1 00:00:00,280 --> 00:00:02,600 Speaker 1: Guess what will what's that man? Go? So do you 2 00:00:02,600 --> 00:00:05,320 Speaker 1: remember ao hell? Like, did you or your friends ever 3 00:00:05,400 --> 00:00:07,920 Speaker 1: use this? I mean, I know a O L. I 4 00:00:07,920 --> 00:00:12,240 Speaker 1: can't say I remember ao LL. Right. So ao hell 5 00:00:12,360 --> 00:00:14,120 Speaker 1: was kind of a hacking program you could use in 6 00:00:14,120 --> 00:00:16,200 Speaker 1: the early days of the Internet. And this was back 7 00:00:16,200 --> 00:00:19,079 Speaker 1: when I was like twelve or thirteen. I mean I 8 00:00:19,200 --> 00:00:21,720 Speaker 1: never had it, like I was never a great programmer 9 00:00:21,880 --> 00:00:24,040 Speaker 1: super into that side of the internet, like I was 10 00:00:24,120 --> 00:00:28,920 Speaker 1: too busy reading in Carter but right, but sometimes I'd 11 00:00:28,920 --> 00:00:30,840 Speaker 1: go over to my friend's houses and watched them play 12 00:00:30,880 --> 00:00:33,560 Speaker 1: with it. And the thing is, you could cause all 13 00:00:33,680 --> 00:00:35,760 Speaker 1: sorts of chaos with it, Like you could log on 14 00:00:35,840 --> 00:00:38,360 Speaker 1: for free and uh this is when a L was 15 00:00:38,440 --> 00:00:40,960 Speaker 1: charging you by the minute, and you could do various pranks. 16 00:00:41,040 --> 00:00:43,280 Speaker 1: But one of the least used and most powerful things 17 00:00:43,280 --> 00:00:45,080 Speaker 1: you could do was create a user name with a 18 00:00:45,240 --> 00:00:46,880 Speaker 1: O L in it. What do you mean, like you 19 00:00:46,880 --> 00:00:49,440 Speaker 1: you couldn't use AOL in your name? No, So that 20 00:00:49,520 --> 00:00:52,000 Speaker 1: was one of the things that AL used to protect itself. 21 00:00:52,040 --> 00:00:54,680 Speaker 1: But through ao hell, you could sign up as like 22 00:00:55,080 --> 00:00:57,680 Speaker 1: a O L cop thirteen or a O L Guide 23 00:00:57,720 --> 00:01:00,200 Speaker 1: twenty six, and you could make yourself to see even 24 00:01:00,280 --> 00:01:03,720 Speaker 1: like this legit agent of the company, and my friends 25 00:01:03,800 --> 00:01:06,720 Speaker 1: used to use this mercilessly. What were they doing, like 26 00:01:06,760 --> 00:01:10,240 Speaker 1: stealing credit cards or what I mean, we're not thieves, 27 00:01:10,360 --> 00:01:14,080 Speaker 1: Like my friends would use it for pranks. So so often, 28 00:01:14,120 --> 00:01:15,800 Speaker 1: like my power would go into a chat room where 29 00:01:15,840 --> 00:01:17,840 Speaker 1: it was clear that like thirteen year old boys were 30 00:01:17,880 --> 00:01:22,000 Speaker 1: trading photos of like scantily clad ladies, and and he'd say, 31 00:01:22,040 --> 00:01:24,840 Speaker 1: this is a O L cop thirteen, You've been busted. 32 00:01:25,160 --> 00:01:28,880 Speaker 1: And then he'd say, unless you write me a one 33 00:01:28,880 --> 00:01:31,679 Speaker 1: page apology for your actions, your parents will be notified 34 00:01:31,680 --> 00:01:34,360 Speaker 1: and your account will be suspended. And by the end 35 00:01:34,400 --> 00:01:36,640 Speaker 1: of the by the end of the summer, you have 36 00:01:36,760 --> 00:01:39,280 Speaker 1: like a binder full of weird apologies from kids saying 37 00:01:39,319 --> 00:01:42,560 Speaker 1: they'd never trade bikini photos online again and just don't 38 00:01:42,560 --> 00:01:45,160 Speaker 1: tell their parents because they're supposed to be playing math games. 39 00:01:45,640 --> 00:01:49,240 Speaker 1: I mean, that is ruthless, I know, but the whole 40 00:01:49,240 --> 00:01:51,520 Speaker 1: thing did make me wonder, like how much of our 41 00:01:51,600 --> 00:01:54,440 Speaker 1: history is still online and how much does the internet 42 00:01:54,480 --> 00:01:56,639 Speaker 1: know about all of us? And that's what today's episode 43 00:01:56,640 --> 00:02:19,480 Speaker 1: it's all about. Hey, their podcast listeners, Welcome to Part 44 00:02:19,520 --> 00:02:22,120 Speaker 1: Time Genius. I'm Will Pearson and as always, I'm joined 45 00:02:22,160 --> 00:02:24,440 Speaker 1: by my good friend Man Gueshow Ticketer, and on the 46 00:02:24,480 --> 00:02:26,919 Speaker 1: other side of the soundproof glasses our friend and producer 47 00:02:27,000 --> 00:02:29,920 Speaker 1: Tristan McNeil. I don't know if you notice of, but 48 00:02:30,000 --> 00:02:33,880 Speaker 1: Tristan seems to be smirking today. Why is he so smirky? Smirks? 49 00:02:34,040 --> 00:02:36,080 Speaker 1: Good job, Tristan, I guess he's in a good mood. 50 00:02:36,120 --> 00:02:38,560 Speaker 1: But all right, so it's not Halloween yet, but today 51 00:02:38,560 --> 00:02:40,720 Speaker 1: we're gonna be exploring a topic that many of us 52 00:02:40,760 --> 00:02:43,840 Speaker 1: do find pretty creepy, and and that's asking the question, 53 00:02:44,080 --> 00:02:47,160 Speaker 1: what exactly does the Internet know about us? And so 54 00:02:47,240 --> 00:02:49,720 Speaker 1: to find out, we've been combing through user agreements just 55 00:02:49,760 --> 00:02:52,600 Speaker 1: to see what sort of personal information that we willingly 56 00:02:52,639 --> 00:02:55,120 Speaker 1: sign away. You know, these are two our favorite apps 57 00:02:55,160 --> 00:02:57,960 Speaker 1: and websites. And we'll also take a look behind the 58 00:02:58,000 --> 00:03:00,760 Speaker 1: digital curtain of the ad tracking industry, you know, just 59 00:03:00,800 --> 00:03:02,440 Speaker 1: to learn a little bit about how much of our 60 00:03:02,520 --> 00:03:05,760 Speaker 1: online habits reveal about who we are in the real world. 61 00:03:06,360 --> 00:03:08,560 Speaker 1: And this was actually a topic suggested by a listener 62 00:03:08,560 --> 00:03:10,800 Speaker 1: after a trip to the grocery store, and she had 63 00:03:10,800 --> 00:03:12,720 Speaker 1: explained that she had purchased a new type of soft 64 00:03:12,800 --> 00:03:15,200 Speaker 1: drink and then she'd never heard of this drink before, 65 00:03:15,280 --> 00:03:17,880 Speaker 1: and she spotted it there, picked it up, bought it, 66 00:03:18,000 --> 00:03:20,160 Speaker 1: went home. She had never looked it up on her phone, 67 00:03:20,560 --> 00:03:23,000 Speaker 1: never looked it up on her computer. But then later 68 00:03:23,040 --> 00:03:25,760 Speaker 1: she found herself getting ads for this same soft drink, 69 00:03:25,840 --> 00:03:28,880 Speaker 1: and so she was wondering do these offline purchases somehow 70 00:03:28,919 --> 00:03:31,920 Speaker 1: connect with what she's seeing online? And so we're gonna 71 00:03:31,919 --> 00:03:34,760 Speaker 1: be looking into questions like this. But I should take 72 00:03:34,760 --> 00:03:36,600 Speaker 1: a second to say that if you have a question 73 00:03:36,640 --> 00:03:39,560 Speaker 1: you'd like us to consider on a future episode, don't 74 00:03:39,560 --> 00:03:41,480 Speaker 1: forget to reach out to us here. We we'd love 75 00:03:41,520 --> 00:03:43,400 Speaker 1: to hear from you. It's part time genius at how 76 00:03:43,440 --> 00:03:46,880 Speaker 1: stuff works dot com and on our seven fact hot 77 00:03:46,880 --> 00:03:49,600 Speaker 1: line one eight four four pt genius. It is is 78 00:03:49,600 --> 00:03:53,000 Speaker 1: still seven right around the clock. That's amazing. There aren't 79 00:03:53,000 --> 00:03:57,840 Speaker 1: many fact hot lines that seven. It's true, It's true, 80 00:03:58,120 --> 00:04:00,120 Speaker 1: all right. Well, this is also a special up so 81 00:04:00,200 --> 00:04:02,640 Speaker 1: because we're teaming up with one of our favorite podcasts 82 00:04:02,680 --> 00:04:06,840 Speaker 1: to tackle two different but certainly related topics. So that's 83 00:04:06,920 --> 00:04:08,800 Speaker 1: that's right. To help us cope with some of the 84 00:04:08,880 --> 00:04:11,960 Speaker 1: unsettling secrets of online privacy, will be joined by some 85 00:04:12,000 --> 00:04:15,400 Speaker 1: of our favorite people, Ben Bolan, Noel Brown, and Matt Frederick. 86 00:04:15,680 --> 00:04:18,560 Speaker 1: They've joined us on the show before, but as a reminder, 87 00:04:18,640 --> 00:04:21,240 Speaker 1: they're the host of the awesome podcast Stuff they Don't 88 00:04:21,240 --> 00:04:23,120 Speaker 1: Want You to Know, and and this week the guys 89 00:04:23,160 --> 00:04:25,160 Speaker 1: did an episode on whether it's possible to wipe your 90 00:04:25,200 --> 00:04:27,840 Speaker 1: history from the web. So we'll talk to them and 91 00:04:27,920 --> 00:04:29,640 Speaker 1: see if they have any tips they can pass along 92 00:04:29,720 --> 00:04:31,520 Speaker 1: for how to get a little bit of privacy on 93 00:04:31,560 --> 00:04:34,560 Speaker 1: the Internet. But if you're interested in this topic, be 94 00:04:34,720 --> 00:04:36,840 Speaker 1: sure to check out today's episode of Stuff they Don't 95 00:04:36,880 --> 00:04:38,440 Speaker 1: Want You To Know. I love what those guys do. 96 00:04:38,680 --> 00:04:40,440 Speaker 1: They really are some of our favorite people. I know 97 00:04:40,520 --> 00:04:42,919 Speaker 1: sometimes we say that, but they actually are some of 98 00:04:42,920 --> 00:04:45,599 Speaker 1: our favorite people. All them, they're all good guys, all right. 99 00:04:45,640 --> 00:04:47,640 Speaker 1: So I thought we could start with just a general 100 00:04:47,760 --> 00:04:51,600 Speaker 1: overview of what web tracking is and and how it works. 101 00:04:51,600 --> 00:04:54,039 Speaker 1: And so to cover the first part of that, you know, 102 00:04:54,120 --> 00:04:56,520 Speaker 1: just just stop and think about the time you've googled 103 00:04:56,560 --> 00:04:59,520 Speaker 1: something like, you know, things to do in Denver, and 104 00:04:59,520 --> 00:05:01,599 Speaker 1: then the next time you've used Google, you've got a 105 00:05:01,600 --> 00:05:03,760 Speaker 1: bunch of ads that that you know, are trying to 106 00:05:03,760 --> 00:05:06,000 Speaker 1: convince you to book a flight to Denver or something. 107 00:05:06,040 --> 00:05:08,400 Speaker 1: Like that. Or maybe you've watched a Star Wars trailer 108 00:05:08,440 --> 00:05:10,760 Speaker 1: on YouTube and then you sign on to Facebook later 109 00:05:10,800 --> 00:05:13,520 Speaker 1: that night and your news feed contains a target ad 110 00:05:13,560 --> 00:05:17,000 Speaker 1: for a brand new Star Wars video game. And I 111 00:05:17,040 --> 00:05:19,400 Speaker 1: think most of us know by now that those early 112 00:05:19,440 --> 00:05:22,599 Speaker 1: appropriate ads they aren't by chance, and they're the result 113 00:05:22,640 --> 00:05:26,080 Speaker 1: of web tracking, which is it's one a series of companies. 114 00:05:26,120 --> 00:05:29,479 Speaker 1: They work together, they share information, They've compiled all this 115 00:05:29,560 --> 00:05:32,479 Speaker 1: information on consumers, and this is for the sake of 116 00:05:32,560 --> 00:05:36,880 Speaker 1: tailoring advertisements to suit specific people. So, Mango, I know 117 00:05:36,920 --> 00:05:38,960 Speaker 1: you engab did a bit of digging on this, and 118 00:05:39,080 --> 00:05:40,920 Speaker 1: so why don't you give us a breakdown on on 119 00:05:40,960 --> 00:05:43,960 Speaker 1: how all of this works. Sure? Well, there are actually 120 00:05:44,000 --> 00:05:46,640 Speaker 1: a few different methods that companies used to gather information 121 00:05:46,680 --> 00:05:49,400 Speaker 1: about people online, and one of them is this process 122 00:05:49,440 --> 00:05:53,800 Speaker 1: called canvas fingerprinting. Basically, websites have this feature written into 123 00:05:53,839 --> 00:05:56,240 Speaker 1: their script so that when you visit the site and 124 00:05:56,360 --> 00:05:58,760 Speaker 1: asked your browser to draw a hidden line of text 125 00:05:59,080 --> 00:06:01,680 Speaker 1: or even a three D image, And because of differences 126 00:06:01,720 --> 00:06:06,000 Speaker 1: between browsers and operating systems and computer hardware, the invisible 127 00:06:06,000 --> 00:06:08,920 Speaker 1: image that each device draws will be different. So I 128 00:06:08,920 --> 00:06:12,480 Speaker 1: guess this is where the fingerprinting part comes in exactly. So, 129 00:06:12,560 --> 00:06:14,760 Speaker 1: so the sites that use this method can then share 130 00:06:14,760 --> 00:06:18,279 Speaker 1: these device fingerprints with third party advertising companies and and 131 00:06:18,320 --> 00:06:20,640 Speaker 1: they in turn can use them to identify when and 132 00:06:20,880 --> 00:06:23,560 Speaker 1: how often the same user of the same device returns 133 00:06:23,600 --> 00:06:26,040 Speaker 1: to this site, as well as which other sites they're 134 00:06:26,120 --> 00:06:28,479 Speaker 1: visiting in between. All Right, so you're saying that these 135 00:06:28,520 --> 00:06:31,400 Speaker 1: techniques like canvas fingerprinting, which I've actually never heard that 136 00:06:31,520 --> 00:06:34,360 Speaker 1: term before, but they allow companies to track your browsing 137 00:06:34,400 --> 00:06:37,279 Speaker 1: history all across the internet. So it's not just the 138 00:06:37,360 --> 00:06:40,920 Speaker 1: site where you first have this image drawn, right, Yeah, 139 00:06:41,000 --> 00:06:43,880 Speaker 1: because the advertisers can see these fingerprints on any site 140 00:06:43,880 --> 00:06:46,520 Speaker 1: that allows them. So by looking for which sites the 141 00:06:46,560 --> 00:06:49,080 Speaker 1: fingerprints show up on, the advertiser can start to get 142 00:06:49,080 --> 00:06:51,400 Speaker 1: an idea of who this user is and which kinds 143 00:06:51,400 --> 00:06:54,159 Speaker 1: of products of services they might be most interested in. Actually, 144 00:06:54,200 --> 00:06:56,720 Speaker 1: I'm curious about this. So these invisible trackers, they're they're 145 00:06:56,760 --> 00:06:59,640 Speaker 1: hidden on websites, and they kind of sound like web cookies, 146 00:06:59,640 --> 00:07:02,320 Speaker 1: which we we've all heard of before. Our cookies a 147 00:07:02,400 --> 00:07:04,599 Speaker 1: form of this same thing. Yeah, so that's why I 148 00:07:04,640 --> 00:07:06,800 Speaker 1: thought too. But cookies are these small pieces of data 149 00:07:06,880 --> 00:07:09,440 Speaker 1: that a website stores on the browser, and then recalls 150 00:07:09,480 --> 00:07:12,320 Speaker 1: whenever you return to that specific site. So cookies are 151 00:07:12,320 --> 00:07:15,040 Speaker 1: how your favorite websites remember your name or address, or 152 00:07:15,080 --> 00:07:18,280 Speaker 1: your password, what's in your shopping cart, like all that stuff. 153 00:07:18,720 --> 00:07:22,240 Speaker 1: So they're definitely related to gathering and storing information about users, 154 00:07:22,240 --> 00:07:25,200 Speaker 1: but they aren't directly related to advertising on their own. 155 00:07:25,320 --> 00:07:27,800 Speaker 1: But I do remember Gape saying something about that. Advertisers 156 00:07:27,800 --> 00:07:30,200 Speaker 1: are still interested in cookies today. They find ways to 157 00:07:30,240 --> 00:07:33,800 Speaker 1: make use of them. There's this other process called cookie sinking. 158 00:07:34,080 --> 00:07:36,520 Speaker 1: It's a way that lets companies share the information they've 159 00:07:36,560 --> 00:07:39,720 Speaker 1: gathered about you through cookies, so they can trade notes 160 00:07:39,800 --> 00:07:41,840 Speaker 1: to come up with a clearer picture of who you are. 161 00:07:42,400 --> 00:07:44,680 Speaker 1: All right, So wait, just a method Because all these 162 00:07:44,760 --> 00:07:47,920 Speaker 1: data companies they collect and they share this information, but 163 00:07:47,960 --> 00:07:50,360 Speaker 1: it's it's supposed to be anonymous, right, Like, like based 164 00:07:50,360 --> 00:07:53,120 Speaker 1: on your phone or your computer, they might know where 165 00:07:53,160 --> 00:07:55,440 Speaker 1: you live, or what town you live in, or maybe 166 00:07:55,440 --> 00:07:57,520 Speaker 1: even how old you are, but they still don't know 167 00:07:57,560 --> 00:08:00,480 Speaker 1: your real name or anything too personal out to do 168 00:08:00,520 --> 00:08:04,360 Speaker 1: they Well, it's anonymous to a point. Like if someone 169 00:08:04,400 --> 00:08:07,800 Speaker 1: really wanted to, it's entirely possible to connect someone's anonymous 170 00:08:07,800 --> 00:08:11,360 Speaker 1: cookie data with their real world identity. I mean, we 171 00:08:11,400 --> 00:08:12,920 Speaker 1: see this kind of thing all the time with cases 172 00:08:12,960 --> 00:08:16,400 Speaker 1: of identity theft and hack social media profiles. In fact, 173 00:08:16,440 --> 00:08:19,280 Speaker 1: it's such a plausible danger that computer scientists don't even 174 00:08:19,360 --> 00:08:22,640 Speaker 1: use the term anonymous anymore. Instead, they say that companies 175 00:08:22,680 --> 00:08:25,960 Speaker 1: gather pseudonymous data on their users. Like the whole point 176 00:08:26,040 --> 00:08:28,640 Speaker 1: is that none of us are even anonymous online anymore. 177 00:08:29,000 --> 00:08:31,040 Speaker 1: Like we've really just been given a pseudonym in the 178 00:08:31,040 --> 00:08:34,280 Speaker 1: form of like these device fingerprints or these assigned numbers 179 00:08:34,400 --> 00:08:37,200 Speaker 1: or some other kind of digital trace, which is all 180 00:08:37,280 --> 00:08:39,199 Speaker 1: a little bit creepy. But I mean there's still one 181 00:08:39,200 --> 00:08:42,200 Speaker 1: thing I don't understand, and that's this connection between online 182 00:08:42,200 --> 00:08:46,280 Speaker 1: advertising and the spending we do in physical store. So 183 00:08:46,920 --> 00:08:48,720 Speaker 1: I kind of want to back up to that first 184 00:08:48,800 --> 00:08:50,480 Speaker 1: question that we asked at the very beginning of this, 185 00:08:50,520 --> 00:08:53,680 Speaker 1: and that's whether the Internet knew my friend had bought 186 00:08:53,760 --> 00:08:56,360 Speaker 1: that pack of soda. I mean, it sort of feels 187 00:08:56,400 --> 00:08:59,280 Speaker 1: like it had to know this, right, I Mean, otherwise, 188 00:08:59,280 --> 00:09:01,559 Speaker 1: why after it for the first time, that she suddenly 189 00:09:01,559 --> 00:09:05,200 Speaker 1: have Facebook ads recommending the same product to her. I know, 190 00:09:05,320 --> 00:09:07,719 Speaker 1: And honestly, that's a really good question. I want to know. 191 00:09:07,800 --> 00:09:10,080 Speaker 1: But but before that, I want to know what soda 192 00:09:10,200 --> 00:09:12,760 Speaker 1: she bought. What's the new I can't remember. We gotta 193 00:09:12,800 --> 00:09:14,320 Speaker 1: find out. It's gonna be a big one. Hopefully they'll 194 00:09:14,320 --> 00:09:18,520 Speaker 1: be a sponsor. But it is, you know, I wish 195 00:09:18,559 --> 00:09:20,719 Speaker 1: you could say it's for incidence. It's you know, it 196 00:09:20,760 --> 00:09:22,840 Speaker 1: feels a little strange that she bought the soda for 197 00:09:22,840 --> 00:09:25,080 Speaker 1: the first time, and and maybe that was the same 198 00:09:25,160 --> 00:09:28,520 Speaker 1: day that this big marketing push happened. But you know, realistically, 199 00:09:28,559 --> 00:09:31,040 Speaker 1: the odds were against that. I mean, the major data 200 00:09:31,080 --> 00:09:34,559 Speaker 1: companies gather an insane amount of information about their users. 201 00:09:34,600 --> 00:09:37,520 Speaker 1: And most folks know that search engines like Google or 202 00:09:37,600 --> 00:09:40,200 Speaker 1: Bing or Yahoo like they keep track of what we 203 00:09:40,240 --> 00:09:44,160 Speaker 1: search for, but big data's reach goes so much further. So, 204 00:09:44,559 --> 00:09:47,520 Speaker 1: for instance, like Google not only knows where you've traveled 205 00:09:47,520 --> 00:09:51,000 Speaker 1: thanks to Google Maps and the GPS tracking functions and 206 00:09:51,000 --> 00:09:53,959 Speaker 1: Android phones, but they also know how fast you were 207 00:09:53,960 --> 00:09:56,320 Speaker 1: going on the way there, and the same goes for 208 00:09:56,360 --> 00:09:59,760 Speaker 1: Apple and their iPhone. Of course, this means they likely 209 00:09:59,800 --> 00:10:02,600 Speaker 1: know where you live and work too, since those are 210 00:10:02,640 --> 00:10:05,560 Speaker 1: the places that you're traveling between the most often. You know. 211 00:10:05,600 --> 00:10:08,480 Speaker 1: But still, even if my phone and by extension Google 212 00:10:08,679 --> 00:10:10,679 Speaker 1: or Apple. You know, they knew I was at the 213 00:10:10,679 --> 00:10:13,320 Speaker 1: grocery store, how would they know what I bought? And 214 00:10:13,360 --> 00:10:16,120 Speaker 1: then and then how would Facebook find out about that? 215 00:10:16,480 --> 00:10:19,000 Speaker 1: So Gave did some heavy lifting on this and and 216 00:10:19,200 --> 00:10:22,800 Speaker 1: uh and in his research on Facebook, it turns out 217 00:10:22,880 --> 00:10:24,920 Speaker 1: they're actually ahead of the curve in terms of the 218 00:10:24,960 --> 00:10:27,520 Speaker 1: depth of data they collect on their users, as well 219 00:10:27,559 --> 00:10:30,080 Speaker 1: as in how they collect it. And the key to 220 00:10:30,080 --> 00:10:32,959 Speaker 1: that is an advertising tool called Atlas that the company 221 00:10:33,040 --> 00:10:35,839 Speaker 1: unveiled back in two thousand fourteen. So, according to the 222 00:10:35,920 --> 00:10:38,640 Speaker 1: vice president of ad Tech of Facebook, the Atlas system 223 00:10:38,679 --> 00:10:41,680 Speaker 1: can match the phone numbers and email addresses of Facebook 224 00:10:41,760 --> 00:10:45,120 Speaker 1: users with the phone numbers and email addresses that consumers 225 00:10:45,120 --> 00:10:48,680 Speaker 1: provide in physical stores. All right, So when the stores 226 00:10:48,720 --> 00:10:51,640 Speaker 1: ask for our email addresses or phone numbers that check out, 227 00:10:51,679 --> 00:10:54,200 Speaker 1: it's so that they can then sell that information to Facebook. 228 00:10:54,280 --> 00:10:56,080 Speaker 1: Is that that's the only reason they do this, well 229 00:10:56,360 --> 00:10:59,280 Speaker 1: kind of, I mean, sometimes that info is used for 230 00:10:59,280 --> 00:11:02,280 Speaker 1: that particular doors marketing purposes, you know, to email you 231 00:11:02,360 --> 00:11:06,000 Speaker 1: their monthly newsletter or coupons or whatever. But they're all 232 00:11:06,120 --> 00:11:09,280 Speaker 1: kinds of data collection companies that act as middlemen between 233 00:11:09,440 --> 00:11:13,800 Speaker 1: retail stores and online platforms. So, for instance, Facebook partners 234 00:11:13,800 --> 00:11:15,920 Speaker 1: with a company called Data Logics to get all its 235 00:11:15,920 --> 00:11:18,320 Speaker 1: store related info. Okay, so I got it. So I 236 00:11:18,360 --> 00:11:20,680 Speaker 1: scanned my grocery store club card, or I gave the 237 00:11:20,679 --> 00:11:24,360 Speaker 1: cashier my email address, and that information, along with the 238 00:11:24,440 --> 00:11:28,080 Speaker 1: purchase connected to it, was then passed along the Data Logics, 239 00:11:28,080 --> 00:11:31,040 Speaker 1: who then shared it with Facebook's advertising system, which you've 240 00:11:31,080 --> 00:11:34,480 Speaker 1: told us is called Atlas. That's right, exactly, So Atlas 241 00:11:34,600 --> 00:11:37,120 Speaker 1: knew that the person with your email address purchased a 242 00:11:37,160 --> 00:11:40,000 Speaker 1: new kind of soda that store, and from there, all 243 00:11:40,000 --> 00:11:42,080 Speaker 1: I had to do was compare that address with the 244 00:11:42,080 --> 00:11:44,560 Speaker 1: one you've linked to your Facebook account and presto, Like 245 00:11:45,040 --> 00:11:47,640 Speaker 1: now Facebook knows just what kind of ad you're likely 246 00:11:47,679 --> 00:11:49,920 Speaker 1: to respond to in your news feed. After all, you've 247 00:11:49,960 --> 00:11:52,400 Speaker 1: already bought that soda once. You know. It's weird because 248 00:11:52,400 --> 00:11:55,720 Speaker 1: I feel like I'm being clever sometimes when I used 249 00:11:55,800 --> 00:11:58,040 Speaker 1: this email address that I had never used for like 250 00:11:58,400 --> 00:12:00,360 Speaker 1: real purposes or whatever, you know, to have to a 251 00:12:00,480 --> 00:12:04,280 Speaker 1: cashier or to sign up for something account right product 252 00:12:04,360 --> 00:12:06,920 Speaker 1: you account. But I'm using the same one for all 253 00:12:06,960 --> 00:12:09,080 Speaker 1: of them, so they're still able to make this connection. 254 00:12:09,160 --> 00:12:11,800 Speaker 1: It's probably the same one that's on my Facebook account, 255 00:12:11,840 --> 00:12:14,319 Speaker 1: you know, So I don't know. I guess. I guess. 256 00:12:14,320 --> 00:12:17,400 Speaker 1: It is a pretty ingenious system from a marketing perspective, 257 00:12:17,440 --> 00:12:19,720 Speaker 1: but you know, not something that's going to generate all 258 00:12:19,760 --> 00:12:22,319 Speaker 1: this enthusiasm or goodwill among the public, at least I 259 00:12:22,320 --> 00:12:25,280 Speaker 1: wouldn't think. Yeah. But here's the creepous part, and and 260 00:12:25,320 --> 00:12:27,560 Speaker 1: that's that Facebook can do the same with ads that 261 00:12:27,600 --> 00:12:30,480 Speaker 1: aren't shown on Facebook. I don't even know what that means. 262 00:12:32,040 --> 00:12:34,640 Speaker 1: So Atlas was made to offer ads on all kinds 263 00:12:34,640 --> 00:12:37,439 Speaker 1: of different websites, not just Facebook, so the company can 264 00:12:37,440 --> 00:12:40,240 Speaker 1: match real world purchases to any ads source from the 265 00:12:40,240 --> 00:12:43,680 Speaker 1: Atlas network, no matter which side it appears on. That's 266 00:12:43,720 --> 00:12:46,280 Speaker 1: not all, though. Facebook can also keep track of its 267 00:12:46,360 --> 00:12:49,560 Speaker 1: users even when they're logged off of the platform. So 268 00:12:49,679 --> 00:12:52,320 Speaker 1: anytime you load a web page that features a Facebook 269 00:12:52,440 --> 00:12:55,120 Speaker 1: like or share button, the company gets in alert about 270 00:12:55,200 --> 00:12:59,240 Speaker 1: who's looking and where, which is basically like every page 271 00:12:59,240 --> 00:13:01,600 Speaker 1: out there internet. It's something I had not thought about. 272 00:13:01,640 --> 00:13:03,640 Speaker 1: I mean, so many of these tracking things they're not 273 00:13:03,800 --> 00:13:06,640 Speaker 1: that surprising, but the the depth of it, I think 274 00:13:06,720 --> 00:13:08,760 Speaker 1: is what's so surprising to me, and that all of 275 00:13:08,760 --> 00:13:11,520 Speaker 1: this seems to happen under the radar. I mean, I 276 00:13:11,559 --> 00:13:13,680 Speaker 1: do feel like most of us have this vague awareness 277 00:13:13,679 --> 00:13:16,600 Speaker 1: of web cookies and targeted advertising and all those sorts 278 00:13:16,640 --> 00:13:18,640 Speaker 1: of things, But I don't know, it's still a little 279 00:13:18,720 --> 00:13:21,080 Speaker 1: unsettling when you start to look into the specifics and 280 00:13:21,280 --> 00:13:23,840 Speaker 1: just how deep it all goes totally. And it only 281 00:13:23,880 --> 00:13:26,760 Speaker 1: gets more unnerving when you remember that Facebook doesn't exist 282 00:13:26,760 --> 00:13:29,880 Speaker 1: in a vacuum. I mean, companies like Twitter and Google like, 283 00:13:29,920 --> 00:13:32,920 Speaker 1: they've all taken notice and developed their own atlas like approaches, 284 00:13:33,400 --> 00:13:36,880 Speaker 1: and uh disability to prove the effectiveness of an online 285 00:13:36,880 --> 00:13:40,080 Speaker 1: ad gives them more bargaining power with potential clients, which 286 00:13:40,120 --> 00:13:42,400 Speaker 1: means the sale and trade of our data that's only 287 00:13:42,440 --> 00:13:45,080 Speaker 1: going to increase from here. All right, Well, well, maybe 288 00:13:45,080 --> 00:13:46,480 Speaker 1: our friends at Stuff they don't want you to know. 289 00:13:46,520 --> 00:13:48,360 Speaker 1: I have some ideas for how to shut off the 290 00:13:48,400 --> 00:13:51,600 Speaker 1: information valves that that are telling these companies so much 291 00:13:51,640 --> 00:13:53,120 Speaker 1: about our lives. So so why don't we go ahead 292 00:13:53,120 --> 00:14:02,880 Speaker 1: and bring them into the studio. Okay, So in this 293 00:14:02,920 --> 00:14:05,920 Speaker 1: episode we've been talking about all the things the Internet 294 00:14:06,040 --> 00:14:08,000 Speaker 1: knows about us, and as we mentioned at the beginning 295 00:14:08,000 --> 00:14:10,560 Speaker 1: of the show, our friends over at Stuff they don't 296 00:14:10,600 --> 00:14:13,120 Speaker 1: want you to know, are actually talking about how to 297 00:14:13,160 --> 00:14:16,560 Speaker 1: remove that information off of the web or or wherever 298 00:14:16,600 --> 00:14:18,920 Speaker 1: it is sitting. And we're joined by two thirds of 299 00:14:18,960 --> 00:14:21,240 Speaker 1: the crew, Ben Bulling and Matt Frederick. Welcome to Part 300 00:14:21,240 --> 00:14:25,520 Speaker 1: Time Genius. Yeah, thanks so much, you guys. We were 301 00:14:25,720 --> 00:14:30,800 Speaker 1: really excited to explore this, and uh, it's fascinating, if 302 00:14:31,000 --> 00:14:34,880 Speaker 1: a little bit unsettling, a little bit a lot of it. 303 00:14:35,200 --> 00:14:39,280 Speaker 1: Thank you, even even hearing just tidbits from you guys 304 00:14:39,280 --> 00:14:42,000 Speaker 1: as you're talking about putting the episode together. I typically 305 00:14:42,000 --> 00:14:44,280 Speaker 1: don't consider myself a paranoid person, but it was hard 306 00:14:44,320 --> 00:14:46,200 Speaker 1: not to go down that path. I think I'm okay. 307 00:14:46,200 --> 00:14:48,080 Speaker 1: But if you could give us just a little bit 308 00:14:48,120 --> 00:14:49,880 Speaker 1: of an overview of what you guys have been looking 309 00:14:49,920 --> 00:14:54,680 Speaker 1: into for the episode. Sure, we looked at a pretty 310 00:14:54,760 --> 00:14:57,760 Speaker 1: basic question, or so we thought in the beginning, which 311 00:14:57,960 --> 00:15:05,000 Speaker 1: was can you delete your online footprint, your your personal data. 312 00:15:05,080 --> 00:15:09,080 Speaker 1: We we stumbled on the phrase the digital you, because 313 00:15:09,120 --> 00:15:14,000 Speaker 1: there's there's so much information and so many leavings of 314 00:15:14,080 --> 00:15:18,640 Speaker 1: all your time brushing through the internet, right Uh, And 315 00:15:18,840 --> 00:15:23,040 Speaker 1: what we found was that there is no silver bullet, 316 00:15:23,560 --> 00:15:28,160 Speaker 1: and uh, you must be very very careful with the 317 00:15:28,320 --> 00:15:33,560 Speaker 1: types of terms that are used by various online entities, 318 00:15:33,560 --> 00:15:37,560 Speaker 1: whether they are a private or a public organization. Yeah, 319 00:15:37,600 --> 00:15:40,360 Speaker 1: specifically terms of services. What been is talking about here, 320 00:15:41,040 --> 00:15:43,120 Speaker 1: that little thing that you go, yeah, I accept that 321 00:15:43,200 --> 00:15:46,080 Speaker 1: you scroll through. Maybe maybe it makes you scroll to 322 00:15:46,120 --> 00:15:48,720 Speaker 1: the bottom of the page, just hit it and go okay. 323 00:15:48,800 --> 00:15:52,200 Speaker 1: Has anyone ever read an entire terms of service? I 324 00:15:52,240 --> 00:15:54,960 Speaker 1: mean that's the that's the whole thing, right, Yeah, I did. 325 00:15:55,200 --> 00:16:00,440 Speaker 1: I did once when my cable was out if pager 326 00:16:00,560 --> 00:16:02,200 Speaker 1: so you could kind of get through. It wasn't even 327 00:16:02,280 --> 00:16:05,040 Speaker 1: in my house, it was at someone else's house. If 328 00:16:05,040 --> 00:16:06,360 Speaker 1: I had brought a book, I wouldn't have read the 329 00:16:06,480 --> 00:16:11,280 Speaker 1: terms made for a much smarter digital you. So envious 330 00:16:11,280 --> 00:16:13,640 Speaker 1: of your digital you. So one of my favorite things 331 00:16:13,680 --> 00:16:16,480 Speaker 1: that I heard you guys talking about was about your 332 00:16:16,560 --> 00:16:18,760 Speaker 1: A O L account and Matt, you are actually looking 333 00:16:18,880 --> 00:16:22,960 Speaker 1: to go back and delete or cancel your ao L account. 334 00:16:23,200 --> 00:16:25,760 Speaker 1: I was like, what are we having this conversation? So 335 00:16:25,880 --> 00:16:31,600 Speaker 1: let's go back to two thousand five six Old Matt 336 00:16:31,680 --> 00:16:34,480 Speaker 1: Frederick is in a band, and every band at that time, 337 00:16:34,600 --> 00:16:40,400 Speaker 1: what do they have? My Okay? Well, also my my 338 00:16:40,560 --> 00:16:44,440 Speaker 1: email since n seven has been through a O L 339 00:16:44,560 --> 00:16:48,200 Speaker 1: at this time, So for some reason, I just stuck 340 00:16:48,280 --> 00:16:50,160 Speaker 1: with A O L because my parents had it and 341 00:16:50,160 --> 00:16:52,440 Speaker 1: I just kept going with it. Anyway, my my Space 342 00:16:52,480 --> 00:16:55,120 Speaker 1: page was linked to my A o L account, so 343 00:16:55,160 --> 00:16:58,640 Speaker 1: in order to like find my old password and reset 344 00:16:58,680 --> 00:17:00,400 Speaker 1: it for my Space, had to get in my AOL 345 00:17:00,480 --> 00:17:06,080 Speaker 1: email and then realized this is still here now the 346 00:17:06,119 --> 00:17:10,960 Speaker 1: one good thing and my ex girlfriend is still email. Well, 347 00:17:11,040 --> 00:17:13,359 Speaker 1: the one good thing is that the A o L service, 348 00:17:13,359 --> 00:17:15,520 Speaker 1: if you have a free account, they disable your account 349 00:17:15,520 --> 00:17:17,920 Speaker 1: after I think a hundred eighty days of an activity 350 00:17:18,520 --> 00:17:21,440 Speaker 1: that you you're not getting emails anymore, but it's still 351 00:17:21,480 --> 00:17:23,200 Speaker 1: there and if anyone wanted to get in there, and 352 00:17:23,280 --> 00:17:27,280 Speaker 1: my personal information is still inside the profile, which made 353 00:17:27,280 --> 00:17:30,960 Speaker 1: me super nervous because I haven't looked at it since then. Anyway, 354 00:17:31,119 --> 00:17:33,159 Speaker 1: you try and delete the thing. To get rid of 355 00:17:33,160 --> 00:17:36,040 Speaker 1: the A o L and the my Space my Space 356 00:17:36,040 --> 00:17:39,560 Speaker 1: relatively easy. But the A O L one they're supposed 357 00:17:39,600 --> 00:17:42,200 Speaker 1: to be a cancel your account little icon that you 358 00:17:42,240 --> 00:17:45,800 Speaker 1: can click on in this one specific area. It wasn't 359 00:17:45,840 --> 00:17:48,760 Speaker 1: there on the three browsers that I attempted to use, 360 00:17:49,440 --> 00:17:51,359 Speaker 1: and maybe a little nervous, so I called up a 361 00:17:51,480 --> 00:17:55,880 Speaker 1: o L with my phone? Can you believe that? Yeah, 362 00:17:56,160 --> 00:17:59,240 Speaker 1: I got ahold of somebody. He officially deleted it, and uh, 363 00:17:59,359 --> 00:18:01,159 Speaker 1: I can still log in right now, which makes me 364 00:18:01,200 --> 00:18:05,080 Speaker 1: super nervous. We'd like to make fun of you for that, 365 00:18:05,119 --> 00:18:07,919 Speaker 1: but you're looking at two people with a Yahoo and 366 00:18:08,280 --> 00:18:13,879 Speaker 1: uh stategy. We need to pull that up at some 367 00:18:13,880 --> 00:18:16,400 Speaker 1: point to see if it still exists. That's pretty great 368 00:18:16,400 --> 00:18:17,920 Speaker 1: and I'm glad to hear A O L is still 369 00:18:18,000 --> 00:18:19,479 Speaker 1: up to their old tricks. I remember when you used 370 00:18:19,520 --> 00:18:21,320 Speaker 1: to try to cancel after you get the free disc 371 00:18:21,400 --> 00:18:23,840 Speaker 1: and sign up for however many hours, and they were 372 00:18:23,880 --> 00:18:25,840 Speaker 1: just awful when you would call and say that you're 373 00:18:25,880 --> 00:18:29,000 Speaker 1: ready to cancel. I remember, I remember like begging a woman. 374 00:18:29,080 --> 00:18:30,760 Speaker 1: I was like, I know you're reading off a script, 375 00:18:30,800 --> 00:18:35,200 Speaker 1: just like please skip to like mine pages almost in tears, 376 00:18:35,359 --> 00:18:39,480 Speaker 1: and it's very much like a it's it's oddly like 377 00:18:39,520 --> 00:18:42,960 Speaker 1: a breakup. And in many cases, you know, and uh, 378 00:18:43,119 --> 00:18:46,280 Speaker 1: the old adage is true, breaking up is hard to do, 379 00:18:46,359 --> 00:18:49,920 Speaker 1: but in these cases it's um you run into some 380 00:18:50,560 --> 00:18:54,040 Speaker 1: misleading and pernicious things. Right. It's it's a point that 381 00:18:54,080 --> 00:18:57,880 Speaker 1: we've made it on our episodes before that your account, 382 00:18:58,119 --> 00:19:01,399 Speaker 1: your personal information and your use of whatever you know, 383 00:19:01,800 --> 00:19:05,680 Speaker 1: page or whatever app. It's highly valuable to these companies. 384 00:19:05,720 --> 00:19:07,800 Speaker 1: You were talking about it friends, though, One of the 385 00:19:07,840 --> 00:19:10,240 Speaker 1: things you guys talked about was the fact that even 386 00:19:10,240 --> 00:19:13,200 Speaker 1: if you don't have a social media profile, if you 387 00:19:13,240 --> 00:19:16,119 Speaker 1: are in your friends phones or whatever, it may be 388 00:19:16,280 --> 00:19:19,240 Speaker 1: your your phone number, your address, anything like that, it 389 00:19:19,320 --> 00:19:22,040 Speaker 1: doesn't mean that you're not out there and that your 390 00:19:22,080 --> 00:19:24,800 Speaker 1: digital you is still is still out there. So can 391 00:19:24,800 --> 00:19:27,120 Speaker 1: you talk a little bit about yeah, you're digital. You 392 00:19:27,560 --> 00:19:32,560 Speaker 1: will still exist in perhaps somewhat of a less substantial form. So, 393 00:19:33,280 --> 00:19:37,840 Speaker 1: for instance, if our co host and colleague Noel Brown, 394 00:19:38,359 --> 00:19:41,360 Speaker 1: uh did not have a Facebook for some reason, he's 395 00:19:41,400 --> 00:19:43,159 Speaker 1: actually the most likely of the three of us to 396 00:19:43,200 --> 00:19:46,560 Speaker 1: have a Facebook. You would agree with if Nol didn't 397 00:19:46,600 --> 00:19:51,320 Speaker 1: have a Facebook, but Matt or I had the Facebook 398 00:19:51,320 --> 00:19:53,080 Speaker 1: app on the phone, or one of you guys did 399 00:19:53,119 --> 00:19:57,480 Speaker 1: have the app on your phone, then depending on your 400 00:19:57,560 --> 00:20:00,520 Speaker 1: OS or the type of phone you're using, the app 401 00:20:00,600 --> 00:20:04,080 Speaker 1: will automatically have access to all your contacts, which means 402 00:20:04,119 --> 00:20:07,639 Speaker 1: every phone number, every email that you or your phone touch, 403 00:20:08,520 --> 00:20:10,960 Speaker 1: Which means that if any of us knew Noel Brown, 404 00:20:11,400 --> 00:20:15,880 Speaker 1: then they would already start building sort of a framework 405 00:20:16,040 --> 00:20:18,560 Speaker 1: for him. So they would say, Okay, we know there's 406 00:20:18,960 --> 00:20:22,080 Speaker 1: this guy. His name is Noel Brown, his phone number 407 00:20:22,119 --> 00:20:25,800 Speaker 1: is you know, five five five whatever, and uh, he 408 00:20:26,080 --> 00:20:30,439 Speaker 1: also has this email address, and he knows uh Mango, 409 00:20:30,720 --> 00:20:33,040 Speaker 1: and he knows Will and he knows Matt and he 410 00:20:33,119 --> 00:20:37,400 Speaker 1: knows Ben and they all work together. Therefore he probably 411 00:20:37,640 --> 00:20:39,600 Speaker 1: you know what I mean. And this this can come 412 00:20:39,720 --> 00:20:43,520 Speaker 1: from different sources. So even if you have somehow resisted 413 00:20:44,200 --> 00:20:47,080 Speaker 1: social media entirely, if you have, if you have a 414 00:20:47,119 --> 00:20:52,080 Speaker 1: phone number and or an email address and friends, yeah, 415 00:20:52,320 --> 00:20:56,120 Speaker 1: then which are fairly easy requirements to meet, then yeah, 416 00:20:56,200 --> 00:21:01,160 Speaker 1: the odds are overwhelmingly likely that there is some even 417 00:21:01,280 --> 00:21:07,080 Speaker 1: rudimentary version of you online already without your consent before 418 00:21:07,160 --> 00:21:08,760 Speaker 1: we let you go. I'm curious. We don't want to 419 00:21:08,760 --> 00:21:11,080 Speaker 1: give any spoilers because it is a terrific episode and 420 00:21:11,080 --> 00:21:13,840 Speaker 1: I hope everyone will will check it out. But do 421 00:21:13,960 --> 00:21:16,560 Speaker 1: you think you guys will will change your behaviors and 422 00:21:16,640 --> 00:21:20,440 Speaker 1: anyway based on what you've learned and putting together this episode? Well, 423 00:21:20,480 --> 00:21:24,560 Speaker 1: here's the thing, um, for a long time now we 424 00:21:24,840 --> 00:21:27,400 Speaker 1: have changed our behavior is because we've been looking into 425 00:21:27,440 --> 00:21:30,159 Speaker 1: this kind of stuff for a while. All we can 426 00:21:30,200 --> 00:21:33,720 Speaker 1: do is encourage everybody else to do your best to 427 00:21:33,920 --> 00:21:36,760 Speaker 1: just you know, don't give too much away for free. 428 00:21:37,280 --> 00:21:39,000 Speaker 1: And when you think, when you think about all of 429 00:21:39,119 --> 00:21:42,280 Speaker 1: the apps that you use on a daily, weekly, monthly 430 00:21:42,359 --> 00:21:46,600 Speaker 1: basis that you've got on whatever device you use, most 431 00:21:46,640 --> 00:21:48,440 Speaker 1: of them ask you for your contacts. Most of them 432 00:21:48,720 --> 00:21:51,080 Speaker 1: are trying to find out where you're going, what you're doing, 433 00:21:51,880 --> 00:21:54,600 Speaker 1: Take pictures of yourself and all your friends and your 434 00:21:54,680 --> 00:21:59,320 Speaker 1: dog and your family. Just think about it. I did, uh, 435 00:21:59,800 --> 00:22:04,119 Speaker 1: I I did assiduously go through and remove apps on 436 00:22:04,359 --> 00:22:08,240 Speaker 1: my phone until you know, you realize at some level 437 00:22:08,359 --> 00:22:11,840 Speaker 1: you're you become like the kid in the old Dutch 438 00:22:11,920 --> 00:22:15,520 Speaker 1: fairy tale trying to plug up a damn one hole 439 00:22:15,640 --> 00:22:19,160 Speaker 1: at a time. There are other avenues for this information exchange, 440 00:22:19,840 --> 00:22:24,399 Speaker 1: and this means that, uh, for a lot of people 441 00:22:24,600 --> 00:22:27,879 Speaker 1: and for myself included. When I was removing apps, it 442 00:22:28,000 --> 00:22:30,359 Speaker 1: hit me and I thought, am I really making a 443 00:22:30,440 --> 00:22:33,920 Speaker 1: difference or am I just making myself feel better? You 444 00:22:34,040 --> 00:22:36,600 Speaker 1: know what I mean? Alright, Well, I hope all of 445 00:22:36,640 --> 00:22:38,760 Speaker 1: our listeners will hop on over to stuff they don't 446 00:22:38,800 --> 00:22:41,080 Speaker 1: want you to know and check out this episode. Can 447 00:22:41,200 --> 00:22:43,760 Speaker 1: you really erase your digital self. But Matt and Ben, 448 00:22:43,840 --> 00:22:59,440 Speaker 1: thanks so much for joining us. Thank you. Thanks. You're 449 00:22:59,480 --> 00:23:01,439 Speaker 1: listening to part Time Genius and we're talking about all 450 00:23:01,520 --> 00:23:05,000 Speaker 1: the creepy things the Internet knows about us, all right, Mega, 451 00:23:05,040 --> 00:23:06,680 Speaker 1: So before the break, we were talking about how the 452 00:23:06,800 --> 00:23:10,480 Speaker 1: purchases we make in physical stores can inform advertisements we 453 00:23:10,600 --> 00:23:13,240 Speaker 1: see later. So there's this blurring of the line between 454 00:23:13,280 --> 00:23:15,680 Speaker 1: the digital world and I don't know, the real world, 455 00:23:15,760 --> 00:23:17,760 Speaker 1: I guess, And and it kind of got me thinking, 456 00:23:18,000 --> 00:23:19,879 Speaker 1: what are some of the other ways that are close 457 00:23:19,920 --> 00:23:23,639 Speaker 1: connection to the web affects our offline lives? Right? Because 458 00:23:23,680 --> 00:23:26,359 Speaker 1: with smartphones, tablets and all the other easy to use 459 00:23:26,400 --> 00:23:28,720 Speaker 1: recording devices we have now, the people of Live to 460 00:23:28,760 --> 00:23:31,000 Speaker 1: day are pretty much guaranteed to be the most self 461 00:23:31,080 --> 00:23:34,600 Speaker 1: documented generation of humans to date, which seems doubly true 462 00:23:34,640 --> 00:23:37,440 Speaker 1: when you add to that the fact that enormous companies 463 00:23:37,480 --> 00:23:40,639 Speaker 1: are now tracking and recording our interests and behaviors and 464 00:23:41,080 --> 00:23:44,800 Speaker 1: everything else for marketing purposes. And I mean, if widespread 465 00:23:44,880 --> 00:23:48,200 Speaker 1: personal documentation has never really been done on this level before, 466 00:23:48,640 --> 00:23:50,480 Speaker 1: you have to assume there will be all kinds of 467 00:23:50,560 --> 00:23:53,760 Speaker 1: interesting social and psychological outcomes from it. You know, for 468 00:23:53,920 --> 00:23:56,359 Speaker 1: better or for worse. Yeah, it's true, and not to 469 00:23:56,440 --> 00:23:58,560 Speaker 1: keep harping on Facebook, but but one of the most 470 00:23:58,640 --> 00:24:02,000 Speaker 1: interesting examples I found for how these digital platforms affect 471 00:24:02,040 --> 00:24:05,480 Speaker 1: our lives has to do with the company's memories algorithms. 472 00:24:05,520 --> 00:24:07,200 Speaker 1: You know where it where it will pop up, You'll 473 00:24:07,240 --> 00:24:09,520 Speaker 1: sign on and in your feed you'll see these memories 474 00:24:09,640 --> 00:24:11,760 Speaker 1: from over the years of the past year or whatever 475 00:24:11,800 --> 00:24:14,520 Speaker 1: it might have been. And so Facebook, you know, uses 476 00:24:14,600 --> 00:24:17,719 Speaker 1: this algorithm to select these pictures and post from our 477 00:24:17,800 --> 00:24:20,920 Speaker 1: past that it then hopes we're most likely to share it, 478 00:24:22,119 --> 00:24:24,600 Speaker 1: which is how they determined that like my yearine review 479 00:24:24,720 --> 00:24:27,000 Speaker 1: is best summed up by like when I was stuffing 480 00:24:27,040 --> 00:24:30,480 Speaker 1: my face with rom in last January, last January, last week, 481 00:24:30,560 --> 00:24:32,480 Speaker 1: whenever it was, I think that's pretty much it. And 482 00:24:32,520 --> 00:24:34,320 Speaker 1: then judging about off and I've seen you at that 483 00:24:34,440 --> 00:24:36,800 Speaker 1: exact scene, I'm gonna say they're not too off base 484 00:24:36,920 --> 00:24:40,080 Speaker 1: with them. But you know, but when you share these memories, 485 00:24:40,200 --> 00:24:42,920 Speaker 1: Facebook selects force. What we're doing is we're helping the 486 00:24:43,000 --> 00:24:45,800 Speaker 1: algorithm learn a little bit more about our preferences and 487 00:24:46,400 --> 00:24:48,439 Speaker 1: and then it can play to these more in the future. 488 00:24:49,119 --> 00:24:51,600 Speaker 1: And do we know at all how this digital memory 489 00:24:51,680 --> 00:24:54,119 Speaker 1: creation might be affecting us. Well. I did come across 490 00:24:54,200 --> 00:24:57,560 Speaker 1: this really interesting articles in Scientific American. It was written 491 00:24:57,600 --> 00:25:00,440 Speaker 1: by Julie Shaw, and it talks a little bit about 492 00:25:00,440 --> 00:25:03,720 Speaker 1: the implications of making forms of social media nostalgia like 493 00:25:03,840 --> 00:25:06,560 Speaker 1: this into a mainstream thing. And here's what she writes. 494 00:25:06,640 --> 00:25:10,159 Speaker 1: She says, by having Facebook choose which events are presented 495 00:25:10,200 --> 00:25:13,440 Speaker 1: as the most meaningful in our lives, it's potentially calling 496 00:25:13,520 --> 00:25:18,080 Speaker 1: the memories the algorithm ignores. Simultaneously, it's reinforcing the memories 497 00:25:18,160 --> 00:25:22,080 Speaker 1: it has chosen, potentially making some memories seem more meaningful 498 00:25:22,440 --> 00:25:25,760 Speaker 1: and memorable than they originally were. Both of these are 499 00:25:25,800 --> 00:25:30,159 Speaker 1: problematic processes that can distort our personal reality. We may 500 00:25:30,240 --> 00:25:33,680 Speaker 1: be helping Facebook learn to optimize its features, but the 501 00:25:33,760 --> 00:25:38,600 Speaker 1: relationship is not symbiotic. Facebook's nostalgia features are messing with 502 00:25:38,760 --> 00:25:41,840 Speaker 1: our memories, which is interesting. But I feel like the 503 00:25:41,880 --> 00:25:44,680 Speaker 1: photos we keep around have always done this. Like have 504 00:25:44,840 --> 00:25:47,680 Speaker 1: you read Susan Sontag's on photography essay? I haven't, and 505 00:25:48,000 --> 00:25:50,040 Speaker 1: I only vaguely remember it from college. But but she 506 00:25:50,119 --> 00:25:52,000 Speaker 1: talked about how the photo you have on your desk 507 00:25:52,040 --> 00:25:54,359 Speaker 1: from your family road trip is a picture of a 508 00:25:54,480 --> 00:25:57,760 Speaker 1: family smiling and in a gorgeous place, right, But in 509 00:25:57,920 --> 00:26:00,919 Speaker 1: capturing that, you're really only showing your most ideal photo. 510 00:26:01,400 --> 00:26:03,399 Speaker 1: But like you're forgetting the hours of bickering in the 511 00:26:03,480 --> 00:26:05,879 Speaker 1: car and and telling your kids to stop fighting, and 512 00:26:05,960 --> 00:26:09,640 Speaker 1: the boredom and all that arguing that happen, and you're 513 00:26:09,640 --> 00:26:12,920 Speaker 1: just focusing on this one kind of manufactured pick where 514 00:26:13,280 --> 00:26:16,359 Speaker 1: everyone's hugging and smiling, right. And it's true. I mean, 515 00:26:16,400 --> 00:26:18,200 Speaker 1: like we sell select the photos that go up on 516 00:26:18,280 --> 00:26:21,320 Speaker 1: social media to such a degree that we're already altering 517 00:26:21,359 --> 00:26:23,480 Speaker 1: our memories, and and Facebook only seems to be putting 518 00:26:23,480 --> 00:26:25,680 Speaker 1: that into hyper drive. I don't know about your family, 519 00:26:25,760 --> 00:26:27,840 Speaker 1: but all we do on our trips this hug and 520 00:26:28,000 --> 00:26:31,639 Speaker 1: smile and pose together. We're always like one inch apart 521 00:26:31,720 --> 00:26:34,400 Speaker 1: from each other. I don't know, Maybe you're right, Maybe 522 00:26:34,400 --> 00:26:36,600 Speaker 1: maybe we're wearing a little too much about this when 523 00:26:36,640 --> 00:26:39,520 Speaker 1: it comes to Facebook, and especially since it might give 524 00:26:39,600 --> 00:26:42,560 Speaker 1: us a case of paranoia, which is definitely something else 525 00:26:42,600 --> 00:26:45,159 Speaker 1: I wanted to talk about, because this is another common 526 00:26:45,240 --> 00:26:48,800 Speaker 1: condition that's connected to all this online data tracking. Well, 527 00:26:48,840 --> 00:26:51,240 Speaker 1: I mean, it's hard not to be paranoid, right, Like 528 00:26:51,359 --> 00:26:54,159 Speaker 1: this is almost universal feeling that our online movements are 529 00:26:54,200 --> 00:26:57,800 Speaker 1: being tracked and recorded, and and because the specifics are 530 00:26:57,880 --> 00:27:00,680 Speaker 1: so hazy, like it's it's just something that like you 531 00:27:00,840 --> 00:27:02,680 Speaker 1: can't get out of your mind. And and so this 532 00:27:02,800 --> 00:27:04,479 Speaker 1: is something that came up a lot during my research. 533 00:27:04,560 --> 00:27:07,520 Speaker 1: For example, I I I read this report about last 534 00:27:07,640 --> 00:27:11,440 Speaker 1: year's Consumer Electronics Show and The Guardian, and it talked 535 00:27:11,480 --> 00:27:14,480 Speaker 1: about how all these companies that make biometrics and tracking 536 00:27:14,520 --> 00:27:17,960 Speaker 1: gadgets have lately found themselves dealing with an entirely new 537 00:27:18,119 --> 00:27:21,680 Speaker 1: clientele and and so who would that be. Well, that's 538 00:27:21,720 --> 00:27:25,320 Speaker 1: the thing. It's like the new customers are ordinary everyday people. 539 00:27:25,720 --> 00:27:28,040 Speaker 1: I mean, previously, the people who are most interested in 540 00:27:28,119 --> 00:27:31,359 Speaker 1: fingerprint scanners for all their like electronic devices and the 541 00:27:31,560 --> 00:27:33,800 Speaker 1: zinc line cases to shield their phones from like the 542 00:27:33,880 --> 00:27:37,240 Speaker 1: electric fields, they were really paranoid. And and they're the 543 00:27:37,280 --> 00:27:39,680 Speaker 1: people that Nellie Bulls of The Guardian refers to as 544 00:27:39,800 --> 00:27:43,639 Speaker 1: the tinfoil hat Brigade. Right, Okay, I think I get it. 545 00:27:43,640 --> 00:27:46,320 Speaker 1: I mean these companies are mostly used by customers who 546 00:27:46,400 --> 00:27:49,560 Speaker 1: have I don't know, dabbled with conspiracy theory or two, 547 00:27:49,960 --> 00:27:52,879 Speaker 1: or perhaps they are a bit untrusting of large corporations 548 00:27:53,000 --> 00:27:54,920 Speaker 1: that monitor them too. Closely. I mean, I think we 549 00:27:55,000 --> 00:27:58,200 Speaker 1: all have an image of that type, I guess exactly. 550 00:27:58,320 --> 00:28:00,600 Speaker 1: But now they're finding the general public is starting to 551 00:28:00,640 --> 00:28:03,280 Speaker 1: feel those same kinds of concerns and and they're taking 552 00:28:03,280 --> 00:28:05,720 Speaker 1: an interest in the products as well, and that totally 553 00:28:05,840 --> 00:28:09,200 Speaker 1: frees these suppliers out. In that Guardian article I mentioned, 554 00:28:09,280 --> 00:28:12,360 Speaker 1: the author describes all these vendors that are pretty uneasy 555 00:28:12,440 --> 00:28:15,520 Speaker 1: at the thought of their whereas being mainstream. So listen 556 00:28:15,600 --> 00:28:18,680 Speaker 1: listen to how the author describes the scene. Quote the 557 00:28:18,760 --> 00:28:23,040 Speaker 1: Personal Security Department at CS, once a quiet overlooked corner 558 00:28:23,080 --> 00:28:26,480 Speaker 1: of the flashy gadget show, was packed and frenzied. This year, 559 00:28:27,000 --> 00:28:30,359 Speaker 1: it's popularity stirred and internal reckoning for the security gadget 560 00:28:30,440 --> 00:28:33,280 Speaker 1: makers who are now central to the conversation about privacy 561 00:28:33,320 --> 00:28:37,720 Speaker 1: and politics. Some longtime sellers are worried about their new buyers. Well, 562 00:28:37,760 --> 00:28:39,920 Speaker 1: I don't know, and maybe for for good reason. Honestly, 563 00:28:40,360 --> 00:28:42,920 Speaker 1: what does it say about society that the average Joe 564 00:28:43,080 --> 00:28:46,400 Speaker 1: is now suddenly taking an interest in these privacy apps 565 00:28:46,520 --> 00:28:49,440 Speaker 1: and the anti surveillance gadgets and all that kind of 566 00:28:49,720 --> 00:28:52,960 Speaker 1: nothing good. It actually makes me think of, you know, 567 00:28:53,080 --> 00:28:55,920 Speaker 1: this new parenting technology that you see all over. I mean, 568 00:28:55,960 --> 00:28:58,240 Speaker 1: this is a little bit different, but you've seen this 569 00:28:58,320 --> 00:29:00,280 Speaker 1: stuff and and it's stuff that I'm kind of glad 570 00:29:00,360 --> 00:29:02,600 Speaker 1: didn't exist that when we were, you know, when our 571 00:29:02,680 --> 00:29:05,720 Speaker 1: kids were much younger. But stuff like the smart baby 572 00:29:05,800 --> 00:29:08,200 Speaker 1: beds and infant heart rate readers that I have to 573 00:29:08,280 --> 00:29:10,920 Speaker 1: put parents at ease, and you know, it keeps them 574 00:29:11,120 --> 00:29:14,680 Speaker 1: over informed about every little detail of their little ones health. 575 00:29:14,760 --> 00:29:17,080 Speaker 1: I mean, have you heard of this new robot nanny tech? 576 00:29:17,240 --> 00:29:19,240 Speaker 1: I think Mattel put it out earlier this year, and 577 00:29:19,320 --> 00:29:23,760 Speaker 1: it's called Aristotle. Call it Aristot be so smart. It's 578 00:29:23,800 --> 00:29:27,720 Speaker 1: basically this three version of Amazon's Alexa, but for kids. 579 00:29:28,240 --> 00:29:30,480 Speaker 1: It's got all these high tech features like voice and 580 00:29:30,560 --> 00:29:34,000 Speaker 1: image recognition and live streaming. But what it also does 581 00:29:34,120 --> 00:29:36,640 Speaker 1: is it it watches and it listens and it learns 582 00:29:36,720 --> 00:29:39,680 Speaker 1: from these babies left under its care. And I don't know, 583 00:29:39,680 --> 00:29:41,680 Speaker 1: I mean, I guess it could help parents keep track 584 00:29:41,720 --> 00:29:45,600 Speaker 1: of stuff like diaper changes and feedings, but it could 585 00:29:45,640 --> 00:29:48,560 Speaker 1: also file away all of that user data for Mattel's 586 00:29:48,640 --> 00:29:51,880 Speaker 1: future commercial endeavors. Yeah, so that that's definitely something to 587 00:29:52,000 --> 00:29:54,200 Speaker 1: keep an eye on for sure. But I mean, I 588 00:29:54,280 --> 00:29:56,240 Speaker 1: want to make sure we aren't coming across as like 589 00:29:56,400 --> 00:29:59,200 Speaker 1: ludyte here because I mean, don't get me wrong, they're 590 00:29:59,240 --> 00:30:02,040 Speaker 1: all kinds of you technologies that can make childcare and 591 00:30:02,120 --> 00:30:05,400 Speaker 1: other aspects of parenting much much easier, And the same 592 00:30:05,520 --> 00:30:08,720 Speaker 1: is true for like technologies of for other non kid things. 593 00:30:08,840 --> 00:30:11,360 Speaker 1: But it's just that, like all this information can make 594 00:30:11,440 --> 00:30:14,400 Speaker 1: things more complicated and and leave parents more worried than 595 00:30:14,440 --> 00:30:16,640 Speaker 1: maybe they should be. Yeah, I mean that's true. And 596 00:30:16,960 --> 00:30:19,120 Speaker 1: having all the stat at their fingertips can make parents 597 00:30:19,200 --> 00:30:22,160 Speaker 1: a little paranoid or even obsessed with the things about 598 00:30:22,160 --> 00:30:24,640 Speaker 1: their child that really are no big deal. I mean, 599 00:30:24,880 --> 00:30:28,800 Speaker 1: it's just that constant biometric reporting and and these status 600 00:30:28,920 --> 00:30:31,520 Speaker 1: updates that can make even these minor details seem so 601 00:30:31,720 --> 00:30:34,040 Speaker 1: much more grim and dire. Again, I'm so glad this 602 00:30:34,080 --> 00:30:36,040 Speaker 1: stuff didn't exist because I would have totally been that 603 00:30:36,120 --> 00:30:39,040 Speaker 1: paranoid parent. Yeah. And then there's another angle on this 604 00:30:39,120 --> 00:30:42,160 Speaker 1: whole wearable technology smart device trend that that I want 605 00:30:42,200 --> 00:30:44,480 Speaker 1: to make sure we cover. So I wanted to talk 606 00:30:44,480 --> 00:30:46,480 Speaker 1: about the Internet of things, and that's the phrase of 607 00:30:46,600 --> 00:30:49,600 Speaker 1: tech developer named Kevin Ashton came up with to describe 608 00:30:49,640 --> 00:30:54,120 Speaker 1: the interconnected network of physical devices and vehicles that electronically 609 00:30:54,240 --> 00:30:57,160 Speaker 1: collect and share data. So I think of everything from 610 00:30:57,200 --> 00:31:01,040 Speaker 1: smartphones to smart refrigerators, robotic vacuums. I mean, the list 611 00:31:01,120 --> 00:31:02,960 Speaker 1: goes on and on, and pretty soon it's going to 612 00:31:03,000 --> 00:31:05,400 Speaker 1: be like every single product you know. And I feel 613 00:31:05,440 --> 00:31:07,880 Speaker 1: like most of us started hearing about this, this Internet 614 00:31:07,960 --> 00:31:10,560 Speaker 1: of things just a few years ago, but it's actually 615 00:31:10,640 --> 00:31:13,280 Speaker 1: an older term at this point. Right. I was reading 616 00:31:13,280 --> 00:31:14,840 Speaker 1: about some of this and it goes back to I 617 00:31:14,960 --> 00:31:17,520 Speaker 1: think the late nineties or so, and this was before 618 00:31:17,600 --> 00:31:20,240 Speaker 1: the smart tech boom of recent years. Yeah, so it 619 00:31:20,320 --> 00:31:23,520 Speaker 1: goes back to I think, and and it was definitely 620 00:31:23,600 --> 00:31:26,480 Speaker 1: forward looking. And I remember my friend was telling me 621 00:31:26,520 --> 00:31:28,760 Speaker 1: about it at the time and he said, m I 622 00:31:28,840 --> 00:31:30,960 Speaker 1: T was developing this table cloth that was gonna be 623 00:31:30,960 --> 00:31:32,680 Speaker 1: a smart table cloth. And I was like, how can 624 00:31:32,760 --> 00:31:35,360 Speaker 1: you make a smart table cloth? And he said, basically 625 00:31:35,440 --> 00:31:37,960 Speaker 1: like if you're putting a drink down, like a glass 626 00:31:38,000 --> 00:31:40,600 Speaker 1: of wine at a wrong angle, it'll account for that 627 00:31:40,800 --> 00:31:43,440 Speaker 1: and catch the glass. And he was also talking about 628 00:31:43,440 --> 00:31:45,640 Speaker 1: like umbrellas that would tell you it's about to rain 629 00:31:45,720 --> 00:31:47,320 Speaker 1: and remind you to take him with you. And and 630 00:31:47,400 --> 00:31:51,960 Speaker 1: this sound crazy, but but like flash forward twenty years 631 00:31:52,080 --> 00:31:56,360 Speaker 1: and according to the analysts, said Gartner, of the world 632 00:31:56,400 --> 00:32:00,600 Speaker 1: population is now online and around eight point four billion 633 00:32:00,760 --> 00:32:03,560 Speaker 1: web connected items are in use. Wow, that's a lot 634 00:32:03,680 --> 00:32:06,680 Speaker 1: of devices are especially when you consider that both of 635 00:32:06,720 --> 00:32:09,920 Speaker 1: those numbers they're only going to rise as time goes on. 636 00:32:10,520 --> 00:32:13,280 Speaker 1: You know, there's this growing catalog of the Internet of things, 637 00:32:13,360 --> 00:32:16,520 Speaker 1: and you think about how widespread they are, from cars 638 00:32:16,600 --> 00:32:20,280 Speaker 1: to household appliances or these health monitoring devices we were 639 00:32:20,320 --> 00:32:24,360 Speaker 1: talking about security sensors and all of these fitness trackers 640 00:32:24,400 --> 00:32:26,160 Speaker 1: that everybody seems to be using. And it's not just 641 00:32:26,280 --> 00:32:28,200 Speaker 1: for them. It's like their dogs and their cats and 642 00:32:28,240 --> 00:32:31,040 Speaker 1: their cows and their babies and all of these things. 643 00:32:31,080 --> 00:32:33,560 Speaker 1: So it's just it's everywhere, and it's led to this 644 00:32:33,800 --> 00:32:36,960 Speaker 1: level of connectedness that the world has never seen before. 645 00:32:37,480 --> 00:32:40,440 Speaker 1: And so with that comes this unique brand of paranoia. 646 00:32:40,680 --> 00:32:43,640 Speaker 1: That's right. So the Pew Research Center publishing report earlier 647 00:32:43,720 --> 00:32:46,560 Speaker 1: this year that highlighted the widespread concerns people have about 648 00:32:46,600 --> 00:32:50,240 Speaker 1: cyber text and account hacking and privacy violations related to 649 00:32:50,280 --> 00:32:53,360 Speaker 1: all these smart things, and you know, it's interesting, it's 650 00:32:53,440 --> 00:32:56,480 Speaker 1: it's you know, the very connectedness of the IoT is 651 00:32:56,560 --> 00:32:58,960 Speaker 1: what makes it seem like a liability, and it is 652 00:32:59,040 --> 00:33:02,400 Speaker 1: a liability. Like I was watching this thing on Wired 653 00:33:02,520 --> 00:33:05,080 Speaker 1: and and they showed how like hackers can hack into 654 00:33:05,120 --> 00:33:07,160 Speaker 1: a car on the highway and just slow it down 655 00:33:07,200 --> 00:33:09,480 Speaker 1: to twelve miles per hour. And in fact, I have 656 00:33:09,600 --> 00:33:12,840 Speaker 1: this one friend who won't buy cars from like after 657 00:33:12,920 --> 00:33:16,120 Speaker 1: a certain point in not late eighties or something, because 658 00:33:16,320 --> 00:33:18,760 Speaker 1: he has that same fear. But so, why do you 659 00:33:18,840 --> 00:33:20,880 Speaker 1: think so many of us buy into this idea of 660 00:33:21,400 --> 00:33:24,440 Speaker 1: connecting and monitoring all these different things when you know, 661 00:33:24,520 --> 00:33:27,480 Speaker 1: we do realize that they could be potentially dangerous or 662 00:33:27,520 --> 00:33:29,880 Speaker 1: somehow asking for trouble if we do so. So, I 663 00:33:29,920 --> 00:33:32,600 Speaker 1: don't know, why do we embrace this so wholeheartedly. Well, 664 00:33:32,760 --> 00:33:34,400 Speaker 1: it's it's for the same reason most of us don't 665 00:33:34,440 --> 00:33:37,640 Speaker 1: bother changing our privacy settings on Google or even adjusting 666 00:33:37,680 --> 00:33:40,440 Speaker 1: the ad preferences on our Facebook accounts, even though we're 667 00:33:40,560 --> 00:33:43,239 Speaker 1: vaguely aware that they're siphening our personal info for their 668 00:33:43,280 --> 00:33:48,760 Speaker 1: own game. And that reason is what convenience Okay, I mean, 669 00:33:48,840 --> 00:33:51,160 Speaker 1: it's it's so much easier to accept that our cool 670 00:33:51,240 --> 00:33:54,000 Speaker 1: tech comes with like a few unsightly strings attached and 671 00:33:54,120 --> 00:33:56,160 Speaker 1: and just move on from there. And I don't know, 672 00:33:56,200 --> 00:33:57,520 Speaker 1: I mean, I guess that makes sense. I mean, think 673 00:33:57,520 --> 00:33:59,720 Speaker 1: about how hard it would be for people to disconnect 674 00:33:59,760 --> 00:34:02,560 Speaker 1: from all these platforms and devices that have become a 675 00:34:02,720 --> 00:34:05,520 Speaker 1: bigger and bigger part of our lives. I mean, I 676 00:34:05,560 --> 00:34:08,239 Speaker 1: would think some people would probably trade any amount of 677 00:34:08,320 --> 00:34:11,440 Speaker 1: privacy just to hold onto that connection. Definitely. And that 678 00:34:11,600 --> 00:34:15,040 Speaker 1: same Pew report I mentioned earlier, like it basically says 679 00:34:15,120 --> 00:34:18,279 Speaker 1: the same thing. They talked over experts and we're able 680 00:34:18,320 --> 00:34:21,560 Speaker 1: to conclude from their responses that the desire for convenience 681 00:34:21,640 --> 00:34:24,840 Speaker 1: trump's concerns about safety in most people's minds. And and 682 00:34:24,920 --> 00:34:27,160 Speaker 1: here's a quote from Lee Rainey, the co author and 683 00:34:27,280 --> 00:34:31,920 Speaker 1: director of pew's Internet Technology and Science Research Center, The 684 00:34:32,040 --> 00:34:35,920 Speaker 1: IoT will bring advantages that are useful, that people's desire 685 00:34:36,000 --> 00:34:39,040 Speaker 1: for convenience will usually prevail over their concerns about risk, 686 00:34:39,120 --> 00:34:42,160 Speaker 1: and these factors will make it difficult, if not impossible, 687 00:34:42,360 --> 00:34:46,080 Speaker 1: for people to opt out of a highly connected life. Wow, alright, well, 688 00:34:46,120 --> 00:34:48,360 Speaker 1: I mean, I know the Internet of Things is supposed 689 00:34:48,360 --> 00:34:50,560 Speaker 1: to be this hopeful concept and it points to this, 690 00:34:51,040 --> 00:34:55,359 Speaker 1: I know, this more conclusive or connected society. But that's 691 00:34:55,360 --> 00:34:57,760 Speaker 1: a lot to think about it. It's it's pretty crazy, 692 00:34:57,920 --> 00:35:00,040 Speaker 1: but I don't want to go too far down a 693 00:35:00,080 --> 00:35:02,719 Speaker 1: paranoia path, because I do have to say that that. 694 00:35:02,840 --> 00:35:05,200 Speaker 1: On a funnier note, just looking around it was great 695 00:35:05,280 --> 00:35:07,920 Speaker 1: stumbling into some of these ridiculous attempts to cash in 696 00:35:08,080 --> 00:35:11,600 Speaker 1: on this growing IoT market, and I jotted down a 697 00:35:11,640 --> 00:35:13,319 Speaker 1: few of these, so I kind of want to take 698 00:35:13,320 --> 00:35:15,040 Speaker 1: a few minutes to share some of the more absurd 699 00:35:15,080 --> 00:35:17,360 Speaker 1: examples of these so called smart devices. What do you 700 00:35:17,400 --> 00:35:19,759 Speaker 1: say we do? Yeah? I love that idea, so I'm 701 00:35:19,800 --> 00:35:22,239 Speaker 1: gonna sorry. A funny one I found is called the 702 00:35:22,440 --> 00:35:24,600 Speaker 1: roll Scout, but you can go ahead and think of 703 00:35:24,640 --> 00:35:27,560 Speaker 1: it as the Internet of toilet paper. We needed an 704 00:35:27,600 --> 00:35:30,320 Speaker 1: Internet of toilet Basically, it's a toilet paper holder that 705 00:35:30,480 --> 00:35:33,680 Speaker 1: informs you via text, email or app notification when the 706 00:35:33,800 --> 00:35:38,080 Speaker 1: role is empty, and each holder costs about sixty I know, 707 00:35:38,600 --> 00:35:41,160 Speaker 1: but you know. According to the creators, roll Scout is 708 00:35:41,440 --> 00:35:45,800 Speaker 1: quote especially useful for small businesses such as cafes and 709 00:35:45,920 --> 00:35:49,040 Speaker 1: restaurants that have public restrooms and are focused on providing 710 00:35:49,080 --> 00:35:52,560 Speaker 1: the best customer experience possible. I have to admit I 711 00:35:52,640 --> 00:35:54,000 Speaker 1: was going to say that was one of the dumbest 712 00:35:54,040 --> 00:35:57,440 Speaker 1: things I'd heard of, but actually, for a restaurant or 713 00:35:57,480 --> 00:36:00,040 Speaker 1: a hotel or something like that. I kind of get it. 714 00:36:00,160 --> 00:36:01,719 Speaker 1: I also kind of want to buy one of these 715 00:36:01,800 --> 00:36:04,799 Speaker 1: just because it's so stupid, But sixty bucks, I think 716 00:36:04,840 --> 00:36:07,040 Speaker 1: I'm gonna have to pass on that. All right, Well, 717 00:36:07,120 --> 00:36:08,800 Speaker 1: here's one of a similar vein of things that we 718 00:36:08,960 --> 00:36:11,920 Speaker 1: put into this much easier to just check yourself category. 719 00:36:12,000 --> 00:36:14,279 Speaker 1: And it's it's a product from Thermos and it's called 720 00:36:14,360 --> 00:36:18,319 Speaker 1: the Connected Hydration Bottle and it it basically keeps track 721 00:36:18,400 --> 00:36:20,719 Speaker 1: of how much water you drink and sends your current 722 00:36:20,840 --> 00:36:24,080 Speaker 1: tally to either your fitbit app or your thermoss own 723 00:36:24,200 --> 00:36:28,240 Speaker 1: water tracking app. And it costs seven thousand dollars. Actually 724 00:36:28,239 --> 00:36:29,799 Speaker 1: didn't see how much it costs, but it just still 725 00:36:29,840 --> 00:36:33,920 Speaker 1: seemed unnecessary. So I found one called the Eggminder, and 726 00:36:34,000 --> 00:36:36,480 Speaker 1: it's a trade that fits inside your refrigerator and keeps 727 00:36:36,520 --> 00:36:39,200 Speaker 1: track with the quantity and freshness of up to fourteen 728 00:36:39,239 --> 00:36:41,960 Speaker 1: and your favorite eggs. I don't try putting a fifteenth 729 00:36:42,000 --> 00:36:44,080 Speaker 1: in there. So this is it's kind of like the 730 00:36:44,160 --> 00:36:47,600 Speaker 1: Internet of egg trays. What is it monitoring? Is it 731 00:36:47,680 --> 00:36:50,279 Speaker 1: monitoring like for cracks and their shells or something like 732 00:36:50,400 --> 00:36:54,000 Speaker 1: that or what? No, in fact, on structural integrity is 733 00:36:54,040 --> 00:36:57,200 Speaker 1: a little out of their scope, but it will send 734 00:36:57,239 --> 00:36:59,239 Speaker 1: a wireless signal to your phone to keep your breast. 735 00:36:59,280 --> 00:37:01,680 Speaker 1: An important develop have beens going on in your egg trays, 736 00:37:01,800 --> 00:37:04,920 Speaker 1: so things like how many eggs it currently contains and 737 00:37:05,160 --> 00:37:07,560 Speaker 1: whether any of them are at risk of spoiling. This 738 00:37:07,760 --> 00:37:10,839 Speaker 1: seems riveting. I mean, I know, we cover a lot 739 00:37:10,880 --> 00:37:12,759 Speaker 1: of topics on here that seems sort of silly at 740 00:37:12,800 --> 00:37:15,319 Speaker 1: first glance, and then then they kind of make more 741 00:37:15,400 --> 00:37:17,120 Speaker 1: sense once you look into them a bit more. And 742 00:37:17,200 --> 00:37:19,440 Speaker 1: I have to admit the toilet paper thing was that 743 00:37:19,600 --> 00:37:21,120 Speaker 1: for me. But I don't know. I'm going to go 744 00:37:21,160 --> 00:37:22,680 Speaker 1: out on a limb here and say the egg minder 745 00:37:22,840 --> 00:37:25,719 Speaker 1: is not one of those things. Yeah, it's probably the 746 00:37:25,760 --> 00:37:29,000 Speaker 1: safe bad, but you know, the worthiness of other IoT 747 00:37:29,160 --> 00:37:31,839 Speaker 1: products and even data tracking as a whole is still 748 00:37:31,960 --> 00:37:34,560 Speaker 1: something that's up for debate, and chances are we'll be 749 00:37:34,640 --> 00:37:37,200 Speaker 1: hearing a lot more about both their time saving advantages 750 00:37:37,320 --> 00:37:40,960 Speaker 1: and and their privacy infringing drawbacks in the years ahead. Yeah, 751 00:37:41,000 --> 00:37:43,600 Speaker 1: there's this. There's this quote from a writer and sociology 752 00:37:43,640 --> 00:37:46,600 Speaker 1: professor named Neil Gross, and it kind of seems fitting 753 00:37:46,680 --> 00:37:49,120 Speaker 1: for this discussion. I jotted it down here and it goes. 754 00:37:49,360 --> 00:37:52,240 Speaker 1: It goes like this. It says, um, in the next century, 755 00:37:52,320 --> 00:37:55,680 Speaker 1: planet Earth will done on electronic skin. It will use 756 00:37:55,760 --> 00:37:59,200 Speaker 1: the Internet as a scaffold to support and transmit its sensations. 757 00:37:59,640 --> 00:38:03,120 Speaker 1: This is already being stitched together. It consists of millions 758 00:38:03,160 --> 00:38:09,040 Speaker 1: of embedded electronic measuring devices, thermostats, pressure gauges, pollution detectors, cameras, microphones, 759 00:38:09,120 --> 00:38:13,799 Speaker 1: glucose sensors e k g s, electro and cephalographs. These 760 00:38:13,800 --> 00:38:17,799 Speaker 1: will probe and monitor cities and endangered species, the atmosphere, 761 00:38:17,880 --> 00:38:22,600 Speaker 1: our ships, highways and fleets of trucks, our conversations, our bodies, 762 00:38:23,080 --> 00:38:26,800 Speaker 1: even our dreams. Wow. So that that's definitely like smart 763 00:38:26,960 --> 00:38:29,560 Speaker 1: and poetic way to think about the future. But um, 764 00:38:29,800 --> 00:38:31,759 Speaker 1: you know what else is smart and poetic? The part 765 00:38:31,800 --> 00:38:45,520 Speaker 1: time genius fact off. That's right, let's do I can 766 00:38:45,640 --> 00:38:47,520 Speaker 1: kick us off here, all right. Here's something to add 767 00:38:47,520 --> 00:38:50,319 Speaker 1: to the already immense list of creepy things Facebook knows 768 00:38:50,400 --> 00:38:53,160 Speaker 1: about you. And that's whether or not your relationship will last. 769 00:38:53,560 --> 00:38:56,400 Speaker 1: In a two thousand fourteen study of Facebook data, scientists 770 00:38:56,440 --> 00:39:00,520 Speaker 1: concluded that, based on activities and status updates, the company 771 00:39:00,600 --> 00:39:03,480 Speaker 1: can make scarily accurate predictions about whether a couple will 772 00:39:03,520 --> 00:39:07,560 Speaker 1: sink or swim. According to his research, being Facebook official 773 00:39:08,040 --> 00:39:10,680 Speaker 1: really helps your chances. I wanted to congratulate you and 774 00:39:10,760 --> 00:39:14,440 Speaker 1: Lizzie being Facebook official, but about half of all Facebook 775 00:39:14,480 --> 00:39:17,440 Speaker 1: relationships that has survived three months are likely to survive 776 00:39:17,560 --> 00:39:21,719 Speaker 1: four years or longer. Well, speaking of gigantic companies that 777 00:39:21,840 --> 00:39:24,279 Speaker 1: know way too much information about us, do you know 778 00:39:24,400 --> 00:39:27,120 Speaker 1: that target can tell if the female customer is pregnant 779 00:39:27,400 --> 00:39:31,440 Speaker 1: occasionally even before she knows herself that. I know. So. 780 00:39:31,840 --> 00:39:34,520 Speaker 1: Back in two thousand two of the statistician named Andrew 781 00:39:34,600 --> 00:39:36,960 Speaker 1: Pole was hired to help the chain develop a pregnancy 782 00:39:37,080 --> 00:39:40,200 Speaker 1: prediction model to better advertised as soon to be mothers. 783 00:39:40,640 --> 00:39:43,759 Speaker 1: And as soon as the model was implemented, Targets Mom 784 00:39:43,840 --> 00:39:47,279 Speaker 1: and Baby sales have They've skyrocketed, and they've helped to 785 00:39:47,320 --> 00:39:51,399 Speaker 1: grow Targets revenue by thirteen billion dollars in the same period. God, 786 00:39:51,520 --> 00:39:54,000 Speaker 1: that's crazy. All right, Well, how about I Know Where 787 00:39:54,040 --> 00:39:58,040 Speaker 1: Your Cat Lives dot com a site that helps make 788 00:39:58,080 --> 00:40:01,719 Speaker 1: the Internet a little more transparent. So using invisible geographic 789 00:40:01,800 --> 00:40:04,640 Speaker 1: location data embedded in the cat photos we upload to 790 00:40:04,680 --> 00:40:07,120 Speaker 1: social media or other people do I've never uploaded the 791 00:40:07,160 --> 00:40:10,120 Speaker 1: cat photos social media. The site presents a sampling of 792 00:40:10,200 --> 00:40:13,640 Speaker 1: over one million public pictures of cats align with their 793 00:40:13,719 --> 00:40:18,759 Speaker 1: real locations on a world map. That sounds terrifying tonight. So, 794 00:40:18,960 --> 00:40:20,800 Speaker 1: we we've been talking so much about what the Internet 795 00:40:20,880 --> 00:40:22,759 Speaker 1: knows about us that I thought it'd be smart to 796 00:40:22,800 --> 00:40:24,840 Speaker 1: turn the tables and mentioned a few cool things we 797 00:40:25,000 --> 00:40:28,280 Speaker 1: know about the Internet. Yeah. So, so, for for starters, 798 00:40:28,400 --> 00:40:31,240 Speaker 1: all the moving information contained on the Internet weighs about 799 00:40:31,440 --> 00:40:35,400 Speaker 1: as much as a single strawberry. That's a going to 800 00:40:35,480 --> 00:40:38,880 Speaker 1: a physicist named Russell Sites who determined that the billions 801 00:40:38,960 --> 00:40:41,680 Speaker 1: of data and motion, like all those moving electrons on 802 00:40:41,719 --> 00:40:44,840 Speaker 1: the Internet, they only add up to roughly fifty grams 803 00:40:45,040 --> 00:40:48,239 Speaker 1: or two ounces. And and as for the Internet's data 804 00:40:48,280 --> 00:40:51,840 Speaker 1: at rest, the five million terrobytes or so of static 805 00:40:51,960 --> 00:40:54,839 Speaker 1: information and storage like that adds up to even less 806 00:40:54,880 --> 00:40:57,880 Speaker 1: mass than a grain of sand. Wow. That is incredible, 807 00:40:58,800 --> 00:41:00,600 Speaker 1: all right. Well, for for anyone who wants an eerie 808 00:41:00,600 --> 00:41:03,920 Speaker 1: illustration of web tracking at work, check out the experimental 809 00:41:03,960 --> 00:41:06,919 Speaker 1: website known as click The Once they are a stream 810 00:41:06,960 --> 00:41:09,640 Speaker 1: of detailed information, will call out all the info your 811 00:41:09,680 --> 00:41:12,839 Speaker 1: browser is leaking online, everything from the number of core 812 00:41:12,960 --> 00:41:15,680 Speaker 1: processors in your computer to where your cursor is currently 813 00:41:15,760 --> 00:41:17,840 Speaker 1: hovering on the page. If you want to try it 814 00:41:17,840 --> 00:41:20,760 Speaker 1: out for yourself, go to click click click dot click. 815 00:41:22,320 --> 00:41:25,600 Speaker 1: Pretty easy to remember. So here's another dirt we've got 816 00:41:25,680 --> 00:41:29,239 Speaker 1: on the Internet. Fact, according to an article and news scientists, 817 00:41:29,600 --> 00:41:32,960 Speaker 1: some researchers now estimate it takes us stunning a hundred 818 00:41:33,080 --> 00:41:36,400 Speaker 1: fifty two billion kill a lott hours per year simply 819 00:41:36,480 --> 00:41:39,280 Speaker 1: to power the data centers that allow the Internet to function. 820 00:41:39,480 --> 00:41:41,680 Speaker 1: So if you were to add that power to the 821 00:41:41,800 --> 00:41:44,479 Speaker 1: energy used by all the computers and other devices linked 822 00:41:44,480 --> 00:41:46,520 Speaker 1: to the Net, the total would account for as much 823 00:41:46,520 --> 00:41:49,080 Speaker 1: as two percent of all the CEO two emissions caused 824 00:41:49,120 --> 00:41:52,400 Speaker 1: by humans. And just to give that a little perspective, 825 00:41:52,840 --> 00:41:54,879 Speaker 1: that two percent will put the Internet on the same 826 00:41:55,040 --> 00:41:59,480 Speaker 1: level as the entire aviation industry. That's a lot, Okay, 827 00:42:00,040 --> 00:42:02,120 Speaker 1: Well that I do have to say that large carbon 828 00:42:02,160 --> 00:42:04,920 Speaker 1: footprint is definitely something we can use to incriminate the 829 00:42:04,960 --> 00:42:07,000 Speaker 1: Internet should it ever come to this. We'll see if 830 00:42:07,040 --> 00:42:10,080 Speaker 1: that happens. But but on that basis alone, I think 831 00:42:10,120 --> 00:42:12,440 Speaker 1: you get to take home today's title and that's it 832 00:42:12,520 --> 00:42:14,319 Speaker 1: for today's show, But be sure to hop on over 833 00:42:14,400 --> 00:42:15,839 Speaker 1: to stuff they don't want you to know to check 834 00:42:15,880 --> 00:42:18,440 Speaker 1: out their episode on how people are attempting to remove 835 00:42:18,520 --> 00:42:21,640 Speaker 1: their identity from the Internet. Thanks again to Noel, Ben 836 00:42:21,760 --> 00:42:23,640 Speaker 1: and Matt for joining us, and thanks so much to 837 00:42:23,680 --> 00:42:39,320 Speaker 1: all of you for listening. Everybody, Thanks again for listening. 838 00:42:39,520 --> 00:42:41,640 Speaker 1: Part Time Genius is a production of how stuff works 839 00:42:41,680 --> 00:42:44,200 Speaker 1: and wouldn't be possible without several brilliant people who do 840 00:42:44,320 --> 00:42:47,320 Speaker 1: the important things we couldn't even begin to understand. CHRISTA 841 00:42:47,400 --> 00:42:49,960 Speaker 1: McNeil does the editing thing. Noel Brown made the theme 842 00:42:50,040 --> 00:42:52,920 Speaker 1: song and does the mixy mixy sound thing. Jerry Rowland 843 00:42:52,960 --> 00:42:56,240 Speaker 1: does the exact producer thing. Gabeluesier is our lead researcher, 844 00:42:56,320 --> 00:42:59,280 Speaker 1: with support from the Research Army including Austin Thompson, Nolan 845 00:42:59,320 --> 00:43:01,480 Speaker 1: Brown and Lucas Sadam and Eve Jeff Cook gets the 846 00:43:01,520 --> 00:43:03,759 Speaker 1: show to your ears. Good job, Eves. If you like 847 00:43:03,880 --> 00:43:05,640 Speaker 1: what you heard, we hope you'll subscribe, And if you 848 00:43:05,760 --> 00:43:07,719 Speaker 1: really really like what you've heard, maybe you could leave 849 00:43:07,719 --> 00:43:09,720 Speaker 1: a good review for us. Do we did? We forget 850 00:43:09,800 --> 00:43:10,960 Speaker 1: Jason Jason, who