1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:15,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm here host 3 00:00:15,200 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,079 --> 00:00:20,680 Speaker 1: And how the tech are you. It's Friday. It's time 5 00:00:20,720 --> 00:00:24,240 Speaker 1: for a classic episode. This episode is called the Evils 6 00:00:24,280 --> 00:00:27,800 Speaker 1: of Data Tracking and it originally published on December six, 7 00:00:28,440 --> 00:00:34,080 Speaker 1: two thousand fifteen, and Ben Johnson, another podcaster and journalist, 8 00:00:34,560 --> 00:00:37,640 Speaker 1: joined the show for this discussion. Keep in mind, you 9 00:00:37,680 --> 00:00:41,280 Speaker 1: know this was done back in the issues with data 10 00:00:41,320 --> 00:00:45,279 Speaker 1: tracking have only gotten worse since then. It has not 11 00:00:46,080 --> 00:00:50,600 Speaker 1: improved in any stretch except for the various agencies that 12 00:00:50,680 --> 00:00:56,280 Speaker 1: are tracking data, whether those are corporations or governments. So um, 13 00:00:56,320 --> 00:01:01,440 Speaker 1: fun times, but enjoy this classic episode. I got to 14 00:01:01,680 --> 00:01:03,800 Speaker 1: chance to be a guest host on a show you 15 00:01:03,840 --> 00:01:07,760 Speaker 1: host called code Breaker. Yes, And on code Breaker when 16 00:01:07,800 --> 00:01:12,720 Speaker 1: we talked, I made a reference to Shakespeare. Yes. Now 17 00:01:12,760 --> 00:01:15,760 Speaker 1: my question for you is are you ever confused with 18 00:01:15,840 --> 00:01:21,319 Speaker 1: the seventeenth century British playwright Ben Johnson? Oh man, all 19 00:01:21,400 --> 00:01:25,160 Speaker 1: the time opens, all the time embarrassing. Yeah, I know 20 00:01:25,440 --> 00:01:28,959 Speaker 1: I've been. You know, I have been to Stratford upon Avon. 21 00:01:29,120 --> 00:01:32,640 Speaker 1: I have done that whole thing and and and I 22 00:01:32,760 --> 00:01:36,320 Speaker 1: used to you know, I never get confused for that 23 00:01:36,400 --> 00:01:39,759 Speaker 1: Ben Johnson. But in the past, at least people made 24 00:01:39,840 --> 00:01:44,319 Speaker 1: jokes about steroids because of the other Ben Johnson, Nadian Runner, 25 00:01:44,920 --> 00:01:46,920 Speaker 1: and I used to have a T shirt that said 26 00:01:46,920 --> 00:01:49,480 Speaker 1: I'm not on steroids, but thanks for asking, which was 27 00:01:49,520 --> 00:01:52,680 Speaker 1: a kind of inside joke among my friends, but it 28 00:01:52,760 --> 00:01:55,720 Speaker 1: was you know, I've never been no, I've never been 29 00:01:55,720 --> 00:01:58,440 Speaker 1: that poetic, I've never been that good at sonnets. So 30 00:01:58,560 --> 00:02:05,400 Speaker 1: usually I'm not not confused. It's really funny because I'm 31 00:02:05,440 --> 00:02:08,720 Speaker 1: in in an upcoming episode of tech Stuff. I'm going 32 00:02:08,760 --> 00:02:11,400 Speaker 1: to have my friend i As actar from c net 33 00:02:11,560 --> 00:02:15,880 Speaker 1: on and his his Twitter handle is at i as, 34 00:02:16,440 --> 00:02:19,359 Speaker 1: so he gets a lot of hip hop and rap questions. 35 00:02:20,480 --> 00:02:25,040 Speaker 1: So yeah, it's like me not so much. I'm okay 36 00:02:25,040 --> 00:02:27,640 Speaker 1: with that. But the reason I have been on today 37 00:02:27,800 --> 00:02:31,160 Speaker 1: is because we're doing something kind of a complimentary episode 38 00:02:31,160 --> 00:02:34,440 Speaker 1: to the Codebreaker episode I was on, which was all 39 00:02:34,480 --> 00:02:37,960 Speaker 1: about data tracking and data mining, and to kind of 40 00:02:37,960 --> 00:02:42,080 Speaker 1: talk about what that is, what it's used for, what 41 00:02:42,280 --> 00:02:45,000 Speaker 1: is the best case scenario, like what why do we 42 00:02:45,120 --> 00:02:49,720 Speaker 1: want it? And what are some really really good reasons 43 00:02:49,760 --> 00:02:53,000 Speaker 1: why we may not want it. Uh, despite the fact 44 00:02:53,080 --> 00:02:54,840 Speaker 1: that at this point it's here to stay. I think 45 00:02:54,880 --> 00:02:59,480 Speaker 1: a spoiler alert, data tracking is not going away, right 46 00:02:59,560 --> 00:03:04,240 Speaker 1: until the sort of posted a post apocalyptic world descents, 47 00:03:04,320 --> 00:03:07,639 Speaker 1: until we're all in um fallout for world or whatever 48 00:03:07,639 --> 00:03:10,680 Speaker 1: it is, or or we reach singularity and we're sharing 49 00:03:10,680 --> 00:03:13,840 Speaker 1: the same consciousness anyway, right, yeah, and then it doesn't 50 00:03:13,840 --> 00:03:16,200 Speaker 1: matter who buys what. There's no such thing as an 51 00:03:16,200 --> 00:03:20,400 Speaker 1: individual at that point. So until then we're stuck with it. 52 00:03:20,520 --> 00:03:24,320 Speaker 1: So data tracking and data mining these are terms that 53 00:03:24,480 --> 00:03:27,120 Speaker 1: kind of relate to another buzz term that you've probably 54 00:03:27,160 --> 00:03:30,360 Speaker 1: heard over and over again, which is big data. Um. 55 00:03:30,760 --> 00:03:33,680 Speaker 1: And the reason why we're even talking about this at 56 00:03:33,680 --> 00:03:38,480 Speaker 1: all is because information has value, but it's hard to 57 00:03:38,960 --> 00:03:42,560 Speaker 1: pin down exactly what the value is because it may 58 00:03:42,560 --> 00:03:47,160 Speaker 1: be very different from one person to another, or between 59 00:03:47,400 --> 00:03:51,560 Speaker 1: the individual whose data it is and some other organization 60 00:03:52,120 --> 00:03:55,520 Speaker 1: or a chain of organizations, and the whole thing gets 61 00:03:55,560 --> 00:04:02,360 Speaker 1: really messy. And uh, yeah, we'll be talking about information 62 00:04:02,440 --> 00:04:06,880 Speaker 1: and information about information like metadata. I mean this it 63 00:04:06,960 --> 00:04:10,200 Speaker 1: becomes a rabbit hole that you can easily get lost 64 00:04:10,240 --> 00:04:13,000 Speaker 1: in as you start to try and unravel everything and 65 00:04:13,040 --> 00:04:17,760 Speaker 1: figure out, Okay, well, what's the heart of this story. 66 00:04:17,920 --> 00:04:20,200 Speaker 1: And to be fair, data treking is something that's been 67 00:04:20,200 --> 00:04:23,640 Speaker 1: going on for ages. It's not new. It's just that 68 00:04:23,720 --> 00:04:27,600 Speaker 1: the Internet and the tools we use today allow us 69 00:04:27,640 --> 00:04:33,120 Speaker 1: to generate, gather, and sift through more information than we've 70 00:04:33,160 --> 00:04:36,080 Speaker 1: ever been able to do before. It's kind of a 71 00:04:36,120 --> 00:04:38,440 Speaker 1: catch all phrase, right, Jonathan. It's it's this kind of 72 00:04:38,480 --> 00:04:41,680 Speaker 1: catch all phrase for for computers in software to build 73 00:04:42,080 --> 00:04:47,599 Speaker 1: a really comprehensive um data profile about you in some ways, 74 00:04:47,640 --> 00:04:50,000 Speaker 1: but it's something that credit card companies and banks and 75 00:04:50,040 --> 00:04:53,960 Speaker 1: advertisers have endeavored to do for decades exactly. I mean, 76 00:04:54,960 --> 00:04:58,599 Speaker 1: and this can be done for very simple, quote unquote 77 00:04:58,600 --> 00:05:01,400 Speaker 1: innocent reasons. Like one of the examples I like to 78 00:05:01,440 --> 00:05:05,039 Speaker 1: say is you might have a shop and you in 79 00:05:05,080 --> 00:05:08,000 Speaker 1: your shop, you sell ice cream, and you pay attention 80 00:05:08,120 --> 00:05:10,880 Speaker 1: to see which flavors of ice cream sell the best, 81 00:05:11,279 --> 00:05:14,000 Speaker 1: and that informs you of which flavors of ice cream 82 00:05:14,040 --> 00:05:16,000 Speaker 1: you need to stock up on. You are tracking the 83 00:05:16,080 --> 00:05:19,479 Speaker 1: data of sales. In this case, you're not necessarily looking 84 00:05:19,520 --> 00:05:22,320 Speaker 1: at individuals. You don't care who it is that buys 85 00:05:22,400 --> 00:05:25,080 Speaker 1: the vanilla, but if everyone's buying vanilla, you want to 86 00:05:25,080 --> 00:05:27,520 Speaker 1: make sure you have more vanilla than the other flavors 87 00:05:27,520 --> 00:05:30,360 Speaker 1: that aren't selling as well. So that's one that's like 88 00:05:30,400 --> 00:05:34,160 Speaker 1: the very simplest version of data tracking there where you 89 00:05:34,200 --> 00:05:37,760 Speaker 1: know you've scrubbed the identity of the individuals. That's not 90 00:05:38,080 --> 00:05:41,320 Speaker 1: even that doesn't even come up. It's not important. But 91 00:05:41,400 --> 00:05:43,680 Speaker 1: it can go all the way to the other extreme, 92 00:05:44,279 --> 00:05:48,760 Speaker 1: where you want to create targeted advertising that is going 93 00:05:48,800 --> 00:05:53,839 Speaker 1: to appeal to each specific person that visits a particular 94 00:05:54,520 --> 00:05:58,320 Speaker 1: site or uses a particular service, and the whole point 95 00:05:58,320 --> 00:06:02,520 Speaker 1: of that is to attempt to get them to buy 96 00:06:02,600 --> 00:06:06,400 Speaker 1: something or to enroll in something. It's it's the goal 97 00:06:06,480 --> 00:06:08,680 Speaker 1: of all advertising. It's to get you to act in 98 00:06:08,720 --> 00:06:11,400 Speaker 1: a certain way. And the idea is that, well, if 99 00:06:11,400 --> 00:06:15,600 Speaker 1: we can narrow the target down to the most likely candidates, 100 00:06:15,640 --> 00:06:20,000 Speaker 1: will get a better return on our efforts. And this 101 00:06:20,080 --> 00:06:23,480 Speaker 1: is theoretically good for the user, right Jonathan. Theoretically it's 102 00:06:23,600 --> 00:06:25,480 Speaker 1: it's sort of like, we want to give you the 103 00:06:25,520 --> 00:06:27,520 Speaker 1: thing that you really want, and we want to give 104 00:06:27,560 --> 00:06:29,919 Speaker 1: it to you in the most efficient way and in 105 00:06:29,960 --> 00:06:33,640 Speaker 1: the most sort of sympatico way with with what you 106 00:06:34,200 --> 00:06:37,640 Speaker 1: are interested in doing already. Um, but it can sort 107 00:06:37,680 --> 00:06:41,560 Speaker 1: of like data tracting to me means so many things, Jonathan, right. 108 00:06:41,680 --> 00:06:45,000 Speaker 1: It means your online browsing behavior, or the things you 109 00:06:45,080 --> 00:06:48,279 Speaker 1: choose to buy, the platforms, the software you use, the 110 00:06:48,360 --> 00:06:54,120 Speaker 1: things you download, your name, age, race, sexual orientation, when 111 00:06:54,160 --> 00:06:58,200 Speaker 1: you're sick, what what you're sick with, your eating habits, 112 00:06:58,240 --> 00:07:02,320 Speaker 1: your television watching habits, your fears, hope streams right when 113 00:07:03,760 --> 00:07:07,279 Speaker 1: you're awake, when you're asleep. I mean, like, we we're 114 00:07:07,320 --> 00:07:10,040 Speaker 1: talking about a world now where where we we want 115 00:07:10,120 --> 00:07:13,280 Speaker 1: these wearables. They can give us real time data feedback 116 00:07:13,360 --> 00:07:16,480 Speaker 1: on how we are doing throughout the day. Maybe you 117 00:07:16,560 --> 00:07:19,840 Speaker 1: are an extreme type a personality and you are trying 118 00:07:19,880 --> 00:07:22,440 Speaker 1: to schedule your day so that when you're hitting those 119 00:07:22,480 --> 00:07:26,200 Speaker 1: peak moments of productivity, that's when you're tackling the most 120 00:07:26,200 --> 00:07:29,119 Speaker 1: important tasks to you, and you're wearing wearables, you're getting 121 00:07:29,120 --> 00:07:32,000 Speaker 1: that feedback and you're understanding all this stuff. That data 122 00:07:32,080 --> 00:07:34,640 Speaker 1: may also be going to other places and there may 123 00:07:34,640 --> 00:07:37,559 Speaker 1: not be any use for it right now. Like right now, 124 00:07:37,760 --> 00:07:40,760 Speaker 1: as you're generating that data, that's fine, no one's looking 125 00:07:40,800 --> 00:07:42,760 Speaker 1: at it. But down the road that might not be 126 00:07:42,800 --> 00:07:46,840 Speaker 1: the case for various reasons, some of which are terrifying. 127 00:07:48,120 --> 00:07:49,960 Speaker 1: But but you know you were talking about the different 128 00:07:49,960 --> 00:07:52,440 Speaker 1: types of data. There are actually three kind of broad 129 00:07:52,600 --> 00:07:56,960 Speaker 1: categories we can classify the data that we are creating 130 00:07:57,360 --> 00:08:01,840 Speaker 1: that that various entities are really interested in. There's there's 131 00:08:01,920 --> 00:08:05,880 Speaker 1: volunteered data, and that's the stuff we share, right that's 132 00:08:05,880 --> 00:08:09,000 Speaker 1: when we go on facebooks exactly, that's my tweets, your tweets, 133 00:08:09,000 --> 00:08:13,400 Speaker 1: your your YouTube's, your facebooks, your Pinterests, it's it's the 134 00:08:13,400 --> 00:08:16,840 Speaker 1: stuff where we are participating in the conversation and generating 135 00:08:16,880 --> 00:08:20,240 Speaker 1: that information. We're not just doing that for the people 136 00:08:20,320 --> 00:08:24,320 Speaker 1: that we are trying to impress, or our friends or 137 00:08:24,520 --> 00:08:27,360 Speaker 1: sometimes our enemies if we want to rub their noses 138 00:08:27,400 --> 00:08:31,480 Speaker 1: and stuff because they think they're so big. But that's 139 00:08:31,520 --> 00:08:35,600 Speaker 1: that's the stuff that we are actively sharing. Then there's 140 00:08:35,640 --> 00:08:39,400 Speaker 1: the observed data. That's the stuff that companies and other 141 00:08:39,520 --> 00:08:42,720 Speaker 1: entities can can figure out about us just by watching 142 00:08:42,720 --> 00:08:45,240 Speaker 1: our behavior. So this would be kind of the stuff 143 00:08:45,240 --> 00:08:49,199 Speaker 1: we buy. The browsing habits we have where we log 144 00:08:49,240 --> 00:08:51,600 Speaker 1: in from. Like, if you are doing most of your 145 00:08:51,600 --> 00:08:55,000 Speaker 1: browsing on a mobile device, you better bet their their 146 00:08:55,080 --> 00:08:57,560 Speaker 1: companies out there that want to know that. They want 147 00:08:57,600 --> 00:09:00,800 Speaker 1: to know that you access stuff on your phone. More 148 00:09:00,880 --> 00:09:04,400 Speaker 1: than you do on a laptop or desktop. That's important information. 149 00:09:05,360 --> 00:09:08,280 Speaker 1: And then there's the inferred data, and that's the stuff 150 00:09:08,320 --> 00:09:12,559 Speaker 1: that companies guests is relevant to you based upon the 151 00:09:12,600 --> 00:09:15,400 Speaker 1: information they gather from the other two types of sources. 152 00:09:16,040 --> 00:09:20,000 Speaker 1: And it's it's this collection that's important. And you know 153 00:09:20,040 --> 00:09:24,080 Speaker 1: I mentioned about value and how value is different for 154 00:09:24,240 --> 00:09:28,960 Speaker 1: different people. So the Telegraph did a survey UH in 155 00:09:29,000 --> 00:09:33,599 Speaker 1: which they asked people what they felt their personal information 156 00:09:33,720 --> 00:09:37,520 Speaker 1: was worth to them personally, like, what is your identity 157 00:09:37,600 --> 00:09:41,080 Speaker 1: worth to you? Uh? Based upon the kind of information 158 00:09:41,080 --> 00:09:44,480 Speaker 1: to get shared around these entities? Can you make a 159 00:09:44,679 --> 00:09:49,240 Speaker 1: ballpark guess? I'll tell you this. It's it's it's less 160 00:09:49,240 --> 00:09:56,160 Speaker 1: than ten thousand dollars, but it's more than five dollars. Man, man, 161 00:09:56,240 --> 00:09:59,600 Speaker 1: I'm gonna go with I'm gonna go with twenty bucks 162 00:09:59,760 --> 00:10:02,640 Speaker 1: or no, no, no no, wait, wait, hundred bucks. You know 163 00:10:03,320 --> 00:10:05,680 Speaker 1: it's probably because we share so much ben that we 164 00:10:05,840 --> 00:10:10,680 Speaker 1: value it so little. But the average person five thousand dollars, 165 00:10:11,240 --> 00:10:13,160 Speaker 1: Oh well, that's good that that makes and that makes 166 00:10:13,160 --> 00:10:16,440 Speaker 1: the average person a lot smarter than me. Well, to 167 00:10:16,480 --> 00:10:19,760 Speaker 1: be fair, it was three thousand, two pounds sterling. But 168 00:10:19,800 --> 00:10:23,319 Speaker 1: I did the conversion fair enough, well done, Yeah and uh. 169 00:10:23,360 --> 00:10:26,679 Speaker 1: And they also mentioned the fact that, um that while 170 00:10:26,720 --> 00:10:30,200 Speaker 1: they were doing this, they they found that there was 171 00:10:30,240 --> 00:10:34,679 Speaker 1: a disparity that that women in the survey tended to 172 00:10:34,720 --> 00:10:39,280 Speaker 1: answer more frequently that their personal information was priceless, that 173 00:10:39,320 --> 00:10:41,120 Speaker 1: there was no price you could put on it, that 174 00:10:41,160 --> 00:10:44,760 Speaker 1: they would feel comfortable selling it okay, and men were 175 00:10:44,880 --> 00:10:47,760 Speaker 1: less likely to do that. They also said an interesting 176 00:10:47,840 --> 00:10:50,520 Speaker 1: data point, if you will, Yeah, I also mentioned that 177 00:10:50,559 --> 00:10:53,360 Speaker 1: older people were less likely to want to sell their 178 00:10:53,360 --> 00:10:57,520 Speaker 1: information than younger people, which kind of young younger people 179 00:10:57,520 --> 00:10:59,360 Speaker 1: are just giving it away. Yeah they are, I mean 180 00:10:59,360 --> 00:11:01,560 Speaker 1: they you know, we're seeing that more and more with 181 00:11:01,679 --> 00:11:08,120 Speaker 1: the the the very cavalier behavior of certain executives like Zuckerberg, 182 00:11:08,120 --> 00:11:10,960 Speaker 1: who have said that privacy is dead. No, privacy is 183 00:11:10,960 --> 00:11:14,280 Speaker 1: no longer a thing. That seems to be kind of 184 00:11:14,320 --> 00:11:19,960 Speaker 1: the message that younger generations have not only absorbed, but 185 00:11:20,080 --> 00:11:24,000 Speaker 1: adopted at least to some extent. Say you and I 186 00:11:24,200 --> 00:11:28,199 Speaker 1: s old kajers. So we claim, yeah, no, if your 187 00:11:28,240 --> 00:11:31,880 Speaker 1: generation X or older, you're like, get your dirty, gruvy 188 00:11:31,920 --> 00:11:36,120 Speaker 1: hands off my personal information, you damn millennial right so 189 00:11:36,520 --> 00:11:39,800 Speaker 1: now here. So that's how much the person values their 190 00:11:39,800 --> 00:11:43,280 Speaker 1: personal information in the thousands of dollars range. Okay, I 191 00:11:43,320 --> 00:11:45,120 Speaker 1: know what you're gonna do next, Right, You're gonna give 192 00:11:45,120 --> 00:11:49,200 Speaker 1: me the average company's value on our data. What's going 193 00:11:49,240 --> 00:11:52,320 Speaker 1: to happen? That is exactly what's gonna happen. So here's 194 00:11:52,360 --> 00:11:56,320 Speaker 1: the deal. Your personal information to a company at best, 195 00:11:56,400 --> 00:12:00,600 Speaker 1: like at its peak for average amount of information. I'm 196 00:12:00,640 --> 00:12:03,679 Speaker 1: not talking about very specialized stuff like if you were 197 00:12:03,679 --> 00:12:06,559 Speaker 1: talking about that wearable example. I was saying earlier that 198 00:12:06,600 --> 00:12:10,480 Speaker 1: would that would be worth more because it's more comprehensive data. 199 00:12:10,559 --> 00:12:13,040 Speaker 1: But you're average stuff like you're the stuff you might 200 00:12:13,080 --> 00:12:17,000 Speaker 1: find on a Facebook profile page. It is worth point 201 00:12:17,440 --> 00:12:23,880 Speaker 1: zero zero five dollars. So half a cent is how 202 00:12:23,960 --> 00:12:27,240 Speaker 1: much your data is work at the peak average is 203 00:12:27,280 --> 00:12:34,080 Speaker 1: closer to point zero zero zero five dollars. To know that, I'm, 204 00:12:34,520 --> 00:12:37,960 Speaker 1: you know, a boring, middle aged white guy who's probably 205 00:12:37,960 --> 00:12:43,560 Speaker 1: going to see the Star Wars movie. Probably, I mean 206 00:12:43,720 --> 00:12:46,959 Speaker 1: I would have paid you for definitely, Okay, I mean 207 00:12:46,960 --> 00:12:49,520 Speaker 1: it's seven days away, dude, It is seven days away 208 00:12:49,520 --> 00:12:54,640 Speaker 1: as we record. That can so so at any rate? Yeah, exactly, 209 00:12:54,640 --> 00:12:57,640 Speaker 1: Like the companies like, that's how much your information is 210 00:12:57,679 --> 00:13:00,640 Speaker 1: worth to them. So to you, it's in the thousands 211 00:13:00,679 --> 00:13:03,719 Speaker 1: of dollars. To them, it doesn't even equal a penny 212 00:13:03,720 --> 00:13:05,920 Speaker 1: for the but in the aggregate, right, it's gonna be 213 00:13:05,960 --> 00:13:08,680 Speaker 1: worth so much more. Well, exactly because because when a 214 00:13:08,720 --> 00:13:12,680 Speaker 1: company is buying information, they're not buying one person's information. 215 00:13:13,200 --> 00:13:17,839 Speaker 1: They're buying bundles of thousands upon thousands of people's information. 216 00:13:18,440 --> 00:13:22,760 Speaker 1: Then it's that company's job to either sell that information 217 00:13:22,800 --> 00:13:25,400 Speaker 1: off to someone else. These are data brokers. This is 218 00:13:25,480 --> 00:13:29,200 Speaker 1: all they do. That's that's their job. That's their business. 219 00:13:29,280 --> 00:13:33,240 Speaker 1: They buy and sell info. They don't make stuff, they 220 00:13:33,280 --> 00:13:36,840 Speaker 1: don't provide any services. They aggregate data and sell it 221 00:13:36,880 --> 00:13:40,360 Speaker 1: off to other people. This is a fascinating part of 222 00:13:40,400 --> 00:13:44,280 Speaker 1: this whole thing to me, Jonathan, because it's like they're 223 00:13:44,440 --> 00:13:48,679 Speaker 1: these companies are really I mean, they're they're kind of 224 00:13:48,760 --> 00:13:52,600 Speaker 1: like in the shadows, you know, they're they're they're like 225 00:13:52,679 --> 00:13:57,080 Speaker 1: companies with like weird names that that that have you know, 226 00:13:57,160 --> 00:14:00,200 Speaker 1: data centers in in in the middle of nowhere, and 227 00:14:00,320 --> 00:14:02,120 Speaker 1: they're really hard to find. And you you know, I 228 00:14:02,160 --> 00:14:04,880 Speaker 1: mean you and I probably both remember the days of 229 00:14:05,040 --> 00:14:07,920 Speaker 1: like do not call me at home, you know, getting 230 00:14:08,040 --> 00:14:10,840 Speaker 1: getting that phone call from marketers the pre do not 231 00:14:11,040 --> 00:14:14,559 Speaker 1: call list days, right, And we don't really have a 232 00:14:14,679 --> 00:14:17,559 Speaker 1: do not call list for data yet. Right. We have 233 00:14:17,600 --> 00:14:19,480 Speaker 1: some tools, but we don't really have that. And I 234 00:14:19,520 --> 00:14:23,120 Speaker 1: feel like we're in this kind of um, I don't 235 00:14:23,120 --> 00:14:27,600 Speaker 1: know gold rush on data exactly. We don't have We 236 00:14:27,680 --> 00:14:30,880 Speaker 1: don't have the power to say here is what you 237 00:14:30,920 --> 00:14:32,840 Speaker 1: can use my data to do, and here is what 238 00:14:32,960 --> 00:14:36,400 Speaker 1: you cannot use my data to do. Uh. You know, 239 00:14:36,440 --> 00:14:39,920 Speaker 1: if you're lucky you will be or if you're very 240 00:14:40,600 --> 00:14:43,520 Speaker 1: um you know, careful and you're and you're paying really 241 00:14:43,520 --> 00:14:46,360 Speaker 1: close attention, then you might use services where you're looking 242 00:14:46,400 --> 00:14:49,280 Speaker 1: at the user agreement and looking to see do they 243 00:14:49,280 --> 00:14:52,400 Speaker 1: have a policy? Do they do they specifically state they 244 00:14:52,440 --> 00:14:56,160 Speaker 1: will not share your information with third parties. Um, you 245 00:14:56,160 --> 00:14:59,280 Speaker 1: know you can find some of those, but honestly, that's 246 00:14:59,320 --> 00:15:01,520 Speaker 1: not the way bus this works. For most of these companies. 247 00:15:01,600 --> 00:15:05,320 Speaker 1: You are the product, right, like like Google when the 248 00:15:05,400 --> 00:15:09,880 Speaker 1: service is free. Yep, you're the one product. That's that's 249 00:15:09,880 --> 00:15:14,960 Speaker 1: what Google, your Google's product, your Facebook product, you are 250 00:15:15,000 --> 00:15:20,280 Speaker 1: what they are selling to other other interested parties. And uh, 251 00:15:20,480 --> 00:15:23,160 Speaker 1: even if you look at ones that say that, oh, 252 00:15:23,200 --> 00:15:27,200 Speaker 1: don't worry, we scrub all the identifiable stuff from your day. 253 00:15:27,200 --> 00:15:30,720 Speaker 1: Animize your data if you will. Yeah, that, Um, that 254 00:15:30,880 --> 00:15:35,680 Speaker 1: don't work so good, Ben, it doesn't work very good. 255 00:15:36,000 --> 00:15:38,440 Speaker 1: There have been studies about this in the last few years, right, 256 00:15:38,480 --> 00:15:39,840 Speaker 1: I think there was a study that came out of 257 00:15:39,880 --> 00:15:43,720 Speaker 1: the UK that suggested, um that really in order to 258 00:15:43,840 --> 00:15:48,600 Speaker 1: identify a person, um, you really only needed like one 259 00:15:48,720 --> 00:15:51,320 Speaker 1: or two pieces of information about them. Right. Do you 260 00:15:51,360 --> 00:15:53,080 Speaker 1: remember that study that came out in the last I 261 00:15:53,120 --> 00:15:55,240 Speaker 1: don't know year or so. It was like, I remember 262 00:15:55,320 --> 00:15:59,240 Speaker 1: that one. And there's one that Latania Sweeney Harvard she 263 00:15:59,360 --> 00:16:03,080 Speaker 1: did one or she proved that with three points of data, 264 00:16:03,280 --> 00:16:07,280 Speaker 1: which was gender, birth date, and ZIP code, she could 265 00:16:07,320 --> 00:16:14,520 Speaker 1: identify of US residents. So so nearly of all people 266 00:16:14,560 --> 00:16:16,920 Speaker 1: living in the United States she could identify just with 267 00:16:16,960 --> 00:16:20,760 Speaker 1: those three pieces of information. That One of the things 268 00:16:20,800 --> 00:16:22,960 Speaker 1: I like to stress to people is your name, which 269 00:16:23,000 --> 00:16:25,000 Speaker 1: you try to protect when you are when you think 270 00:16:25,000 --> 00:16:27,960 Speaker 1: you're being anonymous on the Internet and you use a handle, 271 00:16:28,440 --> 00:16:30,160 Speaker 1: so you're not using your name like I use my 272 00:16:30,280 --> 00:16:34,200 Speaker 1: name everywhere as my handle. That's just but most of 273 00:16:34,200 --> 00:16:35,960 Speaker 1: the time, but a lot of people like, no, I'm 274 00:16:35,960 --> 00:16:38,400 Speaker 1: going to use an anonymous handle. Your name is the 275 00:16:38,480 --> 00:16:42,080 Speaker 1: least important thing about you. I hate to break it 276 00:16:42,120 --> 00:16:46,280 Speaker 1: to you that because honestly, most of these companies don't care. 277 00:16:46,720 --> 00:16:48,480 Speaker 1: Just like I was saying, the shop owner with the 278 00:16:48,520 --> 00:16:51,160 Speaker 1: ice cream doesn't care who you are. Most of these 279 00:16:51,200 --> 00:16:53,880 Speaker 1: companies don't really care who you are. They care about 280 00:16:53,920 --> 00:16:59,680 Speaker 1: what you do. Yeah, and it's really interesting too, Like Google, 281 00:16:59,720 --> 00:17:03,960 Speaker 1: for instance, leverages large amounts of money and even some 282 00:17:04,040 --> 00:17:06,879 Speaker 1: of their best algorithmic computing power to make sure that 283 00:17:07,000 --> 00:17:11,760 Speaker 1: advertisers and marketers, for instance, don't engage in dishonest practices 284 00:17:12,080 --> 00:17:16,840 Speaker 1: or break Google's own rules about tracking data. And even 285 00:17:16,880 --> 00:17:20,040 Speaker 1: for Google, this is becoming like a herculean task. We 286 00:17:20,160 --> 00:17:23,520 Speaker 1: I talked to in for Cobreaker, I talked to some 287 00:17:23,640 --> 00:17:29,119 Speaker 1: people at Columbia's, some Columbia grad students who are studying this, 288 00:17:29,640 --> 00:17:33,520 Speaker 1: and it's really interesting to see what gets pulled from 289 00:17:33,560 --> 00:17:38,040 Speaker 1: even like emails that you might send in Gmail, for instance. Yeah, yeah, 290 00:17:38,080 --> 00:17:41,879 Speaker 1: this is this is kind of um, kind of terrifying. 291 00:17:42,000 --> 00:17:44,320 Speaker 1: Also if you if you think about because you think 292 00:17:44,320 --> 00:17:47,879 Speaker 1: of email as being this this uh, private means of 293 00:17:47,920 --> 00:17:50,800 Speaker 1: communication that you know, just as you wouldn't expect someone 294 00:17:50,840 --> 00:17:54,639 Speaker 1: else to read your snail mail. That would be a 295 00:17:54,720 --> 00:17:58,479 Speaker 1: violation of trust to find out that stuff is popping 296 00:17:58,560 --> 00:18:00,360 Speaker 1: up based on things that you've typed. Like if you've 297 00:18:00,359 --> 00:18:03,800 Speaker 1: written an email recently and then something that is related 298 00:18:03,800 --> 00:18:06,479 Speaker 1: to that email starts seemingly to pop up in places 299 00:18:06,520 --> 00:18:09,120 Speaker 1: you weren't expecting, you might think at first that that's 300 00:18:09,119 --> 00:18:12,360 Speaker 1: a weird coincidence. But if this happens a few times, like, Okay, 301 00:18:12,400 --> 00:18:16,560 Speaker 1: this is not a coincidence. What is going on? Yeah, 302 00:18:16,720 --> 00:18:19,840 Speaker 1: it's really interesting. And I think that's something, Jonathan, that 303 00:18:20,000 --> 00:18:23,280 Speaker 1: so many people can relate to, right, this idea of like, 304 00:18:23,680 --> 00:18:27,160 Speaker 1: oh I looked at those shoes once, or I looked 305 00:18:27,200 --> 00:18:31,080 Speaker 1: at uh, you know, whatever, the whatever item that you 306 00:18:31,119 --> 00:18:36,600 Speaker 1: might buy online once and that advertisement follows you around forever. Right, Yeah, 307 00:18:36,640 --> 00:18:39,720 Speaker 1: you had that great moment and codebreaker in fact, about 308 00:18:40,040 --> 00:18:46,880 Speaker 1: the sandals. Yes, the sandals that followed Molly around. Every 309 00:18:46,880 --> 00:18:49,840 Speaker 1: time I logged into Facebook for a straight week, they 310 00:18:49,840 --> 00:18:52,159 Speaker 1: would pop up. And I was so scared because I 311 00:18:52,200 --> 00:18:54,200 Speaker 1: was like, oh my god, I'm gonna be sitting next 312 00:18:54,240 --> 00:18:56,119 Speaker 1: to my boyfriend and I'm gonna log onto Facebook and 313 00:18:56,160 --> 00:18:58,600 Speaker 1: these sandals are gonna come up, and it's gonna reopen 314 00:18:58,640 --> 00:19:00,639 Speaker 1: the wound. And he's gonna know that I was sending 315 00:19:00,680 --> 00:19:04,679 Speaker 1: them to people asking for an opinion. Did you change 316 00:19:04,680 --> 00:19:07,720 Speaker 1: your behavior when you were hanging out with him? Yeah, 317 00:19:07,760 --> 00:19:11,280 Speaker 1: Like if we were loafing around, I would open so 318 00:19:11,320 --> 00:19:16,760 Speaker 1: many tabs and none of them would be Facebook. This 319 00:19:16,800 --> 00:19:18,480 Speaker 1: is a really funny story. What does it make you 320 00:19:18,520 --> 00:19:22,280 Speaker 1: think about? It made me wish that I could go 321 00:19:22,320 --> 00:19:25,760 Speaker 1: back to the days where Facebook was just showing me 322 00:19:25,920 --> 00:19:28,520 Speaker 1: things that I wanted to buy and not things that 323 00:19:28,600 --> 00:19:32,240 Speaker 1: I wanted to run away from. That's that's such a 324 00:19:32,359 --> 00:19:35,080 Speaker 1: I mean, I've seen the same sort of stuff, and 325 00:19:35,160 --> 00:19:39,480 Speaker 1: I've seen advertisements that it's odd, like things that do 326 00:19:39,560 --> 00:19:42,760 Speaker 1: not interest me at all. They're just completely off base 327 00:19:43,359 --> 00:19:45,760 Speaker 1: and I can't It's one of those things where I think, well, 328 00:19:45,760 --> 00:19:48,800 Speaker 1: maybe now, it's just that the system hasn't gathered enough 329 00:19:48,840 --> 00:19:52,280 Speaker 1: information about me to get a good feel, right, so 330 00:19:52,359 --> 00:19:55,400 Speaker 1: I'm seeing stuff. It's like, this is the shotgun approach. 331 00:19:55,440 --> 00:19:59,000 Speaker 1: I'm throwing everything I can and hopefully stuff will work. 332 00:20:00,000 --> 00:20:01,840 Speaker 1: We'll be back to talk more about the evils of 333 00:20:01,880 --> 00:20:15,000 Speaker 1: data tracking after these messages. Systems like Facebook, for example, 334 00:20:15,040 --> 00:20:18,280 Speaker 1: where you can actually go in to an ad and 335 00:20:18,359 --> 00:20:23,120 Speaker 1: say I am not interested in this, you know, then 336 00:20:23,200 --> 00:20:25,800 Speaker 1: maybe you'll end up getting a different selection of stuff, 337 00:20:25,840 --> 00:20:29,239 Speaker 1: but sometimes you'll just get different versions of the same thing. 338 00:20:29,280 --> 00:20:33,040 Speaker 1: You already said, that's not that's not me man. I 339 00:20:34,080 --> 00:20:37,040 Speaker 1: appreciate the muscles on this dude, but I Am not 340 00:20:37,119 --> 00:20:39,920 Speaker 1: going to be that guy. So I'm into I'm into 341 00:20:40,000 --> 00:20:42,800 Speaker 1: my little pony. But I I think I've outgrown it. 342 00:20:43,040 --> 00:20:48,040 Speaker 1: I look my collection, My collection can only be so complete. 343 00:20:48,280 --> 00:20:50,880 Speaker 1: That's you know, there's you gotta draw the line after 344 00:20:50,960 --> 00:20:54,560 Speaker 1: like series five, I gotta say I'm done. I have 345 00:20:54,640 --> 00:20:57,719 Speaker 1: no more shelf space. Um yeah, this is this is 346 00:20:57,800 --> 00:21:03,680 Speaker 1: one of those those amazing stories, like the story about 347 00:21:03,720 --> 00:21:06,480 Speaker 1: the sandals. Just just one of those things that has 348 00:21:06,520 --> 00:21:09,840 Speaker 1: happened to pretty much anyone who has been active online 349 00:21:09,840 --> 00:21:12,720 Speaker 1: over the last couple of years. And it's probably one 350 00:21:12,760 --> 00:21:15,439 Speaker 1: of those stories I would imagine that people start to share, 351 00:21:16,200 --> 00:21:21,760 Speaker 1: not realizing that this is something that other people have experienced. Yeah, yeah, right, 352 00:21:22,000 --> 00:21:24,560 Speaker 1: it is. It feels like a very personal experience, but 353 00:21:24,600 --> 00:21:27,240 Speaker 1: it's something that seems to happen to everyone, and it 354 00:21:27,280 --> 00:21:31,800 Speaker 1: was interesting to see. Relatively recently, Facebook, I believe um 355 00:21:32,040 --> 00:21:37,240 Speaker 1: made it easier for people to control how much of 356 00:21:37,320 --> 00:21:42,840 Speaker 1: their ex girlfriend or boyfriend or significant others information surfaced 357 00:21:42,880 --> 00:21:46,480 Speaker 1: on their own timeline. Did you see that story that 358 00:21:46,560 --> 00:21:50,360 Speaker 1: came out relatively recently. He was interesting this idea that like, 359 00:21:50,720 --> 00:21:53,280 Speaker 1: you know, you know, even the tech companies they're really 360 00:21:53,720 --> 00:21:57,680 Speaker 1: really good at data tracking at this point, are still 361 00:21:57,720 --> 00:22:01,440 Speaker 1: trying to figure out when it goes wrong and how 362 00:22:01,560 --> 00:22:06,240 Speaker 1: to help the user in some cases, um make it 363 00:22:06,280 --> 00:22:10,960 Speaker 1: a more kind of custom experience, right. And and we 364 00:22:10,960 --> 00:22:15,160 Speaker 1: should also stress that that the overwhelming majority of companies 365 00:22:15,160 --> 00:22:18,560 Speaker 1: out there do not want it to go wrong, because 366 00:22:18,560 --> 00:22:22,199 Speaker 1: that's that's counterproductive to what the end end goal is, 367 00:22:22,240 --> 00:22:25,680 Speaker 1: at least for the person that actually bought the advertisement 368 00:22:25,760 --> 00:22:29,200 Speaker 1: and the person who owns the place where that ad 369 00:22:29,280 --> 00:22:32,240 Speaker 1: is being shown in the case of targeted advertising, at 370 00:22:32,320 --> 00:22:35,280 Speaker 1: least like they don't want that to go badly. But 371 00:22:35,320 --> 00:22:36,800 Speaker 1: in other cases, you know, they don't want to go 372 00:22:36,840 --> 00:22:41,520 Speaker 1: badly either, that that's counterproductive. But it is possible for 373 00:22:41,560 --> 00:22:44,800 Speaker 1: this to go wrong even if everyone is trying to 374 00:22:45,880 --> 00:22:50,720 Speaker 1: use it to to people's benefit. And on Codebreaker, you 375 00:22:50,840 --> 00:22:55,879 Speaker 1: had a heartbreaking story about about a father who shared 376 00:22:55,960 --> 00:22:59,480 Speaker 1: some information not not knowing, not even realizing that that 377 00:22:59,560 --> 00:23:01,919 Speaker 1: information was somehow going to make its way into a 378 00:23:02,000 --> 00:23:05,760 Speaker 1: database and then came back to haunt him. Yeah, this 379 00:23:05,800 --> 00:23:09,360 Speaker 1: guy Mike Say, and and this is happened a few 380 00:23:09,440 --> 00:23:14,920 Speaker 1: years back. He you know, unfortunately, um, his his one 381 00:23:14,920 --> 00:23:17,760 Speaker 1: of his children, his daughter died in a car accident. 382 00:23:18,040 --> 00:23:22,480 Speaker 1: And he went to buy some uh you know, he 383 00:23:22,520 --> 00:23:27,080 Speaker 1: went to buy some some actually some some um some 384 00:23:27,119 --> 00:23:30,480 Speaker 1: stuff for photos, like he was putting together some photo albums, 385 00:23:30,520 --> 00:23:33,080 Speaker 1: and he went to buy some frames for those photos. 386 00:23:33,680 --> 00:23:37,360 Speaker 1: And the place that he called to do that, um, 387 00:23:37,720 --> 00:23:42,080 Speaker 1: he just in passing mentioned uh that his daughter had 388 00:23:42,119 --> 00:23:44,199 Speaker 1: died in this car accident. And the person that he 389 00:23:44,280 --> 00:23:48,760 Speaker 1: was talking to actually took down that information. And it's 390 00:23:48,800 --> 00:23:53,000 Speaker 1: still not clear how it how all of this happened, 391 00:23:53,080 --> 00:23:57,520 Speaker 1: but apparently they took down the information and that information 392 00:23:57,640 --> 00:24:00,359 Speaker 1: ended up getting sold to a data broker that's sold 393 00:24:00,359 --> 00:24:04,440 Speaker 1: the information from the company he originally called to Office Max, 394 00:24:04,880 --> 00:24:10,679 Speaker 1: and then Office Max sent a letter to him. Um 395 00:24:10,840 --> 00:24:16,080 Speaker 1: there was addressed to owner of the house or Mike 396 00:24:16,200 --> 00:24:20,840 Speaker 1: Say daughter died in a car accident. Um. And that's 397 00:24:20,840 --> 00:24:25,800 Speaker 1: when it goes really really wrong, right. The picture frames 398 00:24:25,840 --> 00:24:28,600 Speaker 1: that was was buying with through my children. In the 399 00:24:28,640 --> 00:24:32,760 Speaker 1: bottom it had Ashley's name and her birth and death 400 00:24:32,840 --> 00:24:35,359 Speaker 1: date and that she loved them, and you know what 401 00:24:35,440 --> 00:24:39,480 Speaker 1: I mean, that's how she the conversation came about that 402 00:24:39,600 --> 00:24:43,560 Speaker 1: what is this for? And she took you know, the 403 00:24:43,600 --> 00:24:47,600 Speaker 1: information that would go on the bottom and put that, 404 00:24:47,640 --> 00:24:49,679 Speaker 1: you know, on whatever to get it stamp. But they 405 00:24:49,680 --> 00:24:52,600 Speaker 1: had nothing to it. Didn't say killed in car crashing 406 00:24:52,640 --> 00:24:56,439 Speaker 1: out of that. That was all in her own. And 407 00:24:56,480 --> 00:24:59,199 Speaker 1: this is a totally separate company from Office Max. And 408 00:24:59,320 --> 00:25:03,399 Speaker 1: somehow Office Max got the information. They bought the information 409 00:25:03,480 --> 00:25:09,480 Speaker 1: from a data group, so that information was sold from 410 00:25:09,600 --> 00:25:14,119 Speaker 1: things remembered to a data group that collects data and 411 00:25:14,119 --> 00:25:19,879 Speaker 1: then taken and sold to Office Max in an Office 412 00:25:19,880 --> 00:25:23,800 Speaker 1: Max used without looking at it. What an an emotional 413 00:25:25,240 --> 00:25:29,360 Speaker 1: uh punch to the gut to see something like that. 414 00:25:30,080 --> 00:25:33,880 Speaker 1: And and honestly, you know, I mean, you know, Office 415 00:25:33,920 --> 00:25:38,440 Speaker 1: Max did not intend that. There's no sane reason anyone 416 00:25:38,520 --> 00:25:43,360 Speaker 1: would do that. So it's clearly an epic mistake that 417 00:25:43,359 --> 00:25:47,680 Speaker 1: that could result in true emotional trauma, even if it's 418 00:25:47,720 --> 00:25:50,399 Speaker 1: just a momentary thing. You don't want any person to 419 00:25:50,520 --> 00:25:54,840 Speaker 1: experience that kind of thing. And uh, it was a 420 00:25:54,880 --> 00:25:59,280 Speaker 1: pure accident, I imagine based upon way back in the day, 421 00:25:59,280 --> 00:26:01,879 Speaker 1: I used to work for consultants and when when I 422 00:26:01,880 --> 00:26:04,119 Speaker 1: worked for consultants, one of the things I had to 423 00:26:04,119 --> 00:26:09,560 Speaker 1: do was arranged giant mailing lists and print out tons 424 00:26:09,600 --> 00:26:12,920 Speaker 1: and tons and tons of mailing labels. And so you're 425 00:26:12,920 --> 00:26:17,119 Speaker 1: you're working with these big spreadsheets there possibly thousands of 426 00:26:17,280 --> 00:26:21,520 Speaker 1: names long. So I'm guessing what happened was you had 427 00:26:21,520 --> 00:26:25,880 Speaker 1: the data entered into some sort of spreadsheet, and probably 428 00:26:25,920 --> 00:26:29,480 Speaker 1: the spreadsheet was converted at least once, and in that conversion, 429 00:26:29,880 --> 00:26:33,040 Speaker 1: a tab that had been or you know, a cell 430 00:26:33,080 --> 00:26:37,400 Speaker 1: that had just been labeled miscellaneous got put into address 431 00:26:37,440 --> 00:26:41,280 Speaker 1: instead of miscellanys or title or something along those lines. 432 00:26:41,960 --> 00:26:45,919 Speaker 1: And that's when it went from one point where it 433 00:26:46,000 --> 00:26:49,800 Speaker 1: was a piece of data, which, as as Mike said, 434 00:26:49,960 --> 00:26:52,520 Speaker 1: like why do people even need to know this? No 435 00:26:52,560 --> 00:26:54,600 Speaker 1: one needs to know this. This is this is not 436 00:26:54,720 --> 00:26:57,400 Speaker 1: a relevant piece of information. It should never have gone 437 00:26:57,440 --> 00:27:00,960 Speaker 1: into a database. But for that to then accidentally get 438 00:27:00,960 --> 00:27:04,400 Speaker 1: transposed into a cell that is going to show up 439 00:27:04,440 --> 00:27:11,400 Speaker 1: on a mailing label is incredibly it feels incredibly callous, yeah, 440 00:27:11,400 --> 00:27:14,879 Speaker 1: and invasive. And it also, you know, it reminds you that, 441 00:27:16,840 --> 00:27:18,679 Speaker 1: I mean, it reminds us all that the way we 442 00:27:18,760 --> 00:27:23,160 Speaker 1: view ourselves isn't always the way we're viewed by the world, right, 443 00:27:23,880 --> 00:27:27,280 Speaker 1: and and and and that's an interesting aspect of this too, 444 00:27:27,440 --> 00:27:34,000 Speaker 1: This idea that basically a data profile of most users 445 00:27:34,040 --> 00:27:37,080 Speaker 1: on the Internet is being slowly put together over time, 446 00:27:37,440 --> 00:27:41,679 Speaker 1: and and you might be we all might be really 447 00:27:41,760 --> 00:27:47,400 Speaker 1: curious to know, um or disappointed to know um, what 448 00:27:47,480 --> 00:27:50,639 Speaker 1: that data profile looks like from the other side. Now, 449 00:27:51,160 --> 00:27:53,399 Speaker 1: then let me ask you this. I'm going to paint 450 00:27:53,440 --> 00:27:58,359 Speaker 1: a picture for you of a dystopia. Okay, that's that's 451 00:27:58,400 --> 00:28:03,280 Speaker 1: sadly not difficult to imagine based on our current events. 452 00:28:03,280 --> 00:28:06,199 Speaker 1: All Right, so I'm sure you heard. I mean, I'm 453 00:28:06,240 --> 00:28:09,800 Speaker 1: absolutely certain you heard about the app people p E 454 00:28:09,800 --> 00:28:13,280 Speaker 1: E p l E. Right, Yes, this was the one 455 00:28:13,320 --> 00:28:16,200 Speaker 1: that was supposed to be the yelp for people where 456 00:28:16,240 --> 00:28:21,080 Speaker 1: you would be able to rate other human beings and 457 00:28:21,160 --> 00:28:23,520 Speaker 1: so then other human beings could see your rating and 458 00:28:23,560 --> 00:28:26,240 Speaker 1: could see ratings of other people. And it just seemed 459 00:28:26,280 --> 00:28:30,200 Speaker 1: like it was going to lead to fallout. For that's 460 00:28:30,200 --> 00:28:33,439 Speaker 1: what literally an idea, an idea whose time should have 461 00:28:33,480 --> 00:28:37,320 Speaker 1: never come exactly. Now, let's take that same basic concept, 462 00:28:37,400 --> 00:28:40,960 Speaker 1: but instead of you rating people, imagine that you get 463 00:28:41,000 --> 00:28:45,000 Speaker 1: to look up what that person's data profile is worth 464 00:28:45,400 --> 00:28:50,520 Speaker 1: compared to your own because guess what you could do 465 00:28:50,600 --> 00:28:54,360 Speaker 1: that you could build something where you start to look 466 00:28:54,400 --> 00:28:58,120 Speaker 1: at the the robustness of a data profile, at least 467 00:28:58,160 --> 00:29:02,280 Speaker 1: for a certain give and set of transactions. Because keep 468 00:29:02,280 --> 00:29:04,400 Speaker 1: in mind, there are a lot of data brokers out there, 469 00:29:04,720 --> 00:29:07,360 Speaker 1: and your profile in one data broker's database may be 470 00:29:07,520 --> 00:29:10,840 Speaker 1: very different from another because it may be less or 471 00:29:10,880 --> 00:29:15,920 Speaker 1: more complete or from totally different sources than the other one. 472 00:29:16,160 --> 00:29:19,800 Speaker 1: But that would be a weird world where you say, wow, 473 00:29:20,760 --> 00:29:25,720 Speaker 1: my life is worth this much to a potential marketer, 474 00:29:26,200 --> 00:29:29,600 Speaker 1: and my friends life is worth twice as much. What 475 00:29:29,720 --> 00:29:32,440 Speaker 1: am I doing wrong in my life where I'm worth less? 476 00:29:32,520 --> 00:29:37,960 Speaker 1: I mean, it's this isies are looking at us. Yeah, 477 00:29:38,040 --> 00:29:40,000 Speaker 1: and it is scary. I mean, I think it's also 478 00:29:40,040 --> 00:29:42,520 Speaker 1: fair to say that this is as scary as it 479 00:29:42,600 --> 00:29:45,440 Speaker 1: is to imagine that it is. It does also have 480 00:29:45,520 --> 00:29:47,760 Speaker 1: some connection to reality. I mean, when you go into 481 00:29:47,840 --> 00:29:51,640 Speaker 1: a bank and apply for a loan um, you're also 482 00:29:51,760 --> 00:29:57,680 Speaker 1: being sort of sized up right and over time, Um, 483 00:29:57,720 --> 00:29:59,960 Speaker 1: you know, we've we've we've seen some actually some really 484 00:30:00,000 --> 00:30:04,640 Speaker 1: of big problems arise from how banks look at people 485 00:30:04,920 --> 00:30:07,680 Speaker 1: and look at people's ability to pay things back. And 486 00:30:07,680 --> 00:30:10,200 Speaker 1: how credit card companies do this, how your net worth 487 00:30:10,280 --> 00:30:13,960 Speaker 1: is calculated, how you're um, you know, your your likelihood 488 00:30:13,960 --> 00:30:17,280 Speaker 1: of paying a loan back is calculated. And but at 489 00:30:17,280 --> 00:30:20,920 Speaker 1: the same time, over time, there have been some safeguards 490 00:30:20,960 --> 00:30:25,000 Speaker 1: put in against discrimination, right, and a lot of people 491 00:30:25,040 --> 00:30:28,000 Speaker 1: have actually talked about this idea of data discrimination over 492 00:30:28,040 --> 00:30:32,240 Speaker 1: time and how um we really need to think about 493 00:30:32,360 --> 00:30:34,760 Speaker 1: as we go forward and as more and more data 494 00:30:34,760 --> 00:30:37,760 Speaker 1: gets collected on on on users over time. I mean, 495 00:30:37,840 --> 00:30:40,600 Speaker 1: people that were born you know, in the last ten 496 00:30:40,680 --> 00:30:43,440 Speaker 1: years are going to have a very different life on 497 00:30:43,480 --> 00:30:46,080 Speaker 1: the internet than even you or I have had, right, 498 00:30:46,120 --> 00:30:48,320 Speaker 1: I mean I wasn't on the internet until I was 499 00:30:48,760 --> 00:30:51,959 Speaker 1: a teenager. So it's interesting to think about it. And 500 00:30:51,960 --> 00:30:54,800 Speaker 1: it's also you know, you would you would hope some 501 00:30:54,840 --> 00:30:58,880 Speaker 1: people are talking about this idea of like, you know, 502 00:30:59,120 --> 00:31:04,800 Speaker 1: if you all are perceived to be a a person 503 00:31:04,840 --> 00:31:09,960 Speaker 1: of a certain race, a certain education level, tax bracket 504 00:31:10,080 --> 00:31:14,680 Speaker 1: like that the tax bracket, then you might actually see 505 00:31:15,560 --> 00:31:19,800 Speaker 1: a web that is different from the web that someone 506 00:31:19,880 --> 00:31:23,480 Speaker 1: else with a different perceived data profile would see. Yeah, 507 00:31:23,480 --> 00:31:26,360 Speaker 1: this is this is like a dark twisted version of 508 00:31:26,400 --> 00:31:30,640 Speaker 1: what the semantic web is supposed to be. The semantic 509 00:31:30,640 --> 00:31:34,600 Speaker 1: web is supposed to learn about you and then respond 510 00:31:34,680 --> 00:31:37,040 Speaker 1: to what you want, so that you don't have to 511 00:31:37,680 --> 00:31:41,040 Speaker 1: play the game of the web search or the web browser. 512 00:31:41,400 --> 00:31:43,840 Speaker 1: You just say what you want and it returns exactly 513 00:31:43,840 --> 00:31:47,200 Speaker 1: what you want, possibly from multiple sources, so you're not 514 00:31:47,240 --> 00:31:50,360 Speaker 1: even going to a web page in the semantic web necessarily. 515 00:31:50,400 --> 00:31:53,600 Speaker 1: It may be that it's pulling things from various sources 516 00:31:53,680 --> 00:31:57,160 Speaker 1: and presenting you kind of a united view. That's one 517 00:31:57,280 --> 00:32:00,760 Speaker 1: realization of what those semantic web would look like. This 518 00:32:00,760 --> 00:32:03,760 Speaker 1: this is similar to that, but a very dark version 519 00:32:03,840 --> 00:32:07,800 Speaker 1: where it's not responding to what you want, it's responding 520 00:32:07,840 --> 00:32:11,360 Speaker 1: to what the other side wants based upon what they 521 00:32:11,400 --> 00:32:14,400 Speaker 1: think they can get out of you. Yeah, and that's 522 00:32:14,400 --> 00:32:18,400 Speaker 1: a really scary thing. I mean, there's also some potential 523 00:32:18,480 --> 00:32:21,920 Speaker 1: good from that can come from this, right, Um, And 524 00:32:21,920 --> 00:32:25,320 Speaker 1: and you know, other people are talking about this idea 525 00:32:25,440 --> 00:32:30,560 Speaker 1: of you know, a time at which maybe people actually 526 00:32:30,600 --> 00:32:34,080 Speaker 1: start to you know, big data is more valuable. Big data, 527 00:32:34,560 --> 00:32:37,080 Speaker 1: you know, it's all perceived to be more valuable to 528 00:32:37,280 --> 00:32:41,240 Speaker 1: companies and marketers and and and and certain organizations in 529 00:32:41,280 --> 00:32:45,200 Speaker 1: the aggregate. Right, So what if we actually reached a 530 00:32:45,320 --> 00:32:49,800 Speaker 1: time where, um, not. I won't say like data gets 531 00:32:49,920 --> 00:32:53,680 Speaker 1: unionized necessarily, but like, what if you reach the time 532 00:32:53,760 --> 00:33:00,080 Speaker 1: where users actually banded together and did collective bargating on 533 00:33:00,160 --> 00:33:04,200 Speaker 1: behalf of their data in order to say, hey, Okay, 534 00:33:04,240 --> 00:33:09,320 Speaker 1: my data maybe only half of a penny value, but 535 00:33:09,880 --> 00:33:12,320 Speaker 1: when I get together with this large group of other 536 00:33:12,480 --> 00:33:16,320 Speaker 1: users that i'm that I'm aligned with, it's actually worth 537 00:33:16,360 --> 00:33:19,640 Speaker 1: a lot more. And I want to align myself with 538 00:33:19,680 --> 00:33:23,920 Speaker 1: this large group and then sell that um and actually 539 00:33:24,160 --> 00:33:27,520 Speaker 1: gain some agency in that way and gain maybe some 540 00:33:27,520 --> 00:33:31,040 Speaker 1: some money. You're you're like creating a brand new career 541 00:33:31,400 --> 00:33:37,800 Speaker 1: a data agent, like agent. Yeah, agent, they're like Dave 542 00:33:37,880 --> 00:33:42,520 Speaker 1: Walkers really wouldn't if you want to go the vampire route, Yeah, 543 00:33:42,560 --> 00:33:46,960 Speaker 1: that's that. That's I I honestly had never considered that, 544 00:33:47,000 --> 00:33:50,120 Speaker 1: but that is a really compelling idea, especially when you 545 00:33:50,160 --> 00:33:52,520 Speaker 1: think even if you go on a route where you're like, 546 00:33:52,560 --> 00:33:54,520 Speaker 1: all right, we're not doing this to make money, We're 547 00:33:54,520 --> 00:33:57,440 Speaker 1: doing this so that we're only opting into the things 548 00:33:57,760 --> 00:34:01,680 Speaker 1: that we agree with that we find acceptable. One of 549 00:34:01,680 --> 00:34:04,080 Speaker 1: the other things about data tracking that I think is 550 00:34:05,280 --> 00:34:11,240 Speaker 1: problematic is that sure, uh, you might be getting stuff 551 00:34:11,320 --> 00:34:14,560 Speaker 1: that's reacting to your behavior but a lot of that 552 00:34:14,640 --> 00:34:17,440 Speaker 1: is going to be based on guesswork. So for example, 553 00:34:17,520 --> 00:34:21,120 Speaker 1: if I have posted a lot about hiking, it maybe 554 00:34:21,400 --> 00:34:25,400 Speaker 1: that I get a lot of advertisements for outdoor supply 555 00:34:25,520 --> 00:34:28,040 Speaker 1: stores like camping gear, that kind of stuff. And it 556 00:34:28,120 --> 00:34:30,960 Speaker 1: may be that that doesn't fully overlap with my interests. 557 00:34:31,000 --> 00:34:34,439 Speaker 1: That's not terrible or whatever, but it also could mean 558 00:34:34,600 --> 00:34:39,040 Speaker 1: that I start to have a narrower selection of things 559 00:34:39,080 --> 00:34:41,440 Speaker 1: presented to me, which in one way is good, but 560 00:34:41,480 --> 00:34:48,720 Speaker 1: it also means it takes away the possibility of discovery. Yeah, 561 00:34:48,760 --> 00:34:52,360 Speaker 1: this is a huge I mean, this is a really 562 00:34:52,560 --> 00:34:55,319 Speaker 1: interesting idea and a really interesting topic, right. I mean, 563 00:34:55,800 --> 00:34:59,320 Speaker 1: Amazon is is kind of thinking about, I think trying 564 00:34:59,320 --> 00:35:01,560 Speaker 1: to do this. I mean, there there are companies that 565 00:35:01,600 --> 00:35:03,840 Speaker 1: would would love to be able to basically when you 566 00:35:03,920 --> 00:35:07,040 Speaker 1: go to their homepage they basically say, or not that 567 00:35:07,080 --> 00:35:09,400 Speaker 1: we would ever necessarily do that in the future, but 568 00:35:09,920 --> 00:35:13,359 Speaker 1: you you, whatever, you you engage with their platform, they 569 00:35:13,400 --> 00:35:20,160 Speaker 1: would say, here's the three products you can choose from today. Yeah, right, right, 570 00:35:20,680 --> 00:35:25,400 Speaker 1: And that does, as you say, really reduce this idea 571 00:35:25,400 --> 00:35:27,759 Speaker 1: of discovery. And this is you know, this happens in 572 00:35:27,800 --> 00:35:31,200 Speaker 1: the media world to write this idea of like UM 573 00:35:31,280 --> 00:35:35,279 Speaker 1: and in the social media world, this idea of UM 574 00:35:35,320 --> 00:35:39,880 Speaker 1: being able to curate so in such a you know, 575 00:35:39,960 --> 00:35:44,200 Speaker 1: a focused way to the individual user also makes it 576 00:35:44,320 --> 00:35:48,120 Speaker 1: so that user doesn't I don't know, read the New 577 00:35:48,200 --> 00:35:52,800 Speaker 1: York Times and learn anything about what goes on outside 578 00:35:52,840 --> 00:35:55,279 Speaker 1: of their own interest. Here it becomes it becomes like 579 00:35:55,320 --> 00:36:00,000 Speaker 1: the ultimate echo chamber, where where you know you're you're 580 00:36:00,440 --> 00:36:03,680 Speaker 1: you're stuck there seeing the same stuff and maybe you 581 00:36:03,719 --> 00:36:06,879 Speaker 1: don't even realize that we're we're heading to like this 582 00:36:06,920 --> 00:36:09,440 Speaker 1: is almost a Terry Gilliam film what we're describing at 583 00:36:09,440 --> 00:36:13,960 Speaker 1: this right Like we're heading to Brazil right now, let's 584 00:36:14,080 --> 00:36:16,359 Speaker 1: head let's say I'm ready for that movie. I mean, 585 00:36:16,400 --> 00:36:19,480 Speaker 1: I don't want to be in the reality that it depicts. 586 00:36:19,520 --> 00:36:21,799 Speaker 1: I don't want to be I don't want it to 587 00:36:21,800 --> 00:36:24,719 Speaker 1: be reality. But I would totally watch that movie. Yeah, 588 00:36:24,800 --> 00:36:27,800 Speaker 1: me too. When we're talking big data, when we're talking 589 00:36:27,840 --> 00:36:30,120 Speaker 1: about all the information that's going out there, and we 590 00:36:30,200 --> 00:36:33,960 Speaker 1: mentioned the fact that companies are buying profiles in the 591 00:36:34,080 --> 00:36:38,279 Speaker 1: tens or hundreds of thousands or more, here's why. Let 592 00:36:38,280 --> 00:36:41,080 Speaker 1: me give you a few statistics on how much information 593 00:36:41,760 --> 00:36:45,799 Speaker 1: is hitting the Internet on a per minute basis. This 594 00:36:45,920 --> 00:36:48,920 Speaker 1: is based off of research that was conducted by a 595 00:36:49,040 --> 00:36:52,520 Speaker 1: data company called Domo, and this was published in August 596 00:36:52,640 --> 00:36:55,320 Speaker 1: of two thousand fifteen. So keep in mind these numbers 597 00:36:55,360 --> 00:37:01,239 Speaker 1: are already old. But every single minute Amazon receives four 598 00:37:01,239 --> 00:37:06,480 Speaker 1: thousand three unique visitors. Well, we're going to pause the 599 00:37:06,640 --> 00:37:20,759 Speaker 1: data tracking conversation for another quick break. So Amazon is 600 00:37:20,800 --> 00:37:23,960 Speaker 1: one of those companies that's really good at doing this 601 00:37:24,080 --> 00:37:27,000 Speaker 1: focused data tracking that you know, this is where you 602 00:37:27,120 --> 00:37:30,040 Speaker 1: buy a product and it says, hey, people who bought 603 00:37:30,040 --> 00:37:35,040 Speaker 1: this product also bought these other things, and then you think, oh, right, right, 604 00:37:35,160 --> 00:37:37,440 Speaker 1: I should get that. It's like it's like the perfect 605 00:37:37,800 --> 00:37:44,799 Speaker 1: impulse by shelf and it's customized for every single purchase. Right, 606 00:37:44,880 --> 00:37:49,560 Speaker 1: I do need batteries and dog biscuits. How did you know? Well, 607 00:37:49,600 --> 00:37:53,160 Speaker 1: there's there's there's there's this other level two, uh that 608 00:37:53,680 --> 00:37:56,600 Speaker 1: that people are starting to imagine when you think about 609 00:37:56,800 --> 00:38:01,520 Speaker 1: companies like nest Um, and you think about smart televisions 610 00:38:01,560 --> 00:38:04,200 Speaker 1: that actually interact with your refrigerator. And one of the 611 00:38:04,239 --> 00:38:09,200 Speaker 1: most convincing sort of imagined scenarios that was ever presented 612 00:38:09,239 --> 00:38:12,480 Speaker 1: to me in recent years was this idea that essentially, 613 00:38:12,520 --> 00:38:17,880 Speaker 1: over time, your refrigerator learns that you go for ice cream. 614 00:38:18,080 --> 00:38:21,319 Speaker 1: You go in for that vanilla ice cream, Um, right 615 00:38:21,360 --> 00:38:27,120 Speaker 1: around eight pm, and um, it tells your television that. 616 00:38:27,640 --> 00:38:31,080 Speaker 1: And then in the world of streaming that you know, 617 00:38:31,160 --> 00:38:33,680 Speaker 1: more and more people are doing on their actual television 618 00:38:33,800 --> 00:38:36,000 Speaker 1: instead of having it connected to a you know, a 619 00:38:36,040 --> 00:38:40,880 Speaker 1: cable channel. Um, you actually get at eight forty five, 620 00:38:41,120 --> 00:38:45,839 Speaker 1: you get an advertisement for ice cream, which reminds you 621 00:38:46,160 --> 00:38:48,279 Speaker 1: to go and eat some ice cream, and then you 622 00:38:48,400 --> 00:38:50,560 Speaker 1: end up eating more ice cream, and then you end 623 00:38:50,640 --> 00:38:53,560 Speaker 1: up buying This is the world where the appliances are 624 00:38:53,560 --> 00:38:57,839 Speaker 1: working against you. Yes, they're not telling you they eat 625 00:38:57,880 --> 00:39:00,719 Speaker 1: a kale set, or maybe they are. We Yeah, that's 626 00:39:00,760 --> 00:39:02,960 Speaker 1: also possible, Like if you have if you have this 627 00:39:03,040 --> 00:39:05,480 Speaker 1: thing where you're going out and buying kale salads and 628 00:39:05,480 --> 00:39:08,200 Speaker 1: then you're watching a lot of you know, food type stuff, 629 00:39:08,280 --> 00:39:10,759 Speaker 1: and your you might end up getting, Hey, here's a 630 00:39:10,840 --> 00:39:13,680 Speaker 1: quick recipe, and here are the fourteen ingredients you don't 631 00:39:13,719 --> 00:39:17,640 Speaker 1: have that you need to go to Whole Foods and buy. Um. Yeah, 632 00:39:18,120 --> 00:39:20,800 Speaker 1: it's a world. I host a show. I host another 633 00:39:20,800 --> 00:39:24,439 Speaker 1: show called forward Thinking, and in that show, it's it's 634 00:39:24,440 --> 00:39:26,880 Speaker 1: an optimistic view of the future and I have often 635 00:39:26,920 --> 00:39:29,839 Speaker 1: looked at this side from the optimist side, and it 636 00:39:29,920 --> 00:39:32,880 Speaker 1: is similar to what you were saying about every single 637 00:39:32,960 --> 00:39:37,640 Speaker 1: person's Web experience could be unique to them based upon 638 00:39:37,719 --> 00:39:40,720 Speaker 1: these factors. Uh. And and there's a way of doing 639 00:39:40,760 --> 00:39:44,600 Speaker 1: that where it is not kind of the scary, oppressive 640 00:39:44,840 --> 00:39:47,400 Speaker 1: type of way. But I I talk about how with 641 00:39:47,520 --> 00:39:51,239 Speaker 1: the Internet of things and with this connectivity of different appliances, 642 00:39:52,000 --> 00:39:54,359 Speaker 1: it's not that far of a stretch to imagine not 643 00:39:54,520 --> 00:39:59,640 Speaker 1: just the Web, your experience of reality could get to 644 00:39:59,640 --> 00:40:02,760 Speaker 1: a point where a lot of it is catered specifically 645 00:40:02,840 --> 00:40:06,120 Speaker 1: to you. Yeah, and there could be a lot of 646 00:40:06,120 --> 00:40:09,200 Speaker 1: good things that come out of that, right, um and 647 00:40:09,200 --> 00:40:13,440 Speaker 1: and and and I totally believe in that possibility. I mean, 648 00:40:13,480 --> 00:40:18,000 Speaker 1: like like we like you said, I think so, um 649 00:40:18,200 --> 00:40:21,759 Speaker 1: so thoughtfully when you came in and and and uh 650 00:40:22,120 --> 00:40:25,000 Speaker 1: talked about this stuff with me on Codebreaker, was was 651 00:40:25,040 --> 00:40:29,400 Speaker 1: basically this idea that look, it's um, I'm not going 652 00:40:29,440 --> 00:40:32,359 Speaker 1: to pull out the quote so eloquently as you did, 653 00:40:32,400 --> 00:40:36,680 Speaker 1: but look, this technology can it's us that makes it 654 00:40:36,760 --> 00:40:39,920 Speaker 1: good or bad. Right we we we have the power 655 00:40:40,080 --> 00:40:43,239 Speaker 1: theoretically to to make all of this stuff and make 656 00:40:43,320 --> 00:40:46,120 Speaker 1: data tracking work for us. As users and work for 657 00:40:46,200 --> 00:40:49,400 Speaker 1: us as organizations and companies. I mean, I think a 658 00:40:49,440 --> 00:40:52,840 Speaker 1: lot about the possibility of data tracking being really good 659 00:40:53,320 --> 00:40:57,440 Speaker 1: in the future where our climate is changing, um and 660 00:40:57,440 --> 00:41:00,400 Speaker 1: and the fact that we really understand and on a 661 00:41:00,440 --> 00:41:04,960 Speaker 1: global level, a lot more about our climate than we 662 00:41:05,040 --> 00:41:09,239 Speaker 1: did even thirty years ago thanks to an increased level 663 00:41:09,239 --> 00:41:12,160 Speaker 1: of data tracking, right, And that is an important point 664 00:41:12,160 --> 00:41:15,440 Speaker 1: to make, Ben, is that data tracking does not necessarily 665 00:41:15,520 --> 00:41:19,040 Speaker 1: mean it's about just people. Although people are an important 666 00:41:19,080 --> 00:41:21,239 Speaker 1: factor when you're looking even at climate change, because you 667 00:41:21,280 --> 00:41:24,240 Speaker 1: want to see, well, how is this affecting actual human 668 00:41:24,280 --> 00:41:27,359 Speaker 1: beings as well as the environment in general. You need 669 00:41:27,400 --> 00:41:30,040 Speaker 1: to know the whole picture. But data tracking is really 670 00:41:30,640 --> 00:41:36,839 Speaker 1: it's it's agnostic as far as what the data actually represents. 671 00:41:36,920 --> 00:41:40,840 Speaker 1: Like we're talking about data about everything, you know, anything 672 00:41:40,880 --> 00:41:42,960 Speaker 1: that you can imagine that can be quantified in the 673 00:41:43,000 --> 00:41:47,239 Speaker 1: form of data that is being tracked by someone for 674 00:41:47,320 --> 00:41:50,319 Speaker 1: some reason. Some of those reasons are good, some of 675 00:41:50,360 --> 00:41:55,120 Speaker 1: those reasons are way not good. Yeah, But but it's 676 00:41:55,160 --> 00:41:59,080 Speaker 1: not all just about people. And it's interesting too to 677 00:41:59,120 --> 00:42:01,440 Speaker 1: think about I wonder what you think about this, But 678 00:42:01,520 --> 00:42:07,520 Speaker 1: it's interesting too to think about just how I think 679 00:42:07,560 --> 00:42:09,719 Speaker 1: a lot of it feels out of control at this 680 00:42:09,800 --> 00:42:13,040 Speaker 1: point for the user, it feels like we don't fully 681 00:42:13,120 --> 00:42:16,400 Speaker 1: understand it. We don't fully understand how it's being used. 682 00:42:16,800 --> 00:42:19,279 Speaker 1: Um and what data about us is being tracked as 683 00:42:19,400 --> 00:42:23,520 Speaker 1: users and UM and you know, government, I mean, the 684 00:42:23,560 --> 00:42:26,359 Speaker 1: FTC in the past few years has become much more 685 00:42:26,440 --> 00:42:31,800 Speaker 1: involved in this idea of helping consumers protect themselves against 686 00:42:32,120 --> 00:42:35,319 Speaker 1: you know, abusive forms of data traction um if you will, 687 00:42:35,960 --> 00:42:37,879 Speaker 1: um and and there are a lot of qui and 688 00:42:37,880 --> 00:42:40,120 Speaker 1: and and also it's fair to say that I think 689 00:42:40,120 --> 00:42:43,719 Speaker 1: a lot of tech companies have endeavored to put not 690 00:42:43,800 --> 00:42:47,160 Speaker 1: only in terms of service agreements, but in the actual 691 00:42:47,239 --> 00:42:50,400 Speaker 1: tools that they give to the user when they manage 692 00:42:50,440 --> 00:42:53,080 Speaker 1: their own accounts, as you know, to put the power 693 00:42:53,120 --> 00:42:56,120 Speaker 1: in the user's hands to control this. But you know, 694 00:42:56,520 --> 00:42:58,759 Speaker 1: let's be honest, right, I mean, you and I don't 695 00:42:58,840 --> 00:43:02,719 Speaker 1: necessarily read all of the terms of services. You're telling me. 696 00:43:03,080 --> 00:43:06,279 Speaker 1: You don't read every end user license agreement. You don't. 697 00:43:06,400 --> 00:43:11,160 Speaker 1: You can't recite the is you know. It's it's tough. 698 00:43:11,200 --> 00:43:14,040 Speaker 1: It's tough. I can't remember the last time, honestly, I 699 00:43:14,080 --> 00:43:17,080 Speaker 1: cannot remember the last time I actually read a LA 700 00:43:17,640 --> 00:43:22,000 Speaker 1: because who who has time for that? I want to 701 00:43:22,120 --> 00:43:25,640 Speaker 1: use your shiny new toy. I don't care that I've 702 00:43:25,680 --> 00:43:28,719 Speaker 1: given up all my privacy. And if as long as 703 00:43:28,760 --> 00:43:31,200 Speaker 1: I get to play with the shiny new toy, if 704 00:43:31,239 --> 00:43:34,480 Speaker 1: my b B eight remote control droid is secretly telling 705 00:43:34,560 --> 00:43:38,560 Speaker 1: Disney exactly how to make me buy more BB eight 706 00:43:38,560 --> 00:43:43,640 Speaker 1: remote control droids, so be it. Um, it all comes 707 00:43:43,640 --> 00:43:46,799 Speaker 1: back to Star Wars. Well, I mean, you know, the 708 00:43:46,840 --> 00:43:49,440 Speaker 1: force is strong in my family. My father has it, 709 00:43:49,640 --> 00:43:53,680 Speaker 1: I have it, my sister has it. So uh, you 710 00:43:53,719 --> 00:43:56,279 Speaker 1: know this is this has been a great conversation, great 711 00:43:56,320 --> 00:43:59,680 Speaker 1: talk about this topic. And uh you know again, if 712 00:43:59,719 --> 00:44:02,839 Speaker 1: you guys have it checked out Codebreaker, you definitely need 713 00:44:02,840 --> 00:44:07,480 Speaker 1: to remedy that immediately. It is a phenomenal series that 714 00:44:07,560 --> 00:44:10,839 Speaker 1: covers various topics. And this is your first season, right, 715 00:44:11,840 --> 00:44:14,840 Speaker 1: This is our first season, Jonathan, and we're looking forward 716 00:44:14,840 --> 00:44:17,560 Speaker 1: to having you back on for our second season which 717 00:44:17,600 --> 00:44:21,239 Speaker 1: is coming soon. Um. But but yeah, it's been a 718 00:44:21,239 --> 00:44:23,120 Speaker 1: lot of fun and and it was it was great 719 00:44:23,160 --> 00:44:26,680 Speaker 1: fun to talk to you on Codebreaker and um and 720 00:44:26,719 --> 00:44:29,640 Speaker 1: have you be a part of the show, and we 721 00:44:29,719 --> 00:44:33,440 Speaker 1: hope you'll you'll keep listening. Absolutely, and guys, you also 722 00:44:33,480 --> 00:44:37,040 Speaker 1: have to know this thing about code breaker, because it's 723 00:44:37,040 --> 00:44:40,840 Speaker 1: a gimmick that's so good. I was mad at myself 724 00:44:40,960 --> 00:44:45,320 Speaker 1: for never thinking about it. This is it was brilliant. 725 00:44:45,360 --> 00:44:48,720 Speaker 1: You you produced, You had all of the shows produced 726 00:44:48,719 --> 00:44:52,640 Speaker 1: and ready to go, and you hid codes in the 727 00:44:52,680 --> 00:44:56,480 Speaker 1: episodes and if you're able to break the code, you 728 00:44:56,520 --> 00:45:01,160 Speaker 1: can listen to the next episode before it publishes. Yes, 729 00:45:01,560 --> 00:45:03,759 Speaker 1: well we should start, you know, I mean, you know 730 00:45:05,080 --> 00:45:08,000 Speaker 1: I'm ready to have we embedded a secret code in 731 00:45:08,000 --> 00:45:11,359 Speaker 1: this episode. John. You know, I like to say yes 732 00:45:12,120 --> 00:45:16,400 Speaker 1: just to drive people crazy, but I'm not gonna do that. 733 00:45:16,560 --> 00:45:19,880 Speaker 1: I was just so proud of myself for for breaking 734 00:45:19,960 --> 00:45:24,200 Speaker 1: the the Morse code in the You did an episode 735 00:45:24,239 --> 00:45:29,239 Speaker 1: about internet porn, which is an incredible episode. Probably not 736 00:45:29,440 --> 00:45:32,960 Speaker 1: safe for you to listen to on speakers if you have, 737 00:45:34,120 --> 00:45:36,719 Speaker 1: not safe for work or for little ones, obviously for 738 00:45:36,719 --> 00:45:40,439 Speaker 1: for obvious reasons. But it's a real honest look at 739 00:45:40,440 --> 00:45:45,040 Speaker 1: what is not only a big industry online but legitimately 740 00:45:45,840 --> 00:45:51,480 Speaker 1: it can make or break certain types of on the 741 00:45:51,600 --> 00:45:56,680 Speaker 1: edge technologies like if you wonder why Blu ray one 742 00:45:56,840 --> 00:46:00,520 Speaker 1: over h D, d v D. It's one of the 743 00:46:00,560 --> 00:46:04,360 Speaker 1: big reasons I mean you know, it's it's crazy to 744 00:46:04,440 --> 00:46:08,440 Speaker 1: think about that. If you wonder why the a v 745 00:46:08,760 --> 00:46:13,719 Speaker 1: N conference overlapped with c ES for so many years. 746 00:46:14,840 --> 00:46:17,800 Speaker 1: This is why it's an incredible episode. And I managed 747 00:46:17,840 --> 00:46:20,719 Speaker 1: to actually break the code and I was probably a 748 00:46:20,760 --> 00:46:26,080 Speaker 1: little more proud of myself than I should have been. Well, 749 00:46:26,120 --> 00:46:28,040 Speaker 1: now you should, I mean you should be proud. It's 750 00:46:28,320 --> 00:46:31,759 Speaker 1: Morris code. It's it's it's not used, um, you know, 751 00:46:32,000 --> 00:46:35,320 Speaker 1: very very widely anymore by by at least most people. 752 00:46:35,360 --> 00:46:37,799 Speaker 1: But it's, uh, it's always a good thing to know. Yeah, 753 00:46:37,840 --> 00:46:40,560 Speaker 1: it turns out, you know, once you once you commit 754 00:46:40,640 --> 00:46:44,920 Speaker 1: that scouts stuff to memory and have and have a 755 00:46:44,960 --> 00:46:47,520 Speaker 1: web page opened up that tells you what the dots 756 00:46:47,520 --> 00:46:50,960 Speaker 1: and dashes being because I didn't remember. But but it's 757 00:46:50,960 --> 00:46:53,719 Speaker 1: still it's it's a it's a great way to engage 758 00:46:53,760 --> 00:46:58,239 Speaker 1: the listeners on top of this interesting take you have 759 00:46:58,360 --> 00:47:00,719 Speaker 1: on the different topics. So I just make sure you 760 00:47:00,800 --> 00:47:04,799 Speaker 1: go and check that out. Ben, Thank you so much 761 00:47:04,840 --> 00:47:06,960 Speaker 1: for agreeing to be part of this. This was I 762 00:47:06,960 --> 00:47:10,640 Speaker 1: think a really fun, kind of complimentary piece to your 763 00:47:10,800 --> 00:47:15,759 Speaker 1: Codebreaker episode, which again, uh, I really enjoyed you guys. 764 00:47:15,880 --> 00:47:18,279 Speaker 1: Let me listen to it before I was even I 765 00:47:18,560 --> 00:47:22,080 Speaker 1: recorded the final segment on that episode and I got 766 00:47:22,080 --> 00:47:24,680 Speaker 1: to listen to everything up to that point, and I 767 00:47:24,719 --> 00:47:26,879 Speaker 1: was just I was like, are you sure you want me? 768 00:47:28,040 --> 00:47:30,359 Speaker 1: You're bat and clean up man, you are our close up. 769 00:47:30,520 --> 00:47:33,600 Speaker 1: We love glad, We're glad I could could be of service. 770 00:47:33,640 --> 00:47:36,360 Speaker 1: It was an honor so well, it was great to 771 00:47:36,400 --> 00:47:38,080 Speaker 1: have you. It was great to have you on and 772 00:47:38,120 --> 00:47:41,440 Speaker 1: when we're big fans of tech stuff, so we're like, um, 773 00:47:41,960 --> 00:47:44,160 Speaker 1: it was thank you so much for for having me 774 00:47:44,200 --> 00:47:46,279 Speaker 1: and for for talking to me about this stuff. And 775 00:47:46,320 --> 00:47:49,200 Speaker 1: I think, I don't know, man, I you know, I 776 00:47:49,320 --> 00:47:52,200 Speaker 1: end up feeling you know, I think so many of 777 00:47:52,280 --> 00:47:54,879 Speaker 1: us end up heeling lost right about when it when 778 00:47:54,920 --> 00:47:57,480 Speaker 1: it comes to data tracking. And my only hope is 779 00:47:57,520 --> 00:48:00,000 Speaker 1: that is that I mean not to say that they're 780 00:48:00,080 --> 00:48:02,200 Speaker 1: should be an app for that, but you know, I'm 781 00:48:02,239 --> 00:48:04,360 Speaker 1: my only hope is that we will at some point 782 00:48:04,480 --> 00:48:08,279 Speaker 1: reach a reach a stage where it's easier for the 783 00:48:08,400 --> 00:48:12,520 Speaker 1: user to to really have control over what they're sharing 784 00:48:12,719 --> 00:48:17,280 Speaker 1: and and and also users are more more educated about 785 00:48:17,280 --> 00:48:20,440 Speaker 1: what they're sharing. Um. And I think I think we're all, 786 00:48:20,680 --> 00:48:22,640 Speaker 1: you know, we're all a little bit on the hook 787 00:48:22,680 --> 00:48:25,320 Speaker 1: for for not being more educated about what we're sharing. 788 00:48:25,360 --> 00:48:29,440 Speaker 1: But we're also you know, we also need better tools 789 00:48:29,640 --> 00:48:32,680 Speaker 1: to understand how we are sharing our data and how 790 00:48:32,680 --> 00:48:34,640 Speaker 1: it's being trapped and we and we definitely need to 791 00:48:34,719 --> 00:48:40,800 Speaker 1: hold government accountable for making sure they create and foster 792 00:48:41,960 --> 00:48:47,520 Speaker 1: departments that are specialized in informing other bodies of government 793 00:48:47,560 --> 00:48:51,160 Speaker 1: about these things, keeping up to date, because as we know, 794 00:48:51,239 --> 00:48:53,880 Speaker 1: I mean, we're we're still seeing an era where the 795 00:48:53,920 --> 00:48:57,239 Speaker 1: people who are in positions of power often they have 796 00:48:57,480 --> 00:49:00,680 Speaker 1: limited to no experience with a lot of the things 797 00:49:00,680 --> 00:49:04,120 Speaker 1: that are affecting us on a daily basis. Oh man, 798 00:49:04,280 --> 00:49:07,919 Speaker 1: those those those uh the Supreme Court transcripts. Man, it's 799 00:49:07,920 --> 00:49:11,560 Speaker 1: a sad world. So if we're gonna like, we can't 800 00:49:11,680 --> 00:49:15,080 Speaker 1: wait until the millennials are in the Supreme Court for 801 00:49:15,160 --> 00:49:18,920 Speaker 1: one thing, I'm gonna be dead by then. So I 802 00:49:18,960 --> 00:49:21,239 Speaker 1: would like to see things improve before that. But that's 803 00:49:21,320 --> 00:49:23,680 Speaker 1: that's one of those real challenges is that we have 804 00:49:24,239 --> 00:49:29,040 Speaker 1: a different population and positions of power who they may 805 00:49:29,200 --> 00:49:33,320 Speaker 1: have every desire to help us make sure that these 806 00:49:33,360 --> 00:49:36,120 Speaker 1: things are being used in a responsible way, a non 807 00:49:36,160 --> 00:49:41,279 Speaker 1: predatory and an ethical way, but they have, you know, 808 00:49:41,320 --> 00:49:43,560 Speaker 1: to them, it's it's like if you were to present 809 00:49:43,640 --> 00:49:46,000 Speaker 1: me with a page written in Greek, I would have 810 00:49:46,040 --> 00:49:49,560 Speaker 1: no idea what to do with that. So what technology 811 00:49:49,600 --> 00:49:53,520 Speaker 1: moves faster than policy? Right? Always has and and and 812 00:49:53,520 --> 00:49:55,680 Speaker 1: and and I think it's it's as true with data 813 00:49:55,719 --> 00:49:59,000 Speaker 1: tracking as anything else. And it's just, um, it's a 814 00:49:59,000 --> 00:50:01,080 Speaker 1: reality that we all have to deal with. But it's 815 00:50:01,160 --> 00:50:04,400 Speaker 1: it's it's something that hopefully will will change over time. 816 00:50:04,640 --> 00:50:09,160 Speaker 1: And it's not not not before, not before a lot 817 00:50:09,200 --> 00:50:12,040 Speaker 1: of stuff happens that many people may regret. So it's 818 00:50:12,120 --> 00:50:14,800 Speaker 1: it's always good to be thinking about it. Yeah. Absolutely, 819 00:50:16,120 --> 00:50:19,320 Speaker 1: That wraps up the classic episode of the evils of 820 00:50:19,400 --> 00:50:23,040 Speaker 1: data tracking. Big thanks again to Ben Johnson from Two 821 00:50:23,040 --> 00:50:25,760 Speaker 1: Bells and fifteen joining the show. That was a pleasure. 822 00:50:25,840 --> 00:50:28,600 Speaker 1: Hope we can sometimes get them back on the show. 823 00:50:29,480 --> 00:50:33,600 Speaker 1: And yeah, that's it for this classic episode of tech Stuff. 824 00:50:33,640 --> 00:50:36,440 Speaker 1: If you have suggestions, you can always reach out to 825 00:50:36,480 --> 00:50:39,320 Speaker 1: me either through the I Heart Radio app, using a 826 00:50:39,360 --> 00:50:41,719 Speaker 1: little microphone on the tech stuff page, to leave me 827 00:50:41,760 --> 00:50:44,440 Speaker 1: a voicemail, or to send me a message on Twitter. 828 00:50:44,640 --> 00:50:46,560 Speaker 1: The handle for the show is tech Stuff H s 829 00:50:46,760 --> 00:50:55,959 Speaker 1: W and I'll talk to you again really soon. Text 830 00:50:55,960 --> 00:50:59,400 Speaker 1: Stuff is an I Heart Radio production For more podcasts 831 00:50:59,440 --> 00:51:02,200 Speaker 1: from My heart Radio, visit the I heart Radio app, 832 00:51:02,320 --> 00:51:05,480 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows. 833 00:51:10,000 --> 00:51:10,040 Speaker 1: H