1 00:00:02,440 --> 00:00:07,440 Speaker 1: Okay, all right, let's set the stage for this evening's 2 00:00:07,600 --> 00:00:11,320 Speaker 1: classic episode. Fellow conspiracy realists, cast your memory back to 3 00:00:11,440 --> 00:00:16,600 Speaker 1: twenty nineteen. Nobody was really sure what would happen. Alexander 4 00:00:16,720 --> 00:00:20,200 Speaker 1: Dugan had some ideas, he had some wishes on his 5 00:00:20,239 --> 00:00:25,479 Speaker 1: wish list, and we got we got very We finally 6 00:00:25,520 --> 00:00:27,840 Speaker 1: did an episode on something we had been talking about 7 00:00:27,880 --> 00:00:30,880 Speaker 1: for gosh, more than a decade at this point, the 8 00:00:30,960 --> 00:00:35,919 Speaker 1: idea of smart devices and surveillance. And I think this 9 00:00:36,000 --> 00:00:38,479 Speaker 1: might have been the one that radicalized us a little bit. 10 00:00:39,120 --> 00:00:43,800 Speaker 2: Maybe a bit tiny bit, because this is definitely a 11 00:00:43,840 --> 00:00:47,479 Speaker 2: moment in time when, in real time, we realized just 12 00:00:47,560 --> 00:00:50,760 Speaker 2: how much mass data collection was happening within our phones, 13 00:00:51,800 --> 00:00:56,960 Speaker 2: not by necessarily the government, but by individual third party 14 00:00:57,000 --> 00:00:59,920 Speaker 2: companies that we all say yes to when we down 15 00:01:00,080 --> 00:01:03,320 Speaker 2: load their applications or we use their services. 16 00:01:03,840 --> 00:01:07,600 Speaker 1: Who reads the terms and conditions right? Who reads down 17 00:01:07,680 --> 00:01:11,440 Speaker 1: to paragraph seventeen of the agreement, especially when you're scrolling 18 00:01:11,520 --> 00:01:14,000 Speaker 1: on a phone and you're in a hurry because you 19 00:01:14,040 --> 00:01:16,480 Speaker 1: got to get the app dude. 20 00:01:17,319 --> 00:01:20,160 Speaker 2: Well, then, but then when you combine that with companies 21 00:01:20,160 --> 00:01:24,800 Speaker 2: potentially giving that information to governmental entities, and you combine 22 00:01:24,800 --> 00:01:28,840 Speaker 2: that with actual governmental entities that have stated before or 23 00:01:28,880 --> 00:01:34,600 Speaker 2: it's been leaked before stellar wind that communications on individual 24 00:01:34,640 --> 00:01:38,360 Speaker 2: phones are being picked up, and even communications through the 25 00:01:38,400 --> 00:01:41,280 Speaker 2: massive pipelines that go underneath the oceans. 26 00:01:41,959 --> 00:01:42,360 Speaker 3: Mm hmm. 27 00:01:42,560 --> 00:01:46,440 Speaker 1: Yeah, it's a prism of conspiracy and it probably takes 28 00:01:46,520 --> 00:01:52,280 Speaker 1: five eyes to see oo in full right. This is 29 00:01:52,360 --> 00:01:57,000 Speaker 1: also this is also shot up for anybody who ever thought, oh, 30 00:01:57,040 --> 00:02:00,320 Speaker 1: my phone is listening to me, the speakers take it 31 00:02:00,400 --> 00:02:02,560 Speaker 1: up something, it's so much deeper. 32 00:02:02,680 --> 00:02:05,880 Speaker 2: Oh my gosh, Ben, I'm gonna say this so I 33 00:02:05,920 --> 00:02:08,240 Speaker 2: say it on record. That is happening so much to 34 00:02:08,320 --> 00:02:13,239 Speaker 2: me recently where it's that echo on the telephone. It's 35 00:02:13,280 --> 00:02:16,600 Speaker 2: that thing where you're getting feedback when you're talking to 36 00:02:16,680 --> 00:02:19,560 Speaker 2: one other person when you don't have the speakerphone on, 37 00:02:19,919 --> 00:02:22,720 Speaker 2: or when you're using a Bluetooth headset that's got a 38 00:02:22,760 --> 00:02:26,000 Speaker 2: microphone that's close to it, when you're talking on PlayStation. 39 00:02:26,560 --> 00:02:33,320 Speaker 1: Oh buddy, oh man, Well, Matt, I'm with you there. 40 00:02:31,880 --> 00:02:36,480 Speaker 1: If this is the year we get hauled off to 41 00:02:36,520 --> 00:02:39,480 Speaker 1: the crazy house, that I think we'll be in the 42 00:02:39,560 --> 00:02:45,040 Speaker 1: same uber because we're not paying for the ambulance. 43 00:02:45,200 --> 00:02:46,320 Speaker 2: Is the joke anyway? 44 00:02:46,800 --> 00:02:50,800 Speaker 1: Anyway, this is our classic episode twenty nineteen smart devices 45 00:02:50,919 --> 00:02:52,440 Speaker 1: and surveillance. 46 00:02:52,960 --> 00:02:57,440 Speaker 4: From UFOs to psychic powers and government conspiracies. History is 47 00:02:57,520 --> 00:03:02,280 Speaker 4: riddled with unexplained events. Turn back now or learn this 48 00:03:02,320 --> 00:03:05,600 Speaker 4: stuff they don't want you to know. A production of 49 00:03:05,680 --> 00:03:08,560 Speaker 4: iHeart Radios How Stuff Works. 50 00:03:16,919 --> 00:03:18,960 Speaker 2: Welcome back to the show. My name is Matt. 51 00:03:18,960 --> 00:03:19,519 Speaker 3: My name is no. 52 00:03:19,960 --> 00:03:22,160 Speaker 1: They call me Ben. We are joined as always with 53 00:03:22,200 --> 00:03:25,600 Speaker 1: our super producer Paul, mission control deck and in spirit 54 00:03:25,800 --> 00:03:30,760 Speaker 1: because today we have our returning super producer Seth Johnson. 55 00:03:30,840 --> 00:03:34,600 Speaker 1: Everybody give him a hand. Most importantly, you are you. 56 00:03:34,600 --> 00:03:37,880 Speaker 1: You are here, and that makes this stuff they don't 57 00:03:38,040 --> 00:03:41,400 Speaker 1: want you to know. I was thinking about this. How 58 00:03:41,480 --> 00:03:44,560 Speaker 1: many people, on average do you think are listening today's 59 00:03:44,560 --> 00:03:48,000 Speaker 1: show on a cell phone or maybe on a laptop 60 00:03:48,280 --> 00:03:50,920 Speaker 1: with another electronic device on in the background. 61 00:03:51,080 --> 00:03:53,120 Speaker 3: M I would say many, if not most. 62 00:03:53,800 --> 00:03:55,360 Speaker 2: I see. I'm thinking a lot of people probably have 63 00:03:55,400 --> 00:03:58,040 Speaker 2: their phone in their pocket or on one of those 64 00:03:58,200 --> 00:04:01,600 Speaker 2: cool little armbands when they're just yeah, oh yeah, dude. 65 00:04:01,760 --> 00:04:04,480 Speaker 1: Runners or people who want to look like they have 66 00:04:04,840 --> 00:04:06,840 Speaker 1: ran or will run somewhere in the future. 67 00:04:06,920 --> 00:04:09,040 Speaker 3: Yeah, there's so I should chase those people. Is that 68 00:04:09,080 --> 00:04:10,000 Speaker 3: what you're saying? OK? 69 00:04:10,200 --> 00:04:12,440 Speaker 1: Yeah, yeah, give them, give them some steaks, you know 70 00:04:12,480 --> 00:04:14,680 Speaker 1: what I mean. I think that's what's missing for joggers. 71 00:04:14,800 --> 00:04:16,279 Speaker 3: I feel like that, you know. I feel like jogging 72 00:04:16,360 --> 00:04:19,400 Speaker 3: is just practice for you know, running away from something, 73 00:04:19,640 --> 00:04:21,599 Speaker 3: you know. I feel like these people are very paranoid. 74 00:04:22,040 --> 00:04:23,839 Speaker 1: One of my this might be a hot take, but 75 00:04:23,880 --> 00:04:27,240 Speaker 1: one of my very old and dear friends years back, 76 00:04:27,560 --> 00:04:29,640 Speaker 1: you guys know him, he told me. He told me 77 00:04:29,680 --> 00:04:32,240 Speaker 1: that one of the first signs of gentrification in a 78 00:04:32,279 --> 00:04:35,400 Speaker 1: neighborhood was people running when no one's chasing them. 79 00:04:35,440 --> 00:04:35,839 Speaker 3: Brilliant. 80 00:04:36,320 --> 00:04:38,120 Speaker 1: He's a weird guy, but I thought it was a 81 00:04:38,120 --> 00:04:38,640 Speaker 1: good point. 82 00:04:38,680 --> 00:04:39,960 Speaker 3: I think so too so so. 83 00:04:40,120 --> 00:04:43,400 Speaker 2: But the reason that we're even mentioning this, like multiple 84 00:04:43,440 --> 00:04:46,160 Speaker 2: devices going on in the same audio space, is because 85 00:04:46,200 --> 00:04:49,000 Speaker 2: that's one of the main things we're going to be 86 00:04:49,000 --> 00:04:54,239 Speaker 2: talking about today, our devices and potentially monitoring them. 87 00:04:54,320 --> 00:04:58,599 Speaker 1: So, right, are you addicted to your phone, your television, 88 00:04:58,680 --> 00:05:00,680 Speaker 1: or your tablet? Are you one of the people, like 89 00:05:00,800 --> 00:05:04,200 Speaker 1: many of us, who can't really hold a conversation without 90 00:05:04,279 --> 00:05:07,280 Speaker 1: at least checking in on this thing. I'm guilty of that. 91 00:05:07,360 --> 00:05:08,360 Speaker 3: He's literally doing it right now. 92 00:05:08,680 --> 00:05:11,120 Speaker 1: Yeah, well, I'm kind of waving it around, do you 93 00:05:11,160 --> 00:05:14,280 Speaker 1: find yourself tuning out, you know, when when the conversation 94 00:05:14,400 --> 00:05:17,200 Speaker 1: is taking its course, and then finding yourself tuning into 95 00:05:17,279 --> 00:05:22,080 Speaker 1: the closest device whatever that is, what like right exactly, So, 96 00:05:22,400 --> 00:05:24,080 Speaker 1: just for the lay of the land, here, we have 97 00:05:24,240 --> 00:05:27,800 Speaker 1: three laptops out open right now. We all have our 98 00:05:27,880 --> 00:05:31,760 Speaker 1: phone somewhere near our person, and we have you know, 99 00:05:31,880 --> 00:05:36,000 Speaker 1: a bevy of av equipment that is probably the most 100 00:05:36,080 --> 00:05:40,080 Speaker 1: innocent of the contraptions here right in this room. And 101 00:05:40,600 --> 00:05:43,279 Speaker 1: if you are one of those people who constantly finds 102 00:05:43,279 --> 00:05:47,159 Speaker 1: yourself clicking in or dropping out of conversation, tuning into 103 00:05:47,200 --> 00:05:51,679 Speaker 1: something else, you are not alone. Let's reiterate that, because 104 00:05:51,720 --> 00:05:54,159 Speaker 1: at first it sounds kind of nice, right, it sounds 105 00:05:54,200 --> 00:05:57,479 Speaker 1: kind of like ah, warm, fuzzy huggy time. No, no, no, 106 00:05:57,760 --> 00:06:02,240 Speaker 1: think about it. You are not alone in multiple senses 107 00:06:02,320 --> 00:06:05,320 Speaker 1: of the phrase. Here are the facts. 108 00:06:06,080 --> 00:06:11,760 Speaker 2: Smartphones, smart devices, smart everything. Really it's everywhere, and you're 109 00:06:11,800 --> 00:06:14,479 Speaker 2: probably working with one right now just because you need 110 00:06:14,520 --> 00:06:17,840 Speaker 2: the technology to listen to this today. Let's look at 111 00:06:17,839 --> 00:06:21,400 Speaker 2: the Pew Research organization. There, they've done you know, they 112 00:06:21,400 --> 00:06:24,799 Speaker 2: do statistics like nobody else. It is estimated they say 113 00:06:25,080 --> 00:06:28,560 Speaker 2: that more than five billion people have mobile devices, and 114 00:06:28,640 --> 00:06:32,040 Speaker 2: over half of these connections are smartphones, so they're doing 115 00:06:32,080 --> 00:06:35,719 Speaker 2: more than just making a connection via satellite to another 116 00:06:35,760 --> 00:06:39,400 Speaker 2: phone somewhere or from you know, it's not a landline anymore. 117 00:06:39,680 --> 00:06:42,840 Speaker 2: It is it's a phone. It's an encyclopedia. It's absolutely 118 00:06:42,880 --> 00:06:44,080 Speaker 2: everything you could possibly do. 119 00:06:44,120 --> 00:06:47,680 Speaker 3: It's like Ziggy from Quantum Leap. Yes, no, literally have 120 00:06:47,760 --> 00:06:50,520 Speaker 3: that in our hands now. It doesn't quite predict the future, 121 00:06:50,560 --> 00:06:51,599 Speaker 3: but it comes damn close. 122 00:06:52,160 --> 00:06:54,919 Speaker 1: Yeah. And this is that's a global number, yes, throughout 123 00:06:54,960 --> 00:06:57,520 Speaker 1: the world numbers. So more than half of the people 124 00:06:57,560 --> 00:07:01,080 Speaker 1: alive today have one of these things. It's I'm holding 125 00:07:01,160 --> 00:07:03,159 Speaker 1: up the phone again like a prop. It's one of 126 00:07:03,240 --> 00:07:08,800 Speaker 1: technology's biggest breakthrough success stories of recent decades because just 127 00:07:08,839 --> 00:07:11,280 Speaker 1: a few decades ago, no one had him. 128 00:07:11,520 --> 00:07:11,960 Speaker 2: Exactly. 129 00:07:12,040 --> 00:07:14,680 Speaker 1: Now five billion people have these, so get this. 130 00:07:14,800 --> 00:07:17,760 Speaker 3: In the US specifically, we've got nine to ten or 131 00:07:17,840 --> 00:07:21,280 Speaker 3: more Americans aged thirty four and under who have had 132 00:07:21,320 --> 00:07:25,679 Speaker 3: a smartphone since twenty fifteen, while the ownership rate among 133 00:07:25,720 --> 00:07:28,239 Speaker 3: the fifty and older age group has raisen from fifty 134 00:07:28,240 --> 00:07:30,200 Speaker 3: three percent to sixty seven percent over the same period. 135 00:07:30,640 --> 00:07:32,440 Speaker 3: One of my dearest oldest friends, who is in his 136 00:07:33,480 --> 00:07:40,440 Speaker 3: early fifties very purposefully still has a dumb phone. Oh, Harry, Harry, Yeah, yeah, 137 00:07:40,520 --> 00:07:42,760 Speaker 3: very much on purpose. He just like he is old 138 00:07:42,800 --> 00:07:45,440 Speaker 3: school in that way where he rejects a lot of 139 00:07:45,480 --> 00:07:50,720 Speaker 3: this like over connectedness, and consequently he's a much more. 140 00:07:50,600 --> 00:07:52,560 Speaker 2: Thoughtful person than a lot of people that I know 141 00:07:53,080 --> 00:07:56,000 Speaker 2: you well. After this episode, most of us are going 142 00:07:56,040 --> 00:07:58,680 Speaker 2: to want to switch back. I know, I know Verizon 143 00:07:58,760 --> 00:08:01,160 Speaker 2: where I am offers a flip phone. 144 00:08:00,800 --> 00:08:05,480 Speaker 1: Option, so Nokia's never break too Well, there you go. 145 00:08:05,720 --> 00:08:08,880 Speaker 2: It seems more and more like a good option. We'll 146 00:08:08,920 --> 00:08:09,600 Speaker 2: get into it. 147 00:08:09,480 --> 00:08:13,840 Speaker 1: Though, because it doesn't just stop with smartphones. We also 148 00:08:13,880 --> 00:08:17,080 Speaker 1: are talking about smart TVs, and yes, Matt, you're absolutely right. 149 00:08:17,720 --> 00:08:21,760 Speaker 1: This may well ruin some some people's day, but this 150 00:08:21,880 --> 00:08:26,000 Speaker 1: is important to know. So we checked out STATISTICA when 151 00:08:26,000 --> 00:08:29,040 Speaker 1: we wanted to find some more stats and specs on 152 00:08:29,280 --> 00:08:34,400 Speaker 1: smart TVs. The global TV set unit sales are projected 153 00:08:34,440 --> 00:08:37,400 Speaker 1: to increase from two hundred and twenty nine million units 154 00:08:37,400 --> 00:08:40,520 Speaker 1: in twenty sixteen to two hundred and fifty nine million 155 00:08:40,600 --> 00:08:45,360 Speaker 1: by twenty twenty. That's pretty nuts, because you know, televisions 156 00:08:45,600 --> 00:08:50,240 Speaker 1: shouldn't be this sort of disposable resource. They never were, right, 157 00:08:50,280 --> 00:08:52,760 Speaker 1: but now more and more people are buying TVs more 158 00:08:52,800 --> 00:08:53,560 Speaker 1: and more frequently. 159 00:08:53,720 --> 00:08:56,640 Speaker 3: I mean, I have a smart TV that is made 160 00:08:56,880 --> 00:08:59,000 Speaker 3: by Amazon. I'm sure it's made by some third party 161 00:08:59,040 --> 00:09:00,679 Speaker 3: manufacturer overseas. 162 00:09:01,200 --> 00:09:02,240 Speaker 2: It is Amazon. 163 00:09:02,880 --> 00:09:05,120 Speaker 3: You know why though, you guys, because it was dirt 164 00:09:05,200 --> 00:09:09,040 Speaker 3: cheap and it's it works great, and I likely interconnectedness 165 00:09:09,040 --> 00:09:11,080 Speaker 3: of it. But we'll get into some of the features 166 00:09:11,080 --> 00:09:13,720 Speaker 3: that this well you call it a feature, it's really 167 00:09:13,760 --> 00:09:19,240 Speaker 3: more like a bug. No, literally a listening surveillance feature, 168 00:09:19,280 --> 00:09:20,280 Speaker 3: but just not for you. 169 00:09:20,520 --> 00:09:22,840 Speaker 1: Yeah, feature bug is in the eye of the beholder. 170 00:09:22,840 --> 00:09:24,800 Speaker 3: That's absolutely right, And in this case, the eye of 171 00:09:24,840 --> 00:09:27,160 Speaker 3: the beholder is big data. 172 00:09:27,280 --> 00:09:30,559 Speaker 1: Yeah. As of twenty eighteen, seventy percent of the television's 173 00:09:30,559 --> 00:09:34,360 Speaker 1: being sold across the planet are smart TVs. And a 174 00:09:34,400 --> 00:09:37,959 Speaker 1: smart TV at the most basic explanatory level, is a 175 00:09:38,000 --> 00:09:42,000 Speaker 1: television that combines a lot of features one would associate 176 00:09:42,040 --> 00:09:45,760 Speaker 1: with a computer. Right, So, if you like no own 177 00:09:45,800 --> 00:09:48,760 Speaker 1: a smart TV, you can watch your favorite shows, but 178 00:09:48,800 --> 00:09:50,360 Speaker 1: you don't have to just watch them when they're on. 179 00:09:50,559 --> 00:09:52,800 Speaker 1: You can also you know, dial it up on demand 180 00:09:52,960 --> 00:09:55,480 Speaker 1: for instance. You can connect it with your phone. 181 00:09:55,880 --> 00:09:59,040 Speaker 3: Not to mention, you can it's very customizable. You can 182 00:09:59,120 --> 00:10:02,520 Speaker 3: combine all of the different services into one kind of 183 00:10:03,200 --> 00:10:05,679 Speaker 3: widget box, let's call it. Where you have your Netflix, 184 00:10:05,880 --> 00:10:08,640 Speaker 3: you got your Hulu, you got your Amazon, which obviously, 185 00:10:08,800 --> 00:10:11,400 Speaker 3: my Amazon TV leans pretty heavily on the global search 186 00:10:11,640 --> 00:10:15,440 Speaker 3: on My Amazon TV searches like all of Amazon, and 187 00:10:15,480 --> 00:10:18,320 Speaker 3: it gives you products, it gives you TV shows, it 188 00:10:18,360 --> 00:10:22,040 Speaker 3: gives you other stuff that's in your set of apps 189 00:10:22,120 --> 00:10:25,920 Speaker 3: or subscriptions, but very much leaning toward the Amazon side 190 00:10:25,960 --> 00:10:26,319 Speaker 3: of things. 191 00:10:26,400 --> 00:10:29,400 Speaker 2: Yeah, the most important thing about a smart TV, when 192 00:10:29,440 --> 00:10:31,640 Speaker 2: we call it that, is that it's able to communicate 193 00:10:31,720 --> 00:10:34,480 Speaker 2: with your network or the network that it's attached to, 194 00:10:35,040 --> 00:10:38,240 Speaker 2: and could possibly see all of the other devices that 195 00:10:38,280 --> 00:10:39,960 Speaker 2: are attached to that network. 196 00:10:39,600 --> 00:10:43,920 Speaker 1: Could possibly sure this is This is true though, so 197 00:10:44,360 --> 00:10:49,280 Speaker 1: Android is probably the most widely used operating system among 198 00:10:49,360 --> 00:10:52,240 Speaker 1: smart TVs, but that by no means should be taken 199 00:10:52,280 --> 00:10:55,440 Speaker 1: to indicate that other oss aren't in there. iPhones are 200 00:10:55,480 --> 00:10:57,600 Speaker 1: in there as well. Apple has a hand in this. 201 00:10:58,160 --> 00:11:01,440 Speaker 1: And while smart device addiction is real, especially when we 202 00:11:01,480 --> 00:11:04,840 Speaker 1: talk about mobile devices, it's I think we should bracket 203 00:11:04,840 --> 00:11:07,440 Speaker 1: that as the subject of its own episode in the future, 204 00:11:07,440 --> 00:11:10,800 Speaker 1: assuming we don't get black backed or disappeared there's more 205 00:11:10,840 --> 00:11:15,440 Speaker 1: to the story behind the purposefully addictive technology. Here. You see, 206 00:11:15,720 --> 00:11:19,480 Speaker 1: while we stare into that electronic abyss, even though we 207 00:11:19,600 --> 00:11:23,480 Speaker 1: might not know it, sometimes things in that abyss, in 208 00:11:23,559 --> 00:11:26,880 Speaker 1: this sort of black mirror, are staring back at you. 209 00:11:27,440 --> 00:11:29,240 Speaker 2: And we'll get into that right after a quick word 210 00:11:29,280 --> 00:11:30,040 Speaker 2: from our sponsor. 211 00:11:35,240 --> 00:11:40,320 Speaker 1: Here's where it gets crazy. So we talked about smart devices. 212 00:11:40,360 --> 00:11:43,160 Speaker 1: They're popular. Everybody loves them. They're the hottest thing since 213 00:11:44,080 --> 00:11:45,559 Speaker 1: fresh baked sliced bread. Whatever. 214 00:11:45,640 --> 00:11:47,240 Speaker 2: Yeah, they do all the stuff we need, they do. 215 00:11:47,240 --> 00:11:51,080 Speaker 1: All the stuff we want too writ But with smart 216 00:11:51,080 --> 00:11:56,360 Speaker 1: devices comes the concept of surveillance. And we've talked about 217 00:11:56,360 --> 00:11:58,760 Speaker 1: this a little bit before at our previous episodes on 218 00:11:58,800 --> 00:12:01,520 Speaker 1: big data, big data, whatever your preference may be. 219 00:12:01,760 --> 00:12:04,120 Speaker 2: Oh yeah, and if you're worried about you know your 220 00:12:04,120 --> 00:12:06,280 Speaker 2: smart device is tracking you or anything, and you still 221 00:12:06,320 --> 00:12:08,960 Speaker 2: have one of these Amazon echoes or maybe a Google 222 00:12:09,000 --> 00:12:11,400 Speaker 2: personal assistant like a home or something plugged in and 223 00:12:11,440 --> 00:12:14,880 Speaker 2: turned on, you can stop worrying. They've already got you. 224 00:12:15,080 --> 00:12:18,920 Speaker 2: It's over. It's just too late. Yeah, we're having a 225 00:12:18,960 --> 00:12:21,679 Speaker 2: little bit of fun. There's that's not fully true. You 226 00:12:21,720 --> 00:12:25,199 Speaker 2: think that's fun, Matt, But there is some sand to 227 00:12:25,240 --> 00:12:27,280 Speaker 2: this idea, and we're gonna get into it a little 228 00:12:27,280 --> 00:12:27,800 Speaker 2: bit later. 229 00:12:27,920 --> 00:12:32,720 Speaker 3: Remember back when PDA's were a thing. Sure, personal digital assistance. Yeah, 230 00:12:32,800 --> 00:12:36,400 Speaker 3: now we have robot overlords that are like doing our bidding? 231 00:12:36,400 --> 00:12:38,720 Speaker 3: But are they really? Are we not really just doing 232 00:12:38,800 --> 00:12:39,320 Speaker 3: their bidding? 233 00:12:39,400 --> 00:12:39,920 Speaker 2: Oh? God? 234 00:12:40,240 --> 00:12:42,840 Speaker 1: Right, it's like the old oh Man. There's so many 235 00:12:43,040 --> 00:12:45,080 Speaker 1: weird ways to go with this. But we should talk 236 00:12:45,120 --> 00:12:47,880 Speaker 1: about the nuts and bolts too, right, because we know 237 00:12:48,160 --> 00:12:51,280 Speaker 1: our smart devices have to keep track of the user. 238 00:12:51,520 --> 00:12:53,800 Speaker 1: You have GPS, you have the ways app or something 239 00:12:53,880 --> 00:12:56,400 Speaker 1: you're driving, you have Lyft, you have Uber, what have you? 240 00:12:56,400 --> 00:12:59,120 Speaker 3: You know, all things that make the stuff function and 241 00:12:59,160 --> 00:13:00,360 Speaker 3: make it convenient for you. 242 00:13:00,640 --> 00:13:03,360 Speaker 1: And then there are also a lot of apps that say, hey, 243 00:13:03,520 --> 00:13:07,000 Speaker 1: we want permission to access your microphone or your location, 244 00:13:07,160 --> 00:13:10,080 Speaker 1: and you're like, wow, Candy Crush, this is getting serious. 245 00:13:10,360 --> 00:13:14,959 Speaker 1: You know, I'm picking that as an example that's not one. Okay, great, Well, 246 00:13:15,000 --> 00:13:18,000 Speaker 1: at least we have candy Crush to remain as a 247 00:13:18,040 --> 00:13:21,880 Speaker 1: sacrosanct example of good programming. But when our smart devices 248 00:13:21,880 --> 00:13:24,719 Speaker 1: are keeping track of us, the kind of surveillance that 249 00:13:24,760 --> 00:13:29,240 Speaker 1: they have is, as we can tell, squarely aimed at 250 00:13:29,280 --> 00:13:32,560 Speaker 1: tracking our preferences. Let me figure out what you like 251 00:13:32,920 --> 00:13:36,880 Speaker 1: says your mobile device such that I can give you 252 00:13:37,040 --> 00:13:39,880 Speaker 1: better offers make it easier for you to say yes 253 00:13:39,920 --> 00:13:42,240 Speaker 1: to things in the future. And that's why if you 254 00:13:42,320 --> 00:13:45,160 Speaker 1: were on our Facebook group, here's where it gets crazy. 255 00:13:45,440 --> 00:13:50,560 Speaker 1: You'll see You'll see all the strange, insidious examples ranging 256 00:13:50,600 --> 00:13:56,319 Speaker 1: from hilarious to disquieting about how just how these algorithms 257 00:13:56,360 --> 00:13:58,720 Speaker 1: can hone in. I think, Noel, you posted a meme 258 00:13:58,800 --> 00:14:02,040 Speaker 1: recently that was Facebook related. 259 00:14:02,160 --> 00:14:02,640 Speaker 2: Yeah, I did. 260 00:14:02,679 --> 00:14:04,920 Speaker 3: It was one of these great Simpsons memes where it 261 00:14:04,960 --> 00:14:07,079 Speaker 3: was just an image of like Bart Simpson in bed 262 00:14:07,320 --> 00:14:11,240 Speaker 3: and Homer leaning in really creepily like eyeball to eyeball, 263 00:14:11,559 --> 00:14:15,760 Speaker 3: and it was I think Homer was labeled as Facebook 264 00:14:15,840 --> 00:14:20,160 Speaker 3: ads and Bart was labeled as things I said out 265 00:14:20,200 --> 00:14:22,720 Speaker 3: loud but never actually Google searched or whatever. 266 00:14:22,720 --> 00:14:25,680 Speaker 1: Or some people even say I was just thinking about this. 267 00:14:26,240 --> 00:14:29,000 Speaker 3: It definitely started a conversation of people giving examples of 268 00:14:29,080 --> 00:14:31,320 Speaker 3: these things. And I've experienced it too. We have a 269 00:14:31,360 --> 00:14:33,840 Speaker 3: lot of advertisers that like vet stuff through us, and 270 00:14:34,080 --> 00:14:36,560 Speaker 3: sometimes I feel like I just say it or like 271 00:14:36,600 --> 00:14:38,360 Speaker 3: I'm talking to you guys about it, and I've never 272 00:14:38,400 --> 00:14:40,600 Speaker 3: even like read copy or seen an email or gone 273 00:14:40,640 --> 00:14:43,240 Speaker 3: to the site. Next thing, you know, Facebook's serving me 274 00:14:43,360 --> 00:14:46,320 Speaker 3: up you know, tushy or whatever. That bad example. But 275 00:14:46,400 --> 00:14:47,800 Speaker 3: like you know, you know I'm talking about You've seen it, 276 00:14:47,840 --> 00:14:48,040 Speaker 3: have them? 277 00:14:48,120 --> 00:14:50,640 Speaker 2: Sure? No, there's a reason for that. So, you know, 278 00:14:50,720 --> 00:14:54,320 Speaker 2: we get into why what are the motives behind all 279 00:14:54,360 --> 00:14:56,160 Speaker 2: of this surveillance. We kind of talked about a little 280 00:14:56,160 --> 00:15:00,760 Speaker 2: bit tracking our preferences and everything, but honestly, like what 281 00:15:00,800 --> 00:15:02,520 Speaker 2: are you going to do with all of that? If 282 00:15:02,600 --> 00:15:05,000 Speaker 2: you if you start to really think about and understand 283 00:15:05,080 --> 00:15:07,800 Speaker 2: what the economic model is behind all this stuff, you 284 00:15:07,880 --> 00:15:11,600 Speaker 2: realize that it's because we, you, me, each of us, 285 00:15:11,720 --> 00:15:15,040 Speaker 2: We are the batteries of the economic system, its products. 286 00:15:15,120 --> 00:15:18,880 Speaker 2: It's literally the matrix. We are living in the matrix everybody. 287 00:15:18,920 --> 00:15:21,280 Speaker 2: We are inside our pods. Our pods consist of your 288 00:15:21,280 --> 00:15:23,920 Speaker 2: smartphone and your smart TV and all the things you 289 00:15:23,960 --> 00:15:26,920 Speaker 2: interact with your laptop. That is us. We are the 290 00:15:26,960 --> 00:15:29,800 Speaker 2: byproducts of a lifestyle obsession, a fight club. 291 00:15:29,920 --> 00:15:31,800 Speaker 3: No, and here's the thing too, we mentioned I mentioned 292 00:15:31,800 --> 00:15:34,680 Speaker 3: this off air. I got that Amazon TV because it 293 00:15:34,760 --> 00:15:37,400 Speaker 3: had a lot of features, It had really high resolution, 294 00:15:37,600 --> 00:15:41,000 Speaker 3: and it was dirt cheap and TV prices way down, 295 00:15:41,160 --> 00:15:42,720 Speaker 3: and as we saw at the beginning of the show, 296 00:15:42,840 --> 00:15:45,640 Speaker 3: TV sales way up. And I think you can't ignore 297 00:15:46,040 --> 00:15:48,360 Speaker 3: that there's an exchange going on there with like we're 298 00:15:48,400 --> 00:15:50,960 Speaker 3: giving up this part of ourselves in exchange for cheaper 299 00:15:51,320 --> 00:15:53,800 Speaker 3: and better, more efficient technology. 300 00:15:54,000 --> 00:15:57,520 Speaker 1: Well, one must ask at a certain point, one must 301 00:15:57,560 --> 00:16:01,800 Speaker 1: ask where the income for where the company is actually 302 00:16:02,480 --> 00:16:06,120 Speaker 1: arriving from. We've mentioned before one of my favorite examples, 303 00:16:06,120 --> 00:16:08,480 Speaker 1: and I won't go into it now because longtime listeners 304 00:16:08,520 --> 00:16:12,200 Speaker 1: have already heard this for a time. Target the corporation, 305 00:16:12,320 --> 00:16:15,000 Speaker 1: the retail store was not making most of its money 306 00:16:15,200 --> 00:16:20,760 Speaker 1: off of selling people baby toys and trousers. Do people 307 00:16:20,760 --> 00:16:24,640 Speaker 1: still say trus trousers? Okay, sure, knickerbockers whatever. They were 308 00:16:24,720 --> 00:16:27,960 Speaker 1: making the bulk of their income, a huge proportion of 309 00:16:27,960 --> 00:16:33,479 Speaker 1: it from selling their security system infrastructure to other companies, 310 00:16:33,760 --> 00:16:36,120 Speaker 1: kind of like the way McDonald's makes most of its 311 00:16:36,160 --> 00:16:39,400 Speaker 1: money through real estate. So it's okay to sell a 312 00:16:39,440 --> 00:16:43,760 Speaker 1: television at a phenomenal loss, right when you know you're 313 00:16:43,760 --> 00:16:46,840 Speaker 1: going to recoup that money and then some on something else. 314 00:16:46,840 --> 00:16:49,080 Speaker 1: And I think that's what's happening at the televisions. Would 315 00:16:49,080 --> 00:16:49,440 Speaker 1: you agree? 316 00:16:49,680 --> 00:16:51,000 Speaker 3: I mean, It sure seems like that to me. 317 00:16:51,600 --> 00:16:55,720 Speaker 1: So we know, we know that no matter who you are, 318 00:16:56,040 --> 00:16:58,720 Speaker 1: no matter where you are in this wide world or 319 00:16:58,840 --> 00:17:04,000 Speaker 1: just orbiting yourround it, you have something that really wants 320 00:17:04,080 --> 00:17:06,040 Speaker 1: to be your friend, wants to be your best friend, 321 00:17:06,880 --> 00:17:10,399 Speaker 1: your teacher, your mother, your secret lover, to quote Homer Simpson. 322 00:17:10,800 --> 00:17:13,840 Speaker 1: And this thing that can't wait to be your best 323 00:17:13,880 --> 00:17:17,680 Speaker 1: friend is called the advertising industry. It's had its eye 324 00:17:17,680 --> 00:17:20,000 Speaker 1: on you for a while. It already knows a lot 325 00:17:20,040 --> 00:17:21,080 Speaker 1: about you, right Matt. 326 00:17:21,119 --> 00:17:23,240 Speaker 2: Oh, yeah, it knows a great deal about you. But 327 00:17:23,320 --> 00:17:25,439 Speaker 2: it wants to go deeper. It wants to go a 328 00:17:25,480 --> 00:17:28,240 Speaker 2: therapist style on you. He wants to know what you love, 329 00:17:28,320 --> 00:17:30,600 Speaker 2: what do you hate, who do you trust? And also 330 00:17:30,600 --> 00:17:32,719 Speaker 2: how much liquid cash can you get your hands on 331 00:17:32,760 --> 00:17:35,159 Speaker 2: in the short term. Now, okay, again we're joking a 332 00:17:35,160 --> 00:17:37,399 Speaker 2: little bit, but you get the point. For real. They 333 00:17:37,440 --> 00:17:40,000 Speaker 2: want to know how much you can spend and what 334 00:17:40,040 --> 00:17:42,360 Speaker 2: you would want to spend it on if you absolutely could, 335 00:17:42,480 --> 00:17:45,359 Speaker 2: right now, if somebody just popped something in front of 336 00:17:45,400 --> 00:17:48,000 Speaker 2: your face, right now, what is the number one thing 337 00:17:48,119 --> 00:17:50,160 Speaker 2: you would buy? Because we'll find it and we will 338 00:17:50,160 --> 00:17:51,479 Speaker 2: show it to you and you will buy it. 339 00:17:51,560 --> 00:17:54,240 Speaker 3: Yeah, and companies just gobble up all of this data 340 00:17:54,440 --> 00:17:59,359 Speaker 3: because the level of technology that we're at right now 341 00:17:59,520 --> 00:18:02,560 Speaker 3: isn't quite as sophisticated as they would like. We're getting there, 342 00:18:02,640 --> 00:18:05,880 Speaker 3: they're certainly pushing it every day, but right now it's 343 00:18:05,920 --> 00:18:07,800 Speaker 3: just kind of like a throwing everything at the wall 344 00:18:07,840 --> 00:18:09,040 Speaker 3: and seeing what sticks approach. 345 00:18:09,160 --> 00:18:09,320 Speaker 2: You know. 346 00:18:09,680 --> 00:18:13,080 Speaker 1: Artificial intelligence, it turns out, in this regard at least, 347 00:18:13,320 --> 00:18:16,240 Speaker 1: is not that intelligent and it needs a ton of help. 348 00:18:16,760 --> 00:18:20,680 Speaker 2: So it's efficient as all hell, but it is not. 349 00:18:20,880 --> 00:18:24,000 Speaker 2: It cannot make the connections a lot of times unless 350 00:18:24,040 --> 00:18:25,720 Speaker 2: it is helped out by a human user. 351 00:18:25,880 --> 00:18:28,200 Speaker 1: Still a black box, though great example works. 352 00:18:28,280 --> 00:18:31,280 Speaker 3: I often get served ads for things that I've already bought. 353 00:18:31,440 --> 00:18:33,840 Speaker 1: That's yeah, I was gonna say too. That's like it's 354 00:18:33,840 --> 00:18:37,399 Speaker 1: we're at the stage where in the Terminator franchise, the 355 00:18:37,440 --> 00:18:43,120 Speaker 1: original cyborgs were easily discernible from organic humans, right because 356 00:18:43,240 --> 00:18:47,080 Speaker 1: it's a bit ridiculous. There's not any human advertiser who 357 00:18:47,080 --> 00:18:50,840 Speaker 1: would say, well, this person just bought a toilet, so 358 00:18:50,880 --> 00:18:53,160 Speaker 1: you know what, they need five more toilets. 359 00:18:53,720 --> 00:18:54,000 Speaker 2: You know. 360 00:18:54,160 --> 00:18:57,480 Speaker 1: That's like, like, what's gonna happen? Are you gonna are 361 00:18:57,480 --> 00:18:59,560 Speaker 1: you gonna just bought a toilet. You'll see an ad 362 00:18:59,600 --> 00:19:01,520 Speaker 1: for it, and then you'll go, no, I don't know, 363 00:19:01,600 --> 00:19:05,800 Speaker 1: Maybe I'll just treat myself. It's it's bizarre. 364 00:19:05,800 --> 00:19:08,000 Speaker 2: Well, maybe you're a contractor. Maybe it thinks we're all 365 00:19:08,080 --> 00:19:10,640 Speaker 2: contractors and we're all building out bathroom fair points. 366 00:19:10,960 --> 00:19:14,560 Speaker 1: Yeah, and we know that, we know that companies have 367 00:19:14,640 --> 00:19:17,280 Speaker 1: a lot they want to do with this data, even 368 00:19:17,320 --> 00:19:20,639 Speaker 1: if they can't entirely get the rubber to meet the 369 00:19:20,720 --> 00:19:23,960 Speaker 1: road and practice. And we'll we'll expand that picture in 370 00:19:24,000 --> 00:19:27,359 Speaker 1: a frightening way a little bit later, but for now, 371 00:19:27,920 --> 00:19:31,480 Speaker 1: let's think of it this way. Companies give all the 372 00:19:31,560 --> 00:19:34,480 Speaker 1: data they can get their hands on, even the stuff 373 00:19:34,480 --> 00:19:37,080 Speaker 1: you would think is unimportant. They give all of that 374 00:19:37,320 --> 00:19:40,280 Speaker 1: equal weighting. When it comes to picking it up. There's 375 00:19:40,320 --> 00:19:43,360 Speaker 1: nothing that gets ignored if it's able to be monitored 376 00:19:43,359 --> 00:19:46,480 Speaker 1: and captured. And that's because AI programs, as we said, 377 00:19:46,600 --> 00:19:49,240 Speaker 1: just aren't that intelligent yet. They are efficient, mat but 378 00:19:49,240 --> 00:19:53,119 Speaker 1: they're not that intelligent. Like think about home surveillance systems. 379 00:19:53,160 --> 00:19:56,520 Speaker 1: You have a you know, Noel, You and I probably 380 00:19:56,520 --> 00:19:59,439 Speaker 1: have Amazon Echoes, yes, yeah, and you have a Google 381 00:19:59,520 --> 00:20:00,480 Speaker 1: Home or something like that. 382 00:20:00,680 --> 00:20:01,040 Speaker 2: Correct. 383 00:20:01,119 --> 00:20:06,160 Speaker 1: Okay, so these home surveillance systems, which is the correct 384 00:20:06,280 --> 00:20:10,119 Speaker 1: term for them. These home surveillance systems have these uh, 385 00:20:10,280 --> 00:20:14,400 Speaker 1: these assistants, these programs that will guess at what they 386 00:20:14,560 --> 00:20:18,680 Speaker 1: think you said, but they still frequently missed the mark. 387 00:20:18,840 --> 00:20:20,080 Speaker 1: You know what I mean, right, But they. 388 00:20:19,960 --> 00:20:23,399 Speaker 3: Also aren't listening and I think in theory and unless 389 00:20:23,400 --> 00:20:25,479 Speaker 3: you say that wake word right, whatever that might be. 390 00:20:26,480 --> 00:20:29,800 Speaker 2: No, they are always listening. Oh hey, the because they 391 00:20:29,800 --> 00:20:30,960 Speaker 2: have to hear the wake word. 392 00:20:31,119 --> 00:20:33,399 Speaker 1: I love doing this. Let's prank someone who's listening to 393 00:20:33,400 --> 00:20:37,440 Speaker 1: one of those in their house right now. Alexa no play, 394 00:20:37,480 --> 00:20:42,639 Speaker 1: Alexa play, don't please? Don't the remix are the origin 395 00:20:42,880 --> 00:20:45,480 Speaker 1: We're kidding, We're kidding. I hope we're all still friends 396 00:20:45,520 --> 00:20:49,359 Speaker 1: with our with our various devices. But but we say 397 00:20:49,359 --> 00:20:52,400 Speaker 1: this to point out that these things are far from perfect. 398 00:20:52,720 --> 00:20:55,560 Speaker 1: And there are a lot of people employed by these companies, 399 00:20:55,760 --> 00:20:59,040 Speaker 1: human listeners, right, just like many of us listening today, 400 00:20:59,320 --> 00:21:04,560 Speaker 1: who are tasked with going through these things and seeing 401 00:21:04,960 --> 00:21:08,120 Speaker 1: listening to the recording that someone said and then seeing 402 00:21:08,280 --> 00:21:11,600 Speaker 1: what the assistant thought they said, and then reconciling the 403 00:21:11,680 --> 00:21:15,000 Speaker 1: two to build a better mouse trap for your personal information. 404 00:21:15,480 --> 00:21:19,360 Speaker 2: Yes, and that is where we get into the Amazon 405 00:21:19,440 --> 00:21:21,399 Speaker 2: Echo story out of Bloomberg that we were going to 406 00:21:21,440 --> 00:21:24,640 Speaker 2: talk about, and that is the fact that there are 407 00:21:25,480 --> 00:21:30,879 Speaker 2: thousands of Amazon employees and contractors who, like Ben said, 408 00:21:30,960 --> 00:21:35,800 Speaker 2: are tasked with literally listening to what the microphone recorded 409 00:21:35,800 --> 00:21:39,040 Speaker 2: in your living room. So if so, you're asking Noel 410 00:21:39,040 --> 00:21:42,080 Speaker 2: about is is it always listening? Yes, the microphone is 411 00:21:42,119 --> 00:21:44,359 Speaker 2: always turned on. As long as you've got your Alexa 412 00:21:44,400 --> 00:21:46,680 Speaker 2: plugged in or your Google Home plugged in, that mic 413 00:21:46,800 --> 00:21:50,480 Speaker 2: is on and it is listening. It doesn't record anything 414 00:21:50,720 --> 00:21:53,800 Speaker 2: until you say Alexa or computer or Echo or whatever. 415 00:21:53,880 --> 00:21:56,920 Speaker 3: The key, right is the wavecord? Yeah, sure, But which 416 00:21:56,920 --> 00:21:58,639 Speaker 3: is a creepy phrase in and of itself if you 417 00:21:58,720 --> 00:21:59,040 Speaker 3: ask me. 418 00:21:59,119 --> 00:22:02,119 Speaker 2: But that is literally an open mic sitting in your house. 419 00:22:02,240 --> 00:22:05,920 Speaker 2: It's a hot mic, and that is where it gets 420 00:22:05,920 --> 00:22:08,600 Speaker 2: really creepy. But it also gets really creepy. Like, Okay, 421 00:22:08,880 --> 00:22:11,640 Speaker 2: on the surface, it makes complete sense. It's what Ben 422 00:22:11,760 --> 00:22:14,080 Speaker 2: was saying. It's quality assurance, right, it's trying to make 423 00:22:14,080 --> 00:22:17,320 Speaker 2: that AI better. It's an educational thing for the system. 424 00:22:17,600 --> 00:22:20,040 Speaker 2: But below the surface, like if you really break it 425 00:22:20,119 --> 00:22:22,840 Speaker 2: down and you take away some of the words that 426 00:22:22,880 --> 00:22:25,520 Speaker 2: are in there that make it feel like a fun 427 00:22:25,560 --> 00:22:29,000 Speaker 2: and exciting new thing. There is this mic in your room. 428 00:22:29,080 --> 00:22:33,240 Speaker 2: It's recording information things that you're saying in your private home, 429 00:22:33,760 --> 00:22:36,200 Speaker 2: and it's sending it to some person that you've never met. 430 00:22:36,440 --> 00:22:39,400 Speaker 2: And then this stranger is going to transcribe exactly what 431 00:22:39,440 --> 00:22:41,560 Speaker 2: you said in your living room. Then it's going to 432 00:22:41,640 --> 00:22:43,639 Speaker 2: feed it back in that system, so that when that 433 00:22:43,680 --> 00:22:47,200 Speaker 2: mic here's you talk again, it knows exactly what you said. 434 00:22:47,240 --> 00:22:51,359 Speaker 3: But again, it's only after you've said the wakeword. It's 435 00:22:51,400 --> 00:22:54,720 Speaker 3: not They're not transcribing all your conversations in your living room. 436 00:22:55,040 --> 00:22:58,000 Speaker 3: It's all in theory stuff that you are attempting to 437 00:22:58,040 --> 00:23:01,160 Speaker 3: communicate to the device of Otherwise it wouldn't be any 438 00:23:01,240 --> 00:23:03,080 Speaker 3: use to them, it wouldn't help them improve the algorithm 439 00:23:03,119 --> 00:23:03,400 Speaker 3: at all. 440 00:23:03,560 --> 00:23:06,640 Speaker 2: But my point here is that that is how it functions, 441 00:23:07,080 --> 00:23:10,199 Speaker 2: according to the way the creators want it to function 442 00:23:10,280 --> 00:23:14,359 Speaker 2: right now. Made this device on. Yes, that is the 443 00:23:14,480 --> 00:23:17,480 Speaker 2: forward facing thing. And I'm not saying Amazon is doing 444 00:23:17,520 --> 00:23:21,040 Speaker 2: anything illegal or you know, scary like that, but there 445 00:23:21,200 --> 00:23:24,840 Speaker 2: is there is an easy route there to exploit that 446 00:23:24,960 --> 00:23:26,400 Speaker 2: microphone that's in your living room. 447 00:23:26,440 --> 00:23:30,000 Speaker 3: That's sure. So you're saying, maybe if someone a bad 448 00:23:30,040 --> 00:23:32,119 Speaker 3: actor let's say, got a hold of this, or do 449 00:23:32,119 --> 00:23:34,000 Speaker 3: you think Amazon could potentially be the bad. 450 00:23:33,840 --> 00:23:39,040 Speaker 1: Actor Amazon's partners. Let's let's say, let's foreshadow it that way, 451 00:23:39,119 --> 00:23:43,160 Speaker 1: Amazon's Amazon's buddies, the folks in bed with it. But Matt, 452 00:23:43,200 --> 00:23:45,280 Speaker 1: what's interesting to me about this is that you are 453 00:23:45,320 --> 00:23:50,280 Speaker 1: talking in terms of an above the surface level. Yes, 454 00:23:50,520 --> 00:23:53,440 Speaker 1: my spider sense tells me you've got to below the surface. 455 00:23:53,480 --> 00:23:58,600 Speaker 2: Take, No, well below the surface. Take is just in 456 00:23:58,640 --> 00:24:01,720 Speaker 2: my mind, the reality of the situation of We've talked 457 00:24:01,720 --> 00:24:03,480 Speaker 2: about it before on here. We kind of hit it 458 00:24:03,480 --> 00:24:05,840 Speaker 2: a couple times in this episode already, just that we 459 00:24:05,880 --> 00:24:09,359 Speaker 2: are literally bugging ourselves. And you know, and you think, 460 00:24:09,600 --> 00:24:13,280 Speaker 2: when you think about a world in which perhaps the 461 00:24:13,320 --> 00:24:17,359 Speaker 2: powers that be end up ruling, Let's just say this 462 00:24:17,560 --> 00:24:20,280 Speaker 2: United States that we live in. Let's say that some 463 00:24:20,440 --> 00:24:22,800 Speaker 2: group comes along and takes over. And now it is 464 00:24:22,880 --> 00:24:25,560 Speaker 2: illegal in this land to do X. And let's say 465 00:24:25,600 --> 00:24:29,320 Speaker 2: your family or your living situation is X. Now there 466 00:24:29,359 --> 00:24:31,840 Speaker 2: is a microphone in your living room or your kitchen 467 00:24:31,880 --> 00:24:34,479 Speaker 2: or wherever it is, and if you're just having a 468 00:24:34,560 --> 00:24:37,600 Speaker 2: regular conversation about what your life is and what you 469 00:24:37,640 --> 00:24:40,000 Speaker 2: are doing, but it is illegal in this land, and 470 00:24:40,000 --> 00:24:43,520 Speaker 2: there's a mic in there, right, there's potential that there 471 00:24:44,240 --> 00:24:48,080 Speaker 2: you could be abused in some way or persecuted. 472 00:24:48,160 --> 00:24:51,359 Speaker 3: That's all reminds me of the telescreens in nineteen eighty four, 473 00:24:51,920 --> 00:24:54,359 Speaker 3: which were like on this on one level, seen as 474 00:24:54,440 --> 00:24:56,879 Speaker 3: like a luxury and as like a really cool technological 475 00:24:56,920 --> 00:24:59,760 Speaker 3: gadget where you could watch all of these whatever entertainment 476 00:25:00,080 --> 00:25:02,080 Speaker 3: so wished. But it was a two way thing. It 477 00:25:02,119 --> 00:25:06,199 Speaker 3: was monitoring you. But there's a certain acceptance of it, 478 00:25:06,359 --> 00:25:08,920 Speaker 3: you know, like it's not secret monitoring. Everyone knows they're 479 00:25:08,960 --> 00:25:11,440 Speaker 3: being monitored. They just know to stay in line and 480 00:25:11,440 --> 00:25:14,760 Speaker 3: not fall outside of the party, you know, doctrine or whatever. 481 00:25:15,000 --> 00:25:17,040 Speaker 3: And we've kind of found ourselves in a very similar 482 00:25:17,080 --> 00:25:20,920 Speaker 3: situation where like complicit in our own surveillance. 483 00:25:20,560 --> 00:25:24,240 Speaker 1: Or well was nothing if not prescient in that regard. 484 00:25:24,560 --> 00:25:28,320 Speaker 1: There's a there's here's a real life example or something 485 00:25:28,359 --> 00:25:30,960 Speaker 1: that could play out plausibly. And this is heavy stuff. 486 00:25:31,240 --> 00:25:35,479 Speaker 1: So imagine that you live in a country that is 487 00:25:36,640 --> 00:25:41,320 Speaker 1: least economic developed country economically developed country, and that country 488 00:25:41,440 --> 00:25:45,280 Speaker 1: has an authoritarian government and they have strict religious laws 489 00:25:45,320 --> 00:25:49,159 Speaker 1: of one sort or another. Let's say that for a 490 00:25:49,280 --> 00:25:52,280 Speaker 1: time there was a different regime, and you were maybe 491 00:25:52,320 --> 00:25:55,400 Speaker 1: in a same sex relationship, and you and your partner 492 00:25:55,840 --> 00:25:59,280 Speaker 1: lived lived your normal, everyday life. Right. You just happen 493 00:25:59,320 --> 00:26:01,440 Speaker 1: to have your device because you like to hear music 494 00:26:01,440 --> 00:26:04,040 Speaker 1: when you cook. Who doesn't like that? But then the 495 00:26:04,080 --> 00:26:10,080 Speaker 1: regime changes, and now again same sex relationships are forbidden 496 00:26:10,200 --> 00:26:13,560 Speaker 1: or haram or whatever. And now that stuff that you 497 00:26:13,800 --> 00:26:18,800 Speaker 1: said that got hoovered up into the cloud, now it 498 00:26:18,840 --> 00:26:21,959 Speaker 1: makes you complicit in what that government sees as a crime. 499 00:26:22,080 --> 00:26:26,000 Speaker 1: And that means that, according to that government, the stuff 500 00:26:26,000 --> 00:26:28,879 Speaker 1: that you did, which was perfectly fine, your relationship was 501 00:26:28,960 --> 00:26:33,320 Speaker 1: perfectly fine until someone retroactively decided it wasn't. And now 502 00:26:34,000 --> 00:26:39,159 Speaker 1: because you wanted to hear the remakes to ignition, now 503 00:26:39,200 --> 00:26:42,160 Speaker 1: just because of the conversation that occurred around that time, 504 00:26:42,480 --> 00:26:46,040 Speaker 1: now you are in hot water and there's not a 505 00:26:46,119 --> 00:26:50,040 Speaker 1: recourse to help you. That's a terrifying possibility. Matt. You 506 00:26:50,080 --> 00:26:54,280 Speaker 1: were telling us off air that at least some companies 507 00:26:54,520 --> 00:27:00,000 Speaker 1: like Amazon attempt to quell those fears by publicly stated 508 00:27:00,160 --> 00:27:03,800 Speaker 1: there are hard constraints on how long an echo can 509 00:27:03,840 --> 00:27:04,360 Speaker 1: record something. 510 00:27:04,359 --> 00:27:08,200 Speaker 2: Oh yeah, absolutely, And again to Amazon's defense, I am 511 00:27:08,240 --> 00:27:12,680 Speaker 2: being completely conspiratorial in this and it's just one of 512 00:27:12,760 --> 00:27:14,639 Speaker 2: those weird foresight things that both men and I were 513 00:27:14,640 --> 00:27:18,040 Speaker 2: talking about there. But Amazon has stated that only a 514 00:27:18,080 --> 00:27:22,560 Speaker 2: fraction of one percent of interactions with their devices actually 515 00:27:23,080 --> 00:27:25,840 Speaker 2: gets transcribed in this way by a human, by a 516 00:27:25,920 --> 00:27:28,200 Speaker 2: human where it gets sent off. They transcribe at the 517 00:27:28,240 --> 00:27:31,239 Speaker 2: center back in since the beginning of twenty nineteen, and 518 00:27:31,280 --> 00:27:33,480 Speaker 2: this is by the way, from August of twenty nineteen 519 00:27:33,520 --> 00:27:38,440 Speaker 2: from the ambient, so eight months of transcribing had had 520 00:27:38,480 --> 00:27:41,240 Speaker 2: gone through and only point two percent of all requests 521 00:27:41,240 --> 00:27:45,000 Speaker 2: to Alexa had actually been transcribed. So that's very a 522 00:27:45,160 --> 00:27:48,200 Speaker 2: very very small number of conversations that actually get listened 523 00:27:48,240 --> 00:27:49,119 Speaker 2: to and transcribed. 524 00:27:49,280 --> 00:27:52,399 Speaker 1: Interesting how they put it in percentage though, because putting 525 00:27:52,440 --> 00:27:56,000 Speaker 1: it in percentage can make something something seems smaller than 526 00:27:56,000 --> 00:27:57,040 Speaker 1: it is in reality. 527 00:27:57,200 --> 00:28:02,560 Speaker 3: Right, it's a massive sample size, right, That small percentage 528 00:28:02,880 --> 00:28:04,080 Speaker 3: is actually a massive. 529 00:28:03,840 --> 00:28:07,359 Speaker 1: Number, right, Right, It just sounds a little more reasonable. 530 00:28:07,520 --> 00:28:07,760 Speaker 3: Right. 531 00:28:07,880 --> 00:28:12,680 Speaker 2: Well, here's the great thing. When you say, hey, whatever, 532 00:28:13,080 --> 00:28:15,720 Speaker 2: Alexa gonna do it, You're gonna do it. Spread it 533 00:28:15,760 --> 00:28:18,320 Speaker 2: out enough. But when you do that, generally you ask 534 00:28:18,359 --> 00:28:21,359 Speaker 2: a very short request or a very short question or 535 00:28:21,400 --> 00:28:22,600 Speaker 2: something to that effect, and then. 536 00:28:22,520 --> 00:28:24,520 Speaker 3: It does the thing and then like that window of 537 00:28:25,040 --> 00:28:26,400 Speaker 3: monitoring is over kind of. 538 00:28:26,400 --> 00:28:29,240 Speaker 2: Right, generally, yeah, it lasts for about two seconds, that's 539 00:28:29,280 --> 00:28:33,000 Speaker 2: the average. So when somebody is transcribing, that's literally all 540 00:28:33,320 --> 00:28:33,960 Speaker 2: all it says. 541 00:28:34,119 --> 00:28:35,800 Speaker 3: You can they make good money doing that? 542 00:28:35,920 --> 00:28:38,160 Speaker 1: Or is it like they know it's Amazon, They're not 543 00:28:38,200 --> 00:28:39,520 Speaker 1: getting paid very well at all? 544 00:28:39,640 --> 00:28:41,480 Speaker 3: Is it sort of like we as a company, we 545 00:28:42,120 --> 00:28:44,479 Speaker 3: do a lot of transcribing interviews and stuff. You think 546 00:28:44,480 --> 00:28:46,360 Speaker 3: it's very similar to that. You think even outsource it. 547 00:28:46,400 --> 00:28:46,840 Speaker 3: They couldn't. 548 00:28:46,880 --> 00:28:47,680 Speaker 2: It is contractors. 549 00:28:47,760 --> 00:28:48,480 Speaker 3: It is contractors. 550 00:28:48,560 --> 00:28:50,600 Speaker 2: Some of its contractors, some of its employees got it. 551 00:28:50,760 --> 00:28:55,920 Speaker 1: And this practice has no currently has no real legal 552 00:28:56,040 --> 00:29:01,160 Speaker 1: constraints because, as we know, technology always outpaces legislation tale 553 00:29:01,160 --> 00:29:04,800 Speaker 1: as old as time. However, I get the feeling that 554 00:29:04,920 --> 00:29:08,000 Speaker 1: a lot of us were sort of aware that something's 555 00:29:08,040 --> 00:29:10,360 Speaker 1: off with these home assistants, or that there is some 556 00:29:10,480 --> 00:29:13,640 Speaker 1: kind of transaction at play. If it's not terrible, if 557 00:29:13,680 --> 00:29:16,560 Speaker 1: it's something we're okay with, we knew there was still something. 558 00:29:16,840 --> 00:29:19,160 Speaker 1: And you can hear, you know, when things go wrong 559 00:29:19,240 --> 00:29:21,840 Speaker 1: in nine one one calls and all these other spooky 560 00:29:21,880 --> 00:29:26,280 Speaker 1: stories about things going south with Google or Amazon. But 561 00:29:26,520 --> 00:29:30,320 Speaker 1: what about the other devices. We have some news for 562 00:29:30,360 --> 00:29:34,000 Speaker 1: you about smart TVs. Look around your room or wherever 563 00:29:34,040 --> 00:29:36,520 Speaker 1: you happen to find yourself, is there a TV in there? 564 00:29:36,800 --> 00:29:39,440 Speaker 1: Things are about to get very interesting for you. After 565 00:29:39,480 --> 00:29:45,360 Speaker 1: a word from our sponsors. 566 00:29:45,760 --> 00:29:49,200 Speaker 2: All right, we're back. Let's jump in to something we 567 00:29:49,360 --> 00:29:52,400 Speaker 2: learned about thanks to New York Times in a twenty 568 00:29:52,560 --> 00:29:57,440 Speaker 2: eighteen article about filing. New York Times, they are not failing. 569 00:29:57,480 --> 00:30:00,600 Speaker 1: I don't believe, I mean not after this bump. I 570 00:30:00,640 --> 00:30:04,400 Speaker 1: mentioned this article in the gang Stalking episode. So you're welcome, 571 00:30:04,480 --> 00:30:04,880 Speaker 1: New York. 572 00:30:05,080 --> 00:30:08,600 Speaker 2: Yes you did, Ben. Ben did bring this up, and 573 00:30:08,680 --> 00:30:10,320 Speaker 2: we decided we were going to look into it, and 574 00:30:10,520 --> 00:30:13,760 Speaker 2: we did, and now we can't look away forever. So 575 00:30:14,000 --> 00:30:18,400 Speaker 2: it's a thing called Samba TV. That sounds fun Samba TV, it. 576 00:30:18,360 --> 00:30:20,320 Speaker 3: Sounds really funny. Know what else sounds finn I'm gonna 577 00:30:20,320 --> 00:30:22,640 Speaker 3: skip down just just a tad. How does this sound 578 00:30:22,680 --> 00:30:22,880 Speaker 3: to you? 579 00:30:22,920 --> 00:30:24,200 Speaker 1: Guys? Hey? 580 00:30:24,280 --> 00:30:26,400 Speaker 3: How about you want to interact with your favorite shows, 581 00:30:26,560 --> 00:30:30,520 Speaker 3: get recommendations based on the content you love, connect your 582 00:30:30,560 --> 00:30:35,600 Speaker 3: devices for exclusive content and special offers. How about Samba 583 00:30:35,600 --> 00:30:39,360 Speaker 3: Interactive TV lets you engage with your TV in a 584 00:30:39,400 --> 00:30:42,200 Speaker 3: whole new way. That sounds great sounds. 585 00:30:42,280 --> 00:30:42,800 Speaker 2: Damn good. 586 00:30:42,800 --> 00:30:43,360 Speaker 3: I'm into that. 587 00:30:43,880 --> 00:30:45,320 Speaker 2: So what is Samba TV? 588 00:30:45,640 --> 00:30:45,800 Speaker 3: Oh? 589 00:30:45,840 --> 00:30:47,040 Speaker 1: Hey man, what's Samba TV? 590 00:30:47,840 --> 00:30:51,560 Speaker 2: Okay, I'll tell you. It is a software, a piece 591 00:30:51,600 --> 00:30:55,160 Speaker 2: of software that is present in a lot of television models, 592 00:30:55,160 --> 00:30:58,560 Speaker 2: some models from nearly a dozen smart TV brands. And 593 00:30:58,600 --> 00:31:02,640 Speaker 2: again this is as of late twenty eighteen. That has changed. 594 00:31:03,240 --> 00:31:06,960 Speaker 2: There are more included now, but it is Sony Sharp Phillips, 595 00:31:07,560 --> 00:31:09,280 Speaker 2: a lot of that, all the hits, all the head 596 00:31:09,320 --> 00:31:13,120 Speaker 2: ones in this software. In particular, it identifies what is 597 00:31:13,160 --> 00:31:17,880 Speaker 2: being watched on the monitor the television by literally analyzing 598 00:31:17,920 --> 00:31:21,320 Speaker 2: the pixels displayed and then comparing that data to a 599 00:31:21,360 --> 00:31:23,520 Speaker 2: set of known media that exists out there. 600 00:31:23,560 --> 00:31:25,920 Speaker 3: It's a similar way like audio things or even YouTube 601 00:31:25,920 --> 00:31:28,120 Speaker 3: things are flagged for copyright violations. 602 00:31:28,360 --> 00:31:31,040 Speaker 2: Yes, but in this case it is the end user 603 00:31:31,560 --> 00:31:35,640 Speaker 2: that is the actual piece of hardware that is being monitored. 604 00:31:35,720 --> 00:31:39,240 Speaker 1: Right, And this was always coming Nielsen ratings, like the 605 00:31:39,320 --> 00:31:41,520 Speaker 1: Nielsen Institution wanted this. 606 00:31:41,920 --> 00:31:45,000 Speaker 2: Yes, there needed to be a way to find out 607 00:31:45,000 --> 00:31:49,360 Speaker 2: who was watching what when, and in particular, when you're 608 00:31:49,360 --> 00:31:53,320 Speaker 2: saying about who, it means everybody who is around, not 609 00:31:53,480 --> 00:31:58,200 Speaker 2: just that this household is watching something. But here's the 610 00:31:58,280 --> 00:32:02,280 Speaker 2: idea you're viewing history is then in part used to suggest, 611 00:32:02,520 --> 00:32:05,520 Speaker 2: as Noel was saying, as the flowery language there that's 612 00:32:05,560 --> 00:32:07,840 Speaker 2: actually present in the PR from Samba TV. 613 00:32:07,920 --> 00:32:09,200 Speaker 1: Oh yeah, they wrote that copy. 614 00:32:09,080 --> 00:32:10,720 Speaker 3: Of Well it's not only the PR, it's the opt 615 00:32:10,800 --> 00:32:11,440 Speaker 3: in message. 616 00:32:11,520 --> 00:32:15,200 Speaker 2: Yes, yes, it's used to suggest the next content that 617 00:32:15,400 --> 00:32:19,880 Speaker 2: Samba TV believes you will yourself enjoy. But that is 618 00:32:19,920 --> 00:32:23,600 Speaker 2: not all that Samba TV does. It also identifies all 619 00:32:23,640 --> 00:32:26,440 Speaker 2: the other devices that are connected to the same network 620 00:32:26,720 --> 00:32:29,200 Speaker 2: through which it is accessing the internet. 621 00:32:29,400 --> 00:32:32,360 Speaker 1: So your friend comes over, they have a phone with 622 00:32:32,480 --> 00:32:35,400 Speaker 1: Wi Fi. Now they're in the loop as well. 623 00:32:35,520 --> 00:32:38,160 Speaker 2: Yes, if you are Netflix and chilling or whatever the 624 00:32:38,240 --> 00:32:41,000 Speaker 2: kids call it these days at somebody else's house or 625 00:32:41,040 --> 00:32:44,360 Speaker 2: apartment and Samba TV is there, it knows that you're 626 00:32:44,400 --> 00:32:46,920 Speaker 2: there because it can identify your device and the Mac 627 00:32:46,960 --> 00:32:47,960 Speaker 2: address and all those things. 628 00:32:48,000 --> 00:32:50,960 Speaker 3: And here's the thing. This company claims that it is 629 00:32:51,040 --> 00:32:54,760 Speaker 3: adhering very closely to privacy guidelines set forth by the 630 00:32:54,960 --> 00:32:58,400 Speaker 3: Federal Trade Commission, that it does not directly sell any 631 00:32:58,440 --> 00:33:01,440 Speaker 3: of this data. Instead, at tizers can pay the company 632 00:33:01,720 --> 00:33:05,120 Speaker 3: to kind of guide the hand of the ads in 633 00:33:05,200 --> 00:33:08,840 Speaker 3: the placement, which which makes sense. Doesn't sound too insidious, right. 634 00:33:09,200 --> 00:33:13,320 Speaker 2: Well, it's directing ads to the other devices that are 635 00:33:13,400 --> 00:33:17,560 Speaker 2: present who they believe are watching the television program right. 636 00:33:17,520 --> 00:33:20,720 Speaker 1: Right, right, And so your opt in stuff happens at 637 00:33:20,720 --> 00:33:23,920 Speaker 1: the television right at the TV, but it doesn't happen 638 00:33:23,960 --> 00:33:27,480 Speaker 1: at your smartphone necessarily if you walk into someone else's house. 639 00:33:28,560 --> 00:33:31,040 Speaker 3: And so, how do they get around the legality of 640 00:33:31,280 --> 00:33:34,600 Speaker 3: other people? Like, are they automatically part of your opt 641 00:33:34,640 --> 00:33:38,560 Speaker 3: in when they's. 642 00:33:37,040 --> 00:33:39,720 Speaker 1: That's what I'm saying, there's not informed consents. Interesting and 643 00:33:39,760 --> 00:33:42,800 Speaker 1: also think about this, this technology is amazing. If our 644 00:33:42,920 --> 00:33:46,920 Speaker 1: species was less of a garbage fire, we could use 645 00:33:46,960 --> 00:33:50,160 Speaker 1: this to do wonderful things for you know, say someone's 646 00:33:50,200 --> 00:33:53,080 Speaker 1: mental health, right, and someone's like, okay, I see you've 647 00:33:53,120 --> 00:33:56,080 Speaker 1: watched Faces of Death four nine times in a row. 648 00:33:56,280 --> 00:33:58,880 Speaker 1: I'd like to recommend the Great British Bacon Show. Do 649 00:33:58,920 --> 00:33:59,560 Speaker 1: you know what I mean? 650 00:34:00,560 --> 00:34:03,040 Speaker 3: You do a better job with your content consumption? No, 651 00:34:03,160 --> 00:34:07,120 Speaker 3: it's totally true. And when you let's say you plug 652 00:34:07,160 --> 00:34:11,600 Speaker 3: in your new smart TV and it has Samba on it, 653 00:34:11,600 --> 00:34:15,000 Speaker 3: it will present you with that very flowery language that's 654 00:34:15,040 --> 00:34:18,520 Speaker 3: the opt in message. There is a giant terms of 655 00:34:18,560 --> 00:34:22,000 Speaker 3: service and privacy policy. You know page that you can 656 00:34:22,239 --> 00:34:25,440 Speaker 3: peruse if you wish. I believe it's sixty five hundred 657 00:34:25,440 --> 00:34:27,960 Speaker 3: words for the terms of service and four thousand words 658 00:34:27,960 --> 00:34:30,160 Speaker 3: for the privacy policy. But why would you even bother 659 00:34:30,280 --> 00:34:33,000 Speaker 3: doing that when you can interact with your favorite shows, 660 00:34:33,160 --> 00:34:36,200 Speaker 3: get recommendations based on the content that you love. 661 00:34:36,600 --> 00:34:38,840 Speaker 1: That terms of service is a real page turner. 662 00:34:38,880 --> 00:34:41,399 Speaker 3: Oh yeah, yeah, But why would you want to do 663 00:34:41,480 --> 00:34:44,399 Speaker 3: that when this seems so innocuous and you just want 664 00:34:44,400 --> 00:34:46,080 Speaker 3: to start playing your Fallout four? 665 00:34:46,320 --> 00:34:49,399 Speaker 2: Yeah, well, to see, that's not the real insidious thing 666 00:34:49,520 --> 00:34:53,080 Speaker 2: here is that Let's put yourself in the position of 667 00:34:53,200 --> 00:34:55,720 Speaker 2: you've spent let's say the last couple of months saving 668 00:34:55,800 --> 00:34:57,840 Speaker 2: up money because you really, you know, you need to 669 00:34:57,840 --> 00:35:00,839 Speaker 2: get this new TV. You're really excited about it. You 670 00:35:00,920 --> 00:35:04,719 Speaker 2: finally get it right, and you're installing it your you know, 671 00:35:04,880 --> 00:35:06,960 Speaker 2: your hands are all sweaty because you know Fallout four, 672 00:35:07,080 --> 00:35:09,040 Speaker 2: the next playthrough is about to happen. You're like, oh 673 00:35:09,040 --> 00:35:11,680 Speaker 2: my god, this is I'm so excited. You plug this 674 00:35:11,760 --> 00:35:15,000 Speaker 2: thing in, you start going through the initialization process, the 675 00:35:15,080 --> 00:35:18,560 Speaker 2: Samba thing pops up, and you know you can you 676 00:35:18,640 --> 00:35:20,839 Speaker 2: literally have to decide if you're going to spend an 677 00:35:20,880 --> 00:35:24,239 Speaker 2: hour parsing through all of that legal ease or if 678 00:35:24,280 --> 00:35:26,440 Speaker 2: you're gonna get to whatever it is you're you wanted 679 00:35:26,440 --> 00:35:31,120 Speaker 2: to get to, and most people just click enable and 680 00:35:31,160 --> 00:35:33,440 Speaker 2: move forward. It has nothing to do with the flowery 681 00:35:33,520 --> 00:35:36,640 Speaker 2: language or anything like that. It says enable, Okay, yes, 682 00:35:36,760 --> 00:35:39,000 Speaker 2: this gets me to the next thing. Just click and enable, 683 00:35:39,239 --> 00:35:41,279 Speaker 2: And that is exactly what most people do. 684 00:35:41,840 --> 00:35:46,120 Speaker 1: Right on the order of an estimated ninety percent, A 685 00:35:46,200 --> 00:35:50,000 Speaker 1: vast majority of people do click enable. And once this 686 00:35:50,040 --> 00:35:53,200 Speaker 1: stuff is up and running, it's a katie bar the 687 00:35:53,280 --> 00:35:56,360 Speaker 1: doors they used to say in Days of your Samba 688 00:35:56,400 --> 00:36:00,880 Speaker 1: sees everything that is displayed on the monitor, regardless of 689 00:36:00,920 --> 00:36:04,400 Speaker 1: what you're watching or playing or how you're displaying it. 690 00:36:04,400 --> 00:36:08,279 Speaker 1: It doesn't matter if you're watching TV. It doesn't matter 691 00:36:08,320 --> 00:36:10,600 Speaker 1: if you're watching a film. It doesn't matter if you're 692 00:36:10,640 --> 00:36:12,760 Speaker 1: broadcasting a home video. 693 00:36:12,920 --> 00:36:15,759 Speaker 2: Right yeah, it could be literally anything, you know, if 694 00:36:15,800 --> 00:36:18,360 Speaker 2: you are broadcasting a home video of something that you 695 00:36:18,400 --> 00:36:22,520 Speaker 2: wouldn't want anyone else to see, Samba TV is analyzing it. 696 00:36:22,520 --> 00:36:25,560 Speaker 2: It isn't necessarily matching up with any known media. Sure, 697 00:36:25,600 --> 00:36:27,920 Speaker 2: but if you broadcast the same kind of home videos, 698 00:36:28,000 --> 00:36:31,239 Speaker 2: Let's say I your kids, maybe a romantic video you 699 00:36:31,280 --> 00:36:36,120 Speaker 2: made with your partner. I mean, honestly, who knows. That's 700 00:36:36,160 --> 00:36:38,919 Speaker 2: again taking a little bit further than the known technology 701 00:36:39,000 --> 00:36:41,400 Speaker 2: or the known reasons for using it. But it could 702 00:36:41,440 --> 00:36:46,000 Speaker 2: be used in the future by someone to figure out 703 00:36:46,960 --> 00:36:49,120 Speaker 2: very personal, intimate things about you. But it's sort of like. 704 00:36:49,120 --> 00:36:51,000 Speaker 3: When we read about the NSA and the way the 705 00:36:51,080 --> 00:36:53,840 Speaker 3: NSA was monitoring people's phone calls. They weren't recording the 706 00:36:53,880 --> 00:36:57,400 Speaker 3: actual audio. They were just capturing the metadata so they 707 00:36:57,440 --> 00:36:59,360 Speaker 3: knew how long a call lasted, or like you know 708 00:36:59,360 --> 00:37:03,160 Speaker 3: who this web of interconnectedness or whatever, it's similar with this. 709 00:37:03,280 --> 00:37:07,719 Speaker 3: It's not like they're recording actually what you're streaming. They're 710 00:37:07,760 --> 00:37:10,600 Speaker 3: just capturing the data of what it is, how long 711 00:37:10,600 --> 00:37:12,840 Speaker 3: you watched it for, et cetera, of the pixels of 712 00:37:12,880 --> 00:37:13,440 Speaker 3: the pixels. 713 00:37:13,800 --> 00:37:19,799 Speaker 1: Right, So this, okay, this is true. Even if we 714 00:37:19,960 --> 00:37:24,040 Speaker 1: want to be as skeptical or I should say, as 715 00:37:24,120 --> 00:37:27,680 Speaker 1: credulous as we can, and if we take those pieces 716 00:37:27,719 --> 00:37:31,600 Speaker 1: of State of pr copy at their word, this still 717 00:37:31,719 --> 00:37:37,080 Speaker 1: has a ton of hilarious, cartoonish vulnerabilities. You can learn 718 00:37:37,120 --> 00:37:40,000 Speaker 1: too much about people and there's no way for the 719 00:37:40,120 --> 00:37:43,040 Speaker 1: end user to stop it. Other than try to opt out. 720 00:37:43,040 --> 00:37:45,000 Speaker 1: But opting out it doesn't delete all the stuff that 721 00:37:45,040 --> 00:37:49,080 Speaker 1: has already learned about you. Are we being paranoid perhaps, 722 00:37:49,520 --> 00:37:57,480 Speaker 1: or perhaps we should introduce you to Alfonso. Oh God, 723 00:37:57,600 --> 00:37:59,040 Speaker 1: I love that Alfonso. 724 00:37:59,280 --> 00:38:02,440 Speaker 2: Okay, So to break this down thus far, we've got 725 00:38:02,520 --> 00:38:08,759 Speaker 2: our personal assistance that always are on Slicious no matter 726 00:38:08,760 --> 00:38:10,879 Speaker 2: if we're saying the keywords or not. They have their 727 00:38:10,920 --> 00:38:13,920 Speaker 2: microphone on and they are listening. They aren't necessarily recording 728 00:38:13,960 --> 00:38:16,040 Speaker 2: all the time, but they are. Now you have your 729 00:38:16,040 --> 00:38:20,200 Speaker 2: smartphone over there that is literally watching what you're watching too, 730 00:38:20,360 --> 00:38:23,760 Speaker 2: and it is making informed decisions about what you watch 731 00:38:23,840 --> 00:38:27,360 Speaker 2: and sending ads to all the devices in your house. Now, 732 00:38:27,600 --> 00:38:30,520 Speaker 2: let's say you're on one of those devices. Let's say 733 00:38:30,680 --> 00:38:33,239 Speaker 2: it's an Android device. Let's say you went to the 734 00:38:33,280 --> 00:38:37,439 Speaker 2: Google Play whatever it is app store thing, and you've 735 00:38:37,440 --> 00:38:40,680 Speaker 2: downloaded some apps and some games. Well a lot of 736 00:38:40,719 --> 00:38:43,480 Speaker 2: these apps and games, not all of them, but a 737 00:38:43,480 --> 00:38:47,759 Speaker 2: lot of them have partnered with this thing called Alphonso. 738 00:38:47,960 --> 00:38:58,360 Speaker 2: So this is a really interesting little piece of software 739 00:38:58,400 --> 00:39:00,960 Speaker 2: that's attached to these apps, and what it will do 740 00:39:01,480 --> 00:39:06,160 Speaker 2: is prompt you to enable the use of your microphone. 741 00:39:06,400 --> 00:39:10,160 Speaker 1: Right and these would be things that do not ostensibly 742 00:39:10,239 --> 00:39:12,839 Speaker 1: need that kind of access. Pull three D beer pong, 743 00:39:12,920 --> 00:39:16,239 Speaker 1: trick shot, real bowling, strike ten pen, you know these 744 00:39:16,320 --> 00:39:20,560 Speaker 1: kind of word salady names, little fun waste of time apps. 745 00:39:20,680 --> 00:39:24,080 Speaker 2: Well, and not just those some anti spying software. There's 746 00:39:24,200 --> 00:39:25,800 Speaker 2: a ton of apps out. 747 00:39:25,680 --> 00:39:30,920 Speaker 1: There right because it's it's this Alphonso is app agnostic. Yes, 748 00:39:31,239 --> 00:39:35,520 Speaker 1: So here's what happens. Here's why they want that microphone access. 749 00:39:36,040 --> 00:39:38,520 Speaker 1: Because when you're using this app, or when you grant 750 00:39:38,560 --> 00:39:43,480 Speaker 1: this app microphone access, Alfonso can figure out what you 751 00:39:43,640 --> 00:39:48,320 Speaker 1: happen to watch by identifying audio signals and television ads 752 00:39:48,360 --> 00:39:51,680 Speaker 1: and shows and even matching that information with the places 753 00:39:51,760 --> 00:39:55,600 Speaker 1: people visit and the movies they see really quickly. Here 754 00:39:55,680 --> 00:39:58,839 Speaker 1: is how it works. So we're all hanging out, we're 755 00:39:58,960 --> 00:40:03,040 Speaker 1: watching some television show that we're into. Let's go with lost, 756 00:40:03,160 --> 00:40:06,520 Speaker 1: something with commercials. So when the show switches to commercial, 757 00:40:06,840 --> 00:40:12,040 Speaker 1: there is a pitch an audio signal that goes out 758 00:40:12,239 --> 00:40:15,400 Speaker 1: to the room. You cannot hear it, your pets cannot 759 00:40:15,440 --> 00:40:18,640 Speaker 1: hear it, your kids cannot hear it. No one can 760 00:40:18,680 --> 00:40:21,000 Speaker 1: hear it, and no one is supposed to hear it. 761 00:40:21,000 --> 00:40:24,680 Speaker 1: It's only for your phone's and that's what they do. 762 00:40:24,719 --> 00:40:28,360 Speaker 1: They communicate with your phone, and then the phone will 763 00:40:28,719 --> 00:40:36,719 Speaker 1: also let people know via Alfonso. The phone will let 764 00:40:36,760 --> 00:40:40,400 Speaker 1: the users of the app, the real app. The users 765 00:40:40,400 --> 00:40:44,880 Speaker 1: of Alfonso understand who is in that room, where they 766 00:40:44,920 --> 00:40:48,080 Speaker 1: came from, maybe where they're going, and what they would 767 00:40:48,160 --> 00:40:50,400 Speaker 1: like to buy. That's not what you sign up for 768 00:40:50,520 --> 00:40:52,560 Speaker 1: when you walk. You know, you go to a potluck 769 00:40:52,600 --> 00:40:55,080 Speaker 1: at your friend's house to watch some kind of film. Right, 770 00:40:55,680 --> 00:40:57,919 Speaker 1: and how many apps are we talking here? 771 00:40:58,040 --> 00:41:00,000 Speaker 2: Well, okay, so according to the New York Times, there 772 00:41:00,040 --> 00:41:02,240 Speaker 2: were over two hundred and fifty apps on the Google 773 00:41:02,239 --> 00:41:06,399 Speaker 2: Play Store with this feature. Right, And if you want, 774 00:41:06,440 --> 00:41:08,359 Speaker 2: if you head over to the Google Play Store and 775 00:41:08,440 --> 00:41:12,960 Speaker 2: you type in quotations Alfonso Automated. That's A L P 776 00:41:13,200 --> 00:41:15,640 Speaker 2: H O N S O A U T O M 777 00:41:15,680 --> 00:41:18,319 Speaker 2: A T E D, and you will find all of 778 00:41:18,320 --> 00:41:20,719 Speaker 2: the various apps that have this thing installed. But then 779 00:41:20,760 --> 00:41:22,759 Speaker 2: if you if you look at an interview with some 780 00:41:22,840 --> 00:41:28,280 Speaker 2: Alfonso people, they said that there are thousands of apps 781 00:41:28,360 --> 00:41:30,640 Speaker 2: that they've partnered with and they didn't want to disclose 782 00:41:30,680 --> 00:41:33,760 Speaker 2: all of them because they have competitors who are trying 783 00:41:33,760 --> 00:41:37,880 Speaker 2: to basically get in on their territory. Yeah, approach their territory. 784 00:41:37,880 --> 00:41:40,640 Speaker 3: Remember when spyware was a big concern. This is like 785 00:41:40,680 --> 00:41:42,160 Speaker 3: some next level spyware. 786 00:41:42,400 --> 00:41:43,200 Speaker 1: This is spyware. 787 00:41:43,280 --> 00:41:45,320 Speaker 3: Yeah, but it's like different, right, It's like it literally 788 00:41:45,560 --> 00:41:48,160 Speaker 3: is opt in, right, it's spying on you. 789 00:41:48,600 --> 00:41:50,799 Speaker 2: Yeah, it's crazy, and now we know it's cyclical. So 790 00:41:50,960 --> 00:41:55,200 Speaker 2: it's all of the devices functioning together in this web 791 00:41:55,440 --> 00:41:57,800 Speaker 2: of trying to figure out what you want the most 792 00:41:57,880 --> 00:42:00,960 Speaker 2: and how to display that thing to you most effectively. 793 00:42:01,080 --> 00:42:05,320 Speaker 1: I guess Staltz effect. Yeah. This also, this problem becomes 794 00:42:06,160 --> 00:42:12,400 Speaker 1: complicated even further when we realize that private entity institutions 795 00:42:12,440 --> 00:42:15,440 Speaker 1: are not the only actors in this sphere. Indeed, they 796 00:42:15,560 --> 00:42:19,279 Speaker 1: may be some of the more innocuous. I love that 797 00:42:19,320 --> 00:42:22,480 Speaker 1: you mentioned spywaar or NOL, because the best spywear right 798 00:42:22,560 --> 00:42:25,960 Speaker 1: now is being built not by private industry but by 799 00:42:26,080 --> 00:42:31,959 Speaker 1: state actors. We mentioned Amazon's partners, right, Amazon's partners using 800 00:42:32,000 --> 00:42:37,319 Speaker 1: the data. Amazon's partners are alphabet soup intelligence agencies or 801 00:42:38,000 --> 00:42:41,200 Speaker 1: strongly thought to be so, especially. 802 00:42:40,880 --> 00:42:43,760 Speaker 2: Legedly, especially thanks to that early cash injection. 803 00:42:44,200 --> 00:42:47,759 Speaker 1: Right, exactly so. According to a Washington Post article from 804 00:42:47,800 --> 00:42:51,600 Speaker 1: twenty seventeen, the United States government has already turned theoretical 805 00:42:51,640 --> 00:42:55,440 Speaker 1: exploits and vulnerabilities in this kind of stuff into functioning 806 00:42:55,520 --> 00:43:00,600 Speaker 1: attack tools. One of these goes by the objectively badass 807 00:43:00,680 --> 00:43:05,560 Speaker 1: name Weeping Angel. Weeping Angel is specifically meant to target 808 00:43:05,680 --> 00:43:09,880 Speaker 1: Samsung TVs. This is just a small, microcosmic example, and 809 00:43:09,920 --> 00:43:12,120 Speaker 1: this is at least what it was doing two years ago, 810 00:43:12,640 --> 00:43:18,000 Speaker 1: according to WikiLeaks after infestation, Weeping Angel places a target 811 00:43:18,040 --> 00:43:21,240 Speaker 1: TV in a fake off mode so that the owner 812 00:43:21,360 --> 00:43:24,480 Speaker 1: believes the TV's off when it's still on. And then 813 00:43:24,680 --> 00:43:27,800 Speaker 1: in this fake off mode, the TV operates as a bug, 814 00:43:28,120 --> 00:43:31,359 Speaker 1: recording conversations in the room and then sending them over 815 00:43:31,400 --> 00:43:35,520 Speaker 1: the cloud to a covert CIA server. This sounds bonkers. 816 00:43:35,560 --> 00:43:37,560 Speaker 1: This sounds bananas. I can't believe it's real. 817 00:43:37,920 --> 00:43:41,799 Speaker 2: Why would anybody ever be paranoid? Right? 818 00:43:42,680 --> 00:43:43,359 Speaker 1: Why would they? 819 00:43:43,480 --> 00:43:44,320 Speaker 2: That's crazy? 820 00:43:45,480 --> 00:43:48,279 Speaker 1: And I hope whomever is listening to this around a 821 00:43:48,320 --> 00:43:52,000 Speaker 1: smart television has unplugged their headphones and is listening on speaker. 822 00:43:53,520 --> 00:43:57,239 Speaker 2: You know, Okay, look everything, I just have to say this, 823 00:43:57,320 --> 00:44:00,400 Speaker 2: everything we've been discussing today, If if you are of 824 00:44:00,440 --> 00:44:04,560 Speaker 2: a certain mind, perhaps like myself quite a lot, quite frequently, 825 00:44:05,440 --> 00:44:08,560 Speaker 2: it could lead you down a dark pathway where it 826 00:44:08,640 --> 00:44:13,160 Speaker 2: feels as though there's surveillance everywhere and you're being targeted 827 00:44:13,560 --> 00:44:16,680 Speaker 2: in some way. We can assure you this is not 828 00:44:16,920 --> 00:44:20,680 Speaker 2: just about you, no matter what you may think or 829 00:44:20,960 --> 00:44:25,239 Speaker 2: no matter what you may believe. Sure it is, it's 830 00:44:25,360 --> 00:44:30,000 Speaker 2: mass it's everybody, and again it is not necessarily nefarious. 831 00:44:31,560 --> 00:44:32,240 Speaker 2: But it's real. 832 00:44:33,000 --> 00:44:36,200 Speaker 1: I that's a matter of perspective. There is a certain 833 00:44:36,920 --> 00:44:40,960 Speaker 1: self importance or self aggrandizing that occurs when people are 834 00:44:41,000 --> 00:44:47,239 Speaker 1: suffering from paranoid delusions. Right, But being paranoid about this 835 00:44:47,320 --> 00:44:50,680 Speaker 1: sort of stuff does not make you delusional. It means 836 00:44:50,719 --> 00:44:54,480 Speaker 1: that you have unfortunately turned over the rock and you've 837 00:44:54,520 --> 00:44:57,640 Speaker 1: seen the thing squirming in the darkness beneath. This is 838 00:44:57,840 --> 00:44:58,800 Speaker 1: very real stuff. 839 00:44:59,400 --> 00:45:02,359 Speaker 2: I love that, and I also am terrified by it. 840 00:45:03,320 --> 00:45:06,799 Speaker 3: But you know, it's not all bad. I mean, there 841 00:45:06,840 --> 00:45:11,080 Speaker 3: are ways of kind of at least stemming some of 842 00:45:11,080 --> 00:45:13,920 Speaker 3: this stuff a little bit. Right, Yeah, So, how to 843 00:45:14,040 --> 00:45:16,080 Speaker 3: Geek actually has an easy to follow a guide on 844 00:45:16,120 --> 00:45:19,080 Speaker 3: how to stop Google Home from recording you all the time. 845 00:45:19,120 --> 00:45:21,719 Speaker 3: Google Home has a thing where it actually saves your 846 00:45:21,800 --> 00:45:24,200 Speaker 3: voice memos. You can check that out. You have to 847 00:45:24,239 --> 00:45:28,000 Speaker 3: opt in for constant recording allegedly, while you can if 848 00:45:28,040 --> 00:45:29,480 Speaker 3: you're an existing user opt out. 849 00:45:29,920 --> 00:45:32,600 Speaker 2: Yeah, that's the that's the whole thing. They've updated their 850 00:45:32,680 --> 00:45:36,760 Speaker 2: terms of service basically Google Home has right and actually 851 00:45:37,400 --> 00:45:39,799 Speaker 2: Amazon has done something similar there where you have more 852 00:45:39,880 --> 00:45:42,719 Speaker 2: choices now. But if you're an A I think, if 853 00:45:42,719 --> 00:45:45,480 Speaker 2: you're a legacy user, you actually can't get out of 854 00:45:45,480 --> 00:45:50,280 Speaker 2: some of the agreements you already signed into. Yeah, somebody 855 00:45:50,360 --> 00:45:52,360 Speaker 2: fact checked me on that, but I recall reading that 856 00:45:52,440 --> 00:45:56,160 Speaker 2: this morning. Here's the good thing. Remember Samba TV we 857 00:45:56,160 --> 00:45:59,000 Speaker 2: were talking about it felt so creepy. Literally, all you 858 00:45:59,080 --> 00:46:01,520 Speaker 2: have to do is say disable when you get to 859 00:46:01,560 --> 00:46:04,239 Speaker 2: that screen and you're installing your TV. That's all you 860 00:46:04,320 --> 00:46:05,600 Speaker 2: have to do and you're done. 861 00:46:05,640 --> 00:46:07,440 Speaker 3: Do you really not think it has something like why 862 00:46:07,480 --> 00:46:09,239 Speaker 3: do you think people are so prone? Ninety percent? Is 863 00:46:09,280 --> 00:46:11,920 Speaker 3: such a massive like amount? Like, are so prone to 864 00:46:12,040 --> 00:46:13,160 Speaker 3: click enable. 865 00:46:12,920 --> 00:46:16,320 Speaker 1: Because you've got a new toy and you want to 866 00:46:16,360 --> 00:46:19,839 Speaker 1: take full advantage because it's presented, like I said, as 867 00:46:19,880 --> 00:46:23,319 Speaker 1: this lovely way of like making this a better experience 868 00:46:23,400 --> 00:46:24,200 Speaker 1: for you the user. 869 00:46:24,520 --> 00:46:25,600 Speaker 3: Why wouldn't I want that? 870 00:46:25,719 --> 00:46:25,879 Speaker 1: Well? 871 00:46:25,920 --> 00:46:28,200 Speaker 2: And it's a menu that you have to click through, right, 872 00:46:28,280 --> 00:46:31,240 Speaker 2: So think about it this way. If it's on enable, 873 00:46:31,360 --> 00:46:33,680 Speaker 2: so you your cursor is on enable. When the screen 874 00:46:33,760 --> 00:46:36,720 Speaker 2: pops up, you'd have to go down to terms of service, 875 00:46:36,800 --> 00:46:39,839 Speaker 2: down to privacy policy, down to learn more, down one 876 00:46:39,880 --> 00:46:44,120 Speaker 2: more to disable the clicks. The as stupid as that sounds, 877 00:46:44,680 --> 00:46:48,520 Speaker 2: and you know, benign as five clicks or four clicks, 878 00:46:49,200 --> 00:46:53,000 Speaker 2: people will take the easier route and just say, okay, fine. 879 00:46:52,840 --> 00:46:55,520 Speaker 1: Enable, I'm in a hurry. I can't do I got 880 00:46:55,640 --> 00:46:57,399 Speaker 1: MF places to be, you know what I mean. 881 00:46:57,640 --> 00:46:58,240 Speaker 2: It's true. 882 00:46:58,480 --> 00:47:02,440 Speaker 1: So it's it is true, and it's an exploit not 883 00:47:02,520 --> 00:47:07,120 Speaker 1: just of technology, but an exploit of our own hardwired physiology. 884 00:47:07,760 --> 00:47:12,239 Speaker 1: Our brains are built to function this way, right, and 885 00:47:12,600 --> 00:47:16,839 Speaker 1: this leads us to some conclusions in what is very 886 00:47:16,920 --> 00:47:20,600 Speaker 1: much an ongoing events right. The first conclusion is that 887 00:47:20,920 --> 00:47:23,960 Speaker 1: there are some issues remaining. There's a lack of accountability. 888 00:47:24,280 --> 00:47:28,000 Speaker 1: One of the primary issues in this conversation is the 889 00:47:28,200 --> 00:47:31,640 Speaker 1: utter lack of accountability on the part of private institutions 890 00:47:31,880 --> 00:47:36,040 Speaker 1: as well as government agencies. It is not difficult to 891 00:47:36,120 --> 00:47:42,719 Speaker 1: imagine these companies cooperating with intelligence agencies, further exacerbating the 892 00:47:42,760 --> 00:47:46,960 Speaker 1: legal pitfalls involved. And again it's important to point out 893 00:47:47,400 --> 00:47:49,640 Speaker 1: just as a cheap skate. It's important to point out 894 00:47:49,960 --> 00:47:53,680 Speaker 1: that the people getting their data gathered are not paid 895 00:47:53,680 --> 00:47:56,520 Speaker 1: for that information, quite the opposite it used to be. 896 00:47:56,920 --> 00:47:58,960 Speaker 1: You know, what's that old adage. We always said, if 897 00:47:58,960 --> 00:48:01,360 Speaker 1: you're not paying for it, you are not the customer, 898 00:48:01,480 --> 00:48:06,160 Speaker 1: you're the product. Yes, right, But now the pendulum swings 899 00:48:06,360 --> 00:48:09,200 Speaker 1: a little bit further in the wrong direction in my opinion, 900 00:48:09,360 --> 00:48:13,840 Speaker 1: because we are paying for these services. We are paying Amazon, 901 00:48:13,920 --> 00:48:18,080 Speaker 1: we're paying Google to spy on us to whatever end, 902 00:48:18,120 --> 00:48:20,839 Speaker 1: and we are we are not accounting not only for this, 903 00:48:21,000 --> 00:48:24,520 Speaker 1: we're not accounting for the larger problem, which is that 904 00:48:24,840 --> 00:48:31,200 Speaker 1: insurance companies aggregate this information, your financial institutions aggregate this information, 905 00:48:31,719 --> 00:48:35,920 Speaker 1: and there is nothing that stops them from cooperating together 906 00:48:36,120 --> 00:48:40,680 Speaker 1: to build a footprint of you close enough. The idea 907 00:48:40,880 --> 00:48:45,279 Speaker 1: is that this footprint, this digital impression of you, will 908 00:48:45,320 --> 00:48:48,920 Speaker 1: one day have the fidelity such that it can predict 909 00:48:49,040 --> 00:48:50,680 Speaker 1: future actions you will take. 910 00:48:50,800 --> 00:48:53,720 Speaker 3: So you're saying that it could in theory be used 911 00:48:53,840 --> 00:48:55,520 Speaker 3: against you, Yes, very much. 912 00:48:55,600 --> 00:48:58,120 Speaker 2: So we're saying they're going to make Android versions of 913 00:48:58,160 --> 00:48:59,440 Speaker 2: you knowl' I'm cool with that. 914 00:48:59,600 --> 00:49:00,480 Speaker 3: That's what some help. 915 00:49:00,600 --> 00:49:03,319 Speaker 2: Yeah, but it's not gonna be about you anymore. No, No, 916 00:49:03,560 --> 00:49:05,320 Speaker 2: it's gonna be Amazon. 917 00:49:04,880 --> 00:49:10,000 Speaker 1: And and and it's it's fascinating when you think about it, 918 00:49:10,080 --> 00:49:11,759 Speaker 1: because now I know. 919 00:49:11,719 --> 00:49:14,680 Speaker 3: You're not joking. It's it's it's like a shape of 920 00:49:14,719 --> 00:49:17,000 Speaker 3: me that is my data, you know, like it's out 921 00:49:17,040 --> 00:49:20,959 Speaker 3: there in the matrix. It's a null shaped data cluster. 922 00:49:20,840 --> 00:49:23,800 Speaker 1: Or a Matt shaped data cluster, a Seth shaped cluster, 923 00:49:25,000 --> 00:49:26,720 Speaker 1: Seth shaped cluster. 924 00:49:26,840 --> 00:49:27,680 Speaker 2: That sounds like a band. 925 00:49:27,880 --> 00:49:30,000 Speaker 1: Yes, it does like a tasty treat, it does. I 926 00:49:30,000 --> 00:49:32,440 Speaker 1: feel like it's maybe a it's maybe like a hostess thing. 927 00:49:33,040 --> 00:49:38,640 Speaker 1: So we are brilliant ideas for desserts to treat the side. 928 00:49:39,120 --> 00:49:43,000 Speaker 1: We are all in this together. We are looking at 929 00:49:43,040 --> 00:49:45,680 Speaker 1: the end of privacy as we recognize it. And that's 930 00:49:45,680 --> 00:49:48,160 Speaker 1: sort of tricky. That sounds more dramatic than it really is, 931 00:49:48,239 --> 00:49:50,880 Speaker 1: because the concept of privacy as we know and enjoy 932 00:49:50,920 --> 00:49:55,480 Speaker 1: it today is relatively recent. Yes, right, and everything we've 933 00:49:55,600 --> 00:49:59,040 Speaker 1: learned indicates that type of privacy we idealize may end 934 00:49:59,120 --> 00:50:03,880 Speaker 1: up becoming a short lived fat to future historians. We're entering. 935 00:50:04,200 --> 00:50:06,400 Speaker 1: You know, Matt, you and I talked about this a 936 00:50:06,440 --> 00:50:10,040 Speaker 1: long time ago. An inequality of privacy, Right, Privacy is 937 00:50:10,080 --> 00:50:14,960 Speaker 1: a new currency. Some of the world's most influential, powerful, 938 00:50:15,000 --> 00:50:19,240 Speaker 1: successful people still have this kind of privacy, right yeah. 939 00:50:19,680 --> 00:50:22,040 Speaker 2: A weird example is just think about how much it 940 00:50:22,120 --> 00:50:27,359 Speaker 2: costs to get a good tint on your windows. I'm 941 00:50:27,400 --> 00:50:30,960 Speaker 2: not kidding. I'm not kidding. If you see someone drive 942 00:50:31,040 --> 00:50:35,919 Speaker 2: by with perfectly tinted like the darkest windows you've ever seen, 943 00:50:36,200 --> 00:50:41,359 Speaker 2: that is expensive. Well, and that's literally privacy just in 944 00:50:41,400 --> 00:50:42,160 Speaker 2: your car anyway. 945 00:50:42,360 --> 00:50:47,879 Speaker 1: I'm sorry, I feel like something going on is your car, Okay. 946 00:50:47,080 --> 00:50:49,600 Speaker 2: No, no, I'm just saying that that amount of privacy 947 00:50:49,719 --> 00:50:53,759 Speaker 2: just to be on the road driving costs money, right yeah. 948 00:50:53,800 --> 00:50:56,680 Speaker 2: And if you think about really good shutters on a 949 00:50:56,719 --> 00:51:02,160 Speaker 2: home or something like that, in those little examples, it 950 00:51:03,000 --> 00:51:06,880 Speaker 2: takes quite a bit of means to protect yourself just 951 00:51:06,960 --> 00:51:09,719 Speaker 2: from someone viewing with their eyeballs where you are at 952 00:51:09,719 --> 00:51:12,440 Speaker 2: any time. And then if you apply that to the 953 00:51:12,520 --> 00:51:15,759 Speaker 2: digital space, it gets more and more expensive. 954 00:51:15,880 --> 00:51:19,880 Speaker 1: It's creating something very similar in nuts and bolts and 955 00:51:19,920 --> 00:51:23,640 Speaker 1: the mechanics of it to the infamous sesame credit that's 956 00:51:23,640 --> 00:51:26,640 Speaker 1: occurring in the Chinese mainlanes precisely. And I'm not being 957 00:51:26,680 --> 00:51:29,080 Speaker 1: alarmist about this, and I don't you know, I don't 958 00:51:29,080 --> 00:51:33,080 Speaker 1: want people to be any more frightened. That is absolutely appropriate. 959 00:51:33,360 --> 00:51:37,200 Speaker 1: You should be a little We're at the Pandora problem right. 960 00:51:37,400 --> 00:51:41,239 Speaker 1: Once the Pandora's jar is unscrewed, once the lid is off, 961 00:51:41,280 --> 00:51:44,239 Speaker 1: there is no going back. There were some rumblings in 962 00:51:44,280 --> 00:51:48,439 Speaker 1: Congress about investigating what is essentially a smart TV spir ring, 963 00:51:48,760 --> 00:51:51,719 Speaker 1: but the advantages of keeping the technology in play for 964 00:51:51,800 --> 00:51:55,799 Speaker 1: now seem to outweigh the problems of consent and the 965 00:51:55,840 --> 00:51:59,080 Speaker 1: fact that consent is not occurring, and the fact that yes, 966 00:51:59,440 --> 00:52:03,319 Speaker 1: this could and up with your with your information from 967 00:52:03,360 --> 00:52:06,480 Speaker 1: other places, such that it might affect your ability to 968 00:52:06,480 --> 00:52:09,360 Speaker 1: get a car loan, It might affect where you can live. 969 00:52:10,239 --> 00:52:12,360 Speaker 1: This can get very dirty, very quickly. 970 00:52:12,440 --> 00:52:17,239 Speaker 3: It's a slippery slope, and especially once I mean, what if, 971 00:52:17,239 --> 00:52:19,880 Speaker 3: like we can't opt out anymore, you know, if if 972 00:52:19,880 --> 00:52:23,600 Speaker 3: that goes away, Like are we really owed that right 973 00:52:23,680 --> 00:52:25,440 Speaker 3: to opt out? Like it's sort of like almost a 974 00:52:25,480 --> 00:52:28,200 Speaker 3: pr move to allow us to opt out, Like you 975 00:52:28,440 --> 00:52:30,640 Speaker 3: could very easily say, as the manufacture of a product, 976 00:52:30,719 --> 00:52:33,400 Speaker 3: say well, if you don't want us to have your data, 977 00:52:33,440 --> 00:52:36,160 Speaker 3: don't buy the products. Like it's sort of almost a courtesy. 978 00:52:36,200 --> 00:52:37,959 Speaker 3: If you think about it to allow people to opt 979 00:52:37,960 --> 00:52:38,319 Speaker 3: out of this. 980 00:52:38,520 --> 00:52:39,799 Speaker 1: Yeah, but what it could just. 981 00:52:39,719 --> 00:52:41,840 Speaker 3: Are you owed a fancy television? 982 00:52:41,960 --> 00:52:45,240 Speaker 1: But it's no, but are you owed? And are you owed? 983 00:52:45,600 --> 00:52:48,960 Speaker 1: Are you required to participate in some of these systems 984 00:52:48,960 --> 00:52:50,919 Speaker 1: insurance one is required to. 985 00:52:50,840 --> 00:52:52,960 Speaker 3: Do so absolutely, that's different, I think. But I guess 986 00:52:53,000 --> 00:52:55,120 Speaker 3: what I'm saying is, like with the television, all these 987 00:52:55,160 --> 00:52:59,040 Speaker 3: are gadgets that, like you could not necessarily call necessities. 988 00:52:59,480 --> 00:53:04,840 Speaker 2: Well, think about think about most people with a steady job, 989 00:53:05,600 --> 00:53:09,480 Speaker 2: think about email communication nowadays, or an app that's used 990 00:53:09,560 --> 00:53:13,880 Speaker 2: wherever you work or you know there. You have to 991 00:53:13,960 --> 00:53:17,080 Speaker 2: have some kind of connection like that. You really do, 992 00:53:17,560 --> 00:53:20,239 Speaker 2: and with most of these devices, you're going to run 993 00:53:20,239 --> 00:53:21,120 Speaker 2: into these issues. 994 00:53:22,000 --> 00:53:25,840 Speaker 1: Or add to that the compounding complicating factor that for 995 00:53:25,920 --> 00:53:29,840 Speaker 1: the for a huge proportion of people who have mobile phones, 996 00:53:30,040 --> 00:53:34,360 Speaker 1: it's their only way to access not just the Internet, 997 00:53:34,360 --> 00:53:38,200 Speaker 1: but it's their primary tool for any financial dealings like 998 00:53:38,239 --> 00:53:42,200 Speaker 1: people's lives hinge on this thing. Yes, so I see 999 00:53:42,280 --> 00:53:46,040 Speaker 1: both sides of that, But here's what I think we 1000 00:53:46,080 --> 00:53:48,000 Speaker 1: can end with. We can say it's not just the 1001 00:53:48,120 --> 00:53:52,560 Speaker 1: United States. Way back in twenty seventeen, when WikiLeaks released this, 1002 00:53:53,320 --> 00:53:56,719 Speaker 1: they showed that digital spine is going to continue to grow. 1003 00:53:56,719 --> 00:53:57,680 Speaker 1: It's not going to go away. 1004 00:53:57,880 --> 00:53:59,680 Speaker 2: We're talking about the Weeping Angel thing, right right. 1005 00:53:59,600 --> 00:54:02,480 Speaker 1: Weeping Angel and specific which again is just for Samsung 1006 00:54:02,560 --> 00:54:04,680 Speaker 1: TVs and it's relatively low tech. You have to put 1007 00:54:04,680 --> 00:54:08,480 Speaker 1: a USB stick in there, but now you don't. Now 1008 00:54:08,560 --> 00:54:11,840 Speaker 1: it's now it's a whole different thing. You know, anybody 1009 00:54:11,880 --> 00:54:14,960 Speaker 1: got sort of irritated when you bought a new cell 1010 00:54:15,000 --> 00:54:17,560 Speaker 1: phone and it had stuff pre installed that you can't move. 1011 00:54:17,840 --> 00:54:20,799 Speaker 1: Think about this times a thousand. That's what's happening. That's 1012 00:54:20,800 --> 00:54:24,120 Speaker 1: what's going to happen. And it's not just happening in 1013 00:54:24,120 --> 00:54:28,799 Speaker 1: the US. Other advanced nations China, Russia, Britain, Israel and 1014 00:54:28,840 --> 00:54:33,640 Speaker 1: so on are creating newer, more robust, powerful tools to 1015 00:54:33,760 --> 00:54:38,320 Speaker 1: do this, and any any nation that can gain access 1016 00:54:38,320 --> 00:54:41,359 Speaker 1: to this kind of spying technology is going to do so, 1017 00:54:41,680 --> 00:54:43,760 Speaker 1: and they can do it through a web of private industry. 1018 00:54:43,920 --> 00:54:46,879 Speaker 1: There are no laws. This is wild West, and it's 1019 00:54:47,040 --> 00:54:49,759 Speaker 1: very bad. It's very bad for the people who are 1020 00:54:49,760 --> 00:54:51,399 Speaker 1: not at the top of the food chain. 1021 00:54:52,320 --> 00:54:55,719 Speaker 2: And with that, we all threw away our devices. We 1022 00:54:55,920 --> 00:55:00,640 Speaker 2: got out our acoustic guitars or ukuleles and started Kumbaya. 1023 00:55:00,840 --> 00:55:04,720 Speaker 1: I started writing my version of roth Water Emerson's on Nature. 1024 00:55:04,800 --> 00:55:08,719 Speaker 2: Right, that's correct. I got our my Jimbei out and 1025 00:55:08,760 --> 00:55:10,000 Speaker 2: we just started playing. 1026 00:55:09,719 --> 00:55:14,080 Speaker 3: Finger symbols from the Yeah. 1027 00:55:13,120 --> 00:55:15,440 Speaker 2: And we just you know, commune with nature for the 1028 00:55:15,440 --> 00:55:19,480 Speaker 2: rest of our lives and watch the sunset dissipate over 1029 00:55:19,520 --> 00:55:20,000 Speaker 2: the horizon. 1030 00:55:20,040 --> 00:55:21,440 Speaker 3: Had an ashram in Quebec. 1031 00:55:22,440 --> 00:55:24,600 Speaker 2: Yeah, and then we all woke up and went back 1032 00:55:24,600 --> 00:55:24,959 Speaker 2: to work. 1033 00:55:25,040 --> 00:55:27,680 Speaker 3: Yeah, that's true. And hey, listen, I'm just being Devil's 1034 00:55:27,719 --> 00:55:29,880 Speaker 3: aavogaated about like are you owed a smart TV? And 1035 00:55:29,920 --> 00:55:32,839 Speaker 3: I agree it's a great question. Well with the phone though, 1036 00:55:32,880 --> 00:55:36,080 Speaker 3: You're right, the phone is for many an affordable entry 1037 00:55:36,080 --> 00:55:39,200 Speaker 3: point to the Internet because it doubles as a you know, 1038 00:55:39,320 --> 00:55:42,840 Speaker 3: a crucial communication device and a way to access the Internet, 1039 00:55:42,840 --> 00:55:45,279 Speaker 3: which we can all agree is a necessity for things 1040 00:55:45,320 --> 00:55:49,200 Speaker 3: like banking everything, you know, everything starts on the Internet, 1041 00:55:49,440 --> 00:55:54,400 Speaker 3: the Web, of course. But I would argue that for 1042 00:55:54,480 --> 00:55:56,600 Speaker 3: the things like, you know, an Alexa, do we do 1043 00:55:56,640 --> 00:56:00,439 Speaker 3: we need an Alexa? Maybe maybe for accessibility? Maybe maybe 1044 00:56:00,440 --> 00:56:03,320 Speaker 3: that's a thingy for like people with disabilities, and Alexa 1045 00:56:03,360 --> 00:56:08,000 Speaker 3: could be a very important addition to a home for others. 1046 00:56:08,040 --> 00:56:09,520 Speaker 3: I think it's more of a luxury and sort of 1047 00:56:09,520 --> 00:56:10,440 Speaker 3: like a neat little gadget. 1048 00:56:10,520 --> 00:56:12,840 Speaker 2: You know, you're absolutely right, and you all nobody needs 1049 00:56:12,960 --> 00:56:17,640 Speaker 2: a smart television monitor right, But very soon the only 1050 00:56:17,800 --> 00:56:22,360 Speaker 2: available televisions will be smart televisions, at least ones that 1051 00:56:22,400 --> 00:56:23,480 Speaker 2: are easily purchased. 1052 00:56:23,560 --> 00:56:25,160 Speaker 3: I think what I was getting at when Ben talked 1053 00:56:25,160 --> 00:56:27,440 Speaker 3: about the sesame credit and the slippery slope of all 1054 00:56:27,480 --> 00:56:30,399 Speaker 3: this is what if there does come a time where 1055 00:56:30,440 --> 00:56:33,480 Speaker 3: you can't opt out anymore. Just by buying the thing 1056 00:56:33,520 --> 00:56:35,880 Speaker 3: and installing it in your home, you're opting in. The 1057 00:56:35,920 --> 00:56:37,759 Speaker 3: Only way to opt out is to not buy it, 1058 00:56:38,160 --> 00:56:39,279 Speaker 3: or to buy something else. 1059 00:56:39,520 --> 00:56:42,840 Speaker 2: There you go, or live in the woods. 1060 00:56:42,920 --> 00:56:47,520 Speaker 1: And that's our classic episode for this evening. We can't 1061 00:56:47,520 --> 00:56:48,359 Speaker 1: wait to hear your thoughts. 1062 00:56:48,440 --> 00:56:49,759 Speaker 3: It's right let us know what you think. You can 1063 00:56:49,840 --> 00:56:52,359 Speaker 3: reach to the handle Conspiracy Stuff where we exist on 1064 00:56:52,400 --> 00:56:56,800 Speaker 3: Facebook x and YouTube on Instagram and TikTok or Conspiracy 1065 00:56:56,800 --> 00:56:57,319 Speaker 3: Stuff Show. 1066 00:56:57,360 --> 00:56:59,560 Speaker 2: If you want to call us dial one eight three 1067 00:56:59,719 --> 00:57:04,880 Speaker 2: three the STDWYTK that's our voicemail system. You've got three minutes. 1068 00:57:04,920 --> 00:57:07,120 Speaker 2: Give yourself a cool nickname and let us know if 1069 00:57:07,120 --> 00:57:09,200 Speaker 2: we can use your name and message on the air. 1070 00:57:09,560 --> 00:57:11,120 Speaker 2: If you got more to say than can fit in 1071 00:57:11,120 --> 00:57:13,919 Speaker 2: that voicemail, why not instead send us a good old 1072 00:57:13,960 --> 00:57:14,720 Speaker 2: fashioned email. 1073 00:57:14,960 --> 00:57:17,880 Speaker 1: We are the entities to read every single piece of 1074 00:57:17,880 --> 00:57:22,320 Speaker 1: correspondence we receive. Be aware, yet not afraid. Sometimes the 1075 00:57:22,400 --> 00:57:44,919 Speaker 1: void writes back conspiracy at iHeartRadio dot com. 1076 00:57:45,160 --> 00:57:47,200 Speaker 2: Stuff they don't want you to know. Is a production 1077 00:57:47,320 --> 00:57:51,840 Speaker 2: of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, 1078 00:57:51,920 --> 00:57:55,120 Speaker 2: Apple Podcasts, or wherever you listen to your favorite shows.