1 00:00:00,720 --> 00:00:04,600 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:05,000 --> 00:00:07,520 Speaker 1: I'm os Vlosian, and today Cara Price and I will 3 00:00:07,520 --> 00:00:10,959 Speaker 1: bring you the headlines this week, including a bipartisan bill 4 00:00:11,039 --> 00:00:13,800 Speaker 1: that could change the future of the Internet. Then, in 5 00:00:13,840 --> 00:00:16,840 Speaker 1: our Tech Support segment, we'll talk to Jeffrey Fowler of 6 00:00:16,840 --> 00:00:20,160 Speaker 1: the Washington Post about the twenty three and me bankruptcy 7 00:00:20,560 --> 00:00:23,560 Speaker 1: and what that might mean for your genetic data. All 8 00:00:23,640 --> 00:00:37,400 Speaker 1: of that on the Weekend Tech. It's Friday, March twenty eighth. Well, 9 00:00:37,440 --> 00:00:40,080 Speaker 1: I'm very, very excited to finally be able to say it. 10 00:00:40,159 --> 00:00:41,479 Speaker 1: Welcome back, Cara Price. 11 00:00:41,960 --> 00:00:45,120 Speaker 2: Hello, it's good to be back as Valashan. 12 00:00:45,560 --> 00:00:48,880 Speaker 1: You're armed with a document dossier. 13 00:00:49,240 --> 00:00:54,920 Speaker 2: I am, I have papers. I created a dossier of 14 00:00:54,960 --> 00:00:56,600 Speaker 2: all the tech stories that I want to bring up. 15 00:00:56,600 --> 00:00:59,880 Speaker 2: And then, of course you've already reported what happens with me. 16 00:01:00,520 --> 00:01:02,920 Speaker 2: But no, some of them, some of them are true 17 00:01:03,600 --> 00:01:06,520 Speaker 2: to my own weirdness and obsession. One of them is 18 00:01:06,600 --> 00:01:09,959 Speaker 2: Kim Kardashian's photoshoot for Perfect magazine, which came out a 19 00:01:09,959 --> 00:01:10,839 Speaker 2: couple of weeks ago. 20 00:01:11,200 --> 00:01:13,640 Speaker 1: Well, tell me about that. I remember, Kim Kardashian, you 21 00:01:13,640 --> 00:01:16,640 Speaker 1: and I were entertained last November by her sort of 22 00:01:16,680 --> 00:01:19,280 Speaker 1: first trap photos with the Tesla robot. But is this 23 00:01:19,319 --> 00:01:20,560 Speaker 1: a development? 24 00:01:20,680 --> 00:01:24,800 Speaker 2: So if you're Kim, you're gonna pose, and if you're Kim, 25 00:01:24,840 --> 00:01:27,080 Speaker 2: you're gonna post with the hottest stuff. Right, what's the 26 00:01:27,080 --> 00:01:30,160 Speaker 2: hottest stuff? The cyber truck and a Tesla robot And 27 00:01:30,280 --> 00:01:35,119 Speaker 2: those poses are not exactly of the doge variety. They 28 00:01:35,200 --> 00:01:39,720 Speaker 2: are sultry. But no, you know, there's actually a more 29 00:01:39,760 --> 00:01:44,000 Speaker 2: serious story here. The Associated Press reports that Tesla has 30 00:01:44,040 --> 00:01:48,240 Speaker 2: recalled nearly all cyber trucks, so it's more than forty 31 00:01:48,360 --> 00:01:51,960 Speaker 2: six thousand in fact, and the National Highway Traffic Safety 32 00:01:51,960 --> 00:01:55,400 Speaker 2: Administration warned that panels on either side of the truck's 33 00:01:55,440 --> 00:01:59,320 Speaker 2: windshield are in danger of detaching and creating a road 34 00:01:59,360 --> 00:02:01,720 Speaker 2: to hack. It's not funny, but it's like you're in 35 00:02:01,720 --> 00:02:03,240 Speaker 2: the hand, I mean, where you see it. 36 00:02:03,280 --> 00:02:05,640 Speaker 1: It's funny because ironic, because it's supposed to look like 37 00:02:05,680 --> 00:02:08,120 Speaker 1: you're living in the future but supposed to be roane. 38 00:02:09,120 --> 00:02:13,520 Speaker 2: It's RoboCop but with like an old hyende vibe. And 39 00:02:13,600 --> 00:02:17,320 Speaker 2: so the issue is actually the adhesive. It's vulnerable to 40 00:02:17,520 --> 00:02:20,600 Speaker 2: environmental embrittlement. According to the NHDSA. 41 00:02:20,760 --> 00:02:21,800 Speaker 1: I thought that was your phrase. 42 00:02:24,720 --> 00:02:28,840 Speaker 2: So drivers can get the panel replaced by Tesla for free, actually, 43 00:02:29,040 --> 00:02:32,280 Speaker 2: which I mean I imagining going to do that? 44 00:02:32,360 --> 00:02:35,880 Speaker 1: Fool's Errand yeah, I mean Tesla shares a down forty 45 00:02:35,919 --> 00:02:38,240 Speaker 1: two percent for this year, and I can't imagine this 46 00:02:38,280 --> 00:02:41,400 Speaker 1: story will help on the subject of corporate drama. I 47 00:02:41,480 --> 00:02:44,639 Speaker 1: was drawn to this story this week of tech enabled 48 00:02:44,800 --> 00:02:49,600 Speaker 1: business espionage. Here's the headline from tech Crunch. Rippling seu's deal. 49 00:02:50,120 --> 00:02:54,600 Speaker 1: Deal denies all legal wrongdoing and Slack is the main witness. 50 00:02:54,960 --> 00:02:58,440 Speaker 2: Can you explain to me how Slack is a witness? Actually, 51 00:02:58,919 --> 00:03:02,560 Speaker 2: Slack is a witness to a lot of our bacchanalia 52 00:03:02,960 --> 00:03:05,120 Speaker 2: for this show, So Slack as a witness actually makes 53 00:03:05,160 --> 00:03:05,520 Speaker 2: sense to me. 54 00:03:05,639 --> 00:03:07,239 Speaker 1: So yes, I mean, this is kind of a story 55 00:03:07,320 --> 00:03:10,880 Speaker 1: about who's watching on You're on your work software or computer. 56 00:03:11,480 --> 00:03:14,720 Speaker 1: But Rippling and Deal are HR startups that offer payroll 57 00:03:14,720 --> 00:03:18,120 Speaker 1: and other HR resources and their rifles. This is the 58 00:03:18,480 --> 00:03:21,000 Speaker 1: Pepsi and Coke of the HR software. 59 00:03:21,000 --> 00:03:23,120 Speaker 2: Well, that's the sexiest sentence. 60 00:03:22,760 --> 00:03:27,520 Speaker 1: In Rippling recently announced a lawsuit against Deal in a 61 00:03:27,520 --> 00:03:32,600 Speaker 1: fifty page complaint, the ledges racketeering, unfair competition, misappropriation of 62 00:03:32,600 --> 00:03:35,920 Speaker 1: trade secrets and more. And here's the real intrigue. The 63 00:03:36,000 --> 00:03:40,320 Speaker 1: lawsuit centers around an employee that Rippling alleges acts as 64 00:03:40,360 --> 00:03:41,360 Speaker 1: a spy for Deal. 65 00:03:41,600 --> 00:03:45,920 Speaker 2: The worst thing you could be at Coke's a Pepsi 66 00:03:45,920 --> 00:03:49,560 Speaker 2: spy and vice versa. But then again, I'm like, who 67 00:03:49,600 --> 00:03:52,120 Speaker 2: cares about trade secrets at Rippling and Deal? But I 68 00:03:52,120 --> 00:03:54,200 Speaker 2: guess it's the comtes. 69 00:03:54,000 --> 00:03:57,280 Speaker 1: Twelve and thirteen billion dollar HR companies. They're almost exactly 70 00:03:57,280 --> 00:03:59,520 Speaker 1: the same size, they have a very similar offering and 71 00:03:59,680 --> 00:04:02,240 Speaker 1: so so yeah, I think the stakes are high, and 72 00:04:02,560 --> 00:04:05,160 Speaker 1: you know, there's been a kind of public relations war 73 00:04:05,320 --> 00:04:09,080 Speaker 1: going on. In parallel, Rippling created a game on their 74 00:04:09,120 --> 00:04:11,680 Speaker 1: website called the Snake Game. You of course remember Snake. 75 00:04:11,840 --> 00:04:13,839 Speaker 2: I mean, I just download a black puzzle again, which 76 00:04:13,840 --> 00:04:15,880 Speaker 2: is the closest I can get, but Nokia. 77 00:04:15,520 --> 00:04:17,000 Speaker 1: Well it's not the closer you can get because you 78 00:04:17,000 --> 00:04:21,120 Speaker 1: can now play Snake on Ripling's website. Here are the instructions. 79 00:04:21,800 --> 00:04:24,159 Speaker 1: Deal often claims to be a one stop solution for 80 00:04:24,200 --> 00:04:26,720 Speaker 1: all your global payroll needs, but their customers pay the 81 00:04:26,800 --> 00:04:29,719 Speaker 1: price for gaps beneath the surface. Play this game to 82 00:04:29,800 --> 00:04:32,440 Speaker 1: find the difference between Deal's claims and the reality of 83 00:04:32,440 --> 00:04:32,800 Speaker 1: their proce. 84 00:04:32,960 --> 00:04:35,480 Speaker 2: This is the pettiest. This is pettier than some of 85 00:04:35,520 --> 00:04:37,400 Speaker 2: the stuff that my friends did in college. 86 00:04:38,040 --> 00:04:40,120 Speaker 1: But I'm sure you want to play the game, right, Yeah. 87 00:04:39,960 --> 00:04:42,960 Speaker 2: So it won't be interesting to the listener to listen 88 00:04:43,000 --> 00:04:46,960 Speaker 2: to us play. But just like imagine playing Nokia Snake, 89 00:04:47,240 --> 00:04:50,440 Speaker 2: you gobble up all of the alleged deal falsehoods that 90 00:04:50,480 --> 00:04:54,920 Speaker 2: are at play. But the lawsuit was filed by Rippling. 91 00:04:54,720 --> 00:04:57,120 Speaker 1: That's right. And the reason it fascinates me, as I mentioned, 92 00:04:57,160 --> 00:05:01,040 Speaker 1: is because it's kind of about privacy, especially privacy as 93 00:05:01,040 --> 00:05:03,800 Speaker 1: an employee, right, Like, if you think you're alone on 94 00:05:03,839 --> 00:05:07,479 Speaker 1: your work computer, you're wrong, man. 95 00:05:07,720 --> 00:05:08,840 Speaker 2: Well, especially if you're a spy. 96 00:05:09,160 --> 00:05:12,800 Speaker 1: Absolutely, according to Ripling's own lawyers, the company keeps a 97 00:05:12,800 --> 00:05:15,400 Speaker 1: detailed log of what people do on Slack, Like when 98 00:05:15,400 --> 00:05:18,440 Speaker 1: an employee accesses Slack channels, or conducts searches on Slack, 99 00:05:18,520 --> 00:05:21,360 Speaker 1: or opens a document on Slack. All those things are logged. 100 00:05:21,880 --> 00:05:24,600 Speaker 1: And so the lawsuit contends that according to this logged activity, 101 00:05:24,880 --> 00:05:27,640 Speaker 1: the Rippling employee, who is allegedly a spy for Deal, 102 00:05:28,200 --> 00:05:31,760 Speaker 1: started looking at content associated with the word Deal at 103 00:05:31,800 --> 00:05:36,400 Speaker 1: a much higher rate beginning November, including perusing sales related 104 00:05:36,400 --> 00:05:40,400 Speaker 1: Slack channels that weren't necessary for his job in payroll operations. 105 00:05:40,680 --> 00:05:44,680 Speaker 2: But when they say looking, we're looking at. 106 00:05:45,320 --> 00:05:46,800 Speaker 1: Well, I'll give you. I'll give you a clear example 107 00:05:46,839 --> 00:05:48,680 Speaker 1: of this. So this morning, when you slacked me and 108 00:05:48,720 --> 00:05:50,960 Speaker 1: said how can I find the log in details to 109 00:05:51,040 --> 00:05:55,279 Speaker 1: all of our subscriptions, what I did was I searched 110 00:05:55,440 --> 00:05:57,840 Speaker 1: log in in Slack and then that's how I found it. 111 00:05:57,920 --> 00:06:00,800 Speaker 1: So you used me as a human function, but you 112 00:06:00,839 --> 00:06:02,320 Speaker 1: actually there is a search function. 113 00:06:02,480 --> 00:06:04,440 Speaker 2: They're like OS has been trying to log in. 114 00:06:04,480 --> 00:06:07,320 Speaker 1: Quite a bit recently. So that's exactly right. So this 115 00:06:07,400 --> 00:06:11,240 Speaker 1: dude was writing deal into the Rippling corporate Slack and 116 00:06:11,240 --> 00:06:13,560 Speaker 1: seeing what he could find allegedly. 117 00:06:13,560 --> 00:06:17,560 Speaker 2: Fascinating, and so slack was basically, I mean, slack wasn't 118 00:06:17,560 --> 00:06:20,520 Speaker 2: doing anything. But like when people went looking for what 119 00:06:20,560 --> 00:06:22,479 Speaker 2: this guy was doing, Slack was like, here's what this 120 00:06:22,520 --> 00:06:23,120 Speaker 2: guy's doing. 121 00:06:23,200 --> 00:06:29,160 Speaker 1: People didn't just go looking, They actually created a honeytrap. Explained, 122 00:06:29,960 --> 00:06:33,640 Speaker 1: So Rippling created a Slack channel and started a rumor 123 00:06:33,680 --> 00:06:37,200 Speaker 1: that it had a bunch of embarrassing information about Deal 124 00:06:37,360 --> 00:06:40,640 Speaker 1: in it, and of course allisuredly this employee headed straight 125 00:06:40,680 --> 00:06:41,960 Speaker 1: for that slack channel. 126 00:06:41,960 --> 00:06:44,960 Speaker 2: And the spy employee searched for this channel and that 127 00:06:45,080 --> 00:06:47,000 Speaker 2: was picked up by the slack log is. 128 00:06:46,920 --> 00:06:50,159 Speaker 1: That's according to the lawsuit anyway. It also claims that 129 00:06:50,279 --> 00:06:52,640 Speaker 1: when the alleged spy was asked by court order to 130 00:06:52,640 --> 00:06:55,760 Speaker 1: hand over his phone, he escaped the bathroom, locked the 131 00:06:55,839 --> 00:06:58,719 Speaker 1: door behind him, and possibly even attempted to flush the 132 00:06:58,760 --> 00:06:59,799 Speaker 1: phone down the toilet. 133 00:07:00,040 --> 00:07:03,240 Speaker 2: As someone who's heard anxiety attacks, at first, I hear, well, honey, 134 00:07:03,279 --> 00:07:04,960 Speaker 2: that phone ain't going down the toilet, And then I'm like, 135 00:07:05,000 --> 00:07:07,159 Speaker 2: you know what, I would flush a phone down the toilet, 136 00:07:07,760 --> 00:07:09,600 Speaker 2: especially if it was a Nokia phone that things goes 137 00:07:09,680 --> 00:07:10,760 Speaker 2: right down, not my iPhone. 138 00:07:10,840 --> 00:07:11,040 Speaker 3: I know. 139 00:07:11,080 --> 00:07:13,440 Speaker 1: You have to imagine fishing your wet phone out of 140 00:07:13,480 --> 00:07:15,320 Speaker 1: the toilet having to fail to flush it is not. 141 00:07:15,360 --> 00:07:18,440 Speaker 2: The the way I have fished phones out of a toilet. 142 00:07:18,760 --> 00:07:20,520 Speaker 2: I'd be a millionaire if you paid. 143 00:07:21,000 --> 00:07:23,760 Speaker 1: Next, I have a story that doesn't exactly debunk the 144 00:07:23,840 --> 00:07:27,200 Speaker 1: claim that crypto is the Wild West. The story is 145 00:07:27,200 --> 00:07:30,560 Speaker 1: about something called a crypto mixer, and he guess what 146 00:07:30,640 --> 00:07:31,440 Speaker 1: that is, Cara. 147 00:07:31,440 --> 00:07:35,200 Speaker 2: Like something between the world's Saddest job fair and the 148 00:07:35,240 --> 00:07:36,000 Speaker 2: Saddest Prome. 149 00:07:37,560 --> 00:07:40,920 Speaker 1: That's what I thought too, but then I asked Google Gemini. Ah, 150 00:07:41,120 --> 00:07:44,920 Speaker 1: And here's how Gemini explains. A crypto mixer, also known 151 00:07:44,960 --> 00:07:49,320 Speaker 1: as a tumbler, aggregates cryptocurrencies from multiple users, mixes them, 152 00:07:49,320 --> 00:07:52,480 Speaker 1: and redistributes them randomly, making it harder to trace the 153 00:07:52,520 --> 00:07:55,160 Speaker 1: origin and ownership of the funds. So, of course there's 154 00:07:55,160 --> 00:07:58,440 Speaker 1: a public ledger for cryptocurrency, the blockchain, but after the 155 00:07:58,480 --> 00:08:00,920 Speaker 1: mixing process, the coins are we distributed back to the 156 00:08:00,960 --> 00:08:03,160 Speaker 1: original depositors in a random manner. 157 00:08:03,520 --> 00:08:06,000 Speaker 2: It's like if everybody bought a bottle of vodka and 158 00:08:06,040 --> 00:08:09,800 Speaker 2: you made a huge batch martini and then you were like, 159 00:08:09,800 --> 00:08:11,320 Speaker 2: can I have my drink? You get your drink, and 160 00:08:11,320 --> 00:08:12,320 Speaker 2: you're like, I don't know if that was from my 161 00:08:12,360 --> 00:08:14,160 Speaker 2: bottle or who somebody else's voca battle. 162 00:08:14,240 --> 00:08:16,120 Speaker 1: I think that that is very very well put, and 163 00:08:16,160 --> 00:08:21,560 Speaker 1: I'm sober. The metaphor the founders of this particular mixer chose, however, 164 00:08:21,720 --> 00:08:24,880 Speaker 1: was the Tornado, hence the name of Tornado Cash. 165 00:08:24,920 --> 00:08:26,560 Speaker 2: That sounds like a celebrity baby name. 166 00:08:27,120 --> 00:08:29,880 Speaker 1: Well wait till we get to the alleged perpetrator. In 167 00:08:29,880 --> 00:08:33,559 Speaker 1: twenty twenty two, the Justice Department alleged hackers, including the 168 00:08:33,640 --> 00:08:37,000 Speaker 1: Lazarus Group, which is allegedly run by the North Korean government, 169 00:08:37,640 --> 00:08:41,520 Speaker 1: launded billions of dollars in stone assets using Tornado Cash. 170 00:08:41,679 --> 00:08:44,480 Speaker 1: So the US took action and sanctioned the company, and. 171 00:08:44,400 --> 00:08:47,160 Speaker 2: So nobody in the US could use Tornado Cash anymore. 172 00:08:47,280 --> 00:08:50,320 Speaker 1: Yeah, that was the name. That was one of the 173 00:08:50,360 --> 00:08:53,800 Speaker 1: consequences of the sanctions. Another was criminal charges against two 174 00:08:53,880 --> 00:08:57,640 Speaker 1: founders of Tornado Cash. The Wall Street Journal recently spoke 175 00:08:57,679 --> 00:09:00,040 Speaker 1: to one of the founders, whose name is get this, 176 00:09:00,679 --> 00:09:01,600 Speaker 1: Roman Storm. 177 00:09:01,880 --> 00:09:04,520 Speaker 2: I can't These are Kardashian children's names. 178 00:09:05,080 --> 00:09:08,559 Speaker 1: That's right. So Roman Storm the founder of Tornado Cash. 179 00:09:08,920 --> 00:09:11,640 Speaker 1: But you know, all jokes aside. He was actually arrested 180 00:09:11,720 --> 00:09:15,000 Speaker 1: at gunpoint after federal agents stormed his home in twenty 181 00:09:15,040 --> 00:09:18,880 Speaker 1: twenty three to arrest him for his involvement in Tornado Cash. 182 00:09:18,960 --> 00:09:21,640 Speaker 1: He's now out on bail and says he's not guilty 183 00:09:21,720 --> 00:09:25,400 Speaker 1: of charges of money laundering and sanctions violations. In this 184 00:09:25,400 --> 00:09:28,360 Speaker 1: week's Wall Street Journal article, he maintains that the software 185 00:09:28,360 --> 00:09:32,040 Speaker 1: he created is neutral and has both good and bad 186 00:09:32,240 --> 00:09:32,920 Speaker 1: use cases. 187 00:09:33,240 --> 00:09:36,920 Speaker 2: But it's my understanding that the whole point of the 188 00:09:37,000 --> 00:09:40,960 Speaker 2: blockchain is to create a ledger that traces back the 189 00:09:41,000 --> 00:09:46,680 Speaker 2: original transaction. So what could possibly be a good use 190 00:09:47,040 --> 00:09:49,280 Speaker 2: of crypto mixing, Well, Storm. 191 00:09:49,040 --> 00:09:52,600 Speaker 1: Said, it's financial privacy. So an example he used actually 192 00:09:52,720 --> 00:09:55,520 Speaker 1: was that you could donate crypto to assist Ukraine in 193 00:09:55,559 --> 00:09:59,719 Speaker 1: their war effort without identifying yourself, for example, to Russian authorities. 194 00:10:00,000 --> 00:10:01,800 Speaker 2: So that actually makes a lot of sense. 195 00:10:02,200 --> 00:10:04,240 Speaker 1: There is some good news for Storm, we could go 196 00:10:04,360 --> 00:10:07,960 Speaker 1: the US actually lifted sanctions against Tornado Cash, and while 197 00:10:08,080 --> 00:10:10,680 Speaker 1: Storm is still awaiting trial, his ally seems to be 198 00:10:10,760 --> 00:10:13,920 Speaker 1: encouraged by the Trump administration's desire to make the US 199 00:10:14,240 --> 00:10:15,600 Speaker 1: the crypto capital of the world. 200 00:10:15,920 --> 00:10:18,960 Speaker 2: Unbelievable. I love that story. Else, The next story that 201 00:10:19,000 --> 00:10:21,200 Speaker 2: I want to talk to you about is one that 202 00:10:21,240 --> 00:10:24,280 Speaker 2: I have not shut up about since probably we started 203 00:10:24,320 --> 00:10:26,240 Speaker 2: doing Sleepwalkers. 204 00:10:25,720 --> 00:10:27,400 Speaker 1: Seven years ago, seven years ago. 205 00:10:27,679 --> 00:10:29,439 Speaker 2: And you know what, a lot of people in the 206 00:10:29,559 --> 00:10:31,319 Speaker 2: US government have not shut up about it over the 207 00:10:31,400 --> 00:10:34,120 Speaker 2: last seven years either. But this is the first time 208 00:10:34,679 --> 00:10:38,480 Speaker 2: in my research that I've felt like, Okay, there's actually 209 00:10:38,520 --> 00:10:42,400 Speaker 2: going to be some movement on this because members of 210 00:10:42,400 --> 00:10:44,920 Speaker 2: the bipartisan members of the US government are getting involved. 211 00:10:44,920 --> 00:10:48,280 Speaker 2: So Section two thirty sounds extremely boring, and I bring 212 00:10:48,280 --> 00:10:50,240 Speaker 2: it up at parties and people look like shut up. 213 00:10:51,040 --> 00:10:53,600 Speaker 2: But if I want to kind of plagiarize the way 214 00:10:53,640 --> 00:10:56,480 Speaker 2: that Section two thirty has been described by those who 215 00:10:56,520 --> 00:11:00,040 Speaker 2: care about it, it is the twenty six words that 216 00:11:00,080 --> 00:11:03,200 Speaker 2: created the Internet, the twenty six words of the following. 217 00:11:03,679 --> 00:11:08,200 Speaker 2: No provider or user of an interactive computer service shall 218 00:11:08,240 --> 00:11:12,440 Speaker 2: be treated as the publisher or speaker of any information 219 00:11:12,840 --> 00:11:17,319 Speaker 2: provided by another information content provider. What does that mean 220 00:11:17,360 --> 00:11:17,560 Speaker 2: to you? 221 00:11:18,280 --> 00:11:20,040 Speaker 1: Doesn't mean that much to me on the face of it, 222 00:11:21,520 --> 00:11:24,319 Speaker 1: But I know that those twenty six words have been 223 00:11:24,360 --> 00:11:29,400 Speaker 1: some of the most consequential in the history of technology legislation, 224 00:11:29,600 --> 00:11:34,000 Speaker 1: and arguably are lifetimes. They represent this addition to the 225 00:11:34,000 --> 00:11:36,880 Speaker 1: Communications Decency Act that was passed in the mid nineties, 226 00:11:37,080 --> 00:11:40,160 Speaker 1: and those twenty six words that created the Internet basically 227 00:11:40,240 --> 00:11:43,800 Speaker 1: meant that Internet platforms were not legally liable for third 228 00:11:43,800 --> 00:11:46,680 Speaker 1: party content like content created by users that lives on 229 00:11:46,720 --> 00:11:50,079 Speaker 1: their platforms when the law was struck. It was basically 230 00:11:50,120 --> 00:11:54,960 Speaker 1: designed to allow these Internet platforms to grow and to 231 00:11:55,000 --> 00:11:57,480 Speaker 1: host content without fear of liability. 232 00:11:58,000 --> 00:11:59,000 Speaker 2: They did that successful. 233 00:11:59,000 --> 00:11:59,760 Speaker 1: Boy did that work. 234 00:12:00,000 --> 00:12:03,400 Speaker 2: And it was also related to free speech concerns, which 235 00:12:03,440 --> 00:12:06,480 Speaker 2: is still something that you know people are talking about today, 236 00:12:06,480 --> 00:12:09,440 Speaker 2: which is this idea of like, how much is a 237 00:12:09,480 --> 00:12:10,760 Speaker 2: two moderated internet? 238 00:12:10,840 --> 00:12:10,960 Speaker 3: Right? 239 00:12:11,320 --> 00:12:15,360 Speaker 2: And so, ultimately, as you were saying, Section two thirty 240 00:12:15,400 --> 00:12:19,120 Speaker 2: allowed companies to put less effort into regulating their content 241 00:12:19,280 --> 00:12:20,959 Speaker 2: because they're not treated like publishers. 242 00:12:20,960 --> 00:12:25,040 Speaker 1: Publishers obviously are constantly subject to lawsuits for saying stuff 243 00:12:25,080 --> 00:12:29,439 Speaker 1: which is defamatory or for publishing content which is extreme, 244 00:12:29,840 --> 00:12:32,520 Speaker 1: whereas websites and social media platforms are immune. 245 00:12:32,640 --> 00:12:36,240 Speaker 2: Right, And in the nineties, nobody could have anticipated how 246 00:12:36,280 --> 00:12:38,920 Speaker 2: this law would fuel the explosion of what we now 247 00:12:38,960 --> 00:12:42,199 Speaker 2: know as social media. The story here is that thirty 248 00:12:42,280 --> 00:12:46,600 Speaker 2: years later, there is now some bipartisan support for modifying 249 00:12:46,840 --> 00:12:50,200 Speaker 2: Section two thirty, and two senators, one being Lindsay Graham, 250 00:12:50,200 --> 00:12:53,079 Speaker 2: who's a Republican from South Carolina, and the other Dick Durbin, 251 00:12:53,160 --> 00:12:57,240 Speaker 2: a Democrat from Illinois, plan on introducing a bill to 252 00:12:57,400 --> 00:13:00,679 Speaker 2: set an expiration date for Section two thirty, and that 253 00:13:00,800 --> 00:13:04,079 Speaker 2: date is January one, twenty twenty seven. Mark your calendar, 254 00:13:04,200 --> 00:13:05,840 Speaker 2: less than two years from now, less than two years 255 00:13:05,880 --> 00:13:10,400 Speaker 2: from now. But the Information reported that the lawmakers don't 256 00:13:10,440 --> 00:13:14,360 Speaker 2: actually want to repeal Section two thirty outright. It's really 257 00:13:14,400 --> 00:13:18,120 Speaker 2: more of an invitation to the tech companies to negotiate, 258 00:13:18,480 --> 00:13:20,760 Speaker 2: which is something that they have been unwilling to do 259 00:13:20,840 --> 00:13:22,160 Speaker 2: without an ultimatum thus far. 260 00:13:22,320 --> 00:13:24,360 Speaker 1: I think one of the interesting things about this story 261 00:13:24,480 --> 00:13:26,960 Speaker 1: is the first place you sort reported was in the 262 00:13:27,000 --> 00:13:29,640 Speaker 1: Information which is like the go to source for the 263 00:13:29,679 --> 00:13:30,960 Speaker 1: whole tech industry. 264 00:13:30,840 --> 00:13:33,800 Speaker 2: And I think what that signals is that big tech 265 00:13:33,960 --> 00:13:38,240 Speaker 2: is actually watching very closely. And one of the things 266 00:13:38,240 --> 00:13:41,960 Speaker 2: that I've noticed is just this increased anger towards tech 267 00:13:41,960 --> 00:13:45,080 Speaker 2: companies in recent years from both sides of the aisle. 268 00:13:45,240 --> 00:13:47,439 Speaker 2: You know, there's the growing concern for the safety of 269 00:13:47,480 --> 00:13:51,040 Speaker 2: kids online and frustrations about failed attempts to regulate big 270 00:13:51,080 --> 00:13:54,680 Speaker 2: tech again on both sides of the aisle, which may 271 00:13:54,760 --> 00:13:57,000 Speaker 2: bode well for this yet to be introduced Bill. 272 00:13:57,800 --> 00:13:59,679 Speaker 1: It's definitely a story we'll be following up with over 273 00:13:59,720 --> 00:14:02,679 Speaker 1: the come months. But right now, let's take a quick break. 274 00:14:03,040 --> 00:14:05,080 Speaker 1: We'll be back with a few more headlines, and then 275 00:14:05,120 --> 00:14:07,960 Speaker 1: we'll hear from Jeffrey Fowler of The Washington Post about 276 00:14:07,960 --> 00:14:11,839 Speaker 1: what twenty three and meters bankruptcy filing could mean progenetic data. 277 00:14:12,160 --> 00:14:28,280 Speaker 1: Stay with us, Welcome back. We have a few more 278 00:14:28,360 --> 00:14:30,480 Speaker 1: quick headlines to you this week before we dive into 279 00:14:30,480 --> 00:14:33,120 Speaker 1: our conversation with Jeffrey Fowler about twenty three and meter. 280 00:14:33,680 --> 00:14:36,240 Speaker 1: Casey Newton wrote this week in his newsletter about two 281 00:14:36,320 --> 00:14:39,520 Speaker 1: new studies, one from the MIT Media Lab and one 282 00:14:39,560 --> 00:14:42,960 Speaker 1: actually from Open ai itself, which suggests that heavy chatbot 283 00:14:43,080 --> 00:14:47,120 Speaker 1: use is correlated with loneliness and reduced socialization. Now, of course, 284 00:14:47,320 --> 00:14:50,960 Speaker 1: as Casey Newton and the studies both point out, correlation 285 00:14:51,160 --> 00:14:54,600 Speaker 1: is not causation. There's always a possibility that loneliness is 286 00:14:54,600 --> 00:14:57,440 Speaker 1: what propels someone to seek an emotional bond with a bot. 287 00:14:58,040 --> 00:15:01,440 Speaker 1: But the studies do point at a potent danger, which 288 00:15:01,440 --> 00:15:04,560 Speaker 1: is that engaging with compelling chatbots might pull people away 289 00:15:04,600 --> 00:15:07,680 Speaker 1: from connections with other humans, which may in turn drive 290 00:15:07,800 --> 00:15:11,320 Speaker 1: more loneliness and more dependence on a computer companion. 291 00:15:11,720 --> 00:15:14,640 Speaker 2: This is the digital chicken egg. Am I lonely seeking 292 00:15:14,640 --> 00:15:17,360 Speaker 2: a chatbot? Or is the chatbot so hot that I 293 00:15:17,400 --> 00:15:18,240 Speaker 2: can't stay away? 294 00:15:18,680 --> 00:15:19,080 Speaker 1: That's it. 295 00:15:19,880 --> 00:15:22,680 Speaker 2: There's another story that Semaphore reported on that an off 296 00:15:22,720 --> 00:15:26,400 Speaker 2: Broadway theater in New York is offering live AI powered 297 00:15:26,440 --> 00:15:30,800 Speaker 2: translations for the play Perfect Crime. So theatergoers can scan 298 00:15:30,880 --> 00:15:34,840 Speaker 2: a QR code and choose from sixty languages. So actors 299 00:15:34,880 --> 00:15:39,040 Speaker 2: wear microphones that feed their voices directly into the translation system, 300 00:15:39,160 --> 00:15:42,720 Speaker 2: so no side conversations or audience noises get accidentally picked 301 00:15:42,760 --> 00:15:45,680 Speaker 2: up by the AI powered translation services. This is an 302 00:15:45,720 --> 00:15:48,000 Speaker 2: incredible use of AI, which is like, you could be 303 00:15:48,040 --> 00:15:51,120 Speaker 2: a tourist from anywhere and see this show and still 304 00:15:51,240 --> 00:15:51,880 Speaker 2: understand it. 305 00:15:52,640 --> 00:15:57,200 Speaker 1: Finally, obviously we can't pass the headlines by without a 306 00:15:57,280 --> 00:15:58,760 Speaker 1: nod to this week. 307 00:15:58,720 --> 00:16:00,080 Speaker 2: The dofist Pentagon pracs. 308 00:16:00,120 --> 00:16:02,800 Speaker 1: Hashtag signal Gate. Just a few weeks ago we had 309 00:16:02,800 --> 00:16:05,960 Speaker 1: Meredith Whittaker, who runs Signal, the encrypti mesching app on 310 00:16:06,120 --> 00:16:08,960 Speaker 1: tech Stuff and Signal is absolutely the flavor of the 311 00:16:08,960 --> 00:16:12,320 Speaker 1: week because Jeffrey Goldberg, the editor of The Atlantic, was 312 00:16:12,320 --> 00:16:15,440 Speaker 1: accidentally included in a Signal group with JD. Vance and 313 00:16:15,480 --> 00:16:19,240 Speaker 1: Pete Hegseth to discuss highly sensitive battle plans for Yemen, 314 00:16:19,280 --> 00:16:22,480 Speaker 1: which are now absolutely all over the Internet. The lesson 315 00:16:22,480 --> 00:16:24,720 Speaker 1: here is that it turns out encryption doesn't work if 316 00:16:24,760 --> 00:16:29,200 Speaker 1: you invite journalists into your group chat. The story broke 317 00:16:29,200 --> 00:16:32,960 Speaker 1: the internet per Axios. Hashtag signal Gate is the most 318 00:16:33,040 --> 00:16:37,040 Speaker 1: interacted with story of the year to date. The Atlantic story, 319 00:16:37,120 --> 00:16:40,560 Speaker 1: which is titled the Trump administration accidentally texted me its 320 00:16:40,560 --> 00:16:44,520 Speaker 1: war plans has approaching five hundred thousand interactions on social media, 321 00:16:44,840 --> 00:16:47,400 Speaker 1: which is more than the second and third place stories 322 00:16:47,400 --> 00:16:50,560 Speaker 1: of the year combined. For your reference, those two are 323 00:16:51,040 --> 00:16:53,720 Speaker 1: Meet the World War II veteran that recently celebrated its 324 00:16:53,800 --> 00:16:57,520 Speaker 1: hundredth birthday from the venerable news source eleven Alive. And 325 00:16:58,040 --> 00:17:01,120 Speaker 1: Elon Musk's networth dropped twenty nine billion dollars in one 326 00:17:01,200 --> 00:17:05,920 Speaker 1: day as Tesla stock tanks from Business Insider. So lots 327 00:17:06,000 --> 00:17:08,760 Speaker 1: of emojis in the in government communication, many more than 328 00:17:08,760 --> 00:17:22,560 Speaker 1: one would suspect fireflames. So for our next segment, we're 329 00:17:22,600 --> 00:17:25,480 Speaker 1: going to explore what happens when a company that you've 330 00:17:25,480 --> 00:17:29,280 Speaker 1: given your personal data to hits a crisis. Now, companies 331 00:17:29,280 --> 00:17:32,320 Speaker 1: file for bankruptcy all the time, from major retailers and 332 00:17:32,359 --> 00:17:37,320 Speaker 1: restaurant chains to tech startups and bitcoin exchanges, but they 333 00:17:37,400 --> 00:17:41,840 Speaker 1: usually don't own the DNA sequences of millions of people. 334 00:17:42,320 --> 00:17:45,080 Speaker 1: So such as the case Carrot with twenty three and me, yeah. 335 00:17:44,920 --> 00:17:47,360 Speaker 2: They have my oh boy, do they have my data? 336 00:17:48,119 --> 00:17:49,399 Speaker 1: Was so you were one of the people in the 337 00:17:49,440 --> 00:17:51,600 Speaker 1: twenty tens when this was the hottest game in town 338 00:17:51,640 --> 00:17:52,520 Speaker 1: who couldn't resist. 339 00:17:52,720 --> 00:17:57,919 Speaker 2: I did it in twenty twenty one because it was 340 00:17:57,960 --> 00:18:02,720 Speaker 2: a thing that it was a thing that somebody bought 341 00:18:02,800 --> 00:18:05,040 Speaker 2: me as a gift, oh okay, And I kind of 342 00:18:05,040 --> 00:18:06,080 Speaker 2: sat on it for a little bit. 343 00:18:06,320 --> 00:18:09,200 Speaker 1: You're buying the dip as a user, not as an investor. 344 00:18:08,880 --> 00:18:12,680 Speaker 2: One thousand percent, and I, I don't know, I just 345 00:18:12,720 --> 00:18:15,560 Speaker 2: looked at it. I'm obsessed with my family history and 346 00:18:15,800 --> 00:18:17,440 Speaker 2: to get a little bit more serious, you know, it's 347 00:18:17,480 --> 00:18:20,760 Speaker 2: like I have a lot of dead relatives and I 348 00:18:20,960 --> 00:18:23,680 Speaker 2: don't have the ability to ask them, you know, who 349 00:18:23,760 --> 00:18:27,520 Speaker 2: was our extended family. And so I decided to take 350 00:18:27,560 --> 00:18:30,280 Speaker 2: matters into my own hands. And you know, lo and 351 00:18:30,280 --> 00:18:34,000 Speaker 2: behold found out that I'm incredibly Jewish, which doesn't surprise me, 352 00:18:34,440 --> 00:18:40,720 Speaker 2: and that's it, but that was still exciting. I struggle 353 00:18:40,760 --> 00:18:42,879 Speaker 2: with something that I think is really at the core 354 00:18:43,080 --> 00:18:46,560 Speaker 2: of this bankruptcy filing and how it affects people, which 355 00:18:46,600 --> 00:18:51,840 Speaker 2: is I love to give my data away. I don't care. 356 00:18:52,240 --> 00:18:55,400 Speaker 2: I want it now. I want Wi Fi at La Guardia, 357 00:18:55,680 --> 00:18:57,200 Speaker 2: I want Wi Fi in the middle of the street, 358 00:18:57,480 --> 00:18:58,800 Speaker 2: and I'm going to give you my pass where I'm 359 00:18:58,800 --> 00:19:01,520 Speaker 2: going to give you my email address. I also report 360 00:19:01,560 --> 00:19:03,679 Speaker 2: in a technology podcast where I know that that's a 361 00:19:03,800 --> 00:19:06,679 Speaker 2: terrible thing to do, and such as the case with 362 00:19:06,680 --> 00:19:09,520 Speaker 2: twenty three and Me, where everyone was like, Kara, don't 363 00:19:09,560 --> 00:19:12,880 Speaker 2: do that, just don't do it. Talk to your family 364 00:19:12,880 --> 00:19:14,399 Speaker 2: members and don't do it. 365 00:19:14,880 --> 00:19:16,760 Speaker 1: Now. Can I ask you? One of the things that 366 00:19:16,800 --> 00:19:18,440 Speaker 1: twenty three and Me was trying to do to kind 367 00:19:18,440 --> 00:19:21,680 Speaker 1: of make its business model more robust than just providing 368 00:19:21,680 --> 00:19:24,680 Speaker 1: a one off test that you literally never needed to repeat, 369 00:19:24,720 --> 00:19:26,960 Speaker 1: was to kind of build in this like social aspects 370 00:19:26,960 --> 00:19:32,200 Speaker 1: to the platform. Did you connect with any distant, previously 371 00:19:32,280 --> 00:19:33,639 Speaker 1: uncontacted relatives. 372 00:19:33,720 --> 00:19:38,320 Speaker 2: Yes, and we figured out we were related. You know, 373 00:19:39,200 --> 00:19:41,440 Speaker 2: no profound It was not a profound thing. And look, 374 00:19:41,480 --> 00:19:43,560 Speaker 2: I know that for a lot of people what happened 375 00:19:44,000 --> 00:19:46,119 Speaker 2: was like people found out they had like a brother 376 00:19:46,800 --> 00:19:48,520 Speaker 2: or another family. You know, it was I think it 377 00:19:48,560 --> 00:19:50,320 Speaker 2: was a It had a profound effect on a lot 378 00:19:50,320 --> 00:19:52,439 Speaker 2: of people. It did not have a profound effect on me. 379 00:19:52,560 --> 00:19:56,520 Speaker 1: So my sister is pretty interested in genealogy, family tree, etc. 380 00:19:57,000 --> 00:20:01,480 Speaker 1: Scarce and she, to her enormous credit, said to my father, 381 00:20:02,359 --> 00:20:04,280 Speaker 1: would you mind if I do twenty three and meters? 382 00:20:05,119 --> 00:20:07,680 Speaker 1: And he said, I would really prefer it if you didn't, 383 00:20:08,280 --> 00:20:11,040 Speaker 1: because I don't want to know if there's a serial 384 00:20:11,080 --> 00:20:16,480 Speaker 1: killer in the family. We both raised the eyebrows with this, 385 00:20:16,760 --> 00:20:20,000 Speaker 1: and of course the question arose, was there a suspect 386 00:20:20,040 --> 00:20:20,560 Speaker 1: in mind? 387 00:20:20,800 --> 00:20:23,360 Speaker 2: You're like, are you exact? 388 00:20:24,960 --> 00:20:27,400 Speaker 1: So she didn't do it? You know how paranora I am. 389 00:20:27,440 --> 00:20:30,439 Speaker 1: Of course I didn't do it either, But fifteen million 390 00:20:30,480 --> 00:20:33,200 Speaker 1: people did. And many of those people who know about 391 00:20:33,200 --> 00:20:35,919 Speaker 1: the bankruptcy, I'm sure are concerned about what's going to 392 00:20:35,920 --> 00:20:39,120 Speaker 1: happen to their personal data. Here to help us understand 393 00:20:39,240 --> 00:20:41,760 Speaker 1: is Jeffrey Fowler, who writes a column for The Washington 394 00:20:41,800 --> 00:20:45,240 Speaker 1: Post all about the user experience of technology, the good 395 00:20:45,240 --> 00:20:47,600 Speaker 1: and the bad. And what I particularly like about his 396 00:20:47,640 --> 00:20:51,520 Speaker 1: writing is that he really centers the user in his column. 397 00:20:51,560 --> 00:20:54,000 Speaker 1: It's tech journalism, but it's practical tech journalism. What does 398 00:20:54,040 --> 00:20:56,639 Speaker 1: this mean for me? How can I make things better? 399 00:20:57,200 --> 00:20:59,040 Speaker 1: What should I know? And so he wrote a column 400 00:20:59,040 --> 00:20:59,960 Speaker 1: this week that called both of our I. 401 00:21:00,480 --> 00:21:01,919 Speaker 2: Jeffrey, welcome to tech stuff. 402 00:21:02,160 --> 00:21:04,879 Speaker 1: Hello, Hello, So you wrote a story this week and 403 00:21:04,920 --> 00:21:07,760 Speaker 1: the headline was delete your DNA from twenty three and 404 00:21:07,880 --> 00:21:11,480 Speaker 1: me right now, which was quite a captivating headline. Could 405 00:21:11,480 --> 00:21:13,239 Speaker 1: you just start off by giving us a little bit 406 00:21:13,240 --> 00:21:15,800 Speaker 1: of the background On twenty three and meters. And then 407 00:21:15,960 --> 00:21:17,960 Speaker 1: while you wrote this story, I wrote. 408 00:21:17,680 --> 00:21:23,120 Speaker 4: It because fifteen million people around the world spat into 409 00:21:23,160 --> 00:21:25,760 Speaker 4: a little vial and sent their DNA to a company 410 00:21:25,760 --> 00:21:29,480 Speaker 4: in Silicon Valley with the promise of unlocking all sorts 411 00:21:29,520 --> 00:21:32,480 Speaker 4: of things about our ethnicity, our background, our family tree, 412 00:21:32,760 --> 00:21:36,800 Speaker 4: and our health. But it turns out that the company 413 00:21:36,960 --> 00:21:41,320 Speaker 4: that we trusted with that information, that really really precious information, 414 00:21:41,760 --> 00:21:46,680 Speaker 4: was a terrible business, and it declared bankruptcy late Sunday night. 415 00:21:47,200 --> 00:21:48,800 Speaker 4: And if there's one thing I know, it's true you 416 00:21:48,840 --> 00:21:51,240 Speaker 4: do not want a company that is bankrupt to be 417 00:21:51,280 --> 00:21:55,720 Speaker 4: responsible for protecting your very precious information. And that's the 418 00:21:55,760 --> 00:21:58,440 Speaker 4: situation we find ourselves in. So the moment I heard 419 00:21:58,480 --> 00:22:01,160 Speaker 4: about it, I was like, we need to press published 420 00:22:01,160 --> 00:22:03,280 Speaker 4: on this story right now. And it seems like a 421 00:22:03,320 --> 00:22:06,440 Speaker 4: lot of people agreed, because so many people have been 422 00:22:06,480 --> 00:22:08,560 Speaker 4: trying to delete their data from twenty three and me 423 00:22:08,800 --> 00:22:11,840 Speaker 4: that the site has been down for large periods of 424 00:22:11,840 --> 00:22:12,640 Speaker 4: this week. 425 00:22:12,680 --> 00:22:14,879 Speaker 2: And yet you, like me, could not resist getting a 426 00:22:14,920 --> 00:22:17,480 Speaker 2: twenty three in me account. What were your reasons for 427 00:22:17,520 --> 00:22:19,199 Speaker 2: getting an account originally? 428 00:22:19,600 --> 00:22:23,080 Speaker 4: You know, I think it sort of tracks the arc 429 00:22:23,200 --> 00:22:26,080 Speaker 4: of a lot of our relationships with technology and with 430 00:22:26,200 --> 00:22:28,720 Speaker 4: data over the last twenty years. Look, this was the 431 00:22:28,760 --> 00:22:35,720 Speaker 4: two thousands for me. There was so much possibility out there, right, 432 00:22:36,200 --> 00:22:39,840 Speaker 4: and you know, in two thousand whatever, when I did 433 00:22:39,880 --> 00:22:42,480 Speaker 4: my twenty three and me test, what I could imagine 434 00:22:42,520 --> 00:22:45,159 Speaker 4: was what the company was promising, which was just a 435 00:22:45,160 --> 00:22:49,199 Speaker 4: fascinating idea, right that together, if they were able to 436 00:22:49,240 --> 00:22:51,520 Speaker 4: gather enough DNA data, they could use the power of 437 00:22:51,560 --> 00:22:55,119 Speaker 4: technology to unlock all kinds of secrets about the human body. 438 00:22:55,320 --> 00:22:57,560 Speaker 4: They could already even in those early days, tell you 439 00:22:57,640 --> 00:23:00,639 Speaker 4: curious things about yourself, Like I remember one information they 440 00:23:00,640 --> 00:23:02,880 Speaker 4: gave me is that I have a gene that makes 441 00:23:02,880 --> 00:23:04,600 Speaker 4: me likely to have wet ear wax. 442 00:23:06,359 --> 00:23:09,280 Speaker 1: While that is, I feel so seen. I was always wondering. 443 00:23:09,640 --> 00:23:10,600 Speaker 1: It's a thing. It does. 444 00:23:10,680 --> 00:23:11,880 Speaker 2: It's very validating. 445 00:23:12,359 --> 00:23:14,160 Speaker 4: Yeah, And so while that's like the wet ear wax 446 00:23:14,200 --> 00:23:15,960 Speaker 4: is not going to change the world, they maybe could 447 00:23:16,040 --> 00:23:19,600 Speaker 4: learn things about cancer or all kinds of mysteries about 448 00:23:19,640 --> 00:23:22,919 Speaker 4: the human body. That is, That was and remains a 449 00:23:22,960 --> 00:23:24,960 Speaker 4: pretty exciting idea. 450 00:23:25,080 --> 00:23:27,200 Speaker 2: All of these things are true about twenty three and 451 00:23:27,240 --> 00:23:30,560 Speaker 2: meters And as a user, I also experienced, you know, 452 00:23:30,600 --> 00:23:34,760 Speaker 2: some interesting information and connection with people that I didn't 453 00:23:34,800 --> 00:23:38,360 Speaker 2: even know were distant cousins. But as you said earlier, 454 00:23:38,840 --> 00:23:42,200 Speaker 2: this is no longer a viable company. Can you talk 455 00:23:42,240 --> 00:23:45,320 Speaker 2: a little bit about why it went bankrupt and how 456 00:23:45,359 --> 00:23:46,679 Speaker 2: it started to run into trouble. 457 00:23:47,080 --> 00:23:49,200 Speaker 4: So all preface this was saying like, I've not been 458 00:23:49,320 --> 00:23:52,480 Speaker 4: tracking the financial fortunes of twenty three and meters over 459 00:23:52,520 --> 00:23:55,200 Speaker 4: the years, but I know at the high level they've 460 00:23:55,240 --> 00:23:58,120 Speaker 4: tried a lot of different ideas to make the business work. 461 00:23:58,240 --> 00:24:00,760 Speaker 4: Because it turns out that when you asked people to 462 00:24:00,800 --> 00:24:03,600 Speaker 4: spend roughly one hundred bucks to send you their DNA, 463 00:24:03,960 --> 00:24:06,680 Speaker 4: they only want to do that once, right, So twenty 464 00:24:06,720 --> 00:24:09,399 Speaker 4: three and ME had this problem, which was once people 465 00:24:09,440 --> 00:24:11,400 Speaker 4: did the test, that you couldn't get any more money 466 00:24:11,440 --> 00:24:13,160 Speaker 4: out of them. So they had to figure out other 467 00:24:13,200 --> 00:24:16,199 Speaker 4: ways to keep going as a business, to make some 468 00:24:16,240 --> 00:24:18,040 Speaker 4: money out of the data that they had. And I 469 00:24:18,080 --> 00:24:19,760 Speaker 4: think the big idea that they had was that it 470 00:24:19,800 --> 00:24:22,560 Speaker 4: was going to be useful for developing drugs, and they 471 00:24:22,600 --> 00:24:25,840 Speaker 4: made a big partnership with GSK, and GSK took a 472 00:24:25,880 --> 00:24:28,520 Speaker 4: stake in the company, and that went on for several years, 473 00:24:28,520 --> 00:24:31,600 Speaker 4: but it wasn't producing results at a scale that was 474 00:24:31,640 --> 00:24:34,239 Speaker 4: helping them and I guess the costs associated with it 475 00:24:34,240 --> 00:24:37,440 Speaker 4: were just gigantic. They've then tried other things. They tried, 476 00:24:37,520 --> 00:24:41,600 Speaker 4: I think most recently selling GLP once as kind of 477 00:24:42,240 --> 00:24:44,280 Speaker 4: a home kind of health kind of service like Pivot 478 00:24:44,320 --> 00:24:47,440 Speaker 4: to Health, and it just didn't work, leading to around 479 00:24:47,440 --> 00:24:49,840 Speaker 4: twenty twenty one that they went public and they were 480 00:24:49,880 --> 00:24:53,920 Speaker 4: valued at six billion dollars roughly, and then as a 481 00:24:53,920 --> 00:24:56,800 Speaker 4: Sunday night when I checked, it was around fifty million dollars. 482 00:24:56,840 --> 00:24:58,439 Speaker 4: They have no idea what it would be now that 483 00:24:58,480 --> 00:25:04,000 Speaker 4: it's in bankruptcy court. Probably less, probably less. And along 484 00:25:04,040 --> 00:25:05,560 Speaker 4: the way they had one other thing that we should 485 00:25:05,560 --> 00:25:08,720 Speaker 4: definitely talk about, which was they had a big hacking attack. 486 00:25:09,160 --> 00:25:11,560 Speaker 1: Yeah. I wanted to ask you about that because I think, 487 00:25:11,840 --> 00:25:13,879 Speaker 1: you know, you sort of pointed out that there was 488 00:25:14,320 --> 00:25:16,639 Speaker 1: a time in the two thousands where we just couldn't 489 00:25:16,640 --> 00:25:19,440 Speaker 1: wait to give our data to all comers and see 490 00:25:19,440 --> 00:25:22,600 Speaker 1: what kind of surprising results that we got back that change, 491 00:25:22,600 --> 00:25:25,000 Speaker 1: which presumably became a headwind for twenty three and meters. 492 00:25:25,080 --> 00:25:28,320 Speaker 1: But there also some specific issues they had that may 493 00:25:28,359 --> 00:25:29,560 Speaker 1: have put customers off. 494 00:25:30,160 --> 00:25:32,800 Speaker 4: Yeah, and I think the attacking attack is a big one. 495 00:25:33,160 --> 00:25:36,160 Speaker 4: My memory of it is that they had a problem 496 00:25:36,200 --> 00:25:39,800 Speaker 4: with users that were reusing passwords or using passwords that 497 00:25:39,800 --> 00:25:42,280 Speaker 4: had been otherwise compromised on the internet. This is a 498 00:25:42,280 --> 00:25:45,080 Speaker 4: common thing to all our listeners if you do not 499 00:25:45,160 --> 00:25:47,480 Speaker 4: already have a password manager and use it to get 500 00:25:47,840 --> 00:25:50,560 Speaker 4: distinct passwords on every single site, and Appy used and 501 00:25:50,560 --> 00:25:52,960 Speaker 4: do that now because that is a big risk. But 502 00:25:53,200 --> 00:25:55,480 Speaker 4: turns out a lot of twenty three meter users fit 503 00:25:55,520 --> 00:25:58,280 Speaker 4: in that bucket, and folks were able to get into 504 00:25:58,560 --> 00:26:01,159 Speaker 4: a whole bunch of accounts and then even offer to 505 00:26:01,400 --> 00:26:04,200 Speaker 4: sell some of that some of what they learn online. 506 00:26:04,320 --> 00:26:06,600 Speaker 4: I think it was mostly information about like family trees. 507 00:26:06,640 --> 00:26:09,439 Speaker 4: I don't think they got people's like DNA samples, but 508 00:26:09,520 --> 00:26:12,680 Speaker 4: still it was enough to really really spook people. And 509 00:26:12,880 --> 00:26:14,960 Speaker 4: you know, because at that core of it, I think 510 00:26:15,000 --> 00:26:18,200 Speaker 4: people are and even back then, we're pretty nervous about 511 00:26:18,240 --> 00:26:20,880 Speaker 4: the idea of spinning into a vial, and any breach 512 00:26:20,920 --> 00:26:23,239 Speaker 4: of that trust by by twenty three and me was 513 00:26:23,359 --> 00:26:24,840 Speaker 4: just it was just killer. 514 00:26:30,400 --> 00:26:32,960 Speaker 1: After the break, we're going to get Kara to finally 515 00:26:32,960 --> 00:26:47,680 Speaker 1: delete her account, stay with us. So, Jeffrey, what happens 516 00:26:47,680 --> 00:26:49,919 Speaker 1: is you don't delete your data from twenty three and me, Like, 517 00:26:49,920 --> 00:26:50,919 Speaker 1: what would happen then. 518 00:26:51,040 --> 00:26:53,320 Speaker 4: If you don't delete your data from twenty three and 519 00:26:53,400 --> 00:26:57,359 Speaker 4: me it is now essentially up for sale. So the 520 00:26:57,440 --> 00:27:01,520 Speaker 4: company has said that it is in bankruptcy court, which 521 00:27:01,520 --> 00:27:03,480 Speaker 4: means it has to find it either find a buyer 522 00:27:03,480 --> 00:27:05,920 Speaker 4: for the whole company or sell it off for parts. 523 00:27:06,080 --> 00:27:08,840 Speaker 4: And the most valuable asset they have is the DNA 524 00:27:09,000 --> 00:27:12,800 Speaker 4: data of sensibly still millions of people. Who is going 525 00:27:12,840 --> 00:27:15,879 Speaker 4: to buy that, We don't know. There's lots of speculation, 526 00:27:16,040 --> 00:27:17,320 Speaker 4: maybe insurance. 527 00:27:16,880 --> 00:27:22,040 Speaker 3: Companies, insurance we have well, we have some laws, some 528 00:27:22,160 --> 00:27:25,600 Speaker 3: laws in the US that protect users from not users 529 00:27:25,680 --> 00:27:27,240 Speaker 3: protect people from. 530 00:27:27,000 --> 00:27:30,439 Speaker 4: Having their their data, their genetic data used to keep 531 00:27:30,480 --> 00:27:34,000 Speaker 4: them from getting things like healthcare coverage. But it's not 532 00:27:34,080 --> 00:27:36,440 Speaker 4: an airtight law and it doesn't apply to other kinds 533 00:27:36,480 --> 00:27:39,040 Speaker 4: of insurance. And again, it can also be bought by 534 00:27:39,040 --> 00:27:41,600 Speaker 4: somebody who we have no idea what they're going to 535 00:27:41,680 --> 00:27:43,520 Speaker 4: do with it, and they may not try to do 536 00:27:43,520 --> 00:27:47,080 Speaker 4: anything with it for like a long time. Because that's 537 00:27:47,119 --> 00:27:49,000 Speaker 4: the thing about data. I think that is the key 538 00:27:49,440 --> 00:27:52,119 Speaker 4: takeaway lesson for everybody from this that you know, you 539 00:27:52,119 --> 00:27:53,879 Speaker 4: can think you know what's going to happen to it 540 00:27:53,920 --> 00:27:56,480 Speaker 4: at one time, and then in the future. It has 541 00:27:56,480 --> 00:27:58,840 Speaker 4: a totally different use, and that applies to our genetic data, 542 00:27:58,880 --> 00:28:00,560 Speaker 4: but all kinds of things about our lives. 543 00:28:01,080 --> 00:28:03,440 Speaker 1: Can anybody delete that twenty three meter day secause? I 544 00:28:03,440 --> 00:28:05,520 Speaker 1: think I read in the piece there's a California law 545 00:28:05,520 --> 00:28:09,080 Speaker 1: that makes this particularly easy to do, or at least 546 00:28:09,080 --> 00:28:11,720 Speaker 1: possible to do. But what about for other people elsewhere 547 00:28:11,720 --> 00:28:13,160 Speaker 1: in the US or elsewhere in the world. 548 00:28:13,480 --> 00:28:15,879 Speaker 4: I'm glad you asked this question. So we've been pretty 549 00:28:15,960 --> 00:28:19,040 Speaker 4: sort of gloomy so far in this conversation. But there's 550 00:28:19,080 --> 00:28:21,439 Speaker 4: a glimmer of good news here, so let's talk about that. 551 00:28:21,440 --> 00:28:21,840 Speaker 1: I love it. 552 00:28:23,119 --> 00:28:29,320 Speaker 4: Starting back in twenty eighteen, California passed a law that said, you, 553 00:28:29,760 --> 00:28:34,159 Speaker 4: dear consumer user citizen, have some privacy rights, and among 554 00:28:34,200 --> 00:28:37,080 Speaker 4: those are you have the rights to delete data that 555 00:28:37,119 --> 00:28:40,080 Speaker 4: companies collect about you. And then a couple years later, 556 00:28:40,120 --> 00:28:43,480 Speaker 4: actually during COVID, California passed another law that everybody was 557 00:28:43,520 --> 00:28:46,560 Speaker 4: so distracted by COVID about we forgot even existed, which 558 00:28:46,600 --> 00:28:49,800 Speaker 4: is a genetic information privacy law which comes even deeper 559 00:28:49,960 --> 00:28:52,200 Speaker 4: and says you not only have the right to delete 560 00:28:52,640 --> 00:28:55,640 Speaker 4: your data from a company, you also have the you 561 00:28:55,640 --> 00:28:57,880 Speaker 4: have the special right to delete genetic data. You have 562 00:28:57,920 --> 00:29:00,240 Speaker 4: the right to tell them to destroy your sample, and 563 00:29:00,360 --> 00:29:02,040 Speaker 4: you have the right to tell them withdraw me from 564 00:29:02,080 --> 00:29:06,240 Speaker 4: any research that's going on. So hoay California for this law. 565 00:29:06,560 --> 00:29:09,720 Speaker 4: Other states, seeing that the federal government in the US 566 00:29:10,080 --> 00:29:13,280 Speaker 4: was doing absolutely nothing to protect our data privacy rights, 567 00:29:13,440 --> 00:29:15,600 Speaker 4: copied it, and I am now happy to say they're 568 00:29:15,600 --> 00:29:19,200 Speaker 4: about twenty states that have versions of this law that 569 00:29:19,440 --> 00:29:22,120 Speaker 4: require giving people the right to delete their data. I 570 00:29:22,120 --> 00:29:24,320 Speaker 4: have a little map that think gets updated that I keep, 571 00:29:24,360 --> 00:29:26,520 Speaker 4: which is like my little sign of hope in the 572 00:29:26,600 --> 00:29:29,440 Speaker 4: dark dark days that regulation can work and we can 573 00:29:29,480 --> 00:29:32,120 Speaker 4: have some help. So the reality is the majority of 574 00:29:32,160 --> 00:29:35,000 Speaker 4: Americas are now covered by these state laws, and so companies, 575 00:29:35,040 --> 00:29:37,920 Speaker 4: including twenty three and me basically treat all Americans the 576 00:29:37,960 --> 00:29:40,920 Speaker 4: same way now, which is to say, yes, we will 577 00:29:40,960 --> 00:29:43,080 Speaker 4: delete your data. Now, a lot of people have been 578 00:29:43,080 --> 00:29:44,680 Speaker 4: writing to me saying like, well, how do we know 579 00:29:44,680 --> 00:29:48,040 Speaker 4: they're really deleting it? It boils down to like if 580 00:29:48,080 --> 00:29:50,479 Speaker 4: they don't, they could get into big trouble. 581 00:29:50,680 --> 00:29:53,080 Speaker 1: But who's they? I mean, this is the receivers, the creditors, 582 00:29:53,080 --> 00:29:56,040 Speaker 1: I mean, in a bankrupt company, the stakes. So I mean, 583 00:29:56,080 --> 00:29:58,440 Speaker 1: who do you go after in the event they don't 584 00:29:58,640 --> 00:30:00,800 Speaker 1: take this duty seriously through the ankruptcy process. 585 00:30:01,360 --> 00:30:03,880 Speaker 4: It's true that like they have very little left to lose, 586 00:30:04,040 --> 00:30:06,520 Speaker 4: you might argue, but there is a spotlight on them 587 00:30:06,520 --> 00:30:09,400 Speaker 4: because they're going through this process, right, and so there's 588 00:30:09,440 --> 00:30:12,040 Speaker 4: going to be a judge involved. And to be fair 589 00:30:12,080 --> 00:30:15,240 Speaker 4: to twenty three and meters, they have already said, look, 590 00:30:15,280 --> 00:30:17,840 Speaker 4: we are going to handle your data the same way 591 00:30:17,880 --> 00:30:21,360 Speaker 4: we always have throughout this process, and we're going to 592 00:30:21,400 --> 00:30:23,000 Speaker 4: try to look for a buyer that will uphold the 593 00:30:23,080 --> 00:30:26,760 Speaker 4: same use. But the truth of it is, because America 594 00:30:26,840 --> 00:30:29,960 Speaker 4: has no real laws that cover this kind of data, 595 00:30:30,400 --> 00:30:32,320 Speaker 4: that all that they would really have to do is 596 00:30:32,480 --> 00:30:34,960 Speaker 4: update the new buyer would have to update a privacy 597 00:30:35,000 --> 00:30:38,000 Speaker 4: policy and give you notice of that and kind of 598 00:30:38,360 --> 00:30:40,560 Speaker 4: then again give you that chance to delete it if 599 00:30:40,560 --> 00:30:41,920 Speaker 4: you don't want to be a part of that. And 600 00:30:41,960 --> 00:30:44,160 Speaker 4: the truth is most Americans are not paying enough attention. 601 00:30:44,640 --> 00:30:46,800 Speaker 4: Even if you know, even if I do another Washington 602 00:30:46,840 --> 00:30:50,600 Speaker 4: Post headline, but this time in all capital letters, you 603 00:30:50,640 --> 00:30:52,959 Speaker 4: know a lot of people wouldnt wouldn't pay attention to that. 604 00:30:53,720 --> 00:30:56,480 Speaker 1: So you haven't started a genetic bank run. We don't 605 00:30:56,480 --> 00:30:58,320 Speaker 1: know how many are actually trying to delete their data, 606 00:30:58,360 --> 00:31:01,360 Speaker 1: certainly enough to crash the website, but a majority, as 607 00:31:01,360 --> 00:31:01,800 Speaker 1: far as we know. 608 00:31:02,520 --> 00:31:06,840 Speaker 4: I asked twenty three and me yesterday and they wouldn't say. 609 00:31:05,960 --> 00:31:12,600 Speaker 2: You've convinced an undisclosed amount of people to delete twenty 610 00:31:12,600 --> 00:31:16,280 Speaker 2: three and me. I'm gonna be honest, Jeffrey. I read 611 00:31:16,280 --> 00:31:23,160 Speaker 2: your article. I didn't do anything, and it's because I'm lazy. 612 00:31:23,200 --> 00:31:25,560 Speaker 2: I mean, there's there's no good reason. Like you, you 613 00:31:25,600 --> 00:31:29,959 Speaker 2: wrote an incredibly compelling article. I went on Instagram instead, 614 00:31:30,160 --> 00:31:31,520 Speaker 2: Like there's no good reason. 615 00:31:31,560 --> 00:31:32,560 Speaker 1: But how do you actually do it? 616 00:31:32,960 --> 00:31:34,680 Speaker 4: Okay, you want to do it together? Right now? 617 00:31:35,240 --> 00:31:38,480 Speaker 2: Yeah? Yeah, hold on one second. Can we get your phone? 618 00:31:38,600 --> 00:31:38,800 Speaker 1: Yeah? 619 00:31:38,800 --> 00:31:41,880 Speaker 2: Okay, so I'm opening of course, I'm like, I'm opening 620 00:31:41,880 --> 00:31:44,800 Speaker 2: it with my face, So tell me how to delete. 621 00:31:44,840 --> 00:31:45,640 Speaker 2: I'm now in the app. 622 00:31:45,640 --> 00:31:48,680 Speaker 4: It says, hey, Kara, so go up to I think 623 00:31:48,680 --> 00:31:50,200 Speaker 4: it's on the upper right corner. 624 00:31:50,320 --> 00:31:52,480 Speaker 2: Okay, So I'm there. God, I'm gonna miss my cousins. 625 00:31:52,960 --> 00:31:53,280 Speaker 2: Keep on. 626 00:31:53,760 --> 00:31:57,440 Speaker 4: You can look for settings. Once you tap on your 627 00:31:57,480 --> 00:31:58,600 Speaker 4: little profile. 628 00:31:58,400 --> 00:32:00,560 Speaker 2: I's a chance to pass for an updated information. 629 00:32:01,080 --> 00:32:03,320 Speaker 4: Okay. Once your in settings, keep scrolling all the way 630 00:32:03,320 --> 00:32:06,040 Speaker 4: down and towards the very end, you'll see an area 631 00:32:06,120 --> 00:32:07,080 Speaker 4: called twenty. 632 00:32:06,800 --> 00:32:07,960 Speaker 2: Three and menu data Got it? 633 00:32:08,000 --> 00:32:08,280 Speaker 1: Got it? 634 00:32:08,360 --> 00:32:08,560 Speaker 3: Yep? 635 00:32:09,080 --> 00:32:10,800 Speaker 4: Okay, then click view. 636 00:32:11,080 --> 00:32:14,600 Speaker 2: Well there's an access your data or delete your data in. 637 00:32:14,600 --> 00:32:17,160 Speaker 4: Mind, Oh you want to delete? 638 00:32:18,440 --> 00:32:20,760 Speaker 1: We were like shock, we were already to folkate the 639 00:32:20,800 --> 00:32:22,600 Speaker 1: roads that Jeffrey, thank you. 640 00:32:22,720 --> 00:32:23,480 Speaker 2: I wasn't sure. 641 00:32:23,840 --> 00:32:27,000 Speaker 4: It will give you a chance to download some of 642 00:32:27,040 --> 00:32:31,800 Speaker 4: the data, including everything from their report about your health 643 00:32:31,840 --> 00:32:33,000 Speaker 4: and your family. 644 00:32:32,760 --> 00:32:34,440 Speaker 2: And report summary. Here we go. 645 00:32:35,120 --> 00:32:37,479 Speaker 4: And then after you get that stuff, Okay, you're going 646 00:32:37,520 --> 00:32:39,600 Speaker 4: to scroll down, you're going to click delete. Now. A 647 00:32:39,640 --> 00:32:42,080 Speaker 4: couple of things to mention about this. While you're clicking 648 00:32:42,120 --> 00:32:45,440 Speaker 4: on things. When you ask to delete, they also do 649 00:32:45,560 --> 00:32:50,320 Speaker 4: two more things. One they also delete your specimen if 650 00:32:50,360 --> 00:32:52,520 Speaker 4: you'd left them the physical jewel that you send them 651 00:32:52,520 --> 00:32:56,000 Speaker 4: in the mail. And two, they withdraw you from any 652 00:32:56,120 --> 00:32:57,920 Speaker 4: health studies that you might have opted into. 653 00:32:58,280 --> 00:33:00,000 Speaker 2: I'm sorry, I just want to share one thing, Jeffrey, 654 00:33:00,080 --> 00:33:02,479 Speaker 2: and this has nothing to do with our podcast. You 655 00:33:02,520 --> 00:33:05,520 Speaker 2: and I, Jeffrey, are both likely to get wet earwax. 656 00:33:06,080 --> 00:33:07,400 Speaker 4: Yeah, let me tell you if you ever live in 657 00:33:07,400 --> 00:33:10,000 Speaker 4: a really moist place that your wax is going to 658 00:33:10,040 --> 00:33:13,840 Speaker 4: sneak up on you. 659 00:33:14,160 --> 00:33:14,760 Speaker 1: I'm deleting. 660 00:33:14,880 --> 00:33:18,760 Speaker 2: Here we go, permanently delete. Oh gosh, here we go. 661 00:33:19,120 --> 00:33:21,360 Speaker 2: I'm like in this house, we delete twenty three and 662 00:33:21,440 --> 00:33:25,400 Speaker 2: meters and then it says profile data, so delete data. 663 00:33:26,880 --> 00:33:28,840 Speaker 2: So here what it says is we have received your 664 00:33:28,880 --> 00:33:31,480 Speaker 2: request to delete your data and have sent an email 665 00:33:31,520 --> 00:33:32,959 Speaker 2: to the email address link to you'rre trying to three 666 00:33:33,000 --> 00:33:35,120 Speaker 2: me account. Please locate this email on the strategy to 667 00:33:35,120 --> 00:33:36,080 Speaker 2: confirm my request. 668 00:33:36,560 --> 00:33:38,640 Speaker 4: That's right, So your kind your email app and then 669 00:33:38,640 --> 00:33:41,920 Speaker 4: it's an email in there, and then you press do it. 670 00:33:41,760 --> 00:33:44,960 Speaker 2: Do it permanently, delete all records permanent. 671 00:33:45,680 --> 00:33:49,680 Speaker 4: Do remember. Should you decide that you really still want 672 00:33:49,720 --> 00:33:55,160 Speaker 4: to know about such things, you could swab again. The 673 00:33:55,200 --> 00:33:57,880 Speaker 4: one thing about your DNA is it's not changing. And 674 00:33:57,920 --> 00:33:59,760 Speaker 4: that's in some cases I guess a good thing and 675 00:34:00,200 --> 00:34:01,440 Speaker 4: sometimes of badly bad thing. 676 00:34:01,680 --> 00:34:04,800 Speaker 1: Yeah, Jeffrey just decays. I have two questions for you. 677 00:34:05,520 --> 00:34:08,800 Speaker 1: The first is how often, as the technology columnst for 678 00:34:08,840 --> 00:34:11,279 Speaker 1: the Washington Post, do you have to talk people through 679 00:34:11,440 --> 00:34:13,600 Speaker 1: how to do banal technology tasks. 680 00:34:13,800 --> 00:34:16,959 Speaker 4: We actually have a special thing on our website where 681 00:34:16,960 --> 00:34:18,839 Speaker 4: it's like, this is the box for us to give 682 00:34:18,880 --> 00:34:20,799 Speaker 4: people instructions for things. So in fact, if you pull 683 00:34:20,880 --> 00:34:23,080 Speaker 4: up my column on this and you scroll down a 684 00:34:23,120 --> 00:34:25,120 Speaker 4: little bit. I've got one of those boxes that I 685 00:34:25,160 --> 00:34:26,920 Speaker 4: was literally reading off of while we were going through 686 00:34:26,920 --> 00:34:30,279 Speaker 4: this process together. And that's fine. Look, this stuff is 687 00:34:30,320 --> 00:34:34,400 Speaker 4: hard yep. These companies can act in evil ways and 688 00:34:34,440 --> 00:34:37,719 Speaker 4: it's not your fault. So the whole premise of what 689 00:34:37,800 --> 00:34:39,600 Speaker 4: I try to do as a tech columnist of the 690 00:34:39,719 --> 00:34:42,280 Speaker 4: Washington Post is be on the side of the user 691 00:34:42,520 --> 00:34:45,640 Speaker 4: r and like, help you fight back when you can, right, 692 00:34:45,800 --> 00:34:48,320 Speaker 4: And like, I think that's that's super important. 693 00:34:49,040 --> 00:34:51,680 Speaker 1: And the second is what do you want to take 694 00:34:51,680 --> 00:34:53,160 Speaker 1: aways from the twenty three meters story? 695 00:34:54,000 --> 00:34:57,600 Speaker 4: One, perhaps we shouldn't have all sent our DNA to 696 00:34:58,920 --> 00:35:04,320 Speaker 4: Silicon Valley corporation that has Silicon Valley Corporation values and 697 00:35:05,239 --> 00:35:08,600 Speaker 4: ways of doing business. And the bigger one is that, 698 00:35:09,120 --> 00:35:11,799 Speaker 4: like we've been talking about here, that it's really hard 699 00:35:11,840 --> 00:35:16,000 Speaker 4: to know in any given moment what your data could 700 00:35:16,040 --> 00:35:18,480 Speaker 4: be useful for at some point in the future. And 701 00:35:18,560 --> 00:35:23,120 Speaker 4: so the only really reasonable thing to do to protect 702 00:35:23,160 --> 00:35:26,480 Speaker 4: yourself is to allow as little of it as possible 703 00:35:26,520 --> 00:35:29,880 Speaker 4: to be collected, which sounds like an insane person thing 704 00:35:29,960 --> 00:35:32,759 Speaker 4: to say in our modern economy when we're literally being 705 00:35:32,840 --> 00:35:37,040 Speaker 4: watched in every single potential dimension. I once did a 706 00:35:37,080 --> 00:35:40,400 Speaker 4: piece for the Post where I hacked into my iPhone 707 00:35:40,440 --> 00:35:42,279 Speaker 4: to watch what it did while I was sleeping at night. 708 00:35:42,600 --> 00:35:45,520 Speaker 4: Oh wow, and saw it sending data out to like 709 00:35:46,080 --> 00:35:48,279 Speaker 4: just like hundreds of like data brokers and all these 710 00:35:48,440 --> 00:35:50,319 Speaker 4: are sorts of things. So it was terrifying. It wasn't 711 00:35:50,320 --> 00:35:52,640 Speaker 4: even using the phone, but it was, you know, it 712 00:35:52,680 --> 00:35:55,560 Speaker 4: was communicating all this personal information about me. So how 713 00:35:55,560 --> 00:35:57,520 Speaker 4: do we deal with that fact? This runs a little 714 00:35:57,520 --> 00:35:59,759 Speaker 4: bit inttention of like I love technology. I want to 715 00:35:59,840 --> 00:36:04,359 Speaker 4: use is a cool phone, and I have seventy five 716 00:36:04,400 --> 00:36:07,560 Speaker 4: connected gadgets in my home that make all sorts of 717 00:36:07,560 --> 00:36:09,719 Speaker 4: cool things happen. I think the answer is you just 718 00:36:09,760 --> 00:36:14,280 Speaker 4: have to be vigilant, and you need the help of regulations. 719 00:36:14,640 --> 00:36:18,120 Speaker 4: And so that's you know, to that have our interest 720 00:36:18,160 --> 00:36:20,279 Speaker 4: at heart to sort of put boundaries around what these 721 00:36:20,280 --> 00:36:20,880 Speaker 4: companies can do. 722 00:36:20,960 --> 00:36:22,720 Speaker 3: And that's why I'm. 723 00:36:22,080 --> 00:36:24,760 Speaker 4: Here to sing the praises of the California Privacy Protection 724 00:36:25,080 --> 00:36:28,360 Speaker 4: Law and hopeful that we can maybe get some some 725 00:36:28,440 --> 00:36:30,360 Speaker 4: more Jeffery, thank. 726 00:36:30,160 --> 00:36:32,000 Speaker 2: You, Thank you so much, Jeffery, and thank you for 727 00:36:32,040 --> 00:36:34,359 Speaker 2: helping me do something that I should have done many 728 00:36:34,440 --> 00:36:36,000 Speaker 2: days ago. You bet. 729 00:36:41,120 --> 00:36:43,200 Speaker 1: That's it for this week for tech Stuff. I'm most 730 00:36:43,239 --> 00:36:44,760 Speaker 1: Flosian and I'm care Price. 731 00:36:45,239 --> 00:36:48,360 Speaker 2: This episode was produced by Eliza Dennis and Victoria Dominguez. 732 00:36:48,680 --> 00:36:51,719 Speaker 2: It was executive produced by me oz Vaalashan and Kate 733 00:36:51,760 --> 00:36:55,560 Speaker 2: Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. The 734 00:36:55,640 --> 00:36:58,920 Speaker 2: engineer is Bihied Fraser and Kyle Murdoch mixed this episode 735 00:36:59,080 --> 00:36:59,920 Speaker 2: and he also wrote our. 736 00:36:59,840 --> 00:37:03,439 Speaker 1: Thea Join us next Wednesday for a very special edition 737 00:37:03,480 --> 00:37:05,920 Speaker 1: of tech Stuff The Story when we'll share an in 738 00:37:05,960 --> 00:37:09,840 Speaker 1: depth conversation with Zach Brown, the CEO of McLaren Racing 739 00:37:10,280 --> 00:37:14,040 Speaker 1: from the McLaren Technology Center. So f one fans tune 740 00:37:14,120 --> 00:37:17,040 Speaker 1: in and please rate, review, and reach out to us 741 00:37:17,080 --> 00:37:19,799 Speaker 1: at tech Stuff podcast at gmail dot com. We really 742 00:37:19,800 --> 00:37:20,480 Speaker 1: want to hear from you.