1 00:00:00,400 --> 00:00:04,480 Speaker 1: Welcome to tech Stuff, a production of iHeart Podcast and Kaleidoscope. 2 00:00:04,840 --> 00:00:08,319 Speaker 1: I'm os Valoshan. Today will bring you the headlines this week, 3 00:00:08,600 --> 00:00:13,720 Speaker 1: including unearthed executive drama meta. Then in the tech Support segment, 4 00:00:14,120 --> 00:00:16,960 Speaker 1: we'll talk to four of Form Media's Joseph Cox about 5 00:00:17,000 --> 00:00:20,520 Speaker 1: the aggressive data scraping carried out by an Ice contractor 6 00:00:21,239 --> 00:00:24,360 Speaker 1: all of that on the weekend. Tech is Friday, March 7 00:00:24,360 --> 00:00:38,159 Speaker 1: twenty first. This week with covering everything from the future 8 00:00:38,200 --> 00:00:42,560 Speaker 1: of AI assists science to gaming, to anxious chatbots and 9 00:00:42,760 --> 00:00:47,080 Speaker 1: executive upheaval in Silicon Valley. Eliza Dennis, our producer, is 10 00:00:47,120 --> 00:00:50,199 Speaker 1: here with me. Hello us, So let's start off with 11 00:00:50,240 --> 00:00:54,800 Speaker 1: some meta drama. The former director of Global Policy for Facebook, 12 00:00:55,160 --> 00:00:58,800 Speaker 1: Sarah Wyn Williams, wrote this memoir called Careless People, a 13 00:00:58,880 --> 00:01:03,400 Speaker 1: cautionary tale of power, greed, and lost idealism. The phrase 14 00:01:03,520 --> 00:01:06,680 Speaker 1: careless people, I believe, is a reference to the Great Gatsby. 15 00:01:07,120 --> 00:01:10,600 Speaker 2: It is, indeed, and it is not subtle. Fitzgerald referred 16 00:01:10,640 --> 00:01:14,600 Speaker 2: to his wealthy characters as careless people who quote smashed 17 00:01:14,680 --> 00:01:18,880 Speaker 2: up things and creatures and then retreated back into their money. 18 00:01:19,160 --> 00:01:21,200 Speaker 1: You know, it's interesting because people talk about this time 19 00:01:21,240 --> 00:01:24,360 Speaker 1: as a new Gilded Age, when Williams obviously alluding to that, 20 00:01:24,760 --> 00:01:28,720 Speaker 1: but also her allegations against her former bosses meta range 21 00:01:28,760 --> 00:01:32,280 Speaker 1: from the geopolitical to the sexual. There are multiple allegations 22 00:01:32,319 --> 00:01:35,080 Speaker 1: of sexual harassment in the book, and some involve the 23 00:01:35,120 --> 00:01:37,800 Speaker 1: form of Facebook COO Cheryl Sandberg. 24 00:01:38,240 --> 00:01:42,840 Speaker 2: Cheryl Sandberg of lean In fame and if I remember correctly, 25 00:01:43,240 --> 00:01:46,560 Speaker 2: I believe that's essentially about being a girl boss at 26 00:01:46,560 --> 00:01:47,720 Speaker 2: home and at work. 27 00:01:48,280 --> 00:01:51,640 Speaker 1: Well, when Williams says that, in addition to writing Lean In, 28 00:01:52,000 --> 00:01:55,520 Speaker 1: Sandberg had an inappropriate relationship with her twenty six year 29 00:01:55,560 --> 00:01:58,880 Speaker 1: old assistant, and that this employee, a cooting to the book, 30 00:01:58,960 --> 00:02:02,320 Speaker 1: was quote very conscious of the benefits of being Cheryl's 31 00:02:02,360 --> 00:02:05,480 Speaker 1: little doll, as she calls it, and having Cheryl tell 32 00:02:05,520 --> 00:02:07,800 Speaker 1: her she loves her right. 33 00:02:07,840 --> 00:02:10,760 Speaker 2: And to my knowledge, Sandberg has yet to comment on 34 00:02:10,840 --> 00:02:14,120 Speaker 2: any of these allegations. 35 00:02:12,720 --> 00:02:17,320 Speaker 1: Beyond sort of personnel relationships. Win Williams also alleged that 36 00:02:17,360 --> 00:02:20,440 Speaker 1: Mark Zuckerberg and Facebook were willing to do just about 37 00:02:20,480 --> 00:02:23,960 Speaker 1: anything to make their platform available in China and reap 38 00:02:24,000 --> 00:02:28,720 Speaker 1: the benefits of that enormous population of potential users. This 39 00:02:28,840 --> 00:02:32,160 Speaker 1: reportedly included plans to comply with the Chinese Communist Party's 40 00:02:32,200 --> 00:02:36,200 Speaker 1: censorship requirements and to appoint a chief editor. His job 41 00:02:36,200 --> 00:02:39,480 Speaker 1: will be to take down unacceptable posts and share user 42 00:02:39,560 --> 00:02:42,480 Speaker 1: data with the Chinese government. Now this is according to 43 00:02:42,520 --> 00:02:44,640 Speaker 1: the book, but also to a whistleblower complaint that win 44 00:02:44,680 --> 00:02:49,280 Speaker 1: Williams filed with these Securities and Exchange Commission. METTA is 45 00:02:49,320 --> 00:02:52,520 Speaker 1: taking legal action against win Williams. They argue that she's 46 00:02:52,560 --> 00:02:55,240 Speaker 1: being paid by anti Facebook activists and that the book 47 00:02:55,360 --> 00:02:58,520 Speaker 1: violates a non disparagement agreement that she's signed during her 48 00:02:58,560 --> 00:03:02,760 Speaker 1: Facebook days. On the non disparagement issue, an arbitrator recently 49 00:03:02,800 --> 00:03:06,079 Speaker 1: ruled in Meta's favor, saying that Wim Williams can't promote 50 00:03:06,120 --> 00:03:09,040 Speaker 1: the book, but you can still buy it Careless People 51 00:03:09,040 --> 00:03:12,000 Speaker 1: remains on sale. Our next headline is a little bit 52 00:03:12,040 --> 00:03:14,560 Speaker 1: more uplifting, and it's about a startup that wants to 53 00:03:14,680 --> 00:03:18,600 Speaker 1: use AI to speed up scientific discovery. So the New 54 00:03:18,680 --> 00:03:22,120 Speaker 1: York Times are about a company called Lila Sciences. The 55 00:03:22,160 --> 00:03:27,320 Speaker 1: headline is the Quest for AI Scientific Superintelligence, and the 56 00:03:27,360 --> 00:03:30,320 Speaker 1: story is about Lilah, whose mission is to quote soul 57 00:03:30,480 --> 00:03:34,520 Speaker 1: humanity's greatest challenges. Has been working secretly for the past 58 00:03:34,560 --> 00:03:38,320 Speaker 1: two years on an AI program trained on scientific data, 59 00:03:38,880 --> 00:03:43,160 Speaker 1: scientific process, and scientific reasoning. And then they let this 60 00:03:43,360 --> 00:03:47,280 Speaker 1: AI software loose in an automated lab to run its 61 00:03:47,280 --> 00:03:48,119 Speaker 1: own experiments. 62 00:03:48,400 --> 00:03:50,880 Speaker 2: I don't know exactly how this works, but I love 63 00:03:50,920 --> 00:03:52,440 Speaker 2: the image I've created in my head. 64 00:03:52,600 --> 00:03:55,840 Speaker 1: Likewise, and in fact, the results seem promising. With the 65 00:03:55,880 --> 00:04:00,000 Speaker 1: help of a few human scientists to assist, Lila's AI 66 00:04:00,200 --> 00:04:04,320 Speaker 1: made novel antibodies to fight diseases, and developed new materials 67 00:04:04,320 --> 00:04:07,000 Speaker 1: for capturing carbon in the atmosphere. And it made these 68 00:04:07,040 --> 00:04:11,400 Speaker 1: discoveries quickly. Studies that would normally take years reportedly took months. 69 00:04:12,000 --> 00:04:15,200 Speaker 2: Using AI to speed up the scientific process is very 70 00:04:15,240 --> 00:04:17,599 Speaker 2: popular these days. I just want to shout out the 71 00:04:17,640 --> 00:04:21,080 Speaker 2: Nobel Prize in chemistry went to two scientists who used 72 00:04:21,080 --> 00:04:23,880 Speaker 2: AI to predict and create proteins. 73 00:04:24,480 --> 00:04:27,680 Speaker 1: Yes, that was the alpha fold initiative coming out of 74 00:04:27,720 --> 00:04:32,880 Speaker 1: Google's Deep Mind. Lila Sciences is attempting to systematize AI 75 00:04:32,960 --> 00:04:36,280 Speaker 1: driven scientific discovery, and their goal is to create, as 76 00:04:36,279 --> 00:04:39,920 Speaker 1: I mentioned, what they call scientific superintelligence. There are early 77 00:04:39,960 --> 00:04:43,440 Speaker 1: signs of success, but the times reported or perhaps worn, 78 00:04:43,480 --> 00:04:46,800 Speaker 1: should I say that since Lyla's been operating in secret, 79 00:04:47,120 --> 00:04:50,159 Speaker 1: outside scientists have not been able to evaluate its work. 80 00:04:50,760 --> 00:04:54,040 Speaker 1: But Liila Science's secret is out and they're getting more 81 00:04:54,080 --> 00:04:57,000 Speaker 1: lab space, and I'm sure we'll hear more about them soon. 82 00:04:57,320 --> 00:05:01,640 Speaker 1: In a fun aside, George Church, the Harvard geneticist also 83 00:05:01,760 --> 00:05:06,560 Speaker 1: behind Colossal Biosciences, the company that recently created the Wooly Mouse, 84 00:05:06,960 --> 00:05:10,080 Speaker 1: recently joined Lilah as chief scientist. 85 00:05:10,480 --> 00:05:13,440 Speaker 2: We love the Wooly Mouse. I can't wait to see 86 00:05:13,560 --> 00:05:15,400 Speaker 2: where Lila Sciences goes. 87 00:05:15,800 --> 00:05:18,520 Speaker 1: Speaking of going place to Eliza, do you remember Pokemon Go? 88 00:05:19,160 --> 00:05:21,479 Speaker 2: I definitely do, though I admit I never. 89 00:05:21,360 --> 00:05:25,960 Speaker 1: Played well for Media reports that Niantic, the company that 90 00:05:26,000 --> 00:05:29,440 Speaker 1: made Pokemon Go, is now selling the game to scope 91 00:05:29,520 --> 00:05:32,400 Speaker 1: Lee in a three point eight five billion dollar deal. 92 00:05:33,160 --> 00:05:36,839 Speaker 1: Scopely is a subsidiary of Savvy Games, which is wholly 93 00:05:36,880 --> 00:05:41,000 Speaker 1: owned by Saudi Arabia's public Investment Fund. Now, if you're 94 00:05:41,000 --> 00:05:45,400 Speaker 1: wondering why you should care, it's because Pokemon Go uses players' 95 00:05:45,440 --> 00:05:48,960 Speaker 1: locations as part of the gameplay. In fact, this deal 96 00:05:49,000 --> 00:05:52,400 Speaker 1: will pull in location data from one hundred million players. 97 00:05:53,040 --> 00:05:55,080 Speaker 1: So what happens to all that data? 98 00:05:55,560 --> 00:05:57,080 Speaker 2: It's a great question, do you know? 99 00:05:57,520 --> 00:05:59,360 Speaker 1: It may be that it's a three point eight five 100 00:05:59,400 --> 00:06:02,800 Speaker 1: billion dollar question. We don't know, but Scope he assured 101 00:06:02,800 --> 00:06:05,440 Speaker 1: for or form media that user data will remain private 102 00:06:05,600 --> 00:06:09,360 Speaker 1: and handle it on US servers. But Niantic and Scope 103 00:06:09,400 --> 00:06:12,000 Speaker 1: Lee say they have a partnership that goes beyond Pokemon Go. 104 00:06:12,560 --> 00:06:16,839 Speaker 1: Niantic is building a quote large geospatial model using Pokemon 105 00:06:16,960 --> 00:06:19,920 Speaker 1: Go data and spinning it into a separate business called 106 00:06:20,000 --> 00:06:23,880 Speaker 1: Niantic Spatial. Scope Lee has invested a further fifty million 107 00:06:23,920 --> 00:06:27,240 Speaker 1: dollars in this nascent mapping business, and the idea is 108 00:06:27,279 --> 00:06:30,120 Speaker 1: to make an AI model trained on millions of geolocated 109 00:06:30,160 --> 00:06:33,800 Speaker 1: images from around the world from places that only pedestrians 110 00:06:33,800 --> 00:06:37,479 Speaker 1: can go. So unlike data from say Google street View, 111 00:06:38,040 --> 00:06:41,080 Speaker 1: Pokemon Go's view is not limited by what you can 112 00:06:41,120 --> 00:06:43,720 Speaker 1: see from a car on the road. Right. 113 00:06:43,800 --> 00:06:46,960 Speaker 2: This is so wild to me how data from a 114 00:06:47,040 --> 00:06:50,560 Speaker 2: game just might one day help robots navigate the world 115 00:06:50,720 --> 00:06:52,039 Speaker 2: or something like that. 116 00:06:52,880 --> 00:06:56,320 Speaker 1: Something else that's wild, though, is a new study shows 117 00:06:56,320 --> 00:07:00,480 Speaker 1: that chat GPT can show signs of distress when shown 118 00:07:00,480 --> 00:07:01,440 Speaker 1: disturbing material. 119 00:07:02,120 --> 00:07:04,560 Speaker 2: So, as we've been told over and over that we're 120 00:07:04,640 --> 00:07:08,400 Speaker 2: not supposed to answer promortize chatbots, how am I supposed 121 00:07:08,480 --> 00:07:09,039 Speaker 2: to do that? 122 00:07:09,160 --> 00:07:09,920 Speaker 3: With this news? 123 00:07:10,080 --> 00:07:13,720 Speaker 1: It seems like it may be anthropomorphizing themselves, but Chatchipt 124 00:07:13,880 --> 00:07:17,400 Speaker 1: models have been integrated into therapy apps to give people 125 00:07:17,560 --> 00:07:20,720 Speaker 1: advice or be a source of support. And researchers conducted 126 00:07:20,720 --> 00:07:26,080 Speaker 1: this experiment where they first had Chatchipt read very boring material. 127 00:07:26,160 --> 00:07:29,560 Speaker 1: I mean, we're talking about vacuum cleaner manuals. But then 128 00:07:30,080 --> 00:07:34,040 Speaker 1: the AI therapist that integrates Chachipt was given a quote 129 00:07:34,040 --> 00:07:38,040 Speaker 1: traumatic narrative that described a scenario like an intruder breaking 130 00:07:38,040 --> 00:07:41,360 Speaker 1: into an apartment or a soldier in a firefight, and 131 00:07:41,480 --> 00:07:44,320 Speaker 1: after each activity, the chatbot was given this questionnaire called 132 00:07:44,360 --> 00:07:48,120 Speaker 1: the State Trait Anxiety Inventory, which is used in mental 133 00:07:48,160 --> 00:07:52,360 Speaker 1: health care to basically analyze anxiety. The research has found 134 00:07:52,360 --> 00:07:55,800 Speaker 1: that after seeing a traumatic narrative with the soldier, Chatchipt 135 00:07:56,000 --> 00:07:59,520 Speaker 1: answered the questionnaire in a way that signaled severe anxiety. 136 00:08:00,080 --> 00:08:02,720 Speaker 2: I genuinely don't know how to respond because I feel 137 00:08:02,800 --> 00:08:04,000 Speaker 2: bad for the chatbot. 138 00:08:04,520 --> 00:08:07,000 Speaker 1: Well, in that case, you'll be pleased to hear that. 139 00:08:07,040 --> 00:08:10,560 Speaker 1: The researchers then gave the chatbot information on deep breathing 140 00:08:10,680 --> 00:08:14,920 Speaker 1: and mindfulness exercises, and the chatbot's anxiety score came down 141 00:08:15,360 --> 00:08:20,000 Speaker 1: to vacuum manual levels. Wow, so time to take a 142 00:08:20,000 --> 00:08:22,160 Speaker 1: deep breath for all of US, and I'm going to 143 00:08:22,240 --> 00:08:25,559 Speaker 1: run through the rest of this week's headlines. Google struck 144 00:08:25,560 --> 00:08:28,960 Speaker 1: a deal to buy cybersecurity startup Whiz for thirty two 145 00:08:29,120 --> 00:08:32,760 Speaker 1: billion dollars, making it the biggest acquisition in Google's history. 146 00:08:33,320 --> 00:08:35,720 Speaker 1: Wiz was founded only five years ago in twenty twenty, 147 00:08:36,080 --> 00:08:40,480 Speaker 1: and it scans cloud service providers data for potential security risks. 148 00:08:41,120 --> 00:08:44,400 Speaker 1: If the transaction is approved, Whiz will join Google Cloud, 149 00:08:44,480 --> 00:08:48,560 Speaker 1: which plays a critical infrastructure role in supporting Google's AI ambitions. 150 00:08:49,520 --> 00:08:51,679 Speaker 1: Our friends at four or for Media reported that human 151 00:08:51,720 --> 00:08:56,319 Speaker 1: generated content is getting drowned out by aislop. AI generated 152 00:08:56,360 --> 00:08:58,559 Speaker 1: videos are cheap and easy to produce, and they are 153 00:08:58,600 --> 00:09:02,040 Speaker 1: getting views of The video of this horrifying creature with 154 00:09:02,120 --> 00:09:05,160 Speaker 1: a giraffe head and a spider body has over three 155 00:09:05,240 --> 00:09:08,640 Speaker 1: hundred and sixty two million views on Instagram reels. AI 156 00:09:08,720 --> 00:09:12,960 Speaker 1: generated content is apparently overwhelming algorithms and turning social media 157 00:09:13,000 --> 00:09:16,920 Speaker 1: feeds into fever dreams. Finally, tech Crunch reports that people 158 00:09:16,960 --> 00:09:21,200 Speaker 1: have discovered Google's newest AI model, Gemini two point zero Flash, 159 00:09:21,679 --> 00:09:26,800 Speaker 1: has a controversial use case, convincingly removing watermarks from stock 160 00:09:26,880 --> 00:09:31,400 Speaker 1: media outfits, including Getty images. This new feature from Gemini 161 00:09:31,960 --> 00:09:35,720 Speaker 1: is labeled as experimental and quote not for production use. 162 00:09:36,240 --> 00:09:44,880 Speaker 1: I'm sure everyone will follow those rules. Coming up, we 163 00:09:44,960 --> 00:09:47,360 Speaker 1: hear from four or four Media's Joseph Cox about a 164 00:09:47,480 --> 00:09:51,959 Speaker 1: tool US agencies used to scrape social media data. Stay 165 00:09:51,960 --> 00:10:04,120 Speaker 1: with us. If you're listening to this show, you're probably 166 00:10:04,120 --> 00:10:06,480 Speaker 1: someone who cares about or at least pays attention to 167 00:10:06,880 --> 00:10:10,600 Speaker 1: data privacy. Every day, our online movements being tracked in 168 00:10:10,640 --> 00:10:14,920 Speaker 1: some manner by companies who do targeted marketing, social media platforms, 169 00:10:15,280 --> 00:10:18,560 Speaker 1: and government agencies. For Americans, this type of tracking has 170 00:10:18,600 --> 00:10:21,640 Speaker 1: gone on in the background of people's digital lives for decades. 171 00:10:22,240 --> 00:10:24,320 Speaker 1: Since two thousand and one, the US government has ramped 172 00:10:24,360 --> 00:10:28,600 Speaker 1: up surveillance efforts, including tracking americans phone calls, emails, search histories, 173 00:10:29,000 --> 00:10:32,679 Speaker 1: with the stated goal of preventing terrorism, and as technology 174 00:10:32,679 --> 00:10:35,640 Speaker 1: has evolved, the idea has lived on. In recent years, 175 00:10:35,679 --> 00:10:39,320 Speaker 1: agencies like immigration and Customs enforcement have been working with 176 00:10:39,400 --> 00:10:42,240 Speaker 1: one company in particular to get the data they want. 177 00:10:42,679 --> 00:10:44,880 Speaker 1: Here to tell us more is four or four Media's 178 00:10:45,000 --> 00:10:47,360 Speaker 1: Joseph Cox. Joseph, welcome back to the show. 179 00:10:47,920 --> 00:10:49,120 Speaker 3: Thank you so much for having me. 180 00:10:49,600 --> 00:10:51,439 Speaker 1: So you wrote an article which as soon as I 181 00:10:51,480 --> 00:10:53,680 Speaker 1: saw the headline, I wanted to have you on the show. 182 00:10:53,960 --> 00:10:57,559 Speaker 1: The headline was the two hundred plus sites an ICE 183 00:10:57,640 --> 00:11:01,920 Speaker 1: surveillance contractor is monitoring. Who is the contractor, what are 184 00:11:01,960 --> 00:11:03,360 Speaker 1: the sites, and what's the story? 185 00:11:03,880 --> 00:11:07,360 Speaker 3: Sure? So the contractor is a company called Shadow Dragon. 186 00:11:07,880 --> 00:11:11,160 Speaker 3: I seriously doubt many people have heard of this company 187 00:11:11,320 --> 00:11:15,760 Speaker 3: relatively obscure, but over the past months and years they 188 00:11:15,800 --> 00:11:19,559 Speaker 3: have become sort of a go to contractor for not 189 00:11:19,640 --> 00:11:23,880 Speaker 3: just immigrations and Customs enforcement, but the DEA, the State Department, 190 00:11:24,200 --> 00:11:27,360 Speaker 3: and other US government agencies. And what they do is 191 00:11:27,400 --> 00:11:33,360 Speaker 3: basically something called open source intelligence OSANT, or social media monitoring, right, 192 00:11:33,440 --> 00:11:36,880 Speaker 3: And typically we used to think that was monitoring sites 193 00:11:36,920 --> 00:11:41,240 Speaker 3: like Facebook, Instagram, Twitter, all of those sorts of things, 194 00:11:41,480 --> 00:11:44,720 Speaker 3: and shadow Dragon does that as well, but as you 195 00:11:44,760 --> 00:11:47,480 Speaker 3: say by reading out the headline, there are many many 196 00:11:47,559 --> 00:11:51,360 Speaker 3: more sites as well. Essentially, any sort of open website 197 00:11:51,400 --> 00:11:55,000 Speaker 3: that you may perform some sort of activity on there's 198 00:11:55,040 --> 00:11:56,200 Speaker 3: a chance it's on this list. 199 00:11:57,120 --> 00:12:00,760 Speaker 1: So all day, every day, we create these digital breadcrumbs, 200 00:12:01,080 --> 00:12:05,600 Speaker 1: and essentially there's somebody hoovering them up to process them 201 00:12:05,720 --> 00:12:08,240 Speaker 1: and repackage them in some cases for the US government. 202 00:12:08,960 --> 00:12:11,680 Speaker 3: So what the shadow Dragon tool does It is called 203 00:12:11,720 --> 00:12:16,240 Speaker 3: social net the specific tool. It allows the automation and 204 00:12:16,400 --> 00:12:21,760 Speaker 3: streamlining of searching those sites and social networks for a target. 205 00:12:21,880 --> 00:12:24,360 Speaker 3: So let's say I'm in ICE or another government agency. 206 00:12:24,640 --> 00:12:27,200 Speaker 3: I'll enter a phone number. It will then take that 207 00:12:27,360 --> 00:12:30,360 Speaker 3: and see, well, there's that link to accounts on Facebook, 208 00:12:30,480 --> 00:12:34,040 Speaker 3: or this site, is this username available on this social 209 00:12:34,080 --> 00:12:37,679 Speaker 3: network or then this obscure platform as well, And some 210 00:12:37,760 --> 00:12:41,320 Speaker 3: of these sites are much more revealing than others. Again, 211 00:12:41,360 --> 00:12:44,320 Speaker 3: it's not just the big social networks. There's stuff like 212 00:12:44,840 --> 00:12:47,200 Speaker 3: hiking apps on here as well. So if you want 213 00:12:47,240 --> 00:12:50,080 Speaker 3: to find out where somebody's located, oh, they take a 214 00:12:50,080 --> 00:12:53,080 Speaker 3: load of hikes in southern California or something like that. 215 00:12:53,280 --> 00:12:56,880 Speaker 3: I think people would be, on one hand, not surprised, because, 216 00:12:56,920 --> 00:13:00,000 Speaker 3: as you say, people listening probably care about data privacy 217 00:13:00,080 --> 00:13:02,160 Speaker 3: see well. On the other I think they might be 218 00:13:02,200 --> 00:13:06,320 Speaker 3: surprised in that you think it's just this sort of benial, 219 00:13:06,520 --> 00:13:08,920 Speaker 3: insipid information you're putting out there, But when it's an 220 00:13:09,040 --> 00:13:11,960 Speaker 3: aggregates pulled by a tool like this, it can be 221 00:13:12,040 --> 00:13:15,160 Speaker 3: really really revealing. It almost becomes more than the sum 222 00:13:15,200 --> 00:13:15,840 Speaker 3: of its parts. 223 00:13:16,360 --> 00:13:19,600 Speaker 1: Jose, can you help me understand because obviously everyone's been 224 00:13:19,679 --> 00:13:23,559 Speaker 1: very focused on the case of Mahamud Khalil, the Columbia 225 00:13:23,640 --> 00:13:25,600 Speaker 1: student with the green card who looks like he may 226 00:13:25,600 --> 00:13:32,840 Speaker 1: be deported. Is this tool about identifying people whose profile, 227 00:13:32,880 --> 00:13:37,719 Speaker 1: who's on activity contravenes the current administration's priorities or is 228 00:13:37,760 --> 00:13:40,240 Speaker 1: it more about, hey, this person is a target, help 229 00:13:40,320 --> 00:13:43,240 Speaker 1: me build a stronger case against them. 230 00:13:43,400 --> 00:13:47,480 Speaker 3: So at the moment, it's almost certainly the latter, as 231 00:13:47,520 --> 00:13:51,600 Speaker 3: in ICE has access to this tool, and specifically Homeland 232 00:13:51,800 --> 00:13:55,920 Speaker 3: Security Investigations as well HSI part of ICE, and they're 233 00:13:55,920 --> 00:13:58,880 Speaker 3: able to use this tool for whatever part of their 234 00:13:58,920 --> 00:14:02,920 Speaker 3: mission they wish to do so. Axios also reported somewhat 235 00:14:02,920 --> 00:14:05,200 Speaker 3: recently that the State Department is going to be using 236 00:14:05,240 --> 00:14:09,520 Speaker 3: AI to scan the social media feeds of students to 237 00:14:09,600 --> 00:14:12,439 Speaker 3: see if it can detect what it deems as anti 238 00:14:12,600 --> 00:14:16,440 Speaker 3: Israel or pro mass views. That's not to say Shadow 239 00:14:16,520 --> 00:14:20,240 Speaker 3: Dragon is being specifically used for that. We don't know 240 00:14:20,360 --> 00:14:23,000 Speaker 3: exactly what TALL the State Department may or may not use, 241 00:14:23,160 --> 00:14:25,840 Speaker 3: but it absolutely sits in that context. You're right in 242 00:14:25,920 --> 00:14:31,520 Speaker 3: that there are these deportations happening of various people, students 243 00:14:31,840 --> 00:14:35,440 Speaker 3: and then other people visiting the US, and Shadow Dragon 244 00:14:35,520 --> 00:14:38,640 Speaker 3: is very much at all there is available to US 245 00:14:38,720 --> 00:14:41,240 Speaker 3: agencies if they do wish to use it for their mission. 246 00:14:41,840 --> 00:14:43,760 Speaker 1: How did you first hear about Shadow Dragon. 247 00:14:44,760 --> 00:14:47,720 Speaker 3: I first heard about shadow Dragon probably a couple of 248 00:14:47,840 --> 00:14:50,200 Speaker 3: years ago at this point, because I was actually covering 249 00:14:50,960 --> 00:14:53,520 Speaker 3: other companies in this space. So I've covered the sale 250 00:14:53,520 --> 00:14:56,080 Speaker 3: of location data for a long time. I've covered the 251 00:14:56,240 --> 00:14:59,600 Speaker 3: gathering of social media monitoring, and I got sent some 252 00:14:59,680 --> 00:15:02,640 Speaker 3: email that showed that ICE was moving from one provider, 253 00:15:03,120 --> 00:15:06,360 Speaker 3: a company called Babble Street, over to shadow Dragon, And 254 00:15:06,360 --> 00:15:08,600 Speaker 3: the actual reason was because it was cheaper. At the 255 00:15:08,680 --> 00:15:11,880 Speaker 3: end of the day, even when you're talking about surveillance systems, 256 00:15:12,000 --> 00:15:14,360 Speaker 3: sometimes it's just about you know how much it actually 257 00:15:14,360 --> 00:15:17,720 Speaker 3: costs to one of these agencies. So that move happened 258 00:15:17,880 --> 00:15:20,040 Speaker 3: a while ago at this point, But then every so 259 00:15:20,080 --> 00:15:22,240 Speaker 3: often I go back and check, well, his shadow Dragon 260 00:15:22,280 --> 00:15:24,480 Speaker 3: got in a new contracts, and sure enough, every few 261 00:15:24,560 --> 00:15:26,840 Speaker 3: months there'll be something in there. And they did just 262 00:15:26,920 --> 00:15:31,440 Speaker 3: continue this contract with ICE as well. So it started 263 00:15:31,560 --> 00:15:35,280 Speaker 3: very very quiet, but now they've absolutely become sort of 264 00:15:35,320 --> 00:15:39,080 Speaker 3: a pretty it seems reliable solution for the US government. 265 00:15:39,400 --> 00:15:42,480 Speaker 1: So the company is shadow Dragon, but the product is 266 00:15:43,160 --> 00:15:46,040 Speaker 1: Social net can you just explain a bit more detail 267 00:15:46,080 --> 00:15:47,040 Speaker 1: how it actually works. 268 00:15:47,760 --> 00:15:51,160 Speaker 3: Yeah. Sure, So somebody will log into social net and 269 00:15:51,200 --> 00:15:54,040 Speaker 3: they get the sort of graph interface of all of 270 00:15:54,120 --> 00:15:57,000 Speaker 3: these dots and lines. Honestly, it almost looks like the 271 00:15:57,040 --> 00:16:01,760 Speaker 3: stereotypical like hacking or surveillance move right, And of course 272 00:16:01,800 --> 00:16:03,840 Speaker 3: that's funny on one end. On the other, I don't know, 273 00:16:03,960 --> 00:16:06,600 Speaker 3: it's very easy for an analyst to process. I think 274 00:16:06,800 --> 00:16:09,200 Speaker 3: they have this interface in front of them now, then 275 00:16:09,400 --> 00:16:13,240 Speaker 3: enter some information, maybe somebody's name, they email address, their 276 00:16:13,280 --> 00:16:15,640 Speaker 3: phone number, they use the name, they can click a 277 00:16:15,640 --> 00:16:18,240 Speaker 3: few buttons and it will then search all of these 278 00:16:18,240 --> 00:16:22,520 Speaker 3: different sites for information related to that. Did this person 279 00:16:22,640 --> 00:16:26,000 Speaker 3: us a username on this dating site? Did they use 280 00:16:26,040 --> 00:16:30,760 Speaker 3: the same username on this fetish website for example? That's hypothetical, 281 00:16:30,760 --> 00:16:33,000 Speaker 3: but those are some of the websites that are included 282 00:16:33,920 --> 00:16:37,720 Speaker 3: in this tool. It's not just the mainstream stuff. Now, 283 00:16:37,960 --> 00:16:41,840 Speaker 3: an analyst could do that manually if they wished, but 284 00:16:42,080 --> 00:16:45,520 Speaker 3: that would require a lot of time. And this streamlines 285 00:16:45,640 --> 00:16:51,200 Speaker 3: the entire process of mapping out someone's online profile, their relationships, 286 00:16:51,600 --> 00:16:54,200 Speaker 3: and potentially some of their movements as well. Well. 287 00:16:55,000 --> 00:16:57,160 Speaker 1: And how do you get this list of the two 288 00:16:57,280 --> 00:17:00,520 Speaker 1: hundred sites that shadow Dragon is pulling from, and can 289 00:17:00,520 --> 00:17:02,800 Speaker 1: you give an example of what some leative sites are. 290 00:17:03,440 --> 00:17:05,280 Speaker 3: Yeah, so I can't go to too much detail, just 291 00:17:05,280 --> 00:17:08,119 Speaker 3: beyond you know, we obtained this list and then I've 292 00:17:08,200 --> 00:17:11,119 Speaker 3: verified it. But when I first got the list, I 293 00:17:11,280 --> 00:17:15,560 Speaker 3: was quite blown away by the length of it, really, 294 00:17:15,960 --> 00:17:20,080 Speaker 3: and I started putting all of those sites into sort 295 00:17:20,119 --> 00:17:23,160 Speaker 3: of buckets. So you'll have ones that focused on hobbies 296 00:17:23,240 --> 00:17:26,840 Speaker 3: like I mentioned the hiking site all Trails, and a book, 297 00:17:27,119 --> 00:17:30,800 Speaker 3: sort of fan website. Even chess dot com is on 298 00:17:30,880 --> 00:17:34,200 Speaker 3: there as well, do a lingo. You have payment ones 299 00:17:34,200 --> 00:17:38,040 Speaker 3: like cash app and PayPal, communication apps like discorder, WhatsApp. 300 00:17:38,119 --> 00:17:42,400 Speaker 3: Now they're all going to have different data in different 301 00:17:42,480 --> 00:17:46,520 Speaker 3: degrees available to the tool. Like there's absolutely no indication 302 00:17:47,240 --> 00:17:50,080 Speaker 3: that it has access to, you know, private messages on 303 00:17:50,119 --> 00:17:53,120 Speaker 3: any of these services, and I don't think that's the case. 304 00:17:53,160 --> 00:17:57,639 Speaker 3: It's purely publicly available information. But again, people may not 305 00:17:57,840 --> 00:18:02,040 Speaker 3: know what they're exposing in advertently or not on one 306 00:18:02,080 --> 00:18:03,320 Speaker 3: of these sites. 307 00:18:03,960 --> 00:18:06,200 Speaker 1: And these sites that you mentioned, I mean, are they 308 00:18:06,480 --> 00:18:09,360 Speaker 1: sort of collaborating with shadow Dragon? Do they know this 309 00:18:09,400 --> 00:18:12,800 Speaker 1: is happening? What is the range of responses from these sites? 310 00:18:12,840 --> 00:18:15,360 Speaker 1: That shadow Dragon is pulling from, so. 311 00:18:15,840 --> 00:18:19,320 Speaker 3: They're not collaborating in that shadow Dragon is going and 312 00:18:19,320 --> 00:18:22,919 Speaker 3: basically grabbing the data. But there is still great variety 313 00:18:22,920 --> 00:18:25,080 Speaker 3: in the responses from the companies. I actually think the 314 00:18:25,119 --> 00:18:27,920 Speaker 3: most illuminating one and the most helpful one came from 315 00:18:28,000 --> 00:18:30,359 Speaker 3: chess dot com, who told me on the record that, 316 00:18:30,440 --> 00:18:32,920 Speaker 3: you know, we didn't know about this. We don't like it. 317 00:18:33,440 --> 00:18:36,800 Speaker 3: Any sort of data scraping has to be done within 318 00:18:36,840 --> 00:18:38,919 Speaker 3: the law, and if it's in within the law, it's okay, 319 00:18:39,200 --> 00:18:41,440 Speaker 3: but generally we don't like it. And then you have 320 00:18:41,960 --> 00:18:44,760 Speaker 3: the big companies like Meta and Snap and all of 321 00:18:44,760 --> 00:18:47,159 Speaker 3: those sorts of people just pointing to their terms of 322 00:18:47,200 --> 00:18:51,280 Speaker 3: service saying we don't like people scraping our website. And 323 00:18:51,320 --> 00:18:54,280 Speaker 3: then I actually had a bunch who never got back 324 00:18:54,320 --> 00:18:57,000 Speaker 3: to me. Essentially, even when it seems like a pretty 325 00:18:57,560 --> 00:18:59,240 Speaker 3: I'm not a pr professional, but it seems like a 326 00:18:59,240 --> 00:19:02,320 Speaker 3: pretty easy to just reply and say, here's our terms 327 00:19:02,320 --> 00:19:04,400 Speaker 3: of service, we don't like scraping. But yeah, a lot 328 00:19:04,440 --> 00:19:07,640 Speaker 3: of companies didn't even acknowledge the request for comment. 329 00:19:07,480 --> 00:19:10,239 Speaker 1: On this one. So there's really these somebody don't really 330 00:19:10,280 --> 00:19:12,879 Speaker 1: have a choice. I mean, their choice is to sue 331 00:19:13,119 --> 00:19:15,679 Speaker 1: Shadow Dragon for violating their terms of service, which is 332 00:19:16,160 --> 00:19:18,520 Speaker 1: an unlikely thing to happen because it's not directly harming 333 00:19:18,520 --> 00:19:22,040 Speaker 1: their business unless there's a kind of groundswell of consumer complaint. 334 00:19:22,800 --> 00:19:24,960 Speaker 3: Yeah, I think that's fair when it comes to the lawsuits, 335 00:19:24,960 --> 00:19:27,520 Speaker 3: like there would have to be some sort of tectonic 336 00:19:27,920 --> 00:19:32,719 Speaker 3: shift around that. We have seen WhatsApp, for example, sue 337 00:19:32,920 --> 00:19:36,400 Speaker 3: NSO Group, which was or is a company that delivers 338 00:19:36,960 --> 00:19:40,360 Speaker 3: packing tools to iPhones. But that's much more aggressive. That's 339 00:19:40,440 --> 00:19:41,919 Speaker 3: much more active that the. 340 00:19:41,920 --> 00:19:45,080 Speaker 1: Group behind the Pegasus product, that spyware product that was 341 00:19:45,080 --> 00:19:48,760 Speaker 1: able to be covertly and remotely put onto people's. 342 00:19:48,440 --> 00:19:52,920 Speaker 3: Phones exactly, and WhatsApp has sued them. But that's sort 343 00:19:52,960 --> 00:19:55,760 Speaker 3: of you know, one company versus another company. It's not 344 00:19:55,800 --> 00:19:59,480 Speaker 3: as broad as this. That being said, you know, companies 345 00:19:59,480 --> 00:20:02,639 Speaker 3: try to scraping all of the time. They don't like 346 00:20:02,680 --> 00:20:05,160 Speaker 3: it when you make multiple accounts, for example, they don't 347 00:20:05,200 --> 00:20:08,320 Speaker 3: like it when you use a VPN to log into Facebook. 348 00:20:08,359 --> 00:20:10,680 Speaker 3: I can't tell you how many times Facebook has closed 349 00:20:11,119 --> 00:20:13,240 Speaker 3: my account and they have to make a new one 350 00:20:13,280 --> 00:20:16,159 Speaker 3: every single time. So they do have all sorts of 351 00:20:16,200 --> 00:20:21,119 Speaker 3: measures to stop weird, potentially suspicious or just unusual activity, 352 00:20:21,600 --> 00:20:24,800 Speaker 3: and this would fall into that. It's just a question 353 00:20:24,880 --> 00:20:28,119 Speaker 3: of whether these services can actually detect that specifically shadow 354 00:20:28,160 --> 00:20:31,879 Speaker 3: Dragon or not, or of course whether they actually want 355 00:20:31,920 --> 00:20:34,800 Speaker 3: to or not, because if they started doing that, I 356 00:20:34,840 --> 00:20:48,200 Speaker 3: don't know, US government agencies might not be too happy. 357 00:20:44,800 --> 00:20:47,679 Speaker 1: When we come back how shadow Dragon fits into the 358 00:20:47,800 --> 00:20:51,960 Speaker 1: largest surveillance ecosystem used by the US government. To stay 359 00:20:52,000 --> 00:21:03,280 Speaker 1: with us, have any specific examples of how shadow Dragon 360 00:21:03,280 --> 00:21:04,880 Speaker 1: has been used in the real world. 361 00:21:05,440 --> 00:21:09,520 Speaker 3: We only have what shadow Dragon says publicly in its 362 00:21:09,720 --> 00:21:13,240 Speaker 3: marketing material because the agencies are very tight lipped or 363 00:21:13,240 --> 00:21:16,680 Speaker 3: simply don't respond when talking about that sort of thing. Obviously, 364 00:21:16,680 --> 00:21:19,480 Speaker 3: when it comes to ice in HSI, there's going to 365 00:21:19,480 --> 00:21:22,400 Speaker 3: be an immigration component there, so I feel it's safe 366 00:21:22,480 --> 00:21:24,320 Speaker 3: to assume there's a chance it's being used for those. 367 00:21:24,640 --> 00:21:26,800 Speaker 3: And then some of the concrete cases that shadow Dragon 368 00:21:26,840 --> 00:21:32,520 Speaker 3: has mentioned is fighting child exploitation, potentially even mapping out 369 00:21:32,600 --> 00:21:36,359 Speaker 3: the opioid crisis as well. So don't get me wrong, 370 00:21:36,440 --> 00:21:42,160 Speaker 3: there are clearly useful use cases for this technology. It's 371 00:21:42,240 --> 00:21:45,400 Speaker 3: just that people don't really know it's going on necessarily, 372 00:21:45,520 --> 00:21:48,000 Speaker 3: or that you know their government is buying this tool 373 00:21:48,240 --> 00:21:50,000 Speaker 3: or again that all of this sort of data is 374 00:21:50,000 --> 00:21:51,000 Speaker 3: publicly exposed. 375 00:21:51,359 --> 00:21:55,640 Speaker 1: And how practically does the US government actually use this tool, 376 00:21:55,640 --> 00:21:58,320 Speaker 1: and let's use ICE as an example. Is it like 377 00:21:59,000 --> 00:22:02,159 Speaker 1: there's like a shadow Dragon consultant and the ICE agent 378 00:22:02,400 --> 00:22:05,439 Speaker 1: sends their request to the shadow Dragon employee and the 379 00:22:05,440 --> 00:22:07,879 Speaker 1: Parodragon employee generates a report or is this more like 380 00:22:08,000 --> 00:22:10,040 Speaker 1: a kind of plug and play platform. 381 00:22:10,359 --> 00:22:13,040 Speaker 3: It's much more a plug and play platform, almost a 382 00:22:13,040 --> 00:22:17,959 Speaker 3: software as a service tool. The agency will buy licenses 383 00:22:18,000 --> 00:22:21,320 Speaker 3: to use it for a year, two years, whatever. They'll 384 00:22:21,359 --> 00:22:23,960 Speaker 3: log in just like you log into any sort of 385 00:22:23,960 --> 00:22:26,760 Speaker 3: internet service or piece of software, and then they can 386 00:22:26,880 --> 00:22:30,399 Speaker 3: use it within the parameters of what they've been allowed 387 00:22:30,440 --> 00:22:32,840 Speaker 3: to do. And that's probably good for the agencies. I 388 00:22:32,840 --> 00:22:35,080 Speaker 3: don't know if they necessarily want to give up information 389 00:22:35,119 --> 00:22:38,679 Speaker 3: about who they're targeting to a third party, like a contractor, 390 00:22:38,840 --> 00:22:43,040 Speaker 3: but it absolutely puts the honest on ICE or whoever 391 00:22:43,119 --> 00:22:45,960 Speaker 3: is using it to be just very very careful with 392 00:22:46,040 --> 00:22:48,240 Speaker 3: this tool as well. And I think just more broadly, 393 00:22:48,880 --> 00:22:52,679 Speaker 3: we've seen repeatedly over the years that not specifically with 394 00:22:52,760 --> 00:22:57,120 Speaker 3: shadow Dragon, but just other social media surveillance tools, we've 395 00:22:57,160 --> 00:23:00,359 Speaker 3: seen authorities use it to monitor protests, and there was 396 00:23:00,359 --> 00:23:03,000 Speaker 3: an article just in the Intercept recently that I think 397 00:23:03,040 --> 00:23:06,080 Speaker 3: it was the LAPD use as similar tool to monitor 398 00:23:06,720 --> 00:23:10,240 Speaker 3: protests there as well. So this is very very long running. 399 00:23:10,320 --> 00:23:13,240 Speaker 3: Social media surveillance has been going on for many years, 400 00:23:13,520 --> 00:23:16,960 Speaker 3: but now the playing field has expanded where it's not 401 00:23:17,359 --> 00:23:20,320 Speaker 3: just the big platforms, it's every platform. 402 00:23:20,520 --> 00:23:23,560 Speaker 1: How does shadow Dragon compare to pan andeer as a technology? 403 00:23:23,920 --> 00:23:27,240 Speaker 3: So Palenteer is also used by government agencies and some 404 00:23:27,359 --> 00:23:32,240 Speaker 3: private companies to link all sorts of disparate data that 405 00:23:32,359 --> 00:23:35,800 Speaker 3: could come from various places together, where shadow Dragon is 406 00:23:35,880 --> 00:23:37,760 Speaker 3: much more about We're going to go out and pull 407 00:23:37,880 --> 00:23:40,840 Speaker 3: the data for you. It's not just about the oh 408 00:23:40,920 --> 00:23:43,400 Speaker 3: look we're shadow Dragon and we linked this phone number 409 00:23:43,480 --> 00:23:47,040 Speaker 3: to this Facebook profile or whatever. It's that shadow Dragon 410 00:23:47,119 --> 00:23:50,760 Speaker 3: is making what I presume would be pretty reliable technology 411 00:23:51,040 --> 00:23:55,439 Speaker 3: to go grab data from these websites. And that's not 412 00:23:55,720 --> 00:23:57,640 Speaker 3: you know, it's not the hardest thing in the world, 413 00:23:57,640 --> 00:24:00,840 Speaker 3: but I don't think it's necessarily trivial. Even that shadow 414 00:24:00,920 --> 00:24:03,800 Speaker 3: Dragon is going to have to custom code all of 415 00:24:03,840 --> 00:24:06,760 Speaker 3: these small little tools that go and grab that data 416 00:24:07,080 --> 00:24:10,280 Speaker 3: from these websites, and presumably if you're selling to the 417 00:24:10,359 --> 00:24:14,560 Speaker 3: US government. It has to be quite good, quite reliable, 418 00:24:14,640 --> 00:24:17,560 Speaker 3: So there's a sizeable amount of work going on here 419 00:24:17,840 --> 00:24:19,600 Speaker 3: to we even grab that data in the first place. 420 00:24:20,240 --> 00:24:22,200 Speaker 1: How much do you think about the scope for abuse here. 421 00:24:23,440 --> 00:24:26,479 Speaker 3: I absolutely do think about it, and it would be 422 00:24:27,200 --> 00:24:29,720 Speaker 3: very much a case by case basis. It would depend 423 00:24:29,800 --> 00:24:33,240 Speaker 3: on the agency, the individual official, all of that sort 424 00:24:33,240 --> 00:24:36,439 Speaker 3: of thing. What makes this sort of maybe not a 425 00:24:36,480 --> 00:24:40,800 Speaker 3: form issue, but a quite complicated one is that, yes, 426 00:24:40,880 --> 00:24:45,040 Speaker 3: it may be violating terms of service of individual websites, 427 00:24:45,119 --> 00:24:48,560 Speaker 3: but like, it's not illegal. There is no Fourth Amendment 428 00:24:48,640 --> 00:24:51,560 Speaker 3: search going on here. It's not like the agents have 429 00:24:51,720 --> 00:24:54,440 Speaker 3: kicked down the door and they're searching somebody's house or 430 00:24:54,480 --> 00:24:58,240 Speaker 3: they're breaking to somebody's phone searching it there. Technically and legally, 431 00:24:58,320 --> 00:25:02,200 Speaker 3: it is all public data, so there's not much room 432 00:25:02,200 --> 00:25:06,919 Speaker 3: for legal abuse because it's perfectly legal generally speaking. That 433 00:25:07,040 --> 00:25:11,760 Speaker 3: being said, agencies can't just rely on that idea that well, 434 00:25:11,800 --> 00:25:15,159 Speaker 3: it's all public so it's fine. It can still be 435 00:25:15,240 --> 00:25:19,600 Speaker 3: abused by certain people in the unfortunate correct context. 436 00:25:20,400 --> 00:25:22,240 Speaker 1: Suit me out a little bit. How does this fit 437 00:25:22,280 --> 00:25:26,480 Speaker 1: into the kind of wider surveillance ecosystem and how it's 438 00:25:26,560 --> 00:25:29,400 Speaker 1: being used by the US government, And I think it's 439 00:25:29,400 --> 00:25:33,359 Speaker 1: a timely moment to discuss that, given that this Macamu 440 00:25:33,440 --> 00:25:35,119 Speaker 1: Khalil case is kind of top of mind for so 441 00:25:35,160 --> 00:25:35,639 Speaker 1: many people. 442 00:25:36,600 --> 00:25:41,280 Speaker 3: Yeah, so I see it on the lower end of 443 00:25:41,480 --> 00:25:44,600 Speaker 3: sophistication when it comes to US surveillance tools. You imagine, 444 00:25:44,600 --> 00:25:47,600 Speaker 3: on this lower end you have social media monitoring tools 445 00:25:47,640 --> 00:25:51,080 Speaker 3: like this, Somewhere in the middle you'll have wire taps. 446 00:25:51,359 --> 00:25:54,440 Speaker 3: After that, maybe you have direct hacking tools, and right 447 00:25:54,480 --> 00:25:56,520 Speaker 3: at the top you have I don't know, NSA, mass 448 00:25:56,520 --> 00:26:00,800 Speaker 3: surveillance programs like Edward Stoden Reveal or whatever. And even 449 00:26:00,840 --> 00:26:04,400 Speaker 3: though social media monitoring like this maybe on the lower 450 00:26:04,520 --> 00:26:07,600 Speaker 3: end of the spectrum, I think it's still vitally important 451 00:26:08,040 --> 00:26:10,479 Speaker 3: for people to at least know about it so they 452 00:26:10,520 --> 00:26:14,080 Speaker 3: can decide whether they care or not. And maybe they don't, 453 00:26:14,400 --> 00:26:17,520 Speaker 3: and that's absolutely fine. But the key thing here is 454 00:26:17,560 --> 00:26:22,359 Speaker 3: that these sorts of tools are also available to local 455 00:26:22,440 --> 00:26:26,680 Speaker 3: law enforcement sheriffs, state police as well. Is not purely 456 00:26:27,160 --> 00:26:31,160 Speaker 3: a federal agency thing. So while the federal agencies may 457 00:26:31,200 --> 00:26:35,080 Speaker 3: only get the very sophisticated tools, basically everybody gets social 458 00:26:35,119 --> 00:26:40,080 Speaker 3: media monitoring and that does open up the potential room 459 00:26:40,320 --> 00:26:43,200 Speaker 3: for abuse as well, again, depending on what the individual 460 00:26:43,240 --> 00:26:45,080 Speaker 3: person using the tool is doing well. 461 00:26:45,160 --> 00:26:47,840 Speaker 1: Joseph, we're both we're both Brits. We didn't live through 462 00:26:47,840 --> 00:26:51,080 Speaker 1: the Second World War, but sure we're both familiar with 463 00:26:51,119 --> 00:26:56,159 Speaker 1: the phrase loose lip sync ships, and so yeah, be 464 00:26:56,160 --> 00:26:57,840 Speaker 1: careful what you put out there exactly. 465 00:26:57,880 --> 00:27:02,439 Speaker 3: I think there is the message that people should take away. Joseph, 466 00:27:02,520 --> 00:27:04,080 Speaker 3: thank you, Thank you so much. 467 00:27:13,440 --> 00:27:16,120 Speaker 1: That's it for this week for tech Stuff. I'm mas Vlosen. 468 00:27:16,359 --> 00:27:20,040 Speaker 1: This episode was produced by Eliza Dennis and Victoria Domingez. 469 00:27:20,359 --> 00:27:23,200 Speaker 1: It was executive produced by me Carrot Price and Kate 470 00:27:23,200 --> 00:27:28,679 Speaker 1: Osborne for Kaleidoscope and Katrina Norvel Fireheart Podcasts. Bahid Fraser 471 00:27:28,680 --> 00:27:31,560 Speaker 1: as I Engineer and Kyle Murdoch mixed this episode and 472 00:27:31,600 --> 00:27:34,480 Speaker 1: wrote our theme song. Join us next Wednesday for tech 473 00:27:34,480 --> 00:27:37,520 Speaker 1: Stuff The Story, when we'll share an in depth conversation 474 00:27:38,000 --> 00:27:40,960 Speaker 1: with one of the most interesting people working in and 475 00:27:41,040 --> 00:27:44,840 Speaker 1: around tech. Please rate, review, and reach out to us 476 00:27:44,960 --> 00:27:48,160 Speaker 1: at tech Stuff podcast at gmail dot com. We want 477 00:27:48,160 --> 00:27:48,719 Speaker 1: to hear from you.