1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,720 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,760 --> 00:00:18,639 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:18,760 --> 00:00:22,200 Speaker 1: Tech are you? If you've been listening to tech Stuff 5 00:00:22,239 --> 00:00:25,880 Speaker 1: for a while, or if you keep up with tech 6 00:00:25,960 --> 00:00:30,560 Speaker 1: news in general, the name Andy Stone might ring a 7 00:00:30,600 --> 00:00:33,600 Speaker 1: bell now. Unlike a lot of the other folks that 8 00:00:33,680 --> 00:00:36,600 Speaker 1: I talk about on the show, you know, like geniuses 9 00:00:36,760 --> 00:00:41,800 Speaker 1: like Ada Lovelace or Nikola Tesla, or you know, company 10 00:00:41,840 --> 00:00:46,559 Speaker 1: founders like Gordon Moore or Steve Wozniak, or even controversial 11 00:00:46,680 --> 00:00:51,200 Speaker 1: corporate leaders like Elon Musk or Mark Zuckerberg, Andy Stone 12 00:00:51,479 --> 00:00:55,480 Speaker 1: isn't that deeply rooted in the world of tech. In fact, 13 00:00:56,120 --> 00:00:59,440 Speaker 1: until twenty fourteen, he spent his professional career working in 14 00:00:59,520 --> 00:01:04,880 Speaker 1: government and campaign positions and stuff related to that. And now, 15 00:01:05,000 --> 00:01:10,640 Speaker 1: according to reports, Russia's Interior Ministry has added Andy Stone's 16 00:01:10,720 --> 00:01:14,440 Speaker 1: name to its wanted list. So I thought we could 17 00:01:14,480 --> 00:01:17,400 Speaker 1: do a short episode about Stone, the job he's held 18 00:01:17,400 --> 00:01:20,560 Speaker 1: since twenty fourteen, and what might have led him to 19 00:01:20,600 --> 00:01:23,880 Speaker 1: be considered a criminal in Russia. Here in the States, 20 00:01:24,280 --> 00:01:26,679 Speaker 1: there are people who certainly take issues with some of 21 00:01:26,680 --> 00:01:29,560 Speaker 1: the things Stone has said, but I'm not sure anyone 22 00:01:29,600 --> 00:01:31,400 Speaker 1: would go so far as to call him a criminal. 23 00:01:31,480 --> 00:01:36,920 Speaker 1: Here in the US, Stone's forte is at least presumably 24 00:01:37,000 --> 00:01:42,000 Speaker 1: in communications. He attended George Washington University, a prestigious college 25 00:01:42,040 --> 00:01:45,759 Speaker 1: here in the US, and he studied political communication while 26 00:01:45,800 --> 00:01:49,840 Speaker 1: he was there. That school happens to be located in Washington, 27 00:01:49,960 --> 00:01:52,320 Speaker 1: d C. There are a couple of other schools in 28 00:01:52,360 --> 00:01:55,920 Speaker 1: that area where typically the students who are going to 29 00:01:55,960 --> 00:01:59,400 Speaker 1: those schools often end up working with the government in 30 00:01:59,400 --> 00:02:02,560 Speaker 1: some form or another. It's pretty common, like a lot 31 00:02:02,560 --> 00:02:06,080 Speaker 1: of jobs are directly or indirectly connected to the US 32 00:02:06,120 --> 00:02:10,200 Speaker 1: government or at least a government there. Because my partner 33 00:02:10,280 --> 00:02:13,400 Speaker 1: attended Georgetown, which is in DC, and worked for the 34 00:02:13,520 --> 00:02:17,360 Speaker 1: French embassy for a while, so again it's a place 35 00:02:17,360 --> 00:02:21,600 Speaker 1: where you make connections. Stone worked for presidential candidate John 36 00:02:21,720 --> 00:02:25,639 Speaker 1: Kerry's campaign in two thousand and three, and in late 37 00:02:25,639 --> 00:02:28,799 Speaker 1: two thousand and four he joined M and R Strategic 38 00:02:28,960 --> 00:02:32,880 Speaker 1: Services as a consultant. Now that's a company that primarily 39 00:02:32,880 --> 00:02:37,720 Speaker 1: focuses on political communications, so not affiliated with any specific 40 00:02:38,160 --> 00:02:42,800 Speaker 1: candidate or party, but is there to act as a 41 00:02:42,880 --> 00:02:48,520 Speaker 1: consultant for the purposes of creating political communications, so again 42 00:02:48,880 --> 00:02:51,760 Speaker 1: very much in line with what he had already been doing. 43 00:02:51,800 --> 00:02:55,080 Speaker 1: He would then become the communications director for Congress in 44 00:02:55,120 --> 00:02:57,280 Speaker 1: two thousand and seven, and you kind of see where 45 00:02:57,320 --> 00:02:59,880 Speaker 1: this is going. The nature of politics here in America 46 00:03:00,080 --> 00:03:04,240 Speaker 1: means that any role you hold in government, like if 47 00:03:04,240 --> 00:03:09,040 Speaker 1: you're actually connected to a governmental party, it's likely on 48 00:03:09,120 --> 00:03:12,400 Speaker 1: borrowed time election cycles change up who's in charge on 49 00:03:12,560 --> 00:03:15,800 Speaker 1: the rag. Now there are, of course, federal employees who 50 00:03:15,840 --> 00:03:20,480 Speaker 1: typically hold positions from administration to administration, and they don't 51 00:03:20,560 --> 00:03:25,440 Speaker 1: necessarily change each time there's a change in leadership, although 52 00:03:25,680 --> 00:03:29,920 Speaker 1: there are efforts to make that be very different. Anyway, 53 00:03:29,960 --> 00:03:33,079 Speaker 1: that's getting into a whole different realm. But from two 54 00:03:33,120 --> 00:03:35,760 Speaker 1: thousand and three to twenty fourteen, pretty much all of 55 00:03:35,840 --> 00:03:41,200 Speaker 1: Andy Stone's work focused on political communications, either as part 56 00:03:41,360 --> 00:03:44,760 Speaker 1: of a campaign or part of a governmental body or 57 00:03:44,840 --> 00:03:48,320 Speaker 1: part of the private sector, but still catering to politics. 58 00:03:48,840 --> 00:03:53,320 Speaker 1: Now that changed in the spring of twenty fourteen, That 59 00:03:53,480 --> 00:03:56,640 Speaker 1: is when Meta, then known as Facebook they had not 60 00:03:56,760 --> 00:04:00,440 Speaker 1: changed their name yet, offered Stone a gig as a 61 00:04:00,600 --> 00:04:07,320 Speaker 1: policy communications manager for Facebook. Now, by this time, Facebook 62 00:04:07,320 --> 00:04:12,440 Speaker 1: had already sailed past the incredible milestone of having a 63 00:04:12,480 --> 00:04:16,640 Speaker 1: billion active users per month. That had happened actually back 64 00:04:16,680 --> 00:04:21,640 Speaker 1: in twenty twelve, so it was definitely already a huge 65 00:04:21,800 --> 00:04:26,239 Speaker 1: company by twenty fourteen. They had also acquired Instagram. Also 66 00:04:26,320 --> 00:04:30,520 Speaker 1: back in twenty twelve, and just before Stone would join 67 00:04:30,600 --> 00:04:35,039 Speaker 1: the company, Facebook had announced it was acquiring Wattsapp. This 68 00:04:35,279 --> 00:04:39,040 Speaker 1: was a truly huge acquisition deal in the billions of dollars, 69 00:04:39,560 --> 00:04:43,240 Speaker 1: so a lot was going on. In fact, when Stone 70 00:04:43,520 --> 00:04:47,400 Speaker 1: joined Facebook, that very same month, the company announced its 71 00:04:47,440 --> 00:04:50,960 Speaker 1: intention to acquire Oculus VR, so a lot of the 72 00:04:52,000 --> 00:04:56,520 Speaker 1: components that make up you know, various foundational elements of 73 00:04:56,760 --> 00:05:00,480 Speaker 1: Meta today, those were all being acquired around the same 74 00:05:00,560 --> 00:05:03,719 Speaker 1: time that Andy Stone joined the company. Now, on top 75 00:05:03,760 --> 00:05:06,560 Speaker 1: of all that, the company was also launching some very 76 00:05:06,600 --> 00:05:12,240 Speaker 1: progressive features. For example, one month before Stone joined Facebook, 77 00:05:12,600 --> 00:05:15,920 Speaker 1: the company updated its platform to allow for different gender 78 00:05:16,000 --> 00:05:20,560 Speaker 1: designations in support of the LGBTQ plus community, So you 79 00:05:20,560 --> 00:05:24,000 Speaker 1: suddenly had a lot more pronouns that you could use 80 00:05:24,000 --> 00:05:28,320 Speaker 1: when you were creating your profile and thus represent yourself 81 00:05:28,360 --> 00:05:33,000 Speaker 1: more accurately. This was a pretty progressive approach, especially in 82 00:05:33,000 --> 00:05:36,800 Speaker 1: twenty fourteen, and Andy Stone had focused a lot of 83 00:05:36,839 --> 00:05:41,320 Speaker 1: his career on how to communicate progressive policies to the public, 84 00:05:41,440 --> 00:05:45,920 Speaker 1: because most of the time he was focusing on socially 85 00:05:45,960 --> 00:05:50,640 Speaker 1: progressive democratic campaigns and policies, So this kind of was 86 00:05:50,960 --> 00:05:55,800 Speaker 1: a type of synergy when he joined Facebook. I don't 87 00:05:55,800 --> 00:06:00,920 Speaker 1: have very much background about Andy Stone beyond this. I'm 88 00:06:00,960 --> 00:06:04,360 Speaker 1: sure there's stuff that's out there. Right now, the news 89 00:06:04,360 --> 00:06:07,159 Speaker 1: cycle is just flooded with this story about Russia putting 90 00:06:07,200 --> 00:06:10,360 Speaker 1: him on a wanted list. But when I started trying 91 00:06:10,400 --> 00:06:12,280 Speaker 1: to dig around and just learn a little bit more 92 00:06:12,320 --> 00:06:15,120 Speaker 1: about him, I mean, I've seen his name dozens of times. 93 00:06:15,440 --> 00:06:17,880 Speaker 1: When I tried to learn more about him, there really 94 00:06:18,120 --> 00:06:21,640 Speaker 1: wasn't a whole lot to find. Again, I'm sure if 95 00:06:21,640 --> 00:06:23,760 Speaker 1: I did a super deep dive, I could change that, 96 00:06:23,880 --> 00:06:27,000 Speaker 1: But it is interesting to me that someone who has 97 00:06:27,040 --> 00:06:30,880 Speaker 1: dedicated his career in public communications has also managed to 98 00:06:30,920 --> 00:06:35,880 Speaker 1: maintain a pretty low profile. Stone's name pops up a 99 00:06:35,920 --> 00:06:40,960 Speaker 1: lot since twenty fourteen. These days, according to LinkedIn anyway, 100 00:06:41,080 --> 00:06:45,279 Speaker 1: his title is communications director, and that means that when 101 00:06:45,279 --> 00:06:49,040 Speaker 1: you come across a story that includes something like Facebook 102 00:06:49,080 --> 00:06:53,359 Speaker 1: says it's disappointed in the EU's decision or whatever, there's 103 00:06:53,400 --> 00:06:57,440 Speaker 1: a darn good chance that Facebook was actually Andy Stone 104 00:06:57,480 --> 00:06:59,920 Speaker 1: who said this. He's been the spokesperson for the cum 105 00:07:00,520 --> 00:07:03,960 Speaker 1: in many of its trials and tribulations over the last 106 00:07:03,960 --> 00:07:07,479 Speaker 1: several years. Stone's job is really to serve as the 107 00:07:07,520 --> 00:07:11,200 Speaker 1: mouthpiece for the company and to craft communications on the 108 00:07:11,240 --> 00:07:14,760 Speaker 1: company's behalf, and I imagine his job has grown a 109 00:07:14,800 --> 00:07:18,240 Speaker 1: lot more challenging over recent years in the wake of 110 00:07:18,280 --> 00:07:22,600 Speaker 1: a series of scandals and difficult events, such as the 111 00:07:22,680 --> 00:07:26,480 Speaker 1: unraveling of the Cambridge Analytica scandal. Now, if you don't 112 00:07:26,520 --> 00:07:30,440 Speaker 1: remember the Cambridge Analytica story, good for you. But it 113 00:07:30,520 --> 00:07:34,520 Speaker 1: refers to a political consulting company that leaned on an 114 00:07:34,560 --> 00:07:38,320 Speaker 1: app that was used to scrape data from Facebook without 115 00:07:38,360 --> 00:07:41,240 Speaker 1: first getting user permissions. And they were able to do 116 00:07:41,280 --> 00:07:45,040 Speaker 1: this because for a while, Facebook really didn't provide adequate 117 00:07:45,080 --> 00:07:51,000 Speaker 1: protections with its application programming interface. So essentially, before Facebook 118 00:07:51,080 --> 00:07:54,480 Speaker 1: actually addressed this issue, a developer could create an app 119 00:07:54,680 --> 00:07:59,680 Speaker 1: for Facebook that, when installed, could access way more information 120 00:07:59,840 --> 00:08:02,880 Speaker 1: than what was actually needed for the app to work. 121 00:08:03,280 --> 00:08:06,120 Speaker 1: For example, maybe you create a game, a very simple 122 00:08:06,200 --> 00:08:11,120 Speaker 1: game that is within Facebook or just uses Facebook credentials 123 00:08:11,160 --> 00:08:13,400 Speaker 1: in or for you to log into the game, but 124 00:08:13,520 --> 00:08:17,280 Speaker 1: as part of that installation process, the app gets permissions 125 00:08:17,320 --> 00:08:21,440 Speaker 1: from the user to access all kinds of data, like 126 00:08:22,080 --> 00:08:26,679 Speaker 1: maybe everything that is personal information related to the user 127 00:08:26,800 --> 00:08:30,800 Speaker 1: gets sent to the developer. Maybe it gives the developer 128 00:08:30,920 --> 00:08:35,439 Speaker 1: access to view all of that user's friends and contacts 129 00:08:35,440 --> 00:08:38,720 Speaker 1: on Facebook, and none of that is necessary for the 130 00:08:38,760 --> 00:08:42,840 Speaker 1: game to work. But really, until the mid twenty tens, 131 00:08:43,360 --> 00:08:46,600 Speaker 1: Facebook kind of let that go, like they just let 132 00:08:46,679 --> 00:08:49,600 Speaker 1: developers get access to all sorts of information, whether it 133 00:08:49,720 --> 00:08:52,400 Speaker 1: was necessary for the app to work or not. So 134 00:08:52,880 --> 00:08:56,760 Speaker 1: in the case of Cambridge Analytica, the company was making 135 00:08:56,880 --> 00:09:00,720 Speaker 1: use of a survey app, and this particular survey would 136 00:09:00,760 --> 00:09:04,079 Speaker 1: actually offer a fee to users. It would pay people 137 00:09:04,280 --> 00:09:07,120 Speaker 1: to take the survey. Folks like getting money, so that 138 00:09:07,160 --> 00:09:09,520 Speaker 1: was a huge incentive. A lot of folks said sure, 139 00:09:09,920 --> 00:09:12,360 Speaker 1: and they ended up installing this little app so they 140 00:09:12,360 --> 00:09:14,680 Speaker 1: could take the survey. Now, what they may not have 141 00:09:14,760 --> 00:09:17,760 Speaker 1: realized is that by installing the app, they were essentially 142 00:09:17,760 --> 00:09:21,640 Speaker 1: opening up all of their connections to scrutiny, because now 143 00:09:21,720 --> 00:09:25,760 Speaker 1: Cambridge Analytica, through this app, could view not just the 144 00:09:26,000 --> 00:09:30,400 Speaker 1: survey taker's information, but all the people who that person 145 00:09:30,520 --> 00:09:33,120 Speaker 1: was friends with. They could view those profiles as if 146 00:09:33,120 --> 00:09:35,480 Speaker 1: they were the person who took the survey. So it 147 00:09:35,559 --> 00:09:38,760 Speaker 1: suddenly opened up the opportunity to scrape data on a 148 00:09:38,840 --> 00:09:43,800 Speaker 1: much larger scale. In fact, millions of accounts were affected 149 00:09:43,800 --> 00:09:45,599 Speaker 1: by this, and so you didn't have to be the 150 00:09:45,640 --> 00:09:47,720 Speaker 1: person who downloaded the survey. You just had to have 151 00:09:47,760 --> 00:09:49,839 Speaker 1: a friend who did it. And if they did it, 152 00:09:50,200 --> 00:09:55,520 Speaker 1: you were affected too. So Cambridge Analytica accesses millions of 153 00:09:55,559 --> 00:09:59,680 Speaker 1: people's profiles without their permission, and Andy Stone had the 154 00:09:59,720 --> 00:10:04,640 Speaker 1: un inviewable task of announcing findings to the media about 155 00:10:04,760 --> 00:10:09,160 Speaker 1: that situation. So when Facebook did an investigation and estimated 156 00:10:09,160 --> 00:10:12,839 Speaker 1: that around eighty seven million accounts were affected, this number 157 00:10:12,840 --> 00:10:16,360 Speaker 1: would fluctuate quite a bit throughout the investigations, but this 158 00:10:16,480 --> 00:10:19,600 Speaker 1: was an early one. Well, Andy Stone was the person 159 00:10:19,640 --> 00:10:23,960 Speaker 1: who had to actually deliver that information to the media. 160 00:10:24,559 --> 00:10:28,360 Speaker 1: Not a whole lot of fun. Anyway, we're gonna come 161 00:10:28,400 --> 00:10:32,000 Speaker 1: back to talk more about Andy Stone in just a moment. 162 00:10:32,040 --> 00:10:34,080 Speaker 1: Before we can do that, let's take a quick break 163 00:10:34,160 --> 00:10:46,480 Speaker 1: to thank our sponsors. So I was talking before the 164 00:10:46,520 --> 00:10:50,240 Speaker 1: break about Andy Stone having to deliver rough news about 165 00:10:50,280 --> 00:10:55,040 Speaker 1: the Cambridge Analytica to the media as this scandal was unfolding. 166 00:10:55,440 --> 00:10:57,480 Speaker 1: But This was around the same time where Stone began 167 00:10:57,520 --> 00:11:01,840 Speaker 1: to build a reputation for being let's call it assertive 168 00:11:02,400 --> 00:11:07,160 Speaker 1: on platforms like Twitter and using that platform to refute 169 00:11:07,280 --> 00:11:11,840 Speaker 1: or criticize Facebook's own critics, kind of like striking back 170 00:11:11,880 --> 00:11:14,840 Speaker 1: at the critics. One of the journalists who covered the 171 00:11:14,880 --> 00:11:19,439 Speaker 1: Cambridge Analytica story was Carol Cadwaller, So on top of 172 00:11:19,520 --> 00:11:23,000 Speaker 1: being a journalist, Cadwaller became a member of the independent 173 00:11:23,120 --> 00:11:27,760 Speaker 1: Oversight Board for Facebook. You might remember this board reviews 174 00:11:27,800 --> 00:11:32,719 Speaker 1: Facebook's decisions to determine if the company is actually aligning 175 00:11:32,760 --> 00:11:36,080 Speaker 1: with its own policies. You may also remember that the 176 00:11:36,120 --> 00:11:39,200 Speaker 1: board can make recommendations to Facebook, but the company is 177 00:11:39,280 --> 00:11:45,400 Speaker 1: under no obligation to actually adopt those recommendations, which obviously 178 00:11:45,520 --> 00:11:48,760 Speaker 1: brings up the question how effective the board can actually be. 179 00:11:48,920 --> 00:11:53,640 Speaker 1: But anyway, Cadwaller says that Stone attempted to discredit her 180 00:11:53,720 --> 00:11:59,720 Speaker 1: by sharing quote deliberate deceptions regarding Cambridge Analytica and deliberately 181 00:12:00,360 --> 00:12:04,560 Speaker 1: trolled me. It was just in no way appropriate for 182 00:12:04,640 --> 00:12:08,440 Speaker 1: the corporate pr of a trillion dollar company to behave 183 00:12:08,600 --> 00:12:12,200 Speaker 1: like that toward a journalist end quote. So this would 184 00:12:12,240 --> 00:12:14,840 Speaker 1: not be the last time the journalists as well as 185 00:12:15,080 --> 00:12:21,000 Speaker 1: others would complain of Andy Stone's approach, particularly on Twitter. 186 00:12:21,880 --> 00:12:25,800 Speaker 1: Another high profile case was when former Facebook employee Francis 187 00:12:25,840 --> 00:12:29,040 Speaker 1: Hogan came forward in late twenty twenty one with a 188 00:12:29,040 --> 00:12:34,520 Speaker 1: whistleblower report on the company. She shared thousands of internal 189 00:12:34,520 --> 00:12:37,880 Speaker 1: documents that didn't exactly put Facebook in a good light. 190 00:12:38,280 --> 00:12:40,840 Speaker 1: In fact, we're still learning about stuff that was in 191 00:12:40,920 --> 00:12:43,640 Speaker 1: those documents, and a lot of the stuff in there 192 00:12:43,679 --> 00:12:47,280 Speaker 1: suggests that the people in Facebook now known as Meta 193 00:12:47,520 --> 00:12:51,920 Speaker 1: were more aware of negative aspects that the company and 194 00:12:52,000 --> 00:12:57,600 Speaker 1: its platforms had on certain users than had been previously disclosed. Anyway, 195 00:12:58,080 --> 00:13:00,240 Speaker 1: Andy Stone frequently had to go to bat for the 196 00:13:00,280 --> 00:13:03,560 Speaker 1: company and defend it, as report after report filed in 197 00:13:03,640 --> 00:13:07,600 Speaker 1: about issues like the company's influence on young users in particular. 198 00:13:08,080 --> 00:13:10,640 Speaker 1: And I'm not sure that Stone's efforts were always effective 199 00:13:10,880 --> 00:13:14,120 Speaker 1: when it came to deflecting her testimony in front of Congress, 200 00:13:14,559 --> 00:13:18,280 Speaker 1: because again he took to Twitter to try and discredit 201 00:13:18,320 --> 00:13:23,320 Speaker 1: her or to hang some doubt upon her and her point. 202 00:13:23,960 --> 00:13:26,720 Speaker 1: One message he posted to Twitter, of course, now known 203 00:13:26,720 --> 00:13:30,760 Speaker 1: as x read quote just pointing out the fact that 204 00:13:30,800 --> 00:13:34,280 Speaker 1: Francis Hogan did not work on child safety or Instagram 205 00:13:34,440 --> 00:13:37,240 Speaker 1: or research these issues, and has no direct knowledge of 206 00:13:37,280 --> 00:13:40,560 Speaker 1: the topic from her work at Facebook. End quote, which, 207 00:13:40,960 --> 00:13:43,120 Speaker 1: if you were to ask me, that seems like a 208 00:13:43,240 --> 00:13:48,680 Speaker 1: pretty darn weak defense. Because if someone dumped tens of 209 00:13:48,880 --> 00:13:54,000 Speaker 1: thousands of actual corporate documents in my lap and then 210 00:13:54,080 --> 00:13:56,960 Speaker 1: told me that this stuff comes from inside company so 211 00:13:57,040 --> 00:13:59,920 Speaker 1: and so, and it says the company has knowingly engaged 212 00:14:00,080 --> 00:14:04,600 Speaker 1: in some questionable stuff and continues to do so, even 213 00:14:04,720 --> 00:14:08,400 Speaker 1: knowing the potential consequences for that, I wouldn't really be 214 00:14:08,520 --> 00:14:15,080 Speaker 1: concerned about that person's expertise in whatever the field was. Right, 215 00:14:15,200 --> 00:14:18,319 Speaker 1: assuming that I could determine that the documents are legitimate, 216 00:14:18,600 --> 00:14:21,280 Speaker 1: that they in fact came from within the company that 217 00:14:21,480 --> 00:14:25,320 Speaker 1: was claimed, then it doesn't matter the level of expertise 218 00:14:25,320 --> 00:14:27,520 Speaker 1: of the person who brought them to me. I would 219 00:14:27,520 --> 00:14:30,560 Speaker 1: be more concerned about the actual content of the documents, 220 00:14:31,040 --> 00:14:35,560 Speaker 1: and the people who produce those documents presumably were experts 221 00:14:35,880 --> 00:14:39,320 Speaker 1: in their field. It doesn't matter if the messenger is 222 00:14:39,360 --> 00:14:44,000 Speaker 1: an expert. In fact, Andy Stone should know that quite 223 00:14:44,000 --> 00:14:47,440 Speaker 1: well because he has been the messenger many times. He 224 00:14:47,560 --> 00:14:51,760 Speaker 1: is not the person who creates the different message or 225 00:14:51,800 --> 00:14:54,760 Speaker 1: the different policies. I should say, he creates the messages 226 00:14:54,800 --> 00:14:59,120 Speaker 1: around them, So you should know that a person's expertise 227 00:14:59,160 --> 00:15:03,040 Speaker 1: isn't really the matter of concern when you're talking about 228 00:15:03,280 --> 00:15:08,320 Speaker 1: documents that have some pretty damning content in them. Now, 229 00:15:08,360 --> 00:15:13,600 Speaker 1: Stone's posts on Twitter prompted some pretty interesting responses around 230 00:15:13,680 --> 00:15:18,040 Speaker 1: the journalist field and beyond. So for example, Bob Pickard, 231 00:15:18,360 --> 00:15:21,800 Speaker 1: who is the CEO of Signal Leadership Communication, This is 232 00:15:21,840 --> 00:15:28,320 Speaker 1: a consulting group that helps executives create communications to the 233 00:15:28,360 --> 00:15:30,760 Speaker 1: public so that they can do so in a way 234 00:15:30,800 --> 00:15:34,480 Speaker 1: that is responsible and accountable, that's going to have the 235 00:15:34,520 --> 00:15:37,080 Speaker 1: best impact, that's not going to run a foul of 236 00:15:37,440 --> 00:15:39,160 Speaker 1: the law of that kind of stuff like this is 237 00:15:39,720 --> 00:15:44,680 Speaker 1: high level communications we're talking about here. Well, Pickman said, 238 00:15:44,760 --> 00:15:50,800 Speaker 1: quote Facebook's crisis comms on this issue are embarrassingly bad. 239 00:15:51,400 --> 00:15:55,680 Speaker 1: Everyone is talking about Facebook's poor PR, which is often 240 00:15:55,720 --> 00:15:58,520 Speaker 1: a proxy for other issues, but I think the comms 241 00:15:58,600 --> 00:16:03,600 Speaker 1: themselves are in d crap end quote. Now, that wasn't 242 00:16:03,760 --> 00:16:07,520 Speaker 1: just referring to Andy Stone, but certainly Stone was being 243 00:16:08,200 --> 00:16:12,600 Speaker 1: lumped in with this criticism. Several PR professionals question why 244 00:16:12,680 --> 00:16:16,400 Speaker 1: Stone was even posting on Twitter at all about this stuff, 245 00:16:16,840 --> 00:16:20,400 Speaker 1: why he was taking this particular tactic. They said it 246 00:16:20,440 --> 00:16:24,800 Speaker 1: was antithetical to the way you would typically try to 247 00:16:24,840 --> 00:16:29,360 Speaker 1: build up a communications department within a company. They said that, 248 00:16:29,480 --> 00:16:32,120 Speaker 1: you know, usually you want to build relationships with the 249 00:16:32,160 --> 00:16:36,120 Speaker 1: press so that you can get more favorable coverage, and 250 00:16:36,440 --> 00:16:40,479 Speaker 1: using Twitter to snipe at journalists who have posted stories 251 00:16:40,720 --> 00:16:45,480 Speaker 1: that critique the company isn't really conducive toward doing that. 252 00:16:45,960 --> 00:16:49,160 Speaker 1: They compared it more to his work in politics, and 253 00:16:49,200 --> 00:16:53,040 Speaker 1: that in politics, often communications leads to a quote unquote 254 00:16:53,160 --> 00:16:55,400 Speaker 1: zero sum game where there has to be a winner 255 00:16:55,400 --> 00:17:01,400 Speaker 1: and a loser, and in those sort of of avenues, 256 00:17:02,120 --> 00:17:06,280 Speaker 1: it was more common to see this kind of banter 257 00:17:06,440 --> 00:17:09,439 Speaker 1: going back and forth between a PR person and the press. 258 00:17:10,080 --> 00:17:13,960 Speaker 1: Less so when it comes to corporate communications now. Earlier 259 00:17:14,000 --> 00:17:17,520 Speaker 1: this year, Stone served as the messenger for another sticky 260 00:17:17,760 --> 00:17:23,480 Speaker 1: situation see Meta slash. Facebook has a strict policy against 261 00:17:23,840 --> 00:17:29,359 Speaker 1: data scraping. The most valuable asset that Meta has is 262 00:17:29,520 --> 00:17:33,960 Speaker 1: access to our information. We are the product. You've often 263 00:17:34,359 --> 00:17:38,280 Speaker 1: probably heard that that if you go someplace and the 264 00:17:38,359 --> 00:17:40,679 Speaker 1: thing is free, then it turns out you are the 265 00:17:40,720 --> 00:17:43,959 Speaker 1: product that's for sale. That's definitely the case when it 266 00:17:43,960 --> 00:17:48,240 Speaker 1: comes to Meta's platforms. All of our interactions, everything we 267 00:17:48,359 --> 00:17:52,280 Speaker 1: post to the platform, everything we look at and engage 268 00:17:52,280 --> 00:17:56,640 Speaker 1: with on those platforms. All of that is valuable, right, 269 00:17:56,720 --> 00:17:59,840 Speaker 1: That's all data that's incredibly valuable because Meta can pack 270 00:17:59,840 --> 00:18:03,560 Speaker 1: it this up and then sell it to advertisers. So 271 00:18:03,640 --> 00:18:08,040 Speaker 1: the more you use Meta's products, the more valuable you become, 272 00:18:08,760 --> 00:18:11,159 Speaker 1: and the more information you share about other people in 273 00:18:11,160 --> 00:18:14,199 Speaker 1: your life, the more valuable you become. We are a 274 00:18:14,480 --> 00:18:19,360 Speaker 1: gold mine for Meta, and Meta guards that gold mine jealously. 275 00:18:20,119 --> 00:18:22,640 Speaker 1: The fact that people don't flip out. More over, how 276 00:18:22,680 --> 00:18:26,639 Speaker 1: their personal information is exploited for the benefit of a 277 00:18:26,880 --> 00:18:32,480 Speaker 1: truly massive corporation amazes me. I guess, I guess not 278 00:18:32,600 --> 00:18:35,360 Speaker 1: everyone fully understands it, and the people who do understand 279 00:18:35,400 --> 00:18:37,760 Speaker 1: it ultimately think, ah, it's not that big a deal. 280 00:18:37,800 --> 00:18:42,520 Speaker 1: I mean, it's just some personal information. But it's really not. 281 00:18:42,840 --> 00:18:45,359 Speaker 1: I mean, it's really a big deal. It's really not 282 00:18:45,560 --> 00:18:50,000 Speaker 1: something you should just dismiss. It's important, and the point 283 00:18:50,000 --> 00:18:53,520 Speaker 1: where it can become important for you specifically as opposed 284 00:18:53,520 --> 00:18:57,159 Speaker 1: to just in the general, then it's too late. But 285 00:18:57,280 --> 00:19:01,040 Speaker 1: this is a potentially fragile arrangement. Meta records that Meta 286 00:19:01,080 --> 00:19:04,639 Speaker 1: recognizes that if people wake up to the fact that 287 00:19:05,680 --> 00:19:09,080 Speaker 1: they really need to protect their personal information in a 288 00:19:09,200 --> 00:19:12,480 Speaker 1: more aggressive way. And if you start to see governments 289 00:19:12,520 --> 00:19:17,000 Speaker 1: around the world pass laws to help actually achieve this goal, 290 00:19:17,840 --> 00:19:23,360 Speaker 1: Meta's revenue is in danger. So Meta is very very 291 00:19:23,400 --> 00:19:26,800 Speaker 1: careful to not really bring too much attention to this 292 00:19:26,840 --> 00:19:29,040 Speaker 1: part of its business, at least not in a way 293 00:19:29,080 --> 00:19:33,280 Speaker 1: that is obvious to users. But it also means Meta 294 00:19:33,400 --> 00:19:36,800 Speaker 1: is very jealous of this information. If some other company 295 00:19:36,800 --> 00:19:40,480 Speaker 1: swoops in and starts to crawl Meta's platforms in order 296 00:19:40,480 --> 00:19:44,679 Speaker 1: to build out databases, that can threaten Meta's own business model, 297 00:19:44,920 --> 00:19:47,400 Speaker 1: because if we notice that this other company is doing it, 298 00:19:47,880 --> 00:19:50,320 Speaker 1: and we're worried about this other company, we might start 299 00:19:50,359 --> 00:19:55,560 Speaker 1: asking questions about Meta itself. So take Clearview AI for example. 300 00:19:56,200 --> 00:20:00,639 Speaker 1: This company created a huge facial recognition database. But in 301 00:20:00,760 --> 00:20:03,720 Speaker 1: order to do that, you first have to grab a 302 00:20:03,760 --> 00:20:07,440 Speaker 1: whole bunch of pictures of people. Now that's obvious, right, 303 00:20:07,520 --> 00:20:10,639 Speaker 1: Like you have to get the pictures in order to 304 00:20:11,200 --> 00:20:14,960 Speaker 1: actually have a database of facial recognition photos. But where 305 00:20:14,960 --> 00:20:17,600 Speaker 1: do you get the pictures from? Right? You know, you're 306 00:20:17,600 --> 00:20:20,440 Speaker 1: not just going door to door and snapping photos of people. No, 307 00:20:20,600 --> 00:20:24,120 Speaker 1: what clearview AI did and the CEO admitted as much. 308 00:20:24,560 --> 00:20:29,000 Speaker 1: The company scraped various social network platforms, including Facebook to 309 00:20:29,080 --> 00:20:33,840 Speaker 1: build out its database of thirty billion photographs, and clearview 310 00:20:33,880 --> 00:20:39,359 Speaker 1: AI's customers included authoritarian organizations you know we're talking about 311 00:20:39,359 --> 00:20:42,120 Speaker 1: like law enforcement agencies, that kind of thing that would 312 00:20:42,200 --> 00:20:45,280 Speaker 1: use clearview AI's databases to try and do things like 313 00:20:45,280 --> 00:20:47,640 Speaker 1: like identify suspects that sort of stuff. So it kind 314 00:20:47,680 --> 00:20:52,040 Speaker 1: of ties into issues like a surveillance state. So obviously 315 00:20:52,320 --> 00:20:54,639 Speaker 1: this kind of story can have a negative impact on 316 00:20:54,680 --> 00:20:57,800 Speaker 1: companies like Meta, and Meta has made it clear that 317 00:20:57,960 --> 00:21:01,520 Speaker 1: it is against its terms of service for other companies 318 00:21:01,560 --> 00:21:05,359 Speaker 1: to scrape Meta's platforms, you know, like Facebook or Instagram 319 00:21:05,560 --> 00:21:08,879 Speaker 1: or to get data. But what if journalists discover that 320 00:21:08,960 --> 00:21:13,159 Speaker 1: Meta is also engaged in data scraping from other websites. 321 00:21:13,960 --> 00:21:16,119 Speaker 1: And this brings us to the curious case of Meta 322 00:21:16,240 --> 00:21:20,240 Speaker 1: and a data collection company called Bright Data. So Meta 323 00:21:20,280 --> 00:21:24,560 Speaker 1: recently sued Bright Data in January of this year, twenty 324 00:21:24,640 --> 00:21:28,240 Speaker 1: twenty three, and accused the company of scraping data from 325 00:21:28,280 --> 00:21:32,359 Speaker 1: Meta platforms. However, it has also come to light that 326 00:21:32,440 --> 00:21:37,000 Speaker 1: Bright Data and Meta have been engaged in a subcontractor 327 00:21:37,040 --> 00:21:41,480 Speaker 1: relationship that Bright Data has worked for Meta and has 328 00:21:41,920 --> 00:21:45,639 Speaker 1: pulled data from other websites so that Meta could do 329 00:21:45,720 --> 00:21:50,600 Speaker 1: stuff like build out brand profiles and reportedly do things 330 00:21:50,640 --> 00:21:54,200 Speaker 1: like identify potential bad actors out there. So, in other words, 331 00:21:54,200 --> 00:21:58,399 Speaker 1: Bright Data was pulling data or scraping data off of 332 00:21:58,480 --> 00:22:02,080 Speaker 1: other sites in order to supply that information to Meta. 333 00:22:02,600 --> 00:22:04,960 Speaker 1: So Stone was pulled into action to comment on this, 334 00:22:05,359 --> 00:22:08,919 Speaker 1: and he said, quote, the collection of data from websites 335 00:22:08,960 --> 00:22:13,800 Speaker 1: can serve legitimate integrity and commercial purposes if done lawfully 336 00:22:14,040 --> 00:22:17,760 Speaker 1: and in accordance with those websites terms end quote. This 337 00:22:17,920 --> 00:22:21,880 Speaker 1: was a really awkward situation because he had to defend 338 00:22:21,960 --> 00:22:26,239 Speaker 1: Meta's work with a data collection company, while Meta is 339 00:22:26,560 --> 00:22:30,320 Speaker 1: famous for going after companies that were data scraping its 340 00:22:30,320 --> 00:22:32,960 Speaker 1: own platforms. So, in other words, Meta saying, it's not 341 00:22:33,080 --> 00:22:35,280 Speaker 1: cool for you to do that to us, but it's 342 00:22:35,280 --> 00:22:38,119 Speaker 1: totally cool if we do that to other websites. You 343 00:22:38,160 --> 00:22:42,200 Speaker 1: know that's fine. So do as we say, now as 344 00:22:42,200 --> 00:22:46,000 Speaker 1: we do. Okay, we're gonna take another quick break. When 345 00:22:46,040 --> 00:22:49,320 Speaker 1: we come back, we'll talk more about the actual lead 346 00:22:49,400 --> 00:22:55,240 Speaker 1: into Andy Stone being named a wanted individual by Russia's government. 347 00:22:55,520 --> 00:23:08,840 Speaker 1: The first a word from our sponsors. Okay, why was 348 00:23:08,960 --> 00:23:13,720 Speaker 1: Andy Stone, a communications director for Meta, put on a 349 00:23:13,880 --> 00:23:16,919 Speaker 1: wanted list. In fact, some outlets call it a most 350 00:23:17,119 --> 00:23:23,200 Speaker 1: wanted list for Russia. Well, it relates to Russia's invasion 351 00:23:23,280 --> 00:23:27,800 Speaker 1: of Ukraine and potentially to some temporary changes to Meta's 352 00:23:27,800 --> 00:23:33,560 Speaker 1: policies regarding violent content on Facebook or calls for violence 353 00:23:33,800 --> 00:23:38,760 Speaker 1: to be more specific on Facebook, and so in that regard. 354 00:23:38,800 --> 00:23:41,439 Speaker 1: If that's all true, this very much becomes a shoot 355 00:23:41,520 --> 00:23:45,720 Speaker 1: the Messenger kind of story. I should point out apparently 356 00:23:45,800 --> 00:23:49,159 Speaker 1: Stone's name has actually been on this wanted list since 357 00:23:49,400 --> 00:23:53,359 Speaker 1: February twenty twenty two. We only learned about it this 358 00:23:53,480 --> 00:23:56,040 Speaker 1: week when a media outlet that keeps an eye on 359 00:23:56,080 --> 00:23:58,400 Speaker 1: such things posted a story about it and it went 360 00:23:58,400 --> 00:24:01,240 Speaker 1: in viral from there. But to be clear, the actual 361 00:24:01,400 --> 00:24:05,639 Speaker 1: charges against Stone remain undisclosed. The media reports simply that 362 00:24:05,680 --> 00:24:09,120 Speaker 1: Stone is quote wanted under an article of the Criminal 363 00:24:09,160 --> 00:24:13,400 Speaker 1: Code of the Russian Federation end quote. That's not helpful. 364 00:24:13,440 --> 00:24:16,920 Speaker 1: It's not descriptive at all. We don't know what code, 365 00:24:17,520 --> 00:24:20,760 Speaker 1: what article he's wanted under, right, we don't know what 366 00:24:21,000 --> 00:24:26,840 Speaker 1: law he supposedly broke under the Russian Federation law. But 367 00:24:26,960 --> 00:24:29,640 Speaker 1: let's look back at the early days of Russia's invasion 368 00:24:29,720 --> 00:24:33,399 Speaker 1: of Ukraine to kind of at least get a working 369 00:24:33,480 --> 00:24:38,280 Speaker 1: hypothesis here. There had been aggressive actions between Russia and 370 00:24:38,359 --> 00:24:41,280 Speaker 1: Ukraine dating back since at least twenty fourteen, but the 371 00:24:41,320 --> 00:24:46,280 Speaker 1: actual invasion began on February twenty fourth, twenty twenty two. 372 00:24:46,520 --> 00:24:53,159 Speaker 1: On March tenth, Meta made a really big change. And traditionally, 373 00:24:53,200 --> 00:24:57,640 Speaker 1: posts on Meta's platforms that include calls for violence against 374 00:24:57,680 --> 00:25:02,879 Speaker 1: people are a huge violation of Meta's rules that Meta 375 00:25:02,920 --> 00:25:07,440 Speaker 1: would take such things down immediately or at least investigate them, 376 00:25:07,720 --> 00:25:11,000 Speaker 1: assuming that someone had flagged the post. That's at least 377 00:25:11,000 --> 00:25:13,119 Speaker 1: according to the policy. Whether or not that's actually how 378 00:25:13,200 --> 00:25:17,639 Speaker 1: it plays out depends on the situation, but often it 379 00:25:17,720 --> 00:25:21,960 Speaker 1: means that that Meta ends up sending the flag to 380 00:25:22,240 --> 00:25:26,720 Speaker 1: someone like a subcontractor to review the incident and determine 381 00:25:26,720 --> 00:25:28,879 Speaker 1: whether or not it was actually against Meta's policies and 382 00:25:28,920 --> 00:25:32,240 Speaker 1: then figure out where to go from There. Doesn't mean 383 00:25:32,880 --> 00:25:36,240 Speaker 1: removing the post or banning the person for a given 384 00:25:36,280 --> 00:25:39,000 Speaker 1: amount of time. It all just depends on the situation. 385 00:25:39,880 --> 00:25:43,640 Speaker 1: But on March tenth, Meta announced it would allow users, 386 00:25:43,680 --> 00:25:47,240 Speaker 1: at least in certain countries to post calls for violence 387 00:25:47,680 --> 00:25:52,440 Speaker 1: against Russians and specifically Russian soldiers, and this was all 388 00:25:52,480 --> 00:25:56,640 Speaker 1: within the context of the Ukraine invasion. Reuters reported that 389 00:25:56,840 --> 00:26:00,359 Speaker 1: internal messages within Meta even indicated that posts that would 390 00:26:00,359 --> 00:26:04,080 Speaker 1: call for the deaths of Russia's president Vladimir Putin or 391 00:26:04,119 --> 00:26:08,800 Speaker 1: the president of Belarus, Alexander Lukashinko were fair game. Again, 392 00:26:08,880 --> 00:26:12,840 Speaker 1: this is something that typically is very much against Meta's policies, 393 00:26:13,520 --> 00:26:18,480 Speaker 1: and Reuters reported that, well, they didn't name the Meta spokesperson, 394 00:26:18,520 --> 00:26:21,280 Speaker 1: but later it was revealed to be Andy Stone. They 395 00:26:21,320 --> 00:26:25,240 Speaker 1: revealed a spokesperson release the statement quote as a result 396 00:26:25,359 --> 00:26:28,760 Speaker 1: of the Russian invasion of Ukraine, we have temporarily made 397 00:26:28,800 --> 00:26:32,760 Speaker 1: allowances for forms of political expression that would normally violate 398 00:26:32,840 --> 00:26:36,440 Speaker 1: our rules, like violent speech such as death to the 399 00:26:36,520 --> 00:26:40,480 Speaker 1: Russian invaders. We still won't allow credible calls for violence 400 00:26:40,520 --> 00:26:45,399 Speaker 1: against Russian civilians end quote. A couple of weeks later, 401 00:26:45,960 --> 00:26:52,400 Speaker 1: a Russian court declared that Meta qualifies as an extremist organization. Effectively, 402 00:26:53,119 --> 00:26:57,720 Speaker 1: this banned Meta's platforms inside Russia, with the exclusion of 403 00:26:57,720 --> 00:27:01,600 Speaker 1: Watsapp because that's not a public platform. Facebook and Instagram 404 00:27:01,640 --> 00:27:06,280 Speaker 1: both banned, and Watsapp still was allowed to operate within 405 00:27:06,320 --> 00:27:10,600 Speaker 1: the country. Interestingly, Again, by that time, Andy Stone had 406 00:27:10,640 --> 00:27:14,240 Speaker 1: already been on the wanted list according to this independent 407 00:27:14,560 --> 00:27:18,040 Speaker 1: website it's called media Zona. By the way, media Zona 408 00:27:18,160 --> 00:27:23,560 Speaker 1: monitors the Russian prison system, among other things. And again 409 00:27:23,680 --> 00:27:25,400 Speaker 1: they had said that he had been on the list 410 00:27:25,440 --> 00:27:29,359 Speaker 1: since February twenty twenty two, even before Metta had made 411 00:27:29,600 --> 00:27:32,359 Speaker 1: the announced change to the policies, and right around the 412 00:27:32,359 --> 00:27:36,159 Speaker 1: time when Russia was actually invading Ukraine. So while that 413 00:27:36,400 --> 00:27:39,639 Speaker 1: was the precipitating factor that would see Russia slap at 414 00:27:39,720 --> 00:27:43,240 Speaker 1: terrorist label on the company, according to media Zona, Stone 415 00:27:43,240 --> 00:27:46,399 Speaker 1: had already been in Russia's crosshairs. It's just that no 416 00:27:46,480 --> 00:27:49,560 Speaker 1: one in Russia actually announced this, so it never made 417 00:27:49,600 --> 00:27:51,560 Speaker 1: the news. So, in other words, they had put his 418 00:27:51,640 --> 00:27:53,400 Speaker 1: name on the list, but they didn't announce the fact 419 00:27:53,400 --> 00:27:56,880 Speaker 1: that he was on the list. Also, interestingly, Mark Zuckerberg 420 00:27:56,960 --> 00:28:01,000 Speaker 1: himself wouldn't be singled out by Russia till April twenty 421 00:28:01,080 --> 00:28:05,520 Speaker 1: twenty two, that's when Russia formally banned with Zuckerberg from 422 00:28:05,640 --> 00:28:09,520 Speaker 1: entering the country. So what the heck got Stone on 423 00:28:09,640 --> 00:28:12,760 Speaker 1: that list so early? Well, first, there's a chance that 424 00:28:12,800 --> 00:28:16,359 Speaker 1: the report is wrong and that Stone's name maybe joined 425 00:28:16,359 --> 00:28:20,680 Speaker 1: this list after the announced changes in Meta's policy. That's 426 00:28:20,720 --> 00:28:23,960 Speaker 1: my best guess that Russian officials, in an effort to 427 00:28:24,240 --> 00:28:28,679 Speaker 1: push back against the company, named Andy Stone as a 428 00:28:28,840 --> 00:28:32,240 Speaker 1: wanted individual, not keeping in mind that he's really just 429 00:28:32,320 --> 00:28:37,760 Speaker 1: a mouthpiece for Meta, because again, Andy Stone doesn't create 430 00:28:37,800 --> 00:28:41,800 Speaker 1: the policies, he communicates them. He's not the one responsible 431 00:28:41,840 --> 00:28:44,840 Speaker 1: for determining that Meta's going to allow this to happen. 432 00:28:45,440 --> 00:28:48,600 Speaker 1: He just has to tell the world that that's what's happening. Now. 433 00:28:48,640 --> 00:28:53,640 Speaker 1: He sometimes does communicate in a way that's combative and insulting, 434 00:28:54,160 --> 00:28:58,280 Speaker 1: so perhaps this has contributed to the issue. But Stone, 435 00:28:58,480 --> 00:29:01,920 Speaker 1: again is not responsible for whether or not Meta allows 436 00:29:01,960 --> 00:29:05,480 Speaker 1: for calls of violence against Russian invaders. It really is 437 00:29:05,520 --> 00:29:08,000 Speaker 1: a shoot the messenger kind of situation here, if in 438 00:29:08,040 --> 00:29:10,320 Speaker 1: fact that's why he was put on the list. But 439 00:29:11,200 --> 00:29:14,240 Speaker 1: maybe there's some other reason for Russia to name Stone 440 00:29:14,280 --> 00:29:18,320 Speaker 1: as a wanted individual. That's possible, but based off what 441 00:29:18,440 --> 00:29:21,120 Speaker 1: we know, it seems to me that this is mostly 442 00:29:21,200 --> 00:29:24,480 Speaker 1: a symbolic gesture made by a country that's in a 443 00:29:24,480 --> 00:29:28,160 Speaker 1: position that's very difficult to defend in a global court 444 00:29:28,240 --> 00:29:32,840 Speaker 1: of public opinion. Russia's government has long struggled to control 445 00:29:32,960 --> 00:29:36,520 Speaker 1: media in an effort to push propaganda and Putin's narrative. 446 00:29:36,520 --> 00:29:41,120 Speaker 1: In particular, platforms like Facebook and Instagram threatened that strategy 447 00:29:41,400 --> 00:29:45,400 Speaker 1: because Russia has no direct control over those platforms, so 448 00:29:45,720 --> 00:29:51,040 Speaker 1: they are legitimate threats to authoritarian control. It almost became 449 00:29:51,120 --> 00:29:54,600 Speaker 1: necessary for Russia to label Meta as an extremist organization 450 00:29:55,120 --> 00:29:58,000 Speaker 1: just to prevent a place where opposing opinions could potentially 451 00:29:58,040 --> 00:30:02,880 Speaker 1: take hold in Russia. Anyway, my best guess is that 452 00:30:02,960 --> 00:30:06,560 Speaker 1: this will have little to no real impact on Andy 453 00:30:06,560 --> 00:30:10,040 Speaker 1: Stone's life. I mean, assuming he wasn't planning on vacationing 454 00:30:10,120 --> 00:30:12,840 Speaker 1: in Saint Petersburg or something. If he had plans to 455 00:30:12,880 --> 00:30:15,880 Speaker 1: travel to Russia, I imagine those are no longer on 456 00:30:15,920 --> 00:30:19,719 Speaker 1: his calendar. Otherwise, I'm pretty sure this is just going 457 00:30:19,760 --> 00:30:22,880 Speaker 1: to be a little intrigue into his professional career, a 458 00:30:22,920 --> 00:30:26,560 Speaker 1: little interesting bit of trivia. I also expect he will 459 00:30:26,600 --> 00:30:29,480 Speaker 1: continue to speak for Meta at least as long as 460 00:30:29,480 --> 00:30:33,400 Speaker 1: he is employed there after that, who knows. It'd be 461 00:30:33,480 --> 00:30:36,920 Speaker 1: really interesting to talk with him post his career at 462 00:30:36,960 --> 00:30:43,000 Speaker 1: Meta to find out, like, did he really believe in 463 00:30:43,040 --> 00:30:47,040 Speaker 1: the cause that he was championing, And maybe he does, 464 00:30:47,400 --> 00:30:50,560 Speaker 1: I don't know. I don't know Andy Stone. I feel 465 00:30:50,600 --> 00:30:56,080 Speaker 1: like he's employing a skill set for an employer, whether 466 00:30:56,120 --> 00:31:00,320 Speaker 1: he believes in that employer's actual choices. I do know. 467 00:31:00,680 --> 00:31:04,640 Speaker 1: I couldn't say. Maybe he does it, just it seems 468 00:31:04,640 --> 00:31:08,760 Speaker 1: odd to me considering his previous employers. But who knows. 469 00:31:08,880 --> 00:31:10,600 Speaker 1: I mean, and I don't know everything that's going on 470 00:31:10,680 --> 00:31:14,280 Speaker 1: inside Meta either. I'm just basing this off things like 471 00:31:14,320 --> 00:31:18,680 Speaker 1: the whistleblower report and that sort of thing. Anyway, that 472 00:31:19,040 --> 00:31:23,800 Speaker 1: is the interesting and currently unfolding story of Andy Stone 473 00:31:23,840 --> 00:31:29,719 Speaker 1: being wanted by Russia's Interior Ministry. We're gonna keep an 474 00:31:29,720 --> 00:31:32,280 Speaker 1: eye on this. If I see updates, I will definitely 475 00:31:32,320 --> 00:31:34,160 Speaker 1: follow up with it. I don't expect there to be 476 00:31:34,240 --> 00:31:36,280 Speaker 1: many updates. I think this is just a really kind 477 00:31:36,280 --> 00:31:41,120 Speaker 1: of intriguing story that breaks on a Monday after a holiday. 478 00:31:42,200 --> 00:31:44,920 Speaker 1: I don't necessarily think it's going to lead to anything 479 00:31:44,920 --> 00:31:49,000 Speaker 1: else that's particularly newsworthy, but we'll see you never know. 480 00:31:49,480 --> 00:31:51,640 Speaker 1: I hope you are all well. I hope those of 481 00:31:51,640 --> 00:31:55,680 Speaker 1: you who celebrated Thanksgiving in the US had a wonderful time. 482 00:31:56,360 --> 00:31:59,520 Speaker 1: I hope everybody had a great week last week. As 483 00:31:59,520 --> 00:32:03,440 Speaker 1: a remind this week we're back to normal, although tomorrow 484 00:32:03,520 --> 00:32:06,640 Speaker 1: we'll have a Smart Talks with IBM episode published, and 485 00:32:06,680 --> 00:32:11,040 Speaker 1: then next week I'm on vacation, so you're likely to 486 00:32:11,080 --> 00:32:14,000 Speaker 1: hear some reruns next week. I'll see if I can 487 00:32:14,040 --> 00:32:17,240 Speaker 1: record some new stuff for next week in advance, but 488 00:32:17,600 --> 00:32:20,160 Speaker 1: it's a pretty packed week for me right now, so 489 00:32:20,200 --> 00:32:23,920 Speaker 1: we'll see. I'll do my best anyway. I wish you 490 00:32:24,040 --> 00:32:27,680 Speaker 1: all well, and I'll talk to you again really soon. 491 00:32:34,080 --> 00:32:38,760 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 492 00:32:39,080 --> 00:32:42,800 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 493 00:32:42,800 --> 00:32:47,560 Speaker 1: to your favorite shows.