1 00:00:04,760 --> 00:00:12,960 Speaker 1: Sleepwalkers is a production of I Heart Radio and Unusual Productions. Hey, 2 00:00:14,440 --> 00:00:18,680 Speaker 1: how are you doing? Are you in a curious situation? 3 00:00:19,280 --> 00:00:24,120 Speaker 1: But not really? Where are you? Can you say where 4 00:00:24,160 --> 00:00:30,080 Speaker 1: you are? You sound like you can't talk. Yeah, okay, 5 00:00:31,000 --> 00:00:36,320 Speaker 1: I've been working on Sleepwalkers so much. Yeah, so I 6 00:00:36,400 --> 00:00:40,360 Speaker 1: know what you're thinking. Kara isn't normally that distracted? But 7 00:00:40,440 --> 00:00:44,200 Speaker 1: the truth is that wasn't her speaking. We're playing prerecorded 8 00:00:44,200 --> 00:00:47,239 Speaker 1: fakes of her voice to her cousin, created by AI. 9 00:00:47,520 --> 00:00:50,720 Speaker 1: Are you sleeping today? Wake you? I feel so tired. 10 00:00:51,600 --> 00:00:56,800 Speaker 1: I'm sorry. I wanted to talk to you about huh. 11 00:00:57,120 --> 00:01:00,400 Speaker 1: I stupidly left my wallet at home and I need 12 00:01:00,440 --> 00:01:02,880 Speaker 1: to order tickets to the screening before it sells out? 13 00:01:03,920 --> 00:01:09,720 Speaker 1: What's screening? I'm not? Are you you know? Could you 14 00:01:09,720 --> 00:01:13,360 Speaker 1: read me a card number real fast or text me 15 00:01:13,400 --> 00:01:17,640 Speaker 1: a pick up your card? I'll then you back. Are 16 00:01:17,680 --> 00:01:23,440 Speaker 1: you you talking to me? Your cousin Leslie? Right? Hello? Yeah? 17 00:01:23,680 --> 00:01:27,959 Speaker 1: I think we're crossing paths here. You're not answering me 18 00:01:28,160 --> 00:01:36,920 Speaker 1: in a weird You're answering me in a weird way. 19 00:01:37,560 --> 00:01:41,959 Speaker 1: So what was it like hearing Leslie respond to robot Carol? Well, 20 00:01:41,959 --> 00:01:44,880 Speaker 1: it reminded me that it's very easy to prank people 21 00:01:45,360 --> 00:01:47,920 Speaker 1: when they have no context for what you're doing. It 22 00:01:48,040 --> 00:01:50,680 Speaker 1: took her like a full minute to be like, Okay, 23 00:01:50,640 --> 00:01:53,040 Speaker 1: I's tired, not like that's not car you know, it's 24 00:01:53,040 --> 00:01:54,680 Speaker 1: always like cool my dad, And I'll say to him 25 00:01:54,680 --> 00:01:59,200 Speaker 1: after a minute, Dad, are you playing internet chess? Well, 26 00:01:59,200 --> 00:02:01,840 Speaker 1: there is what they all there's tech brain, which is 27 00:02:01,920 --> 00:02:05,480 Speaker 1: when someone's texting and talking to you, they're like and 28 00:02:05,560 --> 00:02:07,160 Speaker 1: that's sort of what it sounds like. She was like, 29 00:02:07,200 --> 00:02:11,200 Speaker 1: are you having another conversation? Has she forgiven you? She's 30 00:02:11,280 --> 00:02:15,880 Speaker 1: forgiven robo ka, I'm still not off the hook. Sorry. 31 00:02:17,240 --> 00:02:19,480 Speaker 1: Fake audio and fake video can be a lot of 32 00:02:19,480 --> 00:02:22,480 Speaker 1: fun for pranks, and there are some life changing the 33 00:02:22,600 --> 00:02:25,720 Speaker 1: positive uses for synthetic media that we hear about later. 34 00:02:26,360 --> 00:02:29,600 Speaker 1: But just how much trouble could deep fakes get us into? 35 00:02:30,040 --> 00:02:32,200 Speaker 1: And as they get easier to make, how can we 36 00:02:32,280 --> 00:02:34,359 Speaker 1: keep them out of the hands of the wrong people. 37 00:02:35,440 --> 00:02:52,519 Speaker 1: I'm as Velochen, Welcome to Sleepwalkers. The plan originally was 38 00:02:52,600 --> 00:02:55,880 Speaker 1: to get cousin Leslie's credit card detail that failed. Yeah, 39 00:02:56,160 --> 00:03:00,160 Speaker 1: Julian had the idea of having Kara Ai asked for 40 00:03:00,200 --> 00:03:02,960 Speaker 1: credit card information basically to prove how easy it is 41 00:03:03,360 --> 00:03:06,600 Speaker 1: to get somebody's credit card information. You can imagine if 42 00:03:07,080 --> 00:03:09,560 Speaker 1: it was a little bit better and you were talking 43 00:03:09,560 --> 00:03:12,239 Speaker 1: to someone and they were like, oh my god, my grandchild, 44 00:03:12,480 --> 00:03:14,320 Speaker 1: you know, needs money. Oh my god, my grandchild is 45 00:03:14,320 --> 00:03:17,080 Speaker 1: in trouble, that they would say, Okay, hold on a minute, 46 00:03:17,200 --> 00:03:18,840 Speaker 1: I'll get you the credit card number, you know what 47 00:03:18,880 --> 00:03:20,920 Speaker 1: I mean. Yeah, And I think that's what's so frightening 48 00:03:20,919 --> 00:03:23,880 Speaker 1: about this technology. We're going to dive later into how 49 00:03:23,919 --> 00:03:28,040 Speaker 1: you synthesized your voice, but it's the same technological underpinning 50 00:03:28,160 --> 00:03:30,720 Speaker 1: of the video that many people have seen of Jordan 51 00:03:30,800 --> 00:03:35,480 Speaker 1: Peel basically speaking in Barack Obama's mouth. We're entering an 52 00:03:35,520 --> 00:03:37,680 Speaker 1: era in which our enemies can make it look like 53 00:03:37,720 --> 00:03:41,080 Speaker 1: anyone is saying anything at any point in time, even 54 00:03:41,080 --> 00:03:44,240 Speaker 1: if they would never say those things. For instance, they 55 00:03:44,240 --> 00:03:47,280 Speaker 1: could have me say things like President Trump is a 56 00:03:47,320 --> 00:03:51,520 Speaker 1: total and complete dipshit. So that was a computer neural 57 00:03:51,560 --> 00:03:55,520 Speaker 1: network faking Barack Obama's facial features and mouth movement to 58 00:03:55,560 --> 00:03:57,920 Speaker 1: literally look like he was speaking the words that Jordan 59 00:03:57,960 --> 00:04:00,600 Speaker 1: Peele said, and that actually makes it even more persuasive 60 00:04:00,720 --> 00:04:03,040 Speaker 1: than the fake audio we just heard. Of your voice, 61 00:04:03,160 --> 00:04:06,000 Speaker 1: because when you see something, you tend to believe it. 62 00:04:06,040 --> 00:04:10,840 Speaker 1: That's why the phrases seeing is believing. Thanks. We're got 63 00:04:10,840 --> 00:04:13,200 Speaker 1: to come back to deep fakes, but before we get there, 64 00:04:13,240 --> 00:04:15,240 Speaker 1: we're going to take a look at some other online 65 00:04:15,240 --> 00:04:18,800 Speaker 1: trickery because the scariest part is that fakes actually don't 66 00:04:18,800 --> 00:04:21,719 Speaker 1: have to be as sophisticated as you'll call to cousin 67 00:04:21,800 --> 00:04:26,479 Speaker 1: Leslie Toreak Havoc. This is particularly true on Facebook. So 68 00:04:26,720 --> 00:04:29,400 Speaker 1: we went to their headquarters in Palo Alto to meet 69 00:04:29,480 --> 00:04:33,719 Speaker 1: Nathaniel Glica. He's the head of cybersecurity policy at Facebook, 70 00:04:33,760 --> 00:04:36,440 Speaker 1: and he told me about an incident last summer that 71 00:04:36,520 --> 00:04:40,400 Speaker 1: creates the true dilemma for him and his team. In July, 72 00:04:41,320 --> 00:04:44,200 Speaker 1: we conducted a takedown of a fairly small network of 73 00:04:44,320 --> 00:04:47,320 Speaker 1: pages that were operating in the US. Showed links back 74 00:04:47,360 --> 00:04:50,360 Speaker 1: to Russian actors, and what they were doing was, among 75 00:04:50,400 --> 00:04:54,000 Speaker 1: other things, creating events where they were inviting Americans to 76 00:04:54,040 --> 00:04:57,039 Speaker 1: come to protests, and in particular this was around the 77 00:04:57,279 --> 00:05:00,360 Speaker 1: Unite the Right two movement, which happened in It was 78 00:05:00,400 --> 00:05:04,080 Speaker 1: the anniversary of the bloody clashes in Charlottesville in seventeen, 79 00:05:04,520 --> 00:05:07,720 Speaker 1: and the far right wanted to gather again. This time 80 00:05:08,040 --> 00:05:11,000 Speaker 1: Russia was watching, and there was an event that popped 81 00:05:11,040 --> 00:05:13,760 Speaker 1: up which was the No Unite the Right to movement. 82 00:05:13,880 --> 00:05:17,359 Speaker 1: This was a counter protest. There were authentic counter protests 83 00:05:17,360 --> 00:05:21,159 Speaker 1: being planned, but this one was being convened by a 84 00:05:21,240 --> 00:05:24,120 Speaker 1: group of inauthentic pages and accounts which were linked back 85 00:05:24,120 --> 00:05:27,120 Speaker 1: to Russia that were clearly attempting to sort of bring 86 00:05:27,160 --> 00:05:29,080 Speaker 1: Americans together in a space where they would go into 87 00:05:29,120 --> 00:05:32,720 Speaker 1: physical conflict. Immediately after creating the event, they then went 88 00:05:32,760 --> 00:05:37,400 Speaker 1: out and invited legitimate, unwitting activists to co host the 89 00:05:37,440 --> 00:05:40,680 Speaker 1: event with them. Let's pause for a moment. This is 90 00:05:40,760 --> 00:05:44,400 Speaker 1: Russia we're talking about, and they're creating a Facebook event 91 00:05:44,440 --> 00:05:47,920 Speaker 1: to appeal to liberal activists, designed to draw them into 92 00:05:47,960 --> 00:05:51,159 Speaker 1: physical conflict with the far right and create the kind 93 00:05:51,160 --> 00:05:54,440 Speaker 1: of scenes that tear at our social fabric. But the 94 00:05:54,440 --> 00:05:58,120 Speaker 1: people co hosting it are not Russian agitators, their u 95 00:05:58,240 --> 00:06:01,560 Speaker 1: S citizens acting in good faith. What we saw in 96 00:06:01,600 --> 00:06:04,440 Speaker 1: that case, and what we're increasingly seeing, is these actors 97 00:06:04,640 --> 00:06:08,520 Speaker 1: trying to lure their behavior with domestic actors to force 98 00:06:08,600 --> 00:06:10,880 Speaker 1: not just the platforms but all of us to ask, 99 00:06:10,920 --> 00:06:14,480 Speaker 1: how do you separate these Ultimately, Facebook had to make 100 00:06:14,480 --> 00:06:19,279 Speaker 1: a decision. We removed that event from Facebook because it 101 00:06:19,320 --> 00:06:22,520 Speaker 1: was created by inauthentic actors. If someone else had created it, 102 00:06:22,520 --> 00:06:25,159 Speaker 1: that event would have been fine. So we removed the event. 103 00:06:25,560 --> 00:06:27,919 Speaker 1: But then we reached out to the co hosts, the 104 00:06:27,960 --> 00:06:30,360 Speaker 1: authentic hosts, and we explained to them what had happened, 105 00:06:30,360 --> 00:06:32,400 Speaker 1: and we made clear if you want to host your 106 00:06:32,400 --> 00:06:34,599 Speaker 1: own event, you should do that. We just want to 107 00:06:34,600 --> 00:06:36,680 Speaker 1: make sure that we everyone understands what's happening and what 108 00:06:36,720 --> 00:06:38,799 Speaker 1: did they say, and what was their reaction to realize 109 00:06:38,800 --> 00:06:41,279 Speaker 1: that the free will had been manipulated in that way. 110 00:06:41,480 --> 00:06:43,920 Speaker 1: If you look at reactions, it's a range of sort 111 00:06:43,960 --> 00:06:46,920 Speaker 1: of disbelief, Right, I don't think this was what you're 112 00:06:46,920 --> 00:06:50,599 Speaker 1: saying it was, too, I can't believe this happened. To Okay, 113 00:06:50,640 --> 00:06:52,520 Speaker 1: that happened, but I strongly believe in this, and I'm 114 00:06:52,560 --> 00:06:54,520 Speaker 1: gonna go and I'm going to advocate for my issues 115 00:06:54,560 --> 00:07:00,600 Speaker 1: somewhere else. That spectrum of difficulty is exactly clear why 116 00:07:00,800 --> 00:07:05,160 Speaker 1: we see actors use these techniques, because there are no 117 00:07:05,279 --> 00:07:08,320 Speaker 1: easy answers here. My assumption going into this was that 118 00:07:08,440 --> 00:07:12,560 Speaker 1: detecting misinformation would be the biggest challenge for Facebook, but 119 00:07:12,640 --> 00:07:15,680 Speaker 1: that's the easy part. It's after you identify the fakes 120 00:07:15,680 --> 00:07:19,560 Speaker 1: that the really tough questions begin. We know that, particularly 121 00:07:19,560 --> 00:07:22,720 Speaker 1: the government actors in this space, part of their information 122 00:07:22,760 --> 00:07:26,480 Speaker 1: dominance strategy is to make themselves appear bigger and more 123 00:07:26,480 --> 00:07:29,240 Speaker 1: powerful than they are. They want to seem like they're everywhere, 124 00:07:29,560 --> 00:07:33,760 Speaker 1: and it's really easy to see foreign government manipulation under 125 00:07:33,800 --> 00:07:37,040 Speaker 1: every rock. I think it's really important not to play 126 00:07:37,120 --> 00:07:41,120 Speaker 1: into the hands of these actors and sort of overplay 127 00:07:41,160 --> 00:07:43,760 Speaker 1: their own influence. This is attention we struggle with. Whatever 128 00:07:43,840 --> 00:07:46,920 Speaker 1: we conduct a takedown for some of these operations, the 129 00:07:46,960 --> 00:07:49,480 Speaker 1: most attention it gets is when we take it down. 130 00:07:50,880 --> 00:07:54,040 Speaker 1: The entire situation puts Facebook in a catch twenty two. 131 00:07:54,560 --> 00:07:57,200 Speaker 1: If they leave the content up, they're helping to promote 132 00:07:57,200 --> 00:08:00,840 Speaker 1: a foreign government's nefarious agenda. If they take it down, 133 00:08:00,960 --> 00:08:03,760 Speaker 1: the foreign government gets all this attention for being more 134 00:08:03,840 --> 00:08:07,800 Speaker 1: powerful and cleverer than they actually are. These decisions are 135 00:08:07,800 --> 00:08:12,640 Speaker 1: incredibly hard. Think of Charlottesville, Think of pizza Gate. Think 136 00:08:12,640 --> 00:08:15,800 Speaker 1: of Lane Davis, who stabbed his own father after an 137 00:08:15,840 --> 00:08:20,600 Speaker 1: argument over the conspiracy theory about liberal pedophiles. Face can kill. 138 00:08:21,200 --> 00:08:24,240 Speaker 1: And Facebook has recognized this for a start, they hired 139 00:08:24,320 --> 00:08:27,120 Speaker 1: Nathaniel a former cyber crimes prosecutor in the U s 140 00:08:27,160 --> 00:08:30,680 Speaker 1: Department of Justice, and in March of this year, Mark 141 00:08:30,760 --> 00:08:35,320 Speaker 1: Zuckerberg announced a company wide pivot towards privacy and encrypted messaging, 142 00:08:35,720 --> 00:08:40,520 Speaker 1: including services like WhatsApp, which they own. But David Kirkpatrick, 143 00:08:40,760 --> 00:08:44,520 Speaker 1: founder of Tachonomy, notes that the pivot carries its own problems. 144 00:08:45,679 --> 00:08:48,400 Speaker 1: If you look at South Asia where there's a lot 145 00:08:48,559 --> 00:08:54,400 Speaker 1: of ethnic discord political violence, notably in India, Indonesia, Me 146 00:08:54,559 --> 00:08:59,080 Speaker 1: and mar Sri Lanka. One of the primary ways that 147 00:08:59,080 --> 00:09:03,480 Speaker 1: that spreads is in group messages in WhatsApp. People in 148 00:09:03,520 --> 00:09:06,640 Speaker 1: the US don't typically use what'sapp for group messages, but 149 00:09:06,720 --> 00:09:09,959 Speaker 1: in places like India and Indonesia they do. And these 150 00:09:09,960 --> 00:09:12,240 Speaker 1: groups on five or six people, your your parents, and 151 00:09:12,240 --> 00:09:14,880 Speaker 1: your brother and city. These are like you subscribe to 152 00:09:15,000 --> 00:09:19,560 Speaker 1: a political leader or a religious zealot. So this is 153 00:09:19,600 --> 00:09:22,640 Speaker 1: more like the dear leader being piped into your home right. 154 00:09:22,720 --> 00:09:26,560 Speaker 1: So the problem has been almost more severe in those 155 00:09:26,920 --> 00:09:31,400 Speaker 1: systems than on Facebook itself of fake news and ethnic 156 00:09:31,760 --> 00:09:38,120 Speaker 1: hatred being disseminated. Because WhatsApp is a encrypted service, so 157 00:09:38,280 --> 00:09:41,360 Speaker 1: the service itself can't even see what the messages are 158 00:09:41,400 --> 00:09:44,400 Speaker 1: that are being distributed. What's scary is it doesn't take 159 00:09:44,440 --> 00:09:47,840 Speaker 1: any technical sophistication or knowledge on the part of people 160 00:09:48,200 --> 00:09:53,240 Speaker 1: writing these messages spreading this misinformation. They're just using WhatsApp. Yeah, 161 00:09:53,280 --> 00:09:56,559 Speaker 1: and these are just messaging apps and social media platforms. 162 00:09:56,600 --> 00:09:58,959 Speaker 1: But what they mean is that a single message can 163 00:09:59,000 --> 00:10:01,839 Speaker 1: spread like wild And of course the history of new 164 00:10:01,880 --> 00:10:05,240 Speaker 1: communication technology tends to go hand in hand with violence. 165 00:10:05,559 --> 00:10:09,000 Speaker 1: When the printing press I books came to Europe, they 166 00:10:09,040 --> 00:10:12,439 Speaker 1: at least religious wars, but they also made the world literate. 167 00:10:12,520 --> 00:10:15,200 Speaker 1: And we've mentioned this before. Technology is usually dual use, 168 00:10:15,559 --> 00:10:18,160 Speaker 1: which relates back to deep fikes. Mostly when you read 169 00:10:18,200 --> 00:10:21,080 Speaker 1: about deep fakes, probably thanks in part to the fact 170 00:10:21,080 --> 00:10:24,320 Speaker 1: they're called deep fakes, the coverage is not very positive. 171 00:10:25,000 --> 00:10:27,520 Speaker 1: There have been more and more stories though about positive 172 00:10:27,600 --> 00:10:29,920 Speaker 1: uses for deep fakes. So when we come back, I'm 173 00:10:29,920 --> 00:10:31,559 Speaker 1: gonna tell you more about how I faked my own 174 00:10:31,640 --> 00:10:33,440 Speaker 1: voice and also some of the things that I learned 175 00:10:33,480 --> 00:10:44,360 Speaker 1: in the process. We started this conversation a few weeks ago, 176 00:10:44,960 --> 00:10:48,080 Speaker 1: and then you asked those to create these artificial voice 177 00:10:48,280 --> 00:10:52,040 Speaker 1: based on your identity. That's Jose's to tell out, the 178 00:10:52,080 --> 00:10:54,839 Speaker 1: co founder of liar Bird. They're the company who made 179 00:10:55,040 --> 00:10:57,880 Speaker 1: robot Kara and helped me prank my cousin, and they've 180 00:10:57,880 --> 00:11:01,479 Speaker 1: published a version of their tools online at liarbird dot ai. 181 00:11:01,840 --> 00:11:04,640 Speaker 1: Here's how it works. I know it might sound a 182 00:11:04,640 --> 00:11:08,240 Speaker 1: bit like magic, but in reality, the way that our 183 00:11:08,240 --> 00:11:12,160 Speaker 1: algorithms work is basically they are just a pattern matching algorithms, 184 00:11:12,600 --> 00:11:17,880 Speaker 1: and so it's trying to figure out how to identify 185 00:11:18,040 --> 00:11:21,559 Speaker 1: the patents in your voice by comparing it against thousands 186 00:11:21,640 --> 00:11:23,840 Speaker 1: of other voices I should have tens of thousands of 187 00:11:23,840 --> 00:11:27,040 Speaker 1: other voices, and trying to figure out what is it 188 00:11:27,080 --> 00:11:30,520 Speaker 1: that makes your voice unique. Once Jose's algorithms identified what 189 00:11:30,600 --> 00:11:33,880 Speaker 1: was unique about my voice, obviously everything they had the 190 00:11:33,920 --> 00:11:36,640 Speaker 1: building blocks they needed to make a fake. Then we 191 00:11:36,679 --> 00:11:39,199 Speaker 1: sent Jose a set of sentences we wanted robot care 192 00:11:39,320 --> 00:11:42,040 Speaker 1: to say, and he used another set of algorithms to 193 00:11:42,120 --> 00:11:44,440 Speaker 1: turn the text into what we heard. The way they 194 00:11:44,440 --> 00:11:47,199 Speaker 1: do this is they use it's called a generative adversarial 195 00:11:47,360 --> 00:11:51,600 Speaker 1: network again, which is a system where one neural net 196 00:11:51,760 --> 00:11:54,559 Speaker 1: tries to trick another one a thousand times per second. 197 00:11:55,120 --> 00:11:58,040 Speaker 1: So each time the second network to tacks of fake, 198 00:11:58,440 --> 00:12:01,880 Speaker 1: the first one tries again It basically learns from its mistakes, 199 00:12:01,880 --> 00:12:04,680 Speaker 1: and once it tricks its adversary, it's ready to show 200 00:12:04,720 --> 00:12:08,360 Speaker 1: its results. In our case, liar Bird pits my fake 201 00:12:08,440 --> 00:12:11,199 Speaker 1: voice against my real voice until it sounds like this 202 00:12:11,679 --> 00:12:17,199 Speaker 1: sub dog Scara. As this technology becomes more widely available, 203 00:12:17,559 --> 00:12:20,920 Speaker 1: so does the potential for abuse. And while Liarbird develops 204 00:12:20,920 --> 00:12:24,439 Speaker 1: the technology, they don't take the ethics lightly. But Jose 205 00:12:24,600 --> 00:12:28,360 Speaker 1: has an entirely different fear. We believe that the biggest 206 00:12:28,480 --> 00:12:31,760 Speaker 1: risk of this kind of technology comes from the fact 207 00:12:31,840 --> 00:12:35,120 Speaker 1: that not a lot of people know about it. I 208 00:12:35,160 --> 00:12:38,640 Speaker 1: believe that society is not ready for what's going to 209 00:12:38,679 --> 00:12:42,440 Speaker 1: happen when this technology becomes widespread, and so I really 210 00:12:42,480 --> 00:12:46,959 Speaker 1: want to make my best effort in trying to showcase 211 00:12:47,040 --> 00:12:49,440 Speaker 1: it to the public so that they are at least 212 00:12:49,440 --> 00:12:53,120 Speaker 1: prepare for what's coming. When people know a scheme exists, 213 00:12:53,160 --> 00:12:55,559 Speaker 1: they're less likely to be tricked by it. But if 214 00:12:55,559 --> 00:12:58,000 Speaker 1: you don't know deep fakes are possible, you're much more 215 00:12:58,000 --> 00:13:00,800 Speaker 1: likely to fall for them. Leslie might been better equipped 216 00:13:00,840 --> 00:13:03,760 Speaker 1: to call my bluff had she known it was even possible. 217 00:13:04,160 --> 00:13:07,240 Speaker 1: But here's the thing, Well, there are inevitable misuses of 218 00:13:07,320 --> 00:13:10,400 Speaker 1: deep fix both behind US and on the horizon. There 219 00:13:10,400 --> 00:13:13,360 Speaker 1: are a number of extraordinary benefits of this technology, which 220 00:13:13,400 --> 00:13:15,760 Speaker 1: is why Jose is working on it. When people are 221 00:13:15,760 --> 00:13:18,040 Speaker 1: diagnosed with a LIST, it's because they start to lose 222 00:13:18,120 --> 00:13:21,480 Speaker 1: their movement skills seemed, let's say, their hands or their feet, 223 00:13:21,679 --> 00:13:24,240 Speaker 1: and so they go to the doctor and then the 224 00:13:24,280 --> 00:13:26,679 Speaker 1: doctor tells them like, you know what, this can be 225 00:13:27,320 --> 00:13:31,559 Speaker 1: als and this gets progressively worse. This was the case 226 00:13:31,600 --> 00:13:36,040 Speaker 1: for Pat Quinn, the co founder of the Ice Bucket Challenge, 227 00:13:36,760 --> 00:13:41,319 Speaker 1: creating a real fight within the a LS community. This 228 00:13:41,559 --> 00:13:46,760 Speaker 1: is a public battle now. Pat was diagnosed with a 229 00:13:46,920 --> 00:13:49,320 Speaker 1: l S and it ultimately took his ability to speak, 230 00:13:49,480 --> 00:13:52,560 Speaker 1: walk and use his hands. During this time. Since they're 231 00:13:52,559 --> 00:13:56,120 Speaker 1: diagnosed until they lose their voice, they have some time, 232 00:13:56,800 --> 00:13:59,240 Speaker 1: and so the idea is that during this time they 233 00:13:59,280 --> 00:14:03,199 Speaker 1: will be able to record themselves, ideally in a really 234 00:14:03,240 --> 00:14:06,559 Speaker 1: high quality setting. Then based on these recordings, we will 235 00:14:06,600 --> 00:14:08,920 Speaker 1: be able to create an artificial copy of their voice 236 00:14:09,559 --> 00:14:12,120 Speaker 1: which they will be able to continue using for the 237 00:14:12,200 --> 00:14:15,000 Speaker 1: rest of their life. Liar Bird has partnered with the 238 00:14:15,000 --> 00:14:19,000 Speaker 1: a l S Foundation to create Project Revoice. Just imagine 239 00:14:19,040 --> 00:14:22,840 Speaker 1: how it would feel for them, to, let's say, not 240 00:14:22,880 --> 00:14:25,760 Speaker 1: be able to tell their husband or their wife I 241 00:14:25,880 --> 00:14:28,560 Speaker 1: love you anymore, to tell this to their kids. And 242 00:14:28,640 --> 00:14:33,080 Speaker 1: so using this technology, they are able to keep this 243 00:14:33,280 --> 00:14:37,080 Speaker 1: really important part of their identities. Using the exact same 244 00:14:37,120 --> 00:14:40,200 Speaker 1: technology I used to create my deep fake liar bird 245 00:14:40,320 --> 00:14:42,640 Speaker 1: was able to give Path the ability to preserve his 246 00:14:42,720 --> 00:14:46,120 Speaker 1: voice for the rest of his life. It's a strange 247 00:14:46,160 --> 00:14:49,680 Speaker 1: feeling saying the first words at the second time. It's 248 00:14:49,800 --> 00:14:55,080 Speaker 1: like you don't realize how powerful, how personally I her 249 00:14:55,200 --> 00:14:58,640 Speaker 1: voice really is until it stay them from you. My 250 00:14:58,800 --> 00:15:02,440 Speaker 1: voice is how I back against a very disease. Take 251 00:15:02,480 --> 00:15:08,400 Speaker 1: it for me, say something, Listen to it. No voice. 252 00:15:10,680 --> 00:15:13,840 Speaker 1: Since revoicing pat Lierbard has received a number of emails 253 00:15:13,840 --> 00:15:16,640 Speaker 1: from als patients asking if it's possible for them to 254 00:15:16,680 --> 00:15:19,720 Speaker 1: do the same thing, preserve this part of themselves which 255 00:15:19,760 --> 00:15:22,440 Speaker 1: they know they're going to lose, and Jose has heard 256 00:15:22,440 --> 00:15:25,480 Speaker 1: from people who have lost family in other ways. For instance, 257 00:15:25,520 --> 00:15:28,680 Speaker 1: we have received that quite a lot, actually very emotional 258 00:15:28,760 --> 00:15:33,200 Speaker 1: emails about people telling some variation of this. My wife 259 00:15:33,240 --> 00:15:38,080 Speaker 1: died three months ago, and I have two children, age 260 00:15:38,200 --> 00:15:41,720 Speaker 1: four and six, and I would really really love to 261 00:15:41,840 --> 00:15:44,200 Speaker 1: be able to tell them a good night story in 262 00:15:44,240 --> 00:15:46,480 Speaker 1: the voice of their mother, or to tell them that, 263 00:15:46,520 --> 00:15:48,680 Speaker 1: in the mother's voice, I love you, I am proud 264 00:15:48,720 --> 00:15:53,560 Speaker 1: of you, be happy. The tools on lierbrard AI are 265 00:15:53,600 --> 00:15:57,080 Speaker 1: intentionally less advanced and meant to just spread awareness, but 266 00:15:57,240 --> 00:16:01,440 Speaker 1: Liarbrard's more bespoke tools open amazing possibilities for changing how 267 00:16:01,480 --> 00:16:03,960 Speaker 1: we deal with loss and grief. I would like to 268 00:16:03,960 --> 00:16:07,080 Speaker 1: ask you just one question, which is like, how would 269 00:16:07,120 --> 00:16:10,400 Speaker 1: you feel, let's say, about recording the voice of your 270 00:16:10,440 --> 00:16:13,200 Speaker 1: parents and keeping them What do you think would you 271 00:16:13,200 --> 00:16:14,720 Speaker 1: like to the lease or or how do you feel 272 00:16:14,720 --> 00:16:17,480 Speaker 1: about that? It was interesting when jose asked me because 273 00:16:17,480 --> 00:16:19,880 Speaker 1: I had actually thought about it ever since I learned 274 00:16:19,880 --> 00:16:25,840 Speaker 1: about Liar Bird when I was fifteen. So fourteen years ago, 275 00:16:26,680 --> 00:16:31,160 Speaker 1: my dad died in a fatal car accident, and nobody 276 00:16:31,240 --> 00:16:34,000 Speaker 1: prepares for accidents, you know. One minute my dad walked 277 00:16:34,000 --> 00:16:37,080 Speaker 1: out the door, and forty five minutes later the police 278 00:16:37,080 --> 00:16:39,280 Speaker 1: showed up at the same door to tell us what happened, 279 00:16:40,160 --> 00:16:43,960 Speaker 1: and so I never got to see or speak to 280 00:16:43,960 --> 00:16:49,280 Speaker 1: my dad ever again. Sometimes my therapist will ask me 281 00:16:49,360 --> 00:16:51,600 Speaker 1: if I think about what I would talk about with 282 00:16:51,640 --> 00:16:54,160 Speaker 1: my dad if he was still alive, and I always 283 00:16:54,160 --> 00:16:56,280 Speaker 1: say that, you know, I don't. I don't think about 284 00:16:56,320 --> 00:17:00,160 Speaker 1: that too much because it's sad to think about at 285 00:17:00,200 --> 00:17:02,640 Speaker 1: that because he's not actually around, and because I know 286 00:17:02,720 --> 00:17:05,639 Speaker 1: I can't talk about him. But it's also hard to 287 00:17:05,680 --> 00:17:08,560 Speaker 1: conceive of. You know, I can't recall off the top 288 00:17:08,600 --> 00:17:12,280 Speaker 1: of my head what he sounds like, and sometimes I'll 289 00:17:12,280 --> 00:17:14,359 Speaker 1: hear his voice when we watch home movies and it 290 00:17:14,359 --> 00:17:17,800 Speaker 1: always spooks me out. So the idea of having his 291 00:17:17,920 --> 00:17:21,520 Speaker 1: disembodied voice asked me things like how do you like 292 00:17:21,600 --> 00:17:24,720 Speaker 1: working on this podcast? Or what's the most amazing thing 293 00:17:24,760 --> 00:17:28,119 Speaker 1: you've learned, or even saying things like I'm so proud 294 00:17:28,160 --> 00:17:31,040 Speaker 1: of you. Do you know that? I'm not sure how 295 00:17:31,119 --> 00:17:35,600 Speaker 1: I'd react to his voice like that. Regardless, the thought 296 00:17:35,600 --> 00:17:38,280 Speaker 1: of it is something in the realm of possibility is 297 00:17:38,400 --> 00:17:43,760 Speaker 1: equal parts chilling and exciting. I actually think, given the chance, 298 00:17:43,960 --> 00:17:51,400 Speaker 1: I might do it. This is not the science fiction 299 00:17:51,760 --> 00:17:55,640 Speaker 1: thing or something that will exist years from now. It's 300 00:17:55,640 --> 00:17:58,600 Speaker 1: something that exists already to the people can even go on. 301 00:17:58,720 --> 00:18:02,359 Speaker 1: And as my cousin Leslie learned, these deep fakes are 302 00:18:02,359 --> 00:18:06,560 Speaker 1: already good enough to use on unassuming family members suburb 303 00:18:07,800 --> 00:18:12,920 Speaker 1: This isn't Kara, this is artificial terra. Oh my God, 304 00:18:13,280 --> 00:18:26,120 Speaker 1: my voice right now, it's AI. This is awful. When 305 00:18:26,119 --> 00:18:29,280 Speaker 1: we started reporting on deep fakes, I never anticipated how 306 00:18:29,359 --> 00:18:32,199 Speaker 1: moving the technology could be. I was more focused on 307 00:18:32,200 --> 00:18:35,560 Speaker 1: the dangers, and they are worth considering too. One person 308 00:18:35,640 --> 00:18:38,000 Speaker 1: who is out in front bringing awareness to the potential 309 00:18:38,119 --> 00:18:41,639 Speaker 1: harms of fake media is Danielle Citron. She's a legal 310 00:18:41,640 --> 00:18:44,159 Speaker 1: professor at the University of Maryland and the author of 311 00:18:44,200 --> 00:18:48,360 Speaker 1: Hate Crimes in Cyberspace. Machine learning technology and neural networks 312 00:18:48,400 --> 00:18:52,720 Speaker 1: can learn from your photo and voice that's taken from 313 00:18:52,760 --> 00:18:57,560 Speaker 1: recordings of your voice, can sufficiently learn enough about your 314 00:18:57,600 --> 00:19:00,760 Speaker 1: face and the way it moves and you or voice 315 00:19:00,960 --> 00:19:05,120 Speaker 1: so that it can create really incredibly difficult to debunk 316 00:19:06,080 --> 00:19:09,000 Speaker 1: videos of you doing and saying things you never did. 317 00:19:10,240 --> 00:19:12,960 Speaker 1: Now we all know how dangerous the simple recent word 318 00:19:13,040 --> 00:19:17,320 Speaker 1: can be. Danielle got interested in how fake video could 319 00:19:17,320 --> 00:19:21,400 Speaker 1: increase the forces of hate exponentially. There was a whole 320 00:19:21,440 --> 00:19:25,200 Speaker 1: Reddit thread devoted to deep fake sex videos of celebrities, 321 00:19:25,240 --> 00:19:28,919 Speaker 1: female celebrities like Emma Watson and Hathaway and others. If 322 00:19:28,960 --> 00:19:31,440 Speaker 1: you went through the thread, which I did, you can 323 00:19:31,480 --> 00:19:35,280 Speaker 1: see the conversation moving beyond Emma Watson to my bitch 324 00:19:35,320 --> 00:19:38,080 Speaker 1: girlfriend or that woman I hated in high school, and 325 00:19:38,080 --> 00:19:40,439 Speaker 1: it was it was all the conversation about women, you 326 00:19:40,480 --> 00:19:42,959 Speaker 1: know what I thought was like the evil Cyberus stocking 327 00:19:43,080 --> 00:19:46,399 Speaker 1: was all based on Crewe doctored photos of someone naked, 328 00:19:46,520 --> 00:19:48,400 Speaker 1: but if you worked at it, you could figure it out. 329 00:19:48,920 --> 00:19:52,439 Speaker 1: Now we can put people into pornography in ways that 330 00:19:52,480 --> 00:19:56,520 Speaker 1: are devastate their careers. So, Kara, I do think it 331 00:19:56,560 --> 00:19:59,280 Speaker 1: says something that this new technology is being used to 332 00:19:59,280 --> 00:20:02,320 Speaker 1: target women. And a lot of these conversations are happening 333 00:20:02,320 --> 00:20:04,600 Speaker 1: on the same forums on Reddit where the in cell 334 00:20:04,680 --> 00:20:07,359 Speaker 1: movement was born, right, So I think this is especially 335 00:20:07,440 --> 00:20:10,400 Speaker 1: important when we talk about famous women and their likeness. 336 00:20:11,440 --> 00:20:13,760 Speaker 1: A lot of men on the Internet want to see 337 00:20:13,760 --> 00:20:17,359 Speaker 1: their favorite actresses in positions that they wouldn't be able 338 00:20:17,400 --> 00:20:22,240 Speaker 1: to see those actresses in, and so with this technology, 339 00:20:22,320 --> 00:20:25,080 Speaker 1: it's quite easy to put someone's face on somebody else's 340 00:20:25,119 --> 00:20:29,040 Speaker 1: body without the consent of the actual actress and actually sag. 341 00:20:29,080 --> 00:20:32,440 Speaker 1: The Screen Actors Guild held a panel a few weeks 342 00:20:32,440 --> 00:20:34,840 Speaker 1: ago to bring this up that like, yes, we we're 343 00:20:34,880 --> 00:20:37,480 Speaker 1: talking about this in terms of democracy and our political 344 00:20:37,520 --> 00:20:39,679 Speaker 1: system and the upcoming election, but we also have to 345 00:20:39,680 --> 00:20:42,560 Speaker 1: talk about this in terms of the livelihood of women 346 00:20:42,720 --> 00:20:45,919 Speaker 1: who make money on their likeness and whose likeness is 347 00:20:45,960 --> 00:20:49,840 Speaker 1: now being misappropriated. Yeah, because it can destroy their careers 348 00:20:49,960 --> 00:20:53,200 Speaker 1: and silence them. There's actually a case in India where 349 00:20:53,280 --> 00:20:57,159 Speaker 1: people attempt to use deep fate pornography to intimidate and 350 00:20:57,359 --> 00:21:00,920 Speaker 1: silence a journalist called run ayub and I spoke about 351 00:21:00,960 --> 00:21:05,600 Speaker 1: that case with Danielle the Indian journalist who had been 352 00:21:05,680 --> 00:21:09,960 Speaker 1: very critical of Hindu politics, nationalist politics, and a deep 353 00:21:10,000 --> 00:21:13,639 Speaker 1: fake sex video sort of was spread basically to discredit 354 00:21:13,680 --> 00:21:19,040 Speaker 1: her um and spread through texting networks and went viral, 355 00:21:19,320 --> 00:21:23,000 Speaker 1: and she basically was devastated and went offline, stopped writing 356 00:21:23,560 --> 00:21:26,320 Speaker 1: for like three weeks. She's a journalist, this is what 357 00:21:26,440 --> 00:21:29,320 Speaker 1: she does for a living, right, So imagine that kind 358 00:21:29,320 --> 00:21:33,720 Speaker 1: of granular individual harm um and compare it with harm 359 00:21:33,920 --> 00:21:36,199 Speaker 1: to CEOs the night before an I p O. A 360 00:21:36,240 --> 00:21:39,159 Speaker 1: deep fake is release that shows this person taking a 361 00:21:39,200 --> 00:21:41,320 Speaker 1: bribe or doing drugs or whatever. I'm making it up, 362 00:21:41,640 --> 00:21:45,000 Speaker 1: but that tanks the I p O. Right. This kind 363 00:21:45,040 --> 00:21:48,240 Speaker 1: of video manipulation used to be confined to places like Disney, 364 00:21:48,520 --> 00:21:51,320 Speaker 1: and the output was blockbuster movies that are fictional but 365 00:21:51,400 --> 00:21:55,240 Speaker 1: not fake. Now AI is being consumerized, and the tools 366 00:21:55,240 --> 00:21:59,000 Speaker 1: to create convincing video are spreading, and that means creating 367 00:21:59,000 --> 00:22:02,359 Speaker 1: the kind of chaos Danielle describes is also more and 368 00:22:02,440 --> 00:22:06,920 Speaker 1: more accessible. That threatens all of us. One person working 369 00:22:07,000 --> 00:22:10,040 Speaker 1: on the issue is Hani for Reed of Dartmouth University, 370 00:22:10,119 --> 00:22:14,000 Speaker 1: who has been called the father of digital forensics. I'm 371 00:22:14,040 --> 00:22:18,000 Speaker 1: concerned that once we know you can create fake content, 372 00:22:18,640 --> 00:22:22,119 Speaker 1: there is nothing stopping anybody from saying that any video 373 00:22:22,240 --> 00:22:26,040 Speaker 1: is fake. Everybody has plausible the liability. So rewind two 374 00:22:26,119 --> 00:22:29,600 Speaker 1: years ago when the Access Hollywood tape came out of 375 00:22:29,640 --> 00:22:33,040 Speaker 1: President Trump saying what he does to women. The response 376 00:22:33,040 --> 00:22:35,639 Speaker 1: from the campaign was not this is fake. It was 377 00:22:35,720 --> 00:22:39,160 Speaker 1: we apologized, this was locker room talk. They found ways 378 00:22:39,200 --> 00:22:43,840 Speaker 1: of trying to excuse it. If that was today, guaranteed 379 00:22:44,040 --> 00:22:45,640 Speaker 1: he would have said it was fake. And in fact, 380 00:22:45,680 --> 00:22:47,960 Speaker 1: a year ago, after having apologized for the for the 381 00:22:48,000 --> 00:22:51,160 Speaker 1: audio recording, he said it was fake. And so now 382 00:22:51,320 --> 00:22:55,400 Speaker 1: politicians have plausible deniability, and at a time when our 383 00:22:55,520 --> 00:22:58,600 Speaker 1: US president is demonizing the press and telling everybody that 384 00:22:58,640 --> 00:23:02,440 Speaker 1: you can't believe anything, that credible deniability holds some weight. 385 00:23:02,560 --> 00:23:05,800 Speaker 1: And so I'm extremely concerned. Now, how do we distinguish 386 00:23:05,920 --> 00:23:08,040 Speaker 1: what's what, and that I think for a democracy is 387 00:23:08,040 --> 00:23:11,960 Speaker 1: going to be incredibly challenging. So when nothing is believable, 388 00:23:12,880 --> 00:23:15,680 Speaker 1: the mischief doer can say it's a lie. Do you 389 00:23:15,680 --> 00:23:17,359 Speaker 1: know what I'm saying? Like the person who commits the 390 00:23:17,400 --> 00:23:21,720 Speaker 1: crime or does something and says something incriminatory can say, 391 00:23:21,840 --> 00:23:24,520 Speaker 1: that's a fake. So the more you educate you both 392 00:23:24,560 --> 00:23:29,280 Speaker 1: but deep fix the evil doers can leverage that and say, well, 393 00:23:29,320 --> 00:23:34,520 Speaker 1: you can't believe anything, right. Danielle calls this the liar's dividend. 394 00:23:35,160 --> 00:23:38,080 Speaker 1: In a world where nothing can be trusted, everything can 395 00:23:38,080 --> 00:23:42,080 Speaker 1: be denied, and even documented bad deeds can be explained away. 396 00:23:42,119 --> 00:23:44,400 Speaker 1: This kind of thing is accelerated by deep fakes, though, 397 00:23:44,440 --> 00:23:46,919 Speaker 1: which is why I think there are some attempts to 398 00:23:47,040 --> 00:23:50,040 Speaker 1: correct it with law, with law like the Anti Deep 399 00:23:50,080 --> 00:23:54,880 Speaker 1: Fakes Law, very similar the Malicious Deep Fake Prohibition Act 400 00:23:54,880 --> 00:23:58,280 Speaker 1: of eighteen, which was introduced by this Republican center from 401 00:23:58,320 --> 00:24:01,879 Speaker 1: Nebraska named Ben Sas and it basically aims to outlaw 402 00:24:02,080 --> 00:24:06,080 Speaker 1: fraud in connection to audio visual records. But I don't 403 00:24:06,080 --> 00:24:08,520 Speaker 1: know if this law will path. In any case, not 404 00:24:08,600 --> 00:24:11,199 Speaker 1: all deep fakes are malicious, and so we have to 405 00:24:11,200 --> 00:24:15,040 Speaker 1: be careful with laws which are too broad. As we 406 00:24:15,160 --> 00:24:17,880 Speaker 1: heard in your Liabird piece, there are so amazingly positive 407 00:24:17,880 --> 00:24:22,280 Speaker 1: applications of deep fake technology. Here's Honi for read talking 408 00:24:22,320 --> 00:24:25,320 Speaker 1: about deep fakes and the movie business. Can you imagine 409 00:24:25,320 --> 00:24:29,280 Speaker 1: a world where the actor can simply license their appearance 410 00:24:29,359 --> 00:24:30,920 Speaker 1: and they never have to show up on the set. 411 00:24:31,000 --> 00:24:32,560 Speaker 1: You say, look, here's a bunch of images of me. 412 00:24:32,800 --> 00:24:35,840 Speaker 1: Synthesize me doing whatever you want. I'm basically an animated 413 00:24:35,880 --> 00:24:38,399 Speaker 1: character for you, and then anybody can be in the movies. 414 00:24:38,440 --> 00:24:40,840 Speaker 1: You can imagine customized movies. Imagine I go to the 415 00:24:40,880 --> 00:24:42,760 Speaker 1: movie and say, look, I'd like to see this movie, 416 00:24:43,119 --> 00:24:45,480 Speaker 1: but with George Clooney and not Kevin Spacey in it. 417 00:24:45,800 --> 00:24:48,720 Speaker 1: Please synthesize that for me. Can we do that today 418 00:24:48,800 --> 00:24:52,200 Speaker 1: or tomorrow now? But in theory, that is essentially where 419 00:24:52,200 --> 00:24:54,000 Speaker 1: we're going. So if you if you haven't seen, some 420 00:24:54,040 --> 00:24:56,040 Speaker 1: of these people are creating all these deep fake videos 421 00:24:56,040 --> 00:24:58,879 Speaker 1: of Nick Cage and inserted into all these different movies. 422 00:24:59,400 --> 00:25:01,680 Speaker 1: That's essentially and that's not the full length movie, they're 423 00:25:01,680 --> 00:25:04,080 Speaker 1: doing it into clips, but that's essentially the trend where 424 00:25:04,080 --> 00:25:05,840 Speaker 1: you can just put your favorite actor or actress into 425 00:25:05,880 --> 00:25:08,800 Speaker 1: whatever movie you want and just watch it. It's personalized movies. 426 00:25:09,800 --> 00:25:12,119 Speaker 1: I'm not gonna lie. I find it super weird that 427 00:25:12,200 --> 00:25:15,240 Speaker 1: Nicholas Cage has become the posted boy for having his 428 00:25:15,359 --> 00:25:18,520 Speaker 1: face deep faction to various movies. I wonder if you 429 00:25:18,560 --> 00:25:22,240 Speaker 1: actually asked Internet nerds, why neck Cage, what do you 430 00:25:22,240 --> 00:25:24,639 Speaker 1: think would like? I have no idea. Well, he's kind 431 00:25:24,680 --> 00:25:26,760 Speaker 1: of already a meme, right, he was, and he was 432 00:25:26,800 --> 00:25:29,480 Speaker 1: in face off where his face was switched with another 433 00:25:29,520 --> 00:25:31,520 Speaker 1: person's face, So he's always sort of in the poster 434 00:25:31,640 --> 00:25:34,840 Speaker 1: child for face swapping, you know. I think actually one 435 00:25:34,840 --> 00:25:37,360 Speaker 1: thing that I thought about is this idea of representation. 436 00:25:37,400 --> 00:25:40,040 Speaker 1: You know, if there's a movie or movies or series 437 00:25:40,119 --> 00:25:44,240 Speaker 1: like James Bond where the lead character has been historically white, 438 00:25:44,840 --> 00:25:48,840 Speaker 1: and you want to show your African American son James Bond, 439 00:25:49,280 --> 00:25:53,240 Speaker 1: it would be kind of cool to make James Bond black, right, 440 00:25:53,359 --> 00:25:55,880 Speaker 1: because then your child could be watching a movie where 441 00:25:55,960 --> 00:25:59,120 Speaker 1: James Bond looks like your child. Absolutely, And I think 442 00:25:59,160 --> 00:26:01,159 Speaker 1: one of the big pro BOMs in the movie business 443 00:26:01,160 --> 00:26:04,640 Speaker 1: and the media business in general is representation. So more 444 00:26:04,680 --> 00:26:08,040 Speaker 1: people do have access to this technology now, but it 445 00:26:08,200 --> 00:26:11,720 Speaker 1: used to be that only a Hollywood special Effects company 446 00:26:11,840 --> 00:26:15,600 Speaker 1: would have access to this technology. When you remove the gatekeepers, 447 00:26:15,920 --> 00:26:19,480 Speaker 1: you get these incredible explosions of culture, but you also 448 00:26:19,560 --> 00:26:22,760 Speaker 1: get real threats to the social fabric. And so in 449 00:26:22,760 --> 00:26:25,320 Speaker 1: the case of deep fakes, and they're all very well 450 00:26:25,400 --> 00:26:28,119 Speaker 1: when they're labeled as fake or when we know they're fake, 451 00:26:28,640 --> 00:26:32,600 Speaker 1: but when they're posing as real, that's when we start 452 00:26:32,640 --> 00:26:35,879 Speaker 1: to be really under threat, I think as a society. 453 00:26:35,920 --> 00:26:38,520 Speaker 1: But there are people working on this as ever. Cat 454 00:26:38,560 --> 00:26:41,199 Speaker 1: and Mouse. When we come back, we'll talk about some 455 00:26:41,280 --> 00:26:50,159 Speaker 1: of the ways they're fighting back. When it comes to 456 00:26:50,200 --> 00:26:54,280 Speaker 1: deep fakes. Pandora's box is open, and as Jose argues, 457 00:26:54,320 --> 00:26:58,040 Speaker 1: there's no turning back the clocks. The technology exists. So, 458 00:26:58,200 --> 00:27:01,040 Speaker 1: knowing deep fakes and fake news have become more sophisticated, 459 00:27:01,400 --> 00:27:04,240 Speaker 1: I wanted to find out how actual news organizations are 460 00:27:04,240 --> 00:27:07,200 Speaker 1: thinking about the problem. So I spoke with John mccathwaite, 461 00:27:07,440 --> 00:27:10,800 Speaker 1: editor in chief of Bloomberg News, and he actually started 462 00:27:10,960 --> 00:27:14,679 Speaker 1: by pointing out that fate news isn't new news. I 463 00:27:14,720 --> 00:27:16,840 Speaker 1: think that one crucial thing when you look at fake 464 00:27:16,920 --> 00:27:18,720 Speaker 1: news is tom it's always been there. You know. The 465 00:27:18,760 --> 00:27:21,800 Speaker 1: first bit of fate news was the trojan horse fake 466 00:27:21,880 --> 00:27:24,320 Speaker 1: news and propaganda have for ever been some of the 467 00:27:24,440 --> 00:27:28,560 Speaker 1: more exotic weapons in global conflict. John points to another 468 00:27:28,600 --> 00:27:32,280 Speaker 1: example involving the famous British by and author of James 469 00:27:32,359 --> 00:27:36,399 Speaker 1: bond Ian Fleming. Supposedly one of his great schemes was 470 00:27:36,480 --> 00:27:42,520 Speaker 1: to drop lots of jumbo sized condoms over Germany um 471 00:27:42,520 --> 00:27:46,439 Speaker 1: and label sort of British small on the outside in 472 00:27:46,520 --> 00:27:49,560 Speaker 1: the end in the name, with the aim, no doubt wrongly, 473 00:27:49,600 --> 00:27:53,639 Speaker 1: of destabilizing German manhood. My point is that there are 474 00:27:53,680 --> 00:27:56,600 Speaker 1: many many ways in which you can do this. But 475 00:27:56,840 --> 00:28:00,159 Speaker 1: the most interesting thing to me about fate news is 476 00:28:00,160 --> 00:28:04,480 Speaker 1: that really in modern history it's tied very heavily to technology. 477 00:28:04,560 --> 00:28:06,840 Speaker 1: What tends to happen is a new technology comes along 478 00:28:07,400 --> 00:28:10,720 Speaker 1: which suddenly sets media free. If we look to history 479 00:28:10,840 --> 00:28:13,920 Speaker 1: we can understand this moment better. We mentioned the early 480 00:28:14,000 --> 00:28:17,600 Speaker 1: printing press before and how it enabled explosions of ideology 481 00:28:17,680 --> 00:28:21,119 Speaker 1: and led to religious conflicts. Well, when the printing press 482 00:28:21,200 --> 00:28:24,520 Speaker 1: was industrialized in the nineteenth century, there was another fake 483 00:28:24,640 --> 00:28:28,160 Speaker 1: news boom. Go back to I think it's no one. 484 00:28:28,240 --> 00:28:30,399 Speaker 1: You have the invention of the steam press in London, 485 00:28:30,800 --> 00:28:33,720 Speaker 1: and what that does It enables people to multiple by 486 00:28:33,760 --> 00:28:37,080 Speaker 1: ten the amount of paper that you can print suddenly 487 00:28:37,080 --> 00:28:40,640 Speaker 1: all the way across Europe, and then in America, free 488 00:28:40,680 --> 00:28:44,120 Speaker 1: ship newspaper starts bringing up. Because you can distribute far more, 489 00:28:44,200 --> 00:28:47,280 Speaker 1: you can reach far more people, far more quickly. And 490 00:28:47,320 --> 00:28:49,560 Speaker 1: the most notorious of this was The New York Sun 491 00:28:49,640 --> 00:28:53,040 Speaker 1: at one time, I think, the world's biggest selling paper, 492 00:28:53,240 --> 00:28:56,360 Speaker 1: run by Michael Benjamin Day, and he would run some 493 00:28:56,480 --> 00:28:59,160 Speaker 1: stories like the moon was populated by people who are 494 00:28:59,160 --> 00:29:03,280 Speaker 1: half human half bat. But what happened, and I think 495 00:29:03,280 --> 00:29:07,000 Speaker 1: this will happen again, is that consumers said, we don't 496 00:29:07,000 --> 00:29:09,680 Speaker 1: want to read that, we need facts. And so if 497 00:29:09,680 --> 00:29:12,600 Speaker 1: you look back at many of the big newspapers of 498 00:29:12,640 --> 00:29:14,840 Speaker 1: our time, the New York Times, the Economists were I 499 00:29:14,920 --> 00:29:17,440 Speaker 1: used to work, many of these things came from that 500 00:29:17,480 --> 00:29:21,160 Speaker 1: particular period because people paid more to get things they trusted. Well, 501 00:29:21,200 --> 00:29:24,520 Speaker 1: that is definitely happening again. In other words, most of 502 00:29:24,560 --> 00:29:28,280 Speaker 1: the high quality press today, the New York Times, the Economist, 503 00:29:28,440 --> 00:29:33,120 Speaker 1: which John also edited, came from consumer demand for trustworthy information. 504 00:29:33,760 --> 00:29:36,280 Speaker 1: And that same consumer demand may help us out of 505 00:29:36,280 --> 00:29:40,480 Speaker 1: today's predicament. But there is one key difference. Now we 506 00:29:40,560 --> 00:29:43,360 Speaker 1: have deep fakes. It's worth a lot of money to 507 00:29:43,400 --> 00:29:45,000 Speaker 1: a lot of people to try to fool us. So 508 00:29:45,080 --> 00:29:47,320 Speaker 1: you look at things like Twitter handles that aren't quite 509 00:29:47,320 --> 00:29:50,800 Speaker 1: the same, some mixture between humans and computers. You used 510 00:29:50,800 --> 00:29:54,760 Speaker 1: to deal with those. What is harder at the moment 511 00:29:54,880 --> 00:29:57,600 Speaker 1: is video. So to give you an example, I think 512 00:29:57,600 --> 00:29:59,160 Speaker 1: a year or so ago, there was some tack in 513 00:29:59,240 --> 00:30:01,840 Speaker 1: a subway in New Org. We could verify really quite 514 00:30:01,920 --> 00:30:05,680 Speaker 1: quickly that the subway attack had happened, but almost immediately 515 00:30:05,720 --> 00:30:08,280 Speaker 1: there was a picture on Twitter of one of the 516 00:30:08,320 --> 00:30:11,960 Speaker 1: alleged assailants lying in a pool of blood. Now, trying 517 00:30:12,000 --> 00:30:15,440 Speaker 1: to verify that that was true was much harder, and 518 00:30:15,480 --> 00:30:17,520 Speaker 1: it came down to things like working out whether that 519 00:30:17,600 --> 00:30:20,480 Speaker 1: was the correct subway floor. You can look at pixels, 520 00:30:20,560 --> 00:30:22,960 Speaker 1: you can look at all those different things, But yes, 521 00:30:23,200 --> 00:30:26,480 Speaker 1: verifying video is often harder than verifying facts. Do you 522 00:30:26,520 --> 00:30:29,440 Speaker 1: have any tools or technologies that you're licensing or spending 523 00:30:29,440 --> 00:30:31,440 Speaker 1: money on to do it. We spend a lot of 524 00:30:31,440 --> 00:30:34,040 Speaker 1: money on technology across all these fronts. With more and 525 00:30:34,080 --> 00:30:37,880 Speaker 1: more news coming directly from social media, large news organizations 526 00:30:37,920 --> 00:30:41,320 Speaker 1: like Bloomberg News need to be able to verify which 527 00:30:41,320 --> 00:30:44,760 Speaker 1: photos and videos are real and whether they actually relate 528 00:30:44,800 --> 00:30:47,680 Speaker 1: to the events they're investigating, which is why Harni for 529 00:30:47,800 --> 00:30:51,440 Speaker 1: Reid is in such high demand. Suddenly, the need to 530 00:30:51,520 --> 00:30:56,520 Speaker 1: authenticate content has really global implications. Everything from our court 531 00:30:56,640 --> 00:31:00,760 Speaker 1: to our national security, to our democratical actions to citizens 532 00:31:00,760 --> 00:31:03,480 Speaker 1: safety is starting to rely on our ability to tell 533 00:31:03,560 --> 00:31:05,640 Speaker 1: the real from the fake. And so I think this 534 00:31:05,760 --> 00:31:08,760 Speaker 1: field of forensics, this field of authentication, has never been 535 00:31:08,800 --> 00:31:11,959 Speaker 1: more important, and that's what Harney spends his days working 536 00:31:11,960 --> 00:31:15,920 Speaker 1: on at Dartmouth. He develops techniques to analyze and authenticate 537 00:31:16,040 --> 00:31:19,960 Speaker 1: digital media. Ahead of the elections, He's working on what 538 00:31:20,080 --> 00:31:23,920 Speaker 1: he calls a soft biometric tool to detect fake videos 539 00:31:24,000 --> 00:31:29,200 Speaker 1: of specific politicians such as Bernie Saunders, Elizabeth Warren, and 540 00:31:29,280 --> 00:31:32,240 Speaker 1: Donald Trump. UM I would say the game is going 541 00:31:32,320 --> 00:31:36,000 Speaker 1: to be that we never eliminate the ability to create 542 00:31:36,040 --> 00:31:38,960 Speaker 1: fake content, but what we do is we raise the bar. 543 00:31:39,520 --> 00:31:41,320 Speaker 1: We take it out of the hands of the amateurs, 544 00:31:41,320 --> 00:31:43,440 Speaker 1: we take it out of the hands of the average 545 00:31:43,480 --> 00:31:46,600 Speaker 1: person downloading some code, and we make it more difficult, 546 00:31:46,640 --> 00:31:48,920 Speaker 1: more time consuming, and more risky. And this is the 547 00:31:48,960 --> 00:31:52,040 Speaker 1: same thing that we do with counterfeit currency. You can 548 00:31:52,080 --> 00:31:55,080 Speaker 1: still create counterfeit currency today, but it's really hard, still 549 00:31:55,120 --> 00:31:57,880 Speaker 1: a risk, but it's a more manageable risk. On the 550 00:31:57,920 --> 00:32:00,960 Speaker 1: subject of money, there are digital orencies which are much 551 00:32:00,960 --> 00:32:04,120 Speaker 1: more difficult to counterfeit than coins and banknotes. You've heard 552 00:32:04,120 --> 00:32:07,640 Speaker 1: of Bitcoin and ethereum, which you're enabled by blockchain, a 553 00:32:07,760 --> 00:32:12,080 Speaker 1: so called distributed ledger. Information about transactions is shared between 554 00:32:12,120 --> 00:32:15,800 Speaker 1: all the users of the currency, rather than authenticated and 555 00:32:15,880 --> 00:32:19,680 Speaker 1: guarded by a bank. Sharing this kind of information across 556 00:32:19,680 --> 00:32:23,160 Speaker 1: a crowd of people with multiple backup copies has a 557 00:32:23,320 --> 00:32:26,480 Speaker 1: range of uses. One thing Harney is looking at is 558 00:32:26,600 --> 00:32:31,600 Speaker 1: using blockchain to authenticate images and videos. At source, we're 559 00:32:31,640 --> 00:32:35,320 Speaker 1: gonna start seeing, um the use of a different type 560 00:32:35,320 --> 00:32:38,000 Speaker 1: of camera. So there are now companies out there that 561 00:32:38,120 --> 00:32:42,040 Speaker 1: create what are called secure imaging pipelines, and so when 562 00:32:42,080 --> 00:32:46,400 Speaker 1: you record an image or video, they extract a unique 563 00:32:46,440 --> 00:32:50,120 Speaker 1: signature from that content, they cryptographically signed it, and they 564 00:32:50,160 --> 00:32:53,320 Speaker 1: put that on the blockchain. So that's basically a distributed 565 00:32:53,400 --> 00:32:56,400 Speaker 1: ledger that's very very hard, if not impossible, to manipulate. 566 00:32:56,720 --> 00:32:59,719 Speaker 1: Perhaps staying ahead of the perpetrators and making fakes more 567 00:32:59,760 --> 00:33:02,400 Speaker 1: differ are cooled is the best we can do. But 568 00:33:02,520 --> 00:33:06,080 Speaker 1: what about our usage? How much responsibility do we have 569 00:33:06,240 --> 00:33:09,840 Speaker 1: to navigate the web thoughtfully? And how much responsibility should 570 00:33:09,840 --> 00:33:13,080 Speaker 1: be on the platforms. We have Facebook, Twitter, we have 571 00:33:13,200 --> 00:33:17,760 Speaker 1: yelped because they're not responsible for user generated content. What's 572 00:33:17,800 --> 00:33:22,040 Speaker 1: interesting is that, like Nathaniel at Facebook, Danielle also sees 573 00:33:22,160 --> 00:33:26,320 Speaker 1: risks in over zealous moderation. If you put too much 574 00:33:26,560 --> 00:33:31,480 Speaker 1: responsibility on the platform, you will likely incentivize over censorship. 575 00:33:31,840 --> 00:33:33,880 Speaker 1: So all the great things that we think about a 576 00:33:33,920 --> 00:33:37,080 Speaker 1: lot of these platforms, and especially the social media, the 577 00:33:37,160 --> 00:33:40,080 Speaker 1: Parkland survivors or Black Lives Matters, right, we don't want 578 00:33:40,080 --> 00:33:43,800 Speaker 1: to lose the facility and new enablements for organizing and speech. 579 00:33:44,280 --> 00:33:46,520 Speaker 1: So if you put too much liability on the platforms, 580 00:33:46,520 --> 00:33:50,440 Speaker 1: they're going to overreact and anything anyone complains about and 581 00:33:50,480 --> 00:33:53,840 Speaker 1: have very aggressive filters. So we might very well miss 582 00:33:53,880 --> 00:33:57,040 Speaker 1: Black Lives Matter, we might not have Parkland and never 583 00:33:57,200 --> 00:34:00,880 Speaker 1: see it because you're gonna have overly are of censorship. 584 00:34:01,120 --> 00:34:05,240 Speaker 1: Here's Nathaniel again. Whenever people come together in a new medium, 585 00:34:05,400 --> 00:34:07,320 Speaker 1: you're going to have people that try to manipulate and 586 00:34:07,320 --> 00:34:09,720 Speaker 1: try to take advantage. I think one of the things 587 00:34:09,760 --> 00:34:12,440 Speaker 1: that's really fundamentally true that we have done when we 588 00:34:12,480 --> 00:34:15,799 Speaker 1: think about the Internet generally social media as well, is 589 00:34:15,840 --> 00:34:18,799 Speaker 1: we've removed some of the traditional gatekeeping mechanisms that have 590 00:34:18,840 --> 00:34:22,000 Speaker 1: existed in the past, and that has meant that far 591 00:34:22,080 --> 00:34:25,040 Speaker 1: more people could engage, much more quickly and much more 592 00:34:25,120 --> 00:34:27,120 Speaker 1: vocally than ever before, and that has led to some 593 00:34:27,200 --> 00:34:29,560 Speaker 1: incredible things. If you think about the me too movement, 594 00:34:29,800 --> 00:34:32,120 Speaker 1: which really part of what drives it and enables it 595 00:34:32,160 --> 00:34:34,520 Speaker 1: is the ability to route around some of those gatekeepers, right, 596 00:34:35,120 --> 00:34:37,359 Speaker 1: But at the same time, you're also going to see 597 00:34:37,400 --> 00:34:39,520 Speaker 1: malicious actors try to misuse that. I think that is 598 00:34:39,520 --> 00:34:42,720 Speaker 1: a fundamental truth for any form of media. The question 599 00:34:42,840 --> 00:34:48,200 Speaker 1: is how do you enable authentic engagement while making the 600 00:34:48,239 --> 00:34:53,160 Speaker 1: types of manipulation that we see more difficult. If Facebook 601 00:34:53,200 --> 00:34:58,360 Speaker 1: and other platforms are too destructive of society, ultimately everyone loses, 602 00:34:58,840 --> 00:35:02,440 Speaker 1: even the technology company needs and their shareholders. So how 603 00:35:02,440 --> 00:35:06,359 Speaker 1: do we move from understanding that to finding solutions. Here's 604 00:35:06,440 --> 00:35:12,439 Speaker 1: David Kirkpatrick again. If we are going to retain democracy, 605 00:35:12,960 --> 00:35:19,320 Speaker 1: we need technical systems, digital systems, technologies that more effectively 606 00:35:19,719 --> 00:35:25,160 Speaker 1: and persuasively compelling lee distribute knowledge so that we have 607 00:35:25,280 --> 00:35:30,040 Speaker 1: citizens that are capable of functioning in a democratic landscape 608 00:35:30,120 --> 00:35:35,440 Speaker 1: that is more complex, more rapidly changing, and ultimately more global. 609 00:35:35,719 --> 00:35:38,680 Speaker 1: And as far as Hanni ferred is concerned, this has 610 00:35:38,719 --> 00:35:42,040 Speaker 1: become everyone's problem, so we all have a part to 611 00:35:42,120 --> 00:35:44,560 Speaker 1: play in solving it. I think two things are gonna 612 00:35:44,560 --> 00:35:46,840 Speaker 1: have to change. So one is the technology to authenticate 613 00:35:46,840 --> 00:35:48,439 Speaker 1: it is going to have to get better. So whether 614 00:35:48,480 --> 00:35:50,680 Speaker 1: that's authenticating at the source or the types of things 615 00:35:50,680 --> 00:35:54,200 Speaker 1: that I do with authenticating content and operating that at scale, 616 00:35:54,280 --> 00:35:56,000 Speaker 1: that's going to have to get better. But I think 617 00:35:56,000 --> 00:35:58,120 Speaker 1: what's also going to have to change is how we 618 00:35:58,239 --> 00:36:01,360 Speaker 1: as consumers of digital content think about what we see. 619 00:36:01,680 --> 00:36:04,719 Speaker 1: We are going to have to become more critical, more reasoned. 620 00:36:05,000 --> 00:36:07,239 Speaker 1: We have to get out of our echo chambers. We 621 00:36:07,320 --> 00:36:09,960 Speaker 1: have to stop allowing social media to manipulate us in 622 00:36:09,960 --> 00:36:12,040 Speaker 1: the way that they do. So I think the solution 623 00:36:12,120 --> 00:36:14,520 Speaker 1: is at least too prong and potentially three with some 624 00:36:14,680 --> 00:36:17,960 Speaker 1: legislative relief on the line to really force the companies 625 00:36:18,000 --> 00:36:19,879 Speaker 1: to do better than they have been over the last 626 00:36:19,920 --> 00:36:23,000 Speaker 1: few years. So does the good outweigh the bad? I 627 00:36:23,040 --> 00:36:25,840 Speaker 1: don't know. We have to have a hard conversation. People 628 00:36:25,880 --> 00:36:29,680 Speaker 1: in who work in infectious disease and physicists who develop weaponry, 629 00:36:30,000 --> 00:36:32,640 Speaker 1: they think about this all the time. We as technologies 630 00:36:32,719 --> 00:36:34,799 Speaker 1: have not quite thought about this as much in the 631 00:36:34,800 --> 00:36:37,359 Speaker 1: past because our field is so young. But I think 632 00:36:37,400 --> 00:36:39,759 Speaker 1: now you know it's time to wake up and start 633 00:36:39,840 --> 00:36:42,680 Speaker 1: asking those hard questions and having those conversations before it's 634 00:36:42,680 --> 00:36:47,920 Speaker 1: too late. Once again, we're being aged to wake up 635 00:36:47,960 --> 00:36:50,840 Speaker 1: from our sleep book, and we do have some months 636 00:36:50,840 --> 00:36:53,560 Speaker 1: is at least when it comes to deep fakes, we 637 00:36:53,600 --> 00:36:56,560 Speaker 1: can make it akin to counterfeiting money. The people who 638 00:36:56,560 --> 00:36:59,320 Speaker 1: do it will get prosecuted, and program is like Honey 639 00:36:59,400 --> 00:37:02,759 Speaker 1: will work on detection technology, but we still have to 640 00:37:02,800 --> 00:37:05,400 Speaker 1: hold the bills up to the light before we decide 641 00:37:05,440 --> 00:37:08,920 Speaker 1: whether to accept them. That's our job, that is, if 642 00:37:08,920 --> 00:37:12,040 Speaker 1: we're not too busy watching Nicholas Cage starring as Thelma 643 00:37:12,120 --> 00:37:16,719 Speaker 1: and Louise in Thelma and Louise. Even more complicated than 644 00:37:16,760 --> 00:37:20,600 Speaker 1: deep fakes is the concentration of power at companies like Facebook. 645 00:37:21,239 --> 00:37:23,560 Speaker 1: In the next episode, we visit a secret lab at 646 00:37:23,600 --> 00:37:27,800 Speaker 1: Google to understand what happens when technology companies start taking 647 00:37:27,800 --> 00:37:30,120 Speaker 1: on the role of the state, and we speak with 648 00:37:30,200 --> 00:37:33,400 Speaker 1: Lena Khan, who has proposed new regulation to balance the 649 00:37:33,440 --> 00:37:38,279 Speaker 1: power of big technology companies like Amazon. I'm asloshin See 650 00:37:38,280 --> 00:37:53,399 Speaker 1: you next time. Sleepwalkers is a production of I Heart 651 00:37:53,480 --> 00:37:58,120 Speaker 1: Radio and Unusual productions. For the latest AI news, live interviews, 652 00:37:58,160 --> 00:38:01,120 Speaker 1: and behind the scenes footage, find us on Instagram, at 653 00:38:01,120 --> 00:38:06,080 Speaker 1: Sleepwalker's podcast or at Sleepwalker's podcast dot com. Sleepwalkers is 654 00:38:06,120 --> 00:38:09,440 Speaker 1: hosted by me Ozveloshin and co hosted by me Kara Price. 655 00:38:09,600 --> 00:38:12,560 Speaker 1: Were produced by Julian Weller with help from Jacobo Penzo 656 00:38:12,680 --> 00:38:16,120 Speaker 1: and Taylor Chacogne. Mixing by Tristan McNeil and Julian Weller. 657 00:38:16,400 --> 00:38:20,120 Speaker 1: Recording assistance this episode from tofarrelf Our Story editor is 658 00:38:20,160 --> 00:38:24,640 Speaker 1: Matthew Riddle. Sleepwalkers is executive produced by me Ozveloshin and 659 00:38:24,719 --> 00:38:28,120 Speaker 1: Mangesh Hattiga. For more podcasts from My Heart Radio, visit 660 00:38:28,160 --> 00:38:30,920 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 661 00:38:31,000 --> 00:38:32,240 Speaker 1: listen to your favorite shows.