1 00:00:15,396 --> 00:00:22,916 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:22,956 --> 00:00:25,956 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:26,356 --> 00:00:30,996 Speaker 1: I'm Noah Feldman. We used to say seeing is believing, 4 00:00:31,276 --> 00:00:35,276 Speaker 1: but that was before the invention of deep fakes. Increasingly 5 00:00:35,596 --> 00:00:40,556 Speaker 1: affordable technology can actually create manufactured audio and video that 6 00:00:40,716 --> 00:00:43,516 Speaker 1: no human eye or ear will be able to distinguish 7 00:00:43,676 --> 00:00:46,636 Speaker 1: from the real thing. Does that mean we can't trust 8 00:00:46,676 --> 00:00:49,516 Speaker 1: our eyes anymore, that we'll have to develop a pervasive 9 00:00:49,556 --> 00:00:53,676 Speaker 1: sense of insecurity about what's actually true? Or could it 10 00:00:53,756 --> 00:00:56,276 Speaker 1: open the door to something a little healthier, like a 11 00:00:56,316 --> 00:00:59,236 Speaker 1: modest skepticism or a world where we will be more 12 00:00:59,316 --> 00:01:03,516 Speaker 1: cautious about what we believe. To discuss these hard questions, 13 00:01:03,796 --> 00:01:06,676 Speaker 1: I'm joined by Danielle Citron, a professor at the Boston 14 00:01:06,756 --> 00:01:10,076 Speaker 1: University's School of Law. Danielle is the vice president of 15 00:01:10,116 --> 00:01:13,716 Speaker 1: the Cyber Civil Rights Initiative, which is a nonprofit devoted 16 00:01:13,756 --> 00:01:16,196 Speaker 1: to the protection of civil rights and liberties in the 17 00:01:16,236 --> 00:01:20,516 Speaker 1: digital age. She works very closely with lawmakers, law enforcers, 18 00:01:20,556 --> 00:01:24,556 Speaker 1: and big social media platforms. She's written extensively about deep fakes, 19 00:01:24,876 --> 00:01:28,276 Speaker 1: and in June twenty nineteen, she testified before the House 20 00:01:28,316 --> 00:01:31,996 Speaker 1: Intelligence Committee on the subject. Danielle, I'm just so thrilled 21 00:01:32,196 --> 00:01:36,116 Speaker 1: that you're here to talk to us about deep fakes. 22 00:01:36,356 --> 00:01:40,716 Speaker 1: So let's just start with some definitions for those who 23 00:01:40,756 --> 00:01:43,156 Speaker 1: are just encountering deep fakes for the first time over 24 00:01:43,156 --> 00:01:45,836 Speaker 1: the last few months. What counts as a deep fake 25 00:01:46,116 --> 00:01:51,596 Speaker 1: in your definition, deep fakes is involves machine learning technology 26 00:01:52,316 --> 00:01:55,756 Speaker 1: that manipulates or fabricates out of sort of digital whole 27 00:01:55,756 --> 00:01:59,436 Speaker 1: cloth video and audio recordings that show people doing and 28 00:01:59,476 --> 00:02:02,956 Speaker 1: saying things that they never dinner said. So they seem 29 00:02:02,956 --> 00:02:07,196 Speaker 1: really authentic and realistic, but they're totally false. And they 30 00:02:07,236 --> 00:02:10,236 Speaker 1: look really good, don't they. You know, right now, the 31 00:02:10,316 --> 00:02:14,476 Speaker 1: technology is still pretty much developing, so they look good. 32 00:02:15,116 --> 00:02:18,796 Speaker 1: They're not perfect. If you look closely enough, you can 33 00:02:18,876 --> 00:02:21,156 Speaker 1: sometimes see some of the flaws, even just to the 34 00:02:21,356 --> 00:02:25,596 Speaker 1: human eye, and technologists can easily identify, for the most 35 00:02:25,636 --> 00:02:28,636 Speaker 1: part that there are some flaws that show it's not real. 36 00:02:29,276 --> 00:02:31,836 Speaker 1: It's that what's going to happen in the next six 37 00:02:32,076 --> 00:02:36,076 Speaker 1: to nine months is that the technology is it's advancing 38 00:02:36,116 --> 00:02:39,956 Speaker 1: so rapidly, so soon enough technologists think it's going to 39 00:02:40,036 --> 00:02:43,636 Speaker 1: be really hard, if not impossible, to detect or discern 40 00:02:43,676 --> 00:02:47,396 Speaker 1: the difference between what's fake and what's real, And that's 41 00:02:47,436 --> 00:02:51,956 Speaker 1: when we're going to run into serious trouble when that happened. Yeah, 42 00:02:51,956 --> 00:02:56,276 Speaker 1: I mean when that happens, that means that any public figure, 43 00:02:56,396 --> 00:02:59,156 Speaker 1: or not just a public figure, any person could be 44 00:02:59,196 --> 00:03:04,116 Speaker 1: depicted in a relatively convincing looking video clip as saying 45 00:03:04,156 --> 00:03:08,396 Speaker 1: we're doing something that they never said or did. So 46 00:03:08,436 --> 00:03:10,836 Speaker 1: we have to, one way or another start preparing ourselves 47 00:03:10,876 --> 00:03:14,316 Speaker 1: for that for that world. I guess is there any credible, 48 00:03:14,356 --> 00:03:17,916 Speaker 1: realistic way just to ban this technology? I mean, could 49 00:03:17,916 --> 00:03:20,076 Speaker 1: this just be a piece of technology that was flat 50 00:03:20,076 --> 00:03:22,636 Speaker 1: out prohibited. I mean, I'm not allowed to have a 51 00:03:22,636 --> 00:03:25,316 Speaker 1: surfaced air missile in my backyard, so why do I 52 00:03:25,316 --> 00:03:27,756 Speaker 1: have to be able to have this on my computer? Right? No, 53 00:03:27,916 --> 00:03:30,156 Speaker 1: it's a it's a great question about you know, the 54 00:03:30,156 --> 00:03:35,636 Speaker 1: difference between single use technologies whose only function is essentially 55 00:03:35,676 --> 00:03:38,556 Speaker 1: illegality in your hands, right if you had a serviced airbistle. 56 00:03:39,276 --> 00:03:42,236 Speaker 1: Deep fakes are different because you know, we can make 57 00:03:42,276 --> 00:03:44,916 Speaker 1: deep fakes for good, we can make deep fakes to 58 00:03:44,956 --> 00:03:47,836 Speaker 1: create art. You know, there's a deep fake of President 59 00:03:47,876 --> 00:03:52,076 Speaker 1: Obama teaching us about the phenomenon of deep fakes. We're 60 00:03:52,196 --> 00:03:54,476 Speaker 1: entering an era in which our enemies can make it 61 00:03:54,516 --> 00:03:57,476 Speaker 1: look like anyone is saying anything at any point in time, 62 00:03:58,036 --> 00:04:00,556 Speaker 1: even if they would never say those things. And so 63 00:04:00,636 --> 00:04:03,516 Speaker 1: you can imagine so many interesting ways that deep fakes 64 00:04:03,556 --> 00:04:07,556 Speaker 1: can help teach history. There's a man who has als 65 00:04:07,756 --> 00:04:10,756 Speaker 1: and he sort of beerheaded the use of the technology 66 00:04:10,836 --> 00:04:15,596 Speaker 1: to create audio that matches his thoughts so that others 67 00:04:15,636 --> 00:04:18,476 Speaker 1: can hear him speak because he's unable to speak now. 68 00:04:19,116 --> 00:04:23,036 Speaker 1: And there's a story about a man who's a quadriplegic 69 00:04:23,116 --> 00:04:25,956 Speaker 1: who can't make love to his wife, and they used 70 00:04:26,276 --> 00:04:29,756 Speaker 1: photographs of him to create a deep fake sex video 71 00:04:29,836 --> 00:04:32,556 Speaker 1: of heat consentually and his wife so they can see 72 00:04:32,596 --> 00:04:35,716 Speaker 1: themselves making love. So there are all sorts of ways 73 00:04:35,756 --> 00:04:38,996 Speaker 1: that deep fakes can be pro social. And so you know, 74 00:04:38,996 --> 00:04:41,876 Speaker 1: the question is should we ban the technology? And I 75 00:04:41,876 --> 00:04:44,276 Speaker 1: would say we're going to have some problems with banning 76 00:04:44,276 --> 00:04:47,036 Speaker 1: it for all sorts of practical reasons, like it's cats 77 00:04:47,036 --> 00:04:48,916 Speaker 1: out of the bag. I don't think we can ban it, 78 00:04:48,996 --> 00:04:51,836 Speaker 1: and I don't think we should ban it. It sounds 79 00:04:51,876 --> 00:04:55,036 Speaker 1: like in the world that you're describing, we get very 80 00:04:55,156 --> 00:04:58,196 Speaker 1: quickly to a category that you've written a lot about 81 00:04:58,276 --> 00:05:01,356 Speaker 1: and that's a very complicated and contentious topic. And that's 82 00:05:01,396 --> 00:05:04,636 Speaker 1: the topic of consent. You know, in the case of 83 00:05:04,676 --> 00:05:07,756 Speaker 1: the person suffering from als that you mentioned, it's not 84 00:05:07,796 --> 00:05:09,916 Speaker 1: just that they could create a video, but I suppose 85 00:05:10,036 --> 00:05:13,156 Speaker 1: they could create an avatar, and then the man's wife 86 00:05:13,156 --> 00:05:16,596 Speaker 1: could consentually have virtual sex with him, and they would 87 00:05:16,636 --> 00:05:18,436 Speaker 1: be they would be happy about that, and we would 88 00:05:18,476 --> 00:05:21,836 Speaker 1: perceive that as a pro social result. The problem arises 89 00:05:21,836 --> 00:05:25,036 Speaker 1: when it's an image of a person taken without his 90 00:05:25,116 --> 00:05:28,556 Speaker 1: or her consent. So are we at least in this realm, 91 00:05:28,596 --> 00:05:30,956 Speaker 1: the realm of let's call it sex and deep fakes 92 00:05:30,996 --> 00:05:33,236 Speaker 1: before we get to politics and deep fakes and other 93 00:05:33,316 --> 00:05:36,876 Speaker 1: kinds of deep fakes. Is this a realm where we 94 00:05:36,916 --> 00:05:41,476 Speaker 1: need to fall back on the familiar, albeit imperfect rules 95 00:05:41,516 --> 00:05:44,276 Speaker 1: that we're used to using to regulate sexual conduct. I mean, 96 00:05:44,316 --> 00:05:46,796 Speaker 1: sexual conduct itself is something that we see as pro social, 97 00:05:47,116 --> 00:05:50,036 Speaker 1: except if it's not consensual, then it's very much antisocial 98 00:05:50,116 --> 00:05:53,316 Speaker 1: and it's rape and it's a crime. It's true that 99 00:05:53,836 --> 00:05:58,796 Speaker 1: with deep fakes, as with real life physical interactions, so 100 00:05:58,916 --> 00:06:01,436 Speaker 1: often where the rubber hits the road, and where it's 101 00:06:01,516 --> 00:06:06,156 Speaker 1: legally enormatively significant is where the person does not consent. 102 00:06:06,876 --> 00:06:09,436 Speaker 1: It's much in the way that we think of sexual 103 00:06:09,716 --> 00:06:12,996 Speaker 1: interactions that are physical between two individuals or three or 104 00:06:13,036 --> 00:06:16,956 Speaker 1: whoever many when we're talking about video and audio, much 105 00:06:16,956 --> 00:06:21,556 Speaker 1: as the way there's a big difference between porn and obscenity, right, 106 00:06:21,676 --> 00:06:24,356 Speaker 1: or porn and non porn. It all has to do 107 00:06:24,396 --> 00:06:28,236 Speaker 1: with consent. So I'm a pornography go for it if 108 00:06:28,276 --> 00:06:30,596 Speaker 1: that's what you want to do. But if you are 109 00:06:30,676 --> 00:06:34,876 Speaker 1: made into a sex object without your consent, then by 110 00:06:34,916 --> 00:06:37,796 Speaker 1: my lights, it's something that we can regulate and should regulate. 111 00:06:38,596 --> 00:06:42,636 Speaker 1: So obviously this is an important and significant developing area 112 00:06:42,636 --> 00:06:44,636 Speaker 1: of law, and I want to ask you what I 113 00:06:44,636 --> 00:06:48,036 Speaker 1: think is the natural question about that, which is, how 114 00:06:48,076 --> 00:06:52,596 Speaker 1: do you draw the line between forms of fiction that 115 00:06:52,636 --> 00:06:57,036 Speaker 1: have always implicated taking a person's image or their identity, 116 00:06:57,716 --> 00:07:01,316 Speaker 1: and which historically we've treated as protected by freedom of 117 00:07:01,356 --> 00:07:05,356 Speaker 1: expression from what you're setting out to prohibit. So Curtis 118 00:07:05,396 --> 00:07:09,116 Speaker 1: Sittenfeld writes a novel, pretty good novel about Laura buk 119 00:07:09,476 --> 00:07:12,476 Speaker 1: the former First Lady, and you know, she's more or 120 00:07:12,516 --> 00:07:16,236 Speaker 1: less appropriated her whole character for the purposes of the novel. 121 00:07:16,236 --> 00:07:18,396 Speaker 1: And it's not a sound a pornographic novel by stention 122 00:07:18,436 --> 00:07:21,916 Speaker 1: the imagination. But such a novel could include sex or 123 00:07:21,956 --> 00:07:25,396 Speaker 1: other forms of intimacy and depicted literarily, and we think 124 00:07:25,436 --> 00:07:30,476 Speaker 1: of that as protected speech in some way. How would 125 00:07:30,516 --> 00:07:32,476 Speaker 1: how do you think about drawing that kind of line? 126 00:07:33,156 --> 00:07:35,996 Speaker 1: So a few things. The first is that audio and 127 00:07:36,156 --> 00:07:40,996 Speaker 1: video are different from text in important ways. No audio 128 00:07:40,996 --> 00:07:45,276 Speaker 1: and video have this visceral hold on us. We assume 129 00:07:45,356 --> 00:07:47,636 Speaker 1: that they're true because we trust what our eyes and 130 00:07:47,676 --> 00:07:50,956 Speaker 1: ears are telling us, Whereas how human beings respond to 131 00:07:50,996 --> 00:07:54,956 Speaker 1: text is it's mediated in many respects through our thoughts, 132 00:07:55,516 --> 00:07:59,276 Speaker 1: and so unlike audio and video, which react, we react 133 00:07:59,316 --> 00:08:03,036 Speaker 1: to it in a way that it bypasses thinking. We 134 00:08:03,156 --> 00:08:06,396 Speaker 1: just immediately assume it is true. It has a different 135 00:08:06,436 --> 00:08:08,676 Speaker 1: kind of impact than text does. So I think that's 136 00:08:08,676 --> 00:08:13,076 Speaker 1: the first difference between the novel, which imagines Laura Bush 137 00:08:13,156 --> 00:08:17,596 Speaker 1: doing and saying things, versus video that represents to the 138 00:08:17,636 --> 00:08:21,796 Speaker 1: world that it actually happened. Right. One is clearly fiction. 139 00:08:21,876 --> 00:08:25,276 Speaker 1: It has to be mediated by our thoughts. We recognize 140 00:08:25,356 --> 00:08:29,476 Speaker 1: it's being conveyed as fiction. And by contrast, audio and 141 00:08:29,556 --> 00:08:31,956 Speaker 1: video in the deep fake context. So I'm not talking 142 00:08:31,996 --> 00:08:35,516 Speaker 1: about audio and video that is clearly parody, and that 143 00:08:35,556 --> 00:08:38,476 Speaker 1: we are using in ways that may clear and is. 144 00:08:38,516 --> 00:08:40,836 Speaker 1: Let's pretend there's a label on it that says this 145 00:08:40,916 --> 00:08:44,436 Speaker 1: is a parody. We're talking about audio and video that's 146 00:08:44,476 --> 00:08:48,396 Speaker 1: designed to hide itself as false. It's being passed off 147 00:08:48,436 --> 00:08:52,956 Speaker 1: as real, and the way we human beings perceive audio 148 00:08:52,956 --> 00:08:56,796 Speaker 1: and video is as if it is a true representation 149 00:08:57,396 --> 00:08:59,396 Speaker 1: of things that have happened in the world, as if 150 00:08:59,436 --> 00:09:02,796 Speaker 1: it's real evidence. So that is I think one really 151 00:09:02,836 --> 00:09:08,516 Speaker 1: important piece of why digital impersonations using audio and video 152 00:09:09,196 --> 00:09:11,996 Speaker 1: that not only a powerful impact and therefore can have 153 00:09:12,156 --> 00:09:16,596 Speaker 1: significant harms, it's how the creators are passing them off 154 00:09:17,316 --> 00:09:21,396 Speaker 1: as real rather than as fiction. We see this play 155 00:09:21,396 --> 00:09:24,356 Speaker 1: out with defamation. We can be as opinionated as we 156 00:09:24,396 --> 00:09:27,396 Speaker 1: want to be and everyone knows it's what we think. 157 00:09:27,996 --> 00:09:31,236 Speaker 1: But when it comes to factual falsehoods that we're saying 158 00:09:31,236 --> 00:09:34,676 Speaker 1: are true, we can punish those. So that's another piece 159 00:09:34,716 --> 00:09:37,116 Speaker 1: of it why there's a big difference between the deep 160 00:09:37,196 --> 00:09:40,676 Speaker 1: fake that's passing itself off as real. Soon we won't 161 00:09:40,676 --> 00:09:42,716 Speaker 1: even be able to tell the difference between fake and 162 00:09:42,756 --> 00:09:45,676 Speaker 1: real unless we do some real serious journalists digging and 163 00:09:45,916 --> 00:09:50,596 Speaker 1: where someone was at a particular time versus the novel, Daniel, 164 00:09:50,836 --> 00:09:53,436 Speaker 1: there's several really fascinating things and what you just said, 165 00:09:53,436 --> 00:09:54,876 Speaker 1: and I want to break it up into pieces if 166 00:09:54,916 --> 00:09:57,676 Speaker 1: we could, so I want to start with this question 167 00:09:57,716 --> 00:10:01,316 Speaker 1: of the disclaimer or how the information is presented on 168 00:10:01,396 --> 00:10:03,916 Speaker 1: the way you were describing now, it sounds like, though, 169 00:10:03,996 --> 00:10:06,716 Speaker 1: if someone produces a video where they take my image 170 00:10:06,716 --> 00:10:08,276 Speaker 1: and then they do something with me that I don't 171 00:10:08,276 --> 00:10:10,116 Speaker 1: want to have done, you know, they put me into 172 00:10:10,116 --> 00:10:13,396 Speaker 1: a sex video or something like that, if they put 173 00:10:13,436 --> 00:10:17,276 Speaker 1: a little note on it that says disclaimer, this is 174 00:10:17,316 --> 00:10:21,356 Speaker 1: a deep fake, it's not Feldman, then in principle it 175 00:10:21,436 --> 00:10:25,036 Speaker 1: sounds like maybe wouldn't be covered by the prohibitions that 176 00:10:25,076 --> 00:10:27,996 Speaker 1: you're talking about. And my instinct is that that wouldn't 177 00:10:27,996 --> 00:10:31,276 Speaker 1: be very satisfactory. If we're concern is about the appropriation 178 00:10:31,316 --> 00:10:35,196 Speaker 1: of my identity, then a disclaimer, even if it's a 179 00:10:35,236 --> 00:10:37,636 Speaker 1: chiron running the whole time that the video is playing, 180 00:10:38,196 --> 00:10:39,716 Speaker 1: I don't think it's going to make me feel that 181 00:10:39,836 --> 00:10:43,156 Speaker 1: much better about the appropriation of my image, is it. So? Yeah, 182 00:10:43,196 --> 00:10:45,836 Speaker 1: let me back up for a second. I was using 183 00:10:46,316 --> 00:10:49,636 Speaker 1: defamation and the kinds of defamation harms and harm to 184 00:10:49,756 --> 00:10:54,516 Speaker 1: reputation that may be alleviated with labeling. And so the 185 00:10:54,596 --> 00:10:57,636 Speaker 1: question is a good one, pressing me would disclosure in 186 00:10:57,676 --> 00:11:01,716 Speaker 1: and of itself be a remedy here, And I always 187 00:11:01,716 --> 00:11:03,756 Speaker 1: think of it as it's kind of like a half measure, 188 00:11:04,196 --> 00:11:08,196 Speaker 1: and it doesn't remedy the invasion to let's say, for example, 189 00:11:08,276 --> 00:11:11,276 Speaker 1: sexual privacy. So when you take someone that you're the 190 00:11:11,316 --> 00:11:14,036 Speaker 1: example that you gave Noahs one that I would immediately 191 00:11:14,076 --> 00:11:16,956 Speaker 1: use as well, which is you take someone's sexual identity 192 00:11:17,716 --> 00:11:20,596 Speaker 1: and you insert their face into pornography. And what you've 193 00:11:20,636 --> 00:11:26,036 Speaker 1: done is you've appropriated variability to express themselves sexually in 194 00:11:26,116 --> 00:11:28,596 Speaker 1: ways that they do not want to be so seen 195 00:11:28,716 --> 00:11:32,476 Speaker 1: that way, right, or depicted, And that in itself is 196 00:11:32,476 --> 00:11:35,516 Speaker 1: a harm. It's a harm that resonates both with autonomy 197 00:11:35,556 --> 00:11:40,356 Speaker 1: and with the consequential emotional harms. And so you're right 198 00:11:40,716 --> 00:11:43,436 Speaker 1: that even with the label there, in the case of 199 00:11:43,436 --> 00:11:46,716 Speaker 1: an invasion of sexual privacy, that there's harm both to 200 00:11:46,836 --> 00:11:51,476 Speaker 1: reputation that may be somewhat solved by a label like hey, 201 00:11:51,516 --> 00:11:54,036 Speaker 1: this isn't real, friends, I'm not passing this off as 202 00:11:54,036 --> 00:11:58,596 Speaker 1: a fact. That's the harm to reputation and defamation. But 203 00:11:58,676 --> 00:12:01,636 Speaker 1: there are other harms, and that includes invasions to sexual 204 00:12:01,676 --> 00:12:07,196 Speaker 1: privacy that labels cannot cure. And as well, if it's 205 00:12:07,196 --> 00:12:09,756 Speaker 1: in a Google search of your name, label, ain't you 206 00:12:09,756 --> 00:12:12,436 Speaker 1: doing me any good? Probably because people won't see it 207 00:12:12,956 --> 00:12:16,036 Speaker 1: and there isn't a meaningful way made of respond we 208 00:12:16,196 --> 00:12:18,516 Speaker 1: response of this related video and audio that people are 209 00:12:18,556 --> 00:12:21,236 Speaker 1: going to believe it's you. That's the second thing that 210 00:12:21,236 --> 00:12:23,596 Speaker 1: I really wanted to ask you about in your fascinating 211 00:12:23,716 --> 00:12:25,956 Speaker 1: comment before, and that is this idea that audio and 212 00:12:26,116 --> 00:12:30,556 Speaker 1: video are somehow different because we really believe something about them. 213 00:12:30,556 --> 00:12:33,116 Speaker 1: And here too, I think there are maybe two different parts. 214 00:12:33,676 --> 00:12:38,196 Speaker 1: One is the idea that maybe there's something almost neurochemical 215 00:12:38,236 --> 00:12:42,996 Speaker 1: about viewing audio and video that's different from other forms 216 00:12:43,236 --> 00:12:46,356 Speaker 1: of representation. And I wonder if you really believe that. 217 00:12:46,396 --> 00:12:47,956 Speaker 1: I'm not sure I believe that. I mean, if I 218 00:12:47,996 --> 00:12:50,956 Speaker 1: watch you a Jason Bourne movie, I know it's not 219 00:12:51,036 --> 00:12:53,436 Speaker 1: real even though it's got audio and video in it, 220 00:12:53,476 --> 00:12:57,116 Speaker 1: because of the context in which I'm experiencing it. So 221 00:12:57,356 --> 00:13:00,396 Speaker 1: we're perfectly capable of being skeptical about truth in the 222 00:13:00,436 --> 00:13:02,756 Speaker 1: cases of audio and video. And I'm not completely convinced 223 00:13:02,756 --> 00:13:05,556 Speaker 1: that if you put us under a functional MRI scan 224 00:13:06,076 --> 00:13:08,156 Speaker 1: that our brains would look so radically different in the 225 00:13:08,196 --> 00:13:11,236 Speaker 1: two contexts. And then the second possibility is that you 226 00:13:11,276 --> 00:13:13,556 Speaker 1: were saying that we have a kind of cultural expectation 227 00:13:14,316 --> 00:13:16,516 Speaker 1: that if it's on audio or if it's on a 228 00:13:16,596 --> 00:13:19,956 Speaker 1: video and it looks like me, that it's me doing it. 229 00:13:20,076 --> 00:13:22,356 Speaker 1: And there I want to raise the topic for us 230 00:13:22,396 --> 00:13:24,756 Speaker 1: to talk about of how that's going to change. It 231 00:13:24,796 --> 00:13:28,436 Speaker 1: seems almost inevitable that that expectation that we still have 232 00:13:29,196 --> 00:13:33,996 Speaker 1: as of two thousand and nineteen may be gone even 233 00:13:34,076 --> 00:13:37,116 Speaker 1: by two thousand and twenty. So those are two questions 234 00:13:37,116 --> 00:13:38,716 Speaker 1: in one. But maybe start with the first one, is 235 00:13:38,716 --> 00:13:40,596 Speaker 1: there something unique about audio and video? And then we'll 236 00:13:40,636 --> 00:13:43,036 Speaker 1: go on to whether our expectations are going to change? 237 00:13:43,796 --> 00:13:46,156 Speaker 1: So what is unique, and this is really important I 238 00:13:46,156 --> 00:13:50,236 Speaker 1: want to underscore, is the notion of context and content 239 00:13:51,156 --> 00:13:54,316 Speaker 1: when a deep seek is most effective, meaning when it 240 00:13:54,356 --> 00:13:57,996 Speaker 1: convinces you that somebody's done something and said something they've 241 00:13:58,036 --> 00:14:02,676 Speaker 1: never done. It's all about context and content. The Jason 242 00:14:02,716 --> 00:14:06,316 Speaker 1: Bourne movie you mentioned in that context and content, I 243 00:14:06,356 --> 00:14:09,516 Speaker 1: know it's a movie I go in understand that it's 244 00:14:09,556 --> 00:14:13,116 Speaker 1: all this alternative universe that I'm watching on screen, But 245 00:14:13,236 --> 00:14:16,916 Speaker 1: in a context and context in which the video is 246 00:14:16,956 --> 00:14:21,156 Speaker 1: being passed off as and looks damn real, then people 247 00:14:21,396 --> 00:14:23,756 Speaker 1: are going to take it as real. So let me 248 00:14:23,796 --> 00:14:26,876 Speaker 1: just give example. Rana Ayub is a journalist in India 249 00:14:27,396 --> 00:14:31,236 Speaker 1: who is an investigative reporter and she her work has 250 00:14:31,276 --> 00:14:34,756 Speaker 1: attracted lots of vitriol because she challenges people in power. 251 00:14:35,316 --> 00:14:37,636 Speaker 1: And there was a deep fake sex video made of her. 252 00:14:37,956 --> 00:14:41,196 Speaker 1: She looked at it and she knew intellectually it wasn't her, 253 00:14:41,636 --> 00:14:44,716 Speaker 1: But immediately after seeing it, she went and vomited and 254 00:14:44,836 --> 00:14:48,636 Speaker 1: she couldn't sleep or eat for like weeks because seeing 255 00:14:48,676 --> 00:14:52,476 Speaker 1: herself demeaned and engaging in a sex action never engaged in, 256 00:14:52,596 --> 00:14:56,596 Speaker 1: it hit her, she explains in the gut she could 257 00:14:56,636 --> 00:14:59,556 Speaker 1: not shake the image of it in her head. And 258 00:14:59,596 --> 00:15:01,756 Speaker 1: when thousands and thousands of people saw it, because it 259 00:15:01,796 --> 00:15:04,636 Speaker 1: was shared, like half of the phones in India, people 260 00:15:04,676 --> 00:15:08,196 Speaker 1: apparently had it, people believed it was her and they 261 00:15:08,236 --> 00:15:11,596 Speaker 1: confronted her off flne There was the suggestion, you know, 262 00:15:11,636 --> 00:15:14,876 Speaker 1: people were urging others to confront her and have sex 263 00:15:14,876 --> 00:15:19,756 Speaker 1: with her, to rape her. So in that context, the 264 00:15:19,876 --> 00:15:24,636 Speaker 1: video was taken as true and it moved people. It 265 00:15:24,756 --> 00:15:27,276 Speaker 1: moved people to write to her inbox was so full 266 00:15:27,676 --> 00:15:31,236 Speaker 1: with sex solicitations from men she didn't know. She was overwhelmed. 267 00:15:32,156 --> 00:15:34,916 Speaker 1: I realize, you know, we're talking about how do we 268 00:15:34,956 --> 00:15:39,876 Speaker 1: distinguish between circumstances, And you ask rightfully, so you know, 269 00:15:40,076 --> 00:15:43,796 Speaker 1: isn't it true now that we have our radar is 270 00:15:43,876 --> 00:15:46,556 Speaker 1: up and we know when to distinguish what's false and 271 00:15:46,596 --> 00:15:49,076 Speaker 1: what's real. Well, in the case of a movie, we do, 272 00:15:49,276 --> 00:15:51,316 Speaker 1: our radar is up and we know we have all 273 00:15:51,476 --> 00:15:54,556 Speaker 1: the and diisha that it's fake. But you know, with 274 00:15:55,156 --> 00:15:59,196 Speaker 1: deep fakes that are really convincing, we won't have our 275 00:15:59,276 --> 00:16:02,676 Speaker 1: radar up, And especially in certain contexts and certain content, 276 00:16:03,276 --> 00:16:05,316 Speaker 1: the whole point of it is to trick us into 277 00:16:05,356 --> 00:16:09,516 Speaker 1: believing it's true. It's an incredibly horrifying story about about 278 00:16:09,556 --> 00:16:13,316 Speaker 1: Ranna Ayube. I want to ask though, about how we 279 00:16:13,396 --> 00:16:16,956 Speaker 1: gradually change our expectations. So, you know, it's a really 280 00:16:16,996 --> 00:16:21,116 Speaker 1: interesting subject in the history of photography, how people certainly 281 00:16:21,156 --> 00:16:24,716 Speaker 1: believe that photographs were always real in the not only 282 00:16:24,716 --> 00:16:26,796 Speaker 1: in the late nineteenth century but well into the twentieth century. 283 00:16:26,836 --> 00:16:28,996 Speaker 1: And two of the great examples that the darians and 284 00:16:29,036 --> 00:16:33,756 Speaker 1: photography you like to use are the photographs of spirits, 285 00:16:33,796 --> 00:16:39,516 Speaker 1: of ectoplasmic angels and fairies and sprites, which became super 286 00:16:39,596 --> 00:16:43,076 Speaker 1: popular in the end of the nineteenth century. Arthur Conan Doyle, 287 00:16:43,076 --> 00:16:45,276 Speaker 1: who at the Sherlock Holm Stories, was interested in them, 288 00:16:45,356 --> 00:16:48,476 Speaker 1: and J. M. Barrie, who wrote Peter Pan was interested 289 00:16:48,476 --> 00:16:51,196 Speaker 1: in them. And lots of people were fascinated by these 290 00:16:51,396 --> 00:16:54,716 Speaker 1: by these photographic images which appeared to give actual form 291 00:16:55,516 --> 00:16:57,996 Speaker 1: to these supernatural spirits. And of course now we look 292 00:16:57,996 --> 00:17:00,556 Speaker 1: at it and we say, well, that's obviously a fake photograph. 293 00:17:01,196 --> 00:17:05,076 Speaker 1: The other example that they like to give is stalinist Russia, 294 00:17:05,156 --> 00:17:07,996 Speaker 1: and there's a famous example of a photograph of a 295 00:17:08,036 --> 00:17:11,476 Speaker 1: whole group of senior Communist Party figures and then one 296 00:17:11,556 --> 00:17:15,676 Speaker 1: by one by one, as each was purged. He they 297 00:17:15,716 --> 00:17:17,996 Speaker 1: were all men. He was airbrushed, or like I say, 298 00:17:18,036 --> 00:17:21,276 Speaker 1: maybe didn't have airbrushed, but he was removed by earlier 299 00:17:21,276 --> 00:17:24,236 Speaker 1: technological means from the photograph, until in the end you 300 00:17:24,396 --> 00:17:27,356 Speaker 1: basically just had Stalin on his own. And at every moment, 301 00:17:28,036 --> 00:17:31,036 Speaker 1: you know, the Communist Party kept putting out these photographs 302 00:17:31,036 --> 00:17:33,596 Speaker 1: out there, and the public sort of in some way 303 00:17:33,676 --> 00:17:36,676 Speaker 1: believe them, even if some people might have remembered that 304 00:17:36,716 --> 00:17:38,556 Speaker 1: this photograph looked a little different than it had looked 305 00:17:38,796 --> 00:17:40,836 Speaker 1: the last time. So those are both examples from the 306 00:17:40,836 --> 00:17:44,516 Speaker 1: history of photography about how the technology could elicit the 307 00:17:44,596 --> 00:17:48,676 Speaker 1: expectation of reality for some period of time, but then 308 00:17:48,716 --> 00:17:52,556 Speaker 1: as people get familiar with the technology, our expectation changes. 309 00:17:53,116 --> 00:17:55,836 Speaker 1: And so I'm wondering if the tragic story that you 310 00:17:55,916 --> 00:17:59,436 Speaker 1: describe for runa Ayu isn't something that will last for 311 00:17:59,476 --> 00:18:01,516 Speaker 1: as long as it takes for people around the world 312 00:18:01,956 --> 00:18:04,556 Speaker 1: to realize what deep fakes are. And let's say it 313 00:18:04,596 --> 00:18:05,996 Speaker 1: takes six months or a year or a couple of 314 00:18:06,036 --> 00:18:08,636 Speaker 1: years for people to figure that out. But maybe we're 315 00:18:08,676 --> 00:18:11,996 Speaker 1: on the US both a change where no one's really 316 00:18:11,996 --> 00:18:13,996 Speaker 1: going to believe what they see or hear on audio 317 00:18:14,076 --> 00:18:16,516 Speaker 1: or video anymore. And I think that leads us to 318 00:18:16,596 --> 00:18:19,116 Speaker 1: two really troubling places, like we're at a fork in 319 00:18:19,196 --> 00:18:22,516 Speaker 1: the road. And the first is if we just decide 320 00:18:22,556 --> 00:18:25,836 Speaker 1: that we can't believe anything, then we're going to just 321 00:18:25,956 --> 00:18:28,836 Speaker 1: believe what we want to believe, just forget the truth, 322 00:18:29,276 --> 00:18:33,356 Speaker 1: you know, our confirmation biases. We believe information that accords 323 00:18:33,396 --> 00:18:36,796 Speaker 1: with our world views. And so one possibility in a 324 00:18:36,836 --> 00:18:41,116 Speaker 1: world in which we are so skeptical of audio and 325 00:18:41,236 --> 00:18:44,716 Speaker 1: video evidence that we simply say for our hands up 326 00:18:44,956 --> 00:18:47,516 Speaker 1: and say, uh, I'm just gonna believe what I want 327 00:18:47,516 --> 00:18:49,996 Speaker 1: to believe. Truth be damned, right, that's one possibility, and 328 00:18:50,036 --> 00:18:52,836 Speaker 1: that's that's one nightmare for the pursuit of truth. And 329 00:18:52,876 --> 00:18:54,836 Speaker 1: sometimes it seems like we're already in that world. If 330 00:18:54,836 --> 00:18:57,276 Speaker 1: you watch Fox and CNN at the same time we're 331 00:18:57,316 --> 00:19:00,116 Speaker 1: Fox and MSNBC and you hear their alternative versions of reality, 332 00:19:00,436 --> 00:19:03,276 Speaker 1: sometimes seems like everyone's already gone down that rabbit hole. Right. 333 00:19:03,316 --> 00:19:06,996 Speaker 1: So that's that's one rabbit hole. That's one of my nightmares. 334 00:19:07,076 --> 00:19:09,796 Speaker 1: You know. It's like Alice in the look glass like 335 00:19:09,916 --> 00:19:12,516 Speaker 1: that to me is a one really bad path that 336 00:19:12,596 --> 00:19:16,796 Speaker 1: you're right, we are already on. And then the second 337 00:19:17,036 --> 00:19:20,556 Speaker 1: challenge for truth, and you alluded to this earlier, is 338 00:19:20,596 --> 00:19:24,396 Speaker 1: that we might also be in a space where and 339 00:19:24,676 --> 00:19:27,556 Speaker 1: this could be liminal. Doesn't mean it's forever, but where. 340 00:19:27,716 --> 00:19:30,996 Speaker 1: And we've seen it already with politicians looking at real 341 00:19:31,316 --> 00:19:34,996 Speaker 1: evidence of wrongdoing and saying, ah, you can't believe your 342 00:19:35,036 --> 00:19:37,756 Speaker 1: eyes and ears, it's a fake. Think about what President 343 00:19:37,996 --> 00:19:40,956 Speaker 1: Trump tried to do with the Access Hollywood tape, he 344 00:19:41,036 --> 00:19:42,796 Speaker 1: sort of wises up to the idea of a deep 345 00:19:42,796 --> 00:19:46,756 Speaker 1: fake after he admits to having of course been in 346 00:19:46,756 --> 00:19:49,836 Speaker 1: the conversation with Billy Bush more recently. He said that 347 00:19:49,956 --> 00:19:52,716 Speaker 1: wasn't me. I'm not sure that was me. He tried 348 00:19:52,756 --> 00:19:54,836 Speaker 1: it out, you know. And it's Bobby Chesney and I, 349 00:19:55,396 --> 00:19:58,996 Speaker 1: professor at UT Austin and my co author. We call 350 00:19:59,076 --> 00:20:03,276 Speaker 1: that the liar's dividend, the possibility that liars were leverage 351 00:20:03,636 --> 00:20:06,956 Speaker 1: the phenomenon of deep fakes to run away from and 352 00:20:07,196 --> 00:20:10,956 Speaker 1: to escape accountability for the wrongdoing. Yeah, that liar's dividend 353 00:20:11,076 --> 00:20:15,796 Speaker 1: seems almost inevitable unless there's some magic, you know, super 354 00:20:15,836 --> 00:20:19,276 Speaker 1: technology that will enable us to distinguish and figure out 355 00:20:19,316 --> 00:20:23,476 Speaker 1: through careful forensic analysis whether a particular audio or video 356 00:20:23,476 --> 00:20:27,236 Speaker 1: clip is or is not a deep fake. So let 357 00:20:27,276 --> 00:20:29,876 Speaker 1: me take a concrete, relatively recent example where I think 358 00:20:29,916 --> 00:20:32,476 Speaker 1: there was and this is the not the deep fake, 359 00:20:32,556 --> 00:20:35,516 Speaker 1: but the so called cheap fake of the Nancy Pelosi 360 00:20:35,596 --> 00:20:38,956 Speaker 1: video which made her slow down her speech and distorted 361 00:20:38,956 --> 00:20:41,676 Speaker 1: her speech to make her sound like she was either 362 00:20:41,756 --> 00:20:45,436 Speaker 1: disoriented or maybe drunk. We want to give this president 363 00:20:45,996 --> 00:20:53,396 Speaker 1: the opportunity do something historic for our country. In that case, 364 00:20:53,756 --> 00:20:57,236 Speaker 1: what was the technology that enabled observers to prove that 365 00:20:57,276 --> 00:20:59,396 Speaker 1: this was not actually Nancy Pelosi's voice. I mean, I 366 00:20:59,436 --> 00:21:02,036 Speaker 1: guess it was partly finding the original video and audio 367 00:21:02,076 --> 00:21:04,796 Speaker 1: where she sounded normal. We want to give this president 368 00:21:05,156 --> 00:21:11,116 Speaker 1: the opportunity do something historic for our country. Yes, the 369 00:21:11,676 --> 00:21:14,516 Speaker 1: key to uncovering that it was like a cheap or 370 00:21:14,516 --> 00:21:18,156 Speaker 1: shallow fake was that there was existing audio of what 371 00:21:18,276 --> 00:21:21,796 Speaker 1: really happened. And so once you can check the real 372 00:21:21,836 --> 00:21:27,396 Speaker 1: audio and visuals next to the fabrication, it was a manipulation, 373 00:21:27,476 --> 00:21:31,036 Speaker 1: really wasn't a fabrication, then you could say they played 374 00:21:31,036 --> 00:21:34,556 Speaker 1: with that, they slowed down the speech. But what's challenging is, 375 00:21:34,596 --> 00:21:37,956 Speaker 1: and someone worries that Bobby and I have about privacy, 376 00:21:38,356 --> 00:21:41,276 Speaker 1: is that when we don't have self surveillance all the time. 377 00:21:41,316 --> 00:21:45,116 Speaker 1: You know, Nancy Pelosi when she's giving public remarks, somebody's 378 00:21:45,156 --> 00:21:49,676 Speaker 1: taping it. But the everyday person doesn't have perfect surveillance 379 00:21:49,796 --> 00:21:52,716 Speaker 1: one hopes, right, and so you don't have carrying your 380 00:21:52,716 --> 00:21:55,956 Speaker 1: phone around, you may actually de facto have perfect surveillance 381 00:21:55,996 --> 00:21:57,356 Speaker 1: of yourself, whether you know it or not, or if 382 00:21:57,356 --> 00:22:00,236 Speaker 1: you're speaking while Alexa is on. I mean, we usually 383 00:22:00,236 --> 00:22:02,876 Speaker 1: think that's a problem that you're constantly being surveilled wherever 384 00:22:02,916 --> 00:22:04,756 Speaker 1: you go, but in your account, it might actually be 385 00:22:04,756 --> 00:22:06,476 Speaker 1: a good thing. I mean, it very easy to build 386 00:22:06,476 --> 00:22:08,476 Speaker 1: in a future to your phone. I mean it's already 387 00:22:08,476 --> 00:22:11,116 Speaker 1: in there that listens to everything you say, and then 388 00:22:11,116 --> 00:22:12,916 Speaker 1: you could always pull it out and say, well, as 389 00:22:12,956 --> 00:22:15,076 Speaker 1: a matter of fact, I didn't say that, and we might. 390 00:22:15,116 --> 00:22:17,596 Speaker 1: I mean, so what Bobby and I imagine in our 391 00:22:17,676 --> 00:22:21,236 Speaker 1: work together is the possibility that we take that. You know, 392 00:22:21,396 --> 00:22:25,516 Speaker 1: our phones right now are they're basically GPS trackers, and 393 00:22:25,636 --> 00:22:27,756 Speaker 1: they track everything we do and say with those phones 394 00:22:27,996 --> 00:22:30,756 Speaker 1: through text and through our sending photos. But they're not 395 00:22:30,956 --> 00:22:33,476 Speaker 1: on in the sense of they're not my microphone recording 396 00:22:33,516 --> 00:22:37,116 Speaker 1: devices unless somebody installed cyberstocking at an app on our phone. 397 00:22:37,236 --> 00:22:39,556 Speaker 1: Right so we're not although that's not true if you 398 00:22:39,596 --> 00:22:41,956 Speaker 1: have if you have Hey Siri on there, Oh please, 399 00:22:42,036 --> 00:22:45,196 Speaker 1: didn't you turn that off on your phone? Sorry? Well, 400 00:22:45,196 --> 00:22:47,876 Speaker 1: I was going to until our conversation because I guess 401 00:22:47,956 --> 00:22:49,556 Speaker 1: you know, I thought to myself, I wouldn't want everything 402 00:22:49,596 --> 00:22:51,076 Speaker 1: that I say to be recorded. But now I think 403 00:22:51,076 --> 00:22:53,156 Speaker 1: to myself, if someone's going to make a fake video 404 00:22:53,196 --> 00:22:55,276 Speaker 1: of me saying or doing something, maybe it wouldn't be 405 00:22:55,356 --> 00:22:57,356 Speaker 1: so terrible for me to have a twenty four seven 406 00:22:57,436 --> 00:22:59,756 Speaker 1: record of everything I've said and done right and what 407 00:22:59,876 --> 00:23:04,316 Speaker 1: Bobby and I are worried about our market mechanisms. That, 408 00:23:04,796 --> 00:23:07,556 Speaker 1: let's say, for people who have a very public persona 409 00:23:07,836 --> 00:23:12,356 Speaker 1: having a lifeging record of everything you do and say 410 00:23:12,636 --> 00:23:16,876 Speaker 1: and every interaction, sort of like the novel The Circle, 411 00:23:17,156 --> 00:23:21,436 Speaker 1: you know, by Dave Eggers, that possibility may not be 412 00:23:21,516 --> 00:23:24,996 Speaker 1: so much a choice for people who their actions have 413 00:23:25,276 --> 00:23:31,156 Speaker 1: high potential for harm. For very powerful entities like a company, 414 00:23:31,516 --> 00:23:34,116 Speaker 1: So it maybe someday, you know, you have a company 415 00:23:34,196 --> 00:23:36,716 Speaker 1: CEO who part of the deal getting a very high 416 00:23:36,716 --> 00:23:39,676 Speaker 1: salary is you have to be under surveillance twenty four 417 00:23:39,756 --> 00:23:42,716 Speaker 1: seven so that you can debunk the deep fake that 418 00:23:42,796 --> 00:23:45,956 Speaker 1: if time just right, could throw off the IPO. And 419 00:23:46,156 --> 00:23:49,036 Speaker 1: my worry is that in the broader term, we may 420 00:23:49,076 --> 00:23:52,476 Speaker 1: have an unraveling of privacy. It's a concept that Scott 421 00:23:52,476 --> 00:23:55,356 Speaker 1: Peppett like wisely wrote about like eight years ago, the 422 00:23:55,396 --> 00:23:58,716 Speaker 1: notion that we'll see market pressures that would require us 423 00:23:58,756 --> 00:24:02,676 Speaker 1: to give up our privacy. Fourth cheaper insurance right for 424 00:24:02,756 --> 00:24:06,636 Speaker 1: our cars or health. We may see that same move. 425 00:24:06,676 --> 00:24:09,476 Speaker 1: It's small now, you know, wearing a fit for your 426 00:24:09,636 --> 00:24:15,636 Speaker 1: insurance company may then become something more pervasive and self surveillance. 427 00:24:16,716 --> 00:24:19,196 Speaker 1: And it is true it would be able to debunk 428 00:24:19,596 --> 00:24:22,316 Speaker 1: the deep fake. That's time just right in an attempt 429 00:24:22,316 --> 00:24:24,636 Speaker 1: to hurt the CEO and attempt to hurt the company. 430 00:24:24,996 --> 00:24:27,636 Speaker 1: But it's got longer term consequences that I want us 431 00:24:27,636 --> 00:24:31,596 Speaker 1: all to think about before we rush to embrace what 432 00:24:31,636 --> 00:24:34,436 Speaker 1: I think is coming market solutions. Because you know, folks 433 00:24:34,476 --> 00:24:36,516 Speaker 1: get in touch with both Bobby and I. Companies are 434 00:24:36,516 --> 00:24:40,156 Speaker 1: now seeking our advice on what we should tell CEOs 435 00:24:40,196 --> 00:24:42,716 Speaker 1: about what they should do to protect themselves and what 436 00:24:42,756 --> 00:24:46,236 Speaker 1: are you telling them? What we often say is there 437 00:24:46,876 --> 00:24:49,196 Speaker 1: a range of possibilities, and one of them, of course, 438 00:24:49,356 --> 00:24:52,676 Speaker 1: are kind of life logging services, and I want them 439 00:24:52,716 --> 00:24:55,956 Speaker 1: to think about the broader implications of doing that, because 440 00:24:55,956 --> 00:24:59,916 Speaker 1: I think it's a bad idea. Do the technologists think 441 00:24:59,956 --> 00:25:04,076 Speaker 1: that in principle it will not be possible to detect 442 00:25:04,196 --> 00:25:09,036 Speaker 1: a true fabrication? Because after all, what is digital audience video? 443 00:25:09,116 --> 00:25:12,436 Speaker 1: A bunch of zeros and ones, and it's possible to 444 00:25:12,796 --> 00:25:14,876 Speaker 1: make these things, maybe it's possible to mask the fact 445 00:25:14,876 --> 00:25:18,156 Speaker 1: that they've been made. Or do the technologists believe that, 446 00:25:18,316 --> 00:25:20,356 Speaker 1: as with much technology, there will always be someone one 447 00:25:20,396 --> 00:25:23,836 Speaker 1: step ahead. So someone makes a really good deep fake, 448 00:25:24,236 --> 00:25:27,156 Speaker 1: someone else develops a technology to detect that deep fake, 449 00:25:27,196 --> 00:25:29,676 Speaker 1: because that would seem to be a possible solution that 450 00:25:29,716 --> 00:25:32,956 Speaker 1: would get you around the life logging alternative where you 451 00:25:32,996 --> 00:25:34,996 Speaker 1: always have to be prepared to say that never happened. 452 00:25:35,596 --> 00:25:37,956 Speaker 1: That is to say, if there were some magic bullet technology, 453 00:25:37,996 --> 00:25:39,396 Speaker 1: it would have to change over time, like in an 454 00:25:39,476 --> 00:25:42,876 Speaker 1: arms race, they could actually detect that. Does that seem 455 00:25:42,916 --> 00:25:44,556 Speaker 1: I mean, what are the technologists say when you ask 456 00:25:44,636 --> 00:25:46,316 Speaker 1: them about that? Do they think it's pointless? Do they 457 00:25:46,316 --> 00:25:48,636 Speaker 1: think it's possible? Right now? I wish I could say 458 00:25:48,636 --> 00:25:52,076 Speaker 1: I had a firm answer, And normally I am pretty 459 00:25:52,116 --> 00:25:56,436 Speaker 1: bullish about the potential role for technology to at least 460 00:25:56,476 --> 00:25:59,196 Speaker 1: intervene in some of these problems. But my sense of 461 00:25:59,236 --> 00:26:04,036 Speaker 1: the broader view of technologists is that they're skeptical, you know, 462 00:26:04,076 --> 00:26:08,836 Speaker 1: about the possibility of a perfect tool of detection. And 463 00:26:09,636 --> 00:26:12,476 Speaker 1: we certainly need, as human beings, better radar for fakery. 464 00:26:13,236 --> 00:26:15,716 Speaker 1: We just believe what we want to believe, and we 465 00:26:15,756 --> 00:26:17,996 Speaker 1: click and we share, and we don't think, and we 466 00:26:18,516 --> 00:26:23,116 Speaker 1: need to become better digital citizens in that way and 467 00:26:23,996 --> 00:26:27,796 Speaker 1: think before we share, think before we like not be 468 00:26:28,036 --> 00:26:33,596 Speaker 1: such immediate consumers of information and shares. And we need 469 00:26:33,636 --> 00:26:35,596 Speaker 1: to do a whole lot of education with the media too, 470 00:26:35,996 --> 00:26:40,116 Speaker 1: because the media is a source of amplification. You know, 471 00:26:40,196 --> 00:26:42,276 Speaker 1: I know that the Wall Street journal has begun educating 472 00:26:42,356 --> 00:26:45,036 Speaker 1: journalists about the phenomenon of deep fakes and what they're 473 00:26:45,076 --> 00:26:47,476 Speaker 1: going to do about it, and I imagine that you know, 474 00:26:47,516 --> 00:26:49,476 Speaker 1: the most reputable outlets are going to be doing the 475 00:26:49,516 --> 00:26:53,396 Speaker 1: same because there isn't a technical easy way to figure out. 476 00:26:53,436 --> 00:26:55,836 Speaker 1: At least now there is, but soon enough there won't be. 477 00:26:56,316 --> 00:26:58,476 Speaker 1: And so how we're going to figure out the truth 478 00:26:58,556 --> 00:27:03,076 Speaker 1: is God blessed journalists because most people won't have, you know, 479 00:27:03,196 --> 00:27:05,996 Speaker 1: constant surveillance on themselves, even if they have their GPS 480 00:27:06,036 --> 00:27:10,236 Speaker 1: track or their phone right, it won't we have recorded 481 00:27:10,636 --> 00:27:14,636 Speaker 1: the interaction between people unless they've done it consensually. And 482 00:27:14,756 --> 00:27:19,036 Speaker 1: so it's journalists takes hard work right to interview people. 483 00:27:19,556 --> 00:27:22,236 Speaker 1: But it's not like there's no so many people just 484 00:27:22,316 --> 00:27:25,676 Speaker 1: be sayd Danielle, you're a nihilist. You're saying deep fakes 485 00:27:25,996 --> 00:27:28,716 Speaker 1: is just a crisis we can't tackle, and that's not true. 486 00:27:29,556 --> 00:27:32,876 Speaker 1: It's a lot of the same old problems, but exponentially, 487 00:27:33,556 --> 00:27:38,236 Speaker 1: you know, spread well. Thank you very much, Danielle. I'm 488 00:27:38,276 --> 00:27:41,796 Speaker 1: super grateful to you, and I certified that this is 489 00:27:41,836 --> 00:27:45,116 Speaker 1: a real conversation that we actually had and was not 490 00:27:45,396 --> 00:27:48,596 Speaker 1: created by deep fakes, cheap fakes, or anything in between. 491 00:27:48,796 --> 00:27:52,276 Speaker 1: Right yep, I certify this is a real audio. Thank you, Danielle. 492 00:27:52,276 --> 00:27:54,676 Speaker 1: That was really fun. I really enjoyed it. Thank you 493 00:27:54,716 --> 00:28:03,236 Speaker 1: Noh for having me on. From talking to Danielle Citron, 494 00:28:03,396 --> 00:28:06,116 Speaker 1: I'm convinced they were really only at the beginning of 495 00:28:06,236 --> 00:28:09,916 Speaker 1: finding our collective solution to the challenges post by deep fakes. 496 00:28:10,796 --> 00:28:14,356 Speaker 1: Think of the example of photography, where we once believe 497 00:28:14,436 --> 00:28:17,116 Speaker 1: that whatever we saw was true, and we gradually came 498 00:28:17,156 --> 00:28:20,076 Speaker 1: to see things in a more complex way. Should our 499 00:28:20,076 --> 00:28:23,596 Speaker 1: response be to pass laws that regulate what kinds of 500 00:28:23,636 --> 00:28:27,316 Speaker 1: images can be created, fake or real? Or should the 501 00:28:27,356 --> 00:28:30,996 Speaker 1: answer be to rely on self regulation or technological solutions. 502 00:28:31,636 --> 00:28:34,516 Speaker 1: These are challenges that would look very, very different if 503 00:28:34,516 --> 00:28:36,556 Speaker 1: we were in the midst of the technology, the invention 504 00:28:36,596 --> 00:28:39,636 Speaker 1: of the photograph. Then they look today at the distance 505 00:28:39,676 --> 00:28:41,636 Speaker 1: of one hundred and fifty or one hundred and seventy 506 00:28:41,636 --> 00:28:44,516 Speaker 1: five years, So We're going to have to drill down 507 00:28:45,036 --> 00:28:48,876 Speaker 1: on just how much deep fakes do fool people, and 508 00:28:48,956 --> 00:28:50,676 Speaker 1: that's something that we're going to have to watch very 509 00:28:50,716 --> 00:28:54,556 Speaker 1: closely going forward. The first couple of very popular deep 510 00:28:54,596 --> 00:28:58,396 Speaker 1: fakes are going to lead to great confusion. But over 511 00:28:58,476 --> 00:29:02,356 Speaker 1: time we may observe that the public becomes acclimated to 512 00:29:02,436 --> 00:29:06,636 Speaker 1: the potential uses of deep fakes and becomes more skeptical. 513 00:29:07,276 --> 00:29:11,796 Speaker 1: As our skepticism rises, the need for regulation may well 514 00:29:11,916 --> 00:29:17,236 Speaker 1: go down. Danielle remains extremely concerned about the sexual privacy 515 00:29:17,276 --> 00:29:20,556 Speaker 1: of individuals that can be violated by deep fakes. That's 516 00:29:20,556 --> 00:29:24,356 Speaker 1: an independent and serious concern, and it will require careful 517 00:29:24,516 --> 00:29:29,156 Speaker 1: regulatory attention to balance human being's interests in privacy and 518 00:29:29,196 --> 00:29:34,916 Speaker 1: dignity against the countervailing concerns a freedom of expression. Now 519 00:29:35,036 --> 00:29:49,516 Speaker 1: our sound of the week that was the Attorney General 520 00:29:49,556 --> 00:29:54,436 Speaker 1: of Israel announcing the indictment of Prime Minister Benjamin Bbing 521 00:29:54,516 --> 00:29:59,876 Speaker 1: Netanyahu on a range of corruption charges. Benjamin Etanyahu is 522 00:29:59,916 --> 00:30:03,836 Speaker 1: not just another Israeli politician. He has served longer in 523 00:30:03,876 --> 00:30:06,756 Speaker 1: the office of Prime Minister than any other person in 524 00:30:06,796 --> 00:30:10,476 Speaker 1: Israel's history, including the Ledge and Dary first Prime Minister 525 00:30:10,596 --> 00:30:14,116 Speaker 1: of Israel. David Ben Gurion. He's become a kind of 526 00:30:14,236 --> 00:30:16,836 Speaker 1: permanent fixture in the minds of many people on the 527 00:30:16,836 --> 00:30:20,796 Speaker 1: Israeli political scene. He's so hard to displace that, even 528 00:30:20,796 --> 00:30:24,476 Speaker 1: though not only one but two recent elections led to 529 00:30:24,556 --> 00:30:27,516 Speaker 1: his failing to get a meaningful majority, he's still the 530 00:30:27,596 --> 00:30:30,756 Speaker 1: Prime Minister because although he couldn't form a government, neither 531 00:30:30,796 --> 00:30:34,276 Speaker 1: could his main opponents. The idea that a sitting prime 532 00:30:34,316 --> 00:30:38,876 Speaker 1: minister in a democracy, admittedly a struggling democracy in Israel's case, 533 00:30:38,916 --> 00:30:42,956 Speaker 1: but a democracy nevertheless, will actually be put on trial 534 00:30:43,076 --> 00:30:47,676 Speaker 1: for crimes in real time is a kind of extraordinary 535 00:30:47,836 --> 00:30:51,956 Speaker 1: reality for Israeli politics, which are always full of surprises. 536 00:30:53,236 --> 00:30:55,116 Speaker 1: Is this a good or a bad thing for the 537 00:30:55,156 --> 00:30:58,716 Speaker 1: state of Israeli democracy? This could really go in one 538 00:30:58,756 --> 00:31:02,676 Speaker 1: of two different ways. There is a pessimistic view of it, 539 00:31:02,796 --> 00:31:05,316 Speaker 1: and it's hard not to sympathize with that pessimistic view. 540 00:31:05,956 --> 00:31:09,076 Speaker 1: It says that Israel has some tendencies that are pushing 541 00:31:09,196 --> 00:31:12,596 Speaker 1: it in the direction of other democracies, not excluding the 542 00:31:12,676 --> 00:31:16,036 Speaker 1: United States under Donald Trump, but also including places like 543 00:31:16,116 --> 00:31:20,916 Speaker 1: Poland and Hungry, where increasingly dominant single leaders serve an 544 00:31:20,956 --> 00:31:23,956 Speaker 1: office for a long time and call on impulses of 545 00:31:24,076 --> 00:31:29,076 Speaker 1: nationalism and populism to retain power. In this context, the 546 00:31:29,156 --> 00:31:33,556 Speaker 1: suggestion that N'tanyahu may actually also be guilty of crimes 547 00:31:33,556 --> 00:31:37,476 Speaker 1: of corruption would tend to suggest that Israeli democracy is 548 00:31:37,556 --> 00:31:41,276 Speaker 1: not going in a good direction. After all, what could 549 00:31:41,276 --> 00:31:44,516 Speaker 1: be more embarrassing for a democracy than evidence that a 550 00:31:44,556 --> 00:31:48,356 Speaker 1: sitting prime minister is guilty of charges like these. Yet 551 00:31:48,476 --> 00:31:52,276 Speaker 1: there is an alternative picture, and it's this, instead of 552 00:31:52,316 --> 00:31:56,276 Speaker 1: being tied in knots as a consequence of the charges 553 00:31:56,276 --> 00:31:59,796 Speaker 1: that are present, has acted corruptly, sound, familiar, or anyone 554 00:32:00,236 --> 00:32:03,636 Speaker 1: Israel actually seems to be taking on the question directly. 555 00:32:04,316 --> 00:32:09,316 Speaker 1: An independent attorney general actually chose to charge the prime 556 00:32:09,356 --> 00:32:13,916 Speaker 1: minister with a crime. This is after N'taniah, who had 557 00:32:14,076 --> 00:32:18,356 Speaker 1: announced his hopes that the kannesse At Israel's legislature would 558 00:32:18,356 --> 00:32:21,916 Speaker 1: actually pass a law rendering him immune from criminal prosecution 559 00:32:22,436 --> 00:32:25,116 Speaker 1: in the middle of his term, a law that so 560 00:32:25,156 --> 00:32:29,756 Speaker 1: far has not been enacted in some way that could 561 00:32:29,796 --> 00:32:33,236 Speaker 1: be seen as a big win for a democracy, After all, 562 00:32:33,316 --> 00:32:36,636 Speaker 1: the democracy must be pretty strong on the dimension of 563 00:32:36,676 --> 00:32:40,036 Speaker 1: fighting public corruption. If the attorney general can actually bring 564 00:32:40,156 --> 00:32:42,676 Speaker 1: charges against a sitting prime minister, and if that prime 565 00:32:42,676 --> 00:32:45,116 Speaker 1: minister is one of the most powerful in the history 566 00:32:45,156 --> 00:32:47,956 Speaker 1: of the country. That's what you would ideally want to 567 00:32:47,996 --> 00:32:51,396 Speaker 1: see in a democracy. The principle that nobody is above 568 00:32:51,596 --> 00:32:54,516 Speaker 1: the law can't happen in the United States, where a 569 00:32:54,516 --> 00:32:57,516 Speaker 1: sitting president, as a practical matter, can't be charged with 570 00:32:57,556 --> 00:33:00,076 Speaker 1: a crime because the Attorney General in the United States 571 00:33:00,236 --> 00:33:02,956 Speaker 1: works for the president and is pretty unlikely to charge 572 00:33:02,956 --> 00:33:06,636 Speaker 1: his boss. The outcome of whether this entire episode will 573 00:33:06,676 --> 00:33:09,036 Speaker 1: turn out to be good or bad for Israeli democracy 574 00:33:09,236 --> 00:33:11,476 Speaker 1: is one that we will learn in the coming months. 575 00:33:12,396 --> 00:33:15,836 Speaker 1: Of course, it's possible that Nittagnaho could be charged, tried, 576 00:33:15,876 --> 00:33:20,196 Speaker 1: and acquitted. It's also possible that, through some complex combination 577 00:33:20,236 --> 00:33:24,116 Speaker 1: of results, he might actually avoid going to trial. Yet 578 00:33:24,116 --> 00:33:26,836 Speaker 1: it's also simultaneously possible that he will be subject to 579 00:33:26,876 --> 00:33:30,836 Speaker 1: judicial process like any other citizen, and the consequences of 580 00:33:30,836 --> 00:33:35,036 Speaker 1: that development for Israeli democracy could be very far reaching. 581 00:33:35,596 --> 00:33:38,116 Speaker 1: This is a story that we'll keep on watching going forward. 582 00:33:38,436 --> 00:33:46,396 Speaker 1: It's one that can only continue to be fascinating. Deep 583 00:33:46,436 --> 00:33:49,516 Speaker 1: Background is brought to you by Pushkin Industries. Our producer 584 00:33:49,556 --> 00:33:52,476 Speaker 1: is Lydia Gene Coott, with engineering by Jason Gambrel and 585 00:33:52,556 --> 00:33:56,436 Speaker 1: Jason Roskowski. Our showrunner is Sophie mckibbon. Our theme music 586 00:33:56,516 --> 00:33:59,756 Speaker 1: is composed by Luis Gera special thanks to the Pushkin Brass, 587 00:33:59,996 --> 00:34:03,796 Speaker 1: Malcolm Gladwell, Jacob Weisberg, and Mia Lobel. I'm Noah Feldman. 588 00:34:03,916 --> 00:34:06,436 Speaker 1: You can follow me on Twitter at Noah R. Feldman. 589 00:34:06,836 --> 00:34:08,196 Speaker 1: This is deep background