1 00:00:00,040 --> 00:00:02,640 Speaker 1: Thanks for tuney into tech Stuff. If you don't recognize 2 00:00:02,680 --> 00:00:05,440 Speaker 1: my voice, my name is Osva Looshan, and I'm here 3 00:00:05,480 --> 00:00:08,600 Speaker 1: because the inimitable Jonathan Strickland has passed the baton to 4 00:00:08,720 --> 00:00:12,160 Speaker 1: Cara Price and myself to host Tech Stuff. The show 5 00:00:12,160 --> 00:00:15,040 Speaker 1: will remain your home for all things tech, and all 6 00:00:15,080 --> 00:00:20,400 Speaker 1: the old episodes will remain available in this feed. Welcome 7 00:00:20,400 --> 00:00:24,200 Speaker 1: to tech Stuff. This is the story. Every Wednesday, we 8 00:00:24,280 --> 00:00:26,880 Speaker 1: bring you an in depth interview with someone at the 9 00:00:26,920 --> 00:00:30,600 Speaker 1: forefront of technology or someone who can unknock a world 10 00:00:30,640 --> 00:00:34,000 Speaker 1: where tech is that it's most fascinating. This week it's 11 00:00:34,040 --> 00:00:40,640 Speaker 1: Harney for Reid. He's a professor of electrical engineering and 12 00:00:40,680 --> 00:00:45,040 Speaker 1: computer science at the University of California, Berkeley, with a 13 00:00:45,120 --> 00:00:51,640 Speaker 1: CSI sounding specialization digital forensics. His focus is on image 14 00:00:51,640 --> 00:00:55,480 Speaker 1: analysis and human perception, so he's the guy you call 15 00:00:55,920 --> 00:00:58,000 Speaker 1: when you need to know whether or not you're confronting 16 00:00:58,040 --> 00:01:01,720 Speaker 1: at deep fake, and many do. He's constantly talking to 17 00:01:01,840 --> 00:01:05,760 Speaker 1: journalists to help them determine what's real and what's fake online. 18 00:01:06,440 --> 00:01:08,959 Speaker 1: In his lab at UC Berkeley, he and his students 19 00:01:09,000 --> 00:01:13,319 Speaker 1: study the various ways misinformation is created and spread and 20 00:01:13,360 --> 00:01:15,480 Speaker 1: how it erodes trust. In our institutions. 21 00:01:16,200 --> 00:01:16,880 Speaker 2: And one more thing. 22 00:01:17,440 --> 00:01:20,200 Speaker 1: For Reid is the founder and chief science officer of 23 00:01:20,240 --> 00:01:24,320 Speaker 1: get Real Labs, where he consults with businesses, news organizations, 24 00:01:24,360 --> 00:01:28,760 Speaker 1: and law enforcement to authenticate digital content. You might be 25 00:01:28,760 --> 00:01:31,600 Speaker 1: wondering how far Reed got into this field. If so, 26 00:01:32,080 --> 00:01:32,840 Speaker 1: you're not alone. 27 00:01:33,120 --> 00:01:34,800 Speaker 2: Somebody said to me the other day, Oh, you were 28 00:01:34,800 --> 00:01:37,120 Speaker 2: so prescient. I'm like, no, we weren't. We were just 29 00:01:37,160 --> 00:01:37,960 Speaker 2: screwing around. 30 00:01:38,520 --> 00:01:42,920 Speaker 1: Farred first started pondering the implications of digital images back 31 00:01:42,920 --> 00:01:43,960 Speaker 1: in nineteen ninety seven. 32 00:01:44,280 --> 00:01:47,680 Speaker 2: This is really pre digital Almost film was still the 33 00:01:47,720 --> 00:01:50,080 Speaker 2: dominant source of media that we took photographs on the 34 00:01:50,120 --> 00:01:53,280 Speaker 2: Internet was nothing right. There was no social media, and 35 00:01:53,360 --> 00:01:56,440 Speaker 2: everything was very nascent. You could see the trends, you 36 00:01:56,520 --> 00:01:59,240 Speaker 2: knew things. Something was bubbling up with the Internet and 37 00:01:59,240 --> 00:02:02,400 Speaker 2: with digital technolog Freed was a postdoc at the time. 38 00:02:02,720 --> 00:02:05,840 Speaker 2: I was at the library getting a book, which now 39 00:02:05,960 --> 00:02:09,720 Speaker 2: just seems quaint, and I was waiting in line, and 40 00:02:09,760 --> 00:02:11,679 Speaker 2: there was a return card, and on the return cart 41 00:02:11,800 --> 00:02:14,320 Speaker 2: was a big book called the Federal Rules of Evidence. 42 00:02:14,919 --> 00:02:17,200 Speaker 2: I'm not a legal scholar, I'm not a lawyer, but 43 00:02:17,240 --> 00:02:19,040 Speaker 2: I was bored and I flipped it open to a 44 00:02:19,120 --> 00:02:23,160 Speaker 2: random page and it was titled introducing Photographs into Evidence 45 00:02:23,160 --> 00:02:25,760 Speaker 2: in a Court of Law. And I liked taking photographs. 46 00:02:25,760 --> 00:02:27,960 Speaker 2: I was working with digital images, but nothing to do 47 00:02:28,000 --> 00:02:29,679 Speaker 2: with this topic, and I thought, I wonder what the 48 00:02:29,760 --> 00:02:32,040 Speaker 2: rules are, and so I read it and there was 49 00:02:32,040 --> 00:02:35,640 Speaker 2: almost a footnote that said, there's this digital format and 50 00:02:35,720 --> 00:02:39,040 Speaker 2: we're going to treat digital the same way we treat analog. 51 00:02:40,080 --> 00:02:42,400 Speaker 2: And I just remember thinking, I don't know anything, but 52 00:02:42,520 --> 00:02:43,919 Speaker 2: that seems like a bad idea. 53 00:02:44,600 --> 00:02:47,560 Speaker 1: This passage really stuck with him, and for years he 54 00:02:47,600 --> 00:02:51,040 Speaker 1: couldn't stop thinking about the implications of a digital world, 55 00:02:51,760 --> 00:02:54,880 Speaker 1: the fact that digital manipulation would change our perception of 56 00:02:54,880 --> 00:03:00,320 Speaker 1: what's real because the photographic medium had fundamentally shifted. What 57 00:03:00,440 --> 00:03:03,600 Speaker 1: surprised him was that few others were taking note. 58 00:03:04,000 --> 00:03:07,040 Speaker 2: It's really unusual in an academic life where you start 59 00:03:07,080 --> 00:03:09,080 Speaker 2: thinking about a problem and you go into the academic 60 00:03:09,160 --> 00:03:13,000 Speaker 2: literature and there is nothing. It was just crickets, because 61 00:03:13,000 --> 00:03:15,440 Speaker 2: there was no reason to be thinking about the problem. 62 00:03:15,960 --> 00:03:19,160 Speaker 1: Two years later, as a professor of computer science at Dartmouth, 63 00:03:19,600 --> 00:03:22,799 Speaker 1: he was playing around in photoshop creating a comic image 64 00:03:22,800 --> 00:03:25,440 Speaker 1: of his friend when he had an epiphany. 65 00:03:26,040 --> 00:03:31,079 Speaker 2: Mathematically, I just did something very interesting. I introduced pixels 66 00:03:31,320 --> 00:03:35,240 Speaker 2: that have been synthesized by photoshop to make the image bigger, right, 67 00:03:35,280 --> 00:03:37,920 Speaker 2: because they didn't exist, and I remember thinking, oh, I 68 00:03:37,920 --> 00:03:39,160 Speaker 2: should be able to detect that. 69 00:03:40,240 --> 00:03:43,880 Speaker 1: In that moment, he started writing code and actually developed 70 00:03:43,920 --> 00:03:48,120 Speaker 1: programs to detect digital manipulation. The world woke up to 71 00:03:48,120 --> 00:03:50,240 Speaker 1: the importance of this work, and he started getting asked 72 00:03:50,280 --> 00:03:53,120 Speaker 1: to chime in on serious cases for the Associated Press, 73 00:03:53,440 --> 00:03:55,800 Speaker 1: for law enforcement, for national security. 74 00:03:56,240 --> 00:04:00,400 Speaker 2: And then twenty fifteen, sixteen seventeen AI hit and the 75 00:04:00,400 --> 00:04:03,920 Speaker 2: world exploded. But it exploded for a few reasons because one, 76 00:04:04,320 --> 00:04:06,680 Speaker 2: at least with Photoshop, there was a barrier to entry. 77 00:04:06,920 --> 00:04:09,160 Speaker 2: You had to actually know how to use photoshop. But 78 00:04:09,200 --> 00:04:11,400 Speaker 2: then when AI came around, you just go to chat 79 00:04:11,480 --> 00:04:13,800 Speaker 2: cheepetiting type, give me an image of X, right, and 80 00:04:13,840 --> 00:04:15,520 Speaker 2: give me an image of Y, give me a video 81 00:04:15,520 --> 00:04:16,960 Speaker 2: of this, give me an audio of this. And so 82 00:04:17,000 --> 00:04:21,080 Speaker 2: suddenly there's no barrier to entry. But more importantly, social 83 00:04:21,120 --> 00:04:24,600 Speaker 2: media dominates the landscape. We went from a few million 84 00:04:24,720 --> 00:04:28,000 Speaker 2: users to a few billion users, and so now not 85 00:04:28,080 --> 00:04:31,719 Speaker 2: only could people easily with no barrier to entry, create 86 00:04:32,120 --> 00:04:36,400 Speaker 2: fake content, they could distribute it to the masses and 87 00:04:37,160 --> 00:04:42,080 Speaker 2: it gets amplified because the algorithms amplify the most outrageous things. 88 00:04:42,440 --> 00:04:46,280 Speaker 2: People want things that conform to the worldview. We are hyperpartisan, 89 00:04:46,400 --> 00:04:51,480 Speaker 2: both here and abroad, and that was the perfect storm create, distribute, amplify, 90 00:04:51,839 --> 00:04:56,040 Speaker 2: rinse and repeat. And so now through the AI revolution, 91 00:04:56,680 --> 00:04:58,560 Speaker 2: it's bizarre what's happening. 92 00:04:59,120 --> 00:05:01,520 Speaker 1: We'll dive into the world Careed does on deep fakes 93 00:05:01,520 --> 00:05:03,960 Speaker 1: in a bit, but first I had to ask you 94 00:05:04,080 --> 00:05:10,960 Speaker 1: about something seemingly completely unrelated, death bods. So you were 95 00:05:11,040 --> 00:05:14,359 Speaker 1: quoted in this Atlantic article about death bods with the 96 00:05:14,400 --> 00:05:18,440 Speaker 1: headline no one is ready for digital immortality, So nie it. 97 00:05:18,560 --> 00:05:20,560 Speaker 1: It'll be good to define on terms like what do 98 00:05:20,600 --> 00:05:23,080 Speaker 1: we mean by this idea of digital immortality? 99 00:05:23,720 --> 00:05:26,080 Speaker 2: Yeah, I don't know that it's a well established term. 100 00:05:26,120 --> 00:05:30,320 Speaker 2: But here's my definition. Is that your likeness, the way 101 00:05:30,360 --> 00:05:32,719 Speaker 2: you think, the way you talk, the way you look, 102 00:05:33,200 --> 00:05:37,000 Speaker 2: lives on an eternity in a digital form through a 103 00:05:37,120 --> 00:05:41,600 Speaker 2: version of AI that embodies how I write, how I think, 104 00:05:41,720 --> 00:05:44,800 Speaker 2: how I talk in order to interact with other people. 105 00:05:45,000 --> 00:05:47,359 Speaker 2: It's interactive, that's the key, but it's dynamic. 106 00:05:47,680 --> 00:05:50,120 Speaker 1: What got you interested in this topic and why did 107 00:05:50,160 --> 00:05:51,520 Speaker 1: you agree to be a source in the story. 108 00:05:51,960 --> 00:05:55,359 Speaker 2: So this is almost a philosophical and legal question, and 109 00:05:55,440 --> 00:05:58,320 Speaker 2: I'm neither of those things. But I got to say 110 00:05:58,440 --> 00:06:03,120 Speaker 2: I've been thinking a lot about it, technically, personally, philosophically. 111 00:06:03,880 --> 00:06:06,600 Speaker 2: Here's why I've been thinking about it. So one is, 112 00:06:06,640 --> 00:06:08,840 Speaker 2: I'm a professor. I've been a professor for twenty five years. 113 00:06:08,880 --> 00:06:11,480 Speaker 2: I love teaching. I love my students. I hate them 114 00:06:11,480 --> 00:06:15,000 Speaker 2: some days, but I usually love them. They're amazing and 115 00:06:15,040 --> 00:06:18,200 Speaker 2: weird and wonderful in many ways. So is there a 116 00:06:18,279 --> 00:06:21,880 Speaker 2: story here where I can keep teaching after I die? 117 00:06:22,560 --> 00:06:25,640 Speaker 2: Like there's something sort of magical about that. I think 118 00:06:25,680 --> 00:06:27,480 Speaker 2: about it for my parents. Both my parents are now 119 00:06:27,480 --> 00:06:29,560 Speaker 2: in their late eighties. One of them will die first, 120 00:06:29,600 --> 00:06:31,599 Speaker 2: almost certainly, And what does it mean for the one 121 00:06:31,760 --> 00:06:35,360 Speaker 2: who They've been together for fifty years? So there's parts 122 00:06:35,400 --> 00:06:38,039 Speaker 2: of it. I think this is wonderful, this idea that 123 00:06:38,200 --> 00:06:40,240 Speaker 2: one of my parents can wake up and open up 124 00:06:40,240 --> 00:06:42,479 Speaker 2: their iPad and have a conversation with the person that 125 00:06:42,480 --> 00:06:45,039 Speaker 2: they spent fifty years of their lives with. On the 126 00:06:45,080 --> 00:06:47,680 Speaker 2: other hand, if that happens early in life, is that 127 00:06:47,720 --> 00:06:50,479 Speaker 2: healthy for somebody? If a thirty year old loses their spouse, 128 00:06:50,560 --> 00:06:53,359 Speaker 2: is that good? That they never sort of physically move on. 129 00:06:53,880 --> 00:06:55,919 Speaker 2: I also think about it from a technical perspective, what 130 00:06:55,960 --> 00:06:58,520 Speaker 2: would that look like for somebody who's famous where there's 131 00:06:58,520 --> 00:07:01,000 Speaker 2: a big digital footprint. I think we have all the 132 00:07:01,040 --> 00:07:04,039 Speaker 2: pieces to do that. We have the large language models, 133 00:07:04,480 --> 00:07:07,919 Speaker 2: we have voice, we have likeness, we have video, and 134 00:07:07,920 --> 00:07:11,040 Speaker 2: you're already seeing people do this creating digital avatars of 135 00:07:11,160 --> 00:07:13,640 Speaker 2: both people who are with us and not with us, 136 00:07:13,960 --> 00:07:16,120 Speaker 2: so that you can interact with them. I can go 137 00:07:16,240 --> 00:07:18,440 Speaker 2: scrape every single piece of writing that Martin Luther King 138 00:07:18,520 --> 00:07:20,600 Speaker 2: Junior wrote. I can grab his speeches, I can grab 139 00:07:20,640 --> 00:07:22,200 Speaker 2: his likeness, I can grab his voice, and I could 140 00:07:22,240 --> 00:07:24,520 Speaker 2: create an avatar of him that I could interact with. 141 00:07:24,760 --> 00:07:27,200 Speaker 1: Well, it reminds me of your work on deep fis 142 00:07:27,240 --> 00:07:29,840 Speaker 1: in some sense, because, as you said, exactly, all the 143 00:07:29,840 --> 00:07:33,800 Speaker 1: pieces are there technically and otherwise. Yeah, but society's clearly 144 00:07:33,800 --> 00:07:34,239 Speaker 1: not ready. 145 00:07:34,880 --> 00:07:37,040 Speaker 2: I don't think we're ready. But look a lot of things, 146 00:07:37,320 --> 00:07:39,720 Speaker 2: if you look at the last two, three, four or 147 00:07:39,760 --> 00:07:42,880 Speaker 2: five decades from technology we weren't ready for and we 148 00:07:42,920 --> 00:07:43,720 Speaker 2: became ready for it. 149 00:07:43,800 --> 00:07:43,960 Speaker 1: Right. 150 00:07:44,320 --> 00:07:46,400 Speaker 2: Look, you can go back to in vitual fertilization. When 151 00:07:46,400 --> 00:07:49,840 Speaker 2: it first started, people were freaked out by that completely 152 00:07:50,000 --> 00:07:52,160 Speaker 2: normal now, right, And by the way, this could also 153 00:07:52,200 --> 00:07:55,200 Speaker 2: be generational. I can imagine some of my students here 154 00:07:55,200 --> 00:07:59,400 Speaker 2: at UC Berkeley think sure, who cares, right, And I'm 155 00:07:59,440 --> 00:08:01,000 Speaker 2: an older guy and I'm like, ah, that seems a 156 00:08:01,000 --> 00:08:03,560 Speaker 2: little weird. So this may just go away generationally, which 157 00:08:03,600 --> 00:08:04,600 Speaker 2: is usually how this happens. 158 00:08:04,640 --> 00:08:07,160 Speaker 1: By the way, do you think we'll see a fundamental 159 00:08:07,400 --> 00:08:09,440 Speaker 1: shift in our society in this case in terms of 160 00:08:10,400 --> 00:08:11,640 Speaker 1: how we think about death. 161 00:08:12,560 --> 00:08:16,920 Speaker 2: I think this idea of a digital immortality is really profound. 162 00:08:17,600 --> 00:08:21,200 Speaker 2: And look, I don't know where this AI revolution is 163 00:08:21,200 --> 00:08:24,120 Speaker 2: going right now. I don't think anybody really does. But 164 00:08:24,200 --> 00:08:27,800 Speaker 2: something is happening. There is something here that is quite dramatic. 165 00:08:28,240 --> 00:08:30,320 Speaker 2: I think it's going to reshape society. I think it's 166 00:08:30,360 --> 00:08:32,520 Speaker 2: going to reshape education. I think it's going to reshape 167 00:08:32,520 --> 00:08:34,800 Speaker 2: the workforce. I think it's going to reshape a lot 168 00:08:34,800 --> 00:08:37,920 Speaker 2: of things. And I do think your likeness or your 169 00:08:37,920 --> 00:08:40,080 Speaker 2: being or your essence or whatever you want to call 170 00:08:40,120 --> 00:08:42,800 Speaker 2: that can live on and you can interact with people. 171 00:08:43,000 --> 00:08:45,880 Speaker 2: You can continue to have a podcast after you die, 172 00:08:46,320 --> 00:08:47,840 Speaker 2: you can keep interviewing people. 173 00:08:49,320 --> 00:08:53,280 Speaker 1: When we come back. How deep fakes impact everyone even 174 00:08:53,320 --> 00:09:02,000 Speaker 1: if you don't know it. There's an interesting point of 175 00:09:02,040 --> 00:09:06,280 Speaker 1: intersection between so death bots and your more core field 176 00:09:06,280 --> 00:09:10,240 Speaker 1: of study, and that's this Indian politician who a parliamentary candidate, 177 00:09:10,280 --> 00:09:14,760 Speaker 1: who created a video of his deceased father endorsing him 178 00:09:14,760 --> 00:09:15,760 Speaker 1: as his rightful heir. 179 00:09:16,080 --> 00:09:17,440 Speaker 2: Yeah. 180 00:09:17,520 --> 00:09:19,240 Speaker 1: I mean this is kind of a world's collide moment 181 00:09:19,280 --> 00:09:22,599 Speaker 1: between misinformation, deep fakes, and digital immortality. 182 00:09:22,840 --> 00:09:25,640 Speaker 2: Yeah. Yeah, So for people who didn't see it, India 183 00:09:25,679 --> 00:09:28,520 Speaker 2: had an election this year, big one and you know, 184 00:09:29,200 --> 00:09:33,480 Speaker 2: billion plus people voting. It was chaotic, and a politician 185 00:09:33,480 --> 00:09:36,320 Speaker 2: did exactly this. His father was a well known politician, 186 00:09:36,360 --> 00:09:38,600 Speaker 2: and he created a digital recreation with his voice and 187 00:09:38,600 --> 00:09:40,480 Speaker 2: his likeness and he was talking and endorsing his son. 188 00:09:41,040 --> 00:09:43,520 Speaker 2: So I have a couple of thoughts on that right now, 189 00:09:43,559 --> 00:09:46,559 Speaker 2: in this particular moment, as we're still grappling, I think 190 00:09:46,559 --> 00:09:50,080 Speaker 2: there should be two rules, which are consent and disclosure. 191 00:09:51,240 --> 00:09:52,920 Speaker 2: And it's really simple, like, if you're going to use 192 00:09:52,920 --> 00:09:55,840 Speaker 2: somebody's likeness, you should have consent, and if you're going 193 00:09:55,880 --> 00:09:59,000 Speaker 2: to distribute it, you should have disclosure. Now, consent is 194 00:09:59,000 --> 00:10:01,520 Speaker 2: difficult when somebody is dead. But if I want to 195 00:10:01,520 --> 00:10:04,880 Speaker 2: get an endorsement from somebody, who's living. I have to 196 00:10:04,880 --> 00:10:08,280 Speaker 2: get their consent yep. And if I distribute that, it 197 00:10:08,320 --> 00:10:11,880 Speaker 2: has to be very clearly labeled and disclosed as this 198 00:10:11,920 --> 00:10:14,640 Speaker 2: is AI generated. I'll give you a really nice example 199 00:10:14,679 --> 00:10:15,880 Speaker 2: of this where it was sort of cool. I was 200 00:10:15,960 --> 00:10:19,800 Speaker 2: during the Olympics. One of the newscasters, well known and 201 00:10:20,000 --> 00:10:22,559 Speaker 2: I'm just blinking out his name right now, was creating 202 00:10:22,600 --> 00:10:26,080 Speaker 2: AI generated personalized summaries. So my wife was watching the 203 00:10:26,080 --> 00:10:28,880 Speaker 2: Olympics and she would get these personalized summaries from the broadcaster. 204 00:10:29,600 --> 00:10:32,240 Speaker 2: So the content was personalized to her based on what 205 00:10:32,320 --> 00:10:36,079 Speaker 2: she was watching. And then the voice being generated was his, 206 00:10:36,920 --> 00:10:39,760 Speaker 2: and the script was being AI generated. Everything was with 207 00:10:39,840 --> 00:10:41,920 Speaker 2: his permission, and it was disclosed to her that it 208 00:10:41,960 --> 00:10:44,640 Speaker 2: was AI generated and summarized. And I think that was 209 00:10:44,679 --> 00:10:47,960 Speaker 2: really well done in terms of the things that were 210 00:10:48,000 --> 00:10:50,240 Speaker 2: made clear of what you were getting and how it 211 00:10:50,280 --> 00:10:51,200 Speaker 2: was being delivered to you. 212 00:10:52,120 --> 00:10:54,840 Speaker 1: That's sort of a high watermark for how this stuff works. 213 00:10:54,840 --> 00:10:57,040 Speaker 1: When it works well. Yeah, Do you think as a 214 00:10:57,040 --> 00:11:00,920 Speaker 1: society we're more likely to move toward that high water 215 00:11:01,040 --> 00:11:06,079 Speaker 1: mark through collective demand or through regulation or through some 216 00:11:06,120 --> 00:11:08,600 Speaker 1: decision from the tech overlords like what gets us there? 217 00:11:08,679 --> 00:11:12,000 Speaker 2: More broadly, yeah. I mean, there's nothing in the last 218 00:11:12,040 --> 00:11:14,880 Speaker 2: twenty years or twenty five years that gives me confident 219 00:11:14,960 --> 00:11:16,800 Speaker 2: that our tech overlords are going to do the right thing. 220 00:11:17,520 --> 00:11:20,120 Speaker 2: They're going to do the thing that maximizes their profits. 221 00:11:20,120 --> 00:11:23,040 Speaker 2: And we know this. Let's stop pretending otherwise that Silicon 222 00:11:23,120 --> 00:11:24,880 Speaker 2: Value is anything other than it is. It's a modern 223 00:11:24,960 --> 00:11:27,480 Speaker 2: day Wall Street in some ways, by the way, even 224 00:11:27,559 --> 00:11:31,480 Speaker 2: more powerful, right, because they control information, not just money, 225 00:11:31,559 --> 00:11:34,800 Speaker 2: and that arguably is much more powerful. I don't think 226 00:11:34,840 --> 00:11:38,319 Speaker 2: this comes from consumers, because we're not customers, we're the product. 227 00:11:39,160 --> 00:11:42,480 Speaker 2: We as users, I should say, have almost no power 228 00:11:42,520 --> 00:11:45,920 Speaker 2: at all. And so the media we tried, right, we 229 00:11:45,960 --> 00:11:49,040 Speaker 2: tried criticizing and embarrassing, and we tried dragging them in 230 00:11:49,040 --> 00:11:53,840 Speaker 2: front of Congress. Nothing effects change. So what does good regulation? 231 00:11:54,320 --> 00:11:56,560 Speaker 2: We got to put guardrails on this and look, there's 232 00:11:56,720 --> 00:12:00,120 Speaker 2: nothing there is nothing in our physical world that it 233 00:12:00,160 --> 00:12:04,400 Speaker 2: is not subject to regulation to make products safe and reasonable. 234 00:12:04,679 --> 00:12:07,240 Speaker 2: But somehow we've abandoned that for the last twenty five 235 00:12:07,320 --> 00:12:09,920 Speaker 2: years because it's the Internet. So I do think it's 236 00:12:09,960 --> 00:12:11,400 Speaker 2: going to have to come I don't think it's going 237 00:12:11,480 --> 00:12:14,040 Speaker 2: to come from the US. It is coming from the UK. 238 00:12:14,200 --> 00:12:16,720 Speaker 2: It is coming from the EU, it is coming from Australia, 239 00:12:17,120 --> 00:12:19,160 Speaker 2: and I think those are going to be the leaders 240 00:12:19,160 --> 00:12:21,160 Speaker 2: in this space. And you saw this with GDPR with 241 00:12:21,200 --> 00:12:24,040 Speaker 2: the privacy rules in many ways that I don't think 242 00:12:24,040 --> 00:12:26,160 Speaker 2: it solved the privacy problem around the world, but it 243 00:12:26,240 --> 00:12:29,160 Speaker 2: moved the needle on the problem. And the EU and 244 00:12:29,200 --> 00:12:32,280 Speaker 2: the UK have moved very aggressively on AI safety, on 245 00:12:32,360 --> 00:12:36,439 Speaker 2: digital safety, and on misuse of monopolies, and I think 246 00:12:36,440 --> 00:12:38,280 Speaker 2: it's going to have to come at that level. 247 00:12:38,840 --> 00:12:40,880 Speaker 1: I want to talk about some of the more personal 248 00:12:40,960 --> 00:12:44,040 Speaker 1: ways in which we can experience deep fakes. I think 249 00:12:44,040 --> 00:12:47,040 Speaker 1: a lot of people think maybe it only touches politicians 250 00:12:47,120 --> 00:12:50,120 Speaker 1: or celebrities. But there was an NPR story about a 251 00:12:50,160 --> 00:12:54,280 Speaker 1: case you worked someone that involved a Baltimore teacher. Can 252 00:12:54,320 --> 00:12:55,440 Speaker 1: you talk about what happened there? 253 00:12:55,960 --> 00:12:58,640 Speaker 2: This case is I'm fascinated by, and I still don't 254 00:12:58,640 --> 00:13:00,199 Speaker 2: think we've gotten to the end of it. Tell you, 255 00:13:00,240 --> 00:13:03,319 Speaker 2: first of all, your listeners, what the case is. Baltimore 256 00:13:03,320 --> 00:13:06,520 Speaker 2: Public School audio of the principle saying things that were 257 00:13:06,640 --> 00:13:10,920 Speaker 2: racist was leaked, and it was leaked to some news outlet, 258 00:13:11,400 --> 00:13:13,360 Speaker 2: and it was bad and it was if you listen 259 00:13:13,440 --> 00:13:17,040 Speaker 2: to it it's pretty bad, and the principal said, this 260 00:13:17,160 --> 00:13:21,960 Speaker 2: isn't me, this is AI generated, And we analyze the audio. 261 00:13:22,120 --> 00:13:26,679 Speaker 2: Several labs analyze the audio. There is alteration to the audio. 262 00:13:26,840 --> 00:13:29,800 Speaker 2: That is, we can hear and see that it's been 263 00:13:29,880 --> 00:13:34,440 Speaker 2: spliced together five or six segments, but when we analyze 264 00:13:34,440 --> 00:13:37,560 Speaker 2: the individual segments, it is not one hundred percent clear 265 00:13:37,600 --> 00:13:39,760 Speaker 2: to us that it is AI generated. It could be 266 00:13:40,600 --> 00:13:42,640 Speaker 2: that he said these things, but they were sort of 267 00:13:42,720 --> 00:13:45,160 Speaker 2: stitched together in a way that put them out of context, 268 00:13:45,440 --> 00:13:48,000 Speaker 2: which would be deceptive. It could be that it's AI 269 00:13:48,040 --> 00:13:50,800 Speaker 2: generated and our tools simply didn't detect it. It could 270 00:13:50,840 --> 00:13:52,640 Speaker 2: be that this is a case of the liar's dividend, 271 00:13:52,640 --> 00:13:54,760 Speaker 2: where the principle really did say this, but he's claiming 272 00:13:54,800 --> 00:13:55,520 Speaker 2: he didn't say it. 273 00:13:55,640 --> 00:13:58,520 Speaker 1: Honey, can you explain exactly what the liar's dividend is? 274 00:13:58,960 --> 00:14:01,480 Speaker 2: The liar's divining go something like this. It says, when 275 00:14:01,520 --> 00:14:04,840 Speaker 2: you live in a world or anything can be manipulated. 276 00:14:04,880 --> 00:14:06,839 Speaker 2: Any image can be fake, any audio can be fake, 277 00:14:06,880 --> 00:14:09,520 Speaker 2: any video can be fake. Nothing has to be real. 278 00:14:09,840 --> 00:14:13,240 Speaker 2: I get to use the fact that fake things exist 279 00:14:13,280 --> 00:14:16,520 Speaker 2: as an excuse for what I've done. But this case 280 00:14:16,559 --> 00:14:21,960 Speaker 2: is a really good example of how dangerous this technology 281 00:14:22,040 --> 00:14:25,160 Speaker 2: is for two reasons. One is, with twenty to thirty 282 00:14:25,160 --> 00:14:28,920 Speaker 2: seconds of your voice, I don't need hours. I can 283 00:14:28,960 --> 00:14:31,880 Speaker 2: clone your voice. I can upload it to an AI 284 00:14:32,040 --> 00:14:34,200 Speaker 2: tool that I use and that I can type and 285 00:14:34,240 --> 00:14:37,680 Speaker 2: have you say anything I want. That means anybody with 286 00:14:37,800 --> 00:14:41,480 Speaker 2: twenty seconds of their voice available has a vulnerability. So 287 00:14:41,520 --> 00:14:45,200 Speaker 2: this is not for movie stars and podcasters. This is everybody. 288 00:14:45,360 --> 00:14:49,840 Speaker 2: Number one. Number two is anybody who's caught saying or 289 00:14:49,880 --> 00:14:53,440 Speaker 2: doing something that they don't want to take ownership of 290 00:14:53,560 --> 00:14:56,760 Speaker 2: can say it's fake. Yep, the dog ate my homework, 291 00:14:57,040 --> 00:14:59,840 Speaker 2: all right, this is easy. And so both of those 292 00:14:59,880 --> 00:15:03,720 Speaker 2: are problematic because where's our shared sense of reality. It 293 00:15:03,800 --> 00:15:07,000 Speaker 2: used to be when you had images and video, despite 294 00:15:07,000 --> 00:15:09,280 Speaker 2: the fact that there was photoshop, despite the fact that 295 00:15:09,520 --> 00:15:13,240 Speaker 2: Hollywood could could manipulate videos, we had a pretty reasonable 296 00:15:13,240 --> 00:15:16,040 Speaker 2: confidence in what we read and saw and heard. And 297 00:15:16,080 --> 00:15:18,880 Speaker 2: you can't say that anymore. This is why I spent 298 00:15:18,960 --> 00:15:21,640 Speaker 2: so much time talking to journalists and fact checkers and 299 00:15:21,760 --> 00:15:25,560 Speaker 2: lawyers and law enforcement. So on this particular case, it 300 00:15:25,640 --> 00:15:28,640 Speaker 2: really showed how this has trickled all the way down 301 00:15:28,920 --> 00:15:30,359 Speaker 2: to high school teachers. 302 00:15:30,600 --> 00:15:33,800 Speaker 1: Zooming out from the individuals to the collective. One of 303 00:15:33,840 --> 00:15:37,480 Speaker 1: the interesting things that happens is whenever there's like a 304 00:15:37,520 --> 00:15:40,720 Speaker 1: world event that everyone's paying attention to, you get this 305 00:15:40,840 --> 00:15:44,280 Speaker 1: fire hose of fake images. I remember in the early 306 00:15:44,400 --> 00:15:47,720 Speaker 1: days of the conflict in Gaza, there was this aerial 307 00:15:47,840 --> 00:15:52,160 Speaker 1: image with what was supposed to be Palestinian tents making 308 00:15:52,200 --> 00:15:55,520 Speaker 1: the word help us or you know. Right out to 309 00:15:55,560 --> 00:15:58,280 Speaker 1: the La fires began, there were these images of the 310 00:15:58,320 --> 00:16:01,280 Speaker 1: Hollywood Sign on fire. I don't know how many people 311 00:16:01,320 --> 00:16:04,880 Speaker 1: believe these images were actually true or in some ways, 312 00:16:04,880 --> 00:16:07,720 Speaker 1: what the harm is if they did. But what's going 313 00:16:07,760 --> 00:16:08,160 Speaker 1: on here? 314 00:16:08,880 --> 00:16:10,960 Speaker 2: So let's start with the La fires. First of all, 315 00:16:11,200 --> 00:16:15,480 Speaker 2: many images coming out of those fires were fake. What's 316 00:16:15,520 --> 00:16:18,200 Speaker 2: the harm, Well, this one's easy. If people believe there's 317 00:16:18,240 --> 00:16:21,600 Speaker 2: fire in this neighborhood, that is very bad. Fire departments 318 00:16:21,600 --> 00:16:23,440 Speaker 2: are going to get distracted. First responders are going to 319 00:16:23,440 --> 00:16:26,000 Speaker 2: get distracted. People are scared that their neighborhood is on fire. 320 00:16:26,000 --> 00:16:27,920 Speaker 2: They're going to get distracted. So I do think there 321 00:16:27,960 --> 00:16:30,480 Speaker 2: is real harm I think in the Gosspt images. Also, 322 00:16:30,920 --> 00:16:34,360 Speaker 2: this is a complicated conflict, and we are all trying 323 00:16:34,360 --> 00:16:36,680 Speaker 2: to get our heads around this thing and figure it out, 324 00:16:36,920 --> 00:16:40,880 Speaker 2: and meanwhile people are fanning the flames, trying to push 325 00:16:40,880 --> 00:16:43,240 Speaker 2: a particular narrative on either side, and I don't think 326 00:16:43,280 --> 00:16:47,960 Speaker 2: that's healthy. Look, we can have serious discussions about how 327 00:16:48,000 --> 00:16:50,680 Speaker 2: to combat climate change, we can have serious discussions about 328 00:16:50,720 --> 00:16:53,640 Speaker 2: how to resolve the Israeli Palestinian conflict. We can have 329 00:16:53,680 --> 00:16:56,640 Speaker 2: serious discussions about a lot of things, but we've got 330 00:16:56,680 --> 00:16:59,080 Speaker 2: to start with a set of facts. And when you 331 00:16:59,120 --> 00:17:03,160 Speaker 2: pollute the higher information ecosystem, we are at a loss. 332 00:17:03,560 --> 00:17:05,920 Speaker 2: You could say, okay, well somebody believe the fake image 333 00:17:05,920 --> 00:17:08,960 Speaker 2: of the tense. Okay, who cares? But here's why you care, 334 00:17:09,440 --> 00:17:11,840 Speaker 2: Because then when the real images come out showing human 335 00:17:11,920 --> 00:17:16,360 Speaker 2: rights violations, showing people being killed, people being bombed, how 336 00:17:16,359 --> 00:17:19,840 Speaker 2: do I believe it? When you pollute the information ecosystem, 337 00:17:20,119 --> 00:17:23,479 Speaker 2: everything is in doubt. And suddenly you have people who 338 00:17:23,520 --> 00:17:26,560 Speaker 2: are denying that anybody's died, You have people denying that 339 00:17:26,600 --> 00:17:28,960 Speaker 2: the fires exist, you have people denying that people are 340 00:17:29,040 --> 00:17:32,480 Speaker 2: dying from COVID. Because this is how untrusting we have become, 341 00:17:33,040 --> 00:17:35,280 Speaker 2: and that I have a real problem with, because look, 342 00:17:35,840 --> 00:17:39,359 Speaker 2: no matter what side of the political or ideological aisle 343 00:17:39,440 --> 00:17:42,360 Speaker 2: you are on, can we at least agree that if 344 00:17:42,359 --> 00:17:45,720 Speaker 2: we don't have a shared factual system, a shared sense 345 00:17:45,720 --> 00:17:49,720 Speaker 2: of reality, we do not have a society or democracy. 346 00:17:49,760 --> 00:17:52,160 Speaker 2: We can't be arguing about whether one plus one is two. 347 00:17:52,480 --> 00:17:54,960 Speaker 2: And I would argue that this problem started well before 348 00:17:55,080 --> 00:17:58,440 Speaker 2: deep fakes. Social media is the one that is amplifying 349 00:17:58,920 --> 00:18:04,160 Speaker 2: and encouraging this type of behavior because it engages users, 350 00:18:04,520 --> 00:18:08,400 Speaker 2: drives ad drives attention, drives profits. The problem is not 351 00:18:08,480 --> 00:18:11,800 Speaker 2: just the creation side, it's the distribution side, and that, 352 00:18:12,040 --> 00:18:14,359 Speaker 2: I would argue, is the bigger problem here than the 353 00:18:14,400 --> 00:18:14,880 Speaker 2: deep fake. 354 00:18:16,920 --> 00:18:19,679 Speaker 1: Coming up. Harney for Reid on what it takes to 355 00:18:19,720 --> 00:18:29,639 Speaker 1: identify a deep fake stay with us. When we first spoke, 356 00:18:29,720 --> 00:18:33,240 Speaker 1: it was just five years ago in twenty nineteen. The 357 00:18:33,280 --> 00:18:36,760 Speaker 1: big question at the time was is there going to 358 00:18:36,760 --> 00:18:42,919 Speaker 1: be a causal piece of fake media that measurably sways 359 00:18:43,040 --> 00:18:46,399 Speaker 1: the outcome of an election? And some people say the 360 00:18:46,400 --> 00:18:47,920 Speaker 1: answer to that is no. I mean the New yorkd 361 00:18:47,920 --> 00:18:51,399 Speaker 1: apiece in twenty twenty three saying basically, you know that 362 00:18:51,440 --> 00:18:55,199 Speaker 1: the deep fakes haven't had that characterismic effect that some 363 00:18:55,240 --> 00:18:58,920 Speaker 1: people thought they would. The Atlantic run a story recently 364 00:18:59,000 --> 00:19:01,919 Speaker 1: under the headline AI's fingerprints were all over the election, 365 00:19:02,040 --> 00:19:05,480 Speaker 1: but deep fakes and information weren't the main issue, and 366 00:19:05,720 --> 00:19:08,359 Speaker 1: the kind of the point about both pieces was that 367 00:19:08,840 --> 00:19:10,919 Speaker 1: what deep fakes are really being used for is to 368 00:19:10,960 --> 00:19:14,439 Speaker 1: create memes and satire rather than to directly trick people. 369 00:19:14,960 --> 00:19:17,600 Speaker 1: And the second point was quote to growing numbers of people, 370 00:19:18,200 --> 00:19:21,040 Speaker 1: everything is fake now except what they know or other feel. 371 00:19:21,600 --> 00:19:21,959 Speaker 2: Yeah. 372 00:19:22,000 --> 00:19:27,600 Speaker 1: So has this been less explosively destructive than people thought 373 00:19:27,640 --> 00:19:29,560 Speaker 1: it would be? Or are the New York and the 374 00:19:29,560 --> 00:19:31,520 Speaker 1: Atlantic slightly missing the point in your view? 375 00:19:32,359 --> 00:19:35,720 Speaker 2: I agree and disagree with them. I agree that there 376 00:19:35,760 --> 00:19:39,760 Speaker 2: was no single atomic bomb that got dropped, and that 377 00:19:39,800 --> 00:19:41,560 Speaker 2: you can draw a line from me to be saying 378 00:19:41,560 --> 00:19:44,400 Speaker 2: this change in election, But nobody thought that was going 379 00:19:44,480 --> 00:19:45,879 Speaker 2: to be the case. So I think that's a little 380 00:19:45,880 --> 00:19:47,320 Speaker 2: bit of a straw man argument. 381 00:19:47,640 --> 00:19:47,800 Speaker 1: Right. 382 00:19:47,880 --> 00:19:50,680 Speaker 2: Okay, here's the other reason I disagree. Go talk to 383 00:19:50,720 --> 00:19:53,480 Speaker 2: the people in Slovakia, because what they will tell you 384 00:19:54,320 --> 00:19:58,320 Speaker 2: is that forty hours before election, there were two candidates, 385 00:19:58,359 --> 00:20:00,600 Speaker 2: a Pronato and a pro Putin candidate, and the pro 386 00:20:00,720 --> 00:20:04,199 Speaker 2: NATO candidate was up four points. A deep fake of 387 00:20:04,240 --> 00:20:06,320 Speaker 2: the Pro NATO candidate was released saying We're going to 388 00:20:06,400 --> 00:20:09,560 Speaker 2: rig the election, and two days later the pro Plutin 389 00:20:09,640 --> 00:20:12,800 Speaker 2: candidate won by four points. There was an eight point 390 00:20:12,920 --> 00:20:16,400 Speaker 2: swing in the polls in forty eight hours. Now were 391 00:20:16,440 --> 00:20:19,119 Speaker 2: the polls wrong? Possibly? Did it have anything to do 392 00:20:19,200 --> 00:20:22,480 Speaker 2: with the deep fake? Don't know, but this could have 393 00:20:22,600 --> 00:20:25,160 Speaker 2: been the first example, just a couple of years ago, 394 00:20:25,600 --> 00:20:28,520 Speaker 2: of where a deep fake was a tipping point. So 395 00:20:29,000 --> 00:20:31,560 Speaker 2: I'm not sure i'd buy that story. I think this 396 00:20:31,640 --> 00:20:34,880 Speaker 2: is more about death by a thousand cuts than by 397 00:20:35,119 --> 00:20:38,800 Speaker 2: dropping an atomic bomb. I think that when you keep 398 00:20:38,840 --> 00:20:44,040 Speaker 2: polluting the information ecosystem, everybody loses trust because you don't 399 00:20:44,119 --> 00:20:46,120 Speaker 2: trust NPR, you don't trust in your times, you don't 400 00:20:46,119 --> 00:20:48,320 Speaker 2: trust an end Who do you trust? Well, you trust 401 00:20:48,320 --> 00:20:50,879 Speaker 2: the guy who's yelling at you telling you what to believe, right, 402 00:20:50,880 --> 00:20:53,680 Speaker 2: because you've sort of given up. Yeah, And I would 403 00:20:53,720 --> 00:20:56,720 Speaker 2: say that, you know it's fundamentally Is that a deep 404 00:20:56,720 --> 00:20:59,280 Speaker 2: fake problem? No, I think that's a social media problem. 405 00:20:59,560 --> 00:21:01,840 Speaker 2: I think that's traditional media problem. I think that's a 406 00:21:01,880 --> 00:21:05,240 Speaker 2: polarization problem. I think it's the nature of politics today, 407 00:21:05,240 --> 00:21:07,600 Speaker 2: both here and abroad, because we have politicians who are 408 00:21:07,600 --> 00:21:10,480 Speaker 2: just outright lying to us now. So I do think 409 00:21:10,520 --> 00:21:14,000 Speaker 2: that can you point specifically to deep fakes? No, but 410 00:21:14,200 --> 00:21:16,199 Speaker 2: I do think it was an accelerant. I do think 411 00:21:16,240 --> 00:21:20,840 Speaker 2: it contributed to our general distrust and then our inability 412 00:21:21,240 --> 00:21:24,800 Speaker 2: to hear things that go against our worldview, and I 413 00:21:24,840 --> 00:21:28,040 Speaker 2: do think that that affected change. I do think you 414 00:21:28,200 --> 00:21:31,920 Speaker 2: can't look at the landscape of what Facebook and Twitter 415 00:21:32,080 --> 00:21:35,200 Speaker 2: and YouTube and TikTok, how they control the information ecosystem 416 00:21:35,240 --> 00:21:38,359 Speaker 2: for the vast majority of Americans, how they have promoted 417 00:21:38,520 --> 00:21:42,479 Speaker 2: false information, both traditionally false and deep fake falls. You 418 00:21:42,520 --> 00:21:44,040 Speaker 2: can't look at that and say that has had no 419 00:21:44,160 --> 00:21:47,520 Speaker 2: impact on the way we think. I think that's probably wrong. 420 00:21:47,800 --> 00:21:50,320 Speaker 1: So you mentioned you've been at this for some time 421 00:21:50,480 --> 00:21:55,760 Speaker 1: since opening that legal textbook all those years ago. Could 422 00:21:55,800 --> 00:22:00,399 Speaker 1: you have imagined how much trust in society has raided? 423 00:22:00,520 --> 00:22:02,440 Speaker 1: And where did you see it kind of happening all 424 00:22:02,440 --> 00:22:05,400 Speaker 1: the way? So the answer is no, I didn't see 425 00:22:05,400 --> 00:22:08,040 Speaker 1: this coming. And in the early days the liar's diving, 426 00:22:08,119 --> 00:22:10,840 Speaker 1: it didn't exist. When there was when there was film 427 00:22:10,960 --> 00:22:13,680 Speaker 1: in audio of you saying and doing something, nobody said 428 00:22:13,680 --> 00:22:15,000 Speaker 1: it was fake. And by the way, here's how you 429 00:22:15,040 --> 00:22:18,840 Speaker 1: know I'm right. Go back to twenty sixteen. Then the 430 00:22:18,920 --> 00:22:21,960 Speaker 1: first candidate Trump got caught on the Access Hollywood tape 431 00:22:22,400 --> 00:22:25,439 Speaker 1: saying what that he grabs women in places that I 432 00:22:25,480 --> 00:22:28,200 Speaker 1: won't mention on this podcast. And when he got called 433 00:22:28,240 --> 00:22:31,080 Speaker 1: on it, he didn't say it was fake. He apologized 434 00:22:31,480 --> 00:22:33,879 Speaker 1: three months later, when he was now in office, he 435 00:22:34,000 --> 00:22:36,520 Speaker 1: said it was fake. That was the moment when I 436 00:22:36,520 --> 00:22:39,479 Speaker 1: realized this was a real thing. So it was actually 437 00:22:39,520 --> 00:22:42,400 Speaker 1: fairly recently, because up until then the tech wasn't good enough, 438 00:22:42,440 --> 00:22:45,439 Speaker 1: and frankly, nobody had thought about it. But once Trump 439 00:22:45,560 --> 00:22:49,200 Speaker 1: normalized that you don't like information, call it fake news, 440 00:22:49,680 --> 00:22:52,800 Speaker 1: suddenly this became the mantra. AI was still pretty nascent, 441 00:22:53,280 --> 00:22:57,000 Speaker 1: but now it's actually a plausible deniability. Now it's actually 442 00:22:57,040 --> 00:22:58,800 Speaker 1: not an unreasonable thing. And if you go back and 443 00:22:58,840 --> 00:23:01,679 Speaker 1: look at that Access Hollywood tape, you never see him talking. 444 00:23:01,920 --> 00:23:06,440 Speaker 2: It's just audio. And so if that was released today, yeah, 445 00:23:06,600 --> 00:23:08,440 Speaker 2: you we'd have to think pretty carefully whether it was 446 00:23:08,480 --> 00:23:08,960 Speaker 2: real or not. 447 00:23:09,520 --> 00:23:11,879 Speaker 1: Your vacation in some ways to talk about this and 448 00:23:11,880 --> 00:23:14,320 Speaker 1: bring attention to it in the media. But your business 449 00:23:14,359 --> 00:23:19,600 Speaker 1: is also to bring some technological solutions to the detection problem. 450 00:23:19,640 --> 00:23:21,840 Speaker 2: Is that right? Yeah? Yeah, So I will tell you 451 00:23:22,480 --> 00:23:25,000 Speaker 2: I say this only half jokingly. I started the company 452 00:23:25,000 --> 00:23:27,159 Speaker 2: just because I couldn't keep up with the demand. I 453 00:23:27,240 --> 00:23:28,800 Speaker 2: just needed people to help me do this. 454 00:23:28,760 --> 00:23:31,720 Speaker 1: Because the best way to start a company, I think. 455 00:23:31,880 --> 00:23:34,479 Speaker 2: Yeah, I'm like guys, I used to get one call 456 00:23:34,520 --> 00:23:35,960 Speaker 2: a week, and there was one a day, and now 457 00:23:35,960 --> 00:23:37,920 Speaker 2: it's time to day and pretty soon it's gonna be 458 00:23:37,920 --> 00:23:40,040 Speaker 2: one Hundreday I can't. I honestly can't keep up. But 459 00:23:40,119 --> 00:23:44,400 Speaker 2: more less snarky, if you will, Like you know, we 460 00:23:44,440 --> 00:23:46,560 Speaker 2: really need to get a handle on this problem. And 461 00:23:46,600 --> 00:23:48,159 Speaker 2: I think there's a couple of places we want to 462 00:23:48,200 --> 00:23:51,080 Speaker 2: help organizations get a handle on it. So clearly, media outlets, 463 00:23:51,160 --> 00:23:54,280 Speaker 2: clearly you have to help the big news wires and 464 00:23:54,320 --> 00:23:57,439 Speaker 2: the major news agencies when they are dealing with breaking 465 00:23:57,480 --> 00:24:01,080 Speaker 2: news of La fires and Gaza and Inaugurate and whatever. 466 00:24:01,440 --> 00:24:03,439 Speaker 2: They've got to know what the hell's going on. We 467 00:24:03,520 --> 00:24:05,639 Speaker 2: have to help them. We clearly have to help law 468 00:24:05,720 --> 00:24:09,680 Speaker 2: enforcement at national security agencies reason about a very complicated world, 469 00:24:09,720 --> 00:24:12,520 Speaker 2: from evidence in a court of law to things with 470 00:24:12,560 --> 00:24:16,919 Speaker 2: geopolitical implications. We have to help organizations. We are seeing 471 00:24:17,119 --> 00:24:20,720 Speaker 2: massive frauds being perpetrayed on Fortune five hundred companies. We 472 00:24:20,800 --> 00:24:24,760 Speaker 2: are seeing imposter hiring. We are seeing people attack companies 473 00:24:24,800 --> 00:24:27,959 Speaker 2: with fake audio and video of CEOs, to damage their 474 00:24:27,960 --> 00:24:31,439 Speaker 2: stock price. We want to help individuals right deal with 475 00:24:31,480 --> 00:24:35,120 Speaker 2: the stuff when they are getting information, how do they trust? 476 00:24:35,600 --> 00:24:38,840 Speaker 2: And so we are developing a suite of tools that 477 00:24:38,920 --> 00:24:43,280 Speaker 2: would authenticate content, images, audio, and video to help people 478 00:24:43,359 --> 00:24:46,280 Speaker 2: make decisions. And it's not a value judgment. We're not 479 00:24:46,320 --> 00:24:48,280 Speaker 2: saying this is good or bad or ineter. In fact, 480 00:24:48,320 --> 00:24:50,680 Speaker 2: now we're even saying if it's true or false. We 481 00:24:50,760 --> 00:24:53,600 Speaker 2: are simply saying is this an authentic photo, image or 482 00:24:53,680 --> 00:24:56,600 Speaker 2: video or is it not. It's a pretty simple question 483 00:24:56,680 --> 00:24:59,679 Speaker 2: with a very very complicated and difficult answer. And by 484 00:24:59,720 --> 00:25:02,200 Speaker 2: the way, if that's not an if, it's a when, 485 00:25:02,960 --> 00:25:04,680 Speaker 2: it's a when that happens that you have to start 486 00:25:04,720 --> 00:25:07,920 Speaker 2: thinking about this because it will happen, right, because anybody 487 00:25:07,920 --> 00:25:11,159 Speaker 2: can create these fakes. Now there's somebody doesn't like their 488 00:25:11,200 --> 00:25:13,200 Speaker 2: seat on an airline, they're going to go off and 489 00:25:13,240 --> 00:25:15,280 Speaker 2: attack your company by creating a fake image or a 490 00:25:15,320 --> 00:25:16,720 Speaker 2: video or an audio and they're going to try to 491 00:25:16,760 --> 00:25:19,000 Speaker 2: hurt you. And it's frankly not that hard to do. 492 00:25:19,560 --> 00:25:21,720 Speaker 1: And the protote elements of what you're working on, what 493 00:25:21,880 --> 00:25:23,560 Speaker 1: is the technology that enables it. 494 00:25:24,160 --> 00:25:25,720 Speaker 2: Yeah, I'm going to tell you a little bit about it, 495 00:25:25,760 --> 00:25:27,679 Speaker 2: but not all of it, because you know, in the 496 00:25:27,720 --> 00:25:31,199 Speaker 2: cybersecurity world you have to be a little careful. But 497 00:25:31,400 --> 00:25:34,160 Speaker 2: underneath it is I've been doing this for twenty five years. 498 00:25:34,160 --> 00:25:37,040 Speaker 2: We have developed a suite of different technologies that look 499 00:25:37,119 --> 00:25:40,520 Speaker 2: at content from many different perspectives. We think about the 500 00:25:40,680 --> 00:25:44,000 Speaker 2: entire content creation process. So let's take an image for 501 00:25:44,040 --> 00:25:46,520 Speaker 2: an example. What happens with an image. You start out 502 00:25:46,600 --> 00:25:50,199 Speaker 2: here in the physical three dimensional world. Light moves and 503 00:25:50,280 --> 00:25:52,879 Speaker 2: hits the front of a lens. It passes through an 504 00:25:52,880 --> 00:25:55,720 Speaker 2: optical train, It hits an electronic sensor where it gets 505 00:25:55,760 --> 00:26:00,080 Speaker 2: converted from light photons analog to digital. It goes to 506 00:26:00,119 --> 00:26:02,880 Speaker 2: a series of post processing steps. It gets compressed into 507 00:26:02,880 --> 00:26:06,800 Speaker 2: a file, It gets uploaded to social media, it gets 508 00:26:06,840 --> 00:26:10,320 Speaker 2: downloaded onto my desk, and then my job begins. And 509 00:26:10,400 --> 00:26:13,880 Speaker 2: what we do is we insert ourselves into every part 510 00:26:13,920 --> 00:26:17,160 Speaker 2: of that process, the physical world, the optics, the electronic sensor, 511 00:26:17,240 --> 00:26:21,480 Speaker 2: the post processing, the packaging, and we build mathematical models 512 00:26:21,720 --> 00:26:24,760 Speaker 2: that we can say this is physically plausible, this is 513 00:26:24,760 --> 00:26:27,560 Speaker 2: physically implausible, this is consistent with a natural image, this 514 00:26:27,640 --> 00:26:30,199 Speaker 2: is consistent with an AI generated image. And we have 515 00:26:30,240 --> 00:26:33,800 Speaker 2: this suite of tools and then collectively, those come together 516 00:26:33,880 --> 00:26:36,720 Speaker 2: to tell a story about our belief that that piece 517 00:26:36,720 --> 00:26:38,040 Speaker 2: of content is authentic or not. 518 00:26:38,560 --> 00:26:41,639 Speaker 1: What degree of conviction do you have on any given 519 00:26:41,640 --> 00:26:43,600 Speaker 1: piece of content that you can verify with on it 520 00:26:43,640 --> 00:26:43,960 Speaker 1: is real? 521 00:26:44,320 --> 00:26:46,920 Speaker 2: First of all, great question, and I don't think it's 522 00:26:46,960 --> 00:26:49,239 Speaker 2: going to surprise you that the answer is complicated. I mean, 523 00:26:49,240 --> 00:26:50,760 Speaker 2: I'd like to be able to tell you ninety nine 524 00:26:50,760 --> 00:26:52,880 Speaker 2: point seven percent. And by the way, anybody who tells 525 00:26:52,920 --> 00:26:54,840 Speaker 2: you ninety nine point seven doesn't know what they're talking about. 526 00:26:54,880 --> 00:26:58,159 Speaker 2: And here's why it depends. So for example, if you 527 00:26:58,280 --> 00:27:02,880 Speaker 2: give me a high resolution twelve megapixel image, to its 528 00:27:02,960 --> 00:27:06,280 Speaker 2: high quality, we can say a lot. If you give 529 00:27:06,320 --> 00:27:08,560 Speaker 2: me an image that's three hundred by three hundred pixel 530 00:27:08,600 --> 00:27:11,560 Speaker 2: and has gone through five levels of compression and resizing 531 00:27:11,600 --> 00:27:15,439 Speaker 2: and uploaded and downloaded, it's really really hard. So it 532 00:27:15,480 --> 00:27:18,400 Speaker 2: depends on the content. So there's a number of factors 533 00:27:18,440 --> 00:27:21,040 Speaker 2: that play in, but the obvious ones are this. If 534 00:27:21,040 --> 00:27:23,719 Speaker 2: you have a high quality, high resolution piece of content, 535 00:27:24,200 --> 00:27:27,240 Speaker 2: we're pretty good at this, and that level of confidence 536 00:27:27,280 --> 00:27:30,639 Speaker 2: and ability goes down as the quality the content degreates. 537 00:27:30,720 --> 00:27:33,200 Speaker 2: It's like a physical DNA sample. You find a pile 538 00:27:33,240 --> 00:27:36,600 Speaker 2: of blood. Your DNA sample is good, You find a 539 00:27:36,640 --> 00:27:40,400 Speaker 2: tiny little half a drop of blood not so good. Look, 540 00:27:40,480 --> 00:27:42,520 Speaker 2: anybody who knows anything about the space knows there are 541 00:27:42,600 --> 00:27:45,160 Speaker 2: days where you say I don't know. I would much 542 00:27:45,240 --> 00:27:48,879 Speaker 2: much rather say I don't know than get it wrong. 543 00:27:49,320 --> 00:27:52,200 Speaker 1: So we told you about a regulation solution, you're working 544 00:27:52,240 --> 00:27:56,000 Speaker 1: on a product solution. What about the average person who 545 00:27:56,000 --> 00:27:59,439 Speaker 1: is listening to this podcast. What is the way to 546 00:27:59,520 --> 00:28:01,920 Speaker 1: protect in this changing environment? 547 00:28:02,160 --> 00:28:04,679 Speaker 2: This is easy. I really like this question because the 548 00:28:04,680 --> 00:28:07,000 Speaker 2: answer they answer to everything is hard. This one's easy. 549 00:28:07,200 --> 00:28:10,000 Speaker 2: Get off of social media. Stop getting your news from 550 00:28:10,040 --> 00:28:12,639 Speaker 2: social media. That's it. You're not going to become an 551 00:28:12,720 --> 00:28:15,200 Speaker 2: armchair analyst. You're not going to become a digital forensic expert. 552 00:28:15,240 --> 00:28:17,600 Speaker 2: You're not going to become a misinformation expert. You can't 553 00:28:17,600 --> 00:28:19,040 Speaker 2: do that, you can't do it at scale. But here's 554 00:28:19,040 --> 00:28:21,400 Speaker 2: what you can do. Stop getting your goddamn news from 555 00:28:21,400 --> 00:28:26,119 Speaker 2: social media. Annie, thank you, great talking to you guy. 556 00:28:26,160 --> 00:28:28,800 Speaker 2: I can't believe it's been five years. Okay, let's do 557 00:28:28,880 --> 00:28:30,840 Speaker 2: this again in five years and see and see where 558 00:28:30,840 --> 00:28:32,800 Speaker 2: we are, and maybe it'll be my avatar that'll be 559 00:28:32,800 --> 00:28:33,280 Speaker 2: talking with you. 560 00:28:33,359 --> 00:28:38,440 Speaker 1: Then that's it for this week in Tech the text 561 00:28:38,440 --> 00:28:42,520 Speaker 1: off i'mos Vloshin. This episode was produced by Eliza Dennis, 562 00:28:42,600 --> 00:28:46,880 Speaker 1: Victoria Dominguez, and Lizzie Jacobs. It was executive produced by Me, 563 00:28:47,360 --> 00:28:50,800 Speaker 1: Karen Price, and Kate Osborne for Kaleidis Kote and Katrina 564 00:28:50,880 --> 00:28:54,720 Speaker 1: Novelle for iHeart Podcasts. Jack Insley mixed this episode and 565 00:28:54,800 --> 00:28:57,960 Speaker 1: Kyle Murdoch Rodolphine Song join us on Friday for a 566 00:28:58,000 --> 00:29:02,120 Speaker 1: special crossover episode with the podcasts part Time Genius. We'll 567 00:29:02,160 --> 00:29:05,200 Speaker 1: be talking to Brian Merchant, author of Blood in the Machine, 568 00:29:05,520 --> 00:29:09,320 Speaker 1: about being a ludd eyed. Please rate, review, and reach 569 00:29:09,360 --> 00:29:12,600 Speaker 1: out to us at tech Stuff podcast at gmail dot com. 570 00:29:12,720 --> 00:29:13,800 Speaker 1: We're excited to hear from you.