1 00:00:01,120 --> 00:00:03,400 Speaker 1: Welcome to Stuff You should know, a production of My 2 00:00:03,600 --> 00:00:12,560 Speaker 1: Heart Radios How Stuff Works. Hey, and welcome to the podcast. 3 00:00:12,600 --> 00:00:15,920 Speaker 1: I'm Josh Clark. There's Charles W. Chuck Bryant. Here's our 4 00:00:15,960 --> 00:00:20,079 Speaker 1: brand new producer from now on, Josh Brother, Josh, Have 5 00:00:20,200 --> 00:00:22,920 Speaker 1: you said your last name? Josh? We don't do that, Okay, 6 00:00:23,440 --> 00:00:27,840 Speaker 1: sometimes we say Jerry's Jerry quit? She really did, but 7 00:00:27,960 --> 00:00:30,560 Speaker 1: she did it like that, kind of like quiet, silent 8 00:00:30,560 --> 00:00:34,320 Speaker 1: wait for her. She just stopped showing up. Yeah, Jerry 9 00:00:34,320 --> 00:00:37,000 Speaker 1: didn't quit. Everyone. We don't think we're not. It's not 10 00:00:37,280 --> 00:00:40,040 Speaker 1: entirely certain until I see her sitting in that chair. Then, 11 00:00:41,320 --> 00:00:44,680 Speaker 1: assuming she's quit, what if, Chuck, she sent us a 12 00:00:44,800 --> 00:00:48,239 Speaker 1: video of herself saying, I quit. I'm so sick of 13 00:00:48,280 --> 00:00:51,800 Speaker 1: you guys. I'm done with this forever. Would you believe 14 00:00:51,800 --> 00:00:56,319 Speaker 1: it then, but the lips didn't quite match up, then 15 00:00:56,320 --> 00:00:58,120 Speaker 1: it would be a deep fake. It would be a 16 00:00:58,120 --> 00:01:00,680 Speaker 1: deep fake. I saw a tweet. I don't remember who 17 00:01:00,720 --> 00:01:03,920 Speaker 1: it was, but they were maybe Ryan Liza or somebody 18 00:01:04,280 --> 00:01:07,440 Speaker 1: was complaining said why do we call these things deep fakes? 19 00:01:07,440 --> 00:01:10,120 Speaker 1: And somebody school them on it. It was kind of 20 00:01:10,240 --> 00:01:13,640 Speaker 1: nice to wat who he said, Ryan Liza I think 21 00:01:14,360 --> 00:01:19,480 Speaker 1: he's like a CNN correspondent journalist. Uh So, first of all, 22 00:01:19,520 --> 00:01:22,080 Speaker 1: we want to issue a c o A here, uh 23 00:01:23,240 --> 00:01:27,319 Speaker 1: that maybe any kids shouldn't listen to this one. We're 24 00:01:27,360 --> 00:01:33,240 Speaker 1: talking about some really every day dark, harmful stuff. Yeah, 25 00:01:33,280 --> 00:01:36,160 Speaker 1: really despicable, gross stuff. The only thing I can think 26 00:01:36,200 --> 00:01:38,000 Speaker 1: of that would be worse than covering this would be 27 00:01:38,040 --> 00:01:43,200 Speaker 1: to to one on like snuff films. I kept thinking 28 00:01:43,240 --> 00:01:45,679 Speaker 1: of that while I was reading this. I don't know, man, 29 00:01:45,840 --> 00:01:49,480 Speaker 1: I don't want to. I don't want to pit types 30 00:01:49,520 --> 00:01:52,760 Speaker 1: of despicable media against one another. But um, I think 31 00:01:52,800 --> 00:01:55,360 Speaker 1: revenge porn might have a leg up on this, which 32 00:01:55,400 --> 00:01:58,080 Speaker 1: this sort of is as well. Right, it's definitely a 33 00:01:58,120 --> 00:02:00,240 Speaker 1: close cousin of it at least. Yeah, but this one 34 00:02:00,360 --> 00:02:04,280 Speaker 1: in for kids. Uh. And I was shocked and dismayed 35 00:02:04,320 --> 00:02:06,640 Speaker 1: because I didn't know about this, And when I saw 36 00:02:07,440 --> 00:02:11,320 Speaker 1: it's a podcast on fake videos, I thought, well, how fine, 37 00:02:11,880 --> 00:02:16,040 Speaker 1: because I love those videos of David Beckham kicking soccer 38 00:02:16,080 --> 00:02:18,040 Speaker 1: balls in the cans from a hundred yards out on 39 00:02:18,080 --> 00:02:22,480 Speaker 1: the beach. That's for real. Uh No, yeah, and it's 40 00:02:22,520 --> 00:02:25,080 Speaker 1: just coincidence he's holding it PEPSI can up. I saw 41 00:02:25,080 --> 00:02:27,720 Speaker 1: it with my own eyes. I thought that's what this 42 00:02:27,760 --> 00:02:29,200 Speaker 1: is about. I was like, all those are fun. It 43 00:02:29,360 --> 00:02:31,160 Speaker 1: is kind of but then I wanted to take a 44 00:02:31,360 --> 00:02:35,880 Speaker 1: bath after this. I can I can understand, So we 45 00:02:36,280 --> 00:02:39,480 Speaker 1: should probably start out after the CEA by saying what 46 00:02:39,560 --> 00:02:41,880 Speaker 1: a deep fake is? A deep fake d E E 47 00:02:42,080 --> 00:02:46,080 Speaker 1: p F A k E all one word is a 48 00:02:46,160 --> 00:02:50,600 Speaker 1: type of video where somebody is saying or doing something 49 00:02:50,639 --> 00:02:54,639 Speaker 1: that they never actually said or did. What you say, Okay, 50 00:02:54,639 --> 00:02:56,560 Speaker 1: this is nothing new. This has been around for a while, 51 00:02:56,720 --> 00:02:59,200 Speaker 1: Like people have doctored photos and videos and stuff like 52 00:02:59,240 --> 00:03:02,240 Speaker 1: that for base sically as long as there's been videos. 53 00:03:02,280 --> 00:03:06,200 Speaker 1: C G. I sure this is different. This is the 54 00:03:06,400 --> 00:03:10,359 Speaker 1: in the same ballpark, but this is an entirely different league. 55 00:03:10,720 --> 00:03:13,320 Speaker 1: Like this league plays on Saturday and Sunday afternoons, not 56 00:03:13,600 --> 00:03:15,680 Speaker 1: Tuesday night, you know what I mean. Like this is 57 00:03:15,720 --> 00:03:18,399 Speaker 1: something totally Just let it simmer for a little while 58 00:03:18,400 --> 00:03:20,919 Speaker 1: and you'll be like, wow, it's a really good analogy. Uh, 59 00:03:20,919 --> 00:03:23,120 Speaker 1: this is just it's different. It's it has a lot 60 00:03:23,160 --> 00:03:25,919 Speaker 1: of the same in principle, but because they are so 61 00:03:26,000 --> 00:03:29,480 Speaker 1: realistic and they're getting more and more realistic by the day, 62 00:03:29,600 --> 00:03:33,280 Speaker 1: they they actually in a lot of people's minds pose 63 00:03:33,360 --> 00:03:36,400 Speaker 1: a threat not just two individual people, as we'll see, 64 00:03:36,720 --> 00:03:39,680 Speaker 1: but possibly to society at large, say a lot of 65 00:03:39,720 --> 00:03:42,320 Speaker 1: people who are really worried about this kind of stuff. Yeah, 66 00:03:42,360 --> 00:03:46,960 Speaker 1: and we're not talking about I'm assuming the fake lip 67 00:03:47,160 --> 00:03:51,080 Speaker 1: reading thing that's deep fake, right or is that just 68 00:03:52,320 --> 00:03:55,360 Speaker 1: no manipulation of video whatsoever? And that's just people using 69 00:03:55,360 --> 00:03:58,960 Speaker 1: their voice So what that is is the Yeah, it's 70 00:03:59,000 --> 00:04:02,080 Speaker 1: just somebody pretend like they're just fake lip reading and 71 00:04:02,120 --> 00:04:05,360 Speaker 1: then doing a voiceover. They're not. They're not manipulating the 72 00:04:05,440 --> 00:04:08,600 Speaker 1: video at all. Okay, No, they're just doing a really 73 00:04:08,600 --> 00:04:12,240 Speaker 1: bad job of lip reading, which is are hysterics. They 74 00:04:12,240 --> 00:04:15,680 Speaker 1: are hilarious. Those I would put those up with the 75 00:04:15,680 --> 00:04:18,599 Speaker 1: the G I. Joe p S A S. Pork chop 76 00:04:18,640 --> 00:04:21,520 Speaker 1: sandwiches like those are just all time classic. Can watch 77 00:04:21,600 --> 00:04:23,839 Speaker 1: them anytime and still laugh. Have you ever seen the 78 00:04:23,880 --> 00:04:27,520 Speaker 1: G I. Joe's action figures on like the dead road Kills. 79 00:04:30,320 --> 00:04:32,840 Speaker 1: It's sad because it's a dead animal, but they'll be like, 80 00:04:33,120 --> 00:04:35,039 Speaker 1: you know, a dead squirrel in the road, and someone 81 00:04:35,080 --> 00:04:37,200 Speaker 1: will pose like a G. I. Joe action figure with 82 00:04:37,240 --> 00:04:40,719 Speaker 1: his like foot on his head. It's like it's a trophy. Yeah, game, 83 00:04:40,839 --> 00:04:42,960 Speaker 1: it's kind of funny. Yeah, I can see that. It's 84 00:04:43,040 --> 00:04:44,520 Speaker 1: let's put it this way, it's as funny as you 85 00:04:44,520 --> 00:04:48,120 Speaker 1: can make a picture of a dead animal and that 86 00:04:48,200 --> 00:04:51,520 Speaker 1: I assume got hit by a car. You hope, But 87 00:04:51,760 --> 00:04:55,080 Speaker 1: maybe they are like killing squirrels. Just after reading this, 88 00:04:55,279 --> 00:04:57,359 Speaker 1: I don't doubt anything. And it makes me hate the 89 00:04:57,400 --> 00:04:59,760 Speaker 1: Internet even more. All right, so let's let's get into 90 00:04:59,800 --> 00:05:02,640 Speaker 1: this a little bit. Okay, Chuck, calm down. We're not 91 00:05:02,680 --> 00:05:05,919 Speaker 1: allowed to have personal positions on these things, so this 92 00:05:06,000 --> 00:05:09,680 Speaker 1: is totally a neutral thing. Okay. So, um, there's this 93 00:05:09,720 --> 00:05:13,960 Speaker 1: really interesting Gizmoto article that that talked about the history 94 00:05:13,960 --> 00:05:18,320 Speaker 1: of kind of um not necessarily deep fixed but altering videos, 95 00:05:18,320 --> 00:05:23,000 Speaker 1: like presenting a doctored video as as reality. And apparently 96 00:05:23,080 --> 00:05:25,120 Speaker 1: there was a long tradition of it at the beginning 97 00:05:25,320 --> 00:05:28,320 Speaker 1: of cinema where people got their news from newsreels, like 98 00:05:28,360 --> 00:05:30,480 Speaker 1: you actually go to a movie theater to see the 99 00:05:30,520 --> 00:05:34,000 Speaker 1: news because you were just a early twentieth century yokel 100 00:05:34,279 --> 00:05:37,960 Speaker 1: living in Kansas or something like that. Yeah, And after 101 00:05:38,000 --> 00:05:40,720 Speaker 1: reading this bit, I thought that was a very Gizmoto 102 00:05:40,800 --> 00:05:44,240 Speaker 1: way to say, here's one not so interesting fact that 103 00:05:44,279 --> 00:05:46,000 Speaker 1: really has not much to do with this. Oh I 104 00:05:46,040 --> 00:05:48,800 Speaker 1: love it really. I personally selected it and put it 105 00:05:48,839 --> 00:05:50,559 Speaker 1: in here, and I thought it was kind of funny. 106 00:05:50,600 --> 00:05:53,040 Speaker 1: I think it's great. Uh yeah, they used to fake, 107 00:05:53,400 --> 00:05:57,400 Speaker 1: uh real life events and recreate them. Don't try to backpedal. 108 00:05:57,560 --> 00:06:01,080 Speaker 1: And that has nothing to do with deep it does 109 00:06:01,200 --> 00:06:03,640 Speaker 1: because one of the big problems are threats from deep 110 00:06:03,880 --> 00:06:07,200 Speaker 1: deep fakes is it's a way of seeing what you 111 00:06:07,240 --> 00:06:09,839 Speaker 1: think is news, but it's not. It's it's a sham. 112 00:06:09,920 --> 00:06:13,760 Speaker 1: It's recreated. Yeah. The difference I see is they were 113 00:06:13,880 --> 00:06:17,039 Speaker 1: recreating real news events and just like here, you didn't 114 00:06:17,040 --> 00:06:18,520 Speaker 1: see it. So this is what it may have looked like, 115 00:06:18,720 --> 00:06:21,880 Speaker 1: but they were passing it off as real. Therein lies 116 00:06:21,960 --> 00:06:23,920 Speaker 1: the tragedy of all times. I thought it was a 117 00:06:24,000 --> 00:06:29,080 Speaker 1: very thin GIZMOTOI whatever, we'll edit this part out Webster's 118 00:06:29,120 --> 00:06:31,479 Speaker 1: defined deep fake. I like it. I put it in 119 00:06:31,480 --> 00:06:34,760 Speaker 1: there specifically because I thought it was good. That's all right, Okay, 120 00:06:34,800 --> 00:06:37,760 Speaker 1: so we'll take another tech Okay, why don't we just 121 00:06:37,760 --> 00:06:41,560 Speaker 1: talk about deep fakes, So Chuck, let's talk about deep fakes. 122 00:06:42,279 --> 00:06:44,640 Speaker 1: We can just cut there. So deep fakes actually are 123 00:06:44,720 --> 00:06:48,480 Speaker 1: super new. Yeah, they and the reason they're called deep fakes. 124 00:06:48,600 --> 00:06:52,159 Speaker 1: Is because in late two thousand seventeen, I think November, 125 00:06:52,720 --> 00:06:55,559 Speaker 1: this guy who was a reditor, a guy who posts 126 00:06:55,600 --> 00:06:59,200 Speaker 1: on Reddit at your first sign warning sign not necessarily read. 127 00:06:59,200 --> 00:07:01,920 Speaker 1: It's pretty sharp and smart and got some good ideas 128 00:07:01,920 --> 00:07:04,920 Speaker 1: going on as as of all the social media platforms 129 00:07:04,920 --> 00:07:07,600 Speaker 1: that throw my two cents in with Reddit, but there 130 00:07:07,640 --> 00:07:11,400 Speaker 1: was a redditor called deep fake d E E P 131 00:07:11,680 --> 00:07:14,600 Speaker 1: F A K E all one word and he said, hey, world, 132 00:07:14,920 --> 00:07:17,240 Speaker 1: look at what I figured out how to do. And 133 00:07:17,280 --> 00:07:23,440 Speaker 1: he started posting pornography but with celebrities faces transposed on it, 134 00:07:24,560 --> 00:07:26,400 Speaker 1: and he said, this is just my hobby, but here's 135 00:07:26,440 --> 00:07:28,280 Speaker 1: how I did it. And he said that he used, 136 00:07:28,520 --> 00:07:31,920 Speaker 1: I'm assuming it's a him. I don't know if it's 137 00:07:32,200 --> 00:07:34,280 Speaker 1: a man or a woman. I'm gonna go with a man. 138 00:07:35,520 --> 00:07:38,560 Speaker 1: UM and he said, I just used Keras and TensorFlow. 139 00:07:38,680 --> 00:07:41,960 Speaker 1: And these are a couple of UM basically open source 140 00:07:42,040 --> 00:07:45,600 Speaker 1: AI programs that this this guy was smart enough to 141 00:07:45,600 --> 00:07:49,800 Speaker 1: figure out how to use to train to create UM 142 00:07:49,920 --> 00:07:52,840 Speaker 1: these videos where you take a celebrity's face and put 143 00:07:52,880 --> 00:07:55,560 Speaker 1: it on a clip from a porn movie and it 144 00:07:55,640 --> 00:08:00,840 Speaker 1: looks like the celebrity is doing what you're seeing. And 145 00:08:00,920 --> 00:08:03,280 Speaker 1: at first it was kind of hokey and not very 146 00:08:03,440 --> 00:08:07,040 Speaker 1: it was very obviously not real. Yeah, I think the 147 00:08:07,760 --> 00:08:12,920 Speaker 1: scary part was how quickly and easily it could be done. Motherboard, 148 00:08:13,040 --> 00:08:15,200 Speaker 1: who we we used to write for every now and then. 149 00:08:15,760 --> 00:08:18,680 Speaker 1: Remember that I tried to forget. They tried to forget 150 00:08:18,720 --> 00:08:21,080 Speaker 1: for sure. Yeah, if I hit us up, what like, 151 00:08:21,280 --> 00:08:23,679 Speaker 1: I feel like seven or eight years ago, I'm trying 152 00:08:23,680 --> 00:08:26,000 Speaker 1: to forget and you're making it really hard. Said you 153 00:08:26,000 --> 00:08:28,360 Speaker 1: guys want to write some blogs for Motherboard. He said, sure, 154 00:08:28,440 --> 00:08:31,120 Speaker 1: so we did. We did. We wrote ten. Yeah, you 155 00:08:31,120 --> 00:08:32,640 Speaker 1: can probably go find those on the internet if you 156 00:08:32,640 --> 00:08:34,480 Speaker 1: want to learn how to they drive a stick shift 157 00:08:34,600 --> 00:08:37,079 Speaker 1: or something. The fine people that motherboards scrubbed those from 158 00:08:37,120 --> 00:08:40,960 Speaker 1: the internet, let's hope. So so Uh, this deep fake 159 00:08:41,080 --> 00:08:45,760 Speaker 1: character figures this out. Another guy released a downloadable desktop 160 00:08:45,800 --> 00:08:49,840 Speaker 1: software that said here, you can do this awful thing too. Right, 161 00:08:50,080 --> 00:08:53,240 Speaker 1: within like two months of of deep Fake coming out 162 00:08:53,240 --> 00:08:54,680 Speaker 1: and saying look what I did and here's how I 163 00:08:54,720 --> 00:08:57,040 Speaker 1: did it, somebody said that's a really good idea. I'm 164 00:08:57,040 --> 00:08:58,880 Speaker 1: going to turn it into an app and make it 165 00:08:59,400 --> 00:09:03,400 Speaker 1: give it to everybody. That's right, And now, uh, people. 166 00:09:03,440 --> 00:09:05,640 Speaker 1: Can you know at this time, this was a very 167 00:09:05,640 --> 00:09:07,760 Speaker 1: short time ago, people, and it's really come a long 168 00:09:07,760 --> 00:09:11,319 Speaker 1: way in the past whatever, not even two years, because 169 00:09:11,320 --> 00:09:15,520 Speaker 1: there's a late right early two eighteen when it really 170 00:09:15,559 --> 00:09:17,800 Speaker 1: first popped up. Yeah. So this thing was downloaded a 171 00:09:17,840 --> 00:09:21,800 Speaker 1: hundred thousand times in the first month alone, and some 172 00:09:21,840 --> 00:09:24,920 Speaker 1: people used it for fun stuff, like uh, putting Nick 173 00:09:24,960 --> 00:09:27,240 Speaker 1: Cage in movies he wasn't in. Yeah, those are called 174 00:09:27,280 --> 00:09:32,040 Speaker 1: dirt fakes. Yeah, they've all got fun names, don't they. Dude. 175 00:09:32,840 --> 00:09:37,400 Speaker 1: Nicholas Cages Yoda is patently objectively hilarious. I didn't like 176 00:09:37,480 --> 00:09:41,200 Speaker 1: that one. I thought the I don't know, the rages 177 00:09:41,240 --> 00:09:44,040 Speaker 1: of all stark thing was interesting, I guess, but none 178 00:09:44,040 --> 00:09:47,480 Speaker 1: of them made me laugh. Like maybe I just don't 179 00:09:47,520 --> 00:09:50,880 Speaker 1: have that kind of sense of humor. Yeah, but yeah, 180 00:09:50,920 --> 00:09:53,240 Speaker 1: I did. Never was like, oh my god, that's hysterical. 181 00:09:53,360 --> 00:09:55,800 Speaker 1: It's Nick Cage's face. I understand. I understand where you 182 00:09:55,800 --> 00:09:57,440 Speaker 1: come from. I don't think I was, like, you know, 183 00:09:57,760 --> 00:10:01,920 Speaker 1: in stitches or anything like that. But it's pretty great. Okay, Okay, 184 00:10:01,960 --> 00:10:04,200 Speaker 1: it's just not my thing. You're not a Gizmoto reader, 185 00:10:04,240 --> 00:10:06,480 Speaker 1: are you. No, none of this is my thing. But 186 00:10:06,520 --> 00:10:09,720 Speaker 1: that doesn't mean we can't report on it. However, since 187 00:10:09,800 --> 00:10:13,400 Speaker 1: it started happening that it became pretty clear pretty quickly 188 00:10:13,960 --> 00:10:16,440 Speaker 1: that this could be a bad thing in the future, 189 00:10:16,920 --> 00:10:21,839 Speaker 1: and not just for putting your ex girlfriend's face on 190 00:10:22,000 --> 00:10:25,840 Speaker 1: a sex video, you know, saying look what she did. 191 00:10:26,000 --> 00:10:28,800 Speaker 1: It could you could put a world leader up there 192 00:10:28,840 --> 00:10:33,160 Speaker 1: and uh really cause a lot of problems. Yes, hypothetically 193 00:10:33,240 --> 00:10:37,280 Speaker 1: you could. And that's that's really as we'll see this, 194 00:10:37,280 --> 00:10:41,839 Speaker 1: this new technology, this deep fake technology, it poses at 195 00:10:41,880 --> 00:10:46,120 Speaker 1: least two risks to immediately obvious risks, and they're hyper 196 00:10:46,160 --> 00:10:52,800 Speaker 1: individualized and hyper macro social. That's risks, but they both 197 00:10:52,840 --> 00:10:56,079 Speaker 1: stem from the same thing, from the same route or 198 00:10:56,720 --> 00:10:59,600 Speaker 1: same seat. To keep the metaphor going and on track, 199 00:10:59,760 --> 00:11:03,040 Speaker 1: that's so, let's talk about the technology behind this, because 200 00:11:03,080 --> 00:11:07,160 Speaker 1: this stuff is just totally fascinating. Surely you agree it 201 00:11:07,280 --> 00:11:10,200 Speaker 1: is AI. It was created by a guy named Ian Goodfellow, 202 00:11:11,480 --> 00:11:13,640 Speaker 1: just this particular type of AI. He didn't make the 203 00:11:13,679 --> 00:11:17,959 Speaker 1: deep fake stuff, no, no, no. But basically what this 204 00:11:18,000 --> 00:11:21,600 Speaker 1: model you know, everyone knows AI is basically when you 205 00:11:21,640 --> 00:11:25,520 Speaker 1: teach a machine to start teaching itself, starts learning on 206 00:11:25,559 --> 00:11:29,480 Speaker 1: its own, which is a little creepy. Um. But the 207 00:11:29,520 --> 00:11:31,920 Speaker 1: model that they're using these days is called artificial neural 208 00:11:31,960 --> 00:11:35,680 Speaker 1: net which is uh, machine learning. And basically what they've 209 00:11:35,720 --> 00:11:37,560 Speaker 1: done in this case is all you have to do 210 00:11:37,600 --> 00:11:40,760 Speaker 1: is show something a lot of data for it to 211 00:11:40,760 --> 00:11:43,040 Speaker 1: start to be able to recognize that data when you 212 00:11:43,080 --> 00:11:45,840 Speaker 1: aren't showing it that data. Yeah, and it learns on 213 00:11:45,920 --> 00:11:50,400 Speaker 1: its own. What makes the classic example is um AI 214 00:11:50,480 --> 00:11:53,960 Speaker 1: that can pick out pictures of cats, and it's easy enough, 215 00:11:54,000 --> 00:11:57,160 Speaker 1: but you don't tell the AI, here's what a cat is. 216 00:11:57,400 --> 00:12:00,880 Speaker 1: Find pictures of cats in this data set. It's here's 217 00:12:00,880 --> 00:12:03,640 Speaker 1: a bunch of stuff, and figure out what a cat is, 218 00:12:03,679 --> 00:12:06,440 Speaker 1: and they get really good at picking it out. You 219 00:12:06,480 --> 00:12:08,600 Speaker 1: can also turn it the opposite way, once you have 220 00:12:08,640 --> 00:12:11,800 Speaker 1: an AI trained on identifying cats and get it to 221 00:12:11,960 --> 00:12:17,120 Speaker 1: produce pictures of cats, but they're usually terrible and often 222 00:12:17,360 --> 00:12:20,160 Speaker 1: very very bizarre, like anyone would look at it and 223 00:12:20,200 --> 00:12:22,840 Speaker 1: be like a human didn't make this. It's just off 224 00:12:23,000 --> 00:12:27,080 Speaker 1: in some really obvious ways. And what you and Goodfellow 225 00:12:27,120 --> 00:12:31,199 Speaker 1: figured out was a way around that problem. Yeah. So, uh, 226 00:12:31,360 --> 00:12:34,160 Speaker 1: I'm not sure I agree with his wording here, um, 227 00:12:34,360 --> 00:12:36,680 Speaker 1: but we'll we'll say what he calls it, uh, he 228 00:12:36,880 --> 00:12:40,560 Speaker 1: set up two teams and one is a generator and 229 00:12:40,600 --> 00:12:45,880 Speaker 1: one is a discriminator, and he calls it generative adversarial network. 230 00:12:46,040 --> 00:12:49,400 Speaker 1: So basically his contention is that these two are adversarial. 231 00:12:49,840 --> 00:12:57,040 Speaker 1: I saw it more as like a managerial in nature. Okay, bureaucratic, Yeah, 232 00:12:57,120 --> 00:12:58,720 Speaker 1: I mean, isn't that what it felt like you or 233 00:12:59,000 --> 00:13:02,480 Speaker 1: the the scriminer is like, yeah, I'm gonna need you 234 00:13:02,520 --> 00:13:05,439 Speaker 1: to come in on Saturdays. It kind of felt like, 235 00:13:05,800 --> 00:13:08,000 Speaker 1: So you've got these two networks and they're both trained 236 00:13:08,040 --> 00:13:10,520 Speaker 1: on the same data set, but the generator is the 237 00:13:10,559 --> 00:13:15,040 Speaker 1: one that's producing these fake cats, and then there's a 238 00:13:15,120 --> 00:13:18,360 Speaker 1: discriminator or what I like to call a manager saying 239 00:13:18,760 --> 00:13:21,280 Speaker 1: these look good. These don't look so good? Right. The 240 00:13:21,280 --> 00:13:23,600 Speaker 1: other way, the way that good Fellow has proposed it, 241 00:13:23,679 --> 00:13:27,360 Speaker 1: is that the discriminator is going through and looking at 242 00:13:27,400 --> 00:13:31,240 Speaker 1: these generated pictures and trying to figure out if it's 243 00:13:31,320 --> 00:13:33,640 Speaker 1: real or if it's from the data, or if it's 244 00:13:33,679 --> 00:13:37,439 Speaker 1: fake the generator created it, or if the or if 245 00:13:37,440 --> 00:13:40,600 Speaker 1: it comes from the data set, And based on the 246 00:13:40,679 --> 00:13:45,000 Speaker 1: feedback that the manager gives the generator um the generator 247 00:13:45,040 --> 00:13:47,160 Speaker 1: is going to adjust as parameters so that it gets 248 00:13:47,240 --> 00:13:52,160 Speaker 1: better and better at putting out more realistic pictures of cats. Yeah. 249 00:13:52,160 --> 00:13:54,320 Speaker 1: I don't get the adversarial part unless it gets mean 250 00:13:54,559 --> 00:13:57,000 Speaker 1: and how it delivers that message the way the reason 251 00:13:57,040 --> 00:14:00,840 Speaker 1: they call it ever sincere is I saw it, but like, um, 252 00:14:00,880 --> 00:14:05,880 Speaker 1: it's like an art forger and detective and the art 253 00:14:05,880 --> 00:14:08,679 Speaker 1: forgers putting out or an appraiser is a better way 254 00:14:08,720 --> 00:14:11,199 Speaker 1: to put it. No art forgers putting out forged art, 255 00:14:11,280 --> 00:14:13,040 Speaker 1: and the appraisers like this is fake, this is fake, 256 00:14:13,120 --> 00:14:15,440 Speaker 1: this is fake. Well, I'm not sure about this one. 257 00:14:16,559 --> 00:14:18,760 Speaker 1: I don't know if this one is fake. This is real, 258 00:14:18,960 --> 00:14:21,160 Speaker 1: this is real, this is real. And then at that point, 259 00:14:21,360 --> 00:14:26,120 Speaker 1: the generator has become adept at fooling an AI that's 260 00:14:26,120 --> 00:14:29,360 Speaker 1: trained to identify pictures of cats creating pictures of cats 261 00:14:29,400 --> 00:14:34,160 Speaker 1: that don't exist. Okay, it's adversarial. They're trying, the generators 262 00:14:34,160 --> 00:14:38,200 Speaker 1: trying to fool the discriminator, and the discriminators trying to 263 00:14:38,240 --> 00:14:41,240 Speaker 1: thwart the generator. That's the adversarial part. But in the 264 00:14:41,320 --> 00:14:44,280 Speaker 1: end they're really on the same team. Yeah, okay, I 265 00:14:44,320 --> 00:14:46,360 Speaker 1: guess that's where it loses me. You have a really 266 00:14:46,400 --> 00:14:50,680 Speaker 1: positive image of corporate America. What is that the two 267 00:14:50,680 --> 00:14:54,200 Speaker 1: with anything. The managers on the same team as everybody. 268 00:14:54,200 --> 00:14:56,960 Speaker 1: Come on, get on board, Get on the trolley, Josh. So, 269 00:14:57,200 --> 00:15:00,200 Speaker 1: if you want to look up Mona Lisa talking, there 270 00:15:00,200 --> 00:15:03,520 Speaker 1: was a video this year from Samsung that showed how 271 00:15:03,640 --> 00:15:07,360 Speaker 1: you could do this, and all the stuff is on YouTube. 272 00:15:07,360 --> 00:15:09,920 Speaker 1: If you want to see Nick Cage as Indiana Jones, 273 00:15:10,040 --> 00:15:13,760 Speaker 1: which is pretty funny, or Yoda, which is hilarious, or 274 00:15:13,800 --> 00:15:16,560 Speaker 1: if you want to see Mona Lisa talking, it looks 275 00:15:16,640 --> 00:15:19,640 Speaker 1: it looks fairly realistic, like hey, they brought that painting 276 00:15:19,640 --> 00:15:22,200 Speaker 1: to life, right yeah, yeah, And if you scroll down 277 00:15:22,240 --> 00:15:24,360 Speaker 1: a little bit, they did one with Marilyn Monro too, 278 00:15:24,840 --> 00:15:28,440 Speaker 1: they brought her to life. He did interesting. Well, I 279 00:15:28,440 --> 00:15:33,840 Speaker 1: mean this is like just set up for for TV commercials. Yes, 280 00:15:33,880 --> 00:15:36,400 Speaker 1: they've already done stuff like this, right, This is like 281 00:15:36,440 --> 00:15:40,480 Speaker 1: fredis Stared dancing with a dirt devil or whatever that was. 282 00:15:40,640 --> 00:15:45,920 Speaker 1: What this could bring is creating entirely new movies and 283 00:15:45,960 --> 00:15:49,400 Speaker 1: bringing back dead actors and actresses or I guess as 284 00:15:49,440 --> 00:15:53,080 Speaker 1: actors now these days. Right, Uh well, I mean you've 285 00:15:53,080 --> 00:15:54,960 Speaker 1: seen some of them. Are you talking about the the 286 00:15:55,040 --> 00:15:59,320 Speaker 1: aging people are just creating, like bringing back someone that's 287 00:15:59,360 --> 00:16:02,040 Speaker 1: been long dead. No, No, I'm saying like you call 288 00:16:02,160 --> 00:16:04,920 Speaker 1: actors and actresses just actors these days. Oh that part. 289 00:16:05,720 --> 00:16:09,560 Speaker 1: Uh you can do whatever you want, okay, Uh, motion 290 00:16:09,600 --> 00:16:13,440 Speaker 1: picture performers, maybe even television performers, but bringing them back 291 00:16:13,480 --> 00:16:16,400 Speaker 1: and giving them like they could star in an entirely 292 00:16:17,680 --> 00:16:21,080 Speaker 1: because it's so realistic in life. Yeah. Um, they're not 293 00:16:21,120 --> 00:16:23,920 Speaker 1: at that point yet because they're just now getting to 294 00:16:24,000 --> 00:16:28,240 Speaker 1: where the d aging looks decent depending on who it is. Uh, 295 00:16:28,280 --> 00:16:32,920 Speaker 1: Like the Sam Jackson stuff and Captain Marvel, Uh, my 296 00:16:32,960 --> 00:16:36,040 Speaker 1: friend looked pretty good. It looked amazing. Yeah. And this 297 00:16:36,160 --> 00:16:39,080 Speaker 1: Will Smith stuff in this new angry movie looks really good. 298 00:16:39,240 --> 00:16:42,920 Speaker 1: What's that one? He's an angry movie where he plays 299 00:16:42,960 --> 00:16:48,640 Speaker 1: a some sort of assassin that uh Aladdin? Yeah, that's it. 300 00:16:49,040 --> 00:16:51,720 Speaker 1: Have you seen that? It was good? No? I have 301 00:16:51,760 --> 00:16:55,680 Speaker 1: no interest, um to go back and kill with the 302 00:16:55,760 --> 00:16:57,480 Speaker 1: younger version of himself or the young ones trying to 303 00:16:57,560 --> 00:16:59,040 Speaker 1: kill the older one. Is what it is. Man. It's 304 00:16:59,040 --> 00:17:02,760 Speaker 1: a lot like Looper, uh sort of, but it looks 305 00:17:02,760 --> 00:17:06,600 Speaker 1: pretty good. Like it looks like young Will Smith. Slightly uncanny, 306 00:17:07,040 --> 00:17:08,879 Speaker 1: but not as bad as I think. Some people are 307 00:17:08,920 --> 00:17:11,200 Speaker 1: easier than others, like the Michael Douglas stuff and aunt 308 00:17:11,280 --> 00:17:14,000 Speaker 1: Man and the marble stuff is kind of creepy looking. 309 00:17:14,119 --> 00:17:16,320 Speaker 1: I haven't seen that. I mean I've seen parts of it, 310 00:17:16,359 --> 00:17:18,600 Speaker 1: but I didn't notice that they were d aging Michael Douglas. 311 00:17:18,680 --> 00:17:21,400 Speaker 1: They took Michael Douglas back in scenes to like the seventies. 312 00:17:22,040 --> 00:17:25,520 Speaker 1: It's just like it doesn't look great. But anyway, that's 313 00:17:25,520 --> 00:17:28,200 Speaker 1: sort of off track, um, but not really. I mean, 314 00:17:28,200 --> 00:17:30,240 Speaker 1: it's kind of similar type of stuff. I guess, no 315 00:17:30,320 --> 00:17:32,879 Speaker 1: more than that Gizmoto article to start. Yeah, that's a 316 00:17:32,880 --> 00:17:36,000 Speaker 1: good point, um, But the whole reason we should point 317 00:17:36,000 --> 00:17:39,960 Speaker 1: out that people are doing this stuff with celebrities and 318 00:17:40,000 --> 00:17:42,200 Speaker 1: stuff like that, it's just because there's more data out there. 319 00:17:42,440 --> 00:17:45,600 Speaker 1: It's a lot easier when you have a gazillion pictures 320 00:17:45,600 --> 00:17:49,240 Speaker 1: of Brad Pitt on the internet to do a fake pit. Yeah, 321 00:17:49,280 --> 00:17:52,560 Speaker 1: because the data set that the AI is trained on 322 00:17:52,760 --> 00:17:56,120 Speaker 1: is just more and more robust. And the more pictures are, 323 00:17:56,160 --> 00:17:59,560 Speaker 1: the more angles the AI has seen Brad Pitt, you know, 324 00:17:59,640 --> 00:18:03,560 Speaker 1: look round at, and so can recreate this these faces 325 00:18:03,640 --> 00:18:08,040 Speaker 1: because the AI seem like every possible pose or expression 326 00:18:08,119 --> 00:18:11,239 Speaker 1: or whatever Brad Pitts ever made. But the thing is 327 00:18:11,280 --> 00:18:14,760 Speaker 1: that mona Lisa and Marilyn Monroe. Thing that Samsung showed, 328 00:18:15,320 --> 00:18:18,960 Speaker 1: They showed that you could make a pretty convincing deep 329 00:18:19,000 --> 00:18:23,560 Speaker 1: fake with just one picture, one pose, right, So that's 330 00:18:23,560 --> 00:18:26,359 Speaker 1: a big deal. But again, the bigger the data set, 331 00:18:26,400 --> 00:18:30,040 Speaker 1: the better. And that's why, like you said, celebrities and 332 00:18:30,040 --> 00:18:35,280 Speaker 1: and UM world leaders were the earliest targets. But over time, 333 00:18:35,760 --> 00:18:38,520 Speaker 1: with the advent of other software and the fact that 334 00:18:38,560 --> 00:18:42,159 Speaker 1: people now post tons of stuff about themselves and pictures 335 00:18:42,160 --> 00:18:46,439 Speaker 1: of themselves on social media, um, it's become easier and 336 00:18:46,480 --> 00:18:51,040 Speaker 1: easier to make a deep fake video of anybody. There's 337 00:18:51,080 --> 00:18:55,199 Speaker 1: like holes, their software that scrapes social media accounts for 338 00:18:55,280 --> 00:19:00,240 Speaker 1: every picture and video that's been posted ants, right, um distinction, Yeah, 339 00:19:00,240 --> 00:19:04,320 Speaker 1: for sure. Um there's and then there's um other other 340 00:19:04,440 --> 00:19:07,960 Speaker 1: sites and other apps that say, oh this this picture 341 00:19:07,960 --> 00:19:10,879 Speaker 1: of this person. You're targeting, your your classmate or whatever. 342 00:19:11,480 --> 00:19:14,640 Speaker 1: They probably have a pretty good match with this porn star, 343 00:19:14,760 --> 00:19:17,159 Speaker 1: So go find videos of this porn star. And then 344 00:19:17,200 --> 00:19:18,920 Speaker 1: the next thing, you know, you run it through that app, 345 00:19:19,000 --> 00:19:21,960 Speaker 1: that deep fake app came out with, and you've got 346 00:19:21,960 --> 00:19:25,560 Speaker 1: yourself a deep fake video and you've officially become a 347 00:19:25,600 --> 00:19:28,800 Speaker 1: bad person. All right, that's a good place to take 348 00:19:28,800 --> 00:19:32,119 Speaker 1: a break, and uh, we'll talk more about these bad people, 349 00:19:32,200 --> 00:19:53,919 Speaker 1: right for this that's what sk as good should all right, 350 00:19:54,040 --> 00:19:58,359 Speaker 1: So you mentioned before the break, Uh, this person's face 351 00:19:58,520 --> 00:20:01,800 Speaker 1: that I just stole off the internet fits this porn 352 00:20:01,840 --> 00:20:06,320 Speaker 1: actors body, which is a it's a consideration if you're 353 00:20:06,359 --> 00:20:10,080 Speaker 1: making that, because to look right, that has to they 354 00:20:10,080 --> 00:20:13,160 Speaker 1: have to bear a passing resemblance. I think that's right. Okay, 355 00:20:13,520 --> 00:20:16,600 Speaker 1: So I was just about to say, so, yeah, So 356 00:20:16,680 --> 00:20:20,119 Speaker 1: what they're doing now is they're browsing these applications with 357 00:20:20,200 --> 00:20:24,040 Speaker 1: facial recognition software to make this a lot easier. And 358 00:20:24,080 --> 00:20:27,000 Speaker 1: that's what most of this is about, is like, let's 359 00:20:27,000 --> 00:20:29,640 Speaker 1: just see how easy we can make this and how 360 00:20:30,280 --> 00:20:33,440 Speaker 1: how much we can democratize this where any shmoke can 361 00:20:33,480 --> 00:20:36,560 Speaker 1: take any single photo and do the worst things possible 362 00:20:36,560 --> 00:20:41,840 Speaker 1: with it, but also how convincing they've become as well. Yeah, 363 00:20:42,000 --> 00:20:44,120 Speaker 1: I mean, so it's another big change. It's looking better 364 00:20:44,160 --> 00:20:47,800 Speaker 1: and better, quicker and quicker, which is pretty scary. Did 365 00:20:47,840 --> 00:20:50,879 Speaker 1: you see the Obama one. Had it not been for 366 00:20:51,480 --> 00:20:55,000 Speaker 1: had it not been for Jordan's Peel's voice obviously being 367 00:20:55,000 --> 00:20:57,760 Speaker 1: not Obama, I would have been like, Wow, this is 368 00:20:57,800 --> 00:21:00,800 Speaker 1: really convincing. Really, Yeah, see how didn't think the lips 369 00:21:00,840 --> 00:21:03,600 Speaker 1: matched up at all? Oh? I thought it looked pretty close. Yeah. 370 00:21:03,680 --> 00:21:06,639 Speaker 1: So what we're talking about is Jordan Peel did a 371 00:21:06,640 --> 00:21:10,240 Speaker 1: basically a demonstration video to raise awareness about how awful 372 00:21:10,280 --> 00:21:13,760 Speaker 1: this is by doing it himself, and did a video 373 00:21:13,760 --> 00:21:17,520 Speaker 1: of Obama like you know, referring to Trump as a 374 00:21:17,520 --> 00:21:21,199 Speaker 1: curse word, a dipstick and basically saying like, hey, this 375 00:21:21,280 --> 00:21:24,320 Speaker 1: is Obama and what you know people are doing. He 376 00:21:24,359 --> 00:21:28,000 Speaker 1: basically is describing what's happening as you're watching it. And 377 00:21:28,119 --> 00:21:29,919 Speaker 1: I thought it look kind of fake. He's describing a 378 00:21:29,960 --> 00:21:32,960 Speaker 1: deep fake through deep fake in Jordan Peel, and he 379 00:21:33,000 --> 00:21:35,640 Speaker 1: did it in conjunction with BuzzFeed and another production company. 380 00:21:35,880 --> 00:21:37,959 Speaker 1: But in their defense, they were making this in like 381 00:21:38,040 --> 00:21:42,080 Speaker 1: early two thou eighteen, like April two eighteen, and since then, 382 00:21:42,160 --> 00:21:45,920 Speaker 1: even more technologies come out that is dedicated to matching 383 00:21:46,480 --> 00:21:49,520 Speaker 1: the movement of the mouth to whatever words you want 384 00:21:49,520 --> 00:21:52,800 Speaker 1: the person to say. Yeah, and you can also like 385 00:21:53,119 --> 00:21:55,760 Speaker 1: use only parts of it, so it's even more convincing. 386 00:21:56,320 --> 00:21:59,920 Speaker 1: So like if Obama had a lead in that actually worked, 387 00:22:00,000 --> 00:22:02,280 Speaker 1: you could just keep that in there and then take 388 00:22:02,320 --> 00:22:05,280 Speaker 1: out certain words, and you can manipulate it however you 389 00:22:05,320 --> 00:22:07,719 Speaker 1: want to write, and the AI can go through and 390 00:22:07,760 --> 00:22:10,000 Speaker 1: find like phone names and stuff like that to make 391 00:22:10,119 --> 00:22:14,360 Speaker 1: the new words that the person never said. It's it's 392 00:22:14,359 --> 00:22:17,040 Speaker 1: becoming extremely easy. Let's just put it like this. It's 393 00:22:17,080 --> 00:22:21,359 Speaker 1: becoming extremely easy, and it's um widely available for anybody 394 00:22:21,560 --> 00:22:25,080 Speaker 1: to make a video of somebody doing something or saying 395 00:22:25,119 --> 00:22:27,840 Speaker 1: something that they never did or never said, and to 396 00:22:27,920 --> 00:22:32,960 Speaker 1: make it convincing enough that you may believe it at first. Yeah, which, 397 00:22:32,960 --> 00:22:35,760 Speaker 1: like we said that, you know, the obvious scariest implications 398 00:22:36,200 --> 00:22:40,320 Speaker 1: that aren't just of the personal variety are in politics, 399 00:22:40,920 --> 00:22:45,000 Speaker 1: when you could create real fake news that actually put 400 00:22:45,200 --> 00:22:48,159 Speaker 1: people in jeopardy. They were put the entire world in 401 00:22:48,240 --> 00:22:52,320 Speaker 1: jeopardy by like announcing a nuclear strike or something like that. Right, Yeah, 402 00:22:52,320 --> 00:22:56,320 Speaker 1: Marco rubio In I can't remember when it was, but um, 403 00:22:56,400 --> 00:22:59,200 Speaker 1: within the last year or two basically said that deep 404 00:22:59,240 --> 00:23:03,200 Speaker 1: fakes or the modern equivalent of a nuclear bomb, that 405 00:23:03,320 --> 00:23:08,359 Speaker 1: you could threaten America the degree with the deep fake. 406 00:23:08,440 --> 00:23:10,320 Speaker 1: I think that is a little hyperbolic for sure, And 407 00:23:10,400 --> 00:23:12,439 Speaker 1: we're not the only ones. Yeah, there are other people 408 00:23:12,480 --> 00:23:14,800 Speaker 1: that say people that know what they're talking about not 409 00:23:14,880 --> 00:23:19,199 Speaker 1: just you know, shlubs like us, but other people that say, like, hey, listen, 410 00:23:19,920 --> 00:23:23,200 Speaker 1: this is probably not like a nuclear bomb going off. Um, 411 00:23:23,240 --> 00:23:26,240 Speaker 1: we should keep our on it. But there are other 412 00:23:26,320 --> 00:23:28,560 Speaker 1: bigger fish to fry when it comes to stuff like this, 413 00:23:28,800 --> 00:23:30,960 Speaker 1: for sure. And then there are other people who are saying, 414 00:23:30,960 --> 00:23:34,440 Speaker 1: well there are that's not to discount like the real 415 00:23:34,600 --> 00:23:38,080 Speaker 1: problem it composed right, Like, we're already in a very 416 00:23:38,119 --> 00:23:43,000 Speaker 1: polarized position in this country. Um. So the idea of 417 00:23:43,080 --> 00:23:49,760 Speaker 1: having realistic, um, indistinguishable from reality videos of like world 418 00:23:49,880 --> 00:23:53,800 Speaker 1: leaders or senators or whoever saying whatever is not going 419 00:23:53,840 --> 00:23:56,360 Speaker 1: to help things at all. It's not going to bring 420 00:23:56,400 --> 00:23:59,240 Speaker 1: everyone together like look at this hilarious deep fake. It's 421 00:23:59,240 --> 00:24:02,040 Speaker 1: going to be like you look, um, and it's just 422 00:24:02,080 --> 00:24:05,240 Speaker 1: going to erode that trust that that is necessary for 423 00:24:05,280 --> 00:24:10,280 Speaker 1: a democracy to thrive. Um. And to to take into 424 00:24:10,359 --> 00:24:14,679 Speaker 1: his logical conclusion, this one researcher put it like this, like, 425 00:24:15,040 --> 00:24:20,600 Speaker 1: eventually we're going to lose our ability to agree on 426 00:24:20,760 --> 00:24:24,840 Speaker 1: what is shared objective reality. And at that point, what 427 00:24:24,880 --> 00:24:28,760 Speaker 1: we would face is what's called the death of truth, 428 00:24:29,440 --> 00:24:32,760 Speaker 1: like there is no such thing anymore. And in one 429 00:24:32,840 --> 00:24:35,879 Speaker 1: on one hand, that's horrible. It's a horrible idea, the 430 00:24:35,920 --> 00:24:40,080 Speaker 1: idea that nothing's real because there's such a thing as 431 00:24:40,119 --> 00:24:42,880 Speaker 1: deep fakes, and anybody could could make something like this. 432 00:24:43,640 --> 00:24:46,160 Speaker 1: But on the other hand, you can kind of say, 433 00:24:46,280 --> 00:24:50,440 Speaker 1: you can't engage his Yoda right exactly. On the other hand, though, 434 00:24:50,480 --> 00:24:53,760 Speaker 1: you can say, the fact that people know that deep 435 00:24:53,840 --> 00:24:58,119 Speaker 1: fakes are out there means that it's gonna be easier 436 00:24:58,119 --> 00:25:01,560 Speaker 1: and easier to be like, that'sbviously not real. It's just 437 00:25:01,720 --> 00:25:06,040 Speaker 1: too unbelievable. So it may actually make us more discriminating 438 00:25:06,080 --> 00:25:09,600 Speaker 1: and more discerning of the news than we are today. Yeah, 439 00:25:09,600 --> 00:25:12,560 Speaker 1: that's the only thing that salvaged my brain from this 440 00:25:12,600 --> 00:25:16,760 Speaker 1: track of talking about this today was like, well, we'll 441 00:25:16,760 --> 00:25:20,160 Speaker 1: go tell our listeners at least be on the lookout, 442 00:25:21,119 --> 00:25:24,520 Speaker 1: be wary, take everything with a grain of salt, because 443 00:25:24,560 --> 00:25:26,600 Speaker 1: we're already in a place where, like, you don't even 444 00:25:26,640 --> 00:25:29,600 Speaker 1: need some deep fake video like it's happened all over 445 00:25:29,640 --> 00:25:32,880 Speaker 1: the place. You can see a uh, something that's photoshopped 446 00:25:32,960 --> 00:25:36,159 Speaker 1: or a real photo that someone just writes a false 447 00:25:36,160 --> 00:25:38,879 Speaker 1: story about. Yeah, that's a good one. Um, you can 448 00:25:38,920 --> 00:25:40,879 Speaker 1: just come up with a false narrative from a picture 449 00:25:41,000 --> 00:25:42,600 Speaker 1: when there's a guy on the street and he's laying 450 00:25:42,600 --> 00:25:45,639 Speaker 1: there bleeding, and you can just say, uh, this person 451 00:25:45,720 --> 00:25:49,960 Speaker 1: was attacked yesterday by a group of angry Trumpers or 452 00:25:50,000 --> 00:25:54,840 Speaker 1: an Antifa on the other side, and it'll get passed 453 00:25:54,840 --> 00:25:58,120 Speaker 1: around twenty million times, and then the retraction gets seen 454 00:25:58,160 --> 00:26:02,000 Speaker 1: by people exactly. That's where that's not a deep fake. No, 455 00:26:02,200 --> 00:26:04,920 Speaker 1: that's just that's low hanging fruit. That's a low five fake. 456 00:26:05,080 --> 00:26:07,920 Speaker 1: Imagine inserting into that and this is what we're talking about, 457 00:26:07,960 --> 00:26:11,600 Speaker 1: into that climate like video where you're looking at the 458 00:26:11,640 --> 00:26:15,240 Speaker 1: person seeing and seeing with your own eyes what they're saying. 459 00:26:15,359 --> 00:26:17,280 Speaker 1: And a lot of people who aren't, like I thought 460 00:26:17,280 --> 00:26:19,640 Speaker 1: the Obama video looked pretty fake, you thought it looked 461 00:26:19,680 --> 00:26:25,080 Speaker 1: pretty real. Everyone's eye is different and ear is different. 462 00:26:25,160 --> 00:26:28,400 Speaker 1: Like a lot of people will believe anything they see 463 00:26:28,480 --> 00:26:31,199 Speaker 1: like this, right, And we'll we'll we'll talk about like 464 00:26:31,280 --> 00:26:34,720 Speaker 1: how to discern deep fakes in a second. But we're 465 00:26:34,760 --> 00:26:37,960 Speaker 1: getting to the point people seem to be in wide 466 00:26:37,960 --> 00:26:42,560 Speaker 1: agreement that very soon it will be up to digital 467 00:26:42,680 --> 00:26:47,600 Speaker 1: forensic scientists to determine whether a video is authentic or not. 468 00:26:47,680 --> 00:26:50,399 Speaker 1: And that's all that's because you or I will not 469 00:26:50,440 --> 00:26:52,560 Speaker 1: be able to distinguish it from reality. Yeah, and I 470 00:26:52,560 --> 00:26:56,239 Speaker 1: imagine that every country will have their own team that 471 00:26:56,280 --> 00:26:58,480 Speaker 1: will be hard at work doing that stuff, and by 472 00:26:58,520 --> 00:27:02,760 Speaker 1: will already yeah, has since at the end of two 473 00:27:02,760 --> 00:27:05,879 Speaker 1: thousand seventeen. Yeah, or at least they're scrambling to catch up. 474 00:27:07,080 --> 00:27:11,040 Speaker 1: Because when the video comes out of um, you know, 475 00:27:12,200 --> 00:27:14,960 Speaker 1: the leader of North Korea saying we want to drop 476 00:27:15,280 --> 00:27:18,960 Speaker 1: bombs on America at two o'clock this afternoon, that's going 477 00:27:19,040 --> 00:27:22,440 Speaker 1: to send our DARPA team scrambling to try and disprove 478 00:27:22,480 --> 00:27:25,080 Speaker 1: this thing before we push the button. Right. It's like 479 00:27:25,080 --> 00:27:28,760 Speaker 1: war games, it is, but way way worse. Yeah. So, 480 00:27:28,960 --> 00:27:31,439 Speaker 1: just to reiterate one more time, the one thing that 481 00:27:31,480 --> 00:27:33,000 Speaker 1: you and I can do, and the one thing that 482 00:27:33,040 --> 00:27:35,720 Speaker 1: you guys out there listening can do to keep society 483 00:27:35,760 --> 00:27:39,760 Speaker 1: from eroding is to know that deep fake videos are 484 00:27:39,920 --> 00:27:43,760 Speaker 1: very real, and just about anybody with enough computing power 485 00:27:43,840 --> 00:27:46,959 Speaker 1: and patients to make one can make one. And the 486 00:27:47,080 --> 00:27:50,320 Speaker 1: very fact that those things exist should make you question 487 00:27:50,880 --> 00:27:53,280 Speaker 1: anything you see or hear with your own eyes that 488 00:27:53,320 --> 00:27:56,800 Speaker 1: seems unbelievable or sensational. Unfortunately, I think the stuff you 489 00:27:56,840 --> 00:27:59,720 Speaker 1: should know crowd is pretty savvy, so we're sort of 490 00:27:59,760 --> 00:28:02,000 Speaker 1: pre into the choir here. Yeah, but maybe they can 491 00:28:02,040 --> 00:28:06,480 Speaker 1: go preach to their older relatives on Facebook or exactly 492 00:28:06,560 --> 00:28:10,960 Speaker 1: take this to Thanksgiving dinner and just explain it to folks. Uh, 493 00:28:11,480 --> 00:28:13,919 Speaker 1: we should talk about porn a little more. Should we 494 00:28:13,920 --> 00:28:17,200 Speaker 1: take a break first? Sure? Are you okay with that? Yeah? 495 00:28:17,400 --> 00:28:20,080 Speaker 1: I feel bad now? No? Okay, Well we'll wait and 496 00:28:20,080 --> 00:28:36,960 Speaker 1: talk about porn in about sixty seconds, ask as good? 497 00:28:37,119 --> 00:28:44,840 Speaker 1: Shouldn't all right, Chuck? You promised talking about porn? Yeah? This? Uh? 498 00:28:44,880 --> 00:28:47,120 Speaker 1: In this research, it says one of the defenses people 499 00:28:47,120 --> 00:28:48,960 Speaker 1: make in favor of deep pig porn is it it 500 00:28:48,960 --> 00:28:52,680 Speaker 1: doesn't actually harm anyone. Is anyone actually saying that? Yeah? 501 00:28:52,720 --> 00:28:54,800 Speaker 1: A lot of people who I shouldn't say a lot 502 00:28:54,880 --> 00:28:57,240 Speaker 1: I've seen at least quote from people who make this 503 00:28:57,280 --> 00:28:59,840 Speaker 1: stuff saying like this is a this is the media 504 00:29:00,520 --> 00:29:02,960 Speaker 1: drumming up a moral panic, Like what's the what's the 505 00:29:03,000 --> 00:29:05,560 Speaker 1: problem here? What's the issue? It's not like where it's 506 00:29:05,560 --> 00:29:09,480 Speaker 1: not like they're going and hacking into like a star's 507 00:29:10,120 --> 00:29:13,200 Speaker 1: iCloud account, getting naked pictures of them and then distributing 508 00:29:13,240 --> 00:29:17,280 Speaker 1: that and like this is really a private naked picture 509 00:29:17,360 --> 00:29:21,440 Speaker 1: of a celebrity people into thinking they've done that to them. 510 00:29:21,480 --> 00:29:23,360 Speaker 1: I think that they would say they're just creating some 511 00:29:23,400 --> 00:29:27,160 Speaker 1: fantasy thing that's not even real. It doesn't exist. I'm 512 00:29:27,200 --> 00:29:29,480 Speaker 1: not defending it. I'm just telling you what the other 513 00:29:29,520 --> 00:29:32,560 Speaker 1: side is saying. Yeah. Well, and that's the perfect example 514 00:29:32,560 --> 00:29:35,520 Speaker 1: of why these are very bad people, because it is 515 00:29:35,800 --> 00:29:38,720 Speaker 1: it is harmful, um to everyone involved, to the person 516 00:29:38,760 --> 00:29:43,680 Speaker 1: whose face you're using, to the adult film actor who 517 00:29:43,760 --> 00:29:47,000 Speaker 1: did a scene wants credit, and well, yeah, I mean 518 00:29:47,000 --> 00:29:50,280 Speaker 1: it's regardless of how you feel about that stuff, someone 519 00:29:50,840 --> 00:29:52,920 Speaker 1: did something and got paid a wage to do so, 520 00:29:53,440 --> 00:29:56,600 Speaker 1: and now it's being ripped off and there are real 521 00:29:56,840 --> 00:30:00,719 Speaker 1: people's faces involved in real bodies involved in real lives. 522 00:30:00,800 --> 00:30:04,080 Speaker 1: It's you know, it's not a moral panic. But you know, 523 00:30:04,120 --> 00:30:05,920 Speaker 1: it's not like we need to march this to the 524 00:30:05,920 --> 00:30:08,400 Speaker 1: top of Capitol Hill right now. Well that's funny because 525 00:30:08,400 --> 00:30:10,800 Speaker 1: Congress held hearings on it this year. Well yeah, but 526 00:30:10,840 --> 00:30:13,080 Speaker 1: I have a feeling it's a little bit more about 527 00:30:13,120 --> 00:30:17,640 Speaker 1: the political ramifications than putting your ex girlfriend's face on 528 00:30:17,680 --> 00:30:19,800 Speaker 1: a porn body. Oh yeah, yeah, yeah, I see what 529 00:30:19,800 --> 00:30:21,480 Speaker 1: you mean. Although they could do you know, they could 530 00:30:21,520 --> 00:30:25,560 Speaker 1: put your your governor's face on a porn body and 531 00:30:25,600 --> 00:30:28,040 Speaker 1: get them removed from office. You know, this video was 532 00:30:28,080 --> 00:30:31,000 Speaker 1: just dug up. Look at look at this, Uh, look 533 00:30:31,040 --> 00:30:34,200 Speaker 1: at your governor. Yeah, look what he's doing. For sure, 534 00:30:34,240 --> 00:30:37,560 Speaker 1: But even take that down to the more the less 535 00:30:37,600 --> 00:30:40,840 Speaker 1: political level, like you were saying, you could ruin somebody's 536 00:30:40,840 --> 00:30:43,280 Speaker 1: marriage if it was shaky or on the rocks before. Sure, 537 00:30:43,320 --> 00:30:45,760 Speaker 1: Hey here's the sex tape of your husband or your wife. 538 00:30:45,800 --> 00:30:50,360 Speaker 1: You know, yes, blackmails another one too. Um, there's a 539 00:30:50,360 --> 00:30:52,640 Speaker 1: lot of ramifications of this, and it seems like the 540 00:30:52,680 --> 00:30:55,560 Speaker 1: more you dig into it, the more becomes clear that 541 00:30:55,720 --> 00:30:59,000 Speaker 1: really the big threat from this is to the individual 542 00:30:59,120 --> 00:31:02,640 Speaker 1: whose face is used on the deep fake porn. Right, 543 00:31:02,680 --> 00:31:04,840 Speaker 1: they could do a video of us holding hands walking 544 00:31:04,840 --> 00:31:07,920 Speaker 1: down the street, right, or they could just use that 545 00:31:08,040 --> 00:31:10,680 Speaker 1: video of us doing that, the one that exists around 546 00:31:10,640 --> 00:31:14,280 Speaker 1: And I'm glad you picked up that one. Um. There 547 00:31:14,400 --> 00:31:18,360 Speaker 1: was just jeez, just a couple of weeks ago because 548 00:31:18,920 --> 00:31:21,480 Speaker 1: I saw this one when I google this undernews, so 549 00:31:21,520 --> 00:31:24,280 Speaker 1: it's very recently. There was an app that we won't 550 00:31:24,360 --> 00:31:27,719 Speaker 1: name that's undressed. Basically, what you could do is just 551 00:31:27,760 --> 00:31:31,160 Speaker 1: take a picture of any woman, plug it into this app, 552 00:31:31,240 --> 00:31:35,640 Speaker 1: and it would uh show her, um, what she would 553 00:31:35,680 --> 00:31:38,360 Speaker 1: look like nude, not her body, but it would just 554 00:31:38,480 --> 00:31:41,760 Speaker 1: do it so fast and so realistically that you could 555 00:31:41,960 --> 00:31:44,880 Speaker 1: nude up some woman with a touch of a button. Yeah, 556 00:31:44,960 --> 00:31:47,360 Speaker 1: and like it would it would replace her clothes in 557 00:31:47,440 --> 00:31:53,040 Speaker 1: the picture with nude clothes. Right. So the birthday suit right, 558 00:31:53,040 --> 00:31:54,880 Speaker 1: birthday suit right, that's what I was looking And it's 559 00:31:54,920 --> 00:31:57,400 Speaker 1: just as awful as you think. Uh. And the creator 560 00:31:57,480 --> 00:32:01,040 Speaker 1: actually even shut it down within like three or four days. Yeah, 561 00:32:01,080 --> 00:32:03,640 Speaker 1: but like what was this guy thinking, like this is 562 00:32:03,640 --> 00:32:05,520 Speaker 1: a great idea. I'm like, oh, people have a problem 563 00:32:05,560 --> 00:32:09,040 Speaker 1: with this, Well I'll shut it down. Like really, in 564 00:32:09,160 --> 00:32:14,280 Speaker 1: his defense, he's probably like fourteen, Well, I guess that's 565 00:32:14,280 --> 00:32:16,920 Speaker 1: a good point. Uh. Even if you plugged in a 566 00:32:16,920 --> 00:32:18,920 Speaker 1: picture of a man, it would show a woman's nude body. 567 00:32:18,960 --> 00:32:21,880 Speaker 1: And you know what that means. That means that And 568 00:32:21,920 --> 00:32:23,760 Speaker 1: the person who created this app says, well, I just 569 00:32:23,800 --> 00:32:26,400 Speaker 1: did that because there are way more pictures of naked 570 00:32:26,440 --> 00:32:28,120 Speaker 1: women on the internet, and I was gonna do a 571 00:32:28,160 --> 00:32:30,560 Speaker 1: man's version, but I had to go to baseball practice 572 00:32:30,600 --> 00:32:34,640 Speaker 1: and never got a chance to. Yeah. That's uh, that's 573 00:32:34,680 --> 00:32:37,160 Speaker 1: pretty amazing. And of course that person is anonymous, right 574 00:32:37,640 --> 00:32:40,520 Speaker 1: as far as I know. Yeah, which means that they 575 00:32:40,520 --> 00:32:43,560 Speaker 1: really must be four Team because they weren't unmasked on 576 00:32:43,600 --> 00:32:46,800 Speaker 1: the internet. Despite the outrage against this, You're probably right. 577 00:32:46,840 --> 00:32:50,960 Speaker 1: I wonder that it's just farmer Bro. That guy justaed him. 578 00:32:50,960 --> 00:32:54,720 Speaker 1: He's still in jail. Is he really good? So that's 579 00:32:54,760 --> 00:32:58,440 Speaker 1: a good um segue into what you can do if 580 00:32:58,480 --> 00:33:02,920 Speaker 1: this happens to you, Right, there's a lot of outrage, 581 00:33:03,440 --> 00:33:06,880 Speaker 1: Congressman against There's a lot of outrage against this kind 582 00:33:06,880 --> 00:33:11,000 Speaker 1: of thing on the internet. So if this, if you 583 00:33:11,040 --> 00:33:14,280 Speaker 1: are targeted and you end up in a deep fake 584 00:33:14,360 --> 00:33:20,680 Speaker 1: porn clip or video or whatever, um, you could drum 585 00:33:20,760 --> 00:33:24,520 Speaker 1: up some moral outrage on the Internet or go to 586 00:33:24,600 --> 00:33:27,800 Speaker 1: the site that is being hosted on directly and say, hey, 587 00:33:28,080 --> 00:33:29,640 Speaker 1: I know this is messed up. You know this is 588 00:33:29,680 --> 00:33:32,760 Speaker 1: messed up. Please take this down. I didn't consent to this. 589 00:33:32,760 --> 00:33:35,400 Speaker 1: This is an invasion of my privacy. Get rid of 590 00:33:35,440 --> 00:33:38,400 Speaker 1: this video. Porn websites are good about that, actually, yep, 591 00:33:38,720 --> 00:33:40,440 Speaker 1: they don't want that stuff on there. No, And it's 592 00:33:40,480 --> 00:33:45,680 Speaker 1: not just porn websites like um, porn Hub, Reddit, giffy 593 00:33:45,800 --> 00:33:49,720 Speaker 1: Um some other sites have banned all kinds of deep 594 00:33:49,760 --> 00:33:53,320 Speaker 1: fake videos and apparently giffy I think it's giffy g 595 00:33:53,640 --> 00:33:55,920 Speaker 1: y f y idea. I don't either. I've never heard 596 00:33:55,920 --> 00:33:58,320 Speaker 1: of it until I started researching this. But this site 597 00:33:58,360 --> 00:34:01,920 Speaker 1: actually created an AI that's trained to spot deep fig 598 00:34:02,040 --> 00:34:04,760 Speaker 1: videos and remove them from the site, which is a 599 00:34:04,800 --> 00:34:07,800 Speaker 1: big tool that they need to be sharing with everybody else. 600 00:34:08,440 --> 00:34:11,600 Speaker 1: But if you can contact the site and say, hey, man, 601 00:34:11,800 --> 00:34:14,399 Speaker 1: take this down, this is me, they will probably take 602 00:34:14,400 --> 00:34:16,920 Speaker 1: it down just because of the everybody knows this is 603 00:34:16,960 --> 00:34:20,280 Speaker 1: really messed up. Yeah, they got enough. They have plenty 604 00:34:20,400 --> 00:34:23,200 Speaker 1: of videos to work for from that are real. They 605 00:34:23,200 --> 00:34:26,120 Speaker 1: don't need this stuff. But there's no laws that say 606 00:34:26,160 --> 00:34:29,360 Speaker 1: they have to take it down, are there? Well not yet. Uh. 607 00:34:29,360 --> 00:34:33,520 Speaker 1: There's this guy, Henry Fred He studies digital forensics at 608 00:34:33,600 --> 00:34:37,400 Speaker 1: Dartmouth and they are hard at work. Like again, this 609 00:34:37,560 --> 00:34:40,600 Speaker 1: just started, you know, very recently, so all of this 610 00:34:40,680 --> 00:34:44,480 Speaker 1: stuff they're just like scrambling to get ahead of as 611 00:34:44,520 --> 00:34:47,279 Speaker 1: far as uh sniffing this stuff out. Well, the whole 612 00:34:47,320 --> 00:34:49,719 Speaker 1: world was caught off guard by this. Oh yeah, like 613 00:34:49,760 --> 00:34:51,319 Speaker 1: this guy just wanted, hey, look what I can do, 614 00:34:51,640 --> 00:34:53,759 Speaker 1: and I'm going to change the world here. Nick Kage 615 00:34:53,920 --> 00:34:57,160 Speaker 1: is so funny. Oh wait, what is Nick kide doing 616 00:34:57,200 --> 00:35:00,719 Speaker 1: as yoda. Oh my god, So profession nials, there are 617 00:35:00,800 --> 00:35:03,680 Speaker 1: some pretty uh easy to spot things if you're a pro, 618 00:35:04,200 --> 00:35:07,560 Speaker 1: unless it's just really bad. Um, your average layman can 619 00:35:07,600 --> 00:35:09,719 Speaker 1: spot those. But if you're a pro, you're gonna look 620 00:35:09,760 --> 00:35:14,759 Speaker 1: for like bad compression, um stuff that like you know, 621 00:35:14,960 --> 00:35:19,439 Speaker 1: lighting that looks off. Yeah. Not blinking is a big one. Yeah, 622 00:35:19,480 --> 00:35:23,879 Speaker 1: like Michael Caine don't blink. Maybe he's just a big 623 00:35:24,320 --> 00:35:28,399 Speaker 1: deep fake his whole career. Man, that'd be something. Um, 624 00:35:28,520 --> 00:35:30,359 Speaker 1: sound is a big thing, like wa wait, hold on, hold, 625 00:35:30,400 --> 00:35:32,279 Speaker 1: I want to say why blinking is not a thing. 626 00:35:32,400 --> 00:35:34,600 Speaker 1: Oh sure, because it's fascinating. I mean it probably wasn't 627 00:35:34,640 --> 00:35:36,520 Speaker 1: to you because you didn't like that gives motor article. 628 00:35:37,080 --> 00:35:41,160 Speaker 1: But um, the reason why not blinking is the thing 629 00:35:41,280 --> 00:35:45,279 Speaker 1: in deep fake is because deep fake AI that is 630 00:35:45,360 --> 00:35:49,799 Speaker 1: trained on data sets probably are being shown photos of 631 00:35:49,880 --> 00:35:53,520 Speaker 1: somebody not blinking, so they don't learn that people blink, 632 00:35:53,560 --> 00:35:56,080 Speaker 1: so they don't The AI doesn't know to add blinking 633 00:35:56,360 --> 00:35:59,719 Speaker 1: when it renders the new face on this video. But 634 00:35:59,760 --> 00:36:01,439 Speaker 1: all I can do is just say, all right, well 635 00:36:01,600 --> 00:36:04,160 Speaker 1: now then we'll program it to blink. Right. That's the 636 00:36:04,160 --> 00:36:06,919 Speaker 1: big problem, Chuck. It's like everything they can spot. In fact, 637 00:36:06,960 --> 00:36:09,400 Speaker 1: when they list out all the things like look for 638 00:36:09,480 --> 00:36:12,520 Speaker 1: blocking this, compression, fixed pattern noise, I'm sure there's some 639 00:36:13,239 --> 00:36:16,200 Speaker 1: deep fakir that's like check check, check, Thanks for the 640 00:36:16,239 --> 00:36:18,760 Speaker 1: list of stuff we need to work on, and they're there. 641 00:36:18,800 --> 00:36:21,320 Speaker 1: It's still maybe, like you were saying, at a point 642 00:36:21,360 --> 00:36:24,080 Speaker 1: where possibly you or I could look at a shoddy 643 00:36:24,160 --> 00:36:27,480 Speaker 1: video and be like, yeah, I see this this and 644 00:36:27,520 --> 00:36:29,439 Speaker 1: this is a little wrong, like there is a little 645 00:36:29,480 --> 00:36:33,560 Speaker 1: bit of compression um relic or remnants or whatever, like 646 00:36:33,600 --> 00:36:35,680 Speaker 1: they're not blinking the shadows off a little bit. But 647 00:36:35,719 --> 00:36:37,799 Speaker 1: there's also plenty of videos where you do need to 648 00:36:37,800 --> 00:36:40,919 Speaker 1: be like a digital forensic scientists to to find them 649 00:36:41,000 --> 00:36:43,120 Speaker 1: or an AI to find it. Yeah, you can also 650 00:36:43,200 --> 00:36:47,520 Speaker 1: use your ear holes because you know, look at the 651 00:36:47,600 --> 00:36:49,760 Speaker 1: room that the person is in and would it sound 652 00:36:49,840 --> 00:36:51,840 Speaker 1: like that in a room like that, And that's a 653 00:36:51,880 --> 00:36:55,280 Speaker 1: good One of the things audio specialists look at is like, uh, 654 00:36:55,320 --> 00:36:58,799 Speaker 1: you know, if you have Obama in a concert hall 655 00:36:58,880 --> 00:37:02,080 Speaker 1: speaking and it sounds like in someone's closet or us, Yeah, 656 00:37:02,320 --> 00:37:05,520 Speaker 1: it sounds like a can like our earliest episodes. Yeah, exactly, 657 00:37:05,520 --> 00:37:08,799 Speaker 1: that's pretty you know, pretty strong indicator. It is. So 658 00:37:08,880 --> 00:37:13,239 Speaker 1: there are things you can do. Um. When BuzzFeed tweeted 659 00:37:13,320 --> 00:37:17,439 Speaker 1: that Jordan Peel Obama deep fake, they included a list 660 00:37:17,480 --> 00:37:20,359 Speaker 1: of things to do to spot a deep fake. UM, 661 00:37:20,800 --> 00:37:22,480 Speaker 1: I wonder how many times you said deep fake in 662 00:37:22,480 --> 00:37:30,239 Speaker 1: this episode. Don't jump to conclusions. Consider the source. It's 663 00:37:30,280 --> 00:37:33,000 Speaker 1: a big one. That's what's gonna guide us through here. 664 00:37:33,520 --> 00:37:36,080 Speaker 1: People like this is Jordan Peel, I can trust that. No. 665 00:37:36,160 --> 00:37:37,880 Speaker 1: But I mean, like if you go onto a site, 666 00:37:38,000 --> 00:37:40,040 Speaker 1: I I can spot like a fake news site a 667 00:37:40,080 --> 00:37:43,560 Speaker 1: mile away, Like you can just tell it's just there's 668 00:37:43,600 --> 00:37:46,399 Speaker 1: it's off, it's uncanny. It's our sense of uncanny that's 669 00:37:46,400 --> 00:37:49,200 Speaker 1: gonna guide us through this. Yeah, the it's you can 670 00:37:49,200 --> 00:37:51,520 Speaker 1: always sell because the screen is black in the texts, 671 00:37:51,560 --> 00:37:56,719 Speaker 1: florescent green in comic sance. Um. And then another one 672 00:37:56,800 --> 00:37:59,719 Speaker 1: is check where this thing is and isn't. This is 673 00:37:59,800 --> 00:38:02,120 Speaker 1: kind of like the opposite tip we always give people 674 00:38:02,120 --> 00:38:04,680 Speaker 1: where if you see the same thing and basically the 675 00:38:04,680 --> 00:38:07,960 Speaker 1: same wording throughout the internet, you should question that. If 676 00:38:07,960 --> 00:38:10,080 Speaker 1: you see a deep fake video and only a couple 677 00:38:10,120 --> 00:38:13,160 Speaker 1: of places and it's news news, but you don't see 678 00:38:13,200 --> 00:38:16,720 Speaker 1: it on like ABC or CNN or Fox News or wherever. 679 00:38:17,160 --> 00:38:19,560 Speaker 1: If you don't see it on like a reputable news site, 680 00:38:19,680 --> 00:38:23,000 Speaker 1: you should probably question it. Yeah, Donald Trump threatens nuclear war. 681 00:38:23,200 --> 00:38:26,839 Speaker 1: We have the video from slappy dot com, right, it's 682 00:38:26,880 --> 00:38:29,680 Speaker 1: probably a good indicator slappy dot com. I'm sure the 683 00:38:29,680 --> 00:38:34,080 Speaker 1: good people are like, we make we make hamburger button 684 00:38:34,560 --> 00:38:36,839 Speaker 1: picking up. I should probably check and see what that is. 685 00:38:37,400 --> 00:38:40,440 Speaker 1: Everyone else is right now, what else? Look closely at 686 00:38:40,480 --> 00:38:43,239 Speaker 1: their mouth? Yea uh. And then here's a kind of 687 00:38:43,239 --> 00:38:46,279 Speaker 1: a no brainers, like slow it down, slow the video down, 688 00:38:46,360 --> 00:38:48,840 Speaker 1: slow your role and like really look at it closely 689 00:38:49,239 --> 00:38:51,640 Speaker 1: if you see like because that's where you're gonna see 690 00:38:51,640 --> 00:38:56,280 Speaker 1: like strange lighting changes and stuff. Um, but it's all legal, 691 00:38:57,040 --> 00:38:58,759 Speaker 1: it is, so we were kind of we're kind of 692 00:38:58,760 --> 00:39:00,759 Speaker 1: talking about that, like the best way to get a 693 00:39:00,840 --> 00:39:04,040 Speaker 1: video taking down as to contact the website. UM just 694 00:39:04,080 --> 00:39:07,359 Speaker 1: be like, bro, come on, this is awful. Um. There 695 00:39:07,400 --> 00:39:11,120 Speaker 1: are no laws that protect you directly. But a lot 696 00:39:11,160 --> 00:39:13,840 Speaker 1: of people are saying, well, we've got revenge porn laws 697 00:39:13,880 --> 00:39:16,440 Speaker 1: that are starting to pop up around the country. It's 698 00:39:16,480 --> 00:39:20,960 Speaker 1: a very short short trip to from revenge porn to 699 00:39:21,200 --> 00:39:27,160 Speaker 1: deep fake porn. It's virtually the same thing. It's involuntary pornography. Um, 700 00:39:27,160 --> 00:39:30,720 Speaker 1: it's even more involuntary because with revenge porn, the person 701 00:39:30,800 --> 00:39:33,560 Speaker 1: even posed for the picture or whatever initially for whatever 702 00:39:33,640 --> 00:39:36,839 Speaker 1: context or reason, with no intention to get it out. 703 00:39:37,640 --> 00:39:41,560 Speaker 1: With deep fake porn, this person never even posed or 704 00:39:41,600 --> 00:39:44,319 Speaker 1: engaged in this act or anything like that. So it's 705 00:39:44,360 --> 00:39:48,279 Speaker 1: even even in a way, maybe even worse than revenge porn, 706 00:39:48,280 --> 00:39:52,440 Speaker 1: which is feels like bitter acid in my mouth to say. Um. 707 00:39:52,880 --> 00:39:55,640 Speaker 1: So you can make a case though, that these revenge 708 00:39:55,640 --> 00:39:59,040 Speaker 1: porn statutes that protect people could be extended to this 709 00:39:59,120 --> 00:40:04,400 Speaker 1: as well, But that's a that's for personal stuff. For 710 00:40:04,680 --> 00:40:08,040 Speaker 1: like national stuff or a public figure or something like that, 711 00:40:08,560 --> 00:40:11,680 Speaker 1: especially when it comes to politics. You could make a 712 00:40:11,719 --> 00:40:14,239 Speaker 1: really strong case that these deep fake videos, even the 713 00:40:14,280 --> 00:40:18,520 Speaker 1: most misleading, nefarious deep fake video you can imagine, would 714 00:40:18,520 --> 00:40:21,879 Speaker 1: be protected under the First Amendment. Yeah, I could see 715 00:40:21,880 --> 00:40:27,799 Speaker 1: a satire defense being mounted in the future. Uh. Like, 716 00:40:27,880 --> 00:40:30,000 Speaker 1: you know, what's the difference between doing a really good 717 00:40:30,040 --> 00:40:33,719 Speaker 1: deep fake in doing an animated cartoon like south Park 718 00:40:34,280 --> 00:40:38,040 Speaker 1: which shows people saying and doing things they wouldn't do either. Uh, 719 00:40:38,080 --> 00:40:42,319 Speaker 1: it is very slippery and thorny in a very fine line. 720 00:40:42,400 --> 00:40:44,439 Speaker 1: But even if the person who makes the deep fake 721 00:40:44,520 --> 00:40:47,440 Speaker 1: says No, I did not mean this is satire. It 722 00:40:47,560 --> 00:40:49,759 Speaker 1: was meant to be misleading, and I wanted to see 723 00:40:49,760 --> 00:40:53,520 Speaker 1: what effects it had. Sure they didn't shout fire in 724 00:40:53,520 --> 00:40:56,640 Speaker 1: a crowded theater, so they could probably still get away 725 00:40:56,680 --> 00:40:59,480 Speaker 1: with it under the First Amendment. Yeah, it's interesting to 726 00:40:59,520 --> 00:41:03,080 Speaker 1: see where the just gonna go. Hopefully write down the toilet. Nope, 727 00:41:03,960 --> 00:41:06,480 Speaker 1: it's just gonna get there. It's gonna get more and 728 00:41:06,520 --> 00:41:10,560 Speaker 1: more realistic, and we're gonna end up, inadvertently, um, falling 729 00:41:10,600 --> 00:41:14,960 Speaker 1: into the simulation. That's what's gonna happen. Chuck, prepare for it. 730 00:41:15,040 --> 00:41:18,000 Speaker 1: That's great, Okay, just try to put a smile on 731 00:41:18,040 --> 00:41:22,719 Speaker 1: your face regardless. That's smiling. If you want to know more, 732 00:41:23,120 --> 00:41:27,400 Speaker 1: if you want to know more about deep fakes, Um, 733 00:41:27,440 --> 00:41:30,160 Speaker 1: it's so hot right now. Just go on the internet 734 00:41:30,200 --> 00:41:33,000 Speaker 1: and you can read all sorts of news articles about it. 735 00:41:33,360 --> 00:41:40,000 Speaker 1: Since I said that, it's time for listener mail. I 736 00:41:40,000 --> 00:41:42,160 Speaker 1: think the first thing that turned me off was the name. 737 00:41:43,200 --> 00:41:46,000 Speaker 1: Anytime I see something that's like not a real word, 738 00:41:46,000 --> 00:41:48,640 Speaker 1: but they're like squeezed together to words and it's all 739 00:41:48,719 --> 00:41:53,440 Speaker 1: lowercase or something or oh gosh, yeah, it's just the 740 00:41:53,480 --> 00:41:57,600 Speaker 1: worst of the Internet. It's terrible. All right, Hey, guys, 741 00:41:57,680 --> 00:42:01,879 Speaker 1: just finished the Neanderthal episode and those mentioned that their 742 00:42:01,960 --> 00:42:05,520 Speaker 1: language could have have some remnants in our modern languages. 743 00:42:05,960 --> 00:42:10,040 Speaker 1: That was a really good automatically remembered that during How 744 00:42:10,080 --> 00:42:12,840 Speaker 1: Swearing Works, you guys mentioned that a different part of 745 00:42:12,880 --> 00:42:16,280 Speaker 1: our brains activate when hearing are using swear words. So maybe, 746 00:42:16,360 --> 00:42:19,640 Speaker 1: just maybe that small percentage of our name Andrew Falian 747 00:42:20,239 --> 00:42:23,160 Speaker 1: d n A activates when we stub our toes or 748 00:42:23,200 --> 00:42:26,800 Speaker 1: hit our shins to unleash our primitive and original language. 749 00:42:27,080 --> 00:42:30,000 Speaker 1: How about that? I like this person anyways, loved the podcast, guys. 750 00:42:30,040 --> 00:42:33,359 Speaker 1: Grateful for the knowledge and entertainment. And thank you Jerry, 751 00:42:33,480 --> 00:42:36,040 Speaker 1: or should we just say thank you Josh Josh T 752 00:42:36,960 --> 00:42:40,880 Speaker 1: for keeping the quality of these podcasts awesome. We're not 753 00:42:40,920 --> 00:42:46,600 Speaker 1: thinking Jerry anymore. Just from my overpriced apartment in Redlands, California. 754 00:42:46,800 --> 00:42:51,600 Speaker 1: That is Falcon. No, that's his last name. Wow, the 755 00:42:51,719 --> 00:42:55,400 Speaker 1: end is silent though, so it's Falcone. Thanks a lot, Falcon. 756 00:42:55,480 --> 00:42:58,439 Speaker 1: We appreciate you. Um, that was a great name. Thanks 757 00:42:58,480 --> 00:43:01,520 Speaker 1: for swooping in with that idea. Mm hm and uh, 758 00:43:01,920 --> 00:43:04,440 Speaker 1: I'm sorry for everybody for that. If you want to 759 00:43:04,440 --> 00:43:06,880 Speaker 1: get in touch with us, like foul code. Did You 760 00:43:06,960 --> 00:43:09,840 Speaker 1: can tweet to us. We're at s y s K podcast, 761 00:43:09,960 --> 00:43:13,960 Speaker 1: I'm at joshuam Clark. We're on Instagram, We're on Facebook. 762 00:43:14,280 --> 00:43:18,839 Speaker 1: Where else are we? Chuck Uh, We're on every deep fake. 763 00:43:19,440 --> 00:43:22,920 Speaker 1: We might be on Giffy who knows gif Uh. You 764 00:43:22,960 --> 00:43:26,360 Speaker 1: can also send us a good old fashioned email, spank 765 00:43:26,400 --> 00:43:28,440 Speaker 1: it on the bottom after you wrap it up, of course, 766 00:43:28,480 --> 00:43:30,839 Speaker 1: and send it off to Stuff podcast at i heart 767 00:43:30,920 --> 00:43:37,080 Speaker 1: radio dot com. Stuff you Should Know is a production 768 00:43:37,120 --> 00:43:39,840 Speaker 1: of iHeart Radio's How Stuff Works. For more podcasts for 769 00:43:39,880 --> 00:43:42,680 Speaker 1: my heart Radio, visit the iHeart Radio app, Apple Podcasts, 770 00:43:42,760 --> 00:43:44,440 Speaker 1: or wherever you listen to your favorite shows.