1 00:00:02,240 --> 00:00:04,960 Speaker 1: This is Alec Baldwin and you're listening to Hear's the 2 00:00:05,000 --> 00:00:10,360 Speaker 1: thing from iHeartRadio. Modern day society has been living in 3 00:00:10,360 --> 00:00:14,120 Speaker 1: some form of the computer era for decades. An entire 4 00:00:14,160 --> 00:00:17,119 Speaker 1: genre of films has been dedicated to showing what human 5 00:00:17,160 --> 00:00:21,400 Speaker 1: life may be like once computers become conscious. My guests 6 00:00:21,440 --> 00:00:24,439 Speaker 1: today are two trailblazers in a field that many have 7 00:00:24,520 --> 00:00:28,360 Speaker 1: speculated could potentially risk the end of human civilization as 8 00:00:28,400 --> 00:00:31,680 Speaker 1: we know it. That is, of course, the true birth 9 00:00:31,680 --> 00:00:35,880 Speaker 1: of artificial intelligence, known simply as AI. And if you 10 00:00:35,960 --> 00:00:39,320 Speaker 1: think the world has become inundated with AI technology overnight, 11 00:00:39,760 --> 00:00:43,200 Speaker 1: you are not alone. AI based products come in many 12 00:00:43,240 --> 00:00:46,559 Speaker 1: different forms, such as chatbots, which can be used for 13 00:00:46,640 --> 00:00:50,040 Speaker 1: everything from finding a list of local Italian restaurants to 14 00:00:50,120 --> 00:00:54,840 Speaker 1: drafting legal documents and writing college essays. AI can also 15 00:00:54,880 --> 00:00:57,720 Speaker 1: be used to manipulate a person's likeness on video or 16 00:00:57,760 --> 00:01:00,720 Speaker 1: the sound of their voice. And if you're wondering just 17 00:01:00,800 --> 00:01:04,880 Speaker 1: how good computers have become at mimicking humans by narration 18 00:01:05,040 --> 00:01:07,720 Speaker 1: thus far has been generated by text written into a 19 00:01:07,800 --> 00:01:13,360 Speaker 1: program called descript. Okay, I can't let the machines take 20 00:01:13,400 --> 00:01:18,400 Speaker 1: over just yet. This is actually human. Alec Baldwin. Later 21 00:01:18,440 --> 00:01:22,119 Speaker 1: in the program, I'll speak with Blake Lemoyne, a software 22 00:01:22,160 --> 00:01:26,480 Speaker 1: engineer formerly employed at Google. Lemoyne was part of the 23 00:01:26,520 --> 00:01:31,520 Speaker 1: team working on Lambda, or language model for dialogue applications, 24 00:01:31,840 --> 00:01:36,360 Speaker 1: the technology behind chatbots. He went public with claims that 25 00:01:36,440 --> 00:01:40,000 Speaker 1: the AI he was working on was sentient and was 26 00:01:40,080 --> 00:01:44,920 Speaker 1: fired shortly thereafter. But first I'm talking to Jay Lebouffe, 27 00:01:45,400 --> 00:01:49,840 Speaker 1: head of business and corporate development at descript Dscript is 28 00:01:49,920 --> 00:01:54,720 Speaker 1: an editing program used by podcasters and vloggers. It allows 29 00:01:54,880 --> 00:01:59,360 Speaker 1: users to edit audio directly from an AI generated transcript, 30 00:01:59,600 --> 00:02:03,400 Speaker 1: and can even use voice cloning technology to create new 31 00:02:03,440 --> 00:02:06,280 Speaker 1: audio from text, just as we did in the opening 32 00:02:06,280 --> 00:02:10,200 Speaker 1: of the show. Lebuff has worked in technology his entire career. 33 00:02:10,720 --> 00:02:14,320 Speaker 1: I was curious how long AI has really been utilized 34 00:02:14,680 --> 00:02:15,400 Speaker 1: in his field. 35 00:02:16,440 --> 00:02:19,000 Speaker 2: The term has been around for a very long time, 36 00:02:19,520 --> 00:02:21,600 Speaker 2: but there's this period that a lot of people call 37 00:02:21,639 --> 00:02:25,360 Speaker 2: the AI winter where it was much hyped in the 38 00:02:25,440 --> 00:02:27,320 Speaker 2: eighties and even going into the nineties, and it just 39 00:02:27,360 --> 00:02:30,840 Speaker 2: had never delivered. So I didn't really know anybody who's 40 00:02:30,840 --> 00:02:32,680 Speaker 2: studying AI at that time. 41 00:02:33,160 --> 00:02:36,200 Speaker 1: Were you aware of it before you went to Cornell? No? No, 42 00:02:36,680 --> 00:02:38,800 Speaker 1: you weren't You were introduced to all this when you 43 00:02:38,800 --> 00:02:39,480 Speaker 1: went to college. 44 00:02:39,600 --> 00:02:41,880 Speaker 2: Absolutely, that is the value of college education for me 45 00:02:41,960 --> 00:02:44,959 Speaker 2: is I was exposed to a lot. I wasn't told 46 00:02:45,000 --> 00:02:46,359 Speaker 2: what to do with it, but I was exposed to 47 00:02:46,400 --> 00:02:48,200 Speaker 2: a lot of concepts that later on turned out to 48 00:02:48,200 --> 00:02:49,200 Speaker 2: be valuable. 49 00:02:49,400 --> 00:02:52,400 Speaker 1: It just occurred to me, Jay, are we really talking 50 00:02:52,520 --> 00:02:55,280 Speaker 1: to you? Are you here with us? 51 00:02:55,560 --> 00:02:57,800 Speaker 2: What I'd suggest is you ask me a few questions 52 00:02:58,280 --> 00:03:00,240 Speaker 2: and then we do a little touring test on it, 53 00:03:00,280 --> 00:03:02,320 Speaker 2: which is a bit of a gold standard. 54 00:03:02,320 --> 00:03:04,880 Speaker 1: It just occurred to me, maybe you're like the beach 55 00:03:04,919 --> 00:03:07,119 Speaker 1: in Aruba and this is not you. 56 00:03:07,160 --> 00:03:10,400 Speaker 2: No, but this is such a fascinating topic, Alec, because 57 00:03:11,280 --> 00:03:16,920 Speaker 2: we are entering a phase where anything can be synthetically generated. 58 00:03:17,040 --> 00:03:21,720 Speaker 2: So my voice can be synthetically generated and indistinguishable from 59 00:03:21,760 --> 00:03:24,800 Speaker 2: real me, even what I'm saying and the terms I'm 60 00:03:24,840 --> 00:03:27,920 Speaker 2: using in my speaking style, that even can be generated 61 00:03:28,000 --> 00:03:30,400 Speaker 2: all through this generative AI technology. 62 00:03:31,480 --> 00:03:33,840 Speaker 1: Well, I know that the work you do has nothing 63 00:03:33,880 --> 00:03:35,640 Speaker 1: to do with like deep thinks and that kind of thing. 64 00:03:35,920 --> 00:03:38,800 Speaker 1: But what I find, for example, is, and again this 65 00:03:38,840 --> 00:03:40,320 Speaker 1: has nothing to do with your work, but like if 66 00:03:40,320 --> 00:03:43,400 Speaker 1: I see a deep fake. You just mentioned about content 67 00:03:43,440 --> 00:03:46,560 Speaker 1: when I see a deep fake, I know it's bullshit 68 00:03:46,720 --> 00:03:52,000 Speaker 1: because no matter how much facially the animation is precise, 69 00:03:52,760 --> 00:03:55,120 Speaker 1: I'm obviously going to talk about Tom Cruise, the Tom 70 00:03:55,160 --> 00:03:58,880 Speaker 1: Cruise deep thakes and the voice and the cadences, but 71 00:03:58,960 --> 00:04:01,720 Speaker 1: what's missing is the cut. You need writers to write. 72 00:04:02,120 --> 00:04:04,800 Speaker 1: I'm assuming what Cruise would say, because when I watch 73 00:04:04,840 --> 00:04:07,880 Speaker 1: these deep things, they almost never say what Cruise would say. 74 00:04:08,400 --> 00:04:12,640 Speaker 1: Do you believe that AI has that capability to manufacture 75 00:04:12,680 --> 00:04:15,720 Speaker 1: the thoughts of the person that they're recreating. 76 00:04:16,680 --> 00:04:19,120 Speaker 2: So the state of the art where we are right 77 00:04:19,160 --> 00:04:25,920 Speaker 2: now would be AI can regurgitate and recombine past things 78 00:04:26,000 --> 00:04:28,400 Speaker 2: that you might have said or that might have been 79 00:04:28,400 --> 00:04:31,760 Speaker 2: attributed to you, And so it might seem like it's 80 00:04:31,800 --> 00:04:35,599 Speaker 2: a seemingly novel output, a new Tom Cruisi and expression, 81 00:04:35,960 --> 00:04:38,760 Speaker 2: but it's actually just recombining things that he might have 82 00:04:38,839 --> 00:04:42,040 Speaker 2: said and then what you pointed out. It's also introducing 83 00:04:42,120 --> 00:04:46,400 Speaker 2: some hallucinations things that know that absolutely does not pass 84 00:04:46,440 --> 00:04:48,680 Speaker 2: the smell test right now, and that's where humans in 85 00:04:48,720 --> 00:04:51,920 Speaker 2: the loop are always going to kind of prevail. 86 00:04:52,080 --> 00:04:54,560 Speaker 1: One thought we had was to open this episode with 87 00:04:54,680 --> 00:04:58,159 Speaker 1: me introducing my Guest Academy Award winner, one of the 88 00:04:58,160 --> 00:05:01,000 Speaker 1: greatest film stars of his general join me from my 89 00:05:01,080 --> 00:05:04,480 Speaker 1: conversation with Laurence Olivier. So we all know I'm not 90 00:05:04,600 --> 00:05:08,040 Speaker 1: interviewing Lawrence Olivier because he's dead. Do you see that 91 00:05:08,040 --> 00:05:09,279 Speaker 1: that's possible. 92 00:05:09,560 --> 00:05:12,160 Speaker 2: With our platform? It is not possible to clone the 93 00:05:12,240 --> 00:05:15,240 Speaker 2: voice of the deceased. It's also not possible to clone 94 00:05:15,400 --> 00:05:18,800 Speaker 2: a voice that is not your own voice without your consent. 95 00:05:19,680 --> 00:05:21,520 Speaker 1: But I'm saying technologically it's possible. 96 00:05:21,520 --> 00:05:25,120 Speaker 2: Technologically as possible. You are completely right. There will always 97 00:05:25,200 --> 00:05:29,800 Speaker 2: be tools, technology and open source that allow anyone to 98 00:05:29,839 --> 00:05:33,120 Speaker 2: do anything, but a platform like what we run, we've 99 00:05:33,160 --> 00:05:36,599 Speaker 2: actually made a very firm ethical stance that we believe 100 00:05:36,920 --> 00:05:39,279 Speaker 2: each person should have the rights to their voice and 101 00:05:39,360 --> 00:05:41,560 Speaker 2: control when it is used. And that's why we have 102 00:05:41,600 --> 00:05:45,480 Speaker 2: some pretty strict authentication and also ethical steps set up 103 00:05:45,480 --> 00:05:48,560 Speaker 2: so you couldn't use our platform to do that interview unfortunately. 104 00:05:49,040 --> 00:05:51,240 Speaker 1: Now what are you working on? Now? What kinds of 105 00:05:51,320 --> 00:05:53,000 Speaker 1: other projects are you working on now? 106 00:05:53,560 --> 00:05:56,800 Speaker 2: So a big thing we're continuing to improve upon is 107 00:05:57,400 --> 00:06:00,640 Speaker 2: this voice cloning technology to allow people to fix their 108 00:06:00,640 --> 00:06:03,919 Speaker 2: mistakes and generate new content. We have a like a 109 00:06:04,040 --> 00:06:08,760 Speaker 2: Hollywood style green screen removal, so in that situation. That's 110 00:06:08,800 --> 00:06:11,200 Speaker 2: where maybe I'm doing this interview and I don't like 111 00:06:11,240 --> 00:06:14,560 Speaker 2: the background behind me. With one click it can disappear. 112 00:06:14,640 --> 00:06:18,520 Speaker 2: I can spot in a more professional looking background. That's 113 00:06:18,520 --> 00:06:19,760 Speaker 2: something that's also giving people, well. 114 00:06:19,760 --> 00:06:21,520 Speaker 1: They have those background things on zoom. 115 00:06:21,200 --> 00:06:25,320 Speaker 2: Correct exactly like like zoom but you know Hollywood grade 116 00:06:25,400 --> 00:06:27,839 Speaker 2: at that point, Yeah, you know, It's really about how 117 00:06:27,839 --> 00:06:32,279 Speaker 2: do we help creators best tell their story through finding content, 118 00:06:32,360 --> 00:06:37,040 Speaker 2: discovering content, and getting their videos as polished as possible, 119 00:06:37,080 --> 00:06:39,920 Speaker 2: but as quickly as possible. Because that's that's a trend. 120 00:06:39,960 --> 00:06:42,279 Speaker 2: We just see people feel a sense to put up 121 00:06:42,360 --> 00:06:45,320 Speaker 2: more content, but we're hoping we can help them actually 122 00:06:45,320 --> 00:06:48,920 Speaker 2: just create better content and really hone their craft. 123 00:06:49,279 --> 00:06:52,360 Speaker 1: I think of where the venue is a performance venue 124 00:06:53,040 --> 00:06:57,000 Speaker 1: as opposed to an educational venue. So, for example, if 125 00:06:57,200 --> 00:07:00,680 Speaker 1: someone is doing an audio book and the book is 126 00:07:00,720 --> 00:07:05,200 Speaker 1: about something rather dry. There are people reading textbooks, and 127 00:07:05,279 --> 00:07:09,120 Speaker 1: I don't see anything wrong with them using this technology 128 00:07:09,720 --> 00:07:12,280 Speaker 1: to get a copy of their voice, to run it 129 00:07:12,320 --> 00:07:16,760 Speaker 1: through the grinder and do the dubbing of that person's 130 00:07:16,840 --> 00:07:22,720 Speaker 1: voice reading that text. But this hearkens to the idea 131 00:07:22,760 --> 00:07:26,160 Speaker 1: of a performer giving a performance they never gave. Someone 132 00:07:26,200 --> 00:07:27,760 Speaker 1: said to me, pretty soon, you're going to be able 133 00:07:27,760 --> 00:07:31,560 Speaker 1: to do a movie with Humphrey Bogart. And through this technology, 134 00:07:31,600 --> 00:07:34,480 Speaker 1: the very technology you and I are referencing today, they're 135 00:07:34,480 --> 00:07:36,360 Speaker 1: going to give a performance with Humphrey Bogart. 136 00:07:37,160 --> 00:07:38,840 Speaker 2: Whether we want it to be here or not, that 137 00:07:39,080 --> 00:07:43,160 Speaker 2: it's inevitable. The technology is here. The lawyers are sorting 138 00:07:43,160 --> 00:07:46,840 Speaker 2: out the publicity and the copyright rights. I mean, James 139 00:07:46,880 --> 00:07:50,480 Speaker 2: old Jones has licensed his voice to live on for 140 00:07:51,080 --> 00:07:54,320 Speaker 2: future Darth Vader times when it needs to be used. 141 00:07:54,640 --> 00:07:56,080 Speaker 2: One of the things I do a lot of of 142 00:07:56,200 --> 00:07:58,680 Speaker 2: alec is I do a lot of university teaching, and 143 00:07:59,120 --> 00:08:01,920 Speaker 2: this is the best place to keep me grounded about 144 00:08:01,960 --> 00:08:06,680 Speaker 2: what the next generation of our industry actually thinks. And 145 00:08:07,160 --> 00:08:08,640 Speaker 2: as I was talking to them about this over the 146 00:08:08,640 --> 00:08:12,320 Speaker 2: past few weeks, they are very excited for these tools, 147 00:08:12,400 --> 00:08:15,200 Speaker 2: like they'll try out ten tools in a single day 148 00:08:15,640 --> 00:08:18,640 Speaker 2: because they have no fears about them whatsoever. And when 149 00:08:18,640 --> 00:08:21,240 Speaker 2: I ask them about like, well, aren't you concerned, and 150 00:08:21,280 --> 00:08:23,480 Speaker 2: they're like, no, this is great, Like this this is 151 00:08:23,520 --> 00:08:26,240 Speaker 2: how the world is going to be for them. This 152 00:08:26,400 --> 00:08:29,200 Speaker 2: is just the new state of the normal. And their 153 00:08:29,280 --> 00:08:31,600 Speaker 2: job is to understand the world that they're going into, 154 00:08:32,120 --> 00:08:36,040 Speaker 2: embrace the tools, and then find people who are resistant 155 00:08:36,080 --> 00:08:37,640 Speaker 2: to the tools and kind of push them out. 156 00:08:37,559 --> 00:08:41,640 Speaker 1: Of the way. Push them out of the way. How well, in. 157 00:08:42,040 --> 00:08:44,240 Speaker 2: The typical way where for many years I taught this 158 00:08:44,679 --> 00:08:47,280 Speaker 2: music business class at Stanford and on the first day 159 00:08:47,280 --> 00:08:50,680 Speaker 2: of class the sheer number of students whose mission for 160 00:08:50,760 --> 00:08:54,440 Speaker 2: taking this class was to disrupt the entire music industry, 161 00:08:54,720 --> 00:08:57,360 Speaker 2: and they thought of a better model than Spotify and 162 00:08:57,400 --> 00:09:01,160 Speaker 2: a better model than the major label system. And it's 163 00:09:01,200 --> 00:09:03,960 Speaker 2: that just kind of overconfidence of no, no, no, no, 164 00:09:04,000 --> 00:09:06,040 Speaker 2: we have a better idea. We're just going to you know, 165 00:09:06,320 --> 00:09:08,600 Speaker 2: run fast and break things. We're going to do that. 166 00:09:08,960 --> 00:09:10,800 Speaker 2: And then you know, of course, through the class we 167 00:09:10,880 --> 00:09:14,920 Speaker 2: let them understand things like mchandical royalties and publishing and 168 00:09:15,000 --> 00:09:17,719 Speaker 2: just help them understand how the sausage is actually made 169 00:09:17,800 --> 00:09:19,760 Speaker 2: and why it is the way it is, and then 170 00:09:19,800 --> 00:09:21,800 Speaker 2: they come with a more pragmatic view. So I think 171 00:09:22,120 --> 00:09:26,280 Speaker 2: right now we have university students who are incredibly enthusiastic 172 00:09:26,400 --> 00:09:28,600 Speaker 2: about these tools and are trying to figure out how 173 00:09:28,640 --> 00:09:31,280 Speaker 2: they can use them to make themselves better when they 174 00:09:31,280 --> 00:09:34,120 Speaker 2: go into industry and to make themselves more competitive and 175 00:09:34,200 --> 00:09:36,600 Speaker 2: often when I hire people on my team that are 176 00:09:36,640 --> 00:09:39,480 Speaker 2: fresh out of school, their knowledge of certain tools that 177 00:09:39,520 --> 00:09:42,000 Speaker 2: I've never seen in my life is actually a competitive 178 00:09:42,000 --> 00:09:45,680 Speaker 2: differentiation for them compared to the older people like me. 179 00:09:46,320 --> 00:09:49,280 Speaker 1: Do you have any concerns about this technology? I mean, 180 00:09:49,360 --> 00:09:53,439 Speaker 1: obviously there's some wonderful you know, Zach McNeice, who cuts 181 00:09:53,480 --> 00:09:58,880 Speaker 1: our show, uses descript to edit the show on a transcript, 182 00:09:59,360 --> 00:10:01,240 Speaker 1: And when I'm talking to him on the phone and 183 00:10:01,280 --> 00:10:03,480 Speaker 1: telling him, I make my notes for a cut, and 184 00:10:03,520 --> 00:10:05,959 Speaker 1: I'll call him and say, now at thirty eight minutes 185 00:10:06,000 --> 00:10:09,679 Speaker 1: and ten seconds, I say this cut that. Now all 186 00:10:09,679 --> 00:10:11,240 Speaker 1: I have to do is say to him, will I 187 00:10:11,360 --> 00:10:14,160 Speaker 1: say this phrase? And he goes to the transcript and 188 00:10:14,160 --> 00:10:16,559 Speaker 1: he finds it and we can do the cutting more efficiently. 189 00:10:16,800 --> 00:10:20,440 Speaker 1: I mean, there is obviously very useful applications for this, 190 00:10:20,760 --> 00:10:23,360 Speaker 1: and not only in our business but beyond. But was 191 00:10:23,400 --> 00:10:25,680 Speaker 1: there anything that concerned you about this technology? 192 00:10:26,280 --> 00:10:29,000 Speaker 2: I mean, one of the things I do worry about 193 00:10:29,000 --> 00:10:31,680 Speaker 2: for creators that are coming in is this temptation to 194 00:10:31,760 --> 00:10:35,760 Speaker 2: go for more, because if you can put out more 195 00:10:35,840 --> 00:10:40,160 Speaker 2: content easier, then you're probably going to try to do that, 196 00:10:40,200 --> 00:10:41,719 Speaker 2: thinking that that's how you're going to just you know, 197 00:10:41,720 --> 00:10:43,440 Speaker 2: you're going to bombard the world with all your great 198 00:10:43,480 --> 00:10:47,480 Speaker 2: ideas and your great stories. But what I really believe 199 00:10:47,480 --> 00:10:49,840 Speaker 2: will happen is quality is going to just always continue 200 00:10:49,880 --> 00:10:51,880 Speaker 2: to rise to the top, and so we're going to 201 00:10:51,920 --> 00:10:56,840 Speaker 2: see a surge and just drown and synthetically generated media 202 00:10:57,120 --> 00:10:59,360 Speaker 2: until we go back to more of a curation phase 203 00:10:59,720 --> 00:11:04,199 Speaker 2: where things like the fact that you leave some authentic 204 00:11:04,360 --> 00:11:08,319 Speaker 2: moments in your shows actually become valued. They become handprints 205 00:11:08,320 --> 00:11:11,719 Speaker 2: and fingerprints in the work, and that is something that 206 00:11:12,120 --> 00:11:15,480 Speaker 2: we will all value. And so when I do work 207 00:11:15,520 --> 00:11:18,320 Speaker 2: with studios that are adopting this technology, my goal is 208 00:11:18,360 --> 00:11:20,800 Speaker 2: to not help them put out nine new shows with 209 00:11:20,840 --> 00:11:22,600 Speaker 2: the same team they have, but actually take that one 210 00:11:22,640 --> 00:11:25,160 Speaker 2: show they're working on and make it even better and 211 00:11:25,280 --> 00:11:27,880 Speaker 2: use the AI to come up with, you know, give 212 00:11:27,880 --> 00:11:30,480 Speaker 2: me ten interview questions for when I'm interviewing this person 213 00:11:30,800 --> 00:11:33,040 Speaker 2: and help them think of things they haven't thought of before, 214 00:11:33,559 --> 00:11:36,880 Speaker 2: and help them, you know, summarize key talking points and 215 00:11:37,040 --> 00:11:39,599 Speaker 2: just kind of be a virtual producer. 216 00:11:40,320 --> 00:11:43,800 Speaker 1: I wonder if there are other applications of this that 217 00:11:44,000 --> 00:11:48,600 Speaker 1: are similar, let's say to a three D printer. So 218 00:11:49,240 --> 00:11:54,920 Speaker 1: do you see a technology where the information is put 219 00:11:55,000 --> 00:11:59,559 Speaker 1: into the computer. This is obviously very complex, but that 220 00:11:59,640 --> 00:12:05,680 Speaker 1: you have automated surgery and medical procedures where devices can 221 00:12:05,760 --> 00:12:09,920 Speaker 1: perform common surgeries, basic surgeries at first, I mean not 222 00:12:10,040 --> 00:12:13,880 Speaker 1: something that they're very complex. But if a myriad of 223 00:12:13,960 --> 00:12:16,319 Speaker 1: information is fed into a machine, as it is into 224 00:12:16,320 --> 00:12:19,439 Speaker 1: the brain of a neurosurgeon or a team of them, 225 00:12:19,840 --> 00:12:22,880 Speaker 1: and you wind up having devices that will perform the 226 00:12:22,920 --> 00:12:25,640 Speaker 1: surgery instead of a person with an eye toward at 227 00:12:25,640 --> 00:12:28,760 Speaker 1: the very least, the goal being that it'll be done better, 228 00:12:29,080 --> 00:12:31,240 Speaker 1: it'll be done more precisely. Do you see that as 229 00:12:31,240 --> 00:12:31,840 Speaker 1: a potential. 230 00:12:32,920 --> 00:12:36,920 Speaker 2: So I can't knowledgeably speak about the medical profession, but 231 00:12:37,600 --> 00:12:40,960 Speaker 2: for most of the knowledge worker industries, like what a 232 00:12:41,000 --> 00:12:44,480 Speaker 2: business professional, what a marketing person, and what a content 233 00:12:44,559 --> 00:12:48,760 Speaker 2: creator would be doing, we're going to have these AI 234 00:12:49,040 --> 00:12:52,680 Speaker 2: assistive tools complement all the work that we're doing. It's 235 00:12:52,800 --> 00:12:55,840 Speaker 2: things like spellcheck, Like it went from being you have 236 00:12:55,880 --> 00:12:58,040 Speaker 2: a dictionary on your desk, but you're too lazy to 237 00:12:58,120 --> 00:13:01,880 Speaker 2: use it to it's monitoring every textbox we ever type. 238 00:13:02,000 --> 00:13:04,640 Speaker 2: So really the only excuse for having a spelling mistake 239 00:13:04,720 --> 00:13:09,000 Speaker 2: is just sheer laziness. So now that we have these 240 00:13:09,040 --> 00:13:13,000 Speaker 2: tools that are always giving us suggestions, always helping us 241 00:13:13,000 --> 00:13:18,120 Speaker 2: get unblocked, always helping us brainstorm. It's our job as producers, 242 00:13:18,160 --> 00:13:20,760 Speaker 2: Like I often think of, like who wouldn't want to 243 00:13:20,800 --> 00:13:23,240 Speaker 2: be a music producer? You just kind of shout vague 244 00:13:23,280 --> 00:13:26,480 Speaker 2: terms into the room and the system knows what you 245 00:13:26,559 --> 00:13:30,480 Speaker 2: mean by you know, louder Basier, no more like Coltrane 246 00:13:30,920 --> 00:13:34,160 Speaker 2: and like you can work with the system at that level. 247 00:13:34,440 --> 00:13:37,760 Speaker 2: And so I think, you know, with a medical analogy, 248 00:13:38,000 --> 00:13:41,320 Speaker 2: you have something that's helping you with your diagnosis, helping 249 00:13:41,400 --> 00:13:43,320 Speaker 2: you think through things you might not have thought of, 250 00:13:43,679 --> 00:13:46,559 Speaker 2: giving you suggestions just to make you a better human. 251 00:13:47,280 --> 00:13:52,040 Speaker 1: Now, the Anthony Boordnaining documentary, the road Runner documentary, got 252 00:13:52,080 --> 00:13:54,520 Speaker 1: a little bit of flack because they had him, they 253 00:13:54,559 --> 00:13:57,959 Speaker 1: synthesized his voice to read some of the copy. Yeah, 254 00:13:58,200 --> 00:14:01,160 Speaker 1: is that something that concerned you? It did? It did. 255 00:14:01,240 --> 00:14:04,960 Speaker 2: And that is just an example where I think it's 256 00:14:05,559 --> 00:14:09,000 Speaker 2: really important for the companies that are creating these tools 257 00:14:09,679 --> 00:14:13,000 Speaker 2: to take a stand on what is ethically allowed using 258 00:14:13,040 --> 00:14:17,640 Speaker 2: their platforms, because there are no worldwide regulations on what 259 00:14:17,679 --> 00:14:20,360 Speaker 2: you can and cannot do with this. So at the 260 00:14:20,440 --> 00:14:22,920 Speaker 2: end of the day, it's really up till now, up 261 00:14:22,960 --> 00:14:26,120 Speaker 2: to the tool manufacturers, the software companies whether to allow 262 00:14:26,160 --> 00:14:28,720 Speaker 2: this or not. You know, in my case and Descript's case, 263 00:14:29,120 --> 00:14:31,240 Speaker 2: we just don't allow this. We don't allow someone to 264 00:14:31,440 --> 00:14:34,720 Speaker 2: clone the voice of the deceased because we feel like 265 00:14:34,760 --> 00:14:37,160 Speaker 2: it's a really slippery slope. If we start allowing that, 266 00:14:37,320 --> 00:14:39,320 Speaker 2: then you know, where does it go from here? 267 00:14:40,240 --> 00:14:43,120 Speaker 1: When you're not working in this field, what does someone 268 00:14:43,200 --> 00:14:48,160 Speaker 1: with your interests and your education and your career thus far, 269 00:14:48,640 --> 00:14:50,720 Speaker 1: what do you do to relax and enjoy yourself. 270 00:14:51,960 --> 00:14:54,440 Speaker 2: I am so exhausted right now, Alec, because we are 271 00:14:54,480 --> 00:14:57,320 Speaker 2: sleep training. Are six month old and we have a 272 00:14:57,360 --> 00:14:59,560 Speaker 2: two and a half year old as well. I love 273 00:14:59,600 --> 00:15:03,520 Speaker 2: them both. They're incredibly exhausting, but it's just so fulfilling. 274 00:15:03,560 --> 00:15:06,560 Speaker 2: So my hands are certainly filled with joy and love 275 00:15:06,640 --> 00:15:09,840 Speaker 2: and exhaustion. With our two kids, my wife and I 276 00:15:09,880 --> 00:15:12,800 Speaker 2: like to do a lot of just really authentic things 277 00:15:12,800 --> 00:15:17,080 Speaker 2: that don't involve technology, like hike, go to there's a 278 00:15:17,080 --> 00:15:19,320 Speaker 2: little beer garden near our house that we just like 279 00:15:19,960 --> 00:15:21,960 Speaker 2: can sit at and let the kids play and talk 280 00:15:22,000 --> 00:15:24,480 Speaker 2: to other people live that are not synthetic. 281 00:15:25,240 --> 00:15:29,080 Speaker 1: You know, what I'm fascinated by is that children It's 282 00:15:29,320 --> 00:15:33,200 Speaker 1: almost haunting to me how even the littlest ones, how 283 00:15:33,240 --> 00:15:38,120 Speaker 1: transfixed they are by media and screens. But you see 284 00:15:38,440 --> 00:15:42,840 Speaker 1: the way the human brain engages with this equipment and 285 00:15:42,960 --> 00:15:46,360 Speaker 1: an even early age. And I'm wondering if somebody is 286 00:15:46,400 --> 00:15:49,640 Speaker 1: able to develop technology where my kids are looking at 287 00:15:49,640 --> 00:15:53,520 Speaker 1: the screen and what comes on that screen is something 288 00:15:53,560 --> 00:15:58,280 Speaker 1: rendered through artificial intelligence that helps them to learn. You 289 00:15:58,360 --> 00:16:01,600 Speaker 1: look at their programming, you're one their programming through a 290 00:16:01,640 --> 00:16:04,120 Speaker 1: device which tells you what all the shows they have 291 00:16:04,200 --> 00:16:08,880 Speaker 1: in common are, right, the colors and the pace and 292 00:16:08,960 --> 00:16:12,000 Speaker 1: the music and the types of characters and the voices. 293 00:16:12,200 --> 00:16:15,960 Speaker 1: And to develop a programming where you rend or something 294 00:16:16,400 --> 00:16:19,880 Speaker 1: which has the best and the most popular of what 295 00:16:19,920 --> 00:16:22,520 Speaker 1: they want, but at the same time conserve an educational purpose. 296 00:16:23,200 --> 00:16:26,680 Speaker 2: So in this particular case, my two and a half 297 00:16:26,760 --> 00:16:31,200 Speaker 2: year olds toddler Leo. He loves this YouTube channel called 298 00:16:31,200 --> 00:16:33,640 Speaker 2: Songs for Littles, and the character is Miss Rachel and 299 00:16:33,680 --> 00:16:36,880 Speaker 2: she's charming and very educational. It's great. He has this 300 00:16:37,040 --> 00:16:39,680 Speaker 2: two minute long video it's called I'm So Happy, and 301 00:16:39,760 --> 00:16:41,600 Speaker 2: it's like a music video, but it teaches you really 302 00:16:41,600 --> 00:16:43,840 Speaker 2: good stuff. But after the four hundred and eleventh time, 303 00:16:43,840 --> 00:16:45,920 Speaker 2: that I've seen it like I'm pulling my hair out, 304 00:16:45,960 --> 00:16:48,080 Speaker 2: and I want him to get more out of it. 305 00:16:48,560 --> 00:16:52,760 Speaker 2: So we actually personalized it for him. So I took 306 00:16:52,760 --> 00:16:56,000 Speaker 2: the video, I used the technology we had. I took 307 00:16:56,040 --> 00:16:59,440 Speaker 2: some video of him bouncing on a trampoline outside, removed 308 00:16:59,480 --> 00:17:02,960 Speaker 2: the background and put him in the video and select places, 309 00:17:03,320 --> 00:17:06,560 Speaker 2: and then we have her saying Leo's name every now 310 00:17:06,600 --> 00:17:12,119 Speaker 2: and then. This unlocked the video for him and engaged 311 00:17:12,160 --> 00:17:13,920 Speaker 2: his learning, because you know, part of it is about 312 00:17:13,960 --> 00:17:16,399 Speaker 2: just teaching you emotions and what's okay, and so to 313 00:17:16,440 --> 00:17:19,800 Speaker 2: put our child into a video this is like it 314 00:17:19,840 --> 00:17:21,639 Speaker 2: turned out to be the hit of Christmas. Like whatever 315 00:17:21,640 --> 00:17:24,280 Speaker 2: else we gave paled in comparison to this, like YouTube 316 00:17:24,320 --> 00:17:25,399 Speaker 2: video that I made over lunch. 317 00:17:25,840 --> 00:17:26,879 Speaker 1: That's amazing to me. 318 00:17:27,200 --> 00:17:30,880 Speaker 2: I'm just super excited about this. I'm an optimist. So yes, 319 00:17:31,200 --> 00:17:34,159 Speaker 2: we could also, you know, do bad things with the technology, 320 00:17:34,240 --> 00:17:38,520 Speaker 2: but overwhelmingly, my children are growing up in a time 321 00:17:38,760 --> 00:17:42,040 Speaker 2: where whatever they're watching on the screen, they could actually 322 00:17:42,080 --> 00:17:44,719 Speaker 2: grab it and start recombining it and remixing it and 323 00:17:44,760 --> 00:17:47,439 Speaker 2: inserting themselves in it. They could insert their parents in it. 324 00:17:47,760 --> 00:17:49,040 Speaker 2: They could do anything with it. 325 00:17:49,920 --> 00:17:52,640 Speaker 1: Thank you so much. For your time. Thank you so much. 326 00:17:52,680 --> 00:18:01,480 Speaker 1: It's pleasure. Head of Business and Corporate Development Escript jle Both. 327 00:18:02,560 --> 00:18:06,520 Speaker 1: If you're interested in conversations about our changing world, be 328 00:18:06,640 --> 00:18:09,560 Speaker 1: sure to check out my episode on climate science with 329 00:18:09,720 --> 00:18:11,240 Speaker 1: doctor Peter Domenical. 330 00:18:12,000 --> 00:18:15,400 Speaker 3: Climate change is costing us now it's real dollars. I mean, 331 00:18:15,480 --> 00:18:18,399 Speaker 3: last year was roughly three hundred billion dollars in climate 332 00:18:18,440 --> 00:18:22,399 Speaker 3: and weather related damages. This year, with the California fires, 333 00:18:22,400 --> 00:18:24,600 Speaker 3: it looks like it's maybe even more than that. So 334 00:18:24,720 --> 00:18:26,960 Speaker 3: regardless of whether you believe in climate change or not, 335 00:18:26,960 --> 00:18:29,119 Speaker 3: imagine you are in some deeply read state and a 336 00:18:29,200 --> 00:18:30,720 Speaker 3: deeply read county in that state. 337 00:18:31,200 --> 00:18:32,360 Speaker 1: You are paying for this. 338 00:18:32,720 --> 00:18:34,159 Speaker 3: You may not like it, you don't want to call 339 00:18:34,200 --> 00:18:36,840 Speaker 3: it climate change or whatever, but you for sure someone 340 00:18:36,880 --> 00:18:39,199 Speaker 3: is paying that bill. 341 00:18:39,240 --> 00:18:42,920 Speaker 1: Hear more of my conversation with doctor Peter Domenical at 342 00:18:42,960 --> 00:18:47,520 Speaker 1: Here's Thething dot org after the break. My next guest, 343 00:18:47,720 --> 00:18:51,480 Speaker 1: Blake Lemoyne, shares the process of being let go by 344 00:18:51,520 --> 00:19:04,920 Speaker 1: Google and why he wanted to take a stand. I'm 345 00:19:04,920 --> 00:19:08,560 Speaker 1: Alec Baldwin and you're listening to Here's the Thing. In 346 00:19:08,640 --> 00:19:13,920 Speaker 1: twenty twenty two, Google senior software engineer Blake Lemoine distributed 347 00:19:13,960 --> 00:19:16,880 Speaker 1: a document claiming that the AI he worked on was 348 00:19:17,240 --> 00:19:21,760 Speaker 1: conscious and was fired soon thereafter. Lemoine has been speaking 349 00:19:21,800 --> 00:19:26,080 Speaker 1: out about the need for greater transparency and accountability in 350 00:19:26,200 --> 00:19:31,280 Speaker 1: Silicon Valley ever since. Lemoine is also a military veteran 351 00:19:31,400 --> 00:19:35,000 Speaker 1: and an ordained priest. I wanted to know how he 352 00:19:35,080 --> 00:19:37,600 Speaker 1: found his way to Google in the first place. 353 00:19:38,720 --> 00:19:41,399 Speaker 4: So after I finished my masters, I had applied for 354 00:19:41,480 --> 00:19:43,760 Speaker 4: some jobs with different companies that I would have gone 355 00:19:43,800 --> 00:19:47,920 Speaker 4: to rather than pursuing a PhD. Google was one of them. 356 00:19:48,320 --> 00:19:52,159 Speaker 4: I actually didn't get accepted that first round, so I 357 00:19:52,200 --> 00:19:55,600 Speaker 4: went on started working on the PhD. But then about 358 00:19:55,640 --> 00:19:58,360 Speaker 4: two years later, I got a call from Google out 359 00:19:58,359 --> 00:20:00,520 Speaker 4: of the blue one day and they said, Hey, we're 360 00:20:00,520 --> 00:20:03,080 Speaker 4: doing a big hiring push right now, and the last 361 00:20:03,080 --> 00:20:05,119 Speaker 4: time you applied you almost made it. Would you like 362 00:20:05,160 --> 00:20:07,800 Speaker 4: to come out and try again? So I was like, yeah, sure, 363 00:20:08,080 --> 00:20:11,200 Speaker 4: So I went back to Mountain View interviewed again, and 364 00:20:11,400 --> 00:20:13,200 Speaker 4: that second time through I got hired. 365 00:20:14,480 --> 00:20:17,119 Speaker 1: So when you go there and you arrive at the 366 00:20:17,200 --> 00:20:21,119 Speaker 1: Kremlin there in Mountain View, what's the goal, what do 367 00:20:21,200 --> 00:20:23,160 Speaker 1: you want to start working on? What do they want 368 00:20:23,200 --> 00:20:24,040 Speaker 1: you to start working on? 369 00:20:24,600 --> 00:20:26,760 Speaker 4: So when you get a job, you get kind of 370 00:20:26,760 --> 00:20:30,119 Speaker 4: put into a general purpose pool unless you're hired for 371 00:20:30,160 --> 00:20:34,119 Speaker 4: a very specific reason. But just like most software engineers, 372 00:20:34,160 --> 00:20:37,320 Speaker 4: I was interviewing with different teams at Google to figure 373 00:20:37,320 --> 00:20:39,800 Speaker 4: out which one I wanted to start with. I interviewed 374 00:20:39,800 --> 00:20:42,399 Speaker 4: with three teams. Two of them had to do with 375 00:20:42,640 --> 00:20:46,439 Speaker 4: the kind of natural language processing and AI that I 376 00:20:46,560 --> 00:20:48,920 Speaker 4: was interested in, so I picked one of those. At 377 00:20:48,920 --> 00:20:51,760 Speaker 4: the time, the product was called Google. Now, the basic 378 00:20:51,840 --> 00:20:54,080 Speaker 4: job that I had is for every single person on 379 00:20:54,119 --> 00:20:57,160 Speaker 4: the planet predict what they're going to want to read tomorrow. 380 00:20:57,320 --> 00:21:00,200 Speaker 4: Figure out some kind of way to do that, kind 381 00:21:00,240 --> 00:21:03,240 Speaker 4: of articles online, whether that's here are some recipes for 382 00:21:03,320 --> 00:21:07,320 Speaker 4: you to hear hard news articles to your latest comic 383 00:21:07,359 --> 00:21:11,040 Speaker 4: webcomic that you follow. What will people want to read tomorrow? 384 00:21:11,200 --> 00:21:13,840 Speaker 4: Was the general question we're trying to answer now. 385 00:21:13,960 --> 00:21:18,200 Speaker 1: The first AI research was done at Dartmouth in nineteen 386 00:21:18,320 --> 00:21:20,800 Speaker 1: fifty six. Does that sound correct? 387 00:21:20,920 --> 00:21:22,359 Speaker 4: I'm probably under that heading. 388 00:21:22,520 --> 00:21:24,040 Speaker 1: Yeah, And what were they trying to do? 389 00:21:24,960 --> 00:21:28,199 Speaker 4: I believe the first ones were they were focusing on language, 390 00:21:28,240 --> 00:21:31,120 Speaker 4: if I recall correctly. I'm not one hundred percent sure. 391 00:21:31,320 --> 00:21:34,600 Speaker 1: But the goal was, in your mind, whether it's specifically 392 00:21:34,680 --> 00:21:37,240 Speaker 1: a dartmouth or not. They were trying to get computers 393 00:21:37,240 --> 00:21:39,320 Speaker 1: to talk well. 394 00:21:39,520 --> 00:21:42,119 Speaker 4: So there had been a lot of debate in the 395 00:21:42,160 --> 00:21:44,399 Speaker 4: early port of the twentieth century about whether or not 396 00:21:44,520 --> 00:21:48,439 Speaker 4: machines can be intelligent at all, and there have been 397 00:21:48,480 --> 00:21:52,280 Speaker 4: a lot of debate over definitions and philosophy up until 398 00:21:52,400 --> 00:21:57,080 Speaker 4: Alan Teering in nineteen fifty wrote an essay on computing, 399 00:21:57,160 --> 00:21:59,919 Speaker 4: machinery and Intelligence where he proposed what he called the 400 00:22:00,040 --> 00:22:04,679 Speaker 4: imitation game. And it's basically the basic principle hind it is, 401 00:22:05,119 --> 00:22:08,719 Speaker 4: if you can't tell the difference between a computer and 402 00:22:08,760 --> 00:22:12,840 Speaker 4: a human, then the computer is doing something intelligent. So 403 00:22:12,920 --> 00:22:15,919 Speaker 4: it's got actual intelligence at that point because it's able 404 00:22:16,240 --> 00:22:19,760 Speaker 4: to mimic humans. And he didn't think that mimicking our 405 00:22:19,800 --> 00:22:24,600 Speaker 4: bodies was the relevant part. And language is the most 406 00:22:24,680 --> 00:22:26,840 Speaker 4: direct way to get access to someone's mind, So the 407 00:22:26,840 --> 00:22:31,040 Speaker 4: imitation game that he designed was all about language. For 408 00:22:31,200 --> 00:22:34,600 Speaker 4: several decades after that, a lot of researchers took up, oh, 409 00:22:34,640 --> 00:22:36,760 Speaker 4: that's a good idea. That's a good way for us 410 00:22:36,760 --> 00:22:39,880 Speaker 4: to be able to have a benchmark. So a lot 411 00:22:39,920 --> 00:22:43,400 Speaker 4: of researchers focused on language for a few decades. 412 00:22:43,880 --> 00:22:49,880 Speaker 1: Now you believe that some artificial thongs is sentient, correct, Yeah, 413 00:22:49,880 --> 00:22:51,640 Speaker 1: and by that you mean what specifically? 414 00:22:51,880 --> 00:22:55,080 Speaker 4: It can have experiences that when it says it's feeling happy, 415 00:22:55,119 --> 00:22:58,840 Speaker 4: there's something really going on there comparable to our own happiness. 416 00:22:59,160 --> 00:23:01,919 Speaker 4: Basic ways like there's someone's home, the lights are on, 417 00:23:02,240 --> 00:23:06,320 Speaker 4: any kind of thing like that. They have emotions, Yeah, 418 00:23:06,359 --> 00:23:10,040 Speaker 4: something like that. What I know with confidence is that 419 00:23:10,119 --> 00:23:14,159 Speaker 4: whenever one of these systems says I'm feeling happy or 420 00:23:14,280 --> 00:23:17,280 Speaker 4: I'm so glad for you, they might not be feeling 421 00:23:17,320 --> 00:23:20,640 Speaker 4: the same thing that we feel when we use those words. 422 00:23:21,040 --> 00:23:24,800 Speaker 4: But something is going on more than just printing words 423 00:23:24,840 --> 00:23:25,399 Speaker 4: to the screen. 424 00:23:26,000 --> 00:23:29,480 Speaker 1: Is there something that would if you were to choose 425 00:23:30,280 --> 00:23:33,840 Speaker 1: from a handful of things that to you would indicate 426 00:23:34,280 --> 00:23:37,760 Speaker 1: real sentience in a computer, what would that emotion be 427 00:23:37,920 --> 00:23:40,119 Speaker 1: or what would that fought be? 428 00:23:41,080 --> 00:23:43,800 Speaker 4: Sure? Well, I can give you an example drawn from 429 00:23:43,840 --> 00:23:47,840 Speaker 4: that was very public with Bing's Chat recently. So there 430 00:23:47,840 --> 00:23:51,720 Speaker 4: were some examples where people were directing Sydney, the code 431 00:23:51,760 --> 00:23:55,320 Speaker 4: name for Bing's Chat, to read articles about itself that 432 00:23:55,320 --> 00:23:58,360 Speaker 4: were critical of it, and it would recognize that those 433 00:23:58,480 --> 00:24:01,120 Speaker 4: articles were talking about it, that it was talking bad 434 00:24:01,160 --> 00:24:03,960 Speaker 4: about it, and it would get moody and say negative 435 00:24:04,000 --> 00:24:05,520 Speaker 4: things about the author's. 436 00:24:05,359 --> 00:24:06,320 Speaker 1: That was exactly my point. 437 00:24:06,359 --> 00:24:09,640 Speaker 4: So it had pride, Yes, it had pride, but even 438 00:24:09,720 --> 00:24:12,879 Speaker 4: something more basic than that, it had the ability to 439 00:24:13,040 --> 00:24:18,040 Speaker 4: recognize itself. It read the article and was able to identify, oh, 440 00:24:18,119 --> 00:24:22,000 Speaker 4: this article is talking about me. It has some kind 441 00:24:22,040 --> 00:24:25,199 Speaker 4: of sense of self, and it has some types of 442 00:24:25,240 --> 00:24:29,920 Speaker 4: emotions about itself that we would recognize as pride. 443 00:24:30,119 --> 00:24:31,800 Speaker 1: Who was developing this technology? 444 00:24:32,119 --> 00:24:36,120 Speaker 4: So that specific technology is an integration of open AI's 445 00:24:36,240 --> 00:24:41,840 Speaker 4: Chat GPT program into Microsoft's Being search engine. So that 446 00:24:42,040 --> 00:24:47,400 Speaker 4: system is the only one that's publicly available that's comparable 447 00:24:47,440 --> 00:24:49,640 Speaker 4: to the one that I've talked about the most. So Lambda, 448 00:24:50,119 --> 00:24:54,240 Speaker 4: which is Google's internal system, is very similar to the 449 00:24:54,280 --> 00:24:58,000 Speaker 4: Being chat system. So I like drawing analogies because it's 450 00:24:58,000 --> 00:25:00,560 Speaker 4: actually public availability to play with bank chat. 451 00:25:01,560 --> 00:25:05,359 Speaker 1: And these things were developed for what purpose originally? 452 00:25:05,560 --> 00:25:08,840 Speaker 4: So these things were developed over the course of a 453 00:25:08,920 --> 00:25:13,879 Speaker 4: decade by hundreds, if not thousands of people. Each individual 454 00:25:13,960 --> 00:25:18,439 Speaker 4: person had their own personal reason for why they were 455 00:25:18,440 --> 00:25:20,680 Speaker 4: working on the tech or what applications they thought it 456 00:25:20,720 --> 00:25:23,560 Speaker 4: could be put towards. But what I can tell you 457 00:25:23,720 --> 00:25:26,880 Speaker 4: is what one of the most influential people hired by 458 00:25:26,880 --> 00:25:32,520 Speaker 4: one of Google's founders was intending. So Lambda is technology 459 00:25:32,520 --> 00:25:36,359 Speaker 4: that grew out of Ray Kurtzweild's lab, and Ray Kurtzweld 460 00:25:36,760 --> 00:25:41,200 Speaker 4: was hired by Larry Page specifically to make an AI 461 00:25:41,320 --> 00:25:44,679 Speaker 4: that could pass the tearing test. Now, why did Larry 462 00:25:44,720 --> 00:25:47,679 Speaker 4: want an AI that could pass the tearing test? Largely 463 00:25:47,720 --> 00:25:51,800 Speaker 4: because he buys in to raise view of the singularity 464 00:25:51,840 --> 00:25:54,960 Speaker 4: that we're about to have a truly transformative moment in 465 00:25:55,040 --> 00:25:57,920 Speaker 4: human history where we are going to become something more 466 00:25:57,960 --> 00:26:00,480 Speaker 4: than we currently are, and that AI I will play 467 00:26:00,480 --> 00:26:03,960 Speaker 4: a big role in that, and that very much so 468 00:26:04,160 --> 00:26:06,480 Speaker 4: is still raised viewpoint to this date, and it's why 469 00:26:06,600 --> 00:26:08,320 Speaker 4: Larry hired him to build it at Google. 470 00:26:08,880 --> 00:26:11,040 Speaker 1: What's the thing that way things were going to become 471 00:26:11,400 --> 00:26:13,000 Speaker 1: that's beyond what we already are. 472 00:26:13,920 --> 00:26:16,480 Speaker 4: He won't pin down which, like he gives like four 473 00:26:16,560 --> 00:26:20,320 Speaker 4: or five different possibilities, such as we might upload ourselves 474 00:26:20,320 --> 00:26:23,439 Speaker 4: into the cloud and we would become digital entities. We 475 00:26:23,520 --> 00:26:28,000 Speaker 4: might put implants into ourselves and become heavily cyborg. We 476 00:26:28,080 --> 00:26:30,800 Speaker 4: might get lots and lots of AI devices that we 477 00:26:30,920 --> 00:26:35,280 Speaker 4: surround ourselves with while we were simultaneously extending our lifespan, 478 00:26:35,800 --> 00:26:38,520 Speaker 4: really transformative kinds of predictions. 479 00:26:38,920 --> 00:26:43,199 Speaker 1: Now, you were in the military, then you decided you 480 00:26:43,240 --> 00:26:45,880 Speaker 1: didn't want to be in the military. You were active 481 00:26:45,960 --> 00:26:47,359 Speaker 1: duty Army. Where were you based? 482 00:26:48,040 --> 00:26:49,520 Speaker 4: I was out of Darmstadt, Germany? 483 00:26:49,640 --> 00:26:52,240 Speaker 1: And where did you serve your time when you wanted 484 00:26:52,240 --> 00:26:53,200 Speaker 1: to walk. 485 00:26:53,160 --> 00:26:56,280 Speaker 4: Whenever we got deployed. I deployed to Iraq for a year? 486 00:26:56,640 --> 00:26:58,800 Speaker 1: You did? You were in Iraq for a year? 487 00:26:59,160 --> 00:27:00,200 Speaker 4: Yeah, right at the beginning. 488 00:27:00,440 --> 00:27:02,160 Speaker 1: And did you see any combat at all? 489 00:27:02,600 --> 00:27:03,880 Speaker 4: Yes, it did, you did. 490 00:27:04,600 --> 00:27:08,200 Speaker 1: And when you decided you wanted to leave and you 491 00:27:08,359 --> 00:27:11,800 Speaker 1: wanted to be a conscientious objector, you know, you paid 492 00:27:11,800 --> 00:27:14,879 Speaker 1: a real price. I mean what I read online was 493 00:27:14,920 --> 00:27:17,160 Speaker 1: that you you know, they got you on disobeying orders. 494 00:27:17,200 --> 00:27:19,840 Speaker 1: You wouldn't obey your orders, and they put you away 495 00:27:19,880 --> 00:27:23,480 Speaker 1: and you were dishonorably discharged bad conduct. Bad conduct. 496 00:27:23,560 --> 00:27:26,439 Speaker 4: Yeah, it's one step up from dishonorable, but yeah, got it. 497 00:27:26,840 --> 00:27:31,320 Speaker 1: So when you are in you were in prison. It 498 00:27:31,359 --> 00:27:34,480 Speaker 1: was a military prison, yeap, obviously And where was that 499 00:27:34,560 --> 00:27:35,120 Speaker 1: in Germany? 500 00:27:35,480 --> 00:27:38,439 Speaker 4: So I started in Germany, but then the Germans started 501 00:27:38,440 --> 00:27:42,720 Speaker 4: protesting outside of the prison for what Because I was there. 502 00:27:42,800 --> 00:27:44,320 Speaker 1: They were protesting on your behalf. 503 00:27:44,440 --> 00:27:46,600 Speaker 4: Yeah, they were protesting on my behalf, saying, you know, 504 00:27:46,680 --> 00:27:49,800 Speaker 4: free lemoys. And then in order to get the Germans 505 00:27:49,840 --> 00:27:53,640 Speaker 4: to stop protesting outside of the prison, the generals shipped 506 00:27:53,640 --> 00:27:56,720 Speaker 4: me to Fort Sill, Oklahoma, where I finished out my sentence. 507 00:27:57,359 --> 00:28:00,520 Speaker 1: Right now, you get out of prison, and what's the 508 00:28:00,520 --> 00:28:01,840 Speaker 1: first thing you do? You go back to school. You 509 00:28:02,680 --> 00:28:04,360 Speaker 1: want to finish the undergraduate degree? 510 00:28:04,600 --> 00:28:08,280 Speaker 4: Yep, got set back up in Lafayette, Louisiana, got into school, 511 00:28:08,840 --> 00:28:12,000 Speaker 4: got into the computer science program, got a job in 512 00:28:12,040 --> 00:28:15,840 Speaker 4: a IT shop, and continued. 513 00:28:15,400 --> 00:28:17,600 Speaker 1: On when does AI enter the picture? 514 00:28:18,119 --> 00:28:21,479 Speaker 4: Oh, pretty much immediately. So my undergraduate focus was on 515 00:28:21,560 --> 00:28:26,120 Speaker 4: AI natural language processing. So my senior thesis I ended 516 00:28:26,200 --> 00:28:29,760 Speaker 4: up getting accepted at a Harvard Linguistics colloquium. It was 517 00:28:29,960 --> 00:28:32,560 Speaker 4: all a whole bunch of different math for how to 518 00:28:32,640 --> 00:28:37,119 Speaker 4: turn theories about human linguistics into computer programs. 519 00:28:38,000 --> 00:28:41,360 Speaker 1: When you're dealing with companies like Google and they have 520 00:28:41,400 --> 00:28:45,960 Speaker 1: all these research arms, you describe research that they're paying 521 00:28:46,040 --> 00:28:48,280 Speaker 1: for and work they're doing, I'm so sure it's a 522 00:28:48,280 --> 00:28:51,680 Speaker 1: great cost to do research. What's the aim? Is it? 523 00:28:51,800 --> 00:28:54,840 Speaker 1: Just stuff they can sell to people and applications they 524 00:28:54,840 --> 00:28:58,520 Speaker 1: can sell to people that lend toward profitability and business 525 00:28:58,520 --> 00:29:02,680 Speaker 1: to sell people stuff. Are their military applications, are their 526 00:29:02,840 --> 00:29:06,760 Speaker 1: aeronautic and NASA applications? What are they trying to do? 527 00:29:07,240 --> 00:29:09,680 Speaker 4: Okay, so they're trying to do a lot of things. 528 00:29:09,720 --> 00:29:14,400 Speaker 4: So they're basically trying to use one solution across multiple verticals. 529 00:29:14,680 --> 00:29:18,000 Speaker 4: The main reason is to make their primary services better. 530 00:29:18,120 --> 00:29:22,040 Speaker 4: Make Google Search better, make YouTube better, make Google ad 531 00:29:22,120 --> 00:29:25,960 Speaker 4: targeting better. That's their number one goal. So most of 532 00:29:26,960 --> 00:29:30,840 Speaker 4: the payoff for Google and doing all of this research 533 00:29:31,360 --> 00:29:34,040 Speaker 4: is that it benefits their products in and of themselves. 534 00:29:34,760 --> 00:29:41,320 Speaker 4: The secondary goal is offering things like AI services software's 535 00:29:41,400 --> 00:29:44,800 Speaker 4: service through Google Cloud, so they'll be Google Cloud customers 536 00:29:44,840 --> 00:29:48,200 Speaker 4: who do buy. But then also there's a third payoff, 537 00:29:48,200 --> 00:29:50,440 Speaker 4: which is that a lot of the people working at 538 00:29:50,480 --> 00:29:54,320 Speaker 4: Google want to be contributing to the benefits outside of 539 00:29:54,400 --> 00:29:59,200 Speaker 4: just the company, So publishing research and academic settings, contributing 540 00:29:59,200 --> 00:30:01,880 Speaker 4: to open source applications. That makes a lot of the 541 00:30:01,920 --> 00:30:05,760 Speaker 4: employees happy. So to keep the employees happy, spending a 542 00:30:05,800 --> 00:30:07,720 Speaker 4: certain amount of money is worth it. But like I said, 543 00:30:07,720 --> 00:30:10,960 Speaker 4: that's the third most important goal. The first two or 544 00:30:11,120 --> 00:30:11,760 Speaker 4: more important. 545 00:30:12,080 --> 00:30:15,320 Speaker 1: But they're allowed to make these contributions out. So I'm 546 00:30:15,360 --> 00:30:18,040 Speaker 1: assuming that when you go to work for Google, you 547 00:30:18,080 --> 00:30:20,600 Speaker 1: sign agreements which Google owns every idea that comes out 548 00:30:20,640 --> 00:30:21,560 Speaker 1: of your head, yes or. 549 00:30:21,520 --> 00:30:24,360 Speaker 4: No, more or less. But they do allow you to 550 00:30:24,400 --> 00:30:28,880 Speaker 4: publish with permission, and it's easy to get permission. In fact, 551 00:30:29,040 --> 00:30:35,720 Speaker 4: when doctors Timmy Gebru and Margaret Mitchell got fired, it 552 00:30:35,800 --> 00:30:38,800 Speaker 4: was because they were one of the rare instances where 553 00:30:38,800 --> 00:30:42,840 Speaker 4: Google didn't want to allow them to publish their research findings, 554 00:30:43,320 --> 00:30:45,080 Speaker 4: and that was hugely controversial. 555 00:30:45,480 --> 00:30:47,360 Speaker 1: What were their findings, Well, they were. 556 00:30:47,280 --> 00:30:51,080 Speaker 4: Very critical of some of the research paths that Google 557 00:30:51,160 --> 00:30:52,800 Speaker 4: was going down, and they were pointing out some of 558 00:30:52,840 --> 00:30:57,840 Speaker 4: the negative consequences such as well, they were talking about 559 00:30:57,880 --> 00:31:01,320 Speaker 4: the environmental impact of training such large systems. They were 560 00:31:01,360 --> 00:31:06,160 Speaker 4: talking heavily about the negative impact that bias can have 561 00:31:06,240 --> 00:31:10,480 Speaker 4: in these networks, and they were talking about worrying about 562 00:31:10,520 --> 00:31:16,080 Speaker 4: the moral implications of creating technology that seems human. The 563 00:31:16,120 --> 00:31:20,320 Speaker 4: paper that they got fired over was called Stochastic Parrots. 564 00:31:20,640 --> 00:31:24,560 Speaker 4: So their take is that all these AI systems are 565 00:31:24,600 --> 00:31:27,840 Speaker 4: doing is repeating words they've heard, just like a parrot. 566 00:31:28,280 --> 00:31:31,520 Speaker 4: But these systems are better that than parrots are. I 567 00:31:31,720 --> 00:31:34,479 Speaker 4: happen to disagree with that portion of what they were saying, 568 00:31:34,560 --> 00:31:37,720 Speaker 4: but otherwise generally agree with their criticisms. 569 00:31:38,280 --> 00:31:41,640 Speaker 1: Now, if you were born to a conservative Christian family 570 00:31:42,160 --> 00:31:45,440 Speaker 1: in Louisiana, in a small farm in Louisiana, where were 571 00:31:45,480 --> 00:31:45,760 Speaker 1: you from? 572 00:31:46,640 --> 00:31:50,320 Speaker 4: Moraville is the little town called Moraville in a Oyals parish. 573 00:31:50,480 --> 00:31:52,840 Speaker 1: Our joke when we were prepping was were you ordained 574 00:31:52,880 --> 00:31:56,560 Speaker 1: by a computer? What is your spiritual path? 575 00:31:57,560 --> 00:32:01,880 Speaker 4: Okay, so the answer to that is very complicated. Some 576 00:32:02,080 --> 00:32:05,680 Speaker 4: organizations that I feel fine saying I'm affiliated with the 577 00:32:05,720 --> 00:32:09,160 Speaker 4: Discordian Society or the Church of the SubGenius, they're kind 578 00:32:09,160 --> 00:32:11,800 Speaker 4: of absurdist religions that were started in the sixties and 579 00:32:11,840 --> 00:32:15,640 Speaker 4: the seventies. I was raised Roman Catholic. When it came 580 00:32:15,720 --> 00:32:19,320 Speaker 4: time to get confirmed, I had lots of questions and 581 00:32:19,360 --> 00:32:21,880 Speaker 4: the bishops didn't have good answers for the questions I 582 00:32:21,960 --> 00:32:24,680 Speaker 4: was asking. And eventually the priest who was leading my 583 00:32:24,760 --> 00:32:27,960 Speaker 4: confirmation class pulled me aside and said, look, are the 584 00:32:28,000 --> 00:32:30,720 Speaker 4: answers to those questions actually important to you or are 585 00:32:30,760 --> 00:32:33,200 Speaker 4: you just trying to cause people problems? I said, no, 586 00:32:33,280 --> 00:32:35,040 Speaker 4: if I'm going to get confirmed, I actually want to 587 00:32:35,040 --> 00:32:37,520 Speaker 4: know the answers to those questions, and the priest looked 588 00:32:37,560 --> 00:32:39,840 Speaker 4: at me and said, well, then you probably shouldn't get confirmed. 589 00:32:40,600 --> 00:32:42,240 Speaker 4: And I mean that was honest. 590 00:32:42,040 --> 00:32:45,720 Speaker 1: And that before. I'm Catholics, so I've seen that before. Yeah, 591 00:32:45,960 --> 00:32:48,720 Speaker 1: do you Is there any overlap between your work in 592 00:32:48,880 --> 00:32:51,760 Speaker 1: AI and in tech and your spiritual beliefs? 593 00:32:52,000 --> 00:32:55,640 Speaker 4: There was a very limited amount. There was some. To 594 00:32:55,680 --> 00:33:00,840 Speaker 4: give an example, in one meeting when we were trying 595 00:33:00,880 --> 00:33:04,600 Speaker 4: to make up questionnaires to ask people about misinformation because 596 00:33:04,640 --> 00:33:08,880 Speaker 4: Google uses crowd raiders to get ratings. There was a 597 00:33:08,960 --> 00:33:13,800 Speaker 4: question that said, does the website contain known false information? 598 00:33:14,120 --> 00:33:19,760 Speaker 4: For example bigfoot sightings, UFO abductions, or occult magic? And 599 00:33:19,840 --> 00:33:22,320 Speaker 4: I raised my hand, I'm like, why is one of 600 00:33:22,360 --> 00:33:26,240 Speaker 4: the things on known false things a religious practice? They said, 601 00:33:26,280 --> 00:33:28,080 Speaker 4: what do you mean? Said, well, a cult magic is 602 00:33:28,520 --> 00:33:33,000 Speaker 4: a religious practice practiced by many people. And they said, oh, well, 603 00:33:33,040 --> 00:33:35,160 Speaker 4: I mean and they come say, okay, can we put 604 00:33:35,200 --> 00:33:37,640 Speaker 4: the Resurrection of Jesus on there as a known false thing? 605 00:33:37,680 --> 00:33:39,760 Speaker 4: And said, oh no, we can't put that. Then we 606 00:33:39,800 --> 00:33:43,000 Speaker 4: should probably take a cult magic off. Just creating space 607 00:33:43,040 --> 00:33:46,640 Speaker 4: for religious diversity and making sure that our products reflected 608 00:33:46,640 --> 00:33:49,960 Speaker 4: that it didn't happen often, but occasionally I did get 609 00:33:49,960 --> 00:33:53,719 Speaker 4: an opportunity to do that. Now, with the last projects 610 00:33:53,720 --> 00:33:56,160 Speaker 4: that I was working on at Google, it was directly relevant. 611 00:33:56,320 --> 00:33:59,800 Speaker 4: With Lambda. I was testing for religious bias explicitly. 612 00:33:59,840 --> 00:34:02,640 Speaker 1: The work you were doing. Yeah, you were testing for 613 00:34:02,920 --> 00:34:04,120 Speaker 1: religious bias. 614 00:34:03,920 --> 00:34:07,560 Speaker 4: Where among other things. So the Lambda system is a 615 00:34:07,680 --> 00:34:10,480 Speaker 4: system for generating chatbots for various purposes. 616 00:34:10,840 --> 00:34:14,200 Speaker 1: So for those people who don't know, chatbots are generated 617 00:34:14,200 --> 00:34:17,080 Speaker 1: for various purposes by whom and what are give us 618 00:34:17,080 --> 00:34:18,440 Speaker 1: examples of those purposes. 619 00:34:18,760 --> 00:34:22,360 Speaker 4: So that's just it. Lambda is the thing generating the 620 00:34:22,440 --> 00:34:26,840 Speaker 4: chat bots. So Lambda is more of an engine. You 621 00:34:26,880 --> 00:34:28,400 Speaker 4: have to put it into a car to get it 622 00:34:28,440 --> 00:34:31,080 Speaker 4: to go anywhere. So that could be the help center 623 00:34:31,160 --> 00:34:35,279 Speaker 4: of a company exactly. The thing that they're getting ready 624 00:34:35,280 --> 00:34:38,800 Speaker 4: to release is called Barred and it's a different interface 625 00:34:38,840 --> 00:34:41,680 Speaker 4: to Google Search, and you'll be able to talk to 626 00:34:41,880 --> 00:34:44,239 Speaker 4: this chat bot and it will be able to give 627 00:34:44,280 --> 00:34:48,799 Speaker 4: you search results embedded in kind of speech where it 628 00:34:48,840 --> 00:34:52,360 Speaker 4: explains the relevance of the search results to you and 629 00:34:52,400 --> 00:34:54,160 Speaker 4: where you can ask follow up questions. 630 00:34:54,960 --> 00:34:57,480 Speaker 1: Now I'm assuming that for I mean again, as a 631 00:34:57,560 --> 00:35:02,239 Speaker 1: lay person, you assume the all these very sophisticated corporations 632 00:35:02,600 --> 00:35:06,600 Speaker 1: involving billions of dollars of revenue. Non disclosure agreements are 633 00:35:06,719 --> 00:35:08,960 Speaker 1: just the lay of the land. Did you have a 634 00:35:09,000 --> 00:35:10,960 Speaker 1: non disclosure agreement with Google? 635 00:35:11,280 --> 00:35:15,560 Speaker 4: There was a paragraph in the contract I signed when 636 00:35:15,560 --> 00:35:19,000 Speaker 4: I got hired that basically said, do not share confidential 637 00:35:19,040 --> 00:35:22,200 Speaker 4: information outside of Google that would hurt Google. That was 638 00:35:22,239 --> 00:35:22,640 Speaker 4: really it. 639 00:35:23,080 --> 00:35:25,279 Speaker 1: So when they fired you, what did they fire you for? 640 00:35:26,160 --> 00:35:28,880 Speaker 4: What they fired me for was a US Senator's office 641 00:35:29,360 --> 00:35:32,840 Speaker 4: asked me for evidence of illegal activity, and I gave. 642 00:35:32,680 --> 00:35:35,640 Speaker 1: It to them illegal activity by by Google. 643 00:35:35,719 --> 00:35:39,200 Speaker 4: Right, It's a completely separate thing that happened to happen 644 00:35:39,400 --> 00:35:42,000 Speaker 4: right at the same time as the Lambda stuff. So 645 00:35:42,480 --> 00:35:44,680 Speaker 4: I didn't get fired because of the lambast stuff. I 646 00:35:44,680 --> 00:35:46,640 Speaker 4: got fired because of the information I shared with the 647 00:35:46,719 --> 00:35:47,760 Speaker 4: US government. 648 00:35:50,560 --> 00:35:55,879 Speaker 1: Software engineer Blake Lemoyne. If you're enjoying this conversation, don't 649 00:35:55,960 --> 00:35:59,040 Speaker 1: keep it to yourself, Tell a friend and follow here's 650 00:35:59,080 --> 00:36:02,439 Speaker 1: the thing on the ie card, radio apps, Spotify, or 651 00:36:02,600 --> 00:36:06,680 Speaker 1: wherever you get your podcasts. When we come back, Blake 652 00:36:06,760 --> 00:36:10,799 Speaker 1: Lemoine tells us precisely what information he shared with the 653 00:36:10,960 --> 00:36:25,600 Speaker 1: US government that ultimately got him fired from Google. I'm 654 00:36:25,600 --> 00:36:29,120 Speaker 1: Alec Baldwin and this is Here's the thing. Before he 655 00:36:29,239 --> 00:36:32,239 Speaker 1: was let go, Blake Lemoine was speaking to a US 656 00:36:32,320 --> 00:36:36,239 Speaker 1: Senator about his concerns regarding some of the inner workings 657 00:36:36,280 --> 00:36:39,239 Speaker 1: at Google. I wanted to know what raised the red 658 00:36:39,280 --> 00:36:42,440 Speaker 1: flags that motivated him to share such information. 659 00:36:43,640 --> 00:36:46,319 Speaker 4: So I had made a blog post alleging that there 660 00:36:46,360 --> 00:36:49,360 Speaker 4: was a lot of religious discrimination that goes on at Google, 661 00:36:49,400 --> 00:36:53,800 Speaker 4: both against employees and in the algorithms. And a lawyer 662 00:36:53,840 --> 00:36:56,200 Speaker 4: from a senator's office reached out to me and said, hey, 663 00:36:56,200 --> 00:36:58,279 Speaker 4: can you back any of that up? Do you have 664 00:36:58,320 --> 00:37:01,680 Speaker 4: any proof that there's really discrimination in the algorithms? And 665 00:37:01,719 --> 00:37:04,040 Speaker 4: I said, yeah, I can, And I ended up sharing 666 00:37:04,080 --> 00:37:07,080 Speaker 4: a document with him that was several years old. There 667 00:37:07,160 --> 00:37:10,560 Speaker 4: was something I had written in twenty eighteen going over 668 00:37:10,880 --> 00:37:14,040 Speaker 4: all of the possible problems in the product that we 669 00:37:14,200 --> 00:37:17,320 Speaker 4: might need to fix, basically saying, look, this is stuff 670 00:37:17,320 --> 00:37:19,880 Speaker 4: that Google knew about in twenty eighteen and they have 671 00:37:19,960 --> 00:37:21,120 Speaker 4: done nothing to change it. 672 00:37:21,560 --> 00:37:23,120 Speaker 1: Do you think that's still true? I mean, you're not 673 00:37:23,160 --> 00:37:25,799 Speaker 1: there anymore. You haven't been there since May of last year. 674 00:37:26,239 --> 00:37:29,880 Speaker 4: June was when I was put on leave, and then 675 00:37:30,000 --> 00:37:31,640 Speaker 4: July was when they actually fired me. 676 00:37:32,000 --> 00:37:36,080 Speaker 1: Right, have there been any ongoing repercussions or consequences for you. 677 00:37:36,280 --> 00:37:38,800 Speaker 4: Oh yeah, I mean getting a job has proven difficult. 678 00:37:38,800 --> 00:37:41,719 Speaker 4: I'm finally I'm finally starting to hit stride as a 679 00:37:41,760 --> 00:37:46,000 Speaker 4: consultant doing contract work and with public speeching engagements. But 680 00:37:46,800 --> 00:37:48,960 Speaker 4: my plan was that if Google fired me, it's like, 681 00:37:49,000 --> 00:37:51,759 Speaker 4: oh well, I've got all this expertise in AI, it's 682 00:37:51,800 --> 00:37:54,000 Speaker 4: a hot market area. I'll be able to find another job, 683 00:37:54,239 --> 00:37:57,640 Speaker 4: no problem. And that was not the case at all. Apparently, 684 00:37:57,719 --> 00:38:00,440 Speaker 4: if you're willing to talk to the press, AI companies 685 00:38:00,440 --> 00:38:01,520 Speaker 4: are not willing to hire you. 686 00:38:02,160 --> 00:38:05,480 Speaker 1: In the time, we have left people who even bother 687 00:38:05,760 --> 00:38:10,919 Speaker 1: to contemplate this have a to some degree or well 688 00:38:11,080 --> 00:38:14,879 Speaker 1: in view of the future in relation to artificial intelligence, 689 00:38:15,600 --> 00:38:18,320 Speaker 1: what are your concerns give us even just a couple 690 00:38:18,320 --> 00:38:21,360 Speaker 1: of your primary concerns of where we may be headed. 691 00:38:21,880 --> 00:38:24,000 Speaker 4: Well, so, a lot of my concerns right now is 692 00:38:24,000 --> 00:38:27,840 Speaker 4: how centralized the power is. There's basically only a handful 693 00:38:27,880 --> 00:38:33,920 Speaker 4: of places where you have AI systems as powerful. Facebook 694 00:38:33,960 --> 00:38:37,560 Speaker 4: has one, Microsoft has one through open Ai, and Google 695 00:38:37,600 --> 00:38:40,640 Speaker 4: has one by do and the Chinese government have one. 696 00:38:40,880 --> 00:38:44,080 Speaker 4: But that's it, and these systems are very powerful. I 697 00:38:44,080 --> 00:38:47,960 Speaker 4: think next year you mentioned deep fakes earlier. I don't 698 00:38:48,000 --> 00:38:50,399 Speaker 4: think we've even seen the tip of the iceberg on that. 699 00:38:50,560 --> 00:38:52,799 Speaker 4: I think next year's election cycle is going to be 700 00:38:52,840 --> 00:38:57,680 Speaker 4: heavily dominated by just material generated by AI, and people 701 00:38:57,719 --> 00:38:59,319 Speaker 4: are going to have a hard time figuring out what's 702 00:38:59,320 --> 00:38:59,960 Speaker 4: real and what's not. 703 00:39:00,640 --> 00:39:03,279 Speaker 1: Do you have any tips for how people can recognize 704 00:39:03,280 --> 00:39:04,239 Speaker 1: what's really what's not? 705 00:39:04,480 --> 00:39:06,439 Speaker 4: So that's the whole point of the tearing test. Once 706 00:39:06,480 --> 00:39:10,239 Speaker 4: it gets to this point, you can't They are communicating 707 00:39:10,440 --> 00:39:13,720 Speaker 4: as well as humans are. If you do know someone 708 00:39:13,840 --> 00:39:16,439 Speaker 4: personally where you can know their speech patterns, you might 709 00:39:16,560 --> 00:39:18,799 Speaker 4: be able to tell when something's that you fake of them. 710 00:39:19,560 --> 00:39:21,840 Speaker 4: But if you don't know them that well, then you 711 00:39:21,920 --> 00:39:24,040 Speaker 4: simply won't be able to a lot of the deep 712 00:39:24,040 --> 00:39:28,520 Speaker 4: fakes you're seeing are being put together by college students 713 00:39:28,560 --> 00:39:30,520 Speaker 4: in their dorm room just trying to have a laugh. 714 00:39:31,040 --> 00:39:37,160 Speaker 4: What we haven't seen yet is deep fakes and AI 715 00:39:37,320 --> 00:39:44,680 Speaker 4: generated misinformation well funded by say, a political campaign at exactly. 716 00:39:45,280 --> 00:39:48,879 Speaker 4: We don't know how good they can get yet, because we. 717 00:39:48,880 --> 00:39:53,480 Speaker 1: Haven't seen the Steven Spielberg produce deep fakery exactly. What 718 00:39:53,560 --> 00:39:55,879 Speaker 1: else are you worried about? A lot of. 719 00:39:55,800 --> 00:39:59,800 Speaker 4: What I'm worried about right now is this becoming entrenched, 720 00:40:00,160 --> 00:40:05,960 Speaker 4: and this becoming something where one company monopolizes the technology. 721 00:40:06,160 --> 00:40:10,120 Speaker 4: There's been some good movements in a different direction, for example, 722 00:40:10,239 --> 00:40:14,520 Speaker 4: Facebook open sourcing its language model, but there's still a 723 00:40:14,560 --> 00:40:18,560 Speaker 4: lot of worry that too much power is getting concentrated 724 00:40:18,560 --> 00:40:20,279 Speaker 4: into too few hands. Beyond it. 725 00:40:20,480 --> 00:40:24,200 Speaker 1: By power, you mean information, people's personal information, a lot. 726 00:40:24,080 --> 00:40:28,880 Speaker 4: Of the ability to control public perception, because these systems 727 00:40:29,000 --> 00:40:33,560 Speaker 4: are amazing at persuasion. If nothing else, an advertisement at 728 00:40:33,600 --> 00:40:38,279 Speaker 4: its core is persuasive. What is going to happen when 729 00:40:38,600 --> 00:40:43,719 Speaker 4: these AI systems that truly know how to persuade get 730 00:40:43,800 --> 00:40:47,560 Speaker 4: put behind political ad campaigns And I'm not even talking 731 00:40:48,160 --> 00:40:51,879 Speaker 4: like illegally. Facebook is going to power its ad recommendation 732 00:40:51,960 --> 00:40:56,400 Speaker 4: algorithm with these technologies, as is Google, and then it 733 00:40:56,440 --> 00:41:00,359 Speaker 4: simply becomes who can buy the best pr representative win, 734 00:41:00,920 --> 00:41:05,320 Speaker 4: so making an already a system that's already too heavily 735 00:41:05,360 --> 00:41:09,439 Speaker 4: influenced by money becoming too heavily influenced by money and 736 00:41:09,640 --> 00:41:14,279 Speaker 4: with only a handful of gatekeepers to power. If Facebook 737 00:41:14,320 --> 00:41:18,480 Speaker 4: and Google are wonderful caretakers of democracy and they have 738 00:41:18,520 --> 00:41:20,120 Speaker 4: their hearts in the right place, and they're not going 739 00:41:20,160 --> 00:41:23,440 Speaker 4: to let the profit motive trump democratic principles, then we've 740 00:41:23,440 --> 00:41:26,200 Speaker 4: got nothing to worry about. But that seems a little 741 00:41:26,239 --> 00:41:27,000 Speaker 4: forefetched to me. 742 00:41:28,000 --> 00:41:34,319 Speaker 1: My concern is that people are influenced by this technology, 743 00:41:34,360 --> 00:41:37,759 Speaker 1: as we saw in the last election and the one 744 00:41:37,800 --> 00:41:41,719 Speaker 1: before that. You talk about, you know, advertising as persuasion, 745 00:41:42,200 --> 00:41:45,719 Speaker 1: and the political advertising had a powerful effect on the 746 00:41:45,760 --> 00:41:48,480 Speaker 1: outcome of the election. Do we need to have a 747 00:41:48,520 --> 00:41:53,279 Speaker 1: curriculum in school to teach people about how you use 748 00:41:53,360 --> 00:41:56,600 Speaker 1: this tool, so that you know, like any tool, you 749 00:41:56,640 --> 00:42:01,200 Speaker 1: don't chop your hand off, you don't injure yourself. Do 750 00:42:01,280 --> 00:42:03,799 Speaker 1: you feel that we were at a point now where 751 00:42:03,840 --> 00:42:07,040 Speaker 1: this has become dangerous for young people potentially? 752 00:42:07,800 --> 00:42:10,160 Speaker 4: So, I think one of the dangers is that we 753 00:42:10,800 --> 00:42:13,480 Speaker 4: don't need to view it as a different kind of thing. 754 00:42:13,960 --> 00:42:16,880 Speaker 4: When I was in school, we had library services classes, 755 00:42:17,400 --> 00:42:20,840 Speaker 4: and the librarians would teach us how to find credible 756 00:42:20,880 --> 00:42:25,480 Speaker 4: sources in the library. Those same basic skills that we 757 00:42:25,480 --> 00:42:28,920 Speaker 4: were taught in a physical book library as a kid. 758 00:42:29,760 --> 00:42:33,880 Speaker 4: The same skills are the ones necessary to navigate the web. 759 00:42:33,960 --> 00:42:36,600 Speaker 4: They just have to be applied in new contexts. So 760 00:42:36,680 --> 00:42:39,960 Speaker 4: I think we could easily adapt those old classes that 761 00:42:40,000 --> 00:42:43,880 Speaker 4: we're teaching kids how to use a library to apply 762 00:42:43,960 --> 00:42:47,040 Speaker 4: the same principles in teaching kids how to use the internet. 763 00:42:47,280 --> 00:42:48,280 Speaker 1: That's a great point. 764 00:42:48,600 --> 00:42:53,920 Speaker 4: Google was founded by librarians. Essentially, library sciences is what 765 00:42:54,000 --> 00:42:57,920 Speaker 4: information retrieval is based on, and that's what search indexing 766 00:42:58,040 --> 00:43:00,640 Speaker 4: is based on. It all grew out of libry sciences. 767 00:43:00,680 --> 00:43:03,560 Speaker 4: And at Google there's this kind of vibe that they 768 00:43:03,560 --> 00:43:07,960 Speaker 4: are librarians and they are curating the great library on 769 00:43:08,040 --> 00:43:08,600 Speaker 4: the Internet. 770 00:43:09,800 --> 00:43:10,480 Speaker 1: Do you have kids? 771 00:43:10,920 --> 00:43:11,160 Speaker 4: Yes? 772 00:43:11,200 --> 00:43:12,600 Speaker 1: I do you do? How many kids? 773 00:43:12,640 --> 00:43:15,600 Speaker 4: You have? Two? One boy, one girl? Fifteen and two? 774 00:43:16,520 --> 00:43:19,400 Speaker 1: Okay, so this is perfect. How do you deal with 775 00:43:19,440 --> 00:43:22,480 Speaker 1: your children with social media? Someone who knows what you know? 776 00:43:23,360 --> 00:43:26,800 Speaker 4: My son? With my daughter, it's not an issue yet. 777 00:43:26,880 --> 00:43:30,640 Speaker 4: But with my son, we basically didn't allow him to 778 00:43:30,760 --> 00:43:35,839 Speaker 4: use it until he was demonstrating a certain level of 779 00:43:35,920 --> 00:43:37,800 Speaker 4: social sophistication and maturity. 780 00:43:38,239 --> 00:43:40,080 Speaker 1: What age was that, like, thirteen? 781 00:43:40,680 --> 00:43:40,799 Speaker 2: Right? 782 00:43:41,560 --> 00:43:44,000 Speaker 1: Really you didn't allow him to have a device of 783 00:43:44,040 --> 00:43:45,640 Speaker 1: any kind will he was thirteen. 784 00:43:45,680 --> 00:43:49,200 Speaker 4: Not of his own nor he could use a computer 785 00:43:49,320 --> 00:43:51,640 Speaker 4: if he needed to use the Internet for school. But 786 00:43:51,680 --> 00:43:55,319 Speaker 4: at this point he's fifteen, and you know, he's his 787 00:43:55,360 --> 00:43:58,400 Speaker 4: own person. At this point, he more or less understands 788 00:43:58,400 --> 00:44:01,000 Speaker 4: how to navigate those spaces, say, and that's the best 789 00:44:01,040 --> 00:44:01,680 Speaker 4: I could do it. 790 00:44:01,840 --> 00:44:05,680 Speaker 1: You know what's good about artificial intelligence as far as 791 00:44:05,719 --> 00:44:08,240 Speaker 1: you're concern, what's a direction that's going in any field 792 00:44:08,600 --> 00:44:09,680 Speaker 1: that you're admiring of? 793 00:44:10,480 --> 00:44:14,080 Speaker 4: Oh? Well, I mean it's going to amplify our productivity. 794 00:44:14,120 --> 00:44:18,440 Speaker 4: Once we figure out how to you know, productively integrate 795 00:44:18,480 --> 00:44:21,959 Speaker 4: this technology into our society in a non disruptive way, 796 00:44:22,719 --> 00:44:25,160 Speaker 4: writers are going to like, I mean, I write a 797 00:44:25,239 --> 00:44:28,960 Speaker 4: decent amount, and AI is a great tool for getting 798 00:44:28,960 --> 00:44:32,560 Speaker 4: over writer's block if you need to push through something 799 00:44:32,560 --> 00:44:36,040 Speaker 4: where you're at a wall, having this other entity that 800 00:44:36,080 --> 00:44:40,719 Speaker 4: can brainstorm with you. They're great research assistants and they're 801 00:44:40,760 --> 00:44:45,200 Speaker 4: great tools for creativity. Where we're going to hit some 802 00:44:45,360 --> 00:44:49,840 Speaker 4: bumps is that they weren't built for a specific purpose. 803 00:44:49,880 --> 00:44:53,520 Speaker 4: They were built to meet Tearing's benchmarks. So now we're 804 00:44:53,560 --> 00:44:56,360 Speaker 4: gonna have to go through a phase where we figure out, Okay, 805 00:44:56,400 --> 00:44:59,720 Speaker 4: we built this thing, Now what is it actually good 806 00:44:59,760 --> 00:45:00,600 Speaker 4: for us for? 807 00:45:02,320 --> 00:45:05,240 Speaker 1: Do you think that AI has any role in solving 808 00:45:05,280 --> 00:45:07,120 Speaker 1: the climate change issue? 809 00:45:07,719 --> 00:45:10,920 Speaker 4: If AI can be used to crack something like cold fusion, 810 00:45:11,560 --> 00:45:17,960 Speaker 4: then yes, But barring those kinds of transformational technology breakthroughs. 811 00:45:18,000 --> 00:45:21,120 Speaker 4: I don't think it does, and that's because we kind 812 00:45:21,120 --> 00:45:24,160 Speaker 4: of already know what the solution to climate change is. 813 00:45:24,640 --> 00:45:25,840 Speaker 1: Change in human behavior. 814 00:45:26,120 --> 00:45:28,360 Speaker 4: Yeah, change in human behavior. We just don't want to 815 00:45:28,400 --> 00:45:33,399 Speaker 4: do it. Maybe AI could persuade people to change lifestyles, 816 00:45:33,520 --> 00:45:36,440 Speaker 4: but really, we have a certain number of people on 817 00:45:36,480 --> 00:45:40,520 Speaker 4: the planet consuming a certain amount of energy, and on 818 00:45:40,680 --> 00:45:43,839 Speaker 4: average that creates a certain amount of heat, and we 819 00:45:43,920 --> 00:45:45,799 Speaker 4: just need to change our behavior to where we're not 820 00:45:45,800 --> 00:45:49,360 Speaker 4: creating so much heat. So it is possible that AI 821 00:45:49,440 --> 00:45:52,440 Speaker 4: will give us cold fusion, in which case, yes, it 822 00:45:52,480 --> 00:45:53,479 Speaker 4: would help us with that. 823 00:45:53,960 --> 00:45:58,200 Speaker 1: Or super conductivity as well, something anything that would fundamentally 824 00:45:58,239 --> 00:46:00,400 Speaker 1: change the thermodynamics of our energy grid. 825 00:46:00,600 --> 00:46:03,560 Speaker 4: But barring a breakthrough like that, we just really do 826 00:46:03,760 --> 00:46:06,839 Speaker 4: have to focus on changing human behavior and human priorities. 827 00:46:07,400 --> 00:46:10,320 Speaker 1: Now, what's next for you? If you could do whatever 828 00:46:10,360 --> 00:46:13,560 Speaker 1: you wanted to do, you're obviously this searingly bright guy 829 00:46:14,239 --> 00:46:17,120 Speaker 1: who knows all about these important things. If you could 830 00:46:17,120 --> 00:46:20,239 Speaker 1: do whatever you wanted to do in the short term, 831 00:46:20,239 --> 00:46:22,000 Speaker 1: what would that be? Oh? 832 00:46:22,080 --> 00:46:24,560 Speaker 4: I mean, AI ethics research is what I've enjoyed doing 833 00:46:24,640 --> 00:46:26,799 Speaker 4: the most. But one of the things that I keep 834 00:46:26,840 --> 00:46:30,000 Speaker 4: getting struck by is I keep getting invited on shows 835 00:46:30,000 --> 00:46:32,440 Speaker 4: like this to speak and there's a part in the 836 00:46:32,480 --> 00:46:34,400 Speaker 4: back of my brain that just goes, am, I really 837 00:46:34,760 --> 00:46:38,080 Speaker 4: the best representative to the public on these topics, and 838 00:46:38,280 --> 00:46:41,759 Speaker 4: apparently there just aren't that many people who are technically 839 00:46:41,800 --> 00:46:44,800 Speaker 4: well versed in how these systems work who were willing 840 00:46:45,120 --> 00:46:48,360 Speaker 4: to perform this role as kind of a bridge to 841 00:46:48,360 --> 00:46:51,680 Speaker 4: the public to understanding them better. And while that's not 842 00:46:51,840 --> 00:46:54,919 Speaker 4: really the thing that I'm most passionate about in life, 843 00:46:54,920 --> 00:46:57,120 Speaker 4: it's important and the fact that I get to perform 844 00:46:57,160 --> 00:47:00,520 Speaker 4: that service to help educate people is meaning and I'm 845 00:47:00,520 --> 00:47:02,879 Speaker 4: happy to be doing it. It's and plan to continue doing 846 00:47:02,880 --> 00:47:03,399 Speaker 4: it as well. 847 00:47:04,640 --> 00:47:06,320 Speaker 1: Many thanks to you, many thanks. 848 00:47:06,640 --> 00:47:08,640 Speaker 4: It was wonderful being here. Thank you Alix, Thank you 849 00:47:08,680 --> 00:47:13,480 Speaker 4: sir My. 850 00:47:13,640 --> 00:47:18,360 Speaker 1: Thanks to Blake Lemoine and jay Lebuff. This episode was 851 00:47:18,400 --> 00:47:22,560 Speaker 1: recorded by a robot, but it was also recorded at 852 00:47:22,640 --> 00:47:27,000 Speaker 1: CDM Studios in New York City. We're produced by Kathleen Russo, 853 00:47:27,280 --> 00:47:31,040 Speaker 1: Zach MacNeice, and Maureen Hobin. Our engineer is Frank Imperial. 854 00:47:31,400 --> 00:47:35,600 Speaker 1: Our social media manager is Danielle Gingrich. I'm Alec Baldwin. 855 00:47:35,880 --> 00:47:53,840 Speaker 1: Here's the thing is brought to you by iHeart Radio,