1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:14,200 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,240 --> 00:00:17,040 Speaker 1: I'm your host, Jonathan Strickling them an executive producer with 4 00:00:17,079 --> 00:00:18,919 Speaker 1: How Stuff Works in I Heart Radio and I Love 5 00:00:19,000 --> 00:00:21,960 Speaker 1: all Things Tech. And you know, guys, there's no shortage 6 00:00:22,040 --> 00:00:26,040 Speaker 1: of scenarios in which AI proves to be our downfall. 7 00:00:26,560 --> 00:00:29,600 Speaker 1: You've got popular films like The Terminator and Matrix series 8 00:00:29,640 --> 00:00:34,040 Speaker 1: in which we have artificial intelligence literally revolting against us 9 00:00:34,080 --> 00:00:39,640 Speaker 1: and then subjugating us. To the numerous predictions that automation 10 00:00:39,720 --> 00:00:41,879 Speaker 1: is going to displace every job, and we run the 11 00:00:41,920 --> 00:00:46,000 Speaker 1: gamut of all these different scenarios where AI is going 12 00:00:46,040 --> 00:00:50,280 Speaker 1: to be our end And then we have various companies 13 00:00:50,280 --> 00:00:53,600 Speaker 1: and organizations that are investing billions of dollars to develop 14 00:00:53,800 --> 00:00:56,720 Speaker 1: an advance artificial intelligence who are saying, no, no, no, 15 00:00:56,720 --> 00:00:58,080 Speaker 1: no, no no no, you don't need to worry about that. 16 00:00:58,120 --> 00:01:01,040 Speaker 1: AI is not gonna totally story of the world. It's 17 00:01:01,040 --> 00:01:03,160 Speaker 1: gonna make our world better. It's going to take over 18 00:01:03,200 --> 00:01:05,680 Speaker 1: the more repetitive, dull, and dangerous parts of our jobs, 19 00:01:05,680 --> 00:01:07,480 Speaker 1: and it's going to free us up to concentrate on 20 00:01:07,520 --> 00:01:12,760 Speaker 1: more rewarding activities. So can we get to any truth 21 00:01:12,840 --> 00:01:15,280 Speaker 1: in the matter? Is there some sort of truth we 22 00:01:15,319 --> 00:01:19,479 Speaker 1: can suss out from these extremes? Well, today I'm joined 23 00:01:19,520 --> 00:01:22,840 Speaker 1: by Oz and Kara, the hosts of the series Sleepwalkers, 24 00:01:23,040 --> 00:01:25,920 Speaker 1: a show all about AI and if you haven't checked 25 00:01:25,959 --> 00:01:28,880 Speaker 1: it out yet, I highly recommend you do because it 26 00:01:29,000 --> 00:01:33,000 Speaker 1: is a phenomenal show. Guys, welcome to tech Stuff. Hi, 27 00:01:33,720 --> 00:01:36,280 Speaker 1: thank you for having us. Yeah, thank you so much. Jonathan. 28 00:01:36,280 --> 00:01:39,399 Speaker 1: We're huge fans of tech stuff and delights to be 29 00:01:39,480 --> 00:01:42,280 Speaker 1: joining the house stuff works. My heart family in the 30 00:01:42,400 --> 00:01:44,600 Speaker 1: and and made part of the tech Stuff networks. So 31 00:01:44,720 --> 00:01:48,200 Speaker 1: thank you. Well, thank you because you know you've you 32 00:01:48,280 --> 00:01:51,880 Speaker 1: have lifted up the boat of tech stuff, certainly because 33 00:01:51,960 --> 00:01:56,280 Speaker 1: your work is really inspiring. Before we jump into this conversation, 34 00:01:56,320 --> 00:01:58,040 Speaker 1: if you could just take a couple of moments and 35 00:01:58,120 --> 00:02:01,280 Speaker 1: let my listeners know kind of you know what Sleepwalkers is, 36 00:02:01,320 --> 00:02:03,520 Speaker 1: how you would describe that show to somebody. Let's say 37 00:02:03,520 --> 00:02:07,080 Speaker 1: you're at a cocktail party and you are asked what 38 00:02:07,120 --> 00:02:08,560 Speaker 1: do you do for a living? You say, well, I'm 39 00:02:08,600 --> 00:02:11,120 Speaker 1: working on this show. How do you describe it? So 40 00:02:11,160 --> 00:02:13,080 Speaker 1: they called I think they call it an elevator pitch. 41 00:02:13,160 --> 00:02:16,280 Speaker 1: But this is a cocktail pitch. Yeah, and were based 42 00:02:16,280 --> 00:02:17,639 Speaker 1: in New York, so we spend a whole lives of 43 00:02:17,680 --> 00:02:19,800 Speaker 1: cocktail parties. I was born at a cocktail party and 44 00:02:19,800 --> 00:02:22,360 Speaker 1: committee which happened in elevators in New York, as I 45 00:02:22,400 --> 00:02:27,440 Speaker 1: understand exactly, or apartments, the size of elevators. Um. So, 46 00:02:27,880 --> 00:02:32,880 Speaker 1: Sleepwalkers is a podcast that actually Oz came to me with. 47 00:02:33,440 --> 00:02:38,200 Speaker 1: It was his idea, his brain child. But I will 48 00:02:38,240 --> 00:02:40,880 Speaker 1: say first, you know, I've I used to report on 49 00:02:41,120 --> 00:02:44,280 Speaker 1: tech and science at the Huffington Post, and I had 50 00:02:44,280 --> 00:02:47,240 Speaker 1: a show called Talk Nerdy to me and when Oz 51 00:02:47,280 --> 00:02:48,600 Speaker 1: came to me and said, you know, I want to 52 00:02:48,600 --> 00:02:51,240 Speaker 1: I want to really make a show that deals with 53 00:02:51,840 --> 00:02:56,400 Speaker 1: all of the human touch points that AI could possibly 54 00:02:56,440 --> 00:03:02,160 Speaker 1: come in contact with, so healthcare, agriculture or uh, you know, 55 00:03:02,280 --> 00:03:06,919 Speaker 1: science in general. Love you know, all of these places 56 00:03:07,000 --> 00:03:10,520 Speaker 1: where people aren't necessarily thinking that a I will have 57 00:03:10,560 --> 00:03:15,520 Speaker 1: an impact, but they already should be basically, And you know, 58 00:03:15,560 --> 00:03:18,320 Speaker 1: I said yes very quickly because I'm very interested in 59 00:03:18,320 --> 00:03:21,320 Speaker 1: all of those touch points. So each episode really is 60 00:03:21,360 --> 00:03:25,919 Speaker 1: a deep dive into one of those areas, as I said, 61 00:03:25,919 --> 00:03:33,519 Speaker 1: whether it be healthcare, transhumanism, agriculture, the military, for example, um, 62 00:03:33,560 --> 00:03:35,839 Speaker 1: you know, these are these are places where we're going 63 00:03:35,840 --> 00:03:37,760 Speaker 1: to see the presence of AI. Were already seeing the 64 00:03:37,800 --> 00:03:40,160 Speaker 1: presence of AI, and and the show really tries to 65 00:03:40,240 --> 00:03:45,000 Speaker 1: explore that. Yeah, I think it's uh, it's really pretty 66 00:03:45,040 --> 00:03:48,520 Speaker 1: incredible when you sit down and look at where AI 67 00:03:48,720 --> 00:03:52,920 Speaker 1: has already kind of crept into our day to day experience, 68 00:03:53,080 --> 00:03:57,440 Speaker 1: sometimes in ways that we wouldn't necessarily associate with AI. 69 00:03:57,480 --> 00:04:01,840 Speaker 1: Like one report I read said you could argue it's 70 00:04:01,840 --> 00:04:05,200 Speaker 1: a very limited application of AI, but that things like 71 00:04:05,320 --> 00:04:08,800 Speaker 1: spell check and grammar check, which are now standard in 72 00:04:09,200 --> 00:04:13,600 Speaker 1: apps and clients and smartphones and browsers, that that's a 73 00:04:14,000 --> 00:04:17,400 Speaker 1: type of artificial intelligence that if it's doing something besides 74 00:04:17,480 --> 00:04:21,440 Speaker 1: just detecting No, this sequence of letters doesn't correspond with 75 00:04:21,480 --> 00:04:24,960 Speaker 1: any words in the language you are writing in. It's 76 00:04:25,000 --> 00:04:29,280 Speaker 1: also perhaps looking at context, like saying, well, you use 77 00:04:29,360 --> 00:04:32,000 Speaker 1: the word weather, but you use the word weather as 78 00:04:32,000 --> 00:04:36,560 Speaker 1: in the types of meteor logical activity that are outside 79 00:04:36,600 --> 00:04:39,240 Speaker 1: the window, as opposed to whether or not you should 80 00:04:39,320 --> 00:04:43,440 Speaker 1: do something. And so you think about that and you'll realize, oh, yeah, 81 00:04:43,440 --> 00:04:46,440 Speaker 1: I guess, I guess there is a lot more to 82 00:04:46,480 --> 00:04:48,240 Speaker 1: it than I thought, which kind of brings me to 83 00:04:48,320 --> 00:04:50,440 Speaker 1: the first point I wanted to make before we dive 84 00:04:50,480 --> 00:04:55,159 Speaker 1: into the various doom and gloom scenarios of AI, which is, 85 00:04:55,560 --> 00:04:59,400 Speaker 1: how do you guys define artificial intelligence? Because I found 86 00:04:59,520 --> 00:05:04,160 Speaker 1: that the this concept it's so broad that often you 87 00:05:04,200 --> 00:05:07,360 Speaker 1: can have two people trying to have a meaningful conversation 88 00:05:07,400 --> 00:05:10,640 Speaker 1: about AI and they're not able to meet in the middle. 89 00:05:10,680 --> 00:05:12,720 Speaker 1: But it's not because they don't agree with each other. 90 00:05:12,760 --> 00:05:16,560 Speaker 1: It's simply because they're working from vastly different definitions of 91 00:05:16,600 --> 00:05:21,640 Speaker 1: what artificial intelligence actually is. I think that's a great point, Jonathan. 92 00:05:22,400 --> 00:05:24,280 Speaker 1: Just to back up a little bit, UM, I want 93 00:05:24,279 --> 00:05:27,120 Speaker 1: to tell you how I count with the name Sleepwalkers 94 00:05:27,120 --> 00:05:29,120 Speaker 1: for the series UM, and then I'd love to dive 95 00:05:29,160 --> 00:05:31,760 Speaker 1: into I think the excellent point you make, which is 96 00:05:31,800 --> 00:05:38,200 Speaker 1: that effectively yesterday's AI is today is computing. UM. But 97 00:05:38,520 --> 00:05:42,160 Speaker 1: so I was very struck about eighteen months ago when 98 00:05:42,200 --> 00:05:45,919 Speaker 1: several of the senior and early employees of Facebook, people 99 00:05:45,960 --> 00:05:49,120 Speaker 1: like Sean Parker, the first president of Facebook, who have 100 00:05:49,160 --> 00:05:52,840 Speaker 1: now left the firm, obviously Chris Hughes more recently coming 101 00:05:52,880 --> 00:05:55,800 Speaker 1: out and saying, you know, I wouldn't let my children 102 00:05:56,360 --> 00:06:01,080 Speaker 1: use technology. Steve Jobs famously said, um, Steve Jobs, we 103 00:06:01,120 --> 00:06:03,320 Speaker 1: had an audio shoot. I was repeat that. Steve Jobs 104 00:06:03,360 --> 00:06:06,200 Speaker 1: famously gave an interview to Nick Bilton where he said that, 105 00:06:06,279 --> 00:06:08,240 Speaker 1: you know, he wouldn't let his children use the iPad. 106 00:06:08,880 --> 00:06:10,640 Speaker 1: But when the Facebook he's actually just came out and 107 00:06:10,720 --> 00:06:12,200 Speaker 1: one after the other said that they didn't want their 108 00:06:12,240 --> 00:06:15,040 Speaker 1: children using this technology kind of made me sit up 109 00:06:15,040 --> 00:06:17,799 Speaker 1: and think, you know, if the people creating this technology 110 00:06:18,000 --> 00:06:21,000 Speaker 1: don't want their kids to use it, what does that say. 111 00:06:21,120 --> 00:06:23,080 Speaker 1: I mean, it's like, would you go to a restaurant 112 00:06:23,160 --> 00:06:26,640 Speaker 1: where the owner didn't let their children eat? I certainly wouldn't. 113 00:06:27,000 --> 00:06:29,120 Speaker 1: So that was the first sort of point of inspiration 114 00:06:29,200 --> 00:06:32,400 Speaker 1: for for sleepwalkers, in other words, not being aware of 115 00:06:32,920 --> 00:06:35,640 Speaker 1: the future we may be going into. And then there 116 00:06:35,680 --> 00:06:39,320 Speaker 1: was a Zuckerberg hearings in the Senate, and Mark Zuckerberg 117 00:06:39,400 --> 00:06:45,880 Speaker 1: sat there looking increasingly from from slightly nervous two relieved 118 00:06:45,920 --> 00:06:50,760 Speaker 1: and calm to actively smug as it became abundantly clear 119 00:06:50,960 --> 00:06:52,720 Speaker 1: that the senators were not going to be able to 120 00:06:52,760 --> 00:06:55,560 Speaker 1: hold into account. I think the idea of those hearings 121 00:06:55,880 --> 00:06:59,600 Speaker 1: were Senator orn Hatch asking Mark Zuckerberg how the platform 122 00:06:59,600 --> 00:07:04,120 Speaker 1: made my the if it was free and Zugerberg smirkingly replied, Senator, 123 00:07:04,320 --> 00:07:07,680 Speaker 1: we run ads um. And so between those two things, 124 00:07:07,760 --> 00:07:10,080 Speaker 1: between the Facebook is not wanting their own children on 125 00:07:10,120 --> 00:07:13,400 Speaker 1: the platform and the grown ups are either senators not 126 00:07:13,440 --> 00:07:15,840 Speaker 1: being able to hold Facebook to account. I thought, Okay, 127 00:07:15,840 --> 00:07:18,160 Speaker 1: what's going on here? And how can we wake up 128 00:07:18,160 --> 00:07:20,280 Speaker 1: and make sure that we don't sort of flush our 129 00:07:20,600 --> 00:07:23,800 Speaker 1: democracy down the toilet and pollute our children's minds by 130 00:07:23,800 --> 00:07:26,960 Speaker 1: not asking some fundamental questions about how technology is changing 131 00:07:27,320 --> 00:07:30,600 Speaker 1: how we already live. And that brings me to your 132 00:07:30,600 --> 00:07:33,880 Speaker 1: second question, Jonathan, what is AI? And it's a fantastic 133 00:07:33,960 --> 00:07:37,920 Speaker 1: question because AI is everywhere, and it's not just the 134 00:07:38,000 --> 00:07:41,200 Speaker 1: robot future that you see in sci fi films that 135 00:07:41,200 --> 00:07:44,960 Speaker 1: you mentioned, and it's not the future facing products that 136 00:07:45,000 --> 00:07:48,000 Speaker 1: you know many brands tell us they're developing. And it's 137 00:07:48,040 --> 00:07:51,560 Speaker 1: basically just statistics and probability, which has got better and 138 00:07:51,560 --> 00:07:54,520 Speaker 1: better and better over time. But one of the things 139 00:07:54,560 --> 00:07:56,440 Speaker 1: that we make clear to our listeners in the first 140 00:07:56,440 --> 00:08:00,000 Speaker 1: episode is that they've already encountered AI ten times or 141 00:08:00,040 --> 00:08:02,840 Speaker 1: hundred times by the time they listened to this podcast 142 00:08:02,840 --> 00:08:05,000 Speaker 1: in their day, because if they took an uber to 143 00:08:05,080 --> 00:08:07,440 Speaker 1: work in the morning, likely the driver was matched to 144 00:08:07,520 --> 00:08:10,400 Speaker 1: them and the route was chosen with AI. If they 145 00:08:10,400 --> 00:08:12,280 Speaker 1: woke up next to somebody this morning who they met 146 00:08:12,280 --> 00:08:16,240 Speaker 1: through a dating app, AI effectively intervened in their romantic 147 00:08:16,320 --> 00:08:19,040 Speaker 1: life and connected them with somebody who they matched with. 148 00:08:19,440 --> 00:08:21,920 Speaker 1: And even even if you're listening to this podcast right now, 149 00:08:22,360 --> 00:08:26,000 Speaker 1: there are algorithms AI algorithms at work smoothing our voices, 150 00:08:26,080 --> 00:08:29,840 Speaker 1: compressing the audio, helping with the editing techniques. So AI 151 00:08:29,960 --> 00:08:33,040 Speaker 1: is everywhere, and it's already changing our perception of the 152 00:08:33,080 --> 00:08:35,280 Speaker 1: world and how we relate to the world around us 153 00:08:35,320 --> 00:08:37,680 Speaker 1: and each other. Yeah, you could even argue at this 154 00:08:37,760 --> 00:08:43,920 Speaker 1: point that AI is really just a slightly more focused 155 00:08:44,440 --> 00:08:48,240 Speaker 1: branch of computer science and that it's It's almost the 156 00:08:48,280 --> 00:08:52,920 Speaker 1: same as saying will computer science save us or doom us? 157 00:08:53,080 --> 00:08:55,000 Speaker 1: It is too big of a question. You have to 158 00:08:55,040 --> 00:08:58,800 Speaker 1: start narrowing things down. I think the real issue is 159 00:08:58,840 --> 00:09:03,080 Speaker 1: that for the longest time, we've associated artificial intelligence with 160 00:09:03,160 --> 00:09:07,640 Speaker 1: the concept of strong AI, which is that idea that 161 00:09:07,720 --> 00:09:12,320 Speaker 1: we would create a machine that was either capable of 162 00:09:12,679 --> 00:09:15,760 Speaker 1: or so close to capable that we can't tell the 163 00:09:15,760 --> 00:09:19,240 Speaker 1: difference of thinking like a human or processing information like 164 00:09:19,280 --> 00:09:21,959 Speaker 1: a human and coming to decisions like a human would, 165 00:09:22,000 --> 00:09:29,240 Speaker 1: possibly with the added elements of consciousness and self awareness. UM. 166 00:09:29,360 --> 00:09:33,080 Speaker 1: And and you know, I talk about how many times 167 00:09:33,080 --> 00:09:35,800 Speaker 1: in this show. How that's a very complicated thing even 168 00:09:35,800 --> 00:09:37,880 Speaker 1: for us to talk about just as human beings without 169 00:09:37,880 --> 00:09:44,800 Speaker 1: bringing machines into it. So I'm sorry, I can't do that, Jonathan, Yes, yes, 170 00:09:44,880 --> 00:09:48,920 Speaker 1: how HOW or or IBM if you prefer. Uh, you know, 171 00:09:48,960 --> 00:09:51,920 Speaker 1: they're just three letters off UM. But yeah, it's it's 172 00:09:51,960 --> 00:09:55,400 Speaker 1: ah that wonderful, that wonderful feeling that that's the only 173 00:09:55,440 --> 00:09:58,360 Speaker 1: thing that AI really is, right, It's it's the super 174 00:09:58,440 --> 00:10:04,319 Speaker 1: intelligent deep thought or how computer that's capable of processing 175 00:10:04,320 --> 00:10:09,679 Speaker 1: information typically in natural language. Uh, it's the Watson platform 176 00:10:09,760 --> 00:10:15,000 Speaker 1: participating on Jeopardy. Like we've we've precipitated this, uh, this thought, 177 00:10:15,120 --> 00:10:18,760 Speaker 1: this this concept of AI, and we've reinforced it with 178 00:10:19,720 --> 00:10:23,439 Speaker 1: entertainment and with applications that try to emulate the stuff 179 00:10:23,440 --> 00:10:26,520 Speaker 1: that we saw on entertainment. But as you point out, 180 00:10:26,920 --> 00:10:31,280 Speaker 1: AI is is a much more broad concept than this 181 00:10:31,720 --> 00:10:35,719 Speaker 1: super intelligent machine. It's a whole bunch of stuff that's 182 00:10:35,760 --> 00:10:39,720 Speaker 1: all about processing information in a particular way, typically to 183 00:10:39,920 --> 00:10:44,280 Speaker 1: come to some sort of uh, decision or action upon 184 00:10:44,440 --> 00:10:48,720 Speaker 1: information that has been automated. So it might be something 185 00:10:48,760 --> 00:10:53,120 Speaker 1: like Facebook's algorithm, which is all designed ultimately. What Facebook's 186 00:10:53,120 --> 00:10:55,880 Speaker 1: algorithm is designed to do is to keep you on Facebook. 187 00:10:56,360 --> 00:10:59,959 Speaker 1: It's it's ultimately ultimately designed so that you will see 188 00:11:00,160 --> 00:11:04,160 Speaker 1: the next thing on Facebook. It's it's reinforcing that desire 189 00:11:04,440 --> 00:11:08,560 Speaker 1: and uh. And so that's what once the algorithm quote 190 00:11:08,640 --> 00:11:11,560 Speaker 1: unquote figures you out, that's why you're gonna start seeing 191 00:11:11,800 --> 00:11:17,920 Speaker 1: a pretty uh, a pretty consistent presentation of what you 192 00:11:17,960 --> 00:11:21,280 Speaker 1: would see on a day to day basis. UM. But 193 00:11:21,400 --> 00:11:26,200 Speaker 1: that would be one example of that. So, as you 194 00:11:26,240 --> 00:11:28,720 Speaker 1: point out, we do interact with AI all the time, 195 00:11:28,760 --> 00:11:33,200 Speaker 1: whether it's on social media with those algorithms, UM, whether 196 00:11:33,400 --> 00:11:36,199 Speaker 1: it's with an app. Maybe we have one of those 197 00:11:36,200 --> 00:11:39,560 Speaker 1: personal assistants in our home that that uses AI to 198 00:11:39,760 --> 00:11:43,319 Speaker 1: various extents. UM. I talked about just recently on the radio. 199 00:11:43,360 --> 00:11:47,960 Speaker 1: I had a conversation about how Comcast is coming out 200 00:11:48,080 --> 00:11:52,280 Speaker 1: with sensors that are meant to monitor the health of 201 00:11:52,400 --> 00:11:55,480 Speaker 1: people living in homes that have been outfitted with these 202 00:11:55,520 --> 00:11:58,480 Speaker 1: ambient sensors, and they monitor things like how often you 203 00:11:58,520 --> 00:12:00,800 Speaker 1: get up to go to the bathroom or whether you 204 00:12:00,840 --> 00:12:04,280 Speaker 1: stay in bed a longer time than normal. And to 205 00:12:04,400 --> 00:12:08,160 Speaker 1: be perfectly fair to Comcast, they're they're pitching this as 206 00:12:08,200 --> 00:12:12,319 Speaker 1: something to help the elderly or people who otherwise need 207 00:12:12,440 --> 00:12:16,160 Speaker 1: caretakers to give them more independence in their own homes. 208 00:12:16,200 --> 00:12:20,520 Speaker 1: But you could also very easily, without much imagination at all, 209 00:12:20,880 --> 00:12:23,080 Speaker 1: start to come up with scenarios where that could become 210 00:12:24,320 --> 00:12:27,520 Speaker 1: truly invasive. Oh yeah, So I was bringing up Second 211 00:12:27,600 --> 00:12:30,720 Speaker 1: Chance AI, which was a project that came out of 212 00:12:30,720 --> 00:12:35,840 Speaker 1: the University of Washington, which was a was designed to 213 00:12:36,240 --> 00:12:42,560 Speaker 1: detect opioid overdoses early on, using UM an opioid users 214 00:12:42,640 --> 00:12:46,840 Speaker 1: cell phone to detect changes in breath and really act 215 00:12:46,840 --> 00:12:51,120 Speaker 1: as a monitor for people who were long time or 216 00:12:51,200 --> 00:12:56,000 Speaker 1: short time heroin and opioid users. So that device would 217 00:12:56,080 --> 00:12:59,240 Speaker 1: then be able to detect this overdose and allow family 218 00:12:59,240 --> 00:13:04,640 Speaker 1: members to know or also alert the person who is 219 00:13:04,720 --> 00:13:08,000 Speaker 1: overdosing that they're in a bad way. So in the 220 00:13:08,000 --> 00:13:10,559 Speaker 1: case of opioid users, it's worth the trade off because 221 00:13:11,040 --> 00:13:13,520 Speaker 1: um you know, it's very helpful and potentially life saving 222 00:13:13,559 --> 00:13:16,560 Speaker 1: for them to know based on previous breathing patterns and 223 00:13:16,600 --> 00:13:20,359 Speaker 1: previous movements, what's likely to happen next. In an overdose scenario. 224 00:13:20,520 --> 00:13:22,920 Speaker 1: And for most Facebook users, they indeed get to see 225 00:13:22,920 --> 00:13:25,200 Speaker 1: the ads which are relevant to them. But the problem 226 00:13:25,280 --> 00:13:29,920 Speaker 1: with AI is that can't discriminate between individuals and general population. 227 00:13:29,960 --> 00:13:32,320 Speaker 1: So although it's more probable that somebody who have a 228 00:13:32,320 --> 00:13:35,880 Speaker 1: successful pregnancy than not, is very painful for the edge cases, 229 00:13:35,920 --> 00:13:39,040 Speaker 1: and AI can't effectively discriminate for them. And I just 230 00:13:39,040 --> 00:13:41,120 Speaker 1: want to say really quickly, and I think this is 231 00:13:41,120 --> 00:13:43,280 Speaker 1: an important point to make, especially about second Chance. So 232 00:13:43,360 --> 00:13:47,880 Speaker 1: second chance basically is harnessing the power of a cell 233 00:13:47,920 --> 00:13:51,600 Speaker 1: phone's microphone, which is the same microphone that you can 234 00:13:51,600 --> 00:13:54,800 Speaker 1: either choose to turn on or off when you're in Instagram, 235 00:13:54,840 --> 00:13:57,320 Speaker 1: that can listen to what you're saying, and basically then 236 00:13:57,480 --> 00:14:01,560 Speaker 1: use data that's collected to target you with products that 237 00:14:01,640 --> 00:14:04,440 Speaker 1: you probably don't need, like another pair of shoes designed 238 00:14:04,440 --> 00:14:07,040 Speaker 1: by a company that you've never heard of, but that 239 00:14:07,120 --> 00:14:10,280 Speaker 1: you might like. So my point is is that this 240 00:14:10,320 --> 00:14:14,680 Speaker 1: microphone that you know, as Oz was saying, could you 241 00:14:14,720 --> 00:14:18,040 Speaker 1: know in a inappropriately spy on you essentially unless you 242 00:14:18,120 --> 00:14:20,600 Speaker 1: are taking control of it, is also a microphone that 243 00:14:20,640 --> 00:14:23,000 Speaker 1: could save a life of somebody who is in the 244 00:14:23,000 --> 00:14:25,720 Speaker 1: early stages of an opioid overdose. So I think that 245 00:14:25,800 --> 00:14:28,480 Speaker 1: kind of rocks my world when I think about the 246 00:14:28,520 --> 00:14:33,640 Speaker 1: two existing on the same piece of technology. Um, again, 247 00:14:33,680 --> 00:14:36,160 Speaker 1: it's that they're being used for different things, and two 248 00:14:36,440 --> 00:14:40,440 Speaker 1: different things that are you know, have hugely different outcomes. 249 00:14:40,720 --> 00:14:42,960 Speaker 1: But they're all about making guesses about what's going to 250 00:14:43,000 --> 00:14:45,560 Speaker 1: happen in the future based on what's happened in the past, 251 00:14:45,760 --> 00:14:49,560 Speaker 1: and that can be liberating or constraining, depending on the 252 00:14:49,600 --> 00:14:52,840 Speaker 1: technology and the intention and your interaction with it. Yeah, 253 00:14:53,280 --> 00:14:58,480 Speaker 1: I'm reminded of something similar that was it was an 254 00:14:58,480 --> 00:15:05,600 Speaker 1: interesting use of AI that ended up being ah. Another 255 00:15:06,040 --> 00:15:12,360 Speaker 1: embarrassing and emotionally traumatic story that broke a few years ago. 256 00:15:12,640 --> 00:15:16,480 Speaker 1: I want to say it was Target that sent coupons 257 00:15:16,520 --> 00:15:21,640 Speaker 1: like maternity coupons to a young woman who her father 258 00:15:21,720 --> 00:15:26,640 Speaker 1: had intercepted the thing and was incensed that Target would 259 00:15:26,640 --> 00:15:31,120 Speaker 1: send these to his his daughter. Uh. And then because 260 00:15:31,240 --> 00:15:34,360 Speaker 1: the father, the father of the young woman, did not 261 00:15:34,440 --> 00:15:38,280 Speaker 1: realize that she was actually pregnant, she had not told him, 262 00:15:38,360 --> 00:15:42,320 Speaker 1: and so he was upset and he was very angry 263 00:15:42,360 --> 00:15:45,120 Speaker 1: at Target, you know, saying how dare you suggest this? 264 00:15:45,520 --> 00:15:50,200 Speaker 1: Then discovered that she was pregnant after all, and it 265 00:15:50,280 --> 00:15:54,720 Speaker 1: shows again that it was the intent was trying to 266 00:15:54,720 --> 00:15:56,880 Speaker 1: be helpful. You could see that at least from you know, 267 00:15:56,920 --> 00:15:59,480 Speaker 1: from a thousand yards away, you could see that where 268 00:15:59,800 --> 00:16:02,520 Speaker 1: it say a company that says, you know you're going 269 00:16:02,560 --> 00:16:04,680 Speaker 1: to have need of these things, here are some coupons 270 00:16:04,720 --> 00:16:06,560 Speaker 1: for those things. If you shop with us, we can 271 00:16:06,600 --> 00:16:08,440 Speaker 1: get you some deals. So you know it's gonna be 272 00:16:08,440 --> 00:16:12,200 Speaker 1: a mutually beneficial kind of arrangement. But then you realize, oh, 273 00:16:12,240 --> 00:16:15,320 Speaker 1: but this is on a subject that is extremely personal 274 00:16:15,720 --> 00:16:19,280 Speaker 1: and in this case had this unintended consequence. It was 275 00:16:19,320 --> 00:16:22,400 Speaker 1: the same sort of predictive approach, and they were able 276 00:16:22,440 --> 00:16:25,920 Speaker 1: to predict the fact that she was pregnant based upon 277 00:16:26,160 --> 00:16:31,720 Speaker 1: her browsing history. So they were proactively acting on this 278 00:16:31,960 --> 00:16:36,360 Speaker 1: data that had been kind of gathered through her browsing activity. 279 00:16:36,480 --> 00:16:39,520 Speaker 1: And then uh, that's what ended up causing this sort 280 00:16:39,560 --> 00:16:42,680 Speaker 1: of a uh scandal is probably too strong of a 281 00:16:42,680 --> 00:16:45,400 Speaker 1: word for it, but certainly a bruhaha. I think if 282 00:16:45,920 --> 00:16:49,200 Speaker 1: we're looking at the grand scheme of of of how 283 00:16:49,240 --> 00:16:54,240 Speaker 1: do we determine the level of of awkwardness, embarrassment, and 284 00:16:54,280 --> 00:16:59,520 Speaker 1: potential emotional trauma? Um? So yes, please, one of the 285 00:16:59,560 --> 00:17:02,040 Speaker 1: things that can me think about? Jonathan is a study 286 00:17:02,080 --> 00:17:09,680 Speaker 1: at Stanford which basically turned AI onto identifying sexual orientation 287 00:17:10,000 --> 00:17:13,960 Speaker 1: from photographs. So they took a data set publicly available 288 00:17:14,040 --> 00:17:17,840 Speaker 1: data set of images of people's faces from dating websites 289 00:17:18,200 --> 00:17:22,360 Speaker 1: which have been tagged bisexual preference I, straight, gay or bisexual. 290 00:17:22,840 --> 00:17:26,480 Speaker 1: Then they train the algorithm on which faces corresponded to 291 00:17:26,520 --> 00:17:31,480 Speaker 1: which express sexual preferences, and the algorithm, after this training 292 00:17:31,840 --> 00:17:36,639 Speaker 1: was able to identify with accuracy for men and accuracy 293 00:17:36,800 --> 00:17:41,000 Speaker 1: for women sexual orientation just from seeing five photographs of them. 294 00:17:41,080 --> 00:17:45,640 Speaker 1: So again that technology by itself is more or less neutral. 295 00:17:46,400 --> 00:17:49,879 Speaker 1: But you think about it being overlaid onto a citywide 296 00:17:49,880 --> 00:17:54,160 Speaker 1: surveillance system in a country like Brunei or Saudi Arabia 297 00:17:54,480 --> 00:17:58,520 Speaker 1: where homosexuality is punishable up to the death penalty, and 298 00:17:58,520 --> 00:18:01,240 Speaker 1: it starts to become very very scared. Ry. Um. So 299 00:18:01,440 --> 00:18:03,960 Speaker 1: we are in this world now where where technology is 300 00:18:03,960 --> 00:18:06,720 Speaker 1: advancing and the ability to make these predictions based on 301 00:18:06,800 --> 00:18:09,879 Speaker 1: past data is so advanced. It doesn't need to have 302 00:18:10,000 --> 00:18:15,160 Speaker 1: consciousness to be killer, right right, Yeah, The fear of 303 00:18:15,240 --> 00:18:20,280 Speaker 1: the matrix or terminator future, while compelling, turns out to 304 00:18:20,680 --> 00:18:23,159 Speaker 1: not be necessary at all. Like that doesn't need to 305 00:18:23,160 --> 00:18:26,680 Speaker 1: be a component for this to already be dangerous. Yeah, 306 00:18:26,680 --> 00:18:29,679 Speaker 1: we'll we'll go into that in greater detail in just 307 00:18:30,040 --> 00:18:41,640 Speaker 1: a moment, but first let's take a quick break. As 308 00:18:41,720 --> 00:18:43,399 Speaker 1: you were saying just before the break, I mean, you 309 00:18:43,440 --> 00:18:46,800 Speaker 1: made that great point about how AI has this potential 310 00:18:47,040 --> 00:18:51,239 Speaker 1: to do potentially, you know, great harm as a as 311 00:18:51,240 --> 00:18:53,760 Speaker 1: a possibility without the need for any sort of intelligence 312 00:18:53,880 --> 00:18:56,040 Speaker 1: or malevolence on the part of the machine. In fact, 313 00:18:56,040 --> 00:19:01,280 Speaker 1: it can just unthinkingly in human terms, cause some some 314 00:19:01,359 --> 00:19:07,400 Speaker 1: pretty terrible consequences, unintended certainly, or at least we hope so, 315 00:19:07,880 --> 00:19:10,400 Speaker 1: on the on the part of those who designed the systems. 316 00:19:10,600 --> 00:19:12,280 Speaker 1: And I wanted to kind of talk a little bit 317 00:19:12,320 --> 00:19:15,680 Speaker 1: more about that about how sometimes that can happen. And one, 318 00:19:15,760 --> 00:19:18,679 Speaker 1: and I'm sure you've come across this in your reporting 319 00:19:18,840 --> 00:19:24,040 Speaker 1: and in your podcasting. One problem that's not only confined 320 00:19:24,080 --> 00:19:26,600 Speaker 1: to AI, but too and not just a tech but 321 00:19:26,680 --> 00:19:30,600 Speaker 1: across the board is bias. Right, This idea that when 322 00:19:30,640 --> 00:19:34,720 Speaker 1: you're designing a system, you're doing so from a particular 323 00:19:34,800 --> 00:19:39,160 Speaker 1: point of view, and because of that, uh, you are 324 00:19:39,320 --> 00:19:44,240 Speaker 1: likely excluding other points of view, maybe not consciously, but 325 00:19:44,440 --> 00:19:47,159 Speaker 1: you are. And that ends up meaning that if it's 326 00:19:47,200 --> 00:19:50,399 Speaker 1: a system that's supposed to apply to everyone, but it 327 00:19:51,080 --> 00:19:54,720 Speaker 1: particularly applies well to people who are similar to the 328 00:19:54,720 --> 00:19:57,199 Speaker 1: people who designed the system, and not so well to 329 00:19:57,200 --> 00:20:00,760 Speaker 1: everybody else, that becomes a problem. And and we've certainly 330 00:20:00,760 --> 00:20:05,560 Speaker 1: seen this in systems like UM Microsoft Connect. When Microsoft 331 00:20:05,640 --> 00:20:11,280 Speaker 1: was pushing the Connect peripheral, which is the gesture recognition peripheral, 332 00:20:11,320 --> 00:20:14,320 Speaker 1: where there's a camera had an infrared camera and a 333 00:20:14,359 --> 00:20:18,280 Speaker 1: regular optical camera that could detect motions so that it 334 00:20:18,280 --> 00:20:21,679 Speaker 1: could be translated into commands for the system. UM it 335 00:20:21,760 --> 00:20:24,440 Speaker 1: was discovered pretty quickly that it worked great for white 336 00:20:24,440 --> 00:20:27,639 Speaker 1: people but not so great for people of color. It 337 00:20:28,040 --> 00:20:31,600 Speaker 1: had been designed by people who had not really worked 338 00:20:31,640 --> 00:20:35,439 Speaker 1: with it in that regard, and so we see there. 339 00:20:35,480 --> 00:20:40,000 Speaker 1: You could argue a fairly um harmless in the grand 340 00:20:40,040 --> 00:20:43,160 Speaker 1: scheme of things failure of a system, but you look 341 00:20:43,200 --> 00:20:47,160 Speaker 1: at something like computer vision for maybe an autonomous car, 342 00:20:47,880 --> 00:20:50,000 Speaker 1: and you could argue, well, now you're talking about life 343 00:20:50,080 --> 00:20:53,800 Speaker 1: or death situations. So to me, one of the big 344 00:20:53,920 --> 00:20:58,320 Speaker 1: challenges in AI is making sure that you that the 345 00:20:58,400 --> 00:21:01,840 Speaker 1: people designing the systems are doing their best to eliminate 346 00:21:01,880 --> 00:21:04,760 Speaker 1: bias as best they can. And part of that I 347 00:21:04,800 --> 00:21:12,120 Speaker 1: think falls to a real concentrated effort to increase diversity 348 00:21:12,200 --> 00:21:15,719 Speaker 1: in the organization's companies that are actually designing these systems 349 00:21:15,720 --> 00:21:18,320 Speaker 1: in the first place. Yeah. No, absolutely. I mean I 350 00:21:18,320 --> 00:21:23,320 Speaker 1: think that the conversation about AI and bias has sort 351 00:21:23,320 --> 00:21:27,520 Speaker 1: of reached critical critical mass. I guess, you know, I 352 00:21:27,520 --> 00:21:30,639 Speaker 1: think it was yesterday or the day before, you know, UM, 353 00:21:30,720 --> 00:21:38,280 Speaker 1: Alexander Rocascio Cortez was speaking out specifically about this problem 354 00:21:38,320 --> 00:21:42,000 Speaker 1: as it pertains to facial recognition technology. UM, there was 355 00:21:42,000 --> 00:21:46,160 Speaker 1: a very good M I. T study that recently came 356 00:21:46,160 --> 00:21:48,600 Speaker 1: out that you know, a lot of these programs are 357 00:21:48,640 --> 00:21:54,640 Speaker 1: developed by white men and therefore are extremely bias and 358 00:21:54,440 --> 00:22:00,439 Speaker 1: and and I think politicians now are really trying to 359 00:22:00,480 --> 00:22:04,679 Speaker 1: sound the alarm because I think it's, um, it's not 360 00:22:04,760 --> 00:22:07,159 Speaker 1: something people think about in their everyday lives. You know. 361 00:22:07,560 --> 00:22:09,960 Speaker 1: I don't think people are you know, walking around getting 362 00:22:09,960 --> 00:22:11,640 Speaker 1: to their job that maybe they don't want to be at, 363 00:22:11,760 --> 00:22:13,800 Speaker 1: you know, driving to work, driving their kids to school, 364 00:22:14,000 --> 00:22:16,280 Speaker 1: you know, thinking about the implications of bias and facial 365 00:22:16,320 --> 00:22:19,520 Speaker 1: recognition technology. I think people have other things to think about, 366 00:22:19,520 --> 00:22:24,040 Speaker 1: but I think it's very important, UM, especially when you know, 367 00:22:24,280 --> 00:22:28,520 Speaker 1: politicians start bringing up these problems, uh for sort of 368 00:22:28,640 --> 00:22:31,119 Speaker 1: ordinary people to start to think, well, actually, wait a minute, 369 00:22:31,280 --> 00:22:35,960 Speaker 1: I might encounter this technology UM at at border patrol, 370 00:22:36,160 --> 00:22:38,280 Speaker 1: you know, when I'm flying out of the country, or 371 00:22:38,680 --> 00:22:41,800 Speaker 1: you know, I might encounter this technology as I walk 372 00:22:41,840 --> 00:22:45,639 Speaker 1: into a stadium that's now using you know, a quick lane. 373 00:22:46,119 --> 00:22:50,480 Speaker 1: And I think when people start to listen to politicians 374 00:22:50,520 --> 00:22:54,320 Speaker 1: who care about these issues, UM, they realize again that 375 00:22:54,400 --> 00:22:57,320 Speaker 1: there are much more human touch points than we think. 376 00:22:57,359 --> 00:23:00,359 Speaker 1: And then so issues of like bias and gender discrimine nation. 377 00:23:01,400 --> 00:23:04,600 Speaker 1: Whereas before people weren't thinking about them as much in 378 00:23:04,720 --> 00:23:08,080 Speaker 1: terms of technology and artificial intelligence, you know, now people 379 00:23:08,080 --> 00:23:11,480 Speaker 1: are realizing that there's real I don't know, there's there's 380 00:23:11,520 --> 00:23:14,520 Speaker 1: real issues in terms of who is developing these technologies 381 00:23:14,720 --> 00:23:19,840 Speaker 1: and who is harmed by the inherent bias within these technologies. 382 00:23:20,160 --> 00:23:23,160 Speaker 1: And I just want to say something really quickly. One 383 00:23:23,240 --> 00:23:26,840 Speaker 1: hypocrisy that I think is is really wild and worth 384 00:23:26,880 --> 00:23:30,720 Speaker 1: noting is, you know, the European Union has recently released 385 00:23:30,800 --> 00:23:34,240 Speaker 1: basically a list of seven I don't know, I don't 386 00:23:34,280 --> 00:23:36,400 Speaker 1: even know what you call them, but bullet points about 387 00:23:36,640 --> 00:23:38,960 Speaker 1: you know, the way in which we should be talking 388 00:23:39,000 --> 00:23:43,480 Speaker 1: about and regulating artificial intelligence, and you know, one of them, 389 00:23:43,520 --> 00:23:46,200 Speaker 1: one of like the main bullet points is to say, 390 00:23:46,760 --> 00:23:50,639 Speaker 1: you know, we really have to focus on uh by 391 00:23:50,680 --> 00:23:55,200 Speaker 1: the inherent bias UM within these you know, both algorithms 392 00:23:55,280 --> 00:23:59,440 Speaker 1: and the way this technology is built. UM, we don't 393 00:23:59,480 --> 00:24:01,400 Speaker 1: we want to make sure that it doesn't get ahead 394 00:24:01,440 --> 00:24:04,679 Speaker 1: of us essentially, right. And at the same time, the 395 00:24:04,720 --> 00:24:08,240 Speaker 1: European Union in Latvia and Hungary and Greece is using, 396 00:24:08,359 --> 00:24:12,320 Speaker 1: is piloting a program called Eyeborder Control UM, which is 397 00:24:12,359 --> 00:24:19,439 Speaker 1: basically being tested and run by border patrol agents. UM 398 00:24:19,480 --> 00:24:24,560 Speaker 1: two match people's faces on a very very large amount 399 00:24:24,680 --> 00:24:27,280 Speaker 1: of data and then decide if a person should be 400 00:24:27,320 --> 00:24:32,320 Speaker 1: detained for further questioning. Right. So, I think right now, 401 00:24:33,240 --> 00:24:36,680 Speaker 1: both politically and socially, there is a reckoning that's going 402 00:24:36,680 --> 00:24:39,919 Speaker 1: on which was like, Okay, we want to use algorithms 403 00:24:40,440 --> 00:24:44,719 Speaker 1: to quote unquote make our borders safer, but we also 404 00:24:45,160 --> 00:24:49,959 Speaker 1: don't want to allow these same things to get ahead 405 00:24:49,960 --> 00:24:53,840 Speaker 1: of us so far that you know, we no longer 406 00:24:53,880 --> 00:24:56,520 Speaker 1: have control over them. And I think that human beings 407 00:24:56,560 --> 00:25:01,240 Speaker 1: in general and specifically politicians are having a really difficult 408 00:25:01,320 --> 00:25:07,240 Speaker 1: time reckoning with the sort of inherent hypocrisy of wanting 409 00:25:07,840 --> 00:25:11,200 Speaker 1: to harness the power of AI to you know, make 410 00:25:11,240 --> 00:25:17,760 Speaker 1: smarter predictions, uh, make policing easier, but also regulating these things. Yeah, 411 00:25:18,280 --> 00:25:21,000 Speaker 1: we're seeing it in in business too, right, Like we're 412 00:25:21,040 --> 00:25:27,320 Speaker 1: seeing businesses that rely heavily upon algorithms. They're not necessarily 413 00:25:27,400 --> 00:25:31,000 Speaker 1: nearly as as critical as the sort of decisions that 414 00:25:31,040 --> 00:25:34,240 Speaker 1: would take place at a border where you could potentially 415 00:25:34,840 --> 00:25:40,600 Speaker 1: really disrupt a person's life unfairly, and that would that's terrible. 416 00:25:40,680 --> 00:25:44,240 Speaker 1: But like I just did an episode recently about the 417 00:25:44,280 --> 00:25:48,320 Speaker 1: YouTube ad apocalypse. You know, this idea of advertisers pulling 418 00:25:48,760 --> 00:25:51,520 Speaker 1: their money and they're they're advertising out of YouTube and 419 00:25:51,560 --> 00:25:54,840 Speaker 1: how that hurt a lot of content creators and sort 420 00:25:54,880 --> 00:25:58,800 Speaker 1: of the problems that YouTube faces. One of the big 421 00:25:58,840 --> 00:26:02,200 Speaker 1: ones being that, you know, they have a pretty aggressive 422 00:26:02,280 --> 00:26:07,440 Speaker 1: algorithm that goes again, goes in and tags videos and 423 00:26:07,880 --> 00:26:12,480 Speaker 1: has them as being potentially uh not family friendly and 424 00:26:12,520 --> 00:26:17,040 Speaker 1: therefore they cannot be monetized. Uh. And the reason why 425 00:26:17,040 --> 00:26:19,560 Speaker 1: YouTube has to depend upon that is because you have 426 00:26:19,600 --> 00:26:23,240 Speaker 1: more than four fifty hours of content being uploaded every 427 00:26:23,280 --> 00:26:26,320 Speaker 1: single minute, So there's no way you could actually have 428 00:26:26,800 --> 00:26:31,280 Speaker 1: human gate keepers who could review all the video footage 429 00:26:31,320 --> 00:26:34,720 Speaker 1: that's being uploaded to YouTube every day and determine whether 430 00:26:34,840 --> 00:26:38,960 Speaker 1: or not this actually merits being allowed into the monetization 431 00:26:39,000 --> 00:26:42,880 Speaker 1: camp versus being demonetized. So you see from the scale 432 00:26:43,359 --> 00:26:45,280 Speaker 1: that they have to rely on it, but you also 433 00:26:45,320 --> 00:26:49,040 Speaker 1: see from the limitation of the algorithms themselves, how all 434 00:26:49,040 --> 00:26:52,479 Speaker 1: these different cases that if a human were to review 435 00:26:52,560 --> 00:26:58,679 Speaker 1: would probably be considered perfectly fine for monetization get you know, excluded. 436 00:26:59,200 --> 00:27:02,160 Speaker 1: So we're seeing that as well. This idea that we're 437 00:27:02,160 --> 00:27:07,560 Speaker 1: seeing the limitations of artificial intelligence where they're working off 438 00:27:07,800 --> 00:27:10,359 Speaker 1: a certain set of criteria, but they aren't always able 439 00:27:10,400 --> 00:27:14,119 Speaker 1: to apply them in the same way that a human would, right, 440 00:27:14,240 --> 00:27:16,960 Speaker 1: they don't they don't take in all the context. So 441 00:27:17,000 --> 00:27:20,400 Speaker 1: we see a lot of videos that are covering sensitive 442 00:27:20,440 --> 00:27:25,720 Speaker 1: subjects like news about the l g B t Q communities, uh, 443 00:27:26,000 --> 00:27:30,760 Speaker 1: news about places that are full of conflict, and these 444 00:27:30,800 --> 00:27:35,640 Speaker 1: are meaningful and useful and educational videos. They're not sensationalized, 445 00:27:35,640 --> 00:27:39,960 Speaker 1: they're not you know, trying to to exploit anyone, and 446 00:27:40,720 --> 00:27:42,960 Speaker 1: the creators are trying to monetize the videos in order 447 00:27:42,960 --> 00:27:45,439 Speaker 1: to be able to fund their efforts, but then they 448 00:27:45,480 --> 00:27:49,880 Speaker 1: get demonetized. So again we're seeing where artificial intelligence can 449 00:27:49,960 --> 00:27:54,000 Speaker 1: cause harm um in ways that we wouldn't have necessarily 450 00:27:54,040 --> 00:27:57,920 Speaker 1: anticipated back when you know, folks like Arthur C. Clarke, 451 00:27:58,000 --> 00:28:02,440 Speaker 1: we're writing about artificial and aligence. One of the things 452 00:28:02,520 --> 00:28:06,200 Speaker 1: that we've found very exciting about Sleepwalkers is that we've 453 00:28:06,200 --> 00:28:09,280 Speaker 1: been able to get access to a lot of kind 454 00:28:09,320 --> 00:28:12,320 Speaker 1: of hard to get into places. So we went to 455 00:28:12,320 --> 00:28:16,040 Speaker 1: the Facebook headquarters in Palo Alto to meet Nathaniel Glika, 456 00:28:16,240 --> 00:28:20,159 Speaker 1: who runs cybersecurity policy for Facebook. And we went to 457 00:28:20,200 --> 00:28:23,600 Speaker 1: the NYPD headquarters to meet the director of Analytics, the 458 00:28:23,600 --> 00:28:26,520 Speaker 1: guy who makes the calls and helps develop the software, 459 00:28:26,640 --> 00:28:29,880 Speaker 1: on what kind of predictive policing is acceptable, what kind 460 00:28:29,880 --> 00:28:33,320 Speaker 1: of policing predicative policing is not acceptable, And we went 461 00:28:33,320 --> 00:28:35,159 Speaker 1: to Google. We went to Google twice. We went to 462 00:28:35,200 --> 00:28:37,800 Speaker 1: Google x, which is the kind of secret lab which 463 00:28:38,120 --> 00:28:41,680 Speaker 1: invents the future, like the self driving cars, the balloons 464 00:28:41,720 --> 00:28:46,000 Speaker 1: which sail in the stratosphere to deliver Internet too hard 465 00:28:46,040 --> 00:28:48,640 Speaker 1: to reach places. But we also went to a very 466 00:28:48,640 --> 00:28:52,760 Speaker 1: interesting program at Google called Jigsaw, and Jigsaw's mission is 467 00:28:52,800 --> 00:28:56,479 Speaker 1: to right some of the wrongs of the Internet, and 468 00:28:56,480 --> 00:29:00,600 Speaker 1: one of the big projects they're working on is sentiment analysis, 469 00:29:00,640 --> 00:29:03,400 Speaker 1: because you know the early promise of the internet, which 470 00:29:03,480 --> 00:29:08,760 Speaker 1: Jonathan you may remember better than me and karaoke. No, 471 00:29:09,000 --> 00:29:13,000 Speaker 1: that was not. He meant more than your podcasts podcast. 472 00:29:13,080 --> 00:29:15,600 Speaker 1: That's fair. That's fair that the podcast. I don't say. 473 00:29:15,760 --> 00:29:17,600 Speaker 1: I put up with that with Tori, but I don't 474 00:29:17,600 --> 00:29:24,920 Speaker 1: need that go ahead. Was comments, right, The Internet was comments, 475 00:29:24,960 --> 00:29:27,480 Speaker 1: It was comment boards, and it was MSN messenger with 476 00:29:27,600 --> 00:29:30,200 Speaker 1: random people you've never met before. And then all of 477 00:29:30,200 --> 00:29:34,560 Speaker 1: a sudden, comments became this morass of utter hatred, and 478 00:29:34,920 --> 00:29:37,680 Speaker 1: most websites stopped accepting comments because it was just too 479 00:29:37,720 --> 00:29:40,720 Speaker 1: horrific and they couldn't afford to have moderators to to 480 00:29:40,760 --> 00:29:44,560 Speaker 1: make it a safe space. So this program at Google Jigsaw, 481 00:29:45,000 --> 00:29:47,800 Speaker 1: one of the things they're working on is sentiment analysis, 482 00:29:47,880 --> 00:29:51,040 Speaker 1: so putting a bunch of comments through an algorithm to 483 00:29:51,280 --> 00:29:54,120 Speaker 1: detect whether or not the comments are hateful. And the 484 00:29:54,160 --> 00:29:56,880 Speaker 1: technology is now being used by the New York Times, 485 00:29:56,880 --> 00:29:59,719 Speaker 1: who are trying to reintroduce a comments section on their 486 00:29:59,720 --> 00:30:05,080 Speaker 1: web site. The problem is these um algorithms learn from 487 00:30:05,320 --> 00:30:11,840 Speaker 1: how humans have historically perceived the negativity or positivity of language, 488 00:30:12,080 --> 00:30:16,160 Speaker 1: and so guess what gay black female was originally considered 489 00:30:16,160 --> 00:30:20,600 Speaker 1: by the algorithm to be hate speech, and white Man 490 00:30:20,800 --> 00:30:24,080 Speaker 1: was considered positive. So you know, there's a lot of 491 00:30:24,080 --> 00:30:27,120 Speaker 1: work to be done to make sure these algorithms don't 492 00:30:27,200 --> 00:30:32,080 Speaker 1: reproduce are very painful history and in trench it right. Yeah, 493 00:30:32,080 --> 00:30:34,959 Speaker 1: that's an excellent point, and it also kind of reminds me. 494 00:30:35,560 --> 00:30:39,960 Speaker 1: I created an outline for this episode, and I'm sort 495 00:30:40,000 --> 00:30:44,280 Speaker 1: of generally making my way through it. Uh, this is 496 00:30:45,120 --> 00:30:48,120 Speaker 1: sort of my milieu, but I was I was thinking 497 00:30:48,520 --> 00:30:53,360 Speaker 1: also that this plays into another component of AI that 498 00:30:53,440 --> 00:30:56,440 Speaker 1: doesn't have anything to do with the AI natively, but 499 00:30:56,600 --> 00:31:00,600 Speaker 1: rather our interactions with AI, and this comes was something 500 00:31:00,600 --> 00:31:04,200 Speaker 1: that humans are particularly good at that AI isn't good at, 501 00:31:04,280 --> 00:31:08,600 Speaker 1: and humans are really good at sussing out what the 502 00:31:09,760 --> 00:31:13,200 Speaker 1: high level operations are for a system and then figuring 503 00:31:13,240 --> 00:31:17,040 Speaker 1: out how to game that system. So we also see 504 00:31:17,040 --> 00:31:21,720 Speaker 1: a lot of examples of people who have recognized how 505 00:31:21,800 --> 00:31:26,960 Speaker 1: the AI is going about detecting something and then they 506 00:31:27,200 --> 00:31:30,480 Speaker 1: end up using that to their own advantage. And in fact, 507 00:31:30,640 --> 00:31:34,040 Speaker 1: I listened to one of your recent episodes of Sleepwalkers, 508 00:31:34,080 --> 00:31:37,680 Speaker 1: the poker Face episode. First of all, Kara, amazing, Lady Gaga. 509 00:31:38,000 --> 00:31:42,080 Speaker 1: Second of all, you're welcome as like Karaoke King that 510 00:31:42,160 --> 00:31:45,720 Speaker 1: was actually a robot version of me doing that was 511 00:31:46,400 --> 00:31:48,920 Speaker 1: my head is off to the to robo then but 512 00:31:49,040 --> 00:31:52,560 Speaker 1: the but yeah, the the there was the discussion, There 513 00:31:52,600 --> 00:31:55,520 Speaker 1: was the the the professor who was talking about how 514 00:31:55,560 --> 00:31:59,880 Speaker 1: students had figured out how to uh to insert keywords 515 00:32:00,000 --> 00:32:03,600 Speaker 1: in their cvs, but they used white text on white 516 00:32:03,640 --> 00:32:06,880 Speaker 1: background so it wouldn't show up to a human reviewer. 517 00:32:07,400 --> 00:32:09,560 Speaker 1: But it was the sort of stuff that machines could read. 518 00:32:09,880 --> 00:32:13,520 Speaker 1: So machines were picking up on the cvs that had 519 00:32:13,560 --> 00:32:17,400 Speaker 1: these words that typically we're going to very prestigious schools. 520 00:32:17,400 --> 00:32:19,240 Speaker 1: It was. It was linking things back to things like 521 00:32:19,360 --> 00:32:23,480 Speaker 1: Harvard or Cambridge, and so their cvs were popping up 522 00:32:23,520 --> 00:32:26,480 Speaker 1: at the top of the pile for consideration, because the 523 00:32:26,520 --> 00:32:29,440 Speaker 1: machines were the ones in charge of going through the 524 00:32:29,480 --> 00:32:32,600 Speaker 1: first pass of these cvs, and then humans would look 525 00:32:32,640 --> 00:32:35,120 Speaker 1: at the next pass, and so it increased your your 526 00:32:35,200 --> 00:32:38,240 Speaker 1: chances getting called in for an interview. And meanwhile, the 527 00:32:38,280 --> 00:32:40,600 Speaker 1: humans are none the wiser because they don't they don't 528 00:32:40,640 --> 00:32:44,200 Speaker 1: see this hidden text, which I thought was a fascinating point. 529 00:32:44,240 --> 00:32:47,280 Speaker 1: It reminded me actually of the early days of S E. 530 00:32:47,440 --> 00:32:51,680 Speaker 1: O and web search where people would just flood a 531 00:32:51,760 --> 00:32:56,760 Speaker 1: web page with all the top searched topics at the 532 00:32:56,760 --> 00:32:58,960 Speaker 1: bottom of the page, even even though they had nothing 533 00:32:59,000 --> 00:33:01,160 Speaker 1: to do with whatever the intent of the page was. 534 00:33:01,560 --> 00:33:03,280 Speaker 1: It was the same sort of thing. They were gaming 535 00:33:03,320 --> 00:33:06,880 Speaker 1: the system. And that's another way that AI could potentially 536 00:33:06,880 --> 00:33:09,280 Speaker 1: become harmful. You know, in this case, I don't think 537 00:33:09,280 --> 00:33:11,560 Speaker 1: it's harmful. I think it's brilliant. The kids are doing 538 00:33:11,560 --> 00:33:13,680 Speaker 1: this because, you know, any way to get your foot 539 00:33:13,720 --> 00:33:15,320 Speaker 1: in the door. If you're the best candidate for the role, 540 00:33:15,360 --> 00:33:18,680 Speaker 1: you should definitely give that interview. But well, especially if 541 00:33:18,720 --> 00:33:22,160 Speaker 1: the game is rigged exactly. Yes, that's another great point. 542 00:33:22,640 --> 00:33:25,640 Speaker 1: Julian and I have Julian's our producer, and uh, Julian 543 00:33:25,720 --> 00:33:28,960 Speaker 1: and I have talked about how we hope to see 544 00:33:29,360 --> 00:33:34,720 Speaker 1: much more cyber I don't know cyberpunk rock in the future, 545 00:33:34,720 --> 00:33:39,720 Speaker 1: whereas you know, I think, yes, cyberpunk is not cyberpunk 546 00:33:39,800 --> 00:33:43,440 Speaker 1: future cyberpunk rock. We don't want cyberpunk rock because that 547 00:33:43,440 --> 00:33:48,200 Speaker 1: would be bad music created by an algorithm. But you know, 548 00:33:48,240 --> 00:33:51,520 Speaker 1: there are it's fun, it's I mean, it's kind of fun. 549 00:33:51,640 --> 00:33:54,960 Speaker 1: I think when deep fakes can get tricky, but it's 550 00:33:55,000 --> 00:33:59,960 Speaker 1: sometimes fun to see how people are gaming computers. You know, 551 00:34:00,040 --> 00:34:04,760 Speaker 1: I was talking about this thing, uh, the reflectacles, which 552 00:34:04,800 --> 00:34:08,040 Speaker 1: were actually designed We're part of a Kickstarter campaign actually 553 00:34:08,560 --> 00:34:12,560 Speaker 1: to UM raise money to design these glasses that would 554 00:34:12,560 --> 00:34:17,960 Speaker 1: basically direct natural light right back into a camera that 555 00:34:18,000 --> 00:34:20,960 Speaker 1: was equipped with facial recognition technology. So it was sort 556 00:34:21,000 --> 00:34:24,239 Speaker 1: of a way for kids to dodge cameras that we're 557 00:34:24,239 --> 00:34:26,440 Speaker 1: trying to recognize them. And I, you know, I just 558 00:34:26,840 --> 00:34:30,400 Speaker 1: I don't know. I guess that my rebellious side really 559 00:34:30,800 --> 00:34:34,680 Speaker 1: really UM is warmed by by things like that. It's 560 00:34:34,760 --> 00:34:36,880 Speaker 1: nice that we can still resist. I mean, you know, 561 00:34:37,000 --> 00:34:40,839 Speaker 1: if she feels so overwhelming technology. And we may talk 562 00:34:40,840 --> 00:34:44,640 Speaker 1: about China later on. You know, part of the problem 563 00:34:44,719 --> 00:34:46,840 Speaker 1: of this kind of surveillance architecture we have is that 564 00:34:47,160 --> 00:34:50,520 Speaker 1: it kind of demotivates you to even try and resist. 565 00:34:51,000 --> 00:34:54,680 Speaker 1: But the issue of these students and peppering the applications 566 00:34:54,760 --> 00:34:58,839 Speaker 1: with with with keywords like Harvard and Stanford on their 567 00:34:58,880 --> 00:35:02,319 Speaker 1: applications in white X versus white background does bring up 568 00:35:02,360 --> 00:35:07,840 Speaker 1: another concern or issue, which is what we call data poisoning. UH. 569 00:35:07,840 --> 00:35:10,279 Speaker 1: And data poisoning is is a military term that we 570 00:35:10,360 --> 00:35:13,840 Speaker 1: heard from the former Secretary of State. Sorry is a 571 00:35:13,880 --> 00:35:16,160 Speaker 1: military term that we heard from the former Navy secretary 572 00:35:16,280 --> 00:35:19,560 Speaker 1: under President Clinton, Richard Danzig, who's a guest on our 573 00:35:19,560 --> 00:35:23,200 Speaker 1: podcast Sleepwalkers. He said that, you know, as we're relying 574 00:35:23,239 --> 00:35:26,719 Speaker 1: on algorithms more and more to make decisions in the battlefield, 575 00:35:26,719 --> 00:35:31,080 Speaker 1: decisions about which targets are threatening, which targets to civilian, 576 00:35:31,400 --> 00:35:33,680 Speaker 1: whether an adversary is preparing for an attack or not. 577 00:35:34,480 --> 00:35:37,640 Speaker 1: And we're relying on algorithms to make these calls for us, 578 00:35:37,719 --> 00:35:40,200 Speaker 1: or at least to inform our decisions. You know, smart 579 00:35:40,320 --> 00:35:44,440 Speaker 1: enemies can start to feed the algorithms they know exist 580 00:35:44,800 --> 00:35:47,560 Speaker 1: poison data. In other words, you know, they can put 581 00:35:47,600 --> 00:35:53,400 Speaker 1: on their own reflecticles and use our technological infrastructure against 582 00:35:53,480 --> 00:35:56,320 Speaker 1: us by tricking our algorithms into thinking things are happening 583 00:35:56,560 --> 00:36:02,440 Speaker 1: that aren't happening. Yeah, that's a another scary concept. It 584 00:36:02,520 --> 00:36:04,759 Speaker 1: reminds me the last little point I have on my 585 00:36:04,880 --> 00:36:08,520 Speaker 1: on my outline will will loop back in a second. 586 00:36:08,560 --> 00:36:13,560 Speaker 1: But this uh the the various cases of false alarms 587 00:36:14,280 --> 00:36:17,160 Speaker 1: that have happened since the nineteen fifties in the early 588 00:36:17,200 --> 00:36:20,600 Speaker 1: warning systems for various nuclear programs. This has happened both 589 00:36:21,120 --> 00:36:25,160 Speaker 1: in the United States and the former Soviet Union. Uh, 590 00:36:25,200 --> 00:36:29,319 Speaker 1: we have seen cases where there were systems that detected 591 00:36:29,719 --> 00:36:32,200 Speaker 1: a nuclear strike when in fact that it never happened. 592 00:36:32,640 --> 00:36:36,000 Speaker 1: But but these were, you know, again, automated systems designed 593 00:36:36,000 --> 00:36:40,120 Speaker 1: to detect patterns, something that AI is really that's one 594 00:36:40,160 --> 00:36:42,279 Speaker 1: of the main things that AI does is look for 595 00:36:42,360 --> 00:36:46,439 Speaker 1: patterns and then uh, start to predict things based upon 596 00:36:46,480 --> 00:36:49,560 Speaker 1: the patterns that have been observed. And it was a 597 00:36:49,560 --> 00:36:53,799 Speaker 1: couple of different cases of mistaken things that were not 598 00:36:53,880 --> 00:36:57,560 Speaker 1: actually patterns but were interpreted as patterns, and that we 599 00:36:57,680 --> 00:37:04,040 Speaker 1: thus saw very near miss into going into full nuclear war. 600 00:37:04,120 --> 00:37:05,839 Speaker 1: And the only reason we did it is because there 601 00:37:05,880 --> 00:37:08,799 Speaker 1: were actually human beings who said, hang on, let me, 602 00:37:09,160 --> 00:37:13,720 Speaker 1: let me triple check this before we commit to mutually 603 00:37:13,719 --> 00:37:17,920 Speaker 1: assured destruction. And uh, you know, we were very fortunate 604 00:37:17,960 --> 00:37:22,400 Speaker 1: in that case that we had clear thinking individuals who 605 00:37:22,800 --> 00:37:27,200 Speaker 1: were second guessing the systems. The danger I see is 606 00:37:27,200 --> 00:37:29,399 Speaker 1: that we start to depend more and more heavily upon 607 00:37:29,440 --> 00:37:34,520 Speaker 1: the systems, where we are less likely to resist the 608 00:37:34,560 --> 00:37:37,840 Speaker 1: decisions coming out. And um, we'll talk a little bit 609 00:37:37,880 --> 00:37:40,279 Speaker 1: more about that again in just a moment, but first 610 00:37:40,320 --> 00:37:52,680 Speaker 1: let's take another quick break. So I was talking about 611 00:37:52,760 --> 00:37:55,440 Speaker 1: the early warning systems. That kind of relates to another 612 00:37:55,520 --> 00:37:58,040 Speaker 1: problem that we hear in AI. This one's uh one 613 00:37:58,080 --> 00:38:00,319 Speaker 1: I hear side by side with bias as being one 614 00:38:00,360 --> 00:38:03,320 Speaker 1: of the big concerns about AI, and that's what is 615 00:38:03,360 --> 00:38:06,560 Speaker 1: commonly referred to as the black box problem, which is 616 00:38:06,560 --> 00:38:12,040 Speaker 1: where you've designed a system that is so uh complicated 617 00:38:12,239 --> 00:38:16,800 Speaker 1: or perhaps purposefully Obvius skated, that you cannot see how 618 00:38:16,840 --> 00:38:20,439 Speaker 1: the system actually operates, and so you're getting output from 619 00:38:20,440 --> 00:38:23,640 Speaker 1: this system, and the output appears to be good, but 620 00:38:23,719 --> 00:38:26,880 Speaker 1: you don't necessarily understand all the steps that went through 621 00:38:27,200 --> 00:38:29,000 Speaker 1: the system to come to that. And we see this 622 00:38:29,040 --> 00:38:31,600 Speaker 1: in machine learning in particular, where you've got, you know, 623 00:38:31,640 --> 00:38:36,400 Speaker 1: these artificial neural networks that have different weights on different decisions, 624 00:38:36,480 --> 00:38:40,480 Speaker 1: and then they give you what is, at least statistically speaking, 625 00:38:40,560 --> 00:38:43,439 Speaker 1: the most correct answer for whatever it is you're looking for. 626 00:38:43,840 --> 00:38:46,120 Speaker 1: If we don't know how the machine is coming into 627 00:38:46,160 --> 00:38:50,600 Speaker 1: that decision, then we can't be fully sure that it 628 00:38:50,760 --> 00:38:52,520 Speaker 1: is the best one. And so there have been a 629 00:38:52,560 --> 00:38:57,480 Speaker 1: lot of people that I've seen arguing for more transparent 630 00:38:57,560 --> 00:39:00,600 Speaker 1: approaches to AI to make sure that it's sort of 631 00:39:00,680 --> 00:39:03,480 Speaker 1: the system that we can audit so that we do 632 00:39:03,800 --> 00:39:08,000 Speaker 1: feel reasonably certain it's working as intended and not producing 633 00:39:08,040 --> 00:39:13,440 Speaker 1: results that could be less than ideal or even harmful. Um, 634 00:39:13,440 --> 00:39:16,160 Speaker 1: it's one of the big concerns I've seen over the 635 00:39:16,239 --> 00:39:19,200 Speaker 1: recent years that you know, the bias one being on 636 00:39:19,200 --> 00:39:21,439 Speaker 1: one side and the black box problem being on the other. 637 00:39:21,680 --> 00:39:23,799 Speaker 1: Have you guys encountered any of that in your work 638 00:39:23,840 --> 00:39:27,440 Speaker 1: so far? Yeah, we have actually and and and just 639 00:39:27,560 --> 00:39:32,480 Speaker 1: in a lot of recent news. Um, the black box 640 00:39:32,520 --> 00:39:35,279 Speaker 1: AI problem it kind of feels like a Ponzi scheme 641 00:39:35,280 --> 00:39:37,799 Speaker 1: where it's like, Okay, we have these returns that we 642 00:39:37,840 --> 00:39:40,600 Speaker 1: know are good, and someone selling you these returns. They're 643 00:39:40,640 --> 00:39:43,759 Speaker 1: not telling you how these returns are happening, but you 644 00:39:43,840 --> 00:39:46,959 Speaker 1: trust that because you want to see your money grow exponentially, 645 00:39:47,280 --> 00:39:48,880 Speaker 1: You're going to give them the money that you have 646 00:39:49,000 --> 00:39:51,719 Speaker 1: now and expect to see those returns. And that's how 647 00:39:51,800 --> 00:39:54,600 Speaker 1: people get tainted. I mean, that's how people It's not funny, 648 00:39:54,640 --> 00:39:58,360 Speaker 1: but it's sort of you know, how Ponzi schemes work. Um. 649 00:39:58,440 --> 00:40:02,360 Speaker 1: The black box A is similar to me, at least 650 00:40:02,920 --> 00:40:08,960 Speaker 1: in my understanding, in that we don't really understand what 651 00:40:09,120 --> 00:40:12,799 Speaker 1: linguistic patterns the networks are actually analyzing. We just know 652 00:40:12,840 --> 00:40:15,600 Speaker 1: that they're analyzing them. And that to me, as someone 653 00:40:15,600 --> 00:40:20,200 Speaker 1: who is, um not a computer scientist, I'm like, what, like, 654 00:40:20,320 --> 00:40:24,239 Speaker 1: that's how is that possible? UM? And it's I mean, 655 00:40:24,280 --> 00:40:26,560 Speaker 1: I think I think it's a bit alarming. And I 656 00:40:26,600 --> 00:40:29,200 Speaker 1: know there are people there's a team at Google right 657 00:40:29,200 --> 00:40:32,120 Speaker 1: now that's sort of working on this, working to fix it, 658 00:40:32,160 --> 00:40:34,640 Speaker 1: and they sort of call it, you know, going I'm 659 00:40:34,680 --> 00:40:36,520 Speaker 1: not a driver, so I don't know, popping the hood, 660 00:40:36,560 --> 00:40:41,480 Speaker 1: going under the hood of of of AI to to 661 00:40:42,400 --> 00:40:47,239 Speaker 1: you know, better understand what exactly is going on, UM, 662 00:40:47,280 --> 00:40:49,560 Speaker 1: because I think, you know, again going back to what 663 00:40:49,600 --> 00:40:53,000 Speaker 1: I was saying about the EU UM releasing these sort 664 00:40:53,000 --> 00:40:58,319 Speaker 1: of seven guidelines. You know, one of them is transparency, right, 665 00:40:58,360 --> 00:41:01,439 Speaker 1: and that's not only transparency and sort of how we're 666 00:41:01,560 --> 00:41:06,279 Speaker 1: using AI and you know, various touch points in human life, 667 00:41:06,480 --> 00:41:10,920 Speaker 1: but also how AI or how algorithms actually work. And 668 00:41:10,960 --> 00:41:14,160 Speaker 1: I think, you know, not only do people not understand 669 00:41:14,239 --> 00:41:18,319 Speaker 1: how many human touch points daily you know, consist of 670 00:41:18,360 --> 00:41:22,000 Speaker 1: some form of artificial intelligence, they don't understand exactly how 671 00:41:22,040 --> 00:41:24,600 Speaker 1: the AI is working. I mean that's an even that's 672 00:41:24,640 --> 00:41:28,440 Speaker 1: more difficult. And so I think this idea that even 673 00:41:28,480 --> 00:41:33,160 Speaker 1: the people who are feeding data into these algorithms, don't 674 00:41:33,320 --> 00:41:37,600 Speaker 1: know exactly how the algorithms are treating the data. Is 675 00:41:37,680 --> 00:41:40,400 Speaker 1: really a cause for alarm, and not not to not 676 00:41:40,480 --> 00:41:42,440 Speaker 1: to not to be alarmists, but but I do think 677 00:41:42,440 --> 00:41:44,080 Speaker 1: it's a cause for alarm, and and I do know 678 00:41:44,280 --> 00:41:45,920 Speaker 1: there's a there's actually a lot of research going out 679 00:41:45,920 --> 00:41:47,600 Speaker 1: in my t about it as well, because I think 680 00:41:48,120 --> 00:41:51,480 Speaker 1: even for people who are in the field, it's something 681 00:41:51,520 --> 00:41:56,879 Speaker 1: that worries them. I think it's worth mentioning that Henry Kissinger, 682 00:41:56,960 --> 00:42:01,160 Speaker 1: who is obviously a controversial figure, wrote a piece about 683 00:42:01,200 --> 00:42:04,759 Speaker 1: this last year for The Atlantic under the headline how 684 00:42:04,800 --> 00:42:08,760 Speaker 1: the Enlightenment Ends. And you know, Kissinger is somebody who 685 00:42:08,840 --> 00:42:12,160 Speaker 1: into his nineties, you know, likes being in the game 686 00:42:12,200 --> 00:42:14,239 Speaker 1: and being hot. So and something he invested in their 687 00:42:14,360 --> 00:42:19,560 Speaker 1: enough somebody he didn't he was involved, but so he 688 00:42:19,640 --> 00:42:21,399 Speaker 1: you know, so he has an appetite for for these 689 00:42:21,440 --> 00:42:23,520 Speaker 1: for these topics. On the other hand, you know, here's 690 00:42:23,560 --> 00:42:26,840 Speaker 1: somebody in their nineties. And the piece was basically he 691 00:42:26,960 --> 00:42:30,640 Speaker 1: convened as many of the leading minds in the world 692 00:42:30,800 --> 00:42:33,800 Speaker 1: on AI that he could and wrote this piece, the 693 00:42:33,880 --> 00:42:36,520 Speaker 1: State of the Nation piece on AI called how the 694 00:42:36,600 --> 00:42:40,759 Speaker 1: enlightenment ends, And the main topic of the of this 695 00:42:41,000 --> 00:42:45,239 Speaker 1: essay was about the black box problem. So Kissinger's point was, 696 00:42:45,280 --> 00:42:49,680 Speaker 1: throughout human history we have been able to state why 697 00:42:49,719 --> 00:42:54,120 Speaker 1: we did stuff, look at the outcome, argue about whether 698 00:42:54,120 --> 00:42:57,040 Speaker 1: our reasoning that got us to that outcome was correct 699 00:42:57,120 --> 00:43:00,719 Speaker 1: or faulty, and then improve our ability to reason. And 700 00:43:00,800 --> 00:43:03,279 Speaker 1: when you have these black box AIA systems which make 701 00:43:03,320 --> 00:43:07,160 Speaker 1: decisions but we're as as yet unable to understand why 702 00:43:07,200 --> 00:43:10,360 Speaker 1: they made the decisions, it takes away the ability to 703 00:43:10,360 --> 00:43:13,760 Speaker 1: have a debate. And that is such a fundamental part 704 00:43:13,840 --> 00:43:16,400 Speaker 1: of what it means to be the human being in 705 00:43:16,560 --> 00:43:21,440 Speaker 1: twenty one century liberal society, um that it's frightening to 706 00:43:21,520 --> 00:43:24,880 Speaker 1: think about losing that ability. On the other hand, and 707 00:43:24,920 --> 00:43:27,560 Speaker 1: the classic, you know, the classic illustration of this problem 708 00:43:27,600 --> 00:43:30,279 Speaker 1: is called the trolley car problem. An autonomous car is 709 00:43:30,360 --> 00:43:32,960 Speaker 1: driving along, it has to choose one person to kill. 710 00:43:33,400 --> 00:43:35,400 Speaker 1: Does it choose as a swerve right and kill the 711 00:43:35,520 --> 00:43:40,320 Speaker 1: child or swerve left and kill the old person, um, 712 00:43:40,480 --> 00:43:42,000 Speaker 1: And it will never be able to explain why it 713 00:43:42,040 --> 00:43:45,319 Speaker 1: made the decision it made. You know, that's probably true 714 00:43:45,360 --> 00:43:48,560 Speaker 1: for most drivers as well, because they'll either have been 715 00:43:48,640 --> 00:43:51,640 Speaker 1: killed themselves they'll have had so much trauma in the 716 00:43:51,640 --> 00:43:54,399 Speaker 1: crash that they can't remember or they simply won't know. 717 00:43:54,920 --> 00:43:58,000 Speaker 1: And as humans, we like to post rationalize things and 718 00:43:58,040 --> 00:44:00,680 Speaker 1: then believe there are rationalizations are why we did what 719 00:44:00,719 --> 00:44:02,960 Speaker 1: we did, But that also may not be true. So 720 00:44:03,120 --> 00:44:05,920 Speaker 1: I don't know bash Ai too hard for being black box, 721 00:44:06,000 --> 00:44:09,160 Speaker 1: because I think that humans, despite our best interests and 722 00:44:09,440 --> 00:44:14,200 Speaker 1: thousands of years of our statilion onwards syllogisms and culture, 723 00:44:14,480 --> 00:44:17,400 Speaker 1: you know, our logic and rationality, and is overlaid on 724 00:44:17,440 --> 00:44:21,040 Speaker 1: some very hard to explain animal instincts. Yeah, and and 725 00:44:21,640 --> 00:44:25,600 Speaker 1: when I think about this problem, so this isn't this 726 00:44:25,640 --> 00:44:29,920 Speaker 1: isn't strictly a I but I have a very strong 727 00:44:30,160 --> 00:44:33,440 Speaker 1: emotional response to the black box problem. But that's because 728 00:44:34,160 --> 00:44:36,640 Speaker 1: I live in the state of Georgia. And in Georgia 729 00:44:37,160 --> 00:44:40,040 Speaker 1: you may or may not know this, we rely heavily 730 00:44:40,280 --> 00:44:47,640 Speaker 1: upon technologically ancient electronic voting machines that have no paper trail, 731 00:44:48,440 --> 00:44:52,000 Speaker 1: so there's no way to audit them. They also have 732 00:44:52,120 --> 00:44:57,799 Speaker 1: been proven to be vulnerable to to um attack, you know, 733 00:44:57,840 --> 00:45:01,439 Speaker 1: to outside attack. And in fact, there's an enormous controversy 734 00:45:01,520 --> 00:45:05,160 Speaker 1: in the State of Georgia that some servers may have 735 00:45:05,320 --> 00:45:09,680 Speaker 1: been tampered with, and then the servers that may or 736 00:45:09,719 --> 00:45:12,840 Speaker 1: may not have been tampered with were mysteriously wiped clear 737 00:45:12,960 --> 00:45:15,520 Speaker 1: a couple of days before anyone could do an investigation 738 00:45:15,600 --> 00:45:18,719 Speaker 1: of it. And so when you see something like that 739 00:45:18,760 --> 00:45:22,759 Speaker 1: where that lack of transparency can have not just a 740 00:45:22,760 --> 00:45:25,560 Speaker 1: direct impact on lives, I mean we're talking about actually 741 00:45:25,640 --> 00:45:29,799 Speaker 1: threatening the very concept of the democratic process. Right. If 742 00:45:29,800 --> 00:45:33,759 Speaker 1: you cannot trust the results of your election, you have 743 00:45:33,960 --> 00:45:38,399 Speaker 1: undermined democracy. And so when I see that, that's why 744 00:45:38,480 --> 00:45:42,040 Speaker 1: I end up having a very kind of heightened emotional 745 00:45:42,080 --> 00:45:44,840 Speaker 1: response to the thought of these opaque systems. But Odds 746 00:45:44,840 --> 00:45:48,480 Speaker 1: to your point, that is absolutely correct that people like 747 00:45:48,520 --> 00:45:51,360 Speaker 1: we we don't necessarily hold people to that same standard. 748 00:45:51,600 --> 00:45:54,960 Speaker 1: We will take them at their word if they tell us, oh, well, 749 00:45:55,000 --> 00:45:57,920 Speaker 1: what what I was thinking when it happened was X, Y, 750 00:45:57,960 --> 00:46:00,640 Speaker 1: and Z, When in reality, maybe they weren't thinking anything 751 00:46:00,640 --> 00:46:03,959 Speaker 1: at all, Maybe they were reacting, but in in the 752 00:46:04,040 --> 00:46:07,200 Speaker 1: post event, they have come up with a rationalization for 753 00:46:07,320 --> 00:46:11,279 Speaker 1: that action that works within the narrative that they've constructed 754 00:46:11,280 --> 00:46:15,160 Speaker 1: for their own lives. So maybe maybe that's because maybe 755 00:46:15,200 --> 00:46:17,759 Speaker 1: that means I just need to give machines a little 756 00:46:17,800 --> 00:46:20,319 Speaker 1: bit of the same slack I would give people. We 757 00:46:20,400 --> 00:46:23,839 Speaker 1: do hold machines to a to an unreasonable expectation. I mean, 758 00:46:24,160 --> 00:46:26,080 Speaker 1: you know how many people are killed every year on 759 00:46:26,120 --> 00:46:30,560 Speaker 1: the roads by drunk driving, by unqualified driving, by poor driving, 760 00:46:31,040 --> 00:46:33,920 Speaker 1: you know, and when that happens, we kind of take 761 00:46:33,960 --> 00:46:36,360 Speaker 1: it as a you know, a necessary evil so that 762 00:46:36,360 --> 00:46:38,920 Speaker 1: people can get around in cars. And yet if anyone 763 00:46:39,040 --> 00:46:43,399 Speaker 1: is killed in a you know, an accident involving driver 764 00:46:43,520 --> 00:46:46,600 Speaker 1: as car like that which has happened with Tesla, you know, 765 00:46:46,600 --> 00:46:49,319 Speaker 1: it's news for runs for months and months and months. 766 00:46:49,320 --> 00:46:50,839 Speaker 1: I'm not saying it shouldn't be news. I'm not saying 767 00:46:50,840 --> 00:46:53,360 Speaker 1: it's not saying we should scrutinize. But we also know 768 00:46:53,360 --> 00:46:55,920 Speaker 1: in order to enjoy the benefits of AI and technology, 769 00:46:56,160 --> 00:46:58,799 Speaker 1: we have to accept that it comes with risks, just 770 00:46:58,880 --> 00:47:02,479 Speaker 1: like the automobile itself comes with risks. Well, I'm sorry, 771 00:47:02,480 --> 00:47:05,000 Speaker 1: go ahead, Kara, No I was gonna say. I was 772 00:47:05,120 --> 00:47:08,880 Speaker 1: speaking at ODDS earlier today about this case of um, 773 00:47:08,920 --> 00:47:13,840 Speaker 1: a man who basically was pitching around a AI powered 774 00:47:13,840 --> 00:47:16,560 Speaker 1: hedge fund and is now in a lot of trouble 775 00:47:16,719 --> 00:47:20,680 Speaker 1: because he lost a lot of money for people and 776 00:47:21,200 --> 00:47:23,319 Speaker 1: you know, I think there's a I think it's an 777 00:47:23,360 --> 00:47:28,400 Speaker 1: interesting story because you know, it's a legal battle that 778 00:47:28,440 --> 00:47:32,080 Speaker 1: has emerged that it's sort of going to set up 779 00:47:33,480 --> 00:47:38,640 Speaker 1: precedent for how you know, AI is incorporated into at 780 00:47:38,719 --> 00:47:41,080 Speaker 1: least this facet of life, right in terms of making 781 00:47:41,120 --> 00:47:44,839 Speaker 1: financial decisions for real human beings with real money, right, 782 00:47:44,880 --> 00:47:50,680 Speaker 1: And if we're allowing computer programs to make decisions based 783 00:47:50,719 --> 00:47:56,239 Speaker 1: on data and then those decisions lead to a significant 784 00:47:56,320 --> 00:48:00,719 Speaker 1: loss of finding a significant loss of money. You know, 785 00:48:00,920 --> 00:48:03,480 Speaker 1: who are we holding accountable? Are we holding the money 786 00:48:03,480 --> 00:48:07,120 Speaker 1: manager accountable or reholding the program or you know, or 787 00:48:07,600 --> 00:48:11,000 Speaker 1: holding the algorithm accountable, algorithm the person who wrote the 788 00:48:11,040 --> 00:48:14,439 Speaker 1: algorithm accountable? You know, I think, and I actually don't 789 00:48:14,600 --> 00:48:17,880 Speaker 1: think the American legal system. I don't think any legal 790 00:48:17,920 --> 00:48:21,920 Speaker 1: system really knows how to handle this problem. And how 791 00:48:21,960 --> 00:48:24,839 Speaker 1: would you How would you if you don't even know 792 00:48:25,160 --> 00:48:27,480 Speaker 1: how the algorithm is working, and that you have no 793 00:48:27,640 --> 00:48:30,600 Speaker 1: language for like human language for it. So I think, 794 00:48:30,800 --> 00:48:32,960 Speaker 1: and we're going to see more and more cases of 795 00:48:33,000 --> 00:48:35,520 Speaker 1: this because I think at the same time and Oz 796 00:48:35,560 --> 00:48:37,400 Speaker 1: talks about this a lot with me, is you know, 797 00:48:37,480 --> 00:48:40,520 Speaker 1: AI is used as such a strong marketing tool right 798 00:48:40,560 --> 00:48:43,840 Speaker 1: now in all facets of life, and again in healthcare 799 00:48:43,920 --> 00:48:47,640 Speaker 1: and agriculture, you know, in computing, in in in the 800 00:48:47,640 --> 00:48:51,720 Speaker 1: automobile industry. And so I think people are very susceptible 801 00:48:51,800 --> 00:48:55,000 Speaker 1: to being marketed with a I it's it's has a 802 00:48:55,040 --> 00:48:58,480 Speaker 1: serious factor right now. But at the same time, are 803 00:48:58,520 --> 00:49:02,160 Speaker 1: we willing to accept AI shortcomings? I mean, I think 804 00:49:02,200 --> 00:49:05,200 Speaker 1: we have to be um But I think, you know, 805 00:49:05,239 --> 00:49:09,680 Speaker 1: as Oz just said, like people are setting their expectations 806 00:49:09,680 --> 00:49:12,520 Speaker 1: a bit high. I mean, they are computers, after all. Yeah, 807 00:49:12,560 --> 00:49:14,960 Speaker 1: and well we've also we've lived in an era where 808 00:49:14,960 --> 00:49:20,560 Speaker 1: we've seen such incredible advancement in computers that it starts 809 00:49:20,600 --> 00:49:24,640 Speaker 1: to reinforce this idea that technology can accomplish just about anything. 810 00:49:24,680 --> 00:49:27,960 Speaker 1: I mean, if you had told ten year old Jonathan 811 00:49:28,480 --> 00:49:30,879 Speaker 1: that one day he would have a computer that would 812 00:49:30,920 --> 00:49:33,520 Speaker 1: fit in his pocket and would allow him to communicate 813 00:49:33,560 --> 00:49:37,759 Speaker 1: with everyone he knows, and whether it's through voice or 814 00:49:37,960 --> 00:49:40,919 Speaker 1: video or text, that I would be able to tap 815 00:49:40,920 --> 00:49:45,760 Speaker 1: into the world's you know, database of all human knowledge 816 00:49:45,960 --> 00:49:48,040 Speaker 1: at a touch of a button, I would have thought 817 00:49:48,040 --> 00:49:52,080 Speaker 1: you were crazy. That that would have seemed completely patently 818 00:49:52,120 --> 00:49:54,319 Speaker 1: impossible to me. I mean, let's talking about an era 819 00:49:54,400 --> 00:49:57,560 Speaker 1: where at that point the most sophisticated machine out there 820 00:49:58,120 --> 00:50:04,440 Speaker 1: was a ma Cantosh computer or the IBM PC, and 821 00:50:04,520 --> 00:50:06,319 Speaker 1: you look at that and you think, well, these are 822 00:50:06,320 --> 00:50:08,400 Speaker 1: great machines, but no, there's no way I'm going to 823 00:50:08,480 --> 00:50:10,360 Speaker 1: have one of these in my pocket, let alone be 824 00:50:10,360 --> 00:50:12,280 Speaker 1: able to do all these other things you're talking about. 825 00:50:13,000 --> 00:50:15,880 Speaker 1: So once you look into that, you start to realize, oh, 826 00:50:16,040 --> 00:50:18,680 Speaker 1: we have now built up this expectation that because we 827 00:50:18,760 --> 00:50:23,319 Speaker 1: have this amazing, uh incredibly rapid evolution of technology in 828 00:50:23,360 --> 00:50:26,960 Speaker 1: our recent past, we start projecting that and thinking the 829 00:50:27,120 --> 00:50:30,040 Speaker 1: same sort of progress is going to continue unabated. It's 830 00:50:30,040 --> 00:50:32,440 Speaker 1: actually just going to pick up speed. And then we 831 00:50:32,480 --> 00:50:35,160 Speaker 1: start thinking, oh, well, that means that before long we're 832 00:50:35,160 --> 00:50:40,200 Speaker 1: gonna have the sort of uh, incredibly sophisticated, artificially intelligent 833 00:50:40,560 --> 00:50:43,680 Speaker 1: constructs as part of our day to day lives. Uh. 834 00:50:43,719 --> 00:50:45,919 Speaker 1: And that's not necessarily the case because of what it does. 835 00:50:46,200 --> 00:50:51,240 Speaker 1: It assumes that all technological advancement proceeds at the same speed, 836 00:50:51,520 --> 00:50:53,799 Speaker 1: which isn't That's not the case. What do you mean 837 00:50:53,840 --> 00:50:55,640 Speaker 1: the chat bots, the chatbots that I was going to 838 00:50:55,760 --> 00:50:58,879 Speaker 1: get fun? What are you talking? I'm sorry, I'm sorry. 839 00:50:58,920 --> 00:51:02,399 Speaker 1: The chat bought past your two ring test. Well, one 840 00:51:02,400 --> 00:51:04,320 Speaker 1: thing I did want to kind of end on because 841 00:51:04,360 --> 00:51:08,320 Speaker 1: I think this is sort of the the the capper 842 00:51:08,520 --> 00:51:13,920 Speaker 1: of discussions about how AI is potentially hazardous is this 843 00:51:13,960 --> 00:51:16,400 Speaker 1: is a discussion that's come up many times of the past, 844 00:51:16,600 --> 00:51:20,319 Speaker 1: i'd say three or four years, about how AI and 845 00:51:20,400 --> 00:51:25,560 Speaker 1: automation are going to end up displacing people. It's going 846 00:51:25,560 --> 00:51:28,520 Speaker 1: to end up eliminating jobs. And there are lots of 847 00:51:28,520 --> 00:51:31,080 Speaker 1: different points of view on the subject. You've got people 848 00:51:31,080 --> 00:51:34,160 Speaker 1: who say, yes, some jobs are going to go away. 849 00:51:34,200 --> 00:51:36,399 Speaker 1: They are the very repetitive jobs, the ones the things 850 00:51:36,400 --> 00:51:38,640 Speaker 1: that AI are good at, like being able to do 851 00:51:38,680 --> 00:51:41,400 Speaker 1: the same thing over and over and over again with 852 00:51:41,520 --> 00:51:44,040 Speaker 1: very little variation. You know, the more you vary from 853 00:51:44,080 --> 00:51:46,080 Speaker 1: the norm, the more difficult it is for a machine 854 00:51:46,120 --> 00:51:49,359 Speaker 1: to do. But those jobs will probably go away, but 855 00:51:50,160 --> 00:51:52,440 Speaker 1: as a result, more jobs will be created. And other 856 00:51:52,440 --> 00:51:55,279 Speaker 1: people are saying maybe in the short term, but in 857 00:51:55,320 --> 00:51:57,600 Speaker 1: the long term, we're going to see automation take over 858 00:51:57,640 --> 00:51:59,400 Speaker 1: everything and no one's gonna have a job, and we've 859 00:51:59,400 --> 00:52:02,000 Speaker 1: got to figure this out, and something's gonna you know, 860 00:52:02,200 --> 00:52:04,479 Speaker 1: the entire world economy is gonna collapse, or we're gonna 861 00:52:04,520 --> 00:52:08,000 Speaker 1: have to go to some form of guaranteed basic income 862 00:52:08,040 --> 00:52:09,600 Speaker 1: for the entire world, or we're gonna have to do 863 00:52:09,600 --> 00:52:12,719 Speaker 1: away with the concept of money altogether. Um, now that 864 00:52:12,760 --> 00:52:15,840 Speaker 1: we've divorced money from labor what we do, and so 865 00:52:15,920 --> 00:52:18,880 Speaker 1: we're seeing like all these kind of conversations going around, 866 00:52:19,480 --> 00:52:20,879 Speaker 1: and I thought I would tell you guys a bit 867 00:52:20,880 --> 00:52:24,560 Speaker 1: because just for the heck of it, I found an M. I. T. H. 868 00:52:24,800 --> 00:52:29,120 Speaker 1: Technology Review article from two thousand eighteen that gathered together 869 00:52:29,880 --> 00:52:33,439 Speaker 1: all of the major predictions for what automation was going 870 00:52:33,520 --> 00:52:35,920 Speaker 1: to do, um, like how many jobs it was going 871 00:52:35,960 --> 00:52:39,279 Speaker 1: to destroy versus create. And Uh, I think it's pretty telling. 872 00:52:39,320 --> 00:52:42,600 Speaker 1: I'm just gonna cite one year. They have years from 873 00:52:42,600 --> 00:52:47,160 Speaker 1: two thousand sixteen up to let's see, but I'm just 874 00:52:47,160 --> 00:52:52,120 Speaker 1: gonna do twenty twenty five two different predictions you had 875 00:52:52,160 --> 00:52:58,759 Speaker 1: Forrester predicting that in the US, automation would destroy the 876 00:52:58,760 --> 00:53:02,480 Speaker 1: words of the view not me, uh, twenty four million 877 00:53:02,680 --> 00:53:09,040 Speaker 1: on six thousand, two hundred forty jobs and only create million, 878 00:53:09,120 --> 00:53:12,080 Speaker 1: six hundred four thousand, seven hundred sixty jobs. So you're 879 00:53:12,120 --> 00:53:15,800 Speaker 1: looking at a deficit of more than ten million jobs. Meanwhile, 880 00:53:16,120 --> 00:53:20,280 Speaker 1: Science Alert said jobs destroyed three million, four hundred thousand, 881 00:53:20,840 --> 00:53:26,080 Speaker 1: so twenty one million jobs fewer predicted than Forrester. So 882 00:53:26,120 --> 00:53:30,600 Speaker 1: if you're looking at the twenty one million disparity between predictions, 883 00:53:31,320 --> 00:53:34,920 Speaker 1: do you think it's safe to say we don't know yet. 884 00:53:34,320 --> 00:53:37,040 Speaker 1: I don't think we know yet. I don't think you 885 00:53:37,080 --> 00:53:38,920 Speaker 1: know yet. I think it's a very I think it. 886 00:53:39,080 --> 00:53:43,560 Speaker 1: I think the idea of unfortunately the line of jobs 887 00:53:43,560 --> 00:53:47,080 Speaker 1: being lost is UH is part of the if it 888 00:53:47,120 --> 00:53:52,120 Speaker 1: bleeds it leads, you know, method of journalism. I do 889 00:53:52,160 --> 00:53:54,799 Speaker 1: think it's absolutely true that automation is not only on 890 00:53:54,840 --> 00:53:57,080 Speaker 1: the horizon, it's here, you know. I mean, if we 891 00:53:57,160 --> 00:53:59,840 Speaker 1: just talk about agriculture, for example, you know there is 892 00:54:00,080 --> 00:54:04,040 Speaker 1: UM right now in Washington Can Washington State. UM is 893 00:54:04,160 --> 00:54:09,279 Speaker 1: you know, piloting a harvesting robot that they are going 894 00:54:09,320 --> 00:54:11,680 Speaker 1: to be using for the first time in this next 895 00:54:11,680 --> 00:54:15,200 Speaker 1: harvest apple harvest where they're using this sort of huge 896 00:54:15,440 --> 00:54:19,680 Speaker 1: hoover like vacuum to pick apples. Right. You know, Amazon 897 00:54:19,880 --> 00:54:24,640 Speaker 1: is introducing uh some new automation technology that's going to 898 00:54:25,480 --> 00:54:29,359 Speaker 1: UH cut the box building jobs that you see in 899 00:54:29,840 --> 00:54:33,920 Speaker 1: some of their warehouses, so they're not they're displacing roles 900 00:54:33,960 --> 00:54:38,200 Speaker 1: they're changing roles, right, so uh, instead of actually creating 901 00:54:38,760 --> 00:54:42,440 Speaker 1: actually making boxes, they're still human beings putting boxes on 902 00:54:42,480 --> 00:54:46,879 Speaker 1: conveyor belts, but they're not making the boxes because that 903 00:54:46,960 --> 00:54:49,600 Speaker 1: leads to a lot of waste, right, because there's a 904 00:54:49,600 --> 00:54:52,600 Speaker 1: lot of human error involved. Um. And also, these machines 905 00:54:52,600 --> 00:54:55,960 Speaker 1: can crank out six hundred to seven hundred boxes, you know, 906 00:54:56,080 --> 00:54:59,160 Speaker 1: per hour, which a human being cannot do. Um. So 907 00:54:59,239 --> 00:55:02,600 Speaker 1: there are certain uh, there's there's there's no denying that 908 00:55:02,640 --> 00:55:04,920 Speaker 1: machines are replacing human beings, and in that way, I 909 00:55:04,960 --> 00:55:07,319 Speaker 1: don't I don't think it's like literal robots. I think 910 00:55:07,320 --> 00:55:11,120 Speaker 1: that there are machines that are doing jobs that are 911 00:55:11,239 --> 00:55:14,319 Speaker 1: very difficult and taxing on human beings. They're doing those 912 00:55:14,400 --> 00:55:19,359 Speaker 1: jobs better and therefore, yes, displacing people. Um. You know 913 00:55:19,480 --> 00:55:23,520 Speaker 1: what Amazon will say is that it's not so much 914 00:55:23,560 --> 00:55:28,680 Speaker 1: about replacing people, it's about repurposing people and um giving 915 00:55:28,680 --> 00:55:32,080 Speaker 1: people jobs that are more meaningful. I think that is 916 00:55:32,080 --> 00:55:35,919 Speaker 1: a public relations line, um. But I also think there's 917 00:55:35,920 --> 00:55:37,920 Speaker 1: a there's a certain element of truth to it, which is, 918 00:55:37,960 --> 00:55:42,160 Speaker 1: you know, can we use machines to take people out 919 00:55:42,160 --> 00:55:45,720 Speaker 1: of jobs that are both physically and emotionally taxing for them, 920 00:55:45,760 --> 00:55:47,480 Speaker 1: I think certainly, and that could you know, it could 921 00:55:47,480 --> 00:55:49,879 Speaker 1: be one of the upsides, but I think that, yeah, 922 00:55:49,920 --> 00:55:51,560 Speaker 1: of course, there are jobs that are going to be 923 00:55:51,600 --> 00:55:55,160 Speaker 1: replaced by machines that are, you know, not only faster, 924 00:55:55,239 --> 00:55:58,239 Speaker 1: but have a much lower margin of error. And then 925 00:55:58,280 --> 00:56:03,280 Speaker 1: maybe some you know, read distributive universal basic income solution 926 00:56:03,400 --> 00:56:06,560 Speaker 1: to solve the practical problem of how will people eat, 927 00:56:07,440 --> 00:56:10,959 Speaker 1: but it won't solve the bigger culture and a psychological problem, 928 00:56:10,960 --> 00:56:15,040 Speaker 1: which is at the American dream and everything we're encouraged 929 00:56:15,120 --> 00:56:18,840 Speaker 1: to think in this country is that through work you 930 00:56:18,880 --> 00:56:21,960 Speaker 1: can better yourself and that this one major source of 931 00:56:22,000 --> 00:56:25,440 Speaker 1: your identity and value in the world is your success 932 00:56:25,440 --> 00:56:27,600 Speaker 1: in your career and how much you achieve and how 933 00:56:27,600 --> 00:56:30,560 Speaker 1: many promotions you get. I mean, look at the famous 934 00:56:30,640 --> 00:56:36,279 Speaker 1: Christmas movie, Um, what's it called? Sorry Christmas Carol? No, no no, no, nos, 935 00:56:36,480 --> 00:56:38,440 Speaker 1: Oh It's a wonderful life. No, not even that one. 936 00:56:38,440 --> 00:56:44,520 Speaker 1: It's along with the guy chevy Chase, Oh Christmas Vacation. 937 00:56:45,120 --> 00:56:47,800 Speaker 1: I mean, look at and look at National Lampoon's Christmas Vacation. 938 00:56:48,040 --> 00:56:51,440 Speaker 1: Chevy Chase's whole identity and worldview is predicated on that 939 00:56:51,560 --> 00:56:54,880 Speaker 1: Christmas bonus and you know, we've been encouraged by a 940 00:56:54,960 --> 00:56:58,120 Speaker 1: hundred years if not more, of this post industrial revolution 941 00:56:58,200 --> 00:57:01,200 Speaker 1: world to equate our value in life with a financial 942 00:57:01,239 --> 00:57:05,719 Speaker 1: value that we create. And we may be technically economically 943 00:57:05,800 --> 00:57:08,319 Speaker 1: able to move away from that, but psychologically it's going 944 00:57:08,360 --> 00:57:11,640 Speaker 1: to be intensely traumatic and we have not even begun 945 00:57:11,719 --> 00:57:13,799 Speaker 1: to deal with the consequences of that or even think 946 00:57:13,840 --> 00:57:18,120 Speaker 1: about them. Yeah, that's a good point, I think. Uh, 947 00:57:17,160 --> 00:57:20,680 Speaker 1: you know, do you have the technologists who argue, and 948 00:57:20,720 --> 00:57:23,320 Speaker 1: I think rightly so that they're going to be a 949 00:57:23,360 --> 00:57:26,120 Speaker 1: lot of aspects that AI simply will not be ready 950 00:57:26,240 --> 00:57:30,800 Speaker 1: to just take over. Again, the further out from the 951 00:57:30,840 --> 00:57:34,240 Speaker 1: repetitive norm you get, the more challenging it is for 952 00:57:34,280 --> 00:57:37,160 Speaker 1: a machine to do, whereas a human can pick up 953 00:57:37,200 --> 00:57:39,560 Speaker 1: on it pretty quickly. We're really good at doing that. 954 00:57:40,160 --> 00:57:42,600 Speaker 1: But um so there's gonna be certain things that, at 955 00:57:42,680 --> 00:57:47,440 Speaker 1: least for the foreseeable future, are going to be really 956 00:57:47,880 --> 00:57:51,680 Speaker 1: firmly in the realm of human beings. Uh. But you also, 957 00:57:52,200 --> 00:57:54,880 Speaker 1: you know, end up having to think about who's messaging 958 00:57:54,920 --> 00:57:58,480 Speaker 1: this out right, because that always creates that little question 959 00:57:58,520 --> 00:58:03,120 Speaker 1: you have too. If it's IBM saying the technology where 960 00:58:03,160 --> 00:58:06,480 Speaker 1: creating is going to augment people in the future, then 961 00:58:06,560 --> 00:58:10,560 Speaker 1: you remember, oh, well, IBM is also designing those systems. UM. 962 00:58:10,680 --> 00:58:13,160 Speaker 1: But I still think that there is truth to it. 963 00:58:13,200 --> 00:58:15,000 Speaker 1: I mean, I think that there is truth that AI 964 00:58:15,120 --> 00:58:18,640 Speaker 1: can augment people, and as you were saying, Kara can 965 00:58:18,680 --> 00:58:22,960 Speaker 1: help take over parts of jobs that really humans are 966 00:58:22,960 --> 00:58:25,480 Speaker 1: not very well suited for in the first place, and 967 00:58:26,040 --> 00:58:29,240 Speaker 1: certainly wouldn't be considered the type of jobs that most 968 00:58:29,240 --> 00:58:32,680 Speaker 1: people would find meaning from right that they wouldn't find 969 00:58:32,800 --> 00:58:36,720 Speaker 1: value in that opportunity. They would be doing it because 970 00:58:36,720 --> 00:58:39,400 Speaker 1: they would need to make ends meet, But it's not 971 00:58:39,440 --> 00:58:41,240 Speaker 1: necessarily I don't think there's a lot of people who 972 00:58:41,320 --> 00:58:46,760 Speaker 1: dream of making boxes UM. So I think it's it's 973 00:58:46,760 --> 00:58:48,560 Speaker 1: one of those things where I think it always benefits 974 00:58:48,600 --> 00:58:50,919 Speaker 1: you to kind of take a step back, think about 975 00:58:50,960 --> 00:58:54,760 Speaker 1: who's messaging this um and and really take a look 976 00:58:54,840 --> 00:58:58,440 Speaker 1: at what's actually going on. Because, as it turns out, 977 00:58:58,440 --> 00:59:00,640 Speaker 1: when you look at a prediction and one since predicting 978 00:59:00,640 --> 00:59:02,880 Speaker 1: that twenty four million jobs are going to be destroyed 979 00:59:02,880 --> 00:59:05,960 Speaker 1: in and someone else is saying it's more like three 980 00:59:06,000 --> 00:59:09,480 Speaker 1: million jobs, what it ultimately what it ultimately tells us 981 00:59:10,000 --> 00:59:13,440 Speaker 1: is that nobody really knows and that that in itself 982 00:59:13,520 --> 00:59:16,680 Speaker 1: is scary. It's not. It's not making us feel better 983 00:59:16,880 --> 00:59:20,520 Speaker 1: about the future necessarily, But I think what it really 984 00:59:20,520 --> 00:59:22,480 Speaker 1: tells us is the future is not set in stone 985 00:59:22,520 --> 00:59:26,720 Speaker 1: at all, and that if we are going forward knowing 986 00:59:27,360 --> 00:59:30,080 Speaker 1: the capabilities of AI, how it can work with us. 987 00:59:30,680 --> 00:59:35,560 Speaker 1: If we hold companies and individuals accountable for designing AI 988 00:59:35,680 --> 00:59:40,120 Speaker 1: systems that can uh be used in an ethical way 989 00:59:40,360 --> 00:59:43,720 Speaker 1: and UH and then hold the people who are implementing 990 00:59:43,720 --> 00:59:46,760 Speaker 1: those systems to make sure it's done in that ethical way, 991 00:59:46,880 --> 00:59:49,560 Speaker 1: then we can see the benefits of AI. I think 992 00:59:49,600 --> 00:59:53,600 Speaker 1: AI ultimately is a very complicated tool, but it's like 993 00:59:53,760 --> 00:59:56,640 Speaker 1: other tools, which means you can use it for good 994 00:59:56,880 --> 01:00:01,080 Speaker 1: or you can use it for evil, And ultimately comes 995 01:00:01,120 --> 01:00:05,480 Speaker 1: down to the implementation and and vigilance. Right, we have 996 01:00:05,560 --> 01:00:07,439 Speaker 1: to just make sure that we're paying attention to what's 997 01:00:07,480 --> 01:00:09,600 Speaker 1: going on and not just trusting that the machines are 998 01:00:09,600 --> 01:00:12,480 Speaker 1: doing everything correctly, because as far as the machines are concerned, 999 01:00:12,480 --> 01:00:14,560 Speaker 1: they're doing everything correctly. It's just that the outcome is 1000 01:00:14,600 --> 01:00:17,920 Speaker 1: not so great for us. Um, a hammer is always 1001 01:00:17,920 --> 01:00:21,280 Speaker 1: doing its job. Yeah, it's just a matter of who's 1002 01:00:21,320 --> 01:00:25,600 Speaker 1: using Yeah, exactly. Yeah, It depends on whoever's holding the hammer, 1003 01:00:25,720 --> 01:00:28,640 Speaker 1: what he or she thinks of as a nail. That's 1004 01:00:28,640 --> 01:00:31,800 Speaker 1: what it really comes down to. Um, well, guys, thank 1005 01:00:31,840 --> 01:00:34,720 Speaker 1: you so much. We're going to have another episode coming 1006 01:00:34,840 --> 01:00:38,040 Speaker 1: up in next week guys, so so stay tuned because 1007 01:00:38,120 --> 01:00:40,160 Speaker 1: Os and carrere gonna be back. We're gonna talk about 1008 01:00:40,880 --> 01:00:43,880 Speaker 1: how different parts of the world are viewing a I 1009 01:00:44,600 --> 01:00:49,400 Speaker 1: from sort of a policy and regulations kind of perspective, 1010 01:00:49,520 --> 01:00:51,720 Speaker 1: as well as just like what are just the different 1011 01:00:51,720 --> 01:00:54,360 Speaker 1: approaches to artificial intelligence around the world, because, as it 1012 01:00:54,360 --> 01:00:56,520 Speaker 1: turns out, you know, Kara, you've already mentioned a couple 1013 01:00:56,560 --> 01:00:59,360 Speaker 1: of times how the EU has been taking steps to 1014 01:00:59,400 --> 01:01:02,480 Speaker 1: try and and think about this ahead of everybody else. 1015 01:01:02,800 --> 01:01:04,840 Speaker 1: But what's going on around the world. And I think 1016 01:01:04,880 --> 01:01:06,920 Speaker 1: you guys are going to be surprised. I know I 1017 01:01:06,960 --> 01:01:10,440 Speaker 1: was because I am so US centric in my show 1018 01:01:10,520 --> 01:01:13,640 Speaker 1: that I often have blinders on. So we'll have to 1019 01:01:13,720 --> 01:01:16,479 Speaker 1: join us for that episode that's coming out next week. 1020 01:01:17,120 --> 01:01:21,280 Speaker 1: And if you haven't already gone out and subscribe to Sleepwalkers, 1021 01:01:21,760 --> 01:01:23,680 Speaker 1: this is your reminder to go out and do that 1022 01:01:24,200 --> 01:01:27,320 Speaker 1: because the show is fantastic. You've got some great interviews, 1023 01:01:27,840 --> 01:01:32,040 Speaker 1: you have fantastic conversations between the two of you about 1024 01:01:32,080 --> 01:01:36,960 Speaker 1: these these subjects, and it's really educational and entertaining and 1025 01:01:37,000 --> 01:01:42,880 Speaker 1: thought provoking, and congratulations on creating such a really compelling show. Well, 1026 01:01:42,920 --> 01:01:46,320 Speaker 1: thank you, Jonathan. We're we're already enjoying working on Sleepwalkers, 1027 01:01:46,480 --> 01:01:49,640 Speaker 1: and you know, this conversation is has been fantastic for 1028 01:01:49,720 --> 01:01:51,440 Speaker 1: us to have a chance to step out of our 1029 01:01:51,440 --> 01:01:53,160 Speaker 1: own show and think about some of these ideas in 1030 01:01:53,200 --> 01:01:56,920 Speaker 1: conversation with you, so we already enjoyed it. Thank you, Jonathan. 1031 01:01:57,320 --> 01:02:00,800 Speaker 1: You're very welcome, and so guys, if you want to 1032 01:02:00,840 --> 01:02:02,680 Speaker 1: get in touch with me, send me an email the 1033 01:02:02,680 --> 01:02:05,960 Speaker 1: addresses tech stuff at how stuff works dot com. Pop 1034 01:02:06,000 --> 01:02:08,600 Speaker 1: on over to the website that's tech stuff podcast dot com. 1035 01:02:08,640 --> 01:02:11,480 Speaker 1: You'll find an archive of all of our past episodes. There. 1036 01:02:11,840 --> 01:02:14,240 Speaker 1: You also find links to the social media presence in 1037 01:02:14,280 --> 01:02:16,640 Speaker 1: our online store, where every purchase you make goes to 1038 01:02:16,640 --> 01:02:19,360 Speaker 1: help the show. We greatly appreciate it, and we will 1039 01:02:19,400 --> 01:02:26,440 Speaker 1: talk to you again really soon. Y tech Stuff is 1040 01:02:26,440 --> 01:02:28,920 Speaker 1: a production of I Heart Radio's How Stuff Works. For 1041 01:02:29,040 --> 01:02:32,000 Speaker 1: more podcasts from my heart Radio, visit the i heart 1042 01:02:32,080 --> 01:02:35,240 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 1043 01:02:35,320 --> 01:02:36,000 Speaker 1: favorite shows,