1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:14,000 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,040 --> 00:00:16,720 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,760 --> 00:00:18,560 Speaker 1: How Stuff Works and I Heart Radio and I Love 5 00:00:18,640 --> 00:00:22,680 Speaker 1: all Things Tech. And Oz and Kara from Sleepwalkers have 6 00:00:23,160 --> 00:00:25,799 Speaker 1: come back to join me yet again. Spoiler alert. We 7 00:00:25,840 --> 00:00:28,000 Speaker 1: actually haven't left anywhere. We just sat here the whole 8 00:00:28,040 --> 00:00:30,319 Speaker 1: time and we're recording two episodes back to back. But 9 00:00:30,680 --> 00:00:33,839 Speaker 1: if you haven't heard our previous discussion, which is kind 10 00:00:33,840 --> 00:00:37,199 Speaker 1: of a high level discussion about AI and the potential 11 00:00:37,280 --> 00:00:41,000 Speaker 1: dangers and sort of the the various messaging we've received 12 00:00:41,000 --> 00:00:43,879 Speaker 1: about AI and the warning signs and the promises, you 13 00:00:43,880 --> 00:00:46,680 Speaker 1: should listen to that episode first. If you haven't subscribed 14 00:00:46,720 --> 00:00:49,839 Speaker 1: to Sleepwalkers, you should absolutely do that too, because the 15 00:00:49,840 --> 00:00:53,200 Speaker 1: show is amazing. And today we're gonna talk a little 16 00:00:53,200 --> 00:00:56,800 Speaker 1: bit more about how different parts of the world are 17 00:00:56,840 --> 00:01:01,200 Speaker 1: treating AI, whether it's from a government perfective or a 18 00:01:01,240 --> 00:01:06,280 Speaker 1: business perspective, The technological you know, development of AI, where 19 00:01:06,360 --> 00:01:10,760 Speaker 1: is that actually happening the fastest, And the answer to 20 00:01:10,800 --> 00:01:14,600 Speaker 1: that might surprise you if you haven't been paying close 21 00:01:14,640 --> 00:01:18,280 Speaker 1: attention to news around the world and we're going to 22 00:01:18,400 --> 00:01:21,640 Speaker 1: dive into all of that. So, without any further Ado, 23 00:01:22,080 --> 00:01:25,280 Speaker 1: welcome back to the show, Os and Kara, Thank you 24 00:01:25,319 --> 00:01:29,640 Speaker 1: so much. Jonathan, Hi Jonathan again. Hi again. Yea. And 25 00:01:30,080 --> 00:01:33,200 Speaker 1: while we were between shows, we were just you know, 26 00:01:33,840 --> 00:01:37,319 Speaker 1: talking about Lady Gaga and Shakespeare as you are want 27 00:01:37,400 --> 00:01:40,360 Speaker 1: to do. Uh, this is kind of what we technologists 28 00:01:40,440 --> 00:01:44,039 Speaker 1: technology podcasters tend to to really, you know, kind of 29 00:01:44,080 --> 00:01:47,080 Speaker 1: revel in when we're not on on Mike or at 30 00:01:47,120 --> 00:01:51,200 Speaker 1: least not recording. But I wanted to talk first about 31 00:01:52,400 --> 00:01:59,880 Speaker 1: where do we see really aggressive you know, moving forward 32 00:02:00,040 --> 00:02:02,720 Speaker 1: on on technology and AI, Like, where are we seeing 33 00:02:02,720 --> 00:02:07,960 Speaker 1: the most development in AI? Because a lot of people 34 00:02:08,000 --> 00:02:10,560 Speaker 1: think of Silicon Valley as sort of the place like 35 00:02:10,639 --> 00:02:15,280 Speaker 1: it's it's the breeding ground for all technologies. But as 36 00:02:15,320 --> 00:02:18,200 Speaker 1: it turns out, that's that's really a very narrow view, 37 00:02:18,960 --> 00:02:24,680 Speaker 1: and it it's ignoring a giant superpower that is pouring 38 00:02:24,720 --> 00:02:33,399 Speaker 1: a lot of resources into AI development. Right China. Yeah, sorry, 39 00:02:33,440 --> 00:02:37,400 Speaker 1: that China is is um is leading the charge on AI. 40 00:02:37,480 --> 00:02:39,720 Speaker 1: And one of the guests we had on the show 41 00:02:39,880 --> 00:02:44,280 Speaker 1: on Sleepwalkers is a guy called Kaifu Lee who was 42 00:02:44,320 --> 00:02:47,160 Speaker 1: part of the team at Apple that developed the technology 43 00:02:47,280 --> 00:02:50,920 Speaker 1: behind SyRI in the nineties, went on to run Google 44 00:02:51,040 --> 00:02:54,320 Speaker 1: China and is now one of the biggest technology investors 45 00:02:54,320 --> 00:02:57,600 Speaker 1: in China through a fund called Signivation Ventures, which is 46 00:02:57,600 --> 00:03:01,920 Speaker 1: about all sorts of different technology ventures in China, including 47 00:03:01,960 --> 00:03:04,720 Speaker 1: several unicorns a billion dollar companies, one of which is 48 00:03:05,080 --> 00:03:08,400 Speaker 1: called meg v which does facial recognition technology. Kai Fu 49 00:03:08,480 --> 00:03:13,200 Speaker 1: Lee recently wrote a book called AI Superpowers, China, Silicon 50 00:03:13,280 --> 00:03:16,480 Speaker 1: Valley and the New World Order. New World Order is 51 00:03:16,560 --> 00:03:20,239 Speaker 1: quite a resonant phrase, shall we say, But the thesis 52 00:03:20,320 --> 00:03:23,200 Speaker 1: of the book is simply that China are doing AI 53 00:03:23,360 --> 00:03:26,040 Speaker 1: a lot better, a lot more aggressively, and with a 54 00:03:26,080 --> 00:03:29,520 Speaker 1: lot more promise than we are in the US. And 55 00:03:29,560 --> 00:03:33,520 Speaker 1: that's for two reasons. Number one, China, unlike the US, 56 00:03:33,760 --> 00:03:40,080 Speaker 1: is a centralized country, a centralized government who have said absolutely, 57 00:03:40,480 --> 00:03:44,600 Speaker 1: with no hesitation, AI is our biggest priority. In seventeen, 58 00:03:44,640 --> 00:03:47,839 Speaker 1: the Communist Party released what they called the New Generation 59 00:03:48,080 --> 00:03:52,960 Speaker 1: Artificial Intelligence Development Plan, and the first paragraph read AI 60 00:03:53,040 --> 00:03:57,000 Speaker 1: has become a new focus of international competition. AI is 61 00:03:57,000 --> 00:04:00,280 Speaker 1: a strategic technology that will lead the future. The was 62 00:04:00,400 --> 00:04:03,320 Speaker 1: major developed countries are taking the development of AI as 63 00:04:03,320 --> 00:04:09,640 Speaker 1: a major strategy to enhance national competitiveness and protect national security. 64 00:04:09,720 --> 00:04:11,840 Speaker 1: So China have been on this for two years, I 65 00:04:11,880 --> 00:04:14,360 Speaker 1: mean for longer than two years, but they're taking immensity 66 00:04:14,440 --> 00:04:18,279 Speaker 1: seriously and we're not. We did have our own presidential 67 00:04:18,320 --> 00:04:20,880 Speaker 1: executive order in April of this year, but it didn't 68 00:04:20,880 --> 00:04:23,600 Speaker 1: come with any funding a kaifu. Second point, which I'm 69 00:04:23,600 --> 00:04:25,720 Speaker 1: sure we'll get onto, is that the China also have 70 00:04:25,760 --> 00:04:28,880 Speaker 1: a much richer data set, which is the power behind 71 00:04:28,920 --> 00:04:33,040 Speaker 1: the throne of AI. Yeah, something that I think a 72 00:04:33,360 --> 00:04:37,280 Speaker 1: lot of Americans in particular aren't aware of is that 73 00:04:38,000 --> 00:04:41,320 Speaker 1: when you look over at China and China's efforts to 74 00:04:42,160 --> 00:04:45,680 Speaker 1: to own AI essentially, I mean there there's they're saying, 75 00:04:46,279 --> 00:04:51,440 Speaker 1: we are going to be established as the the primary 76 00:04:52,200 --> 00:04:55,760 Speaker 1: source of AI development by twenty thirty and they're they're 77 00:04:55,760 --> 00:04:58,919 Speaker 1: well on their way to doing that already. Um. You 78 00:04:58,960 --> 00:05:02,239 Speaker 1: look at the the B A T that's the three 79 00:05:02,400 --> 00:05:05,599 Speaker 1: big companies that collectively are valued at more than a 80 00:05:05,640 --> 00:05:09,479 Speaker 1: trillion dollars, bay Do, Ali, Baba and ten Cent, and 81 00:05:10,200 --> 00:05:15,560 Speaker 1: those are already enormous corporations that have deep ties to 82 00:05:16,440 --> 00:05:19,920 Speaker 1: the state government of China than that are working very 83 00:05:19,920 --> 00:05:22,440 Speaker 1: hard in AI. But beyond that, even though you know, 84 00:05:22,480 --> 00:05:24,680 Speaker 1: you might think, well, those are maybe three big companies, 85 00:05:24,680 --> 00:05:28,440 Speaker 1: but how is that that that big of a of 86 00:05:28,120 --> 00:05:30,800 Speaker 1: a of a you know, a force on its own. 87 00:05:31,560 --> 00:05:37,920 Speaker 1: They also invest heavily in lots of startup companies, including unicorns. Uh. 88 00:05:37,960 --> 00:05:42,480 Speaker 1: They in fact, they out of the six hundred billion 89 00:05:42,560 --> 00:05:45,440 Speaker 1: dollars of unicorns out of China from a couple of 90 00:05:45,520 --> 00:05:49,680 Speaker 1: years ago, they made up fifty percent of the investment 91 00:05:49,800 --> 00:05:54,080 Speaker 1: into those companies. Then beyond that, they're investing in companies 92 00:05:54,279 --> 00:05:56,719 Speaker 1: outside of China and in other parts of the world, 93 00:05:56,760 --> 00:06:00,400 Speaker 1: including the United States. Uh. Ten Cent, for example, has 94 00:06:00,480 --> 00:06:05,960 Speaker 1: been investing heavily in video games over in the West Coast, 95 00:06:06,800 --> 00:06:10,560 Speaker 1: I'm along with lots of other different companies out there. 96 00:06:10,680 --> 00:06:15,760 Speaker 1: So not only are you talking about a country that 97 00:06:15,839 --> 00:06:19,840 Speaker 1: has an enormous population and therefore an enormous source of 98 00:06:19,880 --> 00:06:24,279 Speaker 1: information on its own, it has spread out around the globe, 99 00:06:24,360 --> 00:06:29,600 Speaker 1: so it's gathering information from everywhere. So it's it's really 100 00:06:29,880 --> 00:06:34,520 Speaker 1: this incredibly pervasive system to gather the fuel that is 101 00:06:34,560 --> 00:06:38,680 Speaker 1: going to power artificial intelligence. Uh. And meanwhile, you also 102 00:06:38,800 --> 00:06:45,480 Speaker 1: have very smart people running very sophisticated laboratories working on 103 00:06:45,520 --> 00:06:49,120 Speaker 1: the next generation of of algorithms and applications of AI. 104 00:06:49,240 --> 00:06:52,520 Speaker 1: So you've got like the perfect storm over there. Yeah. Absolutely. 105 00:06:52,600 --> 00:06:55,120 Speaker 1: And I also think, I mean, just in terms of 106 00:06:55,200 --> 00:06:57,880 Speaker 1: what you're talking about, you know, Tencent being involved in 107 00:06:57,960 --> 00:07:01,240 Speaker 1: Fortnite if I'm correct, Yes, yeah, I was just doing 108 00:07:01,520 --> 00:07:04,400 Speaker 1: was doing the floss right over here. Actually, that's right. 109 00:07:05,080 --> 00:07:09,840 Speaker 1: And also the ownership of a little known dating app 110 00:07:09,880 --> 00:07:13,520 Speaker 1: called grinder Um, which the US is actually trying to 111 00:07:14,280 --> 00:07:18,200 Speaker 1: get back just because of the national security threat that 112 00:07:18,320 --> 00:07:24,119 Speaker 1: is involved in the Chinese owning uh so much user 113 00:07:24,200 --> 00:07:30,360 Speaker 1: data from United States citizens, including U S military personnel. Um. 114 00:07:30,840 --> 00:07:37,640 Speaker 1: I think I think what's alarming about this is that 115 00:07:37,960 --> 00:07:42,600 Speaker 1: we is that we're sort of entering into this new 116 00:07:42,680 --> 00:07:49,600 Speaker 1: territory of you know, war and competition and war being 117 00:07:49,680 --> 00:07:53,280 Speaker 1: less about you know, guns and bombs and more. And 118 00:07:53,520 --> 00:07:56,080 Speaker 1: this is not these are not my words I've read this, 119 00:07:57,120 --> 00:08:00,760 Speaker 1: but more about you know, bits and bites, so to speak. 120 00:08:00,840 --> 00:08:03,480 Speaker 1: I think you read that from Secret Ar State Mike Pompeo. 121 00:08:03,560 --> 00:08:07,480 Speaker 1: That's right, I'm quoting Mike Pompeio. Happily, I think, um, 122 00:08:07,640 --> 00:08:11,000 Speaker 1: which actually he said in regards to Huawei, which is 123 00:08:11,120 --> 00:08:14,520 Speaker 1: obviously in the news quite a bit recently. Um, you know. 124 00:08:14,600 --> 00:08:18,880 Speaker 1: I think that the there's there's the technology race, which is, 125 00:08:18,920 --> 00:08:22,680 Speaker 1: you know, sort of who's going to have who's going 126 00:08:22,720 --> 00:08:26,960 Speaker 1: to advance quickly and in the best way. But it's 127 00:08:27,000 --> 00:08:31,840 Speaker 1: also about sort of who owns data and who has 128 00:08:31,920 --> 00:08:35,199 Speaker 1: access to what data. I think the China, the Chinese 129 00:08:35,200 --> 00:08:39,960 Speaker 1: government does not really have a problem uh spying on 130 00:08:40,000 --> 00:08:42,720 Speaker 1: its citizens. I don't even think we'd call it spying necessarily. 131 00:08:42,760 --> 00:08:46,040 Speaker 1: I think it's sweeping, um is what is what they 132 00:08:46,080 --> 00:08:49,920 Speaker 1: tend to do, and um with that data can make 133 00:08:50,000 --> 00:08:55,400 Speaker 1: some pretty uh you know, chilling accusations about people who 134 00:08:55,600 --> 00:08:59,520 Speaker 1: they think are dissenting against the government. Um. You know, 135 00:08:59,559 --> 00:09:03,520 Speaker 1: in the United States, we are very free with the 136 00:09:03,600 --> 00:09:05,800 Speaker 1: data we give away because I don't think enough people 137 00:09:05,840 --> 00:09:08,360 Speaker 1: think about how much data they're giving away in any 138 00:09:08,400 --> 00:09:13,839 Speaker 1: given day. But the US government hopefully will remain so 139 00:09:14,080 --> 00:09:19,880 Speaker 1: is not as what's the word pervasive, pervasive and aggressive 140 00:09:19,920 --> 00:09:24,320 Speaker 1: about collecting said data and using it to um, you know, 141 00:09:24,400 --> 00:09:28,839 Speaker 1: in prison there citizens. So I don't know, I think 142 00:09:29,200 --> 00:09:31,560 Speaker 1: China can kind of get away with more and maybe 143 00:09:31,559 --> 00:09:35,520 Speaker 1: that's why the Chinese government is allowing Chinese businesses to 144 00:09:35,520 --> 00:09:38,600 Speaker 1: get away with more than the US government. Ill saying 145 00:09:38,600 --> 00:09:40,720 Speaker 1: it's worth pointing out in the US, you know, we 146 00:09:40,760 --> 00:09:44,280 Speaker 1: have this phenomenous surveillance capitalism, So there's a bunch of 147 00:09:44,280 --> 00:09:46,880 Speaker 1: big firms who take all the data we give them, 148 00:09:47,400 --> 00:09:50,400 Speaker 1: use it to model us, make predictions about us, and 149 00:09:50,480 --> 00:09:55,040 Speaker 1: sell us more stuff. In China, it's surveillance statecraft, and 150 00:09:55,960 --> 00:10:00,360 Speaker 1: the data is not siloed between Amazon, Google, face Book 151 00:10:00,600 --> 00:10:02,920 Speaker 1: and others. UM. It's tends to be in the hands 152 00:10:02,920 --> 00:10:05,960 Speaker 1: of the big technology companies in China, you know who, 153 00:10:06,120 --> 00:10:09,280 Speaker 1: given that they're not fully state owned, but much more 154 00:10:09,320 --> 00:10:11,760 Speaker 1: hand in pocket with the government than our technology companies 155 00:10:11,800 --> 00:10:15,400 Speaker 1: are share their collected data much more widely, which allows 156 00:10:15,400 --> 00:10:18,920 Speaker 1: them to make better predictions about what might happen and 157 00:10:19,400 --> 00:10:22,679 Speaker 1: be determinative about the outcome of their citizens. I do 158 00:10:22,800 --> 00:10:25,240 Speaker 1: think it's worth pointing out a lot of this conversation 159 00:10:25,320 --> 00:10:27,880 Speaker 1: about the difference between AI in China and the United 160 00:10:27,920 --> 00:10:30,920 Speaker 1: States that we have on this side of the Pacific 161 00:10:31,480 --> 00:10:38,040 Speaker 1: um is filtered through our liberal individual worldview, which states 162 00:10:38,200 --> 00:10:42,360 Speaker 1: that you know, free will, the ability to control one's 163 00:10:42,400 --> 00:10:46,200 Speaker 1: own outcomes, um, the you know, the the recognition of 164 00:10:46,200 --> 00:10:50,040 Speaker 1: oneself as an individual are the utmost goods, and that 165 00:10:50,120 --> 00:10:53,840 Speaker 1: worldview simply isn't shared throughout most of China. Now you 166 00:10:53,880 --> 00:10:56,760 Speaker 1: can say that's that will change, you know, society opens up. 167 00:10:57,000 --> 00:10:59,360 Speaker 1: You can say there's an inevitable progress, if you want 168 00:10:59,400 --> 00:11:01,640 Speaker 1: to call it, program us towards our worldview. But the 169 00:11:01,640 --> 00:11:04,040 Speaker 1: fact is that isn't the world view in China. And 170 00:11:04,160 --> 00:11:08,000 Speaker 1: many Chinese citizens are used to living in one on 171 00:11:08,000 --> 00:11:12,480 Speaker 1: one party states since um, since mal z Dong or 172 00:11:12,480 --> 00:11:15,000 Speaker 1: where it was in the nineteen forties, and so there's 173 00:11:15,000 --> 00:11:16,600 Speaker 1: a there's a there's a there's a sense of being 174 00:11:16,600 --> 00:11:22,840 Speaker 1: accustomed to the belief that harmony and the furthering of 175 00:11:22,960 --> 00:11:27,520 Speaker 1: the state's goals rise um rising is a rising tide 176 00:11:27,520 --> 00:11:31,240 Speaker 1: which raises all boats. And so you know, I think 177 00:11:31,240 --> 00:11:33,360 Speaker 1: we have this desire to see and of course it 178 00:11:33,400 --> 00:11:36,120 Speaker 1: has happened in China. There's been Channeman's Square, you know. 179 00:11:36,120 --> 00:11:38,600 Speaker 1: There there there have been organic protest movements. But we 180 00:11:38,679 --> 00:11:43,079 Speaker 1: do have the desire to impose our absolute belief in 181 00:11:43,120 --> 00:11:45,520 Speaker 1: the importance of the individual and free will on the 182 00:11:45,559 --> 00:11:48,120 Speaker 1: rest of the world. And you know, I'm not sure 183 00:11:48,280 --> 00:11:51,120 Speaker 1: that if you ask the average Chinese citizen, you know, 184 00:11:51,360 --> 00:11:56,080 Speaker 1: do you resent being surveiled when the most highest number 185 00:11:56,120 --> 00:11:58,360 Speaker 1: of people have been lifted out of poverty in the 186 00:11:58,360 --> 00:12:01,719 Speaker 1: fastest time of any country tree in history. I think 187 00:12:01,720 --> 00:12:04,680 Speaker 1: the answer might be, you know, perhaps we prefer more freedoms, 188 00:12:04,720 --> 00:12:08,200 Speaker 1: but this isn't the worst thing in the world. Also, 189 00:12:08,360 --> 00:12:12,080 Speaker 1: China is an ethnically largely ethnically homogeneous society. So for that, 190 00:12:12,160 --> 00:12:15,760 Speaker 1: for the average Han Chinese, that the trade off for this, 191 00:12:15,760 --> 00:12:19,880 Speaker 1: this lifting out of poverty and the national pride which 192 00:12:19,920 --> 00:12:22,800 Speaker 1: is which is absolutely on the rise in China, giving 193 00:12:22,800 --> 00:12:25,160 Speaker 1: a much stronger sense of national identity, I would argue 194 00:12:25,160 --> 00:12:27,240 Speaker 1: than we have here, the tradeoff may be worth it. 195 00:12:27,600 --> 00:12:31,680 Speaker 1: What's very chilling, Beyond chilling is the treatment of the 196 00:12:31,760 --> 00:12:35,120 Speaker 1: non Han Chinese in China, the tibet the Tibetans, and 197 00:12:35,200 --> 00:12:39,360 Speaker 1: specifically the Weakers, who are really the people who experienced 198 00:12:39,400 --> 00:12:43,040 Speaker 1: the hard end of this surveillance state in China. Hey, guys, 199 00:12:43,040 --> 00:12:46,280 Speaker 1: it's Jonathan from the future. I'm just popping into interrupt 200 00:12:46,400 --> 00:12:49,360 Speaker 1: here because as it turned out, we got so into 201 00:12:49,360 --> 00:12:52,600 Speaker 1: this conversation I totally forgot to put in a break. 202 00:12:53,080 --> 00:12:56,760 Speaker 1: So let's take a quick break even while Jonathan in 203 00:12:56,800 --> 00:12:59,800 Speaker 1: the past and ours and Kara continue their conversation. We'll 204 00:12:59,800 --> 00:13:09,880 Speaker 1: get back to that in just a second, Hey, guys, 205 00:13:10,040 --> 00:13:12,199 Speaker 1: Jonathan from the future. Again, we're going to get right 206 00:13:12,200 --> 00:13:15,680 Speaker 1: back into the episode. Oz was just talking about China 207 00:13:15,920 --> 00:13:19,680 Speaker 1: and it's use of technology and the creation of a 208 00:13:19,720 --> 00:13:24,160 Speaker 1: surveillance state, and we're gonna pick up with my response. Yeah, 209 00:13:24,200 --> 00:13:26,360 Speaker 1: and as we mentioned in the last episode, you know, 210 00:13:26,360 --> 00:13:29,240 Speaker 1: we were talking about about bias and how that can 211 00:13:30,160 --> 00:13:33,800 Speaker 1: be unintentionally inserted into a system and how that can 212 00:13:33,840 --> 00:13:37,800 Speaker 1: cause harm. But you could also intentionally create a biased 213 00:13:37,880 --> 00:13:43,559 Speaker 1: system specifically in order to keep tabs on particular populations 214 00:13:44,080 --> 00:13:49,840 Speaker 1: that that are, you know, minorities, and yeah, the and 215 00:13:49,920 --> 00:13:56,000 Speaker 1: obviously that could lead to truly horrific inhumane practices, leading 216 00:13:56,000 --> 00:13:59,839 Speaker 1: all the way up to even genocide. Yes, I mean, 217 00:14:00,559 --> 00:14:05,400 Speaker 1: right now what we're seeing is sort of um imprisonment 218 00:14:05,559 --> 00:14:08,440 Speaker 1: in re education camps. But I think it's important to 219 00:14:08,440 --> 00:14:11,880 Speaker 1: note and this is again, you know, piggybacking off of 220 00:14:11,880 --> 00:14:14,400 Speaker 1: what Oz was saying. You know, the Chinese Communist Party 221 00:14:14,640 --> 00:14:18,320 Speaker 1: has often used surveillance as a means for control. The 222 00:14:18,320 --> 00:14:21,960 Speaker 1: differences is that artificial intelligence enables a kind of surveillance 223 00:14:21,960 --> 00:14:24,800 Speaker 1: that we haven't seen before, which is, you know, in China, 224 00:14:25,160 --> 00:14:32,760 Speaker 1: there is basically a very large operation that they call Egypt, 225 00:14:32,800 --> 00:14:37,960 Speaker 1: which is the Integrated Joint up Integrated Joint Operations. And 226 00:14:38,000 --> 00:14:41,480 Speaker 1: what that does is it's a database that is sweeping 227 00:14:41,560 --> 00:14:47,800 Speaker 1: information from you know, basically every source imaginable WiFi, you know, 228 00:14:47,960 --> 00:14:50,840 Speaker 1: visitor management system so you know, for example, in America, 229 00:14:50,880 --> 00:14:52,640 Speaker 1: what we call you know, when you walk into a 230 00:14:52,640 --> 00:14:56,560 Speaker 1: building and register your name to visit an office. You know, um, 231 00:14:56,680 --> 00:15:01,440 Speaker 1: wecha um we chat conversations, you know, uh, when you 232 00:15:01,520 --> 00:15:03,360 Speaker 1: leave the country, when you come back in the country, 233 00:15:03,560 --> 00:15:08,720 Speaker 1: all of these things are being swept into um a 234 00:15:08,840 --> 00:15:11,200 Speaker 1: larger I don't I don't know what the word is 235 00:15:11,200 --> 00:15:12,800 Speaker 1: for it. It's not server, I don't know how to 236 00:15:12,840 --> 00:15:15,600 Speaker 1: say it. It's a system. It's a system. It's a 237 00:15:15,600 --> 00:15:20,160 Speaker 1: system that it's then making decisions and predictions about who 238 00:15:20,240 --> 00:15:24,240 Speaker 1: is basically doing right and doing wrong. And we just 239 00:15:24,360 --> 00:15:27,280 Speaker 1: simply don't have We don't have anything like that in 240 00:15:27,280 --> 00:15:29,680 Speaker 1: the United States right now. I mean, certainly we can 241 00:15:30,200 --> 00:15:33,000 Speaker 1: there there are companies that can use our information in 242 00:15:33,040 --> 00:15:35,640 Speaker 1: the United States to make predictions about us, to sell 243 00:15:35,720 --> 00:15:40,160 Speaker 1: us things, to basically set our insurance premiums. But as 244 00:15:40,240 --> 00:15:45,240 Speaker 1: far as just an integrated system that is making decisions 245 00:15:45,280 --> 00:15:48,960 Speaker 1: about its people and then also using it to imprison 246 00:15:49,000 --> 00:15:52,200 Speaker 1: its people. There's it's just unprecedented. And you know, the 247 00:15:52,280 --> 00:15:56,960 Speaker 1: Human Rights Watch is basically calling it a humanitarian crisis, 248 00:15:56,960 --> 00:15:58,480 Speaker 1: which I which I think it is when you think 249 00:15:58,520 --> 00:16:02,400 Speaker 1: about you know, um, your own country basically spying on 250 00:16:02,440 --> 00:16:07,040 Speaker 1: your conversations. Um, really, I mean obviously without your consent. 251 00:16:07,080 --> 00:16:09,400 Speaker 1: People are stopped all the time in China and their 252 00:16:09,400 --> 00:16:11,840 Speaker 1: phones are taken and read through like that's a very 253 00:16:11,880 --> 00:16:15,440 Speaker 1: that's a sort of normal day, um, and then using 254 00:16:15,480 --> 00:16:18,480 Speaker 1: that information to put you in a re education camp. 255 00:16:18,520 --> 00:16:21,760 Speaker 1: I mean, I think if most Americans knew that, which 256 00:16:21,760 --> 00:16:26,560 Speaker 1: I don't think they do, um, they would be alarmed. 257 00:16:26,920 --> 00:16:30,040 Speaker 1: But let's be clear. You know, we're we're very far 258 00:16:30,200 --> 00:16:33,120 Speaker 1: from having a re education camps in the US, but 259 00:16:33,240 --> 00:16:35,960 Speaker 1: we do, and it's not on a state level or 260 00:16:36,120 --> 00:16:39,640 Speaker 1: a national level. We do make decisions about people's outcomes 261 00:16:39,640 --> 00:16:42,800 Speaker 1: with stuff like credit schools. I mean the credit schools. 262 00:16:42,840 --> 00:16:47,680 Speaker 1: You know in China it's it's it's ethnically it's explicitly ethnic. Right. 263 00:16:47,880 --> 00:16:50,560 Speaker 1: Well again in China they wouldn't say that, but effectively 264 00:16:50,560 --> 00:16:53,680 Speaker 1: it's explicitly ethnic against the wagers. In the US, we 265 00:16:53,720 --> 00:16:56,160 Speaker 1: have credit schools and guess what most people who grow 266 00:16:56,200 --> 00:16:58,240 Speaker 1: up with a certain amount of privilege know what their 267 00:16:58,240 --> 00:17:01,800 Speaker 1: credit school is, understand the principle of the credit school. 268 00:17:02,080 --> 00:17:04,080 Speaker 1: Get a credit card as early as they can, start 269 00:17:04,080 --> 00:17:06,439 Speaker 1: building their credit, making monthly payoffs, and then when it 270 00:17:06,480 --> 00:17:08,760 Speaker 1: comes time to get a mortgage and buy a house 271 00:17:08,800 --> 00:17:11,359 Speaker 1: and move to a nice neighborhood, guess what, All the 272 00:17:11,359 --> 00:17:13,720 Speaker 1: pieces are in place. But but for for many people 273 00:17:13,720 --> 00:17:15,840 Speaker 1: who don't grow up with the privilege, that the credit 274 00:17:15,880 --> 00:17:18,320 Speaker 1: school comes as a complete surprise at a certain point 275 00:17:18,359 --> 00:17:20,480 Speaker 1: in life that you know, there's even notion of having 276 00:17:20,560 --> 00:17:22,840 Speaker 1: bad credit. All of a sudden, you know you've got 277 00:17:22,880 --> 00:17:26,119 Speaker 1: bad credit without ever having realized that you had this 278 00:17:26,160 --> 00:17:28,280 Speaker 1: credit school you're supposed to be working on. And guess what. 279 00:17:28,600 --> 00:17:30,400 Speaker 1: You can't move out of your neighborhood, You can't get 280 00:17:30,440 --> 00:17:32,720 Speaker 1: what you want, You can't buy your children and things 281 00:17:32,720 --> 00:17:35,280 Speaker 1: they need. So we do have this here. It's not 282 00:17:35,400 --> 00:17:41,040 Speaker 1: state policy, but we've we've effectually outsourced this predictive technology 283 00:17:41,080 --> 00:17:42,760 Speaker 1: about what people are going to do in the future 284 00:17:42,800 --> 00:17:45,600 Speaker 1: to private corporations who who use it to profit from 285 00:17:45,720 --> 00:17:48,520 Speaker 1: us rather than to control us. But we shouldn't beat 286 00:17:48,560 --> 00:17:51,440 Speaker 1: we shouldn't beat China too hard with this stick when 287 00:17:51,440 --> 00:17:54,600 Speaker 1: we have certain analogous practices here in the US. Sure, 288 00:17:54,840 --> 00:17:57,280 Speaker 1: I mean you could even I'm sure there are plenty 289 00:17:57,280 --> 00:18:01,040 Speaker 1: of people who do argue that when we the practices 290 00:18:01,080 --> 00:18:06,440 Speaker 1: of certain actual state level organizations in the United States, 291 00:18:06,960 --> 00:18:09,479 Speaker 1: that we should be concerned about what sort of systems 292 00:18:09,480 --> 00:18:11,960 Speaker 1: they might be using. I'm thinking specifically of the n 293 00:18:12,080 --> 00:18:15,840 Speaker 1: s A because it wasn't that long ago when we 294 00:18:15,840 --> 00:18:18,600 Speaker 1: were having enormous headlines about the n s A and 295 00:18:18,640 --> 00:18:21,840 Speaker 1: its practices of trying to have you know, essentially listening 296 00:18:21,920 --> 00:18:28,280 Speaker 1: points just outside of major UH communications channels, whether it 297 00:18:28,400 --> 00:18:32,960 Speaker 1: was Internet service providers or the telecommunications industry in general, 298 00:18:33,320 --> 00:18:36,080 Speaker 1: and you start thinking, well, if you start applying these 299 00:18:36,160 --> 00:18:40,359 Speaker 1: kind of AI surveillance UH programs, which is you know, 300 00:18:40,600 --> 00:18:45,360 Speaker 1: in essay, is all about trying to detect communications UH 301 00:18:45,480 --> 00:18:49,480 Speaker 1: in that everyone gets lumped into and not just bad actors, 302 00:18:49,920 --> 00:18:52,600 Speaker 1: then you start getting those concerns. And you know, we 303 00:18:52,600 --> 00:18:55,560 Speaker 1: we saw plenty of that even without the AI element 304 00:18:55,960 --> 00:18:58,000 Speaker 1: when the n s A stories were breaking a couple 305 00:18:58,000 --> 00:19:01,040 Speaker 1: of years ago, even just to the point where we 306 00:19:01,040 --> 00:19:04,400 Speaker 1: were seeing people in the n s A behaving poorly, 307 00:19:04,560 --> 00:19:11,000 Speaker 1: like using the information gathered to track down like old relationship, 308 00:19:11,800 --> 00:19:15,080 Speaker 1: you know, like old boyfriends and girlfriends, not an ethical 309 00:19:15,160 --> 00:19:18,320 Speaker 1: use of your power if you're if you're looking in 310 00:19:18,400 --> 00:19:22,199 Speaker 1: on communications. So we know that it doesn't have to 311 00:19:22,240 --> 00:19:27,280 Speaker 1: be a an official state line policy for this to 312 00:19:27,560 --> 00:19:32,480 Speaker 1: either be uh misused in an unauthorized way or put 313 00:19:32,520 --> 00:19:35,760 Speaker 1: to use in a way that maybe it's not immoral, 314 00:19:35,880 --> 00:19:38,359 Speaker 1: but you could at least argue it's a moral with 315 00:19:38,400 --> 00:19:42,080 Speaker 1: a lot of the the business practices, because that's morality 316 00:19:42,200 --> 00:19:45,679 Speaker 1: is not consideration when you're looking at how do we 317 00:19:45,920 --> 00:19:49,840 Speaker 1: make more revenue, how do we make a greater profit? Um, 318 00:19:50,160 --> 00:19:52,560 Speaker 1: And and you know you're just you're essentially you're checking 319 00:19:52,560 --> 00:19:55,159 Speaker 1: off boxes saying, right, here's how we can make this 320 00:19:55,240 --> 00:19:59,359 Speaker 1: more efficient, have a lower cost to us, a greater 321 00:19:59,600 --> 00:20:02,520 Speaker 1: payoff off in the long run. And so we start 322 00:20:02,560 --> 00:20:04,640 Speaker 1: to see how exactly as you were saying, Oz that 323 00:20:05,080 --> 00:20:08,040 Speaker 1: that while you it is easy to point to another 324 00:20:08,080 --> 00:20:13,560 Speaker 1: country and say these policies are clearly uh harmful to 325 00:20:13,640 --> 00:20:16,880 Speaker 1: people and are therefore bad, we also need to make 326 00:20:16,920 --> 00:20:20,399 Speaker 1: sure we're reflecting on the environment that we ourselves are 327 00:20:20,440 --> 00:20:23,960 Speaker 1: in when we I'm sorry, go ahead, oh no, I 328 00:20:24,000 --> 00:20:26,359 Speaker 1: was just gonna make one comment, you know, and for example, 329 00:20:26,359 --> 00:20:30,720 Speaker 1: in the state of Arizona, if you applied for a 330 00:20:30,840 --> 00:20:33,520 Speaker 1: driver's license in the state of Arizona, your photo was 331 00:20:33,600 --> 00:20:38,360 Speaker 1: then put into a database that was being used by 332 00:20:38,480 --> 00:20:41,480 Speaker 1: that was being used for facial recognition, right, and that 333 00:20:41,560 --> 00:20:45,159 Speaker 1: was without driver's consent. Basically, law enforcement came back and 334 00:20:45,200 --> 00:20:47,159 Speaker 1: was like, whoa we you know, we think people know 335 00:20:47,200 --> 00:20:49,000 Speaker 1: that this is going on. You know, that was their 336 00:20:49,000 --> 00:20:51,960 Speaker 1: best answer. It wasn't like, oh, there was you know, 337 00:20:52,040 --> 00:20:54,560 Speaker 1: fine print that people didn't read. It was basically like, well, 338 00:20:54,600 --> 00:20:58,600 Speaker 1: we thought people knew, and so you know, I we 339 00:20:58,720 --> 00:21:02,240 Speaker 1: can't exonerate yeah, sort of our own our home turf, 340 00:21:02,280 --> 00:21:04,680 Speaker 1: because there's certainly I don't even know if you would 341 00:21:04,680 --> 00:21:09,480 Speaker 1: call this misuse. It just seems like exploitation um for 342 00:21:10,640 --> 00:21:14,960 Speaker 1: gain for for basically gains of I think police departments 343 00:21:15,000 --> 00:21:19,719 Speaker 1: that are seeing this technology, recognizing how powerful it is, 344 00:21:19,800 --> 00:21:23,760 Speaker 1: but also realizing that it's something that needs to be 345 00:21:23,800 --> 00:21:26,960 Speaker 1: regulated and not knowing how or not really caring. And 346 00:21:27,000 --> 00:21:29,480 Speaker 1: I think Jonathan, you used the great example of the 347 00:21:29,560 --> 00:21:31,879 Speaker 1: n S a UM, which I think tie is very 348 00:21:31,920 --> 00:21:34,320 Speaker 1: neatly to what we're talking about. Edward Snowden had a 349 00:21:34,400 --> 00:21:38,560 Speaker 1: very haunting phrase which was turnkey tyranny to describe basically, 350 00:21:38,560 --> 00:21:42,280 Speaker 1: once you build infrastructure for something, um, anything can happen. 351 00:21:42,400 --> 00:21:44,960 Speaker 1: And the technological infrastructure we have here in the US 352 00:21:45,040 --> 00:21:48,119 Speaker 1: for surveillance and social control is I mean, we have 353 00:21:48,200 --> 00:21:52,160 Speaker 1: few cameras, you know, there are more distinctions between companies, 354 00:21:52,200 --> 00:21:55,280 Speaker 1: but effectively the infrastructure exists to do what's happening in 355 00:21:55,359 --> 00:21:59,440 Speaker 1: China here. And that's very frightening because you know, as 356 00:21:59,480 --> 00:22:03,360 Speaker 1: we know for Henry Ford's what wonderful pressure to build 357 00:22:03,480 --> 00:22:05,840 Speaker 1: roads in the US. Guess what you build the roads, 358 00:22:05,880 --> 00:22:07,840 Speaker 1: people are going to drive cars and not take the train. 359 00:22:07,960 --> 00:22:10,920 Speaker 1: So once this infrastructure exists, and you add to that, 360 00:22:11,440 --> 00:22:15,600 Speaker 1: you know a leader who doesn't respect norms or wartime, 361 00:22:16,320 --> 00:22:18,919 Speaker 1: all of a sudden the barriers that we think are 362 00:22:19,000 --> 00:22:23,160 Speaker 1: so solid to protect that infrastructure being weaponized against us 363 00:22:23,520 --> 00:22:26,040 Speaker 1: start to road very very quickly. And I think that 364 00:22:26,040 --> 00:22:28,200 Speaker 1: that's the moment we're in right now. And that's another 365 00:22:28,200 --> 00:22:30,920 Speaker 1: reason why he wants to call this podcast sleepwalkers, because 366 00:22:31,280 --> 00:22:34,640 Speaker 1: we don't insist on those norms and legal protections. Bit 367 00:22:34,680 --> 00:22:37,280 Speaker 1: by bit, the infrastructure will have its own logic and 368 00:22:37,600 --> 00:22:41,280 Speaker 1: one emergency after another. Remember the Patriot Act will allow 369 00:22:41,359 --> 00:22:44,639 Speaker 1: this technology to be used against us in ways that 370 00:22:44,680 --> 00:22:48,160 Speaker 1: we currently find sickening and horrifying and terrifying. In China, 371 00:22:48,680 --> 00:22:52,960 Speaker 1: they could easily come home. Yeah. Yeah, it's a sobering point. 372 00:22:53,160 --> 00:22:55,680 Speaker 1: And on that point, I think we're going to take 373 00:22:55,680 --> 00:22:57,919 Speaker 1: a quick break so I can suck my thumb in 374 00:22:57,920 --> 00:23:11,280 Speaker 1: the corner. Okay, alright, Pruny thomicide. Now we've talked about 375 00:23:11,600 --> 00:23:14,600 Speaker 1: some of the differences between say, China and the United States. 376 00:23:14,840 --> 00:23:17,719 Speaker 1: One of the things I thought was interesting is the 377 00:23:17,760 --> 00:23:21,399 Speaker 1: idea that in China you have you have sort of 378 00:23:21,440 --> 00:23:29,199 Speaker 1: a very concrete strategy in place, right, a top down strategy, 379 00:23:29,800 --> 00:23:33,080 Speaker 1: and it's it's pretty much all the companies that are 380 00:23:33,160 --> 00:23:36,199 Speaker 1: working on this strategy are in alignment with it to 381 00:23:36,520 --> 00:23:39,800 Speaker 1: some degree or another. Some are very much in step 382 00:23:40,160 --> 00:23:44,600 Speaker 1: with the state government. Others are to a lesser degree perhaps, 383 00:23:44,680 --> 00:23:47,720 Speaker 1: but still you know, following along the strategy. Meanwhile, in 384 00:23:47,720 --> 00:23:52,720 Speaker 1: the United States, it's much more of this competitive this 385 00:23:52,720 --> 00:23:56,680 Speaker 1: this classic capitalist idea of competition in the space where 386 00:23:56,720 --> 00:23:59,280 Speaker 1: you have all these different pockets that are all trying 387 00:23:59,320 --> 00:24:03,280 Speaker 1: to own a I themselves competing against each other. So 388 00:24:03,320 --> 00:24:06,520 Speaker 1: we're getting lots of interesting innovation, but not nearly at 389 00:24:06,560 --> 00:24:09,639 Speaker 1: the same speed or scale as we're seeing in China. 390 00:24:09,800 --> 00:24:12,800 Speaker 1: Is that is that more or less a correct assumption 391 00:24:12,880 --> 00:24:15,439 Speaker 1: or am I way off base here? No? I think 392 00:24:15,480 --> 00:24:20,200 Speaker 1: that's completely fair. You know, this year, the government that 393 00:24:20,320 --> 00:24:25,560 Speaker 1: President Trump did announce an executive order on AI in February, 394 00:24:26,400 --> 00:24:28,040 Speaker 1: and so that was that was seen as a kind 395 00:24:28,040 --> 00:24:31,639 Speaker 1: of related response to what China are doing. And that 396 00:24:31,720 --> 00:24:36,040 Speaker 1: executive order that the President issued had four major components. 397 00:24:36,440 --> 00:24:39,439 Speaker 1: Number one, to set AI as a national priority of 398 00:24:39,480 --> 00:24:42,760 Speaker 1: the United States. Number two, interestingly, to get better at 399 00:24:42,880 --> 00:24:47,640 Speaker 1: data sharing between the government and private enterprise. Number three, 400 00:24:47,640 --> 00:24:50,640 Speaker 1: it is the ethical guidelines on how AI is used 401 00:24:50,640 --> 00:24:53,720 Speaker 1: in terms of surveillance and military and number four to 402 00:24:53,840 --> 00:24:56,200 Speaker 1: make sure that the United States is doing the best 403 00:24:56,240 --> 00:24:59,159 Speaker 1: it can to educate the next generation of engineers and 404 00:24:59,240 --> 00:25:02,680 Speaker 1: AI scientists. Now, those are all good things, apart from 405 00:25:02,680 --> 00:25:06,760 Speaker 1: maybe two number two, the data sharing um. But guess 406 00:25:06,800 --> 00:25:09,320 Speaker 1: how much funding that executive order came with. I'm going 407 00:25:09,320 --> 00:25:12,760 Speaker 1: to guess that was a big old goose egg. Zero dollars, 408 00:25:13,600 --> 00:25:18,280 Speaker 1: zero dollars. Whereas shen Zen, which is a province in China, 409 00:25:18,720 --> 00:25:21,639 Speaker 1: is spending fifteen billion dollars this year. That's one of 410 00:25:21,680 --> 00:25:25,520 Speaker 1: the many, many provinces in China spending not even federal 411 00:25:25,560 --> 00:25:28,880 Speaker 1: money but state money on AI. So in that context 412 00:25:28,920 --> 00:25:31,080 Speaker 1: you can say, oh, you know, we're getting up to speed, 413 00:25:31,480 --> 00:25:34,560 Speaker 1: we're responding, but you know you've got to put your 414 00:25:34,560 --> 00:25:37,040 Speaker 1: money where your mouth is and we're not doing that. Yeah. 415 00:25:37,240 --> 00:25:40,600 Speaker 1: And on a on a related note, you know, when 416 00:25:40,640 --> 00:25:45,119 Speaker 1: we talk about things like the regulations, the laws in place, 417 00:25:45,200 --> 00:25:48,280 Speaker 1: like how do we how do we then create policies 418 00:25:48,880 --> 00:25:54,960 Speaker 1: that ensure that we're using AI responsibly and productively and 419 00:25:55,040 --> 00:25:58,840 Speaker 1: not in ways that are harmful or destructive, um at 420 00:25:58,880 --> 00:26:01,399 Speaker 1: least to our own set aisions if and hopefully not 421 00:26:01,480 --> 00:26:05,480 Speaker 1: to anyone at all. We tend to see that lag behind, 422 00:26:05,680 --> 00:26:07,840 Speaker 1: just like we do with technology in general. We tend 423 00:26:07,840 --> 00:26:13,600 Speaker 1: to see technology innovations far out distancing our ability to 424 00:26:13,600 --> 00:26:18,040 Speaker 1: to incorporate that into our our legal you know, kind 425 00:26:18,080 --> 00:26:23,879 Speaker 1: of massive infrastructure. Understandably so obviously that that system is 426 00:26:23,880 --> 00:26:27,320 Speaker 1: going to move much more slowly than technological innovation, but 427 00:26:27,400 --> 00:26:31,399 Speaker 1: it does create these pain points, whether it's uh, you know, 428 00:26:31,480 --> 00:26:34,040 Speaker 1: like autonomous cars. You know, you have different states in 429 00:26:34,080 --> 00:26:37,280 Speaker 1: the United States that will allow for some degree of 430 00:26:37,320 --> 00:26:41,439 Speaker 1: autonomous car testing. Meanwhile, you've got companies like Tesla that 431 00:26:41,480 --> 00:26:45,520 Speaker 1: are rolling out vehicles that have autopilot feature, which, to 432 00:26:45,640 --> 00:26:48,640 Speaker 1: the company's credit, they say, is not meant to be 433 00:26:48,720 --> 00:26:52,280 Speaker 1: taken as an autonomous system. But tell that to all 434 00:26:52,280 --> 00:26:55,680 Speaker 1: the people going down the highway who are leaning back 435 00:26:55,680 --> 00:26:58,399 Speaker 1: in their cars with their hands off the wheel. Um, 436 00:26:58,520 --> 00:27:01,119 Speaker 1: you could argue that that false of the responsibility of 437 00:27:01,119 --> 00:27:03,400 Speaker 1: the individuals. But if enough individuals are doing it, you've 438 00:27:03,400 --> 00:27:05,720 Speaker 1: got to start asking what's the value of actually having 439 00:27:05,720 --> 00:27:10,640 Speaker 1: the system in place? Um, we're seeing that as being 440 00:27:10,640 --> 00:27:13,280 Speaker 1: a kind of a disparity as well. Right, We're seeing 441 00:27:13,320 --> 00:27:17,520 Speaker 1: this this gap between what we're capable of and what 442 00:27:17,560 --> 00:27:19,439 Speaker 1: we should be doing, or at least what you know, 443 00:27:20,320 --> 00:27:23,800 Speaker 1: what are legal system says we should be doing. And meanwhile, 444 00:27:24,720 --> 00:27:27,840 Speaker 1: in contrast to that, and and Kara, you mentioned the 445 00:27:27,840 --> 00:27:30,160 Speaker 1: EU a couple of times in our last podcast. Over 446 00:27:30,160 --> 00:27:35,600 Speaker 1: in the European Union, you have committees dedicated to thinking 447 00:27:35,640 --> 00:27:39,080 Speaker 1: about these sort of things and starting to propose potential 448 00:27:40,040 --> 00:27:46,119 Speaker 1: UH strategies or even presenting different options for strategies for 449 00:27:46,240 --> 00:27:50,000 Speaker 1: dealing with AI. Even beyond these these cases that I'm 450 00:27:50,040 --> 00:27:52,040 Speaker 1: talking about, like they're they're going to the point where 451 00:27:53,160 --> 00:27:55,240 Speaker 1: I remember reporting on this a couple of years ago 452 00:27:55,280 --> 00:27:58,160 Speaker 1: for Forward Thinking, a series I used to do where 453 00:27:58,320 --> 00:28:00,879 Speaker 1: the EU was proposed a committee in the EU, or 454 00:28:00,920 --> 00:28:06,320 Speaker 1: rather was proposing the idea of granting personhood two artificially 455 00:28:06,400 --> 00:28:10,960 Speaker 1: intelligent systems, And on the face of it, that sounds 456 00:28:11,000 --> 00:28:14,240 Speaker 1: absurd to a lot of people, the idea of granting 457 00:28:14,359 --> 00:28:17,440 Speaker 1: a non human the concept of personhood, despite the fact 458 00:28:17,480 --> 00:28:20,440 Speaker 1: that in the United States we have corporations which are 459 00:28:20,480 --> 00:28:25,920 Speaker 1: exactly that. But yeah, like, hey, hey, wait, we corporations 460 00:28:25,920 --> 00:28:30,159 Speaker 1: can be people too, and uh can give money as 461 00:28:30,160 --> 00:28:33,000 Speaker 1: long as they can give money to politicians, absolutely, so 462 00:28:33,119 --> 00:28:36,560 Speaker 1: why can't robots. But the point that the EU committee 463 00:28:36,560 --> 00:28:40,320 Speaker 1: was making was not that robots have feelings and we 464 00:28:40,320 --> 00:28:43,800 Speaker 1: should really be considerate of them, but rather there needs 465 00:28:43,840 --> 00:28:48,959 Speaker 1: to be some way to start to establish concepts like accountability. 466 00:28:49,640 --> 00:28:53,560 Speaker 1: In the case where some form of AI constructor or 467 00:28:53,720 --> 00:28:58,840 Speaker 1: robot causes harm in the form of damages or injury, 468 00:28:59,360 --> 00:29:01,680 Speaker 1: how do you are to determine who's at fault. We 469 00:29:01,760 --> 00:29:03,880 Speaker 1: touched on this a little bit in the last episode, 470 00:29:04,160 --> 00:29:07,160 Speaker 1: so to me, it's fascinating that those are discussions that 471 00:29:07,160 --> 00:29:11,920 Speaker 1: are popping up, like very serious discussions in the EU. 472 00:29:12,160 --> 00:29:15,400 Speaker 1: And Oz I think when we met briefly in New 473 00:29:15,480 --> 00:29:19,360 Speaker 1: York a couple of weeks ago, we talked about the 474 00:29:19,360 --> 00:29:21,760 Speaker 1: fact that this is an area of the world not 475 00:29:21,920 --> 00:29:27,880 Speaker 1: known for making incredible advances in AI technology. It's not 476 00:29:27,960 --> 00:29:30,480 Speaker 1: like you look at Europe as saying this is where 477 00:29:30,480 --> 00:29:33,520 Speaker 1: the hot bed for AI development is, but it is 478 00:29:33,560 --> 00:29:39,120 Speaker 1: a region that's dedicating a lot of consideration to the 479 00:29:39,240 --> 00:29:43,960 Speaker 1: implications of AI in day to day lives. Yeah. Absolutely, 480 00:29:44,000 --> 00:29:46,720 Speaker 1: I mean, the EU has been out in front on 481 00:29:46,880 --> 00:29:51,400 Speaker 1: thinking about technology and AI. Um Obviously, g d PR 482 00:29:51,560 --> 00:29:54,960 Speaker 1: was a big act on regulating data and informing people 483 00:29:55,000 --> 00:29:58,880 Speaker 1: about how people's data is being used passed in the EU, 484 00:29:59,000 --> 00:30:02,040 Speaker 1: which triggered a whole series of conversations in the US 485 00:30:02,080 --> 00:30:06,200 Speaker 1: about data regulation which will probably see they're starting now, 486 00:30:06,280 --> 00:30:08,040 Speaker 1: we may see them come into law in the next 487 00:30:08,120 --> 00:30:10,840 Speaker 1: two or three or four years. So the EU, good 488 00:30:10,840 --> 00:30:15,960 Speaker 1: old Europe, despite its stifling effects on innovation I think 489 00:30:16,000 --> 00:30:20,040 Speaker 1: and on technological innovation, is an innovator in terms of regulation, 490 00:30:20,040 --> 00:30:22,719 Speaker 1: which probably sounds like a contradiction in terms and this 491 00:30:22,760 --> 00:30:24,720 Speaker 1: is an area that Karas looked into really closely and 492 00:30:24,760 --> 00:30:30,080 Speaker 1: has some I think, very interesting insights about. Yeah. Yeah, well, 493 00:30:30,120 --> 00:30:32,320 Speaker 1: I mean, as I was saying, to you earlier actually 494 00:30:32,320 --> 00:30:34,760 Speaker 1: on the form I guess it was the former the 495 00:30:34,800 --> 00:30:38,640 Speaker 1: episode before this. Um, yeah, you know, I think it's interesting. 496 00:30:40,200 --> 00:30:45,320 Speaker 1: The EUSE approach to me up very recently, you know, 497 00:30:45,440 --> 00:30:49,080 Speaker 1: was to basically collect a group of fifty two experts 498 00:30:49,240 --> 00:30:55,040 Speaker 1: and you know, create these seven core requirements for artificial intelligence. Now, 499 00:30:55,520 --> 00:30:56,880 Speaker 1: I mean I can name them. You know. One is 500 00:30:56,960 --> 00:31:00,440 Speaker 1: human agency and oversight. The other is technical robut us nous. 501 00:31:00,480 --> 00:31:03,800 Speaker 1: The third is privacy and data governance. The fourth is transparency. 502 00:31:04,160 --> 00:31:07,440 Speaker 1: The fifth is AI that AI systems should be sustainable, 503 00:31:07,560 --> 00:31:11,880 Speaker 1: um whatever that means. UM, AI systems should be auditable. 504 00:31:11,920 --> 00:31:13,920 Speaker 1: As we were talking about with you know, black the 505 00:31:13,960 --> 00:31:17,280 Speaker 1: black box problem, you know, AI should also be available 506 00:31:17,320 --> 00:31:19,040 Speaker 1: to all. You know, this is what we talk about 507 00:31:19,040 --> 00:31:24,320 Speaker 1: in terms of bias, gender bias, um, you know, racial bias, 508 00:31:24,360 --> 00:31:28,120 Speaker 1: all of those things. But I don't I have a 509 00:31:28,160 --> 00:31:30,080 Speaker 1: bit of an issue with it just in that it's 510 00:31:31,040 --> 00:31:34,640 Speaker 1: it doesn't seem to have much action involved in it. 511 00:31:34,680 --> 00:31:37,040 Speaker 1: There aren't many action items. I think I think it 512 00:31:37,200 --> 00:31:42,680 Speaker 1: is important, um for governments to to set standards you know, 513 00:31:42,800 --> 00:31:45,720 Speaker 1: as a as a sort of first step, um, you know, 514 00:31:45,800 --> 00:31:49,760 Speaker 1: and I think it also primes, you know, sort of 515 00:31:49,800 --> 00:31:53,760 Speaker 1: average citizens to be aware of misuse, right, because when 516 00:31:53,760 --> 00:31:56,360 Speaker 1: you set when you set up requirements, it means that 517 00:31:56,400 --> 00:31:59,840 Speaker 1: these are things that can be misused and be misused easily. 518 00:32:00,040 --> 00:32:01,880 Speaker 1: And I will say, you know, whenever I go home, 519 00:32:02,040 --> 00:32:05,200 Speaker 1: every time you go onto website on your phone, notifications 520 00:32:05,280 --> 00:32:08,760 Speaker 1: saying are you willing to let this website put cookies? 521 00:32:08,960 --> 00:32:11,120 Speaker 1: Are you willing to share your data? Are you willing 522 00:32:11,120 --> 00:32:14,440 Speaker 1: to accept targeted ads? And that the net effect socially 523 00:32:14,560 --> 00:32:17,080 Speaker 1: of that reminder that your data is valuable and you 524 00:32:17,120 --> 00:32:20,560 Speaker 1: have a choice every single day fifty times that that 525 00:32:20,760 --> 00:32:24,760 Speaker 1: inevitably caused the shifting consciousness and the shifting citizenship. So 526 00:32:25,080 --> 00:32:27,400 Speaker 1: I do think it's easy to say this regulation is toothless, 527 00:32:27,400 --> 00:32:30,240 Speaker 1: but I also think it can, you know, make people think, oh, 528 00:32:30,240 --> 00:32:31,920 Speaker 1: it's a huge culture. I mean when you just think, 529 00:32:31,960 --> 00:32:34,080 Speaker 1: I mean, think about it. Think about cigarettes, right, I mean, 530 00:32:34,080 --> 00:32:37,560 Speaker 1: it's just not you just don't smoke inside right anyway, 531 00:32:37,840 --> 00:32:41,080 Speaker 1: illegally you can't. But I'm saying, you know, just the 532 00:32:41,160 --> 00:32:45,680 Speaker 1: shift in public perception about smoking, the shift in public respection, 533 00:32:45,800 --> 00:32:49,160 Speaker 1: um perspective about sugar, for example, at least in the 534 00:32:49,240 --> 00:32:53,040 Speaker 1: United States. You know, those were all things that at 535 00:32:53,080 --> 00:32:55,280 Speaker 1: one point, we're not really discussed or talked about, and 536 00:32:55,280 --> 00:32:57,040 Speaker 1: you know, everyone kind of smoke cigarettes and that's what 537 00:32:57,040 --> 00:32:59,000 Speaker 1: people did and you know, then they got lung cancer. 538 00:32:59,080 --> 00:33:02,640 Speaker 1: But you know, I think the there is as long 539 00:33:02,680 --> 00:33:07,480 Speaker 1: as there is a public discourse about privacy, people will 540 00:33:07,520 --> 00:33:10,320 Speaker 1: begin to care more and more about privacy. So I think, 541 00:33:10,600 --> 00:33:15,200 Speaker 1: you know, the EU presenting um these seven requirements is 542 00:33:15,520 --> 00:33:19,240 Speaker 1: a step in the right direction. Absolutely, and if if anything, 543 00:33:19,600 --> 00:33:23,640 Speaker 1: will just make people think about these seven tenants when 544 00:33:23,680 --> 00:33:26,120 Speaker 1: they're going about their sort of everyday lives, and I 545 00:33:26,120 --> 00:33:30,240 Speaker 1: think we'll we'll make companies, um, you know, focus on 546 00:33:30,480 --> 00:33:35,200 Speaker 1: building these requirements into whatever they are developing. You know, 547 00:33:35,680 --> 00:33:39,640 Speaker 1: I think it's important for companies that are developing new 548 00:33:39,680 --> 00:33:42,800 Speaker 1: technologies to think about bias, especially when the people who 549 00:33:42,800 --> 00:33:47,800 Speaker 1: are developing the technologies might be quite homogeneous. So I 550 00:33:47,800 --> 00:33:49,560 Speaker 1: think it's worth saying that, you know, we we tend 551 00:33:49,600 --> 00:33:52,320 Speaker 1: to see the new world order in terms of America 552 00:33:52,440 --> 00:33:55,200 Speaker 1: versus China. You know, not to wave the flag for Europe, 553 00:33:55,200 --> 00:33:59,040 Speaker 1: which unfortunately my my country may longer no longer be 554 00:33:59,160 --> 00:34:01,960 Speaker 1: part of. But it's a it's a huge it's a 555 00:34:02,040 --> 00:34:06,080 Speaker 1: huge group of nations with tremendous purchasing power, and it's 556 00:34:06,200 --> 00:34:10,600 Speaker 1: genuinely significant market for US technology companies. And so you know, 557 00:34:10,760 --> 00:34:13,920 Speaker 1: the effects of this may seem far off and slightly irrelevant, 558 00:34:13,960 --> 00:34:16,880 Speaker 1: but you know, these finds that the EU is already 559 00:34:16,880 --> 00:34:20,120 Speaker 1: slapping Facebook and Google with they're not necessarily materials to 560 00:34:20,200 --> 00:34:23,600 Speaker 1: business yet. But you know, Europe as a voice for 561 00:34:23,880 --> 00:34:28,279 Speaker 1: regulation is an important one because you know, actually these 562 00:34:28,320 --> 00:34:32,160 Speaker 1: new technologies, AI technologies, they tend to affect the poorest 563 00:34:32,200 --> 00:34:35,160 Speaker 1: in society and the most vulnerable in society in the 564 00:34:35,200 --> 00:34:40,840 Speaker 1: most negative ways. So algorithmic discrimination, the replacement of you know, 565 00:34:41,320 --> 00:34:45,400 Speaker 1: of low education, shall we say, jobs like driving or packaging. 566 00:34:45,640 --> 00:34:49,160 Speaker 1: You know, these are things that are being experienced at 567 00:34:49,200 --> 00:34:51,239 Speaker 1: the hard edge by people who don't have much of 568 00:34:51,239 --> 00:34:53,640 Speaker 1: a voice in the political system. And so the fact 569 00:34:53,640 --> 00:34:56,360 Speaker 1: that the EU is is taking up the mantle, even 570 00:34:56,400 --> 00:34:59,920 Speaker 1: though it comes with hypocrisy, like using facial recognition at 571 00:35:00,040 --> 00:35:03,040 Speaker 1: the ports of entry, even though you know that the 572 00:35:03,200 --> 00:35:05,360 Speaker 1: level of the finds that can be applied to companies 573 00:35:05,400 --> 00:35:07,839 Speaker 1: like Google and Facebook can never hurt their bottom line, 574 00:35:08,239 --> 00:35:11,319 Speaker 1: I would say, is an important and valuable thing that 575 00:35:11,400 --> 00:35:13,759 Speaker 1: it's happening. And so you know, I think the EU 576 00:35:14,120 --> 00:35:17,759 Speaker 1: can offer some point of reference for how we may 577 00:35:17,760 --> 00:35:20,560 Speaker 1: think about regulating AI and technology in the US in 578 00:35:20,600 --> 00:35:25,120 Speaker 1: the future. And I think that you know, you've heard 579 00:35:25,360 --> 00:35:30,200 Speaker 1: futurists say we need to have some very serious conversations 580 00:35:30,800 --> 00:35:34,920 Speaker 1: about AI and the ethics of AI and how we 581 00:35:34,960 --> 00:35:38,440 Speaker 1: can make certain we're being responsible custodians of a I 582 00:35:39,200 --> 00:35:41,319 Speaker 1: and UH. To me, that was one of those things 583 00:35:41,400 --> 00:35:43,560 Speaker 1: where it was like, we need to talk about talking 584 00:35:43,600 --> 00:35:46,239 Speaker 1: about this, like that was the conversation for a very 585 00:35:46,320 --> 00:35:48,560 Speaker 1: long time. We need to talk about talking about it. 586 00:35:48,880 --> 00:35:50,960 Speaker 1: It's like having a meeting to talk about when you're 587 00:35:50,960 --> 00:35:53,040 Speaker 1: gonna have your meeting. Have you ever worked in an 588 00:35:53,080 --> 00:35:56,319 Speaker 1: office before? Oh? Yeah, no, I worked in I worked 589 00:35:56,320 --> 00:35:59,600 Speaker 1: in a college administrative office before. So I've had meeting 590 00:35:59,760 --> 00:36:03,960 Speaker 1: about how we can have fewer meetings and this is 591 00:36:03,960 --> 00:36:09,360 Speaker 1: not the way. Yeah. So, so having the EU actually 592 00:36:09,360 --> 00:36:11,840 Speaker 1: take this step, even if you were to argue like 593 00:36:11,880 --> 00:36:14,120 Speaker 1: this is a very early step and maybe there's not 594 00:36:14,160 --> 00:36:17,480 Speaker 1: a lot of teeth to it yet, it is a 595 00:36:17,520 --> 00:36:20,960 Speaker 1: step as opposed to what I've seen elsewhere where it's 596 00:36:20,960 --> 00:36:23,280 Speaker 1: been talking about taking a step but not even doing 597 00:36:23,320 --> 00:36:27,560 Speaker 1: that much. So I'm encouraged by it because it actually 598 00:36:27,560 --> 00:36:30,399 Speaker 1: moves the conversation forward instead of us saying we need 599 00:36:30,440 --> 00:36:34,000 Speaker 1: to have this conversation, the conversation has started. I don't 600 00:36:34,000 --> 00:36:36,239 Speaker 1: think it's over yet. I think that this is a 601 00:36:36,280 --> 00:36:41,440 Speaker 1: good way to actually force more parties to get involved 602 00:36:41,440 --> 00:36:46,040 Speaker 1: and think about it and maybe even start proactively thinking, 603 00:36:46,040 --> 00:36:48,839 Speaker 1: how can I make certain that the thing we are 604 00:36:48,880 --> 00:36:52,120 Speaker 1: building is actually being built in a responsible way that 605 00:36:52,239 --> 00:36:55,400 Speaker 1: isn't going to that we can mitigate as many unintended 606 00:36:55,400 --> 00:36:58,799 Speaker 1: consequences as possible, knowing that that is impossible to do 607 00:36:59,719 --> 00:37:04,319 Speaker 1: uh entirely, but to really try for it. So to me, 608 00:37:04,480 --> 00:37:06,399 Speaker 1: this is this is one of those conversations I could 609 00:37:06,440 --> 00:37:09,040 Speaker 1: have all day long. But I know you guys need 610 00:37:09,080 --> 00:37:10,640 Speaker 1: to get going because there's going to be someone else 611 00:37:10,640 --> 00:37:12,359 Speaker 1: who's going to have to use the studio you guys 612 00:37:12,360 --> 00:37:14,080 Speaker 1: are in, So I'm going to I'm not going to 613 00:37:14,200 --> 00:37:16,480 Speaker 1: have us go all day long on this. Plus, we're 614 00:37:16,480 --> 00:37:19,160 Speaker 1: recording this on a holiday weekend, and I know everybody 615 00:37:19,200 --> 00:37:22,200 Speaker 1: wants to get home, so we're gonna wrap it up. 616 00:37:22,239 --> 00:37:25,160 Speaker 1: But I want to thank you guys so much for 617 00:37:25,239 --> 00:37:30,399 Speaker 1: agreeing to come onto my show. Uh it's a fascinating conversation, 618 00:37:30,600 --> 00:37:34,239 Speaker 1: and you guys obviously have some great perspectives on this. 619 00:37:34,760 --> 00:37:39,200 Speaker 1: And again to my listeners out there, if you haven't 620 00:37:39,200 --> 00:37:42,800 Speaker 1: subscribed to Sleepwalkers, go check it out. It's a really 621 00:37:42,880 --> 00:37:47,399 Speaker 1: well done show. I've been very impressed as someone who 622 00:37:47,400 --> 00:37:51,080 Speaker 1: has a solo show. Most of the time I listened 623 00:37:51,080 --> 00:37:54,200 Speaker 1: to it, I just think, wow, that's that's so awesome, 624 00:37:54,800 --> 00:37:57,400 Speaker 1: that's so great. I wish I were on that show 625 00:37:57,440 --> 00:38:00,400 Speaker 1: every now and then. So hint, hint, if you vernee me, 626 00:38:00,440 --> 00:38:03,239 Speaker 1: you just we love it. We would absolutely love it. 627 00:38:03,320 --> 00:38:06,240 Speaker 1: Let's talk about that. We had a great time today. 628 00:38:06,520 --> 00:38:10,719 Speaker 1: Thank you, Thank you guys, and you guys, if you 629 00:38:10,760 --> 00:38:13,200 Speaker 1: want to get in touch with me, drop me a line, 630 00:38:13,280 --> 00:38:16,200 Speaker 1: say hey, you need to talk about this other topic 631 00:38:16,360 --> 00:38:18,719 Speaker 1: or those guys were great. Have them back on the 632 00:38:18,760 --> 00:38:21,319 Speaker 1: show as soon as possible. Send me an email. It's 633 00:38:21,400 --> 00:38:24,640 Speaker 1: tech Stuff at how stuff works dot com. Pop on 634 00:38:24,719 --> 00:38:27,279 Speaker 1: over to the website that's tech Stuff podcast dot com. 635 00:38:27,320 --> 00:38:30,799 Speaker 1: You'll find the archive for all of my episodes. If 636 00:38:30,840 --> 00:38:33,480 Speaker 1: you are really bored, they're more than a thousand of them, 637 00:38:33,560 --> 00:38:36,360 Speaker 1: so have at it. And then you can pop on 638 00:38:36,400 --> 00:38:38,160 Speaker 1: over to the merchandise store and you can finally get 639 00:38:38,160 --> 00:38:41,160 Speaker 1: yourself that tech Stuff mug that you've been wanting all 640 00:38:41,160 --> 00:38:44,479 Speaker 1: this time, and I'll talk to you again. Really. Sion 641 00:38:49,239 --> 00:38:51,440 Speaker 1: Text Stuff is a production of I Heart Radio's How 642 00:38:51,520 --> 00:38:54,879 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 643 00:38:54,920 --> 00:38:58,040 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 644 00:38:58,080 --> 00:39:01,640 Speaker 1: listen to your favorite shows. Eight