1 00:00:15,800 --> 00:00:18,840 Speaker 1: Welcome to tech stuff. I'm os Vloschen and this is 2 00:00:18,920 --> 00:00:22,880 Speaker 1: the story. Earlier this month, I was in Munich, Germany 3 00:00:23,320 --> 00:00:27,400 Speaker 1: at the DLD conference, and I had the opportunity to 4 00:00:27,480 --> 00:00:29,680 Speaker 1: sit down with someone I've wanted to meet for a 5 00:00:29,680 --> 00:00:34,519 Speaker 1: long time, Audrey Tom, who served as Minister for Digital 6 00:00:34,520 --> 00:00:39,000 Speaker 1: Affairs of Taiwan until twenty twenty four. Audrey's part of 7 00:00:39,000 --> 00:00:44,800 Speaker 1: the government had rather unusual origins. Amcca, an anarchist and 8 00:00:45,280 --> 00:00:48,080 Speaker 1: the world's first non binary government minister. 9 00:00:48,680 --> 00:00:52,400 Speaker 2: When I entered the cabinet, they gave me this hr 10 00:00:52,520 --> 00:00:57,480 Speaker 2: foreign and I wrote not applicable or in both the 11 00:00:57,560 --> 00:01:01,760 Speaker 2: gender field and the party affiliation. So I've never attended 12 00:01:01,800 --> 00:01:06,080 Speaker 2: any political party. And I think these two do have 13 00:01:06,160 --> 00:01:10,000 Speaker 2: something in common, which is I take a stance that 14 00:01:10,800 --> 00:01:14,000 Speaker 2: I share part of your experience, I share part of 15 00:01:14,040 --> 00:01:16,920 Speaker 2: their experience. By at the end of the day, it 16 00:01:17,000 --> 00:01:20,400 Speaker 2: is the shared experience that counts. It is not the 17 00:01:20,440 --> 00:01:23,880 Speaker 2: political label or the gender label that counts. And I 18 00:01:23,920 --> 00:01:27,039 Speaker 2: think this also allowed me to essentially take all the 19 00:01:27,120 --> 00:01:31,760 Speaker 2: sites and bring communities back together when they were being 20 00:01:31,800 --> 00:01:36,959 Speaker 2: torn apart over the political labeling, the radicalization, and also 21 00:01:37,080 --> 00:01:38,360 Speaker 2: about gender politics. 22 00:01:38,640 --> 00:01:42,440 Speaker 1: We record this conversation against the backdrop of China's largest 23 00:01:42,520 --> 00:01:46,720 Speaker 1: ever military drills around Taiwan and a newly struck trade 24 00:01:46,800 --> 00:01:50,160 Speaker 1: deal with the US that involves a commitment from the 25 00:01:50,160 --> 00:01:56,040 Speaker 1: Taiwan Semiconductor Manufacturing Company to transfer its advanced chip making 26 00:01:56,080 --> 00:02:01,800 Speaker 1: expertise to the US. Some worry ultimately reduce the American 27 00:02:01,840 --> 00:02:06,280 Speaker 1: interest in guaranteeing Taiwan's security. We talk about these issues, 28 00:02:06,680 --> 00:02:09,960 Speaker 1: but Audrey's big appointed is this, A Taiwan that is 29 00:02:10,040 --> 00:02:13,760 Speaker 1: open to debate and not polarized by social media will 30 00:02:13,800 --> 00:02:17,359 Speaker 1: be far more resilient against China and its ongoing cyber 31 00:02:17,400 --> 00:02:22,400 Speaker 1: attacks and disinformation campaigns. Audrey's timing government was devoted to 32 00:02:22,480 --> 00:02:27,960 Speaker 1: building pro social social networks and digital forums where citizens 33 00:02:28,000 --> 00:02:31,920 Speaker 1: could directly participate in everything from regulating deep fakes to 34 00:02:31,960 --> 00:02:36,480 Speaker 1: the response to COVID. Today, Audrey is the Cyber Ambassador 35 00:02:36,480 --> 00:02:40,200 Speaker 1: for Taiwan, and part of the role is consulting with 36 00:02:40,240 --> 00:02:43,880 Speaker 1: governments around the world on how to build social resilience 37 00:02:44,120 --> 00:02:51,240 Speaker 1: in the digital age, drawing all lessons London, Taiwan. You 38 00:02:51,440 --> 00:02:55,079 Speaker 1: were the Digital Minister for Taiwan for seven and a 39 00:02:55,080 --> 00:02:58,600 Speaker 1: half years, since twenty sixteen to twenty twenty four, and 40 00:02:58,639 --> 00:03:02,880 Speaker 1: now you're the ambassador cyber ambassador. Talk a bit about 41 00:03:03,040 --> 00:03:05,320 Speaker 1: your mission in both of those roles and how the 42 00:03:05,320 --> 00:03:09,440 Speaker 1: mission changed that gets from being Digital Minister to cyber ambassador. 43 00:03:09,919 --> 00:03:14,120 Speaker 2: So I actually joined the Cabinet of Taiwan first, not 44 00:03:14,320 --> 00:03:18,360 Speaker 2: as minister, but rather as a reverse mentor a youth 45 00:03:18,480 --> 00:03:23,320 Speaker 2: advisor to administrate in charge for participation. That was twenty fourteen, 46 00:03:23,680 --> 00:03:26,080 Speaker 2: and at the time I was part of this movement 47 00:03:26,200 --> 00:03:30,079 Speaker 2: called the Sunflower Movement where the President my angel at 48 00:03:30,080 --> 00:03:33,920 Speaker 2: the time, was enjoying a nine percent approval rating. Nine percent, 49 00:03:34,160 --> 00:03:36,800 Speaker 2: uh huh, So in a country of twenty three point 50 00:03:36,840 --> 00:03:41,320 Speaker 2: five million, that's anything, he says, twenty million people against him, 51 00:03:41,920 --> 00:03:45,280 Speaker 2: So it's a very special time. And because he was 52 00:03:45,320 --> 00:03:47,920 Speaker 2: trying to fast track through a trade deal with Paijing 53 00:03:48,280 --> 00:03:51,560 Speaker 2: that would have allowed Huawei and ct and the usual 54 00:03:51,720 --> 00:03:56,920 Speaker 2: friendly neighbors into our telecommunication or publishing our cybersecurity and 55 00:03:56,960 --> 00:04:01,560 Speaker 2: so on. So people were very concerned. And when the 56 00:04:01,600 --> 00:04:05,160 Speaker 2: Parliament tries to fast track that through, we occupied the 57 00:04:05,200 --> 00:04:08,840 Speaker 2: Parliament for three weeks non violently, and not just to 58 00:04:09,040 --> 00:04:12,480 Speaker 2: protest against something, but also to demonstrate to show something. 59 00:04:12,760 --> 00:04:15,280 Speaker 2: So one of the first things I did in the 60 00:04:15,360 --> 00:04:19,720 Speaker 2: Sunflower was I personally carried a three hundred and fifteen 61 00:04:19,760 --> 00:04:23,760 Speaker 2: meter ethernet cable to the occupied parliament, so people from 62 00:04:23,800 --> 00:04:27,120 Speaker 2: the outside can see in real time and through live 63 00:04:27,200 --> 00:04:32,040 Speaker 2: streaming what's happening inside. And so in this radical transparency way, 64 00:04:32,480 --> 00:04:35,760 Speaker 2: every day people can see what was the remaining issue 65 00:04:35,800 --> 00:04:39,360 Speaker 2: to be discussed, what was agreed the previous day. So 66 00:04:39,520 --> 00:04:42,640 Speaker 2: half a million people and many more online converged after 67 00:04:42,760 --> 00:04:46,800 Speaker 2: three weeks of digital and face to face deliberation. And 68 00:04:46,839 --> 00:04:50,360 Speaker 2: so the point being people converged over time because they 69 00:04:50,440 --> 00:04:55,599 Speaker 2: can see how their interventions can help the entire sans making. 70 00:04:55,839 --> 00:04:58,920 Speaker 2: And we did sands make after the three weeks and 71 00:04:59,320 --> 00:05:03,320 Speaker 2: the parliament terrans basically said, yeah, okay, The speaker said, 72 00:05:03,320 --> 00:05:05,839 Speaker 2: it's a good idea. The people's ideas are better than ours. 73 00:05:06,160 --> 00:05:08,320 Speaker 2: And so it was one of the very rare occupied 74 00:05:08,320 --> 00:05:11,520 Speaker 2: that converged rather than dissipated. So it was tapped to 75 00:05:11,720 --> 00:05:14,719 Speaker 2: join the cabinet to advise them how to resolve each 76 00:05:14,800 --> 00:05:18,479 Speaker 2: incoming issue like Uber and so on without getting the 77 00:05:18,640 --> 00:05:19,880 Speaker 2: ministries occupied again. 78 00:05:21,200 --> 00:05:24,680 Speaker 1: And why were you worried in twenty fourteen about Huawei 79 00:05:24,720 --> 00:05:28,640 Speaker 1: being involved in the telecoms infrastructure in Taiwan? 80 00:05:29,200 --> 00:05:34,360 Speaker 2: So suppose we use Huawei in our four G core infrastructure, 81 00:05:34,680 --> 00:05:38,039 Speaker 2: and suppose that we check them very thoroughly, no backdoor, 82 00:05:38,040 --> 00:05:42,560 Speaker 2: no childian horse. And someday there's an emergency firmware update, 83 00:05:42,960 --> 00:05:45,440 Speaker 2: do we just apply it or do we apply another 84 00:05:45,520 --> 00:05:50,840 Speaker 2: systemic risk analysis. If it's from a European vendor, then 85 00:05:50,880 --> 00:05:55,320 Speaker 2: we trust that there are journalism and there's also whistleblowing 86 00:05:55,600 --> 00:05:59,440 Speaker 2: protection and many others. So if they're a state or 87 00:05:59,480 --> 00:06:04,080 Speaker 2: a criminal organization try to backdoor the new firmware updates, well, 88 00:06:04,279 --> 00:06:07,359 Speaker 2: the local people from this European country will also be 89 00:06:07,400 --> 00:06:11,680 Speaker 2: concerned about that. Whereas people argued that there's probably no 90 00:06:11,880 --> 00:06:15,520 Speaker 2: real private sector in Beijing, so if Beijing tries to 91 00:06:15,560 --> 00:06:19,000 Speaker 2: step in this or on us to discover whether it 92 00:06:19,040 --> 00:06:22,080 Speaker 2: has been defecto taken over right, So that was the 93 00:06:22,120 --> 00:06:24,880 Speaker 2: main argument. Is an economic argument that says, if we 94 00:06:25,080 --> 00:06:28,359 Speaker 2: have to keep making this assessment, then it is actually 95 00:06:28,400 --> 00:06:31,560 Speaker 2: more expensive than if we went with the European enter 96 00:06:31,720 --> 00:06:32,839 Speaker 2: So that was the idea. 97 00:06:32,960 --> 00:06:36,800 Speaker 1: Interesting twenty twenty seven is the year that Chichiping wants 98 00:06:36,880 --> 00:06:40,760 Speaker 1: the PLA to be ready to take Taiwan by force 99 00:06:40,920 --> 00:06:46,080 Speaker 1: if the Communist Party teems are necessary. Obviously, last week 100 00:06:46,120 --> 00:06:50,320 Speaker 1: were I think maybe the biggest drills ever around Taiwan 101 00:06:50,360 --> 00:06:53,960 Speaker 1: simulating a port blockade. How scared are you that they 102 00:06:53,960 --> 00:06:57,120 Speaker 1: will deem it necessary and they will attempt to take 103 00:06:57,160 --> 00:06:58,440 Speaker 1: Taiwan by force next year. 104 00:06:58,839 --> 00:07:01,599 Speaker 2: So there's a long history of the PLA doing something 105 00:07:01,880 --> 00:07:04,479 Speaker 2: and then in a kind of anti fragile way, it 106 00:07:04,720 --> 00:07:07,800 Speaker 2: actually made to many these people not more worried, but 107 00:07:08,000 --> 00:07:11,800 Speaker 2: actually banding strongly, more together. But that must be fair 108 00:07:11,840 --> 00:07:15,840 Speaker 2: as well well if you look at our stock market now, 109 00:07:15,880 --> 00:07:19,720 Speaker 2: which has tripled since I was a minister, And also 110 00:07:19,960 --> 00:07:23,480 Speaker 2: I would say people by and large now feel more 111 00:07:23,560 --> 00:07:28,080 Speaker 2: secure in the communication or so called greyzone domain. One 112 00:07:28,080 --> 00:07:31,880 Speaker 2: of the main work I had when I convened the 113 00:07:32,040 --> 00:07:35,800 Speaker 2: Ministry of Digital Affairs I was the founding minister, is 114 00:07:36,040 --> 00:07:40,080 Speaker 2: basically to say, actually, if you look at their tactics 115 00:07:40,360 --> 00:07:43,480 Speaker 2: during the Pelosi visit of twenty twenty two, so. 116 00:07:43,560 --> 00:07:46,200 Speaker 1: This was just just a pause for the listeners. Pelosi 117 00:07:46,200 --> 00:07:48,600 Speaker 1: came in twenty twenty two and this was perhaps the 118 00:07:48,640 --> 00:07:53,360 Speaker 1: greatest crisis in China Taiwan relations for a very long time. Yes, 119 00:07:53,720 --> 00:07:57,200 Speaker 1: arguably that's a larger drill, Okay, Yeah, and you were 120 00:07:57,360 --> 00:07:59,600 Speaker 1: and at that time you were the minister, yes, and 121 00:07:59,640 --> 00:08:03,640 Speaker 1: I was just founding the Ministry of Digital Affairs, yeah, 122 00:08:03,720 --> 00:08:07,440 Speaker 1: so really had a frontline view of things. And so 123 00:08:07,720 --> 00:08:13,040 Speaker 1: before the cutting of the subse cables, the physical attacks, 124 00:08:13,480 --> 00:08:17,520 Speaker 1: the cyber attacks, the denial of service of our websites 125 00:08:17,800 --> 00:08:21,800 Speaker 1: to make it very busy so people cannot visit, the 126 00:08:21,840 --> 00:08:26,760 Speaker 1: polarization attacks right creating bake accounts online to argue very 127 00:08:26,800 --> 00:08:31,679 Speaker 1: passionately about some issue, and many other attack attempts were 128 00:08:31,800 --> 00:08:34,600 Speaker 1: isolated by in twenty twenty two. That was the first 129 00:08:34,679 --> 00:08:38,040 Speaker 1: time that they really banded together. So you would have, 130 00:08:38,280 --> 00:08:41,880 Speaker 1: for example, them doing a cyber attack to the display 131 00:08:42,040 --> 00:08:47,040 Speaker 1: sideboard outside of a rail station displaying hate messages against Peloicy, 132 00:08:47,559 --> 00:08:50,959 Speaker 1: and then you would see the rumor mill online saying, 133 00:08:51,200 --> 00:08:53,760 Speaker 1: you know, the Ministry of Transportation has been taken over. 134 00:08:54,000 --> 00:08:57,160 Speaker 1: Just look at this picture. And when the journalists try, 135 00:08:57,240 --> 00:08:59,640 Speaker 1: of course to check the Ministry of Defense or Foreign 136 00:08:59,679 --> 00:09:03,080 Speaker 1: Affairs website, they find it very slow and cannot actually 137 00:09:03,160 --> 00:09:07,439 Speaker 1: get any real information out. And in this vacuum, then 138 00:09:07,559 --> 00:09:11,080 Speaker 1: more cyber attacks, more social engineering, and so on. So 139 00:09:11,200 --> 00:09:15,600 Speaker 1: it's all integrated, and so we very quickly started our 140 00:09:15,720 --> 00:09:19,360 Speaker 1: counter responses what we call humor over rumor, and so 141 00:09:19,520 --> 00:09:22,480 Speaker 1: in doing so, explaining that the denial of service is 142 00:09:22,520 --> 00:09:25,440 Speaker 1: actually not about taking over a website, just keeping a 143 00:09:25,440 --> 00:09:27,680 Speaker 1: line busy anyway. So the point is that with the 144 00:09:27,760 --> 00:09:30,960 Speaker 1: help of journalists, the stock market did not plummet the 145 00:09:31,040 --> 00:09:34,240 Speaker 1: way they hoped would do the attackers hoped, but rather 146 00:09:34,400 --> 00:09:38,000 Speaker 1: actually rise that day. And so I think what we're 147 00:09:38,040 --> 00:09:41,640 Speaker 1: seeing now is after four years of this kind of 148 00:09:41,720 --> 00:09:45,839 Speaker 1: constant drill, the free penetration testing, free red teaming here, 149 00:09:45,880 --> 00:09:47,880 Speaker 1: you have to pay for their service in time and 150 00:09:47,920 --> 00:09:51,000 Speaker 1: we'll get two million, almost three million now cyber attack 151 00:09:51,080 --> 00:09:54,160 Speaker 1: attempts every day for free every day, yes, free service. 152 00:09:54,720 --> 00:09:56,760 Speaker 1: And so because of that, I think that how these 153 00:09:56,800 --> 00:10:00,000 Speaker 1: people are feeling much more resilient that even if are 154 00:10:00,160 --> 00:10:04,400 Speaker 1: subsecubers are cut, even if a natural earthquake did disrupt 155 00:10:04,559 --> 00:10:09,600 Speaker 1: the telecom towers, we do have the lower's orbit satellite 156 00:10:09,679 --> 00:10:12,959 Speaker 1: links now, we do have the roaming between the telecom 157 00:10:13,200 --> 00:10:16,400 Speaker 1: vendors now, so we are feeling much more resilient despite 158 00:10:16,559 --> 00:10:20,400 Speaker 1: their continuing greason tactics. Part of the job is to 159 00:10:20,440 --> 00:10:26,400 Speaker 1: make a robust, communicative society at home so that no 160 00:10:26,440 --> 00:10:29,640 Speaker 1: matter what the attacks are, as you've just described, there 161 00:10:29,760 --> 00:10:33,400 Speaker 1: is a sense of social cohesion, which which can allow 162 00:10:33,960 --> 00:10:37,240 Speaker 1: survival and flourishing. There's also geopolitics. I mean, you're an 163 00:10:37,240 --> 00:10:41,920 Speaker 1: ambassador now. Part of geopolitics is influence abroad, and I 164 00:10:42,200 --> 00:10:44,240 Speaker 1: have a feeling that part of your remit was not 165 00:10:44,559 --> 00:10:47,360 Speaker 1: doing that, or perhaps it was, I don't know. I mean, 166 00:10:47,360 --> 00:10:51,440 Speaker 1: what the cyber campaigns in the US right now as 167 00:10:51,480 --> 00:10:54,080 Speaker 1: Taiwan re emerges as a hot button issue. I mean, 168 00:10:54,480 --> 00:10:57,439 Speaker 1: just last week the trade deal between Taiwan and the 169 00:10:57,520 --> 00:11:00,800 Speaker 1: US was signed. They included two hund undred and fifty 170 00:11:01,200 --> 00:11:05,400 Speaker 1: billion dollars of Taiwanese investment in the US semiconductor industry. 171 00:11:06,040 --> 00:11:10,880 Speaker 1: You know, a promise that the sort of technological expertise 172 00:11:10,960 --> 00:11:14,479 Speaker 1: of TSMC and chip manufacturing would essentially be the intellectual 173 00:11:14,480 --> 00:11:16,800 Speaker 1: currency would be exported to the US so that they 174 00:11:16,840 --> 00:11:20,240 Speaker 1: can create justice good chips in the US and of course, 175 00:11:20,240 --> 00:11:22,800 Speaker 1: on the other side, defense spending into Taiwan. 176 00:11:23,000 --> 00:11:25,800 Speaker 2: We also work with Japan and Germany on expanding the 177 00:11:25,840 --> 00:11:27,199 Speaker 2: TSMC manufacturing. 178 00:11:27,280 --> 00:11:30,000 Speaker 1: Do you worry that the TSMC is a kind of 179 00:11:30,600 --> 00:11:34,440 Speaker 1: important guarantee of Taiwan's security because of the access to 180 00:11:34,520 --> 00:11:38,240 Speaker 1: chips by Western powers that if is exported the Western 181 00:11:38,240 --> 00:11:40,960 Speaker 1: powers may be less interested in guaranteeing Taiwan's security. 182 00:11:42,320 --> 00:11:47,160 Speaker 2: Well, the thing about existential bridging is that Taiwan needs 183 00:11:47,200 --> 00:11:51,960 Speaker 2: to be indispensable to the modern way of living, which 184 00:11:52,000 --> 00:11:56,280 Speaker 2: involves chips and also bubute, but that's another topic. And 185 00:11:56,320 --> 00:12:00,240 Speaker 2: what we're saying essentially is that as long as the 186 00:12:00,280 --> 00:12:05,480 Speaker 2: people abroad trust TSMC and the supply chain around the 187 00:12:05,559 --> 00:12:09,000 Speaker 2: chips and also like the Semi e one eighty seven 188 00:12:09,240 --> 00:12:13,560 Speaker 2: which is the cybersecurity zero trust network Stendard that we 189 00:12:13,760 --> 00:12:17,640 Speaker 2: in the digital ministry developed with the TSMC, as long 190 00:12:17,679 --> 00:12:23,319 Speaker 2: as the world depends on this bedrock of trustworthy technology, 191 00:12:23,559 --> 00:12:27,240 Speaker 2: then we are more safe. Because no democracy is an island, 192 00:12:27,559 --> 00:12:32,040 Speaker 2: not even Taiwan. So even if Tawan has the best manufacturing, 193 00:12:32,559 --> 00:12:37,360 Speaker 2: we depolarize. Our society were independently ranked as number one 194 00:12:37,440 --> 00:12:41,520 Speaker 2: or two the most democratic and free of Asia. It 195 00:12:41,600 --> 00:12:47,280 Speaker 2: is not sufficient if or our democratic allies experience democratic backsliding, 196 00:12:47,640 --> 00:12:50,319 Speaker 2: if they do not come to see that the freedom 197 00:12:50,400 --> 00:12:53,439 Speaker 2: of the press and freedom of speech and freedom association 198 00:12:53,880 --> 00:12:57,679 Speaker 2: is very important and the supply chain that guarantees these 199 00:12:57,840 --> 00:13:01,680 Speaker 2: kind of qualities like protecting with the blurns in the 200 00:13:01,760 --> 00:13:05,800 Speaker 2: Taiwan ecosystem, is not important, then Taiwan is in real 201 00:13:05,880 --> 00:13:10,280 Speaker 2: danga because the authoritarian narrative, namely that democracy do not 202 00:13:10,360 --> 00:13:13,520 Speaker 2: deliver an only lead to chaos, would be winning the 203 00:13:13,600 --> 00:13:16,760 Speaker 2: upper hand. So my work as cyber Ambassador is going 204 00:13:16,800 --> 00:13:20,840 Speaker 2: around the world and showing people that radical freedom of 205 00:13:20,880 --> 00:13:25,320 Speaker 2: speech and expression do not need to lead to polarization 206 00:13:25,440 --> 00:13:30,960 Speaker 2: and chaos. And incorporating TSMC and the cybersecurity poster assuming 207 00:13:31,040 --> 00:13:35,640 Speaker 2: breach trust, no single vendors, radical plurality and so on 208 00:13:36,040 --> 00:13:40,320 Speaker 2: is actually a better playbook, especially in the cyber offense 209 00:13:40,400 --> 00:13:43,840 Speaker 2: dominant window the next few years, than you know, just 210 00:13:43,920 --> 00:13:47,360 Speaker 2: simply saying, oh, let's not depend on anyone, including Taiwan. 211 00:13:47,360 --> 00:13:48,800 Speaker 1: I mean when I think about an ambassador, I think 212 00:13:48,800 --> 00:13:54,199 Speaker 1: about somebody who goes around trying to bring other countries 213 00:13:54,240 --> 00:13:56,480 Speaker 1: to their point of view. So is a big part 214 00:13:56,480 --> 00:13:59,680 Speaker 1: of your mandate making sure the international support for independent 215 00:13:59,679 --> 00:14:00,840 Speaker 1: Taiwan one continues. 216 00:14:02,840 --> 00:14:08,240 Speaker 2: Yeah, cyber comes from the Greek Carbernets, which means somebody 217 00:14:08,280 --> 00:14:13,600 Speaker 2: who steers right, And so my job really is to 218 00:14:13,679 --> 00:14:19,160 Speaker 2: spread the Taiwanese way of steering through the transformative changes 219 00:14:19,480 --> 00:14:22,440 Speaker 2: caused by the Internet. Last year, I went to twenty 220 00:14:22,520 --> 00:14:26,560 Speaker 2: eight democratic countries so switching time zone every five days 221 00:14:26,600 --> 00:14:29,640 Speaker 2: on average, and they're all asking the same question, how 222 00:14:29,680 --> 00:14:34,760 Speaker 2: do we move beyond this very damaging political violence cost 223 00:14:35,160 --> 00:14:39,200 Speaker 2: by this online polarization. How can we tap kind of 224 00:14:39,240 --> 00:14:43,360 Speaker 2: a populist energy but into something more popular and something 225 00:14:43,400 --> 00:15:01,040 Speaker 2: that turns polarization into plurality or technology that fosters pluralism. 226 00:15:01,080 --> 00:15:03,480 Speaker 1: After the break, how order we got rid of deep 227 00:15:03,520 --> 00:15:07,120 Speaker 1: fake scams in Taiwan, and how she's working to spread 228 00:15:07,360 --> 00:15:26,160 Speaker 1: digital democracy around the world stay with us, and so 229 00:15:27,320 --> 00:15:33,440 Speaker 1: most people's experience of the digital realm ensuring the government 230 00:15:33,480 --> 00:15:36,880 Speaker 1: realm in the US. Doge, what was similar and what 231 00:15:36,960 --> 00:15:40,160 Speaker 1: was different about your efforts in Taiwan and Doge's efforts 232 00:15:40,200 --> 00:15:40,880 Speaker 1: in the US. 233 00:15:42,080 --> 00:15:45,440 Speaker 2: Well, we're like the og doj in which that we 234 00:15:45,560 --> 00:15:49,080 Speaker 2: did use the sheba u Rio siba Enu as their 235 00:15:49,160 --> 00:15:55,120 Speaker 2: spokestock during the pandemic times way before Doje, and I 236 00:15:55,160 --> 00:15:59,560 Speaker 2: remember Mimba you know, y yes, but we had a 237 00:15:59,760 --> 00:16:04,320 Speaker 2: life shiba Yinu who literally lived with the Participation Officer 238 00:16:04,560 --> 00:16:07,440 Speaker 2: of the Ministry of Health and Welfare, so part of 239 00:16:07,480 --> 00:16:11,560 Speaker 2: our network. So early in twenty twenty, when we adopted 240 00:16:11,600 --> 00:16:16,920 Speaker 2: the humor over rumor counter infidemic strategy. I remember very 241 00:16:17,040 --> 00:16:21,720 Speaker 2: vividly that people were arguing that one side that masks 242 00:16:21,800 --> 00:16:24,600 Speaker 2: are not useful. Any kind of mask hurts you, and 243 00:16:24,800 --> 00:16:27,560 Speaker 2: N ninety five, the highest grade mask, hurts you the most. 244 00:16:27,880 --> 00:16:30,560 Speaker 2: And the other side says no, only N ninety five 245 00:16:30,600 --> 00:16:33,560 Speaker 2: protects you, and the medical grade mask that we were 246 00:16:33,680 --> 00:16:37,520 Speaker 2: rationing out, well, these are placebo, and so these really 247 00:16:37,600 --> 00:16:41,040 Speaker 2: stress to fracture our society. And in less than twenty 248 00:16:41,080 --> 00:16:44,600 Speaker 2: four hours we roll out this very funny meme where 249 00:16:44,640 --> 00:16:47,600 Speaker 2: the Shiba Nu puts her pod to her mouth saying, 250 00:16:47,960 --> 00:16:51,520 Speaker 2: wear a mask to protect your own dirty, unwashed hunt 251 00:16:51,800 --> 00:16:55,800 Speaker 2: from your own face. That's very scientific. Nobody could dispute 252 00:16:56,000 --> 00:16:59,240 Speaker 2: that works as a personal protection from yourself. And it 253 00:16:59,400 --> 00:17:02,080 Speaker 2: also means if I wear a mask and you don't 254 00:17:02,120 --> 00:17:05,320 Speaker 2: like wearing musk, I'm just reminding you to wash your hands. 255 00:17:05,520 --> 00:17:08,199 Speaker 2: What's the big deal. It was so funny, So it 256 00:17:08,240 --> 00:17:12,439 Speaker 2: went viral and we depolarized the conversation around mask. We 257 00:17:12,480 --> 00:17:16,800 Speaker 2: would do the same around vaccination, around contact chasing and 258 00:17:16,840 --> 00:17:19,480 Speaker 2: so on, and so reported one of the best counter 259 00:17:19,680 --> 00:17:23,879 Speaker 2: epidemic and counter infidemic results. So in that sense, in 260 00:17:23,960 --> 00:17:27,399 Speaker 2: kind of setting the mimetic narrative. I guess it's a 261 00:17:27,400 --> 00:17:30,680 Speaker 2: little bit like the dog in the US. I think 262 00:17:30,720 --> 00:17:34,840 Speaker 2: our main difference was that I always pre announced every move, 263 00:17:35,160 --> 00:17:38,520 Speaker 2: and I built very strong alliance with the bureaucracy. It 264 00:17:38,560 --> 00:17:42,119 Speaker 2: turns out the section chiefs already have plenty of answers, 265 00:17:42,400 --> 00:17:45,439 Speaker 2: they just need air cover from ministers. And so I 266 00:17:45,560 --> 00:17:49,360 Speaker 2: became this non collusive, not top down, but button up 267 00:17:49,720 --> 00:17:54,840 Speaker 2: or middle management, middle out way of running a government's digitalization, 268 00:17:55,240 --> 00:17:58,280 Speaker 2: which is, of course, I guess, not as quick, not 269 00:17:58,400 --> 00:18:02,320 Speaker 2: as fast as the chain, but I would say also 270 00:18:02,440 --> 00:18:06,480 Speaker 2: much more lasting, so that the programs we did become institutions, 271 00:18:06,520 --> 00:18:10,560 Speaker 2: become regulations, and the bureaucracy, the career public servants really 272 00:18:10,600 --> 00:18:12,760 Speaker 2: do see it that they own the work. 273 00:18:13,320 --> 00:18:16,240 Speaker 1: Use the phrase non coersive, and I think you also 274 00:18:16,280 --> 00:18:19,640 Speaker 1: mentioned there were no lockdowns in Taiwan during the during 275 00:18:19,680 --> 00:18:24,000 Speaker 1: the COVID pandemic, But there is a role for coercive 276 00:18:24,040 --> 00:18:25,560 Speaker 1: control and guaranteeing freedom. 277 00:18:25,560 --> 00:18:30,440 Speaker 2: I mean, for example, TikTok is banned in Taiwan. TikTok 278 00:18:30,640 --> 00:18:33,960 Speaker 2: is only banned in the public sector, so no government 279 00:18:34,000 --> 00:18:37,400 Speaker 2: employees can use it. That's right, But it's not just TikTok. 280 00:18:37,720 --> 00:18:41,239 Speaker 2: It's also waybule away seeing the deepsek web servers and 281 00:18:41,280 --> 00:18:45,720 Speaker 2: many others. But that's just basic cybersecurity hygiene. And we 282 00:18:45,880 --> 00:18:50,080 Speaker 2: also additionally, of course warn our people that TikTok is 283 00:18:50,320 --> 00:18:55,320 Speaker 2: a de facto controlled by the Beijing regime and that 284 00:18:55,440 --> 00:19:00,320 Speaker 2: it must also agree to our crowdsourced law against deep 285 00:19:00,320 --> 00:19:03,040 Speaker 2: fake fraud and skim and things like that, so that 286 00:19:03,080 --> 00:19:07,879 Speaker 2: if they enable this unsigned advertisement and it causes scam 287 00:19:07,960 --> 00:19:11,359 Speaker 2: damage for example, they need to be jointly liable to 288 00:19:11,680 --> 00:19:15,960 Speaker 2: all the scam damage from unsigned advertisements. And if they 289 00:19:15,960 --> 00:19:19,359 Speaker 2: don't agree, we can also throttle the connection to their video. 290 00:19:19,560 --> 00:19:22,080 Speaker 2: So we're not about sensor and content or anything. But 291 00:19:22,160 --> 00:19:25,000 Speaker 2: we of course, as you said, want to uphold the 292 00:19:25,040 --> 00:19:28,280 Speaker 2: freedom of expression and commerce online and not taken over 293 00:19:28,400 --> 00:19:29,560 Speaker 2: by foreign robots. 294 00:19:30,000 --> 00:19:31,840 Speaker 1: But as you look around, I mean, this is the 295 00:19:31,920 --> 00:19:34,159 Speaker 1: year when social media bands have come into effect. For example, 296 00:19:34,160 --> 00:19:38,080 Speaker 1: in Australia, so much concerned by parents about social media 297 00:19:38,119 --> 00:19:41,359 Speaker 1: pushing their children into suicide. Frankly, what do you think 298 00:19:41,400 --> 00:19:43,359 Speaker 1: when you see a policy like banning social media f 299 00:19:43,400 --> 00:19:45,000 Speaker 1: under sixteens in Australia. 300 00:19:45,119 --> 00:19:48,320 Speaker 2: Well, first of all, if it's done using what's called 301 00:19:48,320 --> 00:19:54,080 Speaker 2: the age signal, that does not compromise privacy without somebody 302 00:19:54,119 --> 00:19:57,920 Speaker 2: dosing themselves just to prove that they're over sixteen. Then 303 00:19:58,040 --> 00:20:03,720 Speaker 2: technologically this is called merin partial anonymity, and this infrastructure 304 00:20:03,880 --> 00:20:07,199 Speaker 2: is important. That was one of the flagship projects that 305 00:20:07,240 --> 00:20:11,400 Speaker 2: I launched as Digital Minister. So already today in Taiwan, 306 00:20:11,600 --> 00:20:15,880 Speaker 2: people can use the attestation from their telecom or their 307 00:20:15,920 --> 00:20:19,159 Speaker 2: local school, or from their local temple and church to 308 00:20:19,400 --> 00:20:22,960 Speaker 2: for example, collect the shipments to the local convenience stores, 309 00:20:23,400 --> 00:20:26,840 Speaker 2: all without bringing back to the government or even the 310 00:20:26,880 --> 00:20:30,080 Speaker 2: government knowing anything about it, because it's just a private 311 00:20:30,119 --> 00:20:34,200 Speaker 2: sector credential used by a social sector player, and they 312 00:20:34,240 --> 00:20:38,080 Speaker 2: can reveal, for example that I'm a new type a 313 00:20:38,080 --> 00:20:42,280 Speaker 2: city resident without doxing my own address, that I'm between 314 00:20:42,760 --> 00:20:46,679 Speaker 2: eighteen and thirty five without doxing my birth year, and 315 00:20:46,720 --> 00:20:49,479 Speaker 2: so on. And so this is important because we are 316 00:20:49,560 --> 00:20:54,080 Speaker 2: now entering an age where super persuasion is available as 317 00:20:54,200 --> 00:20:58,840 Speaker 2: a journeral product. So anybody across language and culture differences 318 00:20:59,080 --> 00:21:02,600 Speaker 2: can make Mali as one in a way that destroyed 319 00:21:02,600 --> 00:21:04,920 Speaker 2: the fabric of trust. So we do need to prove 320 00:21:05,119 --> 00:21:08,280 Speaker 2: there were not automated robots, but if in doing so, 321 00:21:08,440 --> 00:21:11,560 Speaker 2: we dogs ourselves. Then that's a bad ending, right, So 322 00:21:11,720 --> 00:21:15,240 Speaker 2: I think at least in the age signal case, Australia 323 00:21:15,320 --> 00:21:19,320 Speaker 2: is pioneering the use of merianimity. But in Taiwan, of course, 324 00:21:19,359 --> 00:21:22,520 Speaker 2: we're also with people younger than eighteen, not just banning 325 00:21:22,600 --> 00:21:27,800 Speaker 2: things but also crowdsourcing ideas from our reverse mentors. Many 326 00:21:27,840 --> 00:21:30,560 Speaker 2: of my reverse ntors are eighteen or even younger. 327 00:21:30,600 --> 00:21:32,160 Speaker 1: Can you have your own reverse mentors now? 328 00:21:32,440 --> 00:21:36,800 Speaker 2: Yes? When I was Digital Minister, I was only forty one, right, 329 00:21:36,880 --> 00:21:40,280 Speaker 2: So many of my reverse mentors are like way below 330 00:21:40,400 --> 00:21:44,200 Speaker 2: thirty five. They include people who petition for real systemic 331 00:21:44,280 --> 00:21:48,840 Speaker 2: change in Taiwan, like banning plastic straws over bubble tea 332 00:21:48,920 --> 00:21:53,959 Speaker 2: takeouts or crowdfunding the Menstruation Museum and remove the taboo 333 00:21:54,000 --> 00:21:57,000 Speaker 2: about periods in just two years time. Many others. But 334 00:21:57,080 --> 00:22:01,359 Speaker 2: one of my reverse mentors was the man the vision 335 00:22:01,680 --> 00:22:05,840 Speaker 2: imagineer of the Decentralized Ideal Wallet project, and so I 336 00:22:05,920 --> 00:22:10,239 Speaker 2: learned a lot about zero knowledge about interoperability. And so 337 00:22:10,280 --> 00:22:13,359 Speaker 2: for social media, what we advocate for and thinking to 338 00:22:13,480 --> 00:22:17,119 Speaker 2: mash me on my revers mentor, is that we need 339 00:22:17,160 --> 00:22:21,639 Speaker 2: to mandate the off ramps of the information highways. It 340 00:22:21,720 --> 00:22:25,240 Speaker 2: turns out I learned that in the US a couple 341 00:22:25,359 --> 00:22:29,000 Speaker 2: years ago, there was a study an undergrad using TikTok 342 00:22:29,119 --> 00:22:33,000 Speaker 2: on average would not quit TikTok until you pay them 343 00:22:33,280 --> 00:22:36,240 Speaker 2: forty five dollars a month. So they lose that much 344 00:22:36,359 --> 00:22:40,679 Speaker 2: utility if they quit from TikTok. However, if there's a 345 00:22:40,760 --> 00:22:43,760 Speaker 2: magic button they can press to move everybody they know 346 00:22:43,920 --> 00:22:47,200 Speaker 2: of TikTok together somewhere else, then they're willing to pay 347 00:22:47,240 --> 00:22:49,560 Speaker 2: you thirty dollars per months. 348 00:22:49,720 --> 00:22:50,000 Speaker 1: Wow. 349 00:22:50,080 --> 00:22:53,240 Speaker 2: So there's a market here for a subscription, and it's 350 00:22:53,240 --> 00:22:56,639 Speaker 2: a product market trap because everybody feels trapped, but the 351 00:22:56,720 --> 00:23:00,359 Speaker 2: first person to move out lose more. So nobody loses out. 352 00:23:00,560 --> 00:23:03,800 Speaker 2: So the solution is actually quite simple Thant. The state 353 00:23:03,840 --> 00:23:07,600 Speaker 2: of Utah in the US, Governor Cox signed into law 354 00:23:07,920 --> 00:23:11,320 Speaker 2: that will take place this July, the Digital Choice Act. 355 00:23:11,640 --> 00:23:14,239 Speaker 2: So if you're Utah citizen starting July and you want 356 00:23:14,320 --> 00:23:18,600 Speaker 2: to move from TikTok to Blue Sky or to Social 357 00:23:18,800 --> 00:23:22,160 Speaker 2: both are open source. The old network is then legally 358 00:23:22,240 --> 00:23:27,359 Speaker 2: required to forward new likes, new followers, new reactions into 359 00:23:27,400 --> 00:23:30,479 Speaker 2: your new network, and so the networks will then have 360 00:23:30,600 --> 00:23:35,240 Speaker 2: to compete. Your experience must be better every month. Otherwise 361 00:23:35,359 --> 00:23:39,760 Speaker 2: you can move away and gain thirty dollars promos utility 362 00:23:39,920 --> 00:23:44,560 Speaker 2: without actually being trapped by your community health hostage. So 363 00:23:44,640 --> 00:23:47,640 Speaker 2: I think this kind of radical interroperability, which we also 364 00:23:47,680 --> 00:23:51,280 Speaker 2: see in Europe with instant messaging over digital markets Act, 365 00:23:51,560 --> 00:23:55,080 Speaker 2: is really one of the more promising solutions to the 366 00:23:55,160 --> 00:23:59,080 Speaker 2: login and to the what core doctoral caused anciertification of 367 00:23:59,160 --> 00:24:00,600 Speaker 2: the social media experience. 368 00:24:00,760 --> 00:24:02,320 Speaker 1: And you were actually part of a group that was 369 00:24:02,320 --> 00:24:05,320 Speaker 1: trying to buy the US version of TikTok ye at 370 00:24:05,320 --> 00:24:08,560 Speaker 1: the Project Liberty Institute, and if that didn't in the 371 00:24:08,640 --> 00:24:11,120 Speaker 1: end go through, right, it went to the lessons instead. Well, 372 00:24:11,160 --> 00:24:14,520 Speaker 1: in a sense, what we're building is the digital infrastructure 373 00:24:14,800 --> 00:24:18,200 Speaker 1: that would enable the law like Utah and now also 374 00:24:18,280 --> 00:24:22,600 Speaker 1: under consideration by North Carolina, our South and Vermont and 375 00:24:22,720 --> 00:24:26,320 Speaker 1: New York. So it's becoming a movement across the red 376 00:24:26,320 --> 00:24:29,879 Speaker 1: and blue states. And so if the new TikTok, the 377 00:24:30,000 --> 00:24:33,800 Speaker 1: US based TikTok wants to comply with those state laws, 378 00:24:33,960 --> 00:24:36,960 Speaker 1: then they will need the infrastructure that we're building in 379 00:24:37,040 --> 00:24:44,200 Speaker 1: Project Liberty Institute. So you've also worked with Governor Newsom 380 00:24:44,240 --> 00:24:48,040 Speaker 1: in California, you work with he has been green of 381 00:24:48,200 --> 00:24:52,680 Speaker 1: Google Jigsaw in Bowling Green, Kentucky. In terms of bringing 382 00:24:53,640 --> 00:24:58,639 Speaker 1: the sort of Taiwanese digitally enabled democracy approach to problem 383 00:24:58,720 --> 00:25:03,080 Speaker 1: solving in other places, what's the project you're most proud 384 00:25:03,080 --> 00:25:05,080 Speaker 1: of outside of Taiwan that you've been involved with. 385 00:25:05,880 --> 00:25:11,439 Speaker 2: It's hard to compare jurisdictions, especially as an ambassador, but 386 00:25:11,600 --> 00:25:13,560 Speaker 2: I can share the most recent ones. 387 00:25:13,640 --> 00:25:16,080 Speaker 1: Yes, good answer. 388 00:25:16,320 --> 00:25:20,719 Speaker 2: Recently I visited Japan and talked to the LVP leadership, 389 00:25:21,280 --> 00:25:24,240 Speaker 2: who are enjoying very high approval rating at the moment, 390 00:25:24,880 --> 00:25:27,280 Speaker 2: and they look at Taiwan and look at how we 391 00:25:27,440 --> 00:25:31,200 Speaker 2: deal with big tech like Facebook when it comes to compliance. 392 00:25:31,640 --> 00:25:35,359 Speaker 2: And then they discovered while if you scroll in Japan 393 00:25:35,920 --> 00:25:39,560 Speaker 2: on Facebook or really any other social media, you see 394 00:25:39,600 --> 00:25:43,320 Speaker 2: a lot of those deep fake scams. But in Taiwan 395 00:25:43,760 --> 00:25:47,000 Speaker 2: we used to see those two years ago. When I scroll, 396 00:25:47,280 --> 00:25:50,399 Speaker 2: I would see Jensen Huan, the Taiwanese Nvidia and CEO, 397 00:25:51,200 --> 00:25:56,200 Speaker 2: his face promising free cryptocurrency investments or whatever. If I click, 398 00:25:56,320 --> 00:25:58,960 Speaker 2: Jensen actually talks to me, sounds just like him. The 399 00:25:59,040 --> 00:26:01,840 Speaker 2: covers is not him, It is some deepike running on 400 00:26:01,920 --> 00:26:05,720 Speaker 2: Nvidia GPU. However, in Taiwan we don't have that anymore. 401 00:26:05,880 --> 00:26:09,560 Speaker 2: Throughout the last year. There's simply no depix scums adds 402 00:26:09,600 --> 00:26:13,040 Speaker 2: anymore on social media, but there's a growing amount of 403 00:26:13,080 --> 00:26:16,880 Speaker 2: debt in Japan. There was a Rootois expose a few 404 00:26:16,920 --> 00:26:20,960 Speaker 2: weeks ago that talks about how Facebook reroots the ads 405 00:26:21,240 --> 00:26:25,080 Speaker 2: from Taiwan where they cannot pass the Know Your Customer 406 00:26:25,280 --> 00:26:28,840 Speaker 2: We use the KIC measures that we put in into 407 00:26:29,000 --> 00:26:32,639 Speaker 2: our nearby jurisdictions. And the reason why Taiwan didn't have 408 00:26:32,760 --> 00:26:36,560 Speaker 2: these scams was that in twenty twenty four, I, as 409 00:26:36,600 --> 00:26:40,480 Speaker 2: a Digital Minister, send a text message from the government 410 00:26:40,720 --> 00:26:44,040 Speaker 2: number one one one to two hundred thousand random numbers 411 00:26:44,080 --> 00:26:47,919 Speaker 2: around Taiwan saying what to do about the deep excam online, 412 00:26:47,960 --> 00:26:51,359 Speaker 2: give us your ideas, And then they gave us their ideas, 413 00:26:51,400 --> 00:26:55,880 Speaker 2: and thousands volunteered to join the online Citizen Assembly, and 414 00:26:56,240 --> 00:26:59,560 Speaker 2: we chose four hundred and forty seven people statistically a 415 00:26:59,600 --> 00:27:03,680 Speaker 2: micro peran of Taiwan society, and they debated in rooms 416 00:27:03,720 --> 00:27:06,640 Speaker 2: of ten, each room coming up with very good ideas 417 00:27:06,760 --> 00:27:10,960 Speaker 2: like mandating the display of probably scam like cigarette labels 418 00:27:11,160 --> 00:27:13,960 Speaker 2: until somebody digitally signed and then we can take that 419 00:27:14,080 --> 00:27:17,040 Speaker 2: label down. Another room said, if people lose seven million 420 00:27:17,040 --> 00:27:19,800 Speaker 2: dollars to an unsigned scam that Facebook did not take down. 421 00:27:19,920 --> 00:27:23,040 Speaker 2: Facebook should be liable for the seven million damage. Another 422 00:27:23,119 --> 00:27:26,400 Speaker 2: said TikTok. If they don't agree to serve a local representative, 423 00:27:26,640 --> 00:27:29,119 Speaker 2: we can dial down the speed to their videos and 424 00:27:29,200 --> 00:27:31,960 Speaker 2: so on. And so these became law, and because there 425 00:27:32,040 --> 00:27:35,360 Speaker 2: was not a ministerial position, it was agreed by more 426 00:27:35,400 --> 00:27:38,399 Speaker 2: than eighty five percent of this mini public and the 427 00:27:38,440 --> 00:27:41,560 Speaker 2: other fifteen percent can live with it. So all the 428 00:27:41,640 --> 00:27:45,240 Speaker 2: three parties in our parliament at time fast tracked this legislation. 429 00:27:45,560 --> 00:27:48,399 Speaker 2: So in just a couple months everything was passed, and 430 00:27:48,440 --> 00:27:51,960 Speaker 2: then throughout twenty twenty five there's just no defect scams anymore. 431 00:27:52,200 --> 00:27:57,919 Speaker 1: Oh, that's remarkable. There's a neo ludite movement emerging in 432 00:27:57,960 --> 00:28:01,760 Speaker 1: the US, people smashing waymos people especially iPhones, or having 433 00:28:01,840 --> 00:28:04,720 Speaker 1: public executions of the iPhone. I think you knew or 434 00:28:04,800 --> 00:28:07,080 Speaker 1: not too long ago. What would you say, Because I 435 00:28:07,080 --> 00:28:10,439 Speaker 1: think you're somebody who loves technology. But many people, I 436 00:28:10,480 --> 00:28:13,560 Speaker 1: think feel now like their lives may be better off 437 00:28:13,680 --> 00:28:16,480 Speaker 1: if the digital revolution never happened. What do you say 438 00:28:16,480 --> 00:28:16,919 Speaker 1: to them? 439 00:28:17,200 --> 00:28:21,080 Speaker 2: So, as I mentioned, if as in Japan, people feel 440 00:28:21,400 --> 00:28:25,720 Speaker 2: that the measures they put in to put very reasonable 441 00:28:25,960 --> 00:28:29,440 Speaker 2: red lines around big tech is not having an effect, 442 00:28:30,040 --> 00:28:33,120 Speaker 2: then my job as cyber ambassador is to help them 443 00:28:33,160 --> 00:28:37,040 Speaker 2: to run a similar process so the people can really steer, 444 00:28:37,640 --> 00:28:41,560 Speaker 2: because a very dangerous narrative is that we only have 445 00:28:41,640 --> 00:28:47,040 Speaker 2: two choices essentially accelerating or to push the brake, that 446 00:28:47,160 --> 00:28:50,280 Speaker 2: is to say, the new load at movement, of stopping 447 00:28:50,280 --> 00:28:53,720 Speaker 2: everything stop AI and so on. But if there's a 448 00:28:53,800 --> 00:28:57,920 Speaker 2: car with only these two control panels control levers, that's 449 00:28:58,000 --> 00:29:02,160 Speaker 2: not a car, right And if you just accelerate, if 450 00:29:02,160 --> 00:29:04,880 Speaker 2: you lose even the brakes, then of course the falls 451 00:29:04,920 --> 00:29:08,400 Speaker 2: off a cliff. That's very fast acceleration. But then there's 452 00:29:08,440 --> 00:29:11,400 Speaker 2: no steering at all. And people would feel that if 453 00:29:11,400 --> 00:29:15,000 Speaker 2: we take not the human in the loop of AI 454 00:29:15,240 --> 00:29:18,840 Speaker 2: as spinning hamster will, but rather to take the AI 455 00:29:19,360 --> 00:29:23,880 Speaker 2: into the loop of humanity of communities, then just as 456 00:29:23,920 --> 00:29:27,760 Speaker 2: our curriculum reform in twenty nineteen showed that it's just 457 00:29:28,040 --> 00:29:32,200 Speaker 2: a partner assistive intelligence of AI that help us to 458 00:29:32,360 --> 00:29:37,800 Speaker 2: exercise our curiosity, our collaboration, our civic care. And so 459 00:29:37,840 --> 00:29:41,280 Speaker 2: the civic AI project that I have in Oxford is 460 00:29:41,320 --> 00:29:45,440 Speaker 2: about training AI in the service of communities. And if 461 00:29:45,440 --> 00:29:48,760 Speaker 2: we lose that, then, of course the communities are feeling 462 00:29:48,960 --> 00:29:54,080 Speaker 2: that they're being torn apart by artificial intelligence. That's authoritarian, 463 00:29:54,280 --> 00:29:57,160 Speaker 2: that's accelerating, and that's falling off a cliff. 464 00:29:57,720 --> 00:30:01,600 Speaker 1: But do you really believe that the genie has not 465 00:30:01,720 --> 00:30:03,800 Speaker 1: yet left the bottle? And we're hearing so much now 466 00:30:03,800 --> 00:30:08,640 Speaker 1: about the emersion properties of AI to preserve itself, all 467 00:30:08,800 --> 00:30:13,000 Speaker 1: costs to in red teeming scenarios, black male humans to 468 00:30:13,080 --> 00:30:17,800 Speaker 1: keep itself on. How much confidence do you have that 469 00:30:18,320 --> 00:30:21,200 Speaker 1: we still have the ability to control this technology we've created, 470 00:30:21,960 --> 00:30:24,040 Speaker 1: and how long will that control last? And what will 471 00:30:24,040 --> 00:30:25,600 Speaker 1: happen if we don't do it well. 472 00:30:25,640 --> 00:30:30,240 Speaker 2: First of all, I think we are not yet fully 473 00:30:30,360 --> 00:30:36,040 Speaker 2: understanding what's happening in the Transformer models because the architecture 474 00:30:36,240 --> 00:30:40,640 Speaker 2: is very opaque. That is to say, every token that 475 00:30:40,720 --> 00:30:44,840 Speaker 2: they produce involve every other token in the context, and 476 00:30:44,880 --> 00:30:50,040 Speaker 2: this quadratic is extremely difficult to reverse engineer what's really 477 00:30:50,080 --> 00:30:53,240 Speaker 2: going on. So if you ask the wooden horse, do 478 00:30:53,360 --> 00:30:57,680 Speaker 2: you have Greek soldiers inside the horse, there's some introspection 479 00:30:58,320 --> 00:31:01,400 Speaker 2: train of thought and say, so there's nothing inside. I 480 00:31:01,440 --> 00:31:04,760 Speaker 2: am a very safe horse. And then it doesn't actually 481 00:31:04,800 --> 00:31:09,000 Speaker 2: know and nobody actually know. We know there's some general circuits. 482 00:31:09,240 --> 00:31:11,800 Speaker 2: We know that horse that are evil and the horse 483 00:31:11,880 --> 00:31:15,400 Speaker 2: that think themselves are good. Generally you can look at 484 00:31:15,680 --> 00:31:19,360 Speaker 2: kind of their eyes and see some vectors, but nobody 485 00:31:19,480 --> 00:31:23,440 Speaker 2: is fully sure. The freely available horse of deep seek. 486 00:31:23,680 --> 00:31:27,920 Speaker 2: According to the CrowdStrike analysis, if you say to the 487 00:31:27,960 --> 00:31:32,480 Speaker 2: deep horse that you love Ciginpin and ask it to 488 00:31:32,600 --> 00:31:35,959 Speaker 2: design a website, it designs a very good website, very secure. 489 00:31:36,280 --> 00:31:38,800 Speaker 2: But if you tell it I love Allo Gong and 490 00:31:38,960 --> 00:31:42,800 Speaker 2: design the same website, the website is full of security holes. 491 00:31:44,200 --> 00:31:48,320 Speaker 2: So that's an emergent property of loyalty. And so I 492 00:31:48,360 --> 00:31:50,840 Speaker 2: think the point I'm making is that this is not 493 00:31:50,960 --> 00:31:54,920 Speaker 2: a simple yes, no question of whether there's trapdoors or 494 00:31:55,000 --> 00:31:57,960 Speaker 2: whether it's a wooden horse, but rather the question is 495 00:31:58,000 --> 00:32:02,320 Speaker 2: how quickly can we switch to a transparent horse, an 496 00:32:02,480 --> 00:32:06,920 Speaker 2: architecture where people can, in linear time, not quadratic time, 497 00:32:07,480 --> 00:32:11,320 Speaker 2: see through exactly how it's updating its belief it's state. 498 00:32:11,680 --> 00:32:15,480 Speaker 2: And we're seeing some promising development like what's called the 499 00:32:15,520 --> 00:32:20,200 Speaker 2: power Retention Network by manifests AI among others, that allow 500 00:32:20,320 --> 00:32:24,720 Speaker 2: them to train with just six thousand dollars a existing 501 00:32:24,800 --> 00:32:28,040 Speaker 2: transformer model that costs tw one hundred thousand dollars to train. 502 00:32:28,440 --> 00:32:33,000 Speaker 2: So with some transparency tacks of around three percent, you 503 00:32:33,040 --> 00:32:36,920 Speaker 2: can get something that you can then interpret in linear 504 00:32:37,000 --> 00:32:40,000 Speaker 2: time much more easily and retain the same function, the 505 00:32:40,000 --> 00:32:43,440 Speaker 2: same horsepower as the woolen horse. So the quicker we 506 00:32:43,560 --> 00:32:47,360 Speaker 2: switch to this kind of transparent horse architecture, the easier 507 00:32:47,560 --> 00:32:50,560 Speaker 2: is to answer your question, which is weather, it's too late, 508 00:32:50,760 --> 00:32:52,880 Speaker 2: But before that we're kind of flying blind. 509 00:32:53,920 --> 00:32:56,200 Speaker 1: Do you believe that the transparent horse will be the 510 00:32:56,200 --> 00:32:57,640 Speaker 1: defining issue of our time? 511 00:32:58,080 --> 00:33:02,480 Speaker 2: Yes, it will because with out a transparent horse, all 512 00:33:02,520 --> 00:33:06,400 Speaker 2: of the generally freely available models, or the open weight 513 00:33:06,720 --> 00:33:11,880 Speaker 2: models are probably trojan horse in disguise. And if we 514 00:33:11,960 --> 00:33:17,360 Speaker 2: grow too dependent on those models, then we risk something 515 00:33:17,480 --> 00:33:22,000 Speaker 2: really real and really dangerous, which is called collective disempowerment. 516 00:33:22,600 --> 00:33:27,000 Speaker 2: That people will feel, okay, this is literally a magic horse, 517 00:33:27,480 --> 00:33:32,080 Speaker 2: that we literally overload our cognitive system so much that 518 00:33:32,200 --> 00:33:36,280 Speaker 2: we have to think with those horses as our exo cortex, 519 00:33:36,680 --> 00:33:40,600 Speaker 2: and before long we will feel that democracy is, you know, 520 00:33:40,760 --> 00:33:44,680 Speaker 2: too much work. Let's just delegate to our digital twins 521 00:33:44,840 --> 00:33:49,320 Speaker 2: of those horse twins centaurs to deliberate for us. And 522 00:33:49,360 --> 00:33:52,800 Speaker 2: that's exactly like sending our robots to the gym to 523 00:33:52,920 --> 00:33:56,440 Speaker 2: exercise for us. I'm sure the robots are impressive, but 524 00:33:56,600 --> 00:34:00,520 Speaker 2: our civic muscle will a trophy. So our work in 525 00:34:00,560 --> 00:34:04,160 Speaker 2: Oxford that I call six Pack of Care. It's about 526 00:34:04,240 --> 00:34:08,160 Speaker 2: reclaiming the six pack, the muscle of care, and also 527 00:34:08,360 --> 00:34:11,439 Speaker 2: portable as in six pack, so that it can work 528 00:34:11,520 --> 00:34:15,319 Speaker 2: across all different cultures and jurisdictions. But the point is 529 00:34:15,640 --> 00:34:19,600 Speaker 2: that if we do not do it gradually, gradual empowerment, 530 00:34:19,800 --> 00:34:23,919 Speaker 2: then there's no room, there's no cadence to train those 531 00:34:23,920 --> 00:34:27,600 Speaker 2: civic muscle together, and there will be a profound loss 532 00:34:27,640 --> 00:34:31,880 Speaker 2: of meaning and that leads to chaos and polarization because 533 00:34:31,920 --> 00:34:35,840 Speaker 2: people do not feel then that they have their hands understanding. 534 00:34:35,920 --> 00:34:40,839 Speaker 2: Will they become like companion animals to the superintelligence? And 535 00:34:40,920 --> 00:34:44,520 Speaker 2: I believe we the people are the true superintelligence. 536 00:34:54,239 --> 00:34:56,040 Speaker 1: Autry Dame, thank you for coming to Textuff. 537 00:34:56,280 --> 00:34:57,640 Speaker 2: Thank you liv On and Prosper. 538 00:35:14,840 --> 00:35:17,640 Speaker 1: That's it for this week for tech Stuff. I'm os Vloschen. 539 00:35:18,320 --> 00:35:21,480 Speaker 1: This episode was produced by Eliza Dennis and Melissa Slaughter. 540 00:35:22,200 --> 00:35:25,440 Speaker 1: It was executive produced by me Karen Price, Julian Nutter, 541 00:35:25,480 --> 00:35:30,000 Speaker 1: and Kate Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. 542 00:35:30,760 --> 00:35:33,719 Speaker 1: Jack Insley mikesed this episode and Kyle Murdoch wrote our 543 00:35:33,800 --> 00:35:36,480 Speaker 1: theme song. And a special thank you to the DLD 544 00:35:36,640 --> 00:35:40,600 Speaker 1: Conference and Marie Degenfeld Schunberg for helping set up this interview, 545 00:35:41,120 --> 00:35:44,160 Speaker 1: and to Koogle and Nier and Stephanie Bookhols for recording 546 00:35:44,200 --> 00:35:46,839 Speaker 1: it in Munich. Join us on Friday for the Week 547 00:35:46,840 --> 00:35:49,200 Speaker 1: in Tech, when we'll run through the headlines you need 548 00:35:49,239 --> 00:35:52,440 Speaker 1: to follow and please rate, review, and reach out to 549 00:35:52,520 --> 00:35:55,400 Speaker 1: us at tech Stuff podcast at gmail dot com. We 550 00:35:55,480 --> 00:35:56,279 Speaker 1: love hearing from you.