1 00:00:13,640 --> 00:00:17,000 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech Stuff. I'm 2 00:00:17,000 --> 00:00:18,040 Speaker 1: as Lussian and. 3 00:00:18,079 --> 00:00:20,200 Speaker 2: I'm Dexter Thomas filling in for care of Price. 4 00:00:20,520 --> 00:00:24,200 Speaker 1: Today. We've got two big stories to dive into. First, 5 00:00:24,520 --> 00:00:27,640 Speaker 1: a growing movement in Europe to prepare for a world 6 00:00:27,960 --> 00:00:30,920 Speaker 1: without access to US technology. 7 00:00:30,640 --> 00:00:35,760 Speaker 2: And then can art lovers really complain away? AI gamers 8 00:00:35,800 --> 00:00:37,199 Speaker 2: are doing a pretty good job so far. 9 00:00:37,760 --> 00:00:45,160 Speaker 1: All of that on the Weekend Tech It's Friday, January thirtieth. Dexter, 10 00:00:45,240 --> 00:00:47,040 Speaker 1: Welcome to tex Stuff. Excited to have you back. 11 00:00:47,200 --> 00:00:48,879 Speaker 2: Hey, I'm happy to be back. 12 00:00:48,800 --> 00:00:51,519 Speaker 1: And congratulations on the good work over at kill Switch. 13 00:00:51,880 --> 00:00:53,960 Speaker 1: One of my favorite text off episodes of the last 14 00:00:54,040 --> 00:00:56,080 Speaker 1: year was you coming on to to us about the 15 00:00:56,120 --> 00:01:00,720 Speaker 1: history of Nintendo and the gang Buster's launch of Nintendo's which. 16 00:01:00,880 --> 00:01:04,760 Speaker 2: Yeah, I feel like I'm becoming quickly the video game person, 17 00:01:04,800 --> 00:01:07,120 Speaker 2: even though I very rarely have enough time to play 18 00:01:07,200 --> 00:01:10,440 Speaker 2: video games in my personal life. But hey, listen, I'm 19 00:01:10,480 --> 00:01:12,520 Speaker 2: into it. I love talking about this stuff all day, 20 00:01:12,600 --> 00:01:15,399 Speaker 2: so I was very happy to be able to add 21 00:01:15,440 --> 00:01:17,640 Speaker 2: a little bit in Nintendo into the tech Stuff universe. 22 00:01:17,959 --> 00:01:22,440 Speaker 1: While you've been following the world of video games without playing, 23 00:01:22,640 --> 00:01:29,080 Speaker 1: I've been following the world of Davos without going. So 24 00:01:29,480 --> 00:01:33,959 Speaker 1: President Trump dropped some bombshells, luckily not real ones not 25 00:01:34,120 --> 00:01:37,560 Speaker 1: yet on Europe, which we want to talk about. But 26 00:01:37,800 --> 00:01:42,600 Speaker 1: before we get there. Actually, there was a thankfully very comedic, 27 00:01:43,000 --> 00:01:45,280 Speaker 1: real life bomb scare at Davos that I want to 28 00:01:45,280 --> 00:01:45,919 Speaker 1: tell you all about. 29 00:01:46,280 --> 00:01:49,200 Speaker 2: Please enlighten me. I've been too busy not playing video games. 30 00:01:49,400 --> 00:01:52,480 Speaker 1: So here's the story. A thirty one year old entrepreneur 31 00:01:52,560 --> 00:01:56,080 Speaker 1: called Sebastian was at Davos for the first time and 32 00:01:56,120 --> 00:01:59,480 Speaker 1: he was at a hotel party hosted by conference called DLD. 33 00:02:00,000 --> 00:02:04,760 Speaker 1: Sebastian was there like many tech entrepreneurs, hoping to pitch investors, 34 00:02:05,280 --> 00:02:07,600 Speaker 1: and he had a prototype with him for his company 35 00:02:07,600 --> 00:02:11,520 Speaker 1: called Verdico, and the product is a verification device to 36 00:02:11,680 --> 00:02:15,480 Speaker 1: fraud proof money transfers. So decided to put it down 37 00:02:15,760 --> 00:02:19,360 Speaker 1: and go in search of a salmon roll. Guess what 38 00:02:19,360 --> 00:02:20,679 Speaker 1: I keep next? 39 00:02:20,840 --> 00:02:22,920 Speaker 2: Got to get that free sushi. Okay, listen, I've been there, 40 00:02:22,960 --> 00:02:23,799 Speaker 2: I get it. I get it. 41 00:02:25,480 --> 00:02:29,440 Speaker 1: So Sebastian gets back to pick up the vertico. The 42 00:02:29,520 --> 00:02:34,120 Speaker 1: device is not there, but a cop is. Because here's 43 00:02:34,120 --> 00:02:37,160 Speaker 1: the thing. The device, as Sebastian describes it is a 44 00:02:37,200 --> 00:02:40,919 Speaker 1: black cube with hot glue blobs and wires coming out 45 00:02:40,960 --> 00:02:41,520 Speaker 1: of the side. 46 00:02:42,040 --> 00:02:45,119 Speaker 2: Oh yeah, no, okay, I definitely see where this is going. 47 00:02:45,360 --> 00:02:51,560 Speaker 1: Yeah, so Sebastian has made more of a splash than 48 00:02:51,600 --> 00:02:54,320 Speaker 1: he planned. He actually gave a phone interview to Semaphore 49 00:02:54,440 --> 00:02:57,320 Speaker 1: about the whole experience, which I think just bears playing 50 00:02:57,400 --> 00:03:00,320 Speaker 1: out because it really is comedy gold. So he has 51 00:03:00,440 --> 00:03:03,919 Speaker 1: a police officer waiting for him who speaks perfect English, 52 00:03:03,960 --> 00:03:06,840 Speaker 1: and the guy apparently brings out a fingerprint scanner and 53 00:03:06,840 --> 00:03:09,120 Speaker 1: says to Sebastian, I need to check if you're an 54 00:03:09,120 --> 00:03:14,560 Speaker 1: international spy. Sebastian gets taken to the local jail in 55 00:03:14,639 --> 00:03:17,639 Speaker 1: Davos and says, look, I understand it's all a big misunderstanding, 56 00:03:17,639 --> 00:03:20,600 Speaker 1: but I'm insomniac. I need my sleeping pills, my lu 57 00:03:20,600 --> 00:03:23,800 Speaker 1: nesta and that. The police officer said, well, if you're 58 00:03:23,840 --> 00:03:26,160 Speaker 1: a spy, those pills could be cyanide, so you can't 59 00:03:26,160 --> 00:03:27,640 Speaker 1: have them. 60 00:03:28,040 --> 00:03:31,440 Speaker 2: Wow. Oh they're taking this seriously. Okay, they're going on 61 00:03:31,880 --> 00:03:33,760 Speaker 2: James Bond here. Okay, I like this. 62 00:03:34,400 --> 00:03:37,680 Speaker 1: And the next morning the Swiss police bring in a 63 00:03:37,720 --> 00:03:41,160 Speaker 1: tech expert to really sort of shake down Sebastian see 64 00:03:41,520 --> 00:03:44,200 Speaker 1: if he is what he says he is. And the 65 00:03:44,200 --> 00:03:47,560 Speaker 1: tech expert says, look, Sebastian, please explain your tech. So 66 00:03:47,600 --> 00:03:49,560 Speaker 1: Sebastian does his best, as he puts it, I do 67 00:03:49,600 --> 00:03:52,160 Speaker 1: my investment pitch. They brought the machine and the police 68 00:03:52,160 --> 00:03:54,720 Speaker 1: officer and Sebastian go through the code line by line. 69 00:03:55,000 --> 00:03:56,720 Speaker 1: And this is my favorite part of the whole story. 70 00:03:56,960 --> 00:04:00,000 Speaker 1: Sebastian says, I don't know the language that the code 71 00:04:00,040 --> 00:04:02,320 Speaker 1: it was written in because it was written by AI. 72 00:04:02,680 --> 00:04:04,920 Speaker 1: So Chris actually explained the code to me. 73 00:04:05,600 --> 00:04:07,680 Speaker 2: That's not great. That's not great. I mean, for all 74 00:04:07,680 --> 00:04:12,640 Speaker 2: this dude knows it could be a bomb. Nice actually right, 75 00:04:12,720 --> 00:04:14,840 Speaker 2: he's got this innocent looking little brick though that he 76 00:04:14,880 --> 00:04:17,520 Speaker 2: thinks it is an innocent little looking brick, And how 77 00:04:17,520 --> 00:04:20,279 Speaker 2: do you know that it didn't Oh my gosh. 78 00:04:19,640 --> 00:04:21,719 Speaker 1: He did say, to be fair to him, I think 79 00:04:21,760 --> 00:04:23,600 Speaker 1: I'm the idiot in this listen. 80 00:04:24,360 --> 00:04:25,960 Speaker 2: I didn't say I'm not gonna argue with the man. 81 00:04:26,839 --> 00:04:29,159 Speaker 2: I'm not gonna who am I? Who am I to judge? 82 00:04:29,520 --> 00:04:33,040 Speaker 2: I'll let him judge himself. Is this okay? This product? 83 00:04:33,400 --> 00:04:36,600 Speaker 2: And I'm just getting you know, super tinfoil conspiracy hat 84 00:04:36,640 --> 00:04:39,240 Speaker 2: theory here. I didn't know what this was before you 85 00:04:39,360 --> 00:04:41,920 Speaker 2: just told me about it. Is this a publicity stunt? 86 00:04:42,080 --> 00:04:44,479 Speaker 1: Semaphore did actually ask him is this a publicity stune? 87 00:04:44,480 --> 00:04:46,040 Speaker 1: And he said, look, I promise it's not. I wouldn't 88 00:04:46,040 --> 00:04:48,680 Speaker 1: spend eighteen hundred euros on my ticket to spend the 89 00:04:48,680 --> 00:04:52,080 Speaker 1: whole weekend in jail, which I kind of believed. 90 00:04:52,360 --> 00:04:55,520 Speaker 2: Okay, yeah, fair, fair that that'd be an expensive and 91 00:04:55,560 --> 00:04:59,360 Speaker 2: kind of risky publicity stunt. Okay, this kind of does 92 00:04:59,480 --> 00:05:02,560 Speaker 2: remind me of something that happened in the past. Actually. 93 00:05:02,800 --> 00:05:06,359 Speaker 2: So it's March twenty ten, right, and somebody finds a 94 00:05:06,360 --> 00:05:09,120 Speaker 2: phone at a bar, and I think it's a Redwood city, 95 00:05:09,120 --> 00:05:11,800 Speaker 2: you know, up in the Bay Area. And this person 96 00:05:12,240 --> 00:05:15,479 Speaker 2: can't find the owner of the phone, so I think 97 00:05:15,480 --> 00:05:17,559 Speaker 2: he just takes it home, thinking okay, well, maybe i'll 98 00:05:17,640 --> 00:05:19,320 Speaker 2: I'll take it back the next day and maybe I'll 99 00:05:19,320 --> 00:05:20,920 Speaker 2: be able to connect with whoever the owner is the 100 00:05:20,960 --> 00:05:23,359 Speaker 2: next day. Fine, And then he looks at the phone 101 00:05:23,800 --> 00:05:27,480 Speaker 2: and it's plastic y, the outer shell is plastic key, 102 00:05:27,760 --> 00:05:30,080 Speaker 2: and you know, this person starts playing around with it 103 00:05:30,400 --> 00:05:32,280 Speaker 2: and tries to open it up. See maybe I can 104 00:05:32,279 --> 00:05:34,479 Speaker 2: figure out whose it is. Maybe there's info on there, 105 00:05:35,000 --> 00:05:38,240 Speaker 2: but the system keeps crashing every time he tries to 106 00:05:38,240 --> 00:05:41,479 Speaker 2: look at it. And there's these word barcodes on the 107 00:05:41,520 --> 00:05:44,359 Speaker 2: back of the phone, and so he starts pulling at 108 00:05:44,480 --> 00:05:47,240 Speaker 2: the phone casing and it comes off and it's this 109 00:05:47,320 --> 00:05:51,120 Speaker 2: plastic shell and there's a front facing camera on the phone. 110 00:05:51,160 --> 00:05:55,400 Speaker 2: Now this is March twenty ten. This is the iPhone four. 111 00:05:56,520 --> 00:06:00,599 Speaker 2: He's found a prototype for the not yet released iPhone 112 00:06:00,640 --> 00:06:05,640 Speaker 2: four at a bar. Yeah, so Gizmoto, the tech website, 113 00:06:06,240 --> 00:06:08,320 Speaker 2: just post an article about this and they it's a 114 00:06:08,360 --> 00:06:11,599 Speaker 2: full leak about the iPhone four. And so they post 115 00:06:11,800 --> 00:06:15,960 Speaker 2: the specs, They disassemble it, take a look inside, take 116 00:06:16,040 --> 00:06:18,800 Speaker 2: pictures of the inside, and it turns out, Yeah, some 117 00:06:19,240 --> 00:06:23,159 Speaker 2: engineer he was testing it forgot the phone at the bar, 118 00:06:23,360 --> 00:06:26,000 Speaker 2: just like left it on the bar stool, went home, 119 00:06:26,680 --> 00:06:31,000 Speaker 2: and some random person found it. And from what I read, 120 00:06:31,480 --> 00:06:36,720 Speaker 2: that unnamed engineer was very embarrassed. But at the end 121 00:06:36,720 --> 00:06:37,880 Speaker 2: of all that still had a job. 122 00:06:38,240 --> 00:06:40,000 Speaker 1: I can imagine that that kind of been the most 123 00:06:40,000 --> 00:06:41,120 Speaker 1: fun conversation. 124 00:06:41,960 --> 00:06:44,720 Speaker 2: No, no, I don't think. I don't think anybody turned to. 125 00:06:44,640 --> 00:06:47,560 Speaker 1: Work like ceremony and your busses, busses busses, busses, Like 126 00:06:48,720 --> 00:06:50,880 Speaker 1: Steve is freaking out. 127 00:06:51,720 --> 00:06:55,919 Speaker 2: That is Yeah, you know, leaving the prototype on a 128 00:06:55,960 --> 00:06:58,520 Speaker 2: barstool is not how you want the world to find 129 00:06:58,520 --> 00:06:59,760 Speaker 2: out about your next product. 130 00:07:00,360 --> 00:07:03,520 Speaker 1: What's the bastin If you're listening, you know, hopefully for 131 00:07:03,640 --> 00:07:06,720 Speaker 1: you, your your product has the same the same effect as 132 00:07:06,720 --> 00:07:11,120 Speaker 1: the iPhone forwarded on the world. Good luck coming back 133 00:07:11,120 --> 00:07:15,880 Speaker 1: to the sort of Davos. So there's a story in 134 00:07:15,920 --> 00:07:19,240 Speaker 1: the Wall Street Journal which had the headline Europe prepares 135 00:07:19,480 --> 00:07:23,600 Speaker 1: for a nightmare scenario the US blocking access to tech, 136 00:07:23,720 --> 00:07:25,000 Speaker 1: which really caught my eye. 137 00:07:25,360 --> 00:07:26,520 Speaker 2: Yeah, what's going on here? 138 00:07:26,760 --> 00:07:30,400 Speaker 1: So I would say since February last year, Vice President 139 00:07:30,520 --> 00:07:33,520 Speaker 1: Vance went to the Munich Security Conference and basically said, 140 00:07:33,960 --> 00:07:36,320 Speaker 1: you guys are on your own. You know, this new 141 00:07:36,360 --> 00:07:38,560 Speaker 1: world order and it's going to be very, very different. 142 00:07:39,000 --> 00:07:42,040 Speaker 1: So nine months later that, you know, the Greenland threats 143 00:07:42,200 --> 00:07:45,160 Speaker 1: ratcheted up, and if you remember when Trump went to Davos, 144 00:07:45,160 --> 00:07:49,120 Speaker 1: it still wasn't clear if he was considering actually occupying Greenland. 145 00:07:49,280 --> 00:07:53,280 Speaker 1: He walked away from that actually over Davos. But nonetheless, 146 00:07:53,280 --> 00:07:55,840 Speaker 1: I think the whole of Europe has woken up belatedly 147 00:07:55,920 --> 00:07:58,600 Speaker 1: to the idea that it's a very different relationship with 148 00:07:58,600 --> 00:08:01,720 Speaker 1: our friends across the Atlantic. And so last week the 149 00:08:01,760 --> 00:08:07,840 Speaker 1: European Parliament passed this law about technological sovereignty favoring European 150 00:08:07,880 --> 00:08:11,960 Speaker 1: tech products and new legislation to promote European cloud providers. 151 00:08:12,760 --> 00:08:18,840 Speaker 1: This sounds super dry and boring and also futile because 152 00:08:18,920 --> 00:08:21,680 Speaker 1: it's really, really, really hard to get off the US 153 00:08:21,840 --> 00:08:27,200 Speaker 1: tech infrastructure if your europe obviously Google, Amazon, Meta, etc. 154 00:08:28,480 --> 00:08:31,640 Speaker 1: But I just thought it was pretty interesting because you know, 155 00:08:32,160 --> 00:08:35,440 Speaker 1: sometimes you see things and it's the beginning of something. 156 00:08:35,679 --> 00:08:41,800 Speaker 1: But if Europe is really serious about weaning itself from 157 00:08:42,320 --> 00:08:47,280 Speaker 1: relying on US tech platforms and products, we could be 158 00:08:47,640 --> 00:08:50,800 Speaker 1: kind of entering into a very different world. 159 00:08:51,280 --> 00:08:53,600 Speaker 2: Yeah, and I think you're totally right. If you remember 160 00:08:53,760 --> 00:08:56,440 Speaker 2: just recently, there was a whole bunch of sites that 161 00:08:56,600 --> 00:08:59,480 Speaker 2: just went down. You remember this, this was tail end 162 00:08:59,520 --> 00:09:03,120 Speaker 2: of twenty twenty five, and this was the Amazon Web 163 00:09:03,160 --> 00:09:07,280 Speaker 2: Services Right service going down, right. And this didn't just 164 00:09:07,440 --> 00:09:10,520 Speaker 2: affect you know, whether or not you can get you know, 165 00:09:10,600 --> 00:09:13,160 Speaker 2: your pizza in two hours from you know, Jeff Bezos 166 00:09:13,200 --> 00:09:15,240 Speaker 2: dot com or whatever. That's not what this is about. 167 00:09:15,360 --> 00:09:19,800 Speaker 2: This is affecting everything. So this affected the trains in 168 00:09:19,840 --> 00:09:23,520 Speaker 2: the Netherlands and systems went down, trains were running late, 169 00:09:23,679 --> 00:09:26,040 Speaker 2: people couldn't get access to services, you know. And I 170 00:09:26,160 --> 00:09:30,080 Speaker 2: spoke to somebody who who was referencing this. Imagine if 171 00:09:31,280 --> 00:09:33,960 Speaker 2: you need to go to the hospital because you're about 172 00:09:34,000 --> 00:09:37,000 Speaker 2: to deliver your baby and something's up with the hospital 173 00:09:37,040 --> 00:09:41,440 Speaker 2: systems because those run on Amazon Web Services and they're 174 00:09:41,440 --> 00:09:46,480 Speaker 2: connected to some server in Pennsylvania. Why are you worried 175 00:09:46,480 --> 00:09:50,080 Speaker 2: about what's happening in Pennsylvania when you're in Europe. But 176 00:09:50,679 --> 00:09:52,280 Speaker 2: this is something that I think a lot of people 177 00:09:52,320 --> 00:09:54,560 Speaker 2: around the world are waking up to, is that they're 178 00:09:54,559 --> 00:09:58,560 Speaker 2: depending on a the US US based companies, but also 179 00:09:59,559 --> 00:10:02,640 Speaker 2: just three companies. The vast majority of the Internet runs 180 00:10:02,640 --> 00:10:05,120 Speaker 2: on three US based companies. 181 00:10:05,080 --> 00:10:07,480 Speaker 1: And it's one thing I think if there's like a 182 00:10:07,520 --> 00:10:10,960 Speaker 1: glitch and Amazon Web Services goes down, and obviously some 183 00:10:11,000 --> 00:10:13,559 Speaker 1: people have horrible real world consequences, but people I think 184 00:10:14,240 --> 00:10:18,400 Speaker 1: willing to accept some kind of like random system risk 185 00:10:18,559 --> 00:10:21,200 Speaker 1: in the technology they use. But the idea that the 186 00:10:21,200 --> 00:10:23,520 Speaker 1: President of the United States could turn around and say, well, 187 00:10:23,960 --> 00:10:25,560 Speaker 1: if you don't do what we like, we're going to 188 00:10:25,600 --> 00:10:27,960 Speaker 1: turn your Internet off or turn your cloud computing off. 189 00:10:28,280 --> 00:10:29,400 Speaker 2: It just shows. 190 00:10:29,240 --> 00:10:32,800 Speaker 1: How vulnerable Europe in particular has become. Obviously a different 191 00:10:32,800 --> 00:10:38,400 Speaker 1: story in China to our you know, assumed benevolent cousins 192 00:10:38,440 --> 00:10:42,120 Speaker 1: across the Atlantic. Actually, did you see the premise of 193 00:10:42,120 --> 00:10:45,800 Speaker 1: Canada Mark Carney's remarks at DeVos takester? I didn't know, 194 00:10:46,320 --> 00:10:48,480 Speaker 1: so I think maybe we could play the tape. 195 00:10:49,360 --> 00:10:52,560 Speaker 3: We knew the story of the international rules based order 196 00:10:52,720 --> 00:10:57,040 Speaker 3: was partially false, that the strongest would exempt themselves when convenient, 197 00:10:57,480 --> 00:11:00,319 Speaker 3: and we knew that international law applied with vary rigor 198 00:11:00,360 --> 00:11:03,520 Speaker 3: depending on the identity of the accused or the victim. 199 00:11:03,640 --> 00:11:08,160 Speaker 3: This fiction was useful, and American hegemony in particular help 200 00:11:08,240 --> 00:11:12,880 Speaker 3: provide public goods, open sea lanes, a stable financial system, 201 00:11:13,120 --> 00:11:17,079 Speaker 3: collective security, and support for frameworks for resolving disputes. 202 00:11:18,200 --> 00:11:19,679 Speaker 2: We participated in. 203 00:11:19,640 --> 00:11:24,240 Speaker 3: The rituals, and we largely avoided calling out the gaps 204 00:11:24,320 --> 00:11:25,720 Speaker 3: between rhetoric and reality. 205 00:11:27,240 --> 00:11:31,840 Speaker 2: This bargain no longer works. Shout out to my man 206 00:11:31,960 --> 00:11:36,920 Speaker 2: for using the phrase hegemony. We don't use that word enough. 207 00:11:37,600 --> 00:11:40,000 Speaker 1: I like Itemony. I didn't even know his hegemony. 208 00:11:40,240 --> 00:11:43,960 Speaker 2: Oh, hegemony many money. You spend enough time in grad 209 00:11:43,960 --> 00:11:46,440 Speaker 2: school and you get that word thrown around of parties. Man, 210 00:11:47,880 --> 00:11:51,800 Speaker 2: that's an important word. Yeah, you know what he's what 211 00:11:51,840 --> 00:11:57,679 Speaker 2: he's talking about. There is bluntin speaking convenience. Yeah, right, 212 00:11:58,040 --> 00:12:02,240 Speaker 2: is things aren't I don't like what the United States 213 00:12:02,280 --> 00:12:05,240 Speaker 2: is doing all the time, but things are working well enough. 214 00:12:05,760 --> 00:12:07,559 Speaker 2: At a certain point that breaks down and I think, 215 00:12:07,880 --> 00:12:09,200 Speaker 2: you know, this is something that I think a lot 216 00:12:09,280 --> 00:12:14,439 Speaker 2: of just regular consumers are also thinking about recently, is well, 217 00:12:14,480 --> 00:12:18,120 Speaker 2: you know, my devices do what I want. I don't 218 00:12:18,520 --> 00:12:21,680 Speaker 2: love that I'm kind of being forced into upgrading every year, 219 00:12:21,840 --> 00:12:24,679 Speaker 2: but okay, but some people are starting to push back 220 00:12:24,679 --> 00:12:28,320 Speaker 2: against that. The convergence again of geopolitical stuff, and I 221 00:12:28,320 --> 00:12:31,959 Speaker 2: think what everyday consumers are thinking is pretty interesting here. 222 00:12:32,200 --> 00:12:33,880 Speaker 1: I think you make a really good point the connection 223 00:12:33,920 --> 00:12:37,280 Speaker 1: between those two things, because this concept of digital serfdom, 224 00:12:37,400 --> 00:12:39,440 Speaker 1: right where we like live on these tech platforms and 225 00:12:39,520 --> 00:12:42,240 Speaker 1: we like spend time on them, and then our time 226 00:12:42,360 --> 00:12:45,280 Speaker 1: is monetized and we basically lay before the platforms and 227 00:12:45,320 --> 00:12:48,400 Speaker 1: there's like a setplus value which is captured by their platforms. 228 00:12:48,720 --> 00:12:51,360 Speaker 1: The President of France kind of adopted this language and 229 00:12:51,440 --> 00:12:55,400 Speaker 1: said this is the refusal of being a vassal. So 230 00:12:55,679 --> 00:12:57,840 Speaker 1: it's interesting all these academic kind of ideas about how 231 00:12:57,840 --> 00:13:01,960 Speaker 1: technology works actually playing out in global relations geopolitical relations 232 00:13:02,040 --> 00:13:04,360 Speaker 1: right now. The issue, of course is just as it's 233 00:13:04,480 --> 00:13:06,440 Speaker 1: very very hard, as we will know, not to use 234 00:13:06,559 --> 00:13:10,120 Speaker 1: Gmail or iPhone or Instagram, it's also very hard if 235 00:13:10,160 --> 00:13:13,920 Speaker 1: you're Europe not to use Google Cloud or Amazon Cloud, 236 00:13:14,040 --> 00:13:17,080 Speaker 1: et cetera. There have been some attempts kind of before 237 00:13:17,120 --> 00:13:19,559 Speaker 1: it became a political crisis in the last couple of years. 238 00:13:19,559 --> 00:13:22,840 Speaker 1: So these US companies like Google and Microsoft and Amazon 239 00:13:23,240 --> 00:13:26,719 Speaker 1: have these kind of European subsidies that are managed by 240 00:13:26,760 --> 00:13:32,280 Speaker 1: local European boards and theoretically the data is privately stored 241 00:13:32,360 --> 00:13:34,960 Speaker 1: in Europe so that the NSA and others can't you know, 242 00:13:35,000 --> 00:13:39,520 Speaker 1: look at it. But these fictions, to Carney's point, are 243 00:13:40,880 --> 00:13:44,079 Speaker 1: you know, under scrutiny, appear more more like what they are, 244 00:13:44,120 --> 00:13:46,240 Speaker 1: and it makes me think a bit about this. Just 245 00:13:46,280 --> 00:13:49,640 Speaker 1: the difficulty of disintermediating makes me think about the TikTok story, 246 00:13:49,679 --> 00:13:53,240 Speaker 1: where there's always political pressure to you know, take TikTok 247 00:13:53,280 --> 00:13:57,079 Speaker 1: away from China to protect you know, American national security 248 00:13:57,120 --> 00:13:59,240 Speaker 1: and make sure that you know, the youth isn't being influenced. 249 00:13:59,280 --> 00:14:01,920 Speaker 1: And now the deal's kind of been struck, but it's 250 00:14:02,120 --> 00:14:04,560 Speaker 1: the Chinese is still involved, and there are kind of, 251 00:14:04,880 --> 00:14:08,480 Speaker 1: you know, emerging rumors that actually the US government is 252 00:14:08,480 --> 00:14:10,920 Speaker 1: now directly controlling the flow of information in a way 253 00:14:10,960 --> 00:14:13,440 Speaker 1: that the Chinese government never did in the US. 254 00:14:14,080 --> 00:14:17,200 Speaker 2: Yeah, it's this is so interesting, right because you know, 255 00:14:17,280 --> 00:14:24,440 Speaker 2: the on paper the reason for divesting TikTok from China 256 00:14:24,880 --> 00:14:28,760 Speaker 2: was because of security issues. Now politicians would would come 257 00:14:28,840 --> 00:14:31,840 Speaker 2: right out and say that, oh, because this is because 258 00:14:31,880 --> 00:14:34,560 Speaker 2: of propaganda. We don't want you know, the Chinese propagandizing 259 00:14:34,560 --> 00:14:38,640 Speaker 2: our youth. The thing is, propaganda's not illegal. There's actually 260 00:14:38,640 --> 00:14:42,240 Speaker 2: nothing wrong with that legally speaking. China can say whatever 261 00:14:42,240 --> 00:14:44,600 Speaker 2: they want to the youth or the elderly or the 262 00:14:44,640 --> 00:14:49,560 Speaker 2: whatever of America. There's nothing Actually politicians shouldn't in a 263 00:14:49,640 --> 00:14:53,240 Speaker 2: legal sense, really be stepping in there. They can dislike it, 264 00:14:53,600 --> 00:14:56,360 Speaker 2: but you start to run into you know, First Amendment, 265 00:14:56,400 --> 00:14:58,000 Speaker 2: you start to run out of free speech there. Right. 266 00:14:58,600 --> 00:15:04,560 Speaker 2: But now I think we're in a very interesting situation where, 267 00:15:04,840 --> 00:15:07,520 Speaker 2: you know, the kids on TikTok, these kids won't get 268 00:15:07,520 --> 00:15:10,920 Speaker 2: off their phones, right, they're being exposed to the idea 269 00:15:10,960 --> 00:15:13,360 Speaker 2: that wait a second, hold on, I thought y'all were 270 00:15:13,360 --> 00:15:19,000 Speaker 2: telling me that China was doing all the censorship. Again, 271 00:15:19,240 --> 00:15:21,600 Speaker 2: there's been an outage, but you've got a lot of 272 00:15:21,720 --> 00:15:28,880 Speaker 2: people who were maybe prematurely thinking that something is being censored, 273 00:15:29,000 --> 00:15:31,720 Speaker 2: but maybe not prematurely thinking that something is being sent 274 00:15:32,240 --> 00:15:35,760 Speaker 2: to ice videos, right correct, right, right. And so I 275 00:15:35,800 --> 00:15:40,520 Speaker 2: think we're a really interesting situation because again, I think 276 00:15:40,560 --> 00:15:46,800 Speaker 2: Americans aren't used to comparing the actions of their government 277 00:15:46,840 --> 00:15:49,240 Speaker 2: to stuff they've read about in like history books. 278 00:15:49,560 --> 00:15:52,720 Speaker 1: Well, you reference gradical conversations. Obviously up there with the 279 00:15:52,760 --> 00:15:55,360 Speaker 1: hegemony was the Monroe doctrine or now as we know, 280 00:15:55,520 --> 00:15:56,720 Speaker 1: the Dunroad doctrines. 281 00:15:57,160 --> 00:16:01,800 Speaker 2: Right, Yeah, there's some like high school sophomore saying whoa, whoa, 282 00:16:01,800 --> 00:16:05,040 Speaker 2: whoa hold on that was on my test last week. 283 00:16:05,040 --> 00:16:07,600 Speaker 2: Hold on, you tell me this stuff? Is I need 284 00:16:07,600 --> 00:16:09,080 Speaker 2: this in the real world? Really? Wow? 285 00:16:09,200 --> 00:16:12,840 Speaker 1: Okay, So the Europeans, on the one hand, appear to 286 00:16:12,840 --> 00:16:15,440 Speaker 1: have no leverage. On the other hand, they kind of do. 287 00:16:15,640 --> 00:16:15,800 Speaker 2: Right. 288 00:16:15,880 --> 00:16:21,720 Speaker 1: Europe owns two trillion dollars of US treasuries, and the 289 00:16:21,800 --> 00:16:26,520 Speaker 1: US's main exports are what are called digital deliverable services, 290 00:16:26,640 --> 00:16:30,640 Speaker 1: which is basically AI advertising, et cetera, and the US 291 00:16:30,720 --> 00:16:33,960 Speaker 1: exports three hundred and sixty billion dollars a year of 292 00:16:34,040 --> 00:16:38,479 Speaker 1: digital deliverable services to Europe, which is really pretty significant. 293 00:16:38,520 --> 00:16:41,040 Speaker 1: So could Europe stop buying? What would it take for 294 00:16:41,040 --> 00:16:43,560 Speaker 1: them to stop buying? They would have to develop their 295 00:16:43,560 --> 00:16:46,560 Speaker 1: own cloud computing infrastructure, They would have to develop their 296 00:16:46,600 --> 00:16:49,240 Speaker 1: own AI model. They have Mistrial, the French company is 297 00:16:49,280 --> 00:16:51,480 Speaker 1: the kind of is the market leader, but the kind 298 00:16:51,520 --> 00:16:54,240 Speaker 1: of the journey from here to there in terms of 299 00:16:54,280 --> 00:16:56,560 Speaker 1: actually being able to take advantage of this leverage is 300 00:16:56,680 --> 00:16:58,880 Speaker 1: really really great. And I think it's the same in 301 00:16:58,960 --> 00:17:01,440 Speaker 1: terms of you know, as we think about individuals, it's 302 00:17:01,440 --> 00:17:05,040 Speaker 1: like it's very hard to have enough cohesion as a 303 00:17:05,040 --> 00:17:15,800 Speaker 1: group to actually manifest political power after the break, will 304 00:17:15,840 --> 00:17:19,840 Speaker 1: we tolerate AI in our houses and in the arts? 305 00:17:20,320 --> 00:17:32,240 Speaker 1: Stay with us? So, Dexter, welcome back. You very kindly 306 00:17:32,280 --> 00:17:39,399 Speaker 1: stuck with me through my European tech sovereignty story. Normally, 307 00:17:39,440 --> 00:17:40,920 Speaker 1: the way Karen I do this is we kind of 308 00:17:40,920 --> 00:17:43,720 Speaker 1: bring each other one or two stories and and kind 309 00:17:43,720 --> 00:17:47,040 Speaker 1: of share why they've grabbed us. So I'm curious what 310 00:17:47,080 --> 00:17:48,640 Speaker 1: you've brought this week and why. 311 00:17:49,040 --> 00:17:53,320 Speaker 2: Yeah. Well, I don't know about you, but everywhere in 312 00:17:53,359 --> 00:17:57,520 Speaker 2: real life online for me, my social circle is full 313 00:17:57,600 --> 00:18:03,400 Speaker 2: of people fighting about AI an art, and they're sick 314 00:18:03,440 --> 00:18:06,920 Speaker 2: of seeing there's Cica sine ai slop on Instagram, the 315 00:18:07,040 --> 00:18:11,560 Speaker 2: Cica sine ai slop on TikTok. They're sick of being 316 00:18:11,560 --> 00:18:16,240 Speaker 2: asked to buy refrigerators that talk to them whatever. Right, 317 00:18:17,040 --> 00:18:20,000 Speaker 2: But a lot of the argument that I'm seeing about 318 00:18:20,040 --> 00:18:23,440 Speaker 2: AI and art is, you know, on streaming services, things 319 00:18:23,440 --> 00:18:25,560 Speaker 2: that you subscribe to, and then AI is kind of 320 00:18:25,640 --> 00:18:28,120 Speaker 2: slipping into it and seeping into it in a way 321 00:18:28,119 --> 00:18:28,480 Speaker 2: that they. 322 00:18:28,359 --> 00:18:30,000 Speaker 1: Don't want Spotify famously. 323 00:18:30,119 --> 00:18:35,320 Speaker 2: Right. Right, But let's reverse that as would you buy 324 00:18:35,680 --> 00:18:38,920 Speaker 2: on purpose? Would you buy AI art and say hanging 325 00:18:39,000 --> 00:18:39,600 Speaker 2: up in your house? 326 00:18:39,920 --> 00:18:42,199 Speaker 1: Absolutely not, that's an easy one for me. 327 00:18:42,760 --> 00:18:45,600 Speaker 2: Well, there is a company that is hoping you will 328 00:18:45,680 --> 00:18:49,320 Speaker 2: change your mind pretty soon because they're selling an AI 329 00:18:49,520 --> 00:18:53,200 Speaker 2: art frame. Basically, it's a it's a standard frame. It's 330 00:18:53,200 --> 00:18:55,040 Speaker 2: like a you know, one of those digital picture frames 331 00:18:55,040 --> 00:18:57,159 Speaker 2: that you can get. It's connected to an app on 332 00:18:57,200 --> 00:19:00,800 Speaker 2: your phone and you can put diff images on it, 333 00:19:01,400 --> 00:19:05,200 Speaker 2: or there's a feature where you can have it generate 334 00:19:05,320 --> 00:19:08,200 Speaker 2: an AI generated image for the frame. 335 00:19:08,800 --> 00:19:11,800 Speaker 1: So you can basically say, you know, make me today, 336 00:19:11,800 --> 00:19:15,119 Speaker 1: I want a momma's water lilies, but with me in 337 00:19:15,200 --> 00:19:17,040 Speaker 1: it and tomorrow I want whatever. 338 00:19:16,720 --> 00:19:21,160 Speaker 2: Else precisely exactly that. So I think it uses Google's 339 00:19:21,240 --> 00:19:24,000 Speaker 2: Nano Banana, you know, one of the image generation services, 340 00:19:24,480 --> 00:19:28,119 Speaker 2: and so it uses e ink basically like a kindle. 341 00:19:28,560 --> 00:19:31,760 Speaker 2: But here's the thing is color e ink. The technology 342 00:19:31,800 --> 00:19:34,520 Speaker 2: isn't quite there yet, so it's not very sharp, and 343 00:19:34,560 --> 00:19:37,080 Speaker 2: so they even write on their site that it's best 344 00:19:37,160 --> 00:19:43,119 Speaker 2: viewed from a distance, which is her big house. Yeah, 345 00:19:43,160 --> 00:19:45,680 Speaker 2: which is just I think, just a really funny way 346 00:19:45,680 --> 00:19:47,639 Speaker 2: of saying like, look, man, this actually isn't gonna look 347 00:19:47,760 --> 00:19:51,480 Speaker 2: very good. So you interested? Are you sold yet? 348 00:19:51,560 --> 00:19:53,560 Speaker 1: But yeah, I'm curious about this. I mean I'm not, 349 00:19:53,680 --> 00:19:56,000 Speaker 1: I'm not, I'm not grabbed by it, but I do 350 00:19:56,040 --> 00:19:59,080 Speaker 1: feel like there's so much investment and excitement in what's 351 00:19:59,080 --> 00:20:01,560 Speaker 1: going to be the next generation of AI enabled consumer 352 00:20:01,600 --> 00:20:05,000 Speaker 1: products that like, maybe I'm at lodite on this one. 353 00:20:05,320 --> 00:20:08,720 Speaker 2: Well, you know, speaking of investment, I hope you're ready 354 00:20:08,800 --> 00:20:11,240 Speaker 2: to make an investment here. The thirty one inch frame 355 00:20:11,840 --> 00:20:15,600 Speaker 2: is one two hundred ninety nine dollars and ninety nine cents. Wow, 356 00:20:16,119 --> 00:20:18,840 Speaker 2: or as you can get a four pack four are 357 00:20:18,880 --> 00:20:22,240 Speaker 2: you ready? Five thousand one hundred and ninety nine dollars 358 00:20:22,240 --> 00:20:23,520 Speaker 2: and ninety six cents. 359 00:20:23,640 --> 00:20:25,800 Speaker 1: Has a slim discount for buying bulk? 360 00:20:26,480 --> 00:20:32,159 Speaker 2: That none? Actually, there's no discount. I actually had to 361 00:20:32,200 --> 00:20:34,000 Speaker 2: do the math again. I had to type it in. 362 00:20:34,040 --> 00:20:36,119 Speaker 2: It was a whole lot of second. Is there any discount? No, 363 00:20:36,160 --> 00:20:39,120 Speaker 2: it's just four times for that's it. 364 00:20:40,400 --> 00:20:42,359 Speaker 1: How do these companies how do they afford to go 365 00:20:42,400 --> 00:20:44,399 Speaker 1: into production? It's like a kickstarter or this. This is 366 00:20:44,400 --> 00:20:44,960 Speaker 1: a real thing. 367 00:20:45,280 --> 00:20:48,800 Speaker 2: The company's called Switchbot, and you might be familiar with 368 00:20:48,840 --> 00:20:53,520 Speaker 2: this because they're mostly known for selling like automatic locks 369 00:20:54,000 --> 00:20:56,640 Speaker 2: for your doors or surveillance cameras. 370 00:20:56,760 --> 00:20:58,960 Speaker 1: Are you intrigued yourself? DEXTRAE Team about buying one? 371 00:20:59,440 --> 00:21:03,080 Speaker 2: I am not going to buy this, but you know, 372 00:21:03,960 --> 00:21:08,080 Speaker 2: there certainly are people who are very excited about this. 373 00:21:08,119 --> 00:21:10,439 Speaker 2: I've seen YouTube videos of people who are who are 374 00:21:10,480 --> 00:21:13,360 Speaker 2: really excited about it, and you know, it just makes 375 00:21:13,400 --> 00:21:17,920 Speaker 2: me think a lot about this conversation that we keep 376 00:21:17,960 --> 00:21:21,600 Speaker 2: seeing about AI. And there's people who are very very 377 00:21:21,680 --> 00:21:26,199 Speaker 2: much against AI in any of the arts, right, But 378 00:21:26,200 --> 00:21:28,359 Speaker 2: then there's always somebody who is going to buy it, 379 00:21:29,000 --> 00:21:29,280 Speaker 2: you know. 380 00:21:29,560 --> 00:21:32,080 Speaker 1: I mean I think it's like Mission creep, right, It's 381 00:21:32,119 --> 00:21:35,679 Speaker 1: like you kind of accept It's like a little bit 382 00:21:35,680 --> 00:21:37,359 Speaker 1: like the frog in the potl you accept more and 383 00:21:37,440 --> 00:21:41,359 Speaker 1: more AI in your art. Hopefully they'll become like a 384 00:21:41,400 --> 00:21:43,600 Speaker 1: point where there's like a tipping point where there's too 385 00:21:43,640 --> 00:21:46,840 Speaker 1: much AI and people respect and wish to engage with 386 00:21:47,080 --> 00:21:52,160 Speaker 1: human creativity. I'm somewhat heartened as a sometime content creator 387 00:21:52,240 --> 00:21:55,320 Speaker 1: that like the average human being still apparently prefers to 388 00:21:55,359 --> 00:21:58,520 Speaker 1: engage with human generated content, but I am stressed that 389 00:21:58,520 --> 00:22:00,639 Speaker 1: that may not forever. 390 00:22:01,680 --> 00:22:03,480 Speaker 2: This is something I've been thinking about a lot. So 391 00:22:04,400 --> 00:22:07,080 Speaker 2: I listened to a lot of music on YouTube. Actually 392 00:22:07,119 --> 00:22:09,600 Speaker 2: that's mostly where I listen to music, just because you know, 393 00:22:09,600 --> 00:22:12,399 Speaker 2: there's weird remixes and stuff that it just doesn't exist 394 00:22:12,400 --> 00:22:15,320 Speaker 2: on other platforms, right, Things that maybe legally shouldn't be 395 00:22:15,440 --> 00:22:17,520 Speaker 2: uploaded there, but it's there, so you know, hey, I'm 396 00:22:17,560 --> 00:22:23,080 Speaker 2: not complaining. But one thing that I a trend that 397 00:22:23,119 --> 00:22:25,560 Speaker 2: I've noticed, and actually we did a kill Switch episode 398 00:22:25,560 --> 00:22:27,840 Speaker 2: on this. Are you familiar with lo fi hip hop? 399 00:22:28,720 --> 00:22:30,600 Speaker 1: Vaguely vaguely but not really. 400 00:22:31,200 --> 00:22:35,240 Speaker 2: Yeah, So basically, just imagine very chill background, the kind 401 00:22:35,240 --> 00:22:37,440 Speaker 2: of you know, just instrumental type stuff that will be 402 00:22:37,520 --> 00:22:39,520 Speaker 2: playing in the back of a you know, kind of 403 00:22:39,560 --> 00:22:44,600 Speaker 2: trendy coffee shop, right, right, So this was a trend, 404 00:22:44,720 --> 00:22:47,120 Speaker 2: you know, kind of big in late twenty tens, and 405 00:22:48,040 --> 00:22:52,320 Speaker 2: people started generating it with AI and the lo fi 406 00:22:52,359 --> 00:22:54,320 Speaker 2: hip hop scene and that there's a scene of beat 407 00:22:54,320 --> 00:22:56,560 Speaker 2: makers who really love this stuff and think they've found 408 00:22:56,640 --> 00:22:59,440 Speaker 2: they've found a community in it. There's people who've just 409 00:22:59,520 --> 00:23:02,720 Speaker 2: stopped me making the music because they feel like a 410 00:23:02,720 --> 00:23:05,640 Speaker 2: lot of the listeners just can't tell the difference between 411 00:23:05,920 --> 00:23:10,040 Speaker 2: AI and the real thing or don't care. And that's 412 00:23:10,080 --> 00:23:13,200 Speaker 2: what's really interesting me is we'll just put it this way. 413 00:23:14,400 --> 00:23:17,040 Speaker 2: My thing is music, Like I really really really care 414 00:23:17,040 --> 00:23:22,800 Speaker 2: about music, but you know, an AI generated image personally 415 00:23:22,840 --> 00:23:24,720 Speaker 2: for me, if I'm going to be very honest, it 416 00:23:24,760 --> 00:23:29,640 Speaker 2: doesn't bother me as much as music does. And what 417 00:23:29,680 --> 00:23:32,720 Speaker 2: I worry about is that everybody's got their own thing 418 00:23:32,760 --> 00:23:35,879 Speaker 2: that they care about, and there's how can there be 419 00:23:35,880 --> 00:23:37,880 Speaker 2: a united front against this sort of thing. 420 00:23:38,280 --> 00:23:42,320 Speaker 1: The switchbop frame is interesting to you because essentially it's 421 00:23:42,359 --> 00:23:46,760 Speaker 1: a consumer referendum on like whether it's the music business 422 00:23:46,840 --> 00:23:49,439 Speaker 1: or the you know, visual arts industry or whatever it 423 00:23:49,480 --> 00:23:51,639 Speaker 1: may be. You think this is something of a belwether 424 00:23:51,720 --> 00:23:54,919 Speaker 1: for like where people are in terms of accepting this 425 00:23:54,960 --> 00:23:56,640 Speaker 1: technology in their lives. 426 00:23:56,320 --> 00:24:01,080 Speaker 2: And I just think that this is yet another step 427 00:24:01,359 --> 00:24:06,240 Speaker 2: in more commodification of art and it's going to really 428 00:24:06,240 --> 00:24:08,840 Speaker 2: really bother some people. And I think a lot of 429 00:24:08,840 --> 00:24:11,960 Speaker 2: people will listen to this and they'll say, man, my 430 00:24:12,040 --> 00:24:15,159 Speaker 2: only problem with this is that it's too expensive. Let 431 00:24:15,240 --> 00:24:18,000 Speaker 2: me know when the price has come down. And I think, 432 00:24:18,119 --> 00:24:20,920 Speaker 2: and I can't judge that person that I think that's 433 00:24:20,960 --> 00:24:25,159 Speaker 2: just where we are. This is happening in video games 434 00:24:25,440 --> 00:24:26,840 Speaker 2: as well. I don't know if you've seen this. 435 00:24:27,560 --> 00:24:30,880 Speaker 1: I have been following the kind of with interest the 436 00:24:30,960 --> 00:24:35,840 Speaker 1: video game AI Dance. I'm not a video game person, 437 00:24:36,119 --> 00:24:37,639 Speaker 1: but I got a friend who loves video games, and 438 00:24:37,680 --> 00:24:40,199 Speaker 1: he was talking about this new game, Ah, his name 439 00:24:40,320 --> 00:24:42,879 Speaker 1: escapes me, but basically saying that there's kind of been 440 00:24:42,920 --> 00:24:48,840 Speaker 1: this breakthrough recently where computer characters powered by AI are 441 00:24:48,920 --> 00:24:52,200 Speaker 1: much more responsive and actually learned from your player behavior 442 00:24:52,600 --> 00:24:54,960 Speaker 1: in a way that previous generations of games didn't, and 443 00:24:54,960 --> 00:24:56,399 Speaker 1: how this has kind of blown his mind. 444 00:24:57,160 --> 00:24:58,920 Speaker 2: Yeah. So, I mean, this is this is one of 445 00:24:58,960 --> 00:25:01,440 Speaker 2: those things that kind of cuts with ways. Because there's 446 00:25:01,440 --> 00:25:04,840 Speaker 2: a game called Clear Obscure Expedition thirty three. It one 447 00:25:05,000 --> 00:25:09,720 Speaker 2: Game of the Year at the Indie Game Awards, really prestigious, right, 448 00:25:10,520 --> 00:25:14,359 Speaker 2: But basically some gamers were looking more closely at the 449 00:25:14,359 --> 00:25:18,000 Speaker 2: game and started posting these images of what really looked 450 00:25:18,040 --> 00:25:22,480 Speaker 2: like AI generated textures for some stuff like newspapers. You 451 00:25:22,480 --> 00:25:24,800 Speaker 2: know how it used to be when you'd ask any 452 00:25:24,800 --> 00:25:27,439 Speaker 2: AI image generator to generate text, it would all be 453 00:25:27,480 --> 00:25:30,720 Speaker 2: garbled right right, And the company coppt to it. They said, yeah, 454 00:25:30,800 --> 00:25:33,000 Speaker 2: you know, we used it for placeholders, but I guess 455 00:25:33,080 --> 00:25:37,040 Speaker 2: some stuff slipped through, And then the Indie Game Awards 456 00:25:37,680 --> 00:25:38,560 Speaker 2: took back the award. 457 00:25:39,480 --> 00:25:40,840 Speaker 1: I fascinate think that's crazy. 458 00:25:41,080 --> 00:25:42,720 Speaker 2: Well, hold on, why do you think that's crazy? 459 00:25:43,240 --> 00:25:45,800 Speaker 1: I think if you're at the margins using it, like 460 00:25:45,840 --> 00:25:48,880 Speaker 1: as an indie creator, whatever field you're in, if you're 461 00:25:48,960 --> 00:25:52,880 Speaker 1: at the margins trying to use AI to like make 462 00:25:52,960 --> 00:25:56,120 Speaker 1: you a creative thing, like a little cheaper to make 463 00:25:56,200 --> 00:25:58,800 Speaker 1: or a little better, you're not outsourcing your brain, your 464 00:25:58,840 --> 00:26:02,119 Speaker 1: creative mind to AI, like using a generative tool. I mean, 465 00:26:02,119 --> 00:26:04,640 Speaker 1: it's like saying, oh, they use like Adobe to make 466 00:26:04,640 --> 00:26:07,000 Speaker 1: the move to me. Anyway, I'm like, I think there's 467 00:26:07,119 --> 00:26:09,879 Speaker 1: there's a light clear line the stand between optimizing something 468 00:26:09,880 --> 00:26:12,199 Speaker 1: which is like amazing that you've created as a human 469 00:26:12,240 --> 00:26:15,000 Speaker 1: with AI and like having AI creating nonsense. 470 00:26:15,240 --> 00:26:17,760 Speaker 2: Okay, let's let's step this line a little bit further. Okay, 471 00:26:17,800 --> 00:26:21,000 Speaker 2: So video game publisher Running with Scissors. I don't know 472 00:26:21,000 --> 00:26:24,560 Speaker 2: if you've played the Postal games. I wouldn't recommend it. Okay, 473 00:26:25,560 --> 00:26:27,879 Speaker 2: it's a very very particular audience. I'm just gonna leave 474 00:26:27,880 --> 00:26:30,520 Speaker 2: it at that. So new Games supposed to be coming out, 475 00:26:30,720 --> 00:26:33,520 Speaker 2: a new game in the Postal series. They drop the trailer. 476 00:26:34,200 --> 00:26:37,040 Speaker 2: Pretty instantly. People are looking at this video trailer and 477 00:26:37,040 --> 00:26:40,320 Speaker 2: they're saying, I think this is AI. Running with Scissors 478 00:26:40,359 --> 00:26:45,320 Speaker 2: says they're canceling the game. And then as people are 479 00:26:45,320 --> 00:26:49,080 Speaker 2: continuing to accuse them of using AI, first they say no. 480 00:26:50,000 --> 00:26:52,320 Speaker 2: Then the co owner says, look, if anybody has an 481 00:26:52,320 --> 00:26:56,080 Speaker 2: AI accusation, quit the server, like, get get out of 482 00:26:56,119 --> 00:27:00,879 Speaker 2: their Discord server. And then this is a quote co 483 00:27:01,000 --> 00:27:04,600 Speaker 2: owner says, I'll go a step further. The anti AI 484 00:27:04,720 --> 00:27:08,879 Speaker 2: mob has ruined art. Wow, can I say that for 485 00:27:08,920 --> 00:27:14,600 Speaker 2: you one more time? The anti AI mob has ruined art. 486 00:27:15,240 --> 00:27:16,440 Speaker 2: I'm still trying to square that circle. 487 00:27:17,080 --> 00:27:20,040 Speaker 1: That's the big inversion. That's pretty nuts. 488 00:27:21,119 --> 00:27:25,160 Speaker 2: But you know, I think what you said is really 489 00:27:25,200 --> 00:27:28,680 Speaker 2: interesting because the kinds of conversations we're seeing here where 490 00:27:28,720 --> 00:27:33,120 Speaker 2: people really are pushing back against AI. This is happening 491 00:27:33,119 --> 00:27:35,760 Speaker 2: at the indie level. These indie developers, they don't have 492 00:27:35,880 --> 00:27:38,439 Speaker 2: the kind of firepower, they don't have the kind of 493 00:27:38,440 --> 00:27:44,000 Speaker 2: resources to design stuff like these big studios. But it 494 00:27:44,160 --> 00:27:47,840 Speaker 2: is the smaller studios who really do have to listen 495 00:27:48,200 --> 00:27:52,919 Speaker 2: to the players because if they don't, they don't sell games. 496 00:27:52,920 --> 00:27:56,639 Speaker 2: They'll jump ship. You know, the difference between breaking even 497 00:27:57,200 --> 00:28:01,520 Speaker 2: and just completely going into bankruptcy. That margin's pretty thin 498 00:28:01,560 --> 00:28:04,679 Speaker 2: for an indie developer. Meanwhile, you've got the head of 499 00:28:04,720 --> 00:28:07,520 Speaker 2: Epic Games, which is this is a company that makes Fortnite, 500 00:28:07,560 --> 00:28:12,160 Speaker 2: Yeah right, saying yeah, saying that games shouldn't be labeled 501 00:28:12,160 --> 00:28:14,240 Speaker 2: with AI. They shouldn't have to have a label this 502 00:28:14,320 --> 00:28:17,320 Speaker 2: has made with AI, because pretty soon everything's gonna be 503 00:28:17,359 --> 00:28:17,880 Speaker 2: made with AI. 504 00:28:18,000 --> 00:28:21,280 Speaker 1: Anyway, the same friend who told me about this game, 505 00:28:21,280 --> 00:28:23,399 Speaker 1: it's like a shooting game where you shoot robots. But 506 00:28:23,480 --> 00:28:25,800 Speaker 1: and now the robots like appear to be learning from 507 00:28:25,840 --> 00:28:28,439 Speaker 1: your particular unique playing style as a human. 508 00:28:28,760 --> 00:28:30,640 Speaker 2: Are you talking about are you talking about art creators? 509 00:28:30,960 --> 00:28:33,360 Speaker 1: That's right, but heat Basically, these games have become more 510 00:28:33,359 --> 00:28:35,400 Speaker 1: engaging because they're more responsive to. 511 00:28:35,359 --> 00:28:38,800 Speaker 2: Me, right, And I think that there's I think there's 512 00:28:38,800 --> 00:28:42,040 Speaker 2: a line to be drawn even amongst you know, developers, 513 00:28:42,080 --> 00:28:44,160 Speaker 2: And I've talked to a few developers about this, and 514 00:28:44,160 --> 00:28:47,360 Speaker 2: there are people who really feel like generative AI should 515 00:28:47,400 --> 00:28:50,520 Speaker 2: not be used in a game at all because functionally, 516 00:28:50,560 --> 00:28:54,400 Speaker 2: what it's doing is it's theft. And a lot of 517 00:28:54,440 --> 00:28:58,960 Speaker 2: players are also worried about copyright, right, could something sneak 518 00:28:59,000 --> 00:29:02,560 Speaker 2: in to the images that you're using, even as a placeholder, 519 00:29:03,000 --> 00:29:05,160 Speaker 2: that actually ends up being way too similar to something 520 00:29:05,160 --> 00:29:07,719 Speaker 2: that actually exists out there. You know, if I make 521 00:29:07,760 --> 00:29:11,360 Speaker 2: a game and say, hey, generate a happy Italian plumber 522 00:29:11,360 --> 00:29:15,880 Speaker 2: who jumps a lot for the background, like Nintendo is 523 00:29:15,920 --> 00:29:19,280 Speaker 2: gonna come and send people to my door, I shouldn't 524 00:29:19,320 --> 00:29:21,640 Speaker 2: do that, right, And so, even if it's not as 525 00:29:21,720 --> 00:29:23,440 Speaker 2: extreme as that, you do kind of need to be 526 00:29:23,440 --> 00:29:27,040 Speaker 2: careful when you're selling a product that has anything that's 527 00:29:27,120 --> 00:29:33,120 Speaker 2: generated by AI. But things like the behavior of an 528 00:29:33,240 --> 00:29:35,720 Speaker 2: enemy character or something like that, that, for I think 529 00:29:35,800 --> 00:29:38,520 Speaker 2: is something that's completely different that has to do with 530 00:29:38,520 --> 00:29:42,440 Speaker 2: the game mechanics. And I don't think anybody who's a 531 00:29:42,480 --> 00:29:45,400 Speaker 2: game developer is saying AI should never touch a game 532 00:29:45,560 --> 00:29:48,720 Speaker 2: at all. It depends on the type of AI that's 533 00:29:48,760 --> 00:29:51,600 Speaker 2: being used. But then again, even the stuff that some 534 00:29:51,640 --> 00:29:55,160 Speaker 2: people are pushing back against. You know, if the large 535 00:29:55,200 --> 00:29:58,080 Speaker 2: companies are deciding to use it, do you mean to 536 00:29:58,120 --> 00:30:00,200 Speaker 2: tell me that people aren't going to play g A 537 00:30:00,320 --> 00:30:01,800 Speaker 2: six because there's some AI in it? 538 00:30:01,880 --> 00:30:04,880 Speaker 1: I don't know, Man, gonna have. 539 00:30:05,600 --> 00:30:25,560 Speaker 2: Yeah when it comes out, If it comes out, Yeah, 540 00:30:26,080 --> 00:30:28,560 Speaker 2: And that's gonna be it for this week for tech Stuff. 541 00:30:28,760 --> 00:30:30,440 Speaker 2: I'm Dexter, Thomas. 542 00:30:29,920 --> 00:30:32,960 Speaker 1: Dexter, Thank you. I'm mos Velocian. This episode was produced 543 00:30:32,960 --> 00:30:36,360 Speaker 1: by Eliza Dennis and Melissa Slaughter. It was executive produced 544 00:30:36,360 --> 00:30:39,200 Speaker 1: by me Karra Price, Julian Nutter, and Kate Osborne for 545 00:30:39,240 --> 00:30:43,440 Speaker 1: Kaleidoscope and Katrin Norvel for iHeart Podcasts. The engineer is 546 00:30:43,440 --> 00:30:46,960 Speaker 1: Behead Fraser and Kyle Murdoch mixed this episode and also 547 00:30:47,040 --> 00:30:49,560 Speaker 1: wrote our theme song. Please do rate and review the 548 00:30:49,560 --> 00:30:51,440 Speaker 1: show wherever you listen, and reach out to us at 549 00:30:51,480 --> 00:30:55,520 Speaker 1: tech Stuff podcast at gmail dot com with thoughts, ideas, suggestions, 550 00:30:55,560 --> 00:31:03,080 Speaker 1: et cetera. We love hearing from you.