1 00:00:13,480 --> 00:00:17,360 Speaker 1: Welcome to tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:17,800 --> 00:00:20,160 Speaker 1: I'm Mas Valoshian and today Cara Price and I will 3 00:00:20,160 --> 00:00:23,439 Speaker 1: bring you the headlines this week, including the future of 4 00:00:23,520 --> 00:00:28,440 Speaker 1: search on the web, Ukrainian drone striking deep within Russia, and. 5 00:00:28,680 --> 00:00:31,400 Speaker 2: The very real security risks of vibe coding. 6 00:00:32,120 --> 00:00:34,800 Speaker 1: Then on tech Support you'll talk to Dexter Thomas of 7 00:00:34,920 --> 00:00:39,559 Speaker 1: the kill Switch podcast all about Nintendo, its history and 8 00:00:39,560 --> 00:00:42,440 Speaker 1: where it's a new console. The Switch too fits into 9 00:00:42,440 --> 00:00:43,080 Speaker 1: its legacy. 10 00:00:43,320 --> 00:00:46,879 Speaker 3: Google Yokoy is the person who may Nintendo what it 11 00:00:46,960 --> 00:00:51,200 Speaker 3: is today without him, if they still existed, they would 12 00:00:51,280 --> 00:00:53,120 Speaker 3: still be making playing cards. 13 00:00:53,440 --> 00:00:56,680 Speaker 1: All of that on the weekend. Take It's Friday, June 14 00:00:56,760 --> 00:00:58,440 Speaker 1: sixth Cara Price. 15 00:00:58,480 --> 00:00:59,400 Speaker 2: It is good to be back. 16 00:01:00,560 --> 00:01:02,240 Speaker 4: I don't even recognize that voice. 17 00:01:03,040 --> 00:01:06,760 Speaker 2: What hell, it's better now, but my voice. I lost 18 00:01:06,800 --> 00:01:09,160 Speaker 2: my voice. I had laryngite. It's like this. 19 00:01:09,600 --> 00:01:12,880 Speaker 4: Which is a nightmare for a podcast host in heaven. 20 00:01:13,000 --> 00:01:16,440 Speaker 1: For your wife, well, I get even more needy when 21 00:01:16,440 --> 00:01:19,760 Speaker 1: I can't speak. It turns out read me exactly, and 22 00:01:19,880 --> 00:01:21,920 Speaker 1: she has read notes. In fact, I was on a 23 00:01:21,959 --> 00:01:24,720 Speaker 1: flight back from London on Monday last week and. 24 00:01:24,720 --> 00:01:27,280 Speaker 2: My voice was literally couldn't speakers like this. 25 00:01:27,840 --> 00:01:29,720 Speaker 1: And it was freezing cold on the flight, and I 26 00:01:29,720 --> 00:01:32,960 Speaker 1: already wrapped one blanket around my neck and to try 27 00:01:33,000 --> 00:01:35,760 Speaker 1: and kind of preserve the last hint of whisper. So 28 00:01:35,800 --> 00:01:37,200 Speaker 1: I wrote a note on my phone for the airs 29 00:01:37,240 --> 00:01:39,720 Speaker 1: tudors and said, it's very cold. Do you have any 30 00:01:39,720 --> 00:01:42,560 Speaker 1: more blankets? And I handed it to her and she 31 00:01:42,600 --> 00:01:45,720 Speaker 1: went from quite stern to totally lit up and so 32 00:01:45,959 --> 00:01:47,880 Speaker 1: talking to me in sign language. She was so excited 33 00:01:48,360 --> 00:01:52,720 Speaker 1: to show off the sign and you were like, it 34 00:01:52,800 --> 00:01:57,120 Speaker 1: was like, I felt so ashamed that I couldn't speak 35 00:01:57,160 --> 00:01:58,320 Speaker 1: sign language at that moment. 36 00:01:58,440 --> 00:02:00,640 Speaker 4: But that's another that's for another day. 37 00:02:00,760 --> 00:02:02,960 Speaker 1: Actually, And actually I was nervous were landing because you know, 38 00:02:02,960 --> 00:02:05,400 Speaker 1: obviously you hear all these stories about your non US 39 00:02:05,440 --> 00:02:07,000 Speaker 1: citizens crossing the border. I was like, oh my god, 40 00:02:07,040 --> 00:02:08,880 Speaker 1: imagine if this is the day they choose to interrogate me. 41 00:02:08,919 --> 00:02:10,519 Speaker 1: I literally can't speak. I'm gonna be there for a 42 00:02:10,520 --> 00:02:11,079 Speaker 1: long hass the time. 43 00:02:11,080 --> 00:02:15,120 Speaker 4: It'll also be the best excuse. Sorry I can't do it, 44 00:02:15,160 --> 00:02:19,600 Speaker 4: but you were somehow last week given the gift of speech. 45 00:02:19,760 --> 00:02:22,320 Speaker 1: Gift of technology gave me the gift of speech. So 46 00:02:22,520 --> 00:02:24,640 Speaker 1: I found this app called WISP, And. 47 00:02:24,560 --> 00:02:26,799 Speaker 4: While I was asking, did you just google this? 48 00:02:27,160 --> 00:02:30,839 Speaker 1: I googled apps that can turn whispers into speech. 49 00:02:31,200 --> 00:02:32,680 Speaker 2: And that would go figure. There is one. 50 00:02:32,760 --> 00:02:35,520 Speaker 1: It's called WISP. And obviously I was at home feeling 51 00:02:35,560 --> 00:02:38,040 Speaker 1: sorry for myself being able to speak, so I was like, 52 00:02:38,360 --> 00:02:43,440 Speaker 1: how can I get some attention? I made a video 53 00:02:43,480 --> 00:02:46,959 Speaker 1: of myself in WISP and it revoiced me, and I 54 00:02:47,000 --> 00:02:47,639 Speaker 1: put it in our slack. 55 00:02:47,639 --> 00:02:49,000 Speaker 5: So you've lays, So I'm going to play it again, 56 00:02:49,639 --> 00:02:52,639 Speaker 5: ais As you know, I can literally only talk in 57 00:02:52,680 --> 00:02:55,720 Speaker 5: a whisper. Right now, I just got a lot of writers, 58 00:02:56,600 --> 00:02:59,600 Speaker 5: but maybe something we could lead next week's episode with. 59 00:03:00,040 --> 00:03:01,960 Speaker 1: So I was right, we are leading this week's episode, 60 00:03:02,120 --> 00:03:03,560 Speaker 1: and here we are leading the episode. 61 00:03:03,680 --> 00:03:08,800 Speaker 4: You sound sort of like a Dutch survivor after a 62 00:03:08,960 --> 00:03:15,280 Speaker 4: very long mission being rescued. Only it's true the end 63 00:03:15,280 --> 00:03:15,840 Speaker 4: of a mission. 64 00:03:15,960 --> 00:03:16,840 Speaker 2: It's a Dutch app. 65 00:03:17,000 --> 00:03:19,200 Speaker 1: Yeah, and so I think you can train it on 66 00:03:19,240 --> 00:03:20,920 Speaker 1: your voice to sound like you, but I didn't have 67 00:03:20,960 --> 00:03:22,880 Speaker 1: a voice to train it all, so I had to 68 00:03:22,960 --> 00:03:25,200 Speaker 1: use the default, which was indeed Dutch. 69 00:03:25,240 --> 00:03:25,959 Speaker 2: There's two options. 70 00:03:26,040 --> 00:03:28,680 Speaker 1: You can make a selfie video whispering that revoices you, 71 00:03:29,240 --> 00:03:32,600 Speaker 1: or you can call on the phone, and of course. 72 00:03:33,960 --> 00:03:35,360 Speaker 4: Were you speaking in real time? 73 00:03:35,520 --> 00:03:37,440 Speaker 2: I was speaking like this and it was translating in. 74 00:03:37,400 --> 00:03:39,040 Speaker 4: Real in real time. That's incredible. 75 00:03:39,080 --> 00:03:40,880 Speaker 2: I called my mum with it. She said, I hate this. 76 00:03:41,040 --> 00:03:41,960 Speaker 2: Never do this to me again. 77 00:03:42,160 --> 00:03:43,920 Speaker 4: It was I mean, I just think it was incredible 78 00:03:43,920 --> 00:03:45,640 Speaker 4: that you could do. Because if you remember when we 79 00:03:45,680 --> 00:03:47,400 Speaker 4: did sleep, we had to pre record. 80 00:03:47,600 --> 00:03:50,560 Speaker 2: This was our podcast that we hosted together six years ago. 81 00:03:50,440 --> 00:03:54,480 Speaker 4: Six years ago now, and basically I had to train 82 00:03:55,320 --> 00:03:58,800 Speaker 4: an AI model on my voice and then I couldn't 83 00:03:58,800 --> 00:03:59,720 Speaker 4: do it in real time. 84 00:04:00,120 --> 00:04:02,840 Speaker 2: Yes, obviously I'm lucky. I normally have a voice. 85 00:04:02,920 --> 00:04:05,720 Speaker 1: Yes, as you know, yes I do. I do, but 86 00:04:05,720 --> 00:04:07,480 Speaker 1: there are some people who don't. And you know, I 87 00:04:07,480 --> 00:04:11,160 Speaker 1: think Whispy isn't really designed for the laryngitis sufferer who 88 00:04:11,200 --> 00:04:13,960 Speaker 1: needs some extra attention. It's designed for people who've actually 89 00:04:14,280 --> 00:04:16,719 Speaker 1: lost their voices, either to cancer or to trauma or 90 00:04:16,720 --> 00:04:19,280 Speaker 1: to other illnesses. And you know, with that in mind, 91 00:04:20,080 --> 00:04:22,680 Speaker 1: the modifications required to make this slightly better and more 92 00:04:22,720 --> 00:04:24,320 Speaker 1: functional feel pretty achievable. 93 00:04:24,480 --> 00:04:25,600 Speaker 2: And what a miracle. 94 00:04:25,600 --> 00:04:30,800 Speaker 4: Frankly, yeah, absolutely, but it's always a but. 95 00:04:30,800 --> 00:04:33,160 Speaker 2: But my next story is a turn in the other direction. 96 00:04:33,520 --> 00:04:37,200 Speaker 1: And are you familiar with Cory Doctor Rowe and this 97 00:04:37,360 --> 00:04:40,000 Speaker 1: concept he came up with of in shitification. 98 00:04:40,320 --> 00:04:42,440 Speaker 4: Yes, I read about it in the Financial Times. 99 00:04:42,240 --> 00:04:43,880 Speaker 2: And what did you take away from it? 100 00:04:44,600 --> 00:04:47,080 Speaker 4: That the Internet has become a slop festival? 101 00:04:47,120 --> 00:04:49,440 Speaker 1: The Internet has become a slop festival. That is better 102 00:04:49,480 --> 00:04:51,240 Speaker 1: put even than Doctor himself about it. 103 00:04:51,200 --> 00:04:53,680 Speaker 4: As a sort of the Internet as a slime, sort 104 00:04:53,720 --> 00:04:55,039 Speaker 4: of cascading down. 105 00:04:55,400 --> 00:04:59,200 Speaker 1: What he said specifically is in certification is quote a 106 00:04:59,240 --> 00:05:02,440 Speaker 1: theory about what happens when you have power without consequence, 107 00:05:02,839 --> 00:05:05,120 Speaker 1: And he goes on to describe platforms that have hollowed 108 00:05:05,120 --> 00:05:08,200 Speaker 1: themselves out where there's just no value left in them 109 00:05:08,279 --> 00:05:10,360 Speaker 1: except this kind of awful look in So I mean 110 00:05:10,360 --> 00:05:12,240 Speaker 1: that makes me think about what happens if you open 111 00:05:12,279 --> 00:05:15,919 Speaker 1: your Instagram reels today, there's just tons of which is 112 00:05:15,960 --> 00:05:16,400 Speaker 1: not even right. 113 00:05:16,520 --> 00:05:18,760 Speaker 4: I generated it's like, you know, it's like Tom Cruise 114 00:05:18,800 --> 00:05:20,000 Speaker 4: talking to Barack Obama. 115 00:05:20,080 --> 00:05:22,560 Speaker 1: In my case, it's like worms eating dragons and then 116 00:05:22,560 --> 00:05:28,800 Speaker 1: getting a sex back a reflecting my subconscious I mean, 117 00:05:28,800 --> 00:05:30,960 Speaker 1: this concept of incentification, I think is one of the 118 00:05:31,040 --> 00:05:36,159 Speaker 1: kind of great guiding constructs to think about what's happening 119 00:05:36,240 --> 00:05:40,960 Speaker 1: right now. But I think concerningly it's actually happening right now, 120 00:05:41,279 --> 00:05:45,120 Speaker 1: potentially the world Wide Web itself. Of course, I'm talking 121 00:05:45,160 --> 00:05:47,880 Speaker 1: about AI summaries. We all know about them from chet 122 00:05:47,920 --> 00:05:51,479 Speaker 1: GPT and how they're overlaid on Google, but now Internet 123 00:05:51,480 --> 00:05:55,760 Speaker 1: browsers themselves are building them in this appointed to Axios, Firefox, 124 00:05:55,839 --> 00:05:58,320 Speaker 1: Microsoft Edge. All you have to do is hover your 125 00:05:58,360 --> 00:06:01,320 Speaker 1: mouse over a weblink and you get an automatic summary, 126 00:06:01,640 --> 00:06:04,640 Speaker 1: which is kind of crazy, like the click through rate, 127 00:06:05,160 --> 00:06:07,760 Speaker 1: the great metric, Yes of our age. 128 00:06:08,000 --> 00:06:09,480 Speaker 4: Yes, it is going to be gone, It's going to 129 00:06:09,560 --> 00:06:10,080 Speaker 4: be gone. Yeah. 130 00:06:10,120 --> 00:06:11,320 Speaker 2: What does that means people. 131 00:06:11,120 --> 00:06:14,440 Speaker 4: Are rolling their mouse over a link? Yeah, nobody's going 132 00:06:14,480 --> 00:06:18,280 Speaker 4: to click through wild to your point about in sitification, 133 00:06:18,480 --> 00:06:21,600 Speaker 4: the only reason that good content exists on the Internet 134 00:06:21,640 --> 00:06:24,200 Speaker 4: that is even worth summarizing in the first place is 135 00:06:24,200 --> 00:06:27,200 Speaker 4: because people had an incentive to create it. It would 136 00:06:27,240 --> 00:06:30,200 Speaker 4: drive traffic and maybe even add dollars to their websites. 137 00:06:30,240 --> 00:06:32,720 Speaker 4: But it's definitely a blow to creatives out there, Like, 138 00:06:33,400 --> 00:06:36,200 Speaker 4: is there an incentive to publish on the web anymore? 139 00:06:36,480 --> 00:06:36,720 Speaker 2: Yeah? 140 00:06:36,839 --> 00:06:39,200 Speaker 1: Look to your point, I think a world where there's 141 00:06:39,320 --> 00:06:42,960 Speaker 1: no reward for posting online will be a poorer world 142 00:06:43,560 --> 00:06:47,320 Speaker 1: for us, but also ironically for AI. Because obviously AI 143 00:06:47,480 --> 00:06:50,080 Speaker 1: needs new days to train on, and so a few 144 00:06:50,160 --> 00:06:52,719 Speaker 1: people are posting news and opinions and arguments on the 145 00:06:52,760 --> 00:06:55,400 Speaker 1: internet that will ultimately lead to a poorer AI. 146 00:06:55,520 --> 00:06:57,520 Speaker 2: So this is like the opposite of a virtuous circle. 147 00:06:57,920 --> 00:07:00,440 Speaker 4: Yeah, you know, an AI is already so fast. It 148 00:07:00,440 --> 00:07:03,640 Speaker 4: feels like every week I see an embarrassing story of 149 00:07:04,279 --> 00:07:05,880 Speaker 4: a mistake that AI made. 150 00:07:06,000 --> 00:07:09,080 Speaker 1: I mean, the pizza with the glue is obviously time classic, brilliant, 151 00:07:09,080 --> 00:07:09,720 Speaker 1: but it doesn't end. 152 00:07:09,760 --> 00:07:11,520 Speaker 2: I mean, that's the thing two years ago. 153 00:07:11,440 --> 00:07:16,000 Speaker 4: And everything, and it's creating more interesting shit. Yeah, that's 154 00:07:16,560 --> 00:07:19,440 Speaker 4: what keeps happening. And I think one of my recent 155 00:07:19,520 --> 00:07:23,160 Speaker 4: favorites was the AI generated summer reading list that this 156 00:07:23,240 --> 00:07:25,920 Speaker 4: is not AI's fault, that made its way to being 157 00:07:26,040 --> 00:07:28,720 Speaker 4: published in the Chicago Sun Times. Like there's a photo 158 00:07:28,720 --> 00:07:31,680 Speaker 4: of a librarian holding up a fake summer reading. 159 00:07:31,480 --> 00:07:33,880 Speaker 2: List of fake books, of fake books. 160 00:07:33,840 --> 00:07:36,960 Speaker 4: One by Isabelle a Ende, a very famous author, one 161 00:07:37,000 --> 00:07:39,280 Speaker 4: by Percival Everett who just won the Pulitzer Prize. 162 00:07:39,320 --> 00:07:41,280 Speaker 2: And these are fake titles of books they never read. 163 00:07:41,040 --> 00:07:44,160 Speaker 4: Bake titles of books, no vetting. But I mean, that's 164 00:07:44,160 --> 00:07:47,520 Speaker 4: not AI's fault. Actually, that's the fault of the Chicago 165 00:07:47,520 --> 00:07:50,520 Speaker 4: Sun Times. I think in that one it definitely does, 166 00:07:51,000 --> 00:07:56,200 Speaker 4: but it doesn't. I think what's so kind of upsetting 167 00:07:56,360 --> 00:08:00,600 Speaker 4: is that it doesn't surprise me in the age of 168 00:08:00,800 --> 00:08:03,440 Speaker 4: automation of everything, that we just sort of accept it 169 00:08:03,480 --> 00:08:04,000 Speaker 4: to be real. 170 00:08:04,200 --> 00:08:05,560 Speaker 2: Yeah. Yeah, Yeah. 171 00:08:05,600 --> 00:08:07,720 Speaker 1: And Google, which for as long as I can remember, 172 00:08:07,800 --> 00:08:10,800 Speaker 1: has been the king of search, is doubling down on 173 00:08:11,000 --> 00:08:14,520 Speaker 1: AI summaries. Essentially, it just released something called AI Mode, 174 00:08:14,960 --> 00:08:18,200 Speaker 1: and this turns straightforward Google search queries into an AI 175 00:08:18,360 --> 00:08:19,160 Speaker 1: chat conversation. 176 00:08:19,640 --> 00:08:22,520 Speaker 4: Yeah, you know, it sounds like how a lot of 177 00:08:22,560 --> 00:08:25,320 Speaker 4: people are already using chat GPT but in your browser. 178 00:08:25,480 --> 00:08:27,760 Speaker 1: Well exactly, that is designed to prevent them from using 179 00:08:27,840 --> 00:08:30,440 Speaker 1: chatpt and instead staying in their browser. There was this 180 00:08:30,560 --> 00:08:32,200 Speaker 1: article in the FT a couple of weeks ago with 181 00:08:32,240 --> 00:08:36,079 Speaker 1: the headline can Google still dominate search in the age 182 00:08:36,120 --> 00:08:39,800 Speaker 1: of AI chatbots? AI Mode shows you, in real time 183 00:08:40,080 --> 00:08:42,760 Speaker 1: how they're trying to answer that question, which could ultimately 184 00:08:42,800 --> 00:08:46,320 Speaker 1: be existential for them. Of course, to no unsurprised. Earlier 185 00:08:46,320 --> 00:08:50,080 Speaker 1: reviews of AI mode have found some pretty unreliable information. 186 00:08:50,640 --> 00:08:52,880 Speaker 1: A reporter at The New York Times actually used AI 187 00:08:52,920 --> 00:08:55,360 Speaker 1: mode to plan his daughter's birthday. 188 00:08:55,000 --> 00:08:57,680 Speaker 4: Party parenting presented by Google. 189 00:08:58,040 --> 00:09:01,199 Speaker 1: Yes, and needless to say, it go very smoothly. The 190 00:09:01,320 --> 00:09:03,760 Speaker 1: reporter asked Google AI mode to find parks in his 191 00:09:03,880 --> 00:09:07,680 Speaker 1: area with picnic tables, and Google provided the reporter with 192 00:09:07,720 --> 00:09:10,520 Speaker 1: a bulletedist of parks with some helpful information about each, 193 00:09:11,360 --> 00:09:13,800 Speaker 1: and so he went out to scout two of the suggestions, 194 00:09:14,240 --> 00:09:16,840 Speaker 1: but of course neither had picnic tables. 195 00:09:16,559 --> 00:09:19,280 Speaker 4: So he basically was given nothing that he needed. 196 00:09:19,120 --> 00:09:21,040 Speaker 1: Nothing of value, and thank goodness, I think he didn't 197 00:09:21,080 --> 00:09:23,160 Speaker 1: have his disappointed kids in terms. 198 00:09:24,040 --> 00:09:25,800 Speaker 2: So anyway he was during the scout, he was writing 199 00:09:25,800 --> 00:09:27,120 Speaker 2: a story, so he soldied on. 200 00:09:27,440 --> 00:09:29,880 Speaker 1: He went back home and told AI mode that the 201 00:09:29,920 --> 00:09:34,360 Speaker 1: parks didn't have tables. AI apologized and spat out another list, 202 00:09:34,920 --> 00:09:37,640 Speaker 1: but it included the same parks he'd already visited and 203 00:09:37,760 --> 00:09:39,360 Speaker 1: knew that they didn't have a picnic table, so it 204 00:09:39,400 --> 00:09:40,640 Speaker 1: didn't take on the new information. 205 00:09:41,000 --> 00:09:45,040 Speaker 4: So getting Google's AI mode to plan your kid's birthday 206 00:09:46,200 --> 00:09:48,320 Speaker 4: essentially at this point he should have just let his 207 00:09:48,400 --> 00:09:49,559 Speaker 4: kid plan the kid's birthday. 208 00:09:49,720 --> 00:09:52,680 Speaker 1: I guess less candy in Google ai mode, but yes. 209 00:09:53,800 --> 00:09:57,040 Speaker 1: The reporter also asked Aimo to find an affordable car wash, 210 00:09:57,320 --> 00:09:59,360 Speaker 1: and it listed one business as having a twenty five 211 00:09:59,400 --> 00:10:02,360 Speaker 1: dollar car whah. When he got there, the real cost 212 00:10:02,400 --> 00:10:05,679 Speaker 1: was sixty five dollars. What's interesting, though, is that when 213 00:10:05,720 --> 00:10:08,800 Speaker 1: the journalist did the same queries through regular, good old 214 00:10:08,840 --> 00:10:11,559 Speaker 1: fashioned Google about the picnic tables and the car washes, 215 00:10:12,040 --> 00:10:14,760 Speaker 1: guess what, go figure, he got the information he wanted 216 00:10:14,760 --> 00:10:15,520 Speaker 1: and it was correct. 217 00:10:15,960 --> 00:10:17,640 Speaker 4: You know, I don't like to be a doomer about this, 218 00:10:17,760 --> 00:10:21,280 Speaker 4: but your reference to incertification makes sense to me. Like, 219 00:10:21,640 --> 00:10:24,120 Speaker 4: here's Google, Yeah, the search engine that we've been all 220 00:10:24,200 --> 00:10:27,839 Speaker 4: using for twenty five years, with a mode that's making 221 00:10:27,920 --> 00:10:28,680 Speaker 4: Google worse. 222 00:10:28,880 --> 00:10:29,160 Speaker 2: Yeah. 223 00:10:29,240 --> 00:10:31,640 Speaker 4: I mean, for all intents and purposes, Google works, We'll 224 00:10:31,640 --> 00:10:34,120 Speaker 4: work pretty well, but they're trying to keep up with 225 00:10:34,480 --> 00:10:39,120 Speaker 4: what the Internet is right now, and as such are 226 00:10:39,160 --> 00:10:40,600 Speaker 4: in shitifying it. Yeah. 227 00:10:40,640 --> 00:10:43,640 Speaker 1: And of course the question I have is, in this 228 00:10:43,760 --> 00:10:47,480 Speaker 1: new world, how long will normal people bother to post 229 00:10:47,600 --> 00:10:51,480 Speaker 1: helpful information online if the default becomes an AI mode 230 00:10:51,520 --> 00:10:54,400 Speaker 1: that is unreliable and that doesn't reward the content creator 231 00:10:54,440 --> 00:10:57,440 Speaker 1: for their contribution. I was actually talking to a friend 232 00:10:57,440 --> 00:10:59,040 Speaker 1: of ours the other day and she mentioned that she 233 00:10:59,080 --> 00:11:01,040 Speaker 1: actually no longer used is Google, which I was pretty 234 00:11:01,040 --> 00:11:01,480 Speaker 1: shocked by. 235 00:11:01,600 --> 00:11:02,760 Speaker 4: That's such a bold class. 236 00:11:04,640 --> 00:11:04,840 Speaker 2: Yeah. 237 00:11:05,240 --> 00:11:08,240 Speaker 1: Specifically for search. She used this search engine called Kagi, 238 00:11:08,840 --> 00:11:13,199 Speaker 1: which is Japanese for key. I learned thirty seconds ago Google. 239 00:11:13,600 --> 00:11:16,079 Speaker 1: I googled it and it touts itself as being a 240 00:11:16,160 --> 00:11:19,960 Speaker 1: quote premium search engine. Users can tailor their search results 241 00:11:20,000 --> 00:11:23,120 Speaker 1: their own preferences, and they don't get shown ads because 242 00:11:23,200 --> 00:11:25,959 Speaker 1: the service monetizes through subscription rather than ads. 243 00:11:26,000 --> 00:11:32,440 Speaker 4: Interesting. Well, no offense to the Kagigs coagistas there, but 244 00:11:32,600 --> 00:11:36,120 Speaker 4: I do think people will generally stick to the devil 245 00:11:36,160 --> 00:11:38,640 Speaker 4: they know, and the devil they know is Google. You know. 246 00:11:38,720 --> 00:11:43,200 Speaker 4: At the same time, progress is not linear right now. 247 00:11:43,480 --> 00:11:46,600 Speaker 4: Should Google's AI mode be your preferred method for birthday 248 00:11:46,600 --> 00:11:50,360 Speaker 4: party planning? Probably? Not. In a year, maybe it could. 249 00:11:50,400 --> 00:11:52,480 Speaker 4: Maybe it will show you where the picnic table is, 250 00:11:52,840 --> 00:11:55,720 Speaker 4: you know, lest we forget, I used to ask Jeeves. 251 00:11:55,480 --> 00:11:57,080 Speaker 2: Well, was the last time you asked you something? 252 00:11:57,440 --> 00:11:59,240 Speaker 4: I don't know? But you know, we actually live in 253 00:11:59,280 --> 00:12:01,920 Speaker 4: a world where I could build my own ass, Jeeves, 254 00:12:02,120 --> 00:12:04,640 Speaker 4: which is what are you talking about? Well, I wouldn't 255 00:12:04,679 --> 00:12:07,200 Speaker 4: do it myself, but this has a lot to do 256 00:12:07,240 --> 00:12:10,600 Speaker 4: with my headline of the week, which is, you know, 257 00:12:10,720 --> 00:12:14,480 Speaker 4: I don't even know if I like vibe coating. I 258 00:12:14,559 --> 00:12:16,800 Speaker 4: just like the term vibe coating. It's a great You know, 259 00:12:16,840 --> 00:12:19,840 Speaker 4: there's this app called Lovable, and it's a sweetest startup 260 00:12:19,880 --> 00:12:24,640 Speaker 4: that calls its product the last piece of software, the 261 00:12:24,760 --> 00:12:29,440 Speaker 4: last of us. It might be, it might be. So. 262 00:12:29,760 --> 00:12:32,360 Speaker 4: The way it works is that you basically, Lovable has 263 00:12:32,400 --> 00:12:35,560 Speaker 4: a chatbot and you tell it to make an app 264 00:12:35,559 --> 00:12:38,120 Speaker 4: for you, and you can say, can you build me 265 00:12:38,120 --> 00:12:40,520 Speaker 4: an app where I can track all the hats that 266 00:12:40,559 --> 00:12:42,520 Speaker 4: I own, so I will stop spending money on hats 267 00:12:42,520 --> 00:12:44,880 Speaker 4: when I reminded of how many hats I own. Same 268 00:12:44,880 --> 00:12:47,760 Speaker 4: thing with glasses, and maybe Lovable would generate an app 269 00:12:47,760 --> 00:12:50,400 Speaker 4: where I can log my extensive hat or glasses collection. 270 00:12:50,520 --> 00:12:51,440 Speaker 2: That does sound pretty cool. 271 00:12:51,640 --> 00:12:54,440 Speaker 4: It is. There is a problem with it, though, which 272 00:12:54,440 --> 00:12:59,360 Speaker 4: is that it's having some issues in the security department. Semaphore, 273 00:13:00,040 --> 00:13:02,679 Speaker 4: you know, trust the old Semaphore. Yes, ran a headline 274 00:13:02,840 --> 00:13:06,319 Speaker 4: the hottest new vibe coding startup maybe a sitting duck 275 00:13:06,800 --> 00:13:07,360 Speaker 4: for hackers. 276 00:13:07,400 --> 00:13:08,360 Speaker 2: You know, I saw that headline. 277 00:13:08,400 --> 00:13:10,120 Speaker 1: I didn't actually read the story, but it was fascinating, 278 00:13:10,120 --> 00:13:11,199 Speaker 1: so I'm glad you'll bring it today. 279 00:13:11,240 --> 00:13:13,440 Speaker 4: There's always a whistleblower at a competing company and it's 280 00:13:13,480 --> 00:13:16,880 Speaker 4: like your company is doing wrong. And so an employee 281 00:13:16,920 --> 00:13:22,440 Speaker 4: at a competing vibe coding company called Replet not confused 282 00:13:22,440 --> 00:13:26,040 Speaker 4: with Reddit Yes, release a report that shows some serious 283 00:13:26,160 --> 00:13:30,680 Speaker 4: vulnerabilities for users, and on lovable site, the company showcases 284 00:13:30,720 --> 00:13:32,960 Speaker 4: some of the apps and websites that users have made 285 00:13:33,000 --> 00:13:36,640 Speaker 4: with the software. There's one called Can't Make This Stuff Up. 286 00:13:36,760 --> 00:13:38,200 Speaker 4: There's one called scam. 287 00:13:37,920 --> 00:13:39,800 Speaker 2: Nail, Scam nails, ay, hangnail but. 288 00:13:39,920 --> 00:13:42,480 Speaker 4: Yeah, which is a community platform for users to get 289 00:13:42,520 --> 00:13:45,200 Speaker 4: expert advice on whether or not they're dealing with a scam. 290 00:13:45,520 --> 00:13:48,080 Speaker 4: There's another I mean, these names. I wonder if Lovable 291 00:13:48,120 --> 00:13:51,680 Speaker 4: comes up with the names. There's another website called info rid, 292 00:13:51,760 --> 00:13:55,320 Speaker 4: which promises to remove your personal information from around seventy 293 00:13:55,400 --> 00:13:57,839 Speaker 4: data brokers so you receive less spam. That sounds good, 294 00:13:58,200 --> 00:14:00,720 Speaker 4: I mean, they all sound like great ideas, but this 295 00:14:00,800 --> 00:14:04,559 Speaker 4: report found that of the over sixteen hundred web apps 296 00:14:04,559 --> 00:14:08,240 Speaker 4: featured on Lovable Site, one hundred and seventy of them 297 00:14:08,400 --> 00:14:09,520 Speaker 4: had vulnerabilities. 298 00:14:10,000 --> 00:14:12,000 Speaker 2: And when you say vulnerabilities, I know it. 299 00:14:11,960 --> 00:14:14,960 Speaker 4: Sounds they're not emotional vulnerability. Things that I think people 300 00:14:14,960 --> 00:14:18,320 Speaker 4: find very important, which is like their name, their email address, 301 00:14:18,440 --> 00:14:22,840 Speaker 4: and most importantly financial information, like anything that's extremely personal. 302 00:14:23,000 --> 00:14:25,000 Speaker 1: So the advice from parents and children that we used 303 00:14:25,000 --> 00:14:27,040 Speaker 1: to be don't get into a stranger's car, it's now 304 00:14:27,120 --> 00:14:29,960 Speaker 1: don't use software that you find online because what I say. 305 00:14:29,880 --> 00:14:33,200 Speaker 4: You know, there are pitfalls. Even if AI can write 306 00:14:33,280 --> 00:14:36,680 Speaker 4: flawless code, there can be security flaws. For example, when 307 00:14:36,680 --> 00:14:38,960 Speaker 4: you tell Lovable that you want to make a website, 308 00:14:39,000 --> 00:14:41,440 Speaker 4: it's just going to make the website plain and simple. 309 00:14:41,640 --> 00:14:43,600 Speaker 4: But for a website to work, it has to be 310 00:14:43,680 --> 00:14:46,280 Speaker 4: connected to a database that can store things like user 311 00:14:46,320 --> 00:14:51,640 Speaker 4: accounts and payment information. And that's the work historically of 312 00:14:51,880 --> 00:14:56,320 Speaker 4: seasoned software developers, who even season software developers can make mistakes. 313 00:14:56,560 --> 00:14:58,720 Speaker 1: Yeah, I mean it's I joked about don't go into 314 00:14:58,760 --> 00:15:01,280 Speaker 1: strangers at but like, you know, it's going to get 315 00:15:01,280 --> 00:15:02,320 Speaker 1: hard or not to use the web. 316 00:15:02,320 --> 00:15:04,840 Speaker 2: You've got AI summaries overlaid everywhere, and. 317 00:15:04,760 --> 00:15:06,240 Speaker 1: Then when you do get to a website or an 318 00:15:06,240 --> 00:15:08,360 Speaker 1: app that you want to go into, you know, you 319 00:15:08,400 --> 00:15:10,000 Speaker 1: have to think about whether or not it's been VIBE 320 00:15:10,000 --> 00:15:12,680 Speaker 1: coded and whether the security on the back end will 321 00:15:12,680 --> 00:15:15,040 Speaker 1: be functional. I mean, it's just like it's kind of 322 00:15:15,040 --> 00:15:16,160 Speaker 1: a little bit of a headspin there. 323 00:15:16,440 --> 00:15:19,440 Speaker 4: Yeah, and you know, Lovable does offer an easy way 324 00:15:19,800 --> 00:15:22,920 Speaker 4: to connect to a properly managed database to store payment 325 00:15:22,960 --> 00:15:26,120 Speaker 4: information and like it's a service called sounds a lot 326 00:15:26,160 --> 00:15:30,840 Speaker 4: like a Nicki Minaj song super Base. But the vulnerability 327 00:15:30,880 --> 00:15:34,320 Speaker 4: report found that on some of these Lovable made apps, 328 00:15:34,640 --> 00:15:38,400 Speaker 4: the super base database was not configured correctly, which led 329 00:15:38,440 --> 00:15:41,880 Speaker 4: to the security flaws. So that might be pointing to 330 00:15:41,920 --> 00:15:44,320 Speaker 4: the issue that you're flagging as that people don't really 331 00:15:44,400 --> 00:15:47,520 Speaker 4: know how to check their work yet and that doing 332 00:15:47,560 --> 00:15:49,840 Speaker 4: that takes some creative thinking and actually a little bit 333 00:15:49,840 --> 00:15:54,360 Speaker 4: of paranoil, like how might my passion project be vulnerable? 334 00:15:54,440 --> 00:15:54,920 Speaker 2: Yeah, I mean, I. 335 00:15:54,920 --> 00:15:57,160 Speaker 1: Guess a lot of software developers do this red teaming, 336 00:15:57,240 --> 00:15:59,080 Speaker 1: like they try and get into the headspace of an 337 00:15:59,120 --> 00:16:01,240 Speaker 1: attacker and the attack their own product. 338 00:16:01,760 --> 00:16:04,320 Speaker 2: Most vibe coders are not self. 339 00:16:04,680 --> 00:16:07,200 Speaker 4: Not necessarily coders, No exactly. 340 00:16:07,560 --> 00:16:13,200 Speaker 1: So repltu replit raised their hand and said, heyre's the 341 00:16:13,200 --> 00:16:16,240 Speaker 1: problem over there? Yeah sevenphore around the story. Yes, how 342 00:16:16,280 --> 00:16:17,200 Speaker 1: did Lovable respond? 343 00:16:17,480 --> 00:16:21,400 Speaker 4: Lovable responded on x formerly known as Twitter, saying, quote, 344 00:16:21,520 --> 00:16:24,360 Speaker 4: we're not yet where we want to be in terms 345 00:16:24,400 --> 00:16:27,560 Speaker 4: of security, and we're committed to keep improving the security 346 00:16:27,600 --> 00:16:32,720 Speaker 4: posture for all Lovable users. It just reminds me if 347 00:16:32,760 --> 00:16:35,240 Speaker 4: like all of a sudden, someone who didn't work at 348 00:16:35,320 --> 00:16:37,920 Speaker 4: Nike was like, I'm going to be Nike today and 349 00:16:37,920 --> 00:16:39,080 Speaker 4: I'm going to make these sneakers. 350 00:16:39,120 --> 00:16:41,200 Speaker 1: Well, that happens, right, I mean, as a whole, this 351 00:16:41,240 --> 00:16:44,400 Speaker 1: company spend billions of dollars on pattern and protection and 352 00:16:44,840 --> 00:16:47,920 Speaker 1: infringement on their copyright and stuff because they don't want 353 00:16:47,960 --> 00:16:51,040 Speaker 1: to erode trust in their brand by having knockoffs, injured. 354 00:16:50,760 --> 00:16:51,960 Speaker 2: People, a hut and whatever it is. 355 00:16:52,040 --> 00:16:54,760 Speaker 1: Yes, exactly, but now we're in this weird error on 356 00:16:54,800 --> 00:16:58,120 Speaker 1: the Internet where you know, we've developed this shared sort 357 00:16:58,120 --> 00:17:01,560 Speaker 1: of status quo understanding of about like you can trust 358 00:17:02,240 --> 00:17:05,040 Speaker 1: an app that you find in the App Store or 359 00:17:05,040 --> 00:17:09,119 Speaker 1: an Android or like most legit seeming websites probably are legit. 360 00:17:09,720 --> 00:17:11,560 Speaker 1: You know, obviously have to be have a spearfishing and stuff, 361 00:17:11,560 --> 00:17:14,879 Speaker 1: but like this whole kind of shared basis of like 362 00:17:15,040 --> 00:17:17,280 Speaker 1: being able to input your data with some degree of 363 00:17:17,320 --> 00:17:20,040 Speaker 1: confidence online. And of course there's amazing things that you 364 00:17:20,080 --> 00:17:21,960 Speaker 1: can do with vibe coding. It's pretty cool that you 365 00:17:21,960 --> 00:17:24,239 Speaker 1: can make a functional prototype for an idea and get 366 00:17:24,280 --> 00:17:27,000 Speaker 1: people excited within just a few moments or a few days, 367 00:17:27,400 --> 00:17:29,600 Speaker 1: and that really is cool, but at the same time, 368 00:17:29,720 --> 00:17:33,200 Speaker 1: like at the cost of blowing up this security architecture 369 00:17:33,400 --> 00:17:35,080 Speaker 1: that we built over the last twenty. 370 00:17:34,920 --> 00:17:37,640 Speaker 4: Years, right, and security has actually been a key concern 371 00:17:37,720 --> 00:17:40,800 Speaker 4: ever since the beginning of the Internet, and security measures 372 00:17:41,160 --> 00:17:46,320 Speaker 4: and hackers have gotten more sophisticated, so it's kind of 373 00:17:46,400 --> 00:17:50,199 Speaker 4: unreal that the big fad is to make products that 374 00:17:50,240 --> 00:17:51,359 Speaker 4: are less sophisticated. 375 00:17:51,440 --> 00:17:51,760 Speaker 2: Totally. 376 00:17:51,840 --> 00:17:54,639 Speaker 1: I really really want to get one of the vibe 377 00:17:54,640 --> 00:17:58,080 Speaker 1: coding people on the show, whether it's Cursor or Replet 378 00:17:58,200 --> 00:18:00,000 Speaker 1: or Lovable, it'd be very interesting to have someone from 379 00:18:00,040 --> 00:18:01,400 Speaker 1: those companies, So Will maybe. 380 00:18:01,240 --> 00:18:03,800 Speaker 4: We should have Replet and Lovable in a Frost Nixon 381 00:18:03,880 --> 00:18:05,680 Speaker 4: wow of. 382 00:18:06,320 --> 00:18:08,480 Speaker 2: So we'll work on that. But in the meantime, we've 383 00:18:08,480 --> 00:18:09,600 Speaker 2: got some brief headlines. 384 00:18:10,000 --> 00:18:13,359 Speaker 4: So if you're someone who thinks that ring doorbells are 385 00:18:13,400 --> 00:18:15,760 Speaker 4: a step too far in the direction of surveillance, I 386 00:18:15,800 --> 00:18:18,880 Speaker 4: would like to present to you the world of New 387 00:18:18,960 --> 00:18:23,439 Speaker 4: York City Facebook mom groups, which I'm very familiar with 388 00:18:23,480 --> 00:18:25,920 Speaker 4: because a lot of my friends are in them. According 389 00:18:25,920 --> 00:18:31,240 Speaker 4: to the Daily Mail me some nannies in New York 390 00:18:31,240 --> 00:18:34,400 Speaker 4: City's Upper East Side are paranoid that they are being 391 00:18:34,440 --> 00:18:36,720 Speaker 4: spied on, and this reminds me spy cam has been 392 00:18:36,760 --> 00:18:38,439 Speaker 4: a thing with Nanni's for a long time, Like this 393 00:18:38,520 --> 00:18:41,320 Speaker 4: is not people used to embed them in like stuffed animals. 394 00:18:41,359 --> 00:18:45,479 Speaker 4: But in a Facebook group aptly titled maybe embarrassingly titled 395 00:18:45,480 --> 00:18:48,720 Speaker 4: Moms of the Upper east Side, members have posted pictures 396 00:18:48,720 --> 00:18:52,320 Speaker 4: of Nanny's with captions like if you recognize this blonde 397 00:18:52,320 --> 00:18:55,840 Speaker 4: girl with pigtails I saw yesterday afternoon around seventy eighth 398 00:18:55,880 --> 00:18:58,520 Speaker 4: and Second Avenue, Please dm me. I think you will 399 00:18:58,560 --> 00:18:59,879 Speaker 4: want to know what your nanny. 400 00:19:00,200 --> 00:19:01,440 Speaker 2: Oh my god, it's like a horror movie. 401 00:19:01,480 --> 00:19:05,080 Speaker 4: It's pen out decks. There are posts claiming to have 402 00:19:05,160 --> 00:19:09,480 Speaker 4: seen nanny's harshly handling children in public, amongst other allegations 403 00:19:09,480 --> 00:19:13,560 Speaker 4: of mistreatment. Some members have responded critically, saying that the 404 00:19:13,600 --> 00:19:18,240 Speaker 4: posts lack context and should be handled cautiously. The big 405 00:19:18,280 --> 00:19:22,520 Speaker 4: takeaway is that Facebook groups for moms are actually a 406 00:19:22,600 --> 00:19:26,240 Speaker 4: really thriving hot zone for Facebook. 407 00:19:26,560 --> 00:19:27,000 Speaker 2: There you go. 408 00:19:27,280 --> 00:19:32,440 Speaker 1: Last story today is a moresemble one. For the last 409 00:19:32,480 --> 00:19:38,320 Speaker 1: eighteen months, Ukraine planted drones secretly in mobile homes and sheds, 410 00:19:38,640 --> 00:19:42,880 Speaker 1: staged on flatbed trucks and part near military runways in Russia, 411 00:19:43,480 --> 00:19:46,320 Speaker 1: and on Sunday they attacked drones flew out of their 412 00:19:46,359 --> 00:19:50,520 Speaker 1: enclosures and destroyed at least thirteen Russian aircraft and damage others. 413 00:19:50,800 --> 00:19:53,280 Speaker 1: That's according to Ukrainian officials, who said that some of 414 00:19:53,320 --> 00:19:56,920 Speaker 1: the destroyed aircraft were actually capable of launching nuclear weapons. 415 00:19:57,480 --> 00:20:02,360 Speaker 1: These attacks took place two and eight hundred miles from 416 00:20:02,520 --> 00:20:06,240 Speaker 1: Ukraine's border with Russia. That's more or less a distance 417 00:20:06,240 --> 00:20:09,000 Speaker 1: from here as in New York City, London. Given that 418 00:20:09,040 --> 00:20:12,800 Speaker 1: these drones targeted Russia's nuclear capacity, some in Russia have 419 00:20:12,880 --> 00:20:16,320 Speaker 1: said that this triggers a nuclear response under their doctrine. 420 00:20:16,840 --> 00:20:20,240 Speaker 1: The moment is being widely compared to Russia's Pearl Harbor. 421 00:20:20,560 --> 00:20:24,000 Speaker 1: And actually I interviewed Jake Sullivan, the former National Security Advisor, 422 00:20:24,440 --> 00:20:27,119 Speaker 1: this week, and we talked about this exact story and 423 00:20:27,160 --> 00:20:31,600 Speaker 1: also about how the US is preparing for a similar 424 00:20:31,600 --> 00:20:33,679 Speaker 1: type of attack. I mean, could this happen here? The 425 00:20:33,720 --> 00:20:36,920 Speaker 1: answer is yes if we don't prepare, And we'll publish 426 00:20:36,920 --> 00:20:39,800 Speaker 1: that interview. By the way, this is a tease next Wednesday. 427 00:20:40,280 --> 00:20:46,760 Speaker 4: As the story, We're going to take a short break. 428 00:20:46,800 --> 00:20:49,879 Speaker 4: But when we're back, the rise, then fall, the rise, 429 00:20:50,000 --> 00:20:53,359 Speaker 4: then fall, then rise again. Of Nintendo and how the 430 00:20:53,400 --> 00:20:55,560 Speaker 4: switch to could be another make or break moment. 431 00:20:55,760 --> 00:21:03,000 Speaker 5: Stay with us. 432 00:21:04,119 --> 00:21:05,880 Speaker 4: So for our next segment, we're going to talk about 433 00:21:05,920 --> 00:21:09,200 Speaker 4: a company that's released some pretty iconic games and hardware 434 00:21:09,240 --> 00:21:14,600 Speaker 4: over the years. No, not Hasbro Nintendo as one was 435 00:21:14,600 --> 00:21:16,240 Speaker 4: the last time you played a Nintendo game? 436 00:21:16,520 --> 00:21:18,199 Speaker 1: You know, I haven't played a Nintendo game for a 437 00:21:18,320 --> 00:21:22,040 Speaker 1: very long time. I briefly lusted after a Wii. I 438 00:21:22,119 --> 00:21:23,240 Speaker 1: like the idea of playing tennis. 439 00:21:23,640 --> 00:21:25,439 Speaker 4: But to say that you lusted after a. 440 00:21:28,000 --> 00:21:28,720 Speaker 2: Please. 441 00:21:29,400 --> 00:21:31,639 Speaker 1: I did have a game Boy when I was eleven 442 00:21:31,720 --> 00:21:34,159 Speaker 1: or twelve years old, and I probably spent two or 443 00:21:34,160 --> 00:21:38,159 Speaker 1: three hundred hours playing Pokemon and wanted to beat both 444 00:21:38,280 --> 00:21:42,000 Speaker 1: the red and blue version of Pokemon on my game Boy. 445 00:21:42,080 --> 00:21:43,600 Speaker 4: And we know what happens when you set your mind 446 00:21:43,600 --> 00:21:44,280 Speaker 4: to something. 447 00:21:44,480 --> 00:21:45,720 Speaker 2: And I did catch them all. 448 00:21:48,200 --> 00:21:50,280 Speaker 1: But you know what, we batted around some ideas this 449 00:21:50,280 --> 00:21:52,560 Speaker 1: week in our production meeting out what to do for 450 00:21:52,640 --> 00:21:56,040 Speaker 1: our weekly tech support segment, and I actually wasn't immediately 451 00:21:56,119 --> 00:21:57,480 Speaker 1: sold uncovering Nintendo. 452 00:21:57,640 --> 00:21:59,399 Speaker 4: No, you were not, And then we had to be like, 453 00:22:01,680 --> 00:22:07,240 Speaker 4: Nintendo has unbelievable cultural clout in the IP space. You 454 00:22:07,359 --> 00:22:11,880 Speaker 4: just mentioned Pokemon, Zelda, Donkey Kong, Super Mario the most 455 00:22:11,920 --> 00:22:16,119 Speaker 4: successful movies right now are coming from IP that is 456 00:22:16,160 --> 00:22:17,000 Speaker 4: based on games. 457 00:22:17,200 --> 00:22:19,359 Speaker 1: You know, when you're like not thinking about something and 458 00:22:19,359 --> 00:22:21,480 Speaker 1: then you start thinking about something and then you see 459 00:22:21,480 --> 00:22:22,120 Speaker 1: it everywhere. 460 00:22:22,160 --> 00:22:24,200 Speaker 4: It's called synchronicity, is what it's called. 461 00:22:24,280 --> 00:22:26,600 Speaker 1: Yes, well it happened to me because I was watching 462 00:22:26,600 --> 00:22:30,399 Speaker 1: the Formula One this weekend and Max Vstappen bashed into 463 00:22:30,760 --> 00:22:33,840 Speaker 1: George Russell and the drivers were hanging out afterwards in 464 00:22:33,880 --> 00:22:36,920 Speaker 1: the like chill out room, and Lando Norris from McLaren 465 00:22:37,040 --> 00:22:39,879 Speaker 1: f one said, I've done that before, but only a 466 00:22:39,880 --> 00:22:42,600 Speaker 1: Mario cut, okay. And then I was watching the French 467 00:22:42,640 --> 00:22:44,840 Speaker 1: Open and Yanick Sinner, the number one tennis player in 468 00:22:44,840 --> 00:22:47,320 Speaker 1: the world, was wearing this cute blue and green outfit 469 00:22:48,040 --> 00:22:51,040 Speaker 1: and everyone was chanting Luigi in the crowd, and then 470 00:22:51,080 --> 00:22:54,000 Speaker 1: he posted a picture of himself saying, Luigi plays again. 471 00:22:54,440 --> 00:22:55,960 Speaker 1: So this is like you can't avoid it, and I 472 00:22:56,000 --> 00:22:57,840 Speaker 1: think it's totally fundamental to our culture. So I'm very, 473 00:22:57,880 --> 00:23:00,760 Speaker 1: very glad that you and Eliza and Torri alerted me 474 00:23:00,840 --> 00:23:03,640 Speaker 1: to the era of my ways. There's also an interesting 475 00:23:03,640 --> 00:23:06,200 Speaker 1: business story here. I remember back in twenty seventeen when 476 00:23:06,320 --> 00:23:11,040 Speaker 1: Nintendo released Switch one. All of this coverage about if 477 00:23:11,160 --> 00:23:14,000 Speaker 1: Nintendo didn't succeed with this console, this one hundred year 478 00:23:14,000 --> 00:23:18,720 Speaker 1: old Japanese business might fail. Not only did they succeed, 479 00:23:18,840 --> 00:23:22,160 Speaker 1: the switch became the third best selling console of all time, 480 00:23:22,480 --> 00:23:25,200 Speaker 1: and now eight years later, at last, there's a new 481 00:23:25,359 --> 00:23:30,480 Speaker 1: Nintendo console, the Switch To. It's super interesting moment because, weirdly, 482 00:23:30,800 --> 00:23:34,200 Speaker 1: despite all the obsession with Nintendo IP, there hasn't been, 483 00:23:34,240 --> 00:23:36,240 Speaker 1: at least in my corner of the Internet, a huge 484 00:23:36,280 --> 00:23:39,159 Speaker 1: amount of buzz about this release. So today we have 485 00:23:39,240 --> 00:23:41,840 Speaker 1: somebody with us who is perfectly placed to help us 486 00:23:42,000 --> 00:23:46,119 Speaker 1: understand the history of Nintendo's placed in our culture and 487 00:23:46,160 --> 00:23:48,560 Speaker 1: what the Switch To could mean for the company. Is 488 00:23:48,600 --> 00:23:51,960 Speaker 1: our friend next to Thomas. He took over the Sleepwalkers 489 00:23:52,000 --> 00:23:54,600 Speaker 1: podcast from us and renamed it kill Switch. It's a 490 00:23:54,640 --> 00:23:56,560 Speaker 1: great podcast rename. 491 00:23:56,680 --> 00:23:57,640 Speaker 2: We do as on. 492 00:23:57,600 --> 00:23:59,920 Speaker 1: The Collioscope and the iHeart Network, and it's all about 493 00:24:00,280 --> 00:24:04,520 Speaker 1: our technology charged lives. Dexter has spent the last couple 494 00:24:04,560 --> 00:24:07,399 Speaker 1: of weeks learning all about Nintendo and how it became 495 00:24:07,640 --> 00:24:11,200 Speaker 1: the company is today for an upcoming episode, Dexter, Welcome 496 00:24:11,240 --> 00:24:11,840 Speaker 1: to tech stuff. 497 00:24:12,040 --> 00:24:12,640 Speaker 3: What's going on? 498 00:24:12,880 --> 00:24:14,560 Speaker 1: Nice to be here, so I want to get into 499 00:24:14,720 --> 00:24:17,520 Speaker 1: kind of what draws you to the Nintendo story, But 500 00:24:17,840 --> 00:24:21,520 Speaker 1: what is the role of Nintendo in our culture and 501 00:24:21,920 --> 00:24:25,119 Speaker 1: how has it shaped the way we interact with technology? 502 00:24:25,760 --> 00:24:29,119 Speaker 3: You know, Nintendo is I mean for people of a 503 00:24:29,160 --> 00:24:32,880 Speaker 3: certain age and older. You'll remember when any video game 504 00:24:32,920 --> 00:24:35,120 Speaker 3: you had in the house, it was Nintendo. You could 505 00:24:35,119 --> 00:24:37,600 Speaker 3: have a Sega genesis. You know, your moms would say, 506 00:24:37,680 --> 00:24:41,240 Speaker 3: turn off that Nintendo. Mom was not a Nintendo business Sega, right. 507 00:24:42,000 --> 00:24:44,560 Speaker 3: I think that if Nintendo were to announce that they 508 00:24:44,560 --> 00:24:46,680 Speaker 3: were going out of business, I mean, just think about 509 00:24:46,680 --> 00:24:50,439 Speaker 3: that for a second, you'd actually feel something which is 510 00:24:50,600 --> 00:24:54,480 Speaker 3: not something that you can say about almost any company. 511 00:24:54,640 --> 00:24:58,200 Speaker 3: It's an unusual company, you know. I compare it honestly 512 00:24:58,760 --> 00:25:03,159 Speaker 3: to Disney, just in how much work they've done to 513 00:25:03,280 --> 00:25:06,320 Speaker 3: build up a kind of mystique that ties itself to 514 00:25:06,359 --> 00:25:08,959 Speaker 3: your childhood. But they're a very interesting company. 515 00:25:09,800 --> 00:25:11,639 Speaker 1: Can you tell us a bit though, about the background 516 00:25:11,640 --> 00:25:14,320 Speaker 1: of Nintendo, because I think it's at least one hundred 517 00:25:14,400 --> 00:25:16,320 Speaker 1: years old, right, and it didn't even begin as a 518 00:25:16,440 --> 00:25:17,720 Speaker 1: games company. 519 00:25:17,760 --> 00:25:22,119 Speaker 3: That's right. Yeah, So Nintendo started out in eighteen eighty nine. 520 00:25:22,440 --> 00:25:26,359 Speaker 3: It is a very long running company with some shall 521 00:25:26,400 --> 00:25:32,320 Speaker 3: we say, murky origins. But yeah, Nintendo was. For most 522 00:25:32,720 --> 00:25:35,960 Speaker 3: of Nintendo's lifespan, I guess you could say it was 523 00:25:36,040 --> 00:25:41,320 Speaker 3: a not very successful company which mostly made playing cards. 524 00:25:41,880 --> 00:25:44,439 Speaker 3: So they started making the kind of playing cards that 525 00:25:44,880 --> 00:25:46,800 Speaker 3: you know, most of our listeners might be familiar with. 526 00:25:46,840 --> 00:25:48,920 Speaker 3: You know that how the Jack King, Queen Ace kind 527 00:25:48,920 --> 00:25:52,560 Speaker 3: of thing. They also made a game called Hanahuda, which 528 00:25:52,800 --> 00:25:56,560 Speaker 3: is kind of like a traditional Japanese card game. The 529 00:25:56,600 --> 00:25:59,240 Speaker 3: cards are a little bit smaller, they're pretty intricately designed, 530 00:25:59,800 --> 00:26:03,800 Speaker 3: and the game's actually still played today, although it's more 531 00:26:03,840 --> 00:26:05,360 Speaker 3: well known in Japan than anywhere else. 532 00:26:05,359 --> 00:26:07,639 Speaker 4: I would say, yeah, so I do want to know 533 00:26:07,720 --> 00:26:10,359 Speaker 4: about the murky origins, because, as you said, it is 534 00:26:10,880 --> 00:26:14,399 Speaker 4: a lot like Disney. You think Nintendo, you think family friendly, 535 00:26:14,520 --> 00:26:18,119 Speaker 4: you think of your childhood. So I'm curious what was 536 00:26:18,160 --> 00:26:18,800 Speaker 4: so murky? 537 00:26:19,119 --> 00:26:22,960 Speaker 3: Sure, yeah, so the president of Nintendo was basically willing 538 00:26:23,000 --> 00:26:25,159 Speaker 3: to do anything to make money. For a while they 539 00:26:25,160 --> 00:26:30,760 Speaker 3: were selling instant ramen, they ran taxis, love hotels, which, 540 00:26:30,800 --> 00:26:33,760 Speaker 3: if you're not familiar with love hotels, Japanese apartments are 541 00:26:33,800 --> 00:26:36,520 Speaker 3: pretty small. If you would like to go on a 542 00:26:36,600 --> 00:26:38,199 Speaker 3: date and the date goes very well, and you and 543 00:26:38,200 --> 00:26:39,840 Speaker 3: your partner would like to spend some time in the 544 00:26:39,920 --> 00:26:43,080 Speaker 3: evening together, you go to a love hotel. Yeah, so 545 00:26:43,200 --> 00:26:46,119 Speaker 3: they were involved in basically anything that could turn a profit. 546 00:26:46,359 --> 00:26:48,359 Speaker 3: Some of these things work, some of these didn't. But 547 00:26:48,400 --> 00:26:52,320 Speaker 3: their mainstay was cards. So gambling in Japan or most 548 00:26:52,400 --> 00:26:55,760 Speaker 3: gambling is illegal, and gambling definitely was illegal then, but 549 00:26:56,560 --> 00:27:01,160 Speaker 3: Nintendo's hung off of the cards were known for being 550 00:27:02,280 --> 00:27:05,439 Speaker 3: pretty decent and they were very widely available, and so 551 00:27:06,520 --> 00:27:11,960 Speaker 3: one of Nintendo's actually big customer bases was illegal gambling houses. 552 00:27:12,200 --> 00:27:15,440 Speaker 3: So you would find their cards at you know, these 553 00:27:15,800 --> 00:27:18,600 Speaker 3: kind of dodgy places where people would go and try 554 00:27:18,600 --> 00:27:19,200 Speaker 3: to make money. 555 00:27:19,760 --> 00:27:26,000 Speaker 1: But how did Nintendo go from gambling to video games? 556 00:27:26,320 --> 00:27:28,800 Speaker 3: Yeah, so this is where things get interesting. There's a 557 00:27:28,920 --> 00:27:33,359 Speaker 3: few stories about how they got started, but basically it 558 00:27:33,400 --> 00:27:38,720 Speaker 3: all revolves around this kind of not very serious college 559 00:27:38,760 --> 00:27:44,200 Speaker 3: student named goompe Occoy. So this electrical engineering student, he's 560 00:27:44,280 --> 00:27:47,040 Speaker 3: just graduated and he gets a job at again, this 561 00:27:47,320 --> 00:27:51,000 Speaker 3: card company called Nintendo, which all his friends were much 562 00:27:51,040 --> 00:27:54,040 Speaker 3: more successful students. He's kind of at this second, third 563 00:27:54,080 --> 00:27:59,120 Speaker 3: rate company. Again, Nintendo is making gambling cards. They're making cards, 564 00:27:59,160 --> 00:28:01,879 Speaker 3: are using gambling, and anybody who's been to Vegas you 565 00:28:02,000 --> 00:28:06,560 Speaker 3: know that gamblers take their cards very seriously. If there's 566 00:28:06,680 --> 00:28:09,480 Speaker 3: any kind of manufacturing defect, if there's a divot in 567 00:28:09,520 --> 00:28:11,720 Speaker 3: the side or anything like that, you can tell what 568 00:28:11,800 --> 00:28:14,000 Speaker 3: the other side of the card is. This is a 569 00:28:14,040 --> 00:28:18,760 Speaker 3: bad thing. And so gamblers, or especially these gambling houses, 570 00:28:19,280 --> 00:28:23,480 Speaker 3: needed the cards to be made perfectly. Well, most cards 571 00:28:23,520 --> 00:28:26,280 Speaker 3: at this point, this is sixties, a lot of these 572 00:28:26,280 --> 00:28:29,480 Speaker 3: traditional cards are still made by hand. Nintendo gets some 573 00:28:29,520 --> 00:28:31,840 Speaker 3: bright idea, well, let's do this stuff with a machine. 574 00:28:32,280 --> 00:28:35,919 Speaker 3: But the machine they have isn't working perfectly. So what 575 00:28:35,960 --> 00:28:37,760 Speaker 3: do you do if your machine doesn't work. You get 576 00:28:37,760 --> 00:28:42,280 Speaker 3: an engineer. So they hire this guy who not a 577 00:28:42,360 --> 00:28:45,640 Speaker 3: very serious student. Darn't hear, a college drop out. But 578 00:28:45,680 --> 00:28:48,640 Speaker 3: they hire this guy for their card company to fix 579 00:28:48,680 --> 00:28:51,240 Speaker 3: their glue machine. He goes in, takes a look at it, 580 00:28:51,320 --> 00:28:54,000 Speaker 3: and pretty quickly he figures out what the problem is 581 00:28:55,400 --> 00:28:58,040 Speaker 3: and he just says, okay, yeah, well fix the paddles 582 00:28:58,040 --> 00:29:00,760 Speaker 3: on the glue machine and you're good. And he goes 583 00:29:00,800 --> 00:29:02,480 Speaker 3: in the back room and just starts messing around and 584 00:29:02,520 --> 00:29:06,120 Speaker 3: doing his own thing. And the way that he tells 585 00:29:06,120 --> 00:29:09,240 Speaker 3: it is, after he fixes his glue machine, he kind 586 00:29:09,240 --> 00:29:12,280 Speaker 3: of goes in the back room and he starts making 587 00:29:12,320 --> 00:29:14,440 Speaker 3: toys just for fun as a hobby. This is something 588 00:29:14,480 --> 00:29:16,320 Speaker 3: he did as a kid. And one of the first 589 00:29:16,360 --> 00:29:20,840 Speaker 3: toys he makes is it's this. If you imagine old 590 00:29:20,880 --> 00:29:23,680 Speaker 3: Warner Brothers cartoons, you know, there would be the kind 591 00:29:23,720 --> 00:29:27,320 Speaker 3: of like a punching glove on a spring. Basically, yeah, exactly. 592 00:29:27,400 --> 00:29:30,080 Speaker 3: So he makes something kind of like that, except instead 593 00:29:30,120 --> 00:29:32,640 Speaker 3: of a punching glove on a spring, he makes a 594 00:29:32,720 --> 00:29:36,560 Speaker 3: grabbing hand on a spring where it expands and if 595 00:29:36,600 --> 00:29:39,520 Speaker 3: you push one end of it together, it reaches out 596 00:29:39,520 --> 00:29:42,640 Speaker 3: and grabs something. And he calls it the Ultra hand. 597 00:29:43,200 --> 00:29:47,080 Speaker 3: And as the president of Nintendo is walking by, looks 598 00:29:47,080 --> 00:29:48,840 Speaker 3: in the room says, hey, dude, what are you doing 599 00:29:49,360 --> 00:29:53,600 Speaker 3: And he says, uh, making a toy, And the president, 600 00:29:53,680 --> 00:29:56,800 Speaker 3: according to legend, looks at it and says, huh, you 601 00:29:56,840 --> 00:30:01,080 Speaker 3: know what, let's sell that put in production. He puts 602 00:30:01,080 --> 00:30:05,400 Speaker 3: it in production and it sells like crazy, and that 603 00:30:05,600 --> 00:30:08,960 Speaker 3: toy does very well. It's one of those big fad toys, 604 00:30:09,040 --> 00:30:11,400 Speaker 3: kind of like the slinky here in the US, right, 605 00:30:12,000 --> 00:30:14,960 Speaker 3: and President goes back to yukoy and he says, Okay, 606 00:30:15,040 --> 00:30:18,560 Speaker 3: that was good. Do it again, and he does it again, 607 00:30:19,360 --> 00:30:21,479 Speaker 3: and he does it again, and he just keeps doing it. 608 00:30:21,520 --> 00:30:24,920 Speaker 3: This guy just keeps cranking out hit after hit. It 609 00:30:25,000 --> 00:30:29,479 Speaker 3: becomes clear that toys are a good line of income 610 00:30:29,520 --> 00:30:32,360 Speaker 3: for Nintendo. They basically make him the head of the 611 00:30:32,400 --> 00:30:37,160 Speaker 3: research and development department, put some engineers under him and say, man, 612 00:30:37,280 --> 00:30:39,640 Speaker 3: just keep doing your thing, keep doing your thing, and 613 00:30:39,800 --> 00:30:40,200 Speaker 3: he does. 614 00:30:40,760 --> 00:30:40,920 Speaker 4: So. 615 00:30:40,960 --> 00:30:44,280 Speaker 3: The important thing is, goop a Yakoy really thought of 616 00:30:44,320 --> 00:30:47,040 Speaker 3: himself as a toy maker, and I think that's something 617 00:30:47,240 --> 00:30:52,520 Speaker 3: important to understand about Nintendo. Nintendo functionally is a toy company. 618 00:30:52,600 --> 00:30:55,440 Speaker 3: They're not really a video game company. They're a toy company. 619 00:30:55,480 --> 00:30:59,000 Speaker 3: It just so happens that they're best selling toys are 620 00:30:59,080 --> 00:30:59,720 Speaker 3: video games. 621 00:31:00,200 --> 00:31:03,960 Speaker 4: When does the pivot happen though, and video games start 622 00:31:04,000 --> 00:31:04,719 Speaker 4: to be their business? 623 00:31:05,120 --> 00:31:10,040 Speaker 3: Yeah, so goope. Yukoy had again continued to make games, 624 00:31:10,080 --> 00:31:14,200 Speaker 3: and some of these are electronics. He's an electrical engineer, 625 00:31:14,240 --> 00:31:19,080 Speaker 3: so he's interested in electronics. And at some point he 626 00:31:19,200 --> 00:31:24,560 Speaker 3: gets the idea that arcades are doing very well. Can 627 00:31:24,600 --> 00:31:27,800 Speaker 3: we make this more portable? And they come up with 628 00:31:27,840 --> 00:31:30,240 Speaker 3: something called The Game and watch this thing comes out 629 00:31:30,240 --> 00:31:34,440 Speaker 3: in nineteen eighty and the real innovation here is that 630 00:31:35,200 --> 00:31:39,560 Speaker 3: this is not the most technologically advanced machine in the world. 631 00:31:39,560 --> 00:31:42,920 Speaker 3: It's actually very very simple. It's about identical to a 632 00:31:42,960 --> 00:31:47,120 Speaker 3: literal calculator. But these had very very simple games on them, 633 00:31:47,360 --> 00:31:50,760 Speaker 3: and one of them was it's just called ball and 634 00:31:50,840 --> 00:31:54,160 Speaker 3: all you're doing is it's literally just called ball, and 635 00:31:54,240 --> 00:31:57,160 Speaker 3: you are juggling a ball back and forth and trying 636 00:31:57,160 --> 00:32:00,200 Speaker 3: to keep it from dropping. And the ball moved was 637 00:32:00,240 --> 00:32:01,680 Speaker 3: back and forth and it kind of beeps, and it 638 00:32:01,720 --> 00:32:02,280 Speaker 3: kind of goes. 639 00:32:02,160 --> 00:32:04,000 Speaker 2: Beep beep, beep, beep beep. 640 00:32:04,040 --> 00:32:08,520 Speaker 3: So it's very simple, it's very slow, but this was 641 00:32:09,480 --> 00:32:12,880 Speaker 3: really a cheap way to bring the concept of the 642 00:32:12,920 --> 00:32:14,800 Speaker 3: game you play in the arcade. It's not quite that, 643 00:32:15,320 --> 00:32:17,680 Speaker 3: but it's in your pocket and it was cheap, which 644 00:32:17,720 --> 00:32:20,480 Speaker 3: was pretty amazing for the time. And these things again 645 00:32:21,000 --> 00:32:21,960 Speaker 3: so like hotcakes. 646 00:32:22,440 --> 00:32:26,880 Speaker 2: What is this idea of lateral thinking with withized technology. 647 00:32:27,520 --> 00:32:34,160 Speaker 3: Yeah, so this is truly I think what goombe okoy 648 00:32:34,880 --> 00:32:38,480 Speaker 3: did that sets them apart from everyone else. So that 649 00:32:38,640 --> 00:32:43,920 Speaker 3: is the literal translation. Kai heiko literally translates out to 650 00:32:44,600 --> 00:32:49,000 Speaker 3: lateral thinking of withered technology. I might translate it as 651 00:32:49,040 --> 00:32:53,720 Speaker 3: something more like sideways thinking with mature technology. 652 00:32:54,080 --> 00:32:58,600 Speaker 2: So you're good American capitalist after all, Well. 653 00:32:58,400 --> 00:33:01,280 Speaker 3: This is the thing I mean, Goomyakoi was. You know, 654 00:33:01,320 --> 00:33:03,320 Speaker 3: he was an innovator, He was a toy maker. He 655 00:33:03,400 --> 00:33:04,920 Speaker 3: was a guy who just liked messing around. But he 656 00:33:04,960 --> 00:33:08,960 Speaker 3: was also a very smart businessman, right, and so he 657 00:33:09,080 --> 00:33:13,000 Speaker 3: would look at what was available, and he would not 658 00:33:13,400 --> 00:33:17,800 Speaker 3: try to use the most technologically advanced device that was 659 00:33:17,840 --> 00:33:22,160 Speaker 3: out there. He would purposefully pick old technology that was proven, 660 00:33:22,480 --> 00:33:25,840 Speaker 3: sometimes a decade old technology, decade old chips, five year 661 00:33:25,840 --> 00:33:28,560 Speaker 3: old chips, and say, okay, what can I do with this. 662 00:33:29,320 --> 00:33:35,080 Speaker 3: Great example of this is the Lefty RX classic Goompeyakoy product. 663 00:33:35,400 --> 00:33:38,440 Speaker 3: So this is a remote control car. The RX stends 664 00:33:38,520 --> 00:33:42,920 Speaker 3: remote control that only turns left, hence the word lefty. 665 00:33:43,080 --> 00:33:45,400 Speaker 3: So they just lean into it, yes, which sounds like 666 00:33:45,440 --> 00:33:48,600 Speaker 3: a terrible idea, but if you think about it, you know, 667 00:33:48,720 --> 00:33:52,640 Speaker 3: sixties going into the seventies, Japan not a rich country 668 00:33:52,640 --> 00:33:55,800 Speaker 3: at this point, a lot of poor kids, and not 669 00:33:55,920 --> 00:34:00,320 Speaker 3: everybody can afford a remote control car, radio control car, right, 670 00:34:00,360 --> 00:34:02,360 Speaker 3: And so what he did was he said, okay, how 671 00:34:02,400 --> 00:34:05,760 Speaker 3: can we strip this down? Well, if you watch NASCAR, 672 00:34:05,840 --> 00:34:08,920 Speaker 3: if you watch Formula One, I hate to say it, 673 00:34:08,960 --> 00:34:12,439 Speaker 3: all they do is go left, No offense. That's skill 674 00:34:12,680 --> 00:34:16,040 Speaker 3: sport and again my deepest respect to all the F 675 00:34:16,080 --> 00:34:19,080 Speaker 3: one fans out there. It is a sport which involves 676 00:34:19,200 --> 00:34:24,520 Speaker 3: primarily going straight and then occasionally left, and that is it. 677 00:34:24,640 --> 00:34:27,319 Speaker 3: And he said, okay, well let's do it. And so 678 00:34:27,960 --> 00:34:30,120 Speaker 3: the car basically has two buttons. You turn it on, 679 00:34:30,200 --> 00:34:32,680 Speaker 3: it goes straight. You push another button and it goes left. 680 00:34:32,840 --> 00:34:36,480 Speaker 3: And this toy was a quarter of the price something 681 00:34:36,560 --> 00:34:40,760 Speaker 3: like a normal remote control car. And so now kids 682 00:34:40,800 --> 00:34:43,719 Speaker 3: can have this car. Again, it's not an act of 683 00:34:43,880 --> 00:34:47,080 Speaker 3: charity to the children of Japan. This is the money making, 684 00:34:47,160 --> 00:34:50,840 Speaker 3: you know operation after all. But he was able to 685 00:34:50,880 --> 00:34:55,040 Speaker 3: strip things down to their essence and again take mature 686 00:34:55,080 --> 00:34:57,799 Speaker 3: technology technology, the worked that was out there, and think 687 00:34:57,840 --> 00:34:58,800 Speaker 3: a little bit sideways. 688 00:34:59,440 --> 00:35:02,120 Speaker 4: So just to pivot a little bit and talk a 689 00:35:02,120 --> 00:35:05,920 Speaker 4: little bit more about the actual console. Yukoi passed away 690 00:35:06,120 --> 00:35:09,400 Speaker 4: in nineteen eighty seven. They obviously continue to make games 691 00:35:09,400 --> 00:35:13,319 Speaker 4: in hardware. They have some ups and downs. I thought 692 00:35:13,320 --> 00:35:16,400 Speaker 4: GameCube was, you know, the second coming because of crazy 693 00:35:16,440 --> 00:35:18,960 Speaker 4: Taxi and those little discs those I mean, it was 694 00:35:18,960 --> 00:35:22,840 Speaker 4: everything to me. Yeah, the market apparently disagreed. Other consoles 695 00:35:22,880 --> 00:35:25,960 Speaker 4: like we which I also ruined my tennis game, the 696 00:35:26,000 --> 00:35:31,120 Speaker 4: switch has done very well. What do you think sets 697 00:35:32,000 --> 00:35:34,640 Speaker 4: apart some consoles from other consoles. 698 00:35:35,040 --> 00:35:40,040 Speaker 3: Yeah, you know, Nintendo's got an interesting up and down 699 00:35:40,440 --> 00:35:44,440 Speaker 3: pattern which we could probably connect some dots on. Basically, 700 00:35:44,520 --> 00:35:48,080 Speaker 3: the pattern is since the early two thousands, when Nintendo 701 00:35:48,320 --> 00:35:54,280 Speaker 3: tries to make a sophisticated machine, a technologically advanced machine, 702 00:35:54,280 --> 00:35:58,640 Speaker 3: it usually flops. When they ease back and just make 703 00:35:58,760 --> 00:36:02,320 Speaker 3: something that is just interesting. Again, I would argue, leaning 704 00:36:02,360 --> 00:36:07,600 Speaker 3: back on goompe Yacoy's philosophy of sideways thinking, with mature 705 00:36:07,640 --> 00:36:12,160 Speaker 3: technology they do better. So if you look at the GameCube, 706 00:36:12,200 --> 00:36:15,320 Speaker 3: they actually were trying to compete with the more powerful systems. 707 00:36:15,400 --> 00:36:19,840 Speaker 3: It didn't do so well. Yeah, the Wei, which functionally 708 00:36:19,920 --> 00:36:24,680 Speaker 3: uses the same insides as the GameCube, really released years later. 709 00:36:26,560 --> 00:36:29,480 Speaker 3: It's not an advanced machine, it's not competing against the 710 00:36:29,520 --> 00:36:34,680 Speaker 3: PlayStation three, but somehow it starts out selling everything else. 711 00:36:35,400 --> 00:36:37,920 Speaker 3: Then they come out with the WIU, which is the 712 00:36:37,960 --> 00:36:40,360 Speaker 3: successor to the Wei, and they are trying to play 713 00:36:40,360 --> 00:36:42,920 Speaker 3: this game of all right, we got the good graphics, 714 00:36:42,920 --> 00:36:47,680 Speaker 3: we got everything everybody else has, and it flops, flops 715 00:36:47,760 --> 00:36:51,560 Speaker 3: really badly. Actually, they're in trouble. The Switch comes out. 716 00:36:51,800 --> 00:36:55,000 Speaker 3: The Switch is not an advanced machine. The Switch is. 717 00:36:56,080 --> 00:36:59,120 Speaker 3: It's like a cell phone, man. And this is a 718 00:36:59,160 --> 00:37:01,400 Speaker 3: time when the Place four, the Xbox one is on 719 00:37:01,440 --> 00:37:04,480 Speaker 3: the market, and it works for them because they're not 720 00:37:04,560 --> 00:37:09,160 Speaker 3: playing this game of chasing graphics, of chasing processing power. 721 00:37:09,560 --> 00:37:14,160 Speaker 3: They're really in their own lane, which is this fun? 722 00:37:14,800 --> 00:37:17,440 Speaker 3: Does it work? Is it cheap enough for us to 723 00:37:17,440 --> 00:37:19,280 Speaker 3: make a bunch of these and still make it fun? 724 00:37:19,840 --> 00:37:22,719 Speaker 1: Why do you think the Switch did so well and 725 00:37:23,320 --> 00:37:27,560 Speaker 1: eight years later, what can we expect from its successor. 726 00:37:28,160 --> 00:37:30,400 Speaker 3: Yeah, you know, I mean I think there's a combination 727 00:37:30,440 --> 00:37:34,200 Speaker 3: of things. I think they actually went back to their 728 00:37:34,320 --> 00:37:37,760 Speaker 3: roots Nintendo. Did you know what the funny thing about 729 00:37:37,840 --> 00:37:40,640 Speaker 3: the Switch is they're thinking of it as a toy 730 00:37:40,719 --> 00:37:43,240 Speaker 3: you can take to a bar and play with your friends. 731 00:37:44,320 --> 00:37:48,560 Speaker 3: Also another kind of black swan thing. Can't predict quarantine, 732 00:37:48,719 --> 00:37:53,640 Speaker 3: people are locked down. Animal Crossing comes out. Everybody's playing 733 00:37:53,680 --> 00:37:57,080 Speaker 3: Animal Crossing. It is a not very technologically advanced game, 734 00:37:57,080 --> 00:37:59,320 Speaker 3: but it lets you play with your friends. 735 00:37:59,480 --> 00:38:01,759 Speaker 1: Try to say, it's really clarifying for me what you said, 736 00:38:01,800 --> 00:38:04,920 Speaker 1: because I didn't really understand the distinction between a toy 737 00:38:05,120 --> 00:38:08,279 Speaker 1: and a console. But this is like tech that sort 738 00:38:08,320 --> 00:38:11,440 Speaker 1: of mobilizes you in the world rather than sucks you 739 00:38:11,520 --> 00:38:13,560 Speaker 1: into a private world exactly. 740 00:38:13,640 --> 00:38:16,800 Speaker 3: And you know, let's not forget that the Game Boy, 741 00:38:17,160 --> 00:38:21,279 Speaker 3: one of its major features was the fact that you 742 00:38:21,280 --> 00:38:23,960 Speaker 3: could link two of them up or four of them 743 00:38:24,040 --> 00:38:27,239 Speaker 3: up actually, so Tetris you could play against somebody. So 744 00:38:27,360 --> 00:38:30,520 Speaker 3: really the idea of it is you're supposed to go 745 00:38:30,640 --> 00:38:34,840 Speaker 3: outside and play with your friends. That's actually what goumpe 746 00:38:34,960 --> 00:38:38,799 Speaker 3: Ukoy wants. I think he wanted you to play with 747 00:38:38,880 --> 00:38:42,800 Speaker 3: the toy with your friends. The switch is another version 748 00:38:42,800 --> 00:38:46,600 Speaker 3: of that. And I was asking somebody that who actually 749 00:38:46,640 --> 00:38:49,000 Speaker 3: got the opportunity to play it before most people did, 750 00:38:49,000 --> 00:38:50,440 Speaker 3: you know, got to play it as a member of 751 00:38:50,440 --> 00:38:52,680 Speaker 3: the press. And he said, well, do you have a switch? 752 00:38:52,719 --> 00:38:55,480 Speaker 3: And I said yeah, And he said you've played your switch, right, 753 00:38:55,560 --> 00:38:57,880 Speaker 3: I said yeah, And he said, well, you've basically played 754 00:38:57,880 --> 00:39:01,279 Speaker 3: the switch too. It's more the same. Is there a 755 00:39:01,400 --> 00:39:05,680 Speaker 3: slight bump in graphic capability? Sure? Is the fit and 756 00:39:05,719 --> 00:39:09,760 Speaker 3: finish of the machine a little bit better, absolutely, But honestly, 757 00:39:10,840 --> 00:39:13,480 Speaker 3: it's another switch. And if you want to play the 758 00:39:13,520 --> 00:39:15,799 Speaker 3: new games, eventually you're going to have to get the 759 00:39:15,840 --> 00:39:18,239 Speaker 3: new switch because it's a little bit more advanced. But 760 00:39:18,320 --> 00:39:22,359 Speaker 3: this is just them realizing I think that truly, if 761 00:39:22,360 --> 00:39:26,360 Speaker 3: it ain't broke, don't fix it. And again, they're using 762 00:39:26,600 --> 00:39:32,239 Speaker 3: slightly older hardware, simple to program for, They're not chasing graphics, 763 00:39:32,400 --> 00:39:34,719 Speaker 3: and it's more of the same for better for worse, 764 00:39:34,760 --> 00:39:35,520 Speaker 3: it's more the same. 765 00:39:35,920 --> 00:39:37,600 Speaker 4: I mean in that way, you have to expect that 766 00:39:37,640 --> 00:39:39,640 Speaker 4: it will do well right, that it won't go the 767 00:39:39,680 --> 00:39:42,880 Speaker 4: way of something that's newer that is not a safe bet. 768 00:39:42,920 --> 00:39:44,279 Speaker 3: Well, so this is where it gets a little bit 769 00:39:44,280 --> 00:39:47,440 Speaker 3: complicated because the switch too is kind of expensive. The 770 00:39:47,480 --> 00:39:51,600 Speaker 3: games are also quite expensive. Games are eighty bucks. Oh yeah, 771 00:39:52,120 --> 00:39:57,080 Speaker 3: so ok, But I think price things aside them sticking 772 00:39:57,200 --> 00:40:00,840 Speaker 3: to more of the same honestly as or in Nintendo, 773 00:40:01,920 --> 00:40:06,400 Speaker 3: and that for them, it's hard to argue against that 774 00:40:06,440 --> 00:40:09,080 Speaker 3: being the best choice. I think that really is where 775 00:40:09,080 --> 00:40:12,440 Speaker 3: they do best is just stick with what works, make 776 00:40:12,520 --> 00:40:13,480 Speaker 3: more fun toys. 777 00:40:14,040 --> 00:40:16,000 Speaker 1: Just to close, you know, we start off the conversation 778 00:40:16,080 --> 00:40:19,680 Speaker 1: talking about valuable IP that Nintendo owns, I mean, Mario 779 00:40:19,800 --> 00:40:22,640 Speaker 1: and Zelder and others. How much of an inducement will 780 00:40:22,680 --> 00:40:27,279 Speaker 1: that be for people to buy the switch to and 781 00:40:27,320 --> 00:40:30,759 Speaker 1: how much of that IP is living elsewhere in the 782 00:40:30,760 --> 00:40:34,480 Speaker 1: world now, whether it's movies and theme parks, like how 783 00:40:34,560 --> 00:40:40,000 Speaker 1: much on Nintendo relying on the console business too be Disney. 784 00:40:40,360 --> 00:40:43,080 Speaker 3: I think they're a bit of a crossroads for that 785 00:40:43,160 --> 00:40:46,640 Speaker 3: because for the longest time they were very, very protective 786 00:40:47,000 --> 00:40:50,920 Speaker 3: of their intellectual property. I think that what Nintendo has 787 00:40:50,960 --> 00:40:55,040 Speaker 3: going for them is that you cannot interact with any 788 00:40:55,040 --> 00:40:58,480 Speaker 3: of their intellectual property unless it's in an environment that 789 00:40:58,640 --> 00:41:01,719 Speaker 3: they've sanctioned it, you know. I mean, they've made the 790 00:41:01,719 --> 00:41:05,000 Speaker 3: theme park, They're very judicious and careful about that, They've 791 00:41:05,040 --> 00:41:07,239 Speaker 3: made the movie. They're very judicious and careful about that. 792 00:41:07,640 --> 00:41:11,400 Speaker 3: You will not be playing Mario on your computer. You 793 00:41:11,480 --> 00:41:14,719 Speaker 3: have to buy the consoles. So they have such a 794 00:41:14,840 --> 00:41:18,399 Speaker 3: draw to their intellectual property, Zelda, Mario that they really 795 00:41:18,400 --> 00:41:21,520 Speaker 3: are banking on the fact that people still want Mario 796 00:41:22,440 --> 00:41:25,480 Speaker 3: and they will pay whatever price it is that we 797 00:41:25,560 --> 00:41:28,759 Speaker 3: set for access to that. So, you know, I think 798 00:41:28,760 --> 00:41:32,840 Speaker 3: they are diversifying in some ways, but they are probably 799 00:41:32,880 --> 00:41:36,960 Speaker 3: still betting that having Mario out there in a movie 800 00:41:37,520 --> 00:41:40,520 Speaker 3: and Zelda potentially out there in a movie is going 801 00:41:40,560 --> 00:41:43,000 Speaker 3: to bring people back to play the game. And if 802 00:41:43,040 --> 00:41:44,160 Speaker 3: you want to play the game. 803 00:41:45,440 --> 00:41:50,280 Speaker 4: You got to buy the hardware, YEP, buy the hardware exactly, exactly, exactly. 804 00:41:50,160 --> 00:41:52,040 Speaker 2: Tank to thank you so much. This was really fascinating. 805 00:41:52,080 --> 00:41:54,400 Speaker 3: I loved it absolutely absolutely, Thanks so much for having me. 806 00:41:54,480 --> 00:42:12,280 Speaker 4: Thank you. That's it for this week for tech Stuff. 807 00:42:12,320 --> 00:42:13,560 Speaker 4: I'm Karra Price and. 808 00:42:13,520 --> 00:42:14,360 Speaker 2: I'm Oz Valushian. 809 00:42:14,800 --> 00:42:18,279 Speaker 1: This episode was produced by Eliza Dennis and Victoria Domingez. 810 00:42:18,719 --> 00:42:21,480 Speaker 1: It was executive produced by me, Kara Price and Kate 811 00:42:21,520 --> 00:42:26,120 Speaker 1: Osborne for Kaleidoscope and Katrina Norval for iHeart Podcasts. The 812 00:42:26,200 --> 00:42:29,520 Speaker 1: Engineer is Beheth Fraser and Jack Insley mixed this episode. 813 00:42:29,880 --> 00:42:31,279 Speaker 2: Kyle Murdoch wrote our theme song. 814 00:42:31,760 --> 00:42:34,520 Speaker 4: Join us next Wednesday for Textuff the Story when we 815 00:42:34,560 --> 00:42:37,480 Speaker 4: will share an in depth conversation with Jake Sullivan, national 816 00:42:37,560 --> 00:42:40,960 Speaker 4: Security advisor under Biden. We'll talk all things competition with 817 00:42:41,040 --> 00:42:42,560 Speaker 4: China and autonomous weapons. 818 00:42:42,760 --> 00:42:45,520 Speaker 1: Please rate, review, and reach out to us at tech 819 00:42:45,560 --> 00:42:47,800 Speaker 1: Stuff podcast at gmail dot com. 820 00:42:48,040 --> 00:42:49,680 Speaker 2: It really helps us to know what you're thinking.