1 00:00:14,240 --> 00:00:17,439 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:17,560 --> 00:00:19,560 Speaker 2: I'm as Luscian and I'm care price. 3 00:00:19,760 --> 00:00:23,639 Speaker 1: Today we get into spy cockroaches and why Europe is 4 00:00:23,680 --> 00:00:27,040 Speaker 1: getting armed with a bunch of sci fi weapons, and 5 00:00:27,640 --> 00:00:31,960 Speaker 1: a revealing conversation with the human team behind a viral. 6 00:00:31,800 --> 00:00:35,600 Speaker 3: AI artist then un chatting me, and then I put 7 00:00:35,640 --> 00:00:38,760 Speaker 3: in what formation should you play to be? 8 00:00:39,159 --> 00:00:43,000 Speaker 2: NWSL teams all of that on the Weekend Tech. It's Friday, 9 00:00:43,120 --> 00:00:50,280 Speaker 2: November seventh, US. I want to tell you about this 10 00:00:50,440 --> 00:00:53,640 Speaker 2: bird that I saw on the way to work today. 11 00:00:53,960 --> 00:00:55,520 Speaker 1: Birds can be very symbolic. 12 00:00:55,720 --> 00:00:59,320 Speaker 2: It was a brown and white pigeon. 13 00:00:59,240 --> 00:01:02,680 Speaker 1: Or feral as US ornithologists like to call it. 14 00:01:02,640 --> 00:01:05,000 Speaker 2: A pharal dove. So do you know I'm telling you 15 00:01:05,040 --> 00:01:08,640 Speaker 2: about a bird? Have no idea, no idea you could pass. 16 00:01:08,800 --> 00:01:09,400 Speaker 1: What's the test? 17 00:01:09,680 --> 00:01:13,400 Speaker 2: You just experienced? Something called bird theory. It's this relationship 18 00:01:13,440 --> 00:01:16,080 Speaker 2: test that I saw on TikTok, and a lot of 19 00:01:16,160 --> 00:01:20,039 Speaker 2: TikTokers are using it to test their partners by secretly 20 00:01:20,120 --> 00:01:23,320 Speaker 2: filming their partner's reactions when they point out a bird. 21 00:01:23,680 --> 00:01:27,880 Speaker 2: If their partner reacts enthusiastically like I did, good sign. 22 00:01:28,959 --> 00:01:33,200 Speaker 2: If not some say, bird, watch out. 23 00:01:33,640 --> 00:01:35,039 Speaker 1: What's behind this? 24 00:01:35,440 --> 00:01:38,800 Speaker 2: It's actually from this theory developed by the Gotmans, who 25 00:01:38,800 --> 00:01:42,280 Speaker 2: are prominent couples therapists, and this test is used to 26 00:01:42,319 --> 00:01:46,000 Speaker 2: evaluate how a partner responds to what the Gotmans call 27 00:01:46,520 --> 00:01:50,560 Speaker 2: a bid for connection. So just now, had we been 28 00:01:50,640 --> 00:01:54,000 Speaker 2: together and you were like, Babe. 29 00:01:53,480 --> 00:01:57,720 Speaker 1: I can't in the studio, I'm by definition forced to 30 00:01:57,720 --> 00:01:59,920 Speaker 1: pay attention to what you're telling me. In real life, 31 00:02:00,000 --> 00:02:01,160 Speaker 1: I feel like, as a bird over there, I had 32 00:02:01,200 --> 00:02:02,400 Speaker 1: just been on my phone, I won't even heard you 33 00:02:03,280 --> 00:02:04,160 Speaker 1: locked into my phone. 34 00:02:04,200 --> 00:02:07,120 Speaker 2: That's very true, and so it is also contextual, I think. 35 00:02:07,160 --> 00:02:10,080 Speaker 2: But what is really interesting to me about this is 36 00:02:10,080 --> 00:02:15,440 Speaker 2: that there is a thing that's happening on TikTok that 37 00:02:15,520 --> 00:02:19,040 Speaker 2: I think is really just an extension of like the 38 00:02:19,080 --> 00:02:23,200 Speaker 2: BuzzFeed or Cosmo quiz. Remember BuzzFeed quizzes. 39 00:02:23,080 --> 00:02:25,760 Speaker 1: Like are you astrologically compatible with your partner? Or like 40 00:02:25,800 --> 00:02:27,600 Speaker 1: do you like the same things? And that kind of thing. 41 00:02:27,639 --> 00:02:29,560 Speaker 2: Well, that's exactly what I mean. But I think that 42 00:02:29,960 --> 00:02:33,520 Speaker 2: there's this interesting thing that's happening on TikTok, which is 43 00:02:33,520 --> 00:02:38,120 Speaker 2: that like people are gamifying their relationships very publicly and 44 00:02:38,280 --> 00:02:41,800 Speaker 2: inviting viewers to judge their partner's responses as well, and 45 00:02:41,840 --> 00:02:43,880 Speaker 2: so it kind of is the first time people are 46 00:02:43,919 --> 00:02:46,560 Speaker 2: like opening up their relationships to a peanut gallery. 47 00:02:46,639 --> 00:02:48,640 Speaker 1: That's the weird part, right, Like you would think the 48 00:02:48,680 --> 00:02:51,320 Speaker 1: idea of the peanut galleries. That's different from the old 49 00:02:51,400 --> 00:02:53,000 Speaker 1: days of Cosmo and BuzzFeed quizzes. 50 00:02:53,160 --> 00:02:56,760 Speaker 2: What's amazing is how seriously people take this stuff. People 51 00:02:56,800 --> 00:02:59,040 Speaker 2: are really like in the comments being like, if your 52 00:02:59,040 --> 00:03:01,040 Speaker 2: partner doesn't pay at ten, you talking about the bird? 53 00:03:01,160 --> 00:03:01,919 Speaker 2: Dump it, believe? 54 00:03:02,200 --> 00:03:04,560 Speaker 1: How about this for a transition please? Is it a bird? 55 00:03:05,120 --> 00:03:07,760 Speaker 1: Is it a plane? No, it's a drone. 56 00:03:09,200 --> 00:03:09,720 Speaker 4: That's good. 57 00:03:09,760 --> 00:03:13,320 Speaker 1: Okay, that's good. So the general headline that I want 58 00:03:13,360 --> 00:03:16,520 Speaker 1: to share with you today is Europe is investing heavily 59 00:03:16,600 --> 00:03:18,720 Speaker 1: in defense tech with mixed results. 60 00:03:19,000 --> 00:03:20,080 Speaker 2: What are the mixed results? 61 00:03:20,440 --> 00:03:23,320 Speaker 1: Some of it's working and some of it's absolutely not. 62 00:03:23,360 --> 00:03:25,040 Speaker 1: There was a story in the Financial Times with the 63 00:03:25,080 --> 00:03:28,799 Speaker 1: headline drones startup back by Peter Teel crashed and burned 64 00:03:28,800 --> 00:03:31,480 Speaker 1: in Armed forces trials. So of course I couldn't read 65 00:03:31,440 --> 00:03:35,400 Speaker 1: it up exactly. But this all started with Vice President 66 00:03:35,560 --> 00:03:38,960 Speaker 1: jd Vance. Earlier this year. Jd Vance, in his first 67 00:03:39,120 --> 00:03:41,480 Speaker 1: major public speech, was in Munich, at something called the 68 00:03:41,560 --> 00:03:44,720 Speaker 1: Munich Security Conference, which is basically when all of the 69 00:03:44,760 --> 00:03:48,560 Speaker 1: big wigs of European defense get together and you know, 70 00:03:48,880 --> 00:03:51,760 Speaker 1: prominent Americans from government come and normally it's a kind 71 00:03:51,800 --> 00:03:56,240 Speaker 1: of backslapping, handshaking thing. This year was very different. Vance 72 00:03:56,360 --> 00:03:58,760 Speaker 1: shocked Europe and I had some friends who had that conference. 73 00:03:58,840 --> 00:03:59,920 Speaker 2: You had friends that were shocked. 74 00:04:00,560 --> 00:04:03,520 Speaker 1: Shocks say, you, guys, you're on your own. You've been 75 00:04:03,560 --> 00:04:05,720 Speaker 1: eating out of our hand for too long. You've gonna 76 00:04:05,720 --> 00:04:07,880 Speaker 1: boost your own defense spending because we're not coming to 77 00:04:07,920 --> 00:04:10,760 Speaker 1: the rescue anymore. So now Europe is stepping up her 78 00:04:10,800 --> 00:04:14,760 Speaker 1: Reuters venture capital funding of European defense tech hit one 79 00:04:14,760 --> 00:04:17,800 Speaker 1: billion dollars this year, up from around four hundred million 80 00:04:17,839 --> 00:04:20,440 Speaker 1: in twenty twenty two. More interesting than that, there was 81 00:04:20,440 --> 00:04:22,920 Speaker 1: a stat is this year, for the first time ever, 82 00:04:23,520 --> 00:04:27,640 Speaker 1: combined European spending on defense procurement, by the way, this 83 00:04:27,680 --> 00:04:30,720 Speaker 1: includes Turkey and Ukraine is expects to be more than 84 00:04:30,760 --> 00:04:33,800 Speaker 1: in the US. What Europe is spending. 85 00:04:33,480 --> 00:04:34,880 Speaker 2: More just because JD. 86 00:04:35,000 --> 00:04:38,359 Speaker 1: Vance marked then yeah, I mean I think it's that, 87 00:04:38,480 --> 00:04:41,000 Speaker 1: And obviously it's it's carroting. Well, it's two sticks. It's 88 00:04:41,160 --> 00:04:43,240 Speaker 1: an American stick and the Russian stick. There's a very 89 00:04:43,240 --> 00:04:45,360 Speaker 1: little carrot. One is saying you've got to do this, 90 00:04:45,400 --> 00:04:47,760 Speaker 1: and the other is all but saying you've got to 91 00:04:47,760 --> 00:04:50,600 Speaker 1: do this. A lot of this investment is happening in Germany, 92 00:04:51,000 --> 00:04:53,960 Speaker 1: which also obviously has a troubled history when it comes 93 00:04:54,000 --> 00:04:57,960 Speaker 1: to quite armament production and has really not done very 94 00:04:58,040 --> 00:05:01,320 Speaker 1: much since the Second World War and till now when 95 00:05:01,520 --> 00:05:04,760 Speaker 1: there is demand from the US, pressure from Russia, but 96 00:05:04,839 --> 00:05:08,159 Speaker 1: also a whole bunch of talented engineers who have worked 97 00:05:08,160 --> 00:05:10,680 Speaker 1: for decades in the auto industry which is now on 98 00:05:10,720 --> 00:05:13,160 Speaker 1: its knees. Interesting, but I do want to talk a 99 00:05:13,200 --> 00:05:17,520 Speaker 1: little bit more about this Financial Time story about Yes, that's. 100 00:05:17,360 --> 00:05:19,040 Speaker 2: What I want to hear more about is stashing and burning. 101 00:05:19,120 --> 00:05:22,360 Speaker 1: This is about a company called Stark, which rhymes with Lark. 102 00:05:23,640 --> 00:05:27,359 Speaker 1: It's a Berlin based drone company back by Peter Teel, 103 00:05:27,640 --> 00:05:32,640 Speaker 1: Sequoia Capital and the NATO Innovation Fund. Stark was founded 104 00:05:32,680 --> 00:05:36,600 Speaker 1: fifteen months ago and they've built weaponized drones that look 105 00:05:36,920 --> 00:05:39,560 Speaker 1: kind of reminiscent of ex fighters from Star Wars. The 106 00:05:39,560 --> 00:05:41,200 Speaker 1: base of the drone is shaped like an X. 107 00:05:41,440 --> 00:05:44,239 Speaker 2: And this has all been developed in the last fifteen months. 108 00:05:44,440 --> 00:05:47,240 Speaker 1: It has and it had two trials which were covered 109 00:05:47,240 --> 00:05:51,960 Speaker 1: by the Financial Times, which called them a disaster. The 110 00:05:52,080 --> 00:05:55,920 Speaker 1: drones quote failed to hit a single target during four 111 00:05:55,960 --> 00:05:59,800 Speaker 1: attempts at two separate exercises. You know, the failure, especially 112 00:05:59,800 --> 00:06:03,880 Speaker 1: when and it's failure of investments by some of these 113 00:06:03,880 --> 00:06:08,400 Speaker 1: Silicon valley overlords is always delicious, extremely delicious. But the 114 00:06:08,480 --> 00:06:11,040 Speaker 1: new idea in defense tech is build it and they 115 00:06:11,080 --> 00:06:14,400 Speaker 1: will come. Previously, it was like the government would commission 116 00:06:14,440 --> 00:06:17,159 Speaker 1: a new fighter jet, There'll be a twenty year procurement cycle, 117 00:06:17,200 --> 00:06:19,960 Speaker 1: and it would cost like, you know, five hundred billion 118 00:06:20,000 --> 00:06:22,640 Speaker 1: dollars for the government to get the new technology wanted. 119 00:06:23,120 --> 00:06:25,600 Speaker 1: Now all these startups are saying, wow, like wars here 120 00:06:25,680 --> 00:06:27,760 Speaker 1: or war's coming, Why don't we raise money from the 121 00:06:27,760 --> 00:06:31,080 Speaker 1: private market and make weapons that governments don't even think 122 00:06:31,080 --> 00:06:34,000 Speaker 1: they need yet. And so that's a huge shift, and 123 00:06:34,040 --> 00:06:37,800 Speaker 1: it's being powered, of course by AI, automation and robotics. 124 00:06:38,080 --> 00:06:39,520 Speaker 1: There's a story in the New York Times about the 125 00:06:39,520 --> 00:06:41,800 Speaker 1: same phenomenon which had a great, great quote, which was 126 00:06:42,279 --> 00:06:45,920 Speaker 1: the new business model reflects a sea change in warfare 127 00:06:46,400 --> 00:06:50,560 Speaker 1: that maybe as profound as the shift from horse cavalries 128 00:06:50,839 --> 00:06:53,400 Speaker 1: to armoured tanks and airplanes in World War One. 129 00:06:53,480 --> 00:06:56,719 Speaker 2: That's quite a distinction. I mean, I guess my question 130 00:06:56,880 --> 00:06:58,640 Speaker 2: is why do things on the cheap? 131 00:06:59,160 --> 00:07:01,240 Speaker 1: Well, I think because, I mean, the point of these 132 00:07:01,320 --> 00:07:04,400 Speaker 1: venture back companies is you're trying to do a minimal 133 00:07:04,600 --> 00:07:06,799 Speaker 1: viable product as cheaply as you can. If it works, 134 00:07:06,839 --> 00:07:08,840 Speaker 1: you raise a bunch more money and go into full production. 135 00:07:09,000 --> 00:07:12,720 Speaker 1: So most Series A companies fail. But what's interesting is 136 00:07:12,760 --> 00:07:17,640 Speaker 1: that private investors are willing to take risk to fund weapons, 137 00:07:17,680 --> 00:07:21,000 Speaker 1: most of which won't work. But the ones that do work, 138 00:07:21,120 --> 00:07:24,480 Speaker 1: or that do convince governments they work, will presumably print 139 00:07:24,520 --> 00:07:26,800 Speaker 1: fortunes for their founders and backers. 140 00:07:26,800 --> 00:07:29,880 Speaker 2: So this is quite literally the old Silicon Valley model 141 00:07:29,920 --> 00:07:31,120 Speaker 2: of moved fast and break things. 142 00:07:31,520 --> 00:07:35,160 Speaker 1: It's exactly that, exactly that, but for defense tech, which. 143 00:07:35,000 --> 00:07:37,040 Speaker 2: Is a little scarier. When it comes to defense tech. 144 00:07:37,080 --> 00:07:39,120 Speaker 2: You know, I'm sure some of this tech is incredibly 145 00:07:39,560 --> 00:07:42,120 Speaker 2: sci fi and inspired, but like, this is stuff that's 146 00:07:42,160 --> 00:07:43,120 Speaker 2: being made for war. 147 00:07:43,560 --> 00:07:46,280 Speaker 1: Yeah, I mean, look, so one of the most successful 148 00:07:46,320 --> 00:07:49,200 Speaker 1: companies in the space seems to be this company called Helsing. 149 00:07:49,720 --> 00:07:53,320 Speaker 1: They're featuring the same ft story. They hit their targets 150 00:07:53,400 --> 00:07:58,600 Speaker 1: seventeen times and successfully completed five trial runs. Their valuation 151 00:07:58,720 --> 00:08:01,400 Speaker 1: is now over four teen billion dollars. 152 00:08:01,520 --> 00:08:02,200 Speaker 2: That's crazy. 153 00:08:02,440 --> 00:08:04,960 Speaker 1: And it's not just drones. There are other companies who 154 00:08:05,000 --> 00:08:08,320 Speaker 1: are trying to create the weapons of the future. In parallel, 155 00:08:08,400 --> 00:08:11,960 Speaker 1: there's a company called euro Atlas who have quote created 156 00:08:12,000 --> 00:08:16,560 Speaker 1: a division to develop autonomous underwater vehicles. The commonitor vital 157 00:08:16,680 --> 00:08:19,600 Speaker 1: cables on the ocean floor. They are vital cables without 158 00:08:19,760 --> 00:08:24,040 Speaker 1: no Internet without without them. They've created this underwater vehicle 159 00:08:24,280 --> 00:08:26,760 Speaker 1: called gray Shark that actually looks more like a dolphin. 160 00:08:29,040 --> 00:08:33,520 Speaker 1: And to me, bionic cockroaches are the most fascinating part 161 00:08:33,520 --> 00:08:35,800 Speaker 1: of the story. I'm serious. So let me tell you 162 00:08:35,800 --> 00:08:39,160 Speaker 1: about a little company called Swarm Biotechniques. 163 00:08:39,240 --> 00:08:39,640 Speaker 2: There you go. 164 00:08:40,080 --> 00:08:44,679 Speaker 1: Swarm Biotechniques have taken cockroaches and made these little backpacks 165 00:08:44,720 --> 00:08:48,520 Speaker 1: for them. These backpacks again a little baby corroach, baby 166 00:08:48,520 --> 00:08:55,800 Speaker 1: cockroach backpack. The backpack interacts with the cockroaches nervous system 167 00:08:56,120 --> 00:09:00,280 Speaker 1: such that a human operator can use the cockroach like 168 00:09:00,400 --> 00:09:01,319 Speaker 1: a remote control cars. 169 00:09:01,440 --> 00:09:02,839 Speaker 2: So these are real cockroaches. 170 00:09:02,920 --> 00:09:05,199 Speaker 1: These are real cockroaches. 171 00:09:04,480 --> 00:09:07,160 Speaker 2: That have implants in their brain. 172 00:09:06,920 --> 00:09:10,440 Speaker 1: Implants in their brain and tiny cameras and can go 173 00:09:10,520 --> 00:09:14,920 Speaker 1: in swarms into areas where humans or other bigger robots. 174 00:09:14,920 --> 00:09:17,920 Speaker 1: This is from the Swarm biotechniques websites has been referenced 175 00:09:17,960 --> 00:09:20,679 Speaker 1: in a bunch of mainstream press. So I haven't seen 176 00:09:20,920 --> 00:09:24,240 Speaker 1: for myself the swamp that the cockroach being remote controlled. 177 00:09:24,240 --> 00:09:25,440 Speaker 1: But what a world we're living in. 178 00:09:25,600 --> 00:09:29,199 Speaker 2: So why is this happening in Europe specifically? 179 00:09:29,679 --> 00:09:32,319 Speaker 1: Yeah, well, I mentioned that the Vance thing, and one 180 00:09:32,320 --> 00:09:35,440 Speaker 1: of the co founders of Helsing, the more successful drone company, 181 00:09:35,520 --> 00:09:39,280 Speaker 1: told the New York Times quote before, no European VC 182 00:09:39,440 --> 00:09:43,079 Speaker 1: was interested in defense. Now everyone wants to invest. And 183 00:09:43,200 --> 00:09:46,760 Speaker 1: so I think you can't really understand this story without 184 00:09:46,840 --> 00:09:49,920 Speaker 1: understanding how scary it is to be next to Russia 185 00:09:50,240 --> 00:09:52,760 Speaker 1: if you live in Europe. So you know, in New 186 00:09:52,840 --> 00:09:54,880 Speaker 1: York a few months ago, everyone was looking up at 187 00:09:54,880 --> 00:09:57,960 Speaker 1: the sky and seeing all these drones. It's kind of 188 00:09:57,960 --> 00:10:00,680 Speaker 1: scary and chilling. Yeah, it's a lot more scary and 189 00:10:00,720 --> 00:10:03,640 Speaker 1: more chilling to Russia. You look up the sky and 190 00:10:03,679 --> 00:10:04,240 Speaker 1: you see stuff. 191 00:10:04,320 --> 00:10:04,520 Speaker 4: Right. 192 00:10:04,600 --> 00:10:07,920 Speaker 1: The market is fear driven, of course, So drones, presumably 193 00:10:07,960 --> 00:10:12,520 Speaker 1: Russian drones, have been spotted flying over European bases hosting 194 00:10:12,760 --> 00:10:15,959 Speaker 1: US tactical nuclear weapons that are part of Europe's nuclear 195 00:10:16,080 --> 00:10:18,839 Speaker 1: terrance strategy. The US is believed to have around one 196 00:10:18,880 --> 00:10:23,000 Speaker 1: hundred tactical nuclear bombs in Europe spread across five countries, 197 00:10:23,040 --> 00:10:25,760 Speaker 1: which is half of the US supply not something you 198 00:10:25,760 --> 00:10:26,040 Speaker 1: want to. 199 00:10:26,040 --> 00:10:29,240 Speaker 2: See, No, you definitely don't. And I know this because 200 00:10:29,280 --> 00:10:31,480 Speaker 2: I just saw the new Cafine Bigelow movie called House 201 00:10:31,520 --> 00:10:35,400 Speaker 2: of Dynamite, and it's about basically the hours leading up 202 00:10:35,440 --> 00:10:38,800 Speaker 2: to a nuclear missile heading to the United States. I'm 203 00:10:38,840 --> 00:10:41,200 Speaker 2: just thinking about it because of this. It sort of 204 00:10:41,600 --> 00:10:45,960 Speaker 2: shows the ways in which the United States isn't exactly prepared. 205 00:10:46,640 --> 00:10:49,400 Speaker 2: I actually want to I want to transition to something 206 00:10:49,440 --> 00:10:52,600 Speaker 2: that is a little bit more CARA, which is music 207 00:10:53,200 --> 00:10:57,640 Speaker 2: culture AI the synthetic creation of new musical artists. 208 00:10:57,760 --> 00:11:00,520 Speaker 1: Well, I have to tell you that one of the 209 00:11:00,559 --> 00:11:03,000 Speaker 1: main investors in Helsing, In fact, I think the lead 210 00:11:03,000 --> 00:11:06,120 Speaker 1: investor in the last round was none other than Daniel Eck, 211 00:11:06,600 --> 00:11:07,760 Speaker 1: who also found his Spotify. 212 00:11:07,960 --> 00:11:10,480 Speaker 2: So there's always but it's also gotten him in trouble, 213 00:11:10,800 --> 00:11:14,160 Speaker 2: that's right, Like many artists have left the platform because 214 00:11:14,160 --> 00:11:15,880 Speaker 2: of this, and it's also why Eck is no longer 215 00:11:15,920 --> 00:11:18,120 Speaker 2: the CEO of Spotify. But yes, I'm here to talk 216 00:11:18,120 --> 00:11:21,160 Speaker 2: about the state of music, and you know that AI 217 00:11:21,240 --> 00:11:23,320 Speaker 2: music is nothing really new, Like we talked about it 218 00:11:23,360 --> 00:11:26,040 Speaker 2: on Sleepwalkers. There was a company called Endell that was 219 00:11:26,120 --> 00:11:30,520 Speaker 2: like the first algorithm to have a record deal. And 220 00:11:30,640 --> 00:11:33,960 Speaker 2: actually in September, Spotify revealed that it moved seventy five 221 00:11:34,120 --> 00:11:37,800 Speaker 2: million spam tracks over the last year. And this has 222 00:11:37,800 --> 00:11:40,280 Speaker 2: become a really big business for people because it's cheap 223 00:11:40,320 --> 00:11:42,439 Speaker 2: to make and if a song is played for more 224 00:11:42,440 --> 00:11:45,240 Speaker 2: than thirty seconds on a streaming platform, the person uploading 225 00:11:45,240 --> 00:11:49,680 Speaker 2: the music gets a royalty. So I'm curious first and foremost, like, 226 00:11:50,320 --> 00:11:52,120 Speaker 2: what do you think about AI music? 227 00:11:52,640 --> 00:11:55,880 Speaker 1: Very good question. I don't know what I think about 228 00:11:55,920 --> 00:11:59,000 Speaker 1: AI music, but what I do think about is that 229 00:11:59,120 --> 00:12:03,840 Speaker 1: I know when people think they are interacting with AI 230 00:12:04,240 --> 00:12:08,040 Speaker 1: created media, they slightly switch off. But I'm wondering if 231 00:12:08,080 --> 00:12:09,839 Speaker 1: that if that continues, I mean. 232 00:12:09,800 --> 00:12:11,720 Speaker 2: They're turned off by the idea. Yeah, sorry, I was 233 00:12:11,720 --> 00:12:16,600 Speaker 2: trying to translate your British. Well. I want to play 234 00:12:16,640 --> 00:12:19,959 Speaker 2: you a song that is number thirty on Billboard's Adult 235 00:12:20,080 --> 00:12:24,640 Speaker 2: R and B Airplay charts, and this artist has over 236 00:12:24,760 --> 00:12:34,760 Speaker 2: one million monthly listeners on Spotify. 237 00:12:34,840 --> 00:12:39,520 Speaker 3: He didn't want me down no stairs, didn't want me 238 00:12:39,600 --> 00:12:41,360 Speaker 3: about them boys who wouldn't care. 239 00:12:42,480 --> 00:12:45,360 Speaker 2: Didn't tell me what a real man sounds like. 240 00:12:46,080 --> 00:12:49,120 Speaker 5: So a fell for that's right. 241 00:12:49,559 --> 00:12:52,200 Speaker 2: He never pulled much. Tell me what you think about it. 242 00:12:52,440 --> 00:12:54,560 Speaker 1: I would have no idea. This wasn't real, right. I 243 00:12:54,559 --> 00:12:56,320 Speaker 1: did like it though, I like especially that. Oh at 244 00:12:56,360 --> 00:12:57,720 Speaker 1: the beginning it sounded a bit like I don't want 245 00:12:57,760 --> 00:12:58,440 Speaker 1: to look for Christmas. 246 00:12:58,480 --> 00:13:00,520 Speaker 2: Well, let me tell you something. Not a lick of 247 00:13:00,559 --> 00:13:03,880 Speaker 2: it is real. However, there is some reality. 248 00:13:03,880 --> 00:13:04,800 Speaker 1: It's a human in the loop. 249 00:13:04,880 --> 00:13:06,720 Speaker 2: There is a human in the loop. The person that 250 00:13:06,800 --> 00:13:09,520 Speaker 2: you just heard is Zania Monet. 251 00:13:09,760 --> 00:13:11,640 Speaker 1: Okay, she's real. If she's not real, she. 252 00:13:11,720 --> 00:13:16,280 Speaker 2: Is fake as they come. So I'm a person who 253 00:13:16,440 --> 00:13:20,800 Speaker 2: likes to see when AI is creating an inflection point 254 00:13:21,160 --> 00:13:22,920 Speaker 2: in the culture where it's like, you know, like we 255 00:13:22,920 --> 00:13:25,559 Speaker 2: talked about with Sora, like all of a sudden, we 256 00:13:25,880 --> 00:13:29,560 Speaker 2: can't trust video anymore. Ever, Yeah, I think because this 257 00:13:29,880 --> 00:13:32,400 Speaker 2: artist Zanaia is on the Billboard charts, Yeah. 258 00:13:32,200 --> 00:13:33,800 Speaker 1: That makes it feel different. Right. This is not just 259 00:13:34,160 --> 00:13:36,679 Speaker 1: so garbage that's on Spotify and from time to time 260 00:13:36,720 --> 00:13:39,240 Speaker 1: gets weeded out by a mod or an automatic mod. 261 00:13:39,559 --> 00:13:41,920 Speaker 1: This is a charting song of an artist with a 262 00:13:42,040 --> 00:13:43,839 Speaker 1: name who doesn't exist. That's right. 263 00:13:44,160 --> 00:13:47,040 Speaker 2: And on top of that, she has a manager and 264 00:13:47,080 --> 00:13:50,920 Speaker 2: a multimillion dollar record yeah with Hollwood Media, which is 265 00:13:50,920 --> 00:13:54,240 Speaker 2: an independent record label founded by a former Interscope executive. 266 00:13:54,960 --> 00:13:57,040 Speaker 2: So when I was reading about Zania, it was actually 267 00:13:57,120 --> 00:14:00,600 Speaker 2: unclear to me what about her was a and what 268 00:14:00,800 --> 00:14:03,120 Speaker 2: wasn't like Was it the beats, was it her voice? 269 00:14:03,280 --> 00:14:06,640 Speaker 2: Was it the lyrics? So I actually called up Zenia's manager, 270 00:14:06,760 --> 00:14:09,440 Speaker 2: Ramel Murphy, and he told me that this whole project 271 00:14:09,520 --> 00:14:12,720 Speaker 2: is actually the brainchild of a lifelong friend and poet, 272 00:14:13,040 --> 00:14:15,760 Speaker 2: a thirty one year old named Talisha Nicki Jones. 273 00:14:16,240 --> 00:14:20,600 Speaker 6: Here's Ramel, so I would say Zanayah is an extension 274 00:14:20,680 --> 00:14:23,160 Speaker 6: of Talisha Jones. So that's why it feels so real, 275 00:14:23,160 --> 00:14:26,400 Speaker 6: because she put her raw motions on real life experiences 276 00:14:26,400 --> 00:14:29,040 Speaker 6: into the song. AI is just a tool to enhance 277 00:14:29,280 --> 00:14:32,400 Speaker 6: and give a different experience, not a better, not a worse, 278 00:14:32,520 --> 00:14:36,280 Speaker 6: a different experience. And we're going to be completely transparent 279 00:14:36,320 --> 00:14:39,000 Speaker 6: as we've been from the jump. She's not the vocal 280 00:14:39,000 --> 00:14:40,080 Speaker 6: beast as Zanaia is. 281 00:14:40,720 --> 00:14:43,800 Speaker 2: Ramel and Talisha don't see Talisha's lack of musical training 282 00:14:43,880 --> 00:14:47,120 Speaker 2: as any issue. She's basically using an R and B 283 00:14:47,240 --> 00:14:50,560 Speaker 2: singer songwriter avatar to elevate the real life stories and 284 00:14:50,600 --> 00:14:54,280 Speaker 2: emotions she's putting into her lyrics. And Talisha the poet 285 00:14:54,560 --> 00:14:57,520 Speaker 2: has an undeniably larger reach as Zanaia, the R and 286 00:14:57,600 --> 00:15:00,800 Speaker 2: B singer. So what was interesting about what Ramel said 287 00:15:00,840 --> 00:15:03,880 Speaker 2: is that this is art. Talisha's music is real music, 288 00:15:04,160 --> 00:15:07,680 Speaker 2: even if Talisha herself isn't musically talented in her own right. 289 00:15:08,240 --> 00:15:11,800 Speaker 2: So Talisha has been using an AI music generator called 290 00:15:11,920 --> 00:15:15,680 Speaker 2: Suno to make these songs because, as Ramel put it, 291 00:15:15,800 --> 00:15:18,160 Speaker 2: she is not a singer and she's not a musician, 292 00:15:18,440 --> 00:15:22,760 Speaker 2: She's just a poet. And actually, Talisha appeared on CBS 293 00:15:22,760 --> 00:15:25,880 Speaker 2: Mornings and showed co host Gail King how she uses 294 00:15:25,920 --> 00:15:27,680 Speaker 2: Suno to turn her poems into music. 295 00:15:28,120 --> 00:15:30,480 Speaker 3: And I just take those lyrics, just go in and 296 00:15:30,560 --> 00:15:35,840 Speaker 3: paste a simple style, slow tempo R and B. 297 00:15:36,160 --> 00:15:38,040 Speaker 1: So you're saying, I want slow tempo R and B. 298 00:15:38,480 --> 00:15:42,040 Speaker 5: She adds a few more prompts into AI music generator Suno, 299 00:15:42,680 --> 00:15:48,840 Speaker 5: like female soulful vocals, light guitar and heavy drums, and 300 00:15:48,960 --> 00:15:51,200 Speaker 5: Hilary's just creat. 301 00:15:54,120 --> 00:15:56,880 Speaker 2: I watched some videos of her on Facebook. Talicia, Yeah, Talisha, 302 00:15:57,160 --> 00:16:01,280 Speaker 2: she teaches how to leverage AI tools into financial rewards, 303 00:16:01,720 --> 00:16:04,520 Speaker 2: and I don't know. I feel like she's kind of 304 00:16:05,240 --> 00:16:08,680 Speaker 2: taking this software by the scruff of its neck and saying, 305 00:16:08,720 --> 00:16:12,200 Speaker 2: look like, I'm not going to be a superstar. I 306 00:16:12,240 --> 00:16:14,360 Speaker 2: can build a superstar, and isn't it better for me 307 00:16:14,400 --> 00:16:16,360 Speaker 2: to build a superstar who I then don't have to 308 00:16:16,360 --> 00:16:18,560 Speaker 2: respond for She's a producer. 309 00:16:18,960 --> 00:16:21,040 Speaker 1: Is an interesting combination being a good producer and a 310 00:16:21,040 --> 00:16:24,960 Speaker 1: good marketer. Right She's written Talisha's written the hype cycle 311 00:16:25,000 --> 00:16:28,200 Speaker 1: of AI to become a producer and artist, and Aaron 312 00:16:28,200 --> 00:16:29,280 Speaker 1: write good on I think. 313 00:16:29,160 --> 00:16:32,560 Speaker 2: What people have a problem with is is this music. 314 00:16:33,080 --> 00:16:35,760 Speaker 2: So there's a very well known R and B artist 315 00:16:35,840 --> 00:16:38,560 Speaker 2: named Kailani and she was quoted in a piece in 316 00:16:38,640 --> 00:16:42,680 Speaker 2: Billboard magazine saying, quote, nothing and no one on earth 317 00:16:42,760 --> 00:16:46,160 Speaker 2: will ever be able to justify AI to me. And 318 00:16:46,240 --> 00:16:49,120 Speaker 2: I do understand this. There are many R and B 319 00:16:49,280 --> 00:16:53,640 Speaker 2: artists who have worked for years and years tirelessly to 320 00:16:54,440 --> 00:16:59,080 Speaker 2: hone their craft. And Talisha is a poet, is using 321 00:16:59,680 --> 00:17:04,200 Speaker 2: soon know and is pushing out songs pretty easily using 322 00:17:04,200 --> 00:17:06,920 Speaker 2: digital tools. So Kilani makes a point in the sense 323 00:17:06,920 --> 00:17:10,480 Speaker 2: of like, can we really call Zanaya an artist in 324 00:17:10,520 --> 00:17:13,360 Speaker 2: the same way we call Kilani an artist? I think right. 325 00:17:13,440 --> 00:17:15,560 Speaker 1: I mean, look, if all music was this, I would 326 00:17:15,600 --> 00:17:18,520 Speaker 1: be super depressed. But I think because I'm attracted to 327 00:17:18,560 --> 00:17:21,960 Speaker 1: Talsia's story, I feel like I give this a pause. 328 00:17:22,320 --> 00:17:25,480 Speaker 1: But it's kind of agel debate. Right, is a musical 329 00:17:25,520 --> 00:17:27,520 Speaker 1: act more the product of the musician or the producer 330 00:17:28,080 --> 00:17:28,879 Speaker 1: in today's world? 331 00:17:29,240 --> 00:17:32,680 Speaker 2: That is the question. And actually Suno has been sued 332 00:17:32,760 --> 00:17:36,520 Speaker 2: for copyright infringement and a lot of artists are very 333 00:17:36,560 --> 00:17:38,520 Speaker 2: anti Suno that sense. 334 00:17:38,400 --> 00:17:44,200 Speaker 1: Yes, basically stealing people's voices and musical ideas and melodies 335 00:17:44,240 --> 00:17:45,360 Speaker 1: to make this possible. 336 00:17:45,400 --> 00:17:49,160 Speaker 2: Well, they're training SUNO's algorithms with pre existing material, and 337 00:17:49,280 --> 00:17:51,879 Speaker 2: a lot of record companies are part of this larger 338 00:17:51,960 --> 00:17:54,879 Speaker 2: lawsuit where they're saying that is not okay. You can't 339 00:17:55,119 --> 00:18:00,479 Speaker 2: basically facilitate someone's entire musical career on the voice and 340 00:18:00,640 --> 00:18:03,560 Speaker 2: sound of you know, thousands of artists music. 341 00:18:10,320 --> 00:18:13,919 Speaker 1: After the break a pitch to build data centers in 342 00:18:14,119 --> 00:18:20,480 Speaker 1: space and professional soccer coaches experiment with Chatgibt stay with 343 00:18:20,600 --> 00:18:36,800 Speaker 1: us and we're back, Cara, What if I told you 344 00:18:36,840 --> 00:18:39,359 Speaker 1: that Google wanted to harness the power of the sun 345 00:18:40,480 --> 00:18:41,680 Speaker 1: to power data centers. 346 00:18:41,880 --> 00:18:45,240 Speaker 2: I would quote the oft quotable Mel Robbins and say 347 00:18:45,480 --> 00:18:46,120 Speaker 2: let them. 348 00:18:47,480 --> 00:18:49,639 Speaker 1: Let them. I would say, let them. What if I 349 00:18:49,680 --> 00:18:51,840 Speaker 1: told you they would also be honessing the power of 350 00:18:51,840 --> 00:18:55,520 Speaker 1: the Sun for these data centers in space. 351 00:18:55,520 --> 00:18:58,720 Speaker 2: I would say that nothing surprises me anymore. Are they 352 00:18:58,720 --> 00:18:59,639 Speaker 2: doing this well? 353 00:18:59,680 --> 00:19:02,960 Speaker 1: They just We've talked about making plans to send trial 354 00:19:03,040 --> 00:19:06,400 Speaker 1: equipment into orbit by early twenty twenty seven. It's called 355 00:19:06,400 --> 00:19:08,160 Speaker 1: Project Suncatcher. 356 00:19:07,680 --> 00:19:08,320 Speaker 2: Brilliant name. 357 00:19:09,200 --> 00:19:13,919 Speaker 1: Basically, the idea is that tightly packed constellations of solar 358 00:19:13,960 --> 00:19:18,040 Speaker 1: powered satellites orbiting four hundred miles above Earth's surface with 359 00:19:18,160 --> 00:19:22,600 Speaker 1: AI processors could help meet rising demand for AI without 360 00:19:22,600 --> 00:19:26,440 Speaker 1: putting additional strain on our energy resources here on Earth. 361 00:19:26,720 --> 00:19:28,639 Speaker 2: And this is a good idea why. 362 00:19:29,200 --> 00:19:31,479 Speaker 1: The article in the Guardian that I read talked about 363 00:19:31,520 --> 00:19:35,080 Speaker 1: how the bet Google is making here is that it 364 00:19:35,119 --> 00:19:38,320 Speaker 1: will be so cheap to send stuff into space by 365 00:19:38,359 --> 00:19:41,399 Speaker 1: twenty twenty seven, that will actually be cheaper to do 366 00:19:41,520 --> 00:19:45,679 Speaker 1: data centers in space than on Earth. Google researchers predict 367 00:19:45,680 --> 00:19:48,440 Speaker 1: that the cost of sending stuff to space could drop 368 00:19:48,480 --> 00:19:52,080 Speaker 1: to two hundred dollars per kilogram in this decade, which 369 00:19:52,119 --> 00:19:54,560 Speaker 1: is a multiple lower than it is today. 370 00:19:54,480 --> 00:19:57,000 Speaker 2: So we'd basically be sending these to space because it's 371 00:19:57,040 --> 00:19:58,520 Speaker 2: saving money. 372 00:19:58,440 --> 00:20:01,720 Speaker 1: It's cheaper than using Earth Earth energy. It's better for 373 00:20:01,720 --> 00:20:05,679 Speaker 1: the environment because you don't need you don't need fossil fuels, 374 00:20:06,520 --> 00:20:09,560 Speaker 1: and you don't need water to cool them down because 375 00:20:09,560 --> 00:20:10,119 Speaker 1: it's cold up there. 376 00:20:10,160 --> 00:20:11,439 Speaker 2: Because it's cold up there. 377 00:20:12,040 --> 00:20:15,320 Speaker 1: There are of course some downsides as well. Launching rockets 378 00:20:15,480 --> 00:20:18,280 Speaker 1: emits loads and loads and loads of CO two, So 379 00:20:18,800 --> 00:20:22,080 Speaker 1: you have to believe that the price in CO two 380 00:20:22,200 --> 00:20:25,119 Speaker 1: for the rocket launches will be less than the price 381 00:20:25,200 --> 00:20:28,639 Speaker 1: of CO two in fossil fuels burnt to power data 382 00:20:28,680 --> 00:20:32,480 Speaker 1: centers on Earth. One astronomer gave a quote that also, 383 00:20:32,720 --> 00:20:36,199 Speaker 1: so many satellites in low orbit will be quote like 384 00:20:36,320 --> 00:20:40,600 Speaker 1: bugs on a windshield when trying to observe outer space. Really, 385 00:20:40,880 --> 00:20:43,080 Speaker 1: and so you know, you think we talked, I think 386 00:20:43,080 --> 00:20:44,800 Speaker 1: a couple of weeks ago about all this space junk 387 00:20:44,840 --> 00:20:47,720 Speaker 1: raining down from the sky, sending loads more stuff, And 388 00:20:47,760 --> 00:20:49,040 Speaker 1: you also have to step back and ask the question, 389 00:20:49,400 --> 00:20:52,160 Speaker 1: do we really really need this much AI processing power 390 00:20:52,160 --> 00:20:53,760 Speaker 1: that we have to send me a bunch of stuff 391 00:20:53,800 --> 00:20:54,800 Speaker 1: into space. 392 00:20:54,640 --> 00:20:56,560 Speaker 2: In order to do like shouldn't we do less? 393 00:20:57,680 --> 00:21:01,879 Speaker 1: Maybe? But that's unlikely. Elon Musk is also working on 394 00:21:01,960 --> 00:21:06,040 Speaker 1: data centers in space, and Nvidia AI chips are shortly 395 00:21:06,119 --> 00:21:09,000 Speaker 1: to be launched into space this month in partnership with 396 00:21:09,000 --> 00:21:11,600 Speaker 1: a startup called star Cloud to see if they actually 397 00:21:11,680 --> 00:21:15,119 Speaker 1: work in space. I mean, this is living, this is insane. 398 00:21:15,200 --> 00:21:18,680 Speaker 2: They're sending chips. They're sending Nvidia chips to space to 399 00:21:18,760 --> 00:21:20,800 Speaker 2: see if Nvidia chips will work in space. 400 00:21:20,800 --> 00:21:22,280 Speaker 1: As far as I understand, yes, I don't want to 401 00:21:22,280 --> 00:21:23,320 Speaker 1: see what happens. 402 00:21:23,440 --> 00:21:24,960 Speaker 2: I'm not interested in finding out. 403 00:21:25,000 --> 00:21:26,480 Speaker 1: Don't look up it is. 404 00:21:26,520 --> 00:21:28,600 Speaker 2: Don't look up. I have a story that just is 405 00:21:28,680 --> 00:21:29,760 Speaker 2: much more on Earth. 406 00:21:30,000 --> 00:21:30,199 Speaker 5: You know. 407 00:21:30,280 --> 00:21:33,399 Speaker 2: I'm obsessed with school phone bans. And I don't have 408 00:21:33,440 --> 00:21:35,400 Speaker 2: a child, but I think that no. 409 00:21:35,440 --> 00:21:37,720 Speaker 1: Child should use a phone in school ever. 410 00:21:38,280 --> 00:21:42,600 Speaker 2: Really, ever until they're eighteen. Kentucky had a school phone ban, 411 00:21:43,080 --> 00:21:45,280 Speaker 2: and listen to what happened. More people are taking out 412 00:21:45,280 --> 00:21:46,200 Speaker 2: books from the library. 413 00:21:46,280 --> 00:21:46,560 Speaker 1: Well. 414 00:21:46,880 --> 00:21:49,720 Speaker 2: The Jefferson County Public School District in Kentucky reported that 415 00:21:49,840 --> 00:21:52,639 Speaker 2: kids are checking out more books from the library this 416 00:21:52,720 --> 00:21:55,600 Speaker 2: year than last year. One school in the district saw 417 00:21:55,640 --> 00:21:59,320 Speaker 2: an increase of sixty seven percent more checkouts year over year. 418 00:21:59,760 --> 00:22:02,040 Speaker 2: And they're only like, what is it November. There are 419 00:22:02,080 --> 00:22:04,679 Speaker 2: only a few months into school. It has made a 420 00:22:04,760 --> 00:22:08,280 Speaker 2: huge difference, not just for library usage, but administrators are 421 00:22:08,280 --> 00:22:11,760 Speaker 2: hopeful that this trend will do wonders for reading scores, 422 00:22:12,080 --> 00:22:15,680 Speaker 2: and it's already improved at least one student's mental health 423 00:22:15,720 --> 00:22:18,960 Speaker 2: who told a local outlet quote. At first, I was 424 00:22:19,080 --> 00:22:21,520 Speaker 2: really dramatic about the policy. I thought it was going 425 00:22:21,560 --> 00:22:24,399 Speaker 2: to end my whole life. I just realized this is 426 00:22:24,440 --> 00:22:26,120 Speaker 2: a good chance for me to put my phone down 427 00:22:26,160 --> 00:22:29,280 Speaker 2: and start focusing back on school. It helps people socialize 428 00:22:29,320 --> 00:22:31,800 Speaker 2: because this year I have talked to people more than 429 00:22:31,800 --> 00:22:33,840 Speaker 2: in all the twelve years of me being in school. 430 00:22:33,960 --> 00:22:34,280 Speaker 4: Wow. 431 00:22:35,000 --> 00:22:37,520 Speaker 1: You know what is funny? You have a book club? Yes, 432 00:22:37,720 --> 00:22:39,800 Speaker 1: able to social media power use that. 433 00:22:39,800 --> 00:22:40,280 Speaker 2: That's true. 434 00:22:40,440 --> 00:22:43,120 Speaker 1: I if I'm on vacation, I can read quite quite 435 00:22:43,160 --> 00:22:45,320 Speaker 1: easily and happily. If I'm at home, I want to 436 00:22:45,320 --> 00:22:47,119 Speaker 1: read in the evening, I have shut my phone in 437 00:22:47,119 --> 00:22:47,600 Speaker 1: the other room. 438 00:22:47,720 --> 00:22:48,119 Speaker 2: You have to. 439 00:22:48,440 --> 00:22:50,240 Speaker 1: You have to have your phones anywhere near that book 440 00:22:50,280 --> 00:22:50,640 Speaker 1: is normal. 441 00:22:50,640 --> 00:22:52,800 Speaker 2: And we're in our thirties. We're talking about kids who 442 00:22:52,800 --> 00:22:54,679 Speaker 2: are twelve years old who have been in school, or 443 00:22:55,080 --> 00:22:58,000 Speaker 2: fifteen years old. It goes back to the thing like 444 00:22:58,280 --> 00:23:00,720 Speaker 2: if you give someone a phone, and you give someone 445 00:23:00,720 --> 00:23:04,919 Speaker 2: a book, which one was designed using mechanisms that they 446 00:23:05,000 --> 00:23:06,760 Speaker 2: used to create slot machines. Not a book. 447 00:23:06,800 --> 00:23:09,760 Speaker 1: Not a book, Not a book. I have something which 448 00:23:09,800 --> 00:23:14,919 Speaker 1: is the definition of not boring story, the louver Heist. 449 00:23:15,800 --> 00:23:18,120 Speaker 2: I love the lead following Yes, it's the best. 450 00:23:18,760 --> 00:23:25,200 Speaker 1: So the Crown Jewels of France were essentially poloined. Recently, 451 00:23:25,600 --> 00:23:26,080 Speaker 1: in the. 452 00:23:25,960 --> 00:23:29,400 Speaker 2: Age of US sending data centers to space, there are 453 00:23:29,520 --> 00:23:31,399 Speaker 2: still just mom and pop thieves. 454 00:23:31,600 --> 00:23:34,000 Speaker 1: Well that's the thing. So that's so. The original thought was, 455 00:23:34,000 --> 00:23:36,440 Speaker 1: oh my god, this is the most sophisticated criminal gang 456 00:23:36,480 --> 00:23:38,920 Speaker 1: in the world. It turned out to be according to 457 00:23:38,920 --> 00:23:43,440 Speaker 1: the French prostitute of the bunch of local Joshmo, which 458 00:23:43,560 --> 00:23:47,480 Speaker 1: meant that they left their DNA all over the place. 459 00:23:47,640 --> 00:23:50,840 Speaker 1: This is why it's a textuff story. So essentially, in 460 00:23:50,920 --> 00:23:53,040 Speaker 1: France there is a database like there is in the 461 00:23:53,160 --> 00:23:56,680 Speaker 1: US of DNA of people who have been arrested and 462 00:23:57,000 --> 00:24:00,000 Speaker 1: convicted of crimes, and essentially they were able to get 463 00:24:00,160 --> 00:24:05,040 Speaker 1: the DNA from various places, including the Empress Eugenniese Crown, 464 00:24:05,200 --> 00:24:09,119 Speaker 1: which was dropped on the way out, advised and cross 465 00:24:09,119 --> 00:24:12,120 Speaker 1: referenced it with the database of people who had committed 466 00:24:12,160 --> 00:24:16,600 Speaker 1: a crime in France and they got four hits being immediately. 467 00:24:16,200 --> 00:24:17,160 Speaker 2: So easy for them. 468 00:24:17,400 --> 00:24:20,159 Speaker 1: Yeah, what's really interesting here is that the reason the 469 00:24:20,240 --> 00:24:25,400 Speaker 1: DNA samples could catch allegedly the thieves was because these 470 00:24:25,440 --> 00:24:29,200 Speaker 1: people had already committed crimes and been booked about the system. 471 00:24:29,280 --> 00:24:32,360 Speaker 1: So there's four point four million people in France who 472 00:24:32,400 --> 00:24:34,960 Speaker 1: are in this DNA database and they can cross reference. 473 00:24:35,240 --> 00:24:37,200 Speaker 1: That's the same as how it works in the US 474 00:24:37,200 --> 00:24:40,560 Speaker 1: with something called CODIS. You know about COTIS, I do 475 00:24:40,880 --> 00:24:43,320 Speaker 1: the Combined DNA Index System. How do you know about it? 476 00:24:43,440 --> 00:24:47,280 Speaker 2: Because I watched CSI, right, I watch a lot of crimes. 477 00:24:47,000 --> 00:24:49,440 Speaker 1: So CODIS has a lot more hits in it, but 478 00:24:49,480 --> 00:24:51,680 Speaker 1: I think in order of magnitude more. And we actually 479 00:24:51,680 --> 00:24:55,800 Speaker 1: at Kulleioscope have another podcast called America's Crime Lab, which 480 00:24:55,800 --> 00:24:57,760 Speaker 1: I'm going to plug hard here. It is about a 481 00:24:57,800 --> 00:25:03,119 Speaker 1: startup called Othrum who were using forensic DNA techniques, so 482 00:25:03,400 --> 00:25:07,160 Speaker 1: in other words, they can extrapolate based on DNA databases 483 00:25:07,359 --> 00:25:10,200 Speaker 1: even if you haven't already been put in the database 484 00:25:10,560 --> 00:25:13,520 Speaker 1: who you may be based on your relatives and stuff. 485 00:25:14,119 --> 00:25:17,800 Speaker 1: And so they were actually very heavily involved in Golden State, 486 00:25:17,880 --> 00:25:20,359 Speaker 1: well not Golden State actually, but in the Idaho. Oh 487 00:25:20,359 --> 00:25:23,160 Speaker 1: the Idaho, but Golden State was exactly one of these 488 00:25:23,200 --> 00:25:29,240 Speaker 1: cases as well. It's controversial. In France, consumer DNA sort 489 00:25:29,280 --> 00:25:32,080 Speaker 1: of businesses are actually banned. We were talking at the 490 00:25:32,119 --> 00:25:36,160 Speaker 1: beginning of the episode about Germany's history and the armaments industry, 491 00:25:36,440 --> 00:25:40,000 Speaker 1: because in France, the reason consumer genetic companies are banned 492 00:25:40,119 --> 00:25:45,480 Speaker 1: is because of the Holocaust. You know, inferring people's genealogy 493 00:25:45,680 --> 00:25:48,840 Speaker 1: and Jewishness was a way that you know, many many 494 00:25:48,880 --> 00:25:52,440 Speaker 1: people are senter their deaths in France. So it's always 495 00:25:52,520 --> 00:25:55,040 Speaker 1: two sided, right, I mean, you want you want the 496 00:25:55,560 --> 00:25:58,240 Speaker 1: to use the DNA technology to you know, to help 497 00:25:58,320 --> 00:26:01,560 Speaker 1: figure out who to the Jewels. 498 00:26:01,320 --> 00:26:03,280 Speaker 2: But not who is the jew. 499 00:26:05,119 --> 00:26:05,239 Speaker 1: It. 500 00:26:05,280 --> 00:26:06,800 Speaker 2: I can say it. I can say. 501 00:26:06,640 --> 00:26:17,680 Speaker 4: It now for Chat and me. 502 00:26:18,320 --> 00:26:22,320 Speaker 1: I like reading about soccer as you might call it, 503 00:26:22,560 --> 00:26:25,960 Speaker 1: football as I call it on the Guardian, I pretty compulsively. 504 00:26:26,000 --> 00:26:27,680 Speaker 1: I didn't down at for a weird reason like a 505 00:26:27,720 --> 00:26:30,520 Speaker 1: Guardian dot co or UK slash football probably forty times 506 00:26:30,520 --> 00:26:33,800 Speaker 1: a day. I stumbled upon an article with a surprising 507 00:26:34,400 --> 00:26:38,800 Speaker 1: revelation about how professional soccer might be getting help from 508 00:26:38,920 --> 00:26:39,359 Speaker 1: jen Ai. 509 00:26:39,800 --> 00:26:43,520 Speaker 2: I saw this too. Yes. The Seattle rainhead coach Laura 510 00:26:43,560 --> 00:26:47,359 Speaker 2: Harvey confessed on the soccer Is podcast to using chat 511 00:26:47,400 --> 00:26:49,840 Speaker 2: GPT to help coach your team. It was in the 512 00:26:49,880 --> 00:26:50,320 Speaker 2: off season. 513 00:26:50,359 --> 00:26:52,560 Speaker 3: I was like writing things into chat GPT of like 514 00:26:52,760 --> 00:26:54,960 Speaker 3: what is Seattle Rain's identity? 515 00:26:55,200 --> 00:26:57,400 Speaker 2: And it would like spur it out and I was like, no, 516 00:26:57,400 --> 00:26:57,680 Speaker 2: I don't. 517 00:26:57,840 --> 00:26:58,919 Speaker 6: I don't know if that or not. 518 00:26:59,000 --> 00:27:00,880 Speaker 3: And then it would be I would like, what what 519 00:27:00,960 --> 00:27:03,639 Speaker 3: do you need to do to be successful in the NWSL? 520 00:27:03,680 --> 00:27:06,600 Speaker 1: Lot really broad questions so far so generic? 521 00:27:06,680 --> 00:27:09,200 Speaker 2: Yeah, totally. I mean it's what many people do with chat. 522 00:27:09,240 --> 00:27:12,000 Speaker 2: They start small, but then pretty soon you realize you're 523 00:27:12,040 --> 00:27:14,200 Speaker 2: asking a chat but whether or not you should stay 524 00:27:14,200 --> 00:27:16,360 Speaker 2: in your relationship even though your partner didn't show enough 525 00:27:16,440 --> 00:27:18,960 Speaker 2: enthusiasm when you started talking about a fake bird you saw. 526 00:27:20,840 --> 00:27:22,480 Speaker 1: So what happened next with the Seattle. 527 00:27:22,200 --> 00:27:25,520 Speaker 3: Rain and then I put in what formation should you 528 00:27:25,640 --> 00:27:29,919 Speaker 3: play to be NWSL teams? And it spurred out every 529 00:27:29,960 --> 00:27:32,560 Speaker 3: team in the league on what formation you should play? 530 00:27:32,640 --> 00:27:34,280 Speaker 3: And for two teams I'm not going to say who 531 00:27:34,280 --> 00:27:36,560 Speaker 3: they are, they'll know the two teams, it went you 532 00:27:36,600 --> 00:27:37,440 Speaker 3: should play back five. 533 00:27:37,560 --> 00:27:38,679 Speaker 4: So I did you know? 534 00:27:38,760 --> 00:27:42,000 Speaker 1: It's interesting because if all soccer teams are playing against 535 00:27:42,040 --> 00:27:44,320 Speaker 1: each other are using AI tactics, and it's like. 536 00:27:44,359 --> 00:27:47,800 Speaker 2: It's like saying all coaches have the same knowledge base, 537 00:27:47,880 --> 00:27:48,360 Speaker 2: but it's like. 538 00:27:48,960 --> 00:27:52,920 Speaker 1: On acid exactly, except Laura Harvey and the rest of 539 00:27:52,920 --> 00:27:57,000 Speaker 1: the coaching staff, like Talisia, did a deep dive into 540 00:27:57,000 --> 00:28:00,960 Speaker 1: whether or not the defensive tactic recommended by would actually work. 541 00:28:01,400 --> 00:28:04,200 Speaker 1: So again, good to get the prompts or good to 542 00:28:04,200 --> 00:28:06,199 Speaker 1: get the output, and then good to use it for 543 00:28:06,240 --> 00:28:08,320 Speaker 1: your purposes. Yep, smartly. 544 00:28:08,440 --> 00:28:09,879 Speaker 2: So was this a winning strategy? 545 00:28:10,119 --> 00:28:12,600 Speaker 1: Well, Laura didn't say which teams are they played against 546 00:28:12,720 --> 00:28:16,119 Speaker 1: using chats tactics, but Seattle Rain have played with a 547 00:28:16,160 --> 00:28:19,280 Speaker 1: back five at multiple points this season and it seems 548 00:28:19,320 --> 00:28:21,800 Speaker 1: to be paying off. In twenty twenty four, they rank 549 00:28:21,960 --> 00:28:25,399 Speaker 1: second bottom in the league, and this year they're number 550 00:28:25,440 --> 00:28:28,440 Speaker 1: five and heading into the twenty twenty five quarter finals. 551 00:28:28,080 --> 00:28:30,440 Speaker 2: A win for chat GPT. I'm going to be looking 552 00:28:30,520 --> 00:28:33,000 Speaker 2: out for that back five. I'm also going to be 553 00:28:33,000 --> 00:28:35,080 Speaker 2: looking out for submissions for chatting me. We want to 554 00:28:35,119 --> 00:28:38,800 Speaker 2: hear how you were using chatbots to win championships. Email 555 00:28:38,880 --> 00:29:04,880 Speaker 2: us at tex Stuff podcast at gmail dot com. That's 556 00:29:04,920 --> 00:29:07,440 Speaker 2: it for this week for Tech Stuff I'm Kara Price. 557 00:29:07,320 --> 00:29:09,920 Speaker 1: And I'm os Valoshin. This episode was produced by Elisa 558 00:29:09,960 --> 00:29:13,720 Speaker 1: Dennis Tyler Hill and Melissa Slaughter. It was executive produced 559 00:29:13,720 --> 00:29:17,000 Speaker 1: by me Kara Price, Julian Nutter, and Kate Osborne for 560 00:29:17,080 --> 00:29:21,640 Speaker 1: Kaleidoscope and Katria Norvel for iHeart Podcast. The engineer is 561 00:29:21,680 --> 00:29:25,240 Speaker 1: Beheth Fraser and Jack Insley makes this episode. Kyle Murdoch 562 00:29:25,280 --> 00:29:26,160 Speaker 1: wrote up be himself. 563 00:29:26,360 --> 00:29:28,920 Speaker 2: Join us next Wednesday for a deep dive into Nvidia, 564 00:29:29,280 --> 00:29:33,000 Speaker 2: the first publicly traded company to top five trillion dollars 565 00:29:33,040 --> 00:29:34,480 Speaker 2: in market value. 566 00:29:34,080 --> 00:29:36,560 Speaker 1: And please do rate, review and reach out to us 567 00:29:36,600 --> 00:29:39,400 Speaker 1: at tech Stuff Podcast at gmail dot com. We love 568 00:29:39,400 --> 00:29:39,920 Speaker 1: hearing from you