1 00:00:07,960 --> 00:00:16,000 Speaker 1: Glasco from Kaleidoscope and iHeart Podcasts. 2 00:00:16,079 --> 00:00:18,920 Speaker 2: This is tech stuff. I'm as Valoscian and I'm care Price. 3 00:00:19,120 --> 00:00:22,960 Speaker 1: Today we've got two big stories to break down for you. First, 4 00:00:23,200 --> 00:00:26,920 Speaker 1: content creators, brands and gen z alike are all turning 5 00:00:27,040 --> 00:00:33,240 Speaker 1: towards the latest luxury unplugging. Then tech companies blasting particles 6 00:00:33,320 --> 00:00:36,720 Speaker 1: into the atmosphere to dim the sun in response to 7 00:00:36,760 --> 00:00:37,680 Speaker 1: climate change. 8 00:00:37,920 --> 00:00:39,800 Speaker 3: Then we'll tell you about a few other stories that 9 00:00:39,880 --> 00:00:43,240 Speaker 3: caught our eye this week, like how polymarkets bets led 10 00:00:43,240 --> 00:00:47,640 Speaker 3: to disinformation about the Ukraine Russian War, a human washing 11 00:00:47,720 --> 00:00:51,840 Speaker 3: machine that promises to wash both your body and your soul, 12 00:00:52,560 --> 00:00:54,480 Speaker 3: and then we'll dive into how twenty three ande meters 13 00:00:54,520 --> 00:00:57,920 Speaker 3: is giving some users a piece of their new family's inheritances. 14 00:00:58,360 --> 00:01:02,280 Speaker 1: Then on Chat to me Chatcheep celebrates its third birthday 15 00:01:02,800 --> 00:01:06,560 Speaker 1: while Sam Altman declare's code read at Open AI with 16 00:01:06,640 --> 00:01:10,119 Speaker 1: Google's Gemini making rapid progress. Joining us to talk about 17 00:01:10,120 --> 00:01:13,319 Speaker 1: this is Megan Moroney from Axios, who's just published a 18 00:01:13,360 --> 00:01:16,640 Speaker 1: piece with the headline the three things keeping Sam Altman 19 00:01:16,920 --> 00:01:19,520 Speaker 1: up at night. All of that on the weekend Tech. 20 00:01:19,920 --> 00:01:23,680 Speaker 1: It's Friday, December fifth, Hello. 21 00:01:23,480 --> 00:01:30,200 Speaker 4: Cara, Hello Oz I was reflecting this week about this 22 00:01:30,319 --> 00:01:33,200 Speaker 4: thing that we used to say when I think we 23 00:01:33,240 --> 00:01:35,040 Speaker 4: first did podcasting, which is that you and I had 24 00:01:35,040 --> 00:01:35,840 Speaker 4: a face for radio. 25 00:01:36,280 --> 00:01:39,160 Speaker 1: Yes, I mean funny enough, I've made that joke for 26 00:01:39,200 --> 00:01:41,600 Speaker 1: all these years. But now podcasting is pivoting to video, 27 00:01:41,800 --> 00:01:42,240 Speaker 1: like what are we? 28 00:01:42,959 --> 00:01:46,480 Speaker 3: Nobody is immune? Nobody is immune anymore. And there was 29 00:01:46,520 --> 00:01:50,360 Speaker 3: an article that I recently read in Business Insider called 30 00:01:50,480 --> 00:01:52,960 Speaker 3: being hot is a new job requirement. 31 00:01:55,280 --> 00:01:56,760 Speaker 2: Dari asks why this caught your eye? 32 00:01:57,040 --> 00:01:59,560 Speaker 3: You know, it's funny. My mother always used to say 33 00:01:59,560 --> 00:02:03,000 Speaker 3: I'm in an image based business because she's in public relationships, right, 34 00:02:03,360 --> 00:02:07,000 Speaker 3: and so who you see is what you get, and 35 00:02:07,040 --> 00:02:09,440 Speaker 3: it's very important. And I kind of always used to 36 00:02:09,560 --> 00:02:11,960 Speaker 3: roll my eyes at that idea. And then this article 37 00:02:12,680 --> 00:02:15,400 Speaker 3: kind of confirmed everything she's been saying to me for 38 00:02:15,440 --> 00:02:17,880 Speaker 3: thirty years, which is, and I'm thirty six, but since 39 00:02:17,880 --> 00:02:23,040 Speaker 3: I was six, yeah, exactly. Yeah, she always was kind 40 00:02:23,040 --> 00:02:27,600 Speaker 3: of like, people who are good looking are more successful. 41 00:02:27,639 --> 00:02:29,720 Speaker 1: Well, I mean that's kind of a story as all 42 00:02:29,800 --> 00:02:32,000 Speaker 1: as time, right, So I'm curious where the new and 43 00:02:32,080 --> 00:02:33,720 Speaker 1: the new job requirement comes from. 44 00:02:34,000 --> 00:02:37,840 Speaker 3: So tech is both driving the issue and it's solving 45 00:02:37,880 --> 00:02:41,560 Speaker 3: the issue. Meaning that in some ways we have become 46 00:02:41,600 --> 00:02:44,000 Speaker 3: more and more critical of how we look. Like post pandemic. 47 00:02:44,040 --> 00:02:46,600 Speaker 3: We talk about the zoom meetings are forcing people to 48 00:02:46,639 --> 00:02:48,880 Speaker 3: stare at their faces more and more. You and I, 49 00:02:48,880 --> 00:02:50,680 Speaker 3: I mean I just looked at myself in the camera 50 00:02:50,760 --> 00:02:52,959 Speaker 3: right now, and I'm like, carry you could. 51 00:02:52,720 --> 00:02:56,240 Speaker 2: Have especially with these lights, with these lights. 52 00:02:56,240 --> 00:02:58,799 Speaker 3: But you know, it's like in every field, I think 53 00:02:58,840 --> 00:03:03,360 Speaker 3: there is a demand for people to be looking at themselves. 54 00:03:02,840 --> 00:03:03,320 Speaker 5: More and more. 55 00:03:03,440 --> 00:03:05,480 Speaker 1: It's a kind of double headed trend, right, because on 56 00:03:05,520 --> 00:03:07,760 Speaker 1: the one hand, is so like the image of our 57 00:03:07,880 --> 00:03:11,760 Speaker 1: face talking into the ether has never been more important 58 00:03:12,320 --> 00:03:14,120 Speaker 1: nor more easy to fabricate. 59 00:03:13,760 --> 00:03:16,120 Speaker 3: That's right. And this is the way that tech is facilitating, 60 00:03:16,240 --> 00:03:18,320 Speaker 3: not the problem, but this idea that you kind of 61 00:03:18,360 --> 00:03:21,040 Speaker 3: need to be hot in order to work in the 62 00:03:21,160 --> 00:03:24,760 Speaker 3: job market. Very normal people who do not know how 63 00:03:24,760 --> 00:03:28,240 Speaker 3: to take headshots are using very simple AI tools to 64 00:03:28,600 --> 00:03:33,079 Speaker 3: say I'm not just Emily, I'm yasified Emily, and I'm 65 00:03:33,080 --> 00:03:35,160 Speaker 3: going to take this picture of me and my dog 66 00:03:35,280 --> 00:03:37,120 Speaker 3: and I'm going to turn it into my head shot. 67 00:03:37,320 --> 00:03:40,800 Speaker 3: You know everyone talks about Instagram face. Yeah, Instagram face 68 00:03:40,920 --> 00:03:47,280 Speaker 3: is like lip augmentation, botox, GLP ones being on the rise, 69 00:03:47,320 --> 00:03:50,280 Speaker 3: plastic surgery of especially amongst men, which we talked about 70 00:03:50,320 --> 00:03:54,160 Speaker 3: on the show. I think there's now something that we 71 00:03:54,200 --> 00:03:58,360 Speaker 3: can call LinkedIn face that we're actually changing the way 72 00:03:58,400 --> 00:03:59,800 Speaker 3: we look for professional reasons. 73 00:04:00,000 --> 00:04:02,400 Speaker 1: I heard that some people are actually getting plastic surgery 74 00:04:02,520 --> 00:04:04,839 Speaker 1: men in particular to look more alert's. 75 00:04:05,600 --> 00:04:09,480 Speaker 2: Isn't that bizarre? The eye is the eye surgery, like 76 00:04:09,520 --> 00:04:11,280 Speaker 2: constantly looking like listen. 77 00:04:11,120 --> 00:04:13,480 Speaker 3: I'm gonna be awake during these forty million hour weeks 78 00:04:13,520 --> 00:04:14,040 Speaker 3: that I'm working. 79 00:04:14,160 --> 00:04:16,440 Speaker 2: Yeah, totally zoom phage zoom face. 80 00:04:17,000 --> 00:04:19,600 Speaker 1: Actually, I think about this particularly in my other sort 81 00:04:19,640 --> 00:04:22,800 Speaker 1: of founder CEO role. I do think like I'm losing 82 00:04:22,800 --> 00:04:25,680 Speaker 1: my hair a bit, like it too fat, like less 83 00:04:25,680 --> 00:04:29,159 Speaker 1: credible because I don't sort of match this particular aesthetic. 84 00:04:29,279 --> 00:04:32,080 Speaker 1: So I think it's probably being amplified by tech, but 85 00:04:32,120 --> 00:04:33,360 Speaker 1: it is not like a new thing. 86 00:04:33,640 --> 00:04:37,400 Speaker 3: I think the takeaway for me is that as the 87 00:04:37,520 --> 00:04:41,800 Speaker 3: job market gets tighter and tighter, looks maxing in general 88 00:04:42,279 --> 00:04:44,920 Speaker 3: is something that is playing into the way people get jobs. 89 00:04:45,360 --> 00:04:47,159 Speaker 3: And I think that when you see the sort of 90 00:04:47,720 --> 00:04:52,479 Speaker 3: most successful billionaires looks maxing, if you're someone who even 91 00:04:52,520 --> 00:04:53,360 Speaker 3: aspires to be. 92 00:04:53,400 --> 00:04:54,719 Speaker 2: A millionaire, bezos. 93 00:04:54,800 --> 00:04:57,080 Speaker 3: You're right that, like, all right, I better get my 94 00:04:57,240 --> 00:04:59,480 Speaker 3: I think it's actually really interesting. I do think that 95 00:04:59,520 --> 00:05:03,960 Speaker 3: our chronic onlineness has led us down this path of 96 00:05:04,040 --> 00:05:08,400 Speaker 3: temptation to sort of change our face and what we 97 00:05:08,440 --> 00:05:10,880 Speaker 3: look like and how we interact digitally. And I think 98 00:05:10,920 --> 00:05:15,599 Speaker 3: that there is now a pushback and it ties into 99 00:05:15,680 --> 00:05:17,359 Speaker 3: the story that I want to tell you, which is 100 00:05:17,400 --> 00:05:21,680 Speaker 3: about how unplugging became luxury's most valuable currency. Huh, I 101 00:05:21,720 --> 00:05:22,960 Speaker 3: actually said this to you. I think when we were 102 00:05:23,000 --> 00:05:28,880 Speaker 3: reporting on Sleepwalkers that to me, exclusivity has always been 103 00:05:28,880 --> 00:05:29,400 Speaker 3: a luxury. 104 00:05:29,520 --> 00:05:31,760 Speaker 2: Scarcity is is luxury? 105 00:05:32,279 --> 00:05:37,200 Speaker 3: What is scarce in the digital age? Offline, being offline, 106 00:05:37,240 --> 00:05:40,200 Speaker 3: an analog And so the story that I want to 107 00:05:40,200 --> 00:05:43,200 Speaker 3: tell you that comes from Vogue Business, which is about 108 00:05:43,279 --> 00:05:45,839 Speaker 3: this woman who goes by the name of cat GPT. 109 00:05:46,800 --> 00:05:49,960 Speaker 3: She's a creator. She recently decided to connect her cell 110 00:05:50,000 --> 00:05:52,839 Speaker 3: phone to an analog phone in an attempt to cut 111 00:05:52,880 --> 00:05:56,440 Speaker 3: down her screen time. Now the irony is her post 112 00:05:56,480 --> 00:05:59,560 Speaker 3: about her no phone mourning went viral because of course 113 00:05:59,560 --> 00:06:00,159 Speaker 3: she had to put. 114 00:06:00,440 --> 00:06:03,360 Speaker 1: You can't, you can't and you can only be offline performatively, 115 00:06:03,760 --> 00:06:07,320 Speaker 1: right if you're just offline nobody knows the thing. 116 00:06:07,320 --> 00:06:12,000 Speaker 2: If you like tree files in the forest exactly, if 117 00:06:12,000 --> 00:06:12,880 Speaker 2: you if. 118 00:06:12,720 --> 00:06:15,760 Speaker 1: You're offline without telling people you're offline on your social media, 119 00:06:15,800 --> 00:06:16,440 Speaker 1: are you offline? 120 00:06:16,480 --> 00:06:18,680 Speaker 3: It reminds me of this great Onion headline once that 121 00:06:18,800 --> 00:06:23,599 Speaker 3: was like woman runs marathon, tells nobody why. But you know, 122 00:06:23,760 --> 00:06:27,520 Speaker 3: actually from that viral piece, she created this company called 123 00:06:27,560 --> 00:06:32,760 Speaker 3: Physical Phones, which sells Bluetooth connected analog phones and cat 124 00:06:32,839 --> 00:06:36,680 Speaker 3: GPT told Vogue quote, people are really turned off by 125 00:06:36,720 --> 00:06:39,960 Speaker 3: technology right now. They're turned off by AI. And by 126 00:06:39,960 --> 00:06:42,680 Speaker 3: the way, we tend to conflate AI with social media 127 00:06:42,920 --> 00:06:43,760 Speaker 3: and our phones. 128 00:06:44,279 --> 00:06:44,920 Speaker 2: I think that's right. 129 00:06:44,960 --> 00:06:46,240 Speaker 1: I mean, I think that's when we talked about this 130 00:06:46,279 --> 00:06:48,800 Speaker 1: a few times with the new Luddites and honestly with 131 00:06:49,400 --> 00:06:52,599 Speaker 1: your journey this year in terms of doing less social 132 00:06:52,640 --> 00:06:55,760 Speaker 1: media and being offline. I every morning try and make 133 00:06:55,800 --> 00:06:57,280 Speaker 1: sure I don't look at my phone for the first 134 00:06:57,279 --> 00:07:00,440 Speaker 1: thirteen minutes it's awake, So I think it's it's it's 135 00:07:00,440 --> 00:07:02,640 Speaker 1: But what's interesting, and this is I guess a Vogue 136 00:07:02,640 --> 00:07:07,400 Speaker 1: business story, is that this like trend and cultural desire 137 00:07:07,880 --> 00:07:08,520 Speaker 1: is being. 138 00:07:10,600 --> 00:07:14,560 Speaker 3: Exactly exactly and it's similar to the and I know 139 00:07:14,640 --> 00:07:17,040 Speaker 3: this from running a book club where we were working 140 00:07:17,040 --> 00:07:20,720 Speaker 3: with fashion brands. It's similar to what happened with the 141 00:07:20,800 --> 00:07:23,600 Speaker 3: literary space, which was all of a sudden everyone was like, 142 00:07:24,240 --> 00:07:25,040 Speaker 3: it's time to read. 143 00:07:25,120 --> 00:07:27,880 Speaker 1: I know, I watched hundreds of hours of you flicking 144 00:07:27,880 --> 00:07:31,600 Speaker 1: through books on Instagram. But how do you concentrate on 145 00:07:31,640 --> 00:07:33,840 Speaker 1: the book when you're also making social content about yourself? 146 00:07:33,920 --> 00:07:34,360 Speaker 2: Is interesting? 147 00:07:34,440 --> 00:07:36,800 Speaker 3: Well, that's why people ask if it's performative reading into that. 148 00:07:36,840 --> 00:07:38,680 Speaker 3: I say, no, it's not. 149 00:07:39,280 --> 00:07:42,320 Speaker 2: There was a rumor that you were a celebrity. 150 00:07:42,200 --> 00:07:43,280 Speaker 3: Book book stylist. 151 00:07:43,440 --> 00:07:46,440 Speaker 1: Yes, I that you gave celebrity books that they could 152 00:07:46,480 --> 00:07:47,080 Speaker 1: be papped with. 153 00:07:47,240 --> 00:07:48,160 Speaker 2: That's right, true or not? 154 00:07:48,480 --> 00:07:50,280 Speaker 3: I can't say. I still can't. 155 00:07:51,360 --> 00:07:52,120 Speaker 2: I still want to say. 156 00:07:52,480 --> 00:07:55,200 Speaker 3: But going back to this why it's a Vogue business piece. 157 00:07:55,280 --> 00:08:00,040 Speaker 3: You know, this idea that brands are now investing in 158 00:08:00,600 --> 00:08:05,600 Speaker 3: luddite behavior like Burbery sponsored an outdoor walk with a 159 00:08:05,640 --> 00:08:08,640 Speaker 3: group of women who were just trying to connect to nature. 160 00:08:08,760 --> 00:08:10,840 Speaker 3: You know what I mean. It's just there's something so 161 00:08:11,040 --> 00:08:17,120 Speaker 3: interesting about to me that not using technology can be 162 00:08:17,280 --> 00:08:18,680 Speaker 3: something that is in the zeitgeist. 163 00:08:19,040 --> 00:08:22,679 Speaker 1: We have this, I guess, tremendous nostalgia for a lost time. 164 00:08:23,080 --> 00:08:25,520 Speaker 3: We do and I think it's I mean, I was 165 00:08:25,600 --> 00:08:28,040 Speaker 3: kind of laughing. I was looking at the Physical Phones 166 00:08:28,080 --> 00:08:32,080 Speaker 3: website this morning, and I mean, apparently people are buying 167 00:08:32,120 --> 00:08:34,480 Speaker 3: these physical phones. I think the irony that they're connected 168 00:08:34,559 --> 00:08:37,160 Speaker 3: via Bluetooth is not lost on me. But in a 169 00:08:37,200 --> 00:08:42,719 Speaker 3: moment of actually great irony, AI companies are actually up 170 00:08:42,760 --> 00:08:43,560 Speaker 3: on this trend. 171 00:08:43,960 --> 00:08:44,480 Speaker 2: I saw this. 172 00:08:44,480 --> 00:08:46,800 Speaker 1: There was a piece in The Times about how the 173 00:08:46,960 --> 00:08:49,800 Speaker 1: all of the marketing campaigns of the major AI companies 174 00:08:50,240 --> 00:08:53,600 Speaker 1: don't include AI, and didn't didn't Anthropic open a pop 175 00:08:53,679 --> 00:08:54,520 Speaker 1: up in New York. 176 00:08:54,679 --> 00:08:59,240 Speaker 3: The Zero Slop Zone, so you would go inside. It 177 00:08:59,280 --> 00:09:01,240 Speaker 3: was this you know pop up in the West Village 178 00:09:01,280 --> 00:09:04,360 Speaker 3: where all pop ups are, and inside you were asked 179 00:09:04,360 --> 00:09:08,440 Speaker 3: to unplug and interact with other humans. So the company 180 00:09:08,480 --> 00:09:12,480 Speaker 3: actually passed out baseball caps, really some really hard hitting 181 00:09:12,520 --> 00:09:16,320 Speaker 3: stuff that read thinking. And they were also handing out 182 00:09:16,320 --> 00:09:19,360 Speaker 3: hard copies of a fifteen thousand word essay written by 183 00:09:19,360 --> 00:09:23,240 Speaker 3: the Anthropics CEO Dario amo Day. The essay is called 184 00:09:23,760 --> 00:09:27,360 Speaker 3: Machines of Loving Grace, How AI could transform the world 185 00:09:27,400 --> 00:09:27,959 Speaker 3: for the better. 186 00:09:28,440 --> 00:09:32,920 Speaker 1: I mean, having a super premium printed essay handed out 187 00:09:32,920 --> 00:09:35,079 Speaker 1: by an AI company in a pop up in the 188 00:09:35,080 --> 00:09:35,600 Speaker 1: West village. 189 00:09:35,600 --> 00:09:38,439 Speaker 2: I mean there is a kind of mind melting quality 190 00:09:38,520 --> 00:09:38,720 Speaker 2: to this. 191 00:09:39,360 --> 00:09:42,600 Speaker 1: I was thinking, Actually, I saw Elon was on X 192 00:09:42,679 --> 00:09:45,160 Speaker 1: this week talking about something that was actually connected, and 193 00:09:45,160 --> 00:09:46,360 Speaker 1: I want to play you a clip please. 194 00:09:46,720 --> 00:09:49,840 Speaker 6: When digital media is ubiquitous and you can just have 195 00:09:49,920 --> 00:09:53,920 Speaker 6: anything digitally essentially for free or very close to for free, 196 00:09:54,960 --> 00:09:58,760 Speaker 6: then the scarce commodity will be live events. 197 00:09:58,960 --> 00:10:01,000 Speaker 3: There we go, the scares com It'll be live events. 198 00:10:01,520 --> 00:10:03,360 Speaker 3: Just Also, one more thing that I thought was very funny. 199 00:10:03,480 --> 00:10:07,080 Speaker 3: Part of being able to be let into this, you know, 200 00:10:07,240 --> 00:10:09,600 Speaker 3: zero slop zone was that you had to have cloud 201 00:10:09,640 --> 00:10:11,720 Speaker 3: on your phone. Of course, so they weren't like letting 202 00:10:11,760 --> 00:10:14,199 Speaker 3: people in willy nilly to like read a fifteen thousand 203 00:10:14,240 --> 00:10:15,960 Speaker 3: word essay that you had to have cloud on your phone. 204 00:10:16,280 --> 00:10:17,240 Speaker 3: Which is funny. 205 00:10:17,200 --> 00:10:19,480 Speaker 1: Was having an AX, which entitles you to buy water 206 00:10:19,720 --> 00:10:20,640 Speaker 1: full price of the USA. 207 00:10:20,760 --> 00:10:21,920 Speaker 2: That's that's exactly right. 208 00:10:22,000 --> 00:10:24,360 Speaker 3: We're going to give you a free bad ear radio 209 00:10:24,480 --> 00:10:26,240 Speaker 3: if you spend eight hundred dollars a year on the 210 00:10:26,400 --> 00:10:28,720 Speaker 3: American Express. The other thing that I thought was really 211 00:10:28,720 --> 00:10:31,640 Speaker 3: interesting is that open Ai actually shot their latest ads 212 00:10:31,679 --> 00:10:35,120 Speaker 3: on thirty five millimeter film and insisted that no AI 213 00:10:35,320 --> 00:10:38,720 Speaker 3: was used in the making of the ad. That's Oppenheimer's stat. 214 00:10:39,160 --> 00:10:41,640 Speaker 7: Living in the end times, it's just like, either be 215 00:10:41,800 --> 00:10:43,720 Speaker 7: what you are or don't be what you are, but 216 00:10:43,840 --> 00:10:46,120 Speaker 7: don't like try to make a gimmick about how you're 217 00:10:46,240 --> 00:10:50,840 Speaker 7: using analog film to make a commercial about the product 218 00:10:50,840 --> 00:10:54,040 Speaker 7: that's probably going to change the world most drastically in 219 00:10:54,080 --> 00:10:55,080 Speaker 7: the next ten years. 220 00:10:55,600 --> 00:10:57,680 Speaker 2: So Caro, I'm going to switch gears a little bit. Now. 221 00:10:58,240 --> 00:11:00,640 Speaker 1: You brought up hotness earlier. Do you know what the 222 00:11:00,679 --> 00:11:07,280 Speaker 1: hottest thing of all time is? What the sun? Okay, 223 00:11:07,360 --> 00:11:09,240 Speaker 1: so I want to talk to you about the sun. 224 00:11:10,280 --> 00:11:14,480 Speaker 1: Imagine if the solution to global warming was as simple 225 00:11:14,600 --> 00:11:16,360 Speaker 1: as blocking sunlight. 226 00:11:16,960 --> 00:11:17,880 Speaker 2: This is interesting. 227 00:11:18,000 --> 00:11:20,520 Speaker 1: Yeah, So I actually read a piece in Bloomberg about 228 00:11:20,520 --> 00:11:22,839 Speaker 1: how there's tens of millions of dollars being invested right 229 00:11:22,880 --> 00:11:29,120 Speaker 1: now into an area of technology called solar geoengineering. Okay, Basically, 230 00:11:29,640 --> 00:11:32,160 Speaker 1: what they're trying to do is to do something called 231 00:11:32,559 --> 00:11:38,160 Speaker 1: stratospheric aerosol injection. Basically, the premise is if you can 232 00:11:38,520 --> 00:11:43,000 Speaker 1: sort of shoot reflective particles into the sky, they can 233 00:11:43,200 --> 00:11:46,199 Speaker 1: reflect the sunlight back towards the sun, and it's actually 234 00:11:46,240 --> 00:11:47,120 Speaker 1: it happens naturally. 235 00:11:47,160 --> 00:11:48,720 Speaker 3: So we're fighting the sun with the sun. 236 00:11:49,080 --> 00:11:52,280 Speaker 2: We were sort of turning the sun itself. Yeah, exactly. 237 00:11:52,440 --> 00:11:56,200 Speaker 1: This actually happens after a volcano erupts. It happens anyway. 238 00:11:56,480 --> 00:11:58,760 Speaker 1: All of the stuff, the sulfur dioxide that goes into 239 00:11:58,760 --> 00:12:04,120 Speaker 1: the environment after a volcano, a period of cooling often follows. 240 00:12:04,480 --> 00:12:07,720 Speaker 1: So some people are experimenting with pumping sulfur dioxide in 241 00:12:07,720 --> 00:12:11,360 Speaker 1: the environment, which can cause acid, rain damage, the ozone 242 00:12:11,360 --> 00:12:12,280 Speaker 1: and asthma attacks. 243 00:12:12,920 --> 00:12:13,199 Speaker 2: Great. 244 00:12:14,440 --> 00:12:17,400 Speaker 1: I mean there are others though, who are using other 245 00:12:18,000 --> 00:12:21,560 Speaker 1: theoretically non harmful materials. And there's kind of two tracks 246 00:12:21,559 --> 00:12:24,079 Speaker 1: to this. On the one hand, there's a government track. 247 00:12:24,200 --> 00:12:28,800 Speaker 1: The UK Advanced Research and Invention Agency has invested seventy 248 00:12:28,800 --> 00:12:30,720 Speaker 1: five million dollars researching. 249 00:12:30,280 --> 00:12:32,120 Speaker 3: This insignific not insignificant. 250 00:12:32,480 --> 00:12:37,280 Speaker 1: But a private company called star Dust of course, which 251 00:12:37,320 --> 00:12:40,400 Speaker 1: is a sexier name than the Advanced Research Invention Agency, 252 00:12:40,880 --> 00:12:44,400 Speaker 1: has raised sixty million dollars and they are looking to 253 00:12:44,440 --> 00:12:48,480 Speaker 1: pattern a chemical that has fewer drawbacks than sulfur dioxide. 254 00:12:49,200 --> 00:12:53,040 Speaker 1: There's also a company called Make Sunsets, which is which 255 00:12:53,080 --> 00:12:57,000 Speaker 1: is selling cooling credits for one dollar a pop to anyone. 256 00:12:57,320 --> 00:12:59,079 Speaker 1: You can buy one. You remember when you were a 257 00:12:59,120 --> 00:13:00,600 Speaker 1: kid that you could buy an acre on the moon. 258 00:13:00,720 --> 00:13:02,720 Speaker 1: Of course, of course this is the acre on the 259 00:13:02,720 --> 00:13:04,319 Speaker 1: Moon the. 260 00:13:04,240 --> 00:13:06,239 Speaker 3: Green Bay Packers. Yeah, it's so interesting. 261 00:13:06,800 --> 00:13:09,560 Speaker 1: So for one dollar you can have the good feeling 262 00:13:09,800 --> 00:13:14,080 Speaker 1: of paying for a sulfur dioxide balloon that explodes into 263 00:13:14,080 --> 00:13:14,560 Speaker 1: the environment. 264 00:13:14,880 --> 00:13:17,120 Speaker 3: I should have bought one. I should have bought one. 265 00:13:17,280 --> 00:13:20,240 Speaker 1: The EPA has suggested it's not maybe not the best idea, 266 00:13:20,760 --> 00:13:24,160 Speaker 1: and the company has two founders, neither of whom are scientists, 267 00:13:24,160 --> 00:13:27,160 Speaker 1: one are technologists, and one's a marketer. They claim on 268 00:13:27,160 --> 00:13:30,400 Speaker 1: their website that one gram of particles in a stratosphere 269 00:13:30,840 --> 00:13:34,360 Speaker 1: prevents the warming caused by a ton of carbon dioxide, 270 00:13:34,840 --> 00:13:36,760 Speaker 1: which is about as credible as when you are offered 271 00:13:36,760 --> 00:13:39,280 Speaker 1: the opportunity to spend an extra dollar on your flight. 272 00:13:41,160 --> 00:13:43,920 Speaker 3: Carbon exactly. I just think it's I mean, it's an 273 00:13:43,920 --> 00:13:47,360 Speaker 3: interesting text story because from what you've read or what 274 00:13:47,400 --> 00:13:51,280 Speaker 3: you've learned in reporting this, like has has any significant 275 00:13:51,360 --> 00:13:52,480 Speaker 3: change been made? 276 00:13:53,000 --> 00:13:53,480 Speaker 2: I don't. 277 00:13:53,520 --> 00:13:56,600 Speaker 1: I don't think so yet. But Elon actually, and I'm 278 00:13:56,640 --> 00:13:59,280 Speaker 1: sorry to bring it up again, has talked about sending 279 00:13:59,320 --> 00:14:02,640 Speaker 1: a fleet of solar powered satellites into the atmosphere in 280 00:14:02,720 --> 00:14:05,360 Speaker 1: order to reflect sunlight. So this is like a kind 281 00:14:05,360 --> 00:14:08,320 Speaker 1: of tech bro thing. I mean, there's a kind of 282 00:14:08,800 --> 00:14:11,520 Speaker 1: icarous esque quality to this story. 283 00:14:11,720 --> 00:14:13,800 Speaker 3: Don't you think it has something to do with controlling 284 00:14:13,880 --> 00:14:15,360 Speaker 3: such an unruly element? 285 00:14:15,440 --> 00:14:19,640 Speaker 1: Can you imagine controlling the sun? There's nothing more potentate. 286 00:14:19,200 --> 00:14:21,880 Speaker 3: If you can't take over the Earth, take over the sun. 287 00:14:21,920 --> 00:14:23,600 Speaker 1: But it has that little bit quality of like, oh, 288 00:14:23,880 --> 00:14:26,320 Speaker 1: you know, there's global warming, let's think about colonizing Mars 289 00:14:26,720 --> 00:14:27,240 Speaker 1: global warming. 290 00:14:27,320 --> 00:14:29,800 Speaker 2: Let's pump random chemical and. 291 00:14:29,760 --> 00:14:32,000 Speaker 3: See what happens. It just seems like a fool's errand 292 00:14:32,040 --> 00:14:32,320 Speaker 3: to me. 293 00:14:32,560 --> 00:14:37,200 Speaker 1: Yeah, Well, interestingly, there's a kind of an analogous technology 294 00:14:37,400 --> 00:14:39,600 Speaker 1: which is being used and has been being used for 295 00:14:39,840 --> 00:14:42,000 Speaker 1: quite a long time called cloud seeding I've heard of 296 00:14:42,240 --> 00:14:44,640 Speaker 1: and cloud seeding has also been in the headlines recently 297 00:14:44,680 --> 00:14:49,720 Speaker 1: because there's there's a huge cataclysmic flooding storm in Dubai 298 00:14:50,080 --> 00:14:53,800 Speaker 1: last year and some people think it's because of cloud seeding. 299 00:14:54,400 --> 00:14:55,960 Speaker 3: And what is cloud seeding exactly? 300 00:14:56,320 --> 00:15:00,480 Speaker 1: Cloud seeding is basically, if there are clouds in this god, yeah, 301 00:15:00,720 --> 00:15:04,520 Speaker 1: you can get them to rain by giving them particulate 302 00:15:04,920 --> 00:15:05,920 Speaker 1: to coalesce. 303 00:15:06,080 --> 00:15:08,720 Speaker 3: It's like oxytose. If you take oxytocin, sometimes it helps 304 00:15:08,720 --> 00:15:09,920 Speaker 3: you cry, Is that right? 305 00:15:10,040 --> 00:15:10,240 Speaker 5: Yeah? 306 00:15:10,320 --> 00:15:13,680 Speaker 3: Or you know, actors very famously use menthol blowers to 307 00:15:13,720 --> 00:15:17,520 Speaker 3: make themselves cry because it like blows mint essentially into 308 00:15:17,560 --> 00:15:19,160 Speaker 3: your eyes and it makes you tear up. 309 00:15:19,880 --> 00:15:22,960 Speaker 1: That's atreat great analogy. This is a quote from the 310 00:15:23,000 --> 00:15:26,640 Speaker 1: Bloomberg piece about cloud seeding. There are natural seating agents 311 00:15:27,080 --> 00:15:30,640 Speaker 1: dust that's blown into the troposphere, or the miasmic stench 312 00:15:30,720 --> 00:15:34,440 Speaker 1: of ammonia gas wafting up from penguin poo in Antarctic 313 00:15:34,520 --> 00:15:39,600 Speaker 1: colonies above which researchers have observed extra cloud cover. There 314 00:15:39,600 --> 00:15:44,000 Speaker 1: are unnatural agents too, such as stratocumulous trails above shipping 315 00:15:44,040 --> 00:15:48,120 Speaker 1: lanes caused by engine pollution. Plain old salt seems to 316 00:15:48,200 --> 00:15:52,800 Speaker 1: work in early experiments. Pilots toss it out of plane windows. 317 00:15:53,400 --> 00:15:55,600 Speaker 3: I would love to see pilots are like thrown salt. 318 00:15:55,680 --> 00:15:59,480 Speaker 1: Well right now, shoulders so now that the innovation is 319 00:15:59,520 --> 00:16:03,160 Speaker 1: to cover salt in titanium, which I'm not quite sure why. 320 00:16:03,360 --> 00:16:05,480 Speaker 1: But in Dubai right now, there's a fleet of planes 321 00:16:05,480 --> 00:16:07,880 Speaker 1: that whenever a cloud, you can't create a cloud, but 322 00:16:07,880 --> 00:16:09,560 Speaker 1: as soon as a cloud comes into the sky, it's 323 00:16:09,560 --> 00:16:15,720 Speaker 1: basically like clouds showing as much titanium coated salt into 324 00:16:15,720 --> 00:16:16,760 Speaker 1: the cloud as possible. 325 00:16:17,080 --> 00:16:19,200 Speaker 2: That's crazy, It's it's pretty bizarre. 326 00:16:19,240 --> 00:16:23,160 Speaker 1: And so the original verdict was that cloud seeding had 327 00:16:23,160 --> 00:16:25,520 Speaker 1: nothing to do with this cataclysmic flooding in Dubai. It 328 00:16:25,520 --> 00:16:28,600 Speaker 1: was a conspiracy theory, blah blah blah. However, this Bloomberg 329 00:16:28,640 --> 00:16:30,920 Speaker 1: piece that came out this month that actually there may 330 00:16:30,960 --> 00:16:32,920 Speaker 1: have been a substantial amount of cloud. 331 00:16:32,720 --> 00:16:35,880 Speaker 2: Seeding in the days around this huge flood. So interesting, 332 00:16:36,160 --> 00:16:37,240 Speaker 2: Be careful what you wish for. 333 00:16:37,760 --> 00:16:39,560 Speaker 1: The thing that makes me think about here is like 334 00:16:39,640 --> 00:16:44,520 Speaker 1: we are now acting as gods, including in human gene editing. 335 00:16:44,640 --> 00:16:47,400 Speaker 1: Like there's that crazy scientist in China who edited the 336 00:16:48,120 --> 00:16:52,360 Speaker 1: heritable human DNA thing, so that irreversible over generations if 337 00:16:52,360 --> 00:16:56,160 Speaker 1: that person procreates similarly, like you know. The Bloomberg piece 338 00:16:56,160 --> 00:16:58,400 Speaker 1: made a good point that the average lifespan of a 339 00:16:59,320 --> 00:17:02,160 Speaker 1: major private of a major public listed company is now 340 00:17:02,280 --> 00:17:06,560 Speaker 1: fifteen years. So if your horizons are the next financial quarter, 341 00:17:07,119 --> 00:17:08,679 Speaker 1: you may think you know have at it. 342 00:17:08,920 --> 00:17:09,800 Speaker 3: Why not, let's try. 343 00:17:09,960 --> 00:17:13,920 Speaker 1: But if your horizons are let's say, even your children, 344 00:17:14,119 --> 00:17:17,399 Speaker 1: right or your children's children. I have a feeling that 345 00:17:17,440 --> 00:17:20,720 Speaker 1: we may look back on this time of cloud seeding 346 00:17:20,960 --> 00:17:27,040 Speaker 1: and star dust and gene line editing, and think, how 347 00:17:27,119 --> 00:17:27,520 Speaker 1: was it. 348 00:17:28,920 --> 00:17:29,720 Speaker 2: That's all we can do? 349 00:17:30,320 --> 00:17:32,960 Speaker 3: That was all we could do. It does beg the 350 00:17:33,040 --> 00:17:35,160 Speaker 3: question like how effective are these things? 351 00:17:35,440 --> 00:17:35,840 Speaker 2: Unknown? 352 00:17:35,880 --> 00:17:37,960 Speaker 1: And to the star doests make the point we're already 353 00:17:38,160 --> 00:17:41,359 Speaker 1: geo engineering because we're pumping so much common dioxide and 354 00:17:41,400 --> 00:17:44,880 Speaker 1: other crap into the atmosphere that it would be unreasonable 355 00:17:44,960 --> 00:17:47,639 Speaker 1: not to use technology to try and counteract the negative 356 00:17:47,640 --> 00:17:51,040 Speaker 1: consequences we're already generating through technology. 357 00:17:51,280 --> 00:17:53,199 Speaker 3: I just think it's really interesting that these are private 358 00:17:53,240 --> 00:17:57,199 Speaker 3: companies that are trying to do this, because obviously we 359 00:17:57,280 --> 00:17:59,320 Speaker 3: know that anything the government tries to get done is 360 00:17:59,320 --> 00:18:02,639 Speaker 3: going to take forecs, and private companies like taking it 361 00:18:02,760 --> 00:18:08,600 Speaker 3: upon themselves. It's it's a sort of oligarchical mentality of 362 00:18:08,720 --> 00:18:12,280 Speaker 3: like we're going to take responsibility for how it rains. 363 00:18:12,320 --> 00:18:12,879 Speaker 3: I mean, these are I. 364 00:18:12,920 --> 00:18:16,440 Speaker 1: Don't imagine if the Manhattan Project had been private, like 365 00:18:16,960 --> 00:18:21,439 Speaker 1: tons of nuclear explos. 366 00:18:19,680 --> 00:18:22,040 Speaker 3: There'd be no US. It's very true, it's very true, 367 00:18:22,040 --> 00:18:24,600 Speaker 3: But now this seems to be so popular. You know, 368 00:18:24,680 --> 00:18:28,119 Speaker 3: private companies move a lot faster than governments, and the 369 00:18:28,160 --> 00:18:31,000 Speaker 3: speed at which climate change is occurring means that there's 370 00:18:31,040 --> 00:18:36,400 Speaker 3: not much time for governments to catch up. 371 00:18:37,800 --> 00:18:41,280 Speaker 1: After the break, Polymarket continues to bet. 372 00:18:41,240 --> 00:18:42,960 Speaker 2: On the Ukraine Russia war. 373 00:18:43,680 --> 00:18:48,639 Speaker 1: Japan's latest invention claims to wash your soul, and twenty 374 00:18:48,680 --> 00:18:52,000 Speaker 1: three in meter users assuing, then you found relatives for 375 00:18:52,040 --> 00:18:53,439 Speaker 1: a slice of the inheritance. 376 00:18:54,000 --> 00:18:57,000 Speaker 3: Then un chat and me Happy third birthday, Chat GPT, 377 00:18:57,240 --> 00:19:12,760 Speaker 3: watch out, Google might outpace you, and. 378 00:19:12,800 --> 00:19:13,520 Speaker 2: We're back, Cara. 379 00:19:13,880 --> 00:19:17,479 Speaker 1: Remember poly Market, the online gambling site where you can 380 00:19:17,520 --> 00:19:18,959 Speaker 1: basically bet on anything. 381 00:19:19,000 --> 00:19:20,480 Speaker 3: It's my favorite thing in the world. It's how we 382 00:19:20,520 --> 00:19:21,640 Speaker 3: knew Mom Danny was going to win. 383 00:19:21,960 --> 00:19:25,640 Speaker 1: They had basically called the election based on betting volume. 384 00:19:25,440 --> 00:19:26,679 Speaker 2: Before the election happened. 385 00:19:27,080 --> 00:19:28,600 Speaker 1: We talked about it last time on the show, in 386 00:19:28,640 --> 00:19:31,640 Speaker 1: relation to the betting volume on whether or not presidents 387 00:19:31,640 --> 00:19:35,159 Speaker 1: the lens you would wear a suit. Unfortunately, and distastefully, 388 00:19:35,720 --> 00:19:39,080 Speaker 1: people seem unable to be able to get enough of 389 00:19:39,080 --> 00:19:44,040 Speaker 1: betting on Ukraine Russia war stuff. One such bet is 390 00:19:44,080 --> 00:19:49,280 Speaker 1: around whether Russia will capture a town called Mirno by 391 00:19:49,680 --> 00:19:53,680 Speaker 1: November fifteenth, will be called mir No Grad in Russian. 392 00:19:53,359 --> 00:19:56,560 Speaker 2: But in Ukrainian Mino. I know because I have Ukrainian grandfather. 393 00:19:56,720 --> 00:19:57,159 Speaker 3: Yes you do. 394 00:19:57,400 --> 00:19:59,600 Speaker 1: The battle around this city is dragged on for weeks 395 00:20:00,080 --> 00:20:02,679 Speaker 1: and it's actually generated more than a million dollars in 396 00:20:02,760 --> 00:20:05,720 Speaker 1: trading volume on polymarket about whether or not russiall will 397 00:20:05,760 --> 00:20:09,960 Speaker 1: capture this town. Poly Market in order to determine whether 398 00:20:10,040 --> 00:20:13,240 Speaker 1: or not Russia has captured the town, uses maps of 399 00:20:13,280 --> 00:20:16,159 Speaker 1: the front lines generated by something called the Institute for 400 00:20:16,200 --> 00:20:18,560 Speaker 1: the Study of War, which is a DC based think tank. 401 00:20:19,040 --> 00:20:21,359 Speaker 1: Their maps are considered the gold standard for where the 402 00:20:21,359 --> 00:20:24,720 Speaker 1: front line is on any given day. Just before the 403 00:20:24,760 --> 00:20:28,639 Speaker 1: resolution on this of this Mirnograd bed, the map was 404 00:20:28,760 --> 00:20:31,840 Speaker 1: changed to show that Russia had indeed taken the town. 405 00:20:32,359 --> 00:20:36,280 Speaker 1: Polymarket paid out, and the next day the map reverted. 406 00:20:36,560 --> 00:20:37,320 Speaker 3: You're kidding. 407 00:20:37,760 --> 00:20:40,480 Speaker 1: What for a for media who broke this story hypothesized 408 00:20:40,520 --> 00:20:44,359 Speaker 1: is that someone at the Institute for the Study of 409 00:20:44,400 --> 00:20:50,440 Speaker 1: War was gotten to, essentially was bribed to temporarily change 410 00:20:50,840 --> 00:20:53,960 Speaker 1: the border of the conflict so that whoever was betting 411 00:20:53,960 --> 00:20:56,200 Speaker 1: that Russia would take it was paid out. 412 00:20:56,440 --> 00:20:58,919 Speaker 3: What do you think about gambling on a war with 413 00:20:59,000 --> 00:21:00,400 Speaker 3: real human lives. 414 00:21:00,480 --> 00:21:03,040 Speaker 2: Well, I think it's extremely distasteful. I should know. 415 00:21:03,160 --> 00:21:06,880 Speaker 1: IW have said that there was unauthorized and unapproved edits, 416 00:21:06,880 --> 00:21:09,040 Speaker 1: so they've acknowledged that somebody did go into the mainframe 417 00:21:09,040 --> 00:21:11,400 Speaker 1: and change the front lines temporarily, haven't. They haven't gone 418 00:21:11,440 --> 00:21:13,439 Speaker 1: any further than that. But but yeah, I mean it's 419 00:21:13,800 --> 00:21:18,000 Speaker 1: I'm gambling on a war with real human lives at states. 420 00:21:18,000 --> 00:21:19,760 Speaker 3: It's one thing to gamble on people who are boxing, 421 00:21:19,800 --> 00:21:21,679 Speaker 3: which I still feel like is barbaric. But I mean 422 00:21:21,680 --> 00:21:24,480 Speaker 3: when it comes to war in people's lives, like and 423 00:21:24,520 --> 00:21:29,159 Speaker 3: that this is now something that like has become you know, 424 00:21:29,280 --> 00:21:31,159 Speaker 3: almost like a fun thing to do on the internet. 425 00:21:31,160 --> 00:21:32,480 Speaker 1: I think it's On the other hand, you might just 426 00:21:32,480 --> 00:21:35,760 Speaker 1: say this democratizing different governments basically bet one way or another. 427 00:21:35,800 --> 00:21:37,919 Speaker 1: I mean they're you know, yes, let's look at you know, 428 00:21:38,119 --> 00:21:40,639 Speaker 1: Syria for example. All these conflicts have governments who are 429 00:21:40,680 --> 00:21:42,720 Speaker 1: betting on either side essentially, And now you can do 430 00:21:42,760 --> 00:21:43,000 Speaker 1: it too. 431 00:21:43,040 --> 00:21:44,280 Speaker 3: You can be like, you can do it too, but 432 00:21:44,320 --> 00:21:46,760 Speaker 3: you're just making money. You're not like changing Well, I 433 00:21:46,760 --> 00:21:48,520 Speaker 3: guess you can kind of change the course of things. 434 00:21:48,520 --> 00:21:50,159 Speaker 2: But well, okay, so that brings us back to the 435 00:21:50,160 --> 00:21:50,840 Speaker 2: politics point. 436 00:21:50,960 --> 00:21:51,200 Speaker 4: Yeah. 437 00:21:51,240 --> 00:21:53,280 Speaker 1: I read a piece in the Financial Times a couple 438 00:21:53,280 --> 00:21:56,119 Speaker 1: of weeks ago with the headline A new specter looms 439 00:21:56,160 --> 00:22:00,359 Speaker 1: over democracy, prediction markets, and basically the columnists Jami M. 440 00:22:00,480 --> 00:22:05,159 Speaker 1: Kelly writes about how these prediction markets can manipulate the 441 00:22:05,200 --> 00:22:09,280 Speaker 1: perception of the outcome of political events. So once you 442 00:22:09,400 --> 00:22:11,679 Speaker 1: see like Mandani ninety four percent according to. 443 00:22:11,680 --> 00:22:13,560 Speaker 3: The better, you're like, hey, Mammdannie's going to win. 444 00:22:13,640 --> 00:22:16,000 Speaker 2: If you're a Cuomo voter, you might think no point 445 00:22:16,040 --> 00:22:19,120 Speaker 2: turning up for the polls. Yeah, you would have been right. Yeah. 446 00:22:19,240 --> 00:22:22,040 Speaker 1: But she makes the point that in closer run things, 447 00:22:22,480 --> 00:22:26,080 Speaker 1: this could be really really bad. For example, last year, 448 00:22:26,400 --> 00:22:29,119 Speaker 1: more than three point six billion dollars was staked on 449 00:22:29,160 --> 00:22:32,320 Speaker 1: the outcome of the presidential election the US presidential. 450 00:22:31,960 --> 00:22:33,600 Speaker 3: Three point six billion usix. 451 00:22:33,359 --> 00:22:36,520 Speaker 1: Billion, despite the fact US users were not officially allowed 452 00:22:36,520 --> 00:22:39,280 Speaker 1: to use the platform, so Kelly points out in the 453 00:22:39,320 --> 00:22:42,280 Speaker 1: FT they may have used VPNs to circumvent that. But 454 00:22:42,440 --> 00:22:46,720 Speaker 1: according to the FT, four non American accounts from outside 455 00:22:46,760 --> 00:22:50,199 Speaker 1: the US together placed more than thirty million dollars in 456 00:22:50,280 --> 00:22:54,240 Speaker 1: wages on Trump's winning, which creates a decisive swing in 457 00:22:54,280 --> 00:22:56,960 Speaker 1: his favor on the platform in the weeks leading up 458 00:22:57,000 --> 00:22:57,720 Speaker 1: to the election. 459 00:22:57,840 --> 00:23:00,000 Speaker 3: And the platform does affect how people vote. 460 00:23:00,080 --> 00:23:02,000 Speaker 1: For thirty million dollars, you could tip that that's not 461 00:23:02,000 --> 00:23:03,720 Speaker 1: in the grand scheme of things compared to the amount 462 00:23:03,760 --> 00:23:06,640 Speaker 1: of money flowing through packs and super packs and whatever else, 463 00:23:06,680 --> 00:23:08,320 Speaker 1: like it's a drop in the ocean, but to be 464 00:23:08,320 --> 00:23:12,000 Speaker 1: able to effect the perception of the whole you know, 465 00:23:12,119 --> 00:23:14,480 Speaker 1: United States public who look at these betting markets for 466 00:23:14,520 --> 00:23:15,960 Speaker 1: a source of truth about's going to happen. 467 00:23:16,280 --> 00:23:18,720 Speaker 3: She also made it do it for sports so effortlessly. 468 00:23:18,880 --> 00:23:21,119 Speaker 1: You know, this like this whole of the election was 469 00:23:21,160 --> 00:23:23,320 Speaker 1: stolen narrative, et cetera, et cetera. If you've got like 470 00:23:23,359 --> 00:23:26,760 Speaker 1: a poly market which has been externally manipulated to show 471 00:23:26,880 --> 00:23:29,399 Speaker 1: like a much higher chance of let's say, Trump winning, 472 00:23:29,440 --> 00:23:33,159 Speaker 1: and then I mean it's like the negative consequences of this. 473 00:23:33,200 --> 00:23:36,800 Speaker 3: Are quite high, quite high. Yeah, no, absolutely, I would 474 00:23:36,840 --> 00:23:40,640 Speaker 3: actually call poly market weird technology. And I have another 475 00:23:41,160 --> 00:23:44,199 Speaker 3: piece of weird tech that I think is going to 476 00:23:44,240 --> 00:23:45,240 Speaker 3: excite you. 477 00:23:45,280 --> 00:23:46,120 Speaker 2: Go ahead. 478 00:23:47,119 --> 00:23:50,000 Speaker 3: From the country that brought you the singing toilet. 479 00:23:49,680 --> 00:23:52,200 Speaker 2: Oh I love to sing, this is Japan, right. 480 00:23:52,040 --> 00:23:55,680 Speaker 3: Japan is now making a human washing machine. What yes, 481 00:23:56,000 --> 00:23:56,360 Speaker 3: what do you. 482 00:23:56,280 --> 00:23:59,520 Speaker 2: Think this looks like, I mean, like a washing machine, 483 00:23:59,560 --> 00:24:00,000 Speaker 2: but a little bit. 484 00:24:00,600 --> 00:24:04,600 Speaker 3: No, it looks like when you know a you know, 485 00:24:04,720 --> 00:24:08,159 Speaker 3: Ridley Scott movie, someone comes out of a machine and 486 00:24:08,200 --> 00:24:12,639 Speaker 3: there's this like primordial ooze, like a chyrogenic Yes, exactly, 487 00:24:12,840 --> 00:24:16,760 Speaker 3: a cryogenic pod. It kind of looks like that. It's 488 00:24:16,800 --> 00:24:21,360 Speaker 3: a capsule with a sort of lounger that's built inside 489 00:24:21,400 --> 00:24:24,639 Speaker 3: the capsule and so you sit on that lounger and 490 00:24:24,760 --> 00:24:27,439 Speaker 3: the top comes down around you, and it sort of 491 00:24:27,440 --> 00:24:31,000 Speaker 3: looks like this whalehead and the whole cycle itself takes 492 00:24:31,040 --> 00:24:31,720 Speaker 3: fifteen minutes. 493 00:24:31,840 --> 00:24:32,439 Speaker 2: It's a womb. 494 00:24:33,000 --> 00:24:38,080 Speaker 3: It is a womb, and it's surprisingly not designed by Toto. 495 00:24:38,520 --> 00:24:42,280 Speaker 3: It's created by a Japanese company. And I love this. 496 00:24:42,280 --> 00:24:43,800 Speaker 3: This is when the Japanese are like, we do things 497 00:24:43,840 --> 00:24:45,640 Speaker 3: so well that we're going to call our company Science. 498 00:24:46,320 --> 00:24:48,760 Speaker 3: The company is called Science, and I wanted to just 499 00:24:48,800 --> 00:24:52,600 Speaker 3: read you the instructions for how it works. So basically, 500 00:24:52,640 --> 00:24:57,240 Speaker 3: you enter the capsule, the automatic wash begins. The machine 501 00:24:57,320 --> 00:25:01,640 Speaker 3: uses microbubbles and a fine misshap to gently clean your 502 00:25:01,840 --> 00:25:06,080 Speaker 3: entire body. There's built in sensors that track the user's 503 00:25:06,160 --> 00:25:10,720 Speaker 3: vital signs during the wash to ensure safety, and while 504 00:25:10,720 --> 00:25:14,400 Speaker 3: the washes in progress, calming visuals and soothing music are 505 00:25:14,440 --> 00:25:20,000 Speaker 3: played inside the capsule. Then, after washing, the machine drives 506 00:25:20,040 --> 00:25:24,320 Speaker 3: the user automatically. The user steps out fully clean, relaxed 507 00:25:24,680 --> 00:25:28,320 Speaker 3: and monitored. No towels or manual effort are needed. 508 00:25:29,000 --> 00:25:31,760 Speaker 2: And I shall want to cry. It's kind of make 509 00:25:31,840 --> 00:25:32,480 Speaker 2: you want to cry. 510 00:25:33,000 --> 00:25:37,040 Speaker 3: It's kind of need this. So where this actually was 511 00:25:37,160 --> 00:25:39,840 Speaker 3: demoed was at the World Expo in Osaka, and you 512 00:25:39,840 --> 00:25:44,440 Speaker 3: know science the lean named company was compelled to commercialize 513 00:25:44,440 --> 00:25:47,479 Speaker 3: the prototype. So actually it will suit and be in 514 00:25:47,600 --> 00:25:51,080 Speaker 3: use at a hotel in that area that purchased one 515 00:25:51,119 --> 00:25:54,320 Speaker 3: of the fifty units the company plans to produce. Guess 516 00:25:54,359 --> 00:25:55,560 Speaker 3: the price of this unit. 517 00:25:56,240 --> 00:25:57,520 Speaker 2: Twenty five thousand dollars? 518 00:25:58,000 --> 00:26:01,040 Speaker 3: No, no, no, my friend to be human car wash 519 00:26:01,480 --> 00:26:03,359 Speaker 3: that costs three hundred and eighty five thousand dollars. 520 00:26:03,440 --> 00:26:04,600 Speaker 2: Well, yeah, well. 521 00:26:04,440 --> 00:26:06,760 Speaker 1: So it won't become probably won't be coming to a 522 00:26:06,840 --> 00:26:08,359 Speaker 1: screen nerrors anytime soon. 523 00:26:08,520 --> 00:26:09,119 Speaker 3: No, it will not. 524 00:26:09,760 --> 00:26:12,240 Speaker 1: So caart you know that I had this weird obsession 525 00:26:12,320 --> 00:26:14,160 Speaker 1: with why it's a bad idea to do twenty three 526 00:26:14,160 --> 00:26:14,440 Speaker 1: and me. 527 00:26:14,800 --> 00:26:17,479 Speaker 3: Yeah, of course we know it's a terrible idea. We learned. 528 00:26:18,080 --> 00:26:20,159 Speaker 1: So there's a story in the Wall Street Journal this 529 00:26:20,200 --> 00:26:25,800 Speaker 1: week about how the irony of unintended consequences can be. 530 00:26:25,920 --> 00:26:27,560 Speaker 2: Truly mind blowing. 531 00:26:28,119 --> 00:26:31,760 Speaker 1: In short, people who find new family members through DNA 532 00:26:31,840 --> 00:26:34,480 Speaker 1: sites like twenty three and me have started suing. 533 00:26:34,200 --> 00:26:37,439 Speaker 2: Them with the inheritance. You're kidding, I promise you it's true. 534 00:26:37,560 --> 00:26:39,000 Speaker 3: Well, that's where I'm going to get three hundred and 535 00:26:39,000 --> 00:26:41,960 Speaker 3: eighty five thousand dollars to get my human watching she is, 536 00:26:42,000 --> 00:26:43,880 Speaker 3: I have to find a rich relative on twenty three 537 00:26:43,880 --> 00:26:44,040 Speaker 3: and me. 538 00:26:44,359 --> 00:26:47,439 Speaker 1: So I'm going to tell you a story about Carmen Thomas. Okay, 539 00:26:47,880 --> 00:26:50,840 Speaker 1: Carmen Thomas used twenty three and means DNA tests to 540 00:26:50,960 --> 00:26:55,080 Speaker 1: track down her absent father, Joe Brown. Okay, but am 541 00:26:55,080 --> 00:26:57,879 Speaker 1: I'm going to reach on the story now. It turned 542 00:26:57,920 --> 00:27:00,240 Speaker 1: out the man she believed to be her father had 543 00:27:00,280 --> 00:27:03,879 Speaker 1: died five years earlier, but she connected with two likely 544 00:27:04,000 --> 00:27:07,639 Speaker 1: half sisters. They went out for boba te and a 545 00:27:07,720 --> 00:27:11,359 Speaker 1: sleepover at their grandmother's. She looked through family albums and 546 00:27:11,440 --> 00:27:14,840 Speaker 1: held a pillow with his photo printed on it. A 547 00:27:14,920 --> 00:27:18,280 Speaker 1: year later, she was suing the Brown sisters and then 548 00:27:18,720 --> 00:27:21,639 Speaker 1: remember that you made me pay. Thomas wanted to share 549 00:27:21,640 --> 00:27:24,840 Speaker 1: of a multimillion dollar medical malpractice award they had won 550 00:27:25,240 --> 00:27:29,240 Speaker 1: after Joe Brown had died of an undiagnosed aortic aneurysm. 551 00:27:29,960 --> 00:27:32,480 Speaker 2: Oh my god, American history. 552 00:27:32,600 --> 00:27:35,280 Speaker 3: So American unbelievable. 553 00:27:34,760 --> 00:27:37,640 Speaker 2: I laugh, but it's pretty imagine. 554 00:27:38,040 --> 00:27:41,040 Speaker 3: So this is basically people going on twenty three and meters. 555 00:27:42,720 --> 00:27:44,960 Speaker 2: I think she's in earnest and then she was like, eh, 556 00:27:45,000 --> 00:27:45,400 Speaker 2: she's like. 557 00:27:45,359 --> 00:27:46,600 Speaker 3: Wait a minute, there's some money here. 558 00:27:46,680 --> 00:27:47,360 Speaker 2: Yeah. Exactly. 559 00:27:47,640 --> 00:27:50,439 Speaker 1: That case actually settled in favor of the family because 560 00:27:50,520 --> 00:27:54,000 Speaker 1: Thomas's claim came too late. But another example went the 561 00:27:54,000 --> 00:27:56,960 Speaker 1: other way. A brother and a sister in Utah were 562 00:27:57,000 --> 00:28:01,400 Speaker 1: fighting over their late father's estate, and for reasons best 563 00:28:01,480 --> 00:28:04,679 Speaker 1: known to himself, the brother reached out to someone on 564 00:28:04,760 --> 00:28:07,040 Speaker 1: twenty three and me who might be. 565 00:28:06,880 --> 00:28:07,600 Speaker 2: A half brother. 566 00:28:08,400 --> 00:28:11,720 Speaker 1: The deceased biological father didn't have a will, and the 567 00:28:11,720 --> 00:28:15,719 Speaker 1: potential half brother sued the original brother and sister and 568 00:28:15,840 --> 00:28:19,760 Speaker 1: one a third of the disputed estate. The reason for 569 00:28:19,800 --> 00:28:23,840 Speaker 1: the victory was because the father had been sending a 570 00:28:23,880 --> 00:28:25,200 Speaker 1: card and one hundred. 571 00:28:24,920 --> 00:28:27,960 Speaker 2: Dollars each year on his birthday. To the court's ruled 572 00:28:27,960 --> 00:28:30,400 Speaker 2: that was fatherhood. 573 00:28:30,480 --> 00:28:31,040 Speaker 3: Fatherhood. 574 00:28:31,480 --> 00:28:33,639 Speaker 1: Basically what this The sort of effect of this is 575 00:28:33,680 --> 00:28:37,880 Speaker 1: that people are being encouraged to frame their wills much 576 00:28:37,920 --> 00:28:41,920 Speaker 1: more consciously. To my family or to my dear children 577 00:28:42,480 --> 00:28:45,920 Speaker 1: is a minefield. 578 00:28:47,240 --> 00:28:49,239 Speaker 3: So again the takeaway from this is do not use 579 00:28:49,280 --> 00:28:51,920 Speaker 3: twenty three and me, I guess you have money that 580 00:28:51,960 --> 00:28:52,680 Speaker 3: you want to protect. 581 00:28:52,760 --> 00:28:53,400 Speaker 2: Yeah, exactly. 582 00:29:04,440 --> 00:29:09,760 Speaker 8: Happy birthday to chat GBT, Happy birthday to your chat GBT, 583 00:29:10,640 --> 00:29:14,320 Speaker 8: Happy third Bert gotta hit it birthday to chat GYBT, 584 00:29:14,480 --> 00:29:15,520 Speaker 8: Happy third birthday. 585 00:29:15,640 --> 00:29:16,480 Speaker 3: This is chatting me. 586 00:29:17,800 --> 00:29:19,280 Speaker 2: I couldn't have hit it. You're right, Cara. 587 00:29:19,400 --> 00:29:21,840 Speaker 1: It's almost three years to the day since the release 588 00:29:21,880 --> 00:29:25,280 Speaker 1: of Chat Gypt, which seemingly changed the way we all 589 00:29:25,280 --> 00:29:27,400 Speaker 1: interact with technology forever. 590 00:29:27,720 --> 00:29:31,640 Speaker 3: And to celebrate, or rather memorialize our dear friend Chat, 591 00:29:31,640 --> 00:29:35,320 Speaker 3: we're here with Megan Moroney, the editor of Technology at Axios. 592 00:29:35,320 --> 00:29:36,920 Speaker 3: Thank you so much for joining us. 593 00:29:37,280 --> 00:29:38,240 Speaker 9: Thank you for having me. 594 00:29:38,280 --> 00:29:39,800 Speaker 2: I love to talk about. 595 00:29:39,480 --> 00:29:43,320 Speaker 1: The chat Cara saying happy birthday, but I was also 596 00:29:43,360 --> 00:29:45,680 Speaker 1: thinking of the Drake songs my birthday. I can cry 597 00:29:45,720 --> 00:29:48,040 Speaker 1: if I want to. How much my celebration is there 598 00:29:48,040 --> 00:29:49,200 Speaker 1: in the halls of this week? 599 00:29:50,160 --> 00:29:53,040 Speaker 9: Yeah, not a lot in Open AI. It's been a 600 00:29:53,560 --> 00:29:57,280 Speaker 9: rough week. A lot of competition, specifically from Google. 601 00:29:57,720 --> 00:30:00,760 Speaker 1: Yeah, and Sam Alton has talked about a code. I'd sarge, 602 00:30:01,320 --> 00:30:02,240 Speaker 1: what does that look like? 603 00:30:02,480 --> 00:30:05,880 Speaker 9: There's three things I wrote a piec yesterday for Axios 604 00:30:05,880 --> 00:30:08,760 Speaker 9: on like the three things that are keeping Sam Altman 605 00:30:08,880 --> 00:30:10,320 Speaker 9: up at night and they're big. 606 00:30:10,520 --> 00:30:11,880 Speaker 3: The first one is just money. 607 00:30:12,520 --> 00:30:15,440 Speaker 9: I mean everyone is talking about the AI bubble and 608 00:30:15,480 --> 00:30:19,080 Speaker 9: how much AI costs to create the models, and then 609 00:30:19,280 --> 00:30:22,320 Speaker 9: you know, for everyone to be using them, and it's 610 00:30:22,320 --> 00:30:25,520 Speaker 9: scary and there's a lot of circular investments. So a 611 00:30:25,520 --> 00:30:27,200 Speaker 9: lot of people are just saying the bubble is about 612 00:30:27,240 --> 00:30:31,800 Speaker 9: to pop. And the second thing is Gemini is Google's 613 00:30:31,840 --> 00:30:37,720 Speaker 9: newest model and it's has great reviews. The image model 614 00:30:37,760 --> 00:30:38,120 Speaker 9: is great. 615 00:30:38,280 --> 00:30:41,120 Speaker 2: This is nan nano banana mode. 616 00:30:40,920 --> 00:30:45,320 Speaker 9: Yes, nano banana. And the third thing is really the 617 00:30:46,240 --> 00:30:49,960 Speaker 9: safety issues because people are using it as a therapist 618 00:30:50,200 --> 00:30:53,000 Speaker 9: or you know, as a confidant, and there's a lot 619 00:30:53,040 --> 00:30:54,000 Speaker 9: of issues coming up with that. 620 00:30:54,560 --> 00:30:57,280 Speaker 1: Cara, I think is most interested in the in the 621 00:30:57,280 --> 00:30:59,720 Speaker 1: therapist angle. We've talked about a bunch on this show. Yes, 622 00:30:59,840 --> 00:31:01,760 Speaker 1: I'm very interested in the Google angle. 623 00:31:01,880 --> 00:31:02,160 Speaker 2: Though. 624 00:31:02,520 --> 00:31:05,960 Speaker 1: Since the release of Gemini three, Google's added more than 625 00:31:06,000 --> 00:31:09,280 Speaker 1: three hundred billion dollars to its market cap. Why was 626 00:31:09,280 --> 00:31:12,240 Speaker 1: the release of Gemini three so consequential? 627 00:31:13,000 --> 00:31:15,480 Speaker 9: So you have to go back obviously three years when 628 00:31:15,880 --> 00:31:20,000 Speaker 9: chat Gipt was first released and Google was caught on 629 00:31:20,160 --> 00:31:22,600 Speaker 9: the back foot. They you know, started a lot of 630 00:31:22,720 --> 00:31:27,360 Speaker 9: this generative AI technology and with deep Mind, they have 631 00:31:27,480 --> 00:31:30,080 Speaker 9: some of the biggest models. And they were surprised when 632 00:31:30,120 --> 00:31:33,120 Speaker 9: chetchipt it came out. They didn't have a similar tool 633 00:31:33,240 --> 00:31:36,640 Speaker 9: and they make their money in search, and they were 634 00:31:36,720 --> 00:31:37,880 Speaker 9: threatened by this. 635 00:31:38,640 --> 00:31:40,280 Speaker 2: And so you know, it's taken them. 636 00:31:40,240 --> 00:31:44,000 Speaker 9: A few years to figure out how to not cannibalize 637 00:31:44,040 --> 00:31:49,200 Speaker 9: their their business and then also to really catch up 638 00:31:49,760 --> 00:31:54,120 Speaker 9: and they have with Gemini, which is built into Google Docs. 639 00:31:54,160 --> 00:31:55,800 Speaker 9: You know, it's just it's part of the tools that 640 00:31:55,840 --> 00:31:56,600 Speaker 9: everybody uses. 641 00:31:56,680 --> 00:31:59,120 Speaker 3: Do you think we could look back on this as 642 00:31:59,160 --> 00:32:02,239 Speaker 3: the week that opened up AI's dominance started to fade, Like, 643 00:32:02,600 --> 00:32:04,120 Speaker 3: is that what this week is going to signal? 644 00:32:04,640 --> 00:32:07,320 Speaker 9: I wouldn't say the beginning of the end, but it changed, 645 00:32:07,360 --> 00:32:11,160 Speaker 9: Like I'm not discounting chat GPT for sure. It could 646 00:32:11,160 --> 00:32:14,120 Speaker 9: be just that this renewed focus gets them back on 647 00:32:14,160 --> 00:32:17,320 Speaker 9: the right track. But I think this is this is 648 00:32:17,360 --> 00:32:21,480 Speaker 9: the first real threat that they've had. And of course, 649 00:32:21,520 --> 00:32:25,960 Speaker 9: you know, everyone who's been following tech for decades. No, 650 00:32:26,120 --> 00:32:29,640 Speaker 9: there's the myspaces, you know that everyone thought was gonna 651 00:32:29,800 --> 00:32:32,120 Speaker 9: be the next big thing, and you know and remembers it, 652 00:32:32,200 --> 00:32:35,320 Speaker 9: or the Beta max where no one used that anymore, 653 00:32:35,400 --> 00:32:36,480 Speaker 9: so very you know. 654 00:32:36,680 --> 00:32:39,320 Speaker 1: It could go that way too, Megan, thank you for 655 00:32:39,400 --> 00:32:43,719 Speaker 1: joining us again. And if you're thinking listeners about switching 656 00:32:43,720 --> 00:32:47,520 Speaker 1: to Gemini writing your wedding vows with the chat GPT 657 00:32:47,760 --> 00:32:51,080 Speaker 1: using anthropic to make a company of one, please write 658 00:32:51,120 --> 00:32:53,640 Speaker 1: into tech Stuff podcast at gmail dot. 659 00:32:53,440 --> 00:32:54,400 Speaker 2: Com with your stories. 660 00:32:54,400 --> 00:32:55,680 Speaker 5: We'd love to hear from you and we love to 661 00:32:55,680 --> 00:33:08,040 Speaker 5: feature them. 662 00:33:08,080 --> 00:33:10,160 Speaker 3: That's it for this week for tech Stuff. I'm Kara 663 00:33:10,240 --> 00:33:10,920 Speaker 3: Price and. 664 00:33:10,880 --> 00:33:11,720 Speaker 2: I'm Oz Voloshin. 665 00:33:11,800 --> 00:33:14,760 Speaker 1: This episode was produced by Eliza Dennis, Tyler Hill and 666 00:33:14,840 --> 00:33:18,680 Speaker 1: Melissa Slaughter. It was executive produced by me Kara Price, 667 00:33:18,800 --> 00:33:22,280 Speaker 1: Julian Nutta, and Kate Osborne for Kaleidoscope and Katrina Norvell 668 00:33:22,320 --> 00:33:26,040 Speaker 1: for iHeart Podcasts. The Engineer is Beheth Fraser and Jack 669 00:33:26,160 --> 00:33:27,520 Speaker 1: Insley mixed this episode. 670 00:33:27,880 --> 00:33:29,360 Speaker 2: Kyle Murdock wroteut theme song. 671 00:33:30,480 --> 00:33:33,080 Speaker 1: Please rate, review and reach out to us at tech 672 00:33:33,120 --> 00:33:34,880 Speaker 1: Stuff podcast at gmail dot com. 673 00:33:34,920 --> 00:33:35,720 Speaker 2: We want to hear from you.