1 00:00:13,800 --> 00:00:17,400 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. 2 00:00:17,560 --> 00:00:19,520 Speaker 2: I'm as Voloscian and I'm Cara Price. 3 00:00:20,000 --> 00:00:23,920 Speaker 1: Today we'll get into the headlines this week, including Meta's 4 00:00:24,200 --> 00:00:28,440 Speaker 1: multi hundred million dollar hiring spree and why gen Z 5 00:00:28,800 --> 00:00:34,199 Speaker 1: always have locations sharing on Then on chatting me, he 6 00:00:34,440 --> 00:00:34,959 Speaker 1: gives me. 7 00:00:35,000 --> 00:00:38,600 Speaker 3: A tailored meal plan, so like what I should eat, 8 00:00:39,159 --> 00:00:42,839 Speaker 3: how I should eat, when I should eat bananas some nights, yeah, 9 00:00:42,960 --> 00:00:43,519 Speaker 3: some nights no. 10 00:00:43,760 --> 00:00:46,680 Speaker 1: It's kind of crazy, all of that. On the Weekend Tech. 11 00:00:47,120 --> 00:00:49,040 Speaker 1: It's Friday, July twenty fifth. 12 00:00:55,560 --> 00:00:59,880 Speaker 2: Hello Cara, Hello Azzie. I just want to say first 13 00:01:00,120 --> 00:01:04,039 Speaker 2: rest and darkness to the other Ozzy Osbourne who died 14 00:01:04,080 --> 00:01:07,479 Speaker 2: this week, another brit and that is where I find 15 00:01:07,520 --> 00:01:08,000 Speaker 2: you today. 16 00:01:08,200 --> 00:01:10,679 Speaker 1: Indeed, I'm in London and I was a little bit 17 00:01:10,720 --> 00:01:14,679 Speaker 1: nervous to come here because Europe's been experiencing this crazy 18 00:01:14,720 --> 00:01:18,120 Speaker 1: heat wave and most of London has no AC. 19 00:01:18,319 --> 00:01:21,320 Speaker 2: So you've been schitzing, as they say. 20 00:01:21,120 --> 00:01:23,319 Speaker 1: Well, actually the heat wave broke, but it has been 21 00:01:23,360 --> 00:01:25,880 Speaker 1: a very very hot summer here. And you and I 22 00:01:25,920 --> 00:01:28,440 Speaker 1: talked about Wimbledon a couple of weeks ago and Ai 23 00:01:28,560 --> 00:01:31,600 Speaker 1: line judges. But some of the other interesting scenes from 24 00:01:31,640 --> 00:01:33,959 Speaker 1: this year's Wimbledon, which I haven't seen before, were people 25 00:01:34,080 --> 00:01:37,440 Speaker 1: passing out in the stands from heat exhaustion and players 26 00:01:37,760 --> 00:01:38,760 Speaker 1: riding to their rescue. 27 00:01:38,840 --> 00:01:45,040 Speaker 2: The players are like, you're passing out. My memory and 28 00:01:45,280 --> 00:01:49,120 Speaker 2: unfortunate experience of really not just London, but most of 29 00:01:49,160 --> 00:01:52,480 Speaker 2: Europe is that they lack AC. And as a kid 30 00:01:52,520 --> 00:01:56,240 Speaker 2: who grew up in the Northeast during the summer, AC 31 00:01:56,560 --> 00:02:01,400 Speaker 2: is the premium, I do wonder or in a warmer climate, 32 00:02:01,480 --> 00:02:03,160 Speaker 2: how does this continue to actually work? 33 00:02:03,640 --> 00:02:05,400 Speaker 1: Yeah, I mean I think the short answer is it 34 00:02:05,440 --> 00:02:07,880 Speaker 1: may not. The Financial Times recently had an op ed 35 00:02:08,400 --> 00:02:11,240 Speaker 1: with the title Britain and Europe need to get serious 36 00:02:11,280 --> 00:02:14,400 Speaker 1: about air conditioning, and the article points out that, you know, 37 00:02:14,440 --> 00:02:18,400 Speaker 1: in a changing climate, European cities are experiencing more intense heat, 38 00:02:18,680 --> 00:02:21,360 Speaker 1: more frequently and for longer periods of time than just 39 00:02:21,440 --> 00:02:24,000 Speaker 1: ten years ago. And also the article points out that 40 00:02:24,080 --> 00:02:26,959 Speaker 1: AC is not just about comfort, it's no longer a luxury. 41 00:02:27,320 --> 00:02:29,600 Speaker 1: There are some pretty interesting stats in the piece about 42 00:02:29,600 --> 00:02:34,240 Speaker 1: how being too hot indoors affect sleep, cognition and even 43 00:02:34,280 --> 00:02:35,639 Speaker 1: correlates with mortality. 44 00:02:35,960 --> 00:02:38,080 Speaker 2: You know, I do notice a huge difference in my 45 00:02:38,160 --> 00:02:41,400 Speaker 2: ability to sleep and even think when I'm hot, like, 46 00:02:42,160 --> 00:02:43,720 Speaker 2: I don't think people should work in the summer it's 47 00:02:43,720 --> 00:02:44,160 Speaker 2: too hot. 48 00:02:44,280 --> 00:02:47,440 Speaker 1: Well, the article says that when indoor temperatures rise above 49 00:02:47,480 --> 00:02:52,480 Speaker 1: seventy five degrees fahrenheit, sleep duration and quality fall rapidly, and, 50 00:02:52,639 --> 00:02:56,160 Speaker 1: according to studies, so desirability to perform well in tests. 51 00:02:56,720 --> 00:02:58,600 Speaker 1: So it's a blessing to you and our listeners that 52 00:02:58,639 --> 00:03:00,920 Speaker 1: it's a cool week in London and I can think 53 00:03:01,080 --> 00:03:01,880 Speaker 1: relatively straight. 54 00:03:02,160 --> 00:03:03,160 Speaker 2: You're always brilliant. 55 00:03:03,320 --> 00:03:03,760 Speaker 1: But I. 56 00:03:05,280 --> 00:03:07,480 Speaker 2: A little bit earlier you mentioned more tatic it's a 57 00:03:07,480 --> 00:03:09,760 Speaker 2: matter of life and death. Is that really true in 58 00:03:09,840 --> 00:03:11,000 Speaker 2: terms of AC and cooling. 59 00:03:11,280 --> 00:03:12,799 Speaker 1: I was a little bit surprised by that as well. 60 00:03:12,840 --> 00:03:16,920 Speaker 1: But according to the FT, between two thousand and twenty nineteen, 61 00:03:17,760 --> 00:03:22,000 Speaker 1: an average of eighty three thousand Western Europeans per year 62 00:03:22,520 --> 00:03:25,120 Speaker 1: lost their lives due to extreme heat, compared with just 63 00:03:25,160 --> 00:03:27,360 Speaker 1: twenty thousand North Americans, who are much more likely to 64 00:03:27,400 --> 00:03:31,160 Speaker 1: live in buildings with AC. The penetration rate of air 65 00:03:31,160 --> 00:03:34,600 Speaker 1: conditioning is ninety percent in the US and only nineteen 66 00:03:34,680 --> 00:03:35,720 Speaker 1: percent in Europe. 67 00:03:35,840 --> 00:03:38,119 Speaker 2: I was going to say, you know, I never understood 68 00:03:38,120 --> 00:03:41,720 Speaker 2: why Europeans don't adapt to cooling systems and buildings yeah. 69 00:03:41,720 --> 00:03:44,680 Speaker 1: I mean there's a kind of cultural resistance because it 70 00:03:44,720 --> 00:03:47,680 Speaker 1: uses so much energy. But of course, with a changing climate, 71 00:03:47,720 --> 00:03:52,040 Speaker 1: what was once seen as an extravagance becomes a potential necessity. 72 00:03:52,560 --> 00:03:55,839 Speaker 1: And the FT article makes the point that new developments 73 00:03:55,880 --> 00:03:59,360 Speaker 1: in more efficient solar can actually potentially offset some of 74 00:03:59,400 --> 00:04:03,400 Speaker 1: the new demands of more widely deployed ac you know, 75 00:04:03,440 --> 00:04:05,960 Speaker 1: I was particularly struck by this FT article because it 76 00:04:06,120 --> 00:04:10,680 Speaker 1: referenced Li Quan Yu, who is known as the father 77 00:04:10,840 --> 00:04:14,120 Speaker 1: of modern Singapore, and he was often asked this question, 78 00:04:14,520 --> 00:04:19,279 Speaker 1: how did you transform was effectively a fishing village into 79 00:04:19,400 --> 00:04:22,480 Speaker 1: one of the most important commercial hubs in Asia and 80 00:04:22,520 --> 00:04:26,400 Speaker 1: the world, And he would say, well, two things. First, 81 00:04:27,040 --> 00:04:31,839 Speaker 1: multi ethnic tolerance and diverse society. Second, air conditioning. 82 00:04:32,120 --> 00:04:35,040 Speaker 2: Britain seems to have one but not the other. I mean, 83 00:04:35,080 --> 00:04:35,960 Speaker 2: maybe that's the key. 84 00:04:36,120 --> 00:04:38,760 Speaker 1: Well, it's funny. We host this podcast called tech Stuff 85 00:04:38,880 --> 00:04:41,800 Speaker 1: and so we have this, you know, strong bias to 86 00:04:41,839 --> 00:04:45,799 Speaker 1: cover new tech stuff, new advanced in AI, new advances 87 00:04:45,839 --> 00:04:48,680 Speaker 1: in geneality and synthetic biology. But one of the things 88 00:04:48,680 --> 00:04:52,279 Speaker 1: this story really made me think about is how the 89 00:04:52,320 --> 00:04:56,200 Speaker 1: adoption of existing tech stuff can be just as important 90 00:04:56,320 --> 00:04:59,400 Speaker 1: as the innovation of new tech stuff, and I think 91 00:04:59,640 --> 00:05:02,000 Speaker 1: A really is a perfect example of that. 92 00:05:02,360 --> 00:05:04,120 Speaker 2: All I can think about, you know, when we talk 93 00:05:04,160 --> 00:05:06,680 Speaker 2: about AI a lot on this show, is that energy 94 00:05:06,720 --> 00:05:10,440 Speaker 2: consumption will likely increase if AI continues at its current pace. 95 00:05:10,800 --> 00:05:14,119 Speaker 1: I mean, if the Silicon Valley overlords have their way. 96 00:05:14,760 --> 00:05:17,080 Speaker 1: And this week in Silicon Valley there has been some 97 00:05:17,720 --> 00:05:21,960 Speaker 1: quite frankly delicious drama in the battle for AI supremacy. 98 00:05:22,360 --> 00:05:24,240 Speaker 2: You have not shut up about this story since you 99 00:05:24,320 --> 00:05:26,000 Speaker 2: read it, and so now I need to hear about it. 100 00:05:26,200 --> 00:05:29,960 Speaker 1: Yeah. So, basically, the most powerful CEOs in the world 101 00:05:30,120 --> 00:05:33,680 Speaker 1: are in an all out war for AI talent. This 102 00:05:33,880 --> 00:05:38,440 Speaker 1: is collecting Pokemon cards, but where the stakes are hundreds 103 00:05:38,440 --> 00:05:41,160 Speaker 1: and hundred and hundreds of millions of dollars, and Mark 104 00:05:41,240 --> 00:05:45,960 Speaker 1: Zuckerberg is leading the charge and is absolutely determined, it seems, 105 00:05:46,200 --> 00:05:47,039 Speaker 1: to collect them all. 106 00:05:47,520 --> 00:05:49,680 Speaker 2: I don't really think of Meta as an AI company. 107 00:05:49,760 --> 00:05:53,040 Speaker 2: I think much too Zuckerberg's dismay. 108 00:05:52,880 --> 00:05:56,640 Speaker 1: Well, exactly right, So hence the recruitment push. Metro announced 109 00:05:56,720 --> 00:05:59,919 Speaker 1: last month they are starting an AI research lab dedicated 110 00:06:00,160 --> 00:06:04,400 Speaker 1: pursuing quote superintelligence, meaning, of course, an AI system that 111 00:06:04,480 --> 00:06:07,400 Speaker 1: exceeds the power of the human brain. But until this point, 112 00:06:07,600 --> 00:06:10,640 Speaker 1: Meta has spent around one hundred times more on AI 113 00:06:10,880 --> 00:06:14,440 Speaker 1: hardware and computing power than they have on human labor, 114 00:06:14,920 --> 00:06:17,520 Speaker 1: and according to The Wall Street Journal, Zuckerberg may have 115 00:06:17,560 --> 00:06:20,160 Speaker 1: gotten a wake up call this spring from the chief 116 00:06:20,200 --> 00:06:23,160 Speaker 1: research officer of none other than Open Ai. 117 00:06:23,680 --> 00:06:25,960 Speaker 2: I'm shocked that they even talk to each other, like 118 00:06:26,000 --> 00:06:29,080 Speaker 2: I imagine them meeting in like a covert confessional with 119 00:06:29,160 --> 00:06:30,920 Speaker 2: their faces obscured. 120 00:06:31,200 --> 00:06:34,680 Speaker 1: Well, yes, but the interesting thing about Silicon Valley is 121 00:06:35,040 --> 00:06:37,719 Speaker 1: that's not really how it's worked, at least up until now. 122 00:06:38,120 --> 00:06:40,560 Speaker 1: With all these huge fortunes being made and all of 123 00:06:40,600 --> 00:06:45,400 Speaker 1: these new platform technologies being developed despite intense rivalry, there 124 00:06:45,440 --> 00:06:48,239 Speaker 1: was also this kind of sense that a rising tide 125 00:06:48,279 --> 00:06:51,920 Speaker 1: would lift all boats. I mean, you'll remember Google CEO 126 00:06:52,120 --> 00:06:55,760 Speaker 1: Eric Schmidt was on the board of Apple. LinkedIn founder 127 00:06:55,880 --> 00:06:58,880 Speaker 1: Reid Hoffmann was one of the first investors in Facebook. 128 00:06:59,240 --> 00:07:02,080 Speaker 1: So there's this kind of history of these people being 129 00:07:02,120 --> 00:07:05,520 Speaker 1: in each other's business. And don't forget, I mean, Zuckerberg 130 00:07:06,080 --> 00:07:09,200 Speaker 1: has significant pulls. So when he calls and says that's 131 00:07:09,240 --> 00:07:12,600 Speaker 1: have a coffee, many people answer. So, according to the 132 00:07:12,600 --> 00:07:16,440 Speaker 1: Wall Street Journal story, Zuckerberg hit up Mark Chen, who 133 00:07:16,560 --> 00:07:20,000 Speaker 1: is open AI's chief research officer for a catch up, 134 00:07:20,480 --> 00:07:24,160 Speaker 1: and zuck ended up asking Chen for advice on how 135 00:07:24,200 --> 00:07:27,280 Speaker 1: to improve Meta's generative AI efforts. 136 00:07:27,520 --> 00:07:29,000 Speaker 2: He's like Mark, it's Mark. 137 00:07:31,040 --> 00:07:35,280 Speaker 1: I think Mark Z was calling Mark C for some 138 00:07:35,280 --> 00:07:38,200 Speaker 1: friendly advice, or at least that's how it appeared to 139 00:07:38,320 --> 00:07:41,320 Speaker 1: Chen at the time. So it was Chen who actually 140 00:07:41,320 --> 00:07:45,040 Speaker 1: pointed out to Zuckerberg about how many orders of magnitude 141 00:07:45,200 --> 00:07:48,480 Speaker 1: more Meta was spending on hardware than they were on talent, 142 00:07:49,440 --> 00:07:53,200 Speaker 1: and thus kara an idea was born. Have you heard 143 00:07:53,200 --> 00:07:54,400 Speaker 1: about the list? 144 00:07:55,320 --> 00:07:59,160 Speaker 2: When billionaires make lists, I get very anxious now I 145 00:07:59,200 --> 00:07:59,960 Speaker 2: have not heard about that. 146 00:08:00,320 --> 00:08:04,240 Speaker 1: This is the Christmas list to end all Christmas lists, 147 00:08:04,400 --> 00:08:07,840 Speaker 1: and Zuckerberg apparently spent months compiling it, a list of 148 00:08:07,880 --> 00:08:11,600 Speaker 1: the top AI researchers and engineers from rival AI companies 149 00:08:11,600 --> 00:08:12,320 Speaker 1: and startups. 150 00:08:12,880 --> 00:08:15,200 Speaker 2: Do we have names? Do we know anything? 151 00:08:15,360 --> 00:08:17,840 Speaker 1: We don't have the secret list, but we do know 152 00:08:17,880 --> 00:08:21,840 Speaker 1: who Zuckerberg has either poached or tried to poach, and 153 00:08:22,160 --> 00:08:25,560 Speaker 1: these people are not household names outside of Silicon Valley. 154 00:08:25,880 --> 00:08:29,840 Speaker 1: There is probably, most famously Alexander Wang of Scale AI, 155 00:08:29,960 --> 00:08:32,280 Speaker 1: who will talk a little bit more about, but the 156 00:08:32,320 --> 00:08:36,679 Speaker 1: companies he's raided include Anthropic, Google, DeepMind, Apple, and of 157 00:08:36,720 --> 00:08:39,880 Speaker 1: course open Ai, from which he's pulled at least a 158 00:08:39,920 --> 00:08:40,920 Speaker 1: dozen employees. 159 00:08:41,400 --> 00:08:43,880 Speaker 2: So how is he going about pulling talent from these 160 00:08:43,880 --> 00:08:45,360 Speaker 2: companies and bringing them to Meta? 161 00:08:45,440 --> 00:08:49,280 Speaker 1: Have you heard of something called money? 162 00:08:49,679 --> 00:08:52,840 Speaker 2: Money? Money? Money must be funny, you know. 163 00:08:52,840 --> 00:08:56,920 Speaker 1: A rich man's world. Over the past few weeks, Zuckerberg 164 00:08:57,000 --> 00:09:00,240 Speaker 1: has reportedly offered pay packages worth more than three three 165 00:09:00,360 --> 00:09:02,680 Speaker 1: hundred million dollars over four years. 166 00:09:03,080 --> 00:09:04,680 Speaker 2: That's like pro ball money. 167 00:09:05,080 --> 00:09:07,559 Speaker 1: I mean, this is generational. This is like your grandkids' 168 00:09:07,640 --> 00:09:10,120 Speaker 1: grandkids will never happens easily. 169 00:09:10,480 --> 00:09:12,840 Speaker 2: And also your parents finally being like you know what, 170 00:09:13,320 --> 00:09:16,440 Speaker 2: you sitting in your room all day hit off exactly. 171 00:09:16,800 --> 00:09:19,679 Speaker 1: I'd glad you won the battle to not limit your 172 00:09:19,679 --> 00:09:23,160 Speaker 1: computer time when you're a kid. Also, a lot of 173 00:09:23,160 --> 00:09:27,000 Speaker 1: these offers are so called exploding offers, meaning they expire 174 00:09:27,040 --> 00:09:30,440 Speaker 1: within a few days, so the existing employers have a 175 00:09:30,520 --> 00:09:33,040 Speaker 1: very hard time developing a counter offer. 176 00:09:34,160 --> 00:09:36,679 Speaker 2: So, I mean it makes a lot of sense why 177 00:09:36,760 --> 00:09:39,600 Speaker 2: Silicon Valley is following this so closely. This is this 178 00:09:39,640 --> 00:09:43,640 Speaker 2: is like a real time raid, an employee raid, with 179 00:09:44,080 --> 00:09:46,640 Speaker 2: Mark Zuckerberg sort of pulling the marionette strings. 180 00:09:46,960 --> 00:09:51,319 Speaker 1: And it also potentially signals a massive culture shift in 181 00:09:51,360 --> 00:09:54,760 Speaker 1: Silicon Valley, because, as I mentioned, it's a small place. 182 00:09:55,240 --> 00:09:58,560 Speaker 1: It's a place where up until now people have sort 183 00:09:58,559 --> 00:10:02,720 Speaker 1: of had this sense of of collegiality up to a 184 00:10:02,800 --> 00:10:07,160 Speaker 1: certain point, and it kind of rests on this bedrock 185 00:10:07,280 --> 00:10:10,240 Speaker 1: principle of Silicon Valley, which is this mantra, which is 186 00:10:10,559 --> 00:10:12,319 Speaker 1: be a missionary, not a mercenary. 187 00:10:12,720 --> 00:10:15,800 Speaker 2: That's a shirt that I would see in Times Square. 188 00:10:16,200 --> 00:10:21,240 Speaker 2: But also very startup, b very very culty, startup. 189 00:10:20,880 --> 00:10:26,000 Speaker 1: By absolutely, And the phrase come from one of the 190 00:10:26,040 --> 00:10:32,400 Speaker 1: granddaddy's of VC, John Dare, and he apparently told generations 191 00:10:32,559 --> 00:10:36,600 Speaker 1: entrepreneurs to embrace their inner missionary and start companies where 192 00:10:36,720 --> 00:10:39,640 Speaker 1: quote there's a lust not only for making money, but 193 00:10:39,679 --> 00:10:43,760 Speaker 1: for making meaning, which is the antithesis of mercenary companies, 194 00:10:43,800 --> 00:10:46,880 Speaker 1: where quote the central goal is a lust for making money. 195 00:10:47,440 --> 00:10:51,440 Speaker 1: Open AICEO. Sam Altman apparently leaned into this idea in 196 00:10:51,600 --> 00:10:56,040 Speaker 1: a Slack message that he shared with researchers amidst Zuckerberg's 197 00:10:56,080 --> 00:10:59,600 Speaker 1: talent rate on the company. According to the Journal, Altman said, quote, 198 00:11:00,080 --> 00:11:03,000 Speaker 1: I am proud of how mission oriented our industry is 199 00:11:03,080 --> 00:11:06,440 Speaker 1: as a whole. Of course, there will always be some mercenaries. 200 00:11:06,960 --> 00:11:09,839 Speaker 1: Missionaries will beat mercenaries. Just every day. 201 00:11:09,840 --> 00:11:11,720 Speaker 2: I'm reminded why I shouldn't be on Slack and you 202 00:11:11,840 --> 00:11:16,680 Speaker 2: make me use it, but no, I mean them's fighting words. 203 00:11:17,080 --> 00:11:21,520 Speaker 1: Now. Zuckerberg of course says, no, the money helps, but 204 00:11:21,600 --> 00:11:25,920 Speaker 1: actually that's not you know, just like in your sports analogy, right, 205 00:11:26,320 --> 00:11:28,480 Speaker 1: of course it helps if you offer an athlete one 206 00:11:28,559 --> 00:11:31,360 Speaker 1: hundreds of million dollars to play on the team, but 207 00:11:31,360 --> 00:11:32,960 Speaker 1: they also want to play for the winning team, right, 208 00:11:32,960 --> 00:11:37,439 Speaker 1: These people have an eye on legacy, And Zuckerberg says that, 209 00:11:37,480 --> 00:11:39,199 Speaker 1: you know, in this case, it's this kind of similar 210 00:11:39,240 --> 00:11:43,800 Speaker 1: thing where it's the company's investment in computing power, which 211 00:11:43,840 --> 00:11:47,160 Speaker 1: is actually what's attracting the talent. More computing power means 212 00:11:47,200 --> 00:11:51,520 Speaker 1: more potential AI breakthroughs. And just this month, Zuckerberg took 213 00:11:51,559 --> 00:11:54,439 Speaker 1: to his Facebook page and said that the new venture, 214 00:11:54,679 --> 00:11:59,199 Speaker 1: metas Superintelligence Labs, will have industry leading levels of compute 215 00:11:59,240 --> 00:12:03,760 Speaker 1: and quote by far the greatest compute per researcher. Zakobog 216 00:12:03,840 --> 00:12:08,120 Speaker 1: also created a helpful motion graphic to demonstrate just what 217 00:12:08,160 --> 00:12:10,120 Speaker 1: he means by this. So one of the new data 218 00:12:10,160 --> 00:12:13,000 Speaker 1: centers he's working on is called Hyperion and to show 219 00:12:13,040 --> 00:12:17,440 Speaker 1: its scale, he put the footprint of Hyperion and overlaid 220 00:12:17,440 --> 00:12:20,400 Speaker 1: it on an image of Manhattan, and it basically is 221 00:12:20,440 --> 00:12:23,600 Speaker 1: the same size as a Manhattan. So in essentially it 222 00:12:23,640 --> 00:12:27,240 Speaker 1: shows the metaphor of Godzilla's footprints that. 223 00:12:27,760 --> 00:12:30,359 Speaker 2: I was going to say, did he invent the footprint 224 00:12:30,440 --> 00:12:33,559 Speaker 2: and then backload the footprint into Hyperion. 225 00:12:34,640 --> 00:12:38,439 Speaker 1: Make it so it has the exact lego cutdown of Manhattan, 226 00:12:38,480 --> 00:12:41,400 Speaker 1: so that we can obscure the island with our with 227 00:12:41,480 --> 00:12:42,679 Speaker 1: our grandiose dreams. 228 00:12:43,240 --> 00:12:46,319 Speaker 2: So Zuck's pitch is basically become one of the richest 229 00:12:46,360 --> 00:12:48,200 Speaker 2: people in the world and have access to all the 230 00:12:48,200 --> 00:12:52,040 Speaker 2: best tools to express your vision. I can't imagine how 231 00:12:52,080 --> 00:12:53,199 Speaker 2: Mark Chen is feeling. 232 00:12:52,960 --> 00:12:57,000 Speaker 1: Right now about the advice he gave to Zuk. That's right, yeah, 233 00:12:57,200 --> 00:12:59,760 Speaker 1: open Ai seems to be in a little bit of 234 00:12:59,760 --> 00:13:03,079 Speaker 1: a Here's what Chen wrote on Slack, according to Wall 235 00:13:03,120 --> 00:13:07,040 Speaker 1: Street Journal quote, I feel a visceral feeling right now, 236 00:13:07,360 --> 00:13:10,640 Speaker 1: as if someone had broken into our home and stolen something. 237 00:13:11,080 --> 00:13:15,520 Speaker 1: Please trust that we haven't been sitting idly by. But 238 00:13:15,559 --> 00:13:20,319 Speaker 1: it's not just insurgent behemoths like open Ai. The other 239 00:13:20,360 --> 00:13:23,200 Speaker 1: interesting part of this story is how companies like Meta 240 00:13:23,600 --> 00:13:27,760 Speaker 1: Alphabet and others have been raiding Silicon Valley's top startups 241 00:13:27,960 --> 00:13:31,800 Speaker 1: for their senior leadership. I mentioned Alexander Wang of scale 242 00:13:31,880 --> 00:13:35,280 Speaker 1: Ai earlier, who's going to be running the superintelligence lab. 243 00:13:35,760 --> 00:13:40,520 Speaker 1: Meta actually paid fourteen billion plus dollars for a forty 244 00:13:40,600 --> 00:13:44,079 Speaker 1: nine percent share in scale Ai and to have Wang 245 00:13:44,360 --> 00:13:46,920 Speaker 1: come and work for them. And it's an interesting thing 246 00:13:47,040 --> 00:13:51,600 Speaker 1: where these antitrust laws, which were originally created in order 247 00:13:51,640 --> 00:13:56,600 Speaker 1: to prevent monopolistic powers from stifling competition, have created this strange, 248 00:13:56,640 --> 00:14:00,160 Speaker 1: unintended consequence where in order not to fall foul of 249 00:14:00,160 --> 00:14:04,760 Speaker 1: antitrust law, it is less risky to essentially gut a 250 00:14:04,840 --> 00:14:07,240 Speaker 1: company if it's senior leadership, but leave it kind of 251 00:14:07,280 --> 00:14:10,080 Speaker 1: in a half alive state than it is just to 252 00:14:10,080 --> 00:14:12,160 Speaker 1: buy the company. I'm not saying that's exactly what's happened 253 00:14:12,200 --> 00:14:15,360 Speaker 1: with Scalai. It's still functioning, although it has lost a 254 00:14:15,440 --> 00:14:18,480 Speaker 1: number of clients like Google and others because I don't 255 00:14:18,480 --> 00:14:20,680 Speaker 1: want to be paying Meta. But again it speaks to 256 00:14:20,720 --> 00:14:23,800 Speaker 1: this culture ship which is going on in Silicon Valley 257 00:14:23,800 --> 00:14:28,560 Speaker 1: because there is always this implied contract between employees, investors, 258 00:14:28,720 --> 00:14:32,160 Speaker 1: and founders which is being very disturbed by these big 259 00:14:32,200 --> 00:14:35,120 Speaker 1: money moves by Zuck. So it's an a talent story, 260 00:14:35,360 --> 00:14:37,640 Speaker 1: but it's also about some pretty fundamental changes to the 261 00:14:37,680 --> 00:14:39,040 Speaker 1: culture of the whole of Silicon Valley. 262 00:14:39,400 --> 00:14:42,960 Speaker 2: All this secrecy and scheming feels like the complete opposite 263 00:14:43,000 --> 00:14:47,040 Speaker 2: of my story, which is all about radical transparency and 264 00:14:47,560 --> 00:14:49,720 Speaker 2: I would say the potential pros and cons of our 265 00:14:49,760 --> 00:14:54,120 Speaker 2: overly connected world us. Do you share your location with anyone? 266 00:14:54,560 --> 00:14:59,480 Speaker 1: Absolutely not. I hate sharing my location, even with Uber 267 00:14:59,520 --> 00:15:02,680 Speaker 1: and Uber. I'm one of those people who turns off 268 00:15:03,240 --> 00:15:04,840 Speaker 1: location services on apps. 269 00:15:05,000 --> 00:15:07,600 Speaker 2: Oh so you're one of those never sharers, which means 270 00:15:07,640 --> 00:15:11,360 Speaker 2: you'd be a terrible teenage girl. I, on the other hand, 271 00:15:11,400 --> 00:15:13,520 Speaker 2: would make a wonderful member of gen Z. 272 00:15:13,760 --> 00:15:14,320 Speaker 1: Tell me more. 273 00:15:14,640 --> 00:15:17,240 Speaker 2: Apparently, there was a poll in twenty twenty two that 274 00:15:17,320 --> 00:15:20,760 Speaker 2: found gen Z is the most likely generation to say 275 00:15:20,760 --> 00:15:23,280 Speaker 2: it's convenient to share their location, which of course is 276 00:15:23,320 --> 00:15:26,000 Speaker 2: not surprising to me because they're also the generation that 277 00:15:26,040 --> 00:15:30,400 Speaker 2: has had access to location sharing nearly their whole lives. Conversely, 278 00:15:30,960 --> 00:15:33,960 Speaker 2: Millennials were the most opinionated, like you were the most 279 00:15:34,000 --> 00:15:37,440 Speaker 2: opinionated about using the digital tool. Location sharing started to 280 00:15:37,440 --> 00:15:40,800 Speaker 2: become popular about fifteen years ago, when we were young 281 00:15:40,880 --> 00:15:43,280 Speaker 2: enough to understand the appeal, but old enough to know 282 00:15:43,320 --> 00:15:46,240 Speaker 2: what life was like without, you know, a GPS enabled 283 00:15:46,240 --> 00:15:48,320 Speaker 2: device on our person at all times. 284 00:15:48,720 --> 00:15:52,600 Speaker 1: Yeah, I mean the idea of being constantly surveiled is 285 00:15:52,680 --> 00:15:55,160 Speaker 1: something our generation has had to come to terms with, 286 00:15:55,960 --> 00:15:58,160 Speaker 1: and we came to terms with it at a pivotal age. 287 00:15:58,560 --> 00:16:01,800 Speaker 1: Our elders, including the now how former CEO of the 288 00:16:01,840 --> 00:16:05,280 Speaker 1: AI company Astronomer might do well if they understood the 289 00:16:05,320 --> 00:16:07,840 Speaker 1: extent to which we are living in a surveillance world. 290 00:16:08,160 --> 00:16:10,760 Speaker 2: You're talking about the former CEO who got caught cheating 291 00:16:10,880 --> 00:16:12,240 Speaker 2: at the Coldplay concert. 292 00:16:12,440 --> 00:16:17,000 Speaker 1: I'm talking about kisscam. The question, Kara, when you talk 293 00:16:17,040 --> 00:16:19,800 Speaker 1: about location sharing, are you talking about the Apple feature 294 00:16:19,880 --> 00:16:22,600 Speaker 1: find My or are there other tools that I don't 295 00:16:22,640 --> 00:16:24,920 Speaker 1: know about that the younger folks are using. 296 00:16:25,160 --> 00:16:27,480 Speaker 2: That is what we're talking about. Find My is really 297 00:16:27,480 --> 00:16:29,760 Speaker 2: popular because it's free on an Apple device. It's what 298 00:16:29,840 --> 00:16:32,480 Speaker 2: I use, for example. But there's also an app called 299 00:16:32,520 --> 00:16:35,520 Speaker 2: Life three sixty, which I'm not a parent yet, but 300 00:16:35,600 --> 00:16:38,840 Speaker 2: works across platforms and is more aimed at parents. It 301 00:16:38,880 --> 00:16:41,440 Speaker 2: can even and this freaks me out, it can even 302 00:16:41,480 --> 00:16:44,600 Speaker 2: show you how fast your kid is driving in real time, 303 00:16:44,720 --> 00:16:47,040 Speaker 2: so regardless of whether or not. We live in a 304 00:16:47,040 --> 00:16:50,240 Speaker 2: surveillance state. Your house might be a surveillance state. 305 00:16:50,360 --> 00:16:53,400 Speaker 1: I mean, can you imagine how anxiety? And do you 306 00:16:53,440 --> 00:16:57,080 Speaker 1: seem to be able to monitor your kids speeding at 307 00:16:57,080 --> 00:16:58,760 Speaker 1: all times? I mean, hopefully they don't speed and you 308 00:16:58,760 --> 00:17:00,320 Speaker 1: can use the app I guess to take the car 309 00:17:00,360 --> 00:17:02,880 Speaker 1: away if they're regularly speeding. But basically it is part 310 00:17:02,920 --> 00:17:06,679 Speaker 1: of adolescence to be risking your life on the semi 311 00:17:06,720 --> 00:17:09,560 Speaker 1: regular there and to lie and to lie, yeah, and 312 00:17:09,600 --> 00:17:12,200 Speaker 1: to life. Man is a parent to be I mean, gosh. 313 00:17:12,080 --> 00:17:14,040 Speaker 2: To be a parent and to be a kid. I 314 00:17:14,080 --> 00:17:17,080 Speaker 2: think about it all the time. We used to be Oh, yeah, 315 00:17:17,119 --> 00:17:18,760 Speaker 2: I'm just at Leah's house right now? Do you know 316 00:17:18,800 --> 00:17:21,840 Speaker 2: I'm like at a hookah bar like two hundred blocks 317 00:17:21,880 --> 00:17:26,960 Speaker 2: away from there. Is one other feature that is really 318 00:17:27,000 --> 00:17:29,960 Speaker 2: interesting that I am a part of, and I want 319 00:17:29,960 --> 00:17:32,640 Speaker 2: to know if you know anything about snap map. 320 00:17:33,720 --> 00:17:35,639 Speaker 1: Assume SnapMap is part of Snapchat. 321 00:17:36,080 --> 00:17:38,640 Speaker 2: Yes, yes it is. You know it's funny. I bet 322 00:17:38,640 --> 00:17:41,240 Speaker 2: if LinkedIn had to share my location tool, you'd share 323 00:17:41,280 --> 00:17:42,360 Speaker 2: your location on LinkedIn. 324 00:17:43,440 --> 00:17:44,240 Speaker 1: I'm hardcore on this. 325 00:17:44,960 --> 00:17:48,600 Speaker 2: So SnapMap is a location sharing feature through Snapchat that 326 00:17:48,640 --> 00:17:52,640 Speaker 2: was launched in twenty seventeen. As of May, SnapMap had 327 00:17:52,640 --> 00:17:56,639 Speaker 2: more than four hundred million monthly users. So if you 328 00:17:56,720 --> 00:18:00,239 Speaker 2: opt in, all the people you follow on Snapchat can 329 00:18:00,280 --> 00:18:02,160 Speaker 2: see your location the minute you open the app. 330 00:18:02,240 --> 00:18:05,399 Speaker 1: Okay, so Life three sixty is obviously like a different 331 00:18:05,440 --> 00:18:08,880 Speaker 1: kettle of fish. This is like a monitoring tool. How 332 00:18:08,920 --> 00:18:11,359 Speaker 1: would you characterize how you use Find my Friends on 333 00:18:11,400 --> 00:18:13,639 Speaker 1: your iPhone differently from SnapMap? 334 00:18:14,160 --> 00:18:18,160 Speaker 2: So it's I don't really use SnapMap, and I don't 335 00:18:18,200 --> 00:18:21,240 Speaker 2: really use fine Mine, but I here's and this is 336 00:18:21,320 --> 00:18:25,840 Speaker 2: very indicative of my personality. I passively let people follow 337 00:18:25,880 --> 00:18:28,440 Speaker 2: where I am because they want to know. I don't 338 00:18:28,480 --> 00:18:31,000 Speaker 2: care where people are, you know. But there is a 339 00:18:31,160 --> 00:18:34,320 Speaker 2: strange feeling you get when a friend tells you where 340 00:18:34,359 --> 00:18:37,280 Speaker 2: you are. They'll be like, you're at such and such, 341 00:18:37,480 --> 00:18:40,280 Speaker 2: and I'm like, I am, how do you know that 342 00:18:40,320 --> 00:18:42,560 Speaker 2: I'm at such and such. It's just it's this very 343 00:18:42,600 --> 00:18:45,399 Speaker 2: weird feeling to have someone who you're very close to 344 00:18:45,640 --> 00:18:48,000 Speaker 2: know where you are. I don't get in trouble with it, 345 00:18:48,040 --> 00:18:50,800 Speaker 2: but you can imagine if you're at a Coldplay concert, 346 00:18:51,760 --> 00:18:52,840 Speaker 2: you might get into trouble. 347 00:18:54,320 --> 00:18:56,160 Speaker 1: One of the things that strikes me here is that 348 00:18:56,600 --> 00:19:00,960 Speaker 1: at the very heart of our culture is fomo, and 349 00:19:01,040 --> 00:19:04,840 Speaker 1: I'm wondering what it might feel like to open your 350 00:19:04,880 --> 00:19:09,119 Speaker 1: snap map or you'll find my friends and see that 351 00:19:10,040 --> 00:19:12,160 Speaker 1: you were the only person who didn't get the call up. 352 00:19:13,600 --> 00:19:16,359 Speaker 2: You know, the first cut is the deepest. In this case, 353 00:19:16,400 --> 00:19:18,119 Speaker 2: it used to be that you heard about a party 354 00:19:18,240 --> 00:19:20,440 Speaker 2: after the weekend was over. Now you can see it 355 00:19:20,480 --> 00:19:22,440 Speaker 2: with your own eyes, you know. I actually I read 356 00:19:22,440 --> 00:19:26,240 Speaker 2: about this exact thing in sf Gate. I found this 357 00:19:26,359 --> 00:19:28,840 Speaker 2: article that was written by and it had to have 358 00:19:28,840 --> 00:19:32,160 Speaker 2: been written by the site's editorial intern, because nobody everybody 359 00:19:32,200 --> 00:19:34,879 Speaker 2: there is probably a bit older. But this editorial intern, 360 00:19:35,320 --> 00:19:37,919 Speaker 2: you know. It was gen Z and she recounted calling 361 00:19:37,960 --> 00:19:41,760 Speaker 2: her mom in tears after seeing her best college friends 362 00:19:41,800 --> 00:19:44,640 Speaker 2: descend on the campus dining hall, and she was crying 363 00:19:44,720 --> 00:19:47,800 Speaker 2: because they had not extended an invitation to her. This 364 00:19:47,840 --> 00:19:51,240 Speaker 2: format of memory hasn't deterred her from sharing her location 365 00:19:51,400 --> 00:19:54,480 Speaker 2: with nearly twenty five people, though she often wonders if 366 00:19:54,480 --> 00:19:55,760 Speaker 2: this practice is healthy. 367 00:19:56,400 --> 00:19:59,240 Speaker 1: I also question if this practice is healthy, Hence I 368 00:19:59,280 --> 00:19:59,800 Speaker 1: don't do it. 369 00:20:00,280 --> 00:20:02,040 Speaker 2: Even though I do it, it is really weird and 370 00:20:02,080 --> 00:20:05,520 Speaker 2: it's actually something psychologists are debating. I found this article 371 00:20:06,000 --> 00:20:09,960 Speaker 2: on Psychology Today, which tried to outline the impact location 372 00:20:10,080 --> 00:20:13,680 Speaker 2: sharing apps have on trust and relationships. Someone named doctor 373 00:20:13,720 --> 00:20:17,200 Speaker 2: Pamela Rutledge acknowledges that location sharing is actually a sign 374 00:20:17,280 --> 00:20:20,680 Speaker 2: of closeness among gen Z friends, and that tracking friends 375 00:20:20,760 --> 00:20:24,480 Speaker 2: can create this sort of ambient awareness that makes people 376 00:20:24,520 --> 00:20:29,840 Speaker 2: feel connected, comforted, and supported, albeit digitally supported, whatever that means. 377 00:20:30,000 --> 00:20:34,960 Speaker 1: Yeah, it's one of those classic social media double edged shorts, right, 378 00:20:35,080 --> 00:20:39,919 Speaker 1: like it? Yes, it both is the platform technology that 379 00:20:40,000 --> 00:20:43,639 Speaker 1: if you're not doing you feel profoundly disconnected, but it 380 00:20:43,680 --> 00:20:45,960 Speaker 1: can also in and of itself be a driver of 381 00:20:46,040 --> 00:20:47,720 Speaker 1: loneliness and isolation alienation. 382 00:20:48,119 --> 00:20:50,640 Speaker 2: Yeah, you know, as we pointed out, location sharing can 383 00:20:50,680 --> 00:20:54,640 Speaker 2: create major FOMO. But beyond the feeling of being left out, 384 00:20:55,640 --> 00:20:59,439 Speaker 2: the mutual visibility can also make I think teenagers and 385 00:20:59,520 --> 00:21:04,000 Speaker 2: also really anyone feel like they have to look busy, 386 00:21:04,119 --> 00:21:09,200 Speaker 2: which can lead to a kind of busyness to avoid judgment. 387 00:21:09,320 --> 00:21:12,280 Speaker 2: And I just think this idea of like performing busyness 388 00:21:12,320 --> 00:21:14,679 Speaker 2: because you're so aware of what other people are doing 389 00:21:15,240 --> 00:21:16,600 Speaker 2: is very strange. 390 00:21:16,600 --> 00:21:19,080 Speaker 1: Well, I mean that's the trope of the corporate office 391 00:21:19,119 --> 00:21:23,000 Speaker 1: in the pre covid eras, right, I mean that was 392 00:21:23,080 --> 00:21:24,840 Speaker 1: kind of the plot of the office was trying to 393 00:21:24,840 --> 00:21:28,200 Speaker 1: figure out how to look busy at work, but having 394 00:21:28,240 --> 00:21:30,480 Speaker 1: to figure out how to look busy in your social life. 395 00:21:30,520 --> 00:21:33,040 Speaker 1: I mean, man's that's the money. It makes me think 396 00:21:33,080 --> 00:21:36,080 Speaker 1: maybe you could slip an air tag into the most 397 00:21:36,080 --> 00:21:38,760 Speaker 1: popular kid at school's backpack. 398 00:21:38,800 --> 00:21:41,000 Speaker 2: Everywhere on their car, their car. 399 00:21:41,200 --> 00:21:42,880 Speaker 1: Everyone be like, oh my god, look at that hanging 400 00:21:42,880 --> 00:21:44,959 Speaker 1: out with I'm so jealous. Yeah, I mean. 401 00:21:45,000 --> 00:21:47,919 Speaker 2: Also, imagine if you feel close to someone, are in 402 00:21:48,000 --> 00:21:51,800 Speaker 2: like a romantic or familial relationship with someone, and they 403 00:21:51,880 --> 00:21:55,320 Speaker 2: just cold turkey stop sharing the location with you. I mean, 404 00:21:55,359 --> 00:21:57,920 Speaker 2: it would be hard not to feel suspicious, especially because 405 00:21:57,960 --> 00:22:01,080 Speaker 2: it is a visible action. Because with fine, people are 406 00:22:01,080 --> 00:22:04,159 Speaker 2: alerted if you stop sharing your location with them, and 407 00:22:04,480 --> 00:22:07,399 Speaker 2: this can lead to conflict. One of the things that 408 00:22:07,440 --> 00:22:10,280 Speaker 2: I found interesting that we've talked about is how these 409 00:22:10,320 --> 00:22:14,439 Speaker 2: tools can subtly erode our own capabilities. You know, we 410 00:22:14,520 --> 00:22:16,840 Speaker 2: shared about this a few weeks ago, and I'm obsessed 411 00:22:16,880 --> 00:22:20,240 Speaker 2: with this idea of racking up cognitive debt from outsourcing 412 00:22:20,240 --> 00:22:23,920 Speaker 2: too many tasks to chatbots. And there is research actually 413 00:22:24,880 --> 00:22:29,320 Speaker 2: to suggest that constant surveillance actually impacts our sensory perception 414 00:22:29,680 --> 00:22:33,679 Speaker 2: and unconscious cognitive functions such as memory and attention. And 415 00:22:33,760 --> 00:22:37,480 Speaker 2: going back to what doctor Rutledge said in Psychology Today, quote, 416 00:22:38,000 --> 00:22:40,639 Speaker 2: if we are training our brains to operate under the 417 00:22:40,680 --> 00:22:44,800 Speaker 2: assumption of constant surveillance, what are the potential implications for 418 00:22:44,880 --> 00:22:49,320 Speaker 2: things like creativity, risk taking, and authentic self expression? You know, 419 00:22:49,359 --> 00:22:51,119 Speaker 2: it's an interesting thing to think about. 420 00:22:57,960 --> 00:23:15,920 Speaker 1: After the break, Rabits, stay with us, Welcome back. We've 421 00:23:15,920 --> 00:23:17,920 Speaker 1: got a few more headlines to you this week. 422 00:23:17,800 --> 00:23:20,560 Speaker 2: And then a story about a woman who used chat 423 00:23:20,600 --> 00:23:23,840 Speaker 2: gipt to help her figure out one of motherhood's biggest challenges. 424 00:23:24,040 --> 00:23:27,320 Speaker 1: But in the meantime, Kara, you've been quite interested in 425 00:23:27,359 --> 00:23:30,920 Speaker 1: the results of a World Coding Championship. 426 00:23:31,119 --> 00:23:34,960 Speaker 2: That's right, a programmer named Shimislav Denbiak. He goes by 427 00:23:35,160 --> 00:23:39,080 Speaker 2: Siho on x SO like Psycho, Yes, sort of like that, 428 00:23:39,119 --> 00:23:43,600 Speaker 2: but without the c Siho has postponed our inevitable replacement 429 00:23:43,680 --> 00:23:46,760 Speaker 2: by AI just a little while longer. He was the 430 00:23:46,760 --> 00:23:50,120 Speaker 2: only coder to beat out a machine at the at 431 00:23:50,200 --> 00:23:52,840 Speaker 2: Coder World Tour finals. I'm surprised you weren't there us. 432 00:23:53,480 --> 00:23:56,080 Speaker 2: Siho used to work at Open Ai, but went head 433 00:23:56,119 --> 00:23:59,320 Speaker 2: to head with the company's custom AI model built specifically 434 00:23:59,359 --> 00:24:02,960 Speaker 2: for this competition. Ours Technica says that this might be 435 00:24:03,000 --> 00:24:07,000 Speaker 2: the first time an AI model has competed directly against 436 00:24:07,080 --> 00:24:09,280 Speaker 2: human programmers in a competition like this. 437 00:24:09,520 --> 00:24:13,160 Speaker 1: And evidently he was the only one Siho who did 438 00:24:13,200 --> 00:24:16,440 Speaker 1: better than the OpenAI model, suggesting that it did better 439 00:24:16,480 --> 00:24:18,160 Speaker 1: than all the other humans. 440 00:24:18,480 --> 00:24:21,479 Speaker 2: Siho actually said that the ten hour competition left him 441 00:24:21,520 --> 00:24:24,480 Speaker 2: quote completely exhausted and that by the end he was 442 00:24:24,560 --> 00:24:26,080 Speaker 2: quote barely alive. 443 00:24:26,800 --> 00:24:28,600 Speaker 1: Why is this story getting so much attention? 444 00:24:28,960 --> 00:24:31,400 Speaker 2: I was actually reading a couple of Reddit threads, as 445 00:24:31,440 --> 00:24:34,200 Speaker 2: I do, and one of the main things it seemed 446 00:24:34,280 --> 00:24:37,720 Speaker 2: like people were saying is that there's something inspiring about 447 00:24:37,760 --> 00:24:39,840 Speaker 2: a human beating a machine. 448 00:24:40,000 --> 00:24:40,280 Speaker 1: Right. 449 00:24:40,359 --> 00:24:42,720 Speaker 2: People were having fun comparing this to the time Ken 450 00:24:42,760 --> 00:24:46,359 Speaker 2: Jennings beat Watson on Jeopardy, or chess grandmasters who beat 451 00:24:46,400 --> 00:24:50,240 Speaker 2: early computers. But this being Reddit, the people who were 452 00:24:50,280 --> 00:24:53,440 Speaker 2: really excited about this were met with reminders from other 453 00:24:53,520 --> 00:24:56,320 Speaker 2: commenters that nobody has beat a computer at chess in 454 00:24:56,359 --> 00:25:00,000 Speaker 2: twenty years, and even Siho, who won this coding competition 455 00:25:00,080 --> 00:25:03,240 Speaker 2: and admitted as much in a tweet after the competition, 456 00:25:03,320 --> 00:25:08,480 Speaker 2: he wrote, humanity has prevailed for now. You know, it's 457 00:25:08,560 --> 00:25:11,239 Speaker 2: nice to know that we're not all getting replaced by 458 00:25:11,320 --> 00:25:14,800 Speaker 2: robots just yet. And there's something about clinging to this 459 00:25:14,880 --> 00:25:18,520 Speaker 2: false hope despite the inevitability of defeat in the future, 460 00:25:18,640 --> 00:25:22,600 Speaker 2: that is just so deeply human. I think the for 461 00:25:22,800 --> 00:25:26,639 Speaker 2: now in parentheses is sort of a nod to what 462 00:25:26,880 --> 00:25:27,919 Speaker 2: everybody is fearing. 463 00:25:28,320 --> 00:25:31,680 Speaker 1: Saiha has a well developed sense of irony, but I 464 00:25:31,720 --> 00:25:35,000 Speaker 1: personally think that human ingenuity still has a lot to offer. 465 00:25:35,600 --> 00:25:37,920 Speaker 1: For evidence, I would like to offer up some breaking 466 00:25:38,000 --> 00:25:42,000 Speaker 1: news from the world of robotics and the research team 467 00:25:42,160 --> 00:25:46,760 Speaker 1: at the University of Florida who've built robotic bunny rabbits, 468 00:25:47,200 --> 00:25:48,120 Speaker 1: forty of them. I mean. 469 00:25:48,160 --> 00:25:50,439 Speaker 2: Other than that, I'm obsessed with is why would anybody 470 00:25:50,560 --> 00:25:50,840 Speaker 2: do this? 471 00:25:51,480 --> 00:25:54,360 Speaker 1: Have you ever watched my one of my favorite late 472 00:25:54,480 --> 00:25:58,200 Speaker 1: night cable television programs, Python Hunters. 473 00:25:59,119 --> 00:26:02,119 Speaker 2: No, but I do follow the Lady Python Hunters of 474 00:26:02,160 --> 00:26:05,880 Speaker 2: the Everglades. I don't follow them on any social media. 475 00:26:05,960 --> 00:26:08,480 Speaker 2: I just keep up with their activities. 476 00:26:08,560 --> 00:26:11,240 Speaker 1: Well, they may soon be put out of business by 477 00:26:11,480 --> 00:26:16,359 Speaker 1: robotic bunny rabbits. Python Hunters and your friends, the Lady 478 00:26:16,359 --> 00:26:20,120 Speaker 1: python Hunters. These are the stories of the brave men 479 00:26:20,160 --> 00:26:23,960 Speaker 1: and women who hunt Burmese pythons through the Florida Evidglades, 480 00:26:24,520 --> 00:26:27,840 Speaker 1: often by night. Now, the pythons aren't invasive species who've 481 00:26:27,880 --> 00:26:30,679 Speaker 1: been plaguing the Evidglades for years and apparently can survive 482 00:26:30,720 --> 00:26:34,399 Speaker 1: as far north these days as Georgia then migrating north 483 00:26:34,440 --> 00:26:38,399 Speaker 1: from the Glades. But the question has arisen, why hunt 484 00:26:38,400 --> 00:26:41,520 Speaker 1: for pythons if you can get the pythons to come 485 00:26:41,560 --> 00:26:46,760 Speaker 1: to you. Enter robo bunnies. These bunnies have motives to 486 00:26:46,800 --> 00:26:50,200 Speaker 1: move around small heaters inside them, so the snakes think 487 00:26:50,200 --> 00:26:52,840 Speaker 1: they're alive, and they spin and shake a bit like 488 00:26:52,880 --> 00:26:56,280 Speaker 1: real rabbits do, and yes, they look pretty cute. And 489 00:26:56,680 --> 00:26:59,399 Speaker 1: scientists have been studying the use of rabbits as bait 490 00:27:00,040 --> 00:27:04,159 Speaker 1: to attract and then catch, remove, and euthanize these pythons 491 00:27:04,400 --> 00:27:07,199 Speaker 1: for more than a decade, but keeping the rabbits alive 492 00:27:07,320 --> 00:27:10,320 Speaker 1: and in the same place in multiple locations throughout the 493 00:27:10,320 --> 00:27:13,520 Speaker 1: Evidlades was just too much work. The robobunnies, of course, 494 00:27:13,600 --> 00:27:17,040 Speaker 1: don't require so much maintenance, and they don't have to 495 00:27:17,040 --> 00:27:20,399 Speaker 1: get eaten in the process, so they're reusable. Scientists are 496 00:27:20,400 --> 00:27:24,760 Speaker 1: hoping that catching pythons with these robobunnies will help address 497 00:27:24,800 --> 00:27:27,359 Speaker 1: population decline in a whole bunch of species in the 498 00:27:27,359 --> 00:27:31,600 Speaker 1: Evidlades that have been terrorized by the invasive pythons for decades. 499 00:27:32,240 --> 00:27:35,119 Speaker 1: And if the robots don't work in this experiment, they 500 00:27:35,119 --> 00:27:37,919 Speaker 1: already have a hypothesis on how to make them more attractive, 501 00:27:38,280 --> 00:27:40,840 Speaker 1: which is smearing them with real rabbit pheromones. 502 00:27:42,320 --> 00:27:44,359 Speaker 2: I just love the idea of pythons going back to 503 00:27:44,400 --> 00:27:47,000 Speaker 2: the community and being like, don't trust them, They're not real. 504 00:27:48,400 --> 00:27:52,200 Speaker 1: There is a symbiosis between the python hunters and the pythons, 505 00:27:52,240 --> 00:27:55,880 Speaker 1: which these robotic bunnies may be about to disrupt that ecosystem. 506 00:27:56,080 --> 00:27:59,080 Speaker 2: Yes, next, are you trying to get more hot people 507 00:27:59,119 --> 00:28:01,280 Speaker 2: to visit your business? Because there's an app for that. 508 00:28:01,359 --> 00:28:05,159 Speaker 2: It's called neon Coat, and it allows business owners to 509 00:28:05,280 --> 00:28:08,760 Speaker 2: offer their services for free as long as you post 510 00:28:08,800 --> 00:28:12,520 Speaker 2: about it on social media. The prerequisite you gotta be hot. 511 00:28:13,560 --> 00:28:17,359 Speaker 2: Businesses on the app offer everything for meals at fancy restaurants, 512 00:28:17,400 --> 00:28:21,720 Speaker 2: to fitness classes, salon appointments, and even tarot readings. The 513 00:28:21,760 --> 00:28:23,960 Speaker 2: app was recently written up in The Wall Street Journal 514 00:28:23,960 --> 00:28:27,560 Speaker 2: and it was founded by Larissa Draconia, who is you 515 00:28:27,680 --> 00:28:31,480 Speaker 2: guessed it a former model an actress, she says she 516 00:28:31,640 --> 00:28:34,400 Speaker 2: hopes the app can help young models have more control. 517 00:28:34,760 --> 00:28:36,520 Speaker 2: When she first came to New York City from a 518 00:28:36,560 --> 00:28:39,560 Speaker 2: small town in Slovenia, the stipend her agency gave her 519 00:28:39,720 --> 00:28:42,680 Speaker 2: was only three hundred dollars a week. Most of her 520 00:28:42,720 --> 00:28:45,200 Speaker 2: perks and social plans had to be booked through her agency. 521 00:28:45,600 --> 00:28:48,240 Speaker 2: She says that letting models and influencers get their own 522 00:28:48,360 --> 00:28:52,880 Speaker 2: perks in exchange for social media posts gives them more autonomy. 523 00:28:53,600 --> 00:28:57,000 Speaker 1: The road to heaven is paved with free stuff. 524 00:28:57,160 --> 00:29:00,800 Speaker 2: Indeed, the app is available where models are New York, 525 00:29:00,880 --> 00:29:04,600 Speaker 2: Los Angeles, London, and Miami, and there are currently over 526 00:29:04,720 --> 00:29:09,920 Speaker 2: twelve thousand users and fifteen hundred businesses. So far, app 527 00:29:10,040 --> 00:29:13,240 Speaker 2: usage has reportedly led to over three hundred and fifty 528 00:29:13,440 --> 00:29:17,560 Speaker 2: thousand social media posts. Don't get too excited. If you're 529 00:29:17,600 --> 00:29:20,200 Speaker 2: looking to join, you've got to have at least one 530 00:29:20,200 --> 00:29:23,920 Speaker 2: thousand Instagram followers if you're a model, and five thousand 531 00:29:24,040 --> 00:29:26,400 Speaker 2: if you're an influencer, and you've got to have high 532 00:29:26,440 --> 00:29:27,240 Speaker 2: engagement rates. 533 00:29:27,280 --> 00:29:30,800 Speaker 1: Well, Caro, you actually have more than five thousand Instagram followers, 534 00:29:31,280 --> 00:29:31,640 Speaker 1: and I do. 535 00:29:31,760 --> 00:29:32,960 Speaker 2: But am I heart enough? 536 00:29:33,120 --> 00:29:35,080 Speaker 1: Well? I think you should apply and report back on 537 00:29:35,120 --> 00:29:38,040 Speaker 1: next week's show whether or not you've become a member 538 00:29:38,080 --> 00:29:40,240 Speaker 1: of Neon Coates Free Stuff Army. 539 00:29:40,440 --> 00:29:41,840 Speaker 2: I think I'm actually going to do that. I should 540 00:29:41,840 --> 00:29:52,400 Speaker 2: do that, I am. I am. 541 00:29:53,560 --> 00:29:56,320 Speaker 1: Now it's time for Chat and Me a new segment 542 00:29:56,440 --> 00:29:58,880 Speaker 1: about how people are really using chatbots. 543 00:29:59,400 --> 00:30:01,600 Speaker 2: This week. I have a story from my friend who 544 00:30:01,640 --> 00:30:05,640 Speaker 2: recently gave birth to her son, James, and she told 545 00:30:05,640 --> 00:30:10,480 Speaker 2: me she's been using chat GPT as a lactation consultant. 546 00:30:10,920 --> 00:30:14,280 Speaker 1: I would need chat GPT to know what a lactation 547 00:30:14,600 --> 00:30:17,560 Speaker 1: consultant is. Well, I guess that's not true. I can 548 00:30:17,640 --> 00:30:21,080 Speaker 1: kind of imagine what it is. But is it ubiquitous? 549 00:30:21,160 --> 00:30:22,840 Speaker 1: Is it expensive? I mean, tell me a little bit 550 00:30:22,880 --> 00:30:25,560 Speaker 1: about the world of lactation consultants. 551 00:30:25,880 --> 00:30:28,640 Speaker 2: It is expensive. It can cost hundreds of dollars per visit. 552 00:30:28,760 --> 00:30:33,120 Speaker 2: And the US actually has a relatively low breastfeeding rate, 553 00:30:33,480 --> 00:30:37,120 Speaker 2: not because American women aren't motivated. Sixty percent of mothers 554 00:30:37,120 --> 00:30:42,040 Speaker 2: report they stop breastfeeding earlier than intended because they lack 555 00:30:42,080 --> 00:30:45,720 Speaker 2: support and education and to chat. Yeah, you know, figuring 556 00:30:45,760 --> 00:30:49,280 Speaker 2: out a pumping schedule is more complicated than I realized. 557 00:30:49,280 --> 00:30:52,480 Speaker 2: So here listen to how my friend used GPT to 558 00:30:52,840 --> 00:30:53,800 Speaker 2: solve the problem. 559 00:30:54,280 --> 00:30:58,320 Speaker 3: I basically need to keep up with James's supply. He's 560 00:30:58,400 --> 00:31:03,680 Speaker 3: nine months quiring more milk, and at about seven months, 561 00:31:03,680 --> 00:31:05,800 Speaker 3: I had to start waking up again to pump through 562 00:31:05,800 --> 00:31:10,320 Speaker 3: the night because he needs more milk throughout the day. 563 00:31:10,400 --> 00:31:13,120 Speaker 3: So if I don't pump at night, then my body 564 00:31:13,160 --> 00:31:15,960 Speaker 3: signals basically that I don't need as much milk. 565 00:31:16,680 --> 00:31:21,040 Speaker 1: Man having to wake up every night to ensure your 566 00:31:21,120 --> 00:31:24,840 Speaker 1: kid has enough milk for the next day, this is 567 00:31:24,960 --> 00:31:25,720 Speaker 1: very stressful. 568 00:31:25,960 --> 00:31:30,840 Speaker 2: It is very stressful. And she actually said that Chatgypt 569 00:31:31,120 --> 00:31:33,600 Speaker 2: has been helping her figure out how to stay on track. 570 00:31:34,080 --> 00:31:38,200 Speaker 3: Chatchpt gave me an entire pumping schedule, how much I 571 00:31:38,240 --> 00:31:41,640 Speaker 3: should pump, talking to me about power pumping, talking to 572 00:31:41,680 --> 00:31:43,160 Speaker 3: me throughout the night, what. 573 00:31:43,080 --> 00:31:44,720 Speaker 1: Does she mean talking to her? 574 00:31:44,960 --> 00:31:48,880 Speaker 2: So she actually said the chat which is her shorthand 575 00:31:48,880 --> 00:31:53,280 Speaker 2: for chat GPT, was really a support system for her 576 00:31:53,480 --> 00:31:55,680 Speaker 2: during her nightly pumping ritual and still is. 577 00:31:56,240 --> 00:31:58,560 Speaker 3: Basically, I would wake up and talk to chat and 578 00:31:58,600 --> 00:32:00,840 Speaker 3: be like, hey, I'm pumping now, and they're like, how's 579 00:32:00,840 --> 00:32:02,760 Speaker 3: it going, And I'm like talking to them like they're 580 00:32:02,800 --> 00:32:07,320 Speaker 3: my friend, and also telling chat CHEAPT how many ounces 581 00:32:07,360 --> 00:32:12,720 Speaker 3: I'm getting, and chat CHEAPT telling me that basically that's good, 582 00:32:13,120 --> 00:32:16,360 Speaker 3: and I should pump again. And here's what I should 583 00:32:16,440 --> 00:32:19,960 Speaker 3: eat based on what I normally eat, So he knows 584 00:32:20,000 --> 00:32:22,760 Speaker 3: what I eat basically like I tell him my yogurt, 585 00:32:23,400 --> 00:32:26,440 Speaker 3: shea seeds, blueberries, raspberries, et cetera, et cetera. 586 00:32:26,840 --> 00:32:28,440 Speaker 1: It's helping her plan her meals too. 587 00:32:28,800 --> 00:32:30,880 Speaker 2: Yes, she tells it what she's eaten, and it takes 588 00:32:30,920 --> 00:32:36,120 Speaker 2: into account all the nutritional information, timing everything, and then he. 589 00:32:36,320 --> 00:32:40,840 Speaker 3: Gives me a tailored meal plan to keep up with supply, 590 00:32:41,160 --> 00:32:44,160 Speaker 3: So like what I should eat, how I should eat, 591 00:32:44,360 --> 00:32:47,120 Speaker 3: when I should eat bananas after I pump in the 592 00:32:47,200 --> 00:32:49,720 Speaker 3: middle of the night some nights, yes, some nights no. 593 00:32:49,960 --> 00:32:50,840 Speaker 3: It's kind of crazy. 594 00:32:51,280 --> 00:32:53,040 Speaker 1: This sounds like it's been kind of a game changer 595 00:32:53,040 --> 00:32:54,000 Speaker 1: of for your friend, a. 596 00:32:54,040 --> 00:32:56,040 Speaker 2: Huge game change of her. And I think what's really 597 00:32:56,080 --> 00:33:00,200 Speaker 2: interesting to note about this friend of mine is that 598 00:33:00,720 --> 00:33:03,840 Speaker 2: she is by no means a technologist. But I think 599 00:33:04,480 --> 00:33:09,120 Speaker 2: the ubiquity of chat GPT is one that is actually 600 00:33:09,160 --> 00:33:14,320 Speaker 2: affecting people who really use technology for work as much 601 00:33:14,320 --> 00:33:18,520 Speaker 2: as it's affecting this sort of average daily phone user. 602 00:33:18,640 --> 00:33:22,120 Speaker 2: And I think it was just this really brilliant case 603 00:33:22,360 --> 00:33:26,760 Speaker 2: of the way chat gpt cannot only give you information 604 00:33:26,960 --> 00:33:29,080 Speaker 2: that you would have to pay a lot of money for, 605 00:33:29,640 --> 00:33:33,440 Speaker 2: but also is acting as this sort of cheerleader, to 606 00:33:33,480 --> 00:33:36,480 Speaker 2: the point that she calls chat gpt he which is 607 00:33:36,560 --> 00:33:38,280 Speaker 2: interesting for a lactation consultant. 608 00:33:38,720 --> 00:33:41,440 Speaker 1: Well, I love hearing these stories. I love chatting me 609 00:33:41,640 --> 00:33:45,640 Speaker 1: as a segment that you pioneered, and it's been fun 610 00:33:45,760 --> 00:33:48,160 Speaker 1: hearing from two of your friends in two consecutive weeks. 611 00:33:48,480 --> 00:33:50,560 Speaker 1: I do want to hear from our listeners. So if 612 00:33:50,560 --> 00:33:54,760 Speaker 1: you found yourself turning to chat gpt, Grock, Claude Gemini, 613 00:33:55,240 --> 00:33:57,280 Speaker 1: or any other chatbot to help you with the unusual 614 00:33:57,400 --> 00:34:01,480 Speaker 1: task or to answer one of life's complicated questions, please 615 00:34:01,520 --> 00:34:03,640 Speaker 1: tell us about it. Send us a one to two 616 00:34:03,640 --> 00:34:07,320 Speaker 1: minute voice note to tech Stuff podcast at gmail dot com. 617 00:34:07,680 --> 00:34:10,000 Speaker 2: The more details you provide, the better. We are nosy 618 00:34:10,080 --> 00:34:12,880 Speaker 2: and we want to understand how AI is changing your lives. 619 00:34:12,920 --> 00:34:14,560 Speaker 1: And if you send in a story that we use, 620 00:34:14,760 --> 00:34:16,840 Speaker 1: we'll send you a free T shirt. 621 00:34:35,080 --> 00:34:37,439 Speaker 2: That's it for this week for Tech Stuff. I'm Kara 622 00:34:37,520 --> 00:34:38,239 Speaker 2: Price and. 623 00:34:38,239 --> 00:34:41,520 Speaker 1: I'm os Vaaloschin. This episode was produced by Eliza Dennis 624 00:34:41,560 --> 00:34:44,839 Speaker 1: and Tyler Hill. It was executive produced by me Kara 625 00:34:44,960 --> 00:34:48,640 Speaker 1: Price and Kate Osborne for Kaleidoscope and Katria Norvel for 626 00:34:48,719 --> 00:34:52,840 Speaker 1: iHeart Podcasts. The engineers are Beheid Fraser and Tom Sitchell. 627 00:34:53,440 --> 00:34:56,520 Speaker 1: Jack Insley makes this episode and Kyle Murdoch wrote our 628 00:34:56,560 --> 00:34:57,040 Speaker 1: theme song. 629 00:34:57,400 --> 00:35:00,279 Speaker 2: Join us next Wednesday for Textuff the Story, when we 630 00:35:00,320 --> 00:35:02,880 Speaker 2: will share an in depth conversation about the future of 631 00:35:02,920 --> 00:35:03,960 Speaker 2: cancer surgery. 632 00:35:04,120 --> 00:35:06,879 Speaker 1: Please rate, review, and reach out to us at tech 633 00:35:06,920 --> 00:35:09,560 Speaker 1: Stuff podcast at gmail dot com. We want to hear 634 00:35:09,600 --> 00:35:09,960 Speaker 1: from you.