1 00:00:07,880 --> 00:00:18,040 Speaker 1: Guys from Kaleidoscope and iHeart Podcasts. This is tech stuff. 2 00:00:18,200 --> 00:00:21,160 Speaker 1: I'm mos Voloscian and I'm Cara Price. Today We've got 3 00:00:21,200 --> 00:00:24,680 Speaker 1: two big stories to break down for you. First, could 4 00:00:24,800 --> 00:00:28,000 Speaker 1: China unseat the US in more than just AI and 5 00:00:28,080 --> 00:00:33,199 Speaker 1: catch up in pharmaceuticals too? Then inmates phone calls with 6 00:00:33,280 --> 00:00:36,600 Speaker 1: loved ones are being used to train AI that then 7 00:00:36,680 --> 00:00:38,080 Speaker 1: monitors their behavior. 8 00:00:38,520 --> 00:00:40,360 Speaker 2: Then we'll tell you about a few other stories that 9 00:00:40,440 --> 00:00:43,080 Speaker 2: caught our eye this week, like how the UK police 10 00:00:43,120 --> 00:00:46,919 Speaker 2: want to cross reference CCTV footage with government databases, and 11 00:00:47,280 --> 00:00:51,920 Speaker 2: how the youngest female self made billionaire made all her money. Finally, 12 00:00:52,000 --> 00:00:55,160 Speaker 2: we discussed giant blueberries and why the company is selling 13 00:00:55,160 --> 00:00:57,600 Speaker 2: them is valued at a billion dollars. 14 00:00:58,160 --> 00:01:01,400 Speaker 3: Then on chat to me, there was concern, you know, 15 00:01:01,440 --> 00:01:04,559 Speaker 3: when two nuclear armed states are on the brink that way, 16 00:01:04,640 --> 00:01:08,080 Speaker 3: that this could escalate and explode not just regionally but internationally. 17 00:01:08,640 --> 00:01:12,240 Speaker 1: All of that on the weekend Tech. It's Friday, December twelfth. 18 00:01:14,319 --> 00:01:18,480 Speaker 2: Hello Cara, Hi Ahs, do you follow art Basil. 19 00:01:18,800 --> 00:01:22,720 Speaker 1: I've never been Miami, I know, but I'm fascinated from Afar. 20 00:01:22,880 --> 00:01:23,959 Speaker 1: It's like the most hype. 21 00:01:24,280 --> 00:01:27,760 Speaker 2: It is the hype beast event of any time in 22 00:01:28,200 --> 00:01:28,880 Speaker 2: human history. 23 00:01:29,040 --> 00:01:31,320 Speaker 1: Last time it broke the internet was with the like 24 00:01:31,360 --> 00:01:34,399 Speaker 1: twenty million dollar banana. That was last year or two 25 00:01:34,440 --> 00:01:34,839 Speaker 1: years ago. 26 00:01:35,160 --> 00:01:38,560 Speaker 2: Probably was for you offensive. It was very offensive. So 27 00:01:39,080 --> 00:01:40,399 Speaker 2: let me ask you a question. Have you seen the 28 00:01:40,480 --> 00:01:41,679 Speaker 2: robot dogs I have? 29 00:01:42,120 --> 00:01:43,640 Speaker 1: Okay, robot dog? 30 00:01:43,680 --> 00:01:45,240 Speaker 2: Would you like to describe them a little bit? What 31 00:01:45,319 --> 00:01:45,600 Speaker 2: you saw? 32 00:01:45,640 --> 00:01:48,320 Speaker 1: Okay? So you remember the Boston Dynamics dogs. Yes, they 33 00:01:48,360 --> 00:01:52,400 Speaker 1: were these sort of uncannily dog like robots that kind 34 00:01:52,400 --> 00:01:55,760 Speaker 1: of pranced a little bit like like not like the accidents, 35 00:01:55,840 --> 00:01:58,720 Speaker 1: like what are those greyhounds like little greyhounds great guns? 36 00:01:59,320 --> 00:02:05,280 Speaker 1: So imagine robot baby prayhounds with very raalistic masks that 37 00:02:05,360 --> 00:02:06,680 Speaker 1: depict teche billionaires. 38 00:02:07,240 --> 00:02:09,200 Speaker 2: You described it perfectly. 39 00:02:08,840 --> 00:02:10,359 Speaker 1: Well together in a kind of pen. 40 00:02:10,639 --> 00:02:14,960 Speaker 2: That's exactly right. So it's an odd group of people 41 00:02:15,200 --> 00:02:18,480 Speaker 2: dogs in this pen. It was Jeff Bezos, Mark Zuckerberg, 42 00:02:18,560 --> 00:02:22,640 Speaker 2: Elon Musk, Pablo Picasso, Andy Warhol, and also a dog 43 00:02:22,680 --> 00:02:25,919 Speaker 2: of the artist people who created the dogs. The masks 44 00:02:25,960 --> 00:02:27,799 Speaker 2: that you were talking about are made by a special 45 00:02:27,800 --> 00:02:30,480 Speaker 2: effects artist, nam Landon Meyer, and they are these hyper 46 00:02:30,560 --> 00:02:34,519 Speaker 2: realistic faces. Is exactly face exactly like human faces. 47 00:02:34,639 --> 00:02:36,880 Speaker 1: So I saw the images, but I didn't really know 48 00:02:36,919 --> 00:02:38,399 Speaker 1: what is this all actually about. 49 00:02:38,680 --> 00:02:42,520 Speaker 2: So basically people wanted to make the point that it 50 00:02:42,720 --> 00:02:44,760 Speaker 2: used to be that we saw the world interpreted through 51 00:02:44,760 --> 00:02:47,960 Speaker 2: the eyes of artists, but now Mark Zuckerberg and Elon 52 00:02:48,240 --> 00:02:51,359 Speaker 2: in particular control a huge amount of how we see 53 00:02:51,360 --> 00:02:55,160 Speaker 2: the world. So he wanted to show that basically they 54 00:02:55,200 --> 00:02:57,200 Speaker 2: are the cultural conduits of our time. 55 00:02:57,840 --> 00:03:00,240 Speaker 1: And what do they do apart from run around the pen? 56 00:03:00,440 --> 00:03:03,120 Speaker 2: So this is what I love. So they run around 57 00:03:03,160 --> 00:03:05,200 Speaker 2: the pen, but when you watch them, you can kind 58 00:03:05,240 --> 00:03:07,880 Speaker 2: of see them, what's the word that I'm looking for, 59 00:03:08,120 --> 00:03:13,200 Speaker 2: shitting out pictures that are in the style of the 60 00:03:13,320 --> 00:03:16,520 Speaker 2: artists that they are depicting or the tech bro So, 61 00:03:16,880 --> 00:03:19,840 Speaker 2: for example, the Andy Warhol robot dog will produce an 62 00:03:19,840 --> 00:03:23,000 Speaker 2: image that looks like a silk screen, and Picasso's is cubism, 63 00:03:23,280 --> 00:03:26,720 Speaker 2: Elons is black and white, and Zucks, to quote people 64 00:03:27,000 --> 00:03:28,639 Speaker 2: looks like the metaverse. 65 00:03:28,760 --> 00:03:31,000 Speaker 1: Wow, So are they are the dogs? Are they taking 66 00:03:31,000 --> 00:03:33,480 Speaker 1: photos of stuff they're seeing in the audience and then 67 00:03:33,560 --> 00:03:35,320 Speaker 1: turning them into works. 68 00:03:35,080 --> 00:03:37,760 Speaker 2: Of arts works of art? That's correct, but the actual 69 00:03:37,840 --> 00:03:41,000 Speaker 2: dogs themselves are selling for one hundred thousand dollars each. 70 00:03:41,280 --> 00:03:44,400 Speaker 2: Imagine just having someone in your house and they're being 71 00:03:44,480 --> 00:03:46,680 Speaker 2: a little Elon musk dog running. 72 00:03:46,360 --> 00:03:48,800 Speaker 1: Around taking photos of your guests and then pooping them 73 00:03:48,800 --> 00:03:52,640 Speaker 1: out crazy. I gather though, that the Bezos dog is 74 00:03:52,680 --> 00:03:56,440 Speaker 1: not for sale and doesn't poop. That's because he's King 75 00:03:56,440 --> 00:03:57,000 Speaker 1: of the Teche Brothers. 76 00:03:57,000 --> 00:03:58,960 Speaker 2: That's because he's king of the tech Bros. I just 77 00:03:59,000 --> 00:04:01,320 Speaker 2: thought it was an interesting story to start with because 78 00:04:01,320 --> 00:04:03,960 Speaker 2: I think that People is making an interesting point. 79 00:04:04,040 --> 00:04:07,240 Speaker 1: It's a tech Burrows world. We just shouldn't it. 80 00:04:07,920 --> 00:04:09,080 Speaker 2: This is me high fiving them. 81 00:04:09,240 --> 00:04:09,440 Speaker 3: Yeah. 82 00:04:09,480 --> 00:04:11,840 Speaker 1: And also, I mean, look, so people really shot to 83 00:04:11,880 --> 00:04:14,400 Speaker 1: fame during the height of the NFT craz That's right. 84 00:04:14,440 --> 00:04:17,040 Speaker 1: He sold one a single NFT for I think sixty 85 00:04:17,160 --> 00:04:19,839 Speaker 1: nine millions dollars. 86 00:04:19,560 --> 00:04:22,479 Speaker 2: Which speaking of something that Bess could shot out that 87 00:04:22,560 --> 00:04:23,240 Speaker 2: amount of money. 88 00:04:23,760 --> 00:04:25,520 Speaker 1: And I believe some of the images that the dogs 89 00:04:25,600 --> 00:04:29,279 Speaker 1: poop out our NFTs that you can also buy. And People, 90 00:04:29,520 --> 00:04:31,640 Speaker 1: as you mentioned earlier, is also one of the dogs. 91 00:04:31,640 --> 00:04:34,240 Speaker 1: So I think he's poking fun at himself and his 92 00:04:34,600 --> 00:04:38,320 Speaker 1: ovre a little bit absolutely right here too. And I 93 00:04:38,320 --> 00:04:41,960 Speaker 1: mean you could say this is absolutely grotesque and. 94 00:04:42,120 --> 00:04:43,400 Speaker 2: People are paying money for it. 95 00:04:43,480 --> 00:04:46,359 Speaker 1: On the other hand, the guy is a master capturing 96 00:04:46,440 --> 00:04:49,840 Speaker 1: people's attention, getting the hype cycle, and forcing people to talk. 97 00:04:50,120 --> 00:04:51,640 Speaker 1: When was the last time you saw an article in 98 00:04:51,720 --> 00:04:55,760 Speaker 1: page six questioning how the tech oligarchs made us see 99 00:04:55,760 --> 00:04:56,719 Speaker 1: the world through their filter. 100 00:04:57,320 --> 00:04:58,600 Speaker 2: He did it, He did it, He did it. 101 00:04:58,640 --> 00:05:00,800 Speaker 1: Good on you people. Yeah, I'd like to move on 102 00:05:00,960 --> 00:05:05,440 Speaker 1: now from big tech to big farm and specifically take 103 00:05:05,480 --> 00:05:08,920 Speaker 1: investments in and around pharmaceuticals. A while back, I read 104 00:05:08,920 --> 00:05:11,719 Speaker 1: this piece in the Ft which asked the question why 105 00:05:11,839 --> 00:05:14,359 Speaker 1: is AI struggling to discover new drugs? 106 00:05:14,520 --> 00:05:17,400 Speaker 2: Because why is AI struggling to find new drugs? 107 00:05:17,480 --> 00:05:20,440 Speaker 1: Well, that's a good rually tell me. I didn't really 108 00:05:20,480 --> 00:05:23,280 Speaker 1: realize this until I read the ftpiece. Obviously, I'm fully 109 00:05:23,279 --> 00:05:25,839 Speaker 1: aware of all the investment in AI drug discovery now, 110 00:05:26,279 --> 00:05:28,320 Speaker 1: but it turned out there was kind of a first 111 00:05:28,360 --> 00:05:32,080 Speaker 1: wave in the like twenty thirteen to twenty seventeen period 112 00:05:32,160 --> 00:05:36,120 Speaker 1: where hundreds of millions of dollars was invested, And the 113 00:05:36,200 --> 00:05:38,880 Speaker 1: kind of period when you would normally expect a drug 114 00:05:38,920 --> 00:05:41,040 Speaker 1: to come to market is about ten years. 115 00:05:41,600 --> 00:05:44,120 Speaker 2: So if my math is right, we should be seeing 116 00:05:44,400 --> 00:05:46,440 Speaker 2: like a whole bunch of new drugs come onto the 117 00:05:46,440 --> 00:05:47,200 Speaker 2: market now. 118 00:05:47,120 --> 00:05:49,640 Speaker 1: That's right, But we're not and according to the Ft, 119 00:05:50,000 --> 00:05:53,200 Speaker 1: there's a few reasons for that. One is money. Essentially, 120 00:05:53,560 --> 00:05:56,520 Speaker 1: these companies couldn't bring their product to market quick enough 121 00:05:56,800 --> 00:05:59,840 Speaker 1: to keep raising money to keep their companies alive, and 122 00:06:00,040 --> 00:06:02,719 Speaker 1: investors lost patients. I mean, to be fair to these companies, 123 00:06:02,760 --> 00:06:05,640 Speaker 1: drug discovery and the trials to actually bring drugs to 124 00:06:05,720 --> 00:06:09,400 Speaker 1: market are incredibly expensive, which is part of the reason 125 00:06:09,440 --> 00:06:12,880 Speaker 1: why these big farmer companies have such a lock on healthcare, 126 00:06:12,880 --> 00:06:15,840 Speaker 1: because they're the only people who can afford these cycles. 127 00:06:15,360 --> 00:06:18,320 Speaker 2: Right, right, right, So it was more of a funding issue. Ultimately, 128 00:06:18,480 --> 00:06:18,960 Speaker 2: I think it. 129 00:06:18,880 --> 00:06:21,320 Speaker 1: Was a funding issue and also a technology issue. I mean, 130 00:06:21,520 --> 00:06:25,120 Speaker 1: don't forget Back in the day, the AI tools were 131 00:06:25,200 --> 00:06:27,840 Speaker 1: nowhere near where they are today, So you basically have 132 00:06:27,920 --> 00:06:31,560 Speaker 1: to choose one problem and develop a specific AI to 133 00:06:31,640 --> 00:06:34,240 Speaker 1: tackle it, whereas now you have these broad AIS that 134 00:06:34,279 --> 00:06:38,600 Speaker 1: can tackle multiple problems simultaneously. So basically, according to the Ft, 135 00:06:38,920 --> 00:06:41,599 Speaker 1: the decade long clock kind of got reset in the 136 00:06:41,640 --> 00:06:45,719 Speaker 1: aftermath of the chetchipt moment two twenty twenty three, twenty 137 00:06:45,760 --> 00:06:49,120 Speaker 1: twenty four. So what happens next, Well, we'll have to 138 00:06:49,160 --> 00:06:52,560 Speaker 1: wait and see whether this great promise of AI to 139 00:06:52,600 --> 00:06:55,279 Speaker 1: deliver new drugs takes place in the next five to 140 00:06:55,360 --> 00:06:58,560 Speaker 1: seven years. One veteran chemist interviewed in the piece said 141 00:06:58,560 --> 00:07:02,640 Speaker 1: the drug discovery is quote probably the hardest thing mankind 142 00:07:02,880 --> 00:07:06,559 Speaker 1: tries to do. Another person said, who's actually a founder 143 00:07:06,600 --> 00:07:09,359 Speaker 1: of one of these new generation drug companies? Quote? I 144 00:07:09,480 --> 00:07:11,760 Speaker 1: used to say we were the industry with the highest 145 00:07:11,760 --> 00:07:15,640 Speaker 1: failure rate of anything but Space Explorer space. And then 146 00:07:15,720 --> 00:07:18,680 Speaker 1: she goes on to say space explorations started to work. 147 00:07:19,240 --> 00:07:21,840 Speaker 2: Wow to see, Yeah, that's very interesting. 148 00:07:21,960 --> 00:07:24,880 Speaker 1: I read another piece in the Ft, also discovery. I 149 00:07:25,320 --> 00:07:27,280 Speaker 1: love the Ft and I guess I love drugtor Sorry. 150 00:07:27,440 --> 00:07:30,480 Speaker 1: The question this piece asked is will the next blockbuster 151 00:07:30,560 --> 00:07:31,640 Speaker 1: drug come from China? 152 00:07:31,680 --> 00:07:34,040 Speaker 2: Well, that's really interesting. So there's almost like a parallel 153 00:07:34,160 --> 00:07:37,000 Speaker 2: arms race in pharma and AI with China. 154 00:07:37,000 --> 00:07:39,760 Speaker 1: And the INTERSETXT and you can kind of see American 155 00:07:39,840 --> 00:07:43,600 Speaker 1: farmer companies begin to make very substantial investments in China. 156 00:07:43,640 --> 00:07:47,000 Speaker 1: For example, Novo Nordics, the company who developed a zepig, 157 00:07:47,520 --> 00:07:50,480 Speaker 1: paid a Chinese company up to two billion dollars to 158 00:07:50,560 --> 00:07:54,360 Speaker 1: license a next gen weight loss drug. So yeah, it's 159 00:07:54,440 --> 00:07:57,480 Speaker 1: kind of it's that's kind of parallel thing where China 160 00:07:57,680 --> 00:08:00,440 Speaker 1: was the factory for US innovation for so long and 161 00:08:00,480 --> 00:08:03,560 Speaker 1: now are starting to really develop and refine and make 162 00:08:03,640 --> 00:08:07,480 Speaker 1: better products, or at least equally good products, faster and cheaper. 163 00:08:07,720 --> 00:08:08,640 Speaker 2: So how are they doing this? 164 00:08:09,240 --> 00:08:11,760 Speaker 1: Well, I mean, it's kind of that thing where the 165 00:08:11,800 --> 00:08:15,040 Speaker 1: government says this is a priority, and also we'll just 166 00:08:15,080 --> 00:08:17,200 Speaker 1: cut through all of the red tape required, will make 167 00:08:17,360 --> 00:08:20,760 Speaker 1: trials easier, will make recruiting for trials easier, will approve 168 00:08:20,800 --> 00:08:23,280 Speaker 1: the construction of new facilities in days rather than months 169 00:08:23,360 --> 00:08:26,160 Speaker 1: or years. They basically said, this is a government priority 170 00:08:26,160 --> 00:08:28,640 Speaker 1: to become a leader in pharmaceuticals, and therefore we will 171 00:08:28,680 --> 00:08:32,960 Speaker 1: remove absolutely every obstacle, whether it's financial, whether it's regulatory, 172 00:08:33,200 --> 00:08:35,679 Speaker 1: to bringing drugs to market fast. And looks like it's 173 00:08:35,679 --> 00:08:36,280 Speaker 1: signing to work. 174 00:08:36,520 --> 00:08:38,000 Speaker 2: So what does this mean for the US? 175 00:08:38,120 --> 00:08:40,439 Speaker 1: Exactly? Well, these Chinese drug makers are now developing drugs 176 00:08:40,480 --> 00:08:43,959 Speaker 1: two or three times faster than most other countries. They're 177 00:08:44,000 --> 00:08:47,400 Speaker 1: still a long long way off. Like the twenty largest 178 00:08:47,480 --> 00:08:50,720 Speaker 1: farmer companies in the world don't include a single Chinese one. 179 00:08:51,000 --> 00:08:53,680 Speaker 1: The top ten are very much dominated by the US. 180 00:08:54,280 --> 00:08:58,439 Speaker 1: But you know, it's another area where strategic competition is 181 00:08:58,480 --> 00:09:01,960 Speaker 1: starting to make itself known. And crucially, one in four 182 00:09:02,080 --> 00:09:06,200 Speaker 1: generic drugs consumed in the US, like tylan Nol has 183 00:09:06,559 --> 00:09:09,160 Speaker 1: ingredients that come from China to manufactured in China. So 184 00:09:09,440 --> 00:09:12,520 Speaker 1: you can see this real point of leverage brewing both 185 00:09:12,559 --> 00:09:16,120 Speaker 1: with access generic drugs, but also China can overtake the 186 00:09:16,200 --> 00:09:19,439 Speaker 1: US as the drug innovator record. It's kind of another 187 00:09:20,040 --> 00:09:23,439 Speaker 1: area where the US ability to create the most valuable 188 00:09:23,480 --> 00:09:25,600 Speaker 1: IP in the world is under threat. 189 00:09:26,000 --> 00:09:27,760 Speaker 2: So as I want to bring you a story that 190 00:09:27,880 --> 00:09:31,920 Speaker 2: really pissed me off, which is not something that I 191 00:09:31,960 --> 00:09:32,600 Speaker 2: share all the. 192 00:09:32,559 --> 00:09:34,960 Speaker 1: Time year that's time I heard you say something that yeah, sure, 193 00:09:34,960 --> 00:09:35,360 Speaker 1: well you. 194 00:09:35,280 --> 00:09:38,400 Speaker 2: Know, my therapist says, I'm not quick to express anger. 195 00:09:38,559 --> 00:09:40,160 Speaker 2: But this is this was actually a story I read 196 00:09:40,160 --> 00:09:43,959 Speaker 2: and I felt like viscerally angry about it. So it's 197 00:09:44,000 --> 00:09:47,240 Speaker 2: from the MIT Technology Review and it's about how private 198 00:09:47,320 --> 00:09:51,000 Speaker 2: prisons are now training AI on prisoner phone calls. 199 00:09:51,360 --> 00:09:52,040 Speaker 1: What does that mean? 200 00:09:52,200 --> 00:09:54,720 Speaker 2: You know, when you make a phone call from prison, 201 00:09:55,000 --> 00:09:57,560 Speaker 2: there are two companies that essentially allow you to do that. 202 00:09:57,440 --> 00:10:00,240 Speaker 1: And they charge you like a lot regions. 203 00:10:00,000 --> 00:10:01,439 Speaker 2: So it's a huge amount of money. 204 00:10:01,520 --> 00:10:03,439 Speaker 1: That's right for the privilege or someone even say, the 205 00:10:03,520 --> 00:10:05,600 Speaker 1: human right of being on to talk to your family. 206 00:10:05,679 --> 00:10:09,400 Speaker 2: That's exactly right, do you know how big this business is? 207 00:10:10,320 --> 00:10:13,320 Speaker 1: I know there's a despicable number of people in cost 208 00:10:13,400 --> 00:10:14,160 Speaker 1: raids in the US. 209 00:10:14,240 --> 00:10:17,280 Speaker 2: The inmate calling system is a one point two billion 210 00:10:17,320 --> 00:10:18,360 Speaker 2: dollar business. 211 00:10:18,080 --> 00:10:19,480 Speaker 1: Just the calling system, that's correct. 212 00:10:20,000 --> 00:10:23,480 Speaker 2: The story focuses on a company called securest Technologies. They're 213 00:10:23,520 --> 00:10:26,480 Speaker 2: one of two companies that specialize in the prison phone system. 214 00:10:26,679 --> 00:10:29,800 Speaker 2: The thing that really caught my attention in this article 215 00:10:30,000 --> 00:10:32,960 Speaker 2: is that Securist has been investing in voice recognition products 216 00:10:33,000 --> 00:10:36,880 Speaker 2: since twenty nineteen and AI products since twenty twenty three, 217 00:10:37,440 --> 00:10:42,280 Speaker 2: which means they've been training their own lms on these 218 00:10:42,320 --> 00:10:43,320 Speaker 2: phone calls. 219 00:10:43,080 --> 00:10:45,920 Speaker 1: So they own the basically the whole batch of recordings 220 00:10:45,960 --> 00:10:48,400 Speaker 1: and can use it for whatever experiments they want to run. 221 00:10:48,440 --> 00:10:51,160 Speaker 2: I mean, the experiment that they apparently are running is 222 00:10:51,600 --> 00:10:54,480 Speaker 2: building large language models that are going to help law 223 00:10:54,559 --> 00:11:00,920 Speaker 2: enforcement track and basically predict ifs are going to happen 224 00:11:00,920 --> 00:11:02,800 Speaker 2: on the basis of what people are talking about in 225 00:11:02,840 --> 00:11:07,640 Speaker 2: their prison phone calls. It's surveillance on a level like 226 00:11:07,960 --> 00:11:11,800 Speaker 2: you know, you think about panoptic power, and this is 227 00:11:12,000 --> 00:11:17,320 Speaker 2: like the panopticon having another reason to be a panopticon. Yeah, 228 00:11:17,559 --> 00:11:20,760 Speaker 2: so we know that llms are being piloted in certain markets, 229 00:11:20,760 --> 00:11:24,280 Speaker 2: but we don't know which markets. And actually, the securest 230 00:11:24,280 --> 00:11:28,199 Speaker 2: president Kevin Elder told the MIT Technology Review that securists 231 00:11:28,400 --> 00:11:32,079 Speaker 2: monitoring efforts have helped disrupt human trafficking and gang activities 232 00:11:32,200 --> 00:11:35,480 Speaker 2: organized from within prisons, among other crimes, and said its 233 00:11:35,520 --> 00:11:39,240 Speaker 2: tools are also used to identify prison staff who are 234 00:11:39,240 --> 00:11:41,040 Speaker 2: bringing in Contraband what. 235 00:11:41,160 --> 00:11:44,960 Speaker 1: Do prison advocates, I mean, there was a potcast. They 236 00:11:45,000 --> 00:11:45,360 Speaker 1: hate it. 237 00:11:46,080 --> 00:11:48,599 Speaker 2: They hate it. A woman named Bianca Tileck from the 238 00:11:48,679 --> 00:11:51,600 Speaker 2: organization Worth Rises, I think set it really the best, 239 00:11:51,640 --> 00:11:54,600 Speaker 2: which is she said, there's literally no other way you 240 00:11:54,600 --> 00:11:57,280 Speaker 2: can communicate with your family. Not only are you not 241 00:11:57,480 --> 00:11:59,920 Speaker 2: compensating them for the use of their data, but you're 242 00:12:00,120 --> 00:12:02,520 Speaker 2: actually charging them to collect their data. 243 00:12:02,679 --> 00:12:06,439 Speaker 1: Yeah, I mean it does kind of stick in your 244 00:12:06,559 --> 00:12:08,640 Speaker 1: crawl a bit, doesn't it. 245 00:12:08,280 --> 00:12:11,160 Speaker 2: It sticks in my cry, you could say, yeah, I 246 00:12:11,160 --> 00:12:13,640 Speaker 2: mean what infuriates me is that if inmates want to 247 00:12:13,679 --> 00:12:16,720 Speaker 2: communicate with the outside world, they have to play within 248 00:12:16,760 --> 00:12:18,400 Speaker 2: this system. Like they don't have a choice of how 249 00:12:18,440 --> 00:12:21,760 Speaker 2: they're making phone calls, and so while they're informed that 250 00:12:21,800 --> 00:12:25,080 Speaker 2: their messages are being recorded, they aren't informed about how 251 00:12:25,120 --> 00:12:26,720 Speaker 2: this data is used at all. 252 00:12:27,080 --> 00:12:31,679 Speaker 1: Now, do you think the reason this story strikes us 253 00:12:31,720 --> 00:12:36,640 Speaker 1: both is because we are such sort of empathetic people, 254 00:12:37,320 --> 00:12:41,559 Speaker 1: or is it because in some sense it's a grotesquely 255 00:12:41,640 --> 00:12:46,160 Speaker 1: magnified pastiche of the experience of being a citizen in 256 00:12:46,200 --> 00:12:47,240 Speaker 1: the twenty first century. 257 00:12:47,400 --> 00:12:51,360 Speaker 2: I do think that yes, this is a magnified version 258 00:12:51,640 --> 00:12:55,640 Speaker 2: of a surveillance capitalist state that we all live in, 259 00:12:55,960 --> 00:12:58,760 Speaker 2: that we experience on a daily basis giving away data 260 00:12:59,280 --> 00:13:00,160 Speaker 2: for free. 261 00:13:00,520 --> 00:13:03,800 Speaker 1: I guess the difference is that we technically theoretically can 262 00:13:03,920 --> 00:13:06,400 Speaker 1: use duc dot go instead of Google and not right. 263 00:13:06,280 --> 00:13:09,280 Speaker 2: We have tools that help us dog GPN. 264 00:13:09,720 --> 00:13:11,839 Speaker 1: We have a choice which most of us don't exercise. 265 00:13:12,400 --> 00:13:14,720 Speaker 2: Choice within our non choice exactly. 266 00:13:14,480 --> 00:13:17,240 Speaker 1: Whether this is like, you know, well, your choice is 267 00:13:17,280 --> 00:13:18,640 Speaker 1: not to communicate. I guess yeah. 268 00:13:18,640 --> 00:13:24,120 Speaker 2: And it's just I understand I suppose I understand the 269 00:13:24,160 --> 00:13:28,719 Speaker 2: need to stop human trafficking, for example, or I have 270 00:13:28,840 --> 00:13:33,119 Speaker 2: the understanding that it's important to stop you know, drug trafficking, 271 00:13:33,320 --> 00:13:35,480 Speaker 2: or you know, a legal contraband coming into a prison. 272 00:13:35,960 --> 00:13:42,360 Speaker 2: This just seems like the easiest, most crass way to 273 00:13:42,840 --> 00:13:44,360 Speaker 2: possibly surveil people. 274 00:13:44,480 --> 00:13:46,800 Speaker 1: I also think, you know, it's such an easy thing 275 00:13:46,840 --> 00:13:49,920 Speaker 1: to productize and say, like put these super predators in 276 00:13:49,960 --> 00:13:53,200 Speaker 1: prison listening on these on their calls, and then this 277 00:13:53,360 --> 00:13:57,320 Speaker 1: infallible technology AI tool will tell you who's the baddest 278 00:13:57,320 --> 00:13:59,160 Speaker 1: of the bad And it's just like really, It's. 279 00:13:59,000 --> 00:14:00,840 Speaker 2: Also like, is this where we need to be applying 280 00:14:00,880 --> 00:14:04,120 Speaker 2: AI tools? Like that's I just feel like money? 281 00:14:04,120 --> 00:14:05,040 Speaker 1: What about rehabilitation? 282 00:14:05,120 --> 00:14:09,880 Speaker 2: Rehabilitation programs? Precisely so, while I don't think twice about 283 00:14:09,960 --> 00:14:12,400 Speaker 2: accepting cookies or giving away my data, I do at 284 00:14:12,480 --> 00:14:15,760 Speaker 2: least have the choice to care without completely cutting myself 285 00:14:15,760 --> 00:14:16,840 Speaker 2: off from the outside world. 286 00:14:17,200 --> 00:14:18,720 Speaker 1: Yeah, if I think you make a good point, which 287 00:14:18,760 --> 00:14:23,160 Speaker 1: is like you can support the goals of law enforcement, 288 00:14:23,200 --> 00:14:25,320 Speaker 1: like we don't want human trafficing all these things you've mentioned, 289 00:14:25,640 --> 00:14:28,960 Speaker 1: and still have questions about this pick particular technique. I 290 00:14:29,000 --> 00:14:30,960 Speaker 1: mean to me, the kind of framing question which we're 291 00:14:30,960 --> 00:14:32,440 Speaker 1: going to come back to after the break in the 292 00:14:32,480 --> 00:14:36,920 Speaker 1: story I'm going to tell you is philosophically, would you 293 00:14:37,040 --> 00:14:40,360 Speaker 1: rather live in a world with no privacy or in 294 00:14:40,360 --> 00:14:41,280 Speaker 1: a world with no crime? 295 00:14:41,640 --> 00:14:42,480 Speaker 2: That's a good question. 296 00:14:42,640 --> 00:14:45,400 Speaker 1: We're going to come back to it, and we're also 297 00:14:45,440 --> 00:14:49,120 Speaker 1: going to talk about the world's youngest female self made 298 00:14:49,120 --> 00:14:52,760 Speaker 1: billionaire and why people are getting addicted to giant fruit 299 00:15:00,160 --> 00:15:02,680 Speaker 1: and we're back. Do you know the phrase one nation 300 00:15:02,800 --> 00:15:03,600 Speaker 1: on the CCTV. 301 00:15:03,760 --> 00:15:05,560 Speaker 2: I know the phrase one nation under God. 302 00:15:05,760 --> 00:15:09,320 Speaker 1: That's one nation on the CTV was another contemporary artist 303 00:15:09,400 --> 00:15:11,480 Speaker 1: who was the People of the. 304 00:15:11,360 --> 00:15:14,760 Speaker 2: Pre NFT era Banks Banks c Yes. 305 00:15:14,960 --> 00:15:19,360 Speaker 1: This phrase was graffited by banks outside London Post office 306 00:15:19,400 --> 00:15:23,560 Speaker 1: in two thousand and eight. According to conservative newspaper in Britain, 307 00:15:23,600 --> 00:15:27,240 Speaker 1: the Telegraph, the UK could be on a path now 308 00:15:27,520 --> 00:15:32,520 Speaker 1: to becoming one nation under AI enabled CCTV thanks to 309 00:15:32,560 --> 00:15:34,720 Speaker 1: a new proposal from the Labor government. 310 00:15:34,920 --> 00:15:35,720 Speaker 2: Say more about this. 311 00:15:36,080 --> 00:15:38,520 Speaker 1: So, according to Telegraph, which is not a fan of 312 00:15:38,560 --> 00:15:40,720 Speaker 1: the Labor government, less sweet and just put our cards 313 00:15:40,760 --> 00:15:44,160 Speaker 1: on the table. Yeah, Labor is proposing that police be 314 00:15:44,200 --> 00:15:48,800 Speaker 1: allowed to compare photos of crime suspects from CCTV doorbells 315 00:15:48,880 --> 00:15:54,080 Speaker 1: and dash cams against facial images on government databases, including 316 00:15:54,120 --> 00:15:57,400 Speaker 1: the passports of forty five million Britons. 317 00:15:58,120 --> 00:16:01,800 Speaker 2: What yeah, So why are they proposing this? 318 00:16:02,040 --> 00:16:04,120 Speaker 1: Well, it comes back to the story we were talking 319 00:16:04,120 --> 00:16:07,320 Speaker 1: about before the break, right, Like they're proposing this because 320 00:16:07,840 --> 00:16:12,640 Speaker 1: they say that having CCTV cameras everywhere in Britain which 321 00:16:12,720 --> 00:16:16,320 Speaker 1: are hooked into the mainframe of the passport photo database 322 00:16:16,640 --> 00:16:20,280 Speaker 1: will allow it to be possible to stop crime in 323 00:16:20,320 --> 00:16:23,520 Speaker 1: real time, to take sexual predators off the streets before 324 00:16:23,560 --> 00:16:27,280 Speaker 1: they offend again, etcetera, etcetera, etcetera. But some are saying, 325 00:16:27,600 --> 00:16:30,440 Speaker 1: I think, frankly, including me, hold on a second, when 326 00:16:30,480 --> 00:16:33,600 Speaker 1: I wanted to get my passport, I didn't consent to 327 00:16:33,680 --> 00:16:38,280 Speaker 1: having my face be available for a live panoptic prison system. 328 00:16:38,360 --> 00:16:40,800 Speaker 2: Well, very similarly to what we were saying is that 329 00:16:40,840 --> 00:16:43,680 Speaker 2: when prisoners consent to make phone calls that are being recorded, 330 00:16:43,960 --> 00:16:46,240 Speaker 2: I can't imagine that they are thinking these recordings will 331 00:16:46,240 --> 00:16:47,720 Speaker 2: be used to train lllms. 332 00:16:48,000 --> 00:16:50,600 Speaker 1: And there's all these problems of bias in these facial 333 00:16:50,640 --> 00:16:53,480 Speaker 1: recognition systems, which we start talking about. What we've talked 334 00:16:53,480 --> 00:16:55,560 Speaker 1: about it to a blue in the face, but haven't 335 00:16:55,560 --> 00:16:58,080 Speaker 1: really been addressed or properly addressed, which is, if you're 336 00:16:58,080 --> 00:17:01,360 Speaker 1: a person of color, you're much well ke's be misidentified. 337 00:17:01,600 --> 00:17:04,720 Speaker 1: But again there's this like technological logic where because it's 338 00:17:04,760 --> 00:17:08,560 Speaker 1: tech enabled, people believe it's true. And it brings me 339 00:17:08,600 --> 00:17:12,280 Speaker 1: back to that question about like, yes, in theory, like 340 00:17:12,640 --> 00:17:15,399 Speaker 1: I would like people who are dangerous criminals to be 341 00:17:15,680 --> 00:17:18,840 Speaker 1: very quickly taken off the streets for the event. But 342 00:17:18,920 --> 00:17:22,399 Speaker 1: do I want the police to have unfettered access to 343 00:17:22,640 --> 00:17:26,320 Speaker 1: the whole passport database? I mean, in a just society 344 00:17:27,000 --> 00:17:31,680 Speaker 1: where you have confidence in the government, a democratic government 345 00:17:31,720 --> 00:17:37,680 Speaker 1: whose values you share, maybe this is okay. Maybe, But 346 00:17:37,880 --> 00:17:40,120 Speaker 1: what happens when this falls into the wrong hands. 347 00:17:39,960 --> 00:17:41,680 Speaker 2: Right exactly exactly? 348 00:17:41,720 --> 00:17:44,000 Speaker 1: You know, there's just there's so many ways to turn 349 00:17:44,080 --> 00:17:47,000 Speaker 1: this into a system for extreme evil. 350 00:17:47,040 --> 00:17:50,479 Speaker 2: I think, and we saw it happen with Wigers, with 351 00:17:50,480 --> 00:17:53,119 Speaker 2: the weaker minority in China, Like it can happen very easily. 352 00:17:53,200 --> 00:17:55,480 Speaker 2: It's not like it hasn't happened easily. There was an 353 00:17:55,640 --> 00:17:59,880 Speaker 2: entire system that the Chinese government had called ijop, which 354 00:18:00,119 --> 00:18:04,840 Speaker 2: is essentially to collect data from tons of sources that 355 00:18:04,920 --> 00:18:07,800 Speaker 2: are being used by average Chinese citizens. So it's it's 356 00:18:07,800 --> 00:18:09,520 Speaker 2: not crazy to imagine. 357 00:18:09,160 --> 00:18:11,560 Speaker 1: No, I mean, I think it's you know, the irony 358 00:18:11,600 --> 00:18:13,920 Speaker 1: of this whole story is ten years ago we were saying, 359 00:18:13,920 --> 00:18:16,720 Speaker 1: oh my god, China has built this surveillance stage and 360 00:18:16,720 --> 00:18:19,600 Speaker 1: official recognition of our life. I'm not awful. It's just 361 00:18:19,600 --> 00:18:22,840 Speaker 1: an ironic moment where our political systems are under so 362 00:18:22,960 --> 00:18:26,399 Speaker 1: much strain that we have this aspiration to things that 363 00:18:26,800 --> 00:18:28,720 Speaker 1: filled us with horror alto recently. 364 00:18:28,960 --> 00:18:32,920 Speaker 2: Yeah, so let's have a question for you. How many 365 00:18:33,000 --> 00:18:34,480 Speaker 2: female billionaires can you name? 366 00:18:35,080 --> 00:18:41,119 Speaker 1: Is Nancy pelosia billionaire? You know there's a fund you 367 00:18:41,119 --> 00:18:44,120 Speaker 1: can use to track Nancy Pelosi's trades and make them yourself. 368 00:18:44,200 --> 00:18:46,040 Speaker 2: Really, Oh, I have never seen that on the train. 369 00:18:47,840 --> 00:18:49,119 Speaker 1: Okay, but but Nancy Pelosi did not. 370 00:18:49,400 --> 00:18:53,560 Speaker 2: Nancy is definitely not one I trolling. So this week 371 00:18:54,000 --> 00:18:55,399 Speaker 2: I was trying, like I was trying to think, like, 372 00:18:55,440 --> 00:18:58,240 Speaker 2: who's a female billionaire? Who's a female billionaire? And I 373 00:18:58,320 --> 00:19:01,560 Speaker 2: was thinking about it because the world has a new 374 00:19:01,600 --> 00:19:04,479 Speaker 2: billionaire and she just happens to be the youngest female 375 00:19:04,560 --> 00:19:06,160 Speaker 2: self made billionaire to date. 376 00:19:06,320 --> 00:19:06,720 Speaker 1: Wow. 377 00:19:07,040 --> 00:19:09,960 Speaker 2: And for the sake of our show, she works in technology. 378 00:19:10,040 --> 00:19:11,159 Speaker 1: Wow. I'd love to have her own who is? 379 00:19:11,200 --> 00:19:12,680 Speaker 2: She told me we should we should have her on. 380 00:19:12,760 --> 00:19:15,119 Speaker 2: Actually she's really interesting. So last week, you know, we 381 00:19:15,119 --> 00:19:18,280 Speaker 2: talked about polymarket, which is the online prediction market that 382 00:19:18,440 --> 00:19:22,680 Speaker 2: continues to kind of discuss me. But the woman who 383 00:19:22,720 --> 00:19:25,919 Speaker 2: is now a billionaire co founded poly Market's competitor, which, 384 00:19:26,000 --> 00:19:27,960 Speaker 2: as you know, is caw she that's. 385 00:19:27,880 --> 00:19:30,119 Speaker 1: Correct, which sounds like a breakfast here like used to 386 00:19:30,160 --> 00:19:30,840 Speaker 1: eat called kashi. 387 00:19:30,920 --> 00:19:33,480 Speaker 2: But I remember Cashi made me so ill. You would 388 00:19:33,480 --> 00:19:37,760 Speaker 2: eat Cashi. I love that cereal. So Calshi is similar 389 00:19:37,760 --> 00:19:39,960 Speaker 2: to poll Market in that people can make bets on 390 00:19:40,000 --> 00:19:43,080 Speaker 2: everything from you know, sports, to elections to the weather. 391 00:19:43,920 --> 00:19:47,879 Speaker 2: But unlike poly Market, which was crypto based, Cawshi is 392 00:19:47,920 --> 00:19:50,760 Speaker 2: based in the US dollar and received approval from the 393 00:19:50,800 --> 00:19:55,480 Speaker 2: CFTC the Commodity's Futures Trading Commission in twenty twenty. Poly 394 00:19:55,640 --> 00:19:58,840 Speaker 2: Market received the approval in September twenty twenty five, but 395 00:19:59,040 --> 00:20:01,560 Speaker 2: only after being fine I had one point four million 396 00:20:01,640 --> 00:20:04,200 Speaker 2: dollars for operating in unregistered markets. 397 00:20:04,520 --> 00:20:11,000 Speaker 1: You sounded like doctor Eva in Austin powers four million dollars. Interesting. 398 00:20:11,080 --> 00:20:13,800 Speaker 1: So cow She recently caught my eye because they announced 399 00:20:13,800 --> 00:20:16,600 Speaker 1: a partnership with CNN. That's right where Cina is basically 400 00:20:16,640 --> 00:20:20,600 Speaker 1: gonna broadcast caw she live predictions on things which are 401 00:20:20,640 --> 00:20:24,600 Speaker 1: going to happen, which some people within CNN were pretty 402 00:20:24,880 --> 00:20:28,720 Speaker 1: upset about, because you know, the idea of of of 403 00:20:28,840 --> 00:20:31,720 Speaker 1: journalists becoming sort of jockey and like, you know, odd 404 00:20:31,760 --> 00:20:32,760 Speaker 1: to make Betty. 405 00:20:32,560 --> 00:20:35,520 Speaker 2: What is that called insider trading that potentially called. 406 00:20:35,960 --> 00:20:37,320 Speaker 1: It sounds like it could be. I want to know 407 00:20:37,400 --> 00:20:39,640 Speaker 1: more about this female founder who is she? What's her name? 408 00:20:39,960 --> 00:20:42,520 Speaker 2: Her name is Luana Lopez Lara and her co founder 409 00:20:42,560 --> 00:20:47,240 Speaker 2: is named Tarik Manser. She's trying nine and Luana's backstory 410 00:20:47,400 --> 00:20:49,840 Speaker 2: is like something out of a Russian spy novel. She 411 00:20:50,000 --> 00:20:54,320 Speaker 2: was trained as a professional ballerina in Brazil extreme discipline, 412 00:20:54,440 --> 00:20:58,040 Speaker 2: and even turned pro in Austria for nine months, but 413 00:20:58,080 --> 00:20:59,400 Speaker 2: then she was like, I'm out of here. I don't 414 00:20:59,400 --> 00:21:01,199 Speaker 2: want to be here anymore, and went to study computer 415 00:21:01,240 --> 00:21:05,120 Speaker 2: science at MIT, and that's where she met Tark Wow. 416 00:21:05,480 --> 00:21:07,840 Speaker 1: And they founded this company, Kalshi together. 417 00:21:08,000 --> 00:21:11,000 Speaker 2: They did I like this from a piece that I 418 00:21:11,040 --> 00:21:13,040 Speaker 2: read about it that he basically noticed that she was 419 00:21:13,080 --> 00:21:15,480 Speaker 2: sitting up front in class and was like, that's the 420 00:21:15,480 --> 00:21:18,240 Speaker 2: girl who's going to be a billionaire. Yeah, exactly. But 421 00:21:18,280 --> 00:21:21,879 Speaker 2: according to their website, both Luana and Taric worked at 422 00:21:21,880 --> 00:21:25,560 Speaker 2: financial institutions like Goldman Sachs and observed that many financial 423 00:21:25,560 --> 00:21:28,920 Speaker 2: decisions were driven by predictions about future events. And then 424 00:21:28,960 --> 00:21:31,960 Speaker 2: they thought it was odd that there wasn't a straightforward 425 00:21:31,960 --> 00:21:35,600 Speaker 2: way to trade directly on event outcomes, so they set 426 00:21:35,640 --> 00:21:39,119 Speaker 2: out to create a place for this type of direct exchange. 427 00:21:39,359 --> 00:21:42,480 Speaker 1: That's interesting you mentioned I think half joking that Polymarket 428 00:21:42,680 --> 00:21:44,679 Speaker 1: disgusted you. But we talked about this last week in 429 00:21:44,720 --> 00:21:47,240 Speaker 1: respect to people betting on like you know, the outcome 430 00:21:47,280 --> 00:21:49,800 Speaker 1: of Russia Ukraine battles over individual. 431 00:21:49,320 --> 00:21:50,720 Speaker 2: Towns and Zelensky sue. 432 00:21:51,000 --> 00:21:53,680 Speaker 1: But I said to you then, which I will repeat now, like, yes, 433 00:21:53,720 --> 00:21:55,719 Speaker 1: that is in a very poor taste. But to what 434 00:21:55,760 --> 00:21:59,080 Speaker 1: you just told me, it democratizes the types of bets 435 00:21:59,119 --> 00:22:03,159 Speaker 1: that government and banks in a sense already make and 436 00:22:03,240 --> 00:22:05,160 Speaker 1: just makes them very visible in consumer facing. 437 00:22:05,320 --> 00:22:07,000 Speaker 2: That's right. There is a little bit of a like 438 00:22:07,320 --> 00:22:10,600 Speaker 2: let's take on the man energy to this whole thing. 439 00:22:10,840 --> 00:22:14,040 Speaker 2: I just think the story is really interesting because when 440 00:22:14,040 --> 00:22:15,679 Speaker 2: you think of tech billionaire, what do you think, like, 441 00:22:15,680 --> 00:22:20,080 Speaker 2: what's the model? Right? Man and sometimes boy for me, yeah, 442 00:22:20,119 --> 00:22:22,560 Speaker 2: but you don't think of woman or girl, not to 443 00:22:22,560 --> 00:22:25,120 Speaker 2: be so binary. But no, I think I agree with you, 444 00:22:25,119 --> 00:22:27,119 Speaker 2: you know, And so I think this idea of like 445 00:22:27,280 --> 00:22:30,720 Speaker 2: socialized bedding on the basis of like maybe what a 446 00:22:30,760 --> 00:22:33,679 Speaker 2: government is going to do, even though it's still mostly 447 00:22:33,720 --> 00:22:37,680 Speaker 2: sports focused, is really important for us to keep talking 448 00:22:37,680 --> 00:22:37,960 Speaker 2: about it. 449 00:22:38,119 --> 00:22:40,399 Speaker 1: And also to your point about Calshi and Polymark, I mean, 450 00:22:40,440 --> 00:22:43,760 Speaker 1: these are two of the most valuable private companies or 451 00:22:43,840 --> 00:22:46,560 Speaker 1: very extremely valuable private companies that are not AI companies. 452 00:22:46,560 --> 00:22:48,880 Speaker 1: They're both valued, I think are over at ten billion dollars. Yes, 453 00:22:49,160 --> 00:22:52,480 Speaker 1: you want to know another unicorn that's been promising around 454 00:22:52,600 --> 00:22:57,560 Speaker 1: place my subconscious. It's a company called Fruitist, who make 455 00:22:58,080 --> 00:22:58,720 Speaker 1: giant fruit. 456 00:22:59,320 --> 00:23:01,080 Speaker 2: It's so funny. There was a I don't know if 457 00:23:01,119 --> 00:23:03,159 Speaker 2: you ever saw this, but it's there's these kind of 458 00:23:03,240 --> 00:23:07,800 Speaker 2: bougie bodegas in New York City and there was a strawberry. 459 00:23:07,880 --> 00:23:12,200 Speaker 2: Oh yeah, I was astonished at the cost of this strawberry. 460 00:23:12,200 --> 00:23:15,480 Speaker 1: It was a strawberry I think nineteen dollars something like that, 461 00:23:15,480 --> 00:23:16,760 Speaker 1: single strawberries. 462 00:23:16,920 --> 00:23:19,119 Speaker 2: And what that tells me is there's a market for it. 463 00:23:19,160 --> 00:23:21,120 Speaker 2: But whatever, I'm sorry to interrupt you, Well, you want 464 00:23:21,160 --> 00:23:21,840 Speaker 2: to talk about. 465 00:23:21,800 --> 00:23:25,800 Speaker 1: Founder of Fruitists actually directly pushes back on what you've 466 00:23:25,840 --> 00:23:27,520 Speaker 1: just said and said what I'm doing has nothing to 467 00:23:27,600 --> 00:23:30,399 Speaker 1: do with champagne strawberries. But let me read you the 468 00:23:30,440 --> 00:23:34,240 Speaker 1: headline from Fortune. Ray Dadio is backing a one billion 469 00:23:34,280 --> 00:23:37,240 Speaker 1: dollar blueberry unicorn that sells berries nearly the size of 470 00:23:37,280 --> 00:23:40,040 Speaker 1: golf balls. That's this huge. I want to live in. 471 00:23:40,480 --> 00:23:43,359 Speaker 2: The fact that we can report on big fruit in 472 00:23:43,400 --> 00:23:47,640 Speaker 2: the same breath as companies literally using large language models 473 00:23:47,640 --> 00:23:50,199 Speaker 2: on the basis of prisoner phone calls. Is where technology 474 00:23:50,240 --> 00:23:52,720 Speaker 2: is truly the craziest world to where we. 475 00:23:52,680 --> 00:23:54,240 Speaker 1: Have the poorest taste of anyone in the world. 476 00:23:54,280 --> 00:23:54,760 Speaker 2: Possibly. 477 00:23:55,080 --> 00:23:56,720 Speaker 1: So when I first saw this was, oh my god, 478 00:23:56,720 --> 00:24:01,119 Speaker 1: they made GM fruit wrong. These actually normal fruit, but 479 00:24:01,400 --> 00:24:05,000 Speaker 1: super optimized by data. They've bought farms around the world. 480 00:24:05,200 --> 00:24:12,760 Speaker 1: They've used AI to basically totally optimize the production, distribution, storage, etc. 481 00:24:13,560 --> 00:24:18,200 Speaker 1: Of the blueberries. Founders Steve mcgamy has said, quote they 482 00:24:18,240 --> 00:24:20,160 Speaker 1: actually pop when you bite into them. 483 00:24:21,240 --> 00:24:25,160 Speaker 2: So we have gotten to a point in our culture 484 00:24:25,480 --> 00:24:29,600 Speaker 2: where we literally can't stomach having any uncertainty around a 485 00:24:29,600 --> 00:24:32,720 Speaker 2: blueberry like we need we need to actually use data 486 00:24:32,960 --> 00:24:35,200 Speaker 2: to make sure that our blueberry is perfect exactly. 487 00:24:35,280 --> 00:24:38,200 Speaker 1: That's exactly. And it's not just blueberries. There are raspberries 488 00:24:38,240 --> 00:24:42,159 Speaker 1: and blackberries in development, and Fruits has stuck a cherry 489 00:24:42,160 --> 00:24:44,560 Speaker 1: deal in China. 490 00:24:44,680 --> 00:24:45,919 Speaker 2: I don't even know what that means. 491 00:24:46,119 --> 00:24:49,000 Speaker 1: To your point though, about consistency, that is exactly the goal. 492 00:24:49,520 --> 00:24:53,760 Speaker 1: Ceomagami refers to ending the problem of quote berry roulette, 493 00:24:54,119 --> 00:24:58,040 Speaker 1: meaning no more inconsistency in berriers. And we can laugh. 494 00:24:58,119 --> 00:25:00,439 Speaker 1: But fruit is sales have tripled in the last year, 495 00:25:00,520 --> 00:25:04,080 Speaker 1: so to passing over four hundred million dollars. The company 496 00:25:04,200 --> 00:25:06,919 Speaker 1: was considering going public, but with tarists they decided to 497 00:25:06,960 --> 00:25:07,399 Speaker 1: pause that. 498 00:25:07,680 --> 00:25:09,400 Speaker 2: Can I just ask a question, where do you buy 499 00:25:09,400 --> 00:25:12,919 Speaker 2: these fruit like everywhere like Whole Foods? But is the 500 00:25:12,960 --> 00:25:14,520 Speaker 2: brand called fruit Ist or are they. 501 00:25:14,400 --> 00:25:17,560 Speaker 1: Say fruit is but actually they it doesn't. It's not 502 00:25:17,720 --> 00:25:20,720 Speaker 1: like the nineteen dollars strawberry. It comes in regular packaging. 503 00:25:21,440 --> 00:25:23,479 Speaker 1: I see. So you can get like a pack of 504 00:25:23,520 --> 00:25:28,200 Speaker 1: like golf ball sized blueberries which is like eight dollars 505 00:25:28,200 --> 00:25:30,880 Speaker 1: for ten, let's say, which is still expensive, but it's 506 00:25:30,920 --> 00:25:33,679 Speaker 1: different from nineteen dollars for one. You know. 507 00:25:33,720 --> 00:25:36,000 Speaker 2: I think one of the interesting things about this piece 508 00:25:36,040 --> 00:25:38,480 Speaker 2: to me, and people have been writing a lot about 509 00:25:38,520 --> 00:25:42,680 Speaker 2: this visa VI restaurants and menus, is that like GLP 510 00:25:42,920 --> 00:25:47,880 Speaker 2: ones and Maha culture have really influenced what people demand 511 00:25:48,240 --> 00:25:50,920 Speaker 2: in terms of the products that they eat or buy. 512 00:25:50,800 --> 00:25:53,199 Speaker 1: One hundred percent. So and this is you know you're 513 00:25:53,280 --> 00:25:56,440 Speaker 1: seeing from Ceo mcgami's hymn book here because he said 514 00:25:56,480 --> 00:25:58,720 Speaker 1: these are not berries to go on your muffins or 515 00:25:58,720 --> 00:26:03,240 Speaker 1: your oatmeal. These are stand alone, snackable, berriessable berries. All 516 00:26:03,240 --> 00:26:06,159 Speaker 1: other snack categories are seriously down since the launch of 517 00:26:06,359 --> 00:26:09,480 Speaker 1: mpig and buried I think up so, yeah, I like 518 00:26:09,520 --> 00:26:11,920 Speaker 1: these stories that have all the different Yeah, all the 519 00:26:11,960 --> 00:26:14,080 Speaker 1: different things we talk about come together in a golf 520 00:26:14,160 --> 00:26:16,919 Speaker 1: ball size blueberry that pops when you bite into it. 521 00:26:27,320 --> 00:26:29,600 Speaker 1: So this week for chatting me, I'm bringing us a 522 00:26:29,640 --> 00:26:32,040 Speaker 1: story that I heard at a conference last week called 523 00:26:32,040 --> 00:26:35,760 Speaker 1: the Doha Forum. And yalde Hakim, who is the lead 524 00:26:36,119 --> 00:26:39,400 Speaker 1: world news presenter for Sky News in Britain, was there 525 00:26:39,800 --> 00:26:42,399 Speaker 1: and she was moderating a panel. But she opened it 526 00:26:42,760 --> 00:26:46,520 Speaker 1: talking about a very strange experience of going viral for 527 00:26:46,560 --> 00:26:47,520 Speaker 1: all the wrong reasons. 528 00:26:48,240 --> 00:26:51,800 Speaker 3: So this interview was shared over a million times. I 529 00:26:51,920 --> 00:26:55,680 Speaker 3: was seeing the interview itself just going completely viral within 530 00:26:55,960 --> 00:26:59,560 Speaker 3: twelve hours. Then my producer sent me a message saying 531 00:26:59,640 --> 00:27:02,040 Speaker 3: that this is the fake version of the interview. 532 00:27:02,200 --> 00:27:04,160 Speaker 2: What do you mean? I thought she recorded an interview. 533 00:27:04,680 --> 00:27:08,440 Speaker 1: She did, but the interview, not just the person's responses, 534 00:27:08,720 --> 00:27:11,320 Speaker 1: but her questions were both deep faked. 535 00:27:11,640 --> 00:27:16,040 Speaker 3: What terrified me is the fact that these deep fakes 536 00:27:16,080 --> 00:27:18,440 Speaker 3: have become better and smarter. For the last seven or 537 00:27:18,480 --> 00:27:21,000 Speaker 3: ten years, we've been hearing about deep fakes, and what 538 00:27:21,040 --> 00:27:23,600 Speaker 3: they could do to society and you know the impact 539 00:27:23,680 --> 00:27:27,440 Speaker 3: it's going to have. But I suddenly saw something that, frankly, 540 00:27:27,520 --> 00:27:29,800 Speaker 3: if you didn't know my voice, if you didn't know me, 541 00:27:30,200 --> 00:27:33,040 Speaker 3: and if you didn't know my mannerisms, you would think 542 00:27:33,280 --> 00:27:34,280 Speaker 3: that that clip was. 543 00:27:34,240 --> 00:27:36,640 Speaker 1: Real and it wasn't any old interview that was being 544 00:27:36,680 --> 00:27:40,159 Speaker 1: deep faked. Yalda's original conversation was with the sister of 545 00:27:40,240 --> 00:27:43,240 Speaker 1: the former Prime Minister of Pakistan, Imran Khan, and he's 546 00:27:43,280 --> 00:27:46,080 Speaker 1: been in jail since twenty twenty three on charges of 547 00:27:46,119 --> 00:27:49,360 Speaker 1: corruption that his supporters say are all politically motivated, and 548 00:27:49,440 --> 00:27:51,520 Speaker 1: the conversation was about that and how he's being treated 549 00:27:51,600 --> 00:27:55,320 Speaker 1: in jail. The way it was manipulated, however, was much 550 00:27:55,359 --> 00:27:58,720 Speaker 1: more provocative, much more potentially dangerous, and that's what went. 551 00:27:58,640 --> 00:28:04,480 Speaker 3: Viraltions that I had asked her had been completely changed, fabricated, manipulated. 552 00:28:05,000 --> 00:28:09,320 Speaker 3: The focus was aggression towards India and an attack on 553 00:28:09,800 --> 00:28:13,359 Speaker 3: the Army chief Arsimunir of Pakistan. And I think what 554 00:28:13,480 --> 00:28:16,200 Speaker 3: was terrifying is the fact that we know that India 555 00:28:16,200 --> 00:28:21,080 Speaker 3: and Pakistan went essentially to war in May. They went 556 00:28:21,119 --> 00:28:24,320 Speaker 3: head to head following a terrorist attack in Kashmir. There 557 00:28:24,440 --> 00:28:27,360 Speaker 3: was concern, you know, when two nuclear armed states are 558 00:28:27,520 --> 00:28:30,400 Speaker 3: on the brink that way, that this could escalate and explode, 559 00:28:30,440 --> 00:28:34,400 Speaker 3: not just regionally, but internationally. And this was further fueling 560 00:28:34,400 --> 00:28:37,879 Speaker 3: the flame to the point where the Defense Minister of 561 00:28:37,880 --> 00:28:41,760 Speaker 3: Pakistan responded to the interview the fake version as though 562 00:28:41,760 --> 00:28:45,560 Speaker 3: it was real, and mainstream media in India picked it 563 00:28:45,680 --> 00:28:47,680 Speaker 3: up and it made headlines across the country. 564 00:28:47,800 --> 00:28:50,160 Speaker 2: This is what experts have been warning us about for years, 565 00:28:50,200 --> 00:28:53,400 Speaker 2: that deep figs can be used to actually escalate tensions 566 00:28:53,440 --> 00:28:56,000 Speaker 2: and break down trust between people and their leader. I mean, 567 00:28:56,240 --> 00:28:59,760 Speaker 2: you think about we have a US president who reposts 568 00:29:00,160 --> 00:29:01,120 Speaker 2: deep fakes all the time. 569 00:29:01,160 --> 00:29:03,480 Speaker 1: Well, that's right, and we've become accustomed, i think, because 570 00:29:03,520 --> 00:29:07,400 Speaker 1: of our president, to viewing deep fakes as kind of 571 00:29:07,920 --> 00:29:12,080 Speaker 1: memes on steroids or comedy. Yeah, and almost a little 572 00:29:12,080 --> 00:29:14,440 Speaker 1: bit desensitized to the kind of threat that people have 573 00:29:14,520 --> 00:29:16,640 Speaker 1: been mourning of for years, that they could be used 574 00:29:16,680 --> 00:29:21,160 Speaker 1: to manipulate public opinion and cause political catastrophe. And that's 575 00:29:21,160 --> 00:29:22,680 Speaker 1: why I wanted to get Yelder on the show to 576 00:29:22,720 --> 00:29:26,760 Speaker 1: talk about this, because it seemed like this deep fake 577 00:29:27,160 --> 00:29:32,160 Speaker 1: really was taken seriously by major politicians in two countries 578 00:29:32,240 --> 00:29:35,040 Speaker 1: that are nuclear armed and have been earlier this year 579 00:29:35,240 --> 00:29:36,000 Speaker 1: on the brink of war. 580 00:29:36,520 --> 00:29:39,120 Speaker 2: So how did she actually rain in the deep fake interview? 581 00:29:39,320 --> 00:29:40,960 Speaker 1: Well, it's a bit of a to do, but she 582 00:29:41,080 --> 00:29:43,720 Speaker 1: was able to use her platform to indeed ring in. 583 00:29:44,200 --> 00:29:46,240 Speaker 3: The way that I dealt with it was first of all, 584 00:29:46,360 --> 00:29:48,600 Speaker 3: putting a post up on my social media saying that 585 00:29:48,640 --> 00:29:53,120 Speaker 3: this deep fake is fake and it's AI generated. That 586 00:29:53,280 --> 00:29:56,440 Speaker 3: also got picked up. I also did some interviews on 587 00:29:56,440 --> 00:30:00,520 Speaker 3: sky News platforms. They ran the original and the fake version, 588 00:30:00,800 --> 00:30:03,640 Speaker 3: and I talked around it and clarified. But I think 589 00:30:03,760 --> 00:30:07,600 Speaker 3: what is terrifying is the fact that this really does 590 00:30:08,160 --> 00:30:12,880 Speaker 3: test our democracy, journalism, our work, and how we're going 591 00:30:12,920 --> 00:30:14,280 Speaker 3: to have to deal with things in the future. 592 00:30:14,360 --> 00:30:17,800 Speaker 1: This is quite a dramatic story of the intersection of 593 00:30:18,040 --> 00:30:21,120 Speaker 1: humanity in AI and not something that most of us 594 00:30:21,240 --> 00:30:23,400 Speaker 1: encounter in our everyday lives. But we do want to 595 00:30:23,440 --> 00:30:26,760 Speaker 1: hear about your everyday lives and your interactions with technology. 596 00:30:26,760 --> 00:30:30,360 Speaker 1: How are you using chatbots, how you're interacting with synthetic media. 597 00:30:30,880 --> 00:30:33,320 Speaker 1: Anything you can share with us about how your life 598 00:30:33,720 --> 00:30:36,480 Speaker 1: is being changed in real time by your interaction with 599 00:30:36,760 --> 00:30:39,440 Speaker 1: new technology is what we want to feature on this show. 600 00:30:39,480 --> 00:30:41,720 Speaker 1: So Please write to us at tech Stuff podcast at 601 00:30:41,720 --> 00:30:42,720 Speaker 1: gmail dot com. 602 00:30:42,880 --> 00:30:45,560 Speaker 4: Will feature your stories and we'll send you a free 603 00:30:45,560 --> 00:30:53,680 Speaker 4: T shirt. 604 00:30:55,440 --> 00:30:57,000 Speaker 2: That's it for this week for tech Stuff. 605 00:30:57,040 --> 00:31:00,280 Speaker 1: I'm care Price and i'mos Voloshian. This episode was produced 606 00:31:00,280 --> 00:31:03,840 Speaker 1: by Eliza Dennis and Melissa Slaughter. It was executive produced 607 00:31:03,880 --> 00:31:06,560 Speaker 1: by Me, Carol Price, Julian Nutter, and Kate Osborne for 608 00:31:06,640 --> 00:31:11,120 Speaker 1: Kaleidoscope and Katria Novel for iHeart Podcasts. The engineer is 609 00:31:11,120 --> 00:31:14,680 Speaker 1: Behid Fraser and Jack Insley mixed this episode. Kyle Murdoch 610 00:31:14,720 --> 00:31:15,520 Speaker 1: wrote out theme song. 611 00:31:15,960 --> 00:31:18,840 Speaker 2: Please rate, review and reach out to us at tech 612 00:31:18,880 --> 00:31:21,840 Speaker 2: Stuff podcast at gmail dot com. We want to hear 613 00:31:21,880 --> 00:31:22,160 Speaker 2: from you.