1 00:00:13,960 --> 00:00:17,600 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,640 --> 00:00:18,560 Speaker 1: os Volocian and. 3 00:00:18,520 --> 00:00:19,360 Speaker 2: I'm Cara Price. 4 00:00:19,840 --> 00:00:24,160 Speaker 1: Today we'll get into why websites claiming to catch cheetahs 5 00:00:24,400 --> 00:00:29,800 Speaker 1: using facial recognition should frighten us all and Trump's crypto empire. 6 00:00:30,360 --> 00:00:31,440 Speaker 2: Then unchat me. 7 00:00:32,240 --> 00:00:37,120 Speaker 3: Every story has two sides, even chat. This time she 8 00:00:37,240 --> 00:00:38,600 Speaker 3: has shown her green. 9 00:00:38,440 --> 00:00:41,760 Speaker 2: Side all of that on the Weekend Tech. It's Friday, 10 00:00:41,880 --> 00:00:43,000 Speaker 2: October twenty fourth. 11 00:00:47,400 --> 00:00:51,040 Speaker 1: So, Kara, often you come in wearing a baseball cap, 12 00:00:51,159 --> 00:00:53,880 Speaker 1: that's right. Not today today, I come in wearing a 13 00:00:54,000 --> 00:00:57,000 Speaker 1: black windbreaker. Yes, I don't normally wear. What does it 14 00:00:57,120 --> 00:00:57,920 Speaker 1: say on it? 15 00:00:57,920 --> 00:01:02,040 Speaker 2: It says the science of genetics, the business of discovery. 16 00:01:02,240 --> 00:01:03,400 Speaker 2: Colossal Labs. 17 00:01:03,480 --> 00:01:06,319 Speaker 1: Colossal Labs. So, a few weeks ago you sent me 18 00:01:06,480 --> 00:01:09,800 Speaker 1: an article about the company trying to revive the DODO. 19 00:01:09,959 --> 00:01:12,080 Speaker 1: I did, What did you say? 20 00:01:12,200 --> 00:01:13,760 Speaker 2: I said, we should have this guy on. 21 00:01:15,160 --> 00:01:17,160 Speaker 1: I'm so glad your session avid listener when you're not 22 00:01:17,200 --> 00:01:19,720 Speaker 1: on the show, Cara, I interviewed him a few weeks ago. 23 00:01:19,959 --> 00:01:22,000 Speaker 2: Yes, I was like, well, that's great, I'm glad you 24 00:01:22,040 --> 00:01:22,880 Speaker 2: did that. 25 00:01:23,720 --> 00:01:27,800 Speaker 1: This is Ben Lamb, the CEO and founder of Colossal Biosciences. 26 00:01:28,360 --> 00:01:31,120 Speaker 1: Just for some background, this is the company whose mission 27 00:01:31,200 --> 00:01:32,640 Speaker 1: is to revive the wooly mammoth. 28 00:01:32,959 --> 00:01:34,000 Speaker 2: Have they been successful? 29 00:01:34,240 --> 00:01:37,039 Speaker 1: They're getting there. So one of their big milestones along 30 00:01:37,040 --> 00:01:39,920 Speaker 1: the way was to gene edit mice to have wooly 31 00:01:40,040 --> 00:01:44,280 Speaker 1: mammoth fur. These were the so called they broke the 32 00:01:44,280 --> 00:01:46,960 Speaker 1: Internet around March April. And then of course you're with 33 00:01:47,040 --> 00:01:51,120 Speaker 1: the die wolves. Yes, majestic white wolves, beautiful wolves staring 34 00:01:51,200 --> 00:01:53,920 Speaker 1: out from the cover of Time magazine. Yes, was that 35 00:01:53,920 --> 00:01:55,120 Speaker 1: colossal Colossal as well? 36 00:01:55,160 --> 00:01:56,080 Speaker 2: That was colossal as well. 37 00:01:56,120 --> 00:01:59,160 Speaker 1: And now the Dodos is on the agenda, so incredible. 38 00:02:00,120 --> 00:02:03,320 Speaker 1: Gives something to believe in about just how powerful the 39 00:02:03,440 --> 00:02:06,800 Speaker 1: new science and tech revolution really is. I got the 40 00:02:06,800 --> 00:02:10,680 Speaker 1: reason I'm wearing Colossal Bioscience is swag is not because 41 00:02:10,680 --> 00:02:14,480 Speaker 1: I've become an employee of Colossal Biosciences. 42 00:02:13,720 --> 00:02:15,280 Speaker 2: Or a hYP Well, you're a kind of a hype man. 43 00:02:15,600 --> 00:02:17,440 Speaker 1: I got to go to the lab yesterday in Dallas. 44 00:02:17,680 --> 00:02:17,880 Speaker 2: Cool. 45 00:02:17,919 --> 00:02:19,080 Speaker 1: It was really really cool. 46 00:02:19,080 --> 00:02:21,239 Speaker 2: Can you describe some of the environs. 47 00:02:21,560 --> 00:02:23,880 Speaker 1: Yeah, I think it's a little on the top secret side, 48 00:02:24,240 --> 00:02:26,639 Speaker 1: but you basically go, you put on a lab code 49 00:02:26,720 --> 00:02:28,520 Speaker 1: and then you walk past all these different labs. Whether 50 00:02:28,560 --> 00:02:31,000 Speaker 1: each there's a team of scientists working on all different problems. 51 00:02:31,040 --> 00:02:35,639 Speaker 1: One is extracting DNA from ancient bones, another is implanting 52 00:02:35,760 --> 00:02:40,040 Speaker 1: DNA into eggs. Another is working on an artificial womb, 53 00:02:40,160 --> 00:02:46,040 Speaker 1: essentially growing animals without needing a parent animal. They're trying 54 00:02:46,080 --> 00:02:49,880 Speaker 1: to grow, growing uff from embryo to maturity. I mean, 55 00:02:49,919 --> 00:02:53,799 Speaker 1: it really is totally mind bobbling. There are two big 56 00:02:53,880 --> 00:02:58,000 Speaker 1: questions I think about this story. One is is it real? 57 00:02:58,520 --> 00:03:00,720 Speaker 1: And some people, when the die Walls came out, said, well, 58 00:03:00,800 --> 00:03:03,600 Speaker 1: you know you've basically gene edited normal wolves to give 59 00:03:03,600 --> 00:03:06,280 Speaker 1: them white fur and make them a little bit bigger, right, 60 00:03:06,400 --> 00:03:09,679 Speaker 1: And Ben Lamb, the CEO of Colossal Biosciences, pushes back, 61 00:03:09,800 --> 00:03:15,079 Speaker 1: including by saying, well have you done that exactly? The 62 00:03:15,160 --> 00:03:17,520 Speaker 1: other thing people say is what is the business case here? 63 00:03:17,639 --> 00:03:20,359 Speaker 1: And when Ban Lamb came on the show, I asked 64 00:03:20,440 --> 00:03:24,240 Speaker 1: him about exactly this is this really in your heart 65 00:03:24,240 --> 00:03:28,120 Speaker 1: of hearts? More about the de extinction project? Or is 66 00:03:28,160 --> 00:03:31,720 Speaker 1: the the extinction project the best meme in the world 67 00:03:32,160 --> 00:03:35,560 Speaker 1: to allow you to advance the cause of synthetic biology. 68 00:03:35,960 --> 00:03:39,080 Speaker 2: So so it depends on who you ask, right, have. 69 00:03:40,880 --> 00:03:44,240 Speaker 4: Yeah, you know, because because I'm very passionate about, you know, conservation, 70 00:03:44,280 --> 00:03:46,520 Speaker 4: which I hope we get into, but you know, for me, 71 00:03:46,600 --> 00:03:49,120 Speaker 4: it is about the de extinction, right. I think that 72 00:03:49,120 --> 00:03:52,720 Speaker 4: that we are at this like truly at the the 73 00:03:52,760 --> 00:03:58,200 Speaker 4: doorway of what we can do with synthetic biology. And 74 00:03:58,280 --> 00:04:00,960 Speaker 4: I think that if Colossal word is your every major 75 00:04:01,000 --> 00:04:05,960 Speaker 4: disease and help conservation and build technologies that could radically 76 00:04:05,960 --> 00:04:10,480 Speaker 4: transform di directed evolution and genetic engineering in all species, 77 00:04:10,560 --> 00:04:14,200 Speaker 4: including humans, and make every single species that we wanted 78 00:04:14,920 --> 00:04:17,760 Speaker 4: except the mammoth, I would consider us a failure. 79 00:04:18,320 --> 00:04:20,440 Speaker 1: So it's a little bit of both. 80 00:04:20,760 --> 00:04:23,440 Speaker 4: But my heart is still kind of pretty much in 81 00:04:23,480 --> 00:04:26,559 Speaker 4: love with the pursuit of some of the really large 82 00:04:26,600 --> 00:04:27,720 Speaker 4: de extinction projects. 83 00:04:27,839 --> 00:04:30,240 Speaker 1: And just for the sake of clarity, the term de 84 00:04:30,360 --> 00:04:34,360 Speaker 1: extinction is defined by Colossal as quote, the process of 85 00:04:34,480 --> 00:04:39,560 Speaker 1: generating an organism that both resembles and is genetically similar 86 00:04:39,640 --> 00:04:40,680 Speaker 1: to an extinct species. 87 00:04:41,480 --> 00:04:43,400 Speaker 2: All right, as I want to move on to the 88 00:04:43,400 --> 00:04:46,960 Speaker 2: big headlines of the week. As you know, my beat 89 00:04:46,960 --> 00:04:49,320 Speaker 2: on this show is dating in the age of tech, 90 00:04:49,400 --> 00:04:51,960 Speaker 2: and so of course I have to bring you a 91 00:04:52,040 --> 00:04:55,719 Speaker 2: video that's been replicated many times over. So I'm going 92 00:04:55,800 --> 00:04:58,480 Speaker 2: to show you a video of a girl who's been 93 00:04:58,560 --> 00:04:59,599 Speaker 2: interviewed on the street. 94 00:05:00,000 --> 00:05:01,359 Speaker 5: You've been engaged for six months? 95 00:05:01,440 --> 00:05:02,440 Speaker 1: Yeah, you're out here? 96 00:05:02,480 --> 00:05:03,040 Speaker 5: Where's he? 97 00:05:03,040 --> 00:05:03,320 Speaker 2: He's in? 98 00:05:04,120 --> 00:05:06,239 Speaker 5: That never caused conflict. I know a lot of times 99 00:05:06,279 --> 00:05:08,120 Speaker 5: military men are like a red flag. 100 00:05:08,400 --> 00:05:10,440 Speaker 1: I mean I always have those thous in my mind. 101 00:05:10,680 --> 00:05:11,920 Speaker 5: Well, yeah, you're a woman, of course. 102 00:05:12,000 --> 00:05:12,360 Speaker 1: Of course. 103 00:05:13,800 --> 00:05:16,760 Speaker 5: I have an AI software called Cheeterbuster, and we can 104 00:05:16,880 --> 00:05:19,840 Speaker 5: find out using spacial recognition if anyone's on a dating app. 105 00:05:20,560 --> 00:05:21,760 Speaker 2: What's his name, Cameron? 106 00:05:21,960 --> 00:05:25,560 Speaker 1: How old is he? City? He said, Jacksonville, North Carolina. 107 00:05:25,640 --> 00:05:27,159 Speaker 5: Can you have a picture of him on your phone 108 00:05:27,400 --> 00:05:29,520 Speaker 5: right now? Cheetabuster eye is gonna do its thing. And 109 00:05:29,680 --> 00:05:33,080 Speaker 5: remember it uses a marified photo recognition. 110 00:05:32,680 --> 00:05:34,120 Speaker 2: So that selled me. That's on his ID. 111 00:05:34,360 --> 00:05:36,600 Speaker 5: It uses that it's real. He's gonna find him. If 112 00:05:36,640 --> 00:05:38,440 Speaker 5: he's on here, we're gonna find him. Okay, I did 113 00:05:38,520 --> 00:05:39,360 Speaker 5: find an account. 114 00:05:39,480 --> 00:05:40,279 Speaker 1: Is this him? 115 00:05:40,920 --> 00:05:43,560 Speaker 2: Yeah? 116 00:05:44,120 --> 00:05:46,840 Speaker 1: That's it searching for men? Is this real? 117 00:05:47,080 --> 00:05:50,080 Speaker 2: Well, the app is real. I can't promise that the 118 00:05:50,080 --> 00:05:51,360 Speaker 2: girl in it isn't a plant. 119 00:05:51,800 --> 00:05:55,520 Speaker 1: I have to say I definitely avoid the situation. This 120 00:05:55,560 --> 00:05:58,040 Speaker 1: woman found herself in of a man in the street. 121 00:05:58,040 --> 00:06:01,480 Speaker 1: Interview being taped to social media. Do it? 122 00:06:01,560 --> 00:06:02,800 Speaker 2: Let's see what Cameron's. 123 00:06:02,600 --> 00:06:03,440 Speaker 1: This is insane? 124 00:06:03,560 --> 00:06:07,400 Speaker 2: Yes, So I just want to start by saying, this 125 00:06:07,480 --> 00:06:12,679 Speaker 2: is a real app called Cheaterbuster AI. So Joseph Cox, 126 00:06:12,680 --> 00:06:15,239 Speaker 2: who's a reporter for four or four, noticed that videos 127 00:06:15,320 --> 00:06:17,560 Speaker 2: like the one I just showed you were like flooding 128 00:06:17,600 --> 00:06:20,159 Speaker 2: feeds of Instagram and TikTok, and of course was like, 129 00:06:21,000 --> 00:06:22,080 Speaker 2: what the hell is this? 130 00:06:22,080 --> 00:06:24,480 Speaker 1: This is well, this is Jerry's spring of the for 131 00:06:24,520 --> 00:06:27,840 Speaker 1: the age of Jerry. Jerry exactly, Everyone's Jerry now. 132 00:06:27,960 --> 00:06:31,360 Speaker 2: Well, no Jerry is using AI now, which is crazy. 133 00:06:31,440 --> 00:06:34,599 Speaker 2: So let me just break down how cheater Bus the 134 00:06:34,680 --> 00:06:39,040 Speaker 2: site asks I would say, OZ thirty six New York, 135 00:06:40,160 --> 00:06:43,360 Speaker 2: I would upload a photo of you, not all of 136 00:06:43,400 --> 00:06:45,000 Speaker 2: it has to be precise, by the way, which is 137 00:06:45,000 --> 00:06:45,680 Speaker 2: really interesting. 138 00:06:46,080 --> 00:06:47,400 Speaker 1: You mean you could say, I don't know. 139 00:06:49,320 --> 00:06:51,920 Speaker 2: Whatever, it doesn't have to be super precise. Then there's 140 00:06:51,920 --> 00:06:55,719 Speaker 2: an option called face matches different names to see results 141 00:06:55,760 --> 00:06:57,760 Speaker 2: of anyone who looks like the photo uploaded. 142 00:06:57,920 --> 00:07:00,160 Speaker 1: So, in other words, the name and the age and 143 00:07:00,200 --> 00:07:05,240 Speaker 1: the location are essentially narrowing criteria that's right to allow 144 00:07:05,320 --> 00:07:08,160 Speaker 1: the facial recognition matching software to do about a. 145 00:07:08,120 --> 00:07:11,160 Speaker 2: Job yes, Okay, then you get a list of Tinder 146 00:07:11,160 --> 00:07:13,640 Speaker 2: profiles and you can look to see if your person 147 00:07:13,880 --> 00:07:18,320 Speaker 2: is cheating. All of this twenty dollars a month, wow 148 00:07:18,320 --> 00:07:18,720 Speaker 2: to use. 149 00:07:18,920 --> 00:07:21,920 Speaker 1: Wow. What about really good looking people who other people 150 00:07:21,960 --> 00:07:24,200 Speaker 1: are using to catfish on Tinder? Their lives could be 151 00:07:24,280 --> 00:07:24,840 Speaker 1: ruined by this. 152 00:07:25,000 --> 00:07:28,000 Speaker 2: That's absolutely that's a very good point. I didn't think 153 00:07:28,040 --> 00:07:31,640 Speaker 2: of that. So facial reognition, to me is the scariest 154 00:07:31,640 --> 00:07:33,680 Speaker 2: part of all of this. I mean, we obviously know 155 00:07:33,680 --> 00:07:37,040 Speaker 2: what facial recognition is. It's the biometric technology that identifies 156 00:07:37,080 --> 00:07:41,080 Speaker 2: a person by analyzing their features, like the distance between 157 00:07:41,080 --> 00:07:43,400 Speaker 2: their eyes or the shape of their nose and so. 158 00:07:44,520 --> 00:07:47,760 Speaker 2: Cheeterbusters is a consumer app and is one of many 159 00:07:47,800 --> 00:07:51,840 Speaker 2: apps out there that use facial recognition to expose a cheater. 160 00:07:52,040 --> 00:07:54,280 Speaker 2: But this is also a technology that's used by the 161 00:07:54,280 --> 00:07:56,080 Speaker 2: Department of Defense. It just trips me out. 162 00:07:55,960 --> 00:07:59,640 Speaker 1: And the police and the police. It's funny appropos the police. 163 00:07:59,640 --> 00:08:02,400 Speaker 1: I was told to somebody who had attended the No 164 00:08:02,560 --> 00:08:05,720 Speaker 1: King's protests yeah this weekend, and she said to me, 165 00:08:05,760 --> 00:08:08,120 Speaker 1: oh my goodness, I realized I probably should have worn 166 00:08:08,160 --> 00:08:13,240 Speaker 1: a mask, right, No, no, because I don't want to 167 00:08:13,280 --> 00:08:15,840 Speaker 1: be in a database if people matched to my identity 168 00:08:15,840 --> 00:08:18,240 Speaker 1: who attended to just totally can you imagine. I mean, 169 00:08:19,040 --> 00:08:21,320 Speaker 1: like five ten years ago, people said, you know, this 170 00:08:21,360 --> 00:08:23,760 Speaker 1: is the nightmare of living in a techno authoritarian state. 171 00:08:24,240 --> 00:08:27,240 Speaker 1: Now it's our life, yes, but it's not just being 172 00:08:27,320 --> 00:08:31,280 Speaker 1: used by the authorities. It's being used into into wider 173 00:08:31,320 --> 00:08:35,480 Speaker 1: culture for cheat busting and of course great social videos exactly. 174 00:08:35,600 --> 00:08:39,120 Speaker 1: And the thing is, I mean, obviously cheetahbusting is one 175 00:08:39,760 --> 00:08:42,160 Speaker 1: use case, but by far and away not the only one. 176 00:08:42,160 --> 00:08:43,960 Speaker 1: I mean, I could theoretically take a picture of someone 177 00:08:44,000 --> 00:08:46,400 Speaker 1: I saw on the subway and then trace it to 178 00:08:46,400 --> 00:08:50,319 Speaker 1: their Tinder profile, figure out geolocation where they're based, and 179 00:08:50,360 --> 00:08:51,440 Speaker 1: like hunt them down. 180 00:08:51,920 --> 00:08:55,600 Speaker 2: Eva Galprin, who was actually the director of cybersecurity at 181 00:08:55,640 --> 00:08:59,520 Speaker 2: the Electronic Frontier Foundation, said quote, I think that an 182 00:08:59,520 --> 00:09:01,960 Speaker 2: app that allows you to find the dating app profile 183 00:09:02,040 --> 00:09:04,760 Speaker 2: and general location of a person based on just one 184 00:09:04,760 --> 00:09:08,280 Speaker 2: photo is an excellent tool for stalkers. And I'm like, 185 00:09:08,360 --> 00:09:11,080 Speaker 2: you think, well, because you make a great point, which 186 00:09:11,080 --> 00:09:13,960 Speaker 2: is that, like I could come into an office any 187 00:09:14,040 --> 00:09:17,679 Speaker 2: day catt someone into giving me a photo of them, 188 00:09:17,960 --> 00:09:19,400 Speaker 2: or just take one, or take one of. 189 00:09:19,600 --> 00:09:21,920 Speaker 1: You put on your meta ray bands or your o. 190 00:09:22,120 --> 00:09:24,320 Speaker 2: The meta ray bands, that's right, opens us up to 191 00:09:24,360 --> 00:09:24,800 Speaker 2: something new. 192 00:09:25,000 --> 00:09:27,160 Speaker 1: I'm glad you brought this story because I think it's 193 00:09:27,960 --> 00:09:31,000 Speaker 1: it's a bigger story. Obviously, facial recognition technology in the 194 00:09:31,080 --> 00:09:35,960 Speaker 1: hands of authorities to do authoritarianism is really bad and 195 00:09:36,000 --> 00:09:38,240 Speaker 1: something we've been talking about for a while and something 196 00:09:38,280 --> 00:09:40,480 Speaker 1: we should all be scared of and is basically you're 197 00:09:40,480 --> 00:09:41,400 Speaker 1: already here. Yeah. 198 00:09:41,880 --> 00:09:44,720 Speaker 2: Joseph Cox from four from Media has actually done multiple 199 00:09:44,720 --> 00:09:48,199 Speaker 2: stories on how ice in law enforcement is currently using 200 00:09:48,240 --> 00:09:50,280 Speaker 2: this exact kind of technology to track people. 201 00:09:50,760 --> 00:09:55,000 Speaker 1: But when any normal person walking around the street can 202 00:09:55,040 --> 00:10:02,439 Speaker 1: be identified and geolocated by anybody with camera, man. 203 00:10:02,360 --> 00:10:06,560 Speaker 2: I mean it makes you feel not like a member 204 00:10:06,559 --> 00:10:08,640 Speaker 2: of the human race, but a member of the database 205 00:10:09,040 --> 00:10:09,920 Speaker 2: in a strange way. 206 00:10:10,120 --> 00:10:14,040 Speaker 1: What do you use cheap busters? Yeah? 207 00:10:14,080 --> 00:10:17,400 Speaker 2: Probably it's I mean that's the thing. It's like, it's 208 00:10:17,559 --> 00:10:20,240 Speaker 2: the lure of this technology is like too strong. It's 209 00:10:20,240 --> 00:10:21,960 Speaker 2: the same thing as giving it. Would I give away 210 00:10:21,960 --> 00:10:25,160 Speaker 2: my data? Yeah, in exchange for something else? Sure? Why not? 211 00:10:25,800 --> 00:10:26,160 Speaker 2: So caro. 212 00:10:26,360 --> 00:10:29,880 Speaker 1: We haven't done too many stories on tech stuff about crypto. 213 00:10:31,360 --> 00:10:35,320 Speaker 2: Crypto is something that I should be interested in, but 214 00:10:35,520 --> 00:10:36,079 Speaker 2: I'm just not. 215 00:10:36,280 --> 00:10:39,200 Speaker 1: It's not, honestly, where my curiosity naturally leads me. I 216 00:10:39,240 --> 00:10:43,839 Speaker 1: prefer wooly mammoths and cheat Cheetahbuster. But it's obviously it's 217 00:10:43,960 --> 00:10:46,560 Speaker 1: it's impossible to an Yeah, yeah, and I saw a 218 00:10:46,600 --> 00:10:49,240 Speaker 1: store in the Financial Times that will be a perfect 219 00:10:49,280 --> 00:10:50,480 Speaker 1: way for us to have a little bit of it. 220 00:10:50,640 --> 00:10:53,280 Speaker 2: And if it's in your heart's. 221 00:10:52,920 --> 00:10:56,280 Speaker 1: In my heart exactly. The headline is how the Trump 222 00:10:56,320 --> 00:11:00,480 Speaker 1: companies made a billion dollars from crypto and billion dollars. 223 00:11:00,480 --> 00:11:02,839 Speaker 1: By the way, there's a billion dollars of pre tax 224 00:11:03,000 --> 00:11:07,480 Speaker 1: profits last year. The FT asked Eric Trump for some 225 00:11:07,520 --> 00:11:14,920 Speaker 1: comment on the story. He said, quote, it's probably more's 226 00:11:15,200 --> 00:11:16,400 Speaker 1: did you have any idea about this? 227 00:11:16,880 --> 00:11:18,400 Speaker 2: I did not realize they made this much money? 228 00:11:18,400 --> 00:11:18,880 Speaker 1: But he knew that. 229 00:11:19,120 --> 00:11:21,480 Speaker 2: I knew that they were involved in crypto. Yeah, but 230 00:11:21,520 --> 00:11:23,120 Speaker 2: I did not know it was a billion. 231 00:11:23,200 --> 00:11:24,640 Speaker 1: And that's a lot. That's a lot of money. 232 00:11:24,679 --> 00:11:25,480 Speaker 2: That's a ton of money. 233 00:11:25,640 --> 00:11:27,840 Speaker 1: So before we kind of get into the whole story, 234 00:11:28,000 --> 00:11:29,400 Speaker 1: I want to give you a little bit of background 235 00:11:29,440 --> 00:11:33,120 Speaker 1: on Trump's own relationship with crypto. As recently as twenty 236 00:11:33,120 --> 00:11:36,080 Speaker 1: twenty four, he was actually in our camp. He labeled 237 00:11:36,080 --> 00:11:38,839 Speaker 1: crypto as a quote based on thin air and called 238 00:11:38,880 --> 00:11:42,680 Speaker 1: bitcoin a scam. But Trump, obviously he has a better 239 00:11:42,920 --> 00:11:45,320 Speaker 1: risk appetite than I do, and so he's all in 240 00:11:45,360 --> 00:11:47,400 Speaker 1: on crypto at the moment. The way it started, though, 241 00:11:47,480 --> 00:11:49,720 Speaker 1: was in twenty twenty four, Trump was in all kinds 242 00:11:49,840 --> 00:11:53,680 Speaker 1: of legal trouble, and this translated to financial trouble because 243 00:11:53,720 --> 00:11:56,720 Speaker 1: of huge civil penalties, and he actually claimed at a 244 00:11:56,760 --> 00:11:59,000 Speaker 1: certain moment that he would have to start selling his 245 00:11:59,040 --> 00:12:01,520 Speaker 1: real estate assets to meet this five hundred million dollar 246 00:12:01,920 --> 00:12:06,080 Speaker 1: civil penalty, so he needed money. And then separately, he 247 00:12:06,200 --> 00:12:09,040 Speaker 1: was also claiming that he was being debanked, in other words, 248 00:12:09,080 --> 00:12:13,560 Speaker 1: that big major financial institutions refuse to carry his accounts, 249 00:12:13,600 --> 00:12:15,679 Speaker 1: which is something that normally happens to people who are 250 00:12:15,920 --> 00:12:18,880 Speaker 1: criminals or have very bad credit, or people who the 251 00:12:18,960 --> 00:12:21,920 Speaker 1: banking system essentially doesn't want to integrate. 252 00:12:22,200 --> 00:12:24,400 Speaker 2: Didn't he have a coin? Wasn't there some sort of 253 00:12:24,480 --> 00:12:25,720 Speaker 2: coin related to him? 254 00:12:25,880 --> 00:12:30,000 Speaker 1: You're exactly right. So there's actually a Millennia coin, right, 255 00:12:30,040 --> 00:12:32,520 Speaker 1: That's what I thought, and a Trump coin. Before we 256 00:12:32,559 --> 00:12:35,480 Speaker 1: get to those, there are two quick sort of definitional things. 257 00:12:35,520 --> 00:12:38,280 Speaker 1: We're going to talk about meme coins, which are like 258 00:12:38,320 --> 00:12:39,840 Speaker 1: the Trump coin and the millennial. 259 00:12:39,960 --> 00:12:41,560 Speaker 2: Trend coin like trend coin. 260 00:12:41,640 --> 00:12:44,920 Speaker 1: Yeah, yeah, and their cryptocurrency, but anyone can issue them 261 00:12:45,080 --> 00:12:48,120 Speaker 1: and they're very much based on like internet hype. And 262 00:12:48,160 --> 00:12:50,480 Speaker 1: then you have stable coins, do you not? Stable coins 263 00:12:50,520 --> 00:12:51,160 Speaker 1: are but I don't know. 264 00:12:51,200 --> 00:12:52,120 Speaker 2: I've heard of it, but I don't know. 265 00:12:52,200 --> 00:12:55,200 Speaker 1: So a stable coin is a coin that's tied to 266 00:12:55,280 --> 00:12:59,800 Speaker 1: a real asset that the stable coin issuer has to 267 00:12:59,840 --> 00:13:02,640 Speaker 1: all so own. So, in other words, a stable coin 268 00:13:02,720 --> 00:13:04,760 Speaker 1: is backed by gold or by US dollars. And if 269 00:13:04,800 --> 00:13:07,040 Speaker 1: I own a stable coin, I can redeem it at 270 00:13:07,040 --> 00:13:10,000 Speaker 1: any point for the real asset. Oh interesting, so much, 271 00:13:10,280 --> 00:13:14,319 Speaker 1: that's White's stable. Let's start with those mean coins, dollar 272 00:13:14,360 --> 00:13:19,800 Speaker 1: Trump and Dollar Millennia. So these coins were both issued 273 00:13:20,080 --> 00:13:23,520 Speaker 1: right before the inauguration and have since lost over ninety 274 00:13:23,520 --> 00:13:27,320 Speaker 1: percent of their evaluation. Really but two important points here, 275 00:13:27,640 --> 00:13:31,040 Speaker 1: really important points. The Trump and Millennia coins, whenever they're 276 00:13:31,040 --> 00:13:34,880 Speaker 1: bought or sold, generate fees to the mean coin issuer. 277 00:13:35,240 --> 00:13:39,480 Speaker 1: So the trading volume, regardless of the price, enriches the 278 00:13:39,520 --> 00:13:43,840 Speaker 1: mean coin issuer. Guess how much the ft estimates that 279 00:13:43,920 --> 00:13:46,080 Speaker 1: the Trumps have made from the buying and selling of 280 00:13:46,080 --> 00:13:49,240 Speaker 1: these mean coins much? Four hundred million. 281 00:13:48,920 --> 00:13:52,720 Speaker 2: Dollars just on fees fees. 282 00:13:53,160 --> 00:13:56,760 Speaker 1: Bear in mind these Trump and Millennia coins made all 283 00:13:56,760 --> 00:13:59,080 Speaker 1: the headlines, but they aren't actually the real play. The 284 00:13:59,160 --> 00:14:03,079 Speaker 1: ft reports on World Liberty Financial. The company is set 285 00:14:03,120 --> 00:14:07,000 Speaker 1: up by Trump's sons and Steve Wickcoff sons. Wickcoff is 286 00:14:07,000 --> 00:14:09,880 Speaker 1: the Special Envoy the Middle East as well as kind 287 00:14:09,920 --> 00:14:12,920 Speaker 1: of an ambassador at large. Four for Trump the Suns, 288 00:14:13,040 --> 00:14:17,480 Speaker 1: it's the neo the Nebow crypto. This company, World Liberty Financial, 289 00:14:17,679 --> 00:14:23,480 Speaker 1: has released two major tokens. WLFI, which gives you governance 290 00:14:23,640 --> 00:14:27,200 Speaker 1: rights in World Liberty Financial if you're a holder. This 291 00:14:27,280 --> 00:14:29,560 Speaker 1: has owned five hundred and fifty million dollars so far, 292 00:14:30,240 --> 00:14:33,360 Speaker 1: and then USD one, which is a stable coin whose 293 00:14:33,400 --> 00:14:35,800 Speaker 1: price is paged to the US dollar, which has sold 294 00:14:35,960 --> 00:14:38,840 Speaker 1: two point seventy one billion dollars of this USD one 295 00:14:38,880 --> 00:14:41,280 Speaker 1: stable coin. Bear in mind, twenty twenty five is the 296 00:14:41,320 --> 00:14:44,800 Speaker 1: real boom kickoff of this stuff. In twenty twenty four, 297 00:14:45,400 --> 00:14:48,920 Speaker 1: Trump declared a personal income of fifty seven point three 298 00:14:48,960 --> 00:14:51,200 Speaker 1: million dollars just from World Liberty Financial. 299 00:14:51,960 --> 00:14:54,120 Speaker 2: Is it legal what he's doing? 300 00:14:54,760 --> 00:14:57,040 Speaker 1: I mean, I couldn't appine on that. I'm sure that 301 00:14:57,080 --> 00:15:00,360 Speaker 1: there will be legal plenty of legal challenges to this, 302 00:15:00,440 --> 00:15:02,240 Speaker 1: but he's also in a position of changing the laws. 303 00:15:02,400 --> 00:15:05,640 Speaker 1: Drawing that famous line from Frost Nixon, if the president 304 00:15:05,720 --> 00:15:06,640 Speaker 1: does it, it is the. 305 00:15:06,640 --> 00:15:09,960 Speaker 2: Law that was a good Franklin, thank you. 306 00:15:10,800 --> 00:15:13,920 Speaker 1: So we're in that era, I think again. To say 307 00:15:13,920 --> 00:15:17,080 Speaker 1: the least, this crypto thing has two levers in terms 308 00:15:17,120 --> 00:15:21,160 Speaker 1: of exerting influence potentially on the president. On the domestic side, 309 00:15:21,280 --> 00:15:25,080 Speaker 1: American crypto companies donate heavily to the Trump campaign and 310 00:15:25,200 --> 00:15:30,520 Speaker 1: the World Liberty, Digital Financial Freedom Pack and Go Figure. Basically, 311 00:15:30,520 --> 00:15:33,400 Speaker 1: there's been post Trump presidents did a lot of inflow 312 00:15:33,440 --> 00:15:36,800 Speaker 1: of like institutional capital into crypto, which is bolstering the 313 00:15:36,800 --> 00:15:42,320 Speaker 1: whole industry internationally. An Abu Dhabi based investment firm bought 314 00:15:42,400 --> 00:15:46,320 Speaker 1: two billion dollars of a Trump backstable coin. A Chinese 315 00:15:46,360 --> 00:15:49,200 Speaker 1: company called gd Culture Group announced it had raised three 316 00:15:49,280 --> 00:15:52,200 Speaker 1: hundred million dollars to spend on bitcoin and the Trump 317 00:15:52,240 --> 00:15:57,400 Speaker 1: meme coin, and Trump's sec stopped a fraud investigation into 318 00:15:57,440 --> 00:16:01,160 Speaker 1: the Chinese born crypto billionaire just in so after he 319 00:16:01,200 --> 00:16:04,400 Speaker 1: put seventy five million dollars into World Devity Financial. 320 00:16:04,280 --> 00:16:06,160 Speaker 2: Again, it just feels us. 321 00:16:06,480 --> 00:16:08,920 Speaker 1: I mean, it's as little as to say the least, 322 00:16:09,320 --> 00:16:11,680 Speaker 1: But I've got some really good reassuring news. Actually, please, 323 00:16:12,760 --> 00:16:16,400 Speaker 1: The president's crypto adventures are actually in a trust really 324 00:16:16,560 --> 00:16:22,560 Speaker 1: managed by Donald Trump Junior. Now, this is a quote 325 00:16:22,560 --> 00:16:25,480 Speaker 1: in the Financial Time from Richard Painter, who was the 326 00:16:25,520 --> 00:16:30,440 Speaker 1: former chief White House ethics lawyer to President George W. Bush. Quote. 327 00:16:30,480 --> 00:16:33,920 Speaker 1: Every other president since the Civil War has avoided any 328 00:16:33,960 --> 00:16:38,680 Speaker 1: significant financial conflict of interest with their official duties. You know, 329 00:16:38,800 --> 00:16:42,080 Speaker 1: going back to the original Trump administration in twenty sixteen 330 00:16:42,120 --> 00:16:45,240 Speaker 1: to twenty twenty, people have fixated, Oh my god, he's 331 00:16:45,280 --> 00:16:48,320 Speaker 1: getting people spend money in the hotel right right, right, 332 00:16:48,440 --> 00:16:51,360 Speaker 1: right now. It's you know, the Trump family are going 333 00:16:51,440 --> 00:16:55,520 Speaker 1: to be billionaires for generations just from this. 334 00:16:55,520 --> 00:17:02,000 Speaker 2: From this, Yeah, that's incredible. After the break, satellites reveal 335 00:17:02,080 --> 00:17:05,399 Speaker 2: private phone calls. Uber has drivers training, AI and a 336 00:17:05,440 --> 00:17:09,080 Speaker 2: new health tracker guaranteed to make a splash. Then on 337 00:17:09,200 --> 00:17:12,000 Speaker 2: chatting me, a listener's life gets a little greener. 338 00:17:12,280 --> 00:17:20,960 Speaker 1: Stay with us, Welcome back, Hello again, Kara. Do you 339 00:17:20,960 --> 00:17:24,959 Speaker 1: remember this sort of phrase the gig economy, Yeah, of course, 340 00:17:25,200 --> 00:17:27,720 Speaker 1: I mean this idea that you could like also be 341 00:17:27,800 --> 00:17:30,600 Speaker 1: a number driver for like two hours, or deliver a 342 00:17:30,640 --> 00:17:33,959 Speaker 1: package or whatever. It may be. So we're now in 343 00:17:34,000 --> 00:17:38,240 Speaker 1: the meta gig economy, the giga economy. I saw a 344 00:17:38,240 --> 00:17:42,800 Speaker 1: headline in Axios. Uber wants drivers to train AI in 345 00:17:42,840 --> 00:17:46,439 Speaker 1: their free time. No, yeah, So basically the idea is 346 00:17:46,920 --> 00:17:49,800 Speaker 1: you're driving for Uber and you might have some time 347 00:17:49,840 --> 00:17:52,720 Speaker 1: between picking up passengers. That's what do you do? 348 00:17:52,760 --> 00:17:53,600 Speaker 2: You train AI? 349 00:17:54,000 --> 00:17:56,159 Speaker 1: Basically you can do like a one to three minute 350 00:17:56,240 --> 00:17:59,520 Speaker 1: task in the Uber app and get paid, like, you know, 351 00:17:59,560 --> 00:18:02,320 Speaker 1: a dollar. So average time per task is one to 352 00:18:02,359 --> 00:18:05,800 Speaker 1: three minutes. Pay is between fifty cents and a dollar 353 00:18:05,880 --> 00:18:09,040 Speaker 1: per task. So let's see you pull over time for 354 00:18:09,080 --> 00:18:11,720 Speaker 1: your break, you're having your lunch. You basically you get 355 00:18:11,720 --> 00:18:14,399 Speaker 1: a notification saying would you like to do some work 356 00:18:14,480 --> 00:18:15,919 Speaker 1: to get additional income? 357 00:18:16,240 --> 00:18:17,200 Speaker 2: And what are they training? 358 00:18:17,320 --> 00:18:17,439 Speaker 1: Like? 359 00:18:17,480 --> 00:18:19,159 Speaker 2: What are what is the data for? 360 00:18:19,960 --> 00:18:23,639 Speaker 1: So there's a separate AI company which is part of 361 00:18:23,720 --> 00:18:27,280 Speaker 1: Uber called Uber Ai Solutions. It's really to do with 362 00:18:27,920 --> 00:18:31,919 Speaker 1: the kind of human labeling aspect of data that is 363 00:18:32,000 --> 00:18:33,360 Speaker 1: required to train. 364 00:18:33,280 --> 00:18:34,960 Speaker 2: That a computer can't do exactly. 365 00:18:35,000 --> 00:18:40,920 Speaker 1: And then Uber sells their human driven AI data labeling 366 00:18:41,000 --> 00:18:44,800 Speaker 1: and data to other AI companies. The other thing is 367 00:18:45,080 --> 00:18:48,719 Speaker 1: some of this data is being sold onto self driving 368 00:18:48,760 --> 00:18:53,520 Speaker 1: tech companies like Aurora and Tier four. So essentially, the 369 00:18:53,600 --> 00:18:57,359 Speaker 1: uber drivers in their free time, are training AI companies 370 00:18:57,400 --> 00:18:59,919 Speaker 1: to replace human drivers. 371 00:19:00,119 --> 00:19:02,240 Speaker 2: So they're basically training themselves out of a job. 372 00:19:02,840 --> 00:19:05,400 Speaker 1: Do you remember Wiley Coyote. 373 00:19:04,840 --> 00:19:08,360 Speaker 2: Who's a coyote? What is that? That's like payota? 374 00:19:08,440 --> 00:19:08,960 Speaker 1: What do you say? 375 00:19:09,440 --> 00:19:09,920 Speaker 2: Coyote? 376 00:19:10,040 --> 00:19:11,800 Speaker 1: Coyote, coyote. 377 00:19:11,920 --> 00:19:15,040 Speaker 2: It's like a coy little coyote. 378 00:19:15,760 --> 00:19:18,760 Speaker 1: Well, I feel I feel suitably embarrassed now, but you 379 00:19:18,800 --> 00:19:21,000 Speaker 1: know that the image of the yeah, of course, running 380 00:19:21,040 --> 00:19:24,000 Speaker 1: coyote cliff, running off the cliff and he's pedaling his 381 00:19:24,000 --> 00:19:26,119 Speaker 1: feet like crazy. Underneath him, it's nothing but in there, 382 00:19:26,160 --> 00:19:27,760 Speaker 1: and then he realizes and all of a sudden he's 383 00:19:27,800 --> 00:19:29,160 Speaker 1: on the floor. Yes, that's us. 384 00:19:29,320 --> 00:19:30,399 Speaker 2: That's a very good image. 385 00:19:30,480 --> 00:19:32,159 Speaker 1: I saw this week as well as a story in 386 00:19:32,200 --> 00:19:36,200 Speaker 1: Bloomberg about how Open a Eye has hired one hundred bankers. 387 00:19:36,480 --> 00:19:38,880 Speaker 1: Do you see this, No, they're paying them one hundred 388 00:19:38,920 --> 00:19:42,879 Speaker 1: fifty dollars an hour bankers former investment banker to train 389 00:19:43,080 --> 00:19:47,159 Speaker 1: open Aiy's new product to do financial modeling tool financial modeling. 390 00:19:47,200 --> 00:19:49,640 Speaker 1: So so bankers are now being paid one hundred fiey 391 00:19:49,640 --> 00:19:51,679 Speaker 1: dollars an hour to train an open air product to 392 00:19:51,680 --> 00:19:52,840 Speaker 1: do this instead of them. 393 00:19:52,960 --> 00:19:55,440 Speaker 2: So I mean that's putting themselves out of I. 394 00:19:55,359 --> 00:19:57,680 Speaker 1: Mean, one hundred few dollars an hour is nice. Yeah, 395 00:19:57,680 --> 00:20:01,560 Speaker 1: it's better than fifty cents a task. Yeah, But it's 396 00:20:01,560 --> 00:20:03,320 Speaker 1: the same thing where it's like when an't we going 397 00:20:03,400 --> 00:20:07,600 Speaker 1: to wake up and realize that this intermediate opportunity of 398 00:20:07,640 --> 00:20:11,560 Speaker 1: getting some extra cash to train AI is potentially at 399 00:20:11,600 --> 00:20:14,040 Speaker 1: the expense of putting the entire human workforce out of 400 00:20:14,040 --> 00:20:14,840 Speaker 1: business as a business. 401 00:20:14,880 --> 00:20:18,520 Speaker 2: It's really interesting. I I mean, I guess it's a 402 00:20:18,560 --> 00:20:22,280 Speaker 2: short term gain for a long term problem for these drivers. 403 00:20:22,359 --> 00:20:24,199 Speaker 1: I mean, I think that's exactly right. If you have 404 00:20:24,240 --> 00:20:26,639 Speaker 1: an opportunity to make some make sure money out of 405 00:20:26,680 --> 00:20:29,080 Speaker 1: your time while you're in your car, more power to 406 00:20:29,160 --> 00:20:32,520 Speaker 1: you as an individual. As a society, is it the 407 00:20:32,600 --> 00:20:33,359 Speaker 1: right thing to be doing? 408 00:20:33,840 --> 00:20:37,760 Speaker 2: Well? As you know, I like stories about things that 409 00:20:38,320 --> 00:20:40,280 Speaker 2: should not be happening that are happening that make me go, 410 00:20:40,359 --> 00:20:42,840 Speaker 2: wait a minute, that doesn't feel right. And so the 411 00:20:42,840 --> 00:20:45,399 Speaker 2: story that I want to bring you today is the 412 00:20:45,440 --> 00:20:50,560 Speaker 2: following quote. Satellites beam data down to Earth all around 413 00:20:50,640 --> 00:20:53,400 Speaker 2: us all the time, so you might expect that those 414 00:20:53,440 --> 00:20:57,000 Speaker 2: space based radio communications would be encrypted to prevent any 415 00:20:57,080 --> 00:20:59,679 Speaker 2: snoop with a satellite dish from accessing the torrent of 416 00:20:59,680 --> 00:21:04,119 Speaker 2: secret information constantly raining from the sky, you would, to 417 00:21:04,160 --> 00:21:07,719 Speaker 2: a surprising and troubling degree, be wrong. What there are 418 00:21:07,720 --> 00:21:10,720 Speaker 2: sky snoops? So researchers that you see San Diego and 419 00:21:10,760 --> 00:21:14,080 Speaker 2: the University of Maryland just found that roughly half of 420 00:21:14,200 --> 00:21:19,080 Speaker 2: geostationary satellite signals are entirely vulnerable to eavesdropping. 421 00:21:19,400 --> 00:21:20,879 Speaker 1: Huff half. Wow. 422 00:21:21,280 --> 00:21:25,640 Speaker 2: For three years, researchers have developed and used an off 423 00:21:25,680 --> 00:21:29,200 Speaker 2: the shelf, meaning we could buy it, eight hundred dollars 424 00:21:29,200 --> 00:21:32,800 Speaker 2: satellite receiver system. They put it on the roof of 425 00:21:32,800 --> 00:21:36,200 Speaker 2: a university building in San Diego, pointed the dish at 426 00:21:36,200 --> 00:21:42,040 Speaker 2: different satellites and interpreted the obscure but unprotected signals. And 427 00:21:42,080 --> 00:21:46,120 Speaker 2: this team accessed all kinds of unencrypted consumer, corporate, and 428 00:21:46,160 --> 00:21:49,399 Speaker 2: government communications from these satellites. Wow, it's a ton. 429 00:21:49,720 --> 00:21:51,960 Speaker 1: What kind of communications of these? I mean, it's like 430 00:21:52,000 --> 00:21:54,359 Speaker 1: phone calls, texts or the above. 431 00:21:54,480 --> 00:21:57,440 Speaker 2: It's all of the above, calls and texts from t mobiles, 432 00:21:57,520 --> 00:22:02,160 Speaker 2: cell network data from airline passengers, from in flight Wi 433 00:22:02,320 --> 00:22:05,960 Speaker 2: Fi browsing. And this is the one that I mean, 434 00:22:06,080 --> 00:22:10,080 Speaker 2: it's all scary, but us and Mexican military and law 435 00:22:10,160 --> 00:22:15,680 Speaker 2: enforcement communications, which revealed the locations of personnel, equipment and facilities. 436 00:22:15,920 --> 00:22:18,040 Speaker 1: This is really remarkable. 437 00:22:18,280 --> 00:22:22,280 Speaker 2: But wait, there's more. I should say that some companies 438 00:22:22,280 --> 00:22:24,040 Speaker 2: in the study, like T Mobile and AT and T 439 00:22:24,200 --> 00:22:27,439 Speaker 2: Mexico have actually already addressed the issue, but there is 440 00:22:27,480 --> 00:22:31,080 Speaker 2: like a lot of data still out there. Researchers estimated 441 00:22:31,080 --> 00:22:32,959 Speaker 2: that what they were able to pick up was actually 442 00:22:32,960 --> 00:22:36,600 Speaker 2: only fifteen percent of global satellite transponder communications. 443 00:22:36,720 --> 00:22:37,000 Speaker 1: Wow. 444 00:22:37,280 --> 00:22:40,560 Speaker 2: To me, the craziest part of this story is that 445 00:22:41,400 --> 00:22:44,640 Speaker 2: anyone can access this data and that's because of two things. 446 00:22:44,960 --> 00:22:47,920 Speaker 2: The tech the actual satellite receiver system is eight hundred 447 00:22:47,960 --> 00:22:51,760 Speaker 2: dollars and this is the craziest part. The researchers are 448 00:22:51,800 --> 00:22:56,639 Speaker 2: releasing their software tool for interpreting satellite data on githubs. 449 00:22:56,680 --> 00:22:59,840 Speaker 2: So even though this data from satellites comes in obscured, 450 00:23:00,320 --> 00:23:04,640 Speaker 2: now anyone with access to GitHub, meaning everyone can interpret 451 00:23:04,680 --> 00:23:05,280 Speaker 2: the data. 452 00:23:05,320 --> 00:23:06,760 Speaker 1: Why why would they release it? 453 00:23:07,080 --> 00:23:11,000 Speaker 2: So researchers actually argue that this could push companies to 454 00:23:11,119 --> 00:23:13,000 Speaker 2: secure their data, so they're sort of doing it to 455 00:23:13,160 --> 00:23:16,680 Speaker 2: force the hand of companies to be like, guys, this 456 00:23:16,720 --> 00:23:19,880 Speaker 2: is something that's possible. Get your act together. Well, they're 457 00:23:19,920 --> 00:23:20,840 Speaker 2: kind of like white hats. 458 00:23:21,040 --> 00:23:23,560 Speaker 1: Actually that's a good framing, and white hat hackers are 459 00:23:23,640 --> 00:23:29,040 Speaker 1: people who basically find the vulnerabilities in systems in order 460 00:23:29,280 --> 00:23:32,359 Speaker 1: that those systems can be repaired and made less vulnerable, 461 00:23:32,840 --> 00:23:36,320 Speaker 1: rather than to blackmail the people or steal their data 462 00:23:36,440 --> 00:23:40,359 Speaker 1: and for nefarious purposes exactly. In other words, if these 463 00:23:40,400 --> 00:23:44,600 Speaker 1: researchers hadn't released the open source tool to interpret the data, 464 00:23:45,240 --> 00:23:48,600 Speaker 1: maybe Wide magazine wouldn't to cover the story. Karen, what 465 00:23:48,640 --> 00:23:51,200 Speaker 1: if I told you you could turn your bathroom into 466 00:23:51,280 --> 00:23:54,800 Speaker 1: a connected, data informed health and wellness hub. 467 00:23:54,680 --> 00:23:57,639 Speaker 2: I'd say absolutely no, thank you, Let's stay out of 468 00:23:57,680 --> 00:23:58,240 Speaker 2: my bathroom. 469 00:23:58,960 --> 00:24:01,720 Speaker 1: And this is quite a fascinating story for me. It's 470 00:24:01,760 --> 00:24:05,520 Speaker 1: about Cola and a new health tracker they've announced called 471 00:24:05,560 --> 00:24:09,040 Speaker 1: the decoder that lives in your toilet bowl. 472 00:24:09,880 --> 00:24:12,480 Speaker 2: I get it as in decoder exactly. 473 00:24:14,040 --> 00:24:16,159 Speaker 1: This is a very interesting story to me because it 474 00:24:16,160 --> 00:24:20,320 Speaker 1: actually harkens back to my childhood watching daytime television in Britain. 475 00:24:20,840 --> 00:24:25,320 Speaker 1: There was a TV doctor called Gillian McKeith whose concept 476 00:24:25,640 --> 00:24:29,560 Speaker 1: was to go into people's houses and examine their bathroom 477 00:24:31,280 --> 00:24:33,879 Speaker 1: to tell them about what they could do to improve 478 00:24:33,920 --> 00:24:38,480 Speaker 1: their health. Julie McKay's actually had a second life. I 479 00:24:38,480 --> 00:24:43,720 Speaker 1: recently saw a story about my favorite showbiz publication, Bang Showbiz, 480 00:24:43,760 --> 00:24:47,000 Speaker 1: with the headline my Pooh tiktoks have saved lives. 481 00:24:48,080 --> 00:24:50,399 Speaker 3: Someone just told me their pooh bobs around in the 482 00:24:50,400 --> 00:24:51,560 Speaker 3: bottom of the toilet bowl. 483 00:24:51,600 --> 00:24:54,200 Speaker 1: It does not flash. Your toilet's not broken. 484 00:24:54,440 --> 00:24:58,119 Speaker 3: No, no, no, but your digestion is waving a big 485 00:24:58,160 --> 00:25:00,399 Speaker 3: brown flag and begging for attention. 486 00:25:01,040 --> 00:25:04,600 Speaker 1: But now everyone can have a Chillian McKee in their toilet, 487 00:25:04,760 --> 00:25:08,800 Speaker 1: up their toilet. So the decoder tracks a few things 488 00:25:09,160 --> 00:25:14,199 Speaker 1: the frequency, consistency, and shape of your waist, which in 489 00:25:14,240 --> 00:25:17,520 Speaker 1: turn can give you recommendations like are you probably hydrated, 490 00:25:18,080 --> 00:25:21,480 Speaker 1: are you absorbing nutrients? And of course, to Jillian's point, 491 00:25:22,000 --> 00:25:23,520 Speaker 1: you know, are you actually sick? Do you have some 492 00:25:23,600 --> 00:25:26,200 Speaker 1: kind of bowel issue that could be anything up to deadly? 493 00:25:26,320 --> 00:25:27,800 Speaker 2: What's effective? In that way it. 494 00:25:27,720 --> 00:25:31,840 Speaker 1: Can be effective. The product comes in three parts. There 495 00:25:31,880 --> 00:25:35,280 Speaker 1: is a sensor that clamps to most toilet bowls don't 496 00:25:35,320 --> 00:25:39,280 Speaker 1: have to have a color, a wall mount that communicates 497 00:25:39,480 --> 00:25:42,760 Speaker 1: between the bowl sensors, and has a fingerprint sensor to 498 00:25:42,880 --> 00:25:44,640 Speaker 1: track different users. So one oh, so. 499 00:25:44,600 --> 00:25:50,280 Speaker 2: It's not just it's anyone can lock into the ball. 500 00:25:51,359 --> 00:25:54,359 Speaker 2: That's just not stuff I want to get stolen data 501 00:25:54,359 --> 00:25:55,280 Speaker 2: that I don't want. 502 00:25:55,480 --> 00:25:58,480 Speaker 1: There's also a health app for phones and per the 503 00:25:58,600 --> 00:26:04,240 Speaker 1: Verge advanced sensors that use spectroscopy to observe how light 504 00:26:04,560 --> 00:26:08,040 Speaker 1: interacts with your waist. The sensors are angled down so 505 00:26:08,080 --> 00:26:10,120 Speaker 1: they can only see the inside of the bowl. They're 506 00:26:10,160 --> 00:26:14,800 Speaker 1: not looking up to speak. And this is seven dollars 507 00:26:14,800 --> 00:26:17,080 Speaker 1: a month, or sixty ninety nine per month for single users, 508 00:26:17,640 --> 00:26:21,760 Speaker 1: or twelve ninety nine per month for family plants chebusters. 509 00:26:22,080 --> 00:26:23,240 Speaker 1: It's cheaper than cheap bust. 510 00:26:23,280 --> 00:26:25,720 Speaker 2: It's just crazy, especially for the whole family. 511 00:26:25,760 --> 00:26:27,760 Speaker 1: And your point about privacy, and I'm not surprised you're 512 00:26:27,760 --> 00:26:30,480 Speaker 1: on the privacy track after our satellite story. The data 513 00:26:30,560 --> 00:26:32,200 Speaker 1: is end to end ENCRYPTID. 514 00:26:32,680 --> 00:26:35,640 Speaker 2: It's as hard as the stool that's in it. I'm 515 00:26:35,640 --> 00:26:36,960 Speaker 2: sorry I had. 516 00:26:36,840 --> 00:26:39,880 Speaker 1: To so, I mean, look, this is this is funny. 517 00:26:40,000 --> 00:26:42,119 Speaker 1: Health tracking is obviously a huge trend. I mean, I 518 00:26:42,119 --> 00:26:44,280 Speaker 1: wonder how many how many different do you get? The 519 00:26:44,359 --> 00:26:49,639 Speaker 1: or ring, the whip band, the decoder, the mattress insert You're. 520 00:26:49,560 --> 00:26:52,840 Speaker 2: Just a moving biometric yeah at that point, which I mean, 521 00:26:53,880 --> 00:26:55,439 Speaker 2: I don't know if I'm that interested in that like 522 00:26:55,520 --> 00:26:57,359 Speaker 2: do every day. I want to know that something is 523 00:26:57,400 --> 00:27:12,840 Speaker 2: inspecting my excrement, which it feels like inspector gadget. 524 00:27:14,280 --> 00:27:16,199 Speaker 1: And now it's time for Chat and me. 525 00:27:16,480 --> 00:27:19,000 Speaker 2: This week we've got a listener's submission. All the way 526 00:27:19,040 --> 00:27:23,840 Speaker 2: from the Netherlands, Nina sent a recording called the green 527 00:27:24,040 --> 00:27:25,000 Speaker 2: Side of Chat. 528 00:27:25,480 --> 00:27:28,760 Speaker 3: It is often said that the enormous rise of AI 529 00:27:29,000 --> 00:27:32,200 Speaker 3: poses a threat to the environment due to its high 530 00:27:32,240 --> 00:27:37,159 Speaker 3: consumption of energy and water, but perhaps you'd like to 531 00:27:37,200 --> 00:27:42,320 Speaker 3: hear that AI can sometimes also contribute to improving our 532 00:27:42,400 --> 00:27:43,879 Speaker 3: living environment. 533 00:27:44,080 --> 00:27:48,000 Speaker 2: So lovely again. Nina said that the Netherlands is very 534 00:27:48,040 --> 00:27:51,240 Speaker 2: densely populated, which has caused a lack of green space. 535 00:27:51,720 --> 00:27:54,679 Speaker 2: So the government has actually been encouraging people for a 536 00:27:54,720 --> 00:27:57,680 Speaker 2: few years to start replacing the pavement in their yards 537 00:27:57,760 --> 00:27:58,520 Speaker 2: with more greenery. 538 00:27:58,960 --> 00:28:02,760 Speaker 3: This summer, to tackle my garden and replace quite a 539 00:28:02,840 --> 00:28:07,280 Speaker 3: few square meters of stones with beautiful shrubs, plants and flowers. 540 00:28:07,880 --> 00:28:12,840 Speaker 3: But I didn't want to just plant any greenery. I thought, 541 00:28:13,119 --> 00:28:16,159 Speaker 3: why not kill two birds with one stone and choose 542 00:28:16,280 --> 00:28:22,119 Speaker 3: native plants that also attract bees, butterflies and birds. I 543 00:28:22,240 --> 00:28:25,720 Speaker 3: just didn't know where to start, because I also wanted 544 00:28:25,760 --> 00:28:27,560 Speaker 3: it all to look nice, of course. 545 00:28:28,119 --> 00:28:29,800 Speaker 1: Well, Anina, thank you for sending this in. I just 546 00:28:29,840 --> 00:28:32,400 Speaker 1: want to say to you directly, if you're not narrating 547 00:28:32,440 --> 00:28:34,440 Speaker 1: books already, you should be. You've got to be great. 548 00:28:34,840 --> 00:28:36,960 Speaker 1: You've got a great voice and a great delivery. And 549 00:28:37,000 --> 00:28:39,440 Speaker 1: this really hits home to me, Cara, because I had 550 00:28:39,480 --> 00:28:42,080 Speaker 1: just moved into a house with a yard, with a yard, 551 00:28:42,200 --> 00:28:46,360 Speaker 1: and it's a landlord owned property, and the landlord has 552 00:28:46,400 --> 00:28:49,320 Speaker 1: moved to another country, and part of the deal was 553 00:28:49,360 --> 00:28:52,080 Speaker 1: that I would take care of the yard. And David, 554 00:28:52,120 --> 00:28:54,200 Speaker 1: if you're listening, I promise I'm gonna do my best, 555 00:28:54,200 --> 00:28:56,120 Speaker 1: but I am pretty overwhelmed. 556 00:28:56,240 --> 00:28:57,400 Speaker 2: Do you have a green thumbers? 557 00:28:57,760 --> 00:29:00,320 Speaker 1: Well, my grandmother, my grandmother and my farm they were 558 00:29:00,320 --> 00:29:02,920 Speaker 1: both big gardeners, so I kind of grown up with 559 00:29:03,280 --> 00:29:07,440 Speaker 1: care for gardens in mind. But like, oh my goodness, 560 00:29:07,440 --> 00:29:09,120 Speaker 1: I went to mow of the grass, went to prune 561 00:29:09,120 --> 00:29:11,480 Speaker 1: the roses, like I don't. The last thing I want 562 00:29:11,560 --> 00:29:14,720 Speaker 1: is for this beautiful garden to turn into a healthscape. 563 00:29:15,080 --> 00:29:18,440 Speaker 1: And this for David, So Anina, I'm going to take 564 00:29:18,480 --> 00:29:21,479 Speaker 1: a leave out of your book haha, so to speak, 565 00:29:21,560 --> 00:29:24,080 Speaker 1: and follow your lead here. I assume what we're going 566 00:29:24,120 --> 00:29:27,320 Speaker 1: to hear about is how Nina is used Chat to 567 00:29:27,360 --> 00:29:28,360 Speaker 1: develop a green thumb. 568 00:29:28,440 --> 00:29:30,160 Speaker 2: Indeed, look what she says. 569 00:29:30,880 --> 00:29:34,000 Speaker 3: I gave chat Jipt a description of the layout of 570 00:29:34,040 --> 00:29:38,000 Speaker 3: the garden, how much some the different parts get each day, 571 00:29:38,600 --> 00:29:41,640 Speaker 3: and my wishes regarding the types of plants I wanted. 572 00:29:42,760 --> 00:29:45,360 Speaker 3: To my surprise, I not only received a list of 573 00:29:45,440 --> 00:29:50,080 Speaker 3: plants that would fit well in my new ecologically responsible garden, 574 00:29:50,560 --> 00:29:53,800 Speaker 3: but Chap also offered to make a planting plant for me, 575 00:29:54,400 --> 00:29:58,520 Speaker 3: taking to account the height, color, and structure of the 576 00:29:58,600 --> 00:29:59,520 Speaker 3: various species. 577 00:30:00,280 --> 00:30:02,760 Speaker 1: I'm very curious if this was a Chat hallucination or 578 00:30:02,800 --> 00:30:05,000 Speaker 1: if the garden was transformed into a paradise. 579 00:30:05,080 --> 00:30:07,240 Speaker 2: Well, I will let Anina tell you, because I just 580 00:30:07,280 --> 00:30:09,000 Speaker 2: never want to stop listening to her voice. 581 00:30:09,600 --> 00:30:13,600 Speaker 3: Well, I have now created my AI based garden, and 582 00:30:13,800 --> 00:30:19,640 Speaker 3: it looks completely wonderful. It meets my expectations in every way. 583 00:30:20,240 --> 00:30:23,880 Speaker 3: It has become a beautiful green garden where butterflies and 584 00:30:23,960 --> 00:30:27,400 Speaker 3: bees come and go, and all kinds of birds forage 585 00:30:27,440 --> 00:30:32,040 Speaker 3: for food. So you see, every story has two sides, 586 00:30:32,480 --> 00:30:36,600 Speaker 3: even Chat. This time she has shown her green side. 587 00:30:36,960 --> 00:30:38,600 Speaker 3: Thank you Chat, I. 588 00:30:38,640 --> 00:30:39,520 Speaker 2: Like the Chat genders. 589 00:30:39,600 --> 00:30:43,440 Speaker 1: Chat. She takes a leaf out of your book. That's 590 00:30:43,480 --> 00:30:45,840 Speaker 1: true and thank you. That was one of my favorite 591 00:30:45,880 --> 00:30:49,960 Speaker 1: ever chat That is incredible. It'll be good and personally 592 00:30:50,040 --> 00:30:52,160 Speaker 1: useful as well, which was part of the idea for 593 00:30:52,240 --> 00:30:55,000 Speaker 1: doing a segment. So thank you. Just to close, we 594 00:30:55,080 --> 00:30:57,840 Speaker 1: really want to hear from you, our listeners. Please send 595 00:30:57,920 --> 00:30:59,960 Speaker 1: us all of your chat stories, the good, the bad, 596 00:31:00,600 --> 00:31:03,240 Speaker 1: and the ugly, Tech Green and the Green, the good, 597 00:31:03,240 --> 00:31:06,120 Speaker 1: the bear, the green and the ugly. Tech Stuff podcast 598 00:31:06,360 --> 00:31:16,840 Speaker 1: at gmail dot com. 599 00:31:16,960 --> 00:31:18,640 Speaker 2: That's it for this week for tech Stuff. 600 00:31:18,640 --> 00:31:21,880 Speaker 1: I'm Cara Price and I'm as Voloshin. This episode was 601 00:31:21,920 --> 00:31:25,400 Speaker 1: produced by Eliza Dennis, Tyler Hill and Melissa Slaughter. It 602 00:31:25,480 --> 00:31:28,640 Speaker 1: was executive produced by me Kara Price, Julian Nutter, and 603 00:31:28,680 --> 00:31:32,720 Speaker 1: Kate Osborne for Kaleidoscope and Katrian Norvell for iHeart Podcasts. 604 00:31:33,200 --> 00:31:36,960 Speaker 1: The engineer is Bihid Fraser and Jack Insley mix this episode. 605 00:31:37,400 --> 00:31:39,120 Speaker 1: Kyle Murdoch wrotear theme song. 606 00:31:39,520 --> 00:31:42,440 Speaker 2: Join us next Wednesday for a conversation all about tech 607 00:31:42,520 --> 00:31:43,600 Speaker 2: induced paranoia. 608 00:31:44,000 --> 00:31:46,760 Speaker 1: Please rate, review and reach out to us at tech 609 00:31:46,800 --> 00:31:49,960 Speaker 1: Stuff podcast at gmail dot com. We love hearing from you.