1 00:00:00,920 --> 00:00:03,600 Speaker 1: Thanks for tunion to techt Stuff. If you don't recognize 2 00:00:03,600 --> 00:00:06,720 Speaker 1: my voice, my name is Ozvoloshan, and I'm here because 3 00:00:06,760 --> 00:00:09,840 Speaker 1: the inimitable Jonathan Strickland has passed the baton to Kara 4 00:00:09,960 --> 00:00:12,959 Speaker 1: Price and myself to host tech Stuff. The show will 5 00:00:12,960 --> 00:00:15,720 Speaker 1: remain your home for all things tech, and all the 6 00:00:15,760 --> 00:00:19,480 Speaker 1: old episodes will remain available in this feed. Thanks for listening. 7 00:00:21,280 --> 00:00:25,080 Speaker 1: Welcome to tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 8 00:00:25,160 --> 00:00:31,520 Speaker 1: I'm Osvoloshan. Today will bring you the headlines, including supersonic planes, 9 00:00:31,880 --> 00:00:34,680 Speaker 1: a conversation with four or four Media's Joseph Cox about 10 00:00:34,680 --> 00:00:38,239 Speaker 1: the aftermath of quote the largest sting operation ever on 11 00:00:38,280 --> 00:00:41,279 Speaker 1: Today's Tech Support. Then we're back with another When did 12 00:00:41,280 --> 00:00:43,760 Speaker 1: this become a thing? This time we're looking into Google's 13 00:00:43,760 --> 00:00:47,280 Speaker 1: new assistant Gemini. All of that on the Weekend Tech. 14 00:00:47,680 --> 00:00:53,960 Speaker 1: It's Friday, February seventh, So this week I'm hosting solo again. 15 00:00:54,040 --> 00:00:56,480 Speaker 1: Kara Price is out for a few weeks, but our 16 00:00:56,520 --> 00:00:59,560 Speaker 1: producer Eliza Dennis is here to help me through the headlines. Eliza, 17 00:00:59,560 --> 00:01:00,200 Speaker 1: welcome back. 18 00:01:00,520 --> 00:01:01,680 Speaker 2: Thanks, happy to be here. 19 00:01:01,880 --> 00:01:05,680 Speaker 1: We've got an exciting potpourri. We've got planes, fridges, and 20 00:01:05,840 --> 00:01:06,840 Speaker 1: Roman Catholicism. 21 00:01:08,000 --> 00:01:10,080 Speaker 2: This list is giving me jeopardy vibes. 22 00:01:11,760 --> 00:01:14,000 Speaker 1: Let's start with fridges, but we're going to do all three. 23 00:01:14,040 --> 00:01:17,440 Speaker 1: But we'll start with fridges. These aren't ordinary fridges. They're 24 00:01:17,480 --> 00:01:20,480 Speaker 1: smart fridges. And I read a story in Bloomberg by 25 00:01:20,520 --> 00:01:24,480 Speaker 1: Austin Carr under the headline Walgreens replaced fridge doors with 26 00:01:24,560 --> 00:01:28,280 Speaker 1: smart screens. It's now a two hundred million dollar fiasco. 27 00:01:28,920 --> 00:01:31,880 Speaker 1: It all started a few years ago when Walgreens began 28 00:01:32,040 --> 00:01:35,120 Speaker 1: installing smart refrigerators in some of their stores. 29 00:01:35,840 --> 00:01:38,440 Speaker 2: I don't think I ever saw one of these in 30 00:01:38,520 --> 00:01:41,920 Speaker 2: the many, many Wilgreens I've been in. How about you, os. 31 00:01:42,000 --> 00:01:44,120 Speaker 1: I haven't actually seen them in person either, but the 32 00:01:44,120 --> 00:01:48,000 Speaker 1: Bloomberg article was very well written and visually described. Most 33 00:01:48,040 --> 00:01:51,960 Speaker 1: of these fridges are actually in Chicago, and they're not 34 00:01:52,000 --> 00:01:55,720 Speaker 1: really smart fridges, they're actually smart doors. Picture a door 35 00:01:55,880 --> 00:02:00,360 Speaker 1: size computer screen, and the screen mimics exactly what you'd 36 00:02:00,400 --> 00:02:04,320 Speaker 1: see through at last door into the space, so basically 37 00:02:04,680 --> 00:02:07,520 Speaker 1: rows of gatorade or energy drinks. But instead of the 38 00:02:07,520 --> 00:02:10,040 Speaker 1: actual product, you're now looking at a digital image of 39 00:02:10,080 --> 00:02:13,880 Speaker 1: the product that's inside the fridge. And that means digital prices, 40 00:02:13,919 --> 00:02:15,280 Speaker 1: no more sticker price tags. 41 00:02:15,440 --> 00:02:18,680 Speaker 2: Okay, so is this just trying to make sure that 42 00:02:18,720 --> 00:02:21,280 Speaker 2: employees don't have to walk around with those guns and 43 00:02:21,480 --> 00:02:24,359 Speaker 2: re stick prices every time there's a sale. 44 00:02:24,680 --> 00:02:27,280 Speaker 1: Probably a bit of that. And also dynamic pricing, so 45 00:02:27,320 --> 00:02:29,040 Speaker 1: the ability to change in real time. 46 00:02:29,680 --> 00:02:32,440 Speaker 2: Kara talked about this a few weeks ago, This dynamic 47 00:02:32,480 --> 00:02:35,799 Speaker 2: pricing where prices can fluctuate at a moment's notice. And 48 00:02:36,480 --> 00:02:38,919 Speaker 2: it's also a side effect of the smart shopping carts 49 00:02:38,919 --> 00:02:42,000 Speaker 2: that she talked about, you know, those being piloted by Instacart. 50 00:02:42,240 --> 00:02:44,640 Speaker 1: Yes, the screens, both on the caper cuts and on 51 00:02:44,720 --> 00:02:48,679 Speaker 1: these fridges can do smart pricing, but they also play ads, 52 00:02:48,720 --> 00:02:53,440 Speaker 1: and per Bloomberg, this is called quote unquote retail media advertising, 53 00:02:53,720 --> 00:02:56,560 Speaker 1: which is a gold rush right now. So you know, 54 00:02:56,600 --> 00:03:00,120 Speaker 1: the fridges are interesting, but there's a legal aspect here too. 55 00:03:00,240 --> 00:03:03,440 Speaker 1: So Walgreens signed a ten year contract with the startup 56 00:03:03,480 --> 00:03:07,320 Speaker 1: behind the screens, Cooler Screens, Inc. After an early pilot 57 00:03:07,400 --> 00:03:11,080 Speaker 1: showed that the screens could boost sales about to five percent. However, 58 00:03:11,280 --> 00:03:15,000 Speaker 1: early gains turned into a month's long contentious legal slog. 59 00:03:15,440 --> 00:03:19,080 Speaker 1: Here's the overview. So Walgreens installed ten thousand smart doors 60 00:03:19,440 --> 00:03:22,880 Speaker 1: and planned to install thirty five thousand more, but back 61 00:03:22,919 --> 00:03:26,239 Speaker 1: in twenty twenty three, things started to go a little wrong. 62 00:03:27,200 --> 00:03:31,400 Speaker 1: The screens started advertising the wrong items, They flickered, they 63 00:03:31,400 --> 00:03:35,800 Speaker 1: went blank, some even called on fire. Yeah, and they 64 00:03:35,840 --> 00:03:40,520 Speaker 1: weren't popular with the higher ups at Walgreens. According to Bloomberg, 65 00:03:40,600 --> 00:03:43,080 Speaker 1: the CEO of Walgreens at the time asked her team, 66 00:03:43,640 --> 00:03:49,120 Speaker 1: why do our stores look like an effing casino? And 67 00:03:49,160 --> 00:03:51,200 Speaker 1: not only that, maybe a casino would have been tolerable 68 00:03:51,240 --> 00:03:54,640 Speaker 1: if it also boosted sales hugely, but Walgreens claimed there 69 00:03:54,640 --> 00:03:56,600 Speaker 1: was practically no boost in sales. 70 00:03:56,800 --> 00:04:01,160 Speaker 2: Okay, so ugly and ineffective, you know, that's that's the take. 71 00:04:01,440 --> 00:04:04,400 Speaker 1: And so Walgreens had enough and started this legal process 72 00:04:04,440 --> 00:04:07,160 Speaker 1: to try and get out of the contract. Early Cooler 73 00:04:07,200 --> 00:04:09,680 Speaker 1: Screens didn't like this, and back in June twenty three, 74 00:04:10,040 --> 00:04:12,880 Speaker 1: they sued Walgreens for a breach of contract. Walgreens then 75 00:04:12,960 --> 00:04:14,760 Speaker 1: counter sued for monetary damages. 76 00:04:14,800 --> 00:04:15,160 Speaker 2: This is the. 77 00:04:16,760 --> 00:04:22,320 Speaker 1: Baldoni Lively of smart fridges. But Cooler Screens then found 78 00:04:22,320 --> 00:04:25,800 Speaker 1: another way to fight back. According to Bloomberg, they secretly 79 00:04:25,839 --> 00:04:28,640 Speaker 1: cut the data fees to these smart screen doors. In 80 00:04:28,680 --> 00:04:31,080 Speaker 1: over one hundred Walgreens locations in Chicago. 81 00:04:31,600 --> 00:04:33,200 Speaker 2: How did that affect Walgreens? 82 00:04:33,440 --> 00:04:36,440 Speaker 1: So, I mean these stores had dozens of doors that 83 00:04:36,520 --> 00:04:39,840 Speaker 1: look kind of glazed over, and customers obviously couldn't see 84 00:04:39,839 --> 00:04:44,360 Speaker 1: what was inside anymore, so employees had to physically and 85 00:04:44,440 --> 00:04:47,839 Speaker 1: valiantly but vaguely describe what was behind each door, like 86 00:04:47,880 --> 00:04:51,600 Speaker 1: there would be hand pasted signs saying assorted sports, drinks 87 00:04:51,600 --> 00:04:56,119 Speaker 1: and coffee. This went on for about a week until 88 00:04:56,120 --> 00:04:59,080 Speaker 1: Walgreens got a judge to issue a temporary restraining order 89 00:04:59,120 --> 00:05:02,080 Speaker 1: against Cooler Screen, and this forced the company to restore 90 00:05:02,120 --> 00:05:06,360 Speaker 1: the data feeds. So Walgreen's got their unwanted smart doors back, 91 00:05:06,520 --> 00:05:09,040 Speaker 1: but the screens were still not performing as hoped, so 92 00:05:09,080 --> 00:05:12,160 Speaker 1: Walgreens started physically removing them in early twenty twenty four, 93 00:05:12,800 --> 00:05:15,680 Speaker 1: and uninstallation was completed in August last year. 94 00:05:16,200 --> 00:05:18,400 Speaker 2: What's your takeaway as well? 95 00:05:18,440 --> 00:05:21,520 Speaker 1: You know, it's always these tech fails are always amusing. 96 00:05:21,760 --> 00:05:24,000 Speaker 1: There was a great little side in the Bloomberg article 97 00:05:24,560 --> 00:05:27,760 Speaker 1: about a Reddit post which went viral showing a customer 98 00:05:27,880 --> 00:05:31,280 Speaker 1: joking if only there was some other technology that would 99 00:05:31,320 --> 00:05:36,800 Speaker 1: let us see what's inside there lah exactly. So for me, 100 00:05:36,839 --> 00:05:38,880 Speaker 1: the takeaway here is that, you know, the latest consumer 101 00:05:38,920 --> 00:05:42,200 Speaker 1: technology can be exciting, specific can boost sales for corporations, 102 00:05:42,279 --> 00:05:46,839 Speaker 1: exciting for corporations, But sometimes the best tech is centuries old, 103 00:05:47,120 --> 00:05:49,679 Speaker 1: and in this case, the good old last display case 104 00:05:50,200 --> 00:05:51,039 Speaker 1: still does the job. 105 00:05:51,480 --> 00:05:54,600 Speaker 2: Okay, I love that story. Let's hear about planes. 106 00:05:55,279 --> 00:05:57,640 Speaker 1: Yeah, so this one was. I heard about this last 107 00:05:57,680 --> 00:06:00,000 Speaker 1: week and I personally spend way too much time on play, 108 00:06:00,720 --> 00:06:02,640 Speaker 1: so I was very intrigued to read about a team 109 00:06:02,720 --> 00:06:06,000 Speaker 1: who've just had a breakthrough in potentially making flying much 110 00:06:06,120 --> 00:06:06,799 Speaker 1: much faster. 111 00:06:07,400 --> 00:06:08,720 Speaker 2: How fast are we talking? 112 00:06:09,040 --> 00:06:12,880 Speaker 1: Well, take flying from San Francisco to Tokyo usually an 113 00:06:12,920 --> 00:06:15,360 Speaker 1: eleven hour flight. What if I told you there was 114 00:06:15,400 --> 00:06:17,480 Speaker 1: a way to make that flight only six hours. 115 00:06:18,120 --> 00:06:20,600 Speaker 2: I mean, I still wouldn't be able to afford it, 116 00:06:20,680 --> 00:06:23,760 Speaker 2: but I would want to go all the time. Fair. 117 00:06:24,320 --> 00:06:27,880 Speaker 1: So this is an aviation startup called Boom, and what 118 00:06:27,880 --> 00:06:30,760 Speaker 1: they're trying to do is to create a modern commercial 119 00:06:30,960 --> 00:06:33,960 Speaker 1: supersonic flight, as in a plane that flies so fast 120 00:06:34,040 --> 00:06:37,600 Speaker 1: it breaks a sound barrier quote, displacing sound waves so 121 00:06:37,720 --> 00:06:40,960 Speaker 1: quickly that they can press into a single thunderous lap, 122 00:06:41,320 --> 00:06:45,600 Speaker 1: forming the so called sonic boom. That's the name Boom. 123 00:06:46,640 --> 00:06:48,640 Speaker 1: And this is according to an article this week in 124 00:06:48,720 --> 00:06:50,320 Speaker 1: the San Francisco Standard. 125 00:06:51,000 --> 00:06:54,599 Speaker 2: So how far are we from Boom? Kind of like 126 00:06:55,160 --> 00:06:56,440 Speaker 2: you know, taking me to London. 127 00:06:57,120 --> 00:06:59,960 Speaker 1: Well, Boom had its first successful test flight that broke 128 00:07:00,040 --> 00:07:03,520 Speaker 1: the sound barrier just last week, flying high over California's 129 00:07:03,560 --> 00:07:08,160 Speaker 1: Mojave Desert. This XB one prototype broke the sound barrier. 130 00:07:08,520 --> 00:07:10,800 Speaker 1: But to be clear, the XP one is just a prototype, 131 00:07:11,000 --> 00:07:13,680 Speaker 1: and it is a scale model. In fact, it's a 132 00:07:13,720 --> 00:07:18,760 Speaker 1: single seater, whereas the planned aircraft called the Overture, which 133 00:07:18,760 --> 00:07:21,120 Speaker 1: has yet to be built, is going to be eighty seats. Now, 134 00:07:21,160 --> 00:07:23,960 Speaker 1: of course, military jets break the sound barrier all day, 135 00:07:24,000 --> 00:07:26,520 Speaker 1: every day, and this isn't that new in this sense, 136 00:07:26,560 --> 00:07:30,160 Speaker 1: has a history of supersonic passenger aircrafts in commercial use. 137 00:07:30,560 --> 00:07:32,400 Speaker 2: So this is the Concord as. 138 00:07:32,000 --> 00:07:34,640 Speaker 1: The Concord exactly, and I think one of the most 139 00:07:34,800 --> 00:07:38,440 Speaker 1: iconic planes in history. The Concord was first put into 140 00:07:38,480 --> 00:07:41,520 Speaker 1: flight in the sixties. It was a supersonic jet that 141 00:07:41,640 --> 00:07:45,840 Speaker 1: shuttler likes of Michael Jackson, Phil Collins, Paul McCartney, the 142 00:07:45,880 --> 00:07:49,400 Speaker 1: Pope and even the Queen back and forth across the Atlantic. 143 00:07:49,600 --> 00:07:52,320 Speaker 1: But per the San Francisco Standard, it costs twenty thousand 144 00:07:52,360 --> 00:07:55,840 Speaker 1: dollars per round trip adjusted for inflation, So yeah, not 145 00:07:56,560 --> 00:08:01,480 Speaker 1: for us. They're also environmental concerns. I think it consumed 146 00:08:01,520 --> 00:08:04,559 Speaker 1: I read, four times as much fuel as a seven 147 00:08:04,600 --> 00:08:07,520 Speaker 1: four seven and carried only a quarter of the people, 148 00:08:08,000 --> 00:08:11,520 Speaker 1: So that's sixteen times less efficient for any mental math nerds, 149 00:08:12,480 --> 00:08:14,040 Speaker 1: they would be able to get much quicker than me. 150 00:08:14,400 --> 00:08:17,160 Speaker 1: And then there are also some crashes and fires. Okay, 151 00:08:17,680 --> 00:08:19,480 Speaker 1: so all of this meant that the Concord flew its 152 00:08:19,560 --> 00:08:21,800 Speaker 1: last flight more than twenty years ago, in two thousand 153 00:08:21,840 --> 00:08:24,920 Speaker 1: and three. But now there's Overture gunning to take its place. 154 00:08:25,280 --> 00:08:27,560 Speaker 1: And what they're trying to do over at Boom is 155 00:08:27,600 --> 00:08:30,280 Speaker 1: to build a supersonic plane that would be accessible to 156 00:08:30,360 --> 00:08:32,880 Speaker 1: regular people. And they're trying to do this by using 157 00:08:32,920 --> 00:08:35,199 Speaker 1: modern tech that will cost less than the concords of 158 00:08:36,040 --> 00:08:39,880 Speaker 1: yester year, and also using more sustainable fuel. So this, 159 00:08:39,920 --> 00:08:43,800 Speaker 1: perhaps unsurprisingly, has caught the eye of Silicon Valley luminaries, 160 00:08:44,040 --> 00:08:47,520 Speaker 1: people like Sam Altman, Read Hoffmann, Mike Morritz have all invested, 161 00:08:48,200 --> 00:08:51,360 Speaker 1: and if things go to plan, Boom says that Overture 162 00:08:51,400 --> 00:08:53,440 Speaker 1: could begin commercial flights in five years. 163 00:08:53,720 --> 00:08:55,679 Speaker 2: Oh wow, okay, how is that going to happen? 164 00:08:55,800 --> 00:08:58,400 Speaker 1: Well, they're not going to operate the planes themselves. They're 165 00:08:58,400 --> 00:09:01,920 Speaker 1: doing deals with the airline industry, and according to the standard, 166 00:09:01,960 --> 00:09:05,920 Speaker 1: they've actually made deals with three airlines, including American Airlines, 167 00:09:05,960 --> 00:09:09,280 Speaker 1: who have put down a nonrefundable deposit on twenty overtures, 168 00:09:09,800 --> 00:09:12,960 Speaker 1: which is I guess a bold move given that one 169 00:09:12,960 --> 00:09:16,240 Speaker 1: hasn't been built yet. Yes, and of course there's a 170 00:09:16,240 --> 00:09:19,040 Speaker 1: big question about the boom, with the clue being in 171 00:09:19,080 --> 00:09:22,160 Speaker 1: the name flying over Land, whether or not that will 172 00:09:22,200 --> 00:09:22,640 Speaker 1: be allowed. 173 00:09:23,120 --> 00:09:27,400 Speaker 2: Yeah, I'm not ready for that. Sound. Let's move on 174 00:09:27,440 --> 00:09:30,720 Speaker 2: to the next thing. You already mentioned the Pope, but 175 00:09:30,800 --> 00:09:31,960 Speaker 2: you have something else to tell me. 176 00:09:32,280 --> 00:09:35,679 Speaker 1: Yes, So it was actually John Paul the Second May 177 00:09:35,679 --> 00:09:39,040 Speaker 1: He Rest in Peace, who was the Concord passenger, but 178 00:09:39,080 --> 00:09:41,560 Speaker 1: actually said, this story is about the current Pope Francis. 179 00:09:41,600 --> 00:09:44,120 Speaker 1: It's not my all time favorite papal Ai story. Do 180 00:09:44,160 --> 00:09:46,560 Speaker 1: you remember the deep fake Pope Eliza. 181 00:09:46,160 --> 00:09:50,040 Speaker 2: In the Yes, absolutely wearing the. 182 00:09:50,000 --> 00:09:54,040 Speaker 1: Coat, wearing the white Montclair puffer jacket in the mountains, 183 00:09:54,240 --> 00:09:55,120 Speaker 1: the papal puffer. 184 00:09:55,280 --> 00:09:56,200 Speaker 2: What a fashion easter. 185 00:09:56,600 --> 00:09:58,679 Speaker 1: But this is a more serious one because right now 186 00:09:58,720 --> 00:10:02,120 Speaker 1: it seems like there's a throwdown brewing between God and 187 00:10:02,360 --> 00:10:06,720 Speaker 1: God in the Machine Akai. Pote Francis has raised concerns 188 00:10:06,760 --> 00:10:09,800 Speaker 1: previously about the ethical and moral implications of AI and 189 00:10:09,840 --> 00:10:13,040 Speaker 1: its effect on relationships and education, and warfare and work. 190 00:10:13,320 --> 00:10:15,400 Speaker 1: According to a story in The New York Times with 191 00:10:15,559 --> 00:10:20,400 Speaker 1: the fabulous headline citing shadow of Evil, Vatican warns about 192 00:10:20,400 --> 00:10:23,800 Speaker 1: the risks of AI. Those statements have now been synthesized 193 00:10:23,840 --> 00:10:27,960 Speaker 1: into a document called Antiqua et Nova Ancient and New, 194 00:10:28,480 --> 00:10:31,160 Speaker 1: and it's meant to warn those of the Catholic Faith 195 00:10:31,400 --> 00:10:35,040 Speaker 1: that AI is both a source of tremendous opportunities but 196 00:10:35,080 --> 00:10:36,200 Speaker 1: also profound risks. 197 00:10:36,480 --> 00:10:39,280 Speaker 2: Okay, so kind of threading the needle there, What else 198 00:10:39,280 --> 00:10:40,280 Speaker 2: did it say? Yeah? 199 00:10:40,320 --> 00:10:43,160 Speaker 1: I mean, the paper pointed out that there is potential 200 00:10:43,160 --> 00:10:47,960 Speaker 1: for misinformation, eroded trust, harm in the social development of children, 201 00:10:48,400 --> 00:10:52,679 Speaker 1: the replacement of human work, and the development of autonomous weapons, 202 00:10:53,120 --> 00:10:55,600 Speaker 1: all of which, you know, franklier themes of this show. 203 00:10:56,000 --> 00:10:58,640 Speaker 1: But the Papal warning tried to bring into focus what 204 00:10:58,679 --> 00:11:01,679 Speaker 1: it means to be human and the ethical considerations we 205 00:11:01,720 --> 00:11:05,320 Speaker 1: should bear in mind when using AI. For example, the 206 00:11:05,360 --> 00:11:08,840 Speaker 1: document says, quote in all areas where humans are called 207 00:11:08,880 --> 00:11:12,440 Speaker 1: to make decisions, the shadow of evil also looms here. 208 00:11:13,200 --> 00:11:15,080 Speaker 1: So what this kind of says to me is that 209 00:11:15,080 --> 00:11:17,360 Speaker 1: the paper is as much about how we choose to 210 00:11:17,440 --> 00:11:20,920 Speaker 1: use AI than about AI itself. And you know what's 211 00:11:20,920 --> 00:11:23,960 Speaker 1: interesting was the Vatican of your you know, may have 212 00:11:24,000 --> 00:11:26,480 Speaker 1: tried to ban this, like the Catholic Church banning the 213 00:11:26,480 --> 00:11:30,640 Speaker 1: printing press and burning the people who try to print stuff. Happily, 214 00:11:30,679 --> 00:11:33,200 Speaker 1: five hundred years later we've you know, the Church has 215 00:11:33,280 --> 00:11:35,800 Speaker 1: moved on and they're not trying to say, you know, 216 00:11:35,920 --> 00:11:38,600 Speaker 1: don't use AI. They're trying to say, think about what 217 00:11:38,640 --> 00:11:40,160 Speaker 1: makes you human when you're using this. 218 00:11:40,600 --> 00:11:44,600 Speaker 2: Yeah, it's actually a really nice reminder that we can 219 00:11:44,840 --> 00:11:46,840 Speaker 2: all decide how we want to move through the world 220 00:11:46,920 --> 00:11:50,480 Speaker 2: and also how we interact with both human and machine. 221 00:11:50,800 --> 00:11:53,760 Speaker 1: I think that's well put. Well, thanks for joining me, Eliza. 222 00:11:54,040 --> 00:11:56,320 Speaker 1: I'm missing Cara, but I'm enjoying chatting to you each week. 223 00:11:56,400 --> 00:11:57,240 Speaker 2: That's my pleasure. 224 00:11:57,360 --> 00:12:03,960 Speaker 1: Us Coming up, we hear from our friends at four 225 00:12:04,040 --> 00:12:06,760 Speaker 1: or four Media about a plea deal in one of 226 00:12:06,800 --> 00:12:10,640 Speaker 1: the largest sting operations ever. That's next on tech support. 227 00:12:11,040 --> 00:12:20,000 Speaker 1: Stay with us today for our tech support segment, We're 228 00:12:20,000 --> 00:12:23,520 Speaker 1: turning to four or four Media's Joseph Cox to discuss 229 00:12:23,600 --> 00:12:27,320 Speaker 1: a story he's been reporting on for years. It's about 230 00:12:27,360 --> 00:12:30,200 Speaker 1: a sting operation with ripple effects throughout some of the 231 00:12:30,200 --> 00:12:35,160 Speaker 1: biggest criminal organizations globally, and a raid made possible by 232 00:12:35,200 --> 00:12:38,240 Speaker 1: the tech that we just can't live without our phones. 233 00:12:38,960 --> 00:12:42,559 Speaker 1: But this isn't about your iPhone, Pixel or Galaxy. It's 234 00:12:42,559 --> 00:12:46,480 Speaker 1: about encrypted phones. Joseph wrote a book called Dark Wire 235 00:12:46,880 --> 00:12:50,839 Speaker 1: about the Encrypted Device Company a NOM, it's unique relationship 236 00:12:50,840 --> 00:12:54,920 Speaker 1: with the FBI, and the biggest criminal raid in recorded history. 237 00:12:55,480 --> 00:12:58,040 Speaker 1: And now some of the people arrested in that sting 238 00:12:58,080 --> 00:13:01,800 Speaker 1: operation are pleading guilty, which calls into question if there'll 239 00:13:01,840 --> 00:13:04,199 Speaker 1: be a trial. Joseph is great to see. 240 00:13:04,040 --> 00:13:05,280 Speaker 3: You, Thanks for having me. 241 00:13:05,720 --> 00:13:08,000 Speaker 1: So if you don't mind, let's start at the beginning. 242 00:13:08,320 --> 00:13:09,199 Speaker 1: What is a non. 243 00:13:09,880 --> 00:13:14,040 Speaker 3: Yes, So, a NOOM was part of the encryptis phone industry. 244 00:13:14,520 --> 00:13:17,640 Speaker 3: And as you say, it's not like the normal apply 245 00:13:17,640 --> 00:13:20,320 Speaker 3: I phones or Google androids that we all use. These 246 00:13:20,360 --> 00:13:23,880 Speaker 3: are very special phones that have sometimes a microphone removed, 247 00:13:23,960 --> 00:13:27,320 Speaker 3: the camera taken out, sometimes even the GPS removed, because 248 00:13:27,360 --> 00:13:31,719 Speaker 3: basically they're marketed to serious drug traffickers and criminals. 249 00:13:31,840 --> 00:13:33,960 Speaker 1: And do they look like old you know, like plip 250 00:13:34,040 --> 00:13:36,800 Speaker 1: phones or nockia old phones, or what's the physical appearance 251 00:13:36,840 --> 00:13:37,520 Speaker 1: of these phones. 252 00:13:37,880 --> 00:13:41,680 Speaker 3: Yeah, that's the funny thing. They do look like ordinary phones. 253 00:13:41,800 --> 00:13:43,480 Speaker 3: Back in the day when there was a company called 254 00:13:43,480 --> 00:13:46,160 Speaker 3: Phantom Secure that was a sort of BlackBerry with a 255 00:13:46,240 --> 00:13:51,080 Speaker 3: quirity keyboard. Nowadays they will look like a Google Pixel, 256 00:13:51,280 --> 00:13:53,280 Speaker 3: but they will operate very differently. 257 00:13:53,440 --> 00:13:56,600 Speaker 1: And a non was basically a handset manufacturer. 258 00:13:57,040 --> 00:14:00,000 Speaker 3: It's hard to describe. It was almost like a software developer, 259 00:14:00,320 --> 00:14:05,760 Speaker 3: a tech startup, a hackefon as well. But basically what 260 00:14:05,800 --> 00:14:08,600 Speaker 3: they would do is that they would make their own software, 261 00:14:08,600 --> 00:14:12,240 Speaker 3: which is sending encrypto messages, wiping data from the phone, 262 00:14:12,720 --> 00:14:16,480 Speaker 3: redacting parts of photos you may take and send across 263 00:14:16,480 --> 00:14:20,000 Speaker 3: the device, and then installing that onto Android's phones. So 264 00:14:20,040 --> 00:14:24,400 Speaker 3: they're almost like an operating system developer and app developer 265 00:14:24,440 --> 00:14:25,960 Speaker 3: as well. But I would say it's mostly on the 266 00:14:25,960 --> 00:14:26,720 Speaker 3: software side. 267 00:14:26,800 --> 00:14:29,720 Speaker 1: And you mentioned that these were very popular with criminals. 268 00:14:29,760 --> 00:14:32,480 Speaker 1: These are non phones. What was the advantage of them? 269 00:14:32,520 --> 00:14:33,760 Speaker 1: Why did you want these phones? 270 00:14:34,520 --> 00:14:38,040 Speaker 3: So criminals are of course always trying to stay one 271 00:14:38,080 --> 00:14:41,680 Speaker 3: step ahead of law enforcement. Back in the day, that 272 00:14:41,720 --> 00:14:44,080 Speaker 3: would have been something like the TV series The Wire 273 00:14:44,200 --> 00:14:47,160 Speaker 3: right where they're using payphones or they're using pages, but 274 00:14:47,280 --> 00:14:51,320 Speaker 3: of course cops got wind of that, so then criminals 275 00:14:51,360 --> 00:14:55,440 Speaker 3: turned to encryption, And to me, it's the biggest technological 276 00:14:55,560 --> 00:14:58,520 Speaker 3: leapf criminals since the inventure of the mobile phone. If 277 00:14:58,560 --> 00:15:01,120 Speaker 3: you are a serious criminal, you're only going to be 278 00:15:01,160 --> 00:15:03,440 Speaker 3: taken seriously if you have one of these phones. It's 279 00:15:03,480 --> 00:15:06,120 Speaker 3: almost essential to be a cocaine or a heroine or 280 00:15:06,120 --> 00:15:07,680 Speaker 3: a methamphetamine trafficka today. 281 00:15:08,080 --> 00:15:11,920 Speaker 1: Wow, obviously it's not just criminals who rely on encryption. 282 00:15:12,040 --> 00:15:14,960 Speaker 1: We had Mariuth Whittaker on the podcast, and you know, 283 00:15:15,000 --> 00:15:17,440 Speaker 1: she made the case for the benefits of encryption. 284 00:15:17,880 --> 00:15:20,600 Speaker 3: Yeah. Absolutely, And I think that's the important thing in 285 00:15:20,680 --> 00:15:24,960 Speaker 3: that encryption is beneficial to everybody. I use it every 286 00:15:25,000 --> 00:15:29,880 Speaker 3: single day. Ironically, I use Signal and other encrypted messaging 287 00:15:29,880 --> 00:15:33,200 Speaker 3: apps to talk to criminals who are using these encryptive 288 00:15:33,200 --> 00:15:36,640 Speaker 3: phones as well. It's a messy for the issue, but 289 00:15:36,640 --> 00:15:38,320 Speaker 3: I think we can all agree that, you know, there 290 00:15:38,360 --> 00:15:41,360 Speaker 3: is a net positive and overwhelming net positive for us 291 00:15:41,400 --> 00:15:45,520 Speaker 3: being able to use encryption. That being said, law enforcement 292 00:15:45,680 --> 00:15:47,960 Speaker 3: are not just going to sit by and let that happen. 293 00:15:48,200 --> 00:15:50,280 Speaker 3: They're going to think of new ways to sort of 294 00:15:50,320 --> 00:15:51,680 Speaker 3: penetrate the secret system. 295 00:15:51,920 --> 00:15:55,040 Speaker 1: So you mentioned that, you know, the old mafia techniques 296 00:15:55,080 --> 00:15:59,640 Speaker 1: of payphones and pages. Basically law enforcement got ahead of them. 297 00:16:00,040 --> 00:16:02,720 Speaker 1: How did the FBI find out about a NOM? 298 00:16:03,480 --> 00:16:07,280 Speaker 3: So what happened first was the FBI shut down Phantom Secure, 299 00:16:07,440 --> 00:16:10,880 Speaker 3: this other encryptive phone company very very popular with the 300 00:16:10,920 --> 00:16:15,160 Speaker 3: Similarer drug cartel, Australian bikers, that sort of thing, and 301 00:16:15,240 --> 00:16:17,440 Speaker 3: they wanted to get a backdoor into it which would 302 00:16:17,440 --> 00:16:20,040 Speaker 3: have allowed them to read all of the messages. That 303 00:16:20,080 --> 00:16:23,720 Speaker 3: didn't happen. But then what did basically fall into their 304 00:16:23,800 --> 00:16:27,120 Speaker 3: lap was that someone who was selling phones for Phantom 305 00:16:27,160 --> 00:16:30,000 Speaker 3: Secure and some other companies, they were making their own 306 00:16:30,280 --> 00:16:34,000 Speaker 3: next generation of encryptive phone called a NOOM, and they 307 00:16:34,040 --> 00:16:36,400 Speaker 3: offered it to the FBI, and they said, look, would 308 00:16:36,440 --> 00:16:38,120 Speaker 3: you like to take this and use it for your 309 00:16:38,120 --> 00:16:41,480 Speaker 3: own investigations. Of course, in return, I would like not 310 00:16:41,600 --> 00:16:45,040 Speaker 3: to be prosecuted. Ideally I would also like to get 311 00:16:45,080 --> 00:16:48,880 Speaker 3: some money for it. But I can't overstate how exciting 312 00:16:48,960 --> 00:16:51,480 Speaker 3: that was to the FBI. Rather than trying to put 313 00:16:51,520 --> 00:16:54,640 Speaker 3: a backdoor into an existing app or into an existing company, 314 00:16:55,080 --> 00:16:57,360 Speaker 3: they could just make their own encryptive phone company and 315 00:16:57,400 --> 00:16:58,680 Speaker 3: then read the messages. 316 00:16:58,320 --> 00:17:01,400 Speaker 1: Of criminals incredible drinking directly from the fire hoods, so 317 00:17:01,480 --> 00:17:05,960 Speaker 1: to speak. Do you know exactly how how the deal 318 00:17:06,040 --> 00:17:06,439 Speaker 1: was struck. 319 00:17:06,800 --> 00:17:12,159 Speaker 3: So their pseudonym is AFGU AFG double Low, And my 320 00:17:12,320 --> 00:17:17,800 Speaker 3: understanding is that the deal came around fairly quickly and 321 00:17:18,040 --> 00:17:21,159 Speaker 3: the main thing for the FBI and the Australian Federal 322 00:17:21,200 --> 00:17:24,480 Speaker 3: Police that were also part of this investigation, they just 323 00:17:24,560 --> 00:17:27,880 Speaker 3: wanted to make sure that it worked. Basically, it's one 324 00:17:27,920 --> 00:17:30,800 Speaker 3: thing for AFGU to promise, oh, sure you can backdoor 325 00:17:30,840 --> 00:17:33,720 Speaker 3: this phone, It'll all be okay. It's another for it 326 00:17:33,800 --> 00:17:36,639 Speaker 3: to actually, you know, work out in practice in the 327 00:17:36,680 --> 00:17:39,320 Speaker 3: real world. I go into this in much more detail 328 00:17:39,359 --> 00:17:41,359 Speaker 3: in the book. There's a series of events where they 329 00:17:41,359 --> 00:17:44,200 Speaker 3: do verify it is working. There's a scene where someone 330 00:17:44,240 --> 00:17:46,840 Speaker 3: from AFP is laying down on their sofa at home 331 00:17:47,000 --> 00:17:49,879 Speaker 3: and they're seeing messages flash up on their laptop screen 332 00:17:50,119 --> 00:17:52,720 Speaker 3: when they're being intercepted in real time. So this wasn't 333 00:17:52,760 --> 00:17:55,560 Speaker 3: just a pipe dream anymore. It was going to become 334 00:17:55,680 --> 00:17:57,920 Speaker 3: a reality, you know, if the criminals bought the phones. 335 00:17:57,960 --> 00:18:00,600 Speaker 1: At least on the cover of your book, which is 336 00:18:00,640 --> 00:18:03,920 Speaker 1: called Darkwoir, this is described as quote the largest sting 337 00:18:03,960 --> 00:18:08,560 Speaker 1: operation ever which makes you wonder just how large was 338 00:18:08,600 --> 00:18:12,480 Speaker 1: it and how ubiquitous were a NON phones in the underworld. 339 00:18:13,200 --> 00:18:16,480 Speaker 3: I mean, a non grew to the same size, if 340 00:18:16,520 --> 00:18:20,600 Speaker 3: not bigger than some of the actual encryptive phone companies 341 00:18:20,600 --> 00:18:24,080 Speaker 3: for criminals. They got to something like twelve thousand devices, 342 00:18:24,240 --> 00:18:27,040 Speaker 3: which doesn't sound like that much to us ordinary people, 343 00:18:27,320 --> 00:18:30,359 Speaker 3: but Phantom Secure was only something between seven and ten thousand, 344 00:18:30,480 --> 00:18:33,919 Speaker 3: So the FBI was almost better than some of the 345 00:18:33,960 --> 00:18:38,280 Speaker 3: criminal organizations at this At this as well as for 346 00:18:38,359 --> 00:18:40,880 Speaker 3: why I called it the largest thing operation ever, it's 347 00:18:40,920 --> 00:18:43,560 Speaker 3: not just sort of the audacity of it, but when 348 00:18:44,119 --> 00:18:47,240 Speaker 3: law enforcement around the world eventually acted on this information, 349 00:18:47,440 --> 00:18:51,399 Speaker 3: they did sort of a worldwide relay race of raids 350 00:18:51,440 --> 00:18:54,880 Speaker 3: all across the planet. Something like nine thousand law enforcement 351 00:18:54,920 --> 00:18:58,960 Speaker 3: officers acted in one single day, which is almost difficult 352 00:18:58,960 --> 00:19:01,919 Speaker 3: to put into words. And now there's a challenge because 353 00:19:02,320 --> 00:19:07,119 Speaker 3: you can't even imagine law enforcement activity on that scale. 354 00:19:07,359 --> 00:19:11,320 Speaker 1: No, what an incredible scene. How did it rise to prominence? 355 00:19:11,320 --> 00:19:13,159 Speaker 1: I mean, did the FBI have other people who were 356 00:19:13,200 --> 00:19:16,199 Speaker 1: kind of promoting in the criminal underworld? Was it entirely organic? Like, 357 00:19:16,440 --> 00:19:20,480 Speaker 1: how did this become the encryption phone of choice. 358 00:19:20,160 --> 00:19:22,480 Speaker 3: With a norm you can only speak to other and 359 00:19:22,560 --> 00:19:24,280 Speaker 3: NOM users in the same way, if you're on a 360 00:19:24,280 --> 00:19:26,719 Speaker 3: phantom secure phone, you can only speak to phantom secure 361 00:19:26,800 --> 00:19:30,800 Speaker 3: users as well. That's especially interesting when it comes to 362 00:19:30,880 --> 00:19:34,000 Speaker 3: the world of drug trafficking because if you get somebody 363 00:19:34,000 --> 00:19:37,679 Speaker 3: who's really really big in the world of cocaine trafficking, 364 00:19:38,040 --> 00:19:40,840 Speaker 3: or maybe one of these kingpins who's hiding out in Dubai, 365 00:19:41,080 --> 00:19:44,120 Speaker 3: if they switch over to a NORM, then everybody underneath 366 00:19:44,200 --> 00:19:47,199 Speaker 3: them needs to switch as well, and you basically have 367 00:19:47,320 --> 00:19:52,520 Speaker 3: this multi level marketing scheme essentially, where it's the people 368 00:19:52,600 --> 00:19:55,520 Speaker 3: at the top who dictate what everybody else underneath has 369 00:19:55,560 --> 00:19:57,960 Speaker 3: to use as well. And then that's how they did it. 370 00:19:58,320 --> 00:20:02,560 Speaker 3: The FBI wasn't sending out agents itself to sell the phones. 371 00:20:02,680 --> 00:20:07,080 Speaker 3: They let the phones organically spread among serious drug traffickers. 372 00:20:07,480 --> 00:20:09,720 Speaker 3: And one of the most important people for that was 373 00:20:09,720 --> 00:20:12,720 Speaker 3: a guy called hak An Aik. He was Australia's most 374 00:20:12,760 --> 00:20:16,560 Speaker 3: wanted man and he's a top tier drug trafficker, I mean, 375 00:20:16,600 --> 00:20:19,760 Speaker 3: getting billions of dollars worth of product into Australia. And 376 00:20:19,800 --> 00:20:23,680 Speaker 3: when he switched over to a NORM and started using 377 00:20:23,680 --> 00:20:26,720 Speaker 3: the phones and selling the phones as well, I mean 378 00:20:26,760 --> 00:20:30,520 Speaker 3: what better ambassador for your phone company than one of 379 00:20:30,560 --> 00:20:32,359 Speaker 3: the most wanted traffickers. 380 00:20:31,840 --> 00:20:32,280 Speaker 2: In the world. 381 00:20:33,400 --> 00:20:35,320 Speaker 1: The raids were in twenty twenty one, and there are 382 00:20:35,400 --> 00:20:38,040 Speaker 1: arrests made, But you publish a story just this week. 383 00:20:38,080 --> 00:20:40,639 Speaker 1: What's happened since twenty twenty one, and what was the 384 00:20:40,680 --> 00:20:41,600 Speaker 1: story from this week? 385 00:20:42,320 --> 00:20:46,159 Speaker 3: Yes, So, in a sort of cruel twist, the FBI 386 00:20:46,240 --> 00:20:48,960 Speaker 3: and the US and the lover authorities didn't just charge 387 00:20:49,440 --> 00:20:52,080 Speaker 3: the criminal users of a norm as in the drug traffickers, 388 00:20:52,080 --> 00:20:54,560 Speaker 3: the hit men, the money launders, that sort of thing. 389 00:20:55,119 --> 00:20:58,560 Speaker 3: US authorities also charged the people who sold a Noom 390 00:20:58,680 --> 00:21:02,720 Speaker 3: phones the hest of the FBI. And the reason was 391 00:21:02,840 --> 00:21:05,879 Speaker 3: in the FBI eyes that, well, a Noon became a 392 00:21:05,880 --> 00:21:07,800 Speaker 3: criminal entity in its own rights, so we're going to 393 00:21:07,920 --> 00:21:10,640 Speaker 3: charge you as well. So there was almost a double 394 00:21:10,640 --> 00:21:13,720 Speaker 3: whammy there. And we were all getting ready for this 395 00:21:13,800 --> 00:21:17,119 Speaker 3: big trial coming up in March where a few of 396 00:21:17,160 --> 00:21:19,480 Speaker 3: them were going to go on the stand essentially and 397 00:21:19,480 --> 00:21:25,360 Speaker 3: try to defend themselves. But now they are pleading guilty essentially, 398 00:21:25,680 --> 00:21:29,840 Speaker 3: And that is really important because at the trial, the 399 00:21:29,960 --> 00:21:33,760 Speaker 3: US authorities were preparing to name Afku, They were going 400 00:21:33,840 --> 00:21:36,440 Speaker 3: to name their informant, which to me is crazy but 401 00:21:36,560 --> 00:21:37,959 Speaker 3: that was something that we're going to have to do, 402 00:21:38,000 --> 00:21:41,200 Speaker 3: and now Afgu's identity will probably remain secret. 403 00:21:41,920 --> 00:21:44,280 Speaker 1: I'm fascinated by this story, Joseph, been grateful for you 404 00:21:44,440 --> 00:21:46,240 Speaker 1: and for coming on to Day to talk about it. 405 00:21:46,720 --> 00:21:48,760 Speaker 1: I'm also interested in it because it's this kind of 406 00:21:49,200 --> 00:21:51,280 Speaker 1: sort of there's this ironic element to it, right where 407 00:21:51,320 --> 00:21:54,200 Speaker 1: the very thing which is supposed to keep these criminals 408 00:21:54,240 --> 00:21:57,720 Speaker 1: safe ended up getting them caught. It makes me think 409 00:21:57,760 --> 00:22:03,320 Speaker 1: about the Hezbollah exploding Pages story, where Israeli intelligence had 410 00:22:03,480 --> 00:22:07,239 Speaker 1: understood that Hezbollah were using pages because they thought they 411 00:22:07,280 --> 00:22:11,240 Speaker 1: were safer than mobile phones, and then Israel essentially inserted 412 00:22:11,280 --> 00:22:15,440 Speaker 1: themselves into the production of these pages and inserted explosives 413 00:22:15,440 --> 00:22:20,240 Speaker 1: and in so doing killed Hezbela operatives with collateral damage 414 00:22:20,280 --> 00:22:23,280 Speaker 1: as well. These two stories have an interesting connection, which 415 00:22:23,320 --> 00:22:26,080 Speaker 1: is about how the tech we use and how the 416 00:22:26,080 --> 00:22:30,040 Speaker 1: tech we trust can be turned against us. And I'm curious, 417 00:22:30,080 --> 00:22:33,040 Speaker 1: if you're not a Hezbollah operative or a drug dealer, 418 00:22:33,440 --> 00:22:35,560 Speaker 1: what should you be thinking about when you read this story. 419 00:22:36,080 --> 00:22:37,840 Speaker 3: I mean, in the case of a NOOM, it was 420 00:22:37,880 --> 00:22:40,679 Speaker 3: drug traffickers. In the case of the Pages, it was 421 00:22:40,720 --> 00:22:45,320 Speaker 3: a designated terrorist group. It can for us though, I 422 00:22:45,359 --> 00:22:49,440 Speaker 3: think make people ordinary people suspicious of software they should 423 00:22:49,480 --> 00:22:52,520 Speaker 3: be able to trust. I want people to still have 424 00:22:52,600 --> 00:22:55,200 Speaker 3: that trust. I don't want people to feel like, well, 425 00:22:55,200 --> 00:22:57,400 Speaker 3: who do I know who's actually running this? Maybe it's 426 00:22:57,480 --> 00:23:00,120 Speaker 3: a whole honeypot or something like that. And I think 427 00:23:00,160 --> 00:23:05,720 Speaker 3: these operations where authorities attack the supply chain could so doubt. 428 00:23:06,000 --> 00:23:08,080 Speaker 3: And I mean from the drug traffickers I've spoke to 429 00:23:08,080 --> 00:23:11,000 Speaker 3: and the people who sell these phones to criminals. More 430 00:23:11,040 --> 00:23:14,160 Speaker 3: and more of the drug traffickers are moving to ordinary 431 00:23:14,160 --> 00:23:15,920 Speaker 3: apps that we all use because they don't trust the 432 00:23:16,000 --> 00:23:19,159 Speaker 3: encryptophones anymore. There's been so many operations where even the 433 00:23:19,240 --> 00:23:21,720 Speaker 3: drug traffickers are saying, well, I'm just going to move 434 00:23:21,760 --> 00:23:24,520 Speaker 3: the signal or something. And what does that mean for 435 00:23:24,600 --> 00:23:26,560 Speaker 3: us now? Because I don't think law enforcements are just 436 00:23:26,600 --> 00:23:28,280 Speaker 3: going to put themselves in the back and go home. 437 00:23:28,520 --> 00:23:31,080 Speaker 3: They will be looking for the next thing to compromise 438 00:23:31,160 --> 00:23:31,879 Speaker 3: or investigate. 439 00:23:32,440 --> 00:23:42,359 Speaker 1: Joseph, thank you, thank you so much. Coming up and 440 00:23:42,400 --> 00:23:45,119 Speaker 1: look at the uses of Gemini in your Gmail and 441 00:23:45,200 --> 00:23:49,399 Speaker 1: its similarities to a polarizing office assistant from decades past. 442 00:23:49,920 --> 00:23:52,440 Speaker 1: That's on when did this become a thing? Stay with us? 443 00:23:59,520 --> 00:24:01,920 Speaker 1: It's time for our next segment, When did this become 444 00:24:01,960 --> 00:24:04,720 Speaker 1: a thing? Where we bring you a story or observation 445 00:24:04,920 --> 00:24:07,800 Speaker 1: from our lives and try to figure it out. In 446 00:24:07,840 --> 00:24:10,640 Speaker 1: other words, ask the question when did this become a thing? 447 00:24:11,119 --> 00:24:13,639 Speaker 1: And if you've got an email account with Google, which, 448 00:24:13,760 --> 00:24:16,639 Speaker 1: according to Google, around one point eight billion people do, 449 00:24:17,480 --> 00:24:20,400 Speaker 1: you might have noticed something pop up on your screen recently, 450 00:24:21,000 --> 00:24:25,320 Speaker 1: a sparkly little feature called Gemini. Now Gemini isn't new, 451 00:24:25,440 --> 00:24:28,280 Speaker 1: per se, It's a new name for Google's a I tool. 452 00:24:28,640 --> 00:24:32,200 Speaker 1: You may remember Barred, which was Gemini's predecessor, But about 453 00:24:32,240 --> 00:24:34,840 Speaker 1: a year ago Google gave their chatbot a glow up. 454 00:24:35,280 --> 00:24:38,199 Speaker 1: And now we're being aided by Gemini, which is a 455 00:24:38,240 --> 00:24:42,720 Speaker 1: constellation of stars, of course, and many people's least favorite 456 00:24:42,760 --> 00:24:46,119 Speaker 1: sign of the zodiac. I went to Google's own blog 457 00:24:46,320 --> 00:24:49,800 Speaker 1: called the Keyword to learn more, and I found out 458 00:24:49,840 --> 00:24:53,480 Speaker 1: that once again, studying Latin pays dividends. I wish I had, 459 00:24:54,200 --> 00:24:58,119 Speaker 1: Gemini is actually Latin for twins. And then, to my amazement, 460 00:24:58,400 --> 00:25:01,560 Speaker 1: the blog made reference to a stronger quote. A key 461 00:25:01,640 --> 00:25:04,879 Speaker 1: characteristic of the Gemini zodiac sign is a dual natured 462 00:25:04,920 --> 00:25:08,840 Speaker 1: personality capable of adapting quickly connecting to a wide range 463 00:25:08,840 --> 00:25:13,479 Speaker 1: of people and seeing things from multiple perspectives. Apparently, earlier 464 00:25:13,640 --> 00:25:17,240 Speaker 1: versions of the name for this product included Titan, which 465 00:25:17,280 --> 00:25:20,480 Speaker 1: is the name of Saturn's largest moon, but also a 466 00:25:20,560 --> 00:25:24,760 Speaker 1: rather more intimidating name. Remember this AI is your friend 467 00:25:24,800 --> 00:25:25,400 Speaker 1: and partner. 468 00:25:25,480 --> 00:25:25,840 Speaker 2: Kids. 469 00:25:26,600 --> 00:25:29,800 Speaker 1: If you haven't seen Gemini in your inbox yet, you've 470 00:25:29,800 --> 00:25:33,480 Speaker 1: definitely seen it on Google dot Com. Whenever that aioverview 471 00:25:33,520 --> 00:25:36,880 Speaker 1: pops up after performing at Google Search, that's Gemini doing 472 00:25:36,920 --> 00:25:40,679 Speaker 1: its thing. When I first saw those AI overviews, they 473 00:25:40,760 --> 00:25:43,600 Speaker 1: kind of shocked me. I do find them pretty useful 474 00:25:43,640 --> 00:25:46,280 Speaker 1: when I want to know how to do something like 475 00:25:46,680 --> 00:25:50,080 Speaker 1: I'm having trouble getting audio output from my microphone into 476 00:25:50,080 --> 00:25:53,520 Speaker 1: my recorder. What should I do? But if I'm researching 477 00:25:53,520 --> 00:25:55,560 Speaker 1: in a more open ended way, I don't find it 478 00:25:55,640 --> 00:25:59,760 Speaker 1: very useful at all. However, recently, Gemini actually entered my 479 00:25:59,760 --> 00:26:04,560 Speaker 1: work email inbox and it felt a little intrusive. When 480 00:26:04,600 --> 00:26:06,680 Speaker 1: I would go to write or apply to an email, 481 00:26:07,160 --> 00:26:10,440 Speaker 1: next to my cursor, a little message appeared saying help 482 00:26:10,520 --> 00:26:13,280 Speaker 1: me write. I've actually never tried it before, but I 483 00:26:13,280 --> 00:26:15,520 Speaker 1: thought today would be a good day to give it 484 00:26:15,560 --> 00:26:21,200 Speaker 1: a whirl. So I'm pulling up my work email, clicking compose, 485 00:26:21,880 --> 00:26:25,200 Speaker 1: and there's a little little thing saying help me write, 486 00:26:25,240 --> 00:26:27,520 Speaker 1: which I'm going to click on, and I'm going to 487 00:26:27,600 --> 00:26:32,159 Speaker 1: say I would like to interview the Pope about his 488 00:26:32,320 --> 00:26:37,159 Speaker 1: views his views on AI for the Tech Stuff Podcast. 489 00:26:38,200 --> 00:26:44,879 Speaker 1: Please write an email to the Vatican Communications Office asking 490 00:26:45,200 --> 00:26:50,200 Speaker 1: for an interview and making the most compelling case as 491 00:26:50,200 --> 00:26:53,600 Speaker 1: to why he should do it. Let's see what happens 492 00:26:54,200 --> 00:27:00,720 Speaker 1: create just thinking. Okay, Dear Vatican Communications, my name is 493 00:27:00,760 --> 00:27:02,600 Speaker 1: os Voloshian and I am reaching out from the Tech 494 00:27:02,600 --> 00:27:05,520 Speaker 1: Stuff Podcast. We will be honored to have the opportunity 495 00:27:05,600 --> 00:27:08,480 Speaker 1: to interview His Holiness, Pope Francis about his views on AI. 496 00:27:09,280 --> 00:27:11,800 Speaker 1: We believe that His Holiness's perspective on AI would be 497 00:27:11,800 --> 00:27:15,080 Speaker 1: incredibly valuable to our audience. As a global leader and 498 00:27:15,160 --> 00:27:17,959 Speaker 1: voice of moral authority, his insights on the ethical and 499 00:27:17,960 --> 00:27:22,360 Speaker 1: societal implications of AI will be particularly impactful. The Tech 500 00:27:22,359 --> 00:27:26,520 Speaker 1: Stuff Podcast reaches a wide audience of tech enthusiasts, industry professionals, 501 00:27:26,520 --> 00:27:30,919 Speaker 1: and policymakers. An interview with His Holiness undoubtedly generates significant 502 00:27:30,920 --> 00:27:34,920 Speaker 1: interest and spark important conversations around AI. We will be 503 00:27:34,960 --> 00:27:38,320 Speaker 1: happy to accommodate His Holiness's schedule and preferences for the 504 00:27:38,320 --> 00:27:41,840 Speaker 1: interview format. We're also open to discussing any specific topics 505 00:27:41,920 --> 00:27:44,199 Speaker 1: or themes that he would like to address. Thank you 506 00:27:44,200 --> 00:27:47,960 Speaker 1: for your timing consideration. We eagerly await your response. Sincerely, 507 00:27:48,040 --> 00:27:51,840 Speaker 1: oz Well, I do like I wouldn't have thought about 508 00:27:51,880 --> 00:27:54,399 Speaker 1: referring to him as his Holiness, but I like the 509 00:27:54,480 --> 00:27:57,680 Speaker 1: use of his correct title. I'm not sure that that 510 00:27:57,760 --> 00:28:01,080 Speaker 1: email quite do the trick. The tone is so obsequious, 511 00:28:01,119 --> 00:28:04,640 Speaker 1: I don't know how many his Holinesses you can fit 512 00:28:04,640 --> 00:28:07,639 Speaker 1: into one email, but also so grandiose. I mean, I 513 00:28:07,680 --> 00:28:10,840 Speaker 1: love hosting text stuff, but the idea that tech stuff 514 00:28:10,840 --> 00:28:13,040 Speaker 1: could be the perfect platform for the Pope to start 515 00:28:13,080 --> 00:28:16,720 Speaker 1: a global conversation is also kind of absurd. So there's 516 00:28:16,720 --> 00:28:20,520 Speaker 1: this kind of mismatch of tones and a demonstration of 517 00:28:20,560 --> 00:28:24,480 Speaker 1: a fundamental misunderstanding of the task at hand. Now, Gemini 518 00:28:24,520 --> 00:28:26,840 Speaker 1: doesn't just write emails. It's supposed to be a kind 519 00:28:26,840 --> 00:28:30,280 Speaker 1: of full suite virtual office assistant. And don't get me wrong, 520 00:28:30,280 --> 00:28:31,760 Speaker 1: I would love it if it really helped me be 521 00:28:31,800 --> 00:28:34,919 Speaker 1: more productive. But either I'm not using it right or 522 00:28:35,000 --> 00:28:38,280 Speaker 1: it's just not there yet. Because in all of these capacities. 523 00:28:38,360 --> 00:28:41,640 Speaker 1: It's a little meh, But why did Gemini just show 524 00:28:41,680 --> 00:28:43,960 Speaker 1: up in my work email one day? And the answer 525 00:28:44,000 --> 00:28:46,640 Speaker 1: to that is that Google had originally hoped that businesses 526 00:28:46,680 --> 00:28:49,200 Speaker 1: would pay twenty dollars per month on top of their 527 00:28:49,200 --> 00:28:53,080 Speaker 1: Google Workspace enterprise price, but the adoption of that was 528 00:28:53,240 --> 00:28:57,360 Speaker 1: very low, so instead Google decided to increase workspace costs 529 00:28:57,440 --> 00:29:00,600 Speaker 1: by two dollars per month for all users and into 530 00:29:00,680 --> 00:29:04,640 Speaker 1: great Gemini everywhere, a case of the good old maxim 531 00:29:04,720 --> 00:29:07,880 Speaker 1: if you build it, make them come. All of this 532 00:29:08,040 --> 00:29:12,160 Speaker 1: made me think about my best friend growing up. Clippy. 533 00:29:12,280 --> 00:29:16,320 Speaker 1: For those too young to remember, Clippy was an anthropomorphized 534 00:29:16,360 --> 00:29:20,000 Speaker 1: paper clip that was introduced into Microsoft Office in nineteen 535 00:29:20,080 --> 00:29:24,480 Speaker 1: ninety six as part of a project spearheaded by Melinda Frenchgates, 536 00:29:24,560 --> 00:29:28,520 Speaker 1: Bill Gates's wife at the time. And by an anthropomorphized 537 00:29:28,640 --> 00:29:32,000 Speaker 1: paper clip, I mean he was an animated cartoon paper 538 00:29:32,000 --> 00:29:36,200 Speaker 1: clip with large, shifty eyes and bushy floating eyebrows, and 539 00:29:36,280 --> 00:29:38,600 Speaker 1: Clippy would be there, bouncing on a piece of ruled 540 00:29:38,640 --> 00:29:41,880 Speaker 1: paper which almost looked like a magic carpet, making eyes 541 00:29:41,920 --> 00:29:44,880 Speaker 1: at you, and every so often a little cartoon text 542 00:29:44,880 --> 00:29:47,600 Speaker 1: bubble would appear and ask if you needed any help. 543 00:29:48,480 --> 00:29:52,520 Speaker 1: This happened over and over and over constantly throughout the 544 00:29:52,600 --> 00:29:56,400 Speaker 1: day when using Microsoft Office Suite. And this might shock you, 545 00:29:56,440 --> 00:29:59,720 Speaker 1: but Clippy isn't his real name. It's clip It. I 546 00:29:59,800 --> 00:30:01,800 Speaker 1: think we all just decided he needed to be cuter, 547 00:30:02,400 --> 00:30:04,960 Speaker 1: perhaps because he was so annoying. I don't know the 548 00:30:04,960 --> 00:30:08,200 Speaker 1: psychology behind it, but we all mind melded and decided 549 00:30:08,200 --> 00:30:10,760 Speaker 1: Clippy was Clippy and that he was a he. And 550 00:30:10,840 --> 00:30:15,600 Speaker 1: this actually came up during the preliminary focus groups. During testing, 551 00:30:16,040 --> 00:30:19,920 Speaker 1: women in particular didn't like Clippy's appearance because they found 552 00:30:19,920 --> 00:30:24,880 Speaker 1: his stare unsettling, and by unsettling, I mean unsettling. There 553 00:30:24,920 --> 00:30:26,760 Speaker 1: was a story in The Verge a couple of years 554 00:30:26,800 --> 00:30:31,200 Speaker 1: ago under the headline Microsoft ignored focus group feedback about 555 00:30:31,360 --> 00:30:35,240 Speaker 1: leering Clippy. Women did not like leering Clippy, according to 556 00:30:35,280 --> 00:30:38,040 Speaker 1: focus groups, but in the nineties, the guy's green lighting, 557 00:30:38,080 --> 00:30:40,920 Speaker 1: the bouncing paper clip didn't care. They were willing to 558 00:30:40,920 --> 00:30:43,800 Speaker 1: throw away the negative focus group data to get Clippy 559 00:30:43,840 --> 00:30:47,280 Speaker 1: onto people's computer screens, and they did so. Clippy was 560 00:30:47,280 --> 00:30:51,320 Speaker 1: included in Microsoft Office for Windows ninety seven, and like Gemini, 561 00:30:51,520 --> 00:30:54,520 Speaker 1: Clippy was supposed to be helpful. For example, if you 562 00:30:54,600 --> 00:30:58,240 Speaker 1: type the word dea into a word document, Clippy would say, 563 00:30:58,600 --> 00:31:01,080 Speaker 1: it looks like you're writing a letter. Would you like help? 564 00:31:01,320 --> 00:31:03,120 Speaker 1: And you could either opt to get help or say no, 565 00:31:03,360 --> 00:31:05,760 Speaker 1: I'd like to type out this letter without help. But 566 00:31:05,800 --> 00:31:08,040 Speaker 1: if you took Clippy up on his offer, he points 567 00:31:08,040 --> 00:31:10,600 Speaker 1: you towards introducing yourself in the letter and giving you 568 00:31:10,640 --> 00:31:14,400 Speaker 1: pointers about how to phrase sentences. The problem was most 569 00:31:14,400 --> 00:31:18,000 Speaker 1: people didn't actually need or want help writing letters or 570 00:31:18,120 --> 00:31:20,520 Speaker 1: with the other basic tasks that Clippy was offering to 571 00:31:20,560 --> 00:31:24,040 Speaker 1: assist with, at least not more than once. So people 572 00:31:24,040 --> 00:31:27,760 Speaker 1: got really really frustrated with this smiling, bouncing, possibly creepy 573 00:31:27,800 --> 00:31:30,760 Speaker 1: paper clip in the corner of their word documents, and 574 00:31:30,920 --> 00:31:34,120 Speaker 1: the vitriol did finally get to Microsoft, with Clippy put 575 00:31:34,200 --> 00:31:37,720 Speaker 1: to rest in two thousand and seven. He certainly was annoying, 576 00:31:38,000 --> 00:31:40,120 Speaker 1: but I do understand why there's so much Clippy fan 577 00:31:40,240 --> 00:31:43,280 Speaker 1: art out there to this day. Clippy truly was a 578 00:31:43,320 --> 00:31:46,320 Speaker 1: touchstone of my youth. But why we're talking about Clippy 579 00:31:46,320 --> 00:31:49,120 Speaker 1: today and why it's relevant is that Clippy was an 580 00:31:49,160 --> 00:31:53,360 Speaker 1: early example of consumer facing AI. And while Gemini is 581 00:31:53,440 --> 00:31:56,760 Speaker 1: much smarter than Clippy and less creepy, at least visually, 582 00:31:57,320 --> 00:31:59,920 Speaker 1: there is the same sense of intrusion the way Jen 583 00:32:00,000 --> 00:32:01,600 Speaker 1: Womini shows up and asks me if I want my 584 00:32:01,680 --> 00:32:05,600 Speaker 1: emails to be summarized, is giving me flashbacks help me write? 585 00:32:05,640 --> 00:32:08,160 Speaker 1: May as well be it looks like you're writing an email, 586 00:32:08,280 --> 00:32:11,120 Speaker 1: would you like help? People have tried to find ways 587 00:32:11,120 --> 00:32:14,440 Speaker 1: to opt out of Gemini's presence on Gmail, citing concerns 588 00:32:14,440 --> 00:32:17,520 Speaker 1: about privacy. I mean, if you let your emails be 589 00:32:17,560 --> 00:32:21,480 Speaker 1: summarized by Gemini, where does that data go? Google has 590 00:32:21,480 --> 00:32:24,520 Speaker 1: a set of privacy commitments online that says, quote, your 591 00:32:24,600 --> 00:32:27,000 Speaker 1: data is your data. The content that you put into 592 00:32:27,040 --> 00:32:31,480 Speaker 1: Google workspace services, emails, documents, etc. Is yours. We never 593 00:32:31,520 --> 00:32:34,080 Speaker 1: sell your data, and you can delete your content or 594 00:32:34,120 --> 00:32:38,920 Speaker 1: export it, but it's still being processed on Google servers. Now. Sure, 595 00:32:39,280 --> 00:32:42,040 Speaker 1: all my emails are there anyway, but there's another layer 596 00:32:42,080 --> 00:32:44,560 Speaker 1: of unease which comes with them being read over by 597 00:32:44,560 --> 00:32:48,320 Speaker 1: a machine and interpreted by a machine without my request 598 00:32:48,400 --> 00:32:51,480 Speaker 1: or consent. And Google just can't sume to get the 599 00:32:51,480 --> 00:32:55,200 Speaker 1: optics quite right. During the twenty twenty four Olympics, Google 600 00:32:55,280 --> 00:32:58,160 Speaker 1: ran an ad for Gemini where a dad used AI 601 00:32:58,240 --> 00:33:01,160 Speaker 1: to help his daughter write a letter to the American 602 00:33:01,200 --> 00:33:04,280 Speaker 1: track star and the idea of using a chatbot to 603 00:33:04,280 --> 00:33:07,280 Speaker 1: help write a fan letter. This kind of innocent form 604 00:33:07,320 --> 00:33:10,280 Speaker 1: of human connection between a kid and how idol was 605 00:33:10,280 --> 00:33:13,160 Speaker 1: off putting to a lot of people. Google ended up 606 00:33:13,200 --> 00:33:16,720 Speaker 1: pulling that ad much like Apple did its ad, which 607 00:33:16,760 --> 00:33:19,360 Speaker 1: crushed a whole bunch of musical instruments and paint pots 608 00:33:19,400 --> 00:33:23,600 Speaker 1: and other artifacts of human creativity into a sleek iPad. 609 00:33:23,760 --> 00:33:26,400 Speaker 1: This also calls up raw because it revealed a deep 610 00:33:26,440 --> 00:33:30,320 Speaker 1: dissonance between how engineers think about their creations and how 611 00:33:30,400 --> 00:33:33,600 Speaker 1: real people think about their lives. Take a look through 612 00:33:33,640 --> 00:33:37,040 Speaker 1: the r slash Google community on Reddit and you'll see 613 00:33:37,080 --> 00:33:40,560 Speaker 1: what real people think. There are users calling Gemini quote 614 00:33:40,600 --> 00:33:45,080 Speaker 1: absolutely worthless, and others said quote mostly nonsense, but occasionally 615 00:33:45,080 --> 00:33:48,520 Speaker 1: it gets lucky and says something correct. Maybe Redd it's 616 00:33:48,520 --> 00:33:51,160 Speaker 1: being a bit too hard on Gemini here. I mean, 617 00:33:51,320 --> 00:33:54,080 Speaker 1: looking back at the Clippy error, so many people in nostalgic. 618 00:33:54,160 --> 00:33:56,440 Speaker 1: I mean, no one actually wants Clippy back in their 619 00:33:56,480 --> 00:33:59,680 Speaker 1: word documents, but there's a kind of yearning for a 620 00:33:59,720 --> 00:34:02,800 Speaker 1: simple time, and I do wonder how we'll look back 621 00:34:02,880 --> 00:34:05,520 Speaker 1: at the Gemini era. I mean, after all, it's still 622 00:34:05,520 --> 00:34:07,640 Speaker 1: a time when we sit in front of our computers, 623 00:34:07,680 --> 00:34:10,440 Speaker 1: typing on our keys, telling the computer what we want. 624 00:34:11,040 --> 00:34:14,080 Speaker 1: And you know, fast forward twenty five years, when perhaps 625 00:34:14,360 --> 00:34:17,400 Speaker 1: I'll be chiogenically frozen and my brain will be uploaded 626 00:34:17,400 --> 00:34:20,239 Speaker 1: to the cloud, and that mind will wistfully think back 627 00:34:20,280 --> 00:34:22,320 Speaker 1: to sitting in front of my computer in my apartment, 628 00:34:22,400 --> 00:34:32,719 Speaker 1: thinking I remember Gemini. That's it for this week for 629 00:34:32,800 --> 00:34:35,800 Speaker 1: tech Stuff, I'm oz Vloshin and this episode was produced 630 00:34:35,800 --> 00:34:39,319 Speaker 1: by Eliza Dennis, Victoria Domingez, and Lizzie Jacobs. It was 631 00:34:39,360 --> 00:34:42,680 Speaker 1: executive produced by me Kara Price and Kate Osborne for 632 00:34:42,760 --> 00:34:47,640 Speaker 1: Kaleidoscope and Katrina Norvel for iHeart Podcast. But He'd Fraser 633 00:34:47,680 --> 00:34:50,680 Speaker 1: is our engineer. Kyle Murdoch mixed this episode and he 634 00:34:50,719 --> 00:34:53,600 Speaker 1: also wrote our theme song. Join us next Wednesday for 635 00:34:53,680 --> 00:34:57,120 Speaker 1: tech Stuff the story when we have a very special 636 00:34:57,160 --> 00:35:01,360 Speaker 1: conversation with the Godfather of AI and the twenty twenty 637 00:35:01,400 --> 00:35:06,239 Speaker 1: four Nobel Laureate Jeffrey Hinton about his remarkable life and 638 00:35:06,280 --> 00:35:10,440 Speaker 1: how it intersects with building AI. Please rate, review, and 639 00:35:10,520 --> 00:35:12,920 Speaker 1: reach out to us at tech Stuff Podcast at gmail 640 00:35:12,960 --> 00:35:14,640 Speaker 1: dot com. We want to hear from you.