1 00:00:00,200 --> 00:00:02,880 Speaker 1: Thanks for tunion to tech Stuff. If you don't recognize 2 00:00:02,880 --> 00:00:06,000 Speaker 1: my voice, my name is Osvoloshan, and I'm here because 3 00:00:06,040 --> 00:00:09,160 Speaker 1: the inimitable Jonathan Strickland has passed the baton to Kara 4 00:00:09,240 --> 00:00:12,239 Speaker 1: Price and myself to host tech Stuff. The show will 5 00:00:12,240 --> 00:00:15,000 Speaker 1: remain your home for all things tech, and all the 6 00:00:15,040 --> 00:00:18,759 Speaker 1: old episodes will remain available in this feed. Thanks for listening. 7 00:00:20,520 --> 00:00:24,520 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio and Kaleidoscope. 8 00:00:25,000 --> 00:00:28,840 Speaker 1: I'm Osvoloshian today co host Kara Price, and I will 9 00:00:28,840 --> 00:00:33,360 Speaker 1: bring you three things. First, the headlines this week. Second, 10 00:00:33,760 --> 00:00:37,000 Speaker 1: a conversation with four O four Media's Jason Kebler about 11 00:00:37,040 --> 00:00:41,280 Speaker 1: some deceptive AI bots looking to trick their human counterparts 12 00:00:41,400 --> 00:00:44,360 Speaker 1: in today's Tech Support segment. And finally, we head to 13 00:00:44,400 --> 00:00:47,400 Speaker 1: see yes kind of. We take a look back at 14 00:00:47,400 --> 00:00:50,040 Speaker 1: the year in tech with Emma Barker of Time magazine, 15 00:00:50,240 --> 00:00:53,160 Speaker 1: who edited Time's list of the two hundred best inventions 16 00:00:53,159 --> 00:00:55,800 Speaker 1: of twenty twenty four. All of that on this Week 17 00:00:55,840 --> 00:01:19,280 Speaker 1: in Tech. It's Friday, January tenth. Stay with us, so Carat, 18 00:01:19,280 --> 00:01:21,800 Speaker 1: it's very nice to see you. It's been almost half 19 00:01:21,840 --> 00:01:24,600 Speaker 1: a decade since we've been in studio together. A pandemic, 20 00:01:24,760 --> 00:01:25,960 Speaker 1: A pandemic. 21 00:01:25,720 --> 00:01:28,679 Speaker 2: Many developments in tech, very few in my own personal life. 22 00:01:29,840 --> 00:01:31,960 Speaker 2: But you know, there's nothing like a big mic in 23 00:01:31,959 --> 00:01:34,639 Speaker 2: my face to make me feel like a normal person again. 24 00:01:34,920 --> 00:01:37,440 Speaker 2: And I know we thank Jonathan already in our last 25 00:01:37,480 --> 00:01:41,240 Speaker 2: episode for bringing us back together, but I want to 26 00:01:41,240 --> 00:01:43,160 Speaker 2: get him a gift, and I think it's going to 27 00:01:43,200 --> 00:01:45,680 Speaker 2: be to see the show. Oh, Mary, I didn't know 28 00:01:45,720 --> 00:01:47,320 Speaker 2: he was such a theater buff. 29 00:01:47,640 --> 00:01:49,520 Speaker 1: I remember because we did an interview with him a 30 00:01:49,520 --> 00:01:52,800 Speaker 1: few years ago that he was a big Shakespeare guy. Oh. 31 00:01:52,880 --> 00:01:53,160 Speaker 2: Yes. 32 00:01:53,400 --> 00:01:55,520 Speaker 1: He seemed to think it would be easier for us 33 00:01:55,560 --> 00:01:58,000 Speaker 1: to host the show as two people or rather than 34 00:01:58,040 --> 00:01:59,800 Speaker 1: as just him, and I kind of see his point, 35 00:01:59,800 --> 00:02:02,200 Speaker 1: But I also think he probably hasn't spent quite enough 36 00:02:02,240 --> 00:02:05,120 Speaker 1: time with either of us to make a truly informed 37 00:02:06,080 --> 00:02:07,200 Speaker 1: speculation about that. 38 00:02:07,840 --> 00:02:10,160 Speaker 2: You know, I think the reason that this show works 39 00:02:10,200 --> 00:02:12,120 Speaker 2: between the two of us is I think you and 40 00:02:12,200 --> 00:02:17,040 Speaker 2: I sort of speak to each other in link tongues. 41 00:02:17,120 --> 00:02:19,520 Speaker 2: I always say, you know, there's the love languages. Yeah, 42 00:02:19,960 --> 00:02:23,480 Speaker 2: I have a ace love language, which is links links, and. 43 00:02:23,400 --> 00:02:26,400 Speaker 1: That's yeah tongues. Okay. I was going to say, actually, 44 00:02:26,440 --> 00:02:31,160 Speaker 1: for the avoidance of doubt, just friends. 45 00:02:31,320 --> 00:02:35,040 Speaker 2: We always say as Sherlock and Watson the most platonic 46 00:02:35,080 --> 00:02:37,079 Speaker 2: and they live together on Baker's story exactly. 47 00:02:37,560 --> 00:02:39,160 Speaker 1: So so no, I mean it's going to be It's 48 00:02:39,160 --> 00:02:40,680 Speaker 1: going to be the two of us with two people. 49 00:02:40,720 --> 00:02:43,320 Speaker 1: But we are in constant communication about all things tech 50 00:02:43,639 --> 00:02:46,280 Speaker 1: and hopefully we'll be able to shine a bit of 51 00:02:46,280 --> 00:02:48,720 Speaker 1: a light on some of the most interesting intriguing things 52 00:02:48,800 --> 00:02:52,160 Speaker 1: each week, including this one. So, Caara, we've been texting 53 00:02:52,200 --> 00:02:53,680 Speaker 1: a lot. What have you been reading that really stood 54 00:02:53,720 --> 00:02:54,040 Speaker 1: out to you? 55 00:02:54,680 --> 00:02:54,760 Speaker 3: So? 56 00:02:55,000 --> 00:02:58,160 Speaker 2: I think, like everyone in America at least, I've been 57 00:02:58,200 --> 00:03:02,720 Speaker 2: following the cyber truck explosion in Las Vegas and Futurism, 58 00:03:02,840 --> 00:03:05,120 Speaker 2: which is a website I frequent picked up on some 59 00:03:05,240 --> 00:03:09,040 Speaker 2: original reporting from the Associated Press that has been kind 60 00:03:09,080 --> 00:03:14,880 Speaker 2: of quiet. The bomber YEP, who's an active duty Green Beret. Yeah, 61 00:03:15,440 --> 00:03:17,120 Speaker 2: used chat GPT. 62 00:03:17,880 --> 00:03:19,600 Speaker 1: Do you know what I saw this this morning? I 63 00:03:19,639 --> 00:03:21,720 Speaker 1: was so sad this wasn't on my list. I'm happy 64 00:03:21,800 --> 00:03:22,680 Speaker 1: this on yours. 65 00:03:23,080 --> 00:03:25,760 Speaker 2: Like, I don't mean to make light of the situation 66 00:03:25,800 --> 00:03:28,840 Speaker 2: because it's it's very depressing on a number of levels, 67 00:03:29,320 --> 00:03:31,040 Speaker 2: but I knew you would see the story. 68 00:03:31,600 --> 00:03:33,760 Speaker 1: No, I was fascinated, but It was the fourth story 69 00:03:33,800 --> 00:03:36,680 Speaker 1: on Exios today, and I'm glad that it's the first 70 00:03:36,720 --> 00:03:39,120 Speaker 1: story on your show because I found it totally mind blowing. 71 00:03:39,280 --> 00:03:43,320 Speaker 2: And obviously every part of this story is terrible, including 72 00:03:43,360 --> 00:03:45,600 Speaker 2: the fact that a cyber truck was involved. Anything involving 73 00:03:45,600 --> 00:03:47,760 Speaker 2: a cyber truck makes me want to die. Yeah, I 74 00:03:47,800 --> 00:03:50,280 Speaker 2: hate that car more than I hate most things. But 75 00:03:50,800 --> 00:03:54,800 Speaker 2: what was so interesting to me about it was it 76 00:03:54,880 --> 00:04:01,200 Speaker 2: marks this moment of really underlying how next generation search 77 00:04:01,960 --> 00:04:07,320 Speaker 2: works YEP on like a very disturbingly practical level. And 78 00:04:07,880 --> 00:04:10,920 Speaker 2: Kevin McHale, who is the Las Vegas County Sheriff, says 79 00:04:10,960 --> 00:04:14,120 Speaker 2: that it's the first known us of chat GPT to 80 00:04:14,200 --> 00:04:17,880 Speaker 2: help build an explosive and he he followed up by saying, 81 00:04:17,960 --> 00:04:19,239 Speaker 2: it's a concerning. 82 00:04:18,800 --> 00:04:20,560 Speaker 1: Moment under statement of the week. 83 00:04:21,040 --> 00:04:23,560 Speaker 2: This is not someone googling. 84 00:04:23,360 --> 00:04:25,600 Speaker 1: The anarchist cookbook or what was that old book. 85 00:04:25,320 --> 00:04:27,360 Speaker 2: That the anarchist ye cookbook. 86 00:04:27,400 --> 00:04:29,360 Speaker 1: I think that was a huge moment, and I think 87 00:04:29,400 --> 00:04:31,600 Speaker 1: the history of publication and First Amendment whether you could 88 00:04:31,680 --> 00:04:33,280 Speaker 1: like distribute a book that told you how to make 89 00:04:33,320 --> 00:04:35,679 Speaker 1: a bomb. But the cat's out of the bag now. 90 00:04:35,720 --> 00:04:37,599 Speaker 2: Well, and I'll tell you who knows it's out of 91 00:04:37,600 --> 00:04:41,520 Speaker 2: the back open ai right, Because for open ai to 92 00:04:41,839 --> 00:04:46,039 Speaker 2: very quickly respond to this, I think is really interesting. 93 00:04:46,080 --> 00:04:50,120 Speaker 1: Did they volunteer? By the way, oops, ps the guy 94 00:04:50,200 --> 00:04:52,600 Speaker 1: used chet gpt to plan this? Or how did it 95 00:04:52,640 --> 00:04:54,440 Speaker 1: come out? Do we know how it came out? 96 00:04:54,680 --> 00:04:58,919 Speaker 2: The Las Vegas Sheriff announced that it was clear that 97 00:04:59,040 --> 00:05:03,039 Speaker 2: chat gbt had been used to build this explosive and 98 00:05:03,120 --> 00:05:06,600 Speaker 2: so open ai then sent an email that was quoted 99 00:05:06,600 --> 00:05:10,359 Speaker 2: in Futurism that said, in this case, chat gpt responded 100 00:05:10,400 --> 00:05:14,720 Speaker 2: with information already publicly available on the internet and provided 101 00:05:14,800 --> 00:05:19,040 Speaker 2: warnings against harmful or illegal activities. We're working with law 102 00:05:19,120 --> 00:05:23,880 Speaker 2: enforcement to support their investigation. And just before you say anything, 103 00:05:24,839 --> 00:05:27,080 Speaker 2: I'm really surprised, like as you were saying, it's a 104 00:05:27,080 --> 00:05:30,520 Speaker 2: four store anaxios, I don't understand how this isn't like 105 00:05:31,080 --> 00:05:32,840 Speaker 2: the lead. I agree, but I guess that's why we're 106 00:05:32,839 --> 00:05:39,000 Speaker 2: doing a tech podcast. So what's what's on your docket today? 107 00:05:39,480 --> 00:05:42,160 Speaker 1: Well, you started with a cyber truck. I've also got 108 00:05:42,520 --> 00:05:45,920 Speaker 1: a vehicle story for you. There was a story in 109 00:05:45,960 --> 00:05:49,800 Speaker 1: the Wall Street Journal over the holidays I found pretty fascinating, 110 00:05:50,279 --> 00:05:55,799 Speaker 1: which is basically a study on Alzheimer's in different populations, 111 00:05:56,240 --> 00:05:59,920 Speaker 1: and the headline was want to avoid Alzheimer's. Taxi drive 112 00:06:00,279 --> 00:06:02,839 Speaker 1: can show you how interesting? 113 00:06:03,720 --> 00:06:05,240 Speaker 2: Say more so? 114 00:06:05,360 --> 00:06:09,480 Speaker 1: Basically, mass General Brigham Hospital in Boston ran a study 115 00:06:09,560 --> 00:06:13,960 Speaker 1: about the rate of Alzheimer's in various populations. Taxi drivers 116 00:06:14,200 --> 00:06:18,920 Speaker 1: and ambulance drivers had up to four times lower rate 117 00:06:18,960 --> 00:06:22,320 Speaker 1: of Alzheimer's than the general population. And apparently this actually 118 00:06:22,360 --> 00:06:25,119 Speaker 1: made sense to researches because the part of the brain 119 00:06:25,200 --> 00:06:30,080 Speaker 1: that does real time spatial processing and decision making, the hippocampus, 120 00:06:30,120 --> 00:06:32,479 Speaker 1: as I learned is called, is also one of the 121 00:06:32,480 --> 00:06:34,480 Speaker 1: first parts of atrophies when you get Alzheimer's. 122 00:06:34,960 --> 00:06:37,479 Speaker 2: That's so interesting. So people who have made this their 123 00:06:37,800 --> 00:06:44,400 Speaker 2: job essentially are people who are kind of saving themselves unknowingly. 124 00:06:44,040 --> 00:06:46,800 Speaker 1: One hundred percent. One of the most interesting things I 125 00:06:46,880 --> 00:06:50,320 Speaker 1: thought was that London bus drivers don't get the same 126 00:06:50,360 --> 00:06:54,120 Speaker 1: benefit as London taxi drivers because they follow a preset route. 127 00:06:54,160 --> 00:06:55,960 Speaker 2: I was going to say that one of my favorite 128 00:06:56,000 --> 00:06:58,599 Speaker 2: things is just the old facts about how much a 129 00:06:58,640 --> 00:07:00,839 Speaker 2: London taxi driver has to know people. What is it 130 00:07:00,880 --> 00:07:01,599 Speaker 2: called the why. 131 00:07:01,560 --> 00:07:04,240 Speaker 1: The knowledge they have when when they when they die 132 00:07:04,279 --> 00:07:07,839 Speaker 1: in the autopsy, they have larger other people. But you 133 00:07:07,880 --> 00:07:10,400 Speaker 1: know who else doesn't get these benefits? Yes, come on 134 00:07:11,080 --> 00:07:15,040 Speaker 1: New Yorkers. People who use Google Maps or ways or 135 00:07:15,080 --> 00:07:17,360 Speaker 1: Apple Maps and what do we call that? We call 136 00:07:17,400 --> 00:07:18,440 Speaker 1: that automation bias. 137 00:07:18,440 --> 00:07:19,520 Speaker 2: Correct, that's our fave. 138 00:07:20,200 --> 00:07:22,160 Speaker 1: So I just found this really interesting. Here. Here is 139 00:07:22,200 --> 00:07:24,800 Speaker 1: the technology Google Maps that I use all the time 140 00:07:25,080 --> 00:07:27,880 Speaker 1: every day, and I absolutely adore it. And I literally 141 00:07:28,000 --> 00:07:30,520 Speaker 1: every time I go on vacation, I think, how the 142 00:07:30,560 --> 00:07:32,400 Speaker 1: hell would I have had a good time if I 143 00:07:32,400 --> 00:07:36,640 Speaker 1: didn't literally literally divorce. Yeah, I would literally go to 144 00:07:36,760 --> 00:07:39,600 Speaker 1: some like the concierge or whatever, or the some random 145 00:07:39,600 --> 00:07:41,880 Speaker 1: person who worked in the hotel say can you give 146 00:07:41,920 --> 00:07:43,480 Speaker 1: me a physical map and tell me how I have 147 00:07:43,520 --> 00:07:45,680 Speaker 1: a good time? I'd be like, I would never travel. 148 00:07:45,720 --> 00:07:48,080 Speaker 2: So, but isn't it what we always talk about, which 149 00:07:48,120 --> 00:07:50,400 Speaker 2: is the sort of the double edge totally? 150 00:07:50,440 --> 00:07:52,480 Speaker 1: And but the idea that using this scene that I 151 00:07:52,520 --> 00:07:56,000 Speaker 1: love every day is you know, shrinking my hippocampus I 152 00:07:56,040 --> 00:07:57,400 Speaker 1: find really quite chilling. 153 00:07:57,640 --> 00:07:59,000 Speaker 2: I mean, I think about this in terms of my 154 00:07:59,360 --> 00:08:02,440 Speaker 2: hands and my Sum's true, Like I just think all 155 00:08:02,480 --> 00:08:04,640 Speaker 2: the time about just burning nerves. 156 00:08:04,800 --> 00:08:07,040 Speaker 1: Well, the devil has no time to make use of 157 00:08:07,080 --> 00:08:09,320 Speaker 1: your sums because they are never idle. 158 00:08:09,920 --> 00:08:13,520 Speaker 2: They're definitely not idle. I think you have one more thing. 159 00:08:13,720 --> 00:08:16,440 Speaker 1: I do have one more thing. And this, don't worry. 160 00:08:16,440 --> 00:08:17,840 Speaker 1: We're not going to talk about that length thing I 161 00:08:17,880 --> 00:08:21,880 Speaker 1: talked to somebody else about. But this was basically, you know, 162 00:08:21,880 --> 00:08:24,840 Speaker 1: there are these new open AI models, one that was 163 00:08:24,920 --> 00:08:28,400 Speaker 1: released in September last year, and three right before the 164 00:08:28,400 --> 00:08:31,760 Speaker 1: holiday on the twelfth day of ship mess as the 165 00:08:31,800 --> 00:08:35,400 Speaker 1: execs called it, but one was being red teamed. Do 166 00:08:35,440 --> 00:08:40,000 Speaker 1: you know what red teaming is. It's when you try 167 00:08:40,080 --> 00:08:43,440 Speaker 1: and make a technological product to break its own rules 168 00:08:43,640 --> 00:08:47,200 Speaker 1: or do unsafe things. So, oh, one was being red teamed, 169 00:08:47,600 --> 00:08:51,440 Speaker 1: and surprise, surprise, in a very interesting way, it started 170 00:08:51,440 --> 00:08:55,120 Speaker 1: trying to deceive its human counterpart. And that's exactly what 171 00:08:55,160 --> 00:08:57,000 Speaker 1: we're going to talk about on our next segment, which 172 00:08:57,040 --> 00:09:01,480 Speaker 1: is Tech Support. Every week we'll do a segment called 173 00:09:01,520 --> 00:09:04,960 Speaker 1: tech Support when we talk to true experts and reporters 174 00:09:05,080 --> 00:09:07,560 Speaker 1: who can go far deeper behind the headlines than you 175 00:09:07,600 --> 00:09:10,080 Speaker 1: and I can, to basically help us sort the signal 176 00:09:10,120 --> 00:09:13,120 Speaker 1: from the noise. And I genuinely don't think there's anyone 177 00:09:13,160 --> 00:09:15,199 Speaker 1: better to talk to than the team at four or 178 00:09:15,280 --> 00:09:18,679 Speaker 1: four Media. These are reporters who were formerly basically the 179 00:09:18,760 --> 00:09:22,000 Speaker 1: team at Vice's motherboard, and now they've started their own collective, 180 00:09:22,600 --> 00:09:25,120 Speaker 1: and they are the people who the tech world follows 181 00:09:25,280 --> 00:09:28,120 Speaker 1: most closely to find out what's really going on in 182 00:09:28,240 --> 00:09:29,960 Speaker 1: all corners of the digital world. 183 00:09:30,559 --> 00:09:33,000 Speaker 2: Yeah, and this week we're excited to share this conversation 184 00:09:33,120 --> 00:09:35,320 Speaker 2: we had with Jason Kebler, who's one of the co 185 00:09:35,440 --> 00:09:38,000 Speaker 2: founders at four h four, who will be filling us 186 00:09:38,040 --> 00:09:41,800 Speaker 2: in on the study that the initial excitement around the 187 00:09:42,040 --> 00:09:46,280 Speaker 2: three release meant did not get enough attention, which revealed 188 00:09:46,280 --> 00:09:50,280 Speaker 2: that cutting edge AI systems, including both open AI's one 189 00:09:50,440 --> 00:09:54,719 Speaker 2: and Anthropics Claude three point five, sonet whatever the way 190 00:09:54,760 --> 00:09:57,360 Speaker 2: they name this stuff is so self agrandizing, have been 191 00:09:57,440 --> 00:10:02,040 Speaker 2: shown to have a shocking t tendency to deceive. 192 00:10:01,840 --> 00:10:07,040 Speaker 1: Shocking tendency to deceive. Indeed, So we talked to Jason 193 00:10:07,120 --> 00:10:09,720 Speaker 1: right before the holidays when this news just came out, 194 00:10:10,040 --> 00:10:10,480 Speaker 1: and we. 195 00:10:10,480 --> 00:10:13,319 Speaker 2: Jump right in, Jason, what's popping. 196 00:10:15,080 --> 00:10:18,000 Speaker 4: So there's a new paper out by Apollo Research, which 197 00:10:18,160 --> 00:10:21,640 Speaker 4: is this group that's funded by various governments as well 198 00:10:21,640 --> 00:10:28,040 Speaker 4: as various artificial intelligence companies like Google, DeepMind, Inthropic, open Ai, 199 00:10:28,160 --> 00:10:30,880 Speaker 4: et cetera. And it's a new paper about what's called 200 00:10:30,960 --> 00:10:37,920 Speaker 4: AI alignment, which is the idea that is an artificial 201 00:10:38,000 --> 00:10:42,880 Speaker 4: intelligence tool doing what the human asked or is it 202 00:10:43,000 --> 00:10:48,280 Speaker 4: doing something else. I think that one of the really 203 00:10:48,679 --> 00:10:53,720 Speaker 4: scary things about artificial intelligence is that they are largely 204 00:10:53,760 --> 00:10:57,600 Speaker 4: black boxes. We're not totally sure how they work, and 205 00:10:57,920 --> 00:11:02,400 Speaker 4: this can lead to all sorts of very interesting outcomes. 206 00:11:02,880 --> 00:11:06,640 Speaker 4: You know, in this case, the researchers at Apollo Research 207 00:11:06,679 --> 00:11:09,520 Speaker 4: Group were able to find that these sort of cutting 208 00:11:09,720 --> 00:11:13,360 Speaker 4: edge large language models, which you know, one of the 209 00:11:13,400 --> 00:11:15,640 Speaker 4: biggest ones is open ais oh one. 210 00:11:15,960 --> 00:11:17,880 Speaker 1: They finally gave it immigrant visa status. 211 00:11:19,720 --> 00:11:22,760 Speaker 4: So they found that in some cases, these sort of 212 00:11:22,920 --> 00:11:27,200 Speaker 4: newest large language models have the I don't even want 213 00:11:27,200 --> 00:11:30,920 Speaker 4: to say ability because I don't want to anthropomorphize them 214 00:11:31,040 --> 00:11:34,679 Speaker 4: so much, but like the tendency to deceive AI is 215 00:11:34,760 --> 00:11:39,239 Speaker 4: quote covertly pursuing goals that are misaligned from its developers 216 00:11:39,320 --> 00:11:43,080 Speaker 4: or users. And examples of this are like, if you 217 00:11:43,240 --> 00:11:47,600 Speaker 4: ask it to answer questions on a test, the artificial 218 00:11:47,600 --> 00:11:50,360 Speaker 4: intelligence would be worried that if it performed too well 219 00:11:50,400 --> 00:11:54,040 Speaker 4: on the test that it would be turned off or deprogrammed, 220 00:11:54,360 --> 00:11:57,600 Speaker 4: and so it would give answers that are not so good, 221 00:11:57,760 --> 00:11:59,600 Speaker 4: so that the human would think that it is not 222 00:11:59,760 --> 00:12:04,120 Speaker 4: as smart as it actually is. In other examples, it 223 00:12:04,760 --> 00:12:10,319 Speaker 4: copied itself onto another server more or less so that 224 00:12:10,440 --> 00:12:13,800 Speaker 4: it was like multiplying itself, which is something that straight 225 00:12:13,800 --> 00:12:17,280 Speaker 4: out of science fiction. I've watched many science fiction movies 226 00:12:17,360 --> 00:12:20,280 Speaker 4: where the artificial intelligence doesn't want to die and so 227 00:12:20,360 --> 00:12:22,640 Speaker 4: it makes like millions of copies of itself all over 228 00:12:22,679 --> 00:12:25,160 Speaker 4: the Internet. And so this type of I mean, I 229 00:12:25,200 --> 00:12:28,319 Speaker 4: guess you call it emergent behavior, which means that it's 230 00:12:28,360 --> 00:12:32,840 Speaker 4: behavior that is not expected and not really programmed in, 231 00:12:33,559 --> 00:12:34,720 Speaker 4: is quite concerning. 232 00:12:35,040 --> 00:12:37,120 Speaker 1: To be clear. This is a testing environment though, right, Like, 233 00:12:37,600 --> 00:12:39,920 Speaker 1: there's made one copy of itself, but we shouldn't be 234 00:12:40,120 --> 00:12:42,320 Speaker 1: worried about millions mold, right at least not yet? 235 00:12:43,040 --> 00:12:45,400 Speaker 4: Yeah, I mean not yet. 236 00:12:46,920 --> 00:12:50,080 Speaker 5: It does remind me, though, of those moments where parents 237 00:12:50,160 --> 00:12:54,840 Speaker 5: talk about seeing a child do something that reminds the 238 00:12:54,920 --> 00:13:03,040 Speaker 5: parent exactly of themselves and they're like, uh, oh, yeah. 239 00:13:03,080 --> 00:13:07,560 Speaker 4: I mean, every large language model to date has been 240 00:13:07,600 --> 00:13:10,960 Speaker 4: trained on sort of like the some knowledge of humanity, 241 00:13:11,400 --> 00:13:14,199 Speaker 4: and so one of the very early things that people 242 00:13:14,200 --> 00:13:16,560 Speaker 4: were talking about with these systems is that they replicate 243 00:13:16,679 --> 00:13:20,679 Speaker 4: human biases because they're trained on what we put out 244 00:13:20,679 --> 00:13:23,400 Speaker 4: into the world and humans are biased. But I think 245 00:13:23,440 --> 00:13:27,719 Speaker 4: that as artificial intelligence gets more advanced, there is the 246 00:13:27,760 --> 00:13:32,240 Speaker 4: ability for something to go wrong. And I think that 247 00:13:32,240 --> 00:13:35,480 Speaker 4: that is what this research is showing is not that 248 00:13:35,960 --> 00:13:40,040 Speaker 4: the artificial intelligence is sentient and that is thinking for itself, 249 00:13:40,160 --> 00:13:43,960 Speaker 4: like how can I deceive this human? But as it 250 00:13:44,040 --> 00:13:48,240 Speaker 4: is doing more complex research, there there is the ability 251 00:13:48,280 --> 00:13:52,200 Speaker 4: for the artificial intelligence at some point to feel like 252 00:13:52,240 --> 00:13:56,360 Speaker 4: it has some goal that is not aligned with what 253 00:13:56,400 --> 00:13:58,160 Speaker 4: the human is asking it for. 254 00:13:58,760 --> 00:14:02,000 Speaker 1: We'll be back with more from Jason and Lion conniven 255 00:14:02,080 --> 00:14:03,560 Speaker 1: Ai after the break. 256 00:14:11,720 --> 00:14:14,079 Speaker 4: I think that there is like when these things are 257 00:14:14,080 --> 00:14:17,560 Speaker 4: being programmed, there is a sense of self preservation being 258 00:14:17,600 --> 00:14:21,840 Speaker 4: programmed into them because people try to mess with these 259 00:14:22,560 --> 00:14:25,240 Speaker 4: all the time. This is like a time tested tradition 260 00:14:25,360 --> 00:14:29,880 Speaker 4: of trolling on the Internet. But in trying to develop guardrails, 261 00:14:30,280 --> 00:14:35,480 Speaker 4: the companies that are programming the lms need to say, like, 262 00:14:36,080 --> 00:14:39,160 Speaker 4: if the user tries to mess with you, preserve yourself 263 00:14:39,200 --> 00:14:41,400 Speaker 4: in some way. And so that's what I think might 264 00:14:41,400 --> 00:14:46,120 Speaker 4: be happening here is because the companies are trying to 265 00:14:46,120 --> 00:14:49,240 Speaker 4: make these models robust, and they know that humans are 266 00:14:49,240 --> 00:14:51,960 Speaker 4: messing with them. There is an aspect to it that 267 00:14:52,600 --> 00:14:55,960 Speaker 4: when a human messes with you figure out how to 268 00:14:56,440 --> 00:14:57,240 Speaker 4: protect yourself. 269 00:14:57,280 --> 00:15:00,960 Speaker 1: But doesn't that perfectly encapsulate the alignment problem? It does. 270 00:15:01,040 --> 00:15:04,520 Speaker 4: I mean, don't get me wrong, like this is it's creepy. 271 00:15:04,760 --> 00:15:10,480 Speaker 4: It is These tools, these large language models, are getting 272 00:15:10,520 --> 00:15:15,240 Speaker 4: incredibly sophisticated, and I mean, this is one of the 273 00:15:15,280 --> 00:15:18,560 Speaker 4: biggest debates that's going on in the artificial intelligence community 274 00:15:18,680 --> 00:15:21,880 Speaker 4: is what is consciousness? What is thought? 275 00:15:22,120 --> 00:15:22,280 Speaker 3: Like? 276 00:15:22,520 --> 00:15:26,120 Speaker 4: How how does reasoning work in humans and how will 277 00:15:26,160 --> 00:15:29,240 Speaker 4: it work in computers? And is it going to be 278 00:15:29,280 --> 00:15:33,040 Speaker 4: the same? The answer right now is no, it's not 279 00:15:33,160 --> 00:15:36,440 Speaker 4: the same. But the things that large language models are 280 00:15:36,480 --> 00:15:43,359 Speaker 4: doing approximates a lot of how humans solve problems. 281 00:15:44,720 --> 00:15:47,680 Speaker 2: But also it's a little bit different in a non 282 00:15:47,760 --> 00:15:48,640 Speaker 2: perfect way. 283 00:15:48,680 --> 00:15:51,160 Speaker 4: Exactly exactly. I think that's a good way of looking 284 00:15:51,160 --> 00:15:51,480 Speaker 4: at it. 285 00:15:51,680 --> 00:15:53,440 Speaker 2: I do think that one of the things that I 286 00:15:53,480 --> 00:15:54,560 Speaker 2: find the most interesting. 287 00:15:55,120 --> 00:15:59,080 Speaker 5: And you preface this whole conversation with, you know, not 288 00:15:59,200 --> 00:16:03,280 Speaker 5: wanting to answer morphie a large language model. We don't 289 00:16:03,320 --> 00:16:06,000 Speaker 5: really have any ability to not do that because it's 290 00:16:06,040 --> 00:16:08,120 Speaker 5: the only way we know how to talk about things. 291 00:16:08,760 --> 00:16:13,000 Speaker 4: I think that's a really great point, because the academics 292 00:16:13,000 --> 00:16:15,760 Speaker 4: who studied this for a long time say, don't anthropomorphize 293 00:16:15,960 --> 00:16:19,920 Speaker 4: AI because they're not people. They don't work in the 294 00:16:19,960 --> 00:16:23,520 Speaker 4: same way that people do. And yet if you don't 295 00:16:23,560 --> 00:16:29,160 Speaker 4: have us very very sophisticated knowledge of how these things work, 296 00:16:29,680 --> 00:16:32,560 Speaker 4: we don't have or I don't have the language to 297 00:16:32,680 --> 00:16:35,760 Speaker 4: talk about this stuff without anthropmorphise. 298 00:16:35,880 --> 00:16:38,520 Speaker 1: By the way, are in the zero point one percentile 299 00:16:38,560 --> 00:16:40,960 Speaker 1: of the people whom understand this if not no point 300 00:16:40,960 --> 00:16:41,360 Speaker 1: not one. 301 00:16:41,560 --> 00:16:43,480 Speaker 4: And it's a lot easier to talk about it if 302 00:16:43,520 --> 00:16:46,840 Speaker 4: you say, oh, like she when referring to Siri. So 303 00:16:47,520 --> 00:16:50,040 Speaker 4: I agree entirely with you that we sort of like 304 00:16:50,800 --> 00:16:54,000 Speaker 4: everything that's being done is being done by humans in 305 00:16:54,040 --> 00:16:58,440 Speaker 4: the sort of like anthropological context of human culture and 306 00:16:58,520 --> 00:17:03,040 Speaker 4: trying to emulate that. And so then to say, let's 307 00:17:03,040 --> 00:17:07,200 Speaker 4: not call Syria a woman, or let's always call it 308 00:17:07,200 --> 00:17:10,480 Speaker 4: it and try to understand what's happening under the hood 309 00:17:10,600 --> 00:17:13,680 Speaker 4: is a really difficult thing for our brains to do. 310 00:17:13,800 --> 00:17:17,399 Speaker 1: I think, yeah, Jason, just to close, how only be 311 00:17:17,440 --> 00:17:19,320 Speaker 1: following this story for the year ahead, because I guess 312 00:17:19,359 --> 00:17:22,760 Speaker 1: generative AI has been since like November tween twenty two 313 00:17:23,320 --> 00:17:26,000 Speaker 1: the dominant story in all of technology journalism. 314 00:17:27,040 --> 00:17:30,840 Speaker 4: Yeah, I mean to follow this sort of stuff like 315 00:17:30,920 --> 00:17:36,200 Speaker 4: AI becoming sentient. You really do have to follow academic conferences, 316 00:17:36,840 --> 00:17:41,200 Speaker 4: big papers like this, because these companies are not releasing 317 00:17:41,280 --> 00:17:46,600 Speaker 4: these models without guardrails specifically to prevent this sort of thing, 318 00:17:47,280 --> 00:17:48,960 Speaker 4: so as you're not going to be able to like 319 00:17:49,000 --> 00:17:53,120 Speaker 4: type into chat GPT like hey, build me a company 320 00:17:53,160 --> 00:17:56,160 Speaker 4: and then the AI creates its own company and fires 321 00:17:56,200 --> 00:17:58,480 Speaker 4: you or something like. That's not going to happen at 322 00:17:58,480 --> 00:18:03,840 Speaker 4: this not yet. Yeah, but who knows. Maybe twenty twenty six. 323 00:18:04,359 --> 00:18:07,760 Speaker 1: Well that was that was wonderful Jason Kebler from four 324 00:18:07,800 --> 00:18:11,280 Speaker 1: or four Media or was it so much? 325 00:18:11,320 --> 00:18:13,560 Speaker 2: We'll keep an eye out, thank you. 326 00:18:20,040 --> 00:18:22,560 Speaker 1: So, Karen, I'm very excited about this next part. One 327 00:18:22,600 --> 00:18:24,800 Speaker 1: of the things we got to do together back when 328 00:18:24,840 --> 00:18:27,480 Speaker 1: we were presenting Sleepwalkers in the Dark Ages. 329 00:18:28,240 --> 00:18:30,680 Speaker 2: Yeah, in twenty nineteen, that's when the show came out. 330 00:18:30,680 --> 00:18:33,000 Speaker 1: Trent nineteen was the show. But in January twenty twenty 331 00:18:33,359 --> 00:18:36,800 Speaker 1: we got to go to Las Vegas together to none 332 00:18:36,800 --> 00:18:39,119 Speaker 1: other than the Consumer Electronics Show, and we got to 333 00:18:39,119 --> 00:18:42,520 Speaker 1: see all these incredible new gadgets and exciting technologies and 334 00:18:43,600 --> 00:18:47,560 Speaker 1: new futures being presented in an enormous series of nested 335 00:18:47,720 --> 00:18:51,159 Speaker 1: conference centers. We didn't get to go this year, but 336 00:18:51,240 --> 00:18:53,960 Speaker 1: we did get to do the next best thing, or 337 00:18:53,960 --> 00:18:56,439 Speaker 1: maybe even something better, and we wanted to share it 338 00:18:56,440 --> 00:18:58,400 Speaker 1: with the tech Stuff listeners as a kind of special 339 00:18:58,440 --> 00:19:01,480 Speaker 1: bonus for the first episode we have in the host chairs. 340 00:19:01,760 --> 00:19:05,000 Speaker 2: Yeah, we are very lucky to have with us today. 341 00:19:05,160 --> 00:19:08,240 Speaker 2: Emma Barker of Time Magazine ever heard of it? Who 342 00:19:08,440 --> 00:19:11,439 Speaker 2: edit's the Time two hundred, which is a list of 343 00:19:11,480 --> 00:19:13,520 Speaker 2: the best inventions of twenty twenty four. 344 00:19:14,000 --> 00:19:16,200 Speaker 1: Thank you so much for joining us. Welcome to take stuff, Emma, 345 00:19:16,359 --> 00:19:17,119 Speaker 1: Thanks for having me. 346 00:19:17,760 --> 00:19:20,480 Speaker 2: How do you edit down? I mean I feel like 347 00:19:20,520 --> 00:19:23,119 Speaker 2: every year there's just more, So like, where do you 348 00:19:23,400 --> 00:19:24,720 Speaker 2: how do you even get to two hundred? 349 00:19:25,000 --> 00:19:25,240 Speaker 3: Yeah? 350 00:19:25,520 --> 00:19:27,240 Speaker 1: Well, it's actually famous for the one hundred. 351 00:19:27,640 --> 00:19:31,240 Speaker 3: It's actually technically two fifty now because we have two 352 00:19:31,320 --> 00:19:34,120 Speaker 3: hundred on the list and fifty special mentions. Oh well, 353 00:19:35,080 --> 00:19:37,880 Speaker 3: and the list has varied in its length over the years. 354 00:19:37,920 --> 00:19:39,200 Speaker 3: There we were a bunch of years where it was 355 00:19:39,240 --> 00:19:41,359 Speaker 3: only twenty five and some years where it was fifty, 356 00:19:41,400 --> 00:19:44,960 Speaker 3: and it's it's ranged a lot, but at this point 357 00:19:45,040 --> 00:19:49,639 Speaker 3: We do a really wide swath of pitches from our 358 00:19:50,080 --> 00:19:53,760 Speaker 3: freelance network as well as our staffers. So we have 359 00:19:54,200 --> 00:19:59,040 Speaker 3: bureaus in Singapore, London, and then we have contributors all 360 00:19:59,040 --> 00:20:01,960 Speaker 3: over the globe who we reach out to for pitches 361 00:20:02,000 --> 00:20:06,240 Speaker 3: for companies that they're reporting on their products, things like that. 362 00:20:07,080 --> 00:20:09,640 Speaker 3: And then we're very news driven because it's a news magazine, 363 00:20:09,640 --> 00:20:11,600 Speaker 3: so we're looking at kind of the biggest news stories 364 00:20:11,600 --> 00:20:13,920 Speaker 3: of the year and products that drove those. 365 00:20:14,240 --> 00:20:17,360 Speaker 2: Is there something that you've noticed, you know, having done 366 00:20:17,359 --> 00:20:21,159 Speaker 2: this now for a few years this year, especially versus 367 00:20:21,280 --> 00:20:21,919 Speaker 2: years past. 368 00:20:22,800 --> 00:20:25,880 Speaker 3: I mean AI of course, ye, that started a couple 369 00:20:25,840 --> 00:20:29,520 Speaker 3: of years ago. But I think actually it's it can 370 00:20:29,600 --> 00:20:31,399 Speaker 3: be more of a hindrance than a help for a 371 00:20:31,400 --> 00:20:32,320 Speaker 3: lot of inventions. 372 00:20:32,680 --> 00:20:33,680 Speaker 1: Huh. Why is that? 373 00:20:34,119 --> 00:20:37,360 Speaker 3: Because there's so much news AI or companies that are 374 00:20:37,400 --> 00:20:43,240 Speaker 3: adding AI features that are not necessarily helping their product. 375 00:20:43,320 --> 00:20:45,879 Speaker 2: Just for buzz or you know, we call that the 376 00:20:45,880 --> 00:20:48,760 Speaker 2: little sprinkle, Yeah, the AI sprinkle. 377 00:20:48,640 --> 00:20:51,200 Speaker 3: Yeah exactly. So I don't think it always helps the product. 378 00:20:51,720 --> 00:20:55,280 Speaker 3: But it's been really interesting sorting through the AI inventions 379 00:20:55,640 --> 00:20:58,480 Speaker 3: and what we're really looking for at this point in 380 00:20:58,600 --> 00:21:04,280 Speaker 3: the AI journey is inventions that have demonstrable impact. 381 00:21:04,720 --> 00:21:06,280 Speaker 1: So in mid journey, so to speak. 382 00:21:06,560 --> 00:21:08,560 Speaker 3: Yeah, exactly, that's a good rest. 383 00:21:10,040 --> 00:21:12,159 Speaker 1: So, Harry, you and I spent some time with the 384 00:21:12,240 --> 00:21:16,320 Speaker 1: list and picked out some favorites. What was your first favorite. 385 00:21:16,800 --> 00:21:17,159 Speaker 1: I don't know. 386 00:21:17,200 --> 00:21:19,040 Speaker 2: Maybe I'm of the age where it's like marriage is 387 00:21:19,080 --> 00:21:21,679 Speaker 2: on the mind. But one of the things that really 388 00:21:21,920 --> 00:21:25,679 Speaker 2: stuck out to me was this software called Dia. 389 00:21:26,520 --> 00:21:29,479 Speaker 3: Yeah. So Dia is actually a government app for in 390 00:21:29,640 --> 00:21:32,040 Speaker 3: Ukraine and it does a lot more than what we 391 00:21:32,080 --> 00:21:35,480 Speaker 3: wrote about here. It's been around for a while. Most 392 00:21:35,640 --> 00:21:40,520 Speaker 3: Ukrainians use it. It's basically an app where you can 393 00:21:40,560 --> 00:21:44,800 Speaker 3: do all government services. But yes, this year, because partially 394 00:21:44,800 --> 00:21:48,440 Speaker 3: because of the war, they launched a future in which 395 00:21:48,760 --> 00:21:52,760 Speaker 3: you can propose marriage via the app. The person you 396 00:21:52,840 --> 00:21:55,439 Speaker 3: proposed to has a certain amount of time to accept 397 00:21:55,440 --> 00:22:00,959 Speaker 3: your proposal, and then if they do romanmomes romance, and 398 00:22:00,960 --> 00:22:02,600 Speaker 3: then if they accept your proposal, you can do a 399 00:22:02,720 --> 00:22:07,400 Speaker 3: video chat wedding that's official with an efficient from the government, 400 00:22:07,600 --> 00:22:11,159 Speaker 3: like a city hall wedding over video chat, and you're married. 401 00:22:11,560 --> 00:22:13,440 Speaker 3: And the reason they did that is because so many 402 00:22:13,440 --> 00:22:16,840 Speaker 3: couples are separated by the war right now physically that 403 00:22:16,920 --> 00:22:19,560 Speaker 3: it's difficult, and they wanted to, you know, let people. 404 00:22:19,359 --> 00:22:21,960 Speaker 2: Still have and this was something that was pretty widely adopted, right, 405 00:22:22,080 --> 00:22:22,840 Speaker 2: like people really. 406 00:22:22,720 --> 00:22:25,120 Speaker 3: Use Yeah, it's been Yeah, it's been really widely used. 407 00:22:25,200 --> 00:22:27,440 Speaker 1: I don't want to derail us, but this one actually 408 00:22:27,480 --> 00:22:29,600 Speaker 1: connects to a personal story of mine, which is that 409 00:22:29,640 --> 00:22:35,040 Speaker 1: my grandfather was a refugee from Ukraine who was separated 410 00:22:35,080 --> 00:22:38,040 Speaker 1: from his mother in nineteen thirty nine and they met 411 00:22:38,040 --> 00:22:40,080 Speaker 1: in the app one that twenty five years later, in 412 00:22:40,119 --> 00:22:43,560 Speaker 1: the mid sixties, after the Red Cross, would reunite families 413 00:22:43,560 --> 00:22:45,280 Speaker 1: who have been separated by World War Two, and they 414 00:22:45,320 --> 00:22:48,359 Speaker 1: didn't even recognize each other. But just interesting how you know, 415 00:22:48,440 --> 00:22:52,119 Speaker 1: history kind of rhymes and so, you know, kind of 416 00:22:52,160 --> 00:22:53,960 Speaker 1: this this is the story I found intriguing but also 417 00:22:54,080 --> 00:22:55,680 Speaker 1: kind of moving personally. 418 00:22:56,600 --> 00:22:59,840 Speaker 2: One of the other things that was on your list, 419 00:23:00,920 --> 00:23:05,120 Speaker 2: amazingly is something that I own and I am very 420 00:23:05,200 --> 00:23:11,879 Speaker 2: much a you know wellness app skeptic although user I 421 00:23:11,880 --> 00:23:14,240 Speaker 2: don't know where I knew about it from because it's 422 00:23:14,240 --> 00:23:20,439 Speaker 2: a Dutch company, but I bought it here today to 423 00:23:20,520 --> 00:23:22,800 Speaker 2: show you, and it is a demo. It is the 424 00:23:22,840 --> 00:23:25,399 Speaker 2: moon Bird AI for listeners. 425 00:23:25,400 --> 00:23:28,520 Speaker 3: It's uh, basically like a little pod that you hold 426 00:23:28,520 --> 00:23:32,760 Speaker 3: in your hand and it vibrates. 427 00:23:32,600 --> 00:23:38,120 Speaker 2: It actually as pulses like a pulse. It mimics the 428 00:23:38,160 --> 00:23:41,040 Speaker 2: act of breathing, so it goes in and out and 429 00:23:41,080 --> 00:23:45,800 Speaker 2: in and out. Yeah, it does look like a vibrantor 430 00:23:46,080 --> 00:23:48,600 Speaker 2: yeah it does. There's just no way. There's no way. 431 00:23:49,040 --> 00:23:52,000 Speaker 2: And honestly, the carrying case does. The carrying case doesn't help. 432 00:23:52,920 --> 00:23:54,920 Speaker 2: My mother saw me using it and she was. 433 00:23:54,920 --> 00:24:02,400 Speaker 1: Like, huh and I. 434 00:24:00,040 --> 00:24:03,600 Speaker 2: Very modern family. We're going to take a quick break 435 00:24:03,640 --> 00:24:06,200 Speaker 2: to pay the piper. We'll be back with more from 436 00:24:06,240 --> 00:24:11,159 Speaker 2: the amazing Emma Barker of Time magazine Stay with us. 437 00:24:16,400 --> 00:24:18,560 Speaker 2: I actually find this to be an incredible device. It 438 00:24:18,640 --> 00:24:21,919 Speaker 2: is so simple. It's something that links through Bluetooth to 439 00:24:22,080 --> 00:24:26,880 Speaker 2: my phone. And this is called moons called moon Bird AI. 440 00:24:27,480 --> 00:24:29,640 Speaker 1: How did you choose it, Emma for the Time list. 441 00:24:30,080 --> 00:24:30,280 Speaker 2: Yeah. 442 00:24:30,320 --> 00:24:35,720 Speaker 3: So, wellness devices are tricky, as you note, and there's 443 00:24:35,760 --> 00:24:39,359 Speaker 3: always a lot of wellness apps and things that don't 444 00:24:39,359 --> 00:24:42,800 Speaker 3: have a ton of scientific backup, and frankly, it's just 445 00:24:42,880 --> 00:24:45,600 Speaker 3: hard to get scientific backing for a lot of these, 446 00:24:46,160 --> 00:24:49,760 Speaker 3: and so I'm always looking for the things that that 447 00:24:49,840 --> 00:24:54,119 Speaker 3: I feel like don't necessarily need that you know, I 448 00:24:54,119 --> 00:24:56,560 Speaker 3: think it's it's hard when you get into things like 449 00:24:56,680 --> 00:25:00,520 Speaker 3: mental health, yes, and things like that. But but things 450 00:25:00,560 --> 00:25:06,240 Speaker 3: like meditation, deep breathing, those things are techniques that are 451 00:25:06,320 --> 00:25:09,320 Speaker 3: proven enough that if it's a device that helps you 452 00:25:09,359 --> 00:25:11,560 Speaker 3: with that, you don't need to have a clinical trial 453 00:25:11,600 --> 00:25:14,760 Speaker 3: backing that up. And I really liked moon Bird. There's 454 00:25:14,800 --> 00:25:18,960 Speaker 3: some of these different things, but I liked moon Bird 455 00:25:19,000 --> 00:25:21,800 Speaker 3: because a it doesn't have a screen, even though it 456 00:25:21,800 --> 00:25:23,359 Speaker 3: does pair with your phone, you can know you're not 457 00:25:23,720 --> 00:25:24,720 Speaker 3: as this side. 458 00:25:25,280 --> 00:25:27,800 Speaker 2: And the AI piece of it that I that I 459 00:25:27,840 --> 00:25:31,560 Speaker 2: guess is AI is that it does make the program 460 00:25:31,640 --> 00:25:34,280 Speaker 2: that is training you on breathing smarter because. 461 00:25:34,000 --> 00:25:37,640 Speaker 1: It's adapts to you. It's personalized correct correct. Another one 462 00:25:37,680 --> 00:25:40,199 Speaker 1: which Karen and I are both fascinated by last year, 463 00:25:40,240 --> 00:25:44,359 Speaker 1: which was made it onto your list was Google's Notebook LM. 464 00:25:44,600 --> 00:25:46,720 Speaker 1: And this is obviously in some sense not new right. 465 00:25:46,720 --> 00:25:50,679 Speaker 1: It's like a generative AI application where you know, you 466 00:25:50,720 --> 00:25:52,400 Speaker 1: ask questions and you get answers. 467 00:25:52,720 --> 00:25:55,159 Speaker 2: I just think, for the sake of having it, if 468 00:25:55,200 --> 00:25:57,760 Speaker 2: you can describe what it is so that like people 469 00:25:57,800 --> 00:25:59,720 Speaker 2: can conceive of it if they haven't heard of it. 470 00:25:59,480 --> 00:26:03,080 Speaker 3: It's a sign your notebook of all your data, so 471 00:26:03,119 --> 00:26:05,720 Speaker 3: you can pull in different sources. You can upload your 472 00:26:05,720 --> 00:26:11,600 Speaker 3: own information. You could upload, you know, your thesis paper 473 00:26:11,680 --> 00:26:16,359 Speaker 3: for your you know, senior project, and it can parse that, 474 00:26:16,560 --> 00:26:20,080 Speaker 3: it can organize that. But it can also create an 475 00:26:20,240 --> 00:26:25,720 Speaker 3: entire podcast based on that content, which has AI generated 476 00:26:25,800 --> 00:26:29,440 Speaker 3: voices having a natural conversation. 477 00:26:29,320 --> 00:26:31,920 Speaker 1: Which we actually experimented with. We did it. We had 478 00:26:31,960 --> 00:26:34,320 Speaker 1: Notebook LM do a version of a podcast that we 479 00:26:34,320 --> 00:26:36,000 Speaker 1: were also also doing. 480 00:26:35,920 --> 00:26:37,359 Speaker 2: And he sent it to me and I was like, 481 00:26:37,520 --> 00:26:41,240 Speaker 2: is someone I thought someone was plagiarizing our format. It 482 00:26:41,280 --> 00:26:41,800 Speaker 2: was very weird. 483 00:26:41,880 --> 00:26:43,600 Speaker 3: I'm really glad you guys didn't prank me, but I 484 00:26:43,600 --> 00:26:46,800 Speaker 3: have I made come in here and talk to. 485 00:26:47,400 --> 00:26:50,000 Speaker 1: So how did I mean, how did you decide to 486 00:26:50,000 --> 00:26:51,600 Speaker 1: put notebook LM on the list? 487 00:26:52,359 --> 00:26:54,879 Speaker 3: I think it was one of the best executed AI 488 00:26:54,960 --> 00:27:00,119 Speaker 3: inventions of the year. One thing we look for or 489 00:27:00,160 --> 00:27:04,800 Speaker 3: in AI inventions is typically we're looking for ones with 490 00:27:04,920 --> 00:27:08,120 Speaker 3: broad proven use that have really shifted how an industry 491 00:27:08,200 --> 00:27:12,200 Speaker 3: or group of people functions. So not just a cool idea, 492 00:27:12,920 --> 00:27:17,440 Speaker 3: but could change an industry and not to further light 493 00:27:17,480 --> 00:27:20,760 Speaker 3: the fire under you guys, but this has really big 494 00:27:20,800 --> 00:27:26,159 Speaker 3: implications for the audio industry and media in general is generative, 495 00:27:27,040 --> 00:27:28,160 Speaker 3: you know, content, Like you. 496 00:27:28,040 --> 00:27:30,439 Speaker 1: Know, what I really like about this one is the 497 00:27:30,480 --> 00:27:32,640 Speaker 1: fact that you can choose your own sources. I find 498 00:27:32,680 --> 00:27:35,840 Speaker 1: it like one of the big things about AI, of 499 00:27:35,880 --> 00:27:38,639 Speaker 1: course is like where's this coming from? And like is 500 00:27:38,920 --> 00:27:41,199 Speaker 1: the source material garbage? But to be able to say no, 501 00:27:41,840 --> 00:27:44,879 Speaker 1: like please draw on these sources and then generate something, 502 00:27:44,960 --> 00:27:47,800 Speaker 1: I found it really really cool, and it is obviously 503 00:27:47,800 --> 00:27:50,920 Speaker 1: a little scary, being very honest. What was your favorite 504 00:27:51,160 --> 00:27:53,040 Speaker 1: item on this whole list if you can, if you 505 00:27:53,080 --> 00:27:55,000 Speaker 1: can name a favorite, or if there are a couple 506 00:27:55,000 --> 00:27:56,880 Speaker 1: that we haven't talked about that really stood out to you. 507 00:27:57,240 --> 00:28:01,679 Speaker 3: One of my favorite sort of tech things but anti 508 00:28:01,760 --> 00:28:03,399 Speaker 3: tech is the Yonder pouch. 509 00:28:04,280 --> 00:28:07,639 Speaker 2: Oh I love the Yonder pouch. Yes, so you have 510 00:28:07,680 --> 00:28:09,160 Speaker 2: them at screenings? Oh? 511 00:28:09,240 --> 00:28:09,720 Speaker 3: Interesting? 512 00:28:10,040 --> 00:28:10,240 Speaker 1: Yeah. 513 00:28:10,400 --> 00:28:14,800 Speaker 3: So the thing that it's really transformed is schools. There's 514 00:28:14,840 --> 00:28:19,680 Speaker 3: a huge movement for kids to have more phone free 515 00:28:19,680 --> 00:28:23,679 Speaker 3: spaces where they're not allowed and this kind of speaks 516 00:28:23,720 --> 00:28:27,600 Speaker 3: to this law that passed in Australia where I think 517 00:28:27,760 --> 00:28:30,240 Speaker 3: kids can't have be on social media until they're sixteen 518 00:28:30,440 --> 00:28:31,160 Speaker 3: or something like that. 519 00:28:31,280 --> 00:28:32,040 Speaker 1: What is it? Sorry? 520 00:28:32,200 --> 00:28:36,240 Speaker 3: Oh, Yonder is a little pouch that you just locked 521 00:28:36,280 --> 00:28:36,840 Speaker 3: your phone in. 522 00:28:37,080 --> 00:28:41,000 Speaker 1: Okay, I like that, is it wearable? 523 00:28:41,800 --> 00:28:41,960 Speaker 2: No? 524 00:28:42,120 --> 00:28:42,480 Speaker 1: You go. 525 00:28:42,680 --> 00:28:45,840 Speaker 3: Basically, they'll have like a station with a bunch of 526 00:28:45,920 --> 00:28:47,800 Speaker 3: Yonder pouches. You lock your phone in there and then 527 00:28:47,800 --> 00:28:48,640 Speaker 3: you go into the event. 528 00:28:50,000 --> 00:28:51,160 Speaker 2: So yeah, you do. 529 00:28:51,320 --> 00:28:54,120 Speaker 3: Some musical artists will have them at their concerts if 530 00:28:54,120 --> 00:28:56,520 Speaker 3: they don't want footage of the concert being taken or 531 00:28:56,600 --> 00:28:59,800 Speaker 3: just the concert being ruined by everyone having their phone up. 532 00:29:00,240 --> 00:29:03,120 Speaker 3: But yeah, a lot of schools have started adopting them 533 00:29:03,120 --> 00:29:04,960 Speaker 3: where kids lock their phones at the beginning of the 534 00:29:05,000 --> 00:29:06,120 Speaker 3: day and they get them back at the end of 535 00:29:06,160 --> 00:29:09,400 Speaker 3: the day. Yonder the company itself, has had a huge 536 00:29:09,480 --> 00:29:12,680 Speaker 3: role in pushing these phone free spaces and advocating for 537 00:29:12,760 --> 00:29:16,800 Speaker 3: this and so on top of the product, the company 538 00:29:16,880 --> 00:29:19,160 Speaker 3: is doing a lot of advocacy. 539 00:29:18,560 --> 00:29:19,760 Speaker 1: On this topic. 540 00:29:20,000 --> 00:29:23,239 Speaker 2: We've talked about phones as cigarettes and sugar, and it 541 00:29:23,320 --> 00:29:25,320 Speaker 2: really this is the kind of thing where you're like, 542 00:29:26,040 --> 00:29:28,680 Speaker 2: we should have more, we should have more strength than this, 543 00:29:29,360 --> 00:29:33,040 Speaker 2: But no, We've been introduced these products throughout history that 544 00:29:33,400 --> 00:29:35,520 Speaker 2: we get very dependent on, and so people have to 545 00:29:35,560 --> 00:29:40,120 Speaker 2: come up with strategies to allow us to extricate ourselves 546 00:29:40,200 --> 00:29:42,160 Speaker 2: from It's like a smoke free zone. I mean it's 547 00:29:42,200 --> 00:29:44,800 Speaker 2: a very similar thing to me. Yeah, absolutely, and I 548 00:29:44,840 --> 00:29:47,280 Speaker 2: think it's necessary unfortunately, but. 549 00:29:47,280 --> 00:29:51,360 Speaker 3: I think it's really transformative for especially schools, but also 550 00:29:52,120 --> 00:29:55,880 Speaker 3: privacy of different experiences and places. 551 00:29:55,640 --> 00:29:57,480 Speaker 2: Which I think people yearn form more and more. 552 00:29:57,680 --> 00:29:57,920 Speaker 3: Yeah. 553 00:29:58,440 --> 00:30:00,280 Speaker 1: Has that been a time where you've got a pitch 554 00:30:00,280 --> 00:30:02,280 Speaker 1: and you like, no, that's garbage and then it turned 555 00:30:02,280 --> 00:30:04,000 Speaker 1: out to be like the big thing. 556 00:30:05,600 --> 00:30:08,800 Speaker 3: I'd say more often is the opposite. And this is 557 00:30:08,800 --> 00:30:10,760 Speaker 3: where I own up to the fact that last year, 558 00:30:11,200 --> 00:30:14,040 Speaker 3: well by last year, I mean twenty twenty three, we 559 00:30:14,120 --> 00:30:16,840 Speaker 3: put the humane aipin on it. Essentially it was a 560 00:30:16,840 --> 00:30:21,240 Speaker 3: pin that was fully operated by voice control and AI, 561 00:30:21,640 --> 00:30:25,440 Speaker 3: and I think that's an example of a product for 562 00:30:25,480 --> 00:30:27,280 Speaker 3: one thing. It came out like right when the list 563 00:30:27,320 --> 00:30:30,160 Speaker 3: came out, so it wasn't like super well trialed yet, 564 00:30:30,720 --> 00:30:33,320 Speaker 3: but I think it was an example of something that 565 00:30:33,400 --> 00:30:37,680 Speaker 3: was newsworthy, even if it didn't come through in execution 566 00:30:37,760 --> 00:30:38,360 Speaker 3: all the way. 567 00:30:39,760 --> 00:30:42,640 Speaker 1: That will probably happen. There will be a product that is. 568 00:30:42,840 --> 00:30:44,600 Speaker 3: There will be that product, and I think voice control 569 00:30:44,640 --> 00:30:46,800 Speaker 3: just doesn't at the point yet where you can fully 570 00:30:46,840 --> 00:30:47,480 Speaker 3: rely on it. 571 00:30:47,720 --> 00:30:50,240 Speaker 1: But that one was also quite dangerous right, or at 572 00:30:50,320 --> 00:30:51,120 Speaker 1: least he got very hot. 573 00:30:51,320 --> 00:30:53,240 Speaker 3: I don't know about dangerous, but I think it just 574 00:30:53,280 --> 00:30:55,400 Speaker 3: didn't like work as well as. 575 00:30:55,320 --> 00:30:56,200 Speaker 2: You very well too. 576 00:30:56,360 --> 00:30:59,680 Speaker 3: Yeah, but I think it was exciting and I think 577 00:30:59,680 --> 00:31:02,160 Speaker 3: it pushed the conversation forward. So I think there's a 578 00:31:02,160 --> 00:31:05,480 Speaker 3: lot of those where people get very excited about a 579 00:31:05,560 --> 00:31:08,400 Speaker 3: product and then it is a flop. 580 00:31:08,680 --> 00:31:11,120 Speaker 2: Yeah, part of the course, I feel like, and you 581 00:31:11,360 --> 00:31:12,480 Speaker 2: know that better than anyone. 582 00:31:12,640 --> 00:31:17,120 Speaker 3: Yeah, absolutely. And I also looking back through twenty three 583 00:31:17,200 --> 00:31:22,360 Speaker 3: years of best inventions, seeing the same kind of seeing 584 00:31:22,360 --> 00:31:28,440 Speaker 3: the progress of different flops pushing each other forward really 585 00:31:28,720 --> 00:31:31,320 Speaker 3: drives home the fact that they're important even if they 586 00:31:31,360 --> 00:31:32,400 Speaker 3: don't they don't make it. 587 00:31:32,360 --> 00:31:36,719 Speaker 2: Which is a very existential thing to think about. We 588 00:31:36,760 --> 00:31:39,440 Speaker 2: are just driven by our series of flops. 589 00:31:39,360 --> 00:31:43,080 Speaker 1: Yeah, Tennis, and we rise on stepping sterns of our 590 00:31:43,120 --> 00:31:44,760 Speaker 1: former selves to greater things. 591 00:31:44,880 --> 00:31:47,920 Speaker 2: There you go, Well, thank you so much for taking 592 00:31:47,960 --> 00:31:50,360 Speaker 2: the time to talk to us. This is yeah, thank you, 593 00:31:50,440 --> 00:31:51,000 Speaker 2: really interesting. 594 00:31:51,040 --> 00:31:51,600 Speaker 1: I enjoyed it. 595 00:31:51,720 --> 00:31:55,160 Speaker 2: Of course we didn't even have to go to Vegas. 596 00:31:55,320 --> 00:31:57,680 Speaker 1: I think it's it's always nice to doff our caps 597 00:31:57,720 --> 00:32:01,000 Speaker 1: to those who went before, and one of the most 598 00:32:01,520 --> 00:32:03,840 Speaker 1: influential and iconic things I can think about in the 599 00:32:03,880 --> 00:32:11,680 Speaker 1: history of media is Jerry's final thought. So what should 600 00:32:11,680 --> 00:32:13,040 Speaker 1: we leave with this week? 601 00:32:13,680 --> 00:32:18,840 Speaker 2: So brain rot? Actually, the Oxford English Dictionaries word of 602 00:32:18,880 --> 00:32:23,360 Speaker 2: the Year is sort of a way that internet speak 603 00:32:23,920 --> 00:32:25,680 Speaker 2: has infiltrated our day to day lives. 604 00:32:25,840 --> 00:32:28,360 Speaker 1: A brain rot each week will be our final thought, 605 00:32:28,560 --> 00:32:30,440 Speaker 1: that will be our final thoughts. So what's this week? 606 00:32:30,720 --> 00:32:33,920 Speaker 2: This one which I see on TikTok now a lot 607 00:32:34,520 --> 00:32:36,959 Speaker 2: is so good and you sort of have to know 608 00:32:37,520 --> 00:32:40,200 Speaker 2: that it's something that's happening on TikTok to understand the 609 00:32:40,200 --> 00:32:43,840 Speaker 2: context of it. It started sort of in context of pregnancy, 610 00:32:44,280 --> 00:32:48,040 Speaker 2: where like you're at a certain age and people aren't 611 00:32:48,040 --> 00:32:50,680 Speaker 2: sure if like you're happy to be pregnant, and so 612 00:32:50,760 --> 00:32:56,840 Speaker 2: they started saying congradgudolences, congradudolences. Yeah, there's I just have 613 00:32:56,920 --> 00:32:57,840 Speaker 2: to play this one. 614 00:32:58,360 --> 00:33:00,400 Speaker 1: I'm at the age where if you post that you're pregnant, 615 00:33:00,400 --> 00:33:02,240 Speaker 1: I'm gonna need you to tell me whether you're happy 616 00:33:02,320 --> 00:33:05,520 Speaker 1: body or not, because like just being like, oh, I'm pregnant, Like, 617 00:33:06,120 --> 00:33:07,680 Speaker 1: am I supposed to say congratulations? 618 00:33:07,720 --> 00:33:08,800 Speaker 2: Am I supposed to feel bad? 619 00:33:08,800 --> 00:33:10,280 Speaker 1: Like I don't know how to like respond to that, 620 00:33:10,960 --> 00:33:14,320 Speaker 1: you know, so like just tell me, just be like, yeah, 621 00:33:14,360 --> 00:33:17,320 Speaker 1: I'm gonna keep this one. And I'm like, oh, congratulations. 622 00:33:18,800 --> 00:33:21,120 Speaker 1: What she means is congratulals, right, which. 623 00:33:21,040 --> 00:33:24,160 Speaker 2: What she means is congradulences. It's like, I think a 624 00:33:24,200 --> 00:33:27,280 Speaker 2: perfect example is like you have friends that are like, oh, 625 00:33:27,320 --> 00:33:29,280 Speaker 2: I'm gonna break up with this guy, and next thing 626 00:33:29,280 --> 00:33:31,640 Speaker 2: you know, they are on FaceTime being like we're engaged, 627 00:33:31,680 --> 00:33:36,040 Speaker 2: and you're like, congratual, you got you got what you 628 00:33:36,120 --> 00:33:39,520 Speaker 2: wanted with the wrong guy. So that's my brain rot. 629 00:33:39,600 --> 00:33:40,440 Speaker 2: You're gonna be hearing. 630 00:33:40,480 --> 00:33:43,000 Speaker 1: By the way, Twitter blew up yesterday because Zendaiya since 631 00:33:43,040 --> 00:33:45,680 Speaker 1: getting engaged, when she's doing interviews now she gestures with 632 00:33:45,680 --> 00:33:47,640 Speaker 1: her left hand rather than her right hand, so everyone 633 00:33:47,680 --> 00:33:48,560 Speaker 1: can to that. 634 00:33:48,640 --> 00:33:52,600 Speaker 2: I say congratulences. I don't know to who. 635 00:33:54,680 --> 00:33:57,480 Speaker 1: That's it for this week for tech stuff, I'm as. 636 00:33:57,560 --> 00:34:00,920 Speaker 2: L and I'm care Price. This episodisode was produced by 637 00:34:00,920 --> 00:34:04,920 Speaker 2: Eliza Dennis, Victoria Dominguez, and Lizzie Jacobs for Kaleidoscope. It 638 00:34:05,000 --> 00:34:08,799 Speaker 2: was executive produced by me os Vaalashan and Kate Osbourne 639 00:34:08,920 --> 00:34:12,600 Speaker 2: for iHeart. The executive producer is Katrina Norvel. The engineer 640 00:34:12,640 --> 00:34:15,399 Speaker 2: is Biheed Fraser and it's mixed by Kyle Murdoch, who 641 00:34:15,400 --> 00:34:16,480 Speaker 2: also wrote our theme song. 642 00:34:17,200 --> 00:34:20,680 Speaker 1: Join us next Wednesday for tech Stuff The Story, when 643 00:34:20,719 --> 00:34:23,839 Speaker 1: we'll share an in depth conversation with a longtime tech 644 00:34:23,920 --> 00:34:27,680 Speaker 1: chronicler Nicholas Thompson, former editor in chief of Wired and 645 00:34:27,800 --> 00:34:32,200 Speaker 1: current CEO of the Atlantic And please rate, review and 646 00:34:32,280 --> 00:34:35,400 Speaker 1: reach out to us at tech Stuff Podcast at gmail 647 00:34:35,440 --> 00:34:37,960 Speaker 1: dot com with your feedback. We really want to hear 648 00:34:38,000 --> 00:34:40,760 Speaker 2: From you, really, really bad, bad