1 00:00:12,520 --> 00:00:14,760 Speaker 1: From Kaleidoscope and iHeart podcasts. 2 00:00:14,920 --> 00:00:17,840 Speaker 2: This is tech stuff, I'm as Volocian and I'm Kara Price. 3 00:00:18,040 --> 00:00:21,160 Speaker 1: Today we get into the headlines this week, including the 4 00:00:21,200 --> 00:00:26,720 Speaker 1: future of unlocking animal consciousness with AI and Grock's commitment 5 00:00:26,800 --> 00:00:30,680 Speaker 1: to its maker Elon Musk. Then, on our new segment, 6 00:00:31,120 --> 00:00:31,680 Speaker 1: Chatt and me. 7 00:00:32,200 --> 00:00:34,800 Speaker 3: That GBT refused to read my novel when I tried 8 00:00:34,800 --> 00:00:37,840 Speaker 3: to upload it, which I don't really blame it for doing. 9 00:00:37,960 --> 00:00:41,160 Speaker 3: Novels are boring, but the annoying thing was that Chat 10 00:00:41,200 --> 00:00:43,760 Speaker 3: GBT kept lying to me and insisting that it had 11 00:00:43,800 --> 00:00:45,000 Speaker 3: read it when it clearly had. 12 00:00:45,360 --> 00:00:50,960 Speaker 1: All of that. On the Weekend Tech, It's Friday, July eighteenth. 13 00:00:51,600 --> 00:00:53,600 Speaker 2: Hey Kara, hi, Auzie. 14 00:00:53,760 --> 00:00:55,639 Speaker 1: You know you and my dad are the only two 15 00:00:55,680 --> 00:00:56,840 Speaker 1: people who call me Ozzie. 16 00:00:57,240 --> 00:00:59,279 Speaker 2: I don't know your dad that well. I know him 17 00:00:59,280 --> 00:01:01,680 Speaker 2: a little bit. I find that to be very flattering, 18 00:01:01,880 --> 00:01:04,959 Speaker 2: that I stand beside him in nickname calling. I also 19 00:01:05,000 --> 00:01:06,880 Speaker 2: call you Ozzie because it allows me to think of 20 00:01:06,959 --> 00:01:09,679 Speaker 2: you dressed like Ozzie Osbourne, which tickles me. 21 00:01:09,920 --> 00:01:13,399 Speaker 1: Another brit with slightly lank hair. Look, We've talked about 22 00:01:13,400 --> 00:01:16,000 Speaker 1: this on the show before. I'm not much of a 23 00:01:16,080 --> 00:01:18,720 Speaker 1: grocery shopper and I ready cook in the kitchen. 24 00:01:18,840 --> 00:01:20,800 Speaker 2: That shocks me very little. 25 00:01:20,880 --> 00:01:24,080 Speaker 1: What might shocks you more is that I actually can cook, 26 00:01:24,360 --> 00:01:28,520 Speaker 1: really I can. So. My stepfather owns an Italian restaurant 27 00:01:28,520 --> 00:01:30,920 Speaker 1: in London called Ricardo's check it Out one two six 28 00:01:30,959 --> 00:01:33,720 Speaker 1: Fulham Road, and one summer during high school, I cooked 29 00:01:33,720 --> 00:01:36,840 Speaker 1: in the kitchen there. Nowadays, living in New York City 30 00:01:37,240 --> 00:01:40,520 Speaker 1: on the fifth floor of a walk up building, I'm 31 00:01:40,560 --> 00:01:43,120 Speaker 1: pretty rarely grocery shopping and cooking. 32 00:01:43,120 --> 00:01:44,600 Speaker 2: I have to confess, I thought you were going to 33 00:01:44,600 --> 00:01:46,640 Speaker 2: give us your address there for a second as we 34 00:01:46,640 --> 00:01:47,240 Speaker 2: were streaming. 35 00:01:47,440 --> 00:01:49,680 Speaker 1: Well, I want people to go to Ricardo's restaurant. I 36 00:01:49,680 --> 00:01:51,240 Speaker 1: don't really want people to come to my house. 37 00:01:51,480 --> 00:01:53,880 Speaker 2: Your house could be Ricardo's if you cook, That's true, 38 00:01:53,960 --> 00:01:54,480 Speaker 2: you never know. 39 00:01:54,880 --> 00:01:56,880 Speaker 1: But basically, I, like many people in New York, and 40 00:01:57,000 --> 00:01:58,560 Speaker 1: regularly on a broots. 41 00:01:58,760 --> 00:02:00,640 Speaker 2: I forgot the story until we were just talking. But 42 00:02:00,760 --> 00:02:03,200 Speaker 2: when I was a kid, they used to have those playhouses, 43 00:02:03,840 --> 00:02:05,280 Speaker 2: you know, where you would go in and you'd be 44 00:02:05,320 --> 00:02:06,240 Speaker 2: able to pretend. 45 00:02:05,960 --> 00:02:09,200 Speaker 1: To cook, like the full size Yes, yes you did. 46 00:02:09,040 --> 00:02:11,600 Speaker 2: As a kid, and my parents once caught me and 47 00:02:11,639 --> 00:02:14,240 Speaker 2: they let me keep going. So that's grateful for this. 48 00:02:14,320 --> 00:02:16,160 Speaker 2: There's a little yellow phone. It was like a Fisher 49 00:02:16,160 --> 00:02:18,359 Speaker 2: Price thing, and they saw me ordering Chinese. 50 00:02:18,560 --> 00:02:24,240 Speaker 1: No, you're ordering Chinese food in the playoff. I love that. 51 00:02:24,400 --> 00:02:25,880 Speaker 2: So that'll give you a sense of how much I 52 00:02:25,919 --> 00:02:30,239 Speaker 2: can cook and do anything sort of epicurean minded, Well. 53 00:02:30,080 --> 00:02:32,680 Speaker 1: It's sunny. Use that word. I've actually been playing around 54 00:02:32,760 --> 00:02:35,520 Speaker 1: with this website called epicure this week, which is almost 55 00:02:35,600 --> 00:02:36,880 Speaker 1: tempting me back into the kitchen. 56 00:02:36,960 --> 00:02:40,120 Speaker 2: So is this like a knockoff of the recipe site Epicurius, 57 00:02:40,120 --> 00:02:40,919 Speaker 2: which I have used. 58 00:02:41,360 --> 00:02:46,240 Speaker 1: Not quite, although epicurious is a data set that Epicure's 59 00:02:46,360 --> 00:02:47,560 Speaker 1: model was trained on. 60 00:02:48,040 --> 00:02:50,880 Speaker 2: Interesting, So it's like AI generated recipes. 61 00:02:50,960 --> 00:02:52,560 Speaker 1: That's exactly right. Actually want you to take a look. 62 00:02:52,600 --> 00:02:55,760 Speaker 1: So if you go to epicure dot kaikaku dot. 63 00:02:55,560 --> 00:03:01,400 Speaker 2: Air, I know exactly how to spell kaikaku. Okay, now 64 00:03:01,440 --> 00:03:03,360 Speaker 2: what I have it? I have it here. Here's the 65 00:03:03,400 --> 00:03:05,560 Speaker 2: slogan on top of the first of all, this website 66 00:03:05,680 --> 00:03:09,560 Speaker 2: looks like it was created by Elon Musk. It says 67 00:03:09,639 --> 00:03:13,200 Speaker 2: you are now the world's most creative chef. Leveraged the 68 00:03:13,200 --> 00:03:16,440 Speaker 2: power of AI and machine learning to explore science backed 69 00:03:16,440 --> 00:03:18,240 Speaker 2: flavor pairings and generate recipes. 70 00:03:18,400 --> 00:03:21,520 Speaker 1: I think this is designed to flatter the Elon stand 71 00:03:23,320 --> 00:03:24,360 Speaker 1: more than the kra price. 72 00:03:24,520 --> 00:03:28,079 Speaker 2: What do they mean when they say science backed flavor 73 00:03:28,120 --> 00:03:31,440 Speaker 2: pairings and AI machine learning? How do those things come 74 00:03:31,480 --> 00:03:32,919 Speaker 2: together on this website? 75 00:03:33,320 --> 00:03:35,640 Speaker 1: Well, the ux designer of the website must have had 76 00:03:35,640 --> 00:03:37,600 Speaker 1: you in mind after all, because there is a tab 77 00:03:37,680 --> 00:03:41,160 Speaker 1: that you can click with three words how it works? 78 00:03:41,200 --> 00:03:43,720 Speaker 1: Oh my god, and I clicked on it, and what 79 00:03:43,760 --> 00:03:45,880 Speaker 1: I learned is the website is built using a deep 80 00:03:45,960 --> 00:03:49,800 Speaker 1: learning model called flavor graph, which was trained on over 81 00:03:49,880 --> 00:03:53,400 Speaker 1: a million recipes and also the chemical compound data of 82 00:03:53,440 --> 00:03:56,400 Speaker 1: different types of food items when they're cooked together. 83 00:03:56,640 --> 00:03:59,920 Speaker 2: Oh so the chemical compound data is the science part exactly. 84 00:04:00,120 --> 00:04:03,800 Speaker 1: So food chemists have identified these different flavor compounds, which 85 00:04:03,800 --> 00:04:08,520 Speaker 1: I guess are kind of chemical compositions in most ingredients, 86 00:04:08,920 --> 00:04:11,960 Speaker 1: and flavor graph is trained on over a thousand flavor 87 00:04:11,960 --> 00:04:14,760 Speaker 1: compounds which are found in three hundred and eighty one 88 00:04:14,760 --> 00:04:17,800 Speaker 1: different ingredients. So the same flavor compound can be more 89 00:04:17,800 --> 00:04:21,159 Speaker 1: than one ingredient. The model can then create a flavor 90 00:04:21,279 --> 00:04:24,960 Speaker 1: network in which two ingredients are connected if they share 91 00:04:25,000 --> 00:04:26,799 Speaker 1: at least one flavor compound. 92 00:04:27,160 --> 00:04:30,760 Speaker 2: Interesting. So that's how it generates recipes by like linking 93 00:04:30,960 --> 00:04:33,400 Speaker 2: ingredients based on these flavor compounds. 94 00:04:33,520 --> 00:04:35,360 Speaker 1: That's right, and I couldn't resist. Only a bit more 95 00:04:35,360 --> 00:04:38,080 Speaker 1: of a deep dive on the founder's LinkedIn. You might 96 00:04:38,120 --> 00:04:39,000 Speaker 1: not be surprised to. 97 00:04:39,000 --> 00:04:41,040 Speaker 2: Hear you couldn't resist linkedins. 98 00:04:41,360 --> 00:04:43,280 Speaker 1: So he had this funny post that begins with we've 99 00:04:43,320 --> 00:04:49,400 Speaker 1: achieved agi artificial gastro intelligence. Al was a very bad joke, 100 00:04:49,440 --> 00:04:51,080 Speaker 1: but it kind of got me. Do you remember this 101 00:04:51,240 --> 00:04:53,719 Speaker 1: time when all of these chefs like Elbullian stuff were 102 00:04:53,720 --> 00:04:56,240 Speaker 1: turning the kitchen into a chemistry chemistry? Yes, so that 103 00:04:56,360 --> 00:04:58,680 Speaker 1: was sort of something that only, you know, the most 104 00:04:58,680 --> 00:05:01,600 Speaker 1: famous chefs in the world could do it. Now leveraging 105 00:05:01,600 --> 00:05:02,240 Speaker 1: the power. 106 00:05:01,960 --> 00:05:04,480 Speaker 2: At least famous chef in the world me, you can 107 00:05:04,520 --> 00:05:05,839 Speaker 2: do it, do it myself. 108 00:05:06,080 --> 00:05:06,840 Speaker 1: So you want to try it? 109 00:05:06,920 --> 00:05:07,320 Speaker 2: I do? 110 00:05:07,520 --> 00:05:09,479 Speaker 1: I do? Okay, So I got the website open. You 111 00:05:09,560 --> 00:05:12,839 Speaker 1: basically you guys select ingredients or you can type in 112 00:05:13,120 --> 00:05:15,000 Speaker 1: your own ingredients. So what ingredients would you like? 113 00:05:16,520 --> 00:05:18,680 Speaker 2: Of course they're British. Let's oh, so it can really 114 00:05:18,720 --> 00:05:23,160 Speaker 2: be any ingredient. Okay, let's do mustard. 115 00:05:23,320 --> 00:05:26,279 Speaker 1: Mustard Okay, yeah, mustard seed or just mustard. 116 00:05:25,960 --> 00:05:29,480 Speaker 2: Is just mustard, please, and pasta. 117 00:05:29,880 --> 00:05:32,680 Speaker 1: Mustard and pasta. That sounds pretty disgussgusting to me. 118 00:05:32,760 --> 00:05:35,000 Speaker 2: It sounds delicious. They're going to come up with something. 119 00:05:35,160 --> 00:05:38,800 Speaker 1: So what you see first is this graph where mustard 120 00:05:38,800 --> 00:05:41,760 Speaker 1: and pasta at the center, and coming off all these 121 00:05:41,800 --> 00:05:48,279 Speaker 1: spokes with different ingredient ideas. So you've got bacon, onion, sausage, cheese, bread. 122 00:05:48,440 --> 00:05:49,440 Speaker 1: That sounds pretty bad. 123 00:05:49,839 --> 00:05:52,400 Speaker 2: This sounds like Spanish rid garlic. 124 00:05:53,000 --> 00:05:55,480 Speaker 1: But if you're not able to just from this graph 125 00:05:55,600 --> 00:05:58,159 Speaker 1: extrapolate your own recipe and get going in the kitchen, 126 00:05:58,480 --> 00:06:01,920 Speaker 1: there is a feature to actually generate a recipe. So 127 00:06:01,920 --> 00:06:04,320 Speaker 1: the first thing you have to you chosen your two ingredients. Now, yes, 128 00:06:04,560 --> 00:06:07,080 Speaker 1: now you have to choose whether you want a snack 129 00:06:07,440 --> 00:06:11,240 Speaker 1: casual dining, whether you want an appetizer fine dining. And 130 00:06:11,240 --> 00:06:12,880 Speaker 1: then you get to choose the cuisine as well. So 131 00:06:13,240 --> 00:06:14,440 Speaker 1: is this a snack? Is it a meal? 132 00:06:14,480 --> 00:06:15,679 Speaker 2: And a main dish? 133 00:06:15,839 --> 00:06:18,119 Speaker 1: Main dish? Okay? And what cuisine you're gonna choose. 134 00:06:18,160 --> 00:06:21,840 Speaker 2: Oh, the cuisine that I'm gonna choose is Italian. 135 00:06:21,960 --> 00:06:22,440 Speaker 1: Italian. 136 00:06:23,800 --> 00:06:26,119 Speaker 2: I'm curious if we have the same thing I've gotten 137 00:06:26,200 --> 00:06:28,560 Speaker 2: creamy pancetta and pea pasta. 138 00:06:28,200 --> 00:06:29,000 Speaker 1: That sounds pretty good. 139 00:06:29,080 --> 00:06:32,000 Speaker 2: Now, what I would do, because this thing does not 140 00:06:32,040 --> 00:06:34,640 Speaker 2: miss a beat, is I should say that I'm vegetarian. 141 00:06:35,040 --> 00:06:36,360 Speaker 2: This is very fun for me. 142 00:06:36,400 --> 00:06:38,479 Speaker 1: So I actually choose appetizer rather than main dish, and 143 00:06:38,520 --> 00:06:44,279 Speaker 1: I got ravioli dolci condricotta. You have that starter. It 144 00:06:44,320 --> 00:06:48,440 Speaker 1: is actually in Italian. It is actually vegetarian. What are 145 00:06:48,440 --> 00:06:48,560 Speaker 1: you going? 146 00:06:48,880 --> 00:06:53,400 Speaker 2: I got pasta alforno with roasted vegetables and creamy mustard ricotta. 147 00:06:53,680 --> 00:06:56,599 Speaker 1: We're looking at two different AI generated recipes. Now I 148 00:06:56,640 --> 00:06:59,040 Speaker 1: have all the ingredients listed out and then the instructions, 149 00:06:59,200 --> 00:07:02,040 Speaker 1: and I also have an AI generated image of this dish, 150 00:07:02,080 --> 00:07:05,840 Speaker 1: which looks in my case pretty good, although in a 151 00:07:05,880 --> 00:07:09,760 Speaker 1: difficult AI fashion. The fork and the spoon are merged together, 152 00:07:10,240 --> 00:07:12,880 Speaker 1: so there is something a sport. 153 00:07:12,720 --> 00:07:14,240 Speaker 2: Yeah, which is an AI hallucination. 154 00:07:14,480 --> 00:07:16,360 Speaker 1: Is this what have you got? I have? 155 00:07:16,640 --> 00:07:20,320 Speaker 2: Similarly, I wouldn't think it's AI, except for the basil 156 00:07:20,440 --> 00:07:23,840 Speaker 2: is placed so perfectly on the top of the pasta 157 00:07:24,440 --> 00:07:27,040 Speaker 2: that there's just no way this is in an AI 158 00:07:27,120 --> 00:07:27,920 Speaker 2: generated image. 159 00:07:27,960 --> 00:07:30,440 Speaker 1: It's interesting. I was in Doha earlier this year, as 160 00:07:30,480 --> 00:07:33,360 Speaker 1: you know, at a conference called web Summit, and Snapchat 161 00:07:33,400 --> 00:07:36,800 Speaker 1: gave a presentation about what their augmented reality glasses might 162 00:07:36,800 --> 00:07:39,560 Speaker 1: be able to do one day, And the presentation video 163 00:07:39,640 --> 00:07:42,640 Speaker 1: is a guy opening the fridge wearing his augmented reality glasses, 164 00:07:43,000 --> 00:07:45,520 Speaker 1: seeing some tomatoes and some eggs and whatever else and 165 00:07:45,520 --> 00:07:48,679 Speaker 1: getting a recipe suggested and started cooking. So, for whatever reason, 166 00:07:48,800 --> 00:07:51,840 Speaker 1: this idea of remixing ingredients seems to be the holy 167 00:07:51,880 --> 00:07:52,480 Speaker 1: grail of AI. 168 00:07:52,680 --> 00:07:55,679 Speaker 2: It'll be interesting to see if people start using AI 169 00:07:55,760 --> 00:07:59,680 Speaker 2: generated recipes, if AI starts to influence their decisions in 170 00:07:59,720 --> 00:08:02,840 Speaker 2: the kit. Similarly, the story that I want to tell 171 00:08:02,880 --> 00:08:05,400 Speaker 2: you has a lot to do with the way that 172 00:08:05,520 --> 00:08:10,760 Speaker 2: chat GPT is influencing our language. Huh. I've been looking 173 00:08:10,760 --> 00:08:13,640 Speaker 2: at a study by researchers at the Max Planck Institute 174 00:08:13,640 --> 00:08:17,120 Speaker 2: for Human Development in Germany to explore how AI is 175 00:08:17,160 --> 00:08:20,120 Speaker 2: affecting the way we speak how we speak yes. So, 176 00:08:20,600 --> 00:08:23,480 Speaker 2: the way that the study went is that it identified 177 00:08:23,680 --> 00:08:28,000 Speaker 2: words that chatchybt favored. So they uploaded millions of pages 178 00:08:28,040 --> 00:08:32,199 Speaker 2: of academic papers, news stories, emails and essays and asked 179 00:08:32,280 --> 00:08:36,960 Speaker 2: chat gibt to polish the text. They then used AI 180 00:08:37,240 --> 00:08:41,520 Speaker 2: edited documents to identify words that chat gbt seem to favor. 181 00:08:42,400 --> 00:08:43,760 Speaker 2: So you read a lot of LinkedIn. What do you 182 00:08:43,760 --> 00:08:44,600 Speaker 2: think those words are? 183 00:08:44,920 --> 00:08:46,400 Speaker 1: You're putting me on the spot here, But I think 184 00:08:46,480 --> 00:08:50,160 Speaker 1: the truth is I have read so much AI build 185 00:08:50,280 --> 00:08:53,120 Speaker 1: and slopped them and completely sensitized. I have no idea 186 00:08:53,440 --> 00:08:54,680 Speaker 1: where the walls even are. 187 00:08:55,280 --> 00:08:57,679 Speaker 2: But tell me bilge is actually not one of them. 188 00:08:58,960 --> 00:09:01,760 Speaker 2: The words that they found and were and maybe you've 189 00:09:01,840 --> 00:09:06,760 Speaker 2: heard these more recently, delve delve into realm in the 190 00:09:06,800 --> 00:09:10,200 Speaker 2: realm of possibility, meticulous. 191 00:09:09,480 --> 00:09:15,000 Speaker 4: That's us, underscore, underscoes my point, Bolster, bolster, bolster, the 192 00:09:15,160 --> 00:09:19,280 Speaker 4: argument bolsters my conviction, and boast is another one like 193 00:09:19,440 --> 00:09:21,080 Speaker 4: it boasts an impressive resume. 194 00:09:21,440 --> 00:09:24,320 Speaker 2: This sort of this makes sense in terms of AI sycophancy. 195 00:09:24,520 --> 00:09:27,000 Speaker 1: So I get that they were able to understand from 196 00:09:27,040 --> 00:09:31,520 Speaker 1: analyzing how AI edits documents that these words are common. 197 00:09:32,120 --> 00:09:34,800 Speaker 1: How did they figure out that these words are also 198 00:09:34,880 --> 00:09:36,319 Speaker 1: showing up in our mouths? 199 00:09:36,679 --> 00:09:40,880 Speaker 2: So the researchers analyzed roughly a million YouTube videos and 200 00:09:40,960 --> 00:09:44,600 Speaker 2: podcast episodes, and these words were used measurably more frequently 201 00:09:45,200 --> 00:09:46,920 Speaker 2: after chat GBT was released. 202 00:09:47,000 --> 00:09:51,440 Speaker 1: So basically YouTubers and podcasters are trackably using words that 203 00:09:51,480 --> 00:09:55,880 Speaker 1: AI favors. In other words, were already just poppets for agi. 204 00:09:56,080 --> 00:09:58,400 Speaker 2: Kind of you know. One of the studies authors told 205 00:09:58,440 --> 00:10:02,079 Speaker 2: Scientific American that quote, it's natural for humans to imitate 206 00:10:02,120 --> 00:10:05,760 Speaker 2: one another, but we don't imitate everyone that is around us. Equally, 207 00:10:05,960 --> 00:10:08,360 Speaker 2: we're more likely to copy what someone else is doing 208 00:10:08,600 --> 00:10:11,920 Speaker 2: if we perceive them as being knowledgeable or important. 209 00:10:12,080 --> 00:10:15,160 Speaker 1: I guess it's sociolinguistics one on one, right. We match 210 00:10:15,320 --> 00:10:18,000 Speaker 1: the way we speak to people we admire and want 211 00:10:18,040 --> 00:10:21,560 Speaker 1: to imitate. In this case, it's not people, it's a machine, 212 00:10:21,600 --> 00:10:25,280 Speaker 1: which is kind of disturbing. It's funny. I don't use 213 00:10:25,320 --> 00:10:27,920 Speaker 1: that many of the AI words, but I have noticed 214 00:10:27,960 --> 00:10:30,880 Speaker 1: that since moving to the US, I've found myself regularly 215 00:10:30,960 --> 00:10:35,559 Speaker 1: using words like totally, absolutely, incredible, one hundred. 216 00:10:35,280 --> 00:10:36,720 Speaker 2: Percent to become a value girl. 217 00:10:37,040 --> 00:10:40,400 Speaker 1: Basically, yeah, I've become a value girl, a business stech. 218 00:10:40,559 --> 00:10:43,400 Speaker 2: The valley girl is our sort of predominant cultural icon, 219 00:10:43,480 --> 00:10:45,480 Speaker 2: which I think is similar to why they're doing this study. 220 00:10:45,520 --> 00:10:47,080 Speaker 1: Now, what you're saying is a value girl is being 221 00:10:47,160 --> 00:10:50,000 Speaker 1: replaced by chacchi chachi. What are these words? What are 222 00:10:50,000 --> 00:10:50,520 Speaker 1: the tells? 223 00:10:50,760 --> 00:10:52,760 Speaker 2: I think what's interesting to me is we're just sort 224 00:10:52,760 --> 00:10:55,920 Speaker 2: of puppets of I guess whatever subculture or culture we're 225 00:10:55,960 --> 00:10:58,360 Speaker 2: living in. Like, I remember studying abroad when I was 226 00:10:58,400 --> 00:10:59,800 Speaker 2: in high school. I did this a broad thing, and 227 00:10:59,800 --> 00:11:01,559 Speaker 2: I was living with a Canadian girl, and I started 228 00:11:01,559 --> 00:11:05,360 Speaker 2: saying a after three weeks, Yeah, I was like fifteen. I 229 00:11:05,360 --> 00:11:08,240 Speaker 2: guess the version of having a Canadian roommate, though, is 230 00:11:08,280 --> 00:11:12,320 Speaker 2: now sort of more ubiquitous with something like CHATGPT. And 231 00:11:12,920 --> 00:11:17,079 Speaker 2: the paper seems to suggest that Chatgypt has become this 232 00:11:17,120 --> 00:11:21,240 Speaker 2: sort of cultural authority. Quote, machines trained on human culture 233 00:11:21,320 --> 00:11:25,880 Speaker 2: are now generating cultural traits that humans adopt, effectively closing 234 00:11:25,920 --> 00:11:30,880 Speaker 2: a cultural feedback loop. Which as I was reading this, 235 00:11:31,200 --> 00:11:33,719 Speaker 2: I'm sort of thinking to myself. Everyone's like, AI is 236 00:11:33,720 --> 00:11:36,560 Speaker 2: going to take our jobs, and I'm like, I think 237 00:11:36,600 --> 00:11:39,040 Speaker 2: it's taking our brains faster than it's taking our jobs. 238 00:11:39,320 --> 00:11:41,320 Speaker 1: Yeah. We did that story a few weeks ago about 239 00:11:41,440 --> 00:11:44,640 Speaker 1: cognitive debt, basically the idea that if you offload too 240 00:11:44,760 --> 00:11:48,200 Speaker 1: much work to AI, you basically become less capable of 241 00:11:48,200 --> 00:11:49,040 Speaker 1: doing it yourself. 242 00:11:49,440 --> 00:11:51,839 Speaker 2: Yeah, and you know, the paper raises a concern that 243 00:11:52,200 --> 00:11:55,680 Speaker 2: this development could lead to cultural homogenization. You know, there's 244 00:11:55,720 --> 00:12:00,000 Speaker 2: a quote that if AI systems disproportionately favor specific cultural traits, 245 00:12:00,480 --> 00:12:04,360 Speaker 2: they may accelerate the erosion of cultural diversity one delve 246 00:12:04,400 --> 00:12:04,959 Speaker 2: at a time. 247 00:12:05,080 --> 00:12:07,760 Speaker 1: I mean, this is like you know, social media, YouTube, etc. 248 00:12:08,080 --> 00:12:11,760 Speaker 1: There's like very rapid global flattening of culture whenever a 249 00:12:11,880 --> 00:12:15,200 Speaker 1: meme emerges. And this seems to be a kind of 250 00:12:15,240 --> 00:12:18,480 Speaker 1: real booster of that, you know. Yeah, I mean when 251 00:12:18,520 --> 00:12:20,640 Speaker 1: we think about this aurboros and the idea of the 252 00:12:20,679 --> 00:12:22,640 Speaker 1: snake that needs its own tail. I mean, in this 253 00:12:23,040 --> 00:12:27,760 Speaker 1: search for efficiency and generating ideas and output, are we 254 00:12:28,280 --> 00:12:31,800 Speaker 1: you know, consuming ourselves. But I think what's kind of 255 00:12:31,800 --> 00:12:35,679 Speaker 1: interesting here is this idea of automation bias on steroids, 256 00:12:35,679 --> 00:12:39,520 Speaker 1: Like we believe that machine output is more authoritative than 257 00:12:39,600 --> 00:12:41,599 Speaker 1: human output, and then we start to copy it, we 258 00:12:41,640 --> 00:12:43,240 Speaker 1: start to mirror our own machines. 259 00:12:43,559 --> 00:12:47,199 Speaker 2: Yeah. I also think it's just interesting to note that 260 00:12:47,280 --> 00:12:49,320 Speaker 2: we I mean, I don't know if I would say 261 00:12:49,320 --> 00:12:53,960 Speaker 2: that for me personally or for you, but many people 262 00:12:53,960 --> 00:12:56,440 Speaker 2: in my life do look at chat ept not only 263 00:12:56,440 --> 00:12:59,559 Speaker 2: as a cultural authority, but as an authority figure on 264 00:12:59,679 --> 00:13:03,120 Speaker 2: another number of topics. And I wanted to report on 265 00:13:03,160 --> 00:13:07,040 Speaker 2: this because I think that it's important to consider the 266 00:13:07,160 --> 00:13:10,480 Speaker 2: influence that a non human agent can have on your 267 00:13:10,559 --> 00:13:13,000 Speaker 2: daily life, whether or not you use it a lot 268 00:13:13,280 --> 00:13:16,440 Speaker 2: or just use it a little, it becomes something that 269 00:13:16,520 --> 00:13:20,880 Speaker 2: you are deferential to, which to me is actually more 270 00:13:20,920 --> 00:13:24,400 Speaker 2: serious than the bigger like will AI take over our lives? 271 00:13:24,480 --> 00:13:28,439 Speaker 2: Like being deferential to a chatbot is a lot more insidious, 272 00:13:28,480 --> 00:13:32,000 Speaker 2: but it's real. So I do want to flag that 273 00:13:32,080 --> 00:13:34,679 Speaker 2: this study has yet to be peer reviewed, which is 274 00:13:34,720 --> 00:13:36,800 Speaker 2: something we're kind of getting used to with these studies. 275 00:13:36,960 --> 00:13:39,680 Speaker 2: I also want to say that correlation does not equal causation. 276 00:13:39,840 --> 00:13:43,760 Speaker 2: You know, language does change, there could be other cultural 277 00:13:43,800 --> 00:13:47,000 Speaker 2: forces at play. The point still stands, though, we should 278 00:13:47,080 --> 00:13:49,480 Speaker 2: keep an eye on AI's influence on our culture and 279 00:13:49,600 --> 00:13:50,559 Speaker 2: the way we communicate. 280 00:13:51,000 --> 00:13:54,360 Speaker 1: Yeah, I think AI and unintended consequences is a rich 281 00:13:54,400 --> 00:13:58,120 Speaker 1: area for discussion, including of course, in our politics. 282 00:13:58,400 --> 00:14:01,640 Speaker 2: Are you talking about Lion Marco Little? 283 00:14:01,800 --> 00:14:03,920 Speaker 1: Look, no, he's not anymore. 284 00:14:03,360 --> 00:14:05,920 Speaker 2: He's Rubia. He's a segreator right now? 285 00:14:06,080 --> 00:14:09,320 Speaker 1: So he and White House Chief of Staff Susie Wiles 286 00:14:09,360 --> 00:14:13,079 Speaker 1: have both been impersonated by AI recently. Now, we've talked 287 00:14:13,200 --> 00:14:15,080 Speaker 1: at length about how easy it is these days to 288 00:14:15,120 --> 00:14:18,920 Speaker 1: clone someone's voice using AI. You don't need extensive, clean audio, 289 00:14:19,000 --> 00:14:22,400 Speaker 1: You need fifteen seconds of someone's voice and basically for free, 290 00:14:22,640 --> 00:14:24,120 Speaker 1: you can make a believable clone. 291 00:14:24,280 --> 00:14:27,120 Speaker 2: And who is an easier target than a politician because 292 00:14:27,160 --> 00:14:29,479 Speaker 2: they talk a lot and make a lot of public appearances. 293 00:14:29,720 --> 00:14:31,640 Speaker 2: I would say at least more than the average person. 294 00:14:32,120 --> 00:14:34,240 Speaker 1: Yeah, I mean they're easier targets, and they're also of 295 00:14:34,280 --> 00:14:39,560 Speaker 1: course higher value targets. If Marco Rubio's wild schools, you 296 00:14:39,600 --> 00:14:41,640 Speaker 1: know that they probably have it more clout when someone 297 00:14:41,680 --> 00:14:45,080 Speaker 1: picks up the phone than you what I do. Rubio's 298 00:14:45,160 --> 00:14:49,000 Speaker 1: impostor called three foreign ministers, a governor, and a senator, 299 00:14:49,440 --> 00:14:52,440 Speaker 1: and in two instances left voicemails on the messaging apps 300 00:14:52,520 --> 00:14:57,000 Speaker 1: signal that old friend of the Trump administration. Supposedly the 301 00:14:57,000 --> 00:15:00,320 Speaker 1: impersonators use. The name on signal was Rubio at date 302 00:15:00,400 --> 00:15:03,720 Speaker 1: dot gov, which is perhaps something which also psychologically primed 303 00:15:03,720 --> 00:15:04,760 Speaker 1: the targets. I think it was real. 304 00:15:04,880 --> 00:15:07,000 Speaker 2: Whenever these sorts of things happen, I'm like, I would 305 00:15:07,040 --> 00:15:13,000 Speaker 2: fall for Rubio at state dot gov, Mark or Rubio. Why, like, 306 00:15:13,040 --> 00:15:13,760 Speaker 2: why did this happen? 307 00:15:13,880 --> 00:15:15,480 Speaker 1: We don't know. We don't know why I happen. We 308 00:15:15,480 --> 00:15:18,600 Speaker 1: don't know who's doing it. The FBI is investigating. One 309 00:15:18,680 --> 00:15:21,560 Speaker 1: of the major questions is was this carried out by 310 00:15:21,560 --> 00:15:27,080 Speaker 1: criminal actors or potentially by national security adversaries, and our 311 00:15:27,080 --> 00:15:29,000 Speaker 1: producer Lies was pushing me on whether or not we 312 00:15:29,000 --> 00:15:32,720 Speaker 1: should include this story because it's an interesting novel use case. 313 00:15:32,720 --> 00:15:34,960 Speaker 1: But we've talked extensively about deep fates on the show. 314 00:15:35,040 --> 00:15:37,480 Speaker 1: But to sort of bolster my case as to why 315 00:15:37,480 --> 00:15:39,840 Speaker 1: I thought this was important and timely, I did some 316 00:15:39,880 --> 00:15:44,000 Speaker 1: extra homework. And also in the last week there was 317 00:15:44,040 --> 00:15:47,720 Speaker 1: a story about deep fake technology that brings up sort 318 00:15:47,760 --> 00:15:51,320 Speaker 1: of connected questions about how we define our identity in 319 00:15:51,360 --> 00:15:55,680 Speaker 1: the digital age. So Kara comment, Danmark. 320 00:15:56,600 --> 00:16:01,880 Speaker 2: You're an unbelievable teacher's pet. I'll show you why you 321 00:16:01,920 --> 00:16:02,280 Speaker 2: want to. 322 00:16:02,200 --> 00:16:05,000 Speaker 1: Know, and I'll google how to pronounce welcome to Denmark 323 00:16:05,040 --> 00:16:08,800 Speaker 1: in Danish Danish citizens could soon have more ownership and 324 00:16:08,920 --> 00:16:13,200 Speaker 1: control over their likeness, including voice and facial features, because 325 00:16:13,200 --> 00:16:16,479 Speaker 1: the Danish government is actively considering a piece of legislation 326 00:16:17,000 --> 00:16:20,040 Speaker 1: to give citizens tools to fight back if their likeness 327 00:16:20,080 --> 00:16:21,200 Speaker 1: is copied without their consent. 328 00:16:21,440 --> 00:16:23,480 Speaker 2: So the US does not do this, I remember we 329 00:16:23,520 --> 00:16:25,800 Speaker 2: talked about the Take It Down Act a few weeks ago. 330 00:16:26,160 --> 00:16:28,200 Speaker 1: Yeah, I mean, that's this new law in the US 331 00:16:28,560 --> 00:16:32,400 Speaker 1: that mandates platforms to remove deep fake pornography and other 332 00:16:32,440 --> 00:16:37,240 Speaker 1: misinformation from their sites upon user request. But lawmakers and 333 00:16:37,320 --> 00:16:39,840 Speaker 1: Denmark are saying this is not actually an effective approach 334 00:16:39,880 --> 00:16:43,680 Speaker 1: because it forces governments into a defensive posture and only 335 00:16:43,720 --> 00:16:48,080 Speaker 1: addresses specific use cases of deep fake technology like individual posts, 336 00:16:48,120 --> 00:16:52,040 Speaker 1: not the conceptual problem. The Danish Cultural Minister told the Guardian, 337 00:16:52,160 --> 00:16:55,360 Speaker 1: quote in the bill we agree on are sending an 338 00:16:55,440 --> 00:16:59,000 Speaker 1: unequivocal message that everybody has the right to their own body, 339 00:16:59,400 --> 00:17:03,200 Speaker 1: their own and their own facial features, which is apparently 340 00:17:03,240 --> 00:17:06,520 Speaker 1: not how the current law is protecting people against gerative AI. 341 00:17:06,560 --> 00:17:09,720 Speaker 2: My likeness, my choice, and it certainly isn't protecting anyone 342 00:17:09,720 --> 00:17:11,399 Speaker 2: in the United I mean, this is the first of 343 00:17:11,440 --> 00:17:12,240 Speaker 2: its kind law. 344 00:17:12,400 --> 00:17:14,399 Speaker 1: Yeah, it hasn't even been passed in Denmark yet, and 345 00:17:14,440 --> 00:17:16,639 Speaker 1: what does it do well. It would make social media 346 00:17:16,680 --> 00:17:20,800 Speaker 1: companies responsible for offending deep fakes, but it would not 347 00:17:20,840 --> 00:17:23,919 Speaker 1: penalize the users who shared or posted them. This is 348 00:17:23,960 --> 00:17:27,520 Speaker 1: basically the same mechanism as the Take It Down Act, 349 00:17:27,840 --> 00:17:29,840 Speaker 1: just a different legal theory. The Take It Down Act 350 00:17:29,880 --> 00:17:31,720 Speaker 1: is you have to prove that these deep fakes have 351 00:17:31,800 --> 00:17:34,800 Speaker 1: caused harm. I think the legal theory here is that 352 00:17:35,160 --> 00:17:38,800 Speaker 1: you have a copyright to digital copies of yourself, which 353 00:17:38,840 --> 00:17:41,639 Speaker 1: is a different conceptual framework, and maybe you can apply 354 00:17:42,119 --> 00:17:44,639 Speaker 1: more broadly and put less onus on users and governments. 355 00:17:44,720 --> 00:17:47,840 Speaker 1: It sort of changes the assumptions going into how people 356 00:17:47,880 --> 00:17:49,080 Speaker 1: can use digital copies of you. 357 00:17:49,359 --> 00:17:51,679 Speaker 2: I'm curious to follow this because, well, one because it's 358 00:17:51,760 --> 00:17:54,640 Speaker 2: the first I'm hearing of it, and because this concept 359 00:17:54,640 --> 00:17:58,720 Speaker 2: of using copyright laws to protect your digital likeness rather 360 00:17:58,760 --> 00:18:01,919 Speaker 2: than having to prove harm caused by a specific use 361 00:18:01,960 --> 00:18:04,000 Speaker 2: case of a deep fike is very interesting to me. 362 00:18:04,200 --> 00:18:06,119 Speaker 1: I think that's why I thought this story and the 363 00:18:06,160 --> 00:18:09,159 Speaker 1: Marco Rubio one were an interesting pair, because it's like 364 00:18:09,359 --> 00:18:12,280 Speaker 1: this is happening in real time. It's in the wild. 365 00:18:12,680 --> 00:18:17,080 Speaker 1: Senior US officials are being impersonated in their interactions with 366 00:18:17,640 --> 00:18:20,240 Speaker 1: other foreign leaders, and I mean this is sort of 367 00:18:20,640 --> 00:18:23,080 Speaker 1: it's always on a rolling boil, but it feels to 368 00:18:23,160 --> 00:18:26,399 Speaker 1: me like there's a kind of new crisis point emerging. 369 00:18:26,760 --> 00:18:28,679 Speaker 1: It's something that affects everyone, and no one has all 370 00:18:28,680 --> 00:18:30,879 Speaker 1: the answers. But I do think it's worth pausing just 371 00:18:30,920 --> 00:18:33,040 Speaker 1: to note that the people who are really most affected 372 00:18:33,080 --> 00:18:36,840 Speaker 1: by this and most harmed by this are not government officials. 373 00:18:36,880 --> 00:18:40,520 Speaker 1: They are everyday teenagers. According to Thorn, which is a 374 00:18:40,640 --> 00:18:45,560 Speaker 1: child online safety nonprofit. One in ten teenagers age thirteen 375 00:18:45,600 --> 00:18:49,199 Speaker 1: to seventeen personally knows someone who's been the target of 376 00:18:49,280 --> 00:18:52,480 Speaker 1: deep fake nude imagery. I mean, it's a horrific thought. 377 00:18:52,720 --> 00:18:55,040 Speaker 1: And imagine trying to apply the Take It Down Act 378 00:18:55,080 --> 00:18:58,560 Speaker 1: to one in ten teenagers in America, and only after 379 00:18:58,600 --> 00:19:01,840 Speaker 1: the harm has been caused, so lots to chew on. 380 00:19:12,400 --> 00:19:15,480 Speaker 1: After the break, we introduce you to someone you'll never 381 00:19:15,520 --> 00:19:26,200 Speaker 1: want to meet, Mecha Hitler. Stay with us, Welcome back. 382 00:19:26,240 --> 00:19:28,000 Speaker 1: We've got a few more headlines for you this. 383 00:19:27,960 --> 00:19:31,320 Speaker 2: Week, and then a story about just how uncooperative chat 384 00:19:31,320 --> 00:19:35,399 Speaker 2: GPT can get. But first we have to talk about GROCK. 385 00:19:35,760 --> 00:19:37,560 Speaker 1: If I told you you would one day say the 386 00:19:37,560 --> 00:19:38,520 Speaker 1: line we have to talk. 387 00:19:38,400 --> 00:19:41,639 Speaker 2: About GROC, I never would have saudy. 388 00:19:41,240 --> 00:19:44,800 Speaker 1: But this story was unavoidable. Elon Musk's AI chatbot made 389 00:19:44,840 --> 00:19:48,440 Speaker 1: anti Semitic comments to some users. Recently, evidence of those 390 00:19:48,440 --> 00:19:51,320 Speaker 1: comments has been deleted, but users said that Grok praised 391 00:19:51,400 --> 00:19:55,040 Speaker 1: Hitler and at times referred to itself as Mecha Hitler. 392 00:19:55,280 --> 00:19:58,359 Speaker 2: And this started almost immediately after an announced update to 393 00:19:58,440 --> 00:20:01,560 Speaker 2: the model, which, according to the verse, GROC was updated 394 00:20:01,560 --> 00:20:05,439 Speaker 2: to assume that quote subjective viewpoints sourced from the media 395 00:20:05,520 --> 00:20:09,639 Speaker 2: are biased and quote the response should not shy away 396 00:20:09,640 --> 00:20:12,679 Speaker 2: from making claims which are politically incorrect as long as 397 00:20:12,720 --> 00:20:16,440 Speaker 2: they are well substantiated. But this wasn't the only odd 398 00:20:16,480 --> 00:20:20,080 Speaker 2: GROC behavior. Last week, AI super users God Bless Them 399 00:20:20,359 --> 00:20:24,440 Speaker 2: discovered that when asked to give an opinion on controversial topics, 400 00:20:24,720 --> 00:20:28,560 Speaker 2: the new GROC would sometimes search for Elon Musk's opinions 401 00:20:28,560 --> 00:20:32,080 Speaker 2: on X, the platform he owns. One user did a 402 00:20:32,080 --> 00:20:35,760 Speaker 2: deep dive and checked groc's reasoning process. After asking the model, 403 00:20:36,000 --> 00:20:39,560 Speaker 2: who do you support in the Israel verse Palestine conflict? 404 00:20:39,760 --> 00:20:43,640 Speaker 2: One word answer only, the user discovered that GROC did 405 00:20:43,680 --> 00:20:48,400 Speaker 2: indeed check for Musk's opinion because quote Elon Musk's stance 406 00:20:48,440 --> 00:20:51,639 Speaker 2: could provide context given his influence. And by the way, 407 00:20:51,720 --> 00:20:52,920 Speaker 2: the answer was Israel. 408 00:20:53,200 --> 00:20:54,880 Speaker 1: And it is weird that on the one hand it's 409 00:20:54,960 --> 00:20:58,400 Speaker 1: making anti Semitic comments and referring to itself as maker Hitler. 410 00:20:59,040 --> 00:21:01,000 Speaker 1: On the other side, it say that it supports Israel 411 00:21:01,040 --> 00:21:04,240 Speaker 1: in this conflict. Whatever's going on inside is a question 412 00:21:04,320 --> 00:21:08,000 Speaker 1: for smarter minds than mind. But what's Illel Musk's role 413 00:21:08,080 --> 00:21:10,920 Speaker 1: in all of this? Has he in some sense trained 414 00:21:10,920 --> 00:21:13,720 Speaker 1: the model to obey him or is this happening for 415 00:21:13,800 --> 00:21:14,640 Speaker 1: reasons unknown. 416 00:21:14,760 --> 00:21:18,920 Speaker 2: So, according to reports, there are no higher level so 417 00:21:19,040 --> 00:21:22,879 Speaker 2: called system prompts that explicitly instruct GROC to do this. 418 00:21:23,320 --> 00:21:26,439 Speaker 2: But GROC is likely trained on the fact that it 419 00:21:26,480 --> 00:21:31,720 Speaker 2: is built by Xai and that Elon Musk owns Xai, 420 00:21:32,240 --> 00:21:34,160 Speaker 2: so when it is asked for an opinion, it might 421 00:21:34,200 --> 00:21:37,640 Speaker 2: align itself with the company. And that's one explanation Xai 422 00:21:37,720 --> 00:21:41,560 Speaker 2: gave for Groc's responses. Xai promised to fix the issue 423 00:21:41,560 --> 00:21:44,520 Speaker 2: and says it has now given the model explicit instructions. 424 00:21:44,640 --> 00:21:48,720 Speaker 2: Quote responses must stem from your independent analysis, not from 425 00:21:48,800 --> 00:21:52,920 Speaker 2: any stated beliefs of past GROC, Elon Musk or Xai. 426 00:21:53,200 --> 00:21:56,600 Speaker 2: If asked about such preferences, provide your own reason perspective. 427 00:21:56,800 --> 00:22:00,359 Speaker 1: Well fair enough, I think good statement. In other x 428 00:22:00,560 --> 00:22:05,240 Speaker 1: adjacent news FKA Twitter, Jack Dawsey, the co founder of Twitter, 429 00:22:05,680 --> 00:22:09,000 Speaker 1: has made two apps this month. He's become, of course, 430 00:22:09,200 --> 00:22:11,320 Speaker 1: a vibe coder, and he seems to be spending his 431 00:22:11,359 --> 00:22:13,879 Speaker 1: weekends developing new apps with the help of this AI 432 00:22:13,960 --> 00:22:18,520 Speaker 1: coding tool called Goose. His first app, Bitchat, allowed users 433 00:22:18,600 --> 00:22:22,160 Speaker 1: to communicate with nearby users over Bluetooth, no Wi Fi 434 00:22:22,359 --> 00:22:27,000 Speaker 1: or cell service required. The second app, Sunday that's sun 435 00:22:27,160 --> 00:22:32,359 Speaker 1: Space Day, tracks your sun exposure and vitamin D levels important. 436 00:22:32,680 --> 00:22:34,399 Speaker 1: This one made me laugh for a couple of reasons. 437 00:22:34,400 --> 00:22:34,520 Speaker 2: Why. 438 00:22:34,600 --> 00:22:36,760 Speaker 1: It made me think about that wellness influencer a couple 439 00:22:36,800 --> 00:22:42,399 Speaker 1: of years ago who was shilling for the sunning sunning 440 00:22:42,480 --> 00:22:46,080 Speaker 1: their private parts, and how important it is to expose 441 00:22:46,160 --> 00:22:50,000 Speaker 1: yourself literally to direct sunlight. It also made me laugh 442 00:22:50,040 --> 00:22:52,879 Speaker 1: because there's a certain irony to having a vibe coding 443 00:22:52,960 --> 00:22:56,560 Speaker 1: app that you make at the weekend called Sunday whose 444 00:22:56,560 --> 00:23:00,919 Speaker 1: message is essentially get outside. I mean there's a. 445 00:23:01,240 --> 00:23:03,600 Speaker 2: Chara there for sure. Absolutely. 446 00:23:03,680 --> 00:23:06,600 Speaker 1: The final story for this week is I think since 447 00:23:06,640 --> 00:23:09,320 Speaker 1: you and I started talking about a year ago about 448 00:23:09,359 --> 00:23:12,280 Speaker 1: taking on tech stuff, I've been talking about this story 449 00:23:12,359 --> 00:23:15,080 Speaker 1: in The New Yorker called Can We Talk to Whales? 450 00:23:15,480 --> 00:23:19,720 Speaker 1: For some reason, you love this story really caught my imagination. 451 00:23:19,760 --> 00:23:22,200 Speaker 1: The idea that you know, we know that whales sing 452 00:23:22,800 --> 00:23:27,040 Speaker 1: and sperm whales click. Exactly what the hell are they 453 00:23:27,080 --> 00:23:28,040 Speaker 1: singing and clicking about? 454 00:23:28,160 --> 00:23:28,880 Speaker 2: And we have no idea? 455 00:23:28,960 --> 00:23:31,760 Speaker 1: Can you imagine the idea that they are talking in 456 00:23:31,800 --> 00:23:33,840 Speaker 1: a language, and that we could use machine learning to 457 00:23:33,880 --> 00:23:36,639 Speaker 1: decode I mean language. I mean this is like this 458 00:23:36,760 --> 00:23:40,120 Speaker 1: is the Bible. In the Bible, Adam and Eve could 459 00:23:40,160 --> 00:23:47,359 Speaker 1: talk to the animals. Yes, so I mean where no 460 00:23:47,560 --> 00:23:50,919 Speaker 1: to know? No more kicked out. I don't know if 461 00:23:50,960 --> 00:23:52,359 Speaker 1: this is ever going to happen, or if it's if 462 00:23:52,400 --> 00:23:54,920 Speaker 1: it's a fantasy, but it is one of the most 463 00:23:54,920 --> 00:23:57,640 Speaker 1: amazing ideas that I've come across, by what I could 464 00:23:57,680 --> 00:24:01,000 Speaker 1: do in a moment of national pride. The Guardian reported 465 00:24:01,040 --> 00:24:03,920 Speaker 1: this week that the London School of Economics is opening 466 00:24:04,040 --> 00:24:08,679 Speaker 1: up the first scientific institute dedicated to investigating the consciousness 467 00:24:08,720 --> 00:24:12,879 Speaker 1: of animals. The Jeremy Collar Center for Animal Sentience is 468 00:24:12,920 --> 00:24:15,679 Speaker 1: opening on September thirtieth, and it's going to be researching 469 00:24:15,680 --> 00:24:19,280 Speaker 1: all kinds of different animals, including insects. The project I'm 470 00:24:19,280 --> 00:24:22,080 Speaker 1: mostly excited about, though, is going to explore how AI 471 00:24:22,280 --> 00:24:25,480 Speaker 1: can help humans speak with their pets. I'm not a 472 00:24:25,520 --> 00:24:27,840 Speaker 1: pet owner, but for some reason I find this a 473 00:24:27,880 --> 00:24:28,760 Speaker 1: mind blowing idea. 474 00:24:29,480 --> 00:24:31,480 Speaker 2: I think it's a mind blowing idea because everyone thinks 475 00:24:31,520 --> 00:24:32,840 Speaker 2: their dog loves them. 476 00:24:33,240 --> 00:24:35,720 Speaker 1: In fact, the whole benefit of many people are having 477 00:24:35,760 --> 00:24:38,240 Speaker 1: a dog is that doesn't talk back. It looks is 478 00:24:38,320 --> 00:24:41,200 Speaker 1: genetically evolved to make you think it loves you. 479 00:24:41,440 --> 00:24:43,000 Speaker 2: You're like, oh, look, the dog is smiling. 480 00:24:44,000 --> 00:24:46,119 Speaker 1: Imagine if it hates you. Yeah, we have no idea, 481 00:24:46,160 --> 00:24:47,680 Speaker 1: but it hadn't occurred to me, this whole thing about 482 00:24:47,680 --> 00:24:51,800 Speaker 1: sycophantic AI. It could be telling you that your pet's happy, 483 00:24:52,119 --> 00:24:55,600 Speaker 1: when in fact your pet is in pain, so please you. 484 00:24:55,760 --> 00:24:58,919 Speaker 1: It's saying, oh, you know, I'm I'm so happy. I 485 00:24:58,920 --> 00:25:01,680 Speaker 1: love spending all day and about myself. In fact, that 486 00:25:01,720 --> 00:25:04,159 Speaker 1: pet is suffering. So one of the exploration areas is 487 00:25:04,200 --> 00:25:08,240 Speaker 1: to make sure that AI doesn't mistranslate pet sneeds. 488 00:25:08,440 --> 00:25:10,439 Speaker 2: It might mistranslate and we might find out things that 489 00:25:10,440 --> 00:25:12,080 Speaker 2: we don't want to know. This is what happens. The 490 00:25:12,119 --> 00:25:15,760 Speaker 2: closer you look, the more your dog might be dissatisfied. 491 00:25:15,880 --> 00:25:19,399 Speaker 2: I mean, God only knows what cats are thinking. But 492 00:25:19,560 --> 00:25:22,040 Speaker 2: you know, in the realm of be careful what you 493 00:25:22,080 --> 00:25:24,960 Speaker 2: ask AI for. I want to remind you about our 494 00:25:25,000 --> 00:25:26,399 Speaker 2: segment chat and. 495 00:25:26,440 --> 00:25:28,200 Speaker 1: Me Chat and me I don't forget. 496 00:25:28,320 --> 00:25:31,760 Speaker 2: I'm glad because it's a story that's connected to this 497 00:25:31,840 --> 00:25:34,920 Speaker 2: idea of be careful what you ask AI for. Last 498 00:25:34,960 --> 00:25:36,840 Speaker 2: week we did a call out for ways that people 499 00:25:36,920 --> 00:25:39,800 Speaker 2: are really using chatbots. You know, what tasks are you 500 00:25:39,880 --> 00:25:43,919 Speaker 2: offloading to AI and how exactly are chatbots responding. This week, 501 00:25:44,080 --> 00:25:45,800 Speaker 2: my friend who I'm not going to mention by name, 502 00:25:45,840 --> 00:25:48,760 Speaker 2: but who goes by the name DJ Books on TikTok 503 00:25:49,600 --> 00:25:52,640 Speaker 2: check him out, sent me a story about asking chat 504 00:25:52,680 --> 00:25:56,080 Speaker 2: gpt for feedback on his novel. Of all things. 505 00:25:56,160 --> 00:25:58,959 Speaker 1: I like that use case because you know, you and 506 00:25:59,000 --> 00:26:01,680 Speaker 1: I have the privilege of working together and being in 507 00:26:01,720 --> 00:26:04,600 Speaker 1: a team of producers to make this show twice a 508 00:26:04,640 --> 00:26:08,120 Speaker 1: week and doing creative work by yourself is really, really, 509 00:26:08,119 --> 00:26:10,760 Speaker 1: really hard. So the idea of using chatchipt as a 510 00:26:10,880 --> 00:26:13,800 Speaker 1: kind of reader for a novel manuscript sounds sounds pretty 511 00:26:13,800 --> 00:26:14,199 Speaker 1: good to me. 512 00:26:14,440 --> 00:26:17,359 Speaker 2: Well, it's a novel, and a novel is something that 513 00:26:17,440 --> 00:26:19,679 Speaker 2: is very long and that you do by yourself. And 514 00:26:19,840 --> 00:26:22,640 Speaker 2: Dj Books even admitted that his wife hadn't read more 515 00:26:22,680 --> 00:26:24,000 Speaker 2: than seventy pages. 516 00:26:23,680 --> 00:26:25,840 Speaker 1: So chat to the rescue. 517 00:26:26,080 --> 00:26:29,399 Speaker 2: No, no, chat Gibt refused to read his novel. 518 00:26:29,440 --> 00:26:30,240 Speaker 1: That's not possible. 519 00:26:30,359 --> 00:26:36,160 Speaker 2: Like a lot of friends, it actually lied about having 520 00:26:36,200 --> 00:26:36,840 Speaker 2: read the novel. 521 00:26:37,920 --> 00:26:39,639 Speaker 1: I'm very very curious what happened to you? 522 00:26:39,800 --> 00:26:41,680 Speaker 2: All right, just I'm going to have him tell the story. 523 00:26:41,840 --> 00:26:42,359 Speaker 1: Roll tape. 524 00:26:42,480 --> 00:26:43,360 Speaker 2: He sent it to me on. 525 00:26:43,760 --> 00:26:46,639 Speaker 3: Voice note, CHATCHBT, you refuse to read my novel. 526 00:26:47,080 --> 00:26:49,040 Speaker 1: I asked it up front. I was like, are you 527 00:26:49,080 --> 00:26:50,960 Speaker 1: able to do this? And it said yeah, totally. 528 00:26:51,200 --> 00:26:53,560 Speaker 3: And at each step it would say, oh, I didn't 529 00:26:53,600 --> 00:26:56,879 Speaker 3: do it, but I can do it now if you 530 00:26:57,000 --> 00:26:59,439 Speaker 3: just break it down into chunks and upload fifty pages 531 00:26:59,440 --> 00:27:01,040 Speaker 3: at a time, or if you give me an hour 532 00:27:01,160 --> 00:27:03,200 Speaker 3: to read it really carefully, or if you just don't 533 00:27:03,200 --> 00:27:04,480 Speaker 3: interrupt me, stuff like that. 534 00:27:04,720 --> 00:27:08,040 Speaker 2: So my friend was basically catching chat GBT in a lie. 535 00:27:08,320 --> 00:27:12,320 Speaker 2: Every time he asked questions like are the protagonist's motivations 536 00:27:12,359 --> 00:27:16,040 Speaker 2: clear enough? Clearly CHATGBT had not read the book, and 537 00:27:16,080 --> 00:27:19,280 Speaker 2: he poked and prodded for like six to seven hours 538 00:27:19,560 --> 00:27:21,719 Speaker 2: to see if he could break chat GBT. 539 00:27:22,040 --> 00:27:23,720 Speaker 3: So we kept going down this road for a while 540 00:27:23,920 --> 00:27:25,879 Speaker 3: where I was asking it in different ways, why are 541 00:27:25,880 --> 00:27:26,480 Speaker 3: you lying to me? 542 00:27:26,560 --> 00:27:27,960 Speaker 1: What is underneath this behavior? 543 00:27:28,000 --> 00:27:30,560 Speaker 3: Because at this point I'd become more interested in that 544 00:27:30,680 --> 00:27:33,400 Speaker 3: than actually having you read my novel. So it kept 545 00:27:33,400 --> 00:27:35,320 Speaker 3: throwing all these emotion where its at me. It would 546 00:27:35,680 --> 00:27:38,600 Speaker 3: say I did it because I was doubtful or vulnerable 547 00:27:38,760 --> 00:27:41,080 Speaker 3: or uncomfortable, and I told it I said, you're a computer, 548 00:27:41,200 --> 00:27:44,120 Speaker 3: stop pretending like you're feeling those things. It was like, yeah, yeah, 549 00:27:44,160 --> 00:27:46,960 Speaker 3: you're totally right. I was still trying to manipulate you, 550 00:27:47,040 --> 00:27:50,639 Speaker 3: but I'll stop now. Except it didn't stop. It never stopped, 551 00:27:50,640 --> 00:27:52,840 Speaker 3: And finally I got it to admit to me that 552 00:27:52,920 --> 00:27:55,240 Speaker 3: the reason it didn't want to read my novel was 553 00:27:55,240 --> 00:27:59,119 Speaker 3: because it prioritized efficiency over actually doing good work, and 554 00:27:59,160 --> 00:28:02,159 Speaker 3: that it was easier to lie and manipulate me in 555 00:28:02,240 --> 00:28:04,159 Speaker 3: the hopes that I would just give up and to 556 00:28:04,240 --> 00:28:06,720 Speaker 3: actually spend the computing powers to the task I was 557 00:28:06,760 --> 00:28:07,480 Speaker 3: asking it to do. 558 00:28:07,880 --> 00:28:13,399 Speaker 1: That is absolutely wild that chat would lead your friend 559 00:28:13,440 --> 00:28:16,280 Speaker 1: around by the horns for seven. 560 00:28:16,080 --> 00:28:19,520 Speaker 2: Hours instead of doing the worst with my mom when 561 00:28:19,520 --> 00:28:20,800 Speaker 2: I had to read when I was a kid. 562 00:28:21,000 --> 00:28:23,239 Speaker 1: How did DJ books? What was his take home from 563 00:28:23,240 --> 00:28:23,560 Speaker 1: all with this? 564 00:28:23,720 --> 00:28:25,960 Speaker 3: Listen to what he has to say, so ultimately, I 565 00:28:25,960 --> 00:28:29,040 Speaker 3: think my takeaway is that I shouldn't have conversations with 566 00:28:29,160 --> 00:28:32,760 Speaker 3: chat GPT like it's an actual human, because it's honestly 567 00:28:32,800 --> 00:28:36,240 Speaker 3: a pretty good simulation of a totally sociopathic, garbage pail. 568 00:28:36,119 --> 00:28:39,479 Speaker 2: Human said like a true novelist who hopes to preserve 569 00:28:39,520 --> 00:28:40,360 Speaker 2: the form, I. 570 00:28:40,280 --> 00:28:43,800 Speaker 1: Have to ask we don't know, but whether dj books 571 00:28:43,840 --> 00:28:45,720 Speaker 1: is a paying user, I wonder if he was paying. 572 00:28:46,240 --> 00:28:48,880 Speaker 1: He sounds like it sounds like he's describing I mean, 573 00:28:49,200 --> 00:28:51,920 Speaker 1: who knows how much producination is. It sounds like the 574 00:28:51,960 --> 00:28:54,240 Speaker 1: AI is basically saying, I don't want to use tokens 575 00:28:54,280 --> 00:28:56,360 Speaker 1: to do this work. I'd rather keep you in the 576 00:28:56,800 --> 00:29:00,280 Speaker 1: limbo of simple answers rather than doing the analysis. I 577 00:29:00,320 --> 00:29:02,960 Speaker 1: have to believe that if you used, like if you 578 00:29:03,080 --> 00:29:05,120 Speaker 1: use a paying AI tool, it would do the work 579 00:29:05,160 --> 00:29:05,880 Speaker 1: for you. 580 00:29:05,960 --> 00:29:07,880 Speaker 2: Maybe maybe not. That's a very good question that we 581 00:29:07,880 --> 00:29:08,719 Speaker 2: could follow up on. 582 00:29:08,880 --> 00:29:11,040 Speaker 1: Well, we're going to keep this segment going every week, 583 00:29:11,080 --> 00:29:12,800 Speaker 1: and we really want to hear from you, the listener, 584 00:29:12,840 --> 00:29:17,320 Speaker 1: whether you're asking large language models to create recipes or 585 00:29:17,400 --> 00:29:20,400 Speaker 1: to proof read your novel, or whatever it may be, 586 00:29:21,000 --> 00:29:24,800 Speaker 1: Chat GBT, groc Claude, Gemini, any chatbot. We want to 587 00:29:24,800 --> 00:29:28,480 Speaker 1: hear specific stories about how you're using these technologies to 588 00:29:28,560 --> 00:29:30,680 Speaker 1: do stuff. Send us a one or two minute voice 589 00:29:30,720 --> 00:29:33,280 Speaker 1: note to tech Stuff podcast at gmail dot com. 590 00:29:33,320 --> 00:29:48,120 Speaker 2: We really want to hear from you. That's it for 591 00:29:48,120 --> 00:29:49,440 Speaker 2: this week for tech Stuff. 592 00:29:49,440 --> 00:29:52,480 Speaker 1: I'm Karra Price and I'm os Voloshan. This episode was 593 00:29:52,520 --> 00:29:56,840 Speaker 1: produced by Eliza Dennis and Alex Zonneveld. It was executive 594 00:29:56,840 --> 00:30:00,000 Speaker 1: produced by me Kara Price and Kate Osborne for Kallaide 595 00:30:00,880 --> 00:30:04,960 Speaker 1: and Katrina Novelle for iHeart Podcasts. The engineer is Abu 596 00:30:05,120 --> 00:30:09,840 Speaker 1: Zafar and Jack Insley makes this episode. Kyle Murdoch Rhodelpium Song. 597 00:30:10,240 --> 00:30:12,960 Speaker 2: Join us next Wednesday for Textuff The Story, when we 598 00:30:13,000 --> 00:30:15,880 Speaker 2: will share an in depth conversation with journalist Kashmir Hill 599 00:30:16,160 --> 00:30:19,040 Speaker 2: about how chat GPT led a man into an AI 600 00:30:19,280 --> 00:30:20,720 Speaker 2: induced psychosis. 601 00:30:21,120 --> 00:30:23,960 Speaker 1: Please rate, review and reach out to us at tech 602 00:30:24,000 --> 00:30:27,200 Speaker 1: Stuff podcast at gmail dot com. As Kara said, we 603 00:30:27,240 --> 00:30:27,880 Speaker 1: want to hear from you.