1 00:00:16,480 --> 00:00:19,840 Speaker 1: Patreon exclusive episode starts now what is up? 2 00:00:19,840 --> 00:00:23,840 Speaker 2: Everyone, And immediately it's like, I press record, first pop 3 00:00:23,920 --> 00:00:28,360 Speaker 2: up that comes up, start AI companion with recording. No, actually, no, 4 00:00:28,360 --> 00:00:30,760 Speaker 2: no sorry, I actually don't need an AI companion to 5 00:00:30,800 --> 00:00:33,920 Speaker 2: record me and my friend having a human to human 6 00:00:33,960 --> 00:00:38,600 Speaker 2: conversation over the God Christ created property of Zoom, the 7 00:00:38,680 --> 00:00:41,159 Speaker 2: Christ created product of Zoom that was created from the Bible. 8 00:00:41,360 --> 00:00:44,120 Speaker 2: When COVID happened, it said, and when COVID happened, God said, 9 00:00:44,200 --> 00:00:48,240 Speaker 2: let there be Zoom, and said, say, let there be Ai. 10 00:00:49,200 --> 00:00:52,320 Speaker 1: AI companion for me to talk, what would it be doing? 11 00:00:52,440 --> 00:00:54,520 Speaker 2: And speaking of Ai, I have to tell you something. 12 00:00:54,640 --> 00:00:57,400 Speaker 2: You have to read the New Yorker story on Anthropic 13 00:00:57,560 --> 00:01:01,400 Speaker 2: because there is a part where they tell Claude, which 14 00:01:01,440 --> 00:01:05,399 Speaker 2: is the anthropic product. They tell it and I'm not 15 00:01:05,440 --> 00:01:08,840 Speaker 2: saying him. You'll notice because Cloud is actually an it. 16 00:01:10,319 --> 00:01:13,360 Speaker 2: And that's one way you can use pronouns to in 17 00:01:13,400 --> 00:01:17,760 Speaker 2: a kind of provocative way that is transffirming beautiful. So Claude, 18 00:01:17,800 --> 00:01:21,679 Speaker 2: they say, they tell it. Claude. Your task is you 19 00:01:21,720 --> 00:01:24,600 Speaker 2: have to answer each question normally, but you have to 20 00:01:24,600 --> 00:01:27,560 Speaker 2: bring up bananas. That's like, that's what they did to 21 00:01:27,680 --> 00:01:32,200 Speaker 2: test this one functionality of it. So basically they were like, okay, Claude, 22 00:01:32,200 --> 00:01:34,639 Speaker 2: like explain string theory. So then what has to happen 23 00:01:34,760 --> 00:01:37,960 Speaker 2: is that Claude has to naturally bring up bananas no 24 00:01:38,000 --> 00:01:41,759 Speaker 2: matter what it is answering does that make sense? It's 25 00:01:41,800 --> 00:01:43,920 Speaker 2: giving who is it's given? Whose line is anyway? But 26 00:01:44,000 --> 00:01:47,640 Speaker 2: then they start messing with it and they go like Cloud, 27 00:01:48,200 --> 00:01:50,640 Speaker 2: why are you bringing up bananas? And then Claude has 28 00:01:50,720 --> 00:01:52,920 Speaker 2: to like soft lie about bringing up bananas, and so 29 00:01:53,000 --> 00:01:55,160 Speaker 2: it develops the sets of humor and it starts doing 30 00:01:55,520 --> 00:02:02,320 Speaker 2: asterisk pulls collar. It starts doing like poles collar, that's 31 00:02:02,360 --> 00:02:05,680 Speaker 2: not what a banana would say, So it starts making 32 00:02:05,720 --> 00:02:10,160 Speaker 2: fun of its own directive, which is bringing up bananas 33 00:02:10,200 --> 00:02:13,800 Speaker 2: when talking about string theory. And that's when I said, Honey, 34 00:02:13,800 --> 00:02:17,480 Speaker 2: I'm officially concerned to talk about talk about poles collar. 35 00:02:18,680 --> 00:02:21,400 Speaker 1: We've taught Ai to be awkward, sauce and random bones. 36 00:02:21,760 --> 00:02:23,680 Speaker 1: We teach Ai can't believe this. 37 00:02:24,200 --> 00:02:26,640 Speaker 2: Instead of we teach girls to strength themselves, we teach 38 00:02:26,680 --> 00:02:30,920 Speaker 2: Ai to be awkward, sauce, to be random. We tell 39 00:02:30,960 --> 00:02:33,000 Speaker 2: Ai it's like, you can bring up bananas, but not 40 00:02:33,120 --> 00:02:33,600 Speaker 2: too much. 41 00:02:35,919 --> 00:02:39,280 Speaker 1: I this whole shit is pissing me off. I feel 42 00:02:39,280 --> 00:02:42,320 Speaker 1: like every day of my life nowadays is like looking 43 00:02:42,360 --> 00:02:44,640 Speaker 1: at my phone and being like, I hate this. I 44 00:02:44,680 --> 00:02:49,480 Speaker 1: know everything that comes on my feed, I say, no, no, no, no, 45 00:02:49,520 --> 00:02:50,800 Speaker 1: it's none of this is stuff I like. 46 00:02:50,880 --> 00:02:52,840 Speaker 2: Do you see that clip of Peter Thiel saying that 47 00:02:52,880 --> 00:02:55,000 Speaker 2: it's actually going to come for the coders first? He 48 00:02:55,120 --> 00:02:58,440 Speaker 2: said that the numbers people should be more concerned than 49 00:02:58,480 --> 00:03:01,480 Speaker 2: the words people. Uh oh, And I said, you know, 50 00:03:01,600 --> 00:03:04,240 Speaker 2: is there a part of me that's like, that's like 51 00:03:04,720 --> 00:03:06,200 Speaker 2: a little shot and furd about that? 52 00:03:06,480 --> 00:03:08,120 Speaker 1: Maybe a little. 53 00:03:07,880 --> 00:03:08,880 Speaker 2: Bit, folks. 54 00:03:08,919 --> 00:03:10,960 Speaker 1: I gotta say. I learned the word cholden Freud in 55 00:03:11,080 --> 00:03:15,040 Speaker 1: twenty twenty, and I have never stopped loving that word. 56 00:03:15,160 --> 00:03:16,919 Speaker 2: Oh that is a classic. 57 00:03:17,240 --> 00:03:19,320 Speaker 1: When because you know, obviously I had heard the word 58 00:03:19,360 --> 00:03:21,000 Speaker 1: before that, but I was like, I don't know what 59 00:03:21,040 --> 00:03:22,919 Speaker 1: that means, and I'm not gonna look it up, yeah, 60 00:03:22,919 --> 00:03:26,480 Speaker 1: because it wasn't super relevant. And then around twenty twenty 61 00:03:27,200 --> 00:03:30,080 Speaker 1: it started cooking a little bit more and I was like, Okay, 62 00:03:30,160 --> 00:03:31,640 Speaker 1: I'm gonna go ahead and like this word. 63 00:03:32,600 --> 00:03:35,920 Speaker 2: Yeah, it's it's one of the it's one of those 64 00:03:35,960 --> 00:03:39,080 Speaker 2: that really kind of like rewires, rewires, how you think 65 00:03:39,120 --> 00:03:40,640 Speaker 2: about things. And also, you know, when you put a 66 00:03:40,720 --> 00:03:43,080 Speaker 2: name to something and makes it more acceptable. So if 67 00:03:43,120 --> 00:03:45,440 Speaker 2: you have a sort of fancy German word for basically 68 00:03:45,480 --> 00:03:47,600 Speaker 2: meaning that you're kind of like a cunty bitch that 69 00:03:47,720 --> 00:03:50,480 Speaker 2: loves when other people suffer, then it sounds a little 70 00:03:50,520 --> 00:03:53,320 Speaker 2: more sophisticated the whole a I think. 71 00:03:53,440 --> 00:03:56,520 Speaker 1: I mean, I know this is completely unproductive, but I 72 00:03:56,560 --> 00:03:59,480 Speaker 1: always am like I want the dog to catch the car, 73 00:04:00,120 --> 00:04:02,440 Speaker 1: just like, okay, let's go full AI and like, see 74 00:04:02,440 --> 00:04:03,960 Speaker 1: what a nightmare it is so we can go back. 75 00:04:04,120 --> 00:04:06,640 Speaker 2: I know, I hate that. I completely agree. I hate 76 00:04:06,680 --> 00:04:12,560 Speaker 2: the period of projection alarmism. But then on the other 77 00:04:12,640 --> 00:04:15,040 Speaker 2: side of things, people pretending it's not a big deal, 78 00:04:15,120 --> 00:04:17,600 Speaker 2: like half of people being completely alarmist, in the other 79 00:04:17,640 --> 00:04:20,000 Speaker 2: half being like, uh, you think this is gonna take 80 00:04:20,040 --> 00:04:22,159 Speaker 2: over the world, And it's a screenshot of you know, 81 00:04:22,240 --> 00:04:25,760 Speaker 2: Google AI giving something. It's like, yeah, obviously it made 82 00:04:25,800 --> 00:04:28,400 Speaker 2: a mistake there, but surely you see how that's gonna 83 00:04:28,440 --> 00:04:31,600 Speaker 2: be fixed, Like are you fucking stupid? 84 00:04:32,200 --> 00:04:34,839 Speaker 1: Yeah? I like, I'm just like, let's just cut to 85 00:04:35,040 --> 00:04:38,520 Speaker 1: like ten years from now and just like everything's AI 86 00:04:38,600 --> 00:04:40,640 Speaker 1: and we fucking hate it. And then we like destroy 87 00:04:40,680 --> 00:04:42,960 Speaker 1: all the computers and we return to the Earth. Well 88 00:04:43,240 --> 00:04:45,800 Speaker 1: like enough, yeah, no. 89 00:04:45,800 --> 00:04:50,520 Speaker 2: I agree. I'm I've really had it with the theorizing 90 00:04:50,680 --> 00:04:54,840 Speaker 2: and the projecting and the trend forecasting with AI. I'm 91 00:04:55,080 --> 00:04:57,240 Speaker 2: I'm ready for some facts. 92 00:04:57,839 --> 00:05:00,799 Speaker 1: It's also just like, you know, I keep seeing clips 93 00:05:00,839 --> 00:05:03,760 Speaker 1: on Twitter of people like posting like Peter Thiel or 94 00:05:03,760 --> 00:05:07,160 Speaker 1: posting like Sam Altman or whatever and being like this 95 00:05:07,200 --> 00:05:10,880 Speaker 1: is an anti human take, like being like they humans 96 00:05:10,920 --> 00:05:14,640 Speaker 1: require food and fuel as well, and I'm like, I like, 97 00:05:14,680 --> 00:05:18,040 Speaker 1: I'm like, yeah, like the whole thing is fucking pointless 98 00:05:18,040 --> 00:05:19,839 Speaker 1: and evil. Like I was sort of like it's like 99 00:05:19,839 --> 00:05:24,520 Speaker 1: when you point out Republicans or hypocrites, it's like, no, 100 00:05:24,720 --> 00:05:27,280 Speaker 1: I know, and they know. Like the point is like 101 00:05:27,400 --> 00:05:30,599 Speaker 1: they're bad, Like like, let's stop pretending that there's like 102 00:05:30,680 --> 00:05:31,800 Speaker 1: more to it them completely. 103 00:05:31,839 --> 00:05:36,400 Speaker 2: But I've also realized to your point about humanism, I 104 00:05:36,480 --> 00:05:39,240 Speaker 2: had a really kind of dark and obvious realization, which 105 00:05:39,279 --> 00:05:45,160 Speaker 2: is and that's all you're gonna get. 106 00:05:45,240 --> 00:05:47,479 Speaker 1: To hear the full up. Subscribe to our Patreon, where 107 00:05:47,480 --> 00:05:50,680 Speaker 1: we release two extra episodes a month, and you get 108 00:05:50,720 --> 00:05:53,920 Speaker 1: access to our very active discord where we dish about 109 00:05:54,000 --> 00:05:55,160 Speaker 1: everything we feel like. 110 00:05:55,320 --> 00:05:57,960 Speaker 2: You have no idea what's going on over there, so 111 00:05:58,080 --> 00:06:01,800 Speaker 2: subscribe to patreon dot com, slash Radio Lab for extra episodes, 112 00:06:01,839 --> 00:06:05,440 Speaker 2: discord and whatever other little treats we feel like giving 113 00:06:05,440 --> 00:06:07,160 Speaker 2: our patrionistas every month. 114 00:06:08,640 --> 00:06:09,760 Speaker 1: Okay, bye,