1 00:00:13,920 --> 00:00:17,520 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,520 --> 00:00:21,079 Speaker 1: Oz Valoscian and I'm caar Price. Today we're going to 3 00:00:21,120 --> 00:00:24,240 Speaker 1: get into how Christianity is on the rise in Silicon Valley. 4 00:00:24,720 --> 00:00:25,720 Speaker 1: Then on Chat to. 5 00:00:25,800 --> 00:00:28,960 Speaker 2: Me, I scrolled down to my very first interaction with 6 00:00:29,080 --> 00:00:33,120 Speaker 2: chat and the first question was why have I been 7 00:00:33,200 --> 00:00:34,040 Speaker 2: unlucky in love? 8 00:00:34,479 --> 00:00:36,280 Speaker 3: When AI advice goes wrong? 9 00:00:36,600 --> 00:00:44,760 Speaker 1: All of that on the weekend. Tech is Friday, October seventeenth. 10 00:00:47,400 --> 00:00:51,120 Speaker 1: Hello Cara, Hi ahs. So the DAYSA came out, which 11 00:00:51,159 --> 00:00:55,720 Speaker 1: is open ayes updated video generation tool and social media app. 12 00:00:56,200 --> 00:00:58,400 Speaker 1: You said to me, people might look back on this 13 00:00:58,480 --> 00:00:59,960 Speaker 1: as the day the Internet change forever. 14 00:01:00,240 --> 00:01:02,960 Speaker 3: I did say that, and I'm not a Pollyanna. 15 00:01:02,880 --> 00:01:04,320 Speaker 1: But I did think you drunk the kool aid. 16 00:01:04,680 --> 00:01:06,160 Speaker 3: No, I drank the truth juice. 17 00:01:06,319 --> 00:01:10,360 Speaker 1: I want some of that. Well, you know, I have 18 00:01:10,480 --> 00:01:13,119 Speaker 1: to give you some credit because as the news has 19 00:01:13,160 --> 00:01:15,800 Speaker 1: evolved and Sorra has kind of taken over the whole 20 00:01:15,800 --> 00:01:18,800 Speaker 1: of social media, I think it's fair to say that 21 00:01:18,840 --> 00:01:21,160 Speaker 1: you were right and I was wrong. This came home 22 00:01:21,200 --> 00:01:24,000 Speaker 1: to me this week with a story about an AI 23 00:01:24,520 --> 00:01:29,000 Speaker 1: homeless person prank. His modo ran a story about how 24 00:01:29,640 --> 00:01:35,080 Speaker 1: kids are creating fake homeless people in AI images and 25 00:01:35,120 --> 00:01:37,920 Speaker 1: then sending them to their parents, Like here he is 26 00:01:38,000 --> 00:01:40,240 Speaker 1: arriving at the door. He says he's a friend of yours. 27 00:01:40,319 --> 00:01:43,959 Speaker 1: He says, he's a friend of your I needed arrest, 28 00:01:44,120 --> 00:01:47,440 Speaker 1: so he's sleeping in your bed. Okay, So like our 29 00:01:47,480 --> 00:01:50,800 Speaker 1: parents falling for this, Yeah, and then in many cases 30 00:01:51,360 --> 00:01:54,720 Speaker 1: the parents call the police. What the well, of course, 31 00:01:54,760 --> 00:01:57,120 Speaker 1: I mean that the kids are saying, here's a strange 32 00:01:57,240 --> 00:01:59,840 Speaker 1: man in our house. He says, a. 33 00:01:59,760 --> 00:02:04,400 Speaker 4: Friend to get Jeremy Carrasco, my dear friend, show tools 34 00:02:04,400 --> 00:02:07,760 Speaker 4: AI to give these people some tutorial on how to 35 00:02:07,800 --> 00:02:08,440 Speaker 4: pixel peep. 36 00:02:08,840 --> 00:02:09,560 Speaker 1: What would he say? 37 00:02:10,120 --> 00:02:11,160 Speaker 3: I think he would say. 38 00:02:11,080 --> 00:02:13,720 Speaker 4: To look closely at this homeless guy. You can kind 39 00:02:13,760 --> 00:02:16,880 Speaker 4: of look at his teeth and his face and his eyes, 40 00:02:16,960 --> 00:02:20,639 Speaker 4: and there will be things they're in that you can 41 00:02:20,680 --> 00:02:21,520 Speaker 4: tell are not real. 42 00:02:21,840 --> 00:02:24,240 Speaker 3: But I also, I'm looking at the images like they're. 43 00:02:24,120 --> 00:02:25,280 Speaker 1: Very super real. 44 00:02:25,639 --> 00:02:28,120 Speaker 4: It's uncanny, though. I mean, to be a parent and 45 00:02:28,160 --> 00:02:30,480 Speaker 4: to have a kid be like, what's this guy doing 46 00:02:30,520 --> 00:02:32,639 Speaker 4: in our house? I guess I would call the police too. 47 00:02:33,320 --> 00:02:35,440 Speaker 1: Well. I mean, that's the thing I had some hesitation 48 00:02:35,520 --> 00:02:37,600 Speaker 1: about bringing this story to you today. As the first 49 00:02:37,639 --> 00:02:38,720 Speaker 1: story on the show. 50 00:02:38,560 --> 00:02:40,400 Speaker 3: Because because you knew I'd like it to. 51 00:02:41,080 --> 00:02:42,919 Speaker 1: Before I decided to bring to you the first story, 52 00:02:42,960 --> 00:02:44,840 Speaker 1: I asked myself, what is the butt of this joke? 53 00:02:44,960 --> 00:02:47,880 Speaker 1: Is the butt of this joke homeless people? Or is 54 00:02:47,880 --> 00:02:50,960 Speaker 1: the butt of this joke gullible parents who freak out 55 00:02:51,040 --> 00:02:54,399 Speaker 1: and sort of share this cultural terror of homeless people. Yeah, 56 00:02:54,600 --> 00:02:56,639 Speaker 1: I don't know where it is between those two things. 57 00:02:56,919 --> 00:02:59,320 Speaker 1: But what I think is really interesting about this sore 58 00:02:59,360 --> 00:03:02,080 Speaker 1: A moment is it gives you a glimpse into the 59 00:03:02,160 --> 00:03:06,280 Speaker 1: kind of collective subconscious of the Internet. And so I 60 00:03:06,280 --> 00:03:08,440 Speaker 1: think its not for nothing that the kind of most 61 00:03:08,760 --> 00:03:12,600 Speaker 1: viral meme of the moment is this kind of fantasy 62 00:03:12,880 --> 00:03:19,080 Speaker 1: objectification creation of homeless people effectively vampire like crossing the 63 00:03:19,120 --> 00:03:22,120 Speaker 1: threshold and invading the house. But it's also a metaverse 64 00:03:22,160 --> 00:03:26,560 Speaker 1: story because, as I mentioned, parents are calling the cops 65 00:03:27,040 --> 00:03:31,080 Speaker 1: in North Austin the Round Police Patrol Division Commander Andy 66 00:03:31,160 --> 00:03:34,680 Speaker 1: McKinney told NBC getting a call about an intruder quote 67 00:03:34,880 --> 00:03:37,600 Speaker 1: causes a pretty aggressive response for us because we're worried 68 00:03:37,640 --> 00:03:40,440 Speaker 1: about the safety of the individuals in the home, which 69 00:03:40,480 --> 00:03:43,360 Speaker 1: can mean clearing the home with guns. Out it could 70 00:03:43,360 --> 00:03:45,080 Speaker 1: also cause a swap response. 71 00:03:45,520 --> 00:03:49,400 Speaker 4: Anyone who says that I overreact about manipulated video is wrong, 72 00:03:49,960 --> 00:03:54,520 Speaker 4: like the fact that this is taking department resources to 73 00:03:55,520 --> 00:03:59,160 Speaker 4: fend off a prank that is created by teenagers. And 74 00:03:59,200 --> 00:04:01,560 Speaker 4: also these are not AI masterminds. 75 00:04:01,600 --> 00:04:03,440 Speaker 1: There's a kid, and the idea that you would need 76 00:04:03,440 --> 00:04:06,600 Speaker 1: a swap team to respond is also kind of mind blowing. 77 00:04:06,920 --> 00:04:09,080 Speaker 1: But this actually brings me to my main story of 78 00:04:09,080 --> 00:04:12,680 Speaker 1: the week, because you know who preached charity and compassion 79 00:04:12,720 --> 00:04:18,360 Speaker 1: and care for the homeless, JC, Jesus Christ. Here's what 80 00:04:18,400 --> 00:04:22,000 Speaker 1: he had to say in the gospel. Foxes have holes 81 00:04:22,200 --> 00:04:24,599 Speaker 1: and birds of the air have nests, but the son 82 00:04:24,640 --> 00:04:27,760 Speaker 1: of Man has nowhere to lay his head. Oh my God. 83 00:04:28,080 --> 00:04:30,200 Speaker 1: You may be scratching your head as to why I'm 84 00:04:30,279 --> 00:04:35,080 Speaker 1: quoting the Bible, but it's relevant because Christianity is currently 85 00:04:35,520 --> 00:04:37,560 Speaker 1: all the rage with Silicon Valley elites. 86 00:04:38,040 --> 00:04:43,440 Speaker 4: It still somehow surprises me when people bring up Christianity 87 00:04:43,480 --> 00:04:45,240 Speaker 4: in the same breath as Silicon Valley. 88 00:04:45,800 --> 00:04:48,640 Speaker 1: Why is that, well, I mean, it's surprising because the 89 00:04:48,720 --> 00:04:53,719 Speaker 1: two were, for a while seemingly anathema. There's a line 90 00:04:53,800 --> 00:04:56,920 Speaker 1: in the famous TV show about Silicon Valley called Silicon 91 00:04:57,000 --> 00:05:02,280 Speaker 1: Valley where somebody says Christianity is order illegal in northern California, 92 00:05:02,880 --> 00:05:06,240 Speaker 1: and Vanity Fair actually uses that quote in a headline 93 00:05:06,600 --> 00:05:09,679 Speaker 1: about today's trend of Christianity rising in Silicon Valley. 94 00:05:10,120 --> 00:05:12,480 Speaker 4: You know, San Francisco has always struck me. And this 95 00:05:12,520 --> 00:05:16,400 Speaker 4: is a joke. You know, as more about communes than communions. 96 00:05:16,640 --> 00:05:18,240 Speaker 1: It's always a good sign for a joke when you 97 00:05:18,279 --> 00:05:19,599 Speaker 1: have to tell someone as a joke. 98 00:05:20,680 --> 00:05:22,920 Speaker 3: But I mean it, like, what what's happening here? 99 00:05:23,080 --> 00:05:25,800 Speaker 1: Okay, Well it's a huge question. And actually this is 100 00:05:25,839 --> 00:05:28,680 Speaker 1: a somewhat personal story for me because a number of 101 00:05:28,680 --> 00:05:31,719 Speaker 1: my friends who are not in Silicon Valley have recently 102 00:05:31,720 --> 00:05:34,520 Speaker 1: turned to Christianity go to church every week. I think 103 00:05:34,520 --> 00:05:37,719 Speaker 1: in search really community. Yeah, community structure. 104 00:05:38,160 --> 00:05:40,280 Speaker 4: I have found myself more drawn to synagogue recently, but 105 00:05:40,279 --> 00:05:41,000 Speaker 4: we can talk about it. 106 00:05:41,200 --> 00:05:43,320 Speaker 1: I want to ask you exactly about that, because I've 107 00:05:43,360 --> 00:05:45,440 Speaker 1: noticed that too. You talk about going to synagogue when 108 00:05:45,440 --> 00:05:47,040 Speaker 1: you go to synagogue more than you did when I 109 00:05:47,080 --> 00:05:47,560 Speaker 1: knew you. 110 00:05:47,880 --> 00:05:49,120 Speaker 3: Much more, much more. 111 00:05:49,440 --> 00:05:49,600 Speaker 1: Why. 112 00:05:50,360 --> 00:05:54,279 Speaker 4: I think because I'm in search of like a single 113 00:05:54,440 --> 00:06:00,359 Speaker 4: place left in the universe that is somehow tech iastic. 114 00:06:00,440 --> 00:06:03,560 Speaker 4: Now the irony is that the rabbis in the synagogue 115 00:06:03,560 --> 00:06:06,120 Speaker 4: that I go to use iPads, but I would imagine 116 00:06:06,160 --> 00:06:07,520 Speaker 4: they don't have notifications on. 117 00:06:08,080 --> 00:06:11,839 Speaker 1: Yeah, it's not a huge surprise that the resurgence of faith, 118 00:06:12,000 --> 00:06:14,200 Speaker 1: partly in response, I think, to tech, as you've mentioned, 119 00:06:14,640 --> 00:06:17,680 Speaker 1: is also happening where tech is made in Silicon Valley. 120 00:06:18,000 --> 00:06:21,560 Speaker 1: But actually, what I think is particularly interesting is that 121 00:06:21,560 --> 00:06:25,440 Speaker 1: Silicon Valley has always been drawn to countercultures. Yes, and 122 00:06:25,480 --> 00:06:30,320 Speaker 1: the counterculture of ten years ago was psychedelics, burning man, polyamory. 123 00:06:30,760 --> 00:06:31,839 Speaker 1: Then that became mainstream. 124 00:06:31,880 --> 00:06:33,760 Speaker 3: Now now it's polygamy. No, I'm kidding that. 125 00:06:34,000 --> 00:06:36,200 Speaker 1: Yeah, So now the old has become new, and it's 126 00:06:36,200 --> 00:06:40,160 Speaker 1: revolutionary to be a Christian in Silicon Valley. So obviously 127 00:06:40,200 --> 00:06:41,760 Speaker 1: the person I want to talk to you about this 128 00:06:41,800 --> 00:06:45,560 Speaker 1: week is peed deal your faith, who I think embodies 129 00:06:45,720 --> 00:06:47,159 Speaker 1: you know, both are swing to the right of the 130 00:06:47,160 --> 00:06:51,240 Speaker 1: tech industry and the rise of Christianity. You're presumably familiar, 131 00:06:51,320 --> 00:06:53,599 Speaker 1: like everybody else on the internet, with his lecture series 132 00:06:53,680 --> 00:06:57,520 Speaker 1: I Am a four part lecture on the Antichrist. Let 133 00:06:57,560 --> 00:06:59,960 Speaker 1: me read from the website quote you are warmly invite 134 00:07:00,240 --> 00:07:03,240 Speaker 1: to a series of four lectures by Peter Teel addressing 135 00:07:03,279 --> 00:07:05,400 Speaker 1: the topic of the biblical Antichrist. 136 00:07:05,520 --> 00:07:07,560 Speaker 4: Who first, can I just ask you who he was inviting? 137 00:07:07,760 --> 00:07:09,359 Speaker 4: Was this on zoom? 138 00:07:09,560 --> 00:07:11,360 Speaker 1: No, No, you could apply, you could go in person, 139 00:07:11,600 --> 00:07:14,280 Speaker 1: and I think it was the crowd, as far as 140 00:07:14,320 --> 00:07:19,920 Speaker 1: I understand, was male dominated, mid twentiesingly guys. And you know, 141 00:07:19,960 --> 00:07:22,840 Speaker 1: Peter Teel obviously is the guy who found a pallenteer 142 00:07:23,120 --> 00:07:26,680 Speaker 1: and PayPal, sued the website Gorper out of existence, and 143 00:07:26,760 --> 00:07:30,400 Speaker 1: personally funded a huge portion of JD Vance's political rise. 144 00:07:30,760 --> 00:07:34,720 Speaker 1: He's also about Christian and in his Antichrist lecture series, 145 00:07:35,160 --> 00:07:38,440 Speaker 1: he posed a few questions about who the Antichrist might be. 146 00:07:39,040 --> 00:07:42,080 Speaker 1: He said, it could be Greta Thumberg, poor Greta. It 147 00:07:42,160 --> 00:07:45,560 Speaker 1: might even be Donald Trump. Oh but it couldn't possibly 148 00:07:45,640 --> 00:07:50,840 Speaker 1: be Mark Andreson, because you know what, because the Antichrist 149 00:07:51,160 --> 00:07:53,280 Speaker 1: is supposed to be popular. Isn't that a great disk? 150 00:07:53,640 --> 00:07:55,480 Speaker 3: That's fantastic. That's fantastic. 151 00:07:55,600 --> 00:07:57,680 Speaker 1: So I think it's important to understand a bit more 152 00:07:57,760 --> 00:08:04,680 Speaker 1: about the nonprofit called the seventeen Collective That's Acts, which 153 00:08:04,720 --> 00:08:08,080 Speaker 1: started holding speaking events and in person events actually last year, 154 00:08:08,560 --> 00:08:11,680 Speaker 1: and they partner with Peter Teel on this event series. 155 00:08:12,280 --> 00:08:15,840 Speaker 1: The group is named after Act seventeen, which is a 156 00:08:15,920 --> 00:08:18,120 Speaker 1: chapter in the Bible which is all about the apostle 157 00:08:18,160 --> 00:08:21,760 Speaker 1: Paul and how he goes to Greece to preach to intellectuals. Ah. 158 00:08:21,880 --> 00:08:23,680 Speaker 1: I see, so there's a kind of sort of doing. 159 00:08:23,800 --> 00:08:26,720 Speaker 1: Is they're kind of referencing this inception of the smartest 160 00:08:26,720 --> 00:08:30,200 Speaker 1: people in the world, which is maybe a little coressed congratulatory. 161 00:08:30,240 --> 00:08:32,079 Speaker 3: Yeah, absolutely absolutely. 162 00:08:32,840 --> 00:08:35,719 Speaker 1: There's also, according to the anti fair piece, a spreadsheet 163 00:08:35,960 --> 00:08:39,240 Speaker 1: of secret Christians in tech that was being passed around 164 00:08:39,559 --> 00:08:42,960 Speaker 1: before Christianity and Silicon Valley became mainstreamed. 165 00:08:43,160 --> 00:08:47,800 Speaker 4: And who is on this secret spreadsheet of Christians in Tech? 166 00:08:48,840 --> 00:08:52,080 Speaker 1: I don't know, but I can tell you who Some 167 00:08:52,120 --> 00:08:55,960 Speaker 1: of the most prominent Silicon Valley Christiansky Jerry tan Who 168 00:08:56,000 --> 00:08:56,679 Speaker 1: Jerry Tann. 169 00:08:56,559 --> 00:08:59,880 Speaker 3: Is, Yes, I do. He's the CEO of y Combine. 170 00:09:00,640 --> 00:09:03,480 Speaker 1: Look at that. He's the CEO Y Combinator, which is 171 00:09:03,559 --> 00:09:07,080 Speaker 1: an extremely influential organization because if you're a young would 172 00:09:07,120 --> 00:09:09,640 Speaker 1: be founder and you get the Y Combinator's seal of approval, 173 00:09:10,040 --> 00:09:13,640 Speaker 1: you are off to the races, right. Another prominent Silicon 174 00:09:13,720 --> 00:09:17,280 Speaker 1: Valley Christian is Pat Gelsinger, who's the former CEO of 175 00:09:17,320 --> 00:09:20,200 Speaker 1: Internal that's one I didn't know. Maybe also a little 176 00:09:20,240 --> 00:09:22,720 Speaker 1: less a little less reflex in today's Silicon Valley to 177 00:09:22,760 --> 00:09:27,240 Speaker 1: be fair. And then very interestingly, there's Tray Stevens. Trey 178 00:09:27,320 --> 00:09:31,720 Speaker 1: Stephens is a Peter Teele partner at Peter Teele's Founder's Fund, 179 00:09:32,080 --> 00:09:35,080 Speaker 1: but he also co founded the defense firm Anderill with 180 00:09:35,120 --> 00:09:39,640 Speaker 1: Palmeer Lucky Oh with old Palmer Lucky. Trey's wife, Michelle 181 00:09:40,040 --> 00:09:42,360 Speaker 1: actually founded at seventeen collective. 182 00:09:43,280 --> 00:09:45,000 Speaker 3: His wife founded it. Fascinating. 183 00:09:45,559 --> 00:09:49,000 Speaker 4: So, I mean, obviously these are major players in the 184 00:09:49,040 --> 00:09:52,440 Speaker 4: tech world, but like, what is the Christianity that they 185 00:09:52,480 --> 00:09:52,960 Speaker 4: believe in? 186 00:09:53,200 --> 00:09:56,480 Speaker 1: Well, I really don't want to speak for them about 187 00:09:56,480 --> 00:09:59,160 Speaker 1: their faith. And I also, while I find some of 188 00:09:59,160 --> 00:10:00,840 Speaker 1: this stuff quite a musing, and I also don't you know, 189 00:10:00,880 --> 00:10:02,960 Speaker 1: I think it's faith plays such an important role in 190 00:10:02,960 --> 00:10:05,920 Speaker 1: so many people's lives. But in the Vanity Fair piece, 191 00:10:06,040 --> 00:10:10,319 Speaker 1: there is some interesting reporting on the broader interaction between 192 00:10:10,360 --> 00:10:13,920 Speaker 1: the tech industry and faith. So Toby Kirth, who's the 193 00:10:13,920 --> 00:10:17,240 Speaker 1: pastor at Peter Teel's church, said, quote, each person has 194 00:10:17,240 --> 00:10:20,960 Speaker 1: a calling and a vocation, and using your gifts to 195 00:10:21,040 --> 00:10:24,520 Speaker 1: the max is a good thing and it's what God 196 00:10:24,520 --> 00:10:28,079 Speaker 1: would want Daniel Francis, the Catholic founder of the AI 197 00:10:28,160 --> 00:10:32,000 Speaker 1: startup able, said quote, you have a duty as a 198 00:10:32,040 --> 00:10:34,520 Speaker 1: founder to make really good products and get them into 199 00:10:34,520 --> 00:10:38,960 Speaker 1: people's hands. You're making God real in people's lives when 200 00:10:38,960 --> 00:10:43,000 Speaker 1: they experience that. Now, obviously there's some good old fashioned 201 00:10:43,080 --> 00:10:46,240 Speaker 1: the American intersection of capitalism and faith and the work 202 00:10:46,280 --> 00:10:47,920 Speaker 1: ethic and all of those things here, and that you 203 00:10:48,000 --> 00:10:50,600 Speaker 1: make money from your talents and then you help the 204 00:10:50,679 --> 00:10:52,040 Speaker 1: poor and all those things. I mean, this is this 205 00:10:52,080 --> 00:10:54,480 Speaker 1: is not so new, but I was kind of more 206 00:10:54,559 --> 00:10:57,480 Speaker 1: intrigued by the idea that there's this belief that if 207 00:10:57,559 --> 00:11:01,079 Speaker 1: God gives you the ability to create new technologology, it's 208 00:11:01,120 --> 00:11:02,480 Speaker 1: your God given purpose to do so. 209 00:11:02,960 --> 00:11:07,160 Speaker 4: Are other people outside of these elite Silicon Valley types 210 00:11:07,559 --> 00:11:09,199 Speaker 4: involved in this church well. 211 00:11:09,559 --> 00:11:12,120 Speaker 1: A pastor at a non denominational church that many of 212 00:11:12,120 --> 00:11:16,760 Speaker 1: these folks go to said that during COVID his congregation swelled, 213 00:11:17,120 --> 00:11:20,040 Speaker 1: in fact more than doubled in size. From a cynical 214 00:11:20,080 --> 00:11:22,679 Speaker 1: point of view, that might not be surprising because some 215 00:11:22,760 --> 00:11:24,720 Speaker 1: of the members of the church are also the most 216 00:11:24,720 --> 00:11:28,280 Speaker 1: powerful people in Silicon Valley, and the church isn't necessarily 217 00:11:28,320 --> 00:11:31,320 Speaker 1: the worst place to rub shoulders with the folks in 218 00:11:31,360 --> 00:11:32,040 Speaker 1: high places. 219 00:11:32,360 --> 00:11:37,480 Speaker 4: So basically going to church as a networking opportunity. 220 00:11:36,440 --> 00:11:40,040 Speaker 1: Not for everyone. I'm sure there's good faith motivations to 221 00:11:40,080 --> 00:11:42,640 Speaker 1: be there just for reasons of pure faith. But one 222 00:11:42,720 --> 00:11:46,080 Speaker 1: Christian entrepreneur was quoted in the Vanity Fair piece saying, quote, 223 00:11:46,360 --> 00:11:49,439 Speaker 1: I guarantee you there are people that are leveraging Christianity 224 00:11:49,640 --> 00:11:51,240 Speaker 1: to get closer to Peter Teel. 225 00:11:51,600 --> 00:11:54,439 Speaker 4: I mean, it is interesting, this is on a sort 226 00:11:54,480 --> 00:11:57,520 Speaker 4: of much grander stage. But like everybody goes to church 227 00:11:57,559 --> 00:11:59,559 Speaker 4: to network in a certain way, so it's just I 228 00:11:59,559 --> 00:12:00,960 Speaker 4: think this or higher here. 229 00:12:01,440 --> 00:12:02,680 Speaker 3: This story, though, really was. 230 00:12:02,640 --> 00:12:04,320 Speaker 4: About something that I don't want you to forget about, 231 00:12:04,320 --> 00:12:05,359 Speaker 4: which is the Antichrist. 232 00:12:05,760 --> 00:12:10,320 Speaker 1: Don't worry, I wouldn't have done so. Peter Tiel gave 233 00:12:10,400 --> 00:12:14,800 Speaker 1: this almost eight hour lecture series about the Antichrist, which 234 00:12:14,960 --> 00:12:18,520 Speaker 1: was theoretically off the record, but of course immediately to 235 00:12:18,559 --> 00:12:19,199 Speaker 1: the press. 236 00:12:19,520 --> 00:12:21,880 Speaker 4: You know, it's very funny that the guy who helped 237 00:12:21,920 --> 00:12:25,120 Speaker 4: create the surveillance state through technology cannot seem to escape 238 00:12:25,160 --> 00:12:27,600 Speaker 4: being surveilled. He couldn't even do an off the record 239 00:12:27,800 --> 00:12:29,640 Speaker 4: lecture series even if he tried. 240 00:12:30,760 --> 00:12:33,560 Speaker 1: Well, he may be one step ahead of the game here, 241 00:12:33,600 --> 00:12:36,480 Speaker 1: Old Peter, because he said, quote this is in the lecture. 242 00:12:36,880 --> 00:12:39,520 Speaker 1: It's a pretty good marketing shtick. If you want everyone 243 00:12:39,559 --> 00:12:42,320 Speaker 1: to hear about something, not to let anyone into the room. 244 00:12:42,760 --> 00:12:45,560 Speaker 1: I'm not bragging, but I'm not totally incompetent either. 245 00:12:47,440 --> 00:12:50,280 Speaker 4: Any anybody who says I'm not bragging is the same 246 00:12:50,320 --> 00:12:52,680 Speaker 4: person who says I have no assholes policy. It's like 247 00:12:52,760 --> 00:12:55,560 Speaker 4: you are you love assholes, and you are bragging. So 248 00:12:55,880 --> 00:12:57,280 Speaker 4: what were the lectures actually about. 249 00:12:57,679 --> 00:12:59,920 Speaker 1: Well, a lot of it, weirdly or maybe not weirdly, 250 00:13:00,200 --> 00:13:04,800 Speaker 1: actually seemed to revolve around deregulation. The Washington Post got 251 00:13:04,800 --> 00:13:07,640 Speaker 1: access to all eight hours of recordings and they reported 252 00:13:07,720 --> 00:13:11,320 Speaker 1: quote Teal argued that critiques of technology and calls for 253 00:13:11,360 --> 00:13:17,240 Speaker 1: stricter regulation by Greta Thunberg and others appear to echo 254 00:13:17,480 --> 00:13:22,000 Speaker 1: biblical interpretations of an Antichrist who will win power by 255 00:13:22,040 --> 00:13:26,880 Speaker 1: offering the world quote peace and safety from apocalyptic destruction. 256 00:13:27,640 --> 00:13:30,040 Speaker 4: So what he's saying is that anybody who wants to 257 00:13:30,120 --> 00:13:33,320 Speaker 4: regulate technology or AI is essentially the devil. 258 00:13:33,800 --> 00:13:36,640 Speaker 1: That is the direction of That is the direction of 259 00:13:36,640 --> 00:13:41,000 Speaker 1: the lecture series, apparently. But that's not the only possibility 260 00:13:41,160 --> 00:13:44,400 Speaker 1: for what the Antichrist might be up to. Reportedly, Teal 261 00:13:44,480 --> 00:13:48,040 Speaker 1: also said, quote is becoming quite difficult to hide one's money. 262 00:13:50,920 --> 00:13:57,480 Speaker 1: An incredible machinery of tax treaties, financial surveillance, and sanctions 263 00:13:57,559 --> 00:14:00,400 Speaker 1: architecture has been constructed, and he went on to say 264 00:14:00,440 --> 00:14:02,720 Speaker 1: that while being rich might give you the illusion of 265 00:14:02,760 --> 00:14:05,160 Speaker 1: having power, you have the sense that it could be 266 00:14:05,160 --> 00:14:06,520 Speaker 1: taken away at any. 267 00:14:06,320 --> 00:14:08,280 Speaker 3: Moment all of a sudden. I feel bad for him. 268 00:14:08,559 --> 00:14:11,960 Speaker 1: So the idea that he's driving towards here is that 269 00:14:12,040 --> 00:14:15,920 Speaker 1: this combination of regulation of financial oversight, if it being 270 00:14:15,920 --> 00:14:20,120 Speaker 1: hard to hide your money, is quote a sign that 271 00:14:20,200 --> 00:14:24,040 Speaker 1: a singular world government has begun to emerge that could 272 00:14:24,080 --> 00:14:27,240 Speaker 1: be taken over by an antichrist figure who could then 273 00:14:27,360 --> 00:14:28,880 Speaker 1: use it to exert control over people. 274 00:14:28,920 --> 00:14:31,840 Speaker 4: So someone like Donald Trump or Greta Tunberg exactly. 275 00:14:31,880 --> 00:14:35,640 Speaker 1: He did clarify that it's not Donald Trump, but he said interestingly, quote, 276 00:14:35,680 --> 00:14:38,800 Speaker 1: if you, in a sincere, rational, well reasoned way, are 277 00:14:38,840 --> 00:14:41,280 Speaker 1: willing to make the argument that Trump is the Antichrist, 278 00:14:41,680 --> 00:14:42,240 Speaker 1: I will give you. 279 00:14:42,240 --> 00:14:44,720 Speaker 3: A hearing unbelievable. 280 00:14:44,840 --> 00:14:46,640 Speaker 1: He went on to say, if you're not willing to 281 00:14:46,680 --> 00:14:49,160 Speaker 1: make that argument, maybe you have to be open to 282 00:14:49,200 --> 00:14:53,800 Speaker 1: the possibility that he's at least relatively good unbelievable onn't 283 00:14:53,800 --> 00:14:56,680 Speaker 1: these weird quotes? So then that Washington Post called up 284 00:14:56,720 --> 00:14:59,840 Speaker 1: his spokesmhen Jeremiah Hall, who said, Peter doesn't believe Trump 285 00:15:00,120 --> 00:15:01,040 Speaker 1: the Antichrist. 286 00:15:01,280 --> 00:15:03,160 Speaker 3: He's like, just to clarify he doesn't believe that. 287 00:15:03,560 --> 00:15:05,960 Speaker 1: His challenge was for Trump's liberal critics to make the 288 00:15:06,000 --> 00:15:07,960 Speaker 1: case that if they want Peter to hear them out, 289 00:15:08,640 --> 00:15:10,920 Speaker 1: they can, but he knows in practice they can't do that, 290 00:15:10,960 --> 00:15:11,840 Speaker 1: and they won't do it. 291 00:15:11,960 --> 00:15:14,200 Speaker 3: So what is the obsession with the Antichrist? 292 00:15:14,720 --> 00:15:16,960 Speaker 1: I'm not sure if I'll be able to properly answer 293 00:15:16,960 --> 00:15:20,240 Speaker 1: that question, but I can tell you that some Christians 294 00:15:20,280 --> 00:15:25,000 Speaker 1: believe the Antichrist is the personal opponent of Christ, and 295 00:15:25,040 --> 00:15:27,920 Speaker 1: they're expected the Antichrist to appear before the end of 296 00:15:27,920 --> 00:15:31,880 Speaker 1: the world. Here's what Peter Teel said quote. My thesis 297 00:15:31,960 --> 00:15:35,320 Speaker 1: is that in the seventeenth eighteenth century, the Antichrist would 298 00:15:35,360 --> 00:15:38,760 Speaker 1: have been a doctor strange Love, a scientist who did 299 00:15:38,800 --> 00:15:42,000 Speaker 1: all this sort of evil, crazy science. In the twenty 300 00:15:42,080 --> 00:15:45,440 Speaker 1: first century, the Antichrist is a luddite who wants to 301 00:15:45,440 --> 00:15:49,800 Speaker 1: stop all the science. It's someone like Greta what do 302 00:15:49,800 --> 00:15:50,800 Speaker 1: you think it's just a. 303 00:15:50,800 --> 00:15:54,040 Speaker 4: Very trumpy and fixation on like a young woman, you 304 00:15:54,080 --> 00:15:54,600 Speaker 4: know what I mean? 305 00:15:54,680 --> 00:15:58,480 Speaker 1: Well no, well maybe that I latch onto Greta, because 306 00:15:58,760 --> 00:16:01,800 Speaker 1: to be fair, he did surface other candidates. Drum roll 307 00:16:02,560 --> 00:16:03,120 Speaker 1: Joe Biden. 308 00:16:05,520 --> 00:16:07,240 Speaker 3: So Joe Biden could be the Antichrist. 309 00:16:07,640 --> 00:16:12,240 Speaker 1: No, he's not charismatic enough. She shin ping anti Christ. 310 00:16:12,360 --> 00:16:16,680 Speaker 1: Not charismatic enough, not charismatic enough. Bill Gates a quote 311 00:16:16,800 --> 00:16:21,680 Speaker 1: very very awful person, but quote not remotely able to 312 00:16:21,720 --> 00:16:21,880 Speaker 1: be the. 313 00:16:21,880 --> 00:16:23,600 Speaker 3: Anti I want to know why he thinks he couldn't 314 00:16:23,600 --> 00:16:26,040 Speaker 3: be the Antichrist. I'm sure. I mean, it's obviously something offensive. 315 00:16:26,120 --> 00:16:28,240 Speaker 1: I think. I think, I think not being the Antichrist 316 00:16:28,440 --> 00:16:30,120 Speaker 1: is a real dick at your charism. 317 00:16:29,960 --> 00:16:32,320 Speaker 3: It is like, you don't have the balls to be 318 00:16:32,360 --> 00:16:33,040 Speaker 3: the Antichrist. 319 00:16:33,280 --> 00:16:34,600 Speaker 1: You don't have the balls to you don't have the 320 00:16:34,600 --> 00:16:37,520 Speaker 1: balls or the shops exactly. Look, it's a weird and 321 00:16:37,680 --> 00:16:41,480 Speaker 1: entertaining story, but it's also consequential. I mean, Peter Teel has. 322 00:16:41,360 --> 00:16:43,160 Speaker 3: A lot of power, that's true, he has a lot 323 00:16:43,160 --> 00:16:43,560 Speaker 3: of power. 324 00:16:44,120 --> 00:16:46,760 Speaker 1: Legions of people, you know, hanging on as every word 325 00:16:46,800 --> 00:16:49,400 Speaker 1: and going to this lecture series. But more to me 326 00:16:49,480 --> 00:16:53,160 Speaker 1: because it speaks of a broader existential crisis that is 327 00:16:53,200 --> 00:16:54,840 Speaker 1: sweeping many people. 328 00:16:54,880 --> 00:16:55,160 Speaker 2: I know. 329 00:16:55,800 --> 00:16:59,800 Speaker 1: Imagine building a I imagine thinking, oh my god, God 330 00:16:59,880 --> 00:17:03,040 Speaker 1: is in the machine, like I'm actually making this product 331 00:17:03,880 --> 00:17:07,120 Speaker 1: first time and it's talking back to me. It's very powerful. 332 00:17:07,440 --> 00:17:11,040 Speaker 1: So this raises questions for anyone. I can imagine if 333 00:17:11,040 --> 00:17:14,440 Speaker 1: anyone you know, what is consciousness is like, shoulday I 334 00:17:14,520 --> 00:17:16,960 Speaker 1: have rights? Is it conscious? But on the flip side, 335 00:17:17,080 --> 00:17:18,800 Speaker 1: what does it mean to be human? Somebody said to 336 00:17:18,800 --> 00:17:20,560 Speaker 1: me the other day, we need to stop asking what 337 00:17:20,600 --> 00:17:22,480 Speaker 1: AI is for and ask what humans are for. 338 00:17:22,680 --> 00:17:24,560 Speaker 3: I've heard that before, and I think that it's true. 339 00:17:24,920 --> 00:17:27,000 Speaker 1: I do, well, yeah, you go, you gotta go to 340 00:17:27,040 --> 00:17:30,600 Speaker 1: church or syn But seriously, yeah, I think this return 341 00:17:30,680 --> 00:17:34,399 Speaker 1: to a system of values that predates this terrifying and 342 00:17:34,480 --> 00:17:38,000 Speaker 1: spectacular moment is not for nothing. I also one of 343 00:17:38,000 --> 00:17:41,560 Speaker 1: my hobby horses is that we're living through a new renaissance. 344 00:17:41,720 --> 00:17:43,240 Speaker 3: Yes, you love to say this, Well. 345 00:17:43,119 --> 00:17:45,720 Speaker 1: I'm in the original renaissance where you had this extraordinary 346 00:17:46,359 --> 00:17:51,560 Speaker 1: explosion of scientific invention, new ways of understanding the world, Leonardo, 347 00:17:51,640 --> 00:17:53,919 Speaker 1: new ways of understanding the human body, the stars, the 348 00:17:53,960 --> 00:17:57,880 Speaker 1: solar system. You also have the extraordinary concentration of wealth 349 00:17:57,920 --> 00:17:59,879 Speaker 1: and power in the hands of people like the medic cheese, 350 00:18:00,920 --> 00:18:07,520 Speaker 1: and you had a fundamentalist religious resurgence which hunts inquisition, 351 00:18:07,880 --> 00:18:11,560 Speaker 1: burning people at the stake. So you know, in a sense, 352 00:18:11,600 --> 00:18:12,200 Speaker 1: nothing is new. 353 00:18:12,920 --> 00:18:15,360 Speaker 4: Does this seem to be an agenda that he's pushing 354 00:18:15,480 --> 00:18:17,600 Speaker 4: onto people or is it just like, come listen to 355 00:18:17,640 --> 00:18:18,840 Speaker 4: my lectures if you're interested. 356 00:18:19,080 --> 00:18:21,479 Speaker 1: Well, I mean I think when he said I'm not 357 00:18:21,520 --> 00:18:23,639 Speaker 1: totally incompetent, I knew if I told people it's a secret, 358 00:18:23,640 --> 00:18:27,359 Speaker 1: everyone want to know about it. I think it's in 359 00:18:27,400 --> 00:18:28,440 Speaker 1: the small prince. 360 00:18:30,880 --> 00:18:33,399 Speaker 4: After the break, satellites are falling out of the sky. 361 00:18:33,880 --> 00:18:36,920 Speaker 4: The Dutch seized a microchip company, and chatchipt helps you 362 00:18:36,960 --> 00:18:37,560 Speaker 4: get a date. 363 00:18:44,640 --> 00:18:49,199 Speaker 1: And we're back. Cara, Have you been following the continued 364 00:18:49,280 --> 00:18:51,440 Speaker 1: drama in the world of microchips? 365 00:18:51,520 --> 00:18:52,600 Speaker 3: No, and you're obsessed. 366 00:18:54,000 --> 00:18:58,280 Speaker 1: I mean, this is geopolitics at the scale of something 367 00:18:58,560 --> 00:18:58,920 Speaker 1: you know. 368 00:18:59,520 --> 00:19:03,480 Speaker 3: Micro, the smallest largest scale possible. 369 00:19:04,200 --> 00:19:06,639 Speaker 1: Macro meets micro exactly. So there's been a ton of 370 00:19:06,680 --> 00:19:09,720 Speaker 1: microchip news this week. OpenAI announced a deal with the 371 00:19:09,800 --> 00:19:13,280 Speaker 1: chip maker broad Com to make custom AI microchips just 372 00:19:13,400 --> 00:19:17,880 Speaker 1: for them. Obviously, China announced this ban on the export 373 00:19:18,000 --> 00:19:22,640 Speaker 1: of rare earth metals which will affect computer chips and 374 00:19:22,680 --> 00:19:25,840 Speaker 1: in turn have got the tariff conversation buzzing again the 375 00:19:25,880 --> 00:19:29,240 Speaker 1: stock market down. Most of that news has been pretty 376 00:19:29,240 --> 00:19:31,880 Speaker 1: well reported and you can get it anywhere. So I'm 377 00:19:31,920 --> 00:19:36,679 Speaker 1: going to take you to the Netherlands otherwise known as Holland, Okay, 378 00:19:37,160 --> 00:19:42,240 Speaker 1: where my tulips are exactly so. First off, there have 379 00:19:42,280 --> 00:19:46,960 Speaker 1: been reports brewing that Dutch microchips are winding up in 380 00:19:47,119 --> 00:19:51,919 Speaker 1: Russian weapons which are being used against Ukraine, despite the 381 00:19:51,960 --> 00:19:53,919 Speaker 1: fact that the Western sanctions are supposed to ban the 382 00:19:53,960 --> 00:19:55,600 Speaker 1: sale of microchips to Russia. 383 00:19:55,960 --> 00:19:58,080 Speaker 3: And how do they know that these are Dutch microchips. 384 00:19:59,080 --> 00:20:04,280 Speaker 1: Well, when the weapons rain down from the sky into Ukraine, 385 00:20:04,760 --> 00:20:07,119 Speaker 1: they break the weapons apart and they put out the microchip, 386 00:20:07,440 --> 00:20:10,480 Speaker 1: and the microchip has the name of a Dutch manufacturer 387 00:20:10,560 --> 00:20:13,320 Speaker 1: written on it. And this is actually brought as evidence 388 00:20:13,400 --> 00:20:17,360 Speaker 1: recently to the Hague there were microchips in Russian weapons 389 00:20:17,440 --> 00:20:20,959 Speaker 1: from two Dutch firms. Basically, what's going on is that 390 00:20:21,119 --> 00:20:26,600 Speaker 1: third party countries are legitimately buying these microchips and then 391 00:20:26,760 --> 00:20:32,679 Speaker 1: exporting them to Russia. These countries include Thailand, Turkey, and 392 00:20:32,800 --> 00:20:34,399 Speaker 1: of course China. 393 00:20:35,160 --> 00:20:35,680 Speaker 3: Huh. 394 00:20:36,160 --> 00:20:42,200 Speaker 1: China has a semiconductor manufacturer headquartered in the Netherlands. What 395 00:20:43,200 --> 00:20:47,240 Speaker 1: which just this week was seized. The company was taken 396 00:20:47,280 --> 00:20:49,720 Speaker 1: over by the Dutch government. 397 00:20:50,359 --> 00:20:55,439 Speaker 4: The Dutch government has taken over a Chinese factory in 398 00:20:55,640 --> 00:20:56,680 Speaker 4: the Netherlands. 399 00:20:57,119 --> 00:21:01,200 Speaker 1: That's right. How there is a law in the Netherlands 400 00:21:01,320 --> 00:21:05,480 Speaker 1: called the Goods Availability Act, and this allows the government 401 00:21:05,520 --> 00:21:10,040 Speaker 1: to quote intervene in privately owned companies in exceptional circumstances. 402 00:21:10,880 --> 00:21:16,200 Speaker 1: The circumstances here are that Nesperia, a semiconductor manufacturer headquartered 403 00:21:16,240 --> 00:21:19,919 Speaker 1: in Holland, is owned by a Chinese company called Wingtech, 404 00:21:20,320 --> 00:21:22,880 Speaker 1: and the Dutch government had reason to believe that their CEO, 405 00:21:23,320 --> 00:21:28,560 Speaker 1: Jiang Shuxeng had some quote serious managerial shortcomings that raise 406 00:21:28,640 --> 00:21:32,320 Speaker 1: broader concerns for the Dutch government about the availability of 407 00:21:32,400 --> 00:21:37,000 Speaker 1: semiconductor products critical to the European industry. In other words, 408 00:21:37,240 --> 00:21:40,480 Speaker 1: the Dutch government were worried this Chinese company would turn 409 00:21:40,480 --> 00:21:43,560 Speaker 1: around and say, we're making all these semiconductors which are 410 00:21:43,600 --> 00:21:47,360 Speaker 1: important for the European industries, including cars, but we're taking 411 00:21:47,400 --> 00:21:49,280 Speaker 1: all back to China and you don't get to have 412 00:21:49,320 --> 00:21:51,040 Speaker 1: any of them. So there was also there was a 413 00:21:51,040 --> 00:21:55,199 Speaker 1: backdrop of illegal or third party exports to Russia powering 414 00:21:55,200 --> 00:21:57,639 Speaker 1: their war machine. But more urgent, the real reason for 415 00:21:57,680 --> 00:22:02,359 Speaker 1: this extraordinary takeover of a private company was the fear 416 00:22:02,600 --> 00:22:06,720 Speaker 1: that its products might be made unavailable to European companies 417 00:22:06,760 --> 00:22:07,400 Speaker 1: who needed them. 418 00:22:07,680 --> 00:22:11,000 Speaker 3: Interesting, So what does this mean exactly? 419 00:22:12,160 --> 00:22:17,000 Speaker 1: Something is happening. Yeah, where the boundaries between governments and 420 00:22:17,040 --> 00:22:21,480 Speaker 1: private companies, which have been blurred consciously and purposefully in 421 00:22:21,560 --> 00:22:26,359 Speaker 1: China since the eighties yea, now in the West are 422 00:22:26,400 --> 00:22:30,040 Speaker 1: also blurring because these technologies are so critical to the 423 00:22:30,119 --> 00:22:35,720 Speaker 1: national interest that governments are unwilling to let them stay 424 00:22:35,880 --> 00:22:39,199 Speaker 1: purely in private hands if it's at the cost of 425 00:22:39,280 --> 00:22:43,080 Speaker 1: strategically undermining their country or the European Union. I mean 426 00:22:43,320 --> 00:22:48,119 Speaker 1: the drama of a European country seizing a factory. The 427 00:22:48,160 --> 00:22:50,600 Speaker 1: factory is still allowed to operate, and the Dutch government 428 00:22:50,640 --> 00:22:53,240 Speaker 1: will only intervene if they think something is going wrong, 429 00:22:53,320 --> 00:22:56,840 Speaker 1: but they've taken the legal control over this company. I 430 00:22:56,880 --> 00:22:58,720 Speaker 1: just think I think it's a watershed moment. 431 00:22:58,760 --> 00:23:00,680 Speaker 3: I think you're right. I actually, I think you're right. 432 00:23:01,040 --> 00:23:05,520 Speaker 1: Also, these Russian weapons are not the only things falling 433 00:23:05,520 --> 00:23:07,399 Speaker 1: out of the sky and causing concern. 434 00:23:08,760 --> 00:23:10,160 Speaker 3: What else is falling out of the sky. 435 00:23:11,440 --> 00:23:15,800 Speaker 1: According to futurism, Starlink and space X satellites are falling 436 00:23:15,840 --> 00:23:16,359 Speaker 1: out of the sky. 437 00:23:16,520 --> 00:23:18,560 Speaker 3: But why are they falling out of the sky now? 438 00:23:19,000 --> 00:23:22,800 Speaker 1: Well, starlink satellites, I now know, only lasts about five 439 00:23:22,880 --> 00:23:25,919 Speaker 1: years and they're supposed to basically burn up as they 440 00:23:25,960 --> 00:23:28,719 Speaker 1: re enter. But scientists are now concerned that when they 441 00:23:28,760 --> 00:23:32,720 Speaker 1: do burn up, they release materials into the atmosphere, which, 442 00:23:32,720 --> 00:23:36,040 Speaker 1: according to one study, could be devastating for the ozone. 443 00:23:35,600 --> 00:23:37,440 Speaker 3: Oh no, the poor ozone layer. 444 00:23:37,520 --> 00:23:39,080 Speaker 1: I no, it's like the nineties. It's like, oh god, 445 00:23:39,080 --> 00:23:39,440 Speaker 1: we were. 446 00:23:39,359 --> 00:23:41,040 Speaker 3: So worried about ozone in the nineties. 447 00:23:41,080 --> 00:23:43,480 Speaker 1: Remember well, I know, and it got a little better, 448 00:23:43,520 --> 00:23:46,120 Speaker 1: but now it could be in jeopardy again. Right now, 449 00:23:46,240 --> 00:23:49,919 Speaker 1: about two to three satellites fall each day out of 450 00:23:49,920 --> 00:23:52,600 Speaker 1: the eight thousand starlinks, like a lot of satellites. Well 451 00:23:52,600 --> 00:23:55,000 Speaker 1: it's going to get a lot more because between Starlink, 452 00:23:55,200 --> 00:23:58,080 Speaker 1: Amazon System, and other companies from around the world, the 453 00:23:58,200 --> 00:24:02,560 Speaker 1: FAA predicts that by twenty thirty, about twenty eight thousand 454 00:24:02,720 --> 00:24:04,720 Speaker 1: fragments could survive re entry. 455 00:24:05,280 --> 00:24:06,720 Speaker 3: This should have been the lead story. 456 00:24:08,960 --> 00:24:11,520 Speaker 1: I knew, you see, I knew, Like space Jenny. 457 00:24:11,280 --> 00:24:14,320 Speaker 3: Thirty five is like tomorrow. That's like ten years. 458 00:24:14,320 --> 00:24:16,440 Speaker 4: If I were doing anything right now, I'd be investing 459 00:24:16,440 --> 00:24:19,879 Speaker 4: in the manufacture of helmets, Like, what kind of helmet 460 00:24:19,920 --> 00:24:21,600 Speaker 4: is going to protect me from a satellite falling from 461 00:24:21,600 --> 00:24:21,920 Speaker 4: the sky. 462 00:24:22,480 --> 00:24:25,360 Speaker 1: That's very funny. I actually there are private companies who 463 00:24:25,359 --> 00:24:30,040 Speaker 1: are specializing now in hunting the rumbas of the sky, 464 00:24:30,080 --> 00:24:31,080 Speaker 1: hunting space junk. 465 00:24:31,160 --> 00:24:31,920 Speaker 3: That's brilliant. 466 00:24:32,200 --> 00:24:34,200 Speaker 4: I have another crazy story for you, and it has 467 00:24:34,240 --> 00:24:36,960 Speaker 4: nothing to do with satellites and has everything to do 468 00:24:37,080 --> 00:24:38,600 Speaker 4: with attachment styles. 469 00:24:38,840 --> 00:24:41,280 Speaker 1: Anxious avoidant is that Yes, you're exactly. 470 00:24:41,240 --> 00:24:41,880 Speaker 3: Is that what you are? 471 00:24:42,560 --> 00:24:44,800 Speaker 4: Did you just give yourself away a little bit? No, 472 00:24:45,240 --> 00:24:47,040 Speaker 4: you're I think you're secure attached. 473 00:24:47,119 --> 00:24:48,000 Speaker 1: Okay, good. I like that. 474 00:24:48,640 --> 00:24:51,240 Speaker 4: The Guardian put out this piece about how people are 475 00:24:51,280 --> 00:24:55,919 Speaker 4: turning to chat GPT to help them with initial text exchange. 476 00:24:55,960 --> 00:24:57,080 Speaker 4: You know, I mean, I know it's been a long 477 00:24:57,119 --> 00:25:00,200 Speaker 4: time for you, but when you first meet someone, you're 478 00:25:00,200 --> 00:25:01,399 Speaker 4: putting your best foot forward. 479 00:25:01,920 --> 00:25:05,960 Speaker 1: I see. So Basically, rather than saying to your friends, hey, 480 00:25:06,000 --> 00:25:08,480 Speaker 1: what should I say? Or they've aby so blah blah, 481 00:25:08,520 --> 00:25:10,119 Speaker 1: you just plug it in through it to get your friends. 482 00:25:10,480 --> 00:25:12,520 Speaker 3: Forget your friend. A friend of mine always says this 483 00:25:12,560 --> 00:25:13,200 Speaker 3: about chatcheebe. 484 00:25:13,359 --> 00:25:14,960 Speaker 4: Now She's like, we haven't spoken in a few days 485 00:25:15,000 --> 00:25:17,440 Speaker 4: because I have chat, Like that's it for no other 486 00:25:17,480 --> 00:25:19,640 Speaker 4: reason that like, I have somebody else to talk to now. 487 00:25:19,920 --> 00:25:20,919 Speaker 1: So this is dating advice. 488 00:25:21,040 --> 00:25:24,040 Speaker 4: This is dating advice, but it's not dating advice in 489 00:25:24,080 --> 00:25:27,959 Speaker 4: the in the macro sense. It's hyper specific dating advice 490 00:25:28,040 --> 00:25:31,159 Speaker 4: on the basis of the actual conversation that you're having 491 00:25:31,240 --> 00:25:33,119 Speaker 4: with a person that you match on a dating app. 492 00:25:33,359 --> 00:25:39,280 Speaker 4: So the people, the kids are calling this chatfishing chatfishing, yes, 493 00:25:39,320 --> 00:25:40,880 Speaker 4: which I really like the. 494 00:25:40,840 --> 00:25:44,120 Speaker 1: Idea, with the idea being that the chatchiptas are much 495 00:25:44,160 --> 00:25:47,920 Speaker 1: better conversationalist than you are, so you're you're essentially chatfishing. 496 00:25:48,200 --> 00:25:51,000 Speaker 1: And then the reality is you meet this other person 497 00:25:51,040 --> 00:25:53,760 Speaker 1: who's completely taciturnin and chomless, and then you realize you've 498 00:25:53,800 --> 00:25:56,040 Speaker 1: been chatfished. Is that I'm done. 499 00:25:56,040 --> 00:25:57,600 Speaker 4: I don't have to tell any more of the story. 500 00:25:58,040 --> 00:26:01,040 Speaker 4: That's what chatfishing. That's exactly what chatfish. So there was 501 00:26:01,080 --> 00:26:03,800 Speaker 4: a woman in the story named Rachel, and actually I 502 00:26:03,840 --> 00:26:05,600 Speaker 4: was reading her bio and I'm. 503 00:26:05,440 --> 00:26:06,080 Speaker 3: Like, is it me. 504 00:26:07,200 --> 00:26:10,840 Speaker 4: She's thirty six, she's a business owner, and she'd been 505 00:26:10,920 --> 00:26:13,600 Speaker 4: chatting with a match on hinge for the last three weeks. 506 00:26:14,040 --> 00:26:16,680 Speaker 4: What she said was that the conversations were refreshing. Her 507 00:26:16,720 --> 00:26:20,200 Speaker 4: match kept asking her open ended questions, some of them 508 00:26:20,200 --> 00:26:22,119 Speaker 4: felt like they were straight from a self help book, 509 00:26:22,240 --> 00:26:24,800 Speaker 4: but they also got into ridiculous subjects too, like sharing 510 00:26:24,880 --> 00:26:28,640 Speaker 4: memes and having inconsequential debates like do you like ketchup 511 00:26:28,720 --> 00:26:30,080 Speaker 4: versus Mayo? 512 00:26:30,200 --> 00:26:33,720 Speaker 3: So they decide to meet up. She was very shocked to. 513 00:26:33,760 --> 00:26:38,040 Speaker 4: Find that when she met him on the in person date, 514 00:26:39,000 --> 00:26:44,280 Speaker 4: he was so flat. Unfortunately, Rachel had been chatfished before, 515 00:26:44,400 --> 00:26:46,840 Speaker 4: so she kind of knew the signs of someone offloading 516 00:26:46,880 --> 00:26:49,879 Speaker 4: their conversations to chat GPT to herself. She was like, 517 00:26:49,880 --> 00:26:54,280 Speaker 4: I'm gonna give this guy one more chance, and she did, 518 00:26:55,080 --> 00:26:57,200 Speaker 4: and because they hung out again and he was as 519 00:26:57,240 --> 00:27:01,399 Speaker 4: flat as ever, she didn't keep trying. She knew and 520 00:27:01,520 --> 00:27:03,280 Speaker 4: what she said in the article really made me laugh. 521 00:27:03,320 --> 00:27:03,680 Speaker 3: Poor thing. 522 00:27:04,040 --> 00:27:06,679 Speaker 4: She said, I'd already been chat gbt'ed into bed at 523 00:27:06,760 --> 00:27:09,440 Speaker 4: least once. I didn't want it to happen again. 524 00:27:19,800 --> 00:27:23,280 Speaker 1: This brings us to our final segment each week, which 525 00:27:23,320 --> 00:27:24,600 Speaker 1: is Chatting Me. 526 00:27:24,960 --> 00:27:27,720 Speaker 4: This week we heard from Mary in Florida. Mary is 527 00:27:28,119 --> 00:27:31,280 Speaker 4: thirty two years old and describes herself as a chronic 528 00:27:31,480 --> 00:27:34,359 Speaker 4: CHATCHYBT user, or at least she used to be. 529 00:27:35,480 --> 00:27:37,040 Speaker 1: Mmm, she started to tap out. 530 00:27:37,080 --> 00:27:39,879 Speaker 3: I wonder why well she got disillusioned. 531 00:27:40,600 --> 00:27:43,520 Speaker 4: Mary says she started using chat gybt in late twenty 532 00:27:43,520 --> 00:27:47,000 Speaker 4: twenty four, and she very quickly found herself relying on 533 00:27:47,040 --> 00:27:50,080 Speaker 4: it for just about anything that popped into her mind. 534 00:27:50,600 --> 00:27:53,760 Speaker 2: From learning how to thaw a frozen twelve pound turkey 535 00:27:53,800 --> 00:27:58,200 Speaker 2: and under twenty four hours, to fitness advice to creating 536 00:27:58,280 --> 00:28:00,920 Speaker 2: funny Ai've generated in of my Dog. 537 00:28:01,680 --> 00:28:04,480 Speaker 4: But Mary started to notice that one topic kept coming 538 00:28:04,560 --> 00:28:07,760 Speaker 4: up over and over, and that's the concept of what 539 00:28:07,800 --> 00:28:08,879 Speaker 4: we were just talking about. 540 00:28:08,920 --> 00:28:09,160 Speaker 1: Love. 541 00:28:09,680 --> 00:28:12,800 Speaker 2: I scrolled down to my very first interaction with chat 542 00:28:13,600 --> 00:28:17,680 Speaker 2: and the first question was why have I been unlucky 543 00:28:17,720 --> 00:28:18,080 Speaker 2: in love? 544 00:28:18,520 --> 00:28:21,159 Speaker 1: Isn't this interesting? I mean, this is like one of 545 00:28:21,160 --> 00:28:25,040 Speaker 1: the great newspaper media inventions of the twentieth century was 546 00:28:25,080 --> 00:28:27,600 Speaker 1: the Advice a Vice comlument, Love Advice column and I 547 00:28:27,680 --> 00:28:28,639 Speaker 1: Love Call in show. 548 00:28:28,920 --> 00:28:32,520 Speaker 4: So when she asked, the answers that Chat gave her 549 00:28:33,400 --> 00:28:36,359 Speaker 4: were a list of seven potential factors that could be 550 00:28:36,440 --> 00:28:40,320 Speaker 4: contributing to her dating struggles, and she initially found them 551 00:28:40,360 --> 00:28:41,040 Speaker 4: really helpful. 552 00:28:41,440 --> 00:28:42,920 Speaker 3: A little while later, though, she. 553 00:28:42,920 --> 00:28:46,080 Speaker 4: Found herself in a new relationship that started getting tumultuous, 554 00:28:46,240 --> 00:28:47,760 Speaker 4: and so she turned to Chat again. 555 00:28:48,400 --> 00:28:51,240 Speaker 1: Is this when the problem started? Yes. 556 00:28:51,400 --> 00:28:54,320 Speaker 4: Mary found that Chat's advice and the ability to find 557 00:28:54,320 --> 00:28:57,160 Speaker 4: patterns was useful for a while. She says it felt 558 00:28:57,200 --> 00:28:59,800 Speaker 4: like it was validating her intuitions and teaching her to 559 00:28:59,800 --> 00:29:03,280 Speaker 4: try trust her gut, which was a good thing. But then, 560 00:29:03,320 --> 00:29:05,720 Speaker 4: she said, she started depending on it way too much, 561 00:29:06,200 --> 00:29:10,080 Speaker 4: and she stopped thinking so critically about her own intuitions, 562 00:29:10,280 --> 00:29:12,840 Speaker 4: and she found herself assuming the worst of her partner, 563 00:29:13,320 --> 00:29:15,840 Speaker 4: even when it went way against the way she'd preferred 564 00:29:15,800 --> 00:29:17,120 Speaker 4: to handle a difficult situation. 565 00:29:17,720 --> 00:29:21,320 Speaker 2: It stripped me of my humanity and led me to 566 00:29:21,480 --> 00:29:27,400 Speaker 2: believe things that were not necessarily true and fed my delusions. 567 00:29:27,720 --> 00:29:31,560 Speaker 1: Well, thank goodness Mary realized this and lived to tell 568 00:29:31,600 --> 00:29:34,880 Speaker 1: the tale. Because, as we've reported on this show in 569 00:29:34,920 --> 00:29:38,000 Speaker 1: that episode we did with Kashmi Hill, this can go 570 00:29:38,240 --> 00:29:39,120 Speaker 1: a much docopov. 571 00:29:39,520 --> 00:29:41,560 Speaker 4: Yeah, And one of the reasons she sent us a 572 00:29:41,640 --> 00:29:44,240 Speaker 4: voice memo is because she wanted to warn others of 573 00:29:44,280 --> 00:29:47,400 Speaker 4: the psychological dangers of AI and what can happen when 574 00:29:47,440 --> 00:29:50,560 Speaker 4: you become codependent on a chatbot for self growth. 575 00:29:50,560 --> 00:29:52,800 Speaker 3: I actually I really like this chat. 576 00:29:52,560 --> 00:29:55,520 Speaker 4: In me because I think we keep coming back to 577 00:29:55,600 --> 00:29:59,840 Speaker 4: over and over again the deficit that arises with co 578 00:30:00,120 --> 00:30:00,960 Speaker 4: and of offloading. 579 00:30:01,400 --> 00:30:03,320 Speaker 3: You know, and I mean I know it for myself. 580 00:30:03,560 --> 00:30:06,320 Speaker 1: The truth is part of growing up, and part of parenting, 581 00:30:06,360 --> 00:30:08,680 Speaker 1: and part of teaching, and part of actually being a 582 00:30:08,760 --> 00:30:15,040 Speaker 1: boss is encouraging people to know when to ask and 583 00:30:15,080 --> 00:30:17,720 Speaker 1: know when to trust themselves. But the problem with the 584 00:30:17,800 --> 00:30:19,960 Speaker 1: chatbot is it's incentives, so they want to keep you 585 00:30:19,960 --> 00:30:21,920 Speaker 1: in there. They don't. The chat that will never say 586 00:30:21,920 --> 00:30:24,360 Speaker 1: to you, go trust yourself. 587 00:30:24,480 --> 00:30:25,760 Speaker 3: It's time to. 588 00:30:25,320 --> 00:30:28,320 Speaker 1: Trust stread your wings. We want to hear from you, 589 00:30:28,440 --> 00:30:31,560 Speaker 1: our listeners, so please do send your chap stories tart 590 00:30:31,600 --> 00:30:46,360 Speaker 1: inbox Textuff Podcast at gmail dot com. 591 00:30:46,520 --> 00:30:48,160 Speaker 3: That's it for this week for tech stuff. 592 00:30:48,160 --> 00:30:51,120 Speaker 1: I'm care Price and I'm os Valoshin. This episode was 593 00:30:51,120 --> 00:30:55,040 Speaker 1: produced by Eliza Dennis, Tyler Hill and Melissa Slaughter. It 594 00:30:55,120 --> 00:30:58,200 Speaker 1: was executive produced by me Kara Price, Julia Nutter, and 595 00:30:58,320 --> 00:31:02,560 Speaker 1: Kate Osborne for Kaleidoscope and Katrina Norvelle for iHeart Podcasts. 596 00:31:03,160 --> 00:31:06,120 Speaker 1: Jack Insley makes this episode and Kyle Murdoch wrote our 597 00:31:06,160 --> 00:31:06,640 Speaker 1: theme song. 598 00:31:06,960 --> 00:31:09,640 Speaker 4: Join us next week for conversation about all the ways 599 00:31:09,680 --> 00:31:13,239 Speaker 4: the Internet can fued our paranoia, especially during pregnancy. 600 00:31:13,160 --> 00:31:15,800 Speaker 1: And please do rate and review the show wherever you listen, 601 00:31:16,040 --> 00:31:18,680 Speaker 1: and reach out to us at tech Stuff podcast at 602 00:31:18,760 --> 00:31:20,880 Speaker 1: gmail dot com. We love hearing from you.