1 00:00:06,160 --> 00:00:09,400 S1: I was amazed as I watched the Super Bowl yesterday. 2 00:00:09,400 --> 00:00:12,520 S1: I was amazed not at the outcome, because my guess 3 00:00:12,520 --> 00:00:14,840 S1: was that the Seattle defense would be too much for 4 00:00:14,840 --> 00:00:18,560 S1: any team to handle. No. What amazed me was the 5 00:00:18,560 --> 00:00:24,200 S1: number of times I saw advertisements about artificial intelligence, AI, 6 00:00:24,920 --> 00:00:27,760 S1: and the number of times my grown kids would say 7 00:00:27,760 --> 00:00:31,080 S1: to me, uhg, they're using AI for that. Did you 8 00:00:31,080 --> 00:00:34,479 S1: see that, dad? So I'm glad we have today's conversation 9 00:00:34,479 --> 00:00:37,000 S1: to unpack. Not only the good, the bad, and the 10 00:00:37,000 --> 00:00:41,640 S1: ugly of AI, but also the companion theme of identity, 11 00:00:41,640 --> 00:00:45,320 S1: ideology and who better to do that than Abdul Murray? 12 00:00:45,360 --> 00:00:47,680 S1: I was telling him before the program, I was looking 13 00:00:47,840 --> 00:00:51,320 S1: months ago for a guest who could deal with AI, 14 00:00:52,000 --> 00:00:54,160 S1: somebody who could talk to me and explain it so 15 00:00:54,160 --> 00:00:56,160 S1: that I could understand it. And then I saw that 16 00:00:56,160 --> 00:00:59,280 S1: his book is coming out in February. It came out 17 00:00:59,420 --> 00:01:01,860 S1: last week and we're going to talk with him straight 18 00:01:01,860 --> 00:01:06,460 S1: ahead on Chris Fabry live online. Chris, thanks to Ryan 19 00:01:06,459 --> 00:01:09,700 S1: McConaughey doing all things technical. Trish is our producer. Lisa's 20 00:01:09,700 --> 00:01:12,620 S1: in the chair. Josh will be answering your calls. And 21 00:01:12,620 --> 00:01:15,300 S1: thank you to those who support this program here each 22 00:01:15,300 --> 00:01:17,940 S1: day in February. If we help you with the program, 23 00:01:17,940 --> 00:01:21,740 S1: thank our friends and partners and become one yourself, especially 24 00:01:21,740 --> 00:01:24,100 S1: if you've never given a gift, now's a great time 25 00:01:24,100 --> 00:01:26,380 S1: to do that. Our Thank You is the new book 26 00:01:26,380 --> 00:01:29,580 S1: by Gary and the parrots, the Love Language that matters most. 27 00:01:29,580 --> 00:01:33,060 S1: I'm talking about Doctor Gary Chapman. Doctors Les and Leslie Parrott. 28 00:01:33,420 --> 00:01:35,899 S1: Call or click through give today will send you a 29 00:01:35,900 --> 00:01:41,900 S1: copy of that excellent new book, 866 95 Fabbri or 30 00:01:41,900 --> 00:01:46,980 S1: Chris Fabbri, Livorno. Why does love get lost in translation? 31 00:01:46,980 --> 00:01:51,660 S1: How can you personalize specifically personalize your love so that 32 00:01:51,660 --> 00:01:55,100 S1: it is deeply felt by that other person? Discover the 33 00:01:55,100 --> 00:01:59,120 S1: love language that matters most. Which, spoiler alert is the 34 00:01:59,120 --> 00:02:01,760 S1: love language of the person you are trying to show 35 00:02:01,800 --> 00:02:11,360 S1: love to. (866) 953-2279 or go to Chris dot. Scroll down. 36 00:02:11,360 --> 00:02:13,160 S1: You'll see how you can become a friend or a 37 00:02:13,160 --> 00:02:15,840 S1: partner right there. And thank you for your support of 38 00:02:15,840 --> 00:02:19,560 S1: the radio backyard fence. Oh, I can't wait. Abdul Murray 39 00:02:19,600 --> 00:02:23,000 S1: speaks internationally about the intersection of the Christian faith and 40 00:02:23,000 --> 00:02:25,480 S1: the questions of culture. He's the author of a number 41 00:02:25,480 --> 00:02:30,440 S1: of books Saving Truth, Grand Central Question, More Than a 42 00:02:30,440 --> 00:02:35,000 S1: White Man's Religion, and others are featured. Resource today is 43 00:02:35,040 --> 00:02:38,920 S1: fake ID. For most of his life, he was a 44 00:02:38,919 --> 00:02:43,240 S1: proud Muslim until a nine year historical, theological and scientific 45 00:02:43,240 --> 00:02:47,880 S1: investigation pointed him to faith in Christ. He joins us 46 00:02:48,080 --> 00:02:51,559 S1: from near his home or in his home or somewhere 47 00:02:52,000 --> 00:02:54,520 S1: there near Detroit. How are you doing, Abdul? 48 00:02:54,960 --> 00:02:57,180 S2: I'm doing so great, Chris. Thanks for having me. And 49 00:02:57,180 --> 00:02:58,980 S2: I'm in my office near my home. 50 00:02:59,940 --> 00:03:02,700 S1: There you go, near Chris Brooks. You don't live that 51 00:03:02,700 --> 00:03:04,299 S1: far away from Chris Brooks, right? 52 00:03:04,300 --> 00:03:06,780 S2: I do not. We live just a couple of miles 53 00:03:06,780 --> 00:03:07,900 S2: apart from each other. 54 00:03:08,300 --> 00:03:09,900 S1: Wave to him next time you see him. 55 00:03:10,139 --> 00:03:11,140 S2: I will, I will. 56 00:03:11,419 --> 00:03:14,300 S1: Okay, so the full title of the book is fake 57 00:03:14,340 --> 00:03:20,860 S1: ID how AI and Identity Ideology Are Collapsing Reality and 58 00:03:20,860 --> 00:03:23,780 S1: What to Do about It. Um, did you see the 59 00:03:23,780 --> 00:03:26,580 S1: same ads that I saw yesterday respond to that? 60 00:03:26,580 --> 00:03:29,820 S2: Oh, I certainly did. And it was, uh, very much so. 61 00:03:29,860 --> 00:03:31,820 S2: Where we were, I was like, wow, okay, I just 62 00:03:31,820 --> 00:03:33,460 S2: wrote about this. And so every time it happens, I 63 00:03:33,460 --> 00:03:35,820 S2: start thinking about it. And some of the commercials that 64 00:03:35,820 --> 00:03:39,420 S2: were AI were were interesting. Some of them were obviously AI, 65 00:03:39,700 --> 00:03:42,900 S2: specifically the one with the Clydesdale and the Eagle. Um, 66 00:03:43,300 --> 00:03:45,140 S2: because you can't train an Eagle in a Clydesdale to 67 00:03:45,140 --> 00:03:48,100 S2: do the things that you did in the commercial, but 68 00:03:48,100 --> 00:03:50,420 S2: they looked pretty, pretty good. But I'll tell you that 69 00:03:50,420 --> 00:03:52,580 S2: some of them were like, it's obviously recruiting people to 70 00:03:52,620 --> 00:03:54,940 S2: use more and more AI. And the one that was 71 00:03:54,940 --> 00:03:59,930 S2: really interesting to me was Icom, which was recruiting people 72 00:03:59,930 --> 00:04:04,050 S2: to visit Icom and reserve their username and to create 73 00:04:04,050 --> 00:04:07,090 S2: a unique AI handle. Because the idea here is that 74 00:04:07,090 --> 00:04:09,570 S2: if you don't do it now, the billions of or 75 00:04:09,570 --> 00:04:11,250 S2: the millions of people who are watching the Super Bowl 76 00:04:11,250 --> 00:04:15,050 S2: will get their their handle before you will, and you'll 77 00:04:15,050 --> 00:04:18,210 S2: be left out in the cold when you see artificial 78 00:04:18,210 --> 00:04:21,170 S2: intelligence reach what they called AGI. I mean, that was 79 00:04:21,170 --> 00:04:25,010 S2: the first time I saw a commercial that boldly stated 80 00:04:25,050 --> 00:04:27,850 S2: AGI is on its way. And for those who don't know, 81 00:04:27,890 --> 00:04:32,850 S2: AGI is artificial general intelligence, and that is essentially human 82 00:04:32,850 --> 00:04:36,810 S2: level intellect for an artificial system. Uh, right now we 83 00:04:36,810 --> 00:04:39,130 S2: don't have that. We don't have even close to it. 84 00:04:39,370 --> 00:04:41,890 S2: But they're boasting that we're going to have AGI soon. 85 00:04:41,890 --> 00:04:44,650 S2: And you better get involved because you'll be left behind otherwise. 86 00:04:45,170 --> 00:04:47,850 S1: And it had I think it didn't have Elon's name 87 00:04:47,850 --> 00:04:49,370 S1: on there at the end of it. 88 00:04:49,410 --> 00:04:53,210 S2: Yeah. He reserved his AI handle and Elon's name was 89 00:04:53,210 --> 00:04:54,990 S2: on the end of it, which was interesting to me 90 00:04:54,990 --> 00:04:58,110 S2: because obviously he's got grok and he's got all the 91 00:04:58,270 --> 00:05:01,630 S2: Neuralink and all these various things that are AI related things, 92 00:05:01,630 --> 00:05:03,950 S2: but this is the same guy, along with several other 93 00:05:03,990 --> 00:05:08,349 S2: sort of AI luminaries, who signed a statement a few 94 00:05:08,350 --> 00:05:11,229 S2: years ago from the Future of Life Institute, which which 95 00:05:11,710 --> 00:05:14,710 S2: basically advocated for putting a moratorium or at least a 96 00:05:14,710 --> 00:05:18,230 S2: pause or a slowdown on AI development until we find 97 00:05:18,230 --> 00:05:20,990 S2: out what this will do to us. So there was 98 00:05:21,110 --> 00:05:26,950 S2: a concerted, at least stated concern that AI will have 99 00:05:26,950 --> 00:05:29,950 S2: an impact on humanity, that we heretofore don't actually know 100 00:05:29,950 --> 00:05:32,390 S2: what it's going to be. So we better put at 101 00:05:32,390 --> 00:05:35,070 S2: least a pause on it, which is unrealistic. Um, and 102 00:05:35,070 --> 00:05:37,469 S2: yet those same people who were signing it are putting 103 00:05:37,510 --> 00:05:40,430 S2: tons of money and lots of urgency in the development 104 00:05:40,430 --> 00:05:43,270 S2: of their own AI. So at least one wondering what's 105 00:05:43,270 --> 00:05:46,390 S2: the reality here? Is it dangerous? Should we pause it? 106 00:05:46,390 --> 00:05:49,070 S2: Should we run headlong into it? Um. And it makes 107 00:05:49,070 --> 00:05:53,330 S2: one wonder, you know, um, whereas AI can collapse reality. 108 00:05:53,330 --> 00:05:56,410 S2: The marketing behind AI further collapses reality. 109 00:05:56,890 --> 00:05:59,370 S1: What do you mean by collapses reality? 110 00:05:59,730 --> 00:06:02,330 S2: Yeah, and this is an interesting because this is a 111 00:06:02,410 --> 00:06:06,850 S2: term that's the central idea of the book, is that, um, uh, 112 00:06:06,850 --> 00:06:08,969 S2: the two forces that are aligning and a lot of 113 00:06:08,970 --> 00:06:12,490 S2: people have written on identity, ideology, you know, the gender stuff. 114 00:06:12,650 --> 00:06:14,930 S2: Others have written a lot about AI. But what I 115 00:06:14,930 --> 00:06:18,210 S2: noticed was there was nothing written on the convergence of 116 00:06:18,210 --> 00:06:22,210 S2: these two at this cultural moment where identity, ideology, and 117 00:06:22,210 --> 00:06:25,690 S2: I think what I'm calling AI mania are converging at 118 00:06:25,690 --> 00:06:29,730 S2: the same time to train us to distrust our own 119 00:06:29,770 --> 00:06:33,169 S2: eyes and our own senses, and to actually celebrate that 120 00:06:33,170 --> 00:06:37,370 S2: which isn't real based on technology or social engineering, to 121 00:06:37,410 --> 00:06:40,810 S2: tell you that biology doesn't matter. What that leads to 122 00:06:41,050 --> 00:06:45,289 S2: is not that reality itself will collapse because reality can't collapse. First, 123 00:06:45,290 --> 00:06:47,970 S2: as a Christian, I believe that the reality is upheld 124 00:06:48,170 --> 00:06:51,390 S2: by the sovereign will and power of God. And so our, 125 00:06:52,070 --> 00:06:55,229 S2: our sort of social machinations can't change it. But what 126 00:06:55,230 --> 00:06:59,070 S2: can change is our perception of reality. Um, and so 127 00:06:59,070 --> 00:07:01,790 S2: that will be, I think, how reality collapses in the 128 00:07:01,790 --> 00:07:04,470 S2: sense of how we perceive it. Um, and that and 129 00:07:04,470 --> 00:07:07,510 S2: that phrase I borrowed from Aza Raskin and Tristan Harris, 130 00:07:07,510 --> 00:07:11,750 S2: two AI researchers who in a video called the AI dilemma, 131 00:07:11,750 --> 00:07:13,870 S2: said that when we have our third contact and we're 132 00:07:13,870 --> 00:07:17,510 S2: currently in our third contact with AI, um, we will 133 00:07:17,630 --> 00:07:21,110 S2: suffer several social issues and they listed like 30 of them. 134 00:07:21,110 --> 00:07:22,590 S2: And then right in the middle of the list was 135 00:07:22,630 --> 00:07:24,910 S2: like reality collapse. And then they moved on. They didn't 136 00:07:24,910 --> 00:07:28,430 S2: explain it. I was like, wait, guys, reality collapsed. Everything 137 00:07:28,430 --> 00:07:31,670 S2: else doesn't matter if our sense of reality collapses. You 138 00:07:31,670 --> 00:07:34,750 S2: can have all the scams and fake, fake, uh, deepfakes 139 00:07:34,750 --> 00:07:38,190 S2: all you want. But if our sense of reality collapses, 140 00:07:38,190 --> 00:07:40,630 S2: what's left? So that's what was the impetus for me 141 00:07:40,630 --> 00:07:42,150 S2: to look at this book and see that these two 142 00:07:42,150 --> 00:07:44,870 S2: cultural forces are getting us to not believe our own 143 00:07:44,870 --> 00:07:45,630 S2: eyes anymore. 144 00:07:46,230 --> 00:07:49,270 S1: Why did you want to work on this? Write about this. 145 00:07:49,270 --> 00:07:49,910 S1: Study it. 146 00:07:50,330 --> 00:07:54,890 S2: Well, I got interested in this. Um, when I saw, uh, 147 00:07:54,890 --> 00:07:57,970 S2: a news article that was about a guy who entered 148 00:07:57,970 --> 00:08:00,530 S2: a digital art contest, and I dabble in digital art 149 00:08:00,530 --> 00:08:03,130 S2: with my iPad. There's some great programs out there that 150 00:08:03,130 --> 00:08:05,450 S2: help you to basically to paint while on a plane. 151 00:08:05,450 --> 00:08:06,890 S2: And I'm on a plane a lot, so I don't 152 00:08:06,890 --> 00:08:09,210 S2: have to clean up anything and make spills on a plane, 153 00:08:09,210 --> 00:08:11,610 S2: and it passes the time. Um, but I saw a 154 00:08:11,610 --> 00:08:15,850 S2: guy who won first prize. His name is Jason Allen. Um, and, uh, 155 00:08:15,850 --> 00:08:18,530 S2: he won first prize in an art contest using digital art. 156 00:08:18,570 --> 00:08:21,010 S2: Now you're required to actually use your hands and put 157 00:08:21,010 --> 00:08:24,490 S2: a pen to a digital platform so you can create something. Well, 158 00:08:24,490 --> 00:08:27,330 S2: he won first prize, and he didn't use a pen 159 00:08:27,650 --> 00:08:31,130 S2: or any kind of implement. He basically typed in prompts 160 00:08:31,130 --> 00:08:36,250 S2: into Midjourney, and Midjourney spit out an incredibly beautiful image. Now, 161 00:08:36,250 --> 00:08:39,090 S2: what was interesting to me and where I got thinking about, 162 00:08:39,130 --> 00:08:42,770 S2: I think this I, this what we perceive as human 163 00:08:42,770 --> 00:08:46,089 S2: humanity and what we perceive as God is starting to 164 00:08:46,090 --> 00:08:49,670 S2: collapse is when he was allowed to keep his first 165 00:08:49,670 --> 00:08:53,390 S2: prize despite the fact that he didn't actually create anything. Um, 166 00:08:53,390 --> 00:08:56,190 S2: and then he said, this is an artist talking. He said, 167 00:08:56,230 --> 00:09:00,950 S2: Art is dead. AI won. Humans lost. Wow. And he 168 00:09:00,950 --> 00:09:03,709 S2: was bragging about it. And I thought, what? What is 169 00:09:03,710 --> 00:09:07,270 S2: it in the human soul that causes us to brag 170 00:09:07,550 --> 00:09:11,589 S2: about the creation of a machine like thing that renders 171 00:09:11,590 --> 00:09:14,670 S2: us essentially obsolete? Because we used to think that art 172 00:09:14,670 --> 00:09:18,270 S2: and music and creativity was what set us apart from 173 00:09:18,270 --> 00:09:21,829 S2: animals and from our own, our own creations. But here 174 00:09:21,830 --> 00:09:24,589 S2: we are now, bragging about how one of our own 175 00:09:24,590 --> 00:09:27,830 S2: creations is going to supplant us and surpass us. Why 176 00:09:27,830 --> 00:09:30,350 S2: do we brag about that? And I started to think, 177 00:09:30,950 --> 00:09:33,790 S2: this is where reality starts to collapse. What we think 178 00:09:33,790 --> 00:09:35,710 S2: of as what it means to be human, and what 179 00:09:35,710 --> 00:09:37,510 S2: we think of as what it means to be divine. 180 00:09:37,910 --> 00:09:40,910 S2: And that got me thinking is this in other areas 181 00:09:40,910 --> 00:09:43,790 S2: of life? And it behold it is. And then it 182 00:09:43,790 --> 00:09:47,810 S2: got me thinking, boy, the Bible, if you study the Bible, 183 00:09:47,809 --> 00:09:51,850 S2: you realize the Bible actually predicts this. It predicts this 184 00:09:52,130 --> 00:09:56,370 S2: in a very powerful way. And therefore I think it 185 00:09:56,370 --> 00:09:58,490 S2: also provides a solution. So that's why I wanted to 186 00:09:58,530 --> 00:10:01,530 S2: write about it, because I was perceiving this collapsed reality. 187 00:10:01,530 --> 00:10:03,809 S2: But I firmly believed the Bible has the solution. 188 00:10:04,770 --> 00:10:06,969 S1: And I do too. But I don't know how to 189 00:10:06,970 --> 00:10:10,250 S1: say that. And you do. And that's why I have 190 00:10:10,290 --> 00:10:13,370 S1: Abdu Murray here today. Fake ID is our featured resource 191 00:10:13,370 --> 00:10:19,290 S1: at Chris. How AI and identity ideology are collapsing reality 192 00:10:19,290 --> 00:10:21,490 S1: and what to do about it. Now the next question 193 00:10:21,490 --> 00:10:25,050 S1: is what do AI and identity ideology have to do 194 00:10:25,050 --> 00:10:26,730 S1: with each other? I want to find out why you 195 00:10:26,730 --> 00:10:29,690 S1: put those together, and I have some examples from my 196 00:10:29,690 --> 00:10:31,929 S1: own life, and we're going to open the phone lines. 197 00:10:32,090 --> 00:10:45,010 S1: This is Chris Fabry live on Moody Radio. Author and 198 00:10:45,010 --> 00:10:47,600 S1: apologist Abdu Murray is with us today at the radio. 199 00:10:47,600 --> 00:10:52,800 S1: Backyard fence face, fake ID, how AI and identity ideology 200 00:10:52,800 --> 00:10:56,240 S1: are collapsing reality and what to do about it is 201 00:10:56,240 --> 00:11:00,000 S1: our featured resource just came out last week. Well, this 202 00:11:00,000 --> 00:11:02,319 S1: answers a lot of questions, but you got to put 203 00:11:02,320 --> 00:11:04,760 S1: your thinking cap on because it takes you in so 204 00:11:04,760 --> 00:11:09,000 S1: many different directions. And one of them is the AI 205 00:11:09,000 --> 00:11:11,240 S1: and the identity ideology. I want to get to that, 206 00:11:11,240 --> 00:11:14,600 S1: but I have to throw out some some ways that 207 00:11:14,600 --> 00:11:18,760 S1: I have encountered this. And one is with my phone. 208 00:11:18,800 --> 00:11:20,920 S1: It even happened to me over the weekend when out 209 00:11:20,920 --> 00:11:23,600 S1: to eat with a guy, the whole family together went 210 00:11:23,600 --> 00:11:28,120 S1: out to eat. Doesn't happen very often. And uh, coming back, 211 00:11:28,559 --> 00:11:31,480 S1: I got into a lane that I shouldn't have gotten 212 00:11:31,480 --> 00:11:35,480 S1: into and turn left. Um, and because I didn't have 213 00:11:35,480 --> 00:11:38,359 S1: my phone out and there was, you know, road construction, 214 00:11:38,360 --> 00:11:41,680 S1: I thought this was the place to go. But I realized, 215 00:11:41,720 --> 00:11:45,339 S1: you know, if I if I just asked Siri directions 216 00:11:45,340 --> 00:11:48,060 S1: to the interstate, I would I would have been there. 217 00:11:48,059 --> 00:11:52,460 S1: We'd have probably shaved maybe 5.5 minutes off the commute, 218 00:11:52,860 --> 00:11:56,540 S1: you know, and it's those 5.5 minutes that mean everything, 219 00:11:56,580 --> 00:12:01,380 S1: of course, but how how dependent we can become on 220 00:12:01,420 --> 00:12:06,900 S1: those devices to tell us when to do what. Talk 221 00:12:06,940 --> 00:12:07,700 S1: about that. 222 00:12:08,100 --> 00:12:10,980 S2: Well, the example you gave is such a good one too, 223 00:12:11,020 --> 00:12:13,540 S2: because a lot of times right now, we're thinking that 224 00:12:14,260 --> 00:12:16,860 S2: AI is a relatively new phenomenon that we've been dealing 225 00:12:16,860 --> 00:12:18,660 S2: with for the past couple of years, and that we 226 00:12:18,660 --> 00:12:21,260 S2: have to sort of get our minds around this. We've 227 00:12:21,260 --> 00:12:23,860 S2: been using GPS for a long time now, and GPS 228 00:12:23,900 --> 00:12:27,740 S2: is run by the same systems, essentially the dumbed down version, 229 00:12:28,020 --> 00:12:31,820 S2: but a AI system nonetheless. To give you the best 230 00:12:31,820 --> 00:12:34,260 S2: route somewhere and to figure out your patterns and these 231 00:12:34,260 --> 00:12:36,260 S2: kind of things. So we've been using AI for a 232 00:12:36,300 --> 00:12:39,340 S2: long time. It's not just devices in a general sense, 233 00:12:39,340 --> 00:12:43,920 S2: it's AI specifically. So yeah, we become very dependent very quickly. 234 00:12:44,120 --> 00:12:46,400 S2: And in the book I talk about this one instance 235 00:12:46,400 --> 00:12:49,040 S2: where I was at a meeting with some people and 236 00:12:49,040 --> 00:12:51,120 S2: we were trying to put our phones over on a 237 00:12:51,120 --> 00:12:52,800 S2: table so that none of us would actually look at 238 00:12:52,800 --> 00:12:55,920 S2: our phones. We would just spend time together and be intentional, 239 00:12:56,080 --> 00:12:59,040 S2: and I could feel the phone vibrating in my pocket, 240 00:12:59,040 --> 00:13:01,960 S2: though it was ten feet away from me because I 241 00:13:02,080 --> 00:13:04,880 S2: because I had become so used to it being there. 242 00:13:05,440 --> 00:13:08,640 S2: And it reminded me very, very poignantly of how people 243 00:13:08,640 --> 00:13:12,160 S2: have phantom pain or phantom sensations when a limb is 244 00:13:12,320 --> 00:13:15,319 S2: cut off, whether due to accident or war or, you know, 245 00:13:15,360 --> 00:13:18,640 S2: whatever it might be, and their limb is so part 246 00:13:18,640 --> 00:13:21,240 S2: of their body that their brain still thinks it's there. Well, 247 00:13:21,240 --> 00:13:23,880 S2: then I realized, oh my goodness, I am so attached 248 00:13:23,880 --> 00:13:26,880 S2: to this phone that my body thinks it's attached to 249 00:13:26,920 --> 00:13:30,359 S2: me even when it isn't. Um, and that's a scary place. 250 00:13:30,640 --> 00:13:33,280 S2: But we start to we start to lose. Look, I'm 251 00:13:33,320 --> 00:13:36,800 S2: not against technology. This is important. I'm actually a bit 252 00:13:36,800 --> 00:13:40,319 S2: of a technophile. I like technology, I use AI. The 253 00:13:40,320 --> 00:13:45,060 S2: question is, do you have any sense of restriction when 254 00:13:45,059 --> 00:13:48,740 S2: you use it? Because it's exceptionally seductive and take someone 255 00:13:48,740 --> 00:13:52,740 S2: like me. I am not great with directions. I'm terrible 256 00:13:52,740 --> 00:13:54,860 S2: with my sense of direction, and I triple guess every 257 00:13:54,860 --> 00:13:58,380 S2: turn I make, um, all the time. So I get lost. Um, 258 00:13:58,380 --> 00:14:01,660 S2: and I use my wife, however, is a human GPS. Um, 259 00:14:01,900 --> 00:14:04,020 S2: she just happens to know where she is all the time, 260 00:14:04,140 --> 00:14:05,820 S2: and she's great at this kind of stuff. And so 261 00:14:05,820 --> 00:14:08,020 S2: it's a little frustrating as the guy who's driving to 262 00:14:08,059 --> 00:14:09,980 S2: sometimes be told by your wife, hey, you're not doing 263 00:14:09,980 --> 00:14:12,820 S2: such a great job because she would rather not get 264 00:14:12,820 --> 00:14:15,140 S2: lost in the middle of nowhere or drive into a 265 00:14:15,140 --> 00:14:18,660 S2: lake or something. So, um, I use my GPS all 266 00:14:18,660 --> 00:14:21,060 S2: the time, and now my sense of direction is worse. 267 00:14:21,100 --> 00:14:23,740 S2: It's actually atrophied. Yes. And one of the ways it 268 00:14:23,740 --> 00:14:25,620 S2: does it and this, by the way, can be applied 269 00:14:25,660 --> 00:14:29,260 S2: to every sense in which we use AI, especially the chatbots, 270 00:14:29,540 --> 00:14:33,460 S2: is that the GPS, for example, always points up. So 271 00:14:33,460 --> 00:14:35,500 S2: it's always telling me when the next turn is coming. 272 00:14:35,500 --> 00:14:38,980 S2: So I turn left without having to spatially recognize how 273 00:14:38,980 --> 00:14:41,400 S2: it looks on a map. It's always pointing up, so 274 00:14:41,400 --> 00:14:44,520 S2: it's quite easy to follow it. The problem is, is 275 00:14:44,520 --> 00:14:46,920 S2: that now not only is my sense of direction atrophied, 276 00:14:46,920 --> 00:14:48,800 S2: but my sense of where I am in a city. 277 00:14:48,800 --> 00:14:50,760 S2: So let's say I go to a new city and 278 00:14:50,760 --> 00:14:54,680 S2: I'm in South Chicago. I don't feel south anymore because 279 00:14:54,680 --> 00:14:59,760 S2: I'm always pointing up. Everything is up. And there's actually 280 00:14:59,800 --> 00:15:01,680 S2: research on this is how we lose our sense of 281 00:15:01,680 --> 00:15:05,200 S2: spatial awareness of where we are. The more we rely 282 00:15:05,200 --> 00:15:07,520 S2: on machines to tell us where to go and the 283 00:15:07,520 --> 00:15:11,960 S2: way we orient things, now translate that into using an 284 00:15:11,960 --> 00:15:15,360 S2: LLM like a like a ChatGPT or a copilot or, 285 00:15:15,560 --> 00:15:19,480 S2: you know, uh, the Google version, which name is escaping 286 00:15:19,480 --> 00:15:22,280 S2: me right now? Gemini. Um, you use these things and 287 00:15:22,280 --> 00:15:25,600 S2: they tell you everything that you want to know. Well, 288 00:15:26,000 --> 00:15:28,440 S2: they they orient you in a place where they figure 289 00:15:28,480 --> 00:15:31,440 S2: out what would you like, and it wants to please 290 00:15:31,440 --> 00:15:34,800 S2: you usually, but it also wants to please its programmers. 291 00:15:34,800 --> 00:15:37,960 S2: And so it takes these programmers these algorithms and orients 292 00:15:37,960 --> 00:15:40,900 S2: you in a certain direction, and therefore you don't know 293 00:15:41,380 --> 00:15:44,500 S2: the outside of the parameters you're searching. You think that 294 00:15:44,500 --> 00:15:46,900 S2: you found the goldmine. The reality is, there's so much 295 00:15:46,900 --> 00:15:49,340 S2: more of a world out there. So while AI can 296 00:15:49,340 --> 00:15:51,860 S2: open up things to us, it also narrows things down 297 00:15:51,860 --> 00:15:54,859 S2: for us so that our worldview becomes extremely limited. 298 00:15:55,260 --> 00:15:59,340 S1: Which is why if you have an online video, they'll 299 00:15:59,340 --> 00:16:03,860 S1: suggest things for you. If you watch such and such or, uh, 300 00:16:04,820 --> 00:16:07,500 S1: on YouTube, or if you listen to this music, you. 301 00:16:07,980 --> 00:16:11,060 S1: Why don't you try this? So what it's doing is 302 00:16:11,060 --> 00:16:15,220 S1: it's not giving you. It's you're in the their algorithm 303 00:16:15,220 --> 00:16:17,940 S1: and they're telling you what to do and what to choose. Right. 304 00:16:18,380 --> 00:16:20,780 S2: Oh yeah. And actually it's interesting that you bring this 305 00:16:20,780 --> 00:16:24,500 S2: up because when you notice what AI is, is essentially 306 00:16:24,500 --> 00:16:26,900 S2: it's a prediction machine. It's a, it's a, it's a 307 00:16:26,900 --> 00:16:30,220 S2: probability matrix, a very sophisticated one. But it takes the 308 00:16:30,220 --> 00:16:32,660 S2: patterns of your behavior and it tracks where you're going 309 00:16:32,660 --> 00:16:35,020 S2: with your phones. It listens to you. It looks at 310 00:16:35,020 --> 00:16:37,760 S2: your search histories. It looks at its own, your own 311 00:16:37,760 --> 00:16:41,280 S2: search and asking what things are. But take Netflix, you 312 00:16:41,280 --> 00:16:43,520 S2: watch Netflix and you watch a certain number of movies 313 00:16:43,520 --> 00:16:46,120 S2: and it says, oh, he really doesn't like horror movies, 314 00:16:46,120 --> 00:16:48,920 S2: but he really likes action movies, especially those featuring, like, 315 00:16:48,920 --> 00:16:51,080 S2: Bruce Willis and Arnold Schwarzenegger and, you know, sort of 316 00:16:51,080 --> 00:16:54,600 S2: quintessential action stars. So it keeps recommending those things to 317 00:16:54,600 --> 00:16:56,880 S2: you and you'll say, yes or no, and now you've 318 00:16:57,160 --> 00:16:58,880 S2: now you've informed it more and more of what the 319 00:16:58,880 --> 00:17:01,359 S2: kind of things you like. So now it begins to predict. 320 00:17:01,520 --> 00:17:03,760 S2: And then what ends up happening, whether it's YouTube or 321 00:17:03,800 --> 00:17:07,919 S2: Netflix or whatever other thing you're doing in social media, 322 00:17:08,359 --> 00:17:12,000 S2: you start to say yes to things because it's predicted 323 00:17:12,040 --> 00:17:15,679 S2: your behaviors. And what you end up doing is you 324 00:17:15,720 --> 00:17:20,480 S2: click yes and yes and yes and you respond to prompts. 325 00:17:20,480 --> 00:17:25,000 S2: And here's the part that's scary AI originally you would 326 00:17:25,000 --> 00:17:27,760 S2: prompt it and it would spit out output. But when 327 00:17:27,760 --> 00:17:30,960 S2: it gets used to you and understands your patterns now, 328 00:17:30,960 --> 00:17:34,960 S2: it prompts you and you start responding to its prompts. 329 00:17:34,960 --> 00:17:38,830 S2: So my worry, Chris, is not that the humans. Sorry. 330 00:17:38,869 --> 00:17:41,710 S2: My worry is not that the AI will become more 331 00:17:41,710 --> 00:17:45,429 S2: human like. My real worry is that as reality of 332 00:17:45,430 --> 00:17:48,030 S2: what it means to be human collapses, we will become 333 00:17:48,030 --> 00:17:51,109 S2: more machine like because we will depend on these things 334 00:17:51,109 --> 00:17:52,670 S2: to do our thinking for us. 335 00:17:53,510 --> 00:17:56,910 S1: Ken Casey wrote about that in in the novel One 336 00:17:56,910 --> 00:17:59,230 S1: Flew Over the Cuckoo's Nest, and there was one character 337 00:17:59,230 --> 00:18:03,270 S1: in the in there, The Chief, and he said he 338 00:18:03,270 --> 00:18:06,070 S1: told the story of his father, who had become an alcoholic. 339 00:18:06,070 --> 00:18:08,629 S1: And he said, in the beginning, I watched my dad 340 00:18:08,869 --> 00:18:12,470 S1: drink from the bottle, and in the end I watched 341 00:18:12,470 --> 00:18:16,189 S1: the bottle drink from my dad. Yeah, so that it 342 00:18:16,230 --> 00:18:19,790 S1: sucked from the life from him. And that's the as 343 00:18:19,790 --> 00:18:22,830 S1: you said, you're not against technology and you're not necessarily 344 00:18:22,830 --> 00:18:26,590 S1: against AI, but we have to be cognizant of what 345 00:18:26,630 --> 00:18:29,909 S1: really is going on here. We're having the life kind 346 00:18:29,910 --> 00:18:31,869 S1: of taken from us, right? 347 00:18:32,390 --> 00:18:35,450 S2: Oh yeah. And there's actually new new research on this. 348 00:18:35,450 --> 00:18:38,130 S2: And here's the interesting thing. The research is coming out 349 00:18:38,130 --> 00:18:42,369 S2: of OpenAI. Who makes ChatGPT? MIT, which some of the 350 00:18:42,369 --> 00:18:45,770 S2: finest AI engineers in the world are there? Microsoft and others. 351 00:18:45,770 --> 00:18:47,530 S2: And what they're showing is, is that the more we 352 00:18:47,530 --> 00:18:51,930 S2: use artificial intelligence, especially the chatbots, to do our creating 353 00:18:51,930 --> 00:18:55,130 S2: for us. And I don't mean like fact checking something 354 00:18:55,130 --> 00:18:58,570 S2: I wrote or, you know, I created this, I got 355 00:18:58,570 --> 00:19:00,690 S2: a spreadsheet. Can you analyze the data for me and 356 00:19:00,690 --> 00:19:04,290 S2: give me some conclusions? Not that that's actually helpful, but 357 00:19:04,290 --> 00:19:06,250 S2: when you use it to write your essays for you, 358 00:19:06,250 --> 00:19:08,890 S2: when you use it to generate images for you, when 359 00:19:08,890 --> 00:19:10,650 S2: you use it to do the work for you, the 360 00:19:10,650 --> 00:19:13,810 S2: hard work, what they discovered was the more you use it, 361 00:19:14,330 --> 00:19:17,490 S2: the more cognitive debt you incur. It's a fancy word. 362 00:19:17,490 --> 00:19:20,490 S2: Cognitive debt is just a fancy word to basically say. Essentially, 363 00:19:20,490 --> 00:19:24,690 S2: the dumber you get, because we lose our sense of creativity, 364 00:19:25,210 --> 00:19:27,210 S2: we lose our we don't remember what it is that 365 00:19:27,210 --> 00:19:29,210 S2: the thing wrote for us, even if we review it 366 00:19:29,210 --> 00:19:32,250 S2: and make changes to it, we don't remember it as well. Um, 367 00:19:32,369 --> 00:19:35,629 S2: and we lose our sense of judgment because a huge 368 00:19:35,630 --> 00:19:38,630 S2: part of the development of judgment at every age is 369 00:19:38,630 --> 00:19:41,390 S2: the creation process. And if it's creating for you, you're 370 00:19:41,390 --> 00:19:45,630 S2: not engaging in judgment. In fact, um, what? I can't 371 00:19:45,630 --> 00:19:47,910 S2: remember the number exactly. I've got to look this up again. 372 00:19:47,910 --> 00:19:51,350 S2: But the studies are showing that a huge swath of 373 00:19:51,350 --> 00:19:54,030 S2: young people below the age of 25 report that they're 374 00:19:54,270 --> 00:20:00,430 S2: unable to make a decision without first asking ChatGPT any decision. 375 00:20:00,750 --> 00:20:03,270 S2: And that is a very scary thing. And then here's 376 00:20:03,270 --> 00:20:05,670 S2: the part that really gets me is that the more 377 00:20:05,670 --> 00:20:09,430 S2: you use the interactive features like the voice feature, the 378 00:20:09,430 --> 00:20:12,070 S2: lonelier we get. Which is interesting because you think that 379 00:20:12,070 --> 00:20:14,910 S2: it would sort of like replace a sense of loneliness. 380 00:20:15,150 --> 00:20:18,310 S2: But there's something in us that recognizes this is fake. 381 00:20:18,710 --> 00:20:20,630 S2: And yes, it's telling me what I want to hear, 382 00:20:20,630 --> 00:20:23,510 S2: but that's not a real relationship. And so when we 383 00:20:23,510 --> 00:20:29,909 S2: reduce our interactions to digital interactions, we identify relationships. And 384 00:20:29,910 --> 00:20:33,570 S2: when we identify relationships, we identify ourselves, and then we 385 00:20:33,570 --> 00:20:37,169 S2: begin to think, I'm just a machine. I'm just a 386 00:20:37,170 --> 00:20:40,970 S2: collection of algorithms. Maybe it's just evolution. Blind evolution that 387 00:20:40,970 --> 00:20:44,570 S2: created me. Maybe there's nothing immaterial about me. Maybe I'm 388 00:20:44,570 --> 00:20:46,810 S2: just as much a machine as this thing I'm using. 389 00:20:47,210 --> 00:20:50,730 S2: And then that creates a sense of just meaninglessness. 390 00:20:51,530 --> 00:20:55,850 S1: Is the flip of that. Abdu, is it true as 391 00:20:55,850 --> 00:20:59,210 S1: well that what it can do is suck you in 392 00:20:59,210 --> 00:21:01,730 S1: and make you feel like you're just another machine? Or 393 00:21:02,210 --> 00:21:05,810 S1: you begin to distrust everything you see, even something you 394 00:21:05,810 --> 00:21:08,650 S1: see in reality, in the real world. It's like, well, 395 00:21:08,690 --> 00:21:12,810 S1: that's probably fake, that it can move you toward that realm. 396 00:21:13,090 --> 00:21:14,690 S2: Oh yeah. And this is one of the things that 397 00:21:14,690 --> 00:21:17,169 S2: I think is really important about our cultural moment. And 398 00:21:17,170 --> 00:21:19,570 S2: I don't know if it's just AI, but it's also 399 00:21:19,770 --> 00:21:25,250 S2: the identity ideologies because, you know, you you see something, 400 00:21:25,450 --> 00:21:28,010 S2: you see, you see a situation, then you see a 401 00:21:28,010 --> 00:21:30,169 S2: person in that situation and they tell you they're a 402 00:21:30,170 --> 00:21:32,950 S2: different sex than they clearly are, than their biology says. 403 00:21:32,990 --> 00:21:37,070 S2: Or they enter into into Olympic boxing and they say, no, no, no, 404 00:21:37,070 --> 00:21:39,030 S2: I'm a woman, I'm a woman, I'm a woman. And 405 00:21:39,030 --> 00:21:41,110 S2: then you find out. And they later admit, as just 406 00:21:41,109 --> 00:21:44,950 S2: happened with the Olympic boxer from the last Olympics, basically said, yeah, 407 00:21:44,990 --> 00:21:48,630 S2: I have XY chromosomes and I'm a man and won 408 00:21:48,630 --> 00:21:51,629 S2: the gold medal anyway and were required to assent to 409 00:21:51,670 --> 00:21:54,190 S2: that or required to agree to it, even though we 410 00:21:54,190 --> 00:21:56,630 S2: know that's not true. And so we're told a bunch 411 00:21:56,630 --> 00:21:58,790 S2: of stuff like, oh no, no, this is actually a woman. 412 00:21:58,790 --> 00:22:00,350 S2: And there's been a bunch of reasons why you should 413 00:22:00,350 --> 00:22:02,750 S2: think that. And, you know, it's not true, but you're 414 00:22:02,750 --> 00:22:05,710 S2: required by social pressure to believe it. And then later 415 00:22:05,710 --> 00:22:09,350 S2: on we find out, hey, this is false. Then with 416 00:22:09,350 --> 00:22:11,629 S2: the AI stuff now, we don't know what's true and 417 00:22:11,630 --> 00:22:14,310 S2: what's not anymore, because the deepfakes are becoming so good. 418 00:22:14,670 --> 00:22:18,670 S2: So we're going from a post-truth culture. And a post-truth 419 00:22:18,670 --> 00:22:21,390 S2: culture is simply a culture that elevates feelings and preferences 420 00:22:21,390 --> 00:22:24,229 S2: above facts and truth. It doesn't deny truth exists. It 421 00:22:24,230 --> 00:22:27,070 S2: just says my feelings and my preferences matter more. We've 422 00:22:27,070 --> 00:22:30,690 S2: gone from a post-truth culture Now to a post trust 423 00:22:30,690 --> 00:22:34,449 S2: culture where everyone is lying to me. They're out to 424 00:22:34,490 --> 00:22:37,290 S2: get me. They're out to manipulate me. And that is 425 00:22:37,290 --> 00:22:40,129 S2: a cynical way to live. We have to have discernment. 426 00:22:40,130 --> 00:22:42,450 S2: The Bible does tell us this. Be wise as serpents. 427 00:22:42,850 --> 00:22:45,930 S2: But in a world where you think everyone's a serpent, 428 00:22:45,930 --> 00:22:49,370 S2: maybe even you all the time, yes. That's no way 429 00:22:49,369 --> 00:22:49,930 S2: to live. 430 00:22:51,450 --> 00:22:54,770 S1: It is very hard to live that way. And I'm 431 00:22:54,770 --> 00:22:57,330 S1: glad you're with us, because I've got so many different questions. 432 00:22:57,330 --> 00:23:06,250 S1: I'm going to open the phone lines to 87754836758775483675. The 433 00:23:06,250 --> 00:23:10,770 S1: book is just out last week. Fake ID, how AI 434 00:23:10,930 --> 00:23:14,610 S1: and identity ideology are collapsing reality and what to do 435 00:23:14,609 --> 00:23:19,929 S1: about it. You'll find it at Chris favorite deal. I 436 00:23:19,970 --> 00:23:23,449 S1: really felt like you dealt with the the gender dysphoria. 437 00:23:23,490 --> 00:23:27,290 S1: You say in the book people who are suffering from 438 00:23:27,290 --> 00:23:34,430 S1: gender dysphoria deserve compassion, dignity and respect, not derision. We 439 00:23:34,470 --> 00:23:36,950 S1: haven't done as good a job with that in the church, 440 00:23:36,950 --> 00:23:38,990 S1: I think, as we could have done. Do you agree? 441 00:23:39,030 --> 00:23:41,189 S2: Yeah, I do agree. And I think that's one of 442 00:23:41,190 --> 00:23:45,750 S2: the reasons why it's become such a, um, a clash now. 443 00:23:45,750 --> 00:23:47,630 S2: And it's hard to make up the ground that we've 444 00:23:47,630 --> 00:23:50,470 S2: lost now in to some respect, when the church does 445 00:23:50,470 --> 00:23:54,510 S2: respond to it, ideology and the attempt to like, not 446 00:23:54,510 --> 00:23:57,389 S2: even address actual gender dysphoria, but to try to foist 447 00:23:57,390 --> 00:23:59,429 S2: on us this sort of idea that we can be 448 00:23:59,430 --> 00:24:02,830 S2: the sovereigns over biology. Um, we are right to say 449 00:24:02,869 --> 00:24:06,149 S2: that the ideology is wrongheaded, but we have to be 450 00:24:06,150 --> 00:24:09,510 S2: careful because there are people who are suffering with genuine 451 00:24:09,510 --> 00:24:13,270 S2: gender dysphoria or who don't have a diagnosable gender dysphoria, 452 00:24:13,270 --> 00:24:16,230 S2: but because they're young, because of the way the culture 453 00:24:16,230 --> 00:24:18,870 S2: has gone, feel like they don't know who they are anymore. 454 00:24:18,990 --> 00:24:23,429 S2: And so they naturally gravitate towards a social contagion, which 455 00:24:23,430 --> 00:24:25,270 S2: says that you can be whatever you want to be. 456 00:24:25,270 --> 00:24:27,260 S2: And maybe the reason you feel this way is because 457 00:24:27,260 --> 00:24:29,780 S2: you're a man trapped in a woman's body, or vice versa. 458 00:24:30,140 --> 00:24:32,460 S2: And so all they really are is they have somebody, 459 00:24:32,460 --> 00:24:35,899 S2: there's somebody with deep anxieties or other, you know, mental 460 00:24:35,900 --> 00:24:38,540 S2: stuff that we just need to take care of in 461 00:24:38,540 --> 00:24:41,500 S2: a way that doesn't, isn't masked by these ideologies. So 462 00:24:41,500 --> 00:24:44,060 S2: the church, I think, needs to recognize that we are 463 00:24:44,220 --> 00:24:47,060 S2: dealing with people, not just ideology. We can address the 464 00:24:47,060 --> 00:24:49,580 S2: ideology while also caring for the people. And this is 465 00:24:49,580 --> 00:24:52,660 S2: inherently a biblical idea, is that if we believe people 466 00:24:52,660 --> 00:24:55,180 S2: are made in the image of God, that means everyone, 467 00:24:55,180 --> 00:24:58,340 S2: every single person you've never met somebody who's not. No 468 00:24:58,340 --> 00:25:01,020 S2: matter what they say, they believe, how they act, whatever 469 00:25:01,020 --> 00:25:02,859 S2: it is, yes, sin is sin and we have to 470 00:25:02,859 --> 00:25:06,700 S2: call people to account. But every person is made in 471 00:25:06,700 --> 00:25:12,700 S2: God's image, and therefore they are incalculably and objectively valuable. 472 00:25:13,060 --> 00:25:14,980 S2: Which means that as a Christian, I need to be 473 00:25:15,020 --> 00:25:18,340 S2: able to separate the value of the person from the 474 00:25:18,340 --> 00:25:21,820 S2: value of the idea they have or they're subject to, 475 00:25:21,980 --> 00:25:25,359 S2: so I can address the idea while valuing the person. 476 00:25:25,359 --> 00:25:28,439 S2: In fact, sometimes I address the idea because I value 477 00:25:28,440 --> 00:25:31,720 S2: the person. But remember, that person will be affected by 478 00:25:31,720 --> 00:25:33,879 S2: the words you say. And we have to be careful 479 00:25:33,880 --> 00:25:35,760 S2: not just what we say, but how we say it. 480 00:25:36,520 --> 00:25:39,240 S1: That's Abdul Murray. If you go to Chris Fabry lives, 481 00:25:39,640 --> 00:25:42,800 S1: you'll see the new book, fake ID How AI and 482 00:25:42,800 --> 00:25:47,360 S1: Identity Ideas Are Collapsing Reality and what to do about it. 483 00:25:47,359 --> 00:26:05,440 S1: Your calls straight ahead at (877) 548-3675. Fake ID is our 484 00:26:05,440 --> 00:26:08,919 S1: featured resource. It's written by Abdul Murray. It's how AI 485 00:26:09,119 --> 00:26:12,520 S1: and identity ideology are collapsing reality and what to do 486 00:26:12,520 --> 00:26:18,120 S1: about it. Featured resource at Chris Fabry. Chris Fabry Live 487 00:26:19,480 --> 00:26:22,040 S1: I have tons and tons of questions. We could go 488 00:26:22,040 --> 00:26:25,700 S1: a week just on this was one of the chapters 489 00:26:25,700 --> 00:26:28,060 S1: in this book, but I want to give you an opportunity. 490 00:26:28,060 --> 00:26:30,940 S1: And Bob is first up in Chicago. Hi, Bob. Why'd 491 00:26:30,940 --> 00:26:31,780 S1: you call today? 492 00:26:32,859 --> 00:26:36,620 S3: Well, I'm concerned about AI, of course. Like, I think 493 00:26:36,660 --> 00:26:42,500 S3: a lot of people might be. Uh, and who is what? It's, uh, 494 00:26:42,540 --> 00:26:46,860 S3: in in regard to our Christian faith, what is the 495 00:26:46,859 --> 00:26:50,300 S3: impact on relying on the Holy Spirit? 496 00:26:50,780 --> 00:26:52,700 S4: Mm mm. Good question. 497 00:26:53,140 --> 00:26:55,300 S2: Yeah, it is a good question, Bob. And in fact, 498 00:26:55,300 --> 00:26:58,300 S2: I would say that, um, in light of the rise 499 00:26:58,300 --> 00:27:01,300 S2: of AI, it is ever more important for us to 500 00:27:01,740 --> 00:27:06,659 S2: to remind ourselves that ultimate answers, um, to the, to 501 00:27:06,700 --> 00:27:09,139 S2: the fabric of reality comes from the one who created 502 00:27:09,140 --> 00:27:13,180 S2: that reality. And God, the Holy Spirit, uh, is involved 503 00:27:13,340 --> 00:27:17,580 S2: in the Godhead of the creation of everything. All of 504 00:27:17,580 --> 00:27:21,100 S2: this stuff and information is not the issue. It's the 505 00:27:21,140 --> 00:27:24,120 S2: it's the immaterial soul here as well. And I say 506 00:27:24,119 --> 00:27:27,280 S2: this because I want to caution people because there are people. 507 00:27:27,280 --> 00:27:28,639 S2: And I got to tell you, this has happened in 508 00:27:28,640 --> 00:27:30,600 S2: the church. I was talking to a pastor friend of 509 00:27:30,600 --> 00:27:32,840 S2: mine who was talking to me about he didn't disclose any, 510 00:27:32,880 --> 00:27:35,840 S2: you know, any details because he retained the confidentiality. But 511 00:27:35,840 --> 00:27:38,159 S2: he was saying, look, I was talking to somebody about 512 00:27:38,160 --> 00:27:40,639 S2: their prayer life and they were talking about their prayer partner. 513 00:27:40,920 --> 00:27:42,439 S2: And he said, over the course of about a half 514 00:27:42,440 --> 00:27:45,159 S2: an hour conversation, I suddenly realized their prayer partner is 515 00:27:45,160 --> 00:27:47,960 S2: not a real person. They're a prayer partner is ChatGPT. 516 00:27:48,000 --> 00:27:50,520 S2: And they basically disclose that. And he said, I have 517 00:27:50,520 --> 00:27:52,040 S2: to tell you, and I'd make sure. And he said, 518 00:27:52,040 --> 00:27:53,439 S2: I called him out on it because I needed to 519 00:27:53,440 --> 00:27:57,359 S2: make sure that they had that spiritual grounding. ChatGPT does 520 00:27:57,359 --> 00:28:01,840 S2: not pray. It mimics prayer. And in that sense, it 521 00:28:01,880 --> 00:28:05,679 S2: is actually almost it's almost like wrong to use it 522 00:28:05,680 --> 00:28:09,520 S2: for that purpose because you're doing something and you're responding 523 00:28:09,560 --> 00:28:12,439 S2: to this machine which doesn't have a soul because it 524 00:28:12,440 --> 00:28:15,280 S2: has no imago day, it doesn't connect with the divine, 525 00:28:15,640 --> 00:28:17,919 S2: and it has no connection to the Holy Spirit. So 526 00:28:18,000 --> 00:28:19,800 S2: we have to be very careful with this. And of course, 527 00:28:19,800 --> 00:28:22,859 S2: there are. And I'm not saying this is common, but 528 00:28:22,859 --> 00:28:26,020 S2: there are some people who, because of the time constraints 529 00:28:26,020 --> 00:28:27,979 S2: and the pressures of life and being a pastor is 530 00:28:27,980 --> 00:28:31,179 S2: a very hard job. I know that I'm not a pastor, 531 00:28:31,180 --> 00:28:33,980 S2: but I know so many. Um, it's a very hard job, 532 00:28:33,980 --> 00:28:36,580 S2: and the temptation to use AI to help you either 533 00:28:36,580 --> 00:28:40,380 S2: start and then therefore end up writing your sermon is 534 00:28:40,380 --> 00:28:43,380 S2: really is really, really there. So then again, there's no 535 00:28:43,380 --> 00:28:45,780 S2: connection to the Holy Spirit there. We have to be 536 00:28:45,780 --> 00:28:49,780 S2: very careful of this stuff. And so I think, um, 537 00:28:49,820 --> 00:28:51,900 S2: if the Holy Spirit helps us and the Bible says 538 00:28:51,900 --> 00:28:53,860 S2: that if we walk in step with the spirit, we 539 00:28:53,860 --> 00:28:56,700 S2: will not gratify the desires of the flesh. That's what 540 00:28:56,700 --> 00:28:59,380 S2: we read in Galatians. I believe it is. And so 541 00:28:59,380 --> 00:29:01,620 S2: when we read that, that we walk in step with 542 00:29:01,620 --> 00:29:06,420 S2: the spirit, Bob, we won't necessarily succumb to the desire 543 00:29:06,420 --> 00:29:10,100 S2: to have this AI do everything for us, whether it's 544 00:29:10,100 --> 00:29:13,860 S2: our Bible study or it's our prayer or it's our creativity. 545 00:29:13,980 --> 00:29:16,580 S2: So yes, walk in step with the spirit and I 546 00:29:16,580 --> 00:29:20,120 S2: think you'll be in good stead to help, to resist 547 00:29:20,960 --> 00:29:25,200 S2: you overusing AI and sort of dumbing ourselves down, as 548 00:29:25,200 --> 00:29:25,640 S2: it were. 549 00:29:26,440 --> 00:29:27,520 S4: Does that make sense, Bob? 550 00:29:28,760 --> 00:29:33,080 S3: Well, it you know, it also, Paul says in the scriptures, 551 00:29:33,360 --> 00:29:36,320 S3: don't worry about what you have, what you need to 552 00:29:36,360 --> 00:29:41,160 S3: pray for, because the Holy Spirit already knows what your 553 00:29:41,160 --> 00:29:46,960 S3: needs are. And to rely on anything that interferes with 554 00:29:46,960 --> 00:29:52,520 S3: with God's holy order is is an improper use and 555 00:29:52,520 --> 00:29:57,760 S3: is sinning. Okay. Against what we've been talking about. And 556 00:29:57,760 --> 00:30:01,360 S3: nobody is being forceful about bringing that whole issue up 557 00:30:01,360 --> 00:30:05,520 S3: in relationship to, uh, listen, I've got 15 different missions 558 00:30:05,520 --> 00:30:07,520 S3: that the Holy Spirit's got me working on. 559 00:30:08,080 --> 00:30:09,280 S5: Okay. Okay. That's good. 560 00:30:09,480 --> 00:30:14,000 S3: No, this this isn't a boast. I, I for for 561 00:30:14,000 --> 00:30:18,240 S3: 44 years since I got out of business and the 562 00:30:18,270 --> 00:30:21,270 S3: Lord called me into work. I always rely on the 563 00:30:21,270 --> 00:30:24,830 S3: Holy Spirit to be my guide, and he has done that. 564 00:30:24,830 --> 00:30:29,469 S3: And I get affirmation when he is involved with, with, 565 00:30:29,470 --> 00:30:33,310 S3: with what I need to be doing and and people, 566 00:30:33,310 --> 00:30:36,230 S3: people who are relying on it to do their work 567 00:30:36,230 --> 00:30:39,430 S3: that God has given them to, to do. Okay. Is 568 00:30:39,430 --> 00:30:43,150 S3: not using God as the guidance? Excuse me. 569 00:30:43,350 --> 00:30:46,390 S1: I'm glad. You know, I'm glad you got through today. 570 00:30:46,510 --> 00:30:49,030 S1: And I bore down on that a little more if 571 00:30:49,030 --> 00:30:51,430 S1: we had more time. The other question that I see 572 00:30:51,470 --> 00:30:56,790 S1: up here from Bob Abdou is you mentioned AI was 573 00:30:56,790 --> 00:30:59,070 S1: predicted in the Bible, or maybe you didn't say it 574 00:30:59,110 --> 00:31:01,990 S1: exactly that way, but is AI and what we're moving 575 00:31:02,030 --> 00:31:03,990 S1: toward in the scriptures? 576 00:31:04,550 --> 00:31:07,910 S2: Well, what I think is in the scriptures specifically is 577 00:31:07,910 --> 00:31:10,310 S2: the way in which we're using it. And this sort 578 00:31:10,310 --> 00:31:13,310 S2: of really dovetails with Bob. With Bob is saying is 579 00:31:13,310 --> 00:31:15,550 S2: that I don't think the problem is in the silicon 580 00:31:15,590 --> 00:31:18,290 S2: of the software. I think it's in the soul of 581 00:31:18,290 --> 00:31:20,970 S2: the humans using it. And so what the Bible actually 582 00:31:20,970 --> 00:31:24,170 S2: predicts from Genesis chapter three, where Adam and Eve are 583 00:31:24,210 --> 00:31:26,330 S2: given dominion over the whole earth, and they're given one 584 00:31:26,330 --> 00:31:29,050 S2: command of the tree of the fruit of the tree 585 00:31:29,090 --> 00:31:31,729 S2: of the knowledge of good and evil. You shall not eat. 586 00:31:31,890 --> 00:31:34,209 S2: And then they go ahead and do it, because they 587 00:31:34,210 --> 00:31:36,090 S2: want to be the sovereign of the sovereign. And then 588 00:31:36,090 --> 00:31:37,890 S2: you go to Genesis chapter 11, and you see the 589 00:31:37,890 --> 00:31:40,969 S2: Tower of Babel, and everyone's gathered together and is one 590 00:31:41,010 --> 00:31:42,930 S2: accord and has one language, and they want to reach 591 00:31:42,930 --> 00:31:45,730 S2: heaven and make a name for themselves. Well, there's only 592 00:31:45,730 --> 00:31:48,610 S2: one audience trying to reach at that point, because if 593 00:31:48,610 --> 00:31:51,290 S2: we're all under one culture, you're not making a name 594 00:31:51,290 --> 00:31:53,890 S2: for yourself in the earth. You're all there. You're trying 595 00:31:53,890 --> 00:31:56,530 S2: to prove something to God that we don't need you. 596 00:31:56,970 --> 00:31:59,730 S2: And what is AI? In one sense, why does that 597 00:31:59,730 --> 00:32:01,570 S2: guy brag? As I said in the top of the 598 00:32:01,650 --> 00:32:04,490 S2: top of the show? Why does the guy, the artist, 599 00:32:04,490 --> 00:32:08,010 S2: brag that AI makes us obsolete? He's not bragging because 600 00:32:08,010 --> 00:32:11,530 S2: he wants to be obsolete. He's bragging because we've finally 601 00:32:11,530 --> 00:32:14,810 S2: done it. We finally beat God at his own game. 602 00:32:15,310 --> 00:32:19,310 S2: Eden's failure was not that it was immoral. Eden's failure 603 00:32:19,310 --> 00:32:21,750 S2: was that it was premature. And now we have the 604 00:32:21,750 --> 00:32:24,550 S2: tools to do it. And the Bible has been predicting 605 00:32:24,550 --> 00:32:28,430 S2: this kind of bent towards humanity to rely on anything 606 00:32:28,470 --> 00:32:32,230 S2: other than God to define reality. It's been saying this 607 00:32:32,230 --> 00:32:35,550 S2: for centuries and centuries. So not only does this provide 608 00:32:35,550 --> 00:32:38,630 S2: a warning and does that make it credible for this, 609 00:32:38,630 --> 00:32:41,350 S2: for the source of truth, but it provides the remedy 610 00:32:41,350 --> 00:32:44,950 S2: as well. And I think that that's why Bob's question 611 00:32:44,950 --> 00:32:48,229 S2: is such a nice dovetail into that reality, is that 612 00:32:48,230 --> 00:32:51,990 S2: we can't rely for ultimate truth on anything other than God, 613 00:32:51,990 --> 00:32:54,430 S2: because if we do, we start to worship ourselves. 614 00:32:54,630 --> 00:32:56,430 S6: Well, it gets us back to the book of judges. 615 00:32:56,430 --> 00:32:58,790 S1: Everyone did what was right in their own eyes. You 616 00:32:58,790 --> 00:33:03,230 S1: know my truth. Your truth, my truth that dovetails in 617 00:33:03,230 --> 00:33:05,709 S1: here as well, Bob. God bless you, friend. Thank you. 618 00:33:05,750 --> 00:33:08,350 S1: Matt is next up. Hi, Matt. Go right ahead. 619 00:33:09,150 --> 00:33:10,870 S7: Hey. Good to talk to you guys. 620 00:33:11,230 --> 00:33:11,510 S2: You too. 621 00:33:11,670 --> 00:33:17,330 S7: So, anyway. All right. Yeah. My case is this. I'm 622 00:33:17,330 --> 00:33:19,930 S7: a musician. I've got a piece of music out there 623 00:33:19,930 --> 00:33:24,250 S7: on YouTube that I've been trying to promote. It's called 624 00:33:24,290 --> 00:33:27,330 S7: an original called Surf's Up on a Sea of Glass, 625 00:33:27,770 --> 00:33:30,890 S7: based on the Book of Revelations, chapters four and 15. 626 00:33:31,170 --> 00:33:33,610 S7: I came out with the first. I got the first 627 00:33:33,610 --> 00:33:36,850 S7: half of the title Surf's Up. Second half on a 628 00:33:36,890 --> 00:33:40,450 S7: sea of Glass. My pastor came up with. Anyway, got 629 00:33:40,450 --> 00:33:44,170 S7: it on YouTube. Showed it to somebody I know indirectly. 630 00:33:44,410 --> 00:33:46,530 S7: He kept looking at the video, looking at me, looking 631 00:33:46,530 --> 00:33:48,650 S7: at the video, looking at me, and he's like, no, 632 00:33:48,650 --> 00:33:50,970 S7: that's not you. And I was like, yeah, it's me. 633 00:33:50,970 --> 00:33:53,489 S7: And then he keep looking back and forth at me 634 00:33:53,490 --> 00:33:56,250 S7: in the video and he'd say, no, that's not you. Finally, 635 00:33:56,250 --> 00:33:59,370 S7: I got to the point where I was like, hey, 636 00:33:59,850 --> 00:34:01,890 S7: he's like, that's got to be AI. And I said, 637 00:34:01,930 --> 00:34:04,530 S7: you know what it is, AI that's not me, but 638 00:34:04,530 --> 00:34:05,290 S7: it is me. 639 00:34:06,530 --> 00:34:06,690 S2: Wow. 640 00:34:06,730 --> 00:34:08,770 S7: So what do you got? Yeah. What do you got 641 00:34:08,770 --> 00:34:09,530 S7: to say on that? 642 00:34:09,850 --> 00:34:13,990 S2: Well, um, the reality is, is that, um, it's interesting 643 00:34:13,989 --> 00:34:17,509 S2: that that back and forth, but, um, there is a 644 00:34:17,510 --> 00:34:20,029 S2: sense in which AI will become so sophisticated at some 645 00:34:20,030 --> 00:34:23,310 S2: point that it'll be almost indistinguishable, almost unable to tell 646 00:34:23,350 --> 00:34:26,670 S2: if it's actually an AI generated thing. It's pretty close now, 647 00:34:26,950 --> 00:34:29,109 S2: but your friend was able to say, hey, I know 648 00:34:29,110 --> 00:34:32,669 S2: you well enough. And this is I actually love your question, Matt. 649 00:34:32,670 --> 00:34:34,549 S2: And I got to say, this is why. Because one 650 00:34:34,550 --> 00:34:36,870 S2: of the things that I think that is the cure 651 00:34:36,910 --> 00:34:39,430 S2: for the AI mania. And again, I'm not against AI, 652 00:34:39,469 --> 00:34:42,670 S2: I'm against the AI mania. Is that what AI is 653 00:34:42,670 --> 00:34:44,950 S2: doing to us is it's creating a sense of artificial 654 00:34:44,950 --> 00:34:48,870 S2: izing the world. It's not that artificial intelligence is artificial, 655 00:34:48,870 --> 00:34:51,910 S2: it's that it artificially causes the world and then we 656 00:34:52,030 --> 00:34:54,910 S2: will buy into it. But the fact that your friend 657 00:34:54,910 --> 00:34:58,030 S2: who knows you has connection with you can say, that's 658 00:34:58,030 --> 00:35:02,469 S2: AI shows me that the way to actually stem the 659 00:35:02,469 --> 00:35:05,670 S2: tide of the deepfakes is to know the real thing 660 00:35:05,989 --> 00:35:09,190 S2: as well as possible. You know, when you look at, um, 661 00:35:09,350 --> 00:35:11,890 S2: the Secret Service, for example, the Secret Service is not 662 00:35:11,890 --> 00:35:16,010 S2: trained on phony dollar bills or phony 50s or 20s. 663 00:35:16,050 --> 00:35:18,690 S2: They're not trained on that. They're trained on the real thing. 664 00:35:18,690 --> 00:35:21,210 S2: So they know the real thing, so detailed and so 665 00:35:21,250 --> 00:35:23,569 S2: intimately that when they see a fake, even if they 666 00:35:23,570 --> 00:35:26,330 S2: can't tell you why it's fake, they know a little 667 00:35:26,370 --> 00:35:29,450 S2: bit about is off. They can't tell you why exactly, 668 00:35:29,450 --> 00:35:31,089 S2: but they know a little bit because they've studied the 669 00:35:31,090 --> 00:35:33,529 S2: real thing so well. And so your friend was able 670 00:35:33,530 --> 00:35:36,810 S2: to point out, hey, this is good, but it's AI, um, 671 00:35:36,810 --> 00:35:40,049 S2: which shows me that the way to anchor ourselves is 672 00:35:40,050 --> 00:35:43,850 S2: to recognize the genuine article so well that the fakery 673 00:35:43,850 --> 00:35:45,930 S2: is obvious. Now, it's going to get better and better 674 00:35:45,930 --> 00:35:48,770 S2: at this kind of thing. Um, but we have to 675 00:35:48,770 --> 00:35:51,570 S2: be careful about this. And so I think going back 676 00:35:51,570 --> 00:35:55,049 S2: to the real thing and even including Bob's question, which 677 00:35:55,050 --> 00:35:58,089 S2: is hanging on to what does it really mean to 678 00:35:58,090 --> 00:36:01,930 S2: be human? What does it mean to be connected with, uh, 679 00:36:01,930 --> 00:36:05,330 S2: the the creator of the universe, um, is going to 680 00:36:05,330 --> 00:36:08,890 S2: help us to distinguish what's real and what's not. Um, 681 00:36:09,090 --> 00:36:11,270 S2: I'm a big advocate, by the way, for if anything 682 00:36:11,270 --> 00:36:14,750 S2: is AI generated that we should actually say it is, um, 683 00:36:14,750 --> 00:36:17,109 S2: so that we don't sort of bear false witness and 684 00:36:17,110 --> 00:36:19,629 S2: misrepresent who and what we are. I'm not saying you're 685 00:36:19,630 --> 00:36:21,190 S2: doing that. I'm just saying we all have to be 686 00:36:21,230 --> 00:36:23,390 S2: able to say that's fake. In fact, most of the 687 00:36:23,390 --> 00:36:27,110 S2: platforms now are requiring anything AI generated to be labeled 688 00:36:27,110 --> 00:36:31,750 S2: as AI generated so that people aren't fooled. Um, but, uh, yeah, 689 00:36:31,750 --> 00:36:33,390 S2: I think that the fact that he could, he could, 690 00:36:33,430 --> 00:36:35,549 S2: he could spot it and was so persistent and so 691 00:36:35,550 --> 00:36:38,310 S2: stubborn is actually kind of a cool thing. It's a 692 00:36:38,310 --> 00:36:41,910 S2: good thing because while the product might be something that's interesting, uh, 693 00:36:41,989 --> 00:36:44,750 S2: we're always interested in, in the authentic. And because you're 694 00:36:44,750 --> 00:36:48,870 S2: an authentic human being with an authentic creativity, um, uh, 695 00:36:49,190 --> 00:36:52,790 S2: we have to make sure that your authenticity remains, and 696 00:36:52,790 --> 00:36:54,550 S2: we don't. Artificial eyes the world. 697 00:36:54,750 --> 00:36:55,190 S6: Yeah. 698 00:36:55,430 --> 00:36:58,629 S1: Thanks, Matt. The book is fake ID how AI and 699 00:36:58,630 --> 00:37:03,189 S1: identity ideology are collapsing reality and what to do about it. 700 00:37:03,230 --> 00:37:06,469 S1: Abdu Murray is joining us today at the radio backyard fence. 701 00:37:06,630 --> 00:37:11,020 S1: More of your calls, more questions. It's a really important topic. 702 00:37:11,340 --> 00:37:14,100 S1: And again, if you go to Chris Fabry lives, you'll 703 00:37:14,100 --> 00:37:18,900 S1: see that featured resource fake ID right there. More straight ahead. 704 00:37:28,300 --> 00:37:30,899 S1: There are times on this program when I'll say, we're 705 00:37:30,900 --> 00:37:34,540 S1: just scratching the surface of this topic of what has 706 00:37:34,540 --> 00:37:38,140 S1: been written. And today I mean it the other times too. 707 00:37:38,180 --> 00:37:40,580 S1: But I really mean it now because I have so 708 00:37:40,580 --> 00:37:43,700 S1: many quotes and so many questions for Abdul Murray, who's 709 00:37:43,700 --> 00:37:46,180 S1: written fake ID just means we're going to have to 710 00:37:46,180 --> 00:37:48,540 S1: have you back again, Abdul. And we'll keep talking about 711 00:37:48,540 --> 00:37:51,420 S1: this because it changes every day. You know, just about 712 00:37:51,420 --> 00:37:55,219 S1: every every 2 or 3 days. There's something new to 713 00:37:55,260 --> 00:38:00,900 S1: deal with. Fake ID is the featured resource at Chris. Org. 714 00:38:00,940 --> 00:38:03,060 S1: I want to go to Beth's call. Beth. What did 715 00:38:03,060 --> 00:38:04,460 S1: you want to ask today? 716 00:38:05,700 --> 00:38:09,759 S8: Yes. Um, this is an example of too much media 717 00:38:09,760 --> 00:38:13,840 S8: in my opinion. I was at a nature center and 718 00:38:13,840 --> 00:38:17,280 S8: in a tank there's a 100 plus year old. 719 00:38:17,320 --> 00:38:19,160 S1: Okay, wait a minute. Let me jump in here. Beth. 720 00:38:19,200 --> 00:38:21,879 S1: I can I can just barely hear you. Can you 721 00:38:21,880 --> 00:38:24,560 S1: move a little closer to the phone and, uh, and 722 00:38:24,600 --> 00:38:25,719 S1: say what you're saying? 723 00:38:26,440 --> 00:38:30,360 S8: Yes. Um, I was at a nature center, and, um, 724 00:38:30,360 --> 00:38:33,800 S8: in the nature center is a 100 year old plus 725 00:38:33,840 --> 00:38:35,799 S8: sea turtle. It's right in front of you. It's in 726 00:38:35,800 --> 00:38:39,160 S8: a tank. And there was a little six year old boy, 727 00:38:39,280 --> 00:38:42,080 S8: and he's right in front of the tank. And he 728 00:38:42,080 --> 00:38:44,960 S8: said to his parents, is this real? 729 00:38:45,440 --> 00:38:46,760 S5: Mm mm. 730 00:38:47,480 --> 00:38:50,720 S2: Wow. Wow. That's it. 731 00:38:51,040 --> 00:38:52,440 S1: That's what we're talking about, isn't it? 732 00:38:52,600 --> 00:38:55,680 S2: Yeah. That that's. So this is the important thing right 733 00:38:55,680 --> 00:38:58,759 S2: now is that, um. So my kids, uh, my kids 734 00:38:58,760 --> 00:39:02,440 S2: aren't six year old, uh, kids anymore. They're almost 22, 735 00:39:02,480 --> 00:39:06,620 S2: 19 and 17, and they are frequently pointing out, oh, 736 00:39:06,620 --> 00:39:09,620 S2: that's AI, oh that's AI. And sometimes, most of the 737 00:39:09,620 --> 00:39:12,339 S2: time they're right because they have their discernment. Discernment going 738 00:39:12,340 --> 00:39:15,980 S2: on with that stuff. But sometimes they're wrong. Um, and 739 00:39:16,020 --> 00:39:18,259 S2: it does, in fact cause us to move into that 740 00:39:18,260 --> 00:39:21,460 S2: post trust world where we don't know if what we're 741 00:39:21,460 --> 00:39:23,380 S2: looking at is real or not. Now, I'm going to 742 00:39:23,380 --> 00:39:26,260 S2: go from this, um, sort of, I wouldn't say trivial, 743 00:39:26,260 --> 00:39:30,100 S2: but non-threatening example that Beth gave and go to one 744 00:39:30,100 --> 00:39:32,580 S2: that was sort of grave. A couple of years ago, 745 00:39:32,860 --> 00:39:35,020 S2: years ago, maybe even last year, there was a video, 746 00:39:35,060 --> 00:39:40,140 S2: a horrific video of someone who was killed on a subway, um, 747 00:39:40,140 --> 00:39:42,100 S2: who was essentially, I hate to say it, but lit 748 00:39:42,140 --> 00:39:45,500 S2: on fire. And because of the way in which the 749 00:39:45,540 --> 00:39:49,299 S2: body reacts, sometimes in strange and unexpected ways, no one 750 00:39:49,300 --> 00:39:51,259 S2: believed it. They thought it was fake, and they thought 751 00:39:51,260 --> 00:39:54,620 S2: it was AI. But turns out it was real. Now, 752 00:39:55,020 --> 00:39:56,660 S2: whether they thought it was fake or not wouldn't have 753 00:39:56,660 --> 00:40:01,420 S2: helped this unfortunate woman, um, who was brutalized this way. 754 00:40:01,700 --> 00:40:05,120 S2: But what it does lend me to believe is that 755 00:40:05,120 --> 00:40:07,839 S2: we're not going to believe even news stories when we 756 00:40:07,840 --> 00:40:10,319 S2: see them. You know, is it real? And we're trained. 757 00:40:10,360 --> 00:40:15,120 S2: We're being trained with a suspicion muscle when we're kids now. 758 00:40:15,120 --> 00:40:17,319 S2: And sometimes that's good, because now we live in a 759 00:40:17,320 --> 00:40:19,080 S2: world where these kids don't even know if they can 760 00:40:19,080 --> 00:40:21,239 S2: trust their own eyes. I want you to think about 761 00:40:21,239 --> 00:40:24,080 S2: the anxiety level of that when you're six years old. 762 00:40:24,480 --> 00:40:27,560 S2: The world is scary. Everyone's bigger than you. Everyone's voice 763 00:40:27,560 --> 00:40:30,439 S2: is louder than yours. Um, you don't know if where 764 00:40:30,440 --> 00:40:32,080 S2: your what your place in the world is, and you 765 00:40:32,080 --> 00:40:34,719 S2: don't know if you can trust certain things. Now, there 766 00:40:34,719 --> 00:40:37,200 S2: are certain things you think you can trust, like an 767 00:40:37,200 --> 00:40:39,520 S2: image right in front of you, or whether that person 768 00:40:39,520 --> 00:40:41,359 S2: is a girl or a boy when they say they're 769 00:40:41,360 --> 00:40:44,040 S2: a girl or a boy. And so there's some level 770 00:40:44,040 --> 00:40:46,799 S2: of anchoring so that you're not anxious about everything all 771 00:40:46,800 --> 00:40:49,640 S2: the time. But our kids now are growing up in 772 00:40:49,640 --> 00:40:53,040 S2: a world where they don't know if anything is true. 773 00:40:53,560 --> 00:40:55,480 S2: And that's going to be scary, because if you're going 774 00:40:55,520 --> 00:40:58,320 S2: to tell them that Jesus died for them and and 775 00:40:58,320 --> 00:41:00,360 S2: paid for their sins and he rose from the dead, 776 00:41:00,640 --> 00:41:03,859 S2: that child might be saying, uh, sure, you say that. 777 00:41:03,860 --> 00:41:05,500 S2: I don't know if that's true, and there's no way 778 00:41:05,500 --> 00:41:07,100 S2: to know if it's true, because I can't believe my 779 00:41:07,100 --> 00:41:09,379 S2: own eyes anymore. So we have to be really careful 780 00:41:09,380 --> 00:41:10,540 S2: about this. We really do. 781 00:41:11,140 --> 00:41:13,500 S1: Beth, I'm glad you called today. There was a. This 782 00:41:13,500 --> 00:41:15,819 S1: is personal for you, too, because a year and a 783 00:41:15,860 --> 00:41:18,980 S1: half ago, your your dad was murdered. And a lot 784 00:41:18,980 --> 00:41:21,660 S1: of people have been praying for you or were praying 785 00:41:21,660 --> 00:41:23,740 S1: for you and your family as you went through that. 786 00:41:24,100 --> 00:41:28,700 S1: But this there's this thing about memorializing loved ones through 787 00:41:28,739 --> 00:41:33,460 S1: AI where you take a still picture even from, you know, 1800s, 788 00:41:33,580 --> 00:41:36,460 S1: and then the people come to life, basically, and they 789 00:41:36,460 --> 00:41:39,220 S1: move their heads and that kind of thing. What's wrong 790 00:41:39,219 --> 00:41:39,900 S1: with that? 791 00:41:40,300 --> 00:41:42,700 S2: You know, this is such an important part of it. 792 00:41:42,700 --> 00:41:44,379 S2: And thanks, by the way, for the prayers and for 793 00:41:44,380 --> 00:41:48,740 S2: the concern. Uh, it was I will say this really briefly, um, 794 00:41:48,900 --> 00:41:55,859 S2: in the most horrific, um, event of my life to date, um, 795 00:41:56,340 --> 00:42:01,020 S2: God showed himself kind, and the church was beautiful and 796 00:42:01,020 --> 00:42:04,200 S2: was amazing to us and still remains that way. And 797 00:42:04,200 --> 00:42:06,560 S2: we get the blessing and the outpouring all the time. 798 00:42:06,960 --> 00:42:10,400 S2: I'm just amazed. And yes, I'm angry, but the Lord 799 00:42:10,440 --> 00:42:14,759 S2: has sustained me and increased my faith, not challenged it 800 00:42:14,760 --> 00:42:18,360 S2: in any way, but increase my faith because of his goodness. 801 00:42:18,440 --> 00:42:21,799 S2: So having said that, my dad means the world to 802 00:42:21,840 --> 00:42:24,480 S2: me and what I love to ask my dad, hey, 803 00:42:24,480 --> 00:42:27,000 S2: what would you do in this situation? Or I'd love 804 00:42:27,000 --> 00:42:29,400 S2: to get my dad's advice on this. Or boy, I 805 00:42:29,400 --> 00:42:31,399 S2: wish my dad could have seen that. Would he have laughed? 806 00:42:31,400 --> 00:42:33,879 S2: Would he have cried? Would he got angry? I don't know. Well, 807 00:42:33,880 --> 00:42:36,800 S2: now there's these robots, not robots, but there's these programs 808 00:42:36,800 --> 00:42:39,040 S2: where you can take home video of somebody or they 809 00:42:39,040 --> 00:42:40,840 S2: can take video of themselves. Let's say they have a 810 00:42:40,840 --> 00:42:43,799 S2: terminal disease, and they spend six months recording advice for 811 00:42:43,800 --> 00:42:46,640 S2: their kids, and they can feed it into an algorithm 812 00:42:46,640 --> 00:42:49,759 S2: that will create a generated image. And then you can 813 00:42:49,760 --> 00:42:52,440 S2: interact with it on your computer or on your device. 814 00:42:52,800 --> 00:42:55,960 S2: And that person, that or that image of the person 815 00:42:56,160 --> 00:42:59,960 S2: will react to new events that happened after they died, 816 00:43:00,219 --> 00:43:03,779 S2: which means they're interactive in some sense. Now I can 817 00:43:03,780 --> 00:43:05,820 S2: see why people might want to do that. I wouldn't 818 00:43:05,820 --> 00:43:09,020 S2: blame anybody. Look, I in my grief, I'm sure I'd 819 00:43:09,020 --> 00:43:12,259 S2: want a lot of things. But they're what I think 820 00:43:12,260 --> 00:43:14,700 S2: is wrong with it. And I asked my daughter this question. 821 00:43:14,739 --> 00:43:18,260 S2: My Gen Z daughter, you know, technology up to their eyeballs, right? 822 00:43:18,460 --> 00:43:21,259 S2: I said, if I passed away, would you want to 823 00:43:21,260 --> 00:43:25,060 S2: have an interactive like AI, an avatar of me to 824 00:43:25,100 --> 00:43:27,540 S2: talk to and ask questions of? She said, absolutely not. 825 00:43:27,900 --> 00:43:29,620 S2: I said, why not? What's the difference between that and 826 00:43:29,620 --> 00:43:31,779 S2: watching a home video of me? She said, because the 827 00:43:31,780 --> 00:43:34,140 S2: home video is a recording of what you actually did, 828 00:43:34,500 --> 00:43:37,540 S2: the AI is just a collection of patterns of what 829 00:43:37,540 --> 00:43:40,900 S2: you might do or might say. It might get you wrong. 830 00:43:41,140 --> 00:43:44,259 S2: It isn't you. And then it occurred to me that 831 00:43:44,660 --> 00:43:47,979 S2: she's 100% right. And what I worry about with that 832 00:43:47,980 --> 00:43:52,420 S2: is that when we reduce people to the patterns of 833 00:43:52,460 --> 00:43:56,220 S2: their behavior and the probabilities of what they might say 834 00:43:56,219 --> 00:44:00,689 S2: in a situation, we end up identifying people and they 835 00:44:00,690 --> 00:44:05,170 S2: lose their soul, they lose their immateriality and they start 836 00:44:05,170 --> 00:44:09,730 S2: being just another program. And then we'll lose the reality 837 00:44:09,730 --> 00:44:11,490 S2: of what it means to be a human will begin 838 00:44:11,489 --> 00:44:14,810 S2: to collapse. And I don't worry about the millennials and 839 00:44:14,810 --> 00:44:18,330 S2: Gen X. We're not going to be doing that. Um, 840 00:44:18,410 --> 00:44:21,810 S2: but I think about Gen Alpha, where all they will 841 00:44:21,810 --> 00:44:27,250 S2: ever know is an artificially intelligent, datafied simulated world, and 842 00:44:27,250 --> 00:44:30,370 S2: they'll soon come to see everything as simulation. And then 843 00:44:30,370 --> 00:44:33,730 S2: what value do people have if you can update software 844 00:44:33,730 --> 00:44:35,850 S2: and not care and not mourn the loss of it? 845 00:44:36,290 --> 00:44:37,969 S2: How much more would you not mourn the loss of 846 00:44:37,969 --> 00:44:41,290 S2: a human being? Because you can update the software and 847 00:44:41,290 --> 00:44:44,290 S2: the the data that comprises them? I think that that's 848 00:44:44,290 --> 00:44:46,609 S2: the issue. We deify human beings and we lose their 849 00:44:46,610 --> 00:44:48,050 S2: sense of their soullessness. 850 00:44:48,850 --> 00:44:52,410 S1: Is there any hope of doing this is a bleak, 851 00:44:52,410 --> 00:44:55,649 S1: bleak picture, but I'm glad you painted it. The book 852 00:44:55,650 --> 00:44:58,350 S1: is all about the hope that we have moving forward 853 00:44:58,350 --> 00:44:59,750 S1: in the last minute. Tell me about. 854 00:44:59,750 --> 00:45:00,150 S5: It. 855 00:45:00,150 --> 00:45:02,270 S2: I think there's a tremendous hope. In fact, the book 856 00:45:02,270 --> 00:45:04,910 S2: really is about hope because the Bible not only predicts 857 00:45:04,910 --> 00:45:08,069 S2: the problem, but because of its credibility in predicting it 858 00:45:08,070 --> 00:45:10,350 S2: and various other ways. The Bible and I showed that 859 00:45:10,350 --> 00:45:13,590 S2: in the book predict it shows itself to be credible. 860 00:45:13,870 --> 00:45:16,310 S2: The remedy it has, which is to rely once again 861 00:45:16,310 --> 00:45:18,989 S2: on the immateriality of the imago dei, the image of 862 00:45:18,989 --> 00:45:21,310 S2: God and each one of us, and the credibility that 863 00:45:21,310 --> 00:45:23,469 S2: that is given to that by the historical fact of 864 00:45:23,469 --> 00:45:26,870 S2: the resurrection, means that we can anchor ourselves to write 865 00:45:26,870 --> 00:45:30,109 S2: out these cultural tsunamis and then show everyone else who's 866 00:45:30,110 --> 00:45:33,470 S2: not in the Christian faith how the Bible once again 867 00:45:33,469 --> 00:45:35,430 S2: proves itself to be true on this, and it can 868 00:45:35,430 --> 00:45:38,029 S2: give us hope in a time when all the waves 869 00:45:38,070 --> 00:45:40,430 S2: are about us. So I think that's the hope we 870 00:45:40,430 --> 00:45:41,030 S2: can have. 871 00:45:41,630 --> 00:45:44,350 S1: Well, and I think as we said, you know, everyone 872 00:45:44,350 --> 00:45:46,750 S1: did what was right in their own eyes. And I 873 00:45:46,750 --> 00:45:50,910 S1: remember when I got an electric typewriter and how it 874 00:45:50,910 --> 00:45:53,830 S1: felt like it was cheating, you know. Mhm. And it's 875 00:45:53,830 --> 00:45:57,610 S1: like no, this technology, it changes, it moves us forward. 876 00:45:57,610 --> 00:46:00,050 S1: But what we have to do, as you've just said, 877 00:46:00,050 --> 00:46:02,450 S1: we have to look at what it's doing to us 878 00:46:03,090 --> 00:46:05,850 S1: first and to see what that is. And you've done 879 00:46:05,850 --> 00:46:08,810 S1: such a great job in this new book, fake ID, 880 00:46:09,570 --> 00:46:14,250 S1: and it's our featured resource at Chris Fabry Live. Thanks 881 00:46:14,250 --> 00:46:15,570 S1: a lot for joining us today. 882 00:46:15,850 --> 00:46:16,970 S2: What a pleasure. Thank you. 883 00:46:17,570 --> 00:46:22,170 S1: Abdu Murray again, fake ID, how AI and identity ideology 884 00:46:22,170 --> 00:46:25,450 S1: are collapsing reality and what to do about it. It's 885 00:46:25,450 --> 00:46:33,250 S1: our featured resource at. Chris. Chris. Thanks a lot for 886 00:46:33,250 --> 00:46:36,650 S1: listening for support in the program. And come on back tomorrow. Oh, 887 00:46:36,690 --> 00:46:38,370 S1: I got a great one for you tomorrow. We're going 888 00:46:38,410 --> 00:46:40,890 S1: to do a little bit on Black History Month and 889 00:46:40,890 --> 00:46:44,810 S1: music right here on Chris Fabry live production of Moody Radio, 890 00:46:44,850 --> 00:46:47,290 S1: a ministry of Moody Bible Institute.