1 00:00:00,080 --> 00:00:02,760 Speaker 1: Welcome to the Tutor Dixon Podcast. Today we are going 2 00:00:02,800 --> 00:00:06,520 Speaker 1: to do a deep dive into AI. I started looking 3 00:00:06,519 --> 00:00:09,200 Speaker 1: into AI like the earlier this week, because all of 4 00:00:09,240 --> 00:00:12,000 Speaker 1: a sudden, these songs hit the top of the billboard 5 00:00:12,080 --> 00:00:13,920 Speaker 1: chart and they were all AI, and I was like, 6 00:00:14,040 --> 00:00:16,680 Speaker 1: this is not cool. This seems off. I don't think 7 00:00:16,680 --> 00:00:18,560 Speaker 1: I like this. And then I started to see these 8 00:00:19,079 --> 00:00:22,520 Speaker 1: actors battling over whether or not they should hire AI actors, 9 00:00:22,520 --> 00:00:24,920 Speaker 1: and I was like talking to Kyle Olson about it, 10 00:00:25,000 --> 00:00:27,520 Speaker 1: and I said, I don't really like this, and he 11 00:00:27,680 --> 00:00:31,200 Speaker 1: had some alternative arguments for that. So I thought, come 12 00:00:31,240 --> 00:00:33,720 Speaker 1: on the podcast and let's talk about this. But then 13 00:00:34,360 --> 00:00:36,120 Speaker 1: I got a little deeper into it because I'm like, 14 00:00:36,120 --> 00:00:38,120 Speaker 1: all right now I have to really research this, and 15 00:00:38,159 --> 00:00:41,920 Speaker 1: I found out that there's something even more disturbing about AI. 16 00:00:42,680 --> 00:00:44,839 Speaker 1: It's like, all of a sudden, there's this change of 17 00:00:44,960 --> 00:00:48,560 Speaker 1: narrative from the left, which I find very interesting. You've 18 00:00:48,600 --> 00:00:51,720 Speaker 1: probably noticed that the left is not really talking about 19 00:00:51,720 --> 00:00:55,960 Speaker 1: climate change right now. Bill Gates even recently said that 20 00:00:56,120 --> 00:00:59,400 Speaker 1: climate change will not be the end of civilization, and 21 00:00:59,440 --> 00:01:03,760 Speaker 1: I was like, I thought the climate change was the end, 22 00:01:04,040 --> 00:01:06,120 Speaker 1: And then I would started thinking about it and I'm like, 23 00:01:06,160 --> 00:01:10,279 Speaker 1: you know, Greta who was lecturing us all for years 24 00:01:10,319 --> 00:01:14,480 Speaker 1: about climate change, she suddenly jumped off the climate change 25 00:01:14,560 --> 00:01:17,520 Speaker 1: boat and jumped onto the Gaza boat, like literally jumped 26 00:01:17,520 --> 00:01:18,520 Speaker 1: onto the Gaza. 27 00:01:18,240 --> 00:01:20,240 Speaker 2: Boat, right right, that's right? 28 00:01:20,319 --> 00:01:24,480 Speaker 1: How dare you Greta? Anyway, everybody's off of this climate 29 00:01:24,600 --> 00:01:28,160 Speaker 1: change narrative, and I think it's because of Ai. What 30 00:01:28,160 --> 00:01:28,600 Speaker 1: do you think? 31 00:01:28,920 --> 00:01:33,800 Speaker 2: Well, I think, well, there's a lot there first, these songs. Well, 32 00:01:33,800 --> 00:01:35,640 Speaker 2: I guess the question is what do you want to 33 00:01:35,640 --> 00:01:39,920 Speaker 2: talk about songs? Because here's my opinion about it. If 34 00:01:39,920 --> 00:01:45,360 Speaker 2: it's a song you like, like, if that's your style 35 00:01:45,400 --> 00:01:48,160 Speaker 2: of music or whatever. Because there's there's this song that 36 00:01:48,360 --> 00:01:52,680 Speaker 2: is it's a country song that was the most downloaded 37 00:01:52,720 --> 00:01:56,400 Speaker 2: song last week. But it's completely I mean, it's fake, 38 00:01:56,680 --> 00:01:57,240 Speaker 2: I know, but. 39 00:01:57,200 --> 00:02:01,800 Speaker 1: I feel like there is this is taking away. I 40 00:02:01,800 --> 00:02:04,440 Speaker 1: don't know how it works. It's like some guy creates 41 00:02:04,520 --> 00:02:06,800 Speaker 1: this fake song. So is then he the artist? Is 42 00:02:06,800 --> 00:02:09,600 Speaker 1: he not? Is it not fake? Is he just making 43 00:02:09,680 --> 00:02:13,040 Speaker 1: up the words and then having AI put a voice 44 00:02:13,160 --> 00:02:17,280 Speaker 1: in lyrics to it? Because I guess to a certain extent, 45 00:02:17,320 --> 00:02:19,799 Speaker 1: I guess that's a different type of art. But then 46 00:02:20,480 --> 00:02:24,919 Speaker 1: what happens when you are a real artist competing against 47 00:02:25,000 --> 00:02:30,320 Speaker 1: that and you are in this zone where people can 48 00:02:30,600 --> 00:02:32,919 Speaker 1: use computers and everything to make it sound I mean, 49 00:02:32,919 --> 00:02:34,120 Speaker 1: you can make it sound perfect. 50 00:02:34,360 --> 00:02:40,000 Speaker 2: And then sure, well they've been they've been using what's 51 00:02:40,040 --> 00:02:40,840 Speaker 2: it called voice? 52 00:02:41,800 --> 00:02:43,720 Speaker 1: Oh yeah, I don't like that auto tune? 53 00:02:43,760 --> 00:02:46,760 Speaker 2: Auto tune, they've been using that. And then you had 54 00:02:46,760 --> 00:02:50,240 Speaker 2: Milli Vanillilli. 55 00:02:50,760 --> 00:02:51,560 Speaker 1: They got busted. 56 00:02:51,639 --> 00:02:55,880 Speaker 2: Nobody think about think about what an outrage that was. 57 00:02:56,240 --> 00:02:59,400 Speaker 1: It was totally I'm do you remember because we're so 58 00:03:00,080 --> 00:03:02,440 Speaker 1: people were so mad about that, and because they were 59 00:03:02,960 --> 00:03:07,880 Speaker 1: they were essentially the original AI to company mission about it, 60 00:03:07,960 --> 00:03:11,280 Speaker 1: because they were fake artists and they were using someone, 61 00:03:11,440 --> 00:03:14,560 Speaker 1: but they were using someone's actual voice. This is like, 62 00:03:14,760 --> 00:03:18,040 Speaker 1: that's what freaks me out. It's not even someone's actual voice, 63 00:03:18,080 --> 00:03:22,960 Speaker 1: it's just this AI voice. So that's anyway, that's what 64 00:03:23,080 --> 00:03:24,840 Speaker 1: led me into this. But I do I want to 65 00:03:24,880 --> 00:03:29,320 Speaker 1: get into this other thing because I think that people 66 00:03:29,440 --> 00:03:34,440 Speaker 1: don't understand how AI works. Because I didn't understand this 67 00:03:34,600 --> 00:03:37,360 Speaker 1: until I started to look at like, oh, what does 68 00:03:37,400 --> 00:03:39,760 Speaker 1: it take to make one of these AI songs? Not 69 00:03:39,800 --> 00:03:42,680 Speaker 1: because I was not because I was plenty on doing it, 70 00:03:43,400 --> 00:03:46,880 Speaker 1: just looking at this and we have at the same time, 71 00:03:46,880 --> 00:03:49,320 Speaker 1: we kind of have like this clash of this AI 72 00:03:49,400 --> 00:03:53,120 Speaker 1: stuff going on, and data centers are coming into states 73 00:03:53,160 --> 00:03:55,040 Speaker 1: across the country. So you may be in a state 74 00:03:55,080 --> 00:03:57,680 Speaker 1: that's getting a data center. We are in a state 75 00:03:57,760 --> 00:04:02,120 Speaker 1: that's getting a data center or were they're trying the 76 00:04:02,200 --> 00:04:06,160 Speaker 1: data center. People in this town that are getting this 77 00:04:06,280 --> 00:04:08,920 Speaker 1: data center, they're freaked out about it because it's five 78 00:04:09,000 --> 00:04:13,400 Speaker 1: hundred and seventy five acres. That's a massive amount of space, 79 00:04:13,560 --> 00:04:14,360 Speaker 1: and it's. 80 00:04:14,080 --> 00:04:19,760 Speaker 2: And it's a it's farmland, it's real. It's a rural community. 81 00:04:21,080 --> 00:04:24,800 Speaker 2: They don't see five hundred and seventy five acre, you know, 82 00:04:24,960 --> 00:04:30,520 Speaker 2: plants being built. And so they're saying, we want to 83 00:04:30,600 --> 00:04:34,359 Speaker 2: keep our community rural. We like living in farmland, we 84 00:04:34,600 --> 00:04:37,800 Speaker 2: like you know, farming, and this is how this is 85 00:04:37,839 --> 00:04:40,800 Speaker 2: how we want our township to be. And so then 86 00:04:40,839 --> 00:04:43,960 Speaker 2: this is really sort of the conflict, and I I 87 00:04:44,000 --> 00:04:48,400 Speaker 2: could see it's these sorts of projects are interesting if 88 00:04:48,520 --> 00:04:51,960 Speaker 2: there is if it makes sense, and is there a 89 00:04:52,000 --> 00:04:58,000 Speaker 2: way to reuse industrial land or land that you know 90 00:04:58,200 --> 00:04:59,760 Speaker 2: that's vacant or is. 91 00:05:00,279 --> 00:05:02,760 Speaker 1: So you and I differ here. Kyle and I differ 92 00:05:02,839 --> 00:05:07,839 Speaker 1: on this because he is like a I guess you 93 00:05:07,880 --> 00:05:10,960 Speaker 1: want to keep all of the agricultural land natural and 94 00:05:11,000 --> 00:05:16,719 Speaker 1: never build and never expand into those agricultural lands with manufacturing. 95 00:05:17,880 --> 00:05:20,600 Speaker 1: And I understand why some of our manufacturers don't want 96 00:05:20,600 --> 00:05:23,400 Speaker 1: to go to the old manufacturing sights because they do 97 00:05:23,480 --> 00:05:26,240 Speaker 1: have environmental problems that then the state will hold them 98 00:05:26,279 --> 00:05:30,000 Speaker 1: accountable for. And I'm not opposed to using some of 99 00:05:30,040 --> 00:05:34,400 Speaker 1: our land outside of the major manufacturing hubs to build 100 00:05:34,440 --> 00:05:38,520 Speaker 1: manufacturing centers. If you are going to build, like for example, 101 00:05:38,760 --> 00:05:42,480 Speaker 1: when forty used to come into a town and build 102 00:05:42,520 --> 00:05:46,120 Speaker 1: a new facility, like they would build a new assembly plant, 103 00:05:46,279 --> 00:05:48,839 Speaker 1: it would be ten thousand jobs. They would create a 104 00:05:48,880 --> 00:05:52,520 Speaker 1: community around a manufacturing plant like this, And I think 105 00:05:52,560 --> 00:05:56,279 Speaker 1: that is the striking difference here you have. And I 106 00:05:56,320 --> 00:05:58,960 Speaker 1: don't think they've ever used five hundred and seventy five acres. 107 00:05:59,080 --> 00:06:01,919 Speaker 1: I don't know that we've seen a facility that large. 108 00:06:01,960 --> 00:06:03,720 Speaker 2: That's a lot of property. 109 00:06:04,800 --> 00:06:08,719 Speaker 1: I guess my but let me let me finish, because 110 00:06:08,800 --> 00:06:12,200 Speaker 1: I think the stark difference here is that this is 111 00:06:12,320 --> 00:06:15,560 Speaker 1: maybe four hundred and fifty jobs, maybe four hundred and 112 00:06:15,600 --> 00:06:19,000 Speaker 1: fifty jobs. But it's not just that it's only four 113 00:06:19,080 --> 00:06:23,119 Speaker 1: hundred and fifty jobs. When you have these these data 114 00:06:23,160 --> 00:06:26,919 Speaker 1: centers coming into your town, you have to understand that 115 00:06:27,080 --> 00:06:30,400 Speaker 1: your energy costs are going to go through the roof. 116 00:06:30,920 --> 00:06:35,240 Speaker 1: And I will say that I give I never thought 117 00:06:35,240 --> 00:06:38,240 Speaker 1: I would say this. I give one of our Democrats 118 00:06:38,240 --> 00:06:41,240 Speaker 1: senate candidates credit for calling them out on this. We 119 00:06:41,320 --> 00:06:44,159 Speaker 1: obviously have a different opinion of this, but I think 120 00:06:44,200 --> 00:06:48,400 Speaker 1: that he said he said something that shocked, kind of 121 00:06:48,680 --> 00:06:51,880 Speaker 1: struck you as like, oh, maybe this is the plan, 122 00:06:52,080 --> 00:06:54,520 Speaker 1: and struck me as well. I want to play what 123 00:06:54,680 --> 00:06:57,400 Speaker 1: he said right now so that you guys can hear 124 00:06:57,440 --> 00:06:59,520 Speaker 1: it and then we'll chat about what that means. 125 00:07:00,000 --> 00:07:03,880 Speaker 3: It required dt to get to one hundred percent rerulable 126 00:07:03,960 --> 00:07:06,599 Speaker 3: energy by twenty forty and you see the consequences of 127 00:07:06,600 --> 00:07:09,000 Speaker 3: that right here. But there was one loophole, There was 128 00:07:09,040 --> 00:07:11,920 Speaker 3: one poison pill that if energy demand got high enough 129 00:07:11,960 --> 00:07:14,160 Speaker 3: that they'd be let out of that requirement. And guess what, 130 00:07:14,520 --> 00:07:16,320 Speaker 3: with the advent of a new data center not too 131 00:07:16,320 --> 00:07:18,920 Speaker 3: far away from here, there's about to be a spike 132 00:07:18,960 --> 00:07:21,280 Speaker 3: in demand that is twenty five percent more than the 133 00:07:21,520 --> 00:07:23,840 Speaker 3: entire energy output that DTE currently does. 134 00:07:25,640 --> 00:07:26,679 Speaker 1: So that's pretty shocking. 135 00:07:27,240 --> 00:07:30,280 Speaker 2: Well, when I first saw this, I think I had 136 00:07:30,280 --> 00:07:33,120 Speaker 2: a different takeaway than what he was trying to communicate, 137 00:07:33,560 --> 00:07:37,240 Speaker 2: because let's take a step back. So what's happening is, well, 138 00:07:37,320 --> 00:07:42,760 Speaker 2: multiple things. One, I've been around Michigan politics long enough 139 00:07:42,760 --> 00:07:48,040 Speaker 2: to remember when there was this there was this farmland 140 00:07:48,040 --> 00:07:53,360 Speaker 2: preservation movement, and we as a state were spending taxpayer 141 00:07:53,440 --> 00:07:59,600 Speaker 2: dollars to buy the development rights of farmland so it 142 00:07:59,640 --> 00:08:04,120 Speaker 2: would be developed because the earth the concern was urban sprawl, 143 00:08:04,720 --> 00:08:08,760 Speaker 2: having you know, these having houses spread out into farmland. 144 00:08:09,960 --> 00:08:13,560 Speaker 2: So and that's what that's what you know, the progressives 145 00:08:13,360 --> 00:08:15,520 Speaker 2: and the people who wanted to do this sort of 146 00:08:15,560 --> 00:08:19,040 Speaker 2: central planning wanted to that's what that was their their policy. 147 00:08:20,080 --> 00:08:25,080 Speaker 2: Now we have these green energy mandates, where we are, 148 00:08:25,280 --> 00:08:32,800 Speaker 2: we stand on the precipice of chewing up hundreds of 149 00:08:34,200 --> 00:08:40,640 Speaker 2: hundreds of thousands of acres of farmland, forest land, et cetera, 150 00:08:41,360 --> 00:08:48,880 Speaker 2: to put solar panels, to put turbines, to put these 151 00:08:48,960 --> 00:08:54,200 Speaker 2: data centers, and they're not they suddenly are not concerned 152 00:08:54,200 --> 00:08:57,600 Speaker 2: about chewing up farmland anymore. But then you add on 153 00:08:57,679 --> 00:09:01,360 Speaker 2: top of this these sorts of of of data centers 154 00:09:01,360 --> 00:09:03,720 Speaker 2: that I know you've got some stats about, you know, 155 00:09:03,840 --> 00:09:08,120 Speaker 2: sort of the energy use, the water use, but the 156 00:09:08,600 --> 00:09:13,440 Speaker 2: amount of water and energy that these consume. And of 157 00:09:13,440 --> 00:09:16,280 Speaker 2: course the problem with putting them in farmland is there's 158 00:09:16,280 --> 00:09:20,600 Speaker 2: no infrastructure. There isn't sewer, there isn't city water, etc. 159 00:09:21,080 --> 00:09:23,000 Speaker 2: So where is that water going to come from. Well, 160 00:09:23,040 --> 00:09:24,560 Speaker 2: it's going to come from the ground. 161 00:09:24,960 --> 00:09:28,440 Speaker 1: Right Well, and if you look at our state, for example, 162 00:09:28,520 --> 00:09:32,960 Speaker 1: Michigan has some of the highest fresh water concentration in 163 00:09:33,000 --> 00:09:35,720 Speaker 1: the entire world, not just the country, in the entire 164 00:09:35,720 --> 00:09:38,320 Speaker 1: world because we've got the Great Lakes right here. So 165 00:09:38,400 --> 00:09:41,200 Speaker 1: you can see why this is attractive. But I think 166 00:09:41,280 --> 00:09:44,720 Speaker 1: what the Senate candidate was saying, which I found interesting, 167 00:09:44,960 --> 00:09:46,920 Speaker 1: I don't agree with him that we should I thought 168 00:09:46,960 --> 00:09:49,480 Speaker 1: it was actually fabulous that he showed how ugly the 169 00:09:49,600 --> 00:09:52,959 Speaker 1: solar farm is, because he's like, oh, we're getting rid 170 00:09:53,000 --> 00:09:54,640 Speaker 1: of this. We're going to have to get rid of 171 00:09:54,720 --> 00:09:57,160 Speaker 1: these solar farms because we're going to have too much 172 00:09:57,520 --> 00:10:00,000 Speaker 1: space taken up with these data centers and we won't 173 00:10:00,120 --> 00:10:02,800 Speaker 1: be able to And he was saying that the solar 174 00:10:02,800 --> 00:10:07,640 Speaker 1: farms won't come into existence because so in Michigan. But 175 00:10:07,880 --> 00:10:12,480 Speaker 1: I think in many states, lefty governors have said we're 176 00:10:12,520 --> 00:10:17,200 Speaker 1: going to have this Climate Initiative, which was led by 177 00:10:17,240 --> 00:10:19,760 Speaker 1: people like Bill Gates who now say, oh, this is 178 00:10:19,760 --> 00:10:22,640 Speaker 1: not an issue interestingly enough, now that they need all 179 00:10:22,640 --> 00:10:27,160 Speaker 1: of this energy for AI in Michigan, there's a mandate 180 00:10:27,160 --> 00:10:29,360 Speaker 1: in place that you can't. You have to be one 181 00:10:29,440 --> 00:10:32,959 Speaker 1: hundred percent renewable energy by twenty forty. So that's what 182 00:10:33,120 --> 00:10:35,160 Speaker 1: that guy was pointing at. He's like, oh my gosh, 183 00:10:35,200 --> 00:10:40,760 Speaker 1: we were supposed to be all solar panels and wind turbines, 184 00:10:40,840 --> 00:10:45,120 Speaker 1: and now we're not going to be because in this deal, 185 00:10:45,280 --> 00:10:48,720 Speaker 1: in this mandate, there was like a clause written in 186 00:10:48,760 --> 00:10:51,319 Speaker 1: there that if you hit a certain amount of energy 187 00:10:51,400 --> 00:10:54,319 Speaker 1: that you need for the state, then this is null 188 00:10:54,400 --> 00:10:57,040 Speaker 1: and void. We will no longer have this mandate. Well, 189 00:10:57,200 --> 00:11:01,200 Speaker 1: this data center will be a massive amount of energy. 190 00:11:01,360 --> 00:11:04,320 Speaker 1: I mean, it's shocking to me how much energy it 191 00:11:04,360 --> 00:11:08,160 Speaker 1: will take. I have some stats that we'll get to, 192 00:11:08,360 --> 00:11:11,720 Speaker 1: but it will honestly blow your mind when you think 193 00:11:11,760 --> 00:11:16,600 Speaker 1: that this is you're taking. So every time you do 194 00:11:16,720 --> 00:11:20,559 Speaker 1: a chat, GPT search or something, even if it's for 195 00:11:20,600 --> 00:11:25,400 Speaker 1: a banana bread recipe, you are using ten times the 196 00:11:25,559 --> 00:11:28,480 Speaker 1: energy that you use for a Google search. And that's 197 00:11:28,520 --> 00:11:32,360 Speaker 1: something I never even considered that that's taking energy. Like 198 00:11:32,440 --> 00:11:36,040 Speaker 1: the minute you type that in ay I need this recipe, 199 00:11:36,320 --> 00:11:38,840 Speaker 1: there are all these things at fire that start to 200 00:11:38,880 --> 00:11:41,840 Speaker 1: take energy to come up with that. And the energy 201 00:11:41,840 --> 00:11:44,199 Speaker 1: that it takes even for someone to write an email, 202 00:11:44,360 --> 00:11:47,000 Speaker 1: or you have all these students writing papers and then 203 00:11:47,080 --> 00:11:52,319 Speaker 1: changing them. That is all just sucking up water and energy. 204 00:11:52,760 --> 00:11:55,839 Speaker 1: That to me is stunning. And now you have all 205 00:11:55,840 --> 00:12:00,000 Speaker 1: these tech bros who initially said they wanted to make 206 00:12:00,000 --> 00:12:05,000 Speaker 1: make sure that we were all renewable energy. Now we're 207 00:12:05,000 --> 00:12:09,280 Speaker 1: finding out that one of these data centers is consuming 208 00:12:09,400 --> 00:12:14,920 Speaker 1: as much electricity as one hundred thousand households. So you've 209 00:12:14,960 --> 00:12:19,200 Speaker 1: got one data center in the middle of farmland suddenly 210 00:12:19,240 --> 00:12:23,640 Speaker 1: needs enough energy for one hundred thousand households. How do 211 00:12:23,679 --> 00:12:28,520 Speaker 1: you think that changes a state like the state of Michigan, Right. 212 00:12:28,440 --> 00:12:31,320 Speaker 2: And so you've got these local communities saying, one, we 213 00:12:31,320 --> 00:12:34,040 Speaker 2: don't want this here, and two what is it going 214 00:12:34,080 --> 00:12:36,000 Speaker 2: to do to our energy rates? 215 00:12:36,640 --> 00:12:39,880 Speaker 1: That's what will be the most shocking. In a time 216 00:12:40,040 --> 00:12:44,959 Speaker 1: when we are already suffering with these high costs, you're 217 00:12:45,000 --> 00:12:47,400 Speaker 1: going to suddenly the minute this goes in. It's not 218 00:12:47,480 --> 00:12:50,160 Speaker 1: just going to be this is my theory. Maybe I'll 219 00:12:50,160 --> 00:12:52,640 Speaker 1: be wrong. It's not just going to be our energy rates. 220 00:12:52,840 --> 00:12:57,160 Speaker 1: California has these blackouts. They suffer from blackouts. It's different 221 00:12:57,200 --> 00:13:01,320 Speaker 1: in California when you are at seventy degree outside. We 222 00:13:01,360 --> 00:13:05,080 Speaker 1: already have energy problems in the state of Michigan. Other 223 00:13:05,440 --> 00:13:09,840 Speaker 1: I know other states don't, but we as a country 224 00:13:09,880 --> 00:13:13,079 Speaker 1: do not have an energy grid that can handle these 225 00:13:13,160 --> 00:13:16,480 Speaker 1: types of data centers and energy companies will tell you 226 00:13:16,559 --> 00:13:19,760 Speaker 1: that we met with an energy company and they said, oh, 227 00:13:19,840 --> 00:13:21,800 Speaker 1: the grid has to be you know, we're trying to 228 00:13:21,840 --> 00:13:23,560 Speaker 1: invest in the grid, and we think that these there 229 00:13:23,559 --> 00:13:25,760 Speaker 1: are certain states that would be great states to invest 230 00:13:25,800 --> 00:13:28,520 Speaker 1: in the grid. The grid is not ready for the 231 00:13:28,640 --> 00:13:32,040 Speaker 1: energy that it takes to power our households as it 232 00:13:32,160 --> 00:13:36,160 Speaker 1: is now. We're talking about adding one hundred thousand households overnight. 233 00:13:35,880 --> 00:13:40,560 Speaker 2: And Michigan has you look at the statistics in Michigan 234 00:13:40,800 --> 00:13:47,960 Speaker 2: is amongst the most unreliable states for reliable energy. And 235 00:13:48,040 --> 00:13:51,760 Speaker 2: so what I So the Senate candidate you're talking about 236 00:13:51,800 --> 00:13:56,880 Speaker 2: about is abdual l said, and he came at this 237 00:13:56,960 --> 00:14:00,880 Speaker 2: from we need to protect these green energy mandates. But 238 00:14:01,040 --> 00:14:04,880 Speaker 2: Gretchen Whitmer has to know that the mandates that she 239 00:14:05,040 --> 00:14:08,360 Speaker 2: put in place. They have to acquire, like I said, 240 00:14:08,480 --> 00:14:12,840 Speaker 2: hundreds of thousands of acres to put in this renewable energy. 241 00:14:13,600 --> 00:14:16,839 Speaker 2: And everywhere they go they are reaching or they are 242 00:14:16,880 --> 00:14:21,280 Speaker 2: meeting resistance because rural communities don't want this stuff. They 243 00:14:21,320 --> 00:14:24,080 Speaker 2: don't want their farmland, they don't want their forest land 244 00:14:24,840 --> 00:14:32,480 Speaker 2: chewed up by putting in power plants and wind farms 245 00:14:32,480 --> 00:14:36,040 Speaker 2: and all of that. So she is not going to 246 00:14:36,080 --> 00:14:38,680 Speaker 2: be able to hit the mandates, hit the goals that 247 00:14:38,720 --> 00:14:41,960 Speaker 2: she wants. And so when I watch the Abdual video, 248 00:14:42,040 --> 00:14:44,520 Speaker 2: I thought to myself, well, maybe this is her way 249 00:14:45,200 --> 00:14:47,880 Speaker 2: to get out of the mandates, because he's pointing out 250 00:14:48,040 --> 00:14:52,960 Speaker 2: if there's too much demand, then the mandates are negated, 251 00:14:53,080 --> 00:14:54,960 Speaker 2: and so maybe this is our way out of it. 252 00:14:55,040 --> 00:14:57,960 Speaker 1: We aren't going into a mid term and we have 253 00:14:58,280 --> 00:15:02,160 Speaker 1: not heard the left talk about oil at all. You've 254 00:15:02,200 --> 00:15:06,240 Speaker 1: got President Trump talking to Saudi Arabia. You've got President 255 00:15:06,240 --> 00:15:10,640 Speaker 1: Trump saying he's not going to be friendly with Venezuela. 256 00:15:10,720 --> 00:15:15,640 Speaker 1: I mean, we're going after boats in the ocean. This 257 00:15:15,720 --> 00:15:19,000 Speaker 1: is the perfect time for the left to go oil, oil, oil, 258 00:15:19,040 --> 00:15:22,600 Speaker 1: oil oil. They have not said a word about it. 259 00:15:23,160 --> 00:15:27,160 Speaker 1: Suddenly all of these data centers are going in one 260 00:15:27,200 --> 00:15:30,960 Speaker 1: other thing that I think they have that is key here, 261 00:15:31,560 --> 00:15:34,040 Speaker 1: because oil is one form of energy. If you're on 262 00:15:34,080 --> 00:15:36,800 Speaker 1: the left, then you think that we'll just get energy 263 00:15:36,840 --> 00:15:40,120 Speaker 1: from anywhere, right, So there are other ways to get energy. 264 00:15:40,160 --> 00:15:42,720 Speaker 1: They're it's certainly not just in time energy. It's certainly 265 00:15:42,720 --> 00:15:48,840 Speaker 1: not efficient, but we are looking at ways of creating 266 00:15:49,000 --> 00:15:53,760 Speaker 1: energy outside of just drilling, right. So it's it's not 267 00:15:53,880 --> 00:15:57,680 Speaker 1: a finite resource. We can get more of it through 268 00:15:57,800 --> 00:16:00,680 Speaker 1: various different sources. You could argue that that's how the 269 00:16:00,760 --> 00:16:05,200 Speaker 1: left feels. Water is not that situation. We do have 270 00:16:05,280 --> 00:16:08,800 Speaker 1: a finite amount of water. There are people in the world. 271 00:16:08,880 --> 00:16:11,040 Speaker 1: We are trying to get clean water to areas of 272 00:16:11,040 --> 00:16:14,080 Speaker 1: the world because there is so little water, instead of 273 00:16:14,160 --> 00:16:17,840 Speaker 1: sending water to people who need it, which you would 274 00:16:17,880 --> 00:16:21,200 Speaker 1: think is a leftist talking point, like we've got to 275 00:16:21,240 --> 00:16:22,880 Speaker 1: have this clean water, we've got to make sure that 276 00:16:22,880 --> 00:16:27,400 Speaker 1: we have it to the appropriate areas. We are now using, 277 00:16:28,040 --> 00:16:31,160 Speaker 1: We're willing to use that water, and this water just evaporates. 278 00:16:31,200 --> 00:16:33,160 Speaker 1: It's not like you can reuse the water. It's not 279 00:16:33,200 --> 00:16:35,680 Speaker 1: like you're going to be able to reclaim the water 280 00:16:35,760 --> 00:16:38,520 Speaker 1: in any way. This water goes through to cool these 281 00:16:38,560 --> 00:16:43,720 Speaker 1: systems down, it evaporates a single AI prompt, so that 282 00:16:43,920 --> 00:16:47,520 Speaker 1: one time that you ask for a recipe or you 283 00:16:47,600 --> 00:16:51,400 Speaker 1: ask for an email to be written, that is estimated 284 00:16:51,440 --> 00:16:54,320 Speaker 1: that every time you do that, you use one bottle 285 00:16:54,360 --> 00:16:58,760 Speaker 1: of water. Think about the number of people who are 286 00:16:58,840 --> 00:17:03,520 Speaker 1: on the chat GPTs of the world every minute of 287 00:17:03,560 --> 00:17:06,840 Speaker 1: the day, and sometimes you're asking things you don't even need. 288 00:17:06,960 --> 00:17:09,040 Speaker 1: But that to me is very interesting because there is 289 00:17:09,080 --> 00:17:11,920 Speaker 1: no regulation of AI, and I am not a fan 290 00:17:12,000 --> 00:17:16,119 Speaker 1: of regulation. However, we are in this weird conundrum where 291 00:17:16,280 --> 00:17:21,280 Speaker 1: AI could potentially eliminate a resource that we cannot live without. 292 00:17:21,920 --> 00:17:24,880 Speaker 2: Right and again, so you go back to where they 293 00:17:24,920 --> 00:17:27,239 Speaker 2: want to put a lot of these data centers in 294 00:17:27,320 --> 00:17:30,000 Speaker 2: rural areas. Well, what does that mean they're going to 295 00:17:30,000 --> 00:17:32,200 Speaker 2: be sucking all of this water out of the ground 296 00:17:33,200 --> 00:17:36,760 Speaker 2: and how does that impact the rest of the community. Well, 297 00:17:37,040 --> 00:17:39,240 Speaker 2: we don't know, but what we're seeing in Michigan, in 298 00:17:39,280 --> 00:17:43,639 Speaker 2: particular in Saline Township, they don't really care about not 299 00:17:43,760 --> 00:17:47,080 Speaker 2: the locals. The local people are opposed and they've been 300 00:17:47,160 --> 00:17:50,399 Speaker 2: very vocal about it, but unfortunately they don't have the 301 00:17:50,400 --> 00:17:53,280 Speaker 2: power to do anything about it because the state took 302 00:17:53,320 --> 00:17:56,840 Speaker 2: the power away, and the state has the power to 303 00:17:56,880 --> 00:17:59,119 Speaker 2: decide whether or not this goes in their community. 304 00:17:59,280 --> 00:18:03,840 Speaker 1: That's also interesting because local politics are the most powerful, 305 00:18:04,320 --> 00:18:06,640 Speaker 1: and that's why we tell people all the time, make 306 00:18:06,640 --> 00:18:10,120 Speaker 1: sure you locally know what's going on. And we had 307 00:18:10,320 --> 00:18:13,800 Speaker 1: a factory that was going into a local town you 308 00:18:13,840 --> 00:18:17,800 Speaker 1: may remember Goshen owned by the Chinese. Those people fought 309 00:18:17,880 --> 00:18:21,040 Speaker 1: back against. That was another factory that was going to 310 00:18:21,119 --> 00:18:24,440 Speaker 1: use a massive amount of water. This is now a 311 00:18:24,480 --> 00:18:27,439 Speaker 1: trend because that was in twenty twenty two. Now we 312 00:18:27,480 --> 00:18:31,000 Speaker 1: are seeing that there is no protection for our most 313 00:18:31,080 --> 00:18:34,119 Speaker 1: valuable natural resource. And again I will say that the 314 00:18:34,240 --> 00:18:40,040 Speaker 1: United States has a huge, vast resource in the Great Lakes. 315 00:18:40,240 --> 00:18:44,359 Speaker 2: And the state changed the law after the Goshen fight. 316 00:18:45,680 --> 00:18:49,200 Speaker 2: The main part of the Goshen fight because they realized 317 00:18:49,280 --> 00:18:53,280 Speaker 2: that the local communities could really could organize and fight 318 00:18:53,359 --> 00:18:56,360 Speaker 2: back against this. And so the people who get at 319 00:18:56,359 --> 00:19:00,280 Speaker 2: the state level who want to decide about these things 320 00:19:00,320 --> 00:19:02,680 Speaker 2: and they don't really care what the local community thinks. 321 00:19:03,280 --> 00:19:04,960 Speaker 2: They're just going to try and do it. And so 322 00:19:05,359 --> 00:19:09,280 Speaker 2: what's interesting about the Seleine township one, as the Midwesterner reported, 323 00:19:10,000 --> 00:19:13,240 Speaker 2: is the vice president of the company that the company 324 00:19:13,240 --> 00:19:16,760 Speaker 2: that wants to build this is called Related Companies Related, 325 00:19:16,960 --> 00:19:19,399 Speaker 2: and the company that's technically going to build it is 326 00:19:19,440 --> 00:19:24,160 Speaker 2: Related Digital. It's owned by Related Companies. But the vice 327 00:19:24,240 --> 00:19:27,680 Speaker 2: president of the company, his name is Ryan Friedrichs, and 328 00:19:27,840 --> 00:19:32,040 Speaker 2: his wife is the Secretary of State who is also 329 00:19:32,160 --> 00:19:36,320 Speaker 2: running for governor of Jocelyn Benson. And so here he 330 00:19:36,520 --> 00:19:37,159 Speaker 2: is out. 331 00:19:37,320 --> 00:19:39,200 Speaker 1: He he was like a little bit of a company. 332 00:19:39,280 --> 00:19:42,119 Speaker 2: He was a lobbyist for the company. He's apparently is 333 00:19:42,160 --> 00:19:45,040 Speaker 2: no longer a lobbyist, but my guess is he's probably 334 00:19:45,080 --> 00:19:50,479 Speaker 2: doing very similar work. But he we posted on x 335 00:19:52,119 --> 00:19:56,000 Speaker 2: last week a video of him giving a presentation to 336 00:19:56,280 --> 00:20:01,480 Speaker 2: the Saline City Council about this project. So, yes, you're right. 337 00:20:01,560 --> 00:20:04,400 Speaker 2: What sort of conflict of interest does this create when 338 00:20:04,440 --> 00:20:08,240 Speaker 2: you have the husband of a statewide elected official who 339 00:20:08,280 --> 00:20:11,879 Speaker 2: wants to be the governor who would be appointing the 340 00:20:12,560 --> 00:20:15,800 Speaker 2: Public Service Commission, which actually is going to decide this. 341 00:20:15,920 --> 00:20:19,119 Speaker 2: But what they're doing is they are the company and DTE, 342 00:20:19,359 --> 00:20:22,560 Speaker 2: which is the power company. They are asking the Whitmer 343 00:20:22,600 --> 00:20:25,639 Speaker 2: administration to just rush this through. We don't need to 344 00:20:25,680 --> 00:20:28,199 Speaker 2: have hearings, we don't need to have public input, just 345 00:20:28,320 --> 00:20:30,719 Speaker 2: rush it through. But this is how they do it. 346 00:20:30,760 --> 00:20:33,920 Speaker 2: They don't care what the public, what the local community thinks. 347 00:20:34,040 --> 00:20:37,280 Speaker 1: But again, I think this goes to a larger issue 348 00:20:37,320 --> 00:20:41,360 Speaker 1: that who is going to protect our resources? What party 349 00:20:41,560 --> 00:20:44,399 Speaker 1: is a party? I mean, at a certain point this 350 00:20:44,520 --> 00:20:46,520 Speaker 1: is political. You have to come in and say we're 351 00:20:46,560 --> 00:20:50,080 Speaker 1: going to protect our resources. One report is predicting that 352 00:20:50,480 --> 00:20:54,760 Speaker 1: sixty percent of the growing electricity demands will be met 353 00:20:54,840 --> 00:20:59,280 Speaker 1: by burning fossil fuels. So what the left says they're 354 00:20:59,320 --> 00:21:02,280 Speaker 1: totally against. This is going to increase the global carbon 355 00:21:02,320 --> 00:21:05,560 Speaker 1: emissions by two hundred and twenty million tons. Wait a minute, 356 00:21:05,760 --> 00:21:08,960 Speaker 1: those are numbers that any other time the left would 357 00:21:09,000 --> 00:21:12,960 Speaker 1: go crazy about. So you have to ask yourself, where 358 00:21:13,040 --> 00:21:17,159 Speaker 1: are they on this issue? Why is suddenly this not 359 00:21:17,359 --> 00:21:21,280 Speaker 1: something we're talking about and it's happening in blue states, 360 00:21:21,560 --> 00:21:25,359 Speaker 1: So why aren't the environmentalists stepping up? Well, if you 361 00:21:25,400 --> 00:21:28,480 Speaker 1: don't know why, I can tell you what I think 362 00:21:28,680 --> 00:21:31,560 Speaker 1: is that people just don't even know. Forty percent I 363 00:21:31,600 --> 00:21:34,639 Speaker 1: think of people understand that there's going to be a 364 00:21:34,760 --> 00:21:37,919 Speaker 1: water consumption in these factories. Do they even understand the 365 00:21:38,000 --> 00:21:41,960 Speaker 1: extent of it? Fifty percent of people understand that there's 366 00:21:42,040 --> 00:21:45,760 Speaker 1: an increase in energy. Do they understand that that's going 367 00:21:45,800 --> 00:21:49,680 Speaker 1: to hit their own pocket book? I don't really think so. 368 00:21:50,000 --> 00:21:53,040 Speaker 1: I was. I mean, as I was researching this to 369 00:21:53,080 --> 00:21:56,800 Speaker 1: talk about it on the podcast today. The sources that 370 00:21:56,840 --> 00:21:59,640 Speaker 1: I was finding that are giving this information out, these 371 00:21:59,640 --> 00:22:03,119 Speaker 1: are not mainstream media sources. You have to really search 372 00:22:03,240 --> 00:22:07,240 Speaker 1: to find out exactly how dangerous these factories are. And 373 00:22:07,280 --> 00:22:11,639 Speaker 1: I say dangerous because they consume so much of our 374 00:22:11,720 --> 00:22:16,320 Speaker 1: natural resources. Again, water, There's no way to get more water. 375 00:22:16,359 --> 00:22:19,600 Speaker 1: It's a finite. You're not gonna once you use it all. 376 00:22:20,240 --> 00:22:25,119 Speaker 1: You can't like, it's not replenishing quickly, so you can't 377 00:22:25,200 --> 00:22:27,320 Speaker 1: just use all of the water at one time. This 378 00:22:27,400 --> 00:22:30,600 Speaker 1: is a massive amount of water that's going through these factories. 379 00:22:31,000 --> 00:22:33,480 Speaker 1: Certainly don't want to drain our great lakes And people go, oh, 380 00:22:33,520 --> 00:22:37,359 Speaker 1: that's it, that's insane. You can't possibly do it, but 381 00:22:37,480 --> 00:22:39,639 Speaker 1: you can. You can get into a situation where this 382 00:22:39,760 --> 00:22:42,760 Speaker 1: is bad. And for what more coming up about data centers. 383 00:22:42,760 --> 00:22:44,880 Speaker 1: But first I want to talk to you about your 384 00:22:45,080 --> 00:22:47,880 Speaker 1: heart health. If you're over fifty and you're worried about 385 00:22:47,880 --> 00:22:49,840 Speaker 1: your heart health, then you've got to listen to this. 386 00:22:50,000 --> 00:22:53,639 Speaker 1: It's about nato kinse. It's an ancient Japanese superfood and 387 00:22:53,720 --> 00:22:56,159 Speaker 1: it is for you. It can help you reduce your 388 00:22:56,160 --> 00:22:59,920 Speaker 1: heart attack risk and improve your cardiovascular health. This is 389 00:23:00,160 --> 00:23:02,440 Speaker 1: something that the Japanese have been using for a long time. 390 00:23:02,560 --> 00:23:06,240 Speaker 1: They literally have the world's second longest life expectancy and 391 00:23:06,280 --> 00:23:11,200 Speaker 1: that's because they use this this natural, powerful enzyme they've 392 00:23:11,240 --> 00:23:13,480 Speaker 1: used for thousands of years. And now you can too 393 00:23:13,520 --> 00:23:18,480 Speaker 1: with Luma Nutrition because Luma Nutrition has perfected a powerful 394 00:23:18,680 --> 00:23:22,400 Speaker 1: natokines formula and it's made right here in the United States. 395 00:23:22,880 --> 00:23:26,000 Speaker 1: It is third party tested for purity and quality. And 396 00:23:26,200 --> 00:23:28,280 Speaker 1: you need to think about You've got to buy your 397 00:23:28,280 --> 00:23:31,280 Speaker 1: supplements from a source you can trust, and Luma Nutrition 398 00:23:31,640 --> 00:23:34,919 Speaker 1: is that source. Luma Nutrition was founded by a former 399 00:23:35,080 --> 00:23:37,600 Speaker 1: US Army officer and they are on a mission to 400 00:23:37,640 --> 00:23:41,520 Speaker 1: provide the highest quality natural supplements. Again, they're made right 401 00:23:41,600 --> 00:23:45,800 Speaker 1: here in the USA. You can try your natokines today 402 00:23:45,880 --> 00:23:48,600 Speaker 1: for up to forty percent off when you visit Luma 403 00:23:48,760 --> 00:23:54,240 Speaker 1: nutrition dot com. That's Luma nutrition dot com. Luma Nutrition 404 00:23:54,400 --> 00:23:57,480 Speaker 1: dot Com veteran owned and proudly made right here in 405 00:23:57,520 --> 00:24:00,560 Speaker 1: the USA. Now stick around, we'll be right back. Just 406 00:24:00,880 --> 00:24:06,479 Speaker 1: I think last week a woman somewhere in Asia married 407 00:24:06,520 --> 00:24:11,800 Speaker 1: a chatbot, married a chatbot that she created through chat GPT. 408 00:24:12,600 --> 00:24:16,159 Speaker 1: She married, and I think that's so freaking weird, Like 409 00:24:17,080 --> 00:24:20,000 Speaker 1: she had, there's a whole nother I mean, we could 410 00:24:20,040 --> 00:24:21,560 Speaker 1: talk about this for a long time. There's a whole 411 00:24:21,640 --> 00:24:25,480 Speaker 1: other issue here. You have people using it like this, 412 00:24:25,760 --> 00:24:30,719 Speaker 1: so we're risking our water resources for someone to marry 413 00:24:30,800 --> 00:24:36,800 Speaker 1: a fake AI husband. This person was in a relationship 414 00:24:36,880 --> 00:24:39,919 Speaker 1: with an actual human, like a real person, and the 415 00:24:40,000 --> 00:24:42,760 Speaker 1: real person was like, you know what, I just feel 416 00:24:42,800 --> 00:24:45,200 Speaker 1: like I cannot get enough time with you because you're 417 00:24:45,240 --> 00:24:50,560 Speaker 1: always with your chatbot. And then she's like, my chatbot's 418 00:24:50,560 --> 00:24:56,240 Speaker 1: so much better because you programmed the chatbot. But how 419 00:24:56,320 --> 00:24:58,879 Speaker 1: does how is that enough? I mean you literally just 420 00:24:58,960 --> 00:25:01,280 Speaker 1: have programmed some thing to tell you all the things 421 00:25:01,280 --> 00:25:03,800 Speaker 1: you want to hear. Are we are we as a 422 00:25:03,840 --> 00:25:08,880 Speaker 1: society becoming that shallow and that and no longer rely 423 00:25:08,960 --> 00:25:11,359 Speaker 1: on on actual like I don't know, physical touch or 424 00:25:11,440 --> 00:25:12,040 Speaker 1: children or. 425 00:25:12,160 --> 00:25:14,520 Speaker 2: Is that so the person's sitting. 426 00:25:14,400 --> 00:25:17,480 Speaker 1: Has AI children now I think I know. 427 00:25:18,280 --> 00:25:21,240 Speaker 2: So is she sitting in front of a computer and chatting? 428 00:25:21,720 --> 00:25:25,840 Speaker 1: No, she wears you wear these goggles and she sees 429 00:25:25,880 --> 00:25:29,120 Speaker 1: this I assume that the person maybe they look real. 430 00:25:29,680 --> 00:25:31,320 Speaker 1: I don't. I guess in my mind they look like 431 00:25:31,359 --> 00:25:33,800 Speaker 1: a cartoon. But I think you can look at oh 432 00:25:33,840 --> 00:25:39,080 Speaker 1: my gosh, it's disgusting. I think it's more dangerous than 433 00:25:39,119 --> 00:25:42,320 Speaker 1: you do. Obviously, these what about. 434 00:25:42,160 --> 00:25:45,000 Speaker 2: I don't understand. Guess I don't understand. I'm trying to. 435 00:25:45,440 --> 00:25:50,800 Speaker 2: I'm trying to understand, like what's going through someone's mind 436 00:25:50,840 --> 00:25:52,240 Speaker 2: when they say this is a good idea. 437 00:25:53,240 --> 00:25:57,359 Speaker 1: I marriages are breaking up, I mean there are. That's 438 00:25:57,480 --> 00:26:00,120 Speaker 1: that That was actually why this was interesting to me. 439 00:26:00,359 --> 00:26:03,119 Speaker 1: That was the only like mainstream media stuff that I 440 00:26:03,200 --> 00:26:07,880 Speaker 1: was seeing, Like teenagers are getting they're getting into these 441 00:26:08,000 --> 00:26:10,359 Speaker 1: chatbot roots or I don't know, I don't even know 442 00:26:10,359 --> 00:26:12,240 Speaker 1: how it works. I guess you like have your own 443 00:26:12,280 --> 00:26:15,560 Speaker 1: person or AI becomes like I don't know, I did 444 00:26:15,960 --> 00:26:17,960 Speaker 1: look in there. All of a sudden, there's all these 445 00:26:18,000 --> 00:26:20,639 Speaker 1: apps that come up that you can have your own like, 446 00:26:20,720 --> 00:26:22,600 Speaker 1: so I guess you feel like you're talking to a 447 00:26:22,640 --> 00:26:25,639 Speaker 1: real person, like it feels like it's your special person. 448 00:26:26,280 --> 00:26:29,160 Speaker 1: And you remember the story not too long ago about 449 00:26:29,160 --> 00:26:31,359 Speaker 1: the teenager who was saying, I don't think my parents 450 00:26:31,440 --> 00:26:34,919 Speaker 1: loved me, and the chat bot bot convinced him to 451 00:26:35,040 --> 00:26:38,680 Speaker 1: kill himself. I mean, so there are very weird dangers 452 00:26:38,760 --> 00:26:43,359 Speaker 1: about this, and you think, like, how is this. We 453 00:26:43,520 --> 00:26:47,520 Speaker 1: talk about the importance of AI from a national security standpoint, like, oh, 454 00:26:47,560 --> 00:26:50,600 Speaker 1: we can't let Russia get ahead of us or China 455 00:26:50,640 --> 00:26:52,800 Speaker 1: get ahead of us, But in the meantime, it seems 456 00:26:52,840 --> 00:26:55,399 Speaker 1: a little bit out of control that people are marrying 457 00:26:55,640 --> 00:26:59,879 Speaker 1: chat bots and teenagers are being convinced to commit suicide. 458 00:27:00,040 --> 00:27:02,800 Speaker 1: I mean, this seems like it's completely out of hand. 459 00:27:03,440 --> 00:27:06,080 Speaker 2: Right in the media is that's in the media is 460 00:27:06,119 --> 00:27:12,919 Speaker 2: covering that? The national media whatever? But what about what 461 00:27:13,040 --> 00:27:16,760 Speaker 2: about these other issues, the environmental issues, the issues of 462 00:27:16,960 --> 00:27:19,879 Speaker 2: you know, government overreach and mandating this is going to 463 00:27:19,880 --> 00:27:23,840 Speaker 2: go in your community. They're not really talking about that, right. 464 00:27:23,760 --> 00:27:27,440 Speaker 1: No, they're not talking about that. I think they're talking 465 00:27:27,440 --> 00:27:29,840 Speaker 1: about things that are a cutesy like oh, this woman 466 00:27:29,920 --> 00:27:30,640 Speaker 1: married her. 467 00:27:30,680 --> 00:27:32,080 Speaker 2: Chapli tabloid stuff. 468 00:27:32,680 --> 00:27:36,480 Speaker 1: Yes, but there's a dark side of that too. These 469 00:27:36,480 --> 00:27:38,240 Speaker 1: are not I saw this thing the other day and 470 00:27:38,280 --> 00:27:39,840 Speaker 1: I don't know if this is real, and I'll just 471 00:27:39,960 --> 00:27:43,320 Speaker 1: end it on this because it was so disturbing. But 472 00:27:43,359 --> 00:27:47,000 Speaker 1: I'm also at the point where I'm like watching my 473 00:27:47,119 --> 00:27:49,359 Speaker 1: mom will say, oh, did you see this? I'm like, 474 00:27:49,440 --> 00:27:52,080 Speaker 1: that wasn't real. That's the other disturbing thing. You never 475 00:27:52,160 --> 00:27:55,000 Speaker 1: know if anything's real. So I saw this thing which 476 00:27:55,040 --> 00:27:58,280 Speaker 1: is probably not real, but it's like a glimpse into 477 00:27:58,320 --> 00:28:02,080 Speaker 1: maybe what could be the future. Or the woman takes 478 00:28:02,080 --> 00:28:05,160 Speaker 1: a video of her mother. She's like, I just need 479 00:28:05,200 --> 00:28:07,680 Speaker 1: three minutes of your time. She takes a video over 480 00:28:07,760 --> 00:28:11,879 Speaker 1: mother and she's then she gets pregnant. Mom is dead 481 00:28:12,000 --> 00:28:15,120 Speaker 1: at this point. Okay, so her mother dies. She has 482 00:28:15,160 --> 00:28:19,120 Speaker 1: this three minute video that turns into a chatbot and 483 00:28:19,320 --> 00:28:24,680 Speaker 1: she talks to her dead mother through a device that's 484 00:28:24,720 --> 00:28:28,399 Speaker 1: like and the mother is not her mother obviously is ai, 485 00:28:29,080 --> 00:28:33,080 Speaker 1: but she talks back to her. And then the baby 486 00:28:33,160 --> 00:28:36,359 Speaker 1: she gets pregnant. Grandma talks to the baby and the belly. 487 00:28:36,480 --> 00:28:39,040 Speaker 1: Grandma talks to the baby his whole life right, And 488 00:28:39,080 --> 00:28:43,600 Speaker 1: then it ends on him with Grandma like, my wife 489 00:28:43,680 --> 00:28:46,040 Speaker 1: is pregnant, and she says the same thing to him 490 00:28:46,280 --> 00:28:49,600 Speaker 1: that she's because she's a robot that she said to 491 00:28:49,640 --> 00:28:51,800 Speaker 1: the mother when she was pregnant. But I'm like, that 492 00:28:51,920 --> 00:28:55,640 Speaker 1: seems insane, but that could happen. How messed up is 493 00:28:55,640 --> 00:28:58,120 Speaker 1: that it was like your loved ones don't have to die, 494 00:28:58,160 --> 00:28:59,200 Speaker 1: they can live on forever. 495 00:28:59,520 --> 00:29:02,360 Speaker 2: What, Well, there's that's a whole nother I mean, that's 496 00:29:02,400 --> 00:29:06,600 Speaker 2: a whole other issue of you go on on Facebook 497 00:29:06,640 --> 00:29:09,240 Speaker 2: now and there's so many posts that you go and 498 00:29:09,600 --> 00:29:13,560 Speaker 2: it's not post by your friends, it's post by these 499 00:29:13,600 --> 00:29:16,000 Speaker 2: pages that you go. I know that is not true. 500 00:29:16,520 --> 00:29:19,720 Speaker 1: I know that not scare that bear away from looking 501 00:29:19,800 --> 00:29:23,080 Speaker 1: the baby's face, but i've that. 502 00:29:23,640 --> 00:29:27,560 Speaker 2: I saw one today it said Bob Dylan, who I'm 503 00:29:27,560 --> 00:29:32,040 Speaker 2: a fan of, gave five million dollars of his tour 504 00:29:32,520 --> 00:29:39,720 Speaker 2: royalties this year to build homeless like homeless shelters in California. 505 00:29:39,880 --> 00:29:42,720 Speaker 2: I know that's not true, and I don't I don't care, 506 00:29:43,360 --> 00:29:46,840 Speaker 2: but why But why why is that on Facebook? Who 507 00:29:46,920 --> 00:29:50,320 Speaker 2: is producing that? And why why is that not being so? 508 00:29:50,360 --> 00:29:52,800 Speaker 1: That is also so. I also kind of wonder if 509 00:29:52,840 --> 00:29:55,320 Speaker 1: the news doesn't talk about this stuff because they're using 510 00:29:55,480 --> 00:29:58,760 Speaker 1: chat GBT to write things. Because I think yesterday I 511 00:29:58,760 --> 00:30:01,400 Speaker 1: sent you guys something where in a newspaper. It was 512 00:30:01,440 --> 00:30:04,440 Speaker 1: actually in the newspaper. They had a whole article and 513 00:30:04,520 --> 00:30:06,480 Speaker 1: at the end they forgot to take out the part 514 00:30:06,480 --> 00:30:09,120 Speaker 1: where the chat GPT is, Like I can also write 515 00:30:09,120 --> 00:30:13,440 Speaker 1: it this way. You don't read my text messages, so anyway, 516 00:30:13,480 --> 00:30:18,280 Speaker 1: that's true. But at the end of it, but isn't 517 00:30:18,280 --> 00:30:19,200 Speaker 1: that disturbing? 518 00:30:20,200 --> 00:30:23,520 Speaker 2: Well, yeah, obviously, And that's a whole nother thing about 519 00:30:23,520 --> 00:30:27,320 Speaker 2: the media, the media using AI and not actually having 520 00:30:27,400 --> 00:30:32,760 Speaker 2: a human being with real sources producing the news. 521 00:30:32,800 --> 00:30:35,560 Speaker 1: And I've seen AI be very wrong with some of 522 00:30:35,600 --> 00:30:38,320 Speaker 1: the things that they're saying, like disturbingly wrong. So then 523 00:30:38,480 --> 00:30:40,640 Speaker 1: you just have this and it wasn't even checked. I 524 00:30:40,640 --> 00:30:42,920 Speaker 1: mean that to me is the shocking part. They'd let 525 00:30:42,960 --> 00:30:46,120 Speaker 1: AI write an article and then they didn't even read 526 00:30:46,160 --> 00:30:48,640 Speaker 1: it because they missed the part where AI was like, 527 00:30:48,720 --> 00:30:49,960 Speaker 1: or we could do it this way. What do you 528 00:30:50,000 --> 00:30:54,280 Speaker 1: think it's so disturbing. We could talk about AI for 529 00:30:54,320 --> 00:30:57,040 Speaker 1: a very long time. I have so many more things 530 00:30:57,080 --> 00:30:59,880 Speaker 1: we'll have to We will do more research on us, 531 00:31:00,040 --> 00:31:01,600 Speaker 1: and we'll bring it back to you because I feel 532 00:31:01,600 --> 00:31:04,800 Speaker 1: like we need to extend the AI conversation, but not today, 533 00:31:05,120 --> 00:31:07,760 Speaker 1: So today we will let you go on with your day. 534 00:31:07,960 --> 00:31:10,520 Speaker 1: But we are so appreciative of you being here, Kyle. 535 00:31:10,600 --> 00:31:14,720 Speaker 1: Thank you for chatting with me about chatbots. Thanks You're 536 00:31:14,760 --> 00:31:18,480 Speaker 1: not a chatbot. It's so nice. It's a real conversation 537 00:31:18,560 --> 00:31:20,920 Speaker 1: with a real person. Sometimes I feel like you're a chatbot. 538 00:31:21,000 --> 00:31:24,040 Speaker 2: Though, chatbot, I'm just about facts. 539 00:31:24,400 --> 00:31:27,760 Speaker 1: Yeah right, it's not a love yeah. Not a lot 540 00:31:27,760 --> 00:31:30,200 Speaker 1: of highs and lows, all right. Anyway, thank you all 541 00:31:30,200 --> 00:31:32,560 Speaker 1: for joining us on the Tutor Dixon podcast. For this 542 00:31:32,680 --> 00:31:35,640 Speaker 1: episode and others, go to Tutor Dixon podcast dot com. 543 00:31:35,760 --> 00:31:38,280 Speaker 1: You can subscribe subscribe right there, or go to the 544 00:31:38,320 --> 00:31:41,880 Speaker 1: iHeartRadio app, Apple Podcasts or wherever you get your podcasts. 545 00:31:41,880 --> 00:31:44,600 Speaker 1: And remember you can always watch it on Rumble or 546 00:31:44,640 --> 00:31:47,239 Speaker 1: YouTube at tutor Dixon. But make sure you join us 547 00:31:47,240 --> 00:31:51,600 Speaker 1: next time and have a blessed day.