1 00:00:09,960 --> 00:00:12,640 Speaker 1: I've lived in LA for a decade, and this whole time, 2 00:00:12,760 --> 00:00:15,680 Speaker 1: I haven't owned a car. When I tell people that, 3 00:00:15,840 --> 00:00:19,279 Speaker 1: they usually look at me weird. And yes, riding the 4 00:00:19,320 --> 00:00:22,360 Speaker 1: bus and walking and using a bike is less convenient, 5 00:00:22,800 --> 00:00:25,800 Speaker 1: but at this point I'm used to it. But sometimes 6 00:00:26,000 --> 00:00:28,000 Speaker 1: I do wonder if I should just give in and 7 00:00:28,120 --> 00:00:31,720 Speaker 1: buy a car like everyone else. So to help me decide, 8 00:00:31,840 --> 00:00:33,879 Speaker 1: I did what a lot of people do recently when 9 00:00:33,920 --> 00:00:38,680 Speaker 1: they're weighing options. I asked AI. I opened up Claude 10 00:00:38,680 --> 00:00:42,240 Speaker 1: dot AI and I input all my current transportation costs. 11 00:00:42,280 --> 00:00:45,160 Speaker 1: I put in my bus fares, my bike, the cost 12 00:00:45,240 --> 00:00:48,159 Speaker 1: of my occasional uber and I asked it to compare 13 00:00:48,200 --> 00:00:51,280 Speaker 1: that to the average costs of car ownership in Los Angeles, 14 00:00:51,400 --> 00:00:56,240 Speaker 1: so parking, gas, insurance, repairs, all that sort of thing. 15 00:00:56,360 --> 00:00:58,279 Speaker 1: And I asked it to give me the pros and 16 00:00:58,360 --> 00:01:02,320 Speaker 1: cons on either side, and it did. The big con 17 00:01:02,600 --> 00:01:06,119 Speaker 1: is convenience, which I already knew. And on the pro side, 18 00:01:06,200 --> 00:01:08,759 Speaker 1: it said that I was saving thousands of dollars per year, 19 00:01:09,440 --> 00:01:12,120 Speaker 1: but it added one extra thing. It said that I 20 00:01:12,120 --> 00:01:14,680 Speaker 1: could have the nice feeling of knowing that I was 21 00:01:14,720 --> 00:01:18,360 Speaker 1: also being eco friendly. And I thought, hold on, wait 22 00:01:18,400 --> 00:01:22,840 Speaker 1: a second, eco friendly, I just spent half an hour 23 00:01:22,920 --> 00:01:26,240 Speaker 1: running scenarios through an LM, which I know is built 24 00:01:26,280 --> 00:01:29,160 Speaker 1: off the back of a massive amount of computing, which 25 00:01:29,280 --> 00:01:32,600 Speaker 1: in turn means a massive amount of energy. So am 26 00:01:32,600 --> 00:01:35,880 Speaker 1: I actually helping the environment here? Or am I hurting it? 27 00:01:37,360 --> 00:01:40,120 Speaker 1: So this week I set out to answer what seems 28 00:01:40,160 --> 00:01:45,560 Speaker 1: like a pretty simple question, how bad is AI's environmental impact? Really? 29 00:01:46,160 --> 00:01:49,840 Speaker 1: And yes, before you ask, I did consider asking Claude 30 00:01:49,840 --> 00:01:53,720 Speaker 1: and maybe chat GBT about AI's own impact on the environment. 31 00:01:53,960 --> 00:01:56,480 Speaker 1: But then I figured, you know what, maybe this is 32 00:01:56,480 --> 00:02:00,160 Speaker 1: a question I should ask actual human beings. And I 33 00:02:00,160 --> 00:02:02,240 Speaker 1: found a couple of people who've been studying this stuff 34 00:02:02,280 --> 00:02:04,160 Speaker 1: for a while to help me parse all of this. 35 00:02:05,400 --> 00:02:08,120 Speaker 1: Is there a way that I can compare? Is my 36 00:02:08,240 --> 00:02:13,120 Speaker 1: AI usage worse or better than car usage, or worse 37 00:02:13,200 --> 00:02:16,280 Speaker 1: or better than my impact on the environment from eating 38 00:02:16,320 --> 00:02:18,799 Speaker 1: meat or something like that? Are we able to make 39 00:02:18,840 --> 00:02:20,000 Speaker 1: those kinds of comparisons. 40 00:02:20,200 --> 00:02:22,880 Speaker 2: That's what carbon footprints were kind of invented for, so 41 00:02:22,960 --> 00:02:25,720 Speaker 2: you can make this type of comparison. If you're driving 42 00:02:25,720 --> 00:02:27,799 Speaker 2: in fossil fuel based car, you know exactly how much 43 00:02:27,840 --> 00:02:30,320 Speaker 2: gas you're using and what that might mean in terms 44 00:02:30,320 --> 00:02:34,640 Speaker 2: of carbon emissions. That's pretty straightforward. It's much harder to 45 00:02:34,720 --> 00:02:35,640 Speaker 2: do this for AI. 46 00:02:39,400 --> 00:02:48,440 Speaker 1: I'm afraid from Kaleidoscope and iHeart podcasts, this is kill switch. 47 00:02:49,800 --> 00:02:51,360 Speaker 1: I'm Jackster Thomas. 48 00:02:52,040 --> 00:02:52,640 Speaker 2: I'm sorry. 49 00:03:21,400 --> 00:03:24,400 Speaker 1: If you use social media, you've probably seen people being 50 00:03:24,400 --> 00:03:27,760 Speaker 1: criticized for using AI, and depending on who you hang 51 00:03:27,800 --> 00:03:30,720 Speaker 1: out with, that criticism can be kind of different in 52 00:03:30,800 --> 00:03:33,680 Speaker 1: how it shows up. When I see someone post an 53 00:03:33,680 --> 00:03:37,280 Speaker 1: AI generated image or some AI generated text and there's 54 00:03:37,360 --> 00:03:40,480 Speaker 1: angry comments in the comment section, it's usually one of 55 00:03:40,560 --> 00:03:44,720 Speaker 1: two types. The first is people saying that it's disrespectful 56 00:03:45,200 --> 00:03:49,760 Speaker 1: that by posting AI generated poetry or drawings you're devaluing 57 00:03:49,800 --> 00:03:52,680 Speaker 1: the original artists who didn't consent to having their work 58 00:03:52,720 --> 00:03:55,280 Speaker 1: fed into an l l M. That one's pretty easy 59 00:03:55,360 --> 00:03:57,800 Speaker 1: to understand, even if you don't agree with it or 60 00:03:57,880 --> 00:04:00,320 Speaker 1: think it's a big deal. The other comment I see 61 00:04:00,320 --> 00:04:03,280 Speaker 1: a lot of is people saying that using AI is 62 00:04:03,320 --> 00:04:07,160 Speaker 1: destroying the environment. Figuring out whether that's a big deal 63 00:04:07,280 --> 00:04:09,640 Speaker 1: or not is a little bit less straightforward. 64 00:04:09,960 --> 00:04:12,000 Speaker 2: What I tried to do in my research, I tried 65 00:04:12,000 --> 00:04:16,200 Speaker 2: to keep track of how the global electricity consumption of 66 00:04:16,320 --> 00:04:18,320 Speaker 2: AI is developing. 67 00:04:18,080 --> 00:04:20,919 Speaker 1: Alex Deviries is the founder of digit Economists and a 68 00:04:20,960 --> 00:04:24,560 Speaker 1: PhD candidate at the VU Amsterdam. He's been researching the 69 00:04:24,600 --> 00:04:27,359 Speaker 1: sustainability of new technologies for about a decade. 70 00:04:27,640 --> 00:04:29,760 Speaker 2: The way I do that is by looking at how 71 00:04:29,800 --> 00:04:33,640 Speaker 2: many machines of specialized AI's devices are being produced by 72 00:04:33,680 --> 00:04:37,400 Speaker 2: the AI hardware supply chain, and then considering that power 73 00:04:37,440 --> 00:04:41,040 Speaker 2: consumption profile how much power is now being consumed by 74 00:04:41,080 --> 00:04:44,479 Speaker 2: all of these devices. Which is a very imperfect way 75 00:04:44,480 --> 00:04:46,200 Speaker 2: of keeping track of this, but it's kind of like 76 00:04:46,240 --> 00:04:48,680 Speaker 2: the only rule you have available at the moment. 77 00:04:48,520 --> 00:04:50,800 Speaker 1: And even Alex is having a hard time keeping up 78 00:04:50,839 --> 00:04:53,240 Speaker 1: with this. What I called him. He was in the 79 00:04:53,240 --> 00:04:56,400 Speaker 1: middle of putting together new research. Back in twenty twenty three. 80 00:04:56,760 --> 00:04:59,679 Speaker 1: His data showed that by twenty twenty seven, new AI 81 00:04:59,680 --> 00:05:02,360 Speaker 1: service sold could use the same amount of energy annually 82 00:05:02,560 --> 00:05:05,720 Speaker 1: as the yearly energy consumption of a country like Argentina 83 00:05:06,040 --> 00:05:10,799 Speaker 1: or the Netherlands. But things have accelerated. His current research 84 00:05:10,839 --> 00:05:13,479 Speaker 1: shows that it won't take until twenty twenty seven for 85 00:05:13,520 --> 00:05:16,360 Speaker 1: that to happen. At this rate, we're going to hit 86 00:05:16,360 --> 00:05:17,840 Speaker 1: that mark sometime this year. 87 00:05:18,120 --> 00:05:21,719 Speaker 2: Simply because now the amount of devices that's being produced 88 00:05:21,760 --> 00:05:24,600 Speaker 2: by the AI hardware supply chain is way higher than 89 00:05:24,640 --> 00:05:25,960 Speaker 2: it was two years ago. 90 00:05:26,200 --> 00:05:31,760 Speaker 1: So it's even exceeding your pretty bleak estimations that you 91 00:05:31,800 --> 00:05:32,480 Speaker 1: made a while ago. 92 00:05:32,839 --> 00:05:35,320 Speaker 2: Oh yeah, it's just that the hype is so big, 93 00:05:35,320 --> 00:05:37,080 Speaker 2: and then the mod for this type of hardware is 94 00:05:37,120 --> 00:05:40,599 Speaker 2: so big that the numbers are going up much faster 95 00:05:40,680 --> 00:05:42,960 Speaker 2: than could be anticipated just two years ago. 96 00:05:43,080 --> 00:05:45,720 Speaker 1: But hold on, before we get too much further, let's 97 00:05:45,800 --> 00:05:49,200 Speaker 1: just clarify what we're even talking about when we say AI. 98 00:05:50,160 --> 00:05:52,839 Speaker 1: If you could break it down for me, how does 99 00:05:53,880 --> 00:05:57,080 Speaker 1: artificial intelligence use natural resources? 100 00:05:57,400 --> 00:06:01,480 Speaker 3: Heah, it's a general umbrella term that includes many different things. 101 00:06:01,839 --> 00:06:04,680 Speaker 3: But right now, if you're talking to a person on 102 00:06:04,720 --> 00:06:08,159 Speaker 3: the streets random like when they say AI, they're referring 103 00:06:08,200 --> 00:06:11,600 Speaker 3: to large languine models, or maybe you meage generation models. 104 00:06:11,680 --> 00:06:14,320 Speaker 3: So these are the generaryty AI models. 105 00:06:14,400 --> 00:06:17,840 Speaker 1: Chale Wren is an associate professor of electrical and computer 106 00:06:17,960 --> 00:06:21,640 Speaker 1: engineering at the University of California, Riverside, and he's kind 107 00:06:21,640 --> 00:06:25,359 Speaker 1: of a colleague of mine. Our fields are completely different. 108 00:06:25,720 --> 00:06:27,719 Speaker 1: But a couple of years ago I taught a class 109 00:06:27,720 --> 00:06:30,360 Speaker 1: in the building right next to his. I'd had no 110 00:06:30,520 --> 00:06:33,560 Speaker 1: idea that on the same campus there was an expert 111 00:06:33,560 --> 00:06:36,840 Speaker 1: who'd been researching the environmental impact of generative AI the 112 00:06:36,880 --> 00:06:40,640 Speaker 1: whole time, and I thought, perfect, this guy's kind of 113 00:06:40,640 --> 00:06:43,240 Speaker 1: a colleague, so I can stop doing all this research 114 00:06:43,279 --> 00:06:46,640 Speaker 1: on my own and just go ask him. Can you 115 00:06:46,680 --> 00:06:51,640 Speaker 1: give me an idea of how, say, car usage compares 116 00:06:51,760 --> 00:06:54,480 Speaker 1: to usage of an AI model. 117 00:06:54,920 --> 00:06:58,640 Speaker 3: I would say having a large language or medium sized 118 00:06:58,760 --> 00:07:05,000 Speaker 3: language model. Right, roughly ten short emails could be consuming 119 00:07:05,440 --> 00:07:08,640 Speaker 3: a quarter of the electric kill what hour energy, So 120 00:07:08,720 --> 00:07:12,000 Speaker 3: that's roughly enough to drive a test in a Model three. 121 00:07:11,880 --> 00:07:14,960 Speaker 2: For one mile, or as Alex puts it, Chat GPT 122 00:07:15,120 --> 00:07:17,480 Speaker 2: must be running on like something like five hundred mega 123 00:07:17,520 --> 00:07:20,320 Speaker 2: what hours a day, which is enough to power a 124 00:07:20,360 --> 00:07:21,400 Speaker 2: small city. 125 00:07:21,960 --> 00:07:26,360 Speaker 1: Basically, chat GPT's overall daily energy use, it's about the 126 00:07:26,400 --> 00:07:30,000 Speaker 1: same as powering every home, every grocery store, every street 127 00:07:30,040 --> 00:07:33,160 Speaker 1: light in a small city like San Luis, Opistpo in 128 00:07:33,200 --> 00:07:37,160 Speaker 1: California or Ithaca in upstate New York. But what does 129 00:07:37,160 --> 00:07:40,320 Speaker 1: that actually mean for me and you? How much energy 130 00:07:40,360 --> 00:07:43,480 Speaker 1: does it take to just ask chat GBT one question 131 00:07:44,280 --> 00:07:44,600 Speaker 1: on a. 132 00:07:44,560 --> 00:07:48,440 Speaker 2: Per interaction basis? It's actually not that much. You're talking 133 00:07:48,480 --> 00:07:51,800 Speaker 2: about something like three one hours maybe per interaction that's 134 00:07:51,840 --> 00:07:54,080 Speaker 2: something like a low lumin led build that you have 135 00:07:54,160 --> 00:07:56,600 Speaker 2: running for one hour. It's not a lot of power, 136 00:07:56,760 --> 00:08:00,160 Speaker 2: but it's it's nevertheless significantly more than a standard Google. 137 00:08:00,240 --> 00:08:03,160 Speaker 1: Search in one second. Just as an aside here, we 138 00:08:03,280 --> 00:08:05,720 Speaker 1: usually don't think of something like a Google Search as 139 00:08:05,800 --> 00:08:08,800 Speaker 1: using electricity. I mean, your phone or your computer is 140 00:08:08,840 --> 00:08:11,320 Speaker 1: already on, so what does it matter if you're typing 141 00:08:11,320 --> 00:08:13,960 Speaker 1: stuff into it or not. But on the other end 142 00:08:14,000 --> 00:08:17,160 Speaker 1: of that Google search you typed in, they're servers and 143 00:08:17,280 --> 00:08:20,280 Speaker 1: those are using energy. So as we keep going in 144 00:08:20,360 --> 00:08:23,640 Speaker 1: this episode, maybe think about that that on your end 145 00:08:23,920 --> 00:08:27,480 Speaker 1: you're not seeing any energy used or environmental effects, but 146 00:08:27,760 --> 00:08:32,040 Speaker 1: doing a Google search, watching a video, or even downloading 147 00:08:32,080 --> 00:08:35,120 Speaker 1: this podcast that does use some amount of energy. 148 00:08:35,559 --> 00:08:39,479 Speaker 2: Even Google's CEO at some point commented, like, hey, interacting 149 00:08:39,600 --> 00:08:42,960 Speaker 2: with these large language models, it takes ten times more 150 00:08:43,040 --> 00:08:45,640 Speaker 2: power than the standard Google search. So and that would 151 00:08:45,720 --> 00:08:47,880 Speaker 2: mean that if you're talking about three one hours for 152 00:08:48,000 --> 00:08:51,000 Speaker 2: interaction in a large language model, for a standard Google search, 153 00:08:51,040 --> 00:08:53,120 Speaker 2: it would be like zero point three one hours, which 154 00:08:53,160 --> 00:08:55,120 Speaker 2: is a very very tiny amount. 155 00:08:56,360 --> 00:08:58,880 Speaker 1: Just to explain here, a what hour is a unit 156 00:08:58,920 --> 00:09:01,559 Speaker 1: that tells you how much energy device uses over time. 157 00:09:02,120 --> 00:09:04,720 Speaker 1: For example, a sixty watt light bulb running for an 158 00:09:04,760 --> 00:09:09,079 Speaker 1: hour uses sixty watt hours. A single Google search uses 159 00:09:09,120 --> 00:09:12,240 Speaker 1: about zero point three watt hours. That's enough to power 160 00:09:12,280 --> 00:09:16,360 Speaker 1: that same light bulb for around eighteen seconds. But now 161 00:09:16,400 --> 00:09:19,280 Speaker 1: there's that AI add on that comes stacked by default 162 00:09:19,360 --> 00:09:22,200 Speaker 1: on top of every Google search, which takes that number 163 00:09:22,280 --> 00:09:25,600 Speaker 1: up ten x, up to three full watt hours per search. 164 00:09:26,320 --> 00:09:28,960 Speaker 1: That's a little different now you're running that same light 165 00:09:29,000 --> 00:09:32,559 Speaker 1: bulb for three full minutes and then but. 166 00:09:32,640 --> 00:09:35,800 Speaker 2: It's of course in the number of interactions where these 167 00:09:36,120 --> 00:09:38,520 Speaker 2: numbers start to stack up quickly, Because if you're talking 168 00:09:38,559 --> 00:09:42,400 Speaker 2: about Google Skill, you're talking about nine billion interactions a day, 169 00:09:42,679 --> 00:09:46,440 Speaker 2: going as three one hours per interaction. Then, interestingly, the 170 00:09:46,480 --> 00:09:50,040 Speaker 2: whole company Google would require as much power as Ireland 171 00:09:50,120 --> 00:09:51,840 Speaker 2: just to serve as a search engine. 172 00:09:51,880 --> 00:09:55,480 Speaker 1: If that was the case, Wow, using as much power 173 00:09:55,520 --> 00:09:58,959 Speaker 1: as a small country sounds wild, but if we think 174 00:09:59,000 --> 00:10:02,760 Speaker 1: about it, it kind of makes sense. We've default ten 175 00:10:02,880 --> 00:10:07,560 Speaker 1: xed our energy use overnight across nine billion searches a day. 176 00:10:08,080 --> 00:10:10,600 Speaker 1: That energy use is going to add up pretty fast. 177 00:10:11,120 --> 00:10:13,880 Speaker 1: But there's another thing to consider when we talk about 178 00:10:13,920 --> 00:10:17,840 Speaker 1: AI's energy use. The difference between training the model or 179 00:10:17,920 --> 00:10:19,600 Speaker 1: giving it a bunch of data to teach it how 180 00:10:19,600 --> 00:10:22,840 Speaker 1: to work, and using it like when you ask it 181 00:10:22,920 --> 00:10:25,640 Speaker 1: to write a cover letter or I ask it if 182 00:10:25,720 --> 00:10:28,240 Speaker 1: I should buy a car. When we talk about AI 183 00:10:28,440 --> 00:10:30,840 Speaker 1: and the energy consumption that can go into AI, there's 184 00:10:30,840 --> 00:10:34,640 Speaker 1: different phases, right. There's the training phase. There's me actually 185 00:10:34,640 --> 00:10:38,280 Speaker 1: sitting down and asking you an agent, a question. Can 186 00:10:38,320 --> 00:10:39,320 Speaker 1: you break that down for me? 187 00:10:39,600 --> 00:10:43,080 Speaker 3: The training part we call it learning, so based on 188 00:10:43,120 --> 00:10:45,920 Speaker 3: the data we try to optimize the parameters so that 189 00:10:46,200 --> 00:10:49,120 Speaker 3: when we see some new queries from the users, we 190 00:10:49,120 --> 00:10:51,679 Speaker 3: can give you as equate an answer as possible. And 191 00:10:51,920 --> 00:10:54,960 Speaker 3: training is really one time. Of course, later we're going 192 00:10:55,000 --> 00:10:58,520 Speaker 3: to do some update fine tuning. Inference is when the 193 00:10:58,600 --> 00:11:01,880 Speaker 3: users actually interact with the model, and depending on the 194 00:11:01,880 --> 00:11:04,480 Speaker 3: popularity of the model, but once it gets trained, it 195 00:11:04,559 --> 00:11:07,480 Speaker 3: could be used by many hundreds of millions of times 196 00:11:07,880 --> 00:11:10,120 Speaker 3: or even billions of times if you train the large 197 00:11:10,200 --> 00:11:13,440 Speaker 3: lunguine model like LAMA three point one. According to the 198 00:11:13,520 --> 00:11:16,600 Speaker 3: data released by Meta Training, a large lan grain model 199 00:11:16,760 --> 00:11:20,200 Speaker 3: like that air pllutant we gamerated through the training will 200 00:11:20,240 --> 00:11:24,200 Speaker 3: be roughly equivalent to more than ten thousand round trips 201 00:11:24,200 --> 00:11:25,720 Speaker 3: by cart LA and New. 202 00:11:25,679 --> 00:11:29,000 Speaker 1: York City ten thousand round trips by car. Yeah, so 203 00:11:29,040 --> 00:11:34,040 Speaker 1: that sounds bad, that sounds like a lot. But is 204 00:11:34,600 --> 00:11:36,920 Speaker 1: that a one time and it's just the one time. 205 00:11:36,960 --> 00:11:37,640 Speaker 3: It's a one time. 206 00:11:39,000 --> 00:11:42,400 Speaker 1: Let's clear something up here. That number ten thousand round 207 00:11:42,480 --> 00:11:45,200 Speaker 1: trips from LA to New York by car. It's not 208 00:11:45,240 --> 00:11:49,360 Speaker 1: just about carbon it's about air pollution, specifically things like 209 00:11:49,440 --> 00:11:52,840 Speaker 1: nitrogen oxides and fine particles that come from power plants 210 00:11:52,880 --> 00:11:56,160 Speaker 1: and can get deep into your lungs. This isn't theoretical. 211 00:11:56,360 --> 00:12:00,240 Speaker 1: This is stuff that raises risks of diseases like cancer, 212 00:12:00,280 --> 00:12:02,439 Speaker 1: and it doesn't just affect people next to the place 213 00:12:02,440 --> 00:12:07,080 Speaker 1: where all those computers are. Pollution travels and it lingers. 214 00:12:07,720 --> 00:12:11,160 Speaker 1: So what Challet's talking about here isn't just numbers. His 215 00:12:11,240 --> 00:12:14,160 Speaker 1: calculations are showing that training a single model the size 216 00:12:14,160 --> 00:12:17,480 Speaker 1: of Metaslama three point one can produce that level of 217 00:12:17,480 --> 00:12:21,839 Speaker 1: pollution on its own. So yes, training these models is 218 00:12:21,880 --> 00:12:25,000 Speaker 1: a one time hit, but it's a big one. If 219 00:12:25,040 --> 00:12:29,120 Speaker 1: we're talking just about energy usage, using an LM to say, 220 00:12:29,240 --> 00:12:32,480 Speaker 1: write ten emails might be like driving an electric vehicle 221 00:12:32,559 --> 00:12:35,240 Speaker 1: for a mile. And since an electric vehicle was maybe 222 00:12:35,280 --> 00:12:38,120 Speaker 1: three times more efficient than a gas vehicle. Figure, those 223 00:12:38,120 --> 00:12:40,440 Speaker 1: ten emails might get you a quarter to a third 224 00:12:40,440 --> 00:12:43,000 Speaker 1: of a mile in a regular car. And yeah, maybe 225 00:12:43,040 --> 00:12:45,720 Speaker 1: these are relevant numbers for me and my decision about 226 00:12:45,720 --> 00:12:48,960 Speaker 1: whether or not my AI usage is counterbalanced by me 227 00:12:49,120 --> 00:12:52,400 Speaker 1: not having a car. But these numbers are just estimates, 228 00:12:52,480 --> 00:12:54,960 Speaker 1: and we are going to get to that. But the 229 00:12:55,000 --> 00:12:58,280 Speaker 1: bigger issue here is that running those data centers doesn't 230 00:12:58,320 --> 00:13:02,160 Speaker 1: just use electricity. And this is where Chalet's research comes 231 00:13:02,160 --> 00:13:06,040 Speaker 1: in because we've heard about AI's carbon footprint, but what 232 00:13:06,160 --> 00:13:09,720 Speaker 1: about its water footprint, which could be a much bigger 233 00:13:09,760 --> 00:13:13,520 Speaker 1: concern for us living here on earth. That's after the break. 234 00:13:26,480 --> 00:13:29,760 Speaker 1: So you had a study come out last year called 235 00:13:30,000 --> 00:13:33,760 Speaker 1: making AI Less Thirsty, uncovering and addressing the secret water 236 00:13:33,800 --> 00:13:36,520 Speaker 1: footprint of AI models. What made you want to look 237 00:13:36,559 --> 00:13:36,800 Speaker 1: at this? 238 00:13:37,400 --> 00:13:39,760 Speaker 3: Maybe that was due to my childhood experience. I spent 239 00:13:39,840 --> 00:13:42,080 Speaker 3: a couple of years in a small town in China 240 00:13:42,640 --> 00:13:45,520 Speaker 3: where we only had water access for half an hour 241 00:13:45,600 --> 00:13:48,120 Speaker 3: each day, so we just had to think about how 242 00:13:48,160 --> 00:13:51,560 Speaker 3: to use water wisely and to every possible means to 243 00:13:51,600 --> 00:13:55,280 Speaker 3: save water. Then in twenty thirteen, oh I saw this issue. 244 00:13:55,559 --> 00:13:57,640 Speaker 3: I want you to find out more about it. What 245 00:13:57,720 --> 00:14:01,000 Speaker 3: about the water consumption and nobody new at that time. 246 00:14:01,320 --> 00:14:04,160 Speaker 1: A big environmental impact we don't talk about as often 247 00:14:04,200 --> 00:14:08,000 Speaker 1: its carbon emissions is water usage. And the impact that 248 00:14:08,040 --> 00:14:11,080 Speaker 1: water usage has on all of us depends on where 249 00:14:11,120 --> 00:14:14,600 Speaker 1: that water comes from and where it goes when it 250 00:14:14,600 --> 00:14:17,240 Speaker 1: comes to AI. A main use of water is to 251 00:14:17,240 --> 00:14:19,920 Speaker 1: cool down the data centers, which, as we know, use 252 00:14:20,040 --> 00:14:22,200 Speaker 1: a lot of energy. This is how they make sure 253 00:14:22,240 --> 00:14:23,440 Speaker 1: that they don't overheat. 254 00:14:23,720 --> 00:14:27,720 Speaker 3: To prevent syrup from overheating, usually we use water evaporation, 255 00:14:28,040 --> 00:14:30,760 Speaker 3: and that's a very efficient way to move the heat, 256 00:14:30,880 --> 00:14:33,360 Speaker 3: to dissipate the heat to the environment, and this water 257 00:14:33,400 --> 00:14:36,840 Speaker 3: evaporation could be in the cooling towers. That is essentially 258 00:14:37,000 --> 00:14:38,600 Speaker 3: evaporating water twenty four to seven. 259 00:14:38,960 --> 00:14:41,920 Speaker 1: When water evaporates from a data center's cooling system, it 260 00:14:42,040 --> 00:14:45,120 Speaker 1: goes out into the air and is basically considered gone, 261 00:14:45,360 --> 00:14:47,920 Speaker 1: at least from the local supply. You might be thinking 262 00:14:47,920 --> 00:14:49,920 Speaker 1: of the water that you use when you take a shower. 263 00:14:50,120 --> 00:14:53,160 Speaker 1: How that water goes down the drain, It gets treated 264 00:14:53,520 --> 00:14:56,680 Speaker 1: and it can be reused. But evaporated water rises up 265 00:14:56,680 --> 00:14:59,920 Speaker 1: into the atmosphere and you can't reuse it. It can 266 00:15:00,160 --> 00:15:03,680 Speaker 1: eventually come back down as rain, but that takes a while. 267 00:15:04,320 --> 00:15:07,600 Speaker 3: Some tech companies they can use over twenty billion liters 268 00:15:07,600 --> 00:15:08,480 Speaker 3: of water each year. 269 00:15:09,400 --> 00:15:11,720 Speaker 1: Twenty billion, Yeah, that. 270 00:15:11,720 --> 00:15:15,040 Speaker 3: Number basically is the same as some major beverage companies 271 00:15:15,120 --> 00:15:18,720 Speaker 3: annual water consumption, the water they put into their product, 272 00:15:18,960 --> 00:15:22,320 Speaker 3: basically the water we drink from a bottled water. Those 273 00:15:22,360 --> 00:15:25,080 Speaker 3: are the water consumption for the beverage industry. So in 274 00:15:25,120 --> 00:15:29,520 Speaker 3: some sense, this AI is turning these tech companies into 275 00:15:29,960 --> 00:15:32,320 Speaker 3: a beverage company in term of water consumption. 276 00:15:32,880 --> 00:15:37,840 Speaker 1: Nobody's drinking that bottled water or those sodas is just evaporating. 277 00:15:38,520 --> 00:15:39,840 Speaker 3: Yes, yes, yes. 278 00:15:40,080 --> 00:15:43,080 Speaker 1: One important thing here is that when Challte's talking about water, 279 00:15:43,360 --> 00:15:46,960 Speaker 1: he's talking about a specific kind of water. For example, 280 00:15:47,080 --> 00:15:49,600 Speaker 1: you might have heard that for every kilogram of beef, 281 00:15:50,040 --> 00:15:54,240 Speaker 1: it needs fifteen thousand liters of water. But ninety percent 282 00:15:54,280 --> 00:15:57,560 Speaker 1: of that water is what's called green water. That's water 283 00:15:57,600 --> 00:16:00,760 Speaker 1: that's naturally stored in soil and used by plants like 284 00:16:00,960 --> 00:16:03,880 Speaker 1: rain water. It doesn't have to be clean enough for 285 00:16:03,920 --> 00:16:06,520 Speaker 1: people to drink it. It would be nice if data 286 00:16:06,520 --> 00:16:09,280 Speaker 1: centers could use that, but that's not really practical for 287 00:16:09,320 --> 00:16:12,960 Speaker 1: their usage. They rely on what's called blue water, the 288 00:16:13,000 --> 00:16:16,520 Speaker 1: stuff that's clean enough for humans to drink. So when 289 00:16:16,600 --> 00:16:19,640 Speaker 1: Chalet is comparing a tech company's usage of water to 290 00:16:19,720 --> 00:16:23,200 Speaker 1: say like Pepsi's global use of water, this is a 291 00:16:23,240 --> 00:16:28,320 Speaker 1: pretty direct comparison. Use the phrase when you're evaluating GPT three, 292 00:16:28,880 --> 00:16:32,000 Speaker 1: that GBT three needs to drink a certain amount of water. 293 00:16:32,120 --> 00:16:35,840 Speaker 3: A rap flight ten to fifty queries for five hundred 294 00:16:35,840 --> 00:16:38,120 Speaker 3: million digits of water, so basically a bottle water. 295 00:16:38,360 --> 00:16:41,280 Speaker 1: Let's pause on that number for a second. Ten to 296 00:16:41,280 --> 00:16:43,640 Speaker 1: fifty queries the kind of thing you might do in 297 00:16:43,640 --> 00:16:47,760 Speaker 1: a single session using chat GPT that could drink half 298 00:16:47,800 --> 00:16:50,800 Speaker 1: a liter of water. I'm pretty sure that me going 299 00:16:50,840 --> 00:16:53,520 Speaker 1: back and forth about buying a car, I probably used 300 00:16:53,560 --> 00:16:58,080 Speaker 1: about a leader, and that's using conservative estimates. Challeat and 301 00:16:58,120 --> 00:17:01,280 Speaker 1: his team. We're focusing on GPT three, which was released 302 00:17:01,320 --> 00:17:05,520 Speaker 1: back in twenty twenty. Even five years later, OpenAI hasn't 303 00:17:05,520 --> 00:17:08,359 Speaker 1: released all the details researchers would need to give us 304 00:17:08,359 --> 00:17:12,040 Speaker 1: a clear picture of its environmental impact. Do the companies 305 00:17:12,080 --> 00:17:13,400 Speaker 1: know how much water that they're using? 306 00:17:13,480 --> 00:17:15,600 Speaker 3: Of course I can't really speak on their behalf, but 307 00:17:15,680 --> 00:17:18,800 Speaker 3: I think they do. They could figure out the water 308 00:17:18,960 --> 00:17:22,240 Speaker 3: consumption easily because they know their energy they know their 309 00:17:22,240 --> 00:17:24,719 Speaker 3: water efficiency of the courting system, they know where they 310 00:17:24,760 --> 00:17:27,600 Speaker 3: build the data centers, so they have the information, but 311 00:17:27,720 --> 00:17:29,399 Speaker 3: we're not seeing their own disclosure. 312 00:17:29,800 --> 00:17:31,960 Speaker 1: By this point you might be picking up on a 313 00:17:32,000 --> 00:17:35,720 Speaker 1: recurring theme here. Putting a specific number on the impact 314 00:17:35,720 --> 00:17:39,679 Speaker 1: of AI is basically impossible, and it's not because the 315 00:17:39,760 --> 00:17:40,919 Speaker 1: math is too difficult. 316 00:17:41,080 --> 00:17:43,720 Speaker 2: The thing is the tech companies are also refusing to 317 00:17:43,800 --> 00:17:46,119 Speaker 2: tell us exactly what's going on. So if you take 318 00:17:46,160 --> 00:17:48,840 Speaker 2: Google's environmental report, it will show you the numbers are 319 00:17:48,880 --> 00:17:52,000 Speaker 2: bad because in twenty three they show that their carbon 320 00:17:52,040 --> 00:17:55,440 Speaker 2: emations were up like fifty percent compared to five years before, 321 00:17:55,560 --> 00:17:58,439 Speaker 2: and they were pointing to AI as the main culprit. 322 00:17:58,480 --> 00:18:02,760 Speaker 2: They were saying, Okay, data center infrastructure is adding to 323 00:18:02,800 --> 00:18:06,159 Speaker 2: our combon emissions, we're using more electricity. And at the 324 00:18:06,160 --> 00:18:10,920 Speaker 2: same time they just don't specify exactly what's going on 325 00:18:11,000 --> 00:18:14,480 Speaker 2: with regard to AI. They say that making distinctions is 326 00:18:14,520 --> 00:18:17,399 Speaker 2: not meaningful at all, even though weirdly, Google was the 327 00:18:17,480 --> 00:18:21,240 Speaker 2: company that just three years ago was in fact making 328 00:18:21,240 --> 00:18:24,200 Speaker 2: this distinction. They were disclosing the ten to fifteen percent 329 00:18:24,240 --> 00:18:27,720 Speaker 2: of their total energy costs were related to artificial intelligence. 330 00:18:27,800 --> 00:18:29,640 Speaker 2: Now they stop doing that they don't want to tell 331 00:18:29,720 --> 00:18:30,320 Speaker 2: us anymore. 332 00:18:30,560 --> 00:18:33,800 Speaker 1: All of a sudden, they it seems like something changed there. 333 00:18:33,800 --> 00:18:35,240 Speaker 1: What do you think changed? 334 00:18:35,480 --> 00:18:37,479 Speaker 2: The numbers got big, that's what's changed. 335 00:18:38,800 --> 00:18:41,760 Speaker 1: Okay, not to spoil the end here, but it looks 336 00:18:41,760 --> 00:18:43,439 Speaker 1: like I'm not going to get a direct answer to 337 00:18:43,480 --> 00:18:46,200 Speaker 1: my question. But at least I have something of a ballpark, 338 00:18:46,320 --> 00:18:49,159 Speaker 1: even if it's a conservative one. And I also know 339 00:18:49,240 --> 00:18:53,120 Speaker 1: that we're using AI every day for everything. We might 340 00:18:53,160 --> 00:18:56,600 Speaker 1: not know the exact environmental impact of AI, but we 341 00:18:56,680 --> 00:19:00,159 Speaker 1: do know that it's increasing, So what do we do 342 00:19:00,200 --> 00:19:12,880 Speaker 1: about it? That's after the break. So in this episode, 343 00:19:13,000 --> 00:19:16,159 Speaker 1: we've been having some trouble figuring out the exact environmental 344 00:19:16,160 --> 00:19:19,680 Speaker 1: costs of AI. But this is a pretty common problem. 345 00:19:19,960 --> 00:19:22,280 Speaker 1: I mean, my friend Matthew Galt wrote up an article 346 00:19:22,280 --> 00:19:26,320 Speaker 1: at four or four Media explaining that the Government Accountability Office, 347 00:19:26,600 --> 00:19:29,919 Speaker 1: which is a nonpartisan group that answers to Congress, is 348 00:19:29,960 --> 00:19:33,359 Speaker 1: struggling with the exact same thing. They came up with 349 00:19:33,480 --> 00:19:35,960 Speaker 1: roughly the same numbers that we talked about earlier. They 350 00:19:36,000 --> 00:19:39,119 Speaker 1: put together a forty seven page report that acknowledges that 351 00:19:39,359 --> 00:19:44,920 Speaker 1: even after interviewing agency officials, researchers, experts, they're still left 352 00:19:44,960 --> 00:19:48,880 Speaker 1: with having to do estimates because, as they said, quote, 353 00:19:49,040 --> 00:19:53,520 Speaker 1: generative AI uses significant energy and water resources, but companies 354 00:19:53,600 --> 00:19:57,840 Speaker 1: are generally not reporting details of these uses. So even 355 00:19:57,880 --> 00:20:02,000 Speaker 1: the US government has no idea exactly how much carbon 356 00:20:02,040 --> 00:20:04,800 Speaker 1: we're pumping out or how much water we're pouring into 357 00:20:04,840 --> 00:20:09,200 Speaker 1: the sand. And this is an issue because when researchers 358 00:20:09,240 --> 00:20:12,680 Speaker 1: like Chalet and Alex were first looking into AI's environmental impact, 359 00:20:13,200 --> 00:20:17,160 Speaker 1: the biggest concern was training. That's the one time process 360 00:20:17,200 --> 00:20:20,160 Speaker 1: of feeding those massive data sets in the powerful machines. 361 00:20:20,640 --> 00:20:24,200 Speaker 1: That's what was making headlines for energy use. But then 362 00:20:24,440 --> 00:20:29,120 Speaker 1: came chat GBT three and suddenly people weren't just training models, 363 00:20:29,480 --> 00:20:33,719 Speaker 1: they were using them all the time, and that shift 364 00:20:34,000 --> 00:20:35,200 Speaker 1: changed everything. 365 00:20:35,840 --> 00:20:38,480 Speaker 2: As an end user, you can't even manage it properly 366 00:20:38,560 --> 00:20:41,160 Speaker 2: because the companies are not telling you. So it's not 367 00:20:41,280 --> 00:20:44,800 Speaker 2: like when you're interacting with chut GPT that judge GPT 368 00:20:44,920 --> 00:20:47,760 Speaker 2: is gonna tell you, Okay, be aware, now the carbon 369 00:20:47,800 --> 00:20:51,840 Speaker 2: footprint of this conversation has already exceeded this amount. Open 370 00:20:51,880 --> 00:20:53,840 Speaker 2: AI knows this kind of stuff. They could tell you, 371 00:20:53,880 --> 00:20:56,679 Speaker 2: but they won't, And then other people are left trying 372 00:20:56,800 --> 00:20:58,919 Speaker 2: to make some kind of customer to figure out what 373 00:20:59,040 --> 00:21:01,160 Speaker 2: might be going on. We also see that they are 374 00:21:01,240 --> 00:21:03,880 Speaker 2: kind of downplaying the impact of what they're doing here. 375 00:21:03,920 --> 00:21:07,480 Speaker 2: I mean, we see their environmental reports or disasters. The 376 00:21:07,560 --> 00:21:09,920 Speaker 2: carbon emitions are shooting up, and the only thing they're 377 00:21:09,920 --> 00:21:12,480 Speaker 2: saying is like, Okay, don't worry about it. AI will 378 00:21:12,520 --> 00:21:14,480 Speaker 2: solve this in a couple of years from now. 379 00:21:14,720 --> 00:21:17,560 Speaker 1: So the thing that's causing the problem is going to 380 00:21:17,600 --> 00:21:18,439 Speaker 1: solve the problem. 381 00:21:18,440 --> 00:21:22,480 Speaker 2: Also, yeah, that's the excuse they're using. AI is going 382 00:21:22,520 --> 00:21:24,920 Speaker 2: to solve it. It's bad right now, but everything will 383 00:21:24,920 --> 00:21:26,879 Speaker 2: be better in a couple of years, trust us. But 384 00:21:27,119 --> 00:21:31,320 Speaker 2: it's one hundred percent wishful thinking. And to be honest, 385 00:21:31,320 --> 00:21:34,880 Speaker 2: if you look at the whole history of technological developments, 386 00:21:34,960 --> 00:21:37,320 Speaker 2: even if we do end up realizing a lot of 387 00:21:37,359 --> 00:21:40,879 Speaker 2: efficiency gains with AI, this is definitely not a given. 388 00:21:40,960 --> 00:21:43,639 Speaker 2: It doesn't mean that our resource uses in total is 389 00:21:43,680 --> 00:21:46,439 Speaker 2: going to go down. This is the infamous Jevins paradox. 390 00:21:46,920 --> 00:21:49,680 Speaker 1: Jevins paradox is a concept that comes up a lot 391 00:21:49,680 --> 00:21:54,359 Speaker 1: in AI recently. Basically, in the Industrial Revolution, cold powered 392 00:21:54,440 --> 00:21:58,399 Speaker 1: engines started to get more efficient, and some people assume that, Okay, 393 00:21:58,400 --> 00:21:59,960 Speaker 1: this is going to mean that now we're going to 394 00:22:00,200 --> 00:22:04,600 Speaker 1: use less coal overall, but an economist named William Jevins said, no, 395 00:22:05,040 --> 00:22:07,840 Speaker 1: this is going to have the opposite effect. As coal 396 00:22:07,880 --> 00:22:12,520 Speaker 1: powered energy gets cheaper, demand will increase, and total consumption 397 00:22:12,600 --> 00:22:16,320 Speaker 1: of coal won't go down, it'll go up. He was right, 398 00:22:16,600 --> 00:22:18,840 Speaker 1: and that effect seems to keep repeating. 399 00:22:19,520 --> 00:22:22,639 Speaker 2: Despite all the efficiency gains that we had. We're not 400 00:22:22,800 --> 00:22:26,320 Speaker 2: saving on resources, we are using more resources. 401 00:22:26,440 --> 00:22:29,400 Speaker 1: And essentially you're saying here is even if we are 402 00:22:29,600 --> 00:22:32,119 Speaker 1: able to make AI more efficient, we're just going to 403 00:22:32,240 --> 00:22:35,520 Speaker 1: use it more, and so any efficiency gains are going 404 00:22:35,600 --> 00:22:37,520 Speaker 1: to be offset by the fact that we're just constantly 405 00:22:37,600 --> 00:22:38,840 Speaker 1: using this more and more and more. 406 00:22:39,200 --> 00:22:42,440 Speaker 2: One thing that's extra annoying with AI is that there's 407 00:22:42,520 --> 00:22:45,440 Speaker 2: also this bigger is better dynamic going on, whereas if 408 00:22:45,480 --> 00:22:48,080 Speaker 2: you make the models bigger, you'll actually end up with 409 00:22:48,160 --> 00:22:50,720 Speaker 2: a better performing model, but it just means that your 410 00:22:50,760 --> 00:22:53,320 Speaker 2: efficiency gains are completely negated all the time. 411 00:22:53,760 --> 00:22:58,680 Speaker 1: Every chat, every prompt, every AI generated jibbli image adds up. 412 00:22:58,960 --> 00:23:03,000 Speaker 1: We just don't see that impact directly, So let's all 413 00:23:03,040 --> 00:23:07,320 Speaker 1: just stop using AI. Right, Well, that's probably not realistic 414 00:23:07,320 --> 00:23:10,760 Speaker 1: at this point, and that's not necessarily what everyone's recommending. 415 00:23:10,880 --> 00:23:13,000 Speaker 3: So I work on optimization, and I think this is 416 00:23:13,040 --> 00:23:15,320 Speaker 3: a problem. We can optimize it, we can make it better, 417 00:23:15,359 --> 00:23:18,080 Speaker 3: reduce the cost, and there are a lot of opportunities, 418 00:23:18,520 --> 00:23:21,800 Speaker 3: so we should definitely not panic. I hope the model 419 00:23:21,840 --> 00:23:24,520 Speaker 3: developer can disclosee that cost to the users. I will 420 00:23:24,520 --> 00:23:26,680 Speaker 3: figure out should I use it now or should I 421 00:23:26,760 --> 00:23:27,240 Speaker 3: use it later. 422 00:23:27,840 --> 00:23:30,760 Speaker 1: Let's say that I log in the chat GBT and 423 00:23:30,840 --> 00:23:34,840 Speaker 1: it says this query is going to use this much energy, 424 00:23:35,359 --> 00:23:38,520 Speaker 1: this much carbon, and this much water. And if I 425 00:23:38,560 --> 00:23:41,720 Speaker 1: have that information up front, then I, the user might 426 00:23:41,760 --> 00:23:45,040 Speaker 1: decide maybe I don't need to have it summarize the 427 00:23:45,200 --> 00:23:47,320 Speaker 1: entirety of the collective works of Shakespeare today. 428 00:23:47,720 --> 00:23:47,960 Speaker 2: Yeah. 429 00:23:48,119 --> 00:23:51,520 Speaker 3: Maybe. Or they could tell you if you do it later, 430 00:23:51,640 --> 00:23:54,760 Speaker 3: in one hour or in the evening, the cost will 431 00:23:54,800 --> 00:23:57,520 Speaker 3: be different. And do you figure it out? Do you 432 00:23:57,560 --> 00:23:59,119 Speaker 3: want to do it now or do it later. 433 00:24:00,280 --> 00:24:03,240 Speaker 1: What Shelley is proposing here is that developers could build 434 00:24:03,240 --> 00:24:05,960 Speaker 1: in a system that would alert users that their query 435 00:24:06,040 --> 00:24:08,400 Speaker 1: is coming in at a high impact time of day, 436 00:24:08,920 --> 00:24:11,120 Speaker 1: and it could suggest that there might be a better 437 00:24:11,160 --> 00:24:14,760 Speaker 1: time to make that request when data centers have lower usage. 438 00:24:14,880 --> 00:24:19,040 Speaker 1: They can use optimization techniques to reduce energy consumption. This 439 00:24:19,160 --> 00:24:23,520 Speaker 1: concept isn't totally new. Google flights shows carbon emissions estimates 440 00:24:23,560 --> 00:24:26,080 Speaker 1: for flights and it will show you which option has 441 00:24:26,080 --> 00:24:29,399 Speaker 1: the least impact. So something like this for AI is 442 00:24:29,440 --> 00:24:34,000 Speaker 1: definitely possible, but I'm not totally convinced people would actually care. 443 00:24:34,680 --> 00:24:36,720 Speaker 1: The last time I booked a flight, I saw the 444 00:24:36,720 --> 00:24:40,040 Speaker 1: most carbon friendly option, but I didn't pick it because 445 00:24:40,080 --> 00:24:42,000 Speaker 1: it had a long layover. I didn't want to deal 446 00:24:42,040 --> 00:24:46,040 Speaker 1: with that. Putting the responsibility on users can sound good 447 00:24:46,040 --> 00:24:48,679 Speaker 1: in theory, but the flip side of that is it 448 00:24:48,680 --> 00:24:50,920 Speaker 1: can just be a way for companies to avoid doing 449 00:24:50,920 --> 00:24:55,720 Speaker 1: anything themselves. So should this responsibility really fall on us? 450 00:24:56,240 --> 00:24:58,840 Speaker 1: I mean, sure, you could decide to skip the chatbot 451 00:24:58,920 --> 00:25:01,679 Speaker 1: and take notes by hand, and that only really works 452 00:25:01,760 --> 00:25:04,520 Speaker 1: if you know what the trade off actually is, and 453 00:25:04,640 --> 00:25:08,440 Speaker 1: right now we don't, because the companies building these tools 454 00:25:08,560 --> 00:25:10,719 Speaker 1: aren't giving us the data that we would need to 455 00:25:10,760 --> 00:25:14,359 Speaker 1: make informed decisions in the first place. So maybe the 456 00:25:14,400 --> 00:25:19,080 Speaker 1: responsibility should fall elsewhere. Like policy makers, Shelley is already 457 00:25:19,080 --> 00:25:21,280 Speaker 1: thinking about what this could look like and how much 458 00:25:21,280 --> 00:25:22,280 Speaker 1: of a difference it could make. 459 00:25:22,359 --> 00:25:26,199 Speaker 3: We're informing the policy makers, hopefully when they make decisions 460 00:25:26,440 --> 00:25:30,960 Speaker 3: they could take into account this public health burden, water consumption, 461 00:25:31,200 --> 00:25:35,000 Speaker 3: power strain on their infrastructures. These are the cost the 462 00:25:35,040 --> 00:25:38,480 Speaker 3: local people will be paying for the companies. I think, 463 00:25:38,760 --> 00:25:42,440 Speaker 3: especially for those big techs, they already have the systems 464 00:25:42,680 --> 00:25:45,879 Speaker 3: ready to do this type of optimization. They are doing 465 00:25:45,920 --> 00:25:48,640 Speaker 3: it for carbon orware computing. And we use the math 466 00:25:48,680 --> 00:25:51,679 Speaker 3: as location as an example. If they factor in the 467 00:25:51,720 --> 00:25:55,359 Speaker 3: public health burden into their decision making, for example, where 468 00:25:55,400 --> 00:25:57,840 Speaker 3: they route to their workload, they can reduce the public 469 00:25:57,880 --> 00:26:01,639 Speaker 3: health cost by about twenty five percent really and reduce 470 00:26:01,680 --> 00:26:05,520 Speaker 3: the energy bo by about two percent and also cut 471 00:26:05,640 --> 00:26:08,520 Speaker 3: the carbon by about one point three percent. 472 00:26:09,240 --> 00:26:12,280 Speaker 1: So just by being more intentional where they route digital traffic, 473 00:26:12,560 --> 00:26:16,040 Speaker 1: a company like Meta could reduce detrimental impacts on public 474 00:26:16,080 --> 00:26:19,000 Speaker 1: health and they'd be saving some cash at the same time. 475 00:26:19,680 --> 00:26:23,359 Speaker 1: This is called geographic load balancing, and for the user 476 00:26:23,600 --> 00:26:27,560 Speaker 1: it's totally seamless. You log in your feed loads, you 477 00:26:27,600 --> 00:26:30,960 Speaker 1: don't notice anything, but behind the scenes, your request is 478 00:26:31,000 --> 00:26:35,280 Speaker 1: going somewhere where it's cleaner, cheaper, and less harmful to process. 479 00:26:36,000 --> 00:26:39,240 Speaker 1: Even beyond where companies route traffic, they can also consider 480 00:26:39,280 --> 00:26:42,680 Speaker 1: where they build the data centers from a public health perspective. 481 00:26:43,040 --> 00:26:45,560 Speaker 3: When they built data centers in the future, they can 482 00:26:45,720 --> 00:26:48,840 Speaker 3: take into account this of factors because the decision that 483 00:26:48,880 --> 00:26:51,719 Speaker 3: we make today will be impacting the public health, the 484 00:26:51,760 --> 00:26:54,920 Speaker 3: water consumption, the power infrastructure for many years to come. 485 00:26:55,800 --> 00:26:58,520 Speaker 1: Shelle is thinking about the future and research on future 486 00:26:58,520 --> 00:27:02,679 Speaker 1: optimization is a big deal because the AI boom is 487 00:27:02,840 --> 00:27:06,440 Speaker 1: already here. Big tech companies are projected to spend three 488 00:27:06,520 --> 00:27:10,040 Speaker 1: hundred and twenty billion dollars on AI technology and data 489 00:27:10,080 --> 00:27:12,880 Speaker 1: centers this year, which is nearly one hundred billion more 490 00:27:12,920 --> 00:27:16,160 Speaker 1: than last year. So where we put these data centers 491 00:27:16,200 --> 00:27:19,720 Speaker 1: and where we route the traffic really matters. 492 00:27:19,720 --> 00:27:23,560 Speaker 3: Something that I was not expecting to be widespread because 493 00:27:23,560 --> 00:27:26,359 Speaker 3: I was thinking, if I leave, let's say, five miles 494 00:27:26,359 --> 00:27:28,879 Speaker 3: away from a data center or power plan, I wouldn't 495 00:27:28,880 --> 00:27:32,720 Speaker 3: be affected. That was wrong. These air pollutants are what 496 00:27:32,760 --> 00:27:37,280 Speaker 3: EPA defines as cross state air pollutants. They do travel 497 00:27:37,400 --> 00:27:39,919 Speaker 3: hundreds of miles along with a wind. We're going to 498 00:27:40,040 --> 00:27:45,040 Speaker 3: have a significant impact just by strategically placing the data 499 00:27:45,040 --> 00:27:46,240 Speaker 3: centers for the public health. 500 00:27:46,680 --> 00:27:49,120 Speaker 1: What that really highlights is something that we don't think 501 00:27:49,119 --> 00:27:52,480 Speaker 1: about with tech infrastructure. It doesn't just impact the people 502 00:27:52,480 --> 00:27:56,280 Speaker 1: who live next door. When air pollution travels hundreds of miles, 503 00:27:56,600 --> 00:27:59,919 Speaker 1: it turns these data centers into regional issues, not just 504 00:28:00,160 --> 00:28:03,320 Speaker 1: local ones. I'll give you an example right here. As 505 00:28:03,320 --> 00:28:05,440 Speaker 1: we're working on this episode, I saw this article in 506 00:28:05,480 --> 00:28:07,240 Speaker 1: Politico and I just want to read you the first 507 00:28:07,320 --> 00:28:12,760 Speaker 1: sentence quote Elon Musk's artificial intelligence company is belching smog 508 00:28:12,840 --> 00:28:16,760 Speaker 1: forming pollution into an area of South Memphis that already 509 00:28:16,840 --> 00:28:21,240 Speaker 1: leads the state in emergency department visits for ASTHMA end quote. 510 00:28:21,640 --> 00:28:23,679 Speaker 1: That's probably enough to give you the idea. But just 511 00:28:23,720 --> 00:28:27,480 Speaker 1: to explain more, XAI, which is the company behind groc 512 00:28:27,600 --> 00:28:30,040 Speaker 1: which is the AI chatbot that you use on Twitter, 513 00:28:30,400 --> 00:28:33,720 Speaker 1: set up shop in Memphis with enough methane gas turbines 514 00:28:33,800 --> 00:28:38,120 Speaker 1: to power two hundred and eighty thousand homes. The company 515 00:28:38,200 --> 00:28:41,320 Speaker 1: didn't get the required air pollution permits. They're run without 516 00:28:41,320 --> 00:28:45,000 Speaker 1: the emission controls that federal law usually requires, and in 517 00:28:45,120 --> 00:28:48,640 Speaker 1: under a year of operation, XAI is now one of 518 00:28:48,680 --> 00:28:52,400 Speaker 1: the largest emitters of smog producing nitrogen oxides in the 519 00:28:52,560 --> 00:28:56,600 Speaker 1: entire county. And this facility is located near predominantly black 520 00:28:56,640 --> 00:29:00,600 Speaker 1: neighborhoods that are already dealing with high levels of indust pollution. 521 00:29:01,480 --> 00:29:06,160 Speaker 1: These inequalities already existed, and tech development is not making 522 00:29:06,160 --> 00:29:10,440 Speaker 1: it better, it's making it worse. It is often like this. 523 00:29:12,080 --> 00:29:14,880 Speaker 1: There are absolutely people who are feeling the impacts of 524 00:29:14,920 --> 00:29:18,720 Speaker 1: this right now, and there's people who will feel it 525 00:29:18,760 --> 00:29:22,960 Speaker 1: in the future. Maybe somebody will write an article about them, 526 00:29:23,120 --> 00:29:26,400 Speaker 1: maybe not. So I was hoping that I could use 527 00:29:26,440 --> 00:29:30,040 Speaker 1: this podcast to solve all my personal problems. But apparently 528 00:29:30,080 --> 00:29:32,680 Speaker 1: we're over one here, because when I started working on 529 00:29:32,720 --> 00:29:35,680 Speaker 1: this episode, I was thinking that this section right here, 530 00:29:35,800 --> 00:29:38,800 Speaker 1: the outro is where I'd say, Wow, now I know 531 00:29:39,040 --> 00:29:41,760 Speaker 1: exactly what impact my use of AI is having on 532 00:29:41,800 --> 00:29:47,120 Speaker 1: the planet. But I don't. And that's pretty annoying because, 533 00:29:47,560 --> 00:29:49,280 Speaker 1: and I guess this is as close to an answer 534 00:29:49,320 --> 00:29:52,160 Speaker 1: as we're going to get. It's not really about how 535 00:29:52,200 --> 00:29:55,920 Speaker 1: often I personally decide to use Chat, GPT or Gemini 536 00:29:56,000 --> 00:29:59,320 Speaker 1: or Claude or whatever. It's about what happens when companies 537 00:29:59,320 --> 00:30:03,640 Speaker 1: build systems that are this powerful but also this resource hungry, 538 00:30:04,040 --> 00:30:06,760 Speaker 1: and they refuse to tell us what it really costs. 539 00:30:07,440 --> 00:30:10,120 Speaker 1: And I think we deserve to know, not just so 540 00:30:10,160 --> 00:30:12,880 Speaker 1: that we can make individual choices about how often to 541 00:30:13,000 --> 00:30:16,080 Speaker 1: use Chat, GIBT or Gemini or whatever, but so that 542 00:30:16,080 --> 00:30:19,560 Speaker 1: we can hold the right people accountable, because if AI 543 00:30:19,720 --> 00:30:21,920 Speaker 1: is really going to change the future like they say 544 00:30:21,920 --> 00:30:24,800 Speaker 1: it will, we shouldn't know how much that future costs. 545 00:30:35,080 --> 00:30:37,880 Speaker 1: Thank you so much for listening to kill Switch. If 546 00:30:37,920 --> 00:30:40,400 Speaker 1: you got any ideas or thoughts about the show, you 547 00:30:40,440 --> 00:30:43,880 Speaker 1: could hit us at kill Switch at Kaleidoscope dot NYC, 548 00:30:44,240 --> 00:30:46,560 Speaker 1: or you could hit me at dex Digi that's d 549 00:30:46,720 --> 00:30:49,880 Speaker 1: e x d ig I on Instagram or on Blue 550 00:30:49,920 --> 00:30:52,360 Speaker 1: Sky if that's more your thing. And if you like 551 00:30:52,440 --> 00:30:56,120 Speaker 1: this episode, if you're on Apple Podcasts or Spotify, take 552 00:30:56,160 --> 00:30:58,840 Speaker 1: your phone out your pocket and leave us a review. 553 00:30:59,080 --> 00:31:01,720 Speaker 1: It really helps people find the show, and in turn, 554 00:31:01,920 --> 00:31:04,440 Speaker 1: that helps us keep doing our thing. Kill Switch is 555 00:31:04,440 --> 00:31:08,000 Speaker 1: hosted by Me Dexter Thomas. It's produced by sen Ozaki, 556 00:31:08,400 --> 00:31:11,840 Speaker 1: darl Luk Potts, and Kate Osborne. Our theme song is 557 00:31:11,880 --> 00:31:14,800 Speaker 1: by me and Kyle Murdoch, and Kyle also mixed the 558 00:31:14,840 --> 00:31:20,160 Speaker 1: show from Kaleidoscope. Our executive producers are Ozma Lashin, Mungesh Hatigadour, 559 00:31:20,560 --> 00:31:24,920 Speaker 1: and Kate Osborne from iHeart. Our executive producers are Katrina 560 00:31:25,000 --> 00:31:27,680 Speaker 1: Norville and Nikki e. Tour. See you on the next 561 00:31:27,720 --> 00:31:29,680 Speaker 1: one