1 00:00:00,160 --> 00:00:02,960 Speaker 1: Hey everyone, it's Robert and Joe here. Today we've got 2 00:00:02,960 --> 00:00:05,240 Speaker 1: something a little bit different to share with you. It's 3 00:00:05,280 --> 00:00:09,400 Speaker 1: a new season of the Smart Talks with IBM podcast series. 4 00:00:09,720 --> 00:00:13,520 Speaker 2: This season on Smart Talks with IBM, Malcolm Gladwell is back, 5 00:00:13,520 --> 00:00:15,760 Speaker 2: and this time he's taking the show on the road. 6 00:00:15,880 --> 00:00:19,160 Speaker 2: Malcolm is stepping outside the studio to explore how IBM 7 00:00:19,239 --> 00:00:22,880 Speaker 2: clients are using artificial intelligence to solve real world challenges 8 00:00:23,079 --> 00:00:25,440 Speaker 2: and transform the way they do business. 9 00:00:25,480 --> 00:00:29,560 Speaker 1: From accelerating scientific breakthroughs to reimagining education. It's a fresh 10 00:00:29,600 --> 00:00:33,040 Speaker 1: look at innovation in action, where big ideas meet cutting 11 00:00:33,120 --> 00:00:33,960 Speaker 1: edge solutions. 12 00:00:34,320 --> 00:00:37,160 Speaker 2: You'll hear from industry leaders, creative thinkers, and of course 13 00:00:37,280 --> 00:00:40,920 Speaker 2: Malcolm Gladwell himself as he guides you through each story. 14 00:00:41,440 --> 00:00:44,680 Speaker 1: New episodes of Smart Talks with IBM drop every month 15 00:00:44,680 --> 00:00:48,279 Speaker 1: on the iHeartRadio app, Apple Podcasts, or wherever you get 16 00:00:48,320 --> 00:00:52,240 Speaker 1: your podcasts. Learn more at IBM dot com slash smart Talks. 17 00:00:52,880 --> 00:00:56,920 Speaker 1: This is a paid advertisement from IBM. 18 00:00:57,080 --> 00:00:59,880 Speaker 3: If I were to go back, I don't know thirty 19 00:01:00,080 --> 00:01:04,200 Speaker 3: years in Kenya. What's the difference between then and now 20 00:01:04,280 --> 00:01:07,880 Speaker 3: in terms of tree cover. I'm talking to Philip thego 21 00:01:08,440 --> 00:01:10,080 Speaker 3: Special Technology envoid. 22 00:01:10,080 --> 00:01:13,200 Speaker 4: To the Kenyan president, let's speak as if you think 23 00:01:13,240 --> 00:01:16,280 Speaker 4: about we aren elevent trol posts and previously we were 24 00:01:16,319 --> 00:01:20,120 Speaker 4: more than twenty percent. So we are cutting trees more 25 00:01:20,160 --> 00:01:21,480 Speaker 4: than we're planting them. 26 00:01:21,840 --> 00:01:25,559 Speaker 3: In thirty years, Kenya last half its tree cover half. 27 00:01:26,280 --> 00:01:29,759 Speaker 3: And here's why that matters. Kenya is a mountainous country. 28 00:01:30,280 --> 00:01:33,119 Speaker 3: Dotted throughout the highlands are dozens of what canyons call 29 00:01:33,280 --> 00:01:39,479 Speaker 3: water towers natural reservoirs, densely forested areas capable of absorbing 30 00:01:39,520 --> 00:01:41,760 Speaker 3: the enormous amount of water that falls on the country 31 00:01:41,840 --> 00:01:46,440 Speaker 3: during the rainy seasons. The tree roots and undergrowth secure 32 00:01:46,560 --> 00:01:50,200 Speaker 3: and capture moisture, then slowly release it into the rivers 33 00:01:50,200 --> 00:01:53,560 Speaker 3: that flow down into the country's low lying coastal areas. 34 00:01:54,200 --> 00:01:59,080 Speaker 3: But in recent years the water towers have depleted, settlements 35 00:01:59,240 --> 00:02:03,040 Speaker 3: have encroached on them, trees have been chopped down, thousands 36 00:02:03,080 --> 00:02:06,800 Speaker 3: of acres cleared, the natural reservoirs ceased to hold nearly 37 00:02:06,840 --> 00:02:10,400 Speaker 3: as much water, so now Kenya is prone to extremes. 38 00:02:10,800 --> 00:02:13,320 Speaker 3: Too much water flowing down from the highlands in the 39 00:02:13,400 --> 00:02:17,280 Speaker 3: rainy season and too little water left during the dry season. 40 00:02:17,600 --> 00:02:19,680 Speaker 4: So you have a couple of hours of water then 41 00:02:19,680 --> 00:02:22,000 Speaker 4: you have a couple of hours with no water that 42 00:02:22,080 --> 00:02:24,840 Speaker 4: it tops off to be dry by the city authority. 43 00:02:25,040 --> 00:02:28,120 Speaker 4: So that's the significance of the water towers we have 44 00:02:28,240 --> 00:02:29,360 Speaker 4: when they cannot hold water. 45 00:02:29,800 --> 00:02:34,000 Speaker 3: Kenya desperately needed to restore its water towers by planting 46 00:02:34,320 --> 00:02:37,799 Speaker 3: as many trees as humanly possible. So in the fall 47 00:02:37,840 --> 00:02:41,680 Speaker 3: of twenty twenty three, the Kenyan government took action. It 48 00:02:41,800 --> 00:02:46,359 Speaker 3: started a national holiday, national Tree Growing Day, a day 49 00:02:46,400 --> 00:02:49,119 Speaker 3: to allow the citizens of Kenya to go out into 50 00:02:49,160 --> 00:02:52,480 Speaker 3: the forest to dominate the Kenyan countryside and plant as 51 00:02:52,560 --> 00:02:56,600 Speaker 3: many trees as they can, and the government decided on 52 00:02:56,680 --> 00:02:57,200 Speaker 3: a number. 53 00:02:58,080 --> 00:03:03,160 Speaker 4: The presidents focus right around how to ensure that we 54 00:03:03,200 --> 00:03:05,959 Speaker 4: do not lose more forests was in this very ambitious 55 00:03:06,000 --> 00:03:07,800 Speaker 4: campaign around fifteen billion trees. 56 00:03:08,320 --> 00:03:11,400 Speaker 3: That's right, fifteen billion with a B. 57 00:03:12,400 --> 00:03:14,679 Speaker 4: So imagine that number will tell you the ambition, not 58 00:03:14,800 --> 00:03:17,400 Speaker 4: as he tells you the deficit. It has to be 59 00:03:17,440 --> 00:03:19,760 Speaker 4: fifteen billion in the next eight years. 60 00:03:20,080 --> 00:03:23,359 Speaker 3: Fifteen billion trees over eight years averages out to more 61 00:03:23,360 --> 00:03:27,440 Speaker 3: than five million trees per day. That's a lot of trees. 62 00:03:28,000 --> 00:03:30,600 Speaker 3: But with such a massive goal, how can you track 63 00:03:30,639 --> 00:03:33,320 Speaker 3: your progress? How do you know where to plant those 64 00:03:33,360 --> 00:03:36,480 Speaker 3: trees so they'll have the most impact. How do you 65 00:03:36,600 --> 00:03:40,520 Speaker 3: monitor where older trees are still being cut down? Well, 66 00:03:40,520 --> 00:03:44,200 Speaker 3: the answer to those questions came from IBM and a 67 00:03:44,240 --> 00:03:50,680 Speaker 3: little space agency called NASA. That's right, folks, Smart Talks 68 00:03:50,800 --> 00:03:54,960 Speaker 3: is going to space. My name is Malcolm Glawell. You're 69 00:03:54,960 --> 00:03:58,480 Speaker 3: listening to the latest episode of Smart Talks with IBM, 70 00:03:58,840 --> 00:04:01,600 Speaker 3: where we offer our list a glimpse behind the curtain 71 00:04:01,720 --> 00:04:05,520 Speaker 3: of the world of technology. In this season, IBM has 72 00:04:05,560 --> 00:04:10,440 Speaker 3: gone inside elementary school classrooms, toured formulation labs at Loreel, 73 00:04:10,760 --> 00:04:14,640 Speaker 3: and spoken with the fan development team at Scuderia Ferrari HP. 74 00:04:15,640 --> 00:04:20,159 Speaker 3: In this episode, how IBM is partnering with NASA to 75 00:04:20,200 --> 00:04:24,760 Speaker 3: build geospatial models using data from satellites to better understand 76 00:04:24,760 --> 00:04:30,640 Speaker 3: our Earth and Solar system. 77 00:04:30,680 --> 00:04:39,119 Speaker 5: Five four three two one zero all engine running liptoff. 78 00:04:39,160 --> 00:04:42,279 Speaker 5: We have a liptoff thirty two minutes past the hour 79 00:04:42,640 --> 00:04:44,000 Speaker 5: liftoff on Apollo eleven. 80 00:04:44,680 --> 00:04:48,320 Speaker 3: IBM has worked on space related projects since before I 81 00:04:48,400 --> 00:04:49,000 Speaker 3: was even born. 82 00:04:49,839 --> 00:04:51,479 Speaker 2: Im all for man. 83 00:04:52,480 --> 00:04:55,760 Speaker 3: A team of four thousand IBM engineers helped create the 84 00:04:55,800 --> 00:04:58,600 Speaker 3: Saturn five rocket that took Neil Armstrong to the Moon. 85 00:05:00,360 --> 00:05:01,040 Speaker 5: Buy up plate. 86 00:05:02,920 --> 00:05:05,520 Speaker 3: And when I think of NASA. I tend to picture 87 00:05:05,520 --> 00:05:07,840 Speaker 3: the moon landing, or the team of people back in 88 00:05:07,920 --> 00:05:12,000 Speaker 3: Houston guiding the Apollo mission, or the Hubble telescope, or 89 00:05:12,040 --> 00:05:16,320 Speaker 3: astronauts aboard the International Space Station. What I didn't think 90 00:05:16,360 --> 00:05:19,880 Speaker 3: about until now are NASA's geographers. 91 00:05:20,680 --> 00:05:22,599 Speaker 6: In order to go places, you need at map things. 92 00:05:23,080 --> 00:05:27,120 Speaker 3: This is Kevin Murphy, chief Science Data Officer at NASA's 93 00:05:27,240 --> 00:05:28,599 Speaker 3: Science Mission Directorate. 94 00:05:29,320 --> 00:05:32,320 Speaker 6: But I think that there's an assumption that NASAs all 95 00:05:32,360 --> 00:05:36,159 Speaker 6: about rockets and astronauts, and certainly that's a really large 96 00:05:36,160 --> 00:05:37,480 Speaker 6: part and important part of NASA. 97 00:05:38,279 --> 00:05:41,640 Speaker 3: NASA sends people to space and looks out of the stars, 98 00:05:42,120 --> 00:05:45,720 Speaker 3: but NASA also looks down at the Earth. The agency 99 00:05:45,720 --> 00:05:52,640 Speaker 3: has about one hundred and fifty satellites that use radar, lightar, landset, Aquaterra, cloudset, AURA, 100 00:05:53,000 --> 00:05:57,920 Speaker 3: low Earth orbit, Medium Earth orbit, geostationary orbit, on and on. 101 00:05:58,600 --> 00:06:02,760 Speaker 3: In one sense, NASA makes hardware to build rockets and 102 00:06:02,800 --> 00:06:06,279 Speaker 3: spacecraft and all those satellites that circle the Earth. But 103 00:06:06,400 --> 00:06:12,640 Speaker 3: fundamentally NASA also collects data. It's scientists and engineers people 104 00:06:12,640 --> 00:06:16,160 Speaker 3: like Kevin want to make the best use possible of 105 00:06:16,200 --> 00:06:20,280 Speaker 3: all the information gathered by all those many dozens of instruments. 106 00:06:20,920 --> 00:06:24,640 Speaker 6: Right now, we gather around twenty five petabytes of new 107 00:06:24,800 --> 00:06:27,920 Speaker 6: observational data per year. In the next couple months, we're 108 00:06:27,960 --> 00:06:33,920 Speaker 6: about to launch a high resolution global radar. When that launches, 109 00:06:34,360 --> 00:06:37,400 Speaker 6: will double how much we collect every year to about 110 00:06:37,440 --> 00:06:39,000 Speaker 6: fifty petabytes of information. 111 00:06:39,680 --> 00:06:44,080 Speaker 3: Actually, since we recorded this conversation, NASA launched that global radar, 112 00:06:44,279 --> 00:06:48,640 Speaker 3: what they call NYSAR. So NASA is already generating new 113 00:06:48,720 --> 00:06:52,159 Speaker 3: data at the rate of fifty petabytes each year. To 114 00:06:52,200 --> 00:06:55,679 Speaker 3: put that in perspective, a single petabyte could hold about 115 00:06:55,680 --> 00:07:00,159 Speaker 3: five hundred billion pages of standard printed text. You know, 116 00:07:00,200 --> 00:07:03,000 Speaker 3: can anyone sort of apply to use this data. 117 00:07:02,880 --> 00:07:05,400 Speaker 6: Is they don't even have to apply. It's free and 118 00:07:05,480 --> 00:07:09,400 Speaker 6: open data. It advances how we understand what we do 119 00:07:09,480 --> 00:07:12,920 Speaker 6: on Earth and how we see ourselves within the universe. 120 00:07:13,280 --> 00:07:16,200 Speaker 6: People can take it for so many different downstream applications. 121 00:07:16,440 --> 00:07:18,720 Speaker 6: So you can go to our websites today, you can 122 00:07:18,920 --> 00:07:23,160 Speaker 6: search through our tools and you can download information from 123 00:07:23,240 --> 00:07:26,280 Speaker 6: the Mars rovers, you can download information from the Lunar 124 00:07:26,320 --> 00:07:29,480 Speaker 6: Reconnaissance Orbiter or any of the Earth Science Data satellites. 125 00:07:29,760 --> 00:07:33,120 Speaker 3: And give me an example of a really cool application, 126 00:07:33,840 --> 00:07:36,120 Speaker 3: a really cool use that someone I don't know in 127 00:07:36,200 --> 00:07:39,080 Speaker 3: academic or whatever has used your data for it is there? 128 00:07:39,120 --> 00:07:42,360 Speaker 6: It okay. So one of the really kind of cool 129 00:07:42,480 --> 00:07:46,320 Speaker 6: but unexpected observations that we had is that we launched 130 00:07:46,360 --> 00:07:50,720 Speaker 6: a pair of satellites in their early two thousands called Grace, 131 00:07:51,120 --> 00:07:54,000 Speaker 6: and these satellites orbit the Earth and they can measure 132 00:07:54,120 --> 00:07:56,840 Speaker 6: very precisely the distance that they're away from each other 133 00:07:56,840 --> 00:07:59,880 Speaker 6: as they orbit the Earth, and as you go into gravity, 134 00:08:00,440 --> 00:08:03,280 Speaker 6: you can actually see a satellite accelerate and the other 135 00:08:03,320 --> 00:08:07,280 Speaker 6: one accelerate after it, right, And using that information, we 136 00:08:07,320 --> 00:08:10,640 Speaker 6: were trying to map kind of the gravity fields of Earth. 137 00:08:10,880 --> 00:08:13,280 Speaker 6: What what they found is that they can actually map 138 00:08:13,440 --> 00:08:16,960 Speaker 6: below kind of the mass of Earth to where water 139 00:08:17,000 --> 00:08:20,440 Speaker 6: storage is. For instance, so aquifers, right, so you can 140 00:08:20,840 --> 00:08:25,600 Speaker 6: monitor through gravity how much water is being depleted or 141 00:08:25,760 --> 00:08:28,880 Speaker 6: added to an aquifer or the density of glaciers. 142 00:08:29,440 --> 00:08:32,800 Speaker 3: So, just to back up for a moment, the presence 143 00:08:32,960 --> 00:08:37,280 Speaker 3: and density of water deposits below the Earth's surface have 144 00:08:37,360 --> 00:08:42,640 Speaker 3: an effect on gravitational fields that are being measured in space. 145 00:08:42,679 --> 00:08:43,040 Speaker 6: Correct. 146 00:08:43,559 --> 00:08:47,120 Speaker 3: Yeah, And so does that tell you presume you learn 147 00:08:47,240 --> 00:08:49,160 Speaker 3: things like where there's an aquifer where you didn't think 148 00:08:49,160 --> 00:08:50,600 Speaker 3: there was an aquifer. 149 00:08:50,640 --> 00:08:52,920 Speaker 6: Or if it's being depleted faster. 150 00:08:53,240 --> 00:08:53,480 Speaker 2: Yeah. 151 00:08:53,640 --> 00:08:56,280 Speaker 3: Yeah, So who's using that kind of data? 152 00:08:56,800 --> 00:08:59,800 Speaker 6: All sorts of different organizations, whether they're you know, NG 153 00:09:00,640 --> 00:09:03,960 Speaker 6: or government agencies or people that are planning a large 154 00:09:03,960 --> 00:09:04,960 Speaker 6: agricultural product. 155 00:09:05,040 --> 00:09:07,200 Speaker 3: How did you Was that an intentional decisisse? 156 00:09:07,200 --> 00:09:08,520 Speaker 6: It wasn't. It was accidental. 157 00:09:09,240 --> 00:09:15,800 Speaker 3: It was accidental. NASA has assembled a historically unprecedented mountain 158 00:09:15,800 --> 00:09:19,640 Speaker 3: of data about the physical world, free and open to anyone, 159 00:09:20,040 --> 00:09:22,800 Speaker 3: and the possibilities for how that information can be used 160 00:09:22,880 --> 00:09:28,679 Speaker 3: are so vast that even NASA is still uncovering them. 161 00:09:29,080 --> 00:09:31,760 Speaker 3: When I was a kid, I loved legos. I had 162 00:09:31,800 --> 00:09:34,880 Speaker 3: a huge bin full of them. At the time, legos 163 00:09:34,960 --> 00:09:38,520 Speaker 3: were really just colored bricks of various sizes. They weren't 164 00:09:38,520 --> 00:09:41,440 Speaker 3: as complicated as they are today. And what I realized 165 00:09:41,480 --> 00:09:44,040 Speaker 3: even then was that there were more possibilities in a 166 00:09:44,080 --> 00:09:46,960 Speaker 3: box of legos than I could ever imagine on my own. 167 00:09:47,800 --> 00:09:49,440 Speaker 3: I played with my brother and he would show me 168 00:09:49,480 --> 00:09:51,760 Speaker 3: something that hadn't occurred to me, And I go to 169 00:09:51,760 --> 00:09:54,080 Speaker 3: my friend Bruce's and see that he was off on 170 00:09:54,120 --> 00:09:57,600 Speaker 3: some legos tangent that I'd never even thought of, like 171 00:09:57,640 --> 00:10:00,400 Speaker 3: a cool bridge or a castle or a truck. I 172 00:10:00,559 --> 00:10:04,080 Speaker 3: use legos one way. Bruce used his legos in a 173 00:10:04,160 --> 00:10:09,640 Speaker 3: completely different way. NASA's data treasure trove is like a 174 00:10:09,760 --> 00:10:13,080 Speaker 3: very very big box of Legos. And here's the question. 175 00:10:13,640 --> 00:10:18,800 Speaker 3: With so much data, containing so many possible connections, could IBM, 176 00:10:19,080 --> 00:10:24,960 Speaker 3: and specifically IBM's artificial intelligence help NASA scientists uncover patterns 177 00:10:24,960 --> 00:10:28,400 Speaker 3: and connect systems in a way they've never done before. 178 00:10:30,880 --> 00:10:33,360 Speaker 7: Everything started with a question, right. 179 00:10:33,400 --> 00:10:37,520 Speaker 3: I'm talking to one Bernabe Moreno, director of IBM Research 180 00:10:37,559 --> 00:10:38,040 Speaker 3: in Europe. 181 00:10:38,840 --> 00:10:42,840 Speaker 7: As we advance AI, we have new tools to understand 182 00:10:43,520 --> 00:10:46,560 Speaker 7: the around this, understand the world, understand the language, and 183 00:10:46,720 --> 00:10:49,760 Speaker 7: understand our planets. And the question that we were asking 184 00:10:49,800 --> 00:10:52,920 Speaker 7: ourselves was all these new advances that we see in language. 185 00:10:52,960 --> 00:10:56,280 Speaker 7: It was a post GPT moment. Could we apply the 186 00:10:56,360 --> 00:10:59,800 Speaker 7: same idea and the same architecture and technology to a 187 00:11:00,240 --> 00:11:01,240 Speaker 7: dour planets? 188 00:11:01,760 --> 00:11:05,120 Speaker 3: The advent of AI created a new opportunity. What if 189 00:11:05,120 --> 00:11:08,680 Speaker 3: all of NASA's mountain of data could be organized, analyzed, 190 00:11:09,000 --> 00:11:14,280 Speaker 3: understood by artificial intelligence. The original idea was to create 191 00:11:14,360 --> 00:11:18,000 Speaker 3: a geospatial foundation model for the Earth and from there 192 00:11:18,440 --> 00:11:23,200 Speaker 3: create additional specialized models for other scientific priorities of NASA, 193 00:11:23,920 --> 00:11:27,600 Speaker 3: and finally quit an AI system that can understand all 194 00:11:27,720 --> 00:11:31,480 Speaker 3: the data across those specialized models in order to uncover 195 00:11:31,600 --> 00:11:36,760 Speaker 3: hidden insights and relationships. Together, these models could unlock an 196 00:11:36,920 --> 00:11:41,520 Speaker 3: infinite number of potential applications. I asked Kevin Murphy at 197 00:11:41,600 --> 00:11:44,840 Speaker 3: NASA about the beginning of these Earth models. 198 00:11:45,440 --> 00:11:48,280 Speaker 6: Has some colleagues, and we were investigating a number of 199 00:11:48,320 --> 00:11:53,560 Speaker 6: different avenues of using AI with our data, but also 200 00:11:53,800 --> 00:11:56,400 Speaker 6: kind of the management and stewardship of the data, so 201 00:11:56,480 --> 00:11:58,600 Speaker 6: not only like the observations, but how we make it 202 00:11:58,640 --> 00:12:02,320 Speaker 6: available to people, make it discoverable. And they said, hey, 203 00:12:03,000 --> 00:12:05,559 Speaker 6: we see these transform architectures. We think that they can 204 00:12:05,600 --> 00:12:09,760 Speaker 6: be applicable to some of the sequential observations that we make. 205 00:12:10,320 --> 00:12:12,400 Speaker 6: We'd really like to work with IBM on that. And 206 00:12:12,440 --> 00:12:16,679 Speaker 6: I was like, I'm really skeptical, but because I hadn't 207 00:12:16,760 --> 00:12:22,880 Speaker 6: seen those types of tools really produce results that were 208 00:12:23,520 --> 00:12:26,520 Speaker 6: commensurate with the amount of effort you put into them, right, 209 00:12:26,600 --> 00:12:28,960 Speaker 6: So we were getting some really good results and deep 210 00:12:29,040 --> 00:12:32,240 Speaker 6: learning approaches, but they took a lot of effort. 211 00:12:32,559 --> 00:12:34,160 Speaker 3: But Kevin came around quickly. 212 00:12:35,000 --> 00:12:39,680 Speaker 6: When we typically develop a new data product or an algorithm, 213 00:12:40,160 --> 00:12:43,880 Speaker 6: it takes anywhere from you know, twelve months, eighteen months, 214 00:12:43,880 --> 00:12:49,280 Speaker 6: twenty four months to go from data and hypothesis to 215 00:12:49,480 --> 00:12:53,439 Speaker 6: results which is validated. We were able to get approximately 216 00:12:53,480 --> 00:12:58,560 Speaker 6: the same precision for some well known types of benchmarks 217 00:12:58,960 --> 00:13:01,040 Speaker 6: with and I think it was about four four months. Oh, 218 00:13:01,040 --> 00:13:02,120 Speaker 6: instead of starting the work. 219 00:13:02,200 --> 00:13:05,640 Speaker 3: Yeah, yeah, so was it happened faster than you thought, 220 00:13:05,840 --> 00:13:10,480 Speaker 3: much faster. In twenty twenty three, IBM and NASA launched 221 00:13:10,520 --> 00:13:15,079 Speaker 3: a foundation model trained on NASA's harmonized landset sentinel to 222 00:13:15,320 --> 00:13:19,720 Speaker 3: satellite data across the continental United States. They named the 223 00:13:19,760 --> 00:13:24,600 Speaker 3: model Prithvi, the Sanskrit word for Earth. The first version 224 00:13:24,600 --> 00:13:28,920 Speaker 3: of Prithvy used only Earth observation images and just that 225 00:13:29,120 --> 00:13:32,480 Speaker 3: was enough to totally change Kevin's idea of what foundation 226 00:13:32,679 --> 00:13:37,320 Speaker 3: models could do. But they didn't stop there. IBM and 227 00:13:37,400 --> 00:13:40,760 Speaker 3: NASA were encouraged at how well Prithvy worked for Earth 228 00:13:40,880 --> 00:13:45,200 Speaker 3: observation tasks, so they decided to create a more complex 229 00:13:45,280 --> 00:13:49,000 Speaker 3: version of Prithvy that could understand whether and climate data. 230 00:13:49,960 --> 00:13:53,080 Speaker 3: They hoped this new version of Prithvi would allow researchers 231 00:13:53,120 --> 00:13:55,920 Speaker 3: to answer new questions about the Earth, from short term 232 00:13:55,960 --> 00:14:00,680 Speaker 3: weather forecasting to longer term climate effects. Imagine you have 233 00:14:00,720 --> 00:14:05,360 Speaker 3: a map of all the different temperatures, pressures, clouds, rainfall, 234 00:14:05,640 --> 00:14:09,720 Speaker 3: and more from around the globe. With this map, IBM 235 00:14:09,800 --> 00:14:13,960 Speaker 3: and NASA could implement advanced tasks. They could track the 236 00:14:13,960 --> 00:14:16,599 Speaker 3: formation of El Nino or predict how the path of 237 00:14:16,640 --> 00:14:19,720 Speaker 3: a hurricane would change if the ocean temperature went up 238 00:14:19,800 --> 00:14:21,680 Speaker 3: by half a degree. 239 00:14:21,960 --> 00:14:25,040 Speaker 7: I would always remember this moment was when we created 240 00:14:25,080 --> 00:14:30,160 Speaker 7: the Weather and Climate Foundational Model. The senior methodologist of NASA, 241 00:14:30,640 --> 00:14:33,480 Speaker 7: it was like, I cannot believe that it has changed 242 00:14:33,680 --> 00:14:35,800 Speaker 7: the way I think about the AI And ever since 243 00:14:35,840 --> 00:14:38,800 Speaker 7: he's been kind of preaching with this A samples. 244 00:14:38,960 --> 00:14:41,440 Speaker 3: One and his team then took the model and decided 245 00:14:41,480 --> 00:14:45,560 Speaker 3: to test it, really tested. They took away ninety nine 246 00:14:45,600 --> 00:14:48,840 Speaker 3: percent of the data points and ran the experiment again. 247 00:14:49,520 --> 00:14:51,600 Speaker 3: What they were trying to figure out is if the 248 00:14:51,640 --> 00:14:54,680 Speaker 3: model had learned enough about the basic principles of the Earth, 249 00:14:55,080 --> 00:14:58,040 Speaker 3: the underlying physics of the way the planet works, to 250 00:14:58,160 --> 00:15:01,480 Speaker 3: fill in the blanks on its own with just one 251 00:15:01,560 --> 00:15:05,120 Speaker 3: percent of the original data, would it still be accurate 252 00:15:05,200 --> 00:15:11,760 Speaker 3: in its predictions. What happened The model crushed it so 253 00:15:12,000 --> 00:15:14,440 Speaker 3: it was able to extrapolate on the basis of one 254 00:15:14,480 --> 00:15:18,160 Speaker 3: percent of the data what the entire picture looked like yes, 255 00:15:19,600 --> 00:15:23,280 Speaker 3: because pre learned everything right, Yeah, it learned the kind 256 00:15:23,280 --> 00:15:27,920 Speaker 3: of principles of exactly. Yeah. Oh wow, that's very very impressive. 257 00:15:28,040 --> 00:15:30,880 Speaker 3: So at that moment when you realize you could do that, 258 00:15:32,840 --> 00:15:35,600 Speaker 3: and just curious about your emotional I mean, did you 259 00:15:35,680 --> 00:15:37,240 Speaker 3: jump up and down? What did you do that? 260 00:15:37,320 --> 00:15:40,760 Speaker 7: So he's like, wow, it was a very emotional meeting 261 00:15:40,800 --> 00:15:46,160 Speaker 7: because you know, having this person say now I'm convinced right, Yeah, 262 00:15:46,280 --> 00:15:49,160 Speaker 7: it was kind of a quite a special moment. These 263 00:15:49,160 --> 00:15:50,840 Speaker 7: moments make your life as a researcher. 264 00:15:52,280 --> 00:15:55,480 Speaker 3: Ibm And as a launch prith Fee for Weather and 265 00:15:55,520 --> 00:15:59,000 Speaker 3: Climate in twenty twenty four, and while ibm And as 266 00:15:59,000 --> 00:16:02,920 Speaker 3: a scientist could use Privy to run interesting experiments, they 267 00:16:02,960 --> 00:16:06,040 Speaker 3: were even more excited about how Prithy could help people 268 00:16:06,520 --> 00:16:13,960 Speaker 3: in the real world. So let's go back to Kenya 269 00:16:14,280 --> 00:16:18,560 Speaker 3: Ambassador Philip Diego and the country's great tree planting project. 270 00:16:19,480 --> 00:16:22,440 Speaker 4: So on those initial months there was a massive effort, 271 00:16:22,520 --> 00:16:25,560 Speaker 4: including a couple of national holidays. 272 00:16:25,400 --> 00:16:26,400 Speaker 3: For tree planting. 273 00:16:27,280 --> 00:16:30,760 Speaker 4: Yes, where the entire cabinet was sent. 274 00:16:31,080 --> 00:16:33,640 Speaker 3: Ah, did you plant trees as I did? 275 00:16:33,720 --> 00:16:35,600 Speaker 4: Oh my god, I said, The entire cabinet plus someone 276 00:16:35,800 --> 00:16:36,600 Speaker 4: we have to be seen. 277 00:16:37,040 --> 00:16:38,840 Speaker 3: Are you good at the planet two weeks ago? 278 00:16:39,200 --> 00:16:41,080 Speaker 4: Well, it's very easy to go hole put a tree 279 00:16:41,160 --> 00:16:42,760 Speaker 4: in the ground show. 280 00:16:42,800 --> 00:16:46,600 Speaker 3: Well wow, what planting a tree is easy? But remember 281 00:16:47,040 --> 00:16:51,960 Speaker 3: it has to happen fifteen billion times. IBM research has 282 00:16:52,000 --> 00:16:56,320 Speaker 3: been operating in Nairobi since twenty thirteen, and what Kenya wanted, 283 00:16:56,440 --> 00:17:00,880 Speaker 3: at least in the beginning was straightforward. Fee model that 284 00:17:00,960 --> 00:17:04,200 Speaker 3: IBM and NASA built could be used to essentially make 285 00:17:04,240 --> 00:17:08,520 Speaker 3: the world's greatest map, and Kenya, with IBM's help, could 286 00:17:08,680 --> 00:17:11,719 Speaker 3: use that model to make the world's greatest map of Kenya. 287 00:17:12,800 --> 00:17:15,000 Speaker 3: The first step was to lay a grid across the 288 00:17:15,080 --> 00:17:19,240 Speaker 3: topography of the country, break the forest into manageable bite 289 00:17:19,240 --> 00:17:22,560 Speaker 3: sized pieces, each of which could be analyzed separately. 290 00:17:23,400 --> 00:17:25,679 Speaker 4: So, because our forest is massive when you look at 291 00:17:25,720 --> 00:17:28,360 Speaker 4: it in terms of green hite, but only lay it, 292 00:17:28,600 --> 00:17:31,240 Speaker 4: you're able to break it into pieces, like into boxes. 293 00:17:31,480 --> 00:17:35,119 Speaker 4: And for us that was important because then it's easy 294 00:17:35,200 --> 00:17:37,960 Speaker 4: to tackle it when it's in a greed system than 295 00:17:38,119 --> 00:17:40,600 Speaker 4: just as a massive forest. So that was also what 296 00:17:41,280 --> 00:17:42,680 Speaker 4: the model was able to do. 297 00:17:43,119 --> 00:17:46,680 Speaker 3: Then the model painstakingly sorted through each of those boxes 298 00:17:47,080 --> 00:17:50,280 Speaker 3: and look for what Philip calls hotspots, so. 299 00:17:50,200 --> 00:17:52,879 Speaker 4: You can see, for example, very quickly, which other areas 300 00:17:52,880 --> 00:17:56,080 Speaker 4: are being eroded very fast, and that you need to 301 00:17:56,160 --> 00:17:59,359 Speaker 4: quickly protect. Yeh, because you sometimes and that's where you 302 00:17:59,359 --> 00:18:01,159 Speaker 4: want to target, right, I mean it's not possible to 303 00:18:01,160 --> 00:18:02,800 Speaker 4: do everything at the same time. 304 00:18:02,960 --> 00:18:04,760 Speaker 3: Do you have a definition of a hotspot? And how 305 00:18:04,760 --> 00:18:08,119 Speaker 3: many hotspots are there according to that definition? H, there 306 00:18:08,160 --> 00:18:08,480 Speaker 3: are a lot. 307 00:18:08,560 --> 00:18:11,880 Speaker 4: So we have more than forty water towers, and I'll 308 00:18:11,880 --> 00:18:14,880 Speaker 4: tell you all of them have hotspots. And the hot 309 00:18:14,880 --> 00:18:19,040 Speaker 4: spots in my definition areas that are being degraded faster 310 00:18:19,160 --> 00:18:22,160 Speaker 4: and in a very unusual way. Right. You can literally 311 00:18:22,200 --> 00:18:25,600 Speaker 4: see how human activity is seriously degrading that particular area 312 00:18:25,960 --> 00:18:28,160 Speaker 4: that if you do not have a direct intervention, we'll 313 00:18:28,160 --> 00:18:31,560 Speaker 4: lose the entire forest. So that's the hot spot for us, 314 00:18:32,119 --> 00:18:34,080 Speaker 4: because you think about cutting one hundred trees a day 315 00:18:34,160 --> 00:18:36,159 Speaker 4: and cutting a million trees a day, So that's a 316 00:18:36,200 --> 00:18:39,240 Speaker 4: hot spot. You want to look at places where there's 317 00:18:39,480 --> 00:18:43,399 Speaker 4: just unusually high activity of deforestation. 318 00:18:43,480 --> 00:18:45,600 Speaker 3: In a hot spot. The size of each box in 319 00:18:45,640 --> 00:18:48,560 Speaker 3: the grid was ten by ten meters, about half a 320 00:18:48,560 --> 00:18:52,520 Speaker 3: tennis court. That's how closely they were examining the forest, 321 00:18:53,560 --> 00:18:57,800 Speaker 3: so very crudely. The model ingests all of this satellite 322 00:18:57,840 --> 00:19:01,879 Speaker 3: data and it helps you answer some very specific questions 323 00:19:01,960 --> 00:19:06,560 Speaker 3: like where should we prioritize our tree planning efforts, which 324 00:19:07,000 --> 00:19:11,600 Speaker 3: areas down to an extraordinary level of specificity, are eroding 325 00:19:11,720 --> 00:19:15,560 Speaker 3: most quickly. You know, all those kinds of practical questions 326 00:19:15,600 --> 00:19:17,159 Speaker 3: about how to direct your strategy. 327 00:19:17,480 --> 00:19:19,560 Speaker 4: So if you think about a smart forest, right, and 328 00:19:19,600 --> 00:19:22,200 Speaker 4: that's really for us according it smart fencing, smart forests, 329 00:19:22,200 --> 00:19:25,960 Speaker 4: everything that's smart because of AI. If you think about 330 00:19:26,680 --> 00:19:29,359 Speaker 4: your usual what you can see with your eyes and 331 00:19:29,400 --> 00:19:32,439 Speaker 4: then the satellite layer which just zooms in and you 332 00:19:32,480 --> 00:19:35,320 Speaker 4: see green. So what the model has been able to 333 00:19:35,320 --> 00:19:37,639 Speaker 4: do is to create a smart layer, right, and then 334 00:19:37,760 --> 00:19:42,000 Speaker 4: that smart layer you can actually see many things, from analytics, 335 00:19:42,000 --> 00:19:44,879 Speaker 4: to the greeds, to a dashboard, one a lot. So 336 00:19:45,000 --> 00:19:49,159 Speaker 4: able to layer to those blocks, you can quantify degradation 337 00:19:49,240 --> 00:19:52,960 Speaker 4: by blocks. You can match integrations, you can match reforestation. 338 00:19:53,440 --> 00:19:55,480 Speaker 3: I asked Philip to imagine what it would have been 339 00:19:55,560 --> 00:19:58,440 Speaker 3: like to attempt the tree planting project in an era 340 00:19:58,640 --> 00:20:04,920 Speaker 3: before AI. His answer was, plant fifteen billion trees, restore 341 00:20:04,920 --> 00:20:09,720 Speaker 3: the water towers. Impossible. With Prithvie on Kenya's side, though, 342 00:20:10,160 --> 00:20:13,520 Speaker 3: it's really happening. What should be clear by now is 343 00:20:13,560 --> 00:20:16,200 Speaker 3: how versatile Prithvie can be. Want to know how to 344 00:20:16,280 --> 00:20:20,320 Speaker 3: combat deforestation, Prithvy can model that. Want to know when 345 00:20:20,320 --> 00:20:22,479 Speaker 3: the best time in the year to plant your crops is, 346 00:20:23,080 --> 00:20:27,280 Speaker 3: prithvy can help predict that too. Last year, six months 347 00:20:27,320 --> 00:20:31,800 Speaker 3: after IBM started helping Kenya with reforestation, Kenya needed Prithvy's 348 00:20:31,800 --> 00:20:34,720 Speaker 3: help on something else. And it was an emergency. 349 00:20:35,560 --> 00:20:38,280 Speaker 4: So something was happening in the world that we sort 350 00:20:38,320 --> 00:20:41,040 Speaker 4: of had these flats that we didn't expect. 351 00:20:41,440 --> 00:20:44,080 Speaker 3: In the spring of twenty twenty four, Kenya was hit 352 00:20:44,119 --> 00:20:48,360 Speaker 3: with thunderstorms and torrential rain, days and days of it. 353 00:20:49,200 --> 00:20:51,440 Speaker 4: And so I got a call from the Red Cross, 354 00:20:51,600 --> 00:20:55,000 Speaker 4: the one of my friends, and they're like, Ambassador, we 355 00:20:55,080 --> 00:20:58,120 Speaker 4: need a little bit of help on how we deal 356 00:20:58,240 --> 00:21:01,240 Speaker 4: with response because what we see is unusual, right, because 357 00:21:01,280 --> 00:21:04,119 Speaker 4: normally you would only have one area. All of a sudden, 358 00:21:04,240 --> 00:21:07,479 Speaker 4: we had an entire country flooding. In April, we had 359 00:21:07,480 --> 00:21:12,679 Speaker 4: about three eight hundred kilometers square kind of total land flooded, 360 00:21:12,880 --> 00:21:16,600 Speaker 4: which is unusual for Kenyon. And so when I got 361 00:21:16,600 --> 00:21:18,879 Speaker 4: this call, we were like, Okay, there's someone could do 362 00:21:18,960 --> 00:21:21,800 Speaker 4: with IBM. We only did one function for the trees. 363 00:21:21,840 --> 00:21:24,440 Speaker 4: It was actually a climate model, and we said, can 364 00:21:24,480 --> 00:21:29,880 Speaker 4: we use this to help us better respond to floods 365 00:21:30,440 --> 00:21:33,520 Speaker 4: And so that was how we started having this discussion 366 00:21:33,560 --> 00:21:37,520 Speaker 4: with IBM in terms of repurposing the model to help 367 00:21:37,720 --> 00:21:41,320 Speaker 4: us deal with this new challenge around floods. 368 00:21:41,880 --> 00:21:47,880 Speaker 3: Again, Prithvy is versatile. Prithe could use everything it knew 369 00:21:47,880 --> 00:21:52,240 Speaker 3: about the land, the forests and infrastructure to analyze how 370 00:21:52,359 --> 00:21:56,560 Speaker 3: and where and when floods would occur. The Kenyan government 371 00:21:56,600 --> 00:21:59,000 Speaker 3: could then use the model to help the Red Cross 372 00:21:59,119 --> 00:22:02,640 Speaker 3: organize its respet pants, show areas that needed to be 373 00:22:02,720 --> 00:22:06,200 Speaker 3: evacuated or safe places where the Red Cross could set 374 00:22:06,280 --> 00:22:10,119 Speaker 3: up camps. That information was invaluable. 375 00:22:11,080 --> 00:22:13,639 Speaker 4: Historically, what has happened is that they would set up 376 00:22:13,720 --> 00:22:18,919 Speaker 4: camp based on population congregation right where people assembly is 377 00:22:18,920 --> 00:22:21,359 Speaker 4: where they set up a camp, not based on any data, 378 00:22:21,440 --> 00:22:25,000 Speaker 4: right simply because people are there, they will come there 379 00:22:25,000 --> 00:22:28,560 Speaker 4: to provide services and emergency response. What we realize is 380 00:22:28,600 --> 00:22:31,080 Speaker 4: that that model doesn't work. So what we've been able 381 00:22:31,119 --> 00:22:33,560 Speaker 4: to do with IBM is be able to to sort 382 00:22:33,560 --> 00:22:37,120 Speaker 4: of give Red Cause very specific locations or options where 383 00:22:37,160 --> 00:22:39,520 Speaker 4: to set up camps. So if people come here, just 384 00:22:39,520 --> 00:22:43,159 Speaker 4: tell them no, move here. That's the safe place you 385 00:22:43,240 --> 00:22:44,879 Speaker 4: really want to go. So I think for me that 386 00:22:45,000 --> 00:22:46,959 Speaker 4: was really amazing. So we're calling them were a very 387 00:22:47,000 --> 00:22:49,159 Speaker 4: funny word for it, flood assembly points. We always have 388 00:22:49,240 --> 00:22:51,760 Speaker 4: fire for assembly points, but now we can say we 389 00:22:51,840 --> 00:22:57,280 Speaker 4: have literally flat assembly points that are safe or citizens. 390 00:22:57,680 --> 00:23:02,240 Speaker 3: That's fascinating. So the model has ingested this incredibly granular 391 00:23:03,080 --> 00:23:09,679 Speaker 3: picture later of the topography and weather patterns of Kenya. 392 00:23:09,800 --> 00:23:13,359 Speaker 3: It's just giving you a set of useful predictions about 393 00:23:13,400 --> 00:23:15,360 Speaker 3: how you should shape your response. 394 00:23:16,320 --> 00:23:18,800 Speaker 4: Yes, and what we did remember is that, as I said, 395 00:23:18,840 --> 00:23:22,760 Speaker 4: it was a full multistate called capability. What IBM gave 396 00:23:22,840 --> 00:23:25,560 Speaker 4: us was a base map. We didn't have that before 397 00:23:25,920 --> 00:23:28,480 Speaker 4: and a base model. So you cannot have these layers 398 00:23:28,520 --> 00:23:29,919 Speaker 4: up on layers, up on layers to be able to 399 00:23:29,920 --> 00:23:31,960 Speaker 4: make intelligent decisions. 400 00:23:35,720 --> 00:23:38,680 Speaker 3: Throughout my reporting on this episode, I've been really impressed 401 00:23:38,680 --> 00:23:41,720 Speaker 3: by what Prithvie can do. But it doesn't stop at 402 00:23:41,760 --> 00:23:45,560 Speaker 3: floods and reforestation. Prithvie has also been used to look 403 00:23:45,560 --> 00:23:49,800 Speaker 3: at wildfires and floods in the UK, and Kevin told 404 00:23:49,800 --> 00:23:53,040 Speaker 3: me that researchers in Africa have even used Prithvy to 405 00:23:53,240 --> 00:23:57,160 Speaker 3: identify locust breeding grounds, which could help them prevent swarms 406 00:23:57,160 --> 00:24:02,480 Speaker 3: that destroy crops. All these are issues on land. 407 00:24:02,920 --> 00:24:05,119 Speaker 8: I mean, I always say to people, seventy percent of 408 00:24:05,160 --> 00:24:07,000 Speaker 8: our land mask is ocean. 409 00:24:07,560 --> 00:24:10,280 Speaker 3: Kate Rice is the director of the Heart Tree Center, 410 00:24:10,600 --> 00:24:15,440 Speaker 3: which focuses on adopting AI into UK's public and private sectors, 411 00:24:15,840 --> 00:24:21,360 Speaker 3: and one of those sectors is the blue economy oceans, fish, shellfish. 412 00:24:22,000 --> 00:24:26,440 Speaker 3: But oceans are huge, and getting data formotions is difficult. 413 00:24:26,640 --> 00:24:29,240 Speaker 8: So you're dealing with something where there's not a lot 414 00:24:29,280 --> 00:24:35,000 Speaker 8: of people walking around collecting data. So the real difficulty 415 00:24:35,200 --> 00:24:39,720 Speaker 8: is understanding that collecting enough data to make anything makes sense. 416 00:24:40,320 --> 00:24:46,200 Speaker 8: And oceans are very complex in terms of their interaction 417 00:24:46,760 --> 00:24:50,000 Speaker 8: with our climate and how they interact with the climate, 418 00:24:50,480 --> 00:24:54,520 Speaker 8: so understanding the physics space models is pretty challenging too. 419 00:24:55,200 --> 00:25:01,000 Speaker 3: Once again, enter IBM IBM created a new geospace to 420 00:25:01,040 --> 00:25:05,399 Speaker 3: help us better understand our oceans. Hartree and IBM, along 421 00:25:05,440 --> 00:25:09,000 Speaker 3: with the Plymouth Marine Laboratory, the UK Science and Technology 422 00:25:09,080 --> 00:25:12,720 Speaker 3: Facilities Council, and the University of Exeter have all partnered 423 00:25:12,760 --> 00:25:16,159 Speaker 3: to focus the model's power on the waters around the 424 00:25:16,240 --> 00:25:21,399 Speaker 3: United Kingdom, which ultimately will help the UK's blue economy. 425 00:25:21,760 --> 00:25:24,960 Speaker 8: You get these major blooms in algae, so the ocean 426 00:25:25,000 --> 00:25:28,480 Speaker 8: goes green and you might see it in lakes as well. Now, 427 00:25:28,720 --> 00:25:32,960 Speaker 8: if you are shell fishing and that's what you're harvesting, 428 00:25:34,160 --> 00:25:40,880 Speaker 8: you can't harvest cockles muscles to be very colloquial, when 429 00:25:40,920 --> 00:25:44,479 Speaker 8: you have algae blooms because they're poisonous. So there are 430 00:25:44,520 --> 00:25:46,440 Speaker 8: certain times the year where you can harvest. In a 431 00:25:46,520 --> 00:25:49,439 Speaker 8: certain times of year, you can't if you keep having 432 00:25:49,440 --> 00:25:53,040 Speaker 8: the algal blooms. Just to put it on an economic terms, 433 00:25:53,480 --> 00:25:56,960 Speaker 8: that's a problem. So if we look at it that way, 434 00:25:57,800 --> 00:26:01,439 Speaker 8: that's an issue. We really do need to try and 435 00:26:01,840 --> 00:26:06,359 Speaker 8: understand where these algal blooms will happen, when they will happen, 436 00:26:06,800 --> 00:26:09,879 Speaker 8: and how to limit them, because obviously, if you're shell 437 00:26:09,960 --> 00:26:13,320 Speaker 8: fishing as your livelihood, that's going to really impact you. 438 00:26:14,240 --> 00:26:18,480 Speaker 3: Kate told me that understanding these algal blooms, how they form, 439 00:26:18,720 --> 00:26:22,040 Speaker 3: why they form, and how they move would allow people 440 00:26:22,160 --> 00:26:23,280 Speaker 3: to better manage them. 441 00:26:24,359 --> 00:26:26,720 Speaker 8: What is it you're putting in the water. Are you 442 00:26:26,800 --> 00:26:30,960 Speaker 8: putting fertilizers in the water in the near shore environment 443 00:26:31,040 --> 00:26:34,639 Speaker 8: that is causing those algal blooms? Is it because we 444 00:26:34,720 --> 00:26:39,480 Speaker 8: are heating up the oceans and particularly our near shore environments. 445 00:26:39,560 --> 00:26:43,000 Speaker 8: That is causing that. I don't know. I'm not a specialist, 446 00:26:43,640 --> 00:26:47,360 Speaker 8: but that's what you're trying to figure out. Is there 447 00:26:47,440 --> 00:26:50,919 Speaker 8: something we are doing that is creating those environments that 448 00:26:51,080 --> 00:26:56,040 Speaker 8: is causing those algal blooms or is it natural? And 449 00:26:56,160 --> 00:26:58,320 Speaker 8: natural is always a difficult one because I would say 450 00:26:58,320 --> 00:27:01,520 Speaker 8: we live in the very managed environment, and particularly in 451 00:27:01,560 --> 00:27:05,879 Speaker 8: the UK, very few landscapes are natural. Most of it 452 00:27:05,920 --> 00:27:09,800 Speaker 8: is managed in some way. Are we managing it in 453 00:27:09,800 --> 00:27:12,560 Speaker 8: an appropriate way? Is there changes in how we behave 454 00:27:12,680 --> 00:27:13,960 Speaker 8: that could make things better? 455 00:27:14,920 --> 00:27:17,000 Speaker 3: Not that I needed more examples to sell me and 456 00:27:17,040 --> 00:27:19,920 Speaker 3: how useful the Prispian models are, but Kate gave me 457 00:27:20,200 --> 00:27:23,840 Speaker 3: a few more use cases that reinforced just how exciting 458 00:27:23,920 --> 00:27:26,840 Speaker 3: foundation models are for our oceans. 459 00:27:27,680 --> 00:27:32,280 Speaker 8: These big brown seaweeds can really help with carbon sequestration. 460 00:27:32,880 --> 00:27:36,520 Speaker 8: Imagine if we could improve the environment enough so that 461 00:27:36,560 --> 00:27:38,880 Speaker 8: we could have more of that, so that we could 462 00:27:38,880 --> 00:27:42,520 Speaker 8: SEQUENTI more carbon. The other thing is wind power. In 463 00:27:42,560 --> 00:27:45,280 Speaker 8: the UK, we have a lot of offshore wind farms 464 00:27:45,680 --> 00:27:48,840 Speaker 8: and we're doing really well with our renewable energy resources. 465 00:27:49,119 --> 00:27:51,000 Speaker 8: So where do we put that and how does that 466 00:27:51,160 --> 00:27:56,560 Speaker 8: impact sand movements? So these sandbars and things aren't static, 467 00:27:56,680 --> 00:28:00,840 Speaker 8: they move, So understanding that is really important for where 468 00:28:00,840 --> 00:28:04,920 Speaker 8: you're going to put your suboceanic infrastructure. So you've got 469 00:28:04,960 --> 00:28:08,520 Speaker 8: cables going across the oceans. If we're going to use 470 00:28:08,560 --> 00:28:13,000 Speaker 8: our oceans more, we need to understand what that environmental 471 00:28:13,040 --> 00:28:15,040 Speaker 8: impact is going to be long term. 472 00:28:15,920 --> 00:28:18,639 Speaker 3: The Ocean Model launched at the end of September twenty 473 00:28:18,680 --> 00:28:28,800 Speaker 3: twenty five. The research is only beginning. When I sat 474 00:28:28,840 --> 00:28:31,840 Speaker 3: down with Kevin Murphy at NASA, I wanted to understand 475 00:28:32,040 --> 00:28:35,520 Speaker 3: where all of this impressive work was going. And one 476 00:28:35,560 --> 00:28:38,280 Speaker 3: of the signature aspects of this work is that it's 477 00:28:38,320 --> 00:28:42,360 Speaker 3: not just for IBM and NASA researchers. Anyone can use 478 00:28:42,400 --> 00:28:43,320 Speaker 3: these models. 479 00:28:44,120 --> 00:28:46,760 Speaker 6: So before, if you were a researcher, or let's say 480 00:28:46,920 --> 00:28:51,240 Speaker 6: you were a farmer or maybe a technology informed person 481 00:28:51,280 --> 00:28:53,760 Speaker 6: that was interested in something like this, you would have 482 00:28:53,800 --> 00:28:57,160 Speaker 6: to learn about how to do remote sensing, how to 483 00:28:57,320 --> 00:29:01,000 Speaker 6: calibrate the imagery, how to stitch it together, because you 484 00:29:01,080 --> 00:29:03,000 Speaker 6: know they come in kind of postage stamps that you 485 00:29:03,040 --> 00:29:06,360 Speaker 6: have to squash together, and then you'd have to learn 486 00:29:06,440 --> 00:29:09,400 Speaker 6: about the algorithms necessary to do all the processing right, 487 00:29:09,480 --> 00:29:12,360 Speaker 6: So a lot of work, and then you could actually 488 00:29:12,880 --> 00:29:16,000 Speaker 6: do the mapping that you were interested in. Today. What 489 00:29:16,040 --> 00:29:18,160 Speaker 6: you can do is you can go to hugging face, 490 00:29:18,400 --> 00:29:22,320 Speaker 6: which is where this model exists in the open using 491 00:29:22,360 --> 00:29:25,080 Speaker 6: kind of our open science principles, and you can apply 492 00:29:25,160 --> 00:29:30,200 Speaker 6: it to future or historical observations without having all of 493 00:29:30,200 --> 00:29:31,520 Speaker 6: that background information. 494 00:29:32,040 --> 00:29:35,560 Speaker 3: And with the partnership between NASA and IBM, these foundation 495 00:29:35,680 --> 00:29:39,400 Speaker 3: models are multiplying. The new version of Prithvi I mentioned 496 00:29:39,480 --> 00:29:42,920 Speaker 3: launched in September twenty twenty four. Then in August twenty 497 00:29:43,000 --> 00:29:48,120 Speaker 3: twenty five, NASA and IBM launched another foundation model called Syria, 498 00:29:48,200 --> 00:29:51,320 Speaker 3: based on data from the Sun. Soria can help predict 499 00:29:51,520 --> 00:29:56,360 Speaker 3: solar flares which can disrupt communications and increase radiation for 500 00:29:56,480 --> 00:29:59,760 Speaker 3: high altitude flights. And then there's the Ocean model I 501 00:29:59,760 --> 00:30:03,360 Speaker 3: talk about with Kate Royce. So what does the future 502 00:30:03,400 --> 00:30:07,080 Speaker 3: look like for all these foundation models built from NASA data? 503 00:30:07,720 --> 00:30:10,000 Speaker 3: If I wanted to look five or ten years out 504 00:30:10,040 --> 00:30:14,000 Speaker 3: to understand erosion patterns in a coastal town, you could 505 00:30:14,000 --> 00:30:14,280 Speaker 3: give me. 506 00:30:14,320 --> 00:30:16,880 Speaker 6: Eventually, I think we'll get there. Yeah, you know, we've 507 00:30:16,920 --> 00:30:20,479 Speaker 6: really only been doing this for the past few years. 508 00:30:20,920 --> 00:30:25,200 Speaker 6: There is a lot of I think capabilities to still 509 00:30:25,240 --> 00:30:30,360 Speaker 6: discover and uncover with how we use these models for 510 00:30:30,640 --> 00:30:33,400 Speaker 6: like especially long term predictions. Like you're talking about. 511 00:30:34,120 --> 00:30:36,600 Speaker 3: What do you think you can't do and that you 512 00:30:36,760 --> 00:30:39,440 Speaker 3: really love to do. What's the kind of like great 513 00:30:39,440 --> 00:30:40,440 Speaker 3: white whale problem. 514 00:30:40,760 --> 00:30:42,440 Speaker 6: We can't do this today, but I'd like to be 515 00:30:42,520 --> 00:30:44,360 Speaker 6: able to do it in the future, which is really 516 00:30:44,400 --> 00:30:47,680 Speaker 6: the linking of the models together. Right. So right now 517 00:30:47,680 --> 00:30:52,200 Speaker 6: we have these isolated areas where you know, we have 518 00:30:52,240 --> 00:30:57,360 Speaker 6: the harmonized Lansat Sentinel or gspatial model. We have the 519 00:30:57,360 --> 00:31:01,000 Speaker 6: weather model, which can look at short terms. We're building 520 00:31:01,040 --> 00:31:05,240 Speaker 6: out the heliophysics model to look at the sun dynamics. 521 00:31:05,520 --> 00:31:08,360 Speaker 6: But they're probably going to have to be additional models 522 00:31:08,360 --> 00:31:10,719 Speaker 6: built so that we can understand how they interact with 523 00:31:10,760 --> 00:31:16,560 Speaker 6: one another, right, And that is you know, kind of 524 00:31:16,600 --> 00:31:20,440 Speaker 6: towards a digital twin of kind of the Solar system 525 00:31:20,480 --> 00:31:23,240 Speaker 6: or Earth systems, which which I think is a big 526 00:31:23,280 --> 00:31:26,240 Speaker 6: Harry problem, but if we understand it, we might be 527 00:31:26,280 --> 00:31:28,040 Speaker 6: able to address some of the questions that you just 528 00:31:28,080 --> 00:31:29,000 Speaker 6: asked about prediction. 529 00:31:29,560 --> 00:31:32,800 Speaker 3: So if you linked all of those models together, basically 530 00:31:32,800 --> 00:31:35,240 Speaker 3: what you're saying is, can I you say a digital twin. 531 00:31:35,680 --> 00:31:43,120 Speaker 3: You're essentially replicating holistically how our world works. And do 532 00:31:43,200 --> 00:31:44,680 Speaker 3: you think that is achievable. 533 00:31:45,600 --> 00:31:48,760 Speaker 6: I don't think it's immediately achievable, but based on kind 534 00:31:48,760 --> 00:31:50,680 Speaker 6: of the progress that we've seen in the last three 535 00:31:50,760 --> 00:31:54,560 Speaker 6: or four years, I think it's more achievable today than 536 00:31:54,600 --> 00:31:55,200 Speaker 6: it was then. 537 00:31:55,800 --> 00:32:00,840 Speaker 3: You think you'll see it in your Yeah, and I've 538 00:32:00,840 --> 00:32:17,720 Speaker 3: got a couple of years last. Smart Talks with IBM 539 00:32:17,840 --> 00:32:21,760 Speaker 3: is produced by Matt Ramano, Amy Gains McQuaid, Trina Menino, 540 00:32:22,000 --> 00:32:26,280 Speaker 3: and Jay Harper. Were edited by Lacy Roberts. Engineering by 541 00:32:26,360 --> 00:32:30,840 Speaker 3: Nina Bird Lawrence, mastering by Sarah Buguer, music by Gramoscope, 542 00:32:31,040 --> 00:32:36,880 Speaker 3: Strategy by Tatiana Lieberman, Cassidy Meyer and Sophia Derlin. Special 543 00:32:36,920 --> 00:32:42,600 Speaker 3: thanks to the team at NASA's Science Mission Directorate. Smart 544 00:32:42,600 --> 00:32:45,760 Speaker 3: Talks with IBM is a production of Pushkin Industries and 545 00:32:45,920 --> 00:32:50,840 Speaker 3: Ruby Studio at iHeartMedia. To find more Pushkin podcasts, listen 546 00:32:50,880 --> 00:32:54,880 Speaker 3: on the iHeartRadio app, Apple Podcasts, or wherever you listen 547 00:32:54,920 --> 00:32:59,400 Speaker 3: to podcasts. I'm Malcolm Glawell. This is a paid advertisement 548 00:32:59,640 --> 00:33:04,240 Speaker 3: for my The conversations on this podcast don't necessarily represent 549 00:33:04,320 --> 00:33:24,720 Speaker 3: IBM's positions, strategies or opinions. Since we recorded this episode, 550 00:33:25,040 --> 00:33:29,800 Speaker 3: IBM and NASA released Syria their solar weather model. In 551 00:33:29,880 --> 00:33:33,680 Speaker 3: early testing, it showed us sixteen percent improvement in solar 552 00:33:33,720 --> 00:33:37,680 Speaker 3: flare prediction accuracy. This is the kind of improvement that 553 00:33:37,800 --> 00:33:41,840 Speaker 3: helps protect our satellites, our power grids, and our GPS 554 00:33:41,880 --> 00:33:46,240 Speaker 3: systems from the Sun's unpredictable nature. And the next step 555 00:33:46,320 --> 00:33:50,080 Speaker 3: in this partnership another model coming in twenty twenty six. 556 00:33:50,400 --> 00:33:53,840 Speaker 3: Looking beyond the Earth and the Sun, the universe of 557 00:33:53,920 --> 00:33:56,440 Speaker 3: possibilities just keeps expanding.