1 00:00:00,120 --> 00:00:04,240 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. This season 2 00:00:04,320 --> 00:00:07,880 Speaker 1: on Smart Talks with IBM, Malcolm Glabwell is back, and 3 00:00:07,920 --> 00:00:10,600 Speaker 1: this time he's taking the show on the road. Malcolm 4 00:00:10,640 --> 00:00:14,720 Speaker 1: is stepping outside the studio to explore how IBM clients 5 00:00:14,760 --> 00:00:18,759 Speaker 1: are using artificial intelligence to solve real world challenges and 6 00:00:18,840 --> 00:00:23,000 Speaker 1: transform the way they do business, from accelerating scientific breakthroughs 7 00:00:23,079 --> 00:00:27,520 Speaker 1: to reimagining education. It's a fresh look at innovation in action, 8 00:00:27,960 --> 00:00:31,720 Speaker 1: where big ideas meet cutting edge solutions. You'll hear from 9 00:00:31,720 --> 00:00:36,000 Speaker 1: industry leaders, creative thinkers, and of course Malcolm Glabwell himself 10 00:00:36,240 --> 00:00:39,879 Speaker 1: as he guides you through each story. New episodes of 11 00:00:39,920 --> 00:00:43,280 Speaker 1: Smart Talks with IBM drop every month on the iHeartRadio app, 12 00:00:43,440 --> 00:00:47,240 Speaker 1: Apple Podcasts, or wherever you get your podcasts. Learn more 13 00:00:47,280 --> 00:00:49,920 Speaker 1: at IBM dot com slash smart Talks. 14 00:00:52,520 --> 00:00:55,440 Speaker 2: If I were to go back, I don't know, thirty 15 00:00:55,520 --> 00:00:59,600 Speaker 2: years in Kenya, what's the difference between then and now? 16 00:00:59,640 --> 00:01:04,360 Speaker 2: Intern of tree Cover, I'm talking to Philip, thego Special 17 00:01:04,400 --> 00:01:06,640 Speaker 2: Technology envoid to the Kenyan President. 18 00:01:07,240 --> 00:01:10,160 Speaker 3: Let's speak as if you think about we arena elevent 19 00:01:10,200 --> 00:01:13,200 Speaker 3: trol perscs and previously we were more than twenty percent. 20 00:01:13,600 --> 00:01:17,000 Speaker 3: So we are cutting trees more than we're planting them. 21 00:01:17,280 --> 00:01:21,000 Speaker 2: In thirty years, Kenya lost half its tree cover half. 22 00:01:21,720 --> 00:01:25,200 Speaker 2: And here's why that matters. Kenya is a mountainous country. 23 00:01:25,720 --> 00:01:28,560 Speaker 2: Dotted throughout the highlands are dozens of what canyons call 24 00:01:28,720 --> 00:01:34,919 Speaker 2: water towers, natural reservoirs, densely forested areas capable of absorbing 25 00:01:34,959 --> 00:01:37,200 Speaker 2: the enormous amount of water that falls on the country 26 00:01:37,280 --> 00:01:41,880 Speaker 2: during the rainy seasons. The tree roots and undergrowth secure 27 00:01:42,000 --> 00:01:45,640 Speaker 2: and capture moisture, then slowly release it into the rivers 28 00:01:45,640 --> 00:01:49,000 Speaker 2: that flow down into the country's low lying coastal areas. 29 00:01:49,640 --> 00:01:54,520 Speaker 2: But in recent years the water towers have depleted, settlements 30 00:01:54,680 --> 00:01:58,480 Speaker 2: have encroached on them, trees have been chopped down, thousands 31 00:01:58,560 --> 00:02:02,280 Speaker 2: of acres cleared, the natural reservoirs cease to hold nearly 32 00:02:02,280 --> 00:02:05,800 Speaker 2: as much water, so now Kenya is prone to extremes. 33 00:02:06,240 --> 00:02:08,760 Speaker 2: Too much water flowing down from the highlands in the 34 00:02:08,840 --> 00:02:12,360 Speaker 2: rainy season and too little water left during the dry season. 35 00:02:13,080 --> 00:02:15,120 Speaker 3: So you have a couple of hours of water, then 36 00:02:15,120 --> 00:02:17,440 Speaker 3: you have a couple of hours with no water that 37 00:02:17,639 --> 00:02:20,560 Speaker 3: taps off to be dry by the city authority. So 38 00:02:20,639 --> 00:02:23,840 Speaker 3: that's the significance of the water towers. We have when 39 00:02:23,880 --> 00:02:24,800 Speaker 3: they cannot hold water. 40 00:02:25,240 --> 00:02:29,440 Speaker 2: Kenya desperately needed to restore its water towers by planting 41 00:02:29,760 --> 00:02:33,239 Speaker 2: as many trees as humanly possible. So in the fall 42 00:02:33,280 --> 00:02:37,120 Speaker 2: of twenty twenty three, the Kenyan government took action. It 43 00:02:37,240 --> 00:02:41,799 Speaker 2: started a national holiday, National Tree Growing Day, a day 44 00:02:41,840 --> 00:02:44,560 Speaker 2: to allow the citizens of Kenya to go out into 45 00:02:44,600 --> 00:02:47,919 Speaker 2: the forest to dominate the Kenyan countryside and plant as 46 00:02:48,000 --> 00:02:52,080 Speaker 2: many trees as they can, and the government decided on 47 00:02:52,120 --> 00:02:52,640 Speaker 2: a number. 48 00:02:53,520 --> 00:02:58,480 Speaker 3: The presidents really focus right around how to ensure that 49 00:02:58,520 --> 00:03:00,880 Speaker 3: we don't lose more for us was in this very 50 00:03:00,919 --> 00:03:03,240 Speaker 3: ambitious campaign around fifteen billion trees. 51 00:03:03,760 --> 00:03:07,280 Speaker 2: That's right, fifteen billion with a bee. 52 00:03:07,840 --> 00:03:10,120 Speaker 3: So imagine that number will tell you the ambition, not 53 00:03:10,240 --> 00:03:12,840 Speaker 3: as he tells you the deficit. It has to be 54 00:03:12,880 --> 00:03:15,160 Speaker 3: fifteen billion in the next eight years. 55 00:03:15,520 --> 00:03:18,799 Speaker 2: Fifteen billion trees over eight years averages out to more 56 00:03:18,800 --> 00:03:22,880 Speaker 2: than five million trees per day. That's a lot of trees. 57 00:03:23,440 --> 00:03:26,040 Speaker 2: But with such a massive goal, how can you track 58 00:03:26,080 --> 00:03:28,760 Speaker 2: your progress? How do you know where to plant those 59 00:03:28,800 --> 00:03:31,920 Speaker 2: trees so they'll have the most impact. How do you 60 00:03:32,040 --> 00:03:35,920 Speaker 2: monitor where older trees are still being cut down? Well, 61 00:03:35,960 --> 00:03:39,640 Speaker 2: the answer to those questions came from IBM and a 62 00:03:39,680 --> 00:03:45,560 Speaker 2: little space agency called NASA. That's right, folks, Smart Talks 63 00:03:46,280 --> 00:03:50,400 Speaker 2: is going to space. My name is Malcolm Glawell. You're 64 00:03:50,400 --> 00:03:53,920 Speaker 2: listening to the latest episode of Smart Talks with IBM, 65 00:03:54,280 --> 00:03:57,040 Speaker 2: where we offer our listeners a glimpse behind the curtain 66 00:03:57,160 --> 00:04:00,960 Speaker 2: of the world of technology. In this season, IBM has 67 00:04:01,000 --> 00:04:05,880 Speaker 2: gone inside elementary school classrooms, toured formulation labs at Loreel, 68 00:04:06,200 --> 00:04:10,080 Speaker 2: and spoken with the fan development team at Scuderia Ferrari HP. 69 00:04:11,080 --> 00:04:15,600 Speaker 2: In this episode, how IBM is partnering with NASA to 70 00:04:15,680 --> 00:04:20,200 Speaker 2: build geospatial models using data from satellites to better understand 71 00:04:20,200 --> 00:04:26,080 Speaker 2: our Earth and Solar system. 72 00:04:26,160 --> 00:04:34,560 Speaker 4: Five four three two one zero all engine running liptoff. 73 00:04:34,600 --> 00:04:37,719 Speaker 4: We have a liptoff thirty two minutes past the hour 74 00:04:38,080 --> 00:04:39,440 Speaker 4: liftoff on Apollo eleven. 75 00:04:40,120 --> 00:04:43,800 Speaker 2: IBM has worked on space related projects since before I 76 00:04:43,839 --> 00:04:48,400 Speaker 2: was even born. Im all for Man, A team of 77 00:04:48,440 --> 00:04:52,320 Speaker 2: four thousand IBM engineers helped create the Saturn five rocket 78 00:04:52,320 --> 00:04:58,640 Speaker 2: that took Neil Armstrong to the Moon. Arm And when 79 00:04:58,640 --> 00:05:01,640 Speaker 2: I think of NASA, I tend to picture the moon landing, 80 00:05:02,160 --> 00:05:04,560 Speaker 2: or the team of people back in Houston guiding the 81 00:05:04,600 --> 00:05:08,520 Speaker 2: Apollo mission, or the Hubble telescope or astronauts aboard the 82 00:05:08,560 --> 00:05:13,080 Speaker 2: International Space Station. What I didn't think about until now 83 00:05:13,600 --> 00:05:15,320 Speaker 2: are NASA's geographers. 84 00:05:16,120 --> 00:05:18,039 Speaker 5: In order to go places, you need at map things. 85 00:05:18,520 --> 00:05:22,560 Speaker 2: This is Kevin Murphy, chief Science Data Officer at NASA's 86 00:05:22,680 --> 00:05:24,039 Speaker 2: Science Mission Directorate. 87 00:05:24,760 --> 00:05:27,760 Speaker 5: But I think that there's an assumption that NASAs all 88 00:05:27,800 --> 00:05:31,560 Speaker 5: about rockets and astronauts, and certainly that's a really large 89 00:05:31,600 --> 00:05:32,920 Speaker 5: part and important part in NASA. 90 00:05:33,720 --> 00:05:37,080 Speaker 2: NASA sends people to space and looks out of the stars, 91 00:05:37,560 --> 00:05:41,160 Speaker 2: but NASA also looks down at the Earth. The agency 92 00:05:41,160 --> 00:05:46,960 Speaker 2: has about one hundred and fifty satellites that use radar, lightar, landset, aquatera, 93 00:05:47,120 --> 00:05:52,599 Speaker 2: cloudset AURA, low Earth orbit, medium Earth orbit, geostationary orbit, 94 00:05:52,800 --> 00:05:57,200 Speaker 2: on and on. In one sense, NASA makes hardware. They 95 00:05:57,200 --> 00:06:00,800 Speaker 2: build rockets and spacecraft and all those delights that circle 96 00:06:00,839 --> 00:06:06,720 Speaker 2: the Earth. But fundamentally NASA also collects data. It's scientists 97 00:06:06,760 --> 00:06:09,600 Speaker 2: and the engineers people like Kevin want to make the 98 00:06:09,640 --> 00:06:13,560 Speaker 2: best use possible of all the information gathered by all 99 00:06:13,680 --> 00:06:15,800 Speaker 2: those many dozens of instruments. 100 00:06:16,360 --> 00:06:20,080 Speaker 5: Right now, we gather around twenty five petabytes of new 101 00:06:20,240 --> 00:06:23,200 Speaker 5: observational data per year. In the next couple of months, 102 00:06:23,200 --> 00:06:28,720 Speaker 5: we're about to launch a high resolution Global Radar when 103 00:06:28,760 --> 00:06:32,320 Speaker 5: that launches, will double how much we collect every year 104 00:06:32,400 --> 00:06:34,440 Speaker 5: to about fifty petabytes of information. 105 00:06:35,120 --> 00:06:39,160 Speaker 2: Actually, since we recorded this conversation, NASA launched that global 106 00:06:39,240 --> 00:06:43,920 Speaker 2: radar what they call NYSAR. So NASA is already generating 107 00:06:43,960 --> 00:06:46,880 Speaker 2: new data at the rate of fifty petabytes each year. 108 00:06:47,520 --> 00:06:50,800 Speaker 2: To put that in perspective, a single petabyte could hold 109 00:06:50,839 --> 00:06:55,560 Speaker 2: about five hundred billion pages of standard printed text. Now 110 00:06:55,600 --> 00:06:58,480 Speaker 2: can anyone sort of apply to use this data is. 111 00:06:58,760 --> 00:07:01,480 Speaker 5: They don't even have to apply. It's free and open data. 112 00:07:01,600 --> 00:07:05,400 Speaker 5: It advances how we understand what we do on Earth 113 00:07:05,480 --> 00:07:09,280 Speaker 5: and how we see ourselves within the universe. People can 114 00:07:09,360 --> 00:07:12,120 Speaker 5: take it for so many different downstream applications. So you 115 00:07:12,160 --> 00:07:15,000 Speaker 5: can go to our websites today, you can search through 116 00:07:15,040 --> 00:07:19,480 Speaker 5: our tools, and you can download information from the Mars rovers, 117 00:07:19,480 --> 00:07:23,160 Speaker 5: you can download information from the Lunar Reconnaissance Orbiter or 118 00:07:23,200 --> 00:07:24,920 Speaker 5: any of the Earth Science Data satellites. 119 00:07:25,200 --> 00:07:28,560 Speaker 2: And give me an example of a really cool application, 120 00:07:29,320 --> 00:07:31,560 Speaker 2: a really cool use that someone I don't know in 121 00:07:31,640 --> 00:07:34,560 Speaker 2: academic or whatever has used your data for. It is there? 122 00:07:34,560 --> 00:07:37,800 Speaker 5: It okay? So one of the really kind of cool 123 00:07:37,920 --> 00:07:41,760 Speaker 5: but unexpected observations that we had is that we launched 124 00:07:41,800 --> 00:07:46,160 Speaker 5: a pair of satellites in their early two thousands called Grace, 125 00:07:46,560 --> 00:07:49,440 Speaker 5: and these satellites orbit the Earth and they can measure 126 00:07:49,560 --> 00:07:52,280 Speaker 5: very precisely the distance that they're away from each other 127 00:07:52,280 --> 00:07:55,000 Speaker 5: as they orbit the Earth. And as you go into 128 00:07:55,000 --> 00:07:58,440 Speaker 5: gravity wells, you can actually see a satellite accelerate and 129 00:07:58,480 --> 00:08:02,240 Speaker 5: the other one accelerate after it, right, And using that information, 130 00:08:02,600 --> 00:08:05,280 Speaker 5: we were trying to map kind of the gravity fields 131 00:08:05,600 --> 00:08:07,920 Speaker 5: of Earth. What what they found is that they can 132 00:08:07,920 --> 00:08:11,360 Speaker 5: actually map below kind of the mass of Earth to 133 00:08:11,800 --> 00:08:15,320 Speaker 5: where water storage is. For instance, so aquifers, right, so 134 00:08:15,600 --> 00:08:20,320 Speaker 5: you can monitor through gravity how much water is being 135 00:08:20,360 --> 00:08:24,320 Speaker 5: depleted or added to an aquifer or the density of glaciers. 136 00:08:24,880 --> 00:08:28,240 Speaker 2: So, just to back up for a moment, the presence 137 00:08:28,400 --> 00:08:32,720 Speaker 2: and density of water deposits below the Earth's surface have 138 00:08:32,800 --> 00:08:38,079 Speaker 2: an effect on gravitational fields that are being measured in space. 139 00:08:38,120 --> 00:08:38,480 Speaker 5: Correct. 140 00:08:39,040 --> 00:08:42,559 Speaker 2: Yeah, And so does that tell you presuming you learn 141 00:08:42,679 --> 00:08:44,600 Speaker 2: things like where there's an aquifer where you didn't think 142 00:08:44,600 --> 00:08:46,040 Speaker 2: there was an aquifer. 143 00:08:46,080 --> 00:08:48,920 Speaker 5: Or if it's being depleted faster? Yeah? 144 00:08:49,080 --> 00:08:51,480 Speaker 2: Yeah. So who's using that kind. 145 00:08:51,360 --> 00:08:55,000 Speaker 5: Of data All sorts of different organizations, whether they're you know, 146 00:08:55,280 --> 00:08:59,120 Speaker 5: NGOs or government agencies or people that are planning a 147 00:08:59,160 --> 00:09:00,439 Speaker 5: large agricultural product. 148 00:09:00,480 --> 00:09:02,600 Speaker 2: How did you Was that an intentional desis? 149 00:09:02,640 --> 00:09:03,959 Speaker 5: It wasn't It was accidental. 150 00:09:04,679 --> 00:09:11,240 Speaker 2: It was accidental. NASA has assembled a historically unprecedented mountain 151 00:09:11,240 --> 00:09:15,079 Speaker 2: of data about the physical world, free and open to anyone, 152 00:09:15,480 --> 00:09:18,240 Speaker 2: and the possibilities for how that information can be used 153 00:09:18,320 --> 00:09:24,120 Speaker 2: are so vast that even NASA is still uncovering them. 154 00:09:24,520 --> 00:09:27,200 Speaker 2: When I was a kid, I loved legos. I had 155 00:09:27,240 --> 00:09:30,320 Speaker 2: a huge bin full of them. At the time, legos 156 00:09:30,400 --> 00:09:33,920 Speaker 2: were really just colored bricks of various sizes. They weren't 157 00:09:33,960 --> 00:09:36,880 Speaker 2: as complicated as they are today. And what I realized 158 00:09:36,920 --> 00:09:39,480 Speaker 2: even then was that there were more possibilities in a 159 00:09:39,520 --> 00:09:42,400 Speaker 2: box of Legos than I could ever imagine on my own. 160 00:09:43,240 --> 00:09:44,880 Speaker 2: I played with my brother and he would show me 161 00:09:44,920 --> 00:09:47,200 Speaker 2: something that hadn't occurred to me. And I go to 162 00:09:47,200 --> 00:09:49,520 Speaker 2: my friend Bruce's and see that he was off on 163 00:09:49,559 --> 00:09:53,040 Speaker 2: some legos tangent that I'd never even thought of, like 164 00:09:53,080 --> 00:09:55,840 Speaker 2: a cool bridge or a castle or a truck. I 165 00:09:56,000 --> 00:09:59,480 Speaker 2: used legos one way, Bruce used his legos in a 166 00:09:59,559 --> 00:10:05,080 Speaker 2: completely different way. NASA's data treasure Trove is like a 167 00:10:05,240 --> 00:10:08,520 Speaker 2: very very big box of legos, And here's the question. 168 00:10:09,080 --> 00:10:14,240 Speaker 2: With so much data, containing so many possible connections, could IBM, 169 00:10:14,520 --> 00:10:20,400 Speaker 2: and specifically IBM's artificial intelligence help NASA scientists uncover patterns 170 00:10:20,400 --> 00:10:23,840 Speaker 2: and connect systems in a way they've never done before. 171 00:10:26,320 --> 00:10:28,800 Speaker 6: Everything started with a question, right. 172 00:10:28,840 --> 00:10:32,960 Speaker 2: I'm talking to one Bernabe Moreno, director of IBM Research 173 00:10:33,000 --> 00:10:33,480 Speaker 2: in Europe. 174 00:10:34,280 --> 00:10:39,000 Speaker 6: As we advance AI, we have new tools to understand this, 175 00:10:39,160 --> 00:10:42,600 Speaker 6: around this, understand the world, understand the language, and understand 176 00:10:42,600 --> 00:10:45,720 Speaker 6: our planets. And the question that we were asking ourselves 177 00:10:45,840 --> 00:10:48,360 Speaker 6: was all these new advances that we see in language. 178 00:10:48,400 --> 00:10:51,720 Speaker 6: It was a post GPT moment. Could we apply the 179 00:10:51,800 --> 00:10:55,320 Speaker 6: same idea and the same architecture and technology to a 180 00:10:55,480 --> 00:10:56,560 Speaker 6: data about our planet. 181 00:10:57,200 --> 00:11:00,560 Speaker 2: The advent of AI created a new opportunity. What if 182 00:11:00,559 --> 00:11:04,160 Speaker 2: all of NASA's mountain of data could be organized, analyzed, 183 00:11:04,440 --> 00:11:09,760 Speaker 2: understood by artificial intelligence. The original idea was to create 184 00:11:09,800 --> 00:11:13,440 Speaker 2: a geospatial foundation model for the Earth, and from there 185 00:11:13,920 --> 00:11:18,640 Speaker 2: create additional specialized models for other scientific priorities of NASA, 186 00:11:19,360 --> 00:11:23,079 Speaker 2: and finally quit an AI system that can understand all 187 00:11:23,160 --> 00:11:26,920 Speaker 2: the data across those specialized models in order to uncover 188 00:11:27,040 --> 00:11:32,240 Speaker 2: hidden insights and relationships. Together, these models could unlock an 189 00:11:32,360 --> 00:11:36,960 Speaker 2: infinite number of potential applications. I asked Kevin Murphy at 190 00:11:37,040 --> 00:11:40,280 Speaker 2: NASA about the beginning of these Earth models. 191 00:11:40,880 --> 00:11:43,559 Speaker 5: Has some colleagues, and we were investigating in a number 192 00:11:43,559 --> 00:11:48,560 Speaker 5: of different avenues of using AI with our data, but 193 00:11:48,679 --> 00:11:51,720 Speaker 5: also kind of the management and stewardship of the data, 194 00:11:51,760 --> 00:11:53,960 Speaker 5: so not only like the observations, but how we make 195 00:11:54,000 --> 00:11:57,760 Speaker 5: it available to people, make it discoverable. And they said, hey, 196 00:11:58,440 --> 00:12:01,000 Speaker 5: we see these transform architecture. We think that they can 197 00:12:01,040 --> 00:12:05,199 Speaker 5: be applicable to some of the sequential observations that we make. 198 00:12:05,760 --> 00:12:07,840 Speaker 5: We'd really like to work with IBM on that. And 199 00:12:07,880 --> 00:12:12,120 Speaker 5: I was like, I'm really skeptical, but because I hadn't 200 00:12:12,200 --> 00:12:18,240 Speaker 5: seen those types of tools really produce results that were 201 00:12:18,960 --> 00:12:22,000 Speaker 5: commensurate with the amount of effort you put into them, right, 202 00:12:22,040 --> 00:12:24,360 Speaker 5: So we were getting some really good results and deep 203 00:12:24,480 --> 00:12:27,720 Speaker 5: learning approaches, but they took a lot of effort. 204 00:12:28,000 --> 00:12:29,600 Speaker 2: But Kevin came around quickly. 205 00:12:30,440 --> 00:12:35,120 Speaker 5: When we typically develop a new data product or an algorithm, 206 00:12:35,600 --> 00:12:39,320 Speaker 5: it takes anywhere from you know, twelve months, eighteen months, 207 00:12:39,320 --> 00:12:44,719 Speaker 5: twenty four months to go from data and hypothesis to 208 00:12:44,960 --> 00:12:48,880 Speaker 5: results which is validated. We were able to get approximately 209 00:12:48,920 --> 00:12:54,000 Speaker 5: the same precision for some well known types of benchmarks 210 00:12:54,400 --> 00:12:56,800 Speaker 5: with and I think it was about four months of 211 00:12:56,880 --> 00:12:57,560 Speaker 5: starting the work. 212 00:12:57,679 --> 00:13:01,840 Speaker 2: Yeah, yeah, so it happened faster than you thought, much faster. 213 00:13:02,920 --> 00:13:06,560 Speaker 2: In twenty twenty three, IBM and NASA launched a foundation 214 00:13:06,760 --> 00:13:11,560 Speaker 2: model trained on NASA's harmonized landset sentinel to satellite data 215 00:13:11,960 --> 00:13:15,960 Speaker 2: across the continental United States. They named the model Prithvi, 216 00:13:16,480 --> 00:13:20,439 Speaker 2: the Sanskrit word for Earth. The first version of Prithvi 217 00:13:20,920 --> 00:13:24,960 Speaker 2: used only Earth observation images and just that was enough 218 00:13:25,000 --> 00:13:29,000 Speaker 2: to totally change Kevin's idea of what foundation models could do. 219 00:13:30,040 --> 00:13:33,920 Speaker 2: But they didn't stop there. IBM and NASA were encouraged 220 00:13:34,080 --> 00:13:38,400 Speaker 2: at how well Prithvy worked for Earth observation tasks, so 221 00:13:38,720 --> 00:13:41,520 Speaker 2: they decided to create a more complex version of Prithvy 222 00:13:41,920 --> 00:13:46,040 Speaker 2: that could understand whether and climate data. They hoped this 223 00:13:46,080 --> 00:13:49,200 Speaker 2: new version of Prithvi would allow researchers to answer new 224 00:13:49,280 --> 00:13:52,920 Speaker 2: questions about the Earth, from short term weather forecasting to 225 00:13:53,000 --> 00:13:56,640 Speaker 2: longer term climate effects. Imagine you have a map of 226 00:13:56,800 --> 00:14:01,800 Speaker 2: all the different temperatures, pressures, clouds, rainfall and more from 227 00:14:01,840 --> 00:14:06,120 Speaker 2: around the globe. With this map, IBM and NASA could 228 00:14:06,160 --> 00:14:10,560 Speaker 2: implement advanced tasks. They could track the formation of al Nino, 229 00:14:11,040 --> 00:14:13,280 Speaker 2: or predict how the path of a hurricane would change 230 00:14:13,600 --> 00:14:17,120 Speaker 2: if the ocean temperature went up by half a degree. 231 00:14:17,400 --> 00:14:20,520 Speaker 6: I would always remember this moment was when we created 232 00:14:20,520 --> 00:14:25,640 Speaker 6: the Weather and Climate Foundational Model. The senior methodologist of NASA, 233 00:14:26,080 --> 00:14:29,200 Speaker 6: it was like, I cannot believe that it has changed the 234 00:14:29,280 --> 00:14:31,440 Speaker 6: way I think about the AI and ever since, he's 235 00:14:31,480 --> 00:14:34,000 Speaker 6: been kind of preaching with this A sample. 236 00:14:34,400 --> 00:14:36,880 Speaker 2: One and his team then took the model and decided 237 00:14:36,920 --> 00:14:41,000 Speaker 2: to test it, really tested it. Took away ninety nine 238 00:14:41,040 --> 00:14:44,320 Speaker 2: percent of the data points and ran the experiment again. 239 00:14:44,960 --> 00:14:47,040 Speaker 2: What they were trying to figure out is if the 240 00:14:47,080 --> 00:14:50,120 Speaker 2: model had learned enough about the basic principles of the Earth, 241 00:14:50,520 --> 00:14:53,480 Speaker 2: the underlying physics of the way the planet works, to 242 00:14:53,600 --> 00:14:56,920 Speaker 2: fill in the blanks on its own with just one 243 00:14:57,000 --> 00:15:00,600 Speaker 2: percent of the original data, would it still be accurate 244 00:15:00,640 --> 00:15:07,200 Speaker 2: in its predictions. What happened. The model crushed it so 245 00:15:07,440 --> 00:15:09,840 Speaker 2: it was able to extrapolate on the basis of one 246 00:15:09,920 --> 00:15:13,320 Speaker 2: percent of the data what the entire picture looked like. 247 00:15:13,440 --> 00:15:17,480 Speaker 6: Yes, because pre learned everything right. 248 00:15:17,400 --> 00:15:20,480 Speaker 2: Yeah, it learned the kind of principles of exactly Yeah. 249 00:15:20,840 --> 00:15:24,280 Speaker 2: Oh wow, that's very very impressive. So at that moment 250 00:15:24,760 --> 00:15:28,840 Speaker 2: when you realize you could do that and just curious 251 00:15:28,840 --> 00:15:31,720 Speaker 2: about your emotional I mean, did you jump up and down? 252 00:15:31,760 --> 00:15:32,280 Speaker 2: What did you do? 253 00:15:32,760 --> 00:15:36,200 Speaker 6: So it's like, wow, it was a very emotional meeting 254 00:15:36,240 --> 00:15:41,600 Speaker 6: because you know, having this person say now I'm convinced right, Yeah, 255 00:15:41,720 --> 00:15:44,600 Speaker 6: it was kind of a quite a special moment. These 256 00:15:44,600 --> 00:15:46,280 Speaker 6: moments make your life as a researcher. 257 00:15:47,720 --> 00:15:51,400 Speaker 2: Ibm And as a launch prithe for Weather and Climate 258 00:15:51,480 --> 00:15:54,520 Speaker 2: in twenty twenty four and while ibm And as a 259 00:15:54,520 --> 00:15:58,520 Speaker 2: scientist could use Prithvy to run interesting experiments, they were 260 00:15:58,520 --> 00:16:02,080 Speaker 2: even more excited about how Prithy could help people in 261 00:16:02,120 --> 00:16:10,400 Speaker 2: the real world. So let's go back to Kenya Ambassador 262 00:16:10,480 --> 00:16:14,080 Speaker 2: Philip Diego and the country's great tree planting project. 263 00:16:14,920 --> 00:16:17,880 Speaker 3: So on those initial months, there was a massive effort, 264 00:16:17,960 --> 00:16:23,080 Speaker 3: including a couple of national holidays for tree planting. Yes, 265 00:16:24,000 --> 00:16:26,200 Speaker 3: where the entire cabinet was sent. 266 00:16:26,520 --> 00:16:28,560 Speaker 2: Ah, did you plant trees. 267 00:16:28,440 --> 00:16:29,080 Speaker 5: As I did? 268 00:16:29,160 --> 00:16:31,040 Speaker 3: Oh my god, I said, the entire cabinet plus someone 269 00:16:31,240 --> 00:16:32,040 Speaker 3: we have to be seen. 270 00:16:32,480 --> 00:16:34,280 Speaker 2: Are you good at the planet? Two weeks ago? 271 00:16:34,640 --> 00:16:36,520 Speaker 3: Well, it's very easy to go hole put a tree 272 00:16:36,600 --> 00:16:38,720 Speaker 3: in the ground. 273 00:16:38,240 --> 00:16:42,040 Speaker 2: Well wow, what planting a tree is easy? But remember 274 00:16:42,480 --> 00:16:47,400 Speaker 2: it has to happen fifteen billion times. IBM research has 275 00:16:47,440 --> 00:16:51,200 Speaker 2: been operating in Nairobi since twenty thirteen, and what ken 276 00:16:51,240 --> 00:16:55,200 Speaker 2: you wanted, at least in the beginning was straightforward. The 277 00:16:55,280 --> 00:16:58,280 Speaker 2: prith Fee model that IBM and NASA built could be 278 00:16:58,400 --> 00:17:02,480 Speaker 2: used to essentially make the world old's greatest map, and Kenya, 279 00:17:02,600 --> 00:17:05,480 Speaker 2: with IBM's help, could use that model to make the 280 00:17:05,520 --> 00:17:09,440 Speaker 2: world's greatest map of Kenya. The first step was to 281 00:17:09,520 --> 00:17:12,560 Speaker 2: lay a grid across a topography of the country, break 282 00:17:12,600 --> 00:17:16,480 Speaker 2: the forest into manageable bite sized pieces, each of which 283 00:17:16,480 --> 00:17:18,000 Speaker 2: could be analyzed separately. 284 00:17:18,840 --> 00:17:21,119 Speaker 3: So because our forest is massive when you look at 285 00:17:21,160 --> 00:17:23,800 Speaker 3: it in terms of green hite, but only lay it, 286 00:17:24,040 --> 00:17:26,720 Speaker 3: you're able to break it into pieces, like into boxes. 287 00:17:26,920 --> 00:17:30,560 Speaker 3: And for us that was important because then it's easy 288 00:17:30,640 --> 00:17:33,400 Speaker 3: to tackle it when it's in a greed system than 289 00:17:33,560 --> 00:17:36,040 Speaker 3: just as a massive forest. So that was also what 290 00:17:36,720 --> 00:17:38,120 Speaker 3: the model was able to do. 291 00:17:38,560 --> 00:17:42,119 Speaker 2: Then the model painstakingly sorted through each of those boxes 292 00:17:42,520 --> 00:17:45,720 Speaker 2: and look for what Philip calls hotspots, so. 293 00:17:45,640 --> 00:17:48,320 Speaker 3: You can see, for example, very quickly which other areas 294 00:17:48,320 --> 00:17:51,520 Speaker 3: are being eroded very fast, and that you need to 295 00:17:51,600 --> 00:17:54,800 Speaker 3: quickly protect. Yeah, because you sometimes and that's where you 296 00:17:54,800 --> 00:17:56,600 Speaker 3: want to target, right, I mean it's not possible to 297 00:17:56,600 --> 00:17:58,240 Speaker 3: do everything at the same time. 298 00:17:58,400 --> 00:18:00,199 Speaker 2: Do you have a definition of a hotspot and how 299 00:18:00,240 --> 00:18:02,440 Speaker 2: many hotspots are there according to that definition? 300 00:18:03,320 --> 00:18:03,920 Speaker 5: Oh, there are a lot. 301 00:18:04,040 --> 00:18:07,320 Speaker 3: So we have more than forty water towers, and I'll 302 00:18:07,320 --> 00:18:10,320 Speaker 3: tell you all of them have hotspots. And the hot 303 00:18:10,320 --> 00:18:14,479 Speaker 3: spots in my definition areas that are being degraded faster 304 00:18:14,640 --> 00:18:17,600 Speaker 3: and in a very unusual way. Right, you can literally 305 00:18:17,640 --> 00:18:21,040 Speaker 3: see how human activity is seriously degrading that particular area 306 00:18:21,400 --> 00:18:23,600 Speaker 3: that if you do not have a direct intervention, we'll 307 00:18:23,640 --> 00:18:27,000 Speaker 3: lose the entire forest. So that's the hotspot for us, 308 00:18:27,560 --> 00:18:29,520 Speaker 3: because you think about cutting one hundred trees a day 309 00:18:29,600 --> 00:18:31,560 Speaker 3: and cutting a million trees a day, So that's a 310 00:18:31,640 --> 00:18:34,680 Speaker 3: hot spot. You want to look at places where there's 311 00:18:34,920 --> 00:18:39,600 Speaker 3: just unusually high activity of deforestation in a hotspot. 312 00:18:39,880 --> 00:18:42,120 Speaker 2: The size of each box in the grid was ten 313 00:18:42,160 --> 00:18:45,880 Speaker 2: by ten meters, about half a tennis court. That's how 314 00:18:46,000 --> 00:18:50,119 Speaker 2: closely they were examining the forest, so very crudely. The 315 00:18:50,160 --> 00:18:55,240 Speaker 2: model ingests all of this satellite data and it helps 316 00:18:55,240 --> 00:18:58,560 Speaker 2: you answer some very specific questions like where should we 317 00:18:58,600 --> 00:19:03,800 Speaker 2: prioritize our tree planning efforts which areas down to an 318 00:19:03,800 --> 00:19:08,639 Speaker 2: extraordinary level of specificity are eroding most quickly. You know, 319 00:19:08,680 --> 00:19:11,879 Speaker 2: all those kinds of practical questions about how to direct 320 00:19:11,920 --> 00:19:12,600 Speaker 2: your strategy. 321 00:19:12,920 --> 00:19:15,000 Speaker 3: So if you think about a smart forest, right, and 322 00:19:15,040 --> 00:19:17,679 Speaker 3: that's really for us, we're calling it smart fencing, smart forests, 323 00:19:17,680 --> 00:19:21,400 Speaker 3: everything that's smart because of AI. If you think about 324 00:19:22,119 --> 00:19:24,800 Speaker 3: your usual what you can see with your eyes and 325 00:19:24,840 --> 00:19:27,879 Speaker 3: then the satellite layer which just zooms in and you 326 00:19:27,920 --> 00:19:30,760 Speaker 3: see green. So what the model has been able to 327 00:19:30,760 --> 00:19:33,080 Speaker 3: do is to create a smart layer, right, and then 328 00:19:33,200 --> 00:19:37,440 Speaker 3: that smart layer you can actually see many things, from analytics, 329 00:19:37,440 --> 00:19:40,320 Speaker 3: to the greeds, to a dashboard, one a lot. So 330 00:19:40,400 --> 00:19:44,600 Speaker 3: about to layer to those blocks. You can quantify degradation 331 00:19:44,720 --> 00:19:48,400 Speaker 3: by blocks. You can match integrations, you can match reforestation. 332 00:19:48,880 --> 00:19:50,920 Speaker 2: I asked Philip to imagine what it would have been 333 00:19:51,000 --> 00:19:53,880 Speaker 2: like to attempt the tree planting project in an era 334 00:19:54,080 --> 00:20:00,399 Speaker 2: before AI. His answer was, plant fifteen billion trees, restore 335 00:20:00,400 --> 00:20:05,160 Speaker 2: the water towers. Impossible with Prithvy on Kenya's side, though 336 00:20:05,600 --> 00:20:08,960 Speaker 2: it's really happening. What should be clear by now is 337 00:20:09,000 --> 00:20:11,600 Speaker 2: how versatile Prithvie can be. I want to know how 338 00:20:11,640 --> 00:20:15,280 Speaker 2: to combat deforestation. Prith vy can model that I want 339 00:20:15,280 --> 00:20:16,840 Speaker 2: to know when the best time in the year to 340 00:20:16,840 --> 00:20:20,120 Speaker 2: plant your crops is. Prithvy can help predict that too. 341 00:20:21,280 --> 00:20:25,600 Speaker 2: Last year, six months after IBM started helping Kenya with reforestation, 342 00:20:26,119 --> 00:20:29,320 Speaker 2: Kenya needed Prithvy's help on something else and it was 343 00:20:29,359 --> 00:20:30,160 Speaker 2: an emergency. 344 00:20:31,000 --> 00:20:33,760 Speaker 3: So something was happening in the world that we sort 345 00:20:33,760 --> 00:20:36,080 Speaker 3: of had these flats that we didn't expect. 346 00:20:36,880 --> 00:20:39,520 Speaker 2: In the spring of twenty twenty four, Kenya was hit 347 00:20:39,560 --> 00:20:43,840 Speaker 2: with thunderstorms and torrential rain, days and days of it. 348 00:20:44,640 --> 00:20:46,879 Speaker 3: And so I got a call from the Red Cross 349 00:20:47,040 --> 00:20:50,439 Speaker 3: then one of my friends, and they're like, Ambassador, we 350 00:20:50,520 --> 00:20:53,639 Speaker 3: need a little bit of help on how we deal 351 00:20:53,720 --> 00:20:56,400 Speaker 3: with response because what we are seeing is unusual, right 352 00:20:56,440 --> 00:20:59,080 Speaker 3: because no man, you would only have one area. All 353 00:20:59,119 --> 00:21:02,680 Speaker 3: of a sudden, we had an entire country flooding. In April, 354 00:21:02,720 --> 00:21:07,280 Speaker 3: we had about three hundred kilometers square kind of total 355 00:21:07,440 --> 00:21:11,760 Speaker 3: land flooded, which is unusual for Kenyon. And so when 356 00:21:11,760 --> 00:21:14,080 Speaker 3: I got this call, we were like, Okay, there's someone 357 00:21:14,080 --> 00:21:16,760 Speaker 3: could did with IBM. We only did one function for 358 00:21:16,800 --> 00:21:19,600 Speaker 3: the trees. It was actually a climate model, and we said, 359 00:21:19,680 --> 00:21:23,720 Speaker 3: can we use this to help us better respond to 360 00:21:24,400 --> 00:21:28,000 Speaker 3: floods and So that was how we started having this 361 00:21:28,440 --> 00:21:32,520 Speaker 3: discussion with IBM in terms of repurposing the model to 362 00:21:32,640 --> 00:21:36,760 Speaker 3: help us deal with this new challenge around floods. 363 00:21:37,320 --> 00:21:43,320 Speaker 2: Again, prithvy is versatile. Prithvie could use everything it knew 364 00:21:43,320 --> 00:21:47,680 Speaker 2: about the land, the forests, and infrastructure to analyze how 365 00:21:47,800 --> 00:21:52,000 Speaker 2: and where and when floods would occur. The Kenyan government 366 00:21:52,040 --> 00:21:54,439 Speaker 2: could then use the model to help the Red Cross 367 00:21:54,560 --> 00:21:58,920 Speaker 2: organize its response, show areas that needed to be evacuated 368 00:21:59,240 --> 00:22:01,640 Speaker 2: or safe place is with the Red Cross could set 369 00:22:01,720 --> 00:22:05,560 Speaker 2: up camps. That information was invaluable. 370 00:22:06,520 --> 00:22:09,080 Speaker 3: Historically, what has happened is that they would set up 371 00:22:09,160 --> 00:22:14,359 Speaker 3: camp based on population congregation right where people assembly is 372 00:22:14,359 --> 00:22:16,800 Speaker 3: where they set up a camp, not based on any data, 373 00:22:16,880 --> 00:22:20,440 Speaker 3: right simply because people are there, they will come there 374 00:22:20,440 --> 00:22:24,000 Speaker 3: to provide services and emergency response. What we realize is 375 00:22:24,040 --> 00:22:26,560 Speaker 3: that that model doesn't work. So what we've been able 376 00:22:26,560 --> 00:22:29,000 Speaker 3: to do with IBM is be able to to sort 377 00:22:29,000 --> 00:22:32,560 Speaker 3: of give Red Cause very specific locations or options where 378 00:22:32,600 --> 00:22:34,960 Speaker 3: to set up camps. So if people come here, just 379 00:22:34,960 --> 00:22:38,600 Speaker 3: tell them no, move here, that's the safe place you 380 00:22:38,680 --> 00:22:40,320 Speaker 3: really want to go. So I think for me that 381 00:22:40,440 --> 00:22:42,680 Speaker 3: was really amazing. So we're calling them a very funny 382 00:22:42,680 --> 00:22:44,920 Speaker 3: word for it, flood assembly points. We always have fire 383 00:22:45,400 --> 00:22:47,359 Speaker 3: fire assembly points, but now we can say we have 384 00:22:48,000 --> 00:22:52,720 Speaker 3: literally flat assembly points that are safe or citizens. 385 00:22:53,119 --> 00:22:57,679 Speaker 2: That's fascinating. So the model has ingested this incredibly granular 386 00:22:58,520 --> 00:23:05,119 Speaker 2: picture later of of the topography and weather patterns of Kenya. 387 00:23:05,240 --> 00:23:08,840 Speaker 2: It's just giving you a set of useful predictions about 388 00:23:08,840 --> 00:23:10,800 Speaker 2: how you should shape your response. 389 00:23:11,760 --> 00:23:14,240 Speaker 3: Yes, and what we did remember is that, as I said, 390 00:23:14,280 --> 00:23:18,200 Speaker 3: it was a full multistate called capability. What IBM gave 391 00:23:18,280 --> 00:23:21,000 Speaker 3: us was a base map. We didn't have that before, 392 00:23:21,359 --> 00:23:23,920 Speaker 3: and a base model. So you cannot have these layers 393 00:23:23,960 --> 00:23:25,359 Speaker 3: up on layers, up on layers to be able to 394 00:23:25,359 --> 00:23:29,719 Speaker 3: make intelligent decisions. 395 00:23:31,160 --> 00:23:34,120 Speaker 2: Throughout my reporting on this episode, I've been really impressed 396 00:23:34,119 --> 00:23:37,160 Speaker 2: by what Prithvie can do. But it doesn't stop at 397 00:23:37,200 --> 00:23:41,000 Speaker 2: floods and reforestation. Prithvie has also been used to look 398 00:23:41,000 --> 00:23:45,200 Speaker 2: at wildfires and floods in the UK, and Kevin told 399 00:23:45,240 --> 00:23:48,480 Speaker 2: me that researchers in Africa have even used prithvy to 400 00:23:48,680 --> 00:23:52,600 Speaker 2: identify locust breeding grounds, which could help them prevent swarms 401 00:23:52,600 --> 00:23:57,919 Speaker 2: that destroy crops. But all these are issues on land. 402 00:23:58,359 --> 00:24:00,560 Speaker 7: I mean, I always say to people. Seventy percent of 403 00:24:00,600 --> 00:24:02,440 Speaker 7: our landmask is ocean. 404 00:24:03,000 --> 00:24:05,719 Speaker 2: Kate Rice is the director of the heart Tree Center, 405 00:24:06,040 --> 00:24:10,880 Speaker 2: which focuses on adopting AI into UK's public and private sectors, 406 00:24:11,280 --> 00:24:16,480 Speaker 2: and one of those sectors is the blue economy oceans, fish, shellfish. 407 00:24:17,440 --> 00:24:22,200 Speaker 2: But oceans are huge, and getting data from motions is difficult. 408 00:24:22,080 --> 00:24:24,680 Speaker 7: So you're dealing with something where there's not a lot 409 00:24:24,720 --> 00:24:30,440 Speaker 7: of people walking around collecting data. So the real difficulty 410 00:24:30,640 --> 00:24:35,159 Speaker 7: is understanding that collecting enough data to make anything makes sense. 411 00:24:35,760 --> 00:24:41,639 Speaker 7: And oceans are very complex in terms of their interaction 412 00:24:42,200 --> 00:24:45,440 Speaker 7: with our climate and how they interact with the climate, 413 00:24:45,960 --> 00:24:50,000 Speaker 7: so understanding the physics space models is pretty challenging too. 414 00:24:50,640 --> 00:24:56,119 Speaker 2: Once again, enter IBM. IBM created a new geospatial model 415 00:24:56,320 --> 00:24:59,800 Speaker 2: to help us better understand our oceans. Heart Tree and 416 00:25:00,520 --> 00:25:03,840 Speaker 2: along with the Plymouth Marine Laboratory, the UK Science and 417 00:25:03,880 --> 00:25:07,679 Speaker 2: Technology Facilities Council and the University of Exeter have all 418 00:25:07,760 --> 00:25:11,480 Speaker 2: partnered to focus the model's power on the waters around 419 00:25:11,520 --> 00:25:16,280 Speaker 2: the United Kingdom, which ultimately will help the UK's blue economy. 420 00:25:17,200 --> 00:25:20,399 Speaker 7: You get these major blooms in algae, so the ocean 421 00:25:20,440 --> 00:25:23,359 Speaker 7: goes green and you might see it in lakes as well. 422 00:25:23,640 --> 00:25:28,359 Speaker 7: Now if you are shell fishing and that's what you're harvesting, 423 00:25:29,600 --> 00:25:36,320 Speaker 7: you can't harvest cockles muscles to be very colloquial when 424 00:25:36,359 --> 00:25:38,920 Speaker 7: you have algae blooms because they're poisonous. 425 00:25:39,480 --> 00:25:40,040 Speaker 1: So there are. 426 00:25:39,960 --> 00:25:41,879 Speaker 7: Certain times the year where you can harvest, and there 427 00:25:41,960 --> 00:25:44,879 Speaker 7: certain times of year you can't. If you keep having 428 00:25:44,880 --> 00:25:48,479 Speaker 7: the algal blooms. Just to put it on an economic terms, 429 00:25:48,920 --> 00:25:52,400 Speaker 7: that's a problem. So if we look at it that way, 430 00:25:53,240 --> 00:25:56,800 Speaker 7: that's an issue. So we really do need to try 431 00:25:56,840 --> 00:26:01,119 Speaker 7: and understand where these algore blooms will happen, when they 432 00:26:01,200 --> 00:26:04,640 Speaker 7: will happen, and how to limit them, because obviously, if 433 00:26:04,680 --> 00:26:08,200 Speaker 7: you're shell fishing as your livelihood, that's going to really 434 00:26:08,240 --> 00:26:08,760 Speaker 7: impact you. 435 00:26:09,680 --> 00:26:13,919 Speaker 2: Kate told me that understanding these algal blooms, how they form, 436 00:26:14,160 --> 00:26:17,520 Speaker 2: why they form, and how they move would allow people 437 00:26:17,600 --> 00:26:18,720 Speaker 2: to better manage them. 438 00:26:19,800 --> 00:26:22,159 Speaker 7: What is it you're putting in the water. Are you 439 00:26:22,240 --> 00:26:26,400 Speaker 7: putting fertilizers in the water in the near shore environment 440 00:26:26,480 --> 00:26:30,080 Speaker 7: that is causing those algal blooms? Is it because we 441 00:26:30,160 --> 00:26:34,280 Speaker 7: are heating up the oceans and particularly our near shore 442 00:26:34,359 --> 00:26:37,520 Speaker 7: environments that is causing that. I don't know. I'm not 443 00:26:37,640 --> 00:26:41,560 Speaker 7: a specialist, but that's what you're trying to figure out. 444 00:26:42,440 --> 00:26:45,120 Speaker 7: Is there something we are doing that is creating those 445 00:26:45,240 --> 00:26:50,400 Speaker 7: environments that is causing those algal blooms or is it natural? 446 00:26:51,400 --> 00:26:53,600 Speaker 7: And natural is always a difficult one because I would 447 00:26:53,600 --> 00:26:56,960 Speaker 7: say we live in a very managed environment, particularly in 448 00:26:57,000 --> 00:27:01,280 Speaker 7: the UK, very few landscapes on natural Most of it 449 00:27:01,359 --> 00:27:05,240 Speaker 7: is managed in some way. Are we managing it in 450 00:27:05,240 --> 00:27:08,000 Speaker 7: an appropriate way? Is there changes in how we behave 451 00:27:08,119 --> 00:27:09,399 Speaker 7: that could make things better? 452 00:27:10,359 --> 00:27:12,440 Speaker 2: Not that I needed more examples to sell me and 453 00:27:12,480 --> 00:27:15,359 Speaker 2: how useful the Prithvian models are, but Kate gave me 454 00:27:15,640 --> 00:27:19,280 Speaker 2: a few more use cases that reinforced just how exciting 455 00:27:19,359 --> 00:27:22,320 Speaker 2: foundation models are for our oceans. 456 00:27:23,119 --> 00:27:27,720 Speaker 7: These big brown seaweeds can really help with carbon sequestration. 457 00:27:28,320 --> 00:27:31,960 Speaker 7: Imagine if we could improve the environment enough so that 458 00:27:32,000 --> 00:27:34,320 Speaker 7: we could have more of that, so that we could 459 00:27:34,359 --> 00:27:37,960 Speaker 7: SEQUENTI more carbon. The other thing is wind power. In 460 00:27:38,000 --> 00:27:40,720 Speaker 7: the UK, we have a lot of offshore wind farms 461 00:27:41,119 --> 00:27:44,280 Speaker 7: and we're doing really well with our renewable energy resources. 462 00:27:44,560 --> 00:27:46,440 Speaker 7: So where do we put that and how does that 463 00:27:46,600 --> 00:27:52,000 Speaker 7: impact sand movements? So these sandbars and things aren't static, 464 00:27:52,119 --> 00:27:56,280 Speaker 7: they move, so understanding that is really important for where 465 00:27:56,280 --> 00:28:00,880 Speaker 7: you're going to put your suboceanic infrastructure. You've got cables 466 00:28:00,920 --> 00:28:04,160 Speaker 7: going across the oceans. If we're going to use our 467 00:28:04,200 --> 00:28:08,920 Speaker 7: oceans more, we need to understand what that environmental impact 468 00:28:09,000 --> 00:28:10,520 Speaker 7: is going to be long term. 469 00:28:11,359 --> 00:28:14,080 Speaker 2: The Ocean Model launched at the end of September twenty 470 00:28:14,119 --> 00:28:24,240 Speaker 2: twenty five. The research is only beginning. When I sat 471 00:28:24,280 --> 00:28:27,320 Speaker 2: down with Kevin Murphy at NASA, I wanted to understand 472 00:28:27,480 --> 00:28:30,959 Speaker 2: where all of this impressive work was going. And one 473 00:28:31,000 --> 00:28:33,720 Speaker 2: of the signature aspects of this work is that it's 474 00:28:33,760 --> 00:28:37,800 Speaker 2: not just for IBM and NASA researchers. Anyone can use 475 00:28:37,840 --> 00:28:38,760 Speaker 2: these models. 476 00:28:39,560 --> 00:28:42,200 Speaker 5: So before, if you were a researcher, or let's say 477 00:28:42,360 --> 00:28:46,680 Speaker 5: you were a farmer or maybe a technology informed person 478 00:28:46,720 --> 00:28:49,200 Speaker 5: that was interested in something like this, you would have 479 00:28:49,240 --> 00:28:52,600 Speaker 5: to learn about how to do remote sensing, how to 480 00:28:52,760 --> 00:28:56,480 Speaker 5: calibrate the imagery, how to stitch it together because you 481 00:28:56,520 --> 00:28:58,440 Speaker 5: know they come in kind of postage stamps that you 482 00:28:58,480 --> 00:29:02,200 Speaker 5: have to squashed, and then you'd have to learn about 483 00:29:02,200 --> 00:29:05,000 Speaker 5: the algorithms necessary to do all the processing right, So 484 00:29:05,160 --> 00:29:08,520 Speaker 5: a lot of work and then you could actually do 485 00:29:08,800 --> 00:29:11,560 Speaker 5: the mapping that you were interested in. Today, what you 486 00:29:11,600 --> 00:29:14,160 Speaker 5: can do is you can go to hugging face, which 487 00:29:14,200 --> 00:29:17,959 Speaker 5: is where this model exists in the open using kind 488 00:29:17,960 --> 00:29:20,719 Speaker 5: of our open science principles, and you can apply it 489 00:29:21,160 --> 00:29:25,800 Speaker 5: to future or historical observations without having all of that 490 00:29:25,960 --> 00:29:26,960 Speaker 5: background information. 491 00:29:27,480 --> 00:29:31,000 Speaker 2: And with the partnership between NASA and IBM, these foundation 492 00:29:31,120 --> 00:29:34,880 Speaker 2: models are multiplying. The new version of Prithvi I mentioned 493 00:29:34,920 --> 00:29:38,400 Speaker 2: launched in September twenty twenty four. Then in August turing 494 00:29:38,440 --> 00:29:43,560 Speaker 2: twenty five, NASA and IBM launched another foundation model called Syria, 495 00:29:43,640 --> 00:29:46,760 Speaker 2: based on data from the Sun. Soria can help predict 496 00:29:46,960 --> 00:29:51,800 Speaker 2: solar flares which can disrupt communications and increase radiation for 497 00:29:51,920 --> 00:29:55,280 Speaker 2: high altitude flights. And then there's the Ocean model I 498 00:29:55,280 --> 00:29:58,800 Speaker 2: talked about with Kate Royce. So what does the future 499 00:29:58,840 --> 00:30:02,520 Speaker 2: look like for all the foundation models built from NASA data? 500 00:30:03,160 --> 00:30:05,440 Speaker 2: If I wanted to look five or ten years out 501 00:30:05,480 --> 00:30:09,280 Speaker 2: to understand erosion patterns in a coastal town, you. 502 00:30:09,240 --> 00:30:11,400 Speaker 5: Could give me. Eventually, I think we'll get there. Yeah, 503 00:30:11,680 --> 00:30:15,000 Speaker 5: you know, we've really only been doing this for the 504 00:30:15,040 --> 00:30:19,280 Speaker 5: past few years. There is a lot of I think, 505 00:30:19,520 --> 00:30:24,240 Speaker 5: capabilities to still discover and uncover with how we use 506 00:30:24,680 --> 00:30:28,560 Speaker 5: these models for, like especially long term predictions, like you're talking. 507 00:30:28,360 --> 00:30:31,880 Speaker 2: About what do you think you can't do and that 508 00:30:32,000 --> 00:30:34,400 Speaker 2: you really love to do. What's the kind of like 509 00:30:34,640 --> 00:30:35,880 Speaker 2: great white whale problem. 510 00:30:36,200 --> 00:30:37,920 Speaker 5: We can't do this today, but I'd like to be 511 00:30:37,960 --> 00:30:39,800 Speaker 5: able to do it in the future, which is really 512 00:30:39,840 --> 00:30:43,120 Speaker 5: the linking of the models together. Right. So, right now 513 00:30:43,120 --> 00:30:47,640 Speaker 5: we have these isolated areas where you know, we have 514 00:30:47,680 --> 00:30:52,800 Speaker 5: the harmonized lansat sentinel or geospatial model. We have the 515 00:30:52,800 --> 00:30:56,040 Speaker 5: weather model which can look at short term predictions. We're 516 00:30:56,040 --> 00:31:00,720 Speaker 5: building out the heliophysics model to look at the dynamics. 517 00:31:00,960 --> 00:31:03,800 Speaker 5: But they're probably going to have to be additional models 518 00:31:03,800 --> 00:31:06,160 Speaker 5: built so that we can understand how they interact with 519 00:31:06,200 --> 00:31:12,000 Speaker 5: one another, right, And that is you know, kind of 520 00:31:12,040 --> 00:31:15,880 Speaker 5: towards a digital twin of kind of the Solar system 521 00:31:15,960 --> 00:31:18,720 Speaker 5: or Earth systems, which which I think is a big 522 00:31:18,720 --> 00:31:21,680 Speaker 5: Harry problem, but if we understand it, we might be 523 00:31:21,720 --> 00:31:23,480 Speaker 5: able to address some of the questions that you just 524 00:31:23,520 --> 00:31:24,440 Speaker 5: asked about prediction. 525 00:31:25,000 --> 00:31:28,240 Speaker 2: So if you linked all of those models together, basically 526 00:31:28,240 --> 00:31:30,680 Speaker 2: what you're saying is, can I you say a digital twin, 527 00:31:31,120 --> 00:31:37,280 Speaker 2: you're essentially replicating holistically how our world works. 528 00:31:37,800 --> 00:31:38,000 Speaker 5: Yep? 529 00:31:38,200 --> 00:31:40,120 Speaker 2: And do you think that is achievable? 530 00:31:41,080 --> 00:31:44,200 Speaker 5: I don't think it's immediately achievable, but based on kind 531 00:31:44,200 --> 00:31:46,120 Speaker 5: of the progress that we've seen in the last three 532 00:31:46,200 --> 00:31:50,040 Speaker 5: or four years, I think it's more achievable today than 533 00:31:50,040 --> 00:31:50,640 Speaker 5: it was then. 534 00:31:51,200 --> 00:31:55,520 Speaker 2: Do you think you'll see it in your Yeah, sure, 535 00:31:55,520 --> 00:31:56,800 Speaker 2: I'm hopeful, and I've got a couple. 536 00:31:56,680 --> 00:32:04,600 Speaker 5: Of years left. 537 00:32:12,120 --> 00:32:15,920 Speaker 2: Smart Talks with IBM is produced by Matt Ramano, Amy Gains, McQuaid, 538 00:32:16,400 --> 00:32:20,680 Speaker 2: Trina Menino, and Jake Harper. Were edited by Lacy Roberts. 539 00:32:21,160 --> 00:32:25,320 Speaker 2: Engineering by Nina Bird Lawrence, mastering by Sarah Buguerer, music 540 00:32:25,400 --> 00:32:31,120 Speaker 2: by Gramoscope, Strategy by Tatiana Lieberman, Cassidy Meyer and Sophia Derlin. 541 00:32:31,920 --> 00:32:36,560 Speaker 2: Special thanks to the team at NASA's Science Mission Directorate. 542 00:32:37,720 --> 00:32:40,880 Speaker 2: Smart Talks with IBM is a production of Pushkin Industries 543 00:32:41,120 --> 00:32:45,719 Speaker 2: and Ruby Studio at iHeartMedia. To find more Pushkin podcasts, 544 00:32:46,000 --> 00:32:49,960 Speaker 2: listen on the iHeartRadio app, Apple Podcasts, or wherever you 545 00:32:50,040 --> 00:32:54,120 Speaker 2: listen to podcasts. I'm Malcolm Glawell. This is a paid 546 00:32:54,160 --> 00:32:59,240 Speaker 2: advertisement from IBM. The conversations on this podcast don't necessarily 547 00:32:59,280 --> 00:33:19,440 Speaker 2: represent ib m's positions, strategies or opinions. Since we recorded 548 00:33:19,440 --> 00:33:24,560 Speaker 2: this episode, IBM and NASA released Syria, their solar weather model. 549 00:33:25,160 --> 00:33:28,600 Speaker 2: In early testing, it showed a sixteen percent improvement in 550 00:33:28,760 --> 00:33:32,840 Speaker 2: solar flare prediction accuracy. This is the kind of improvement 551 00:33:33,040 --> 00:33:36,760 Speaker 2: that helps protect our satellites, our power grids, and our 552 00:33:36,800 --> 00:33:41,360 Speaker 2: GPS systems from the Sun's unpredictable nature. And the next 553 00:33:41,400 --> 00:33:45,520 Speaker 2: step in this partnership another model coming in twenty twenty six. 554 00:33:45,840 --> 00:33:49,280 Speaker 2: Looking beyond the Earth and the Sun. The universe of 555 00:33:49,360 --> 00:33:51,880 Speaker 2: possibilities just keeps expanding.