1 00:00:00,160 --> 00:00:01,640 Speaker 1: This is a big web for you. 2 00:00:01,760 --> 00:00:04,480 Speaker 2: I mean, on yesterday you announced the largest budget ever 3 00:00:04,600 --> 00:00:07,840 Speaker 2: to save and improve lives globally, unless you're a little 4 00:00:07,840 --> 00:00:09,200 Speaker 2: bit frustrated. 5 00:00:08,680 --> 00:00:11,680 Speaker 1: That not many people think about health. Why. 6 00:00:13,440 --> 00:00:16,120 Speaker 3: Health is pretty central to the human condition. 7 00:00:16,600 --> 00:00:20,599 Speaker 4: If you go back over one hundred years, about a 8 00:00:20,640 --> 00:00:23,760 Speaker 4: third of all kids died before the age of five, 9 00:00:24,120 --> 00:00:29,760 Speaker 4: and in rich countries, those numbers went down. By the 10 00:00:29,800 --> 00:00:33,600 Speaker 4: turn of the century, we had gotten to where about 11 00:00:33,640 --> 00:00:38,520 Speaker 4: ten million kids died every year under the age of five, 12 00:00:39,840 --> 00:00:43,600 Speaker 4: and mostly in poor countries, partly because the death rates 13 00:00:43,800 --> 00:00:47,000 Speaker 4: were twenty times higher there and because a lot of 14 00:00:47,040 --> 00:00:51,280 Speaker 4: the births were there. And this is one of the 15 00:00:51,280 --> 00:00:54,920 Speaker 4: great stories ever, is how we use things like gobby 16 00:00:55,040 --> 00:01:00,520 Speaker 4: that was introduced here at Davos to get out vaccines 17 00:01:00,560 --> 00:01:03,960 Speaker 4: all the world's children and cut that down below five 18 00:01:04,000 --> 00:01:08,920 Speaker 4: million a year so it had visibility. It was a 19 00:01:08,959 --> 00:01:12,560 Speaker 4: priority in funding. It's a priority in terms of innovation, 20 00:01:13,840 --> 00:01:18,000 Speaker 4: and that's the Gates Foundation's primary area today. You know, 21 00:01:18,040 --> 00:01:21,000 Speaker 4: we do face some headwinds, not on the innovation side. 22 00:01:21,000 --> 00:01:24,000 Speaker 4: The innovation side, we've got AI. We have many new 23 00:01:24,040 --> 00:01:28,480 Speaker 4: tools coming, but they actually funding for delivery. You know, 24 00:01:28,600 --> 00:01:32,360 Speaker 4: AID budgets are flat overall, So that means the amount 25 00:01:32,360 --> 00:01:35,280 Speaker 4: of money for health in Africa is going down quite 26 00:01:35,319 --> 00:01:39,240 Speaker 4: a bit as other topics come up. You'd hope that 27 00:01:39,280 --> 00:01:43,440 Speaker 4: as people say, hey, here's new money for climate or 28 00:01:43,600 --> 00:01:47,760 Speaker 4: some other cause, that you know, they'd maintain their commitment 29 00:01:47,920 --> 00:01:52,240 Speaker 4: to the high impact interventions. We'll see as we raise 30 00:01:52,280 --> 00:01:54,880 Speaker 4: money for Gobby, you know, can we get an increase, 31 00:01:54,960 --> 00:01:58,559 Speaker 4: can we stay flat? Or is the most impactful aid 32 00:01:58,640 --> 00:02:02,200 Speaker 4: in the world pushing us for that five million to 33 00:02:02,200 --> 00:02:06,200 Speaker 4: go back up instead of what if we really are vigilant, 34 00:02:07,640 --> 00:02:09,799 Speaker 4: sometime in the twenty thirties, we'll be able to get 35 00:02:09,800 --> 00:02:11,600 Speaker 4: that down to two and a half million. 36 00:02:12,080 --> 00:02:13,560 Speaker 1: But why is that so hard? 37 00:02:13,680 --> 00:02:16,080 Speaker 2: And again is it because people of perception is that 38 00:02:16,160 --> 00:02:18,680 Speaker 2: there are other crises that they need to fight and 39 00:02:18,720 --> 00:02:21,480 Speaker 2: so this goes down the priority list. 40 00:02:22,040 --> 00:02:25,399 Speaker 4: Well, you know, the first thing that would be ideal 41 00:02:26,120 --> 00:02:31,040 Speaker 4: is if countries would get two point seven percent in 42 00:02:31,080 --> 00:02:33,560 Speaker 4: their aid budget, and then you would have room for 43 00:02:34,240 --> 00:02:39,200 Speaker 4: Ukraine and climate and education and vaccines. You know, we'd 44 00:02:39,200 --> 00:02:42,760 Speaker 4: be in great shape. People were at that point seven percent, 45 00:02:44,280 --> 00:02:47,760 Speaker 4: and so the Gates Foundation is the biggest advocate for 46 00:02:48,360 --> 00:02:51,359 Speaker 4: overall generosity there and then making sure the health portion, 47 00:02:52,000 --> 00:02:54,720 Speaker 4: which is very high impact, to make sure that that's 48 00:02:54,760 --> 00:02:57,919 Speaker 4: well spent and that the voters who get behind that 49 00:02:58,000 --> 00:03:02,360 Speaker 4: feel good about. Okay, your responsibles for this result with 50 00:03:02,480 --> 00:03:07,960 Speaker 4: only a tiny tiny portion of your budget. And there 51 00:03:07,960 --> 00:03:15,200 Speaker 4: are not only humanitarian benefits, stability, economy, environment huge benefits 52 00:03:15,240 --> 00:03:17,960 Speaker 4: because as we make the world healthy, that's where you 53 00:03:18,000 --> 00:03:22,919 Speaker 4: get the population growth actually gets to a steady state. 54 00:03:23,200 --> 00:03:25,120 Speaker 1: So what do you want to focus on in health? 55 00:03:25,120 --> 00:03:27,959 Speaker 2: And we have some actually pretty incredible gadgets that use 56 00:03:28,000 --> 00:03:32,520 Speaker 2: also AI to scan for example, when females are pregnant, 57 00:03:32,800 --> 00:03:35,000 Speaker 2: whether the baby is okay, But in the next two 58 00:03:35,000 --> 00:03:37,120 Speaker 2: to three years, like what would your priorities be. 59 00:03:38,080 --> 00:03:42,720 Speaker 4: Well, I've been amazed at the speed of innovation and 60 00:03:42,760 --> 00:03:46,760 Speaker 4: that's even before now. We'll use AI tools to design 61 00:03:46,800 --> 00:03:50,400 Speaker 4: new vaccines and get it the very toughest problems, like 62 00:03:51,280 --> 00:03:55,480 Speaker 4: getting a grade HIV vaccine. This is a device that 63 00:03:55,560 --> 00:04:00,800 Speaker 4: will cost one thousand dollars or less and we connected 64 00:04:00,880 --> 00:04:06,440 Speaker 4: to AI software and so the cost per delivery is 65 00:04:06,520 --> 00:04:09,280 Speaker 4: less than a dollar. Where you scan and you see 66 00:04:09,760 --> 00:04:12,840 Speaker 4: is there anything about the cord or the ambiotic sac 67 00:04:14,160 --> 00:04:17,240 Speaker 4: that that woman is likely going to need a c section? 68 00:04:17,600 --> 00:04:21,520 Speaker 4: And so we funded looking at tens of thousands of 69 00:04:21,560 --> 00:04:25,440 Speaker 4: pregnancies and showed the AI here, this one came out normal, 70 00:04:25,560 --> 00:04:28,600 Speaker 4: this one came out abnormal. And so it's even better 71 00:04:29,200 --> 00:04:32,400 Speaker 4: than the best human at doing that prediction. And so 72 00:04:32,720 --> 00:04:36,159 Speaker 4: you know, even in rural Africa, you connect this to 73 00:04:36,200 --> 00:04:39,080 Speaker 4: a cell phone, you do the scan tells you what 74 00:04:39,200 --> 00:04:42,559 Speaker 4: you need to know, and that would cut maternal deaths 75 00:04:42,839 --> 00:04:47,080 Speaker 4: in half if we made it broadly available. 76 00:04:47,240 --> 00:04:49,440 Speaker 2: And so what's the challenge in making And I've seen pictures. 77 00:04:49,440 --> 00:04:51,039 Speaker 2: I mean it's incredible. You can really hook it up 78 00:04:51,080 --> 00:04:53,760 Speaker 2: to an iPad and you can kind of see even 79 00:04:53,800 --> 00:04:56,440 Speaker 2: when there's like, you know, a nurse in a village 80 00:04:56,680 --> 00:04:59,960 Speaker 2: of like ten females. What's the challenge in getting the 81 00:05:00,160 --> 00:05:01,680 Speaker 2: to the right place at scale? 82 00:05:02,080 --> 00:05:05,200 Speaker 4: Well, we had to spend hundreds of millions of dollars 83 00:05:05,480 --> 00:05:08,360 Speaker 4: to invent this, and now it's going through the regulatory 84 00:05:08,440 --> 00:05:13,239 Speaker 4: process and that's also very expensive. Sadly, there's no natural 85 00:05:13,279 --> 00:05:15,760 Speaker 4: market for this. You know, the people who benefit from 86 00:05:15,839 --> 00:05:19,760 Speaker 4: it have very little money. So the normal market mechanism 87 00:05:19,880 --> 00:05:23,600 Speaker 4: of is this a priority for the innovation agenda? Without 88 00:05:24,279 --> 00:05:28,920 Speaker 4: government's being Jennerson philanthropy, it just never happens, but it's 89 00:05:28,960 --> 00:05:33,520 Speaker 4: coming together. I was in Kenya seeing the work and 90 00:05:33,839 --> 00:05:35,919 Speaker 4: you know, we'll get it out there. We have partnerships 91 00:05:35,960 --> 00:05:40,280 Speaker 4: with lots of the ultrasound companies, including Phillips, who if 92 00:05:40,320 --> 00:05:43,159 Speaker 4: we fund it, they work with us very well. 93 00:05:43,520 --> 00:05:46,680 Speaker 2: And this is an HPV single dose, so you basically 94 00:05:46,680 --> 00:05:49,280 Speaker 2: have a backpack of good use that saves lives. 95 00:05:49,360 --> 00:05:51,040 Speaker 3: Yeah, there's a ton of this stuff. 96 00:05:53,080 --> 00:05:57,200 Speaker 4: And here with HPV we did something pretty amazing. We 97 00:05:57,320 --> 00:05:59,440 Speaker 4: brought the costs of the vaccine down by bringing a 98 00:05:59,480 --> 00:06:03,240 Speaker 4: lot of manu factors in people like serum, some Chinese manufacturers. 99 00:06:03,240 --> 00:06:07,800 Speaker 4: But we also did something that was very expensive and risky, 100 00:06:07,800 --> 00:06:10,160 Speaker 4: as we went out and really studied women. 101 00:06:09,960 --> 00:06:11,560 Speaker 3: Who'd only gotten one dose of the. 102 00:06:11,560 --> 00:06:14,960 Speaker 4: Vaccine and showed that they were as protected as well 103 00:06:14,960 --> 00:06:17,960 Speaker 4: as people who'd had two or three doses. And that 104 00:06:18,160 --> 00:06:21,880 Speaker 4: means that given that, you know, the vaccine budget is 105 00:06:21,960 --> 00:06:26,400 Speaker 4: super limited, because now we're at single dose, which over 106 00:06:26,440 --> 00:06:30,440 Speaker 4: twenty countries have adopted, we can protect twice as many 107 00:06:30,480 --> 00:06:35,559 Speaker 4: women with our very limited resources as we could when 108 00:06:36,560 --> 00:06:38,040 Speaker 4: the view was that you had to get out there 109 00:06:38,040 --> 00:06:40,720 Speaker 4: with two doses, and you know it's not you know, 110 00:06:40,760 --> 00:06:45,200 Speaker 4: who's motivated to spend tens of millions of dollars and I 111 00:06:45,279 --> 00:06:49,000 Speaker 4: prove something like that. It just you know, there's no 112 00:06:49,160 --> 00:06:53,480 Speaker 4: market mechanism that ever proves you need less doses. 113 00:06:53,760 --> 00:06:57,359 Speaker 2: But politician, are they listening? So we have multiple crises. 114 00:06:57,400 --> 00:07:00,680 Speaker 2: It's wars, it's climate change, it's inflation. I mean it 115 00:07:00,680 --> 00:07:03,719 Speaker 2: has healthcare worldwide really dropped off the list when you 116 00:07:03,720 --> 00:07:04,760 Speaker 2: speak to world leaders. 117 00:07:05,240 --> 00:07:08,000 Speaker 4: Well, when we raised money for Global Fund a year ago, 118 00:07:08,200 --> 00:07:11,160 Speaker 4: we actually end up with a little bit less money, 119 00:07:12,440 --> 00:07:16,360 Speaker 4: which is tragic. As we raise and Global Fund works 120 00:07:16,400 --> 00:07:22,840 Speaker 4: on TB malaria and HIV, as we raise money for gobby. 121 00:07:23,040 --> 00:07:27,640 Speaker 4: At the end of this year, as we see, does 122 00:07:27,680 --> 00:07:31,280 Speaker 4: the US maintain its commitment to its AHAV work, which 123 00:07:31,320 --> 00:07:34,400 Speaker 4: is both its PEP far bilateral work, and it's by 124 00:07:34,440 --> 00:07:37,240 Speaker 4: far the largest donor twenty five percent of the Global Fund. 125 00:07:37,920 --> 00:07:42,120 Speaker 4: We'll see these, you know, high high impact saving lives 126 00:07:42,160 --> 00:07:45,640 Speaker 4: for less than one thousand dollars per life saved. If 127 00:07:45,800 --> 00:07:49,440 Speaker 4: in the face of things that you know are not 128 00:07:50,040 --> 00:07:55,200 Speaker 4: as dramatic, you know, those are getting more attention. You know, 129 00:07:55,480 --> 00:07:58,640 Speaker 4: if we had held a meeting on global health, I 130 00:07:58,680 --> 00:08:00,960 Speaker 4: don't think we'd get seventy thousand some people to come. 131 00:08:01,280 --> 00:08:04,520 Speaker 4: You know, we work hard to get seven hundred people 132 00:08:04,560 --> 00:08:08,440 Speaker 4: to come. And yet if you want people to be 133 00:08:09,560 --> 00:08:13,160 Speaker 4: dealing with climate change, making sure a kid has good 134 00:08:13,240 --> 00:08:17,720 Speaker 4: nutrition is by far more impactful because then even if 135 00:08:17,760 --> 00:08:21,320 Speaker 4: there's a little bit more malaria or a little bit 136 00:08:21,320 --> 00:08:25,119 Speaker 4: more heat, that's what makes them incredibly resilient. And yet 137 00:08:25,520 --> 00:08:30,880 Speaker 4: that nutrition research and investments they're not counted in this quota. Okay, 138 00:08:30,960 --> 00:08:35,800 Speaker 4: let's shift the money and global health is a little 139 00:08:35,800 --> 00:08:37,679 Speaker 4: bit off the radar screen right now. 140 00:08:37,800 --> 00:08:40,800 Speaker 2: So you've really made healthcare your life's work. And I 141 00:08:40,840 --> 00:08:42,520 Speaker 2: think this morning you said, look, you're a little bit 142 00:08:42,600 --> 00:08:44,240 Speaker 2: jealous of the attention of Copp. 143 00:08:44,920 --> 00:08:46,920 Speaker 3: Well, I'm involved in climate change as well. 144 00:08:46,960 --> 00:08:49,640 Speaker 4: I'm actually the you know, I do more philanthropy for 145 00:08:49,720 --> 00:08:54,120 Speaker 4: climate change than anyone else does. I fund more private companies. 146 00:08:54,200 --> 00:08:57,840 Speaker 4: It's a very important issue over the long run. You know, 147 00:08:58,400 --> 00:09:01,800 Speaker 4: as fields mature, you figureigure out how to spend money well. 148 00:09:01,840 --> 00:09:05,480 Speaker 4: And in the year two thousand, global health, we weren't 149 00:09:05,559 --> 00:09:09,640 Speaker 4: doing autopsies. That's the thing the Gates Foundation funds to 150 00:09:09,640 --> 00:09:10,920 Speaker 4: really understand the Brenham disease. 151 00:09:11,000 --> 00:09:12,280 Speaker 3: We weren't gathering the data. 152 00:09:12,320 --> 00:09:15,920 Speaker 4: We fund International health metrics to do that, and the 153 00:09:16,000 --> 00:09:20,040 Speaker 4: donors weren't engaged and working together, and so that's been 154 00:09:20,120 --> 00:09:24,120 Speaker 4: super successful climates at a very early stage, and so 155 00:09:24,240 --> 00:09:27,400 Speaker 4: a lot of these grants are not high impact, not 156 00:09:27,600 --> 00:09:32,160 Speaker 4: very good at measuring it. The innovation agenda there, which 157 00:09:32,200 --> 00:09:37,000 Speaker 4: will also be AI supercharged, is very exciting. So I'm 158 00:09:37,040 --> 00:09:41,320 Speaker 4: actually optimistic about climate health. But for the next ten years, 159 00:09:41,320 --> 00:09:44,240 Speaker 4: where money is going to be so limited, we need 160 00:09:44,920 --> 00:09:48,280 Speaker 4: you know, if you want to care about health climate impacts, 161 00:09:49,080 --> 00:09:51,319 Speaker 4: the health spending should go up, not down. 162 00:09:51,840 --> 00:09:54,280 Speaker 2: Are you worried about elections around the world and the 163 00:09:54,280 --> 00:09:57,800 Speaker 2: fact that actually it could also change our I guess 164 00:09:57,880 --> 00:10:00,360 Speaker 2: dedicational or spending in general towards. 165 00:10:00,600 --> 00:10:03,960 Speaker 4: Yeah, the challenge I talked about is before you factor 166 00:10:04,000 --> 00:10:08,200 Speaker 4: in the potential of having governments that don't really care 167 00:10:08,840 --> 00:10:11,800 Speaker 4: about their role in the world and you know, don't 168 00:10:11,800 --> 00:10:15,360 Speaker 4: want to go forward with the generosity of spirit working 169 00:10:15,400 --> 00:10:19,120 Speaker 4: with others. So that would be you know, for example, 170 00:10:19,160 --> 00:10:23,200 Speaker 4: the US commitment to HIV. If that goes away, you know, 171 00:10:23,280 --> 00:10:25,800 Speaker 4: that's tens of millions of death, So that's really big. 172 00:10:25,840 --> 00:10:29,600 Speaker 4: I mean, that's way more than all the wars put together. 173 00:10:30,360 --> 00:10:33,960 Speaker 2: Can the private companies step in to try and fill 174 00:10:33,960 --> 00:10:34,880 Speaker 2: a potential void. 175 00:10:35,360 --> 00:10:40,400 Speaker 4: They basically know companies can help. They can help with 176 00:10:40,440 --> 00:10:43,600 Speaker 4: the visibility, they can help with the innovation, they can 177 00:10:44,360 --> 00:10:48,080 Speaker 4: there's a lot they can do. But the you know, 178 00:10:48,120 --> 00:10:52,680 Speaker 4: one hundred and thirty billion dollars of aid money that's given, 179 00:10:54,200 --> 00:10:57,199 Speaker 4: we're not going to get the private sector, even philanthropy 180 00:10:57,320 --> 00:11:00,400 Speaker 4: to make up any shortfalls there. It's just the numbers 181 00:11:00,440 --> 00:11:00,959 Speaker 4: don't work. 182 00:11:01,600 --> 00:11:04,240 Speaker 2: Malaria, I mean, this is also a huge topic that 183 00:11:04,360 --> 00:11:05,439 Speaker 2: needs to be addressed well. 184 00:11:05,480 --> 00:11:10,079 Speaker 4: Malaria is you know, when we first got started, of 185 00:11:10,120 --> 00:11:13,800 Speaker 4: those ten million deaths, a million of them we're malaria deaths. 186 00:11:14,360 --> 00:11:18,520 Speaker 4: That's down to a half million. And we have incredible tools. 187 00:11:18,559 --> 00:11:22,280 Speaker 4: We work for example with the chemical companies on bed nets. 188 00:11:22,559 --> 00:11:24,360 Speaker 4: You know, we fund the twials. We have a next 189 00:11:24,360 --> 00:11:27,480 Speaker 4: generation of bed nets that deals with the resistance. We 190 00:11:27,559 --> 00:11:30,640 Speaker 4: have some very incredible tools that will take five years 191 00:11:30,679 --> 00:11:34,800 Speaker 4: to deploy, that called gene drive, that are for decimating 192 00:11:34,840 --> 00:11:41,160 Speaker 4: mosquito populations and used in combination with prophylaxis and treatment, 193 00:11:41,760 --> 00:11:45,680 Speaker 4: we have a chance of reducing the malaria map and 194 00:11:45,760 --> 00:11:50,000 Speaker 4: eventually getting to pull eradication. That's probably a twenty year thing. 195 00:11:50,520 --> 00:11:53,640 Speaker 4: You know, The Lifetime Art Foundation is right about that, 196 00:11:53,840 --> 00:11:58,000 Speaker 4: so you know, we're working on Polio's are bigger radication 197 00:11:58,160 --> 00:12:01,640 Speaker 4: right now, along with some of the neglected diseases like 198 00:12:01,720 --> 00:12:05,440 Speaker 4: guinea worm. But as we succeed in that, you know, 199 00:12:05,480 --> 00:12:10,000 Speaker 4: with malaria, with measles, we need to get rid. 200 00:12:09,920 --> 00:12:10,679 Speaker 3: Of them altogether. 201 00:12:11,520 --> 00:12:13,960 Speaker 2: Is the US, and of course it's a big election year, 202 00:12:14,760 --> 00:12:16,800 Speaker 2: if you know, Donald Trump comes into my house. But 203 00:12:16,840 --> 00:12:20,000 Speaker 2: even under a Biden second administration, what does that mean 204 00:12:20,040 --> 00:12:22,480 Speaker 2: for all of these What does it mean exactly for 205 00:12:22,640 --> 00:12:24,560 Speaker 2: how much they're going to spend on health, how much? 206 00:12:24,760 --> 00:12:27,880 Speaker 1: How will they deal with climate change? And even AI well. 207 00:12:28,080 --> 00:12:33,880 Speaker 4: The US in terms of fostering climate innovation, the IRA 208 00:12:34,760 --> 00:12:36,640 Speaker 4: is the biggest thing that ever happened in the world 209 00:12:36,760 --> 00:12:41,439 Speaker 4: in terms of accelerating climate innovation. And so a lot 210 00:12:41,480 --> 00:12:43,520 Speaker 4: of the companies I'm in over one hundred and twenty 211 00:12:43,520 --> 00:12:47,880 Speaker 4: companies that work on the different areas of emissions, they 212 00:12:48,000 --> 00:12:51,920 Speaker 4: are moving much faster. Some of them couldn't even exist 213 00:12:52,040 --> 00:12:56,600 Speaker 4: without those tax credits, which over time will allow them 214 00:12:56,640 --> 00:12:59,959 Speaker 4: to get green technologies that don't bear what I call 215 00:13:00,160 --> 00:13:04,040 Speaker 4: a green premium, and so you can make cement around 216 00:13:04,080 --> 00:13:08,000 Speaker 4: the world with no emissions but not pay more at first. 217 00:13:08,000 --> 00:13:08,839 Speaker 3: It will cost more. 218 00:13:08,840 --> 00:13:14,200 Speaker 4: And that's why the learning curve tax credits in the 219 00:13:14,240 --> 00:13:18,240 Speaker 4: i RA are so critical to not just solving this 220 00:13:18,360 --> 00:13:21,160 Speaker 4: problem for the US, but solving it for the entire 221 00:13:21,200 --> 00:13:25,319 Speaker 4: world scale up. It matters if those green premiums are 222 00:13:25,320 --> 00:13:28,240 Speaker 4: eliminated or are brought down to be very low. So 223 00:13:28,840 --> 00:13:34,000 Speaker 4: certainly in climate, in caring about the world, voters you know, 224 00:13:34,080 --> 00:13:39,320 Speaker 4: will be somewhat expressing their moral views in terms of 225 00:13:39,360 --> 00:13:44,200 Speaker 4: who who they pick going forward? You know, should should 226 00:13:44,200 --> 00:13:48,800 Speaker 4: the who? And you know, should its work continue? Should 227 00:13:49,160 --> 00:13:50,960 Speaker 4: you know, pep far continue. 228 00:13:51,960 --> 00:13:53,760 Speaker 3: It's a question if. 229 00:13:53,640 --> 00:13:56,679 Speaker 2: The US doesn't fulfill that role or there are other 230 00:13:56,800 --> 00:14:00,719 Speaker 2: countries that can step up, not a chance, not even 231 00:14:00,760 --> 00:14:01,520 Speaker 2: if they come together. 232 00:14:02,080 --> 00:14:07,360 Speaker 4: Oh, I mean the the US is the world's biggest economy. 233 00:14:06,840 --> 00:14:10,400 Speaker 4: The percentage of innovation that it says you and the 234 00:14:10,840 --> 00:14:13,080 Speaker 4: and I. You know, I go to every government and 235 00:14:13,080 --> 00:14:15,520 Speaker 4: I say, please be more generous. But I've never been 236 00:14:15,559 --> 00:14:19,800 Speaker 4: able to say, hey, the US is being selfish. Why 237 00:14:19,800 --> 00:14:22,880 Speaker 4: don't you go to your taxpayers and say, hey, let's 238 00:14:22,880 --> 00:14:26,320 Speaker 4: make up for us selfishness. That pitch has never really 239 00:14:26,960 --> 00:14:29,400 Speaker 4: gotten that much traction. 240 00:14:30,800 --> 00:14:31,560 Speaker 3: So no, the. 241 00:14:33,120 --> 00:14:37,000 Speaker 4: Figure of merit is percentage of DDP point seven percent 242 00:14:37,440 --> 00:14:41,280 Speaker 4: is considered very generous. Only Norway and Sweden or above that. 243 00:14:41,840 --> 00:14:44,360 Speaker 4: You know, Germany's at that level, The UK was at 244 00:14:44,360 --> 00:14:47,520 Speaker 4: that level, went back down to point five. The US 245 00:14:47,640 --> 00:14:51,200 Speaker 4: is at point twenty four, which you know, when Bush 246 00:14:51,240 --> 00:14:54,080 Speaker 4: came into office, we were at point one. The biggest 247 00:14:54,200 --> 00:14:59,560 Speaker 4: increase in my lifetime was during Presidence Bush administration. He 248 00:14:59,680 --> 00:15:04,000 Speaker 4: did a malaria program. Both of the HIV program is 249 00:15:04,040 --> 00:15:08,200 Speaker 4: pretty incredible, and that's been maintained on a bipartisan basis. 250 00:15:08,240 --> 00:15:12,920 Speaker 4: It's only now that we see at being attacked. 251 00:15:13,640 --> 00:15:15,600 Speaker 3: You know, so it hangs in the balance. 252 00:15:15,640 --> 00:15:19,640 Speaker 4: We had a great twenty year celebration last year of 253 00:15:19,680 --> 00:15:22,560 Speaker 4: the over twenty million lives that have been saved with 254 00:15:22,640 --> 00:15:23,440 Speaker 4: that program. 255 00:15:23,840 --> 00:15:25,560 Speaker 2: But I mean a lot of the time you spend 256 00:15:25,640 --> 00:15:28,320 Speaker 2: is also to making sure that healthcare is top of 257 00:15:28,360 --> 00:15:31,880 Speaker 2: the agenda, right That's it's almost. 258 00:15:31,000 --> 00:15:35,440 Speaker 4: Care about humans, you know, childhood death, women bleeding to death. 259 00:15:35,760 --> 00:15:37,480 Speaker 3: And we're not talking about small numbers. 260 00:15:37,520 --> 00:15:41,640 Speaker 4: We're talking about millions and millions, you know, and people 261 00:15:41,680 --> 00:15:46,040 Speaker 4: decide is that you know a priority for a tiny 262 00:15:46,080 --> 00:15:50,560 Speaker 4: part of their budgets, you know, should. 263 00:15:50,240 --> 00:15:51,160 Speaker 3: There be more giving? 264 00:15:51,280 --> 00:15:55,280 Speaker 4: All these things can help, and it's a very positive 265 00:15:55,280 --> 00:15:58,280 Speaker 4: thing because you know, I have you know, dozens of 266 00:15:58,320 --> 00:16:02,800 Speaker 4: innovations that have been funded over these last ten years 267 00:16:02,800 --> 00:16:06,400 Speaker 4: that are very low cost and ready to roll out. 268 00:16:06,560 --> 00:16:07,760 Speaker 3: And that's why. 269 00:16:07,960 --> 00:16:10,920 Speaker 4: The two and a half million is within reach if 270 00:16:10,960 --> 00:16:14,400 Speaker 4: we stay focused on these things, and that would get 271 00:16:14,400 --> 00:16:17,320 Speaker 4: you to two percent of kids dying before the age 272 00:16:17,360 --> 00:16:20,640 Speaker 4: of five. Rich countries are one percent, and so you'd 273 00:16:20,680 --> 00:16:23,840 Speaker 4: be within a factor of two of you know, sort 274 00:16:23,880 --> 00:16:26,480 Speaker 4: of the basic statement of the Gates Foundation, which is 275 00:16:26,520 --> 00:16:29,360 Speaker 4: all lives have equal value. So we won't get there 276 00:16:29,400 --> 00:16:33,200 Speaker 4: in my lifetime, but we will get close if we 277 00:16:33,240 --> 00:16:34,880 Speaker 4: stay involved and committed. 278 00:16:34,960 --> 00:16:38,320 Speaker 2: I mean, COVID has also changed our relationship to healthcare. 279 00:16:38,760 --> 00:16:43,760 Speaker 2: What does that mean for vaccination hesitancy in developing countries 280 00:16:44,320 --> 00:16:45,560 Speaker 2: and developed nations? 281 00:16:46,360 --> 00:16:51,720 Speaker 4: Well, the you know, the misinformation about vaccines and associating 282 00:16:51,800 --> 00:16:56,280 Speaker 4: certain people like myself or Fauci having malign in tent 283 00:16:56,360 --> 00:16:59,880 Speaker 4: with vaccines. That was most acute in the United States. 284 00:17:00,440 --> 00:17:05,000 Speaker 4: But the pandemic which you would have thought, wow, global 285 00:17:05,040 --> 00:17:07,960 Speaker 4: health research to talking about health being ready for the 286 00:17:07,960 --> 00:17:08,760 Speaker 4: next pandemic. 287 00:17:09,160 --> 00:17:10,200 Speaker 3: You know, when you've got. 288 00:17:10,119 --> 00:17:13,840 Speaker 4: Millions of deaths, isn't that you know, it's sad, it's tragic. 289 00:17:13,880 --> 00:17:15,600 Speaker 4: But isn't he at least there are a benefit that 290 00:17:15,960 --> 00:17:19,800 Speaker 4: health is on the agenda. Sadly, it's a topic nobody 291 00:17:19,840 --> 00:17:21,840 Speaker 4: wants to talk about because it was painful. 292 00:17:22,320 --> 00:17:26,400 Speaker 3: You know, it's over. Let's move on from that, and. 293 00:17:26,400 --> 00:17:29,840 Speaker 4: Even the idea of okay, the importance of who, the 294 00:17:29,880 --> 00:17:34,439 Speaker 4: importance of vaccine, importance of vaccine research in telling people, hey, 295 00:17:34,520 --> 00:17:40,240 Speaker 4: vaccines really are very well tested. That's a question historically 296 00:17:40,320 --> 00:17:44,439 Speaker 4: when we've had vaccine hesitancy. For example, in Nigeria in 297 00:17:44,480 --> 00:17:48,720 Speaker 4: two thousand and one, the polio campaign faced disaster when 298 00:17:48,760 --> 00:17:53,560 Speaker 4: some politicians said that oral polio vaccine was sterilizing Muslim women. 299 00:17:54,000 --> 00:17:54,840 Speaker 3: And it took us. 300 00:17:54,680 --> 00:17:58,360 Speaker 4: Five years working with the religious leaders where they would 301 00:17:58,480 --> 00:18:03,120 Speaker 4: visibly vaccinate their chillidn and talked about where the vaccine 302 00:18:03,160 --> 00:18:06,760 Speaker 4: came and how it avoided children being paralyzed or dying, 303 00:18:07,800 --> 00:18:12,000 Speaker 4: and we were able to get rid of polio from Nigeria. 304 00:18:12,520 --> 00:18:16,320 Speaker 4: We have two countries left, Afghanistan and Pakistan, where we've 305 00:18:16,359 --> 00:18:19,320 Speaker 4: never gotten to zero. It spread back to Africa, so 306 00:18:19,359 --> 00:18:21,280 Speaker 4: we have to also clean that up. But the hardest 307 00:18:21,280 --> 00:18:25,560 Speaker 4: part that we're very close on. With a little bit 308 00:18:25,600 --> 00:18:28,320 Speaker 4: of luck, in a few years, we should be able 309 00:18:28,320 --> 00:18:30,960 Speaker 4: to get to zero's Pakistan and Afghanistan. 310 00:18:31,200 --> 00:18:32,680 Speaker 1: Well, guess we have two minutes left. 311 00:18:33,200 --> 00:18:36,639 Speaker 2: How do you harness AI to be good actually for 312 00:18:36,720 --> 00:18:37,280 Speaker 2: human kinds? 313 00:18:37,840 --> 00:18:41,440 Speaker 4: Well A, and I'd say that it's going to raise 314 00:18:41,880 --> 00:18:46,680 Speaker 4: productivity generally, and normal capital markets with great competition from 315 00:18:46,720 --> 00:18:47,520 Speaker 4: Microsoft and. 316 00:18:47,480 --> 00:18:48,960 Speaker 3: Google will drive that. 317 00:18:49,280 --> 00:18:52,199 Speaker 4: And you should all pay attention because it is so 318 00:18:52,520 --> 00:18:57,439 Speaker 4: dramatic how it improves white collar productivity and later with 319 00:18:57,800 --> 00:19:01,919 Speaker 4: the robotics, not yet, but eventually blue collar productivity. So 320 00:19:02,000 --> 00:19:05,000 Speaker 4: that is phenomenal for the world. The world will be 321 00:19:05,200 --> 00:19:09,600 Speaker 4: richer and you know, can work less and have more 322 00:19:10,240 --> 00:19:13,720 Speaker 4: the place where it's with equity, and we have a 323 00:19:13,800 --> 00:19:16,200 Speaker 4: huge commitment to make sure there's not this normal twenty 324 00:19:16,280 --> 00:19:20,840 Speaker 4: year leg between benefit to the rich versus to developing countries. 325 00:19:21,560 --> 00:19:26,400 Speaker 4: Is that there's a bigger teacher shortage in Africa than elsewhere, 326 00:19:26,480 --> 00:19:29,840 Speaker 4: bigger doctor shortage. And so not only will we invent 327 00:19:29,960 --> 00:19:34,359 Speaker 4: new tools using AI like the ultrasound, we will provide 328 00:19:35,080 --> 00:19:40,879 Speaker 4: health advice and directly, you know, in their local African language, 329 00:19:41,520 --> 00:19:45,200 Speaker 4: fully tailored to the conditions in those countries. We will 330 00:19:45,200 --> 00:19:48,399 Speaker 4: provide an AI doctor, We will provide an AI tutor, 331 00:19:48,880 --> 00:19:52,000 Speaker 4: and already We've funded lots of Africans to do pilot 332 00:19:52,080 --> 00:19:55,520 Speaker 4: studies and to take the very best technology and get 333 00:19:55,520 --> 00:19:59,160 Speaker 4: it out at about the same time as it will 334 00:19:59,200 --> 00:20:01,280 Speaker 4: happen in the ritual. In fact, in a few cases, 335 00:20:01,720 --> 00:20:07,560 Speaker 4: ritual regulations may make it roll out slower than in 336 00:20:07,840 --> 00:20:11,440 Speaker 4: countries like India or in Africa. So it's a race, 337 00:20:11,960 --> 00:20:15,280 Speaker 4: but it's a race for good and you know, I 338 00:20:15,280 --> 00:20:17,480 Speaker 4: couldn't be more thrilled. You know, it's a case where 339 00:20:17,480 --> 00:20:20,439 Speaker 4: my work ongoing work with Microsoft and it helps me 340 00:20:20,520 --> 00:20:24,480 Speaker 4: understand how we take this into the developing countries. 341 00:20:24,920 --> 00:20:29,639 Speaker 2: But overall your net positive on AI in general globally AI. 342 00:20:30,280 --> 00:20:34,919 Speaker 4: AI is a brilliant tool for people to be more productive. 343 00:20:35,320 --> 00:20:38,119 Speaker 4: Now it means the bad guys will be more productive 344 00:20:38,200 --> 00:20:40,280 Speaker 4: so they can do more cyber attacks, so they can 345 00:20:40,280 --> 00:20:44,120 Speaker 4: design weapons. You always, you know, the Internet, the microprocessor, 346 00:20:44,240 --> 00:20:49,600 Speaker 4: all these things helped everybody be more efficient, including the 347 00:20:49,640 --> 00:20:52,520 Speaker 4: bad guys. And so you've got to make sure the 348 00:20:52,560 --> 00:20:57,000 Speaker 4: best AI for cyber defense or you know, measures to 349 00:20:57,080 --> 00:21:00,399 Speaker 4: defend against bioterrorism are in the hand of the good guys. 350 00:21:00,720 --> 00:21:05,399 Speaker 4: And you know it's a challenge, but you know, people 351 00:21:05,480 --> 00:21:08,480 Speaker 4: sometimes lose sight of the fact that this is the 352 00:21:08,520 --> 00:21:11,320 Speaker 4: biggest productivity advance in our lifetimes,