1 00:00:04,400 --> 00:00:07,560 Speaker 1: Hi everyone. I'm Kitty Kuric and this is next question. 2 00:00:25,120 --> 00:00:28,560 Speaker 1: Hi everyone, So happy to be here with mister Gates 3 00:00:28,600 --> 00:00:32,519 Speaker 1: and all of you. Very exciting. He's made the rounds already. 4 00:00:32,640 --> 00:00:36,040 Speaker 1: He's pretty much been everywhere, so we're very excited that 5 00:00:36,120 --> 00:00:39,520 Speaker 1: we have sort of a private ish audience with him. 6 00:00:39,840 --> 00:00:42,440 Speaker 1: And as you know, he is autographed a copy of 7 00:00:42,440 --> 00:00:46,000 Speaker 1: his memoir for everybody here, so you guys will have 8 00:00:46,120 --> 00:00:49,839 Speaker 1: fun reading it and talking about it on your various platforms. 9 00:00:49,880 --> 00:00:54,120 Speaker 1: You book influencers. You I know myself about writing a memoir. 10 00:00:54,280 --> 00:00:57,440 Speaker 1: How what an intense experience it is looking back on 11 00:00:57,480 --> 00:01:00,720 Speaker 1: your childhood, looking back on everything that is influenced you 12 00:01:01,080 --> 00:01:04,440 Speaker 1: as you grew And this is really about the first 13 00:01:04,520 --> 00:01:07,920 Speaker 1: part of Bill's life. So I'm really excited Bill. I 14 00:01:07,920 --> 00:01:12,919 Speaker 1: really enjoyed reading it, and it's the first of three volumes. 15 00:01:12,959 --> 00:01:15,080 Speaker 1: Holy cal you have a lot to say, don't you. 16 00:01:16,120 --> 00:01:20,960 Speaker 2: Yeah, first we were thinking I'll just write one, but 17 00:01:21,040 --> 00:01:25,840 Speaker 2: then I wasn't covering things in a good way, so 18 00:01:25,880 --> 00:01:27,840 Speaker 2: that really just didn't come together. And so it was 19 00:01:27,880 --> 00:01:32,399 Speaker 2: eighteen months ago I thought maybe I'll just do my 20 00:01:32,440 --> 00:01:35,440 Speaker 2: first twenty five years where I can really focus on 21 00:01:35,520 --> 00:01:39,880 Speaker 2: my parents and the great luck I had with friends 22 00:01:39,920 --> 00:01:40,800 Speaker 2: and when I was born. 23 00:01:41,000 --> 00:01:44,720 Speaker 1: So it starts really with your childhood growing up in 24 00:01:44,800 --> 00:01:47,920 Speaker 1: Seattle and goes all the way to your decision to 25 00:01:48,000 --> 00:01:51,360 Speaker 1: leave Harvard to commit yourself full time to building Microsoft 26 00:01:51,440 --> 00:01:55,240 Speaker 1: into one of the country's most consequential companies. I'm curious, 27 00:01:55,360 --> 00:01:58,760 Speaker 1: what did you learn going back in time? What did 28 00:01:58,800 --> 00:02:02,240 Speaker 1: you learn about yourself that you really didn't appreciate before. 29 00:02:03,920 --> 00:02:08,800 Speaker 2: Well, I don't look back much because the future, you know, 30 00:02:08,880 --> 00:02:12,280 Speaker 2: has lots of innovation and things that I'm kind of 31 00:02:12,680 --> 00:02:17,720 Speaker 2: rushing and working hard to try and achieve nowadays, mostly 32 00:02:17,960 --> 00:02:21,480 Speaker 2: through my foundation work. And so it was only as 33 00:02:21,560 --> 00:02:26,040 Speaker 2: I was about to turn seventy later this year that 34 00:02:26,120 --> 00:02:29,080 Speaker 2: I thought, no, it would be great to look back. 35 00:02:30,120 --> 00:02:34,280 Speaker 2: And you know, I always thought of my parents is 36 00:02:34,360 --> 00:02:37,359 Speaker 2: kind of amazing, each in kind of a different way. 37 00:02:38,520 --> 00:02:41,360 Speaker 2: But that really grew for me as I went through 38 00:02:41,400 --> 00:02:46,960 Speaker 2: this and talked to my sisters, made sure they had 39 00:02:46,760 --> 00:02:50,799 Speaker 2: the same views about things. And there are various points 40 00:02:50,840 --> 00:02:54,960 Speaker 2: where my parents really made great decisions about sending me 41 00:02:55,080 --> 00:02:57,680 Speaker 2: to a therapist or a private school, or give me 42 00:02:58,800 --> 00:03:01,480 Speaker 2: an immense amount of freedom and more than I'd say 43 00:03:01,800 --> 00:03:03,840 Speaker 2: any kid gets today. 44 00:03:04,000 --> 00:03:07,239 Speaker 1: I loved reading about your family tree, but I got 45 00:03:07,360 --> 00:03:10,560 Speaker 1: especially a big kick out of your gammy what you 46 00:03:10,639 --> 00:03:14,040 Speaker 1: called her your maternal grandmother, and she was kind of 47 00:03:14,040 --> 00:03:16,760 Speaker 1: a card shark and a family that was apparently obsessed 48 00:03:16,800 --> 00:03:20,160 Speaker 1: with card games. What kind of life lessons did you learn? 49 00:03:20,240 --> 00:03:23,720 Speaker 1: I love the fact when you beat her finally, there 50 00:03:23,760 --> 00:03:26,840 Speaker 1: wasn't a big celebration. I would have been such so obnoxious. 51 00:03:26,880 --> 00:03:29,720 Speaker 1: I would have been jumping around and dancing and you know, 52 00:03:29,800 --> 00:03:33,840 Speaker 1: making her feel really bad, But you kind of said 53 00:03:33,880 --> 00:03:36,080 Speaker 1: it just sort of went by, but you were still 54 00:03:36,160 --> 00:03:39,320 Speaker 1: very proud of that moment, right, Yeah. 55 00:03:39,360 --> 00:03:42,640 Speaker 2: I mean she won a high percentage of the time, 56 00:03:43,800 --> 00:03:47,080 Speaker 2: but she didn't really explain what her state machine was 57 00:03:47,200 --> 00:03:50,800 Speaker 2: that you know, as cards go by in coachcard Rumney 58 00:03:51,120 --> 00:03:54,840 Speaker 2: or gin or Bridge being the most complicated, but even 59 00:03:54,960 --> 00:03:59,680 Speaker 2: hearts there's quite a bit of strategy. And so it 60 00:03:59,680 --> 00:04:01,800 Speaker 2: took a number of years when I went from being, 61 00:04:02,320 --> 00:04:06,320 Speaker 2: you know, mostly loser to mostly a winner. And I 62 00:04:06,360 --> 00:04:10,160 Speaker 2: think she was a little chagrined that I caught on 63 00:04:10,240 --> 00:04:12,360 Speaker 2: because she never really explained her secret. 64 00:04:12,480 --> 00:04:15,680 Speaker 1: But weren't they that she really I mean, she really 65 00:04:15,800 --> 00:04:20,719 Speaker 1: understood her opponent's hands. What was her secret? Because I 66 00:04:20,760 --> 00:04:23,000 Speaker 1: love that you wrote. Card Plane taught me that no 67 00:04:23,120 --> 00:04:27,479 Speaker 1: matter how complex or mysterious something seems, you often can 68 00:04:27,520 --> 00:04:30,799 Speaker 1: figure it out. The world can be understood. 69 00:04:31,800 --> 00:04:37,160 Speaker 2: Well in some of these games simply having an exact 70 00:04:37,200 --> 00:04:40,560 Speaker 2: memory for what cards have gone by and you know 71 00:04:40,600 --> 00:04:47,440 Speaker 2: what they've picked up. I actually wrote down mathematically optimal 72 00:04:48,000 --> 00:04:49,200 Speaker 2: strategy for a number of. 73 00:04:49,160 --> 00:04:50,039 Speaker 1: These things you did. 74 00:04:51,000 --> 00:04:56,520 Speaker 2: And you know people weren't that interested in it, but 75 00:04:56,640 --> 00:04:59,960 Speaker 2: the fact that if you thought hard enough they were 76 00:05:00,279 --> 00:05:06,160 Speaker 2: is the best play. And that it's the subtlety of it. 77 00:05:08,040 --> 00:05:11,039 Speaker 2: And so the person who's really spent the time, you 78 00:05:11,080 --> 00:05:13,200 Speaker 2: know when you walk into the game you have an 79 00:05:13,200 --> 00:05:14,159 Speaker 2: incredible edge. 80 00:05:14,440 --> 00:05:17,800 Speaker 1: Your dad, Bill Junior, was a lawyer. Your mom, Mary, 81 00:05:17,920 --> 00:05:21,919 Speaker 1: served on the boards of several corporations and nonprofit organizations. 82 00:05:22,160 --> 00:05:24,280 Speaker 1: You were the middle kid. You had an older sister 83 00:05:24,360 --> 00:05:27,640 Speaker 1: and a younger sister coincidentally the same as your kids, right, 84 00:05:28,200 --> 00:05:30,640 Speaker 1: But looking back on your childhood, what quality do you 85 00:05:30,680 --> 00:05:32,839 Speaker 1: think you got from each of your parents? 86 00:05:34,520 --> 00:05:39,160 Speaker 2: Well, my dad was mostly just setting an amazing example. 87 00:05:39,240 --> 00:05:42,160 Speaker 2: He was always calm, he always knew the answer, He 88 00:05:42,200 --> 00:05:47,840 Speaker 2: never acted out of emotion. The world was a predictable 89 00:05:47,880 --> 00:05:52,560 Speaker 2: place that you could always think things through my mom. 90 00:05:52,640 --> 00:05:55,840 Speaker 2: I spent way more time with because my dad would 91 00:05:55,880 --> 00:05:57,800 Speaker 2: go off to work in the morning, and you know, 92 00:05:57,839 --> 00:06:00,479 Speaker 2: when he came back, we'd have dinner and talk about 93 00:06:00,560 --> 00:06:03,240 Speaker 2: mostly what he was working on. But you know, all 94 00:06:03,320 --> 00:06:07,280 Speaker 2: day long, my mom's saying, you know, get up here, 95 00:06:07,560 --> 00:06:09,880 Speaker 2: get dressed, keep your elbows off. 96 00:06:09,920 --> 00:06:13,320 Speaker 1: She was kind of a tiger mom, wasn't she a. 97 00:06:13,240 --> 00:06:17,839 Speaker 2: Little bit uh? You know, she did it kind of 98 00:06:17,839 --> 00:06:21,839 Speaker 2: in an implicit way where when she'd speak about other parents, 99 00:06:21,920 --> 00:06:24,520 Speaker 2: she'd say, oh, they must be so disappointed, you know, 100 00:06:24,600 --> 00:06:28,839 Speaker 2: their son didn't go to college or that, you know, 101 00:06:28,880 --> 00:06:33,640 Speaker 2: he never learned his table manners. And I said to 102 00:06:33,640 --> 00:06:35,320 Speaker 2: my mom, you know, you you told me to go 103 00:06:35,400 --> 00:06:37,760 Speaker 2: to Harvard, and she said, I know, I never said that, 104 00:06:38,279 --> 00:06:40,479 Speaker 2: But then I explained, no, you really did say it. 105 00:06:40,839 --> 00:06:44,240 Speaker 2: You didn't have to in this indirect way of you know, 106 00:06:44,279 --> 00:06:49,600 Speaker 2: what what your values were. So that was an intense relationship. 107 00:06:50,760 --> 00:06:54,440 Speaker 2: You know. I I confused her a bit because I would, 108 00:06:54,640 --> 00:06:57,560 Speaker 2: you know, stay in my room and read a lot. 109 00:06:57,880 --> 00:07:00,880 Speaker 2: And there were period of years where I was a 110 00:07:00,880 --> 00:07:03,559 Speaker 2: bit rebellious and not really thinking. 111 00:07:05,240 --> 00:07:08,080 Speaker 1: You write a lot about the tension between the two 112 00:07:08,160 --> 00:07:10,720 Speaker 1: of you. You say I could go days without speaking, 113 00:07:10,760 --> 00:07:14,240 Speaker 1: emerging from my room only for meals in school. Call 114 00:07:14,320 --> 00:07:16,880 Speaker 1: me to dinner, I ignored. You tell me to pick 115 00:07:16,960 --> 00:07:21,680 Speaker 1: up my clothes. Nope, clear the table. Nothing built, no offense. 116 00:07:21,760 --> 00:07:23,440 Speaker 1: But you sound like you were a real asshole. 117 00:07:24,960 --> 00:07:29,480 Speaker 2: There were there were about two years where a lot 118 00:07:29,520 --> 00:07:30,320 Speaker 2: of I thought. 119 00:07:30,120 --> 00:07:32,960 Speaker 1: He would crack a smile at least at that. It's 120 00:07:33,080 --> 00:07:36,920 Speaker 1: just like, I. 121 00:07:36,840 --> 00:07:42,160 Speaker 2: Don't think I did any permanent damage. But eventually they 122 00:07:42,640 --> 00:07:47,920 Speaker 2: sent me to a therapist and he, you know, really 123 00:07:47,960 --> 00:07:50,760 Speaker 2: got my trust and gave me a bunch of books 124 00:07:50,760 --> 00:07:53,120 Speaker 2: to read, and you know, just kind of explained that 125 00:07:53,720 --> 00:07:59,200 Speaker 2: putting my energy against uh my parents, mostly my mom, 126 00:07:59,400 --> 00:08:00,840 Speaker 2: you know, that was kind of a waste of my 127 00:08:00,960 --> 00:08:04,000 Speaker 2: time and that there wasn't any you know, grand victory 128 00:08:04,000 --> 00:08:07,360 Speaker 2: to be had there because they were on my side, 129 00:08:08,120 --> 00:08:15,040 Speaker 2: and that really was super helpful, and so I gave 130 00:08:15,160 --> 00:08:18,760 Speaker 2: up all that you know, non compliance. 131 00:08:18,960 --> 00:08:21,880 Speaker 1: See also doctor Cressy was his name. He could get 132 00:08:21,920 --> 00:08:25,800 Speaker 1: your parents some good advice and that was basically a 133 00:08:25,840 --> 00:08:28,680 Speaker 1: little to their dismay. You know, you talk about in 134 00:08:28,720 --> 00:08:33,120 Speaker 1: the book to give you more freedom, right to kind 135 00:08:33,120 --> 00:08:38,359 Speaker 1: of let you be yourself, and you found that freedom 136 00:08:39,000 --> 00:08:42,199 Speaker 1: through hikes with a group of how shall I say it, 137 00:08:42,320 --> 00:08:46,720 Speaker 1: fellow nerds, and I was amazed that you got through one. 138 00:08:46,920 --> 00:08:51,160 Speaker 1: I loved reading about this one miserably cold hike that 139 00:08:51,320 --> 00:08:53,200 Speaker 1: you know, you guys were up to your knees or 140 00:08:53,200 --> 00:08:56,880 Speaker 1: maybe even higher in snow. It was just horrible conditions 141 00:08:56,960 --> 00:09:00,920 Speaker 1: and you got through it by writing code in your 142 00:09:01,040 --> 00:09:05,160 Speaker 1: head for hours. Tell us about that and how kind 143 00:09:05,200 --> 00:09:09,680 Speaker 1: of you had this alternative reality in a way that 144 00:09:09,840 --> 00:09:11,559 Speaker 1: helped you cope. 145 00:09:12,400 --> 00:09:15,280 Speaker 2: Yeah, the Pasitting Northwest where I grew up, there were 146 00:09:15,280 --> 00:09:20,160 Speaker 2: a lot of great hikes, and through Boy Scouts, I 147 00:09:20,200 --> 00:09:22,360 Speaker 2: got in with the group where we took a lot 148 00:09:22,400 --> 00:09:25,960 Speaker 2: of hikes. I wasn't hiking so much that I love, 149 00:09:26,000 --> 00:09:29,959 Speaker 2: but just having this different group of friends and kind 150 00:09:29,960 --> 00:09:31,320 Speaker 2: of challenging ourselves all. 151 00:09:31,320 --> 00:09:35,000 Speaker 1: Took on different roles, Like you were the fire builder, right. 152 00:09:36,280 --> 00:09:40,719 Speaker 2: Right. I wasn't a great hiker. And in fact, the 153 00:09:40,800 --> 00:09:44,199 Speaker 2: day where I did this software work, they had all 154 00:09:44,240 --> 00:09:46,640 Speaker 2: out voted me to take the long route because they 155 00:09:46,640 --> 00:09:50,080 Speaker 2: thought it was more scenic or something. And we carried 156 00:09:50,080 --> 00:09:53,240 Speaker 2: these very heavy packs and I was pretty scrawny, so 157 00:09:53,679 --> 00:09:57,080 Speaker 2: you know, in order to just not dwell on the 158 00:09:57,200 --> 00:10:00,520 Speaker 2: rain or the switchbacks. You know, I started thinking through 159 00:10:01,360 --> 00:10:06,040 Speaker 2: a very complex piece of what's called the Basic Interpreter 160 00:10:06,640 --> 00:10:11,040 Speaker 2: that almost three years later, I drawn that knowledge for 161 00:10:11,200 --> 00:10:15,400 Speaker 2: Microsoft's Area first product. But the idea of really good software, 162 00:10:15,440 --> 00:10:20,640 Speaker 2: it's kind of elegant, how small and fast it can be. 163 00:10:20,720 --> 00:10:23,440 Speaker 2: So I really put my mind to that, and you know, 164 00:10:23,520 --> 00:10:28,160 Speaker 2: spent about five hours really making it one of the 165 00:10:28,160 --> 00:10:30,440 Speaker 2: best piece of software I ever wrote. 166 00:10:30,520 --> 00:10:33,040 Speaker 1: And that took your mind off the miserable condition. 167 00:10:33,160 --> 00:10:37,719 Speaker 2: Yeah, then I didn't notice all those switchbacks that we 168 00:10:37,720 --> 00:10:39,000 Speaker 2: were going up and down. 169 00:10:39,080 --> 00:10:41,480 Speaker 1: You know. You also write that if you were growing 170 00:10:41,559 --> 00:10:45,640 Speaker 1: up today, you'd likely be diagnosed as autistic, and I 171 00:10:45,679 --> 00:10:48,520 Speaker 1: thought it was interesting that you said you often rocked 172 00:10:48,559 --> 00:10:51,120 Speaker 1: when you were sort of concentrating, and it's something you 173 00:10:51,160 --> 00:10:54,120 Speaker 1: still do today when you're deep in thought or focused 174 00:10:54,160 --> 00:10:58,440 Speaker 1: on something. When did you realize that this might be 175 00:10:58,559 --> 00:11:02,200 Speaker 1: the case, that you might beyond the spectrum in some way. 176 00:11:04,120 --> 00:11:07,320 Speaker 2: Well, when I was growing up, that term just didn't exist. 177 00:11:07,520 --> 00:11:11,040 Speaker 2: I mean, autism was a pretty narrow definition in terms 178 00:11:11,080 --> 00:11:19,000 Speaker 2: of not ever developing full social skills. And it wasn't 179 00:11:19,120 --> 00:11:21,000 Speaker 2: until I was an adult in a Q and A 180 00:11:21,120 --> 00:11:24,040 Speaker 2: session and somebody said that, you know, maybe I was 181 00:11:24,160 --> 00:11:28,520 Speaker 2: rocking at the time, and I thought, well, that's interesting. 182 00:11:28,920 --> 00:11:31,480 Speaker 2: You know, I don't want to say no, because you know, 183 00:11:31,520 --> 00:11:34,200 Speaker 2: what am I doing saying that I'm better or that 184 00:11:34,240 --> 00:11:37,640 Speaker 2: would be awful or and then you know, some of 185 00:11:37,640 --> 00:11:42,720 Speaker 2: the things do fit my social skills. We were slow 186 00:11:42,800 --> 00:11:48,600 Speaker 2: to develop that ability to concentrate for you know, hours 187 00:11:48,640 --> 00:11:51,240 Speaker 2: on end, you know, read more books than the other kids, 188 00:11:51,800 --> 00:11:55,160 Speaker 2: you know, which ended up being a strength that's very 189 00:11:55,280 --> 00:12:00,040 Speaker 2: characteristic of that type of mind. So you know, it 190 00:12:00,160 --> 00:12:05,160 Speaker 2: wasn't like some deficit for me, even though some things 191 00:12:05,200 --> 00:12:08,559 Speaker 2: I had to work hard to to get decent. 192 00:12:08,760 --> 00:12:12,320 Speaker 1: And the fact you acknowledge that you're you talk about 193 00:12:12,360 --> 00:12:16,360 Speaker 1: your neurodiversity and you say it was also a superpower, right, 194 00:12:17,000 --> 00:12:20,640 Speaker 1: I mean talk about that some of these qualities that 195 00:12:21,280 --> 00:12:24,200 Speaker 1: might be seen as you know, as I said, on 196 00:12:24,280 --> 00:12:28,720 Speaker 1: the spectrum or differently abled or whatever, actually work to 197 00:12:28,760 --> 00:12:30,800 Speaker 1: your advantage. 198 00:12:30,960 --> 00:12:34,200 Speaker 2: Yeah. Well, that term nerd, you know, you never know 199 00:12:34,400 --> 00:12:36,920 Speaker 2: that it's being used as a positive or a negative. 200 00:12:37,960 --> 00:12:39,400 Speaker 2: When I was growing up, it was kind of a 201 00:12:39,440 --> 00:12:43,200 Speaker 2: negative about you know, young boys like me who would 202 00:12:43,600 --> 00:12:48,360 Speaker 2: uh just go off and obsess about programming or any 203 00:12:48,440 --> 00:12:52,120 Speaker 2: kind of technical thing, and that you know we were boring, 204 00:12:52,600 --> 00:12:56,360 Speaker 2: uh and not important, you know, then eventually became oh 205 00:12:56,440 --> 00:12:57,400 Speaker 2: you're a nerd. Wow. 206 00:12:57,760 --> 00:12:59,880 Speaker 1: I think it's a compliment. I think it means like 207 00:12:59,880 --> 00:13:04,320 Speaker 1: a brainiac, but probably not not a player. You know 208 00:13:04,320 --> 00:13:04,840 Speaker 1: what I'm saying. 209 00:13:05,760 --> 00:13:09,120 Speaker 2: Okay, well that's I'm ready. 210 00:13:10,160 --> 00:13:10,920 Speaker 1: You'll accept that. 211 00:13:12,640 --> 00:13:19,800 Speaker 2: Definitely. It's definitely specializing and wanting to read a lot 212 00:13:20,080 --> 00:13:24,680 Speaker 2: about something, which in my case included learning how to 213 00:13:24,720 --> 00:13:28,559 Speaker 2: write software. At a time where computers were very expensive 214 00:13:28,679 --> 00:13:31,199 Speaker 2: and it was very rare for people to have time 215 00:13:31,240 --> 00:13:34,520 Speaker 2: on them. Through a series of experiences, each of which 216 00:13:34,600 --> 00:13:38,320 Speaker 2: kind of built on the last, I got feedback about, Okay, 217 00:13:38,320 --> 00:13:43,040 Speaker 2: here's how you write even better software. So even by 218 00:13:43,080 --> 00:13:46,479 Speaker 2: the time I graduate from high school, I've had thousands 219 00:13:46,480 --> 00:13:51,200 Speaker 2: of hours and with my intense focus, you know, I 220 00:13:51,280 --> 00:13:57,440 Speaker 2: knew a lot about software and simultaneously the magic of 221 00:13:57,480 --> 00:14:03,120 Speaker 2: these chips microprocessors making computing go from being very expensive 222 00:14:03,280 --> 00:14:07,679 Speaker 2: almost free. Uh. You know, with the help of Paul Allen, 223 00:14:07,760 --> 00:14:11,520 Speaker 2: we can see, Hey, the software stuff is going to 224 00:14:11,559 --> 00:14:16,600 Speaker 2: be the key ingredient. He wanted to do a company. 225 00:14:16,720 --> 00:14:19,760 Speaker 2: We actually did the hardware, but I insisted we just 226 00:14:19,840 --> 00:14:22,960 Speaker 2: do the software, and that you know, we were in 227 00:14:23,040 --> 00:14:26,760 Speaker 2: the right place at the right time, and and you 228 00:14:26,800 --> 00:14:30,880 Speaker 2: know built because of that. Other people just didn't didn't 229 00:14:30,880 --> 00:14:31,680 Speaker 2: see what we saw. 230 00:14:38,240 --> 00:14:40,320 Speaker 1: If you want to get smarter every morning with a 231 00:14:40,360 --> 00:14:43,640 Speaker 1: breakdown of the news and fascinating takes on health and 232 00:14:43,680 --> 00:14:47,040 Speaker 1: wellness and pop culture, sign up for our daily newsletter, 233 00:14:47,120 --> 00:14:59,720 Speaker 1: Wake Up Call by going to Katiecuric dot com. Before 234 00:14:59,760 --> 00:15:05,200 Speaker 1: you to Harvard, you had a defining moment in your life. 235 00:15:05,480 --> 00:15:08,240 Speaker 1: You say, unlike a lot of successful people, you had 236 00:15:08,240 --> 00:15:12,160 Speaker 1: a relatively easy childhood. I mean taking out the part 237 00:15:12,200 --> 00:15:14,720 Speaker 1: of giving your mom a hard time. But you know, 238 00:15:15,640 --> 00:15:19,080 Speaker 1: you suffered the loss of your very, very good friend, 239 00:15:19,080 --> 00:15:22,080 Speaker 1: and you write about it quite movingly. Kent Evans. He 240 00:15:22,160 --> 00:15:25,040 Speaker 1: was killed in a mountain climbing accident when the two 241 00:15:25,080 --> 00:15:28,200 Speaker 1: of you were sixteen years old, and I'm curious if 242 00:15:28,240 --> 00:15:31,840 Speaker 1: you could talk about the impact that had and how 243 00:15:31,880 --> 00:15:33,560 Speaker 1: it changed your perspective. 244 00:15:35,320 --> 00:15:39,600 Speaker 2: Yeah. So when the computer arrives at this school, Lakeside School, 245 00:15:41,040 --> 00:15:44,120 Speaker 2: there's four of us, My best friend Ken Toy talk 246 00:15:44,200 --> 00:15:46,720 Speaker 2: to every night, and these too older boys, one of 247 00:15:46,720 --> 00:15:50,960 Speaker 2: whom was Paul, who we stayed there even after it 248 00:15:51,040 --> 00:15:55,880 Speaker 2: wasn't exciting anymore. We were just really obsessed. 249 00:15:56,200 --> 00:15:58,080 Speaker 1: You would say, out to your house to go to 250 00:15:58,120 --> 00:15:58,880 Speaker 1: the computer lab. 251 00:15:59,280 --> 00:16:04,080 Speaker 2: Yeah nights, yeah, yeah. And so Kent, you know, was 252 00:16:04,200 --> 00:16:07,600 Speaker 2: very outward looking. You know, he had looked up what 253 00:16:07,800 --> 00:16:13,560 Speaker 2: did generals make, what did ambassadors make? What did professors make? Uh? 254 00:16:13,680 --> 00:16:17,280 Speaker 2: You know, he got me reading Fortune magazine, you know, 255 00:16:17,360 --> 00:16:19,400 Speaker 2: and so he had this sense of, Okay, what we 256 00:16:19,520 --> 00:16:21,760 Speaker 2: need to think now on what we're going to be 257 00:16:21,800 --> 00:16:27,320 Speaker 2: doing later. And that was a revelation to me. And 258 00:16:27,760 --> 00:16:29,240 Speaker 2: I'd been a bit of a goof off and he 259 00:16:29,360 --> 00:16:35,240 Speaker 2: encouraged me not to be so lackadaisical because he was 260 00:16:35,840 --> 00:16:39,680 Speaker 2: very diligent in everything he did. You know. So we 261 00:16:39,680 --> 00:16:43,240 Speaker 2: were thinking, okay, we're gonna go out there and solve 262 00:16:43,280 --> 00:16:46,920 Speaker 2: big problems together. And he signed up for a mountain 263 00:16:46,920 --> 00:16:51,440 Speaker 2: climbing class in our junior of high school. Uh. He 264 00:16:51,520 --> 00:16:54,520 Speaker 2: was less coordinated than I was, so nobody expected that, 265 00:16:54,720 --> 00:16:58,920 Speaker 2: but his parents kind of went along because Kent had 266 00:16:58,960 --> 00:17:04,359 Speaker 2: strong views about things, and you know, he we talked 267 00:17:04,400 --> 00:17:08,199 Speaker 2: on a Friday night like we always did, and he 268 00:17:08,359 --> 00:17:11,040 Speaker 2: was going to be gone for the weekend. And then 269 00:17:11,160 --> 00:17:13,960 Speaker 2: that Sunday, they head master of the school called me 270 00:17:14,000 --> 00:17:17,400 Speaker 2: and said that not only kn't been injured, but he'd 271 00:17:17,400 --> 00:17:21,919 Speaker 2: actually died, which you know, when you're at that age, okay, 272 00:17:21,960 --> 00:17:25,240 Speaker 2: grandparents maybe, but you know, people your own age are 273 00:17:25,680 --> 00:17:29,520 Speaker 2: kind of invulnerable, and so that was quite a shock. 274 00:17:30,840 --> 00:17:33,879 Speaker 2: You know. The only thing that got me, you know, 275 00:17:35,320 --> 00:17:38,320 Speaker 2: back thinking for was, you know, when I would spend 276 00:17:38,400 --> 00:17:40,919 Speaker 2: time with Kent's parents. You know, it was kind of 277 00:17:41,000 --> 00:17:45,280 Speaker 2: an irreparable loss for them. They he was amazing to 278 00:17:45,400 --> 00:17:49,439 Speaker 2: them and to me, and so you know, I was 279 00:17:49,560 --> 00:17:52,960 Speaker 2: lucky that all those things Kent had taught me, you know, 280 00:17:53,040 --> 00:17:56,840 Speaker 2: I could go back and do. In fact, I that 281 00:17:57,000 --> 00:17:59,640 Speaker 2: summer I had committed to do a very complicated piece 282 00:17:59,680 --> 00:18:04,560 Speaker 2: of soft for school scheduling. So I called Paul, who'd 283 00:18:04,600 --> 00:18:07,200 Speaker 2: gone off for a first year of college, and then 284 00:18:07,240 --> 00:18:09,679 Speaker 2: he came back and that's sort of where he and 285 00:18:09,760 --> 00:18:13,679 Speaker 2: I spent a lot of time together, and you know, 286 00:18:13,720 --> 00:18:18,640 Speaker 2: eventually that led to us founding Microsoft about three years. 287 00:18:18,640 --> 00:18:20,880 Speaker 1: In fact, there's a scene in the book where Paul 288 00:18:20,920 --> 00:18:23,639 Speaker 1: Allen bursts into your dorm room at Harvard holding the 289 00:18:23,760 --> 00:18:28,160 Speaker 1: January nineteen seventy five issue of Popular Electronics with this 290 00:18:28,200 --> 00:18:32,600 Speaker 1: headline project breakthrough world's first mini computer kit to rival 291 00:18:32,600 --> 00:18:36,720 Speaker 1: commercial models, And he said to you it's happening without us. 292 00:18:37,000 --> 00:18:39,080 Speaker 1: That was a real kick in the pants for both 293 00:18:39,119 --> 00:18:39,920 Speaker 1: of you, wasn't it. 294 00:18:41,240 --> 00:18:44,080 Speaker 2: Yeah, I'd made I'd help Paul get a job back 295 00:18:44,119 --> 00:18:49,240 Speaker 2: in Boston. So he was on leaven We were just 296 00:18:49,280 --> 00:18:53,760 Speaker 2: brainstorming all the time about this insight, about how magical 297 00:18:53,840 --> 00:18:56,919 Speaker 2: these chips were and what they would do and you know, 298 00:18:56,960 --> 00:19:01,320 Speaker 2: different ideas, and just stunned that other people didn't see 299 00:19:01,480 --> 00:19:04,600 Speaker 2: how big this would be. But still thinking through, Okay, 300 00:19:04,600 --> 00:19:07,760 Speaker 2: what was our role in all of it? And so yeah, 301 00:19:07,800 --> 00:19:12,199 Speaker 2: when he bought that magazine at Harvard Square and you know, 302 00:19:12,200 --> 00:19:16,320 Speaker 2: it's a cold Boston winter, and we said, okay, we 303 00:19:16,359 --> 00:19:19,160 Speaker 2: are going to be in on the ground floor. And 304 00:19:19,200 --> 00:19:22,600 Speaker 2: now it's happening. And so even though I loved my 305 00:19:22,720 --> 00:19:26,280 Speaker 2: time at Harvard, you know, that's when I had to 306 00:19:26,280 --> 00:19:28,240 Speaker 2: make the plan. Okay, I'm about to. 307 00:19:29,080 --> 00:19:31,920 Speaker 1: Go on unless your mom bummed out that you quit Harvard. 308 00:19:32,880 --> 00:19:35,720 Speaker 2: Well I didn't quit. I mean you go on leave. 309 00:19:37,080 --> 00:19:40,560 Speaker 2: You know, so if your company fails, you know, they 310 00:19:40,600 --> 00:19:43,560 Speaker 2: don't like shun you. Oh you had a failed company, 311 00:19:43,640 --> 00:19:46,840 Speaker 2: or oh you were gone for a year. It would 312 00:19:46,880 --> 00:19:49,280 Speaker 2: have been fine. You know, I didn't have you know, 313 00:19:49,440 --> 00:19:54,080 Speaker 2: kids to support or anything like that. And you know, 314 00:19:54,119 --> 00:19:59,760 Speaker 2: my computer skills were quite valuable, so I didn't think 315 00:19:59,800 --> 00:20:02,840 Speaker 2: of it as risky at all. She might have been 316 00:20:02,920 --> 00:20:07,120 Speaker 2: a bit worried about it, because she always wondered if 317 00:20:07,119 --> 00:20:11,600 Speaker 2: I was taking the wrong turn that she needed to 318 00:20:11,840 --> 00:20:18,159 Speaker 2: advise me on. But you know, we started small, and 319 00:20:18,200 --> 00:20:19,760 Speaker 2: then a few years later, at the end of the 320 00:20:19,760 --> 00:20:24,240 Speaker 2: book is when I'm nineteen seventy nine, I've decided to 321 00:20:24,280 --> 00:20:28,200 Speaker 2: move Microsoft back to Seattle, and you know, we only 322 00:20:28,240 --> 00:20:32,320 Speaker 2: have about eighteen people, but we're on our way to 323 00:20:32,440 --> 00:20:34,480 Speaker 2: be sort of the software. 324 00:20:34,560 --> 00:20:36,800 Speaker 1: And the book ends when you're around twenty three. You 325 00:20:36,840 --> 00:20:40,320 Speaker 1: get into your Porsche. Nice nice ride for a college kid. 326 00:20:40,359 --> 00:20:42,800 Speaker 1: By the way, oh, I was a used nine eleven, 327 00:20:44,720 --> 00:20:46,679 Speaker 1: and you headed back to Seattle, as you said, to 328 00:20:46,680 --> 00:20:50,199 Speaker 1: build Microsoft. But now fifty years later you've said you're 329 00:20:50,320 --> 00:20:56,840 Speaker 1: thinking of quote digital empowerment as an unadulterated good has changed, 330 00:20:57,400 --> 00:21:01,920 Speaker 1: And I'm curious back then, Bill, if you ever envisioned 331 00:21:02,640 --> 00:21:08,520 Speaker 1: technology being as transformative and as negative a force in 332 00:21:08,680 --> 00:21:11,840 Speaker 1: society in some cases that it's become. 333 00:21:15,240 --> 00:21:20,240 Speaker 2: In the early days, it felt like a pure, unadult, 334 00:21:20,240 --> 00:21:25,560 Speaker 2: traded good thing that you could type documents and do 335 00:21:25,680 --> 00:21:29,760 Speaker 2: spreadsheets and communicate with people all over the world, and 336 00:21:30,000 --> 00:21:32,800 Speaker 2: it was just Okay, humans do good things, and now 337 00:21:32,800 --> 00:21:36,919 Speaker 2: we're going to do them faster and better. And so 338 00:21:37,000 --> 00:21:43,720 Speaker 2: that naivete I keep that literally until social networking comes along. 339 00:21:44,320 --> 00:21:47,239 Speaker 2: In fact, we thought the biggest problem with computers was 340 00:21:47,840 --> 00:21:50,960 Speaker 2: that not everybody had access, So we need to make 341 00:21:51,000 --> 00:21:54,280 Speaker 2: them even cheaper and get them into the inner city 342 00:21:54,359 --> 00:21:57,440 Speaker 2: and get them into developing countries. So this digital divide 343 00:21:58,640 --> 00:22:02,280 Speaker 2: was something that I worked on, you know, putting computers 344 00:22:02,280 --> 00:22:06,960 Speaker 2: in libraries, and Microsoft also did a lot on that. 345 00:22:07,600 --> 00:22:11,480 Speaker 2: Only later did the idea that some of these technologies 346 00:22:11,600 --> 00:22:17,040 Speaker 2: can actually accentuate human weakness, and you can get people 347 00:22:17,040 --> 00:22:20,400 Speaker 2: who believe crazy things that normally would have to kind 348 00:22:20,400 --> 00:22:23,400 Speaker 2: of give it up because they can't find each other. 349 00:22:23,440 --> 00:22:26,040 Speaker 2: But now, boy, they have found each other. You know, 350 00:22:26,200 --> 00:22:29,280 Speaker 2: for crazy idea A, they've got a quorum. For crazy 351 00:22:29,320 --> 00:22:32,040 Speaker 2: idea B, they've got a quorum, and all day long 352 00:22:32,119 --> 00:22:36,679 Speaker 2: they're enjoying being off in that you know, sort of 353 00:22:36,760 --> 00:22:43,440 Speaker 2: non nonfactual but self reinforcing group. You know. So yes, 354 00:22:43,520 --> 00:22:48,520 Speaker 2: it's only maybe ten years ago that that became clear. 355 00:22:48,600 --> 00:22:53,040 Speaker 1: So is disinformation the biggest peril? Do you think to 356 00:22:53,640 --> 00:22:56,920 Speaker 1: modern technology or what about sort of I'm sure you've 357 00:22:56,960 --> 00:23:01,280 Speaker 1: read Jonathan Height. I just forgot his name. Wait, thank you, 358 00:23:01,359 --> 00:23:05,679 Speaker 1: SJO Anxious generation? What most concerned you about technology? And 359 00:23:05,720 --> 00:23:08,000 Speaker 1: while we're on the subject, how do you feel about AI? 360 00:23:08,800 --> 00:23:12,919 Speaker 1: Is that something you're excited about or more concerned about? 361 00:23:14,160 --> 00:23:19,520 Speaker 2: Well, I'm extremely excited about it, but it's a little 362 00:23:19,520 --> 00:23:22,280 Speaker 2: bit scary in terms of how we're going to end 363 00:23:22,359 --> 00:23:27,479 Speaker 2: up using it. It is way beyond anything that's come before. 364 00:23:27,840 --> 00:23:31,720 Speaker 2: You know, the revolution I was part of is about 365 00:23:31,720 --> 00:23:35,800 Speaker 2: computing going from being expensive to basically being free. Okay, 366 00:23:35,800 --> 00:23:40,320 Speaker 2: what do you do with it? Now? It's about intelligence 367 00:23:40,840 --> 00:23:43,879 Speaker 2: going from being scarce and you have to go to 368 00:23:43,920 --> 00:23:46,960 Speaker 2: college and learn all these things because we have a 369 00:23:47,000 --> 00:23:54,840 Speaker 2: shortage of engineers or doctors or whatever, and now intelligence 370 00:23:54,880 --> 00:23:58,040 Speaker 2: will be essentially free. We're not there yet. We don't 371 00:23:58,040 --> 00:24:03,720 Speaker 2: have robots. The accuracy of these machines h still kind 372 00:24:03,760 --> 00:24:06,320 Speaker 2: of surprise us. That they make mistakes that are very 373 00:24:06,320 --> 00:24:10,399 Speaker 2: different than the kind of mistake humans make. So there's refinement, 374 00:24:11,480 --> 00:24:15,880 Speaker 2: but it's happening quite quickly. So when you think about 375 00:24:16,040 --> 00:24:24,240 Speaker 2: today's problems, you know, climate Alzheimer's, HIV, vaccine, AI is 376 00:24:24,640 --> 00:24:28,320 Speaker 2: phenomenal in that it will accelerate those advances. You know, 377 00:24:28,359 --> 00:24:33,280 Speaker 2: we will have medical advice for people in poor countries, 378 00:24:33,359 --> 00:24:37,040 Speaker 2: will have great advice for farmers about when to plant 379 00:24:37,040 --> 00:24:41,320 Speaker 2: what to plant, better than even rich farmers have today. 380 00:24:42,000 --> 00:24:46,400 Speaker 2: But overall for society as it you know, changes job 381 00:24:46,480 --> 00:24:51,919 Speaker 2: markets and hopefully frees up time. Exactly who benefits or 382 00:24:51,960 --> 00:24:55,800 Speaker 2: how government adjusts to it, and how you know, people 383 00:24:55,800 --> 00:25:00,600 Speaker 2: with negative intent use these new tools, I'd say it's 384 00:25:00,600 --> 00:25:07,760 Speaker 2: a cloudier future. And and so AI brings wonderful advances 385 00:25:07,840 --> 00:25:14,119 Speaker 2: and some challenge for humanity. You know. Harari in nexas 386 00:25:14,160 --> 00:25:19,240 Speaker 2: says that we need to put AI aside and learn 387 00:25:19,240 --> 00:25:22,080 Speaker 2: how to get along with each other, and then we 388 00:25:22,119 --> 00:25:25,919 Speaker 2: should turn to this wonderful magic box since we'll be 389 00:25:26,520 --> 00:25:29,840 Speaker 2: mature enough to use it the right way. You know, 390 00:25:30,400 --> 00:25:34,399 Speaker 2: that's not likely to happen. So we've got, you know, 391 00:25:34,400 --> 00:25:38,000 Speaker 2: a need to get everybody involved in thinking through how 392 00:25:38,040 --> 00:25:39,159 Speaker 2: do we want to shape. 393 00:25:39,040 --> 00:25:41,600 Speaker 1: Well, that's a great segue to my next question. I 394 00:25:41,600 --> 00:25:44,399 Speaker 1: thought we could talk about politics because there's nothing really 395 00:25:44,400 --> 00:25:47,760 Speaker 1: going on in that department. What did you think when 396 00:25:47,760 --> 00:25:49,879 Speaker 1: you saw this whole row of tech bros at the 397 00:25:49,880 --> 00:25:52,800 Speaker 1: inauguration front and Center. What did you say to yourself 398 00:25:52,840 --> 00:25:53,680 Speaker 1: when you witness that? 399 00:26:00,200 --> 00:26:04,280 Speaker 2: You know they make their own decisions. 400 00:26:04,800 --> 00:26:07,080 Speaker 1: You must have thought something though when you saw that. 401 00:26:09,440 --> 00:26:15,000 Speaker 2: No, I wouldn't have predicted it. And yet, you know, 402 00:26:15,040 --> 00:26:20,000 Speaker 2: when you have a new administration, you know they're gonna 403 00:26:20,000 --> 00:26:25,080 Speaker 2: wield a lot of authority, including over those companies. You know, 404 00:26:25,119 --> 00:26:28,520 Speaker 2: I'm I'm not really in their game anymore. My focus 405 00:26:28,600 --> 00:26:33,200 Speaker 2: is the foundation work. So you know my mindset isn't 406 00:26:33,280 --> 00:26:35,480 Speaker 2: quite Okay, let's defend. 407 00:26:37,119 --> 00:26:38,840 Speaker 1: Were you invited for the inauguration? 408 00:26:40,040 --> 00:26:47,480 Speaker 2: I bet I could have been there, I mean, I 409 00:26:47,520 --> 00:26:54,200 Speaker 2: think so, but it was fine. I had been there 410 00:26:54,240 --> 00:26:56,120 Speaker 2: recently for Jimmy Carter's funeral. 411 00:26:59,400 --> 00:27:11,399 Speaker 1: That's it. Yeah, okay. These same tech billionaires made major 412 00:27:11,440 --> 00:27:16,800 Speaker 1: contributions to Trump's inauguration fund. Meta, Amazon, Google, your very 413 00:27:16,800 --> 00:27:20,760 Speaker 1: own Microsoft all donated one million dollars. Sam Altman and 414 00:27:20,840 --> 00:27:26,280 Speaker 1: Tim Cook both reportedly also made one million dollar donations personally. 415 00:27:26,760 --> 00:27:29,480 Speaker 1: So did you donate anything to Donald Trump? 416 00:27:31,760 --> 00:27:35,480 Speaker 2: I didn't. I thought it looked like a well funded inauguration. 417 00:27:36,680 --> 00:27:37,639 Speaker 1: They don't need my money. 418 00:27:38,640 --> 00:27:44,600 Speaker 2: No. I saved it for HIV and malaria eradication. 419 00:28:04,880 --> 00:28:08,000 Speaker 1: You did, though, have a three hour dinner with Donald Trump. 420 00:28:08,200 --> 00:28:10,800 Speaker 1: I know, I suddenly feel like Mike Wallace here. I 421 00:28:10,840 --> 00:28:14,600 Speaker 1: don't mean to. I don't mean to anyway. 422 00:28:15,000 --> 00:28:15,240 Speaker 2: Bill. 423 00:28:15,720 --> 00:28:18,040 Speaker 1: I know you had dinner with Donald Trump. It was 424 00:28:18,080 --> 00:28:20,480 Speaker 1: three hours long. It was at the end of December. 425 00:28:20,640 --> 00:28:24,280 Speaker 1: You described that dinner as quite intriguing and wide ranging, 426 00:28:24,840 --> 00:28:26,919 Speaker 1: and that he showed a lot of interest in the 427 00:28:26,960 --> 00:28:30,560 Speaker 1: issues that you brought up. He's been president for just 428 00:28:30,600 --> 00:28:33,320 Speaker 1: over two weeks, and as you well know, there's a 429 00:28:33,400 --> 00:28:36,400 Speaker 1: lot of concern about a whole slew of things he's doing, 430 00:28:36,480 --> 00:28:40,720 Speaker 1: from firing inspectors general at a variety of agencies to 431 00:28:40,760 --> 00:28:45,320 Speaker 1: stop corruption, etc. To the mass deportation of immigrants, to 432 00:28:45,400 --> 00:28:51,480 Speaker 1: blaming the tragic airplane crash in DC on DEI given 433 00:28:51,520 --> 00:28:55,720 Speaker 1: the conversation and given your experience having this dinner with him, 434 00:28:56,160 --> 00:28:58,800 Speaker 1: are you surprised at some of the actions he's taken 435 00:28:58,960 --> 00:28:59,640 Speaker 1: so quickly? 436 00:29:01,960 --> 00:29:06,840 Speaker 2: Well, things are moving fairly quickly. And you know, the 437 00:29:06,920 --> 00:29:14,520 Speaker 2: area that I have real expertise in includes the foreign assistants, 438 00:29:14,600 --> 00:29:19,440 Speaker 2: the USAID work, and things like pepfar where the US 439 00:29:19,520 --> 00:29:24,840 Speaker 2: government has been incredibly generous and saved tens of millions 440 00:29:24,840 --> 00:29:28,640 Speaker 2: of lives. And so when I saw President Trump, you know, 441 00:29:28,640 --> 00:29:33,680 Speaker 2: I said, let's maintain this generosity and let's accelerate innovation. 442 00:29:35,000 --> 00:29:37,640 Speaker 2: And we don't have an HIV vaccine, we don't have 443 00:29:37,680 --> 00:29:41,640 Speaker 2: an HIV cure. If some of the same things that 444 00:29:41,640 --> 00:29:45,440 Speaker 2: were done in Project Warp Speed during the pandemic could 445 00:29:45,440 --> 00:29:49,400 Speaker 2: be applied in those areas, the chance of getting great 446 00:29:49,440 --> 00:29:53,440 Speaker 2: new tools is pretty exciting. And so I asked for 447 00:29:53,480 --> 00:29:56,400 Speaker 2: his support in that and said that, you know, in 448 00:29:56,440 --> 00:30:00,960 Speaker 2: the meantime, we should continue to provide those medicines. What's 449 00:30:01,000 --> 00:30:06,320 Speaker 2: happened so far in terms of personnel and grants may 450 00:30:06,440 --> 00:30:10,720 Speaker 2: disrupt that. And so you know, I'll be a strong 451 00:30:10,800 --> 00:30:14,240 Speaker 2: voice saying, you know, even if you have to, you know, 452 00:30:14,360 --> 00:30:17,480 Speaker 2: change the name of the organization. You know, maybe there's 453 00:30:17,520 --> 00:30:21,280 Speaker 2: ten percent of these programs you don't support. Let's not 454 00:30:21,520 --> 00:30:27,320 Speaker 2: take the incredible talent and impact and through very abrupt 455 00:30:27,400 --> 00:30:33,600 Speaker 2: actions largely by people who don't know this agency and 456 00:30:33,680 --> 00:30:37,560 Speaker 2: may have just heard about you know, one percent here, 457 00:30:37,720 --> 00:30:41,720 Speaker 2: you know, went to something they don't like. So the 458 00:30:41,760 --> 00:30:45,160 Speaker 2: next few months, you know, I expect to provide the 459 00:30:45,200 --> 00:30:49,200 Speaker 2: best advice I can because I think the US should 460 00:30:49,280 --> 00:30:53,480 Speaker 2: be proud of what it's done, starting with President Bush. 461 00:30:53,560 --> 00:30:57,560 Speaker 2: But on a bipartisan basis up until now. 462 00:30:57,880 --> 00:31:00,760 Speaker 1: Will you call Elon muss because as the head of 463 00:31:00,960 --> 00:31:04,920 Speaker 1: DOGE or whatever that agency is, he's made moves, as 464 00:31:04,960 --> 00:31:07,719 Speaker 1: you know, to shut down USAI D He says, quote 465 00:31:07,720 --> 00:31:11,000 Speaker 1: with the blessing of President Trump, and on his platform 466 00:31:11,280 --> 00:31:14,160 Speaker 1: X he called the government agency evil and a viper's 467 00:31:14,200 --> 00:31:18,719 Speaker 1: nests of radical left Marxist who hate America America, adding 468 00:31:18,760 --> 00:31:21,400 Speaker 1: that it was time for it to die. We got 469 00:31:21,440 --> 00:31:24,840 Speaker 1: so many questions Bill for you on threads, my followers, 470 00:31:25,360 --> 00:31:30,240 Speaker 1: And do you think that Elon Musk, or for that matter, 471 00:31:30,320 --> 00:31:32,960 Speaker 1: Donald Trump, after you all had that three hour dinner 472 00:31:33,000 --> 00:31:38,800 Speaker 1: and talked about the important work that organizations like this do, 473 00:31:38,800 --> 00:31:40,360 Speaker 1: do you think anybody's going to listen. 474 00:31:42,200 --> 00:31:47,200 Speaker 2: I'm very hopeful that the Secretary of State Mark Rubio, 475 00:31:47,280 --> 00:31:53,320 Speaker 2: who was in Africa and saw this great work, and 476 00:31:53,400 --> 00:31:57,480 Speaker 2: President Trump will work to preserve the bulk of what's there. 477 00:31:58,280 --> 00:32:02,440 Speaker 2: Whether that named agency stays in place, you know, whether 478 00:32:02,480 --> 00:32:05,960 Speaker 2: every program does. But you know, an abrupt ending of 479 00:32:06,000 --> 00:32:09,120 Speaker 2: that work would really put to the test. You know, 480 00:32:09,360 --> 00:32:12,800 Speaker 2: is it in the value of Americans to take half 481 00:32:12,840 --> 00:32:15,680 Speaker 2: percent of the budget and keep tens of millions of 482 00:32:15,720 --> 00:32:21,200 Speaker 2: Africans alive or have we sort of overnight decided that 483 00:32:21,200 --> 00:32:23,960 Speaker 2: that half a percent shouldn't be spent that way? And 484 00:32:24,400 --> 00:32:26,760 Speaker 2: it is a political question. You know, I have a 485 00:32:26,800 --> 00:32:29,640 Speaker 2: clear point of view. I know great people in that area. 486 00:32:29,680 --> 00:32:31,960 Speaker 1: It's not just Africa, by the way, it's all over 487 00:32:31,960 --> 00:32:32,400 Speaker 1: the world. 488 00:32:32,960 --> 00:32:36,320 Speaker 2: It is it is. It's mostly the benefits of that 489 00:32:36,400 --> 00:32:41,600 Speaker 2: work of USAAD broadly are our global The HIV work 490 00:32:41,880 --> 00:32:44,200 Speaker 2: because of the nature of the epidemic is about eighty 491 00:32:44,240 --> 00:32:49,760 Speaker 2: percent in Africa. And the things USAID funds I fund 492 00:32:50,640 --> 00:32:53,920 Speaker 2: with billions of dollars, and you know, I'm very careful 493 00:32:53,960 --> 00:32:57,600 Speaker 2: to make sure that money is well spent. And so 494 00:32:57,680 --> 00:33:03,080 Speaker 2: I think naively people hear the most random purnade things 495 00:33:03,080 --> 00:33:05,320 Speaker 2: and think, okay, it's all like that. So I have 496 00:33:05,360 --> 00:33:08,840 Speaker 2: a challenge to say to Americans, is this in your 497 00:33:08,960 --> 00:33:15,160 Speaker 2: value system? And you know, does it benefit our security 498 00:33:15,320 --> 00:33:21,640 Speaker 2: or our moral example to keep these programs going? And 499 00:33:22,480 --> 00:33:26,800 Speaker 2: you know, I think people at both parties will find 500 00:33:26,840 --> 00:33:32,440 Speaker 2: this a deeply moral and important thing to keep strong. 501 00:33:32,520 --> 00:33:35,000 Speaker 1: Now, comment on some of the other moves that Trump 502 00:33:35,080 --> 00:33:40,640 Speaker 1: has made with Inspectors General kind of using weaponizing DEI 503 00:33:40,840 --> 00:33:43,920 Speaker 1: I'm just curious how you're feeling, is you're watching all 504 00:33:43,960 --> 00:33:44,840 Speaker 1: of this unfold. 505 00:33:46,320 --> 00:33:50,040 Speaker 2: Well, in some ways, you know, there's not that much 506 00:33:50,840 --> 00:33:55,960 Speaker 2: that wasn't predicted mentioned during the campaign, And although. 507 00:33:55,680 --> 00:33:58,520 Speaker 1: He distanced himself from Project twenty twenty five and now, 508 00:33:58,560 --> 00:34:01,320 Speaker 1: according to Time magazine, two thirds of the things he's 509 00:34:01,440 --> 00:34:05,600 Speaker 1: done are actually in adherence with Project twenty twenty five. 510 00:34:06,720 --> 00:34:09,880 Speaker 2: You know, I'm you know, there are people in the 511 00:34:09,920 --> 00:34:14,800 Speaker 2: Democratic Party who will speak out on these things. I'm 512 00:34:15,560 --> 00:34:18,360 Speaker 2: you know, I'm taking my fortune and trying to partner 513 00:34:18,480 --> 00:34:21,719 Speaker 2: with governments, and the US government being very key, so 514 00:34:21,800 --> 00:34:24,880 Speaker 2: I'll mostly speak out about those areas where I have 515 00:34:26,320 --> 00:34:29,120 Speaker 2: you know, real experience. I've gotten to visit the projects, 516 00:34:29,160 --> 00:34:34,640 Speaker 2: I've gotten to meet the heroes, and I'm still you know, 517 00:34:35,000 --> 00:34:40,320 Speaker 2: hoping we can strengthen rather than we can that those 518 00:34:40,640 --> 00:34:41,240 Speaker 2: those projects. 519 00:34:41,320 --> 00:34:44,560 Speaker 1: I want to ask you about RFK Junior, because obviously 520 00:34:44,680 --> 00:34:48,960 Speaker 1: that's kind of in your laying. His nomination passed the 521 00:34:49,000 --> 00:34:52,719 Speaker 1: Senate Finance Committee to advance to the floor bill. Do 522 00:34:52,800 --> 00:34:55,840 Speaker 1: you think he should be Secretary of Health and Human Services, 523 00:34:55,920 --> 00:35:02,200 Speaker 1: given especially your incredible work with vaccines and given some 524 00:35:02,360 --> 00:35:08,160 Speaker 1: of his, uh, you know, past statements about not only 525 00:35:08,200 --> 00:35:10,240 Speaker 1: their efficacy but about their safety. 526 00:35:12,680 --> 00:35:15,879 Speaker 2: You know, I I've only had one meeting with him, 527 00:35:16,480 --> 00:35:21,120 Speaker 2: which actually goes back to the first Trump administration, and 528 00:35:22,440 --> 00:35:25,960 Speaker 2: you know, it looks like he will get confirmed. Uh, 529 00:35:26,000 --> 00:35:29,640 Speaker 2: he'll be in charge of some other parts of the 530 00:35:29,640 --> 00:35:32,920 Speaker 2: government that do amazing work. You know, we have the 531 00:35:32,920 --> 00:35:36,960 Speaker 2: best drug regulator, the FDA, we have the best health 532 00:35:37,000 --> 00:35:41,200 Speaker 2: research the NIH. He'll be in charge of those things. 533 00:35:41,520 --> 00:35:45,960 Speaker 2: And uh, you know, so I hope to find common cause, 534 00:35:46,600 --> 00:35:51,279 Speaker 2: including you know, working to have more vaccines. You know, 535 00:35:51,320 --> 00:35:54,840 Speaker 2: we we don't have a vaccine for TB or HIV 536 00:35:54,960 --> 00:36:01,960 Speaker 2: in malaria, and there's some incredible work that even in 537 00:36:02,040 --> 00:36:05,759 Speaker 2: the next four years with Lucke, we'll get a number 538 00:36:05,800 --> 00:36:06,480 Speaker 2: of those tools. 539 00:36:06,560 --> 00:36:09,480 Speaker 1: You have predicted that the chance of another pandemic in 540 00:36:09,520 --> 00:36:13,120 Speaker 1: the next four years is between ten and fifteen percent, 541 00:36:13,239 --> 00:36:16,960 Speaker 1: and that we're absolutely not prepared for it. I'm curious 542 00:36:17,040 --> 00:36:21,240 Speaker 1: why you think global leaders are not taking this threat 543 00:36:21,400 --> 00:36:24,120 Speaker 1: seriously enough. This has sort of been the case for 544 00:36:24,280 --> 00:36:29,759 Speaker 1: pandemics for a while, even after COVID. Seems like we 545 00:36:29,960 --> 00:36:33,800 Speaker 1: haven't put the infrastructure in place to deal with something 546 00:36:33,880 --> 00:36:36,040 Speaker 1: like that happening in the future. And I'm curious why 547 00:36:36,120 --> 00:36:37,200 Speaker 1: you think that's the case. 548 00:36:38,160 --> 00:36:40,480 Speaker 2: Well, there are countries like China and India that are 549 00:36:40,520 --> 00:36:42,960 Speaker 2: actually doing some great things. 550 00:36:42,840 --> 00:36:45,000 Speaker 1: But I guess maybe than in this country. 551 00:36:44,680 --> 00:36:47,520 Speaker 2: We would make them more ready for the next pandemic. 552 00:36:47,640 --> 00:36:50,680 Speaker 2: US is kind of strange because we're still kind of arguing. 553 00:36:50,360 --> 00:36:52,680 Speaker 1: About where the virus came from. 554 00:36:52,719 --> 00:36:57,680 Speaker 2: You know, no, well that too, but even you know, okay, 555 00:36:58,280 --> 00:37:01,640 Speaker 2: in the end, at the end of any emergency action 556 00:37:01,840 --> 00:37:04,200 Speaker 2: like that, you realize, okay, now we've gone too far 557 00:37:04,840 --> 00:37:08,719 Speaker 2: and the remedy is greater than the benefit. And so 558 00:37:09,280 --> 00:37:11,360 Speaker 2: you know, when should we have known that? And whose 559 00:37:11,400 --> 00:37:11,880 Speaker 2: fault was? 560 00:37:11,920 --> 00:37:12,040 Speaker 1: That? 561 00:37:13,760 --> 00:37:18,640 Speaker 2: The vaccine saved millions of lives and President Trump was 562 00:37:19,080 --> 00:37:24,520 Speaker 2: involved in accelerating the availability of that vaccine, similar things 563 00:37:24,920 --> 00:37:30,080 Speaker 2: should be standing by for the next pandemic, And so 564 00:37:30,120 --> 00:37:33,880 Speaker 2: I'm hopeful that we'll sort of put that one behind 565 00:37:33,960 --> 00:37:38,279 Speaker 2: us and come together around what preparation looks like. The 566 00:37:38,320 --> 00:37:40,520 Speaker 2: world is more dependent on the US doing a good 567 00:37:40,600 --> 00:37:43,319 Speaker 2: job on this than it should be, you know. It 568 00:37:43,640 --> 00:37:47,759 Speaker 2: really the depth of medical experience, the size of the 569 00:37:47,800 --> 00:37:52,560 Speaker 2: American budget, you know, and we need to do it 570 00:37:52,600 --> 00:37:58,640 Speaker 2: on a cooperative basis. So with luck, we'll get our 571 00:37:58,640 --> 00:38:01,840 Speaker 2: act together before the next one comes, because it could 572 00:38:01,880 --> 00:38:06,839 Speaker 2: be far worse that pandemic you know, killed less than 573 00:38:06,880 --> 00:38:10,120 Speaker 2: one percent. Now that's millions of people, but you could 574 00:38:10,120 --> 00:38:13,160 Speaker 2: have one that would be you know, greater than ten percent. 575 00:38:13,239 --> 00:38:16,880 Speaker 1: Could you see yourself getting more involved in pandemic preparedness 576 00:38:17,400 --> 00:38:21,640 Speaker 1: and kind of trying to galvanize all these people who 577 00:38:22,360 --> 00:38:25,160 Speaker 1: have such a depth of knowledge but don't seem to 578 00:38:25,200 --> 00:38:26,879 Speaker 1: be particularly coordinated. 579 00:38:27,960 --> 00:38:31,360 Speaker 2: Well, I wrote a book about how to avoid the 580 00:38:31,400 --> 00:38:34,160 Speaker 2: next pandemic, but. 581 00:38:34,120 --> 00:38:37,280 Speaker 1: I mean actually instituting sort of more of a plan 582 00:38:37,520 --> 00:38:39,200 Speaker 1: or is that not really that interesting? 583 00:38:39,280 --> 00:38:42,239 Speaker 2: Well, if the government was putting together a group of 584 00:38:42,280 --> 00:38:48,200 Speaker 2: people on that, either I or you know, the the 585 00:38:48,520 --> 00:38:52,560 Speaker 2: deep experts at the Gates Foundation would love to be 586 00:38:52,640 --> 00:38:55,960 Speaker 2: part of that. You know, this is a multi country 587 00:38:56,000 --> 00:38:59,839 Speaker 2: thing because the pandemic is likely to start in either 588 00:38:59,840 --> 00:39:02,640 Speaker 2: a your Africa. And the best thing with the pandemic 589 00:39:02,680 --> 00:39:06,040 Speaker 2: because you stop it. You have functional health systems in 590 00:39:06,080 --> 00:39:10,920 Speaker 2: Africa that see it, detect it, and don't let it 591 00:39:10,960 --> 00:39:17,319 Speaker 2: go global. And that's partly an additional benefit besides what 592 00:39:17,440 --> 00:39:21,000 Speaker 2: I think should justify it by itself, which is the 593 00:39:21,040 --> 00:39:28,160 Speaker 2: moral idea of saving lives. And so yes, if the 594 00:39:28,160 --> 00:39:33,359 Speaker 2: world gets serious about this, you know I think about it. 595 00:39:33,880 --> 00:39:38,319 Speaker 2: I have great people outside of the pharma companies, we 596 00:39:38,440 --> 00:39:42,400 Speaker 2: have the greatest depth of vaccine expertise and so you know, 597 00:39:42,400 --> 00:39:45,120 Speaker 2: whenever you want to think not just about the market 598 00:39:45,160 --> 00:39:49,879 Speaker 2: incentives but societal benefit, that's where we have a team 599 00:39:49,920 --> 00:39:51,600 Speaker 2: that can make a contribution. 600 00:39:51,840 --> 00:39:54,960 Speaker 1: As you mentioned, I have so many questions, but we're 601 00:39:55,080 --> 00:39:57,040 Speaker 1: running out of time. But you mentioned you're going to 602 00:39:57,080 --> 00:40:01,880 Speaker 1: be turning seventy. I guess in October, right. And my 603 00:40:01,960 --> 00:40:04,279 Speaker 1: husband always says, I'm not even on the back nine. 604 00:40:04,280 --> 00:40:07,360 Speaker 1: I'm on the back three, which is so so sweet 605 00:40:07,360 --> 00:40:11,720 Speaker 1: of him. But as you as you approached the big 606 00:40:11,880 --> 00:40:14,520 Speaker 1: seven to zero, I mean, what are you thinking about? 607 00:40:14,600 --> 00:40:17,480 Speaker 1: What do you want to accomplish bill that you have 608 00:40:17,680 --> 00:40:22,080 Speaker 1: yet to accomplish because you're always I think, striving to 609 00:40:22,120 --> 00:40:22,640 Speaker 1: do more. 610 00:40:24,520 --> 00:40:30,040 Speaker 2: Yeah, most of my time is being smart about giving 611 00:40:30,800 --> 00:40:33,759 Speaker 2: the money away that I'm lucky enough to have and 612 00:40:33,800 --> 00:40:38,520 Speaker 2: building a phenomenal team of people at the Gates Foundation. 613 00:40:39,360 --> 00:40:43,200 Speaker 2: You know, global health is the thing we've picked, and 614 00:40:43,239 --> 00:40:46,320 Speaker 2: so far it's gone very well, you know, because of 615 00:40:46,360 --> 00:40:51,360 Speaker 2: our partner's generosity with the US government being top of 616 00:40:51,400 --> 00:40:54,759 Speaker 2: that list, have allowed us to get childhood under five 617 00:40:54,840 --> 00:40:57,040 Speaker 2: deaths from ten million a year. At the turn of 618 00:40:57,080 --> 00:41:00,279 Speaker 2: the century now down to blow five million. If we 619 00:41:00,400 --> 00:41:04,799 Speaker 2: stay the course, we will be able to cut that 620 00:41:04,840 --> 00:41:09,000 Speaker 2: in half again. You know. So for me, eradicating polio 621 00:41:09,480 --> 00:41:13,919 Speaker 2: where we're close but uh not there yet. Uh, then 622 00:41:14,000 --> 00:41:17,839 Speaker 2: moving on to eradicate musles and malaria, and to get 623 00:41:17,920 --> 00:41:21,160 Speaker 2: kind of an equity where child's life in these poor 624 00:41:21,239 --> 00:41:28,200 Speaker 2: countries is also valued, where they get the nourishment to thrive. Uh. 625 00:41:28,520 --> 00:41:31,640 Speaker 2: You know, it's very fulfilling work. And the innovation pipeline, 626 00:41:32,080 --> 00:41:37,000 Speaker 2: that's a very positive story. The delivery pipeline where the 627 00:41:37,200 --> 00:41:41,040 Speaker 2: world is getting distracted. Uh, and we need to renew 628 00:41:41,120 --> 00:41:44,919 Speaker 2: our values and commitment to these things. That look that's 629 00:41:44,960 --> 00:41:49,600 Speaker 2: looking tough at least in the near term, but uh, 630 00:41:49,719 --> 00:41:53,200 Speaker 2: you know, the innovations are going to come and you know, 631 00:41:53,239 --> 00:41:58,279 Speaker 2: eventually I think uh people will come back uh and 632 00:41:58,600 --> 00:42:00,880 Speaker 2: and do this just on a your value based. 633 00:42:00,719 --> 00:42:02,640 Speaker 1: You're never going to retire, You're never going to sit 634 00:42:02,680 --> 00:42:04,960 Speaker 1: on a beach and drink pina coladas. 635 00:42:05,960 --> 00:42:12,560 Speaker 2: Well, as long as I think my organizing teams and 636 00:42:13,239 --> 00:42:17,000 Speaker 2: challenging teams really can help drive these things forward, that 637 00:42:17,200 --> 00:42:21,160 Speaker 2: is the most fun thing for me to do. And 638 00:42:21,640 --> 00:42:24,320 Speaker 2: you know, do I have ten more years or twenty 639 00:42:24,320 --> 00:42:26,880 Speaker 2: more years of doing that hard to say, but for 640 00:42:27,000 --> 00:42:28,640 Speaker 2: now I'm full speed ahead. 641 00:42:28,480 --> 00:42:32,000 Speaker 1: Well Bill Gates. The book is called source Code My Beginnings. 642 00:42:32,280 --> 00:42:35,680 Speaker 1: Thanks so much, Bill, fun talking to Thank you Fair, 643 00:42:35,960 --> 00:42:52,600 Speaker 1: thank you, Thanks for listening everyone. If you have a 644 00:42:52,680 --> 00:42:55,439 Speaker 1: question for me, a subject you want us to cover, 645 00:42:55,840 --> 00:42:58,120 Speaker 1: or you want to share your thoughts about how you 646 00:42:58,200 --> 00:43:01,840 Speaker 1: navigate this crazy world, reach out send me a DM 647 00:43:01,880 --> 00:43:05,160 Speaker 1: on Instagram. I would love to hear from you. Next 648 00:43:05,280 --> 00:43:08,960 Speaker 1: Question is a production of iHeartMedia and Katie Couric Media. 649 00:43:09,440 --> 00:43:13,040 Speaker 1: The executive producers are me, Katie Kuric, and Courtney Ltz. 650 00:43:13,360 --> 00:43:17,600 Speaker 1: Our supervising producer is Ryan Martz, and our producers are 651 00:43:17,640 --> 00:43:23,360 Speaker 1: Adriana Fazzio and Meredith Barnes. Julian Weller composed our theme music. 652 00:43:24,239 --> 00:43:27,359 Speaker 1: For more information about today's episode, or to sign up 653 00:43:27,360 --> 00:43:30,480 Speaker 1: for my newsletter, wake Up Call, go to the description 654 00:43:30,560 --> 00:43:34,440 Speaker 1: in the podcast app, or visit us at Katiecuric dot com. 655 00:43:34,719 --> 00:43:37,120 Speaker 1: You can also find me on Instagram and all my 656 00:43:37,239 --> 00:43:41,880 Speaker 1: social media channels. For more podcasts from iHeartRadio, visit the 657 00:43:42,000 --> 00:43:46,160 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 658 00:43:46,200 --> 00:43:47,000 Speaker 1: favorite shows.