1 00:00:05,480 --> 00:00:07,600 Speaker 1: This is Wall Street Week. I'm David Weston. Welcome to 2 00:00:07,720 --> 00:00:09,799 Speaker 1: our very special contributor here in Wall Streetek. He is 3 00:00:09,840 --> 00:00:11,719 Speaker 1: Larry Summons of Harvard. Larry, thank you so much for 4 00:00:11,720 --> 00:00:13,720 Speaker 1: being with us here in New York. It has been 5 00:00:13,800 --> 00:00:17,840 Speaker 1: quite a week where we lost Charlie Munger, we lost 6 00:00:18,880 --> 00:00:21,600 Speaker 1: we lost Henry Kissinger, and then on Friday we actually 7 00:00:21,600 --> 00:00:24,840 Speaker 1: lost Saturday O'Connor as well. I know you knew Henry, 8 00:00:25,000 --> 00:00:29,000 Speaker 1: well you knew Charliemunger. Talk about Henry Kissinger in your experience. 9 00:00:30,800 --> 00:00:34,920 Speaker 2: He was an extraordinary man. It was extraordinary to see 10 00:00:34,960 --> 00:00:39,839 Speaker 2: somebody in their eighties and then in their nineties, and 11 00:00:39,880 --> 00:00:45,400 Speaker 2: then one hundred years old, so incredibly intellectually vital looking 12 00:00:45,440 --> 00:00:50,880 Speaker 2: to learn about AI, speaking to most of the world's 13 00:00:50,960 --> 00:00:57,480 Speaker 2: leading statesman every several months. He was fully engaged and 14 00:00:57,840 --> 00:01:01,640 Speaker 2: active in the way that many people half his age 15 00:01:02,640 --> 00:01:07,640 Speaker 2: wouldn't have had the energy and the drive to do. 16 00:01:08,480 --> 00:01:14,600 Speaker 2: He was somebody who always thought in the large, first 17 00:01:15,319 --> 00:01:22,600 Speaker 2: understanding broad historical forces or shaping relationships between countries. From that, 18 00:01:23,000 --> 00:01:28,039 Speaker 2: forming a conception of strategy, From that conception of strategy, 19 00:01:28,640 --> 00:01:34,760 Speaker 2: working to tactics. It was anything but the day to 20 00:01:34,840 --> 00:01:40,720 Speaker 2: day political thrust of statements that so often seems to 21 00:01:40,760 --> 00:01:47,119 Speaker 2: preoccupy leaders these days. He surely didn't get everything right, 22 00:01:47,760 --> 00:01:54,080 Speaker 2: but he certainly was always trying to reason from the 23 00:01:54,200 --> 00:02:02,280 Speaker 2: large to reason down, and I learned from his deeply, 24 00:02:03,520 --> 00:02:10,800 Speaker 2: in a way tragic sensibility. He was the idealist as 25 00:02:10,919 --> 00:02:17,640 Speaker 2: realist he and the realist as idealist Henry, maybe because 26 00:02:17,680 --> 00:02:21,640 Speaker 2: of how and where he grew up in Germany in 27 00:02:21,680 --> 00:02:29,640 Speaker 2: the nineteen thirties, feared disorder, feared chaos, feared the complete 28 00:02:29,720 --> 00:02:36,760 Speaker 2: triumph of passion over reason, and his determination was to 29 00:02:36,800 --> 00:02:41,360 Speaker 2: bring about stability. Not because he somehow thought that the 30 00:02:41,440 --> 00:02:46,119 Speaker 2: pride of nations and strength was what was most important, 31 00:02:46,760 --> 00:02:53,480 Speaker 2: but because when there was the containment of evil, that 32 00:02:53,680 --> 00:02:59,160 Speaker 2: was when people had an opportunity to live and flourish. 33 00:03:00,000 --> 00:03:07,679 Speaker 2: Those were powerful, powerful ideas he was. He was a 34 00:03:07,720 --> 00:03:13,560 Speaker 2: figure who was also very generous to people who he 35 00:03:13,639 --> 00:03:21,560 Speaker 2: had no compelling reason to be generous too. He was 36 00:03:22,280 --> 00:03:26,079 Speaker 2: as impressive an extraordinary a person as I've known. He 37 00:03:26,120 --> 00:03:31,200 Speaker 2: was also in many ways as complex a person as 38 00:03:32,520 --> 00:03:34,040 Speaker 2: I have known. 39 00:03:34,360 --> 00:03:36,520 Speaker 1: I was saying I was one of the beneficiaries of that, because, 40 00:03:36,560 --> 00:03:38,480 Speaker 1: for reasons that are unknown to me, he sort of 41 00:03:38,480 --> 00:03:39,960 Speaker 1: adopted me a little bit when I came to New 42 00:03:40,040 --> 00:03:42,480 Speaker 1: York and invited me to various events where we learned 43 00:03:42,480 --> 00:03:45,520 Speaker 1: about about the economic world and the world of geopolitics, 44 00:03:45,520 --> 00:03:49,440 Speaker 1: where otherwise wouldn't have Charlie Munger, the partner of Warren 45 00:03:49,480 --> 00:03:54,800 Speaker 1: Buffett important. What effect did he have on investing writ large? 46 00:03:55,400 --> 00:03:57,800 Speaker 2: You know, I think the first thing to say is 47 00:03:57,840 --> 00:04:02,560 Speaker 2: that Charlie and Henry had something very important in common. 48 00:04:02,760 --> 00:04:07,839 Speaker 2: They had different interests, they had different styles, they had 49 00:04:07,880 --> 00:04:11,720 Speaker 2: different ways of speaking, but they both had a deep 50 00:04:11,960 --> 00:04:16,560 Speaker 2: commitment to seeing the world as it was, not as 51 00:04:16,600 --> 00:04:21,279 Speaker 2: they wanted it to be. And they began by trying 52 00:04:21,320 --> 00:04:27,520 Speaker 2: to see things as realistically, as clinically accurately as possible. 53 00:04:27,960 --> 00:04:31,880 Speaker 2: And that was the basis for decision, whether it was 54 00:04:32,360 --> 00:04:36,560 Speaker 2: political and diplomatic decision in Henry's case, whether it was 55 00:04:36,640 --> 00:04:42,760 Speaker 2: financial and investment decision in Charlie's case. What I got 56 00:04:42,800 --> 00:04:48,120 Speaker 2: from both of them was this commitment to detached observation 57 00:04:49,040 --> 00:04:58,279 Speaker 2: as a prelude to taking action. You know, by Larrence testimony, 58 00:04:59,120 --> 00:05:04,440 Speaker 2: Charlie provides, I did a really extraordinary insight, he said, 59 00:05:04,520 --> 00:05:08,080 Speaker 2: and it's different than the old fashioned and traditional value 60 00:05:08,120 --> 00:05:15,640 Speaker 2: investing credo Charlie rejected the idea of buying fair companies 61 00:05:15,760 --> 00:05:20,360 Speaker 2: at great prices. He thought you'd ultimately do better buying 62 00:05:20,440 --> 00:05:26,960 Speaker 2: great companies at fair prices. And that philosophy of finding 63 00:05:27,000 --> 00:05:31,960 Speaker 2: the best, making sure you weren't overpaying, and then sticking 64 00:05:32,040 --> 00:05:36,120 Speaker 2: with it is one that has certainly served certainly served him, 65 00:05:36,160 --> 00:05:42,640 Speaker 2: and certainly served warn well. But I saw it. Charlie 66 00:05:42,720 --> 00:05:45,719 Speaker 2: was in at wasn't involved a law of Harvard Law 67 00:05:45,760 --> 00:05:51,120 Speaker 2: School while I was the president of Harvard, and he 68 00:05:51,200 --> 00:05:54,880 Speaker 2: had a view. It was clear there was a logic. 69 00:05:55,000 --> 00:06:01,080 Speaker 2: He was prepared to argue and defend it. He too, 70 00:06:01,320 --> 00:06:06,120 Speaker 2: was an extraordinary figure. And you have to also look 71 00:06:06,160 --> 00:06:10,600 Speaker 2: at both Henry and Charlie and see the power of 72 00:06:10,680 --> 00:06:14,760 Speaker 2: staying curious. They stayed with it to the very end. 73 00:06:14,880 --> 00:06:18,760 Speaker 2: And I think their curiosity, their love of reading, their 74 00:06:18,800 --> 00:06:22,240 Speaker 2: love of discussion, their love of argument, was part of 75 00:06:22,279 --> 00:06:24,800 Speaker 2: what caused them to flourish for so very long. 76 00:06:24,960 --> 00:06:28,120 Speaker 1: Well, maybe a reach, but I think you're really emulating 77 00:06:28,160 --> 00:06:31,040 Speaker 1: that and staying curious. As you've now taken this role 78 00:06:31,080 --> 00:06:33,600 Speaker 1: with Open Ai. You talk about Henry Kissinger with open Air, 79 00:06:33,680 --> 00:06:35,920 Speaker 1: you're taking it on. Let me ask you, frankly, why 80 00:06:35,960 --> 00:06:37,680 Speaker 1: did you take the job, and what do you hope 81 00:06:37,680 --> 00:06:38,240 Speaker 1: to do with it? 82 00:06:39,279 --> 00:06:42,960 Speaker 2: I thought that, as I said on your show, that 83 00:06:43,080 --> 00:06:47,880 Speaker 2: this was something that was extraordinarily important. You know, no 84 00:06:47,920 --> 00:06:50,640 Speaker 2: one could be certain whether this is a once a 85 00:06:50,680 --> 00:06:54,880 Speaker 2: decade technology, a once a half century technology, a once 86 00:06:54,920 --> 00:06:59,640 Speaker 2: a century technology, a once a millennium technology. You know 87 00:06:59,680 --> 00:07:02,800 Speaker 2: and can know that for sure, but it sure looks 88 00:07:02,920 --> 00:07:11,560 Speaker 2: like it's awfully important to develop rapidly and safely and 89 00:07:11,680 --> 00:07:16,800 Speaker 2: to disseminate effectively and well, so when I was offered 90 00:07:16,840 --> 00:07:22,720 Speaker 2: an opportunity to be part of contributing to that, overseeing 91 00:07:22,840 --> 00:07:28,040 Speaker 2: to make sure that that was effectively done, and to 92 00:07:28,520 --> 00:07:31,560 Speaker 2: do it working with some very great people, I thought 93 00:07:31,640 --> 00:07:34,520 Speaker 2: it was a real opportunity and I was glad to 94 00:07:34,560 --> 00:07:34,800 Speaker 2: do it. 95 00:07:34,840 --> 00:07:37,320 Speaker 1: I don't want to take anything way for your technological expertise, 96 00:07:37,440 --> 00:07:40,240 Speaker 1: but when what you just said safely strikes me as 97 00:07:40,280 --> 00:07:42,120 Speaker 1: probably part of what you're going to be focused on, 98 00:07:42,480 --> 00:07:45,160 Speaker 1: and that gets to questions of governance about how you 99 00:07:45,200 --> 00:07:47,840 Speaker 1: handle this technology wherever it's going and how a powerful 100 00:07:47,880 --> 00:07:50,040 Speaker 1: may be. Do you have an overall sense of what 101 00:07:50,080 --> 00:07:52,200 Speaker 1: you need to do to govern it to get this 102 00:07:52,360 --> 00:07:53,280 Speaker 1: safely part right? 103 00:07:54,360 --> 00:07:56,520 Speaker 2: You know, David, I've been on the job two days 104 00:07:56,560 --> 00:07:59,600 Speaker 2: and they're going to send me the onboarding packet for 105 00:08:00,680 --> 00:08:04,600 Speaker 2: the board on Sunday. So I shouldn't be saying too 106 00:08:04,720 --> 00:08:10,680 Speaker 2: much at all because I don't know enough. Here's some 107 00:08:10,760 --> 00:08:13,560 Speaker 2: things I think I know. I think I know that 108 00:08:14,360 --> 00:08:19,520 Speaker 2: a company like this has to be prepared to cooperate 109 00:08:19,760 --> 00:08:24,560 Speaker 2: doesn't mean always agree with, but cooperate with key government 110 00:08:24,640 --> 00:08:33,280 Speaker 2: officials on regulatory issues, on national security issues, on development 111 00:08:33,640 --> 00:08:40,480 Speaker 2: of technology issues. I think I know also, and this 112 00:08:40,520 --> 00:08:45,839 Speaker 2: is integral to the structure of open AI, where the 113 00:08:45,880 --> 00:08:50,839 Speaker 2: for profit entity is itself a creature of a not 114 00:08:50,920 --> 00:08:57,199 Speaker 2: for profit entity, that this needs to be a corporation 115 00:08:58,040 --> 00:09:02,520 Speaker 2: with a conscience, and that we need to be always 116 00:09:02,559 --> 00:09:07,480 Speaker 2: thinking about the multiple stakeholders in the development of this technology. 117 00:09:07,520 --> 00:09:10,800 Speaker 2: And as a board member, that will be part of 118 00:09:10,800 --> 00:09:19,560 Speaker 2: my responsibility working with other board members to make that certain. 119 00:09:19,960 --> 00:09:24,160 Speaker 2: You know, my colleague at Harvard, late colleague Ken Galbret, 120 00:09:24,440 --> 00:09:28,280 Speaker 2: said that conscious is the knowledge that someone is watching. 121 00:09:28,520 --> 00:09:32,040 Speaker 2: And I think it's the responsibility for everybody involved in 122 00:09:32,120 --> 00:09:38,360 Speaker 2: this to be thinking very carefully always about both opportunities 123 00:09:38,880 --> 00:09:43,600 Speaker 2: and uncertainties and to make sure that those are balanced 124 00:09:44,160 --> 00:09:46,880 Speaker 2: in the best way that's possible. 125 00:09:47,280 --> 00:09:49,840 Speaker 1: It's fascinating. Well, I wish you luck, as we all do, because, 126 00:09:50,040 --> 00:09:52,840 Speaker 1: as you've said on this program, it's a powerful, really 127 00:09:52,880 --> 00:09:55,040 Speaker 1: powerful engine particularly and could do an awful lot of good. 128 00:09:55,040 --> 00:09:56,640 Speaker 1: At the same time, we have to be careful about it. 129 00:09:56,679 --> 00:09:58,400 Speaker 1: Do you think it's still coming from the cognitive class 130 00:09:58,440 --> 00:10:00,400 Speaker 1: you said that once unnecessarily, I think yeah. 131 00:10:00,480 --> 00:10:02,640 Speaker 2: I think one of the reasons. I think one thing 132 00:10:02,720 --> 00:10:05,520 Speaker 2: that people should keep in mind as they read all 133 00:10:05,559 --> 00:10:09,960 Speaker 2: the press about this is this is a technology that 134 00:10:10,120 --> 00:10:16,760 Speaker 2: does what reporters and journalists do. Now you're hearing journalists. 135 00:10:16,840 --> 00:10:21,040 Speaker 2: They view it slightly differently than when it's technologies that 136 00:10:21,080 --> 00:10:24,600 Speaker 2: are potentially affecting what some other people and by the. 137 00:10:24,559 --> 00:10:27,080 Speaker 1: Way, a few lawyers too, by the way, so I'm 138 00:10:27,080 --> 00:10:29,840 Speaker 1: doubly vulnerable here. As Larry, it's really great to have 139 00:10:29,880 --> 00:10:31,640 Speaker 1: you with us, as it always is. There's a very 140 00:10:31,640 --> 00:10:34,360 Speaker 1: special contributor, Larry Summers of Harvard, and this is Wall 141 00:10:34,400 --> 00:10:35,840 Speaker 1: Street Week on Bloomberg