1 00:00:02,400 --> 00:00:06,760 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:06,960 --> 00:00:09,240 Speaker 2: All Right, well, the biggest headline, of course, out of 3 00:00:09,240 --> 00:00:13,600 Speaker 2: the WWDC was Tim Cook talking about artificial intelligence. 4 00:00:15,120 --> 00:00:19,640 Speaker 3: Recent developments and generative intelligence and large language models offer 5 00:00:19,880 --> 00:00:24,439 Speaker 3: powerful capabilities that provide the opportunity to take the experience 6 00:00:24,480 --> 00:00:29,680 Speaker 3: of using Apple products to new heights. Introducing Apple Intelligence, 7 00:00:30,240 --> 00:00:33,839 Speaker 3: the new personal intelligence system that makes your most personal 8 00:00:33,960 --> 00:00:36,680 Speaker 3: products even more useful and delightful. 9 00:00:37,880 --> 00:00:40,199 Speaker 2: Here in the studio with more is Sam Palmisano. He's 10 00:00:40,240 --> 00:00:43,280 Speaker 2: former CEO of IBM. He is also currently the board 11 00:00:43,320 --> 00:00:46,360 Speaker 2: chair of America's Frontier Fund, and nonprofit developed to support 12 00:00:46,360 --> 00:00:50,200 Speaker 2: critical tech to ensure US competitiveness. Sam, it's so great 13 00:00:50,200 --> 00:00:52,199 Speaker 2: to get you on set, your first guest on set for. 14 00:00:52,159 --> 00:00:54,040 Speaker 4: The new show. I'm honor, thank you so much. 15 00:00:54,080 --> 00:00:55,560 Speaker 2: We're going to get you a hatter ache show that 16 00:00:55,560 --> 00:00:57,920 Speaker 2: it's something okay, perfect, We're gonna work my hair, little do, 17 00:00:58,160 --> 00:00:59,840 Speaker 2: We're gonna work with shoot someone. 18 00:01:01,200 --> 00:01:01,480 Speaker 1: Okay. 19 00:01:01,880 --> 00:01:03,840 Speaker 2: So let's take a broader view. Then when it comes 20 00:01:03,920 --> 00:01:07,520 Speaker 2: to AI, how does the US take the appropriate amount 21 00:01:07,600 --> 00:01:10,200 Speaker 2: of risk to make sure that it can stay ahead. 22 00:01:10,800 --> 00:01:13,959 Speaker 5: That's a great question, quite honestly, and I'll step back 23 00:01:14,040 --> 00:01:16,200 Speaker 5: and say, look, at the models today and why we 24 00:01:16,319 --> 00:01:19,839 Speaker 5: created the American Frontier Fund. The models today are heavily 25 00:01:19,959 --> 00:01:23,680 Speaker 5: oriented around software. You heard that today or the Apple announcements. 26 00:01:24,080 --> 00:01:25,720 Speaker 4: To stay ahead in tech, it has to. 27 00:01:25,760 --> 00:01:28,160 Speaker 5: Be much more than just software, So you need long 28 00:01:28,240 --> 00:01:31,800 Speaker 5: term investments in science and research. US is far ahead 29 00:01:31,840 --> 00:01:35,160 Speaker 5: in science and research, but we're losing track on commercialization 30 00:01:35,600 --> 00:01:38,840 Speaker 5: that builds the new technologies that changes the future, and 31 00:01:38,959 --> 00:01:41,880 Speaker 5: not just consumer convenience. I mean how healthcare is done, 32 00:01:42,200 --> 00:01:44,880 Speaker 5: how banking and our financial systems are run, how the 33 00:01:44,959 --> 00:01:48,520 Speaker 5: national security, national defense occurs. All those types of things 34 00:01:48,600 --> 00:01:51,640 Speaker 5: require much longer cycles of investment, and that's why we 35 00:01:51,760 --> 00:01:53,800 Speaker 5: created this thing called the American Frontier Fund. 36 00:01:53,920 --> 00:01:55,720 Speaker 4: We want to keep the West ahead of the East. 37 00:01:56,280 --> 00:01:57,560 Speaker 3: So we have the Chips Act. 38 00:01:57,680 --> 00:02:00,840 Speaker 2: Right, Yes, we clearly have big tech innovating. Like where 39 00:02:00,880 --> 00:02:02,960 Speaker 2: are we falling short? Like where do you think the money, 40 00:02:03,040 --> 00:02:05,960 Speaker 2: in the talent and the time needs to go into That's. 41 00:02:05,840 --> 00:02:09,160 Speaker 5: A great question because obviously the issues telling me that, Well, okay, 42 00:02:09,200 --> 00:02:11,119 Speaker 5: then I'll tell you that's an okay question. 43 00:02:11,240 --> 00:02:12,919 Speaker 4: I asked Roman for a good question. 44 00:02:13,760 --> 00:02:17,480 Speaker 5: But seriously, what it is that the point that the 45 00:02:17,600 --> 00:02:20,959 Speaker 5: cycles that we're talking about for these very advanced applications 46 00:02:21,160 --> 00:02:23,639 Speaker 5: to require I'll say, let's say ten to fifteen years, 47 00:02:23,680 --> 00:02:25,679 Speaker 5: we call it the value of death. To take it 48 00:02:25,760 --> 00:02:28,600 Speaker 5: from science and research through the value death to get 49 00:02:28,639 --> 00:02:31,639 Speaker 5: it to commercialization takes a long time. So to do 50 00:02:31,760 --> 00:02:34,919 Speaker 5: that you have to create a partnerships or an ecosystem 51 00:02:34,960 --> 00:02:38,720 Speaker 5: with the research laboratories, with the academic institutions, with the 52 00:02:38,919 --> 00:02:41,360 Speaker 5: private sector as well, and you need government involved because 53 00:02:41,360 --> 00:02:44,519 Speaker 5: of the risk associated of getting through the value of death. 54 00:02:44,680 --> 00:02:45,960 Speaker 4: And that's what we are about. 55 00:02:46,000 --> 00:02:49,600 Speaker 5: And what we've created is a new model for commercialization 56 00:02:49,800 --> 00:02:52,040 Speaker 5: of this tech that it's required to keep the US 57 00:02:52,080 --> 00:02:54,040 Speaker 5: ahead of the rest of the world, but also the 58 00:02:54,120 --> 00:02:56,919 Speaker 5: funding models that make it attractive for private investment to 59 00:02:57,000 --> 00:02:58,720 Speaker 5: come in to these opportunities. 60 00:02:58,880 --> 00:03:00,600 Speaker 1: Do we have that right now? I mean, when you 61 00:03:00,680 --> 00:03:02,919 Speaker 1: look at sort of all the movement we see seen 62 00:03:02,919 --> 00:03:04,800 Speaker 1: an AI as of lately, so much of it seems 63 00:03:04,800 --> 00:03:06,600 Speaker 1: to be driven by the private sector. And even when 64 00:03:06,639 --> 00:03:09,000 Speaker 1: the government got involved, who do they turn to first? 65 00:03:09,040 --> 00:03:11,440 Speaker 1: They said, basically they brought the private sector and it 66 00:03:11,480 --> 00:03:14,120 Speaker 1: said give us a solution to regulating this, which I 67 00:03:14,160 --> 00:03:15,880 Speaker 1: think caused everyone to kind of scratch their heads. 68 00:03:15,960 --> 00:03:19,560 Speaker 5: Well, they're t two different questions regulations different than actually 69 00:03:19,600 --> 00:03:22,600 Speaker 5: the creation of the tech and the creation of the tech. 70 00:03:22,720 --> 00:03:25,120 Speaker 5: The reason why go back to the early days of 71 00:03:25,360 --> 00:03:28,000 Speaker 5: your INCU tell the CEO of the organization's give them 72 00:03:28,240 --> 00:03:29,280 Speaker 5: who created incutel. 73 00:03:29,520 --> 00:03:31,360 Speaker 4: That was for national security and defense. 74 00:03:31,840 --> 00:03:33,400 Speaker 5: But it was the same sorts of things that what 75 00:03:33,480 --> 00:03:36,480 Speaker 5: could you do long term in technology to make the 76 00:03:36,520 --> 00:03:40,320 Speaker 5: country more secure? And sometimes you need governments involvement because 77 00:03:40,320 --> 00:03:43,040 Speaker 5: of their risk factors, no different than the Chips Act. 78 00:03:43,120 --> 00:03:46,440 Speaker 5: As far as you're talking about billions of dollars to 79 00:03:46,520 --> 00:03:49,000 Speaker 5: create these facilities which are necessary that we control the 80 00:03:49,040 --> 00:03:50,320 Speaker 5: resilience of the supply chain. 81 00:03:50,720 --> 00:03:52,400 Speaker 4: But who's going to take the risk of forty or 82 00:03:52,480 --> 00:03:54,280 Speaker 4: fifty billion in the private sector? 83 00:03:54,600 --> 00:03:57,120 Speaker 5: I mean, Intel might invest twenty billion, but you're doubling 84 00:03:57,200 --> 00:03:59,920 Speaker 5: it up now for these kinds of technologies going into 85 00:03:59,920 --> 00:04:00,520 Speaker 5: the future. 86 00:04:00,840 --> 00:04:02,680 Speaker 4: So this is why we've come up with this. 87 00:04:02,840 --> 00:04:06,440 Speaker 5: Model that says, Okay, we're going to collaborate in the 88 00:04:06,880 --> 00:04:11,240 Speaker 5: science with the national laboratories and everybody else, and then 89 00:04:11,760 --> 00:04:15,120 Speaker 5: when you have commercialization, we'll take it to the private sector. 90 00:04:15,160 --> 00:04:17,920 Speaker 5: But whether there's high risk involved, we're going to ask 91 00:04:18,000 --> 00:04:19,680 Speaker 5: for We're going to try to seek which we've done 92 00:04:19,720 --> 00:04:22,719 Speaker 5: in our first fund, there's government involvement, there's private sector 93 00:04:22,800 --> 00:04:25,360 Speaker 5: involvement as well. The combined are going to create the 94 00:04:25,400 --> 00:04:27,040 Speaker 5: first fund we're about to announce in two or three. 95 00:04:27,279 --> 00:04:29,280 Speaker 1: Are there parallels between what you're trying to do and 96 00:04:29,360 --> 00:04:31,640 Speaker 1: what we're seeing now and maybe what we saw coming 97 00:04:31,680 --> 00:04:34,360 Speaker 1: out of the sixties seventies with some of the Defense 98 00:04:34,400 --> 00:04:36,599 Speaker 1: Department projects and now there's some of the science based 99 00:04:36,640 --> 00:04:40,040 Speaker 1: projects that were government based that eventually created the foundations 100 00:04:40,080 --> 00:04:41,400 Speaker 1: for what really is our tech industry. 101 00:04:41,440 --> 00:04:43,400 Speaker 5: Well, you go through a dark both, right, is the 102 00:04:43,480 --> 00:04:45,840 Speaker 5: in there? I mean, you go through those initials. I'm 103 00:04:45,880 --> 00:04:47,920 Speaker 5: so old I was involved with these things. I mean, 104 00:04:48,200 --> 00:04:49,560 Speaker 5: so I can take you through the history of hell 105 00:04:49,640 --> 00:04:52,760 Speaker 5: was all was done. But fundamentally, it's those kinds of 106 00:04:53,040 --> 00:04:55,440 Speaker 5: core technologies. Did you ever think of when you were 107 00:04:55,480 --> 00:04:58,280 Speaker 5: exchanging technical documents it would lead to what you do 108 00:04:58,400 --> 00:05:00,440 Speaker 5: today on your Apple phone, which you just heard about. No, 109 00:05:00,560 --> 00:05:02,880 Speaker 5: of course not now, this is like what thirty forty 110 00:05:02,960 --> 00:05:06,240 Speaker 5: years later, But nonetheless it's the same sort of thing here. 111 00:05:06,720 --> 00:05:09,280 Speaker 5: But a lot of this stuff does originate quite honestly, 112 00:05:09,400 --> 00:05:12,560 Speaker 5: in the science organizations within the government and the academics 113 00:05:12,600 --> 00:05:16,800 Speaker 5: research organizations. But you still need the private sector to commercialize, 114 00:05:17,279 --> 00:05:19,680 Speaker 5: not just the fund but also to commercialize because the 115 00:05:19,800 --> 00:05:21,880 Speaker 5: talent and the skills that are required to build these 116 00:05:21,960 --> 00:05:25,400 Speaker 5: companies don't exist in government. In fact, I would never 117 00:05:25,520 --> 00:05:27,480 Speaker 5: let government do it if I had an opinion on 118 00:05:27,560 --> 00:05:29,960 Speaker 5: that subject, because that's not what they do. So we 119 00:05:30,080 --> 00:05:32,960 Speaker 5: created these things we call road weather laboratories that actually 120 00:05:33,000 --> 00:05:35,440 Speaker 5: will take what we're not talking about and drive the 121 00:05:35,480 --> 00:05:36,760 Speaker 5: commercialization phase. 122 00:05:37,000 --> 00:05:39,680 Speaker 2: So to that point, I've been wondering what the cycle, 123 00:05:40,240 --> 00:05:41,640 Speaker 2: I don't know what to call it, like the cycle 124 00:05:41,680 --> 00:05:43,920 Speaker 2: of AI or the cycle of AI chips, Like, what 125 00:05:44,360 --> 00:05:45,280 Speaker 2: what does that look like? 126 00:05:45,520 --> 00:05:46,680 Speaker 4: Is it short? Is a long? 127 00:05:46,800 --> 00:05:49,520 Speaker 2: How does that affect the private the private money coming in? 128 00:05:49,800 --> 00:05:51,040 Speaker 4: Well, give you a sense. 129 00:05:51,240 --> 00:05:54,200 Speaker 5: Historically, that's technology cycle. If I take you back fifty 130 00:05:54,279 --> 00:05:57,320 Speaker 5: years is fifteen to twenty years. So go back to 131 00:05:57,400 --> 00:06:00,400 Speaker 5: the mainframe mainframe, to the client server PC. Any clients 132 00:06:00,440 --> 00:06:03,000 Speaker 5: ever precede the cloud, they were fifteen to twenty year cycles. 133 00:06:03,520 --> 00:06:08,720 Speaker 5: They started slower, Televisions, phones all started slower. Today with 134 00:06:08,880 --> 00:06:12,520 Speaker 5: Jenner the AI, it is like on steroids relative to 135 00:06:12,600 --> 00:06:15,559 Speaker 5: the pace of it. However, a lot of people argue 136 00:06:15,560 --> 00:06:17,560 Speaker 5: that it's at the peak, even there's research out there 137 00:06:17,600 --> 00:06:18,680 Speaker 5: that says we're peaking. 138 00:06:18,440 --> 00:06:19,000 Speaker 4: In the cycle. 139 00:06:19,400 --> 00:06:21,200 Speaker 5: So then what happens next and you hear it in 140 00:06:21,400 --> 00:06:24,800 Speaker 5: the enterprise talking about we can't use the stuff it hallucinates. 141 00:06:25,080 --> 00:06:27,080 Speaker 4: That's an indication that this thing is peaking. 142 00:06:27,560 --> 00:06:30,680 Speaker 5: And maybe it's okay for music and poetry or marketing material, 143 00:06:31,000 --> 00:06:33,359 Speaker 5: but you're not going to go defend the United States 144 00:06:34,480 --> 00:06:37,480 Speaker 5: infrastructure with that kind of technology, or you're not going 145 00:06:37,520 --> 00:06:39,000 Speaker 5: to have a bank and not deal a deal with 146 00:06:39,080 --> 00:06:41,680 Speaker 5: fraud or cyber with that kind of technology. So my 147 00:06:41,800 --> 00:06:44,400 Speaker 5: point being is this is where we are, but it's 148 00:06:44,480 --> 00:06:48,040 Speaker 5: going really really fast. Now the key, and we believe 149 00:06:48,120 --> 00:06:50,560 Speaker 5: the key for it to really be adopted in enterprise, 150 00:06:51,000 --> 00:06:52,839 Speaker 5: it's basically you have to deal with we will call 151 00:06:52,839 --> 00:06:56,360 Speaker 5: them guardrails versus regulations room and we would argue that, 152 00:06:56,560 --> 00:06:59,480 Speaker 5: and we do argue this that the right source for 153 00:06:59,520 --> 00:07:02,920 Speaker 5: the regular is not the tech industry itself, because there 154 00:07:02,960 --> 00:07:04,880 Speaker 5: are leaders in the tech industry and they're going to 155 00:07:05,080 --> 00:07:07,240 Speaker 5: show up with an opinion which is a valid opinion, 156 00:07:07,560 --> 00:07:11,400 Speaker 5: but also it serves the hyperscalers. We would say that 157 00:07:11,600 --> 00:07:13,200 Speaker 5: you really need to do who the users of the 158 00:07:13,280 --> 00:07:16,560 Speaker 5: technology are to take the businesses the CEOs of these 159 00:07:16,640 --> 00:07:20,120 Speaker 5: companies in healthcare, banking, government, whatever it happens to be, 160 00:07:20,600 --> 00:07:23,960 Speaker 5: and let them come up with let them innovative the technology, 161 00:07:24,040 --> 00:07:28,040 Speaker 5: but regulate the use cases the application, regulate the use 162 00:07:28,080 --> 00:07:31,360 Speaker 5: of the data, regulate the privacy, establish the guard rails 163 00:07:31,800 --> 00:07:34,280 Speaker 5: so that people are protected in their privacy and their 164 00:07:34,360 --> 00:07:37,120 Speaker 5: data is protected, both the consumer as well as the enterprise. 165 00:07:37,560 --> 00:07:39,320 Speaker 4: That should be done by the enterprises. 166 00:07:39,720 --> 00:07:42,080 Speaker 5: Look, I was in the tech industry for forty years, 167 00:07:42,560 --> 00:07:44,520 Speaker 5: not by me or my counterparts. 168 00:07:44,800 --> 00:07:47,320 Speaker 4: All right, Sam, great conversation. I really appreciate you being 169 00:07:47,360 --> 00:07:47,840 Speaker 4: here today. 170 00:07:47,920 --> 00:07:51,360 Speaker 1: Sam pal Masano, former CEO over at IBM, and of 171 00:07:51,400 --> 00:07:54,160 Speaker 1: course now helping to shepherd that effort of that public 172 00:07:54,240 --> 00:07:58,440 Speaker 1: private partnership to further our technological advancement here in the US.