1 00:00:02,520 --> 00:00:16,560 Speaker 1: Bloomberg Audio Studios, podcasts, radio news, single best idea and 2 00:00:16,600 --> 00:00:18,840 Speaker 1: today we focus on one voice and it was a 3 00:00:18,880 --> 00:00:22,119 Speaker 1: phenomenal day, all sorts of good people to talk to. 4 00:00:22,280 --> 00:00:25,439 Speaker 1: Dan Ives was on with Webbush and he was just 5 00:00:26,000 --> 00:00:31,600 Speaker 1: blistering about the significance of this Oracle move. To summarize 6 00:00:31,600 --> 00:00:33,159 Speaker 1: here and not take a lot of time on it. 7 00:00:33,600 --> 00:00:37,599 Speaker 1: Oracle leapt over thirty percent today on the market opening. 8 00:00:38,040 --> 00:00:40,479 Speaker 1: I said this on air. I'll say it again. I 9 00:00:40,600 --> 00:00:45,239 Speaker 1: have never seen a blue chip stock leap have a 10 00:00:45,320 --> 00:00:52,600 Speaker 1: complete restructure reanalysis. After what we saw from Oracle last evening, profound, 11 00:00:52,640 --> 00:00:56,639 Speaker 1: profound performance. I should mention for clarity, Oracle is one 12 00:00:56,640 --> 00:00:59,400 Speaker 1: of the good sponsors of Bloomberg Surveillance. We thank them 13 00:00:59,600 --> 00:01:02,720 Speaker 1: for that support. But that's just a stunning, stunning moment, 14 00:01:02,720 --> 00:01:05,920 Speaker 1: and many others as well. All Senator on Annawan's important 15 00:01:05,959 --> 00:01:10,480 Speaker 1: Bloomberg Economics essay, saying, look, the employment revision yesterday, Guess 16 00:01:10,480 --> 00:01:13,240 Speaker 1: what it means. We've been in some form of recession 17 00:01:13,600 --> 00:01:16,319 Speaker 1: going back to the spring of twenty twenty four. She 18 00:01:16,360 --> 00:01:19,920 Speaker 1: finishes her note by saying, well, things look better now. 19 00:01:20,040 --> 00:01:23,959 Speaker 1: Maybe we're at the beginning of a more constructive business cycle. 20 00:01:24,480 --> 00:01:27,959 Speaker 1: What a treat today to have in from McKinsey Global Institute, 21 00:01:28,000 --> 00:01:32,360 Speaker 1: Michael Cheua, he's out of Stanford and at Indiana did 22 00:01:32,640 --> 00:01:37,160 Speaker 1: absolutely definitive academics on what we do every day, which 23 00:01:37,200 --> 00:01:41,160 Speaker 1: is search. Michael Chewa of McKenzie on search. 24 00:01:41,319 --> 00:01:44,600 Speaker 2: For several years, you know, AI usage, a regular usage 25 00:01:44,640 --> 00:01:47,720 Speaker 2: of AI had sort of plateaued around fifty percent, you know, 26 00:01:47,840 --> 00:01:53,080 Speaker 2: since the advent of jenerative AI chatchypt Claude Gemini, and 27 00:01:53,120 --> 00:01:55,840 Speaker 2: it's like, you know, that is really accelerated. And now 28 00:01:55,880 --> 00:01:58,440 Speaker 2: we see about ninety percent of the companies in our 29 00:01:58,480 --> 00:02:02,160 Speaker 2: survey saying that they're using A regularly. At the same time, 30 00:02:02,480 --> 00:02:06,080 Speaker 2: the majority are not saying it's having a significant impact 31 00:02:06,160 --> 00:02:08,640 Speaker 2: on its EBIT yet. But what we do see is 32 00:02:08,639 --> 00:02:14,000 Speaker 2: that the individual use case or business function level, it's 33 00:02:14,040 --> 00:02:16,480 Speaker 2: starting to create real value. And so what we say 34 00:02:16,520 --> 00:02:21,480 Speaker 2: is that's a process. To your point about industries, Yeah, 35 00:02:21,520 --> 00:02:25,000 Speaker 2: the tech industry, for instance, has moved ahead. But in fact, 36 00:02:25,160 --> 00:02:27,720 Speaker 2: rather than an industry having quote unquote figured it out, 37 00:02:27,720 --> 00:02:30,360 Speaker 2: what we're finding is lots of variation within industries. And 38 00:02:30,400 --> 00:02:32,520 Speaker 2: so again the companies that have figured out how to 39 00:02:32,560 --> 00:02:36,200 Speaker 2: rewire themselves right, whether it's travel and logistics or high tech, 40 00:02:36,520 --> 00:02:38,840 Speaker 2: they're just accelerating past their competitors. 41 00:02:39,040 --> 00:02:42,640 Speaker 1: We continue with Michael Cheua of Stanford, of Indiana, of 42 00:02:42,720 --> 00:02:46,680 Speaker 1: McKenzie and basically, if to a dummy like me, it's like, okay, 43 00:02:46,760 --> 00:02:50,760 Speaker 1: late nineteen ninety four, early nineteen ninety five, a revolution 44 00:02:51,280 --> 00:02:52,840 Speaker 1: Tua on the view forward. 45 00:02:53,040 --> 00:02:54,720 Speaker 2: I think one of the interesting things, and it's a 46 00:02:54,760 --> 00:02:57,080 Speaker 2: bit of a the you know, a debate within the 47 00:02:57,720 --> 00:03:01,320 Speaker 2: artificial intelligence community. A lot of the excitement now is 48 00:03:01,400 --> 00:03:04,600 Speaker 2: with these quote unquote neural network models, and that's underpinning 49 00:03:04,639 --> 00:03:07,040 Speaker 2: a lot of what you know, we currently often describe 50 00:03:07,080 --> 00:03:10,440 Speaker 2: as AI. But you know, AI started in the nineteen fifties, 51 00:03:10,520 --> 00:03:13,360 Speaker 2: or the term was invented in the nineteen fifty five, 52 00:03:14,120 --> 00:03:17,280 Speaker 2: and for a lot of that time, you know, the term, 53 00:03:17,680 --> 00:03:21,160 Speaker 2: you know, symbolic systems at Stanford comes from using logic, 54 00:03:21,320 --> 00:03:25,000 Speaker 2: using this way that we've managed to formalize reasoning. You know, 55 00:03:25,320 --> 00:03:27,160 Speaker 2: one of the things that people who use neural network 56 00:03:27,200 --> 00:03:28,880 Speaker 2: models are trying to do are trying to make it 57 00:03:29,080 --> 00:03:31,480 Speaker 2: reason better. But we also had all these other types 58 00:03:31,520 --> 00:03:34,320 Speaker 2: of symbolic ways of reasoning, and I think we're also 59 00:03:34,400 --> 00:03:37,400 Speaker 2: starting to see the hybrid models, and so I think 60 00:03:37,400 --> 00:03:40,720 Speaker 2: what we're seeing is, you know, history again, doesn't repeat itself, 61 00:03:40,760 --> 00:03:42,760 Speaker 2: but it does come back. We've had these debates about 62 00:03:42,800 --> 00:03:45,000 Speaker 2: what the best way to create AI has been, and 63 00:03:45,040 --> 00:03:45,880 Speaker 2: that's moving forward. 64 00:03:46,240 --> 00:03:49,360 Speaker 1: Mackenzie gives great access to those reports. Go to Mackenzie 65 00:03:49,360 --> 00:03:52,960 Speaker 1: and look for the report in March an Artificial Intelligence 66 00:03:53,040 --> 00:03:58,120 Speaker 1: AI with, among others, Michael Chew. We're on podcasts right Apple, 67 00:03:58,120 --> 00:04:01,960 Speaker 1: We're on spotif hand YouTube podcast. It's single best idea. 68 00:04:14,840 --> 00:04:14,960 Speaker 1: Hm