1 00:00:02,440 --> 00:00:08,360 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. Questions over Apple's AI 2 00:00:08,400 --> 00:00:11,959 Speaker 1: strategy will hover over the earnings call and for more 3 00:00:12,039 --> 00:00:14,320 Speaker 1: on what exactly the company is doing and how it's 4 00:00:14,360 --> 00:00:16,319 Speaker 1: all shaking out. We want to bring in someone with 5 00:00:16,400 --> 00:00:19,880 Speaker 1: a deep, deep understanding of the company, it's culture, and 6 00:00:20,280 --> 00:00:23,240 Speaker 1: its innovation, and that is Steve Wozniak. He co founded 7 00:00:23,280 --> 00:00:26,920 Speaker 1: Apple alongside Steve Jobs decades ago. Steve, such a pleasure 8 00:00:26,920 --> 00:00:29,680 Speaker 1: to speak with you. Thank you so much. You are 9 00:00:29,680 --> 00:00:32,880 Speaker 1: a former Apple insider, you're current devoted Apple user. 10 00:00:32,920 --> 00:00:33,440 Speaker 2: We know that. 11 00:00:33,680 --> 00:00:36,840 Speaker 1: And the initial version of Apple Intelligence was made available 12 00:00:36,880 --> 00:00:40,559 Speaker 1: through an iOS update to iPhone sixteen and iPhone fifteen 13 00:00:40,600 --> 00:00:43,559 Speaker 1: pro users. I know you've praised the demos. Have you 14 00:00:43,560 --> 00:00:45,920 Speaker 1: got a chance to test out the new software and 15 00:00:45,960 --> 00:00:47,199 Speaker 1: if so, what do you think of it? 16 00:00:49,280 --> 00:00:49,480 Speaker 2: Yeah? 17 00:00:49,520 --> 00:00:52,160 Speaker 3: I just installed along with everyone else when it wasn't 18 00:00:52,200 --> 00:00:55,960 Speaker 3: beta the iPhone update, and I tried a few little 19 00:00:56,000 --> 00:01:00,160 Speaker 3: like Siri searches, Siri questions to look for data. I 20 00:01:00,200 --> 00:01:03,040 Speaker 3: made them a little more complicated in my wording than 21 00:01:03,120 --> 00:01:06,440 Speaker 3: sirring normally gets and they work fine, So maybe it 22 00:01:06,520 --> 00:01:07,000 Speaker 3: was better. 23 00:01:07,080 --> 00:01:08,360 Speaker 2: I couldn't tell you for sure. 24 00:01:08,560 --> 00:01:10,440 Speaker 3: I like to use things for a long time before 25 00:01:10,440 --> 00:01:12,800 Speaker 3: I really comment on are they good? 26 00:01:12,840 --> 00:01:14,040 Speaker 2: Are they worthwhile? This? And that? 27 00:01:14,160 --> 00:01:16,240 Speaker 3: I mean once in a while I can tell right away, 28 00:01:17,160 --> 00:01:19,000 Speaker 3: but not yet here, Not yet here. 29 00:01:19,040 --> 00:01:20,880 Speaker 1: Okay, So there's still a lot that you're going to 30 00:01:20,920 --> 00:01:23,240 Speaker 1: play around with. How does that compare with how you 31 00:01:23,360 --> 00:01:26,560 Speaker 1: use Apple or Apple tools or any AI tools right 32 00:01:26,560 --> 00:01:28,479 Speaker 1: now in your daily life. I'm just trying to get 33 00:01:28,480 --> 00:01:31,720 Speaker 1: a sense of how you anticipate using Apple AI tools 34 00:01:31,760 --> 00:01:34,240 Speaker 1: to enhance what you need to do right now. 35 00:01:36,319 --> 00:01:39,240 Speaker 3: I pretty much avoid large language models because I want 36 00:01:39,280 --> 00:01:42,480 Speaker 3: things to be really that I know it's accurate, like 37 00:01:42,560 --> 00:01:44,640 Speaker 3: something works rather than it didn't work. 38 00:01:44,880 --> 00:01:46,160 Speaker 2: And I don't like to be surprised. 39 00:01:46,200 --> 00:01:49,120 Speaker 3: I want to think about everything that I read or 40 00:01:49,200 --> 00:01:51,840 Speaker 3: hear and really think it out and that I understand 41 00:01:51,920 --> 00:01:54,720 Speaker 3: it and can express it my own words. That's AI 42 00:01:55,560 --> 00:01:56,880 Speaker 3: actual intelligence. 43 00:01:58,200 --> 00:02:01,000 Speaker 4: Well, that's just it's not everybody body is, you know, 44 00:02:01,720 --> 00:02:05,120 Speaker 4: jumping on board the AI bandwagon. Is Apple going to 45 00:02:05,160 --> 00:02:08,360 Speaker 4: cater to those people that don't care about AI, almost 46 00:02:08,440 --> 00:02:10,400 Speaker 4: would turn it off if it was on their phones, 47 00:02:10,800 --> 00:02:14,280 Speaker 4: to ensure that they will continue to upgrade each cycle. 48 00:02:16,360 --> 00:02:18,239 Speaker 3: Oh, I think most of us are on an AI 49 00:02:18,360 --> 00:02:22,920 Speaker 3: bandwagon and appreciate it very much, but of lightweight stuff. 50 00:02:23,000 --> 00:02:24,200 Speaker 2: I mean, like we've had. 51 00:02:24,120 --> 00:02:26,919 Speaker 3: AI for quite a while, the highest intelligence of how 52 00:02:26,960 --> 00:02:29,040 Speaker 3: computers can think out and predict what you're going to 53 00:02:29,120 --> 00:02:31,040 Speaker 3: want in your life, kind of like our old Knowledge 54 00:02:31,200 --> 00:02:35,320 Speaker 3: Navigator going back, you know, thirty forty years. It was 55 00:02:35,880 --> 00:02:38,000 Speaker 3: like a little demo we made up of what computers 56 00:02:38,000 --> 00:02:39,280 Speaker 3: could do if they could really think. 57 00:02:39,840 --> 00:02:43,960 Speaker 2: And we're moving closer towards that. But the inaccuracies. 58 00:02:44,000 --> 00:02:46,120 Speaker 3: Does it really help you? If it doesn't create, it 59 00:02:46,200 --> 00:02:48,840 Speaker 3: just repeats things. So it's a very good search engine 60 00:02:48,880 --> 00:02:51,239 Speaker 3: for me. It finds things on the web. I wish 61 00:02:51,320 --> 00:02:53,919 Speaker 3: it had citations that you could click on any item 62 00:02:53,960 --> 00:02:56,720 Speaker 3: that came back to you from AI and say or 63 00:02:56,960 --> 00:03:00,200 Speaker 3: just question it verbally and say where did this come from? 64 00:03:00,639 --> 00:03:03,239 Speaker 3: And let it tell you, just like all the citations 65 00:03:03,240 --> 00:03:07,000 Speaker 3: in scientific journals that are advance our knowledge. 66 00:03:07,360 --> 00:03:10,840 Speaker 4: So, as you know, Steve Apple has always been very first. 67 00:03:10,880 --> 00:03:12,920 Speaker 4: It hasn't been the very first of certain things, but 68 00:03:12,960 --> 00:03:16,240 Speaker 4: it usually ends up being the best at something. What 69 00:03:16,280 --> 00:03:19,680 Speaker 4: would you advise, you know, the current team to do 70 00:03:19,800 --> 00:03:21,720 Speaker 4: in order to be the best AI? 71 00:03:24,880 --> 00:03:27,959 Speaker 3: I don't know how the best testing you can Apple 72 00:03:28,000 --> 00:03:30,560 Speaker 3: already shows that it cares so much about the employees 73 00:03:30,600 --> 00:03:35,920 Speaker 3: and the users and you know, diversification and you know, 74 00:03:35,960 --> 00:03:38,560 Speaker 3: and also not tracking you being a little more private 75 00:03:38,600 --> 00:03:41,200 Speaker 3: than the others. So I think that's a good sign 76 00:03:41,280 --> 00:03:44,160 Speaker 3: that Apple is going to pay attention to you know, 77 00:03:44,800 --> 00:03:49,200 Speaker 3: not taking advantage of you with AI, and largely though 78 00:03:49,200 --> 00:03:50,800 Speaker 3: it's up to the user, I think a lot of 79 00:03:50,880 --> 00:03:53,280 Speaker 3: user education should come along with this. 80 00:03:53,920 --> 00:03:57,000 Speaker 1: Yeah, Apple has definitely made privacy one of its differentiating 81 00:03:57,040 --> 00:03:59,800 Speaker 1: points when it comes to its innovation, especially when it 82 00:03:59,800 --> 00:04:02,760 Speaker 1: comes AI. Do you think overall, just taking a step 83 00:04:02,760 --> 00:04:05,240 Speaker 1: back as we talk about these earnings from these big 84 00:04:05,240 --> 00:04:10,040 Speaker 1: tech companies, that investors and financial markets overall are impatient 85 00:04:10,040 --> 00:04:12,760 Speaker 1: when they punish companies for spending too much on AI 86 00:04:13,040 --> 00:04:14,960 Speaker 1: and not having enough to show for it in terms of, 87 00:04:15,000 --> 00:04:17,600 Speaker 1: you know, a monetization or some kind of tangible return. 88 00:04:19,839 --> 00:04:22,200 Speaker 3: Well, what comes in the form of AI and how 89 00:04:22,240 --> 00:04:26,240 Speaker 3: good it is is really created by the upper executives 90 00:04:26,240 --> 00:04:28,840 Speaker 3: in a company. But the trouble is, the upper executives 91 00:04:28,880 --> 00:04:30,839 Speaker 3: in a company don't exactly own it. 92 00:04:30,839 --> 00:04:31,240 Speaker 2: It's all the. 93 00:04:31,200 --> 00:04:35,000 Speaker 3: Little shareholders like you and I and others and investors 94 00:04:35,080 --> 00:04:38,120 Speaker 3: and so and so they're watching every little report and 95 00:04:38,120 --> 00:04:39,120 Speaker 3: everything and turn. 96 00:04:39,120 --> 00:04:39,960 Speaker 2: The stock up and down. 97 00:04:40,240 --> 00:04:42,599 Speaker 3: Apple is a little bit, and I do recognize the 98 00:04:42,640 --> 00:04:46,840 Speaker 3: problem of iPhone being kind of the major product. We 99 00:04:46,960 --> 00:04:49,240 Speaker 3: are sort of closer to a one product company. There 100 00:04:49,279 --> 00:04:51,640 Speaker 3: were times with the Apple too, we were one product company. 101 00:04:51,720 --> 00:04:54,320 Speaker 3: There were times with the Macintosh and even a fortune 102 00:04:54,480 --> 00:04:56,440 Speaker 3: you know, five hundred company way up on the list, 103 00:04:56,520 --> 00:04:59,200 Speaker 3: like Fortune twenty. Our stock would drop by a third 104 00:04:59,320 --> 00:05:03,599 Speaker 3: in one day sometimes and you didn't have a backup 105 00:05:03,640 --> 00:05:06,159 Speaker 3: of a lot of other products, you know, like the 106 00:05:06,200 --> 00:05:09,400 Speaker 3: Apple Services and Apple Pay and all that to kind 107 00:05:09,440 --> 00:05:13,320 Speaker 3: of balance things out a bit. So that is a concern. 108 00:05:13,000 --> 00:05:15,800 Speaker 2: That I do agree with. You got to keep your 109 00:05:15,800 --> 00:05:18,600 Speaker 2: eye on it. But Apples still doing a good job 110 00:05:18,640 --> 00:05:19,599 Speaker 2: for the Apple. 111 00:05:19,320 --> 00:05:22,840 Speaker 4: Community well exactly, and for investors, Steve, I wonder what 112 00:05:22,920 --> 00:05:24,600 Speaker 4: kind of a job you think Apple is doing. You've 113 00:05:24,640 --> 00:05:26,800 Speaker 4: been around, you know, the financial world a long time. 114 00:05:27,080 --> 00:05:29,240 Speaker 4: Is the price at two twenty five there are thirty 115 00:05:29,320 --> 00:05:31,440 Speaker 4: nine buys and you know many of those price targets 116 00:05:31,480 --> 00:05:34,039 Speaker 4: are higher. Is it trading at a good valuation in 117 00:05:34,080 --> 00:05:34,799 Speaker 4: your estimation? 118 00:05:36,839 --> 00:05:37,239 Speaker 2: Yikes? 119 00:05:37,279 --> 00:05:39,760 Speaker 3: I'm a user type person, not an investor. I have 120 00:05:39,880 --> 00:05:42,440 Speaker 3: never used Apples stock. App when I was eighteen to 121 00:05:42,440 --> 00:05:44,719 Speaker 3: twenty years old. I came up with a formula that 122 00:05:45,520 --> 00:05:48,560 Speaker 3: a I'd never be political and never vote, but also 123 00:05:48,720 --> 00:05:51,400 Speaker 3: that I would be I wanted to be happy in life, 124 00:05:51,600 --> 00:05:54,799 Speaker 3: and that meant smiles minus rounds. If you're constantly watching 125 00:05:55,000 --> 00:05:57,520 Speaker 3: something to go up and down stock or valuation of 126 00:05:57,520 --> 00:06:01,760 Speaker 3: this and that, you're you're not going to be that happy, 127 00:06:02,080 --> 00:06:04,280 Speaker 3: you know. So I just like to let my head 128 00:06:04,480 --> 00:06:05,480 Speaker 3: run nice and smooth. 129 00:06:05,720 --> 00:06:06,240 Speaker 2: That's just me. 130 00:06:06,800 --> 00:06:09,039 Speaker 3: And I like to observe things and be an observer 131 00:06:09,480 --> 00:06:13,039 Speaker 3: and obviously be affected by things. But I'm just I 132 00:06:13,160 --> 00:06:15,520 Speaker 3: just can't get into that. 133 00:06:15,520 --> 00:06:21,279 Speaker 1: That's one's enserves in a different way, right exactly, Steve, 134 00:06:22,040 --> 00:06:22,760 Speaker 1: you're joining us. 135 00:06:22,760 --> 00:06:25,000 Speaker 2: One is frowns, and I like to avoid frowns. 136 00:06:25,160 --> 00:06:27,840 Speaker 1: Okay, sounds good, Steve. We'll leave you to it. Steve 137 00:06:27,839 --> 00:06:29,440 Speaker 1: Wozniak is co founder of Apple