1 00:00:02,960 --> 00:00:07,280 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:09,960 --> 00:00:13,840 Speaker 2: You're listening to the Bloomberg Intelligence podcast. Catch us live 3 00:00:13,920 --> 00:00:17,000 Speaker 2: weekdays at ten am Eastern on Apple Car, playing Android 4 00:00:17,040 --> 00:00:20,120 Speaker 2: Auto with the Bloomberg Business app. Listen on demand wherever 5 00:00:20,200 --> 00:00:24,040 Speaker 2: you get your podcasts, or watch us live on YouTube. 6 00:00:25,000 --> 00:00:28,520 Speaker 3: Looking about Apple here, stocks down two point seven percent. 7 00:00:28,760 --> 00:00:32,040 Speaker 3: The European Union, who has always been tough on US 8 00:00:32,080 --> 00:00:34,520 Speaker 3: technology I think twenty twenty five years ago with Microsoft 9 00:00:34,520 --> 00:00:38,159 Speaker 3: in this scene called Windows, and they're still beating up 10 00:00:38,200 --> 00:00:42,440 Speaker 3: on US tech companies. So another fine come along. Let's 11 00:00:42,479 --> 00:00:45,520 Speaker 3: check in with anarag Rana. He covers all things technology 12 00:00:45,560 --> 00:00:50,840 Speaker 3: for Bloomberg Intelligence. So put into context, what's what happened 13 00:00:50,840 --> 00:00:52,839 Speaker 3: to Apple here from a regulatory standpoint and what it 14 00:00:52,880 --> 00:00:53,360 Speaker 3: means for them. 15 00:00:53,400 --> 00:00:55,600 Speaker 4: Just explain kind of what the fine is for. 16 00:00:56,800 --> 00:00:59,200 Speaker 5: Yeah, this has been, you know, going on for a while, 17 00:00:59,240 --> 00:01:01,880 Speaker 5: and tbxtions were that they were going to be fined 18 00:01:01,880 --> 00:01:04,920 Speaker 5: about five hundred million euros. Now the more important thing 19 00:01:05,000 --> 00:01:08,840 Speaker 5: is on March seven, the Digital Market Act goes live, 20 00:01:09,280 --> 00:01:12,160 Speaker 5: and Apple's already made concessions in terms of all the 21 00:01:12,200 --> 00:01:15,080 Speaker 5: things the European Union wants to do. I think the 22 00:01:15,120 --> 00:01:17,959 Speaker 5: surprise to our analyst Hamblin, who is based in Europe, 23 00:01:18,120 --> 00:01:20,399 Speaker 5: was the size of it. It's not you know, when 24 00:01:20,400 --> 00:01:22,759 Speaker 5: we rumored was five hundred when it came to around 25 00:01:22,760 --> 00:01:25,280 Speaker 5: one point eight billion. One thing is for clear at 26 00:01:25,319 --> 00:01:27,640 Speaker 5: this point that Apple is not going to just you know, 27 00:01:27,959 --> 00:01:30,959 Speaker 5: pay this fine and walk away with the DMA or 28 00:01:31,000 --> 00:01:34,040 Speaker 5: the Digital Markets Act. It's possible that the app store, 29 00:01:34,560 --> 00:01:37,160 Speaker 5: you know, policies are going to be under more scrutiny, 30 00:01:37,360 --> 00:01:40,080 Speaker 5: and I think that's what's driving the stock down. I mean, 31 00:01:40,120 --> 00:01:42,600 Speaker 5: two billion dollars is not that big of a deal 32 00:01:42,640 --> 00:01:46,040 Speaker 5: for Apple, but I think the twenty one billion dollars 33 00:01:46,080 --> 00:01:49,080 Speaker 5: of revenue that the app store generates, I think that's 34 00:01:49,240 --> 00:01:50,360 Speaker 5: that's a bigger issue. 35 00:01:51,760 --> 00:01:54,080 Speaker 6: So here's my thing, though, is that the stock is 36 00:01:54,080 --> 00:01:56,840 Speaker 6: down over ten percent since it's January high for this year. 37 00:01:57,280 --> 00:01:59,960 Speaker 6: So clearly something else was going on, just not this fine, 38 00:02:00,000 --> 00:02:02,360 Speaker 6: and that's just adding fuel to the fire here. What 39 00:02:03,400 --> 00:02:04,360 Speaker 6: is happening to Apple? 40 00:02:05,320 --> 00:02:07,240 Speaker 5: Yeah, I mean if you think about it, all the 41 00:02:07,640 --> 00:02:09,880 Speaker 5: bad things that can happen to Apple are happening all 42 00:02:09,919 --> 00:02:12,840 Speaker 5: at once. I mean, frankly speaking, it right now when 43 00:02:12,880 --> 00:02:15,320 Speaker 5: you look at their iPhone sales that are slowing down, 44 00:02:15,480 --> 00:02:20,640 Speaker 5: China competition, a lot of supply chain dependency on China. 45 00:02:21,000 --> 00:02:23,840 Speaker 5: On top of that app store stuff. They are not 46 00:02:23,960 --> 00:02:26,720 Speaker 5: a player in Jenei right now. So a lot of 47 00:02:26,720 --> 00:02:30,000 Speaker 5: good things happening for Apple, and it's going to be 48 00:02:30,040 --> 00:02:31,920 Speaker 5: a slow year for them. It's probably going to be 49 00:02:31,960 --> 00:02:34,520 Speaker 5: another slow year next year. So Apple has to come 50 00:02:34,560 --> 00:02:37,760 Speaker 5: out but in June when they do their developer conference 51 00:02:38,120 --> 00:02:41,880 Speaker 5: and put some fresh I would say air or life 52 00:02:41,880 --> 00:02:44,200 Speaker 5: into the company by saying that, Okay, we do have 53 00:02:44,240 --> 00:02:46,800 Speaker 5: a generative AI strategy and this is how we are 54 00:02:47,400 --> 00:02:49,880 Speaker 5: thinking of achieving it. I think other than that, it's 55 00:02:49,880 --> 00:02:53,400 Speaker 5: going to be a be a difficult year for them. 56 00:02:54,080 --> 00:02:56,520 Speaker 3: I think this June conference that you reference on Oki 57 00:02:56,560 --> 00:02:58,680 Speaker 3: it's usually the conference maybe where they introduce a new 58 00:02:58,720 --> 00:03:01,359 Speaker 3: cool product or something like that. But I feels like 59 00:03:01,440 --> 00:03:04,119 Speaker 3: the pressure on the company this year is as great 60 00:03:04,120 --> 00:03:06,120 Speaker 3: as I can ever remember it because they're either going 61 00:03:06,200 --> 00:03:08,680 Speaker 3: to come out with something really cool as it relates 62 00:03:08,720 --> 00:03:10,840 Speaker 3: to AI, or they're just going to come out and 63 00:03:10,880 --> 00:03:12,840 Speaker 3: say no, we're not rushing into it, in which case 64 00:03:12,880 --> 00:03:15,960 Speaker 3: the stock is going to presumably, you know, be challenging. 65 00:03:16,000 --> 00:03:17,080 Speaker 4: What do you think is going to happen? 66 00:03:18,000 --> 00:03:19,880 Speaker 5: Yeah, the June conference is going to be a lot 67 00:03:19,880 --> 00:03:23,320 Speaker 5: of the software updates the Worldwide Developer Conference, and you 68 00:03:23,400 --> 00:03:25,640 Speaker 5: do give you know, they are bound to give a 69 00:03:25,680 --> 00:03:28,400 Speaker 5: lot more detail about what are the five, six, seven 70 00:03:28,440 --> 00:03:30,800 Speaker 5: things that they're going to be updating on their software 71 00:03:30,960 --> 00:03:34,360 Speaker 5: that has more Generator VII capabilities. I think it's not 72 00:03:34,440 --> 00:03:37,080 Speaker 5: going to be that big just because these things take time. 73 00:03:37,440 --> 00:03:39,680 Speaker 5: We just heard from Mark them and you know, not 74 00:03:39,880 --> 00:03:42,320 Speaker 5: that long ago that they're shutting down the car and 75 00:03:42,480 --> 00:03:47,040 Speaker 5: allocating that headcount to generative AI. But you know, it's 76 00:03:47,120 --> 00:03:48,480 Speaker 5: what are you going to do in three months? I 77 00:03:48,480 --> 00:03:50,040 Speaker 5: think it's going to be a little longer than that. 78 00:03:50,080 --> 00:03:53,080 Speaker 5: They're going to make some promises, but frankly speaking, I mean, 79 00:03:53,120 --> 00:03:57,920 Speaker 5: they do have a distribution network, so you know, maybe 80 00:03:58,040 --> 00:04:01,240 Speaker 5: maybe we should give them some some points at this 81 00:04:01,280 --> 00:04:03,520 Speaker 5: point to see if they can pull this off. It's 82 00:04:03,560 --> 00:04:05,800 Speaker 5: going to lead to an iPhone refash cycle that we 83 00:04:05,840 --> 00:04:08,880 Speaker 5: haven't seen before, but I think, but you know, I'm 84 00:04:08,880 --> 00:04:10,600 Speaker 5: not betting on anything big at this point. 85 00:04:12,560 --> 00:04:14,600 Speaker 6: So go to the AI strategy. If you had to 86 00:04:14,720 --> 00:04:18,320 Speaker 6: articulate what Apple's AI strategy is, what would you say. 87 00:04:18,880 --> 00:04:21,120 Speaker 5: Yeah, so let's think about all the different things that 88 00:04:21,160 --> 00:04:23,840 Speaker 5: we do on our phone. So for example, City by itself, 89 00:04:24,520 --> 00:04:27,000 Speaker 5: you know, could be far more sophisticated than what it 90 00:04:27,080 --> 00:04:30,520 Speaker 5: is today. Imagine what all different things you can do 91 00:04:30,600 --> 00:04:33,880 Speaker 5: with open AI or check GPT. Imagine if you could 92 00:04:33,880 --> 00:04:37,039 Speaker 5: do that just on the phone itself without having to 93 00:04:37,320 --> 00:04:40,159 Speaker 5: go to a third party and do it now. Apple 94 00:04:40,240 --> 00:04:43,080 Speaker 5: has the whereworth in terms of dollars to do it, 95 00:04:43,279 --> 00:04:46,000 Speaker 5: whether they actually have the capacity from a technology point 96 00:04:46,040 --> 00:04:48,280 Speaker 5: of view, If they can embed a lot of those 97 00:04:48,320 --> 00:04:50,640 Speaker 5: features like you asking a question to your phone and 98 00:04:50,680 --> 00:04:54,039 Speaker 5: you getting the answer right there, they have the distribution 99 00:04:54,120 --> 00:04:57,440 Speaker 5: network of over one billion connected devices, then what happens 100 00:04:57,520 --> 00:04:59,120 Speaker 5: is you don't really need to go into an app 101 00:04:59,120 --> 00:05:01,240 Speaker 5: to do a lot of that stuff. We also heard 102 00:05:01,279 --> 00:05:03,560 Speaker 5: that they're going to develop a software that can write 103 00:05:03,560 --> 00:05:06,039 Speaker 5: itself a little quicker. That's not going to help a 104 00:05:06,080 --> 00:05:08,560 Speaker 5: consumer as much that they're going to help more developers. 105 00:05:08,920 --> 00:05:10,760 Speaker 5: But you know, the thing that we want to see 106 00:05:10,839 --> 00:05:13,680 Speaker 5: is what can be done on the phone itself that 107 00:05:13,720 --> 00:05:16,120 Speaker 5: would force me not to tap into an app and 108 00:05:16,160 --> 00:05:20,320 Speaker 5: I can get those answers immediately, And. 109 00:05:20,560 --> 00:05:23,960 Speaker 3: That to me seems like something that Apple can absolutely do. 110 00:05:24,200 --> 00:05:27,240 Speaker 3: If I were a long term investor, should I be 111 00:05:27,279 --> 00:05:32,400 Speaker 3: buying the stock here. I mean, Ai, I'm going to 112 00:05:32,440 --> 00:05:33,919 Speaker 3: get a monster bumped. 113 00:05:33,600 --> 00:05:34,240 Speaker 4: In the multiple. 114 00:05:35,120 --> 00:05:39,039 Speaker 5: I am absolutely on the same camp with you, you 115 00:05:39,080 --> 00:05:41,480 Speaker 5: know at this point, Paul. But frankly, they have to 116 00:05:41,520 --> 00:05:44,640 Speaker 5: prove it. Right now, things haven't worked for them in 117 00:05:44,760 --> 00:05:47,000 Speaker 5: terms of, you know, the being on the edge of 118 00:05:47,040 --> 00:05:50,440 Speaker 5: any technology at this point, so I think they have 119 00:05:50,520 --> 00:05:52,480 Speaker 5: to prove it in order to get it. Remember, they 120 00:05:52,520 --> 00:05:55,880 Speaker 5: have over eight hundred million phones that people have in 121 00:05:55,920 --> 00:06:00,440 Speaker 5: their hand in terms of the cellular connection, so at 122 00:06:00,480 --> 00:06:02,640 Speaker 5: any given year they sell roughly about two hundred and 123 00:06:02,640 --> 00:06:06,520 Speaker 5: twenty million phones. If this thing really takes off, there 124 00:06:06,560 --> 00:06:09,040 Speaker 5: are over forty percent of phones that are iPhone twelve 125 00:06:09,160 --> 00:06:12,839 Speaker 5: and below or that is very low processing power, low memory, 126 00:06:13,440 --> 00:06:16,120 Speaker 5: a lot that gets upgraded and that leads to a 127 00:06:16,200 --> 00:06:19,120 Speaker 5: next big you know, cycle and iPhone. I think that 128 00:06:19,279 --> 00:06:22,440 Speaker 5: can be done, but I think they'll have to prove it. Remember, 129 00:06:22,760 --> 00:06:25,080 Speaker 5: for a company that grows that two to five percent, 130 00:06:25,279 --> 00:06:27,640 Speaker 5: they're still trading at like twenty five times earning, So 131 00:06:27,680 --> 00:06:30,000 Speaker 5: it's not it's not cheap from even you know, that 132 00:06:30,080 --> 00:06:30,720 Speaker 5: point of view. 133 00:06:32,360 --> 00:06:34,800 Speaker 6: But Anuraga, I thought that Apple wanted to eventually become 134 00:06:35,520 --> 00:06:38,200 Speaker 6: primarily services revenue, Like, they didn't want to be dependent 135 00:06:38,240 --> 00:06:41,039 Speaker 6: on hardware. They wanted to have people pay them stuff 136 00:06:41,080 --> 00:06:43,280 Speaker 6: on like a monthly or yearly basis. How would that 137 00:06:43,360 --> 00:06:46,080 Speaker 6: AI integration do that? 138 00:06:47,480 --> 00:06:50,360 Speaker 5: So the services revenue is an important piece, but it's 139 00:06:50,440 --> 00:06:53,240 Speaker 5: nothing without the hardware. You have to have the hardware 140 00:06:53,279 --> 00:06:53,760 Speaker 5: to get there. 141 00:06:53,720 --> 00:06:55,159 Speaker 6: Right, But don't they want it to be Don't they 142 00:06:55,200 --> 00:06:56,920 Speaker 6: want it to be about that and not the hardware. 143 00:06:57,960 --> 00:07:00,119 Speaker 5: Yeah? Yeah, But at the same time, you know, if 144 00:07:00,160 --> 00:07:03,000 Speaker 5: you are able to drive these higher value services, your 145 00:07:03,040 --> 00:07:06,839 Speaker 5: ecosystem becomes stronger. People who have these older iPhones, let's 146 00:07:06,839 --> 00:07:09,080 Speaker 5: say four years old, five years old, six years old, 147 00:07:09,240 --> 00:07:11,280 Speaker 5: you're not going to be able to run those operations 148 00:07:11,280 --> 00:07:13,040 Speaker 5: on it because it's going to be either too slow 149 00:07:13,240 --> 00:07:15,200 Speaker 5: or your battery is going to die around very quickly. 150 00:07:15,320 --> 00:07:17,720 Speaker 5: So you need to upgrade the hardware. Now I just 151 00:07:17,760 --> 00:07:20,560 Speaker 5: talked about iPhones, but you can extend that to iPads, 152 00:07:20,760 --> 00:07:24,200 Speaker 5: to watches, to any other product that they have. It 153 00:07:24,280 --> 00:07:27,400 Speaker 5: is the critical portion of it. Anybody who controls a 154 00:07:27,520 --> 00:07:30,920 Speaker 5: large portion of these large language models will have an 155 00:07:31,000 --> 00:07:32,800 Speaker 5: edge down the road. So they really want to be 156 00:07:32,840 --> 00:07:35,160 Speaker 5: a player in that market. You don't want to be 157 00:07:35,240 --> 00:07:36,120 Speaker 5: left behind. 158 00:07:38,400 --> 00:07:40,560 Speaker 3: So let's just go back to where we started here, 159 00:07:40,600 --> 00:07:43,360 Speaker 3: which is some of the regulatory issues. Is this like 160 00:07:43,720 --> 00:07:47,400 Speaker 3: twenty five years ago when the EU came after Microsoft 161 00:07:47,440 --> 00:07:49,400 Speaker 3: for Windows and bundling all that stuff and they just 162 00:07:49,480 --> 00:07:51,720 Speaker 3: kind of write some checks and you go away. Or 163 00:07:51,760 --> 00:07:54,760 Speaker 3: is this something more of a challenge, more of a 164 00:07:54,800 --> 00:07:56,280 Speaker 3: headwind for Apple do you think? 165 00:07:57,320 --> 00:07:58,960 Speaker 5: I think it's more of a headline risk for the 166 00:07:59,040 --> 00:08:01,160 Speaker 5: next several years. A lot of these things is not 167 00:08:01,200 --> 00:08:03,720 Speaker 5: going to be just you know, they can't go out 168 00:08:03,760 --> 00:08:06,360 Speaker 5: and tell them to reduce these fees. But having said that, 169 00:08:06,400 --> 00:08:08,000 Speaker 5: there is going to be an overhang of the stock. 170 00:08:08,120 --> 00:08:10,440 Speaker 5: But we think that they can do so much. They 171 00:08:10,440 --> 00:08:13,560 Speaker 5: can really boost the advertising revenue that they really haven't 172 00:08:13,640 --> 00:08:16,360 Speaker 5: played that lever. They can take iCloud pricing up, they 173 00:08:16,360 --> 00:08:18,680 Speaker 5: can do Apple Care. I mean, Apple has a lot 174 00:08:18,720 --> 00:08:21,960 Speaker 5: of levers to pull, and it's going to be interesting 175 00:08:22,000 --> 00:08:25,080 Speaker 5: to see how different governments around the world approach this 176 00:08:25,160 --> 00:08:28,239 Speaker 5: issue because Apple, I mean, they wrote a phenomenal piece 177 00:08:28,240 --> 00:08:33,400 Speaker 5: today about fighting this particular you know, they're fine. I 178 00:08:33,440 --> 00:08:36,200 Speaker 5: think if you read that the entire piece as it 179 00:08:36,240 --> 00:08:38,720 Speaker 5: relates to Spotify, I think they make a very good 180 00:08:38,760 --> 00:08:41,959 Speaker 5: case of why their ecosystem is important for developers and 181 00:08:42,000 --> 00:08:45,040 Speaker 5: why they should have you know, at least some you know, 182 00:08:45,600 --> 00:08:47,719 Speaker 5: you know, skin in the game in terms of what 183 00:08:47,760 --> 00:08:49,600 Speaker 5: they bring to the party. 184 00:08:50,920 --> 00:08:53,240 Speaker 6: All Right, Honor Rock, thanks so much. We super appreciate it. 185 00:08:53,320 --> 00:08:54,920 Speaker 6: The best person to go to when it comes to Apple. 186 00:08:54,960 --> 00:08:56,480 Speaker 6: I should I have an iPhone fourteen? 187 00:08:56,559 --> 00:08:56,959 Speaker 3: I forgot? 188 00:08:57,400 --> 00:08:57,679 Speaker 4: Really? 189 00:08:57,760 --> 00:09:01,280 Speaker 6: Yeah? Yeah, yeah, I eleven switched my switch from Verizon 190 00:09:01,280 --> 00:09:02,920 Speaker 6: to T Mobile. I got a free phone. 191 00:09:02,960 --> 00:09:05,640 Speaker 4: And how's upen, Oh, it's okay. 192 00:09:05,760 --> 00:09:08,560 Speaker 6: The Wi Fi is a lot better, really, yes, but 193 00:09:08,720 --> 00:09:11,960 Speaker 6: sometimes about the Wi Fi, it's a little Spottierka, It's interesting. 194 00:09:12,040 --> 00:09:14,719 Speaker 6: It's interesting little trade off there. On Magrana joining us 195 00:09:14,840 --> 00:09:17,840 Speaker 6: Bloomberg Intelligence a senior technology analyst. 196 00:09:18,960 --> 00:09:22,840 Speaker 2: You're listening to the Bloomberg Intelligence Podcast. Catch us live 197 00:09:22,920 --> 00:09:26,440 Speaker 2: weekdays at ten am Eastern on applecar Play and Android 198 00:09:26,480 --> 00:09:29,240 Speaker 2: Auto with the Bloomberg Business app. You can also listen 199 00:09:29,360 --> 00:09:32,480 Speaker 2: live on Amazon Alexa from our flagship New York station 200 00:09:32,840 --> 00:09:36,640 Speaker 2: Just say Alexa playing Bloomberg eleven thirty. 201 00:09:36,640 --> 00:09:38,600 Speaker 3: It's Alex Steels pulsewhen you We're live here at the 202 00:09:38,600 --> 00:09:42,600 Speaker 3: New Jersey Institute of Technology here in Newark, New Jersey, 203 00:09:42,679 --> 00:09:44,400 Speaker 3: and we're also on that YouTube things ahead over to 204 00:09:44,440 --> 00:09:48,720 Speaker 3: YouTube dot com search a Bloomberg I think Yank Bloomberg Radio, 205 00:09:48,720 --> 00:09:50,280 Speaker 3: Bloomberg Podcast of all things. 206 00:09:50,280 --> 00:09:51,440 Speaker 4: I'll tell you know you've made it. 207 00:09:51,480 --> 00:09:55,080 Speaker 3: When Donna Russo was in the house bringing it here, 208 00:09:55,160 --> 00:09:57,360 Speaker 3: she's kind of running everything over here. So we appreciate 209 00:09:57,679 --> 00:10:00,840 Speaker 3: don having us over here. Let's talk AI. Michael Johnson 210 00:10:00,880 --> 00:10:05,000 Speaker 3: joint Is. He's the president of New Jersey Innovation Institute. Michael, 211 00:10:05,000 --> 00:10:07,320 Speaker 3: thanks so much for joining us here. What is the 212 00:10:07,400 --> 00:10:09,439 Speaker 3: New Jersey Innovation Institute. 213 00:10:09,559 --> 00:10:12,240 Speaker 7: It's a great question. So in the US, we have 214 00:10:12,280 --> 00:10:15,160 Speaker 7: lots of research universities and there's lots of smart people, 215 00:10:15,280 --> 00:10:19,320 Speaker 7: lots of great resources, but there's this fundamental problem in academia, 216 00:10:19,360 --> 00:10:21,480 Speaker 7: which is it's tough for the outside world actually leverage 217 00:10:21,480 --> 00:10:25,120 Speaker 7: those resources. So for governmental organizations, for industry, they want 218 00:10:25,160 --> 00:10:27,680 Speaker 7: access to the cutting edge of AI, for example, but 219 00:10:27,720 --> 00:10:29,800 Speaker 7: it's tough for them to actually make those connections and 220 00:10:29,800 --> 00:10:33,280 Speaker 7: interact with faculty. So NJI is an organization. It's a 221 00:10:33,360 --> 00:10:36,080 Speaker 7: five oh one C three wholly owned by NNGT, and 222 00:10:36,120 --> 00:10:38,600 Speaker 7: the idea is that we are a standalone corporation that's 223 00:10:38,600 --> 00:10:41,880 Speaker 7: a conduit between the outside world and NGT. So we 224 00:10:41,960 --> 00:10:45,080 Speaker 7: make those facilitations, we create unique business models to work 225 00:10:45,080 --> 00:10:49,160 Speaker 7: with faculty, and we're a quick moving organization, unlike academia, 226 00:10:49,200 --> 00:10:50,880 Speaker 7: which is you know, tends to be slower and more 227 00:10:50,960 --> 00:10:53,240 Speaker 7: difficult to work with. So we're that conduit between them 228 00:10:53,280 --> 00:10:55,400 Speaker 7: the outside world and roughly have about one hundred and 229 00:10:55,440 --> 00:10:58,320 Speaker 7: twenty folks out of organization and we're focused on that can. 230 00:10:58,240 --> 00:10:59,760 Speaker 6: Just say it's really cool his three year old son, 231 00:10:59,800 --> 00:11:02,760 Speaker 6: is he? I mean, what three year old is going 232 00:11:02,800 --> 00:11:04,680 Speaker 6: to come and talk about AI? I feel like that 233 00:11:04,800 --> 00:11:06,640 Speaker 6: just says it all at the end of the day, right, 234 00:11:06,840 --> 00:11:09,280 Speaker 6: that is the future. So am I a company that 235 00:11:09,360 --> 00:11:11,080 Speaker 6: goes to you and then you pair me up with 236 00:11:11,120 --> 00:11:13,720 Speaker 6: something or is it sort of the technology that you're evolving, 237 00:11:13,720 --> 00:11:15,280 Speaker 6: and then you go pitch it to companies. How does 238 00:11:15,280 --> 00:11:15,680 Speaker 6: that work? 239 00:11:15,760 --> 00:11:17,560 Speaker 7: It's a bit of inside out and outside in. So 240 00:11:17,600 --> 00:11:19,240 Speaker 7: we can go to corporations and try and find out 241 00:11:19,240 --> 00:11:21,199 Speaker 7: what their problems are, what their pain points are, and 242 00:11:21,200 --> 00:11:23,320 Speaker 7: then go and find faculty you can help out. Or 243 00:11:23,360 --> 00:11:25,000 Speaker 7: we might have a few faculty that have a very 244 00:11:25,040 --> 00:11:27,160 Speaker 7: specific problem. They need access to software, they need to 245 00:11:27,200 --> 00:11:29,599 Speaker 7: access the resources, and we go externally and find a 246 00:11:29,640 --> 00:11:31,760 Speaker 7: way to work with corporations on that, but it's pairing 247 00:11:31,800 --> 00:11:34,640 Speaker 7: the two with each other. And faculty are really smart, 248 00:11:34,640 --> 00:11:36,640 Speaker 7: they're really focused on their research, but they don't always 249 00:11:36,679 --> 00:11:38,360 Speaker 7: have the mind to go out and actually execute on 250 00:11:38,440 --> 00:11:41,280 Speaker 7: consultant type projects for industry. So we help form that 251 00:11:41,320 --> 00:11:43,560 Speaker 7: framework and along the way, we're trying to help with 252 00:11:43,640 --> 00:11:46,480 Speaker 7: tech transfers. So getting technology out of the university into 253 00:11:46,480 --> 00:11:49,040 Speaker 7: products and services was always a pain point, and also 254 00:11:49,120 --> 00:11:53,040 Speaker 7: just generally accelerating innovation and also helping upskilled workers. 255 00:11:53,640 --> 00:11:56,400 Speaker 3: You know, over the last several quarters, Bloomberg does this analysis. 256 00:11:56,400 --> 00:11:59,640 Speaker 3: It shows you know, what are companies talking about on 257 00:11:59,679 --> 00:12:02,280 Speaker 3: their totally conference calls, And for the last several quarters, 258 00:12:02,520 --> 00:12:04,760 Speaker 3: every single company in Y S and P five hundred 259 00:12:04,920 --> 00:12:07,880 Speaker 3: has talked about AI, with the exception of Apple last 260 00:12:07,920 --> 00:12:10,240 Speaker 3: quarter and I mentioned AI, which is interesting. 261 00:12:10,280 --> 00:12:11,680 Speaker 6: What a company that's not doing? Now? 262 00:12:11,800 --> 00:12:16,400 Speaker 3: Yeah, what are companies most commonly asking you for help with? 263 00:12:17,360 --> 00:12:17,560 Speaker 1: Oh? 264 00:12:17,600 --> 00:12:19,880 Speaker 7: Man, that goes all over the place. It depends in 265 00:12:19,880 --> 00:12:21,880 Speaker 7: the companies. We have some small mom and pop businesses 266 00:12:21,920 --> 00:12:25,200 Speaker 7: that just want help with trying to move towards technology. 267 00:12:24,640 --> 00:12:25,479 Speaker 4: Towards computers. 268 00:12:25,720 --> 00:12:27,280 Speaker 7: We have other companies, for example, that want to be 269 00:12:27,320 --> 00:12:30,200 Speaker 7: the bleeding edge of some sword and fields. So for example, 270 00:12:30,200 --> 00:12:32,440 Speaker 7: it might be life sciences, it might be AI for example, 271 00:12:32,679 --> 00:12:35,000 Speaker 7: and they're asking us to help improve something that they're 272 00:12:35,000 --> 00:12:37,280 Speaker 7: already doing, or it's a very specific project they're pushing 273 00:12:37,360 --> 00:12:39,400 Speaker 7: us to find faculty to help out with. So it 274 00:12:39,440 --> 00:12:42,480 Speaker 7: kind of depends. We have other folks. For example, Picatinny 275 00:12:42,600 --> 00:12:45,000 Speaker 7: Arsenal and Department of Defense are looking for just workers, 276 00:12:45,200 --> 00:12:48,200 Speaker 7: so helping us upscale workers for advanced manufacturing and all 277 00:12:48,240 --> 00:12:50,520 Speaker 7: sorts of different programs they need help finding talent for 278 00:12:50,679 --> 00:12:52,080 Speaker 7: so we're trying to help that with that as well. 279 00:12:52,280 --> 00:12:53,760 Speaker 6: So JPM, we're going to know if you saw this, 280 00:12:54,080 --> 00:12:56,080 Speaker 6: they had a great piece out that said that some 281 00:12:56,120 --> 00:12:59,199 Speaker 6: of its corporate customers are slashing manual work by almost 282 00:12:59,320 --> 00:13:03,520 Speaker 6: ninety percent with its cash flow management tool that runs 283 00:13:03,559 --> 00:13:06,760 Speaker 6: on AI. And that's the fear, right that we're going 284 00:13:06,760 --> 00:13:08,560 Speaker 6: to use AI and replace all the workers and those 285 00:13:08,600 --> 00:13:10,880 Speaker 6: workers don't have any jobs. Is there any truth to that? 286 00:13:11,679 --> 00:13:14,200 Speaker 7: It's a great question. So whenever you have technologies, they 287 00:13:14,200 --> 00:13:16,480 Speaker 7: are disruptive, there are going to be jobs certainly that 288 00:13:16,520 --> 00:13:18,160 Speaker 7: are going to go away. But if you look back 289 00:13:18,160 --> 00:13:20,679 Speaker 7: to when Excel first came out, or when computers first 290 00:13:20,720 --> 00:13:22,760 Speaker 7: came out. You look at accounting as a great use case. 291 00:13:23,080 --> 00:13:25,560 Speaker 7: Accountants didn't go away because we were going from a 292 00:13:25,640 --> 00:13:27,720 Speaker 7: ledger that was literally on paper to a computer based system. 293 00:13:27,760 --> 00:13:29,520 Speaker 7: We found new questions to answer, new ways that we 294 00:13:29,520 --> 00:13:31,359 Speaker 7: could look at our accounting and finances. 295 00:13:31,600 --> 00:13:32,199 Speaker 4: So I think the. 296 00:13:32,240 --> 00:13:34,520 Speaker 7: Jobs are going to change, But the overall number of 297 00:13:34,600 --> 00:13:36,679 Speaker 7: jobs in that net, I don't know if it will 298 00:13:36,720 --> 00:13:39,160 Speaker 7: actually reduce. It might increase in some cases. But we're 299 00:13:39,160 --> 00:13:41,200 Speaker 7: going to answer different questions. We're going to do things 300 00:13:41,280 --> 00:13:43,079 Speaker 7: much more quickly than we did in the past, for sure. 301 00:13:43,679 --> 00:13:46,880 Speaker 3: You know, I guess my I guess my lack of 302 00:13:46,920 --> 00:13:49,240 Speaker 3: knowledge of full appreciation of AIS is. I'm just not 303 00:13:49,280 --> 00:13:51,760 Speaker 3: sure if it's something completely new or is it just 304 00:13:51,800 --> 00:13:55,199 Speaker 3: the next next iteration of what the smart people NJIT 305 00:13:55,840 --> 00:13:59,160 Speaker 3: typically do. Is I'm just not sure what's new about 306 00:13:59,160 --> 00:14:02,199 Speaker 3: it other than man, everybody's talking about it. And it 307 00:14:02,360 --> 00:14:05,240 Speaker 3: was a theme for Why the Stock One of the 308 00:14:05,240 --> 00:14:07,360 Speaker 3: themes that drove the stock market in twenty twenty three 309 00:14:07,640 --> 00:14:09,680 Speaker 3: was a concept of AI and the average trader across 310 00:14:10,200 --> 00:14:12,960 Speaker 3: the river in New York has no idea what AI is, 311 00:14:13,040 --> 00:14:15,320 Speaker 3: but he's buying stocks because he thinks they're an AI play. 312 00:14:15,480 --> 00:14:18,120 Speaker 7: It's been around for decades, right, but we have a 313 00:14:18,120 --> 00:14:19,720 Speaker 7: couple of technology that came out in the last two 314 00:14:19,760 --> 00:14:21,760 Speaker 7: years that have really transformed the way we see AI. 315 00:14:21,800 --> 00:14:23,280 Speaker 7: And while we're talking about we were talking about it 316 00:14:23,360 --> 00:14:25,960 Speaker 7: last night at my family's Sunday dinner, and the reason 317 00:14:26,040 --> 00:14:29,040 Speaker 7: is because now it's accessible. So, for example, two years ago, 318 00:14:29,040 --> 00:14:30,520 Speaker 7: if I go into Google and I type how do 319 00:14:30,600 --> 00:14:33,120 Speaker 7: I make chicken Palm? I got all these ads, I 320 00:14:33,160 --> 00:14:35,080 Speaker 7: get all these things that tell me about chicken parm. 321 00:14:35,080 --> 00:14:36,560 Speaker 7: I go in to chat ept and I typed that 322 00:14:36,600 --> 00:14:38,280 Speaker 7: for example, and I got a perfect recipe on how 323 00:14:38,320 --> 00:14:41,200 Speaker 7: to actually make that, so it becomes very accessible to anyone. 324 00:14:41,240 --> 00:14:43,040 Speaker 7: And I think that go to market strategy. The open 325 00:14:43,120 --> 00:14:45,960 Speaker 7: I had of making accessible is what really changed the game. 326 00:14:46,280 --> 00:14:49,160 Speaker 7: And also the same time computing power is exponentially increasing, 327 00:14:49,240 --> 00:14:51,840 Speaker 7: it's more accessible. We're now able to use it everywhere 328 00:14:51,880 --> 00:14:54,000 Speaker 7: from making chicken palm to try and do research. 329 00:14:54,520 --> 00:14:56,480 Speaker 6: So what kind of cool stuff are you guys working 330 00:14:56,520 --> 00:14:58,320 Speaker 6: on right now? Like, what were you most excited about? 331 00:14:58,720 --> 00:14:59,200 Speaker 2: For us? 332 00:14:59,280 --> 00:15:01,400 Speaker 7: As ANGI, what we're very focused on is trying to 333 00:15:01,440 --> 00:15:03,400 Speaker 7: get things out of the university into the real world. 334 00:15:03,520 --> 00:15:06,200 Speaker 7: And one specific project that we're working on is actually 335 00:15:06,200 --> 00:15:09,200 Speaker 7: on law enforcement and body cams. So bodycams is there 336 00:15:09,240 --> 00:15:12,200 Speaker 7: a sensor that generates a huge amount of data, and 337 00:15:12,320 --> 00:15:15,000 Speaker 7: from those those data sets, we're usually looking at them 338 00:15:15,440 --> 00:15:17,640 Speaker 7: after the fact, so after something bad happens, we're trying 339 00:15:17,640 --> 00:15:20,160 Speaker 7: to review that situation. What we're trying to do is 340 00:15:20,160 --> 00:15:22,560 Speaker 7: can we look at that data and predict something bad 341 00:15:22,640 --> 00:15:24,480 Speaker 7: is going to happen before it happens. So if we 342 00:15:24,520 --> 00:15:28,080 Speaker 7: see a pattern between some behaviors, running back time for a. 343 00:15:28,000 --> 00:15:30,440 Speaker 6: Second, so you have a buye So you're tracking behavior 344 00:15:30,520 --> 00:15:32,720 Speaker 6: to then model behavior later. 345 00:15:33,480 --> 00:15:35,560 Speaker 7: Yes, So for example, let's say we see an officer 346 00:15:35,640 --> 00:15:38,560 Speaker 7: is running more frequently, they're yelling more frequently. That is 347 00:15:38,560 --> 00:15:41,560 Speaker 7: probably correlated to some behavior outcomes such as excessive use 348 00:15:41,600 --> 00:15:44,240 Speaker 7: of force. So for example, we might identify this officer 349 00:15:44,280 --> 00:15:46,160 Speaker 7: as at a much higher likelihood of excessive use of 350 00:15:46,240 --> 00:15:48,520 Speaker 7: force in the future. Let's intervene and get them training 351 00:15:48,520 --> 00:15:50,840 Speaker 7: before something bad happens. So we're trying to build that 352 00:15:50,880 --> 00:15:53,800 Speaker 7: a software we can actually put onto the hardware and 353 00:15:53,840 --> 00:15:56,640 Speaker 7: help with law enforcement and help with de escalating situations. 354 00:15:56,680 --> 00:16:00,640 Speaker 6: Wow, that's really cool. What other stuff like, what are 355 00:16:00,680 --> 00:16:03,160 Speaker 6: the thing you excited about Oh man, there's so many 356 00:16:03,760 --> 00:16:04,560 Speaker 6: we'll think your second best. 357 00:16:04,880 --> 00:16:07,000 Speaker 7: My second best would definitely be in the drone space. 358 00:16:07,080 --> 00:16:09,680 Speaker 7: So drones are another sensor. We're collecting huge amounts of 359 00:16:09,720 --> 00:16:11,720 Speaker 7: imagery data, and today a lot of that work is 360 00:16:11,720 --> 00:16:14,280 Speaker 7: actually a person looking at videos, scrolling through video like 361 00:16:14,280 --> 00:16:16,560 Speaker 7: you would from a VHS tape, and we're using computer 362 00:16:16,640 --> 00:16:19,440 Speaker 7: vision and AI to actually analyze that data and try 363 00:16:19,440 --> 00:16:22,000 Speaker 7: to predict what's happening, try to classify certain imagery and 364 00:16:22,000 --> 00:16:24,760 Speaker 7: answer very specific questions like is a power line going 365 00:16:24,760 --> 00:16:27,240 Speaker 7: to fail based upon a single picture from a simple drone? 366 00:16:27,280 --> 00:16:29,000 Speaker 6: Oh, now that could be really helpful depually all the 367 00:16:29,000 --> 00:16:31,360 Speaker 6: wildfires and stuff that we've had. And then as all 368 00:16:31,360 --> 00:16:34,000 Speaker 6: the utilities are kind of grappling with like old infrastructure 369 00:16:34,080 --> 00:16:35,800 Speaker 6: that is not easy to replace, kind of how you 370 00:16:35,840 --> 00:16:40,480 Speaker 6: manage that? Is it expensive for these companies to use this? 371 00:16:41,280 --> 00:16:44,240 Speaker 7: Usually the bottleneck today is data generation and data annotation 372 00:16:44,360 --> 00:16:46,240 Speaker 7: because there's lots of data, but we have to annotate 373 00:16:46,240 --> 00:16:48,400 Speaker 7: the data to be actually able to use it. So, 374 00:16:48,440 --> 00:16:50,360 Speaker 7: for example, with the body cams, we have to know 375 00:16:50,400 --> 00:16:52,640 Speaker 7: what those events are that we're trying to predict and 376 00:16:52,680 --> 00:16:54,840 Speaker 7: actually classifying them ahead of time. So that's the real 377 00:16:54,960 --> 00:16:56,480 Speaker 7: the bottleneck for it in a lot of cases. 378 00:16:57,320 --> 00:16:59,360 Speaker 6: All right, Michael, thanks so much. It was really great. 379 00:16:59,360 --> 00:17:01,080 Speaker 6: This is really fun. Get your perspective. Is your son 380 00:17:01,120 --> 00:17:01,640 Speaker 6: gonna stay. 381 00:17:01,520 --> 00:17:02,880 Speaker 7: Or is he gonna he's don't listen all day? 382 00:17:03,600 --> 00:17:07,320 Speaker 6: Well, awesome, we like that future generation. Michael Johnson, president 383 00:17:07,359 --> 00:17:11,560 Speaker 6: of New Jersey Innovation Institute, and Ji, I thanks very much. 384 00:17:11,600 --> 00:17:14,439 Speaker 6: It was really great to get that perspective. That's really interesting. 385 00:17:14,480 --> 00:17:19,040 Speaker 6: I think the bodycam situation too, like it's not a 386 00:17:19,080 --> 00:17:21,320 Speaker 6: profiling and profiling thing. It's like you're gonna get the 387 00:17:21,320 --> 00:17:23,560 Speaker 6: help that you need down the road, which I think 388 00:17:23,640 --> 00:17:25,320 Speaker 6: is really cool. And it's good to hear these actual 389 00:17:25,440 --> 00:17:27,840 Speaker 6: use cases because it's easy to just say AI is cool, 390 00:17:27,920 --> 00:17:29,560 Speaker 6: is going to do stuff, But to get an actual 391 00:17:29,640 --> 00:17:32,119 Speaker 6: use case that you can do is quite interesting. 392 00:17:32,200 --> 00:17:32,920 Speaker 4: Yeah. Absolutely. 393 00:17:35,320 --> 00:17:39,200 Speaker 2: You're listening to the Bloomberg Intelligence Podcast. Catch us live 394 00:17:39,280 --> 00:17:42,000 Speaker 2: weekdays at ten am Eastern on Apple card Playing and 395 00:17:42,119 --> 00:17:45,000 Speaker 2: broid Otto with the Bloomberg Business app. Listen on demand 396 00:17:45,040 --> 00:17:49,320 Speaker 2: wherever you get your podcasts, or watch us live on YouTube. 397 00:17:50,000 --> 00:17:52,159 Speaker 6: Paul and I are here broadcasting live from the campus 398 00:17:52,160 --> 00:17:54,840 Speaker 6: of the New Jersey Institute of Technology n J. It 399 00:17:55,000 --> 00:17:57,000 Speaker 6: where we're talking about all things AI and sort of 400 00:17:57,000 --> 00:18:00,000 Speaker 6: how you create the thing and then move it outside 401 00:18:00,200 --> 00:18:02,200 Speaker 6: and bring it to companies or businesses that need it 402 00:18:02,320 --> 00:18:05,439 Speaker 6: and bridging that gap between those two and one person 403 00:18:05,640 --> 00:18:07,800 Speaker 6: in part very much responsible for that here in New 404 00:18:07,840 --> 00:18:12,119 Speaker 6: Jersey is Beth Simone Novak. She is Chief AI Strategist 405 00:18:12,560 --> 00:18:16,199 Speaker 6: of the State of New Jersey. What a cool title, Beth, 406 00:18:16,560 --> 00:18:17,280 Speaker 6: What does that mean? 407 00:18:17,560 --> 00:18:18,480 Speaker 8: First in the country? 408 00:18:19,040 --> 00:18:19,600 Speaker 4: What does it mean? 409 00:18:19,680 --> 00:18:19,760 Speaker 3: So? 410 00:18:19,920 --> 00:18:22,040 Speaker 6: I mean, are you like, hey, business, you should use that, 411 00:18:22,160 --> 00:18:23,359 Speaker 6: or hey, government, let's use this? 412 00:18:23,600 --> 00:18:24,280 Speaker 8: All of the above? 413 00:18:24,359 --> 00:18:24,639 Speaker 6: Okay. 414 00:18:24,640 --> 00:18:27,560 Speaker 8: So Governor Murphy has said very loud and clear, we 415 00:18:27,640 --> 00:18:29,800 Speaker 8: have to do better when it comes to technology in 416 00:18:29,880 --> 00:18:33,600 Speaker 8: terms of embracing the use of technology to grow the economy, 417 00:18:33,640 --> 00:18:35,639 Speaker 8: to grow jobs in the state, but also to improve 418 00:18:35,680 --> 00:18:39,159 Speaker 8: how government works. So my job is to work on 419 00:18:39,240 --> 00:18:40,719 Speaker 8: all of the above and to see what we can 420 00:18:40,760 --> 00:18:43,159 Speaker 8: do as government to make that easier, to make that better, 421 00:18:43,520 --> 00:18:46,040 Speaker 8: and to embrace the responsible and ethical use of AI 422 00:18:46,200 --> 00:18:48,400 Speaker 8: to ensure that we're doing right by our residents. 423 00:18:48,800 --> 00:18:52,200 Speaker 3: So what are some of the applications that you know, 424 00:18:52,320 --> 00:18:55,040 Speaker 3: the governor and the Governor's office thinks AI can do 425 00:18:55,200 --> 00:18:56,840 Speaker 3: over the next several years. 426 00:18:56,880 --> 00:18:58,960 Speaker 4: Where will the residents of your New Jersey see it? 427 00:18:59,000 --> 00:18:59,440 Speaker 4: Do you think? 428 00:18:59,560 --> 00:19:02,000 Speaker 8: So this is not a several years from now. The 429 00:19:02,040 --> 00:19:04,720 Speaker 8: future is already here. And we've been using AI for 430 00:19:04,800 --> 00:19:07,280 Speaker 8: quite some time, and Generative AI since the very beginning, 431 00:19:07,560 --> 00:19:09,520 Speaker 8: so in many ways that you don't even see or 432 00:19:09,560 --> 00:19:11,520 Speaker 8: know about. So, for example, if you're getting a letter 433 00:19:11,560 --> 00:19:13,359 Speaker 8: from the State of New Jersey about let's say your 434 00:19:13,400 --> 00:19:16,920 Speaker 8: unemployment benefits, you're getting a letter that has been simplified, 435 00:19:17,080 --> 00:19:20,439 Speaker 8: that has been written in plain English, that's been written, 436 00:19:20,440 --> 00:19:22,600 Speaker 8: we hope, more clearly than it would have been before 437 00:19:22,920 --> 00:19:25,560 Speaker 8: because Generative AI can help us to do a first draft. 438 00:19:25,880 --> 00:19:28,840 Speaker 8: If you're calling up about your anchor tax relief that 439 00:19:28,920 --> 00:19:31,080 Speaker 8: the State of New Jersey is giving out to residents, 440 00:19:31,359 --> 00:19:34,720 Speaker 8: you are hopefully getting your call resolved faster because you 441 00:19:34,800 --> 00:19:37,360 Speaker 8: get a menu option that's we've written with the help 442 00:19:37,359 --> 00:19:40,040 Speaker 8: of AI. Because voice to text, our call center operators 443 00:19:40,080 --> 00:19:42,600 Speaker 8: know people are calling in asking the following kinds of questions, 444 00:19:43,000 --> 00:19:46,280 Speaker 8: we should write these menu options and these instructions and 445 00:19:46,320 --> 00:19:50,360 Speaker 8: answers so people can get that information faster. When you're 446 00:19:50,400 --> 00:19:53,000 Speaker 8: going out, for example, and typing in on a website 447 00:19:53,000 --> 00:19:55,159 Speaker 8: and telling us comments of how we can do something 448 00:19:55,160 --> 00:19:58,000 Speaker 8: better on a website like business dot NJ dot gov, 449 00:19:58,280 --> 00:20:00,399 Speaker 8: where you can go to start and run grow your 450 00:20:00,400 --> 00:20:03,199 Speaker 8: business everything you need from one place. We're taking the 451 00:20:03,240 --> 00:20:05,720 Speaker 8: comments we're getting from citizens about what they need, about 452 00:20:05,720 --> 00:20:08,480 Speaker 8: what they want, using AI to help us summarize those 453 00:20:08,960 --> 00:20:12,880 Speaker 8: those comments, synthesize them, and turn that into the information 454 00:20:12,960 --> 00:20:14,960 Speaker 8: that people want and need front and center. So the 455 00:20:15,000 --> 00:20:20,040 Speaker 8: goal is government that's more responsive, more informative, and providing 456 00:20:20,040 --> 00:20:22,639 Speaker 8: services twenty four to seven that are responsive to what 457 00:20:22,680 --> 00:20:23,880 Speaker 8: people actually want and need. 458 00:20:23,960 --> 00:20:26,440 Speaker 6: That's a pretty good pitch. You were also the chief 459 00:20:26,480 --> 00:20:28,159 Speaker 6: of innovation, right, h Jersey. 460 00:20:28,320 --> 00:20:30,960 Speaker 8: I was for many years the chief innovation officer. Yes. 461 00:20:31,240 --> 00:20:34,479 Speaker 6: Did the chief innovation officer become the AI strategist or 462 00:20:34,520 --> 00:20:36,440 Speaker 6: is there also an innovation officer? And I guess I'm 463 00:20:36,440 --> 00:20:39,480 Speaker 6: trying to understand, like is the innovation thing now AI 464 00:20:39,880 --> 00:20:41,280 Speaker 6: or can there be other stuff? 465 00:20:41,560 --> 00:20:43,800 Speaker 8: There is still other stuff. We have a wonderful new 466 00:20:43,880 --> 00:20:48,840 Speaker 8: Chief Innovation Officer, Dave Cole, has taken over that title 467 00:20:49,200 --> 00:20:51,919 Speaker 8: and is leading our efforts to use technology to improve 468 00:20:51,960 --> 00:20:55,399 Speaker 8: how we bring services to residents. So projects like business 469 00:20:55,440 --> 00:20:58,240 Speaker 8: dot J dot gov to take the business one for example, 470 00:20:58,320 --> 00:21:01,639 Speaker 8: or other digitization of residence services so that instead of 471 00:21:01,680 --> 00:21:04,560 Speaker 8: having to go to a government office, you know, between 472 00:21:04,600 --> 00:21:06,520 Speaker 8: nine and five, you can come to a website, you 473 00:21:06,520 --> 00:21:07,600 Speaker 8: can use your mobile phone. 474 00:21:07,640 --> 00:21:08,880 Speaker 6: Oh my gosh, that'd be amazing. 475 00:21:08,760 --> 00:21:13,000 Speaker 8: Forransact with government twenty four to seven. That's work that's 476 00:21:13,040 --> 00:21:14,919 Speaker 8: been underway for a long time, and that doesn't just 477 00:21:15,000 --> 00:21:18,800 Speaker 8: depend on AI. That is about again, clearer instructions, planer 478 00:21:18,880 --> 00:21:22,119 Speaker 8: English things available online, giving you the information front and 479 00:21:22,119 --> 00:21:24,240 Speaker 8: center that you want and need in the way that 480 00:21:24,280 --> 00:21:27,359 Speaker 8: people have become accustomed to from the best businesses. We 481 00:21:27,440 --> 00:21:30,040 Speaker 8: think that government should serve citizens in much the same way, 482 00:21:30,119 --> 00:21:31,600 Speaker 8: except in the public interest. 483 00:21:31,840 --> 00:21:35,240 Speaker 3: Well, New Jersey's had a long history of technological innovation. 484 00:21:35,280 --> 00:21:38,800 Speaker 3: I think of telecommunications with Bellcore and Bell Labs supporting 485 00:21:38,880 --> 00:21:39,800 Speaker 3: eighteen teen Verizon. 486 00:21:39,800 --> 00:21:40,520 Speaker 4: I think about some of. 487 00:21:40,520 --> 00:21:43,919 Speaker 3: The biotech and you know, pharmaceutical companies like Johnson and 488 00:21:43,960 --> 00:21:46,880 Speaker 3: Johnson based here in New Jersey. I'm wondering, is there 489 00:21:46,960 --> 00:21:51,159 Speaker 3: support for the young NJ grads that are in a 490 00:21:51,200 --> 00:21:53,800 Speaker 3: garage somewhere in Jersey City coming up with the next 491 00:21:53,840 --> 00:21:54,960 Speaker 3: AI type thing. 492 00:21:54,960 --> 00:21:56,280 Speaker 4: How do we support those people? 493 00:21:56,520 --> 00:22:00,239 Speaker 8: Absolutely so, there are a whole number and rain of 494 00:22:00,280 --> 00:22:04,040 Speaker 8: investments that are out there to support people starting new businesses. 495 00:22:04,359 --> 00:22:06,879 Speaker 8: That's what my colleagues at EDA work on in particular, 496 00:22:07,119 --> 00:22:11,000 Speaker 8: is ensuring that we're providing those kinds of incentives for 497 00:22:11,040 --> 00:22:13,320 Speaker 8: people who want to start their business in New Jersey 498 00:22:13,320 --> 00:22:15,720 Speaker 8: and grow their business in New Jersey. That's particularly why 499 00:22:15,760 --> 00:22:19,680 Speaker 8: the government is here to help support those businesses going 500 00:22:19,720 --> 00:22:21,560 Speaker 8: out and in particular now to look at how we 501 00:22:21,600 --> 00:22:25,399 Speaker 8: can support new AI businesses or existing businesses who are 502 00:22:25,440 --> 00:22:28,520 Speaker 8: asking how we can turn around and use AI to 503 00:22:28,680 --> 00:22:31,240 Speaker 8: improve what we do. It's a question we've been answering 504 00:22:31,240 --> 00:22:33,280 Speaker 8: for a long time. Before we called it AI, we 505 00:22:33,359 --> 00:22:36,560 Speaker 8: called it big data, right, So the more the people 506 00:22:36,560 --> 00:22:39,000 Speaker 8: we're using a lot of businesses have asked themselves, how 507 00:22:39,080 --> 00:22:42,399 Speaker 8: can I go out and start using data to measure 508 00:22:42,440 --> 00:22:44,919 Speaker 8: what's working, to measure what customers want, and again to 509 00:22:44,920 --> 00:22:48,159 Speaker 8: deliver new kinds of services across a range of industries. 510 00:22:48,640 --> 00:22:52,240 Speaker 8: It's why we've been starting new partnerships, such as with 511 00:22:52,280 --> 00:22:55,440 Speaker 8: Princeton around this new AI hub that's been set up 512 00:22:55,640 --> 00:22:57,920 Speaker 8: so that we can connect some of that tremendous innovation 513 00:22:58,000 --> 00:23:02,320 Speaker 8: that's coming out of universities like NGIT, like Rutgers, like Princeton. 514 00:23:03,000 --> 00:23:04,840 Speaker 8: We're of course known in this state for having the 515 00:23:04,880 --> 00:23:07,600 Speaker 8: best universities and the best education system in the country, 516 00:23:07,920 --> 00:23:10,160 Speaker 8: and we want to connect that back to how we're 517 00:23:10,160 --> 00:23:12,280 Speaker 8: growing the economy and growing jobs here in the state. 518 00:23:12,359 --> 00:23:13,600 Speaker 6: What's the hardest part of your job. 519 00:23:15,280 --> 00:23:17,359 Speaker 8: There's only twenty four hours in the day, and there's 520 00:23:17,400 --> 00:23:19,439 Speaker 8: a very, very lot to do, both on the public 521 00:23:19,480 --> 00:23:21,480 Speaker 8: sector side and on the private sector side. 522 00:23:21,560 --> 00:23:25,359 Speaker 6: Do you feel like it's awareness, is it implementation? Is 523 00:23:25,400 --> 00:23:28,160 Speaker 6: it finding the cool technology? Is it having too many 524 00:23:28,200 --> 00:23:28,920 Speaker 6: problems to solve? 525 00:23:30,000 --> 00:23:34,919 Speaker 8: Well, the cool technology is very much there, and I 526 00:23:34,960 --> 00:23:37,679 Speaker 8: think what we're trying to do now is to ensure that, 527 00:23:37,840 --> 00:23:41,199 Speaker 8: especially in government, we are building not just awareness but 528 00:23:41,280 --> 00:23:44,600 Speaker 8: actually use of these tools to improve how we serve 529 00:23:44,640 --> 00:23:47,800 Speaker 8: residents across a whole range of domains and across agencies. 530 00:23:48,400 --> 00:23:51,679 Speaker 3: So we know that Governor Murphy feels that AI is 531 00:23:51,720 --> 00:23:54,000 Speaker 3: important and administration feels that AI is important, as the 532 00:23:54,040 --> 00:23:57,240 Speaker 3: rest of the government within the state share that as well. 533 00:23:57,520 --> 00:23:59,919 Speaker 3: Or is it require kind of a promotional pitch for 534 00:24:00,520 --> 00:24:01,240 Speaker 3: the office. 535 00:24:01,560 --> 00:24:04,840 Speaker 8: Well, you know, the governor is the salesman in chief 536 00:24:05,240 --> 00:24:07,280 Speaker 8: for the State of New Jersey, and of course, setting 537 00:24:07,280 --> 00:24:10,520 Speaker 8: this message about the importance of AI, the ways we 538 00:24:10,560 --> 00:24:13,320 Speaker 8: should be embracing these tools going out early. We're one 539 00:24:13,359 --> 00:24:15,280 Speaker 8: of the first states to actually put out a policy 540 00:24:15,680 --> 00:24:20,000 Speaker 8: that says we should responsibly and ethically use AI to 541 00:24:20,160 --> 00:24:22,840 Speaker 8: better serve residents. And one of the things we're doing 542 00:24:23,040 --> 00:24:26,639 Speaker 8: is promoting upskilling and learning across the whole of public sector. 543 00:24:26,680 --> 00:24:28,960 Speaker 8: It's not enough to have just the governor supporting AI, 544 00:24:29,040 --> 00:24:31,960 Speaker 8: to have a chief AI strategist. We need every public 545 00:24:32,040 --> 00:24:34,800 Speaker 8: servant out there to be asking themselves, how can I 546 00:24:34,880 --> 00:24:39,000 Speaker 8: use these powerful new tools, again ethically and responsibly safeguarding 547 00:24:39,040 --> 00:24:42,280 Speaker 8: privacy and security and people's data. But how can I 548 00:24:42,280 --> 00:24:44,200 Speaker 8: go out and use these tools to write that better 549 00:24:44,240 --> 00:24:47,120 Speaker 8: first draft of the email, to write that clearer website, 550 00:24:47,400 --> 00:24:49,879 Speaker 8: to be able to write a better policy. This is 551 00:24:49,920 --> 00:24:53,439 Speaker 8: the next generation, if you will, of word processor. To 552 00:24:53,520 --> 00:24:56,399 Speaker 8: put it very simply, but that we should be using 553 00:24:56,480 --> 00:24:58,400 Speaker 8: to be able to serve residence better and we need 554 00:24:58,440 --> 00:24:59,639 Speaker 8: everybody to know how to do that. 555 00:25:00,160 --> 00:25:02,040 Speaker 6: Bet, thanks so much. We really appreciate your time. We 556 00:25:02,080 --> 00:25:04,720 Speaker 6: know you're quite busy. Beth Simona and Novak, chief AI 557 00:25:04,840 --> 00:25:07,760 Speaker 6: strategist from the State of New Jersey. That was actually 558 00:25:07,760 --> 00:25:09,720 Speaker 6: really helpful. Okay, so this is like the next version 559 00:25:09,760 --> 00:25:11,600 Speaker 6: of the stuff that we do normally. Like that really 560 00:25:11,600 --> 00:25:13,280 Speaker 6: helps me because I think for people like you and me, 561 00:25:13,680 --> 00:25:16,520 Speaker 6: it's hard to understand the practical applications. It just becomes 562 00:25:16,560 --> 00:25:19,800 Speaker 6: like AI. Yep, whatever that winds up meaning exactly. 563 00:25:21,560 --> 00:25:25,440 Speaker 2: You're listening to the Bloomberg Intelligence Podcast. Catch us live 564 00:25:25,520 --> 00:25:29,040 Speaker 2: weekdays at ten am Eastern on applecar Play and Android 565 00:25:29,080 --> 00:25:32,240 Speaker 2: Auto with the Bloomberg Business. You can also listen live 566 00:25:32,320 --> 00:25:35,520 Speaker 2: on Amazon Alexa from our flagship New York station Just 567 00:25:35,560 --> 00:25:38,200 Speaker 2: say Alexa play Bloomberg eleven thirty. 568 00:25:39,760 --> 00:25:41,600 Speaker 6: But I am learning a lot of cool stuff about AI, 569 00:25:41,720 --> 00:25:44,439 Speaker 6: particularly the implementation. It's not just this thing that we 570 00:25:44,480 --> 00:25:46,639 Speaker 6: talk about, like it can actually be used for certain 571 00:25:46,680 --> 00:25:50,200 Speaker 6: areas and apparently it can also be used in sports. 572 00:25:50,640 --> 00:25:53,360 Speaker 6: AI in the role of sports. So here to help 573 00:25:53,440 --> 00:25:55,560 Speaker 6: us break that down on what that all means is 574 00:25:55,640 --> 00:26:00,960 Speaker 6: Evana Scherich. She is Zealous Analytics senior product iientist also 575 00:26:01,080 --> 00:26:05,199 Speaker 6: former basketball player. Right basketball player, You also know all 576 00:26:05,240 --> 00:26:09,040 Speaker 6: the things about technology. How do you use AI in sports? 577 00:26:09,320 --> 00:26:14,320 Speaker 1: Yeah, So this field had expanded in last maybe ten 578 00:26:14,400 --> 00:26:18,000 Speaker 1: years of a lot in other sports. Even before that, 579 00:26:18,119 --> 00:26:20,760 Speaker 1: it was in baseball that was one of the first sports. 580 00:26:20,760 --> 00:26:21,760 Speaker 1: If you've seen Moneyball. 581 00:26:21,840 --> 00:26:25,200 Speaker 6: That's really yes, yeah, okay, I like me the moneyball, okay. 582 00:26:25,400 --> 00:26:27,919 Speaker 6: And so it's basically like how to position, like what 583 00:26:28,080 --> 00:26:30,800 Speaker 6: players to put where combinations? Is it that kind of stuff? 584 00:26:30,840 --> 00:26:34,919 Speaker 1: Correct? Correct? So so player evaluation in game decision strategy, 585 00:26:35,480 --> 00:26:36,520 Speaker 1: that's sort of sort of things. 586 00:26:36,600 --> 00:26:36,840 Speaker 8: Yeah. 587 00:26:37,000 --> 00:26:42,119 Speaker 3: So again, played for your starter for NJIT's basketball. You 588 00:26:42,160 --> 00:26:46,400 Speaker 3: also represented your native Croatia and youth basketball. So you're 589 00:26:46,520 --> 00:26:48,840 Speaker 3: great at basketball, but you're also a math nerd to 590 00:26:48,920 --> 00:26:51,959 Speaker 3: the nth degree. She got a BS and a pH 591 00:26:52,040 --> 00:26:55,720 Speaker 3: degree and applied mathematics from nj T, focusing on computational 592 00:26:56,200 --> 00:26:57,399 Speaker 3: fluid dynamics. 593 00:26:57,480 --> 00:26:58,160 Speaker 6: I don't know what that means. 594 00:26:58,160 --> 00:27:00,359 Speaker 4: I don't know what that means. That's but okay, I 595 00:27:00,359 --> 00:27:00,679 Speaker 4: don't know. 596 00:27:01,119 --> 00:27:04,320 Speaker 3: So a great mathematician, great bad basketball player. Let's put 597 00:27:04,320 --> 00:27:08,080 Speaker 3: it all together. What what are some of the leagues, 598 00:27:08,160 --> 00:27:10,439 Speaker 3: What are some of the really good applications for some 599 00:27:10,520 --> 00:27:14,000 Speaker 3: of that technology? Because we've seen you mentioned moneyball for 600 00:27:14,160 --> 00:27:16,440 Speaker 3: you know that we've seen it in baseball. What other 601 00:27:16,480 --> 00:27:18,240 Speaker 3: applications are out there that you think? It seems like 602 00:27:18,280 --> 00:27:19,880 Speaker 3: we're in the very early innings of that. 603 00:27:20,320 --> 00:27:26,040 Speaker 1: Yeah. Yeah, So early on it started with just using basic, basic, data, 604 00:27:26,119 --> 00:27:28,360 Speaker 1: so box scores, play by play, and then a lot 605 00:27:28,400 --> 00:27:32,440 Speaker 1: of sports in recent years have what's called player tracking data, 606 00:27:32,840 --> 00:27:35,359 Speaker 1: meaning we have locations of the players on the court 607 00:27:35,480 --> 00:27:38,000 Speaker 1: or on a pitch, on a field, whichever sport we're 608 00:27:38,000 --> 00:27:42,160 Speaker 1: talking about, in at a high resolution. So so from 609 00:27:42,200 --> 00:27:44,600 Speaker 1: that data we can extract not only things that are 610 00:27:44,600 --> 00:27:47,199 Speaker 1: counted in a box score, but also other things that 611 00:27:47,280 --> 00:27:51,040 Speaker 1: happened during the game that you wouldn't see u counted 612 00:27:51,400 --> 00:27:53,800 Speaker 1: in like a basic box score for example. 613 00:27:54,160 --> 00:27:57,040 Speaker 6: What are some of the common questions that like coaches 614 00:27:57,160 --> 00:27:58,560 Speaker 6: or owners come to you. 615 00:27:58,480 --> 00:28:03,240 Speaker 1: With the biggest question is how do we value players? 616 00:28:04,280 --> 00:28:06,280 Speaker 1: How do we how do we find which which players, 617 00:28:06,280 --> 00:28:10,160 Speaker 1: which teams should sign, which how how long of a contract, 618 00:28:10,280 --> 00:28:13,840 Speaker 1: how how much money should be on a contract. So 619 00:28:13,880 --> 00:28:16,800 Speaker 1: that's that's one side. So so that's the player evaluation side, 620 00:28:16,880 --> 00:28:19,359 Speaker 1: and then the other side is coaching and in game 621 00:28:19,400 --> 00:28:23,639 Speaker 1: decision making. So which situations are producing the most value 622 00:28:23,720 --> 00:28:29,720 Speaker 1: for for the teams, Which situations are creating creating better 623 00:28:29,720 --> 00:28:30,840 Speaker 1: opportunities to score. 624 00:28:31,680 --> 00:28:34,040 Speaker 3: I know, like in baseball, major league baseball and in 625 00:28:34,119 --> 00:28:34,840 Speaker 3: minor league baseball. 626 00:28:34,840 --> 00:28:36,560 Speaker 4: Now it's coming into all the other parts of baseball. 627 00:28:36,800 --> 00:28:40,920 Speaker 3: The analytics people the data people versus maybe some of 628 00:28:40,920 --> 00:28:44,120 Speaker 3: the more traditionalists and they're kind of they kind of 629 00:28:44,160 --> 00:28:45,240 Speaker 3: butt heads on occasion. 630 00:28:45,720 --> 00:28:48,320 Speaker 4: And how much analytics do you use? Imagine knows what 631 00:28:48,360 --> 00:28:49,280 Speaker 4: I'm talking about. 632 00:28:49,360 --> 00:28:51,800 Speaker 3: So how much analytics do you use versus just my 633 00:28:52,000 --> 00:28:53,960 Speaker 3: gut I think this player will do well? 634 00:28:54,320 --> 00:28:56,960 Speaker 4: Or how do you kind of bridge that topic? Yeah? 635 00:28:57,040 --> 00:28:59,320 Speaker 1: Yeah, So that's a that's a big important thing because 636 00:28:59,320 --> 00:29:03,280 Speaker 1: you can just have data without the domain expertise. And 637 00:29:03,280 --> 00:29:05,760 Speaker 1: I think that's something that we a Zellas have a 638 00:29:05,800 --> 00:29:09,080 Speaker 1: really good strength is that we have the experts in 639 00:29:08,720 --> 00:29:12,000 Speaker 1: in data and statistics, in AI, in machine learning, but 640 00:29:12,040 --> 00:29:14,320 Speaker 1: we also have a lot of people who worked in 641 00:29:14,400 --> 00:29:16,960 Speaker 1: sports teams and have that sort of experience and know 642 00:29:17,400 --> 00:29:20,560 Speaker 1: which questions the teams want to answer, what's useful for 643 00:29:20,640 --> 00:29:23,719 Speaker 1: them and uh and how can we help them best? 644 00:29:23,920 --> 00:29:26,560 Speaker 6: So yeah, because when you were saying what AI could 645 00:29:26,560 --> 00:29:28,480 Speaker 6: help you do, it feels like that's not what a 646 00:29:28,520 --> 00:29:30,280 Speaker 6: coach is supposed to do. But you're saying that you 647 00:29:30,320 --> 00:29:33,600 Speaker 6: need someone to interpret how to manage that and stuff. 648 00:29:33,480 --> 00:29:35,920 Speaker 1: Right, right, So you need like a bridge between the 649 00:29:36,000 --> 00:29:38,880 Speaker 1: data and what's what's happening on the court. 650 00:29:39,000 --> 00:29:41,520 Speaker 3: All right, If I'm an agent representing a player. Now, 651 00:29:41,640 --> 00:29:43,960 Speaker 3: this is I got to learn this stuff because the 652 00:29:44,000 --> 00:29:44,800 Speaker 3: team's gonna come. 653 00:29:44,720 --> 00:29:46,960 Speaker 6: At me and say, this is what the program tells 654 00:29:47,000 --> 00:29:48,040 Speaker 6: me that, yeah, your. 655 00:29:47,920 --> 00:29:51,800 Speaker 3: Client's worth blank because his or her ops is this 656 00:29:51,880 --> 00:29:53,920 Speaker 3: and blah blah blah blah blah blah. And you got 657 00:29:53,960 --> 00:29:55,360 Speaker 3: to come back and say, no, I think he's better 658 00:29:55,400 --> 00:29:55,600 Speaker 3: than that. 659 00:29:55,600 --> 00:29:58,040 Speaker 4: And I think he's really more. So did they do 660 00:29:58,080 --> 00:29:58,320 Speaker 4: you have? 661 00:29:58,400 --> 00:30:00,440 Speaker 3: Do you work with the agents and players and selves 662 00:30:00,480 --> 00:30:02,760 Speaker 3: as well, because they better be smart. 663 00:30:02,600 --> 00:30:03,080 Speaker 4: On this stuff? 664 00:30:03,280 --> 00:30:05,720 Speaker 1: Yeah, yeah, that's it's a great area where whereas elles 665 00:30:05,840 --> 00:30:08,880 Speaker 1: is growing as well in some of our sports. But 666 00:30:08,880 --> 00:30:11,840 Speaker 1: but yeah, it's it's not you know, an agent cannot 667 00:30:12,640 --> 00:30:15,520 Speaker 1: learn all of this on their own, so so having 668 00:30:16,280 --> 00:30:18,200 Speaker 1: a company or a contractor who can. 669 00:30:18,280 --> 00:30:21,840 Speaker 3: So do you guys work with agents and players and directly. 670 00:30:22,760 --> 00:30:24,000 Speaker 1: In in certain sports? 671 00:30:24,040 --> 00:30:26,880 Speaker 6: Yes, yeah, but not all across the board. So you also, 672 00:30:26,880 --> 00:30:29,320 Speaker 6: as Paul was mentioned earlier, you got your BS and 673 00:30:29,360 --> 00:30:35,360 Speaker 6: your PhD in applied mathematics and nj I T because 674 00:30:35,400 --> 00:30:37,520 Speaker 6: we're here and we're talking about nj I T kind 675 00:30:37,520 --> 00:30:39,800 Speaker 6: of bridges the gap between learning stuff and then putting 676 00:30:39,800 --> 00:30:41,960 Speaker 6: it out into the world. How did this help you 677 00:30:42,080 --> 00:30:45,200 Speaker 6: evolve your career and leave you where you are today. 678 00:30:45,560 --> 00:30:50,440 Speaker 1: Yeah, even though I studied competitional fluidnamics, it's not exactly 679 00:30:50,520 --> 00:30:52,280 Speaker 1: data science, but I've learned a lot of skills that. 680 00:30:52,360 --> 00:30:55,200 Speaker 6: Were by the way, so you can pretend it is. 681 00:30:56,240 --> 00:30:58,920 Speaker 1: There's a little skills that transfer from from one field 682 00:30:58,920 --> 00:31:02,640 Speaker 1: to the other and coding, analyzing large data sets, of 683 00:31:03,040 --> 00:31:08,440 Speaker 1: creating visualizations and communicating scientific results to to regular audience, 684 00:31:08,440 --> 00:31:10,680 Speaker 1: to anybody else who can understand to understand it. 685 00:31:10,800 --> 00:31:14,320 Speaker 3: Are there some sports that are embracing AI or just 686 00:31:14,400 --> 00:31:16,520 Speaker 3: technology analytics more than others? 687 00:31:17,680 --> 00:31:21,600 Speaker 1: I think that's that's historically in baseball, particularly because they 688 00:31:21,680 --> 00:31:25,560 Speaker 1: had the more advanced data for the longest time. But 689 00:31:25,920 --> 00:31:30,000 Speaker 1: other sports now also have the player tracking data and 690 00:31:30,200 --> 00:31:32,440 Speaker 1: are starting to get more more on that side. 691 00:31:32,920 --> 00:31:35,000 Speaker 6: How did you wind up in this? Because if you 692 00:31:35,040 --> 00:31:37,880 Speaker 6: played basketball, right, because you're originally from Croatia, right, So 693 00:31:37,920 --> 00:31:40,720 Speaker 6: you played basketball and then you somehow wound up and 694 00:31:40,760 --> 00:31:42,960 Speaker 6: deep into analytics. How did how did you do that? 695 00:31:43,960 --> 00:31:44,160 Speaker 4: Well? 696 00:31:44,160 --> 00:31:46,520 Speaker 1: I always loved math and I always loved basketball, and 697 00:31:46,600 --> 00:31:49,240 Speaker 1: this was a perfect combination of the two. 698 00:31:50,360 --> 00:31:54,480 Speaker 3: So in I'm kind of wondering where are we do 699 00:31:54,480 --> 00:31:57,600 Speaker 3: you think in terms of the evolution of applying data 700 00:31:57,640 --> 00:32:00,680 Speaker 3: and AI to sports, because it just this satistics. I've 701 00:32:00,680 --> 00:32:03,320 Speaker 3: been following sports my entire life, and I'm listening to 702 00:32:03,320 --> 00:32:05,720 Speaker 3: a broadcast and they're saying stuff. 703 00:32:05,720 --> 00:32:08,080 Speaker 4: I have no idea what they're talking about, Like now batting. 704 00:32:07,800 --> 00:32:10,840 Speaker 3: Average is an important anymore to baseball, and now it's 705 00:32:10,840 --> 00:32:12,240 Speaker 3: on bass plus slugging. 706 00:32:13,560 --> 00:32:14,400 Speaker 4: I don't know. 707 00:32:14,520 --> 00:32:16,520 Speaker 3: I mean, it seems like we need a tutorial on a 708 00:32:16,560 --> 00:32:17,800 Speaker 3: lot a lot of these broadcasts. 709 00:32:17,880 --> 00:32:20,360 Speaker 4: I mean, how do you where? Where can this go? 710 00:32:20,480 --> 00:32:21,440 Speaker 4: Do you think? Yeah? 711 00:32:21,640 --> 00:32:24,000 Speaker 1: I wouldn't know about baseball because I don't really understand 712 00:32:24,080 --> 00:32:27,280 Speaker 1: the rules coming from Croatia. Uh but but in basketball, 713 00:32:27,320 --> 00:32:31,720 Speaker 1: we you know, for now we have the player location data. 714 00:32:31,760 --> 00:32:36,800 Speaker 1: But but it's also growing towards player kinematics data, which 715 00:32:36,840 --> 00:32:41,040 Speaker 1: which NBA has available for this season. Kinematics and kinematics, 716 00:32:41,040 --> 00:32:46,920 Speaker 1: So the locations of players waist, elbow, shoulder, all of 717 00:32:46,920 --> 00:32:50,000 Speaker 1: the joints, the more detailed data of like player movements 718 00:32:50,040 --> 00:32:54,200 Speaker 1: and and uh yeah, so so how how players are shooting? 719 00:32:54,280 --> 00:32:57,520 Speaker 1: And you can you can extract all this more more 720 00:32:57,520 --> 00:33:01,600 Speaker 1: detailed information and that's the next up in basketball. 721 00:33:01,960 --> 00:33:04,480 Speaker 6: Wow, part of me thinks that's cool. And also creepy 722 00:33:04,720 --> 00:33:07,800 Speaker 6: like all at the same time. Any sports where this 723 00:33:07,920 --> 00:33:10,480 Speaker 6: like doesn't at all work for I mean this feels 724 00:33:10,480 --> 00:33:12,320 Speaker 6: like this makes sense in like team sports. What about 725 00:33:12,360 --> 00:33:16,200 Speaker 6: like more individual sports like gymnastic skiing, Like how does 726 00:33:16,240 --> 00:33:17,240 Speaker 6: it work in those kind of things? 727 00:33:17,360 --> 00:33:19,520 Speaker 1: Yeah, so it's those sextual works in golf, which is 728 00:33:20,640 --> 00:33:24,320 Speaker 1: obviously an individual sport, and there we work directly with 729 00:33:24,360 --> 00:33:25,040 Speaker 1: the players. 730 00:33:25,520 --> 00:33:27,800 Speaker 3: And what kind of data do you look at there 731 00:33:27,800 --> 00:33:29,480 Speaker 3: for the golfer? Mean to me, it's just can I 732 00:33:29,560 --> 00:33:30,360 Speaker 3: keep it on the fairway? 733 00:33:30,400 --> 00:33:31,440 Speaker 6: Can you line it up and shoot it in? 734 00:33:32,480 --> 00:33:32,680 Speaker 4: Yeah? 735 00:33:32,880 --> 00:33:35,120 Speaker 6: What can you tell me about my non game golf? 736 00:33:37,240 --> 00:33:40,440 Speaker 1: I actually, you know, don't play golf and don't completely 737 00:33:40,520 --> 00:33:41,720 Speaker 1: understand that fair. 738 00:33:41,760 --> 00:33:44,440 Speaker 6: But the same idea that they can storm, like how 739 00:33:44,520 --> 00:33:46,520 Speaker 6: you stand like that kind of thing, Like what kind 740 00:33:46,520 --> 00:33:48,239 Speaker 6: of tools are like where you hit it? 741 00:33:48,560 --> 00:33:52,560 Speaker 1: Where the ball I guess it's called falls and. 742 00:33:53,160 --> 00:33:55,120 Speaker 4: Clubhead speed, and I mean they're. 743 00:33:54,960 --> 00:33:56,560 Speaker 6: Breaking it head speed, you have stuff. 744 00:33:56,840 --> 00:33:58,680 Speaker 4: Yeah, they got it all now with the track man. 745 00:33:58,840 --> 00:34:00,600 Speaker 4: Everybody's got a little computer sitting right. 746 00:34:00,520 --> 00:34:03,280 Speaker 3: Behind them on the driving range and it measures basically everything. 747 00:34:03,320 --> 00:34:06,840 Speaker 3: So now it's all about spin, ray, club head speed. 748 00:34:07,240 --> 00:34:09,239 Speaker 3: All this kind of stuff. But for those of us 749 00:34:09,360 --> 00:34:11,520 Speaker 3: are just trying to hit it on the grass and 750 00:34:11,560 --> 00:34:14,640 Speaker 3: not the water or like the desert, because it's. 751 00:34:14,520 --> 00:34:15,160 Speaker 4: Not very helpful. 752 00:34:15,239 --> 00:34:18,239 Speaker 6: Yeah, just like go that way, all right, Avanna thanks 753 00:34:18,239 --> 00:34:21,160 Speaker 6: a lot of enersherk zealous analytics really appreciate it. That's 754 00:34:21,280 --> 00:34:24,040 Speaker 6: like an amazing story. I've never gotten into golf though. 755 00:34:24,920 --> 00:34:25,239 Speaker 4: Why not? 756 00:34:26,000 --> 00:34:26,560 Speaker 6: It's boring? 757 00:34:26,800 --> 00:34:28,120 Speaker 4: Yeah, I mean is it? 758 00:34:28,200 --> 00:34:30,040 Speaker 6: Is it boring? You play it? It's boring to watch it? 759 00:34:30,160 --> 00:34:33,240 Speaker 3: No, it's people are passionate about it, and oh it's 760 00:34:33,520 --> 00:34:33,800 Speaker 3: I had it. 761 00:34:33,920 --> 00:34:35,319 Speaker 4: Like my kids were young. You put golf on. 762 00:34:35,400 --> 00:34:39,120 Speaker 3: It's nice and serene, it's and it keeps them yet 763 00:34:39,120 --> 00:34:40,760 Speaker 3: exactly keeps them the safe. 764 00:34:40,760 --> 00:34:43,040 Speaker 4: That was my strategy with the four when when they 765 00:34:43,040 --> 00:34:43,359 Speaker 4: were young. 766 00:34:43,440 --> 00:34:45,399 Speaker 6: So brilliant. Yeah, why did not think of that? 767 00:34:45,480 --> 00:34:47,480 Speaker 4: Yeah? So anyway, you got some good golf start. 768 00:34:47,600 --> 00:34:51,720 Speaker 3: Yeah, Analytics in sports, Uh, it's everywhere. It's getting bigger 769 00:34:52,400 --> 00:34:54,560 Speaker 3: and teams are investing more in it. 770 00:34:54,719 --> 00:34:56,920 Speaker 4: So it's just the future. 771 00:35:00,000 --> 00:35:04,080 Speaker 2: Listening to the Bloomberg Intelligence Podcast catch us live weekdays 772 00:35:04,120 --> 00:35:07,480 Speaker 2: at ten am Eastern on applecar Play and Android Auto 773 00:35:07,520 --> 00:35:10,280 Speaker 2: with the Bloomberg Business app. You can also listen live 774 00:35:10,360 --> 00:35:13,560 Speaker 2: on Amazon Alexa from our flagship New York station Just 775 00:35:13,600 --> 00:35:16,240 Speaker 2: say Alexa playing Bloomberg eleven thirty. 776 00:35:17,400 --> 00:35:20,600 Speaker 3: We're live here today from the New Jersey Institute of 777 00:35:20,600 --> 00:35:24,840 Speaker 3: Technology NJ for the cool kids in Newark, New Jersey, 778 00:35:24,840 --> 00:35:27,760 Speaker 3: talking about AI and boy, there's a lot of smart people. 779 00:35:27,760 --> 00:35:30,040 Speaker 3: We came to the right place for that, including our 780 00:35:30,080 --> 00:35:33,360 Speaker 3: next guest, Anita Givanni Global ahead of Honor of Innovation 781 00:35:33,440 --> 00:35:38,800 Speaker 3: in Avanad. Avanad was founded by Microsoft and Xcenture. Anita, 782 00:35:38,800 --> 00:35:40,000 Speaker 3: thanks so much for joining us here. 783 00:35:40,000 --> 00:35:40,600 Speaker 4: Could you talk to. 784 00:35:40,640 --> 00:35:43,600 Speaker 3: Us about how you guys at Avanon approach AI. Where 785 00:35:43,600 --> 00:35:45,440 Speaker 3: do you try to help out in the equation? 786 00:35:45,920 --> 00:35:47,040 Speaker 6: Yeah, So we. 787 00:35:46,920 --> 00:35:50,360 Speaker 9: Are a global consultancy, as you mentioned, Microsoft Etcenter joint venture, 788 00:35:50,480 --> 00:35:52,879 Speaker 9: sixty thousand employees around the world, and what we do 789 00:35:52,960 --> 00:35:56,400 Speaker 9: is think about AI from a client perspective. How is 790 00:35:56,400 --> 00:36:00,880 Speaker 9: it that we can support organizations across sectors be AI 791 00:36:00,920 --> 00:36:03,359 Speaker 9: first and at the same time we're all going through 792 00:36:03,360 --> 00:36:06,520 Speaker 9: this journey together. So thinking about ourselves as an organization, 793 00:36:07,160 --> 00:36:09,520 Speaker 9: how can we be AI first in our own business 794 00:36:09,520 --> 00:36:11,080 Speaker 9: processes and for our own people. 795 00:36:11,239 --> 00:36:13,239 Speaker 6: So I'm a company, Can I come to you? What 796 00:36:13,280 --> 00:36:13,799 Speaker 6: do you do for me? 797 00:36:14,560 --> 00:36:17,200 Speaker 9: We think about a lot of things. Are you guys 798 00:36:17,280 --> 00:36:20,640 Speaker 9: prepared from a people perspective, an organizational perspective, and a 799 00:36:20,680 --> 00:36:24,160 Speaker 9: process perspective. For example, a lot of people that we 800 00:36:24,239 --> 00:36:27,359 Speaker 9: interviewed in an AI readiness report said they were enthusiastic 801 00:36:27,400 --> 00:36:28,960 Speaker 9: and optimistic about AI. 802 00:36:29,280 --> 00:36:29,960 Speaker 8: That's great. 803 00:36:29,960 --> 00:36:33,120 Speaker 9: However, half of the leaders said they weren't ready and 804 00:36:33,239 --> 00:36:37,160 Speaker 9: only a third of CEOs believe that their top leadership 805 00:36:37,200 --> 00:36:39,640 Speaker 9: is AI fluent. So there is a dissonance between the 806 00:36:39,920 --> 00:36:44,560 Speaker 9: excitement and enthusiasm and the reality of the preparedness of organizations. 807 00:36:44,600 --> 00:36:47,160 Speaker 9: And what we do is make sure that organizations have 808 00:36:47,239 --> 00:36:49,239 Speaker 9: the coaching and support they need to get there. 809 00:36:49,320 --> 00:36:51,160 Speaker 3: I would think one of the challenges, just speaking for 810 00:36:51,239 --> 00:36:53,520 Speaker 3: myself is I learned a whole lot today speaking to 811 00:36:53,560 --> 00:36:57,799 Speaker 3: again and the smart people from NJIT kind what AI is. 812 00:36:57,840 --> 00:36:59,319 Speaker 3: I'm one of those people that says if you can't 813 00:36:59,360 --> 00:37:01,880 Speaker 3: explain it more and sends you don't understand it, And 814 00:37:01,960 --> 00:37:04,359 Speaker 3: I don't think I understand it. How do you what's 815 00:37:04,400 --> 00:37:06,319 Speaker 3: the basic framework that you try to get across your 816 00:37:06,320 --> 00:37:08,839 Speaker 3: clients about what AI is and what it can mean 817 00:37:08,920 --> 00:37:09,279 Speaker 3: for them. 818 00:37:09,680 --> 00:37:12,399 Speaker 9: Yeah, think about AI and one of the biggest, one 819 00:37:12,440 --> 00:37:15,400 Speaker 9: of the biggest generative AI tools right now through Microsoft 820 00:37:15,520 --> 00:37:18,600 Speaker 9: is copilots. Think of it as a co pilot, not 821 00:37:18,640 --> 00:37:23,080 Speaker 9: necessarily a replacement. Pilot that can allow you to articulating, Yeah, 822 00:37:23,160 --> 00:37:27,040 Speaker 9: allow you to do your job more effectively and more efficiently. 823 00:37:27,120 --> 00:37:30,239 Speaker 9: And so instead of thinking about AI as a job replacement, 824 00:37:30,520 --> 00:37:33,280 Speaker 9: think about it as a way to replace key tasks 825 00:37:33,600 --> 00:37:36,200 Speaker 9: and allow you to spend your days in ways that 826 00:37:36,239 --> 00:37:39,920 Speaker 9: you want to, talking to people, being more relationship focused 827 00:37:40,000 --> 00:37:44,320 Speaker 9: rather than necessarily summarizing emails or going through data sets, 828 00:37:44,360 --> 00:37:44,759 Speaker 9: et cetera. 829 00:37:44,840 --> 00:37:48,000 Speaker 6: So's a partner. So basically I could have some AI, 830 00:37:48,080 --> 00:37:51,080 Speaker 6: think go through my email and like correlate the important 831 00:37:51,120 --> 00:37:53,440 Speaker 6: parts and give it out, for example, and take it 832 00:37:53,480 --> 00:37:54,799 Speaker 6: and give it to me, so I don't have to 833 00:37:54,800 --> 00:37:57,800 Speaker 6: spend my whole morning going through and reading reports. Yeah, exactly. 834 00:37:57,880 --> 00:37:58,800 Speaker 6: That's really cool. 835 00:37:58,920 --> 00:37:59,400 Speaker 4: Yeah, and that. 836 00:37:59,400 --> 00:38:01,520 Speaker 6: Would how much time to go do other stuff? 837 00:38:01,600 --> 00:38:01,799 Speaker 4: Yeah? 838 00:38:01,880 --> 00:38:03,480 Speaker 9: I mean think about when you come back from vacation. 839 00:38:03,600 --> 00:38:06,279 Speaker 9: You probably check your email when you're on vacation. I don't, 840 00:38:06,320 --> 00:38:07,719 Speaker 9: but for that exact. 841 00:38:07,480 --> 00:38:09,799 Speaker 6: Reason, because if I come back, I have like two 842 00:38:09,960 --> 00:38:12,160 Speaker 6: thousand emails being gone for like a week, and I 843 00:38:12,200 --> 00:38:13,359 Speaker 6: can't keep that. I can't do it. 844 00:38:13,400 --> 00:38:15,560 Speaker 9: If you had the AI tool, what you could do 845 00:38:15,640 --> 00:38:17,640 Speaker 9: after being away for two weeks. I don't check my 846 00:38:17,719 --> 00:38:19,399 Speaker 9: email and probably get in trouble for that, but I don't. 847 00:38:19,440 --> 00:38:21,480 Speaker 6: I can come back and say, what did I miss. 848 00:38:21,239 --> 00:38:23,560 Speaker 9: Over the last two weeks, go through all my pings 849 00:38:23,560 --> 00:38:25,640 Speaker 9: on teams, go through all my outlook, and can you 850 00:38:25,680 --> 00:38:27,799 Speaker 9: prepare for me a summary so that now that I 851 00:38:27,840 --> 00:38:30,719 Speaker 9: come back, I can actually be ready and can prioritize. 852 00:38:30,760 --> 00:38:32,640 Speaker 9: That's where it really comes into. 853 00:38:33,160 --> 00:38:34,160 Speaker 6: Wow, that's a handly cool. 854 00:38:34,239 --> 00:38:36,560 Speaker 3: Yeah, So what when you sit down with your clients, 855 00:38:36,920 --> 00:38:39,439 Speaker 3: I mean, what's some of the common requests you get 856 00:38:39,440 --> 00:38:41,560 Speaker 3: from them? Or you know, what are some of the 857 00:38:42,360 --> 00:38:44,080 Speaker 3: what do they ask for most of the help with 858 00:38:44,160 --> 00:38:44,479 Speaker 3: I guess. 859 00:38:44,840 --> 00:38:45,040 Speaker 8: Yeah. 860 00:38:45,080 --> 00:38:47,000 Speaker 9: One of the things that's really top of mind for 861 00:38:47,040 --> 00:38:50,400 Speaker 9: people is about skill set and training and capability building. 862 00:38:50,480 --> 00:38:54,080 Speaker 9: So in our survey, we found that eight out of 863 00:38:54,239 --> 00:38:57,560 Speaker 9: ten people said that twenty hours of their work week, 864 00:38:57,640 --> 00:39:00,600 Speaker 9: almost fifty percent of their work week can be replaced 865 00:39:00,640 --> 00:39:03,239 Speaker 9: with AI tools. The challenges they don't know how to 866 00:39:03,320 --> 00:39:05,759 Speaker 9: use the tools in the most effective and efficient way, 867 00:39:06,000 --> 00:39:09,040 Speaker 9: so the training around that is critical in the process. 868 00:39:09,080 --> 00:39:13,080 Speaker 9: The other is a responsible AI A governance set, right, Yeah, 869 00:39:13,080 --> 00:39:15,120 Speaker 9: what are the guard rails that we have to put 870 00:39:15,160 --> 00:39:18,360 Speaker 9: into place so that people can play creatively in the space. 871 00:39:18,520 --> 00:39:21,880 Speaker 6: Do you feel like people and CEOs or board levels, 872 00:39:22,520 --> 00:39:25,200 Speaker 6: do they now know what they don't know? They are 873 00:39:25,239 --> 00:39:27,319 Speaker 6: beginning to figure it out, or we're still in the 874 00:39:27,320 --> 00:39:28,080 Speaker 6: beginning part of that. 875 00:39:28,160 --> 00:39:30,359 Speaker 9: I believe we're in the infancy of it. I think 876 00:39:30,360 --> 00:39:32,560 Speaker 9: there's an infancy of the learning curve, but also an 877 00:39:32,600 --> 00:39:35,439 Speaker 9: infancy of having the right people in the room, having 878 00:39:35,480 --> 00:39:38,279 Speaker 9: diverse perspectives. As we think about responsible AI. 879 00:39:38,239 --> 00:39:40,719 Speaker 3: And we're hearing you mentioned the I guess the ethical 880 00:39:41,040 --> 00:39:42,319 Speaker 3: use of AI. 881 00:39:43,239 --> 00:39:44,560 Speaker 4: I don't know how that's going to evolve. 882 00:39:45,120 --> 00:39:47,600 Speaker 3: Is that going to be some partnership between public, private, 883 00:39:48,080 --> 00:39:48,919 Speaker 3: the individual. 884 00:39:49,480 --> 00:39:51,120 Speaker 6: I'm not sure I actually know what that means. 885 00:39:51,640 --> 00:39:53,560 Speaker 8: Boy, it just seems like ethical use of AI. 886 00:39:53,840 --> 00:39:56,279 Speaker 3: Yeah, it just seems like the technology could get out 887 00:39:56,280 --> 00:39:56,920 Speaker 3: of control. 888 00:39:57,160 --> 00:39:59,960 Speaker 9: You will look as AI and generative AI becomes more 889 00:40:00,080 --> 00:40:03,799 Speaker 9: are ubiquitous. With increased scale comes increased risk. That's just 890 00:40:03,880 --> 00:40:06,440 Speaker 9: the reality of things. So how do you mitigate those risks? 891 00:40:06,760 --> 00:40:08,680 Speaker 9: I think one of the most important ways to do 892 00:40:08,719 --> 00:40:11,239 Speaker 9: that is to have the right people in the room. So, 893 00:40:11,360 --> 00:40:14,680 Speaker 9: whether that's from a diversity perspective of gender whether that's 894 00:40:14,719 --> 00:40:17,839 Speaker 9: having people of color in the room, people from diverse backgrounds. 895 00:40:18,040 --> 00:40:20,600 Speaker 9: It's one of the reasons that we do the scholarship 896 00:40:20,640 --> 00:40:23,600 Speaker 9: program for women in STEM at this very institute, because 897 00:40:23,600 --> 00:40:25,239 Speaker 9: we want to make sure that they're not brought in 898 00:40:25,640 --> 00:40:28,000 Speaker 9: as a second thought, but rather at. 899 00:40:27,840 --> 00:40:29,880 Speaker 6: The very beginning of the conversation. So, what's like an 900 00:40:30,000 --> 00:40:34,040 Speaker 6: unethical use of AI? Like, where does AI get bad? 901 00:40:34,520 --> 00:40:34,759 Speaker 4: Yeah? 902 00:40:34,760 --> 00:40:37,319 Speaker 9: Well, I mean, look, you can use you can use 903 00:40:37,360 --> 00:40:40,120 Speaker 9: AI to create images that don't actually exist. You can 904 00:40:40,160 --> 00:40:43,000 Speaker 9: put voices on people to say things through their own 905 00:40:43,080 --> 00:40:45,440 Speaker 9: voice when they may maybe have not said that video. 906 00:40:46,000 --> 00:40:49,120 Speaker 9: You can think about putting in questions into generative AI 907 00:40:49,239 --> 00:40:51,920 Speaker 9: that perhaps share data with the broader public that you 908 00:40:51,920 --> 00:40:54,480 Speaker 9: didn't want to share that's company specific data. So there's 909 00:40:54,520 --> 00:40:58,360 Speaker 9: a security component, there's a falsification component, There's lots of 910 00:40:58,360 --> 00:41:00,719 Speaker 9: different ways you kind of have to be proact and. 911 00:41:00,719 --> 00:41:03,360 Speaker 3: On this front, once again, maybe at no fault of 912 00:41:03,400 --> 00:41:07,880 Speaker 3: their own, the government is generations behind where the technology is. 913 00:41:08,480 --> 00:41:10,759 Speaker 3: I don't know how this plays out, I really don't. 914 00:41:10,800 --> 00:41:13,600 Speaker 3: I mean, is there a feeling that the industry for 915 00:41:13,640 --> 00:41:15,880 Speaker 3: a while is going to have to police itself or 916 00:41:15,920 --> 00:41:18,520 Speaker 3: is there going to be some again public private partnership 917 00:41:18,560 --> 00:41:20,600 Speaker 3: in terms of regulating this, because this is not the 918 00:41:20,640 --> 00:41:22,360 Speaker 3: FCC regulating the airwaves. 919 00:41:23,040 --> 00:41:26,560 Speaker 4: This is really really difficult. 920 00:41:26,880 --> 00:41:28,120 Speaker 8: Yeah, it gets complicated. 921 00:41:28,360 --> 00:41:31,400 Speaker 9: Look, I think there's an individual level to it, an 922 00:41:31,400 --> 00:41:33,799 Speaker 9: individual level of responsibility, But at the end of the day, 923 00:41:33,840 --> 00:41:36,279 Speaker 9: it's going to fall on the leaders, the leaders of 924 00:41:36,440 --> 00:41:39,920 Speaker 9: organizations across the board. If the senior leaders are not 925 00:41:39,960 --> 00:41:44,279 Speaker 9: thinking about responsible AI, they're not thinking about the AI fluency, 926 00:41:44,680 --> 00:41:46,920 Speaker 9: no one else is going to think about that. So 927 00:41:47,040 --> 00:41:49,040 Speaker 9: the responsibility on the leaders is very high. 928 00:41:49,320 --> 00:41:51,759 Speaker 6: How do they get fluent aside from talking to you? 929 00:41:52,239 --> 00:41:56,680 Speaker 9: Yeah, well, there's a defining AI understanding and feeling comfortable 930 00:41:56,680 --> 00:41:58,719 Speaker 9: with the language of AI. And then there's some very 931 00:41:58,760 --> 00:42:01,759 Speaker 9: tactical things like prompt engineering. When you put in a 932 00:42:01,800 --> 00:42:04,680 Speaker 9: question into jen AI and get a response, there are 933 00:42:04,680 --> 00:42:07,160 Speaker 9: ways you can position that question in a more intelligent 934 00:42:07,200 --> 00:42:09,640 Speaker 9: way to get a response that more aligns with your need. 935 00:42:09,680 --> 00:42:12,080 Speaker 9: So there's very tactical things you can do through some 936 00:42:12,120 --> 00:42:12,960 Speaker 9: more AI fluent. 937 00:42:13,440 --> 00:42:15,799 Speaker 4: What's that? I wonder what the technology we do? 938 00:42:15,840 --> 00:42:18,560 Speaker 3: We know what the technology investments can be required to 939 00:42:18,560 --> 00:42:20,600 Speaker 3: be proficient in AI, because I feel like there's gonna 940 00:42:20,600 --> 00:42:24,640 Speaker 3: be a lot of companies, a lot of people left behind. 941 00:42:24,760 --> 00:42:26,640 Speaker 3: It's not just having the ability to have a laptop 942 00:42:26,680 --> 00:42:28,440 Speaker 3: on your desk. It feels like it's a. 943 00:42:28,400 --> 00:42:28,920 Speaker 4: Lot more than that. 944 00:42:29,120 --> 00:42:31,480 Speaker 9: Yeah, I mean, we talked about this very briefly. 945 00:42:31,560 --> 00:42:33,080 Speaker 6: But data is going to be critical. 946 00:42:33,480 --> 00:42:36,960 Speaker 9: Data and AI are interlinked. So without strong data sources, 947 00:42:37,239 --> 00:42:39,279 Speaker 9: the AI won't be as powerful as it has the 948 00:42:39,320 --> 00:42:41,920 Speaker 9: potential to be. And so a lot of the technology 949 00:42:42,000 --> 00:42:44,840 Speaker 9: investment right now, besides the people investment in training, is 950 00:42:44,880 --> 00:42:47,040 Speaker 9: going to be on cleaning up and making sure that 951 00:42:47,080 --> 00:42:48,960 Speaker 9: we have good, strong data to work off of. 952 00:42:49,360 --> 00:42:54,120 Speaker 6: So interesting, Anita, thank you so much, really appreciate Anita Vann. 953 00:42:54,400 --> 00:42:59,120 Speaker 6: Did I say that right? We'll get it eventually? Okay, 954 00:42:59,120 --> 00:43:01,520 Speaker 6: all right, we're going to get it. That will be Paul, 955 00:43:01,560 --> 00:43:04,239 Speaker 6: and our quest is to get that crap. 956 00:43:04,360 --> 00:43:04,440 Speaker 3: You know. 957 00:43:04,520 --> 00:43:06,359 Speaker 6: I have to say, I feel like I learned a lot. 958 00:43:06,400 --> 00:43:08,640 Speaker 6: I have a little bit of an understanding as to like, okay, 959 00:43:08,680 --> 00:43:10,520 Speaker 6: now this is how people like you and I can 960 00:43:10,600 --> 00:43:11,160 Speaker 6: understand it a. 961 00:43:11,160 --> 00:43:13,600 Speaker 3: Little bit, which is very cool, right, But who gets 962 00:43:13,600 --> 00:43:15,640 Speaker 3: a PhD in like applied mathematics, not. 963 00:43:15,760 --> 00:43:19,399 Speaker 6: Us or what was a fluid math of something like that? 964 00:43:19,680 --> 00:43:24,160 Speaker 2: This is the Bloomberg Intelligence podcast, available on Apples, Spotify, 965 00:43:24,400 --> 00:43:28,040 Speaker 2: and anywhere else you get your podcasts. Listen live each weekday, 966 00:43:28,160 --> 00:43:31,120 Speaker 2: ten am to noon Eastern on Bloomberg dot com, the 967 00:43:31,239 --> 00:43:34,479 Speaker 2: iHeart Radio app tune In, and the Bloomberg Business app. 968 00:43:34,600 --> 00:43:37,640 Speaker 2: You can also watch us live every weekday on YouTube 969 00:43:37,840 --> 00:43:39,720 Speaker 2: and always on the Bloomberg terminal