1 00:00:00,600 --> 00:00:07,280 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:10,600 --> 00:00:14,400 Speaker 2: This is Bloomberg Intelligence with Alex Steinel and Paul Sweeny. 3 00:00:14,520 --> 00:00:17,720 Speaker 3: The real app performance has been in US corporate high yield. 4 00:00:17,880 --> 00:00:20,240 Speaker 4: Are the companies lean enough? Have they trimmed all the fats? 5 00:00:20,320 --> 00:00:24,119 Speaker 3: The semiconductor business is a really cyclical business. 6 00:00:23,600 --> 00:00:27,200 Speaker 2: Breaking market headlines and corporate news from across the globe. 7 00:00:27,240 --> 00:00:29,880 Speaker 4: Do investors like the M and A that we've seen? 8 00:00:30,080 --> 00:00:33,159 Speaker 3: These are two big time blue chip companies. 9 00:00:33,320 --> 00:00:36,920 Speaker 4: The window between the peak and cut changing super fast. 10 00:00:37,080 --> 00:00:41,720 Speaker 2: Bloomberg Intelligence with Alex Steinel and Paul'sweeny on Bloomberg Radio. 11 00:00:42,560 --> 00:00:45,200 Speaker 3: On Today's Bloomberg Intelligence Show, we dig inside the big 12 00:00:45,240 --> 00:00:47,560 Speaker 3: business stories impacting Wall Street and the global markets. 13 00:00:47,720 --> 00:00:49,760 Speaker 4: Each and every week, are going to provide in depth 14 00:00:49,760 --> 00:00:51,840 Speaker 4: research and data on some of the two thousand companies 15 00:00:51,880 --> 00:00:54,520 Speaker 4: and one hundred and thirty industries our analysts cover worldwide. 16 00:00:54,520 --> 00:00:56,920 Speaker 3: Today, we'll look at why Apple says it's iPhone sales 17 00:00:56,920 --> 00:00:57,800 Speaker 3: in China are falling. 18 00:00:57,960 --> 00:00:59,800 Speaker 4: Plus we're going to dive into the world of art 19 00:01:00,320 --> 00:01:03,600 Speaker 4: intelligence and how it's going to impact job organizations going forward. 20 00:01:03,680 --> 00:01:06,280 Speaker 3: But first we'll begin in the retail sector. Earlier in 21 00:01:06,319 --> 00:01:10,039 Speaker 3: the week, Target reported fourth quarter profit that beat analyst expectations. 22 00:01:10,040 --> 00:01:12,280 Speaker 4: So this comes as the company reduced its stockpile of 23 00:01:12,319 --> 00:01:15,160 Speaker 4: merchandise by about twelve percent during the quarter. So Target 24 00:01:15,200 --> 00:01:17,840 Speaker 4: also confirmed it's going to launch a paid membership program, 25 00:01:17,959 --> 00:01:20,240 Speaker 4: going up against rivals like Amazon and Walmart. 26 00:01:20,280 --> 00:01:22,360 Speaker 3: For more, co host Bailey Lipschutz and I were joined 27 00:01:22,360 --> 00:01:26,160 Speaker 3: by Jennifer bartashis Bloomberg's senior retail analysts. Were first asked 28 00:01:26,240 --> 00:01:28,400 Speaker 3: Jennifer why investors are excited about Target. 29 00:01:28,640 --> 00:01:31,759 Speaker 5: They've really revealed their plan for growth over the next 30 00:01:31,959 --> 00:01:35,399 Speaker 5: several years and that is resonating well with investors, and 31 00:01:35,480 --> 00:01:39,040 Speaker 5: it's all about recapturing top plane growth, traffic, market share, 32 00:01:39,400 --> 00:01:41,200 Speaker 5: and they're a little bit more upbeat on where they 33 00:01:41,280 --> 00:01:42,479 Speaker 5: see the consumer right now. 34 00:01:43,000 --> 00:01:45,399 Speaker 6: And when we look at inventory being an issue that 35 00:01:45,480 --> 00:01:47,920 Speaker 6: was such a big problem for Target during the pandemic, 36 00:01:48,040 --> 00:01:49,880 Speaker 6: it does seem like that's being worked through. Kind of 37 00:01:49,880 --> 00:01:53,840 Speaker 6: what's your expectation with how Target is handling their inventory 38 00:01:54,480 --> 00:01:56,480 Speaker 6: ahead of schedule or not a head schedule. 39 00:01:56,640 --> 00:01:59,840 Speaker 5: Yeah, they've done a major pivot with inventory, and obviously 40 00:01:59,840 --> 00:02:03,000 Speaker 5: they had huge issues a couple of years ago with markdowns, 41 00:02:03,080 --> 00:02:05,720 Speaker 5: and they've really been able to write size inventory that's 42 00:02:05,720 --> 00:02:09,080 Speaker 5: shown up where inventory right now is actually lower than 43 00:02:09,120 --> 00:02:11,560 Speaker 5: it was last year, and yet in stocks are better 44 00:02:11,600 --> 00:02:14,640 Speaker 5: than they've been. That focus on those retail fundamentals of 45 00:02:14,680 --> 00:02:16,800 Speaker 5: making sure that you have things in the stores and 46 00:02:16,880 --> 00:02:20,360 Speaker 5: in stock has really been playing through and driving some 47 00:02:20,480 --> 00:02:22,800 Speaker 5: of the results that they've seen, and a lot of 48 00:02:22,800 --> 00:02:25,320 Speaker 5: that stems from inventory. And so right now they're very 49 00:02:25,320 --> 00:02:28,680 Speaker 5: well positioned going into twenty twenty four and that should 50 00:02:28,880 --> 00:02:30,880 Speaker 5: hopefully be a tailwind for them for the rest of 51 00:02:30,880 --> 00:02:31,639 Speaker 5: this fiscal year. 52 00:02:31,919 --> 00:02:34,160 Speaker 3: An Affinity Card, can you explain what's going on there? 53 00:02:34,480 --> 00:02:37,760 Speaker 5: They're kind of reimagining their loyalty program. So the baseline 54 00:02:37,800 --> 00:02:40,520 Speaker 5: was Target Circle, which was free to join, and they 55 00:02:40,520 --> 00:02:43,600 Speaker 5: have one hundred million users that have joined it, and 56 00:02:43,720 --> 00:02:46,320 Speaker 5: that's where you can go in and save different offers 57 00:02:46,360 --> 00:02:49,000 Speaker 5: and you get an extra ten percent off your paper 58 00:02:49,040 --> 00:02:51,639 Speaker 5: towels you know that week and that brand. What they're 59 00:02:51,680 --> 00:02:53,520 Speaker 5: doing is they're making it so you don't have to 60 00:02:53,600 --> 00:02:56,200 Speaker 5: search and save. You get the things automatically when you 61 00:02:56,320 --> 00:02:59,000 Speaker 5: check out, and if you're a Red Card holder, for 62 00:02:59,000 --> 00:03:03,079 Speaker 5: forty nine dollars, you can get unlimited free delivery, same 63 00:03:03,160 --> 00:03:06,320 Speaker 5: day free delivery to your house and they're calling this 64 00:03:06,840 --> 00:03:09,920 Speaker 5: Target three sixty. You can also subscribe to just that 65 00:03:09,960 --> 00:03:12,480 Speaker 5: if you're not a Red Card holder. Also an introductory 66 00:03:12,480 --> 00:03:14,480 Speaker 5: price of forty nine, but it will go up over time, 67 00:03:14,639 --> 00:03:17,400 Speaker 5: although they didn't say to what. So it's a move 68 00:03:17,400 --> 00:03:19,320 Speaker 5: where they're trying to get, you know, a little bit 69 00:03:19,400 --> 00:03:22,600 Speaker 5: deeper into consumers' lives, you know, doing more home delivery. 70 00:03:22,880 --> 00:03:25,760 Speaker 5: There's some additional perks in there where you can access 71 00:03:26,200 --> 00:03:28,320 Speaker 5: all of what Ship does, which is their same day 72 00:03:28,320 --> 00:03:30,840 Speaker 5: delivery service, and that includes being able to shop at 73 00:03:30,840 --> 00:03:34,079 Speaker 5: other retailers. So that's kind of the differentiator here with 74 00:03:34,120 --> 00:03:36,720 Speaker 5: regards to the membership program that once you have it, 75 00:03:36,760 --> 00:03:39,120 Speaker 5: you could also have your Target run include something from 76 00:03:39,120 --> 00:03:44,440 Speaker 5: Costco for example. But it's early days and we'll see 77 00:03:44,440 --> 00:03:47,640 Speaker 5: how much appetite consumers have for yet another membership that 78 00:03:47,680 --> 00:03:48,400 Speaker 5: they have to pay for. 79 00:03:49,040 --> 00:03:51,600 Speaker 6: Well, what are your expectations on that? Because I feel 80 00:03:51,600 --> 00:03:54,800 Speaker 6: like everyone pays for Amazon Prime and Walmart Plus I 81 00:03:54,840 --> 00:03:56,880 Speaker 6: know has had a strong roll out and has partnerships 82 00:03:56,920 --> 00:03:59,520 Speaker 6: with different credit cards. Like, what can Target really do 83 00:03:59,600 --> 00:04:02,560 Speaker 6: to get people to pay another annual fee when it 84 00:04:02,640 --> 00:04:05,320 Speaker 6: seems like, at least looking at the streaming platforms they're 85 00:04:05,360 --> 00:04:07,280 Speaker 6: struggling to maintain users. 86 00:04:07,680 --> 00:04:09,760 Speaker 5: It's a great question, and I think that's one of 87 00:04:09,800 --> 00:04:11,960 Speaker 5: the big questions that we're going to need to see 88 00:04:11,960 --> 00:04:15,560 Speaker 5: as they really talk about what differentiates this program, because 89 00:04:15,760 --> 00:04:17,680 Speaker 5: you know, forty nine dollars is cheap, but it is 90 00:04:17,680 --> 00:04:20,080 Speaker 5: an introductory price, and there are a lot of people 91 00:04:20,120 --> 00:04:23,080 Speaker 5: out there that do delivery, and so you know, when 92 00:04:23,120 --> 00:04:24,960 Speaker 5: I look at it, I think in order for it 93 00:04:25,000 --> 00:04:26,800 Speaker 5: to be a successful program, they're going to have to 94 00:04:26,960 --> 00:04:30,839 Speaker 5: really do something extraordinary that appeals to people to prompt 95 00:04:30,839 --> 00:04:34,599 Speaker 5: them to pay extra And historically, you know, Target's formula 96 00:04:34,640 --> 00:04:37,280 Speaker 5: for success has never been mimicking others. It's been about 97 00:04:37,400 --> 00:04:39,680 Speaker 5: kind of forging their own path. And I think this 98 00:04:39,760 --> 00:04:41,680 Speaker 5: is an example where it's going to be very very 99 00:04:41,680 --> 00:04:44,240 Speaker 5: important for them to do that. But it's not clear 100 00:04:44,320 --> 00:04:46,440 Speaker 5: yet how that's going to be realized. 101 00:04:46,680 --> 00:04:50,120 Speaker 6: And how does Target play into a recession. Are they 102 00:04:50,200 --> 00:04:52,240 Speaker 6: one of the companies that can weather the storm because 103 00:04:52,240 --> 00:04:54,120 Speaker 6: of their loyalty or how does that play out just 104 00:04:54,160 --> 00:04:56,640 Speaker 6: given their typical demographics. 105 00:04:56,200 --> 00:04:59,440 Speaker 5: If there is a recession. With their typical demographics, Target's 106 00:04:59,440 --> 00:05:03,920 Speaker 5: customer base is extremely loyal, but Target's product mix does 107 00:05:03,960 --> 00:05:07,200 Speaker 5: skew to more discretionary items, and so whenever there's a 108 00:05:07,240 --> 00:05:10,200 Speaker 5: pullback and spending, it tends to have an outsized impact 109 00:05:10,240 --> 00:05:13,880 Speaker 5: on Target versus other big box retailers. Now, Target has 110 00:05:13,920 --> 00:05:17,480 Speaker 5: been really focused in the last eighteen months, especially on value. 111 00:05:17,640 --> 00:05:20,240 Speaker 5: They've launched some new private label brands that are meant 112 00:05:20,279 --> 00:05:22,880 Speaker 5: to be the cheapest option in every category in the store. 113 00:05:23,320 --> 00:05:26,160 Speaker 5: So they're trying to respond to that. So they have 114 00:05:26,279 --> 00:05:28,880 Speaker 5: the breadth of category and they have the loyalty of 115 00:05:28,920 --> 00:05:31,960 Speaker 5: consumers that they can weather a recession, but it will 116 00:05:32,000 --> 00:05:34,280 Speaker 5: make it harder for them to get back to that 117 00:05:34,440 --> 00:05:37,200 Speaker 5: top line growth and that expansion of margin that they're 118 00:05:37,200 --> 00:05:39,120 Speaker 5: talking about if we find ourselves in the midst of 119 00:05:39,160 --> 00:05:39,760 Speaker 5: a recession. 120 00:05:40,160 --> 00:05:43,520 Speaker 3: Hey, Jen, how do I differentiate how much shopper between 121 00:05:44,120 --> 00:05:45,160 Speaker 3: Target and Walmart? 122 00:05:45,680 --> 00:05:48,520 Speaker 5: It's all about perception, right When you think about Walmart, 123 00:05:48,600 --> 00:05:51,960 Speaker 5: it is usually about lowest price. Walmart is a EDLP. 124 00:05:52,640 --> 00:05:55,720 Speaker 5: Every day you get the lowest possible price. They've got 125 00:05:55,760 --> 00:05:58,839 Speaker 5: a very big assortment and they have some perks with 126 00:05:58,880 --> 00:06:02,000 Speaker 5: regards to their Walmart Plus program. When people think of Target, 127 00:06:02,040 --> 00:06:05,160 Speaker 5: they think a little bit more of discovery, They think 128 00:06:05,200 --> 00:06:08,640 Speaker 5: a little bit more about inspiration than functionality, and that's 129 00:06:08,680 --> 00:06:11,400 Speaker 5: where you really see the divergence between those two retailers. 130 00:06:11,760 --> 00:06:13,400 Speaker 6: I will say, when I go to Target, I always 131 00:06:13,480 --> 00:06:15,080 Speaker 6: end up buying more than I want, where if I 132 00:06:15,120 --> 00:06:17,640 Speaker 6: go to Walmart, I'm kind of in and out. Jen 133 00:06:17,800 --> 00:06:20,159 Speaker 6: question off of that comparison, I always talk up the 134 00:06:20,160 --> 00:06:22,960 Speaker 6: fact that Walmart owned Sam's Club. When you look at 135 00:06:22,960 --> 00:06:26,640 Speaker 6: the company makeup, how much does that differentiate Walmart versus 136 00:06:26,680 --> 00:06:28,719 Speaker 6: some of the other peers, just given as you mentioned 137 00:06:29,000 --> 00:06:32,960 Speaker 6: Target potentially partnering with Costco if you're using their membership card. 138 00:06:33,480 --> 00:06:36,919 Speaker 5: Yeah, so obviously there's a great benefit to Walmart of 139 00:06:36,960 --> 00:06:40,479 Speaker 5: having both it's US Walmart stores as well as the 140 00:06:40,480 --> 00:06:44,000 Speaker 5: Sam's Club stores, because they really can capture spending by 141 00:06:44,080 --> 00:06:47,560 Speaker 5: different groups of people and for different shopping reasons. Now, 142 00:06:47,560 --> 00:06:49,720 Speaker 5: when I talked about shipped, it's not so much that 143 00:06:49,760 --> 00:06:52,680 Speaker 5: you get the benefits of a Costco membership, for example, 144 00:06:52,720 --> 00:06:55,200 Speaker 5: with Target. It's just more that you can have your 145 00:06:55,440 --> 00:06:59,160 Speaker 5: delivery person pick up purchases from other retailers, which is 146 00:06:59,200 --> 00:07:02,200 Speaker 5: something that isn't in vetted in a Walmart program, for example. 147 00:07:02,600 --> 00:07:06,120 Speaker 5: But that Target membership fee is going to be competing 148 00:07:06,160 --> 00:07:09,039 Speaker 5: with your Costco membership fee. It'll be you know, competing 149 00:07:09,040 --> 00:07:11,880 Speaker 5: with a bjays membership fee along with all of your 150 00:07:11,880 --> 00:07:14,920 Speaker 5: streaming services and everything else out there. So you know, 151 00:07:14,960 --> 00:07:17,960 Speaker 5: Walmart definitely benefits from the scale of having both the 152 00:07:17,960 --> 00:07:20,720 Speaker 5: course stores and Sam's Club, but it is a little 153 00:07:20,720 --> 00:07:22,680 Speaker 5: bit of a different business model than what we're talking 154 00:07:22,720 --> 00:07:23,480 Speaker 5: about with Target. 155 00:07:23,560 --> 00:07:23,680 Speaker 7: Oh. 156 00:07:23,720 --> 00:07:26,800 Speaker 3: Thanks to Jen Bartashis Bloomberg Intelligence senior retail analysts. 157 00:07:26,800 --> 00:07:28,200 Speaker 4: All right, we turned out a big tech So this 158 00:07:28,240 --> 00:07:31,280 Speaker 4: week we learned that Apple iPhone sales in China fell 159 00:07:31,320 --> 00:07:33,560 Speaker 4: by a surprising twenty four percent. Now this was over 160 00:07:33,600 --> 00:07:35,840 Speaker 4: the first six weeks of twenty twenty four. Those figures 161 00:07:35,840 --> 00:07:37,840 Speaker 4: came from the Counterpoint Research firm. 162 00:07:38,040 --> 00:07:39,880 Speaker 3: For more on this, co host Bailey Lipshaltz and I, 163 00:07:40,000 --> 00:07:43,800 Speaker 3: we're joined by anaag Rana, Bloomberg Intelligence Senior technology analyst. 164 00:07:43,960 --> 00:07:46,600 Speaker 3: We first asked Anurag what's causing this to decline in 165 00:07:46,640 --> 00:07:48,200 Speaker 3: the early part of twenty twenty four. 166 00:07:48,440 --> 00:07:50,360 Speaker 8: This is the same story that's been going on for 167 00:07:50,400 --> 00:07:53,320 Speaker 8: the last six to nine months, ever since Wallwei released 168 00:07:53,320 --> 00:07:56,920 Speaker 8: their higher end phone last year. So Apple's been losing 169 00:07:56,960 --> 00:07:59,720 Speaker 8: market share there that has been you know, you could 170 00:07:59,760 --> 00:08:03,200 Speaker 8: say opened or coupled with consumer weakness in China. And 171 00:08:03,240 --> 00:08:06,080 Speaker 8: it's all that's you know, driving these problems right now. 172 00:08:06,440 --> 00:08:10,560 Speaker 6: When I look at the PGEL function on the terminal, 173 00:08:11,000 --> 00:08:14,120 Speaker 6: iPhone accounts for more than half of Apple's revenue China 174 00:08:14,200 --> 00:08:17,120 Speaker 6: call it a fifth of their sales anak. When you 175 00:08:17,160 --> 00:08:20,200 Speaker 6: look at those numbers, what does this actually mean for 176 00:08:20,320 --> 00:08:23,640 Speaker 6: Apple if they're going to see that competition eating into 177 00:08:23,680 --> 00:08:24,239 Speaker 6: their sales. 178 00:08:24,480 --> 00:08:25,880 Speaker 8: Yeah, I think it's going to be a tough time 179 00:08:25,920 --> 00:08:28,840 Speaker 8: for Apple in China for that at least this year 180 00:08:29,080 --> 00:08:31,640 Speaker 8: and maybe into next year before they can, let's say, 181 00:08:31,640 --> 00:08:34,560 Speaker 8: make some inroads in India and other markets. Emerging markets 182 00:08:34,640 --> 00:08:37,079 Speaker 8: is the real growth driver for Apple, there is no 183 00:08:37,440 --> 00:08:39,920 Speaker 8: two ways about it. And and you know iPhone is 184 00:08:39,920 --> 00:08:42,680 Speaker 8: the big growth driver. So if phones are not selling 185 00:08:42,760 --> 00:08:45,600 Speaker 8: in China, that's a problem for Apple. It means numbers 186 00:08:45,640 --> 00:08:47,960 Speaker 8: need to come down even more for Apple this year. 187 00:08:48,280 --> 00:08:51,680 Speaker 8: You know, we saw about double digit sales drop in 188 00:08:51,800 --> 00:08:56,120 Speaker 8: China last quarter. I looked up on MDL consensus is 189 00:08:56,160 --> 00:08:59,319 Speaker 8: about seven percent drop in China for this quarter. I 190 00:08:59,400 --> 00:09:01,800 Speaker 8: think that number needs to creep up to someone in 191 00:09:01,800 --> 00:09:05,280 Speaker 8: the low double digit decline going forward. It's a painful 192 00:09:05,320 --> 00:09:09,400 Speaker 8: situation for Apple, and frankly, there aren't at many rosy 193 00:09:09,480 --> 00:09:11,600 Speaker 8: things looking forward, at least for twenty twenty four. 194 00:09:11,960 --> 00:09:16,760 Speaker 3: So I guess the biggest concern for Apple obviously is competition, 195 00:09:16,800 --> 00:09:19,920 Speaker 3: because they haven't really had that robust of a competitor 196 00:09:19,920 --> 00:09:23,120 Speaker 3: and they're part of the market. Are there any responses 197 00:09:23,160 --> 00:09:25,960 Speaker 3: Apple can make here from a competitive landscape other than 198 00:09:26,520 --> 00:09:27,880 Speaker 3: lowering the price point? 199 00:09:28,120 --> 00:09:30,360 Speaker 8: Yeah, Paul, I think there is a little more hype 200 00:09:30,360 --> 00:09:34,360 Speaker 8: in that competition news than reality only because Huahwei didn't 201 00:09:34,440 --> 00:09:36,719 Speaker 8: release a phone for many years, So that you can 202 00:09:36,760 --> 00:09:38,880 Speaker 8: think about it. If you have an install base of 203 00:09:38,920 --> 00:09:41,400 Speaker 8: WAWE phones. I mean, let's say you know that's X 204 00:09:41,480 --> 00:09:44,520 Speaker 8: and that hasn't been updated for four or five six years, 205 00:09:44,520 --> 00:09:46,720 Speaker 8: and you just certainly get a brand new phone. All 206 00:09:46,760 --> 00:09:48,760 Speaker 8: of those people are going to go and refresh that, 207 00:09:49,080 --> 00:09:51,959 Speaker 8: So I think you should take that as a bigger factor, 208 00:09:52,240 --> 00:09:55,600 Speaker 8: plus the subsidies they are getting in China from the 209 00:09:55,600 --> 00:09:58,800 Speaker 8: local providers. So I think Apple will do okay in 210 00:09:58,920 --> 00:10:01,480 Speaker 8: China over the long term. But I think that's not 211 00:10:01,520 --> 00:10:03,120 Speaker 8: going to be a twenty twenty fourth story. 212 00:10:03,280 --> 00:10:06,400 Speaker 6: And you mentioned emerging markets as the next driver. What countries, 213 00:10:06,440 --> 00:10:08,800 Speaker 6: what regions are going to be able to pay up 214 00:10:08,840 --> 00:10:11,520 Speaker 6: for what I would say, is quite an expensive phone. 215 00:10:11,760 --> 00:10:13,960 Speaker 8: Yeah, I think that's the most important question, and I 216 00:10:13,960 --> 00:10:16,120 Speaker 8: think you know, just by the sheer size of it, 217 00:10:16,280 --> 00:10:19,480 Speaker 8: India is the next one. But frankly speaking, right now, 218 00:10:19,559 --> 00:10:22,400 Speaker 8: Apple doesn't even operate in you know, five percent of 219 00:10:22,440 --> 00:10:25,480 Speaker 8: the entire market because of the price point of the phone. 220 00:10:25,600 --> 00:10:27,800 Speaker 8: I think the strategy India is going to be a 221 00:10:27,840 --> 00:10:30,679 Speaker 8: mix of the lower phone the se as well as 222 00:10:30,760 --> 00:10:33,440 Speaker 8: the refurbished phone where the price point is even lower. 223 00:10:33,559 --> 00:10:36,040 Speaker 8: But having said that, I think India is a developing country. 224 00:10:36,080 --> 00:10:38,559 Speaker 8: The middle class is getting more richer, so I think 225 00:10:38,600 --> 00:10:40,520 Speaker 8: that's going to be the next big growth catalyst. But 226 00:10:40,600 --> 00:10:42,760 Speaker 8: this is it's not going to play out in twenty 227 00:10:42,800 --> 00:10:45,080 Speaker 8: four to twenty five. That's more of a I would 228 00:10:45,080 --> 00:10:46,320 Speaker 8: submit to long term story. 229 00:10:46,679 --> 00:10:49,520 Speaker 3: We've talked about this before that Apple might introduce a 230 00:10:49,679 --> 00:10:52,880 Speaker 3: lower price phone into India for just that reason. Is 231 00:10:52,880 --> 00:10:56,120 Speaker 3: that something they're still considering or will they just wait 232 00:10:56,160 --> 00:10:58,160 Speaker 3: for the market to kind of move up to where 233 00:10:58,160 --> 00:10:59,400 Speaker 3: the Apple phone price point is. 234 00:10:59,800 --> 00:11:01,600 Speaker 8: I think it's going to be the latter. I've done 235 00:11:01,640 --> 00:11:04,040 Speaker 8: a lot of analysis of how much share they can 236 00:11:04,080 --> 00:11:06,720 Speaker 8: gain if they drop the S price by fifty dollars 237 00:11:06,840 --> 00:11:09,400 Speaker 8: hundred dollars, and you know, when I publish that stuff, 238 00:11:09,400 --> 00:11:11,440 Speaker 8: I think they actually raise the price by fifty bucks. 239 00:11:11,480 --> 00:11:14,280 Speaker 8: So they don't believe in dropping prices. They are more 240 00:11:14,320 --> 00:11:17,760 Speaker 8: on a margin story. So think of Apple more on 241 00:11:17,760 --> 00:11:20,520 Speaker 8: the long term basis right now, not on the short term. 242 00:11:20,920 --> 00:11:23,320 Speaker 8: I don't think they're going to you know, I would 243 00:11:23,360 --> 00:11:27,640 Speaker 8: say swap margins for market share. They've never done that 244 00:11:27,679 --> 00:11:30,040 Speaker 8: in their history, whether it was on the mac side 245 00:11:30,400 --> 00:11:31,600 Speaker 8: or on the phone side. 246 00:11:31,840 --> 00:11:34,920 Speaker 6: And quickly anorak, I look at the news, there's no 247 00:11:35,040 --> 00:11:39,040 Speaker 6: more Apple car. There's tepid reception to the vision pro 248 00:11:39,120 --> 00:11:41,960 Speaker 6: What actually drives the stock, what drives sales in the 249 00:11:42,000 --> 00:11:43,400 Speaker 6: next twelve to eighteen months. 250 00:11:43,800 --> 00:11:46,640 Speaker 8: Yeah, I think that's the most important question for Apple investors. 251 00:11:46,840 --> 00:11:48,520 Speaker 8: And I think there's going to be an event in June, 252 00:11:48,520 --> 00:11:51,920 Speaker 8: the World Wide Developers Conference, where they have pledged that 253 00:11:51,960 --> 00:11:54,040 Speaker 8: they're going to show a lot of AI enhancements to 254 00:11:54,080 --> 00:11:56,920 Speaker 8: the operating system. I think that really is the one 255 00:11:57,000 --> 00:12:00,720 Speaker 8: wildcard that can completely change the sentiment of the company, 256 00:12:00,720 --> 00:12:03,520 Speaker 8: both in terms of sales and the gloomy investor sentiment 257 00:12:04,000 --> 00:12:07,440 Speaker 8: only because remember, Apple has a distribution network that stands, 258 00:12:07,520 --> 00:12:10,040 Speaker 8: you know, next to nobody out there in terms of 259 00:12:10,200 --> 00:12:13,520 Speaker 8: you know, affluent people using their phones. More than one 260 00:12:13,559 --> 00:12:16,720 Speaker 8: billion devices connected just on the smartphone. I think that 261 00:12:16,840 --> 00:12:18,840 Speaker 8: really is the big driver. One of the things I 262 00:12:18,920 --> 00:12:21,240 Speaker 8: was thinking about was if you go back, you know, 263 00:12:21,600 --> 00:12:24,280 Speaker 8: five years, seven years, there were apps such as trip 264 00:12:24,320 --> 00:12:27,080 Speaker 8: Advisors an Yelp where people used to go for their 265 00:12:27,400 --> 00:12:31,600 Speaker 8: you know, specialized functions for restaurant advices and tourism. But 266 00:12:31,640 --> 00:12:34,040 Speaker 8: when you look at somebody like Google, a lot of 267 00:12:34,040 --> 00:12:36,920 Speaker 8: that traffic has moved on to them because they control 268 00:12:36,960 --> 00:12:40,160 Speaker 8: the distribution. I think Apple has the same ability, but 269 00:12:40,280 --> 00:12:43,680 Speaker 8: they have to show up with some AI products otherwise 270 00:12:43,720 --> 00:12:44,280 Speaker 8: that's not going to. 271 00:12:44,280 --> 00:12:47,160 Speaker 3: Flow our Thanks to Ana rog Rana Bloomberg Intelligence Senior 272 00:12:47,200 --> 00:12:48,440 Speaker 3: Technology Channels, I com. 273 00:12:48,320 --> 00:12:50,160 Speaker 4: Going to bring a break down the role of AI 274 00:12:50,320 --> 00:12:51,680 Speaker 4: in sports analytics. 275 00:12:51,760 --> 00:12:54,560 Speaker 3: You're listening to Bloomberg Intelligence on Bloomberg Radio, providing in 276 00:12:54,600 --> 00:12:56,760 Speaker 3: depth research and data on two thousand companies and one 277 00:12:56,840 --> 00:12:59,640 Speaker 3: hundred and thirty industries. You can access Bloomberg Intelligence v 278 00:12:59,679 --> 00:13:01,880 Speaker 3: A BI. I go in the terminal, I'm Paul Swimming and. 279 00:13:01,920 --> 00:13:03,920 Speaker 4: Am Alex Steel and this is Bloomberg. 280 00:13:08,240 --> 00:13:12,120 Speaker 2: You're listening to the Bloomberg Intelligence podcast. Catch us live 281 00:13:12,200 --> 00:13:14,880 Speaker 2: weekdays at ten am Eastern on fo car Play and 282 00:13:14,880 --> 00:13:17,959 Speaker 2: Android Auto with the Bloomberg Business App. Listen on demand 283 00:13:17,960 --> 00:13:22,360 Speaker 2: wherever you get your podcasts, or watch us live on YouTube. 284 00:13:22,840 --> 00:13:25,560 Speaker 4: Earlier in the week, Bloomberg Intelligence broadcasted live from the 285 00:13:25,600 --> 00:13:28,679 Speaker 4: campus of the New Jersey Institute of Technology. Paul was 286 00:13:28,720 --> 00:13:31,240 Speaker 4: pumped and the main topic of conversation was AI. 287 00:13:31,520 --> 00:13:34,320 Speaker 3: We took a look at zealous Analytics and Austin based 288 00:13:34,320 --> 00:13:38,080 Speaker 3: sports analytics company that evaluates, predicts, and improves player and 289 00:13:38,120 --> 00:13:39,400 Speaker 3: team performance in sports. 290 00:13:39,440 --> 00:13:42,360 Speaker 4: So one of our guests was Evana Serk, senior product 291 00:13:42,440 --> 00:13:45,840 Speaker 4: scientist at Zealis Analytics, and we asked Havana how Zealos 292 00:13:46,000 --> 00:13:47,200 Speaker 4: uses AI in sports. 293 00:13:47,600 --> 00:13:51,000 Speaker 1: This field that has expanded in last maybe ten years 294 00:13:51,040 --> 00:13:54,040 Speaker 1: of a lot in other sports. Even before that, it 295 00:13:54,080 --> 00:13:56,560 Speaker 1: was in baseball that was one of the first sports. 296 00:13:56,559 --> 00:13:58,920 Speaker 1: If you've seen Moneyball, that's that's really yes. 297 00:13:58,880 --> 00:14:02,120 Speaker 4: Yeah, okay, the moneyball Okay. And so it's basically like 298 00:14:02,160 --> 00:14:05,719 Speaker 4: how to position, like what players to put where combinations? 299 00:14:05,720 --> 00:14:06,600 Speaker 4: Is it that kind of stuff? 300 00:14:06,640 --> 00:14:10,720 Speaker 1: Correct? Correct? So, so player evaluation in game decision strategy, 301 00:14:11,320 --> 00:14:12,800 Speaker 1: that's sort of sort of things. Yeah. 302 00:14:12,840 --> 00:14:17,319 Speaker 3: So again, played for your starter for nja's basketball. You 303 00:14:17,360 --> 00:14:20,960 Speaker 3: also represented your native Croatia and youth basketball. So you're 304 00:14:21,080 --> 00:14:23,400 Speaker 3: great at basketball, but you're also a math nerd to 305 00:14:23,480 --> 00:14:26,520 Speaker 3: the nth degree. She got a BS and a pH 306 00:14:26,600 --> 00:14:31,960 Speaker 3: degree and applied mathematics from njiit focusing on computational fluid dynamics. 307 00:14:32,000 --> 00:14:32,720 Speaker 4: I don't know what that means. 308 00:14:32,720 --> 00:14:34,480 Speaker 3: I don't know what that means. That's art, but okay, 309 00:14:34,840 --> 00:14:38,200 Speaker 3: I don't know. So a great mathematician, great basketball player. 310 00:14:38,280 --> 00:14:40,200 Speaker 3: Let's put it all together. What are some of the 311 00:14:40,320 --> 00:14:43,480 Speaker 3: really good applications for some of that technology we've seen? 312 00:14:43,560 --> 00:14:45,600 Speaker 3: You mentioned moneyball for you know that we've seen it 313 00:14:45,600 --> 00:14:48,320 Speaker 3: in baseball. What other applications are out there that you think? 314 00:14:48,480 --> 00:14:50,520 Speaker 3: It seems like we're in the very early innings of that. 315 00:14:50,880 --> 00:14:54,400 Speaker 1: Early on, it started with just using basic data, so 316 00:14:54,600 --> 00:14:56,680 Speaker 1: box scores, play by play, and then a lot of 317 00:14:56,720 --> 00:15:00,720 Speaker 1: sports in recent years have a what's called player tracking data, 318 00:15:01,080 --> 00:15:03,600 Speaker 1: meaning we have locations of the players on the court 319 00:15:03,720 --> 00:15:06,240 Speaker 1: or on a pitch, on a field, whichever sport we're 320 00:15:06,280 --> 00:15:09,520 Speaker 1: talking about, at a high resolution. So from that data 321 00:15:09,560 --> 00:15:11,840 Speaker 1: we can extract not only things that are counted in 322 00:15:11,840 --> 00:15:14,040 Speaker 1: a box score, but also other things that happened during 323 00:15:14,040 --> 00:15:17,440 Speaker 1: the game that you wouldn't see counted in like a 324 00:15:17,720 --> 00:15:19,280 Speaker 1: basic box score for example. 325 00:15:19,640 --> 00:15:22,360 Speaker 4: What are some of the common questions that like coaches 326 00:15:22,480 --> 00:15:24,440 Speaker 4: or owners come to you with the. 327 00:15:24,360 --> 00:15:26,960 Speaker 1: Biggest question is how do we value players? How do 328 00:15:26,960 --> 00:15:29,280 Speaker 1: we find which players teams should sign, how long of 329 00:15:29,320 --> 00:15:32,040 Speaker 1: a contract, how much money should be on a contract. 330 00:15:32,280 --> 00:15:35,080 Speaker 1: That's one side, So that's the player evaluation side, and 331 00:15:35,080 --> 00:15:38,240 Speaker 1: then the other side is coaching and in game decision making. 332 00:15:38,600 --> 00:15:42,440 Speaker 1: So which situations are producing the most value for the teams? 333 00:15:42,720 --> 00:15:46,080 Speaker 1: Which situations are creating better opportunities to score? 334 00:15:47,000 --> 00:15:49,360 Speaker 3: I know, like in baseball, major league baseball and in 335 00:15:49,400 --> 00:15:51,120 Speaker 3: minor league baseball. Now it's coming into all the other 336 00:15:51,120 --> 00:15:55,760 Speaker 3: parts of baseball. The analytics people, the data people versus 337 00:15:55,800 --> 00:15:58,120 Speaker 3: maybe some of the more traditionalists. They kind of butt 338 00:15:58,160 --> 00:16:01,040 Speaker 3: heads on occasion. So how much chanalytics do you use 339 00:16:01,120 --> 00:16:04,160 Speaker 3: versus just my gut I think this player will do well. 340 00:16:04,880 --> 00:16:06,640 Speaker 3: How do you kind of bridge that topic? 341 00:16:06,920 --> 00:16:09,440 Speaker 1: Yeah? Yeah, So that's that's a big important thing, because 342 00:16:09,440 --> 00:16:13,400 Speaker 1: you can just have data without the domain expertise. And 343 00:16:13,440 --> 00:16:16,200 Speaker 1: I think that's something that we Adzealas have a really 344 00:16:16,200 --> 00:16:19,200 Speaker 1: good strength, is that we have the experts in data 345 00:16:19,240 --> 00:16:22,000 Speaker 1: and statistics in AI in machine learning. But we also 346 00:16:22,040 --> 00:16:24,680 Speaker 1: have a lot of people who worked in sports teams 347 00:16:24,680 --> 00:16:27,800 Speaker 1: and have that sort of experience and know which questions 348 00:16:27,880 --> 00:16:30,680 Speaker 1: the teams want to answer, what's useful for them, and 349 00:16:30,760 --> 00:16:32,120 Speaker 1: how can we help them best. 350 00:16:32,320 --> 00:16:34,920 Speaker 4: So yeah, because when you were saying what AI could 351 00:16:34,920 --> 00:16:36,840 Speaker 4: help you do, it feels like that's not what a 352 00:16:36,880 --> 00:16:38,640 Speaker 4: coach is supposed to do. But you're saying that you 353 00:16:38,680 --> 00:16:41,960 Speaker 4: need someone to interpret how to manage that and stuff. 354 00:16:41,880 --> 00:16:44,280 Speaker 1: Right, right, So you need like a bridge between the 355 00:16:44,360 --> 00:16:46,680 Speaker 1: data and what's what's happening on the court. 356 00:16:46,720 --> 00:16:49,280 Speaker 3: All right, If I'm an agent representing a player, now 357 00:16:49,400 --> 00:16:51,680 Speaker 3: this is I got to learn this stuff because the 358 00:16:51,720 --> 00:16:52,760 Speaker 3: team's gonna come at. 359 00:16:52,600 --> 00:16:54,880 Speaker 4: Me and say, this is what the program tells me. 360 00:16:54,880 --> 00:16:58,160 Speaker 3: That, Yeah, your client's worth blank because his or her 361 00:16:58,480 --> 00:17:01,360 Speaker 3: OPS is this and blah blah blah blah blah, And 362 00:17:01,400 --> 00:17:02,720 Speaker 3: you got to come back and say, no, I think 363 00:17:02,720 --> 00:17:04,000 Speaker 3: he's better than that, and I think he's really more. 364 00:17:04,200 --> 00:17:06,560 Speaker 3: So do you work with the agents and players themselves 365 00:17:06,600 --> 00:17:09,240 Speaker 3: as well, because they better be smart on this stuff. 366 00:17:09,440 --> 00:17:11,879 Speaker 1: Yeah, Yeah, that's it's a great area where where ZELS 367 00:17:11,960 --> 00:17:14,720 Speaker 1: is growing as well in some of our sports. But 368 00:17:14,720 --> 00:17:16,840 Speaker 1: but Yeah, an agent cannot learn all of this on 369 00:17:16,880 --> 00:17:21,199 Speaker 1: their own, so having a company or a contractor who can. 370 00:17:21,280 --> 00:17:24,840 Speaker 3: So do you guys work with agents and players directly. 371 00:17:24,760 --> 00:17:25,679 Speaker 1: In certain sports? 372 00:17:25,720 --> 00:17:28,560 Speaker 4: Yes, yeah, but not all across the board. So you also, 373 00:17:28,560 --> 00:17:31,000 Speaker 4: as Paul was mentioned earlier, you got your BS and 374 00:17:31,040 --> 00:17:36,800 Speaker 4: your PhD in applied mathematics and nj T. Because we're 375 00:17:36,840 --> 00:17:39,199 Speaker 4: here and we're talking about NJIT kind of bridges the 376 00:17:39,200 --> 00:17:41,400 Speaker 4: gap between learning stuff and then putting it out into 377 00:17:41,400 --> 00:17:44,400 Speaker 4: the world. How did this help you evolve your career 378 00:17:44,560 --> 00:17:46,080 Speaker 4: and leave you where you are today? 379 00:17:46,359 --> 00:17:46,560 Speaker 3: Yeah? 380 00:17:46,640 --> 00:17:50,320 Speaker 1: Even though I studied competitional fluidnamics, it's not exactly data science, 381 00:17:50,359 --> 00:17:51,359 Speaker 1: but I've learned a lot of skills. 382 00:17:51,400 --> 00:17:54,080 Speaker 4: There were transitions, by the way, so you can pretend 383 00:17:54,080 --> 00:17:54,399 Speaker 4: it is. 384 00:17:55,440 --> 00:17:58,119 Speaker 1: There's a little skills that transfer from from one field 385 00:17:58,119 --> 00:18:01,520 Speaker 1: to the other, for example, coding, analyzing large data sets, 386 00:18:01,840 --> 00:18:06,520 Speaker 1: creating visualizations, and communicating scientific results to regular audience. 387 00:18:06,720 --> 00:18:09,760 Speaker 3: Are there some sports that are embracing AI or just 388 00:18:09,840 --> 00:18:11,959 Speaker 3: technology analytics more than others? 389 00:18:12,359 --> 00:18:15,639 Speaker 1: That's historically in baseball, particularly because they had more advanced 390 00:18:15,720 --> 00:18:19,520 Speaker 1: data for the longest time, But other sports now also 391 00:18:19,640 --> 00:18:23,480 Speaker 1: have the player tracking data and are starting to get 392 00:18:23,560 --> 00:18:24,399 Speaker 1: more on that side. 393 00:18:24,560 --> 00:18:26,639 Speaker 4: How did you wind up in this? Because if you 394 00:18:26,680 --> 00:18:29,520 Speaker 4: played basketball, right, because you're originally from Croatia, right, So 395 00:18:29,520 --> 00:18:32,360 Speaker 4: you played basketball and then you somehow wound up and 396 00:18:32,400 --> 00:18:34,320 Speaker 4: deep into analytics. How did you do that? 397 00:18:34,560 --> 00:18:34,720 Speaker 7: Well? 398 00:18:34,760 --> 00:18:37,080 Speaker 1: I always loved math and I always loved basketball, and 399 00:18:37,160 --> 00:18:39,400 Speaker 1: this was a perfect combination of the two. 400 00:18:39,720 --> 00:18:41,200 Speaker 3: Where are we do you think in terms of the 401 00:18:41,240 --> 00:18:45,280 Speaker 3: evolution of applying data and AI to sports? Because it 402 00:18:45,400 --> 00:18:48,159 Speaker 3: just the statistics. I've been following sports my entire life, 403 00:18:48,240 --> 00:18:50,800 Speaker 3: and I'm listening to a broadcast and they're saying stuff. 404 00:18:50,840 --> 00:18:52,520 Speaker 3: I have no idea what they're talking about, Like now 405 00:18:52,560 --> 00:18:55,680 Speaker 3: batting average is an important anymore to baseball and now 406 00:18:55,720 --> 00:18:59,280 Speaker 3: it's on bass plus slugging. I don't know. I mean, 407 00:18:59,600 --> 00:19:01,240 Speaker 3: it seems like we need a tutorial a lot of 408 00:19:01,240 --> 00:19:03,480 Speaker 3: these broadcasts. Where can this go? Do you think? 409 00:19:03,760 --> 00:19:06,040 Speaker 1: Yeah? I wouldn't know about baseball because I don't really 410 00:19:06,119 --> 00:19:09,280 Speaker 1: understand the rules coming from Croatia. But in basketball, we 411 00:19:09,880 --> 00:19:12,920 Speaker 1: you know, from now we have a player location data, 412 00:19:12,960 --> 00:19:17,679 Speaker 1: but it's also growing towards player kinematics data, which NBA 413 00:19:17,760 --> 00:19:22,240 Speaker 1: has available for Saturday's season Kinematics kinematics, so the locations 414 00:19:22,280 --> 00:19:27,080 Speaker 1: of players waist, elbow, shoulder, all of the joints, so 415 00:19:27,200 --> 00:19:30,639 Speaker 1: more detailed data of like player movements and yeah, so 416 00:19:31,000 --> 00:19:33,520 Speaker 1: how how players are shooting? And you can extract all 417 00:19:33,560 --> 00:19:35,960 Speaker 1: this more detailed information. 418 00:19:35,800 --> 00:19:39,040 Speaker 4: Are thanks now to Havana Serk, senior product scientists at 419 00:19:39,119 --> 00:19:40,000 Speaker 4: Zalis Analytics. 420 00:19:40,119 --> 00:19:43,080 Speaker 3: Let's stick with our conversations on artificial intelligence. At the 421 00:19:43,080 --> 00:19:46,160 Speaker 3: New Jersey Institute of Technology, we looked at the company Avonaut, 422 00:19:46,200 --> 00:19:49,679 Speaker 3: a leading provider of cloud and advisory services. Avonaut was 423 00:19:49,720 --> 00:19:52,720 Speaker 3: founded as a joint venture between Microsoft and Accenture. 424 00:19:52,880 --> 00:19:55,680 Speaker 4: We were joined by Anita Giovanni, global head of Innovation 425 00:19:55,760 --> 00:19:58,200 Speaker 4: at Avonat, and we asked her how the company approaches 426 00:19:58,240 --> 00:20:01,520 Speaker 4: AI and how AI will impact organizations going forward. 427 00:20:01,720 --> 00:20:04,679 Speaker 9: Yeah, so, we are a global consultancy, as you mentioned 428 00:20:04,680 --> 00:20:08,000 Speaker 9: MICROSOFTIC Center, joint venture, sixty thousand employees around the world, 429 00:20:08,000 --> 00:20:10,480 Speaker 9: and what we do is think about AI from a 430 00:20:10,520 --> 00:20:13,639 Speaker 9: client perspective. How is it that we can support organizations 431 00:20:13,680 --> 00:20:17,040 Speaker 9: across sectors be AI first and at the same time, 432 00:20:17,280 --> 00:20:19,800 Speaker 9: we're all going through this journey together. So thinking about 433 00:20:19,840 --> 00:20:23,399 Speaker 9: ourselves as an organization, how can we be AI first 434 00:20:23,400 --> 00:20:25,800 Speaker 9: in our own business processes and for our own people 435 00:20:25,960 --> 00:20:27,960 Speaker 9: so I'm a company and I come to you, what 436 00:20:28,000 --> 00:20:30,280 Speaker 9: do you do for me? We think about a lot 437 00:20:30,280 --> 00:20:33,280 Speaker 9: of things. Are you guys prepared from a people perspective, 438 00:20:33,320 --> 00:20:37,199 Speaker 9: an organizational perspective, and a process perspective. For example, a 439 00:20:37,280 --> 00:20:40,000 Speaker 9: lot of people that we interviewed in an AI readiness 440 00:20:40,000 --> 00:20:44,640 Speaker 9: report said they were enthusiastic and optimistic about AI. That's great. However, 441 00:20:45,280 --> 00:20:47,640 Speaker 9: half of the leaders said they weren't ready, and only 442 00:20:47,720 --> 00:20:51,480 Speaker 9: a third of CEOs believe that their top leadership is 443 00:20:51,520 --> 00:20:54,680 Speaker 9: AI fluent. So there is a dissonance between the excitement 444 00:20:54,720 --> 00:20:58,800 Speaker 9: and enthusiasm and the reality of the preparedness of organizations. 445 00:20:58,800 --> 00:21:01,399 Speaker 9: And what we do is make sure that organizations have 446 00:21:01,440 --> 00:21:03,440 Speaker 9: the coaching and support they need to get there. 447 00:21:03,560 --> 00:21:05,399 Speaker 3: I would think one of the challenges, just speaking for 448 00:21:05,480 --> 00:21:07,720 Speaker 3: myself is I learned a whole lot speaking to again 449 00:21:07,720 --> 00:21:11,360 Speaker 3: and the smart people from NJIT what AI is. I'm 450 00:21:11,359 --> 00:21:13,040 Speaker 3: one of those people that says, if you can't explain 451 00:21:13,080 --> 00:21:15,040 Speaker 3: it in one sense, you don't understand it. And I 452 00:21:15,080 --> 00:21:17,560 Speaker 3: don't think I understand it. What's the basic framework that 453 00:21:17,600 --> 00:21:19,600 Speaker 3: you try to get across your clients about what AI 454 00:21:19,840 --> 00:21:21,320 Speaker 3: is and what it can mean for them? 455 00:21:21,720 --> 00:21:24,680 Speaker 9: Yeah, think about AI and one of the biggest generative 456 00:21:24,680 --> 00:21:27,760 Speaker 9: AI tools right now through Microsoft is copilots. Think of 457 00:21:27,800 --> 00:21:31,920 Speaker 9: it as a co pilot, not necessarily a replacement pilot that. 458 00:21:31,920 --> 00:21:33,920 Speaker 4: Can allow you to articulating. 459 00:21:34,000 --> 00:21:37,359 Speaker 9: Yeah, allow you to do your job more effectively and 460 00:21:37,400 --> 00:21:40,320 Speaker 9: more efficiently. And so instead of thinking about AI as 461 00:21:40,359 --> 00:21:42,879 Speaker 9: a job replacement, think about it as a way to 462 00:21:42,960 --> 00:21:46,480 Speaker 9: replace key tasks and allow you to spend your days 463 00:21:46,800 --> 00:21:49,680 Speaker 9: in ways that you want to, talking to people, being 464 00:21:49,680 --> 00:21:54,240 Speaker 9: more relationship focused rather than necessarily summarizing emails or going 465 00:21:54,280 --> 00:21:55,880 Speaker 9: through data sets, et cetera. 466 00:21:55,960 --> 00:21:57,159 Speaker 3: So it's a partner. 467 00:21:57,400 --> 00:22:00,240 Speaker 4: So basically I could have some AI. Think go throw 468 00:22:00,320 --> 00:22:02,920 Speaker 4: my email and like correlate the important parts and give 469 00:22:02,920 --> 00:22:04,959 Speaker 4: it out, for example, and take it and give it 470 00:22:04,960 --> 00:22:06,440 Speaker 4: to me, so I don't have to spend my home 471 00:22:06,480 --> 00:22:08,000 Speaker 4: morning going through and reading reports. 472 00:22:08,200 --> 00:22:08,919 Speaker 3: Yeah exactly. 473 00:22:09,000 --> 00:22:11,159 Speaker 4: That's really cool. Yeah, and that would make me so 474 00:22:11,240 --> 00:22:12,640 Speaker 4: much time to go do other stuff. 475 00:22:12,720 --> 00:22:12,960 Speaker 3: Yeah. 476 00:22:13,000 --> 00:22:14,600 Speaker 9: I mean, think about when you come back from vacation. 477 00:22:14,720 --> 00:22:17,400 Speaker 9: You probably check your email when you're on vacation. I don't, 478 00:22:17,440 --> 00:22:18,840 Speaker 9: but for an exact. 479 00:22:18,600 --> 00:22:21,000 Speaker 4: Reason, because if I come back, I have like two 480 00:22:21,080 --> 00:22:23,720 Speaker 4: thousand emails being gone for like a week, and I can't. 481 00:22:23,560 --> 00:22:25,159 Speaker 9: Keep that I can't do it if you had the 482 00:22:25,200 --> 00:22:27,560 Speaker 9: AA tool, what you could do after being away for 483 00:22:27,600 --> 00:22:29,640 Speaker 9: two weeks. I don't check my email and probably get 484 00:22:29,640 --> 00:22:30,240 Speaker 9: in trouble for that. 485 00:22:30,160 --> 00:22:30,560 Speaker 5: But I don't. 486 00:22:30,560 --> 00:22:32,359 Speaker 9: I can come back and say, what did I miss 487 00:22:32,359 --> 00:22:34,679 Speaker 9: over the last two weeks, go through all my pings 488 00:22:34,680 --> 00:22:36,760 Speaker 9: on teams, go through all my outlook, and can you 489 00:22:36,800 --> 00:22:38,919 Speaker 9: prepare for me a summary so that now that I 490 00:22:39,000 --> 00:22:41,879 Speaker 9: come back, I can actually be ready and can prioritize. 491 00:22:41,880 --> 00:22:43,760 Speaker 9: That's where it really comes into. 492 00:22:44,280 --> 00:22:45,280 Speaker 4: Wow, that's really cool. 493 00:22:45,400 --> 00:22:47,680 Speaker 3: Yeah, So what when you sit down with your clients, 494 00:22:48,040 --> 00:22:50,560 Speaker 3: I mean, what's some of the common requests you get 495 00:22:50,560 --> 00:22:52,359 Speaker 3: from them or what do they ask for most of 496 00:22:52,400 --> 00:22:53,399 Speaker 3: the help with I guess. 497 00:22:53,680 --> 00:22:53,920 Speaker 8: Yeah. 498 00:22:53,960 --> 00:22:55,840 Speaker 9: One of the things that's really top of mind for 499 00:22:55,880 --> 00:22:59,240 Speaker 9: people is about skill set and training and capability building. 500 00:22:59,359 --> 00:23:02,480 Speaker 9: So in our survey, we found that eight out of 501 00:23:02,640 --> 00:23:06,000 Speaker 9: ten people said that twenty hours of their work week, 502 00:23:06,040 --> 00:23:09,000 Speaker 9: almost fifty percent of their work week can be replaced 503 00:23:09,040 --> 00:23:11,639 Speaker 9: with AI tools. The challenges they don't know how to 504 00:23:11,720 --> 00:23:14,160 Speaker 9: use the tools in the most effective and efficient way, 505 00:23:14,400 --> 00:23:17,440 Speaker 9: so the training around that is critical in the process. 506 00:23:17,480 --> 00:23:21,480 Speaker 9: The other is a responsible AI a governance set. Right, Yeah, 507 00:23:21,480 --> 00:23:23,520 Speaker 9: what are the guard rails that we have to put 508 00:23:23,560 --> 00:23:26,760 Speaker 9: into place so that people can play creatively in the space. 509 00:23:26,920 --> 00:23:30,280 Speaker 4: Do you feel like people and CEOs or board levels 510 00:23:30,320 --> 00:23:32,480 Speaker 4: are do they now know what they don't know, they 511 00:23:32,480 --> 00:23:34,680 Speaker 4: are beginning to figure it out, or we're still in 512 00:23:34,720 --> 00:23:35,560 Speaker 4: the beginning part of that. 513 00:23:35,600 --> 00:23:37,800 Speaker 9: I believe we're in the infancy of it. I think 514 00:23:37,840 --> 00:23:39,840 Speaker 9: there is an infancy of the learning curve, but also 515 00:23:39,920 --> 00:23:42,600 Speaker 9: an infancy of having the right people in the room, 516 00:23:42,680 --> 00:23:45,760 Speaker 9: having diverse perspectives. As we think about responsible AI. 517 00:23:45,680 --> 00:23:48,200 Speaker 3: And we're hearing you mentioned the I guess the ethical 518 00:23:48,480 --> 00:23:51,480 Speaker 3: use of AI. I don't know how that's going to evolve. 519 00:23:52,040 --> 00:23:54,520 Speaker 3: Is that going to be some partnership between public, private, 520 00:23:54,800 --> 00:23:55,640 Speaker 3: the individual. 521 00:23:55,960 --> 00:23:58,080 Speaker 4: I'm not sure I actually know what that means. Well, 522 00:23:58,080 --> 00:23:59,760 Speaker 4: it just seems like ethical use of AI. 523 00:24:00,160 --> 00:24:02,520 Speaker 3: Yeah, it just seems like the technology could get out 524 00:24:02,520 --> 00:24:03,200 Speaker 3: of control. 525 00:24:03,480 --> 00:24:07,080 Speaker 9: Look, as AI and generative AI becomes more ubiquitous, with 526 00:24:07,280 --> 00:24:10,720 Speaker 9: increased scale comes increased risk. That's just the reality of things. 527 00:24:10,720 --> 00:24:13,240 Speaker 9: So how do you mitigate those risks? I think one 528 00:24:13,240 --> 00:24:15,080 Speaker 9: of the most important ways to do that is to 529 00:24:15,080 --> 00:24:18,000 Speaker 9: have the right people in the room. So, whether that's 530 00:24:18,040 --> 00:24:21,119 Speaker 9: from a diversity perspective of gender, whether that's having people 531 00:24:21,160 --> 00:24:24,160 Speaker 9: of color in the room, people from diverse backgrounds. It's 532 00:24:24,160 --> 00:24:26,879 Speaker 9: one of the reasons that we do the scholarship program 533 00:24:26,920 --> 00:24:29,560 Speaker 9: for women in STEM at this very institute, because we 534 00:24:29,600 --> 00:24:31,720 Speaker 9: want to make sure that they're not brought in as 535 00:24:31,760 --> 00:24:34,520 Speaker 9: a second thought, but rather at the very beginning of 536 00:24:34,520 --> 00:24:35,080 Speaker 9: the conversation. 537 00:24:35,200 --> 00:24:37,840 Speaker 4: So, what's like an unethical use of AI? Like, where 538 00:24:37,880 --> 00:24:39,000 Speaker 4: does AI get bad? 539 00:24:39,200 --> 00:24:39,360 Speaker 5: Yeah? 540 00:24:39,440 --> 00:24:41,600 Speaker 9: Well, I mean, look, you can use you can use 541 00:24:41,640 --> 00:24:44,400 Speaker 9: AI to create images that don't actually exist. You can 542 00:24:44,400 --> 00:24:47,240 Speaker 9: put voices on people to say things through their own 543 00:24:47,320 --> 00:24:49,720 Speaker 9: voice when they may maybe have not said that video. 544 00:24:50,240 --> 00:24:53,400 Speaker 9: You can think about putting in questions into generative AI 545 00:24:53,480 --> 00:24:56,159 Speaker 9: that perhaps share data with the broader public that you 546 00:24:56,200 --> 00:24:58,760 Speaker 9: didn't want to share that's company specific data. So there's 547 00:24:58,760 --> 00:25:02,600 Speaker 9: a security component, there's a falsification component, There's lots of 548 00:25:02,600 --> 00:25:04,720 Speaker 9: different ways you kind of have to be proactive about. 549 00:25:04,840 --> 00:25:07,520 Speaker 3: And on this front, once again, maybe at no fault 550 00:25:07,560 --> 00:25:11,040 Speaker 3: of their own, the government is generations behind where the 551 00:25:11,080 --> 00:25:14,240 Speaker 3: technology is. I don't know how this plays out, I 552 00:25:14,320 --> 00:25:17,080 Speaker 3: really don't. I mean, is there a feeling that the 553 00:25:17,119 --> 00:25:18,879 Speaker 3: industry for a while is going to have to police 554 00:25:18,920 --> 00:25:21,520 Speaker 3: itself or is there going to be some again public 555 00:25:21,600 --> 00:25:24,280 Speaker 3: private partnership in terms of regulating this, because this is 556 00:25:24,320 --> 00:25:29,760 Speaker 3: not the FCC regulating the airwaves. This is really really difficult. 557 00:25:30,119 --> 00:25:33,760 Speaker 9: Yeah, it gets complicated. Look, I think there's an individual 558 00:25:33,840 --> 00:25:36,399 Speaker 9: level to it, an individual level of responsibility, But at 559 00:25:36,400 --> 00:25:37,919 Speaker 9: the end of the day, it's going to fall on 560 00:25:38,000 --> 00:25:41,920 Speaker 9: the leaders, the leaders of organizations across the board. If 561 00:25:41,920 --> 00:25:44,640 Speaker 9: the senior leaders are not thinking about responsible AI, they're 562 00:25:44,640 --> 00:25:48,359 Speaker 9: not thinking about the AI fluency, no one else is 563 00:25:48,400 --> 00:25:50,280 Speaker 9: going to think about that. So the responsibility on the 564 00:25:50,359 --> 00:25:51,280 Speaker 9: leaders is very high. 565 00:25:51,320 --> 00:25:54,399 Speaker 3: Our thanks to Anita Givanni, Global head of Innovation at ABANAT. 566 00:25:54,440 --> 00:25:56,960 Speaker 4: Coming up on the program, our conversation on AI continues, 567 00:25:57,040 --> 00:25:59,359 Speaker 4: we speak with Michael Johnson, President of the New Jersey 568 00:25:59,440 --> 00:26:00,640 Speaker 4: Innovation Institute. 569 00:26:00,800 --> 00:26:03,600 Speaker 3: You're listening to Bloomberg Intelligence on Bloomberg Radio, providing in 570 00:26:03,640 --> 00:26:05,800 Speaker 3: depth research and data on two thousand companies and one 571 00:26:05,880 --> 00:26:09,320 Speaker 3: hundred and thirty industries. You can access Bloomberg Intelligence via Bigone, 572 00:26:09,320 --> 00:26:11,800 Speaker 3: the Terminal on Paul Sweeney, m Alex Steele, and. 573 00:26:11,800 --> 00:26:12,840 Speaker 4: This is Bloomberg. 574 00:26:17,880 --> 00:26:21,760 Speaker 2: You're listening to the Bloomberg Intelligence podcast. Catch us live 575 00:26:21,840 --> 00:26:25,360 Speaker 2: weekdays at ten am Eastern on applecar Play and Android 576 00:26:25,400 --> 00:26:28,160 Speaker 2: Auto with the Bloomberg Business app. You can also listen 577 00:26:28,280 --> 00:26:31,399 Speaker 2: live on Amazon Alexa from our flagship New York station. 578 00:26:31,760 --> 00:26:35,520 Speaker 2: Just say Alexa play Bloomberg eleven thirty. 579 00:26:36,440 --> 00:26:39,280 Speaker 4: Earlier in the week, Bloomberg Intelligence broadcasted live from the 580 00:26:39,280 --> 00:26:42,520 Speaker 4: campus of the New Jersey Institute of Technology, and the 581 00:26:42,560 --> 00:26:44,480 Speaker 4: main topic of conversation was AI. 582 00:26:44,680 --> 00:26:47,359 Speaker 3: We spoke with Michael Johnson, president of the New Jersey 583 00:26:47,480 --> 00:26:53,040 Speaker 3: Innovation Institute or NJII. It's a standalone corporation owned by NJIT. 584 00:26:53,600 --> 00:26:56,520 Speaker 3: We first asked Michael what NJII does. 585 00:26:56,720 --> 00:26:59,199 Speaker 10: In the US. We have lots of research universities and 586 00:26:59,240 --> 00:27:02,359 Speaker 10: there's lots of smart people, lots of great resources, but 587 00:27:02,400 --> 00:27:05,440 Speaker 10: there's this fundamental problem in academia, which is it's tough 588 00:27:05,480 --> 00:27:08,320 Speaker 10: for the outside world actually leverage those resources. So for 589 00:27:08,440 --> 00:27:11,520 Speaker 10: governmental organizations for industry, they want access to the cutting 590 00:27:11,600 --> 00:27:13,920 Speaker 10: edge of AI, for example, but it's tough for them 591 00:27:13,920 --> 00:27:16,520 Speaker 10: to actually make those connections and interact with faculty. So 592 00:27:16,760 --> 00:27:19,359 Speaker 10: NGI is an organization. It's a five oh one C 593 00:27:19,480 --> 00:27:22,359 Speaker 10: three wholly owned by NGT, and the idea is that 594 00:27:22,400 --> 00:27:24,840 Speaker 10: we are a standalone corporation that's a conduit between the 595 00:27:24,880 --> 00:27:28,919 Speaker 10: outside world and NGT, So we make those facilitations, we 596 00:27:29,000 --> 00:27:31,920 Speaker 10: create unique business models to work with faculty, and we're 597 00:27:31,920 --> 00:27:35,159 Speaker 10: a quick moving organization, unlike academia, which is you know, 598 00:27:35,280 --> 00:27:37,280 Speaker 10: tends to be slower and more difficult to work with. 599 00:27:37,560 --> 00:27:39,440 Speaker 10: So we're that conduit between them and the outside world 600 00:27:39,520 --> 00:27:41,680 Speaker 10: and roughly have about one hundred and twenty folks out 601 00:27:41,680 --> 00:27:43,720 Speaker 10: of organization we're focused on that can. 602 00:27:43,640 --> 00:27:45,240 Speaker 4: Just say it's really cool his three year old son 603 00:27:45,320 --> 00:27:48,159 Speaker 4: is here. I mean, what three year old is going 604 00:27:48,240 --> 00:27:50,080 Speaker 4: to come and talk about AI. I feel like that 605 00:27:50,200 --> 00:27:52,160 Speaker 4: just says it all at the end of the day, right, 606 00:27:52,280 --> 00:27:54,719 Speaker 4: that is the future? So am I a company that 607 00:27:54,760 --> 00:27:56,480 Speaker 4: goes to you and then you pair me up with 608 00:27:56,520 --> 00:27:59,119 Speaker 4: something or is it sort of the technology that you're evolving, 609 00:27:59,119 --> 00:28:00,399 Speaker 4: and then you go pitch it to companies. 610 00:28:00,440 --> 00:28:01,120 Speaker 5: How does that work? 611 00:28:01,200 --> 00:28:03,000 Speaker 10: It's a bit of inside out and outside in. So 612 00:28:03,040 --> 00:28:04,680 Speaker 10: we can go to corporations and try and find out 613 00:28:04,680 --> 00:28:06,639 Speaker 10: what their problems are, what their pain points are, and 614 00:28:06,640 --> 00:28:08,760 Speaker 10: then go and find faculty you can help out. Or 615 00:28:08,800 --> 00:28:10,440 Speaker 10: we might have a few faculty that have a very 616 00:28:10,440 --> 00:28:12,600 Speaker 10: specific problem. They need access to software, they need to 617 00:28:12,600 --> 00:28:15,040 Speaker 10: access the resources, and we go externally to find a 618 00:28:15,040 --> 00:28:17,200 Speaker 10: way to work with corporations on that, but it's pairing 619 00:28:17,200 --> 00:28:20,040 Speaker 10: the two with each other. And faculty are really smart, 620 00:28:20,080 --> 00:28:22,080 Speaker 10: they're really focused on their research, but they don't always 621 00:28:22,119 --> 00:28:23,800 Speaker 10: have the mind to go out and actually execute on 622 00:28:23,840 --> 00:28:26,679 Speaker 10: consultant type projects for industry. So we help form that 623 00:28:26,720 --> 00:28:29,000 Speaker 10: framework and along the way we're trying to help with 624 00:28:29,040 --> 00:28:31,880 Speaker 10: tech transfers, so getting technology out of the university into 625 00:28:31,920 --> 00:28:34,440 Speaker 10: products and services is always a pain point, and also 626 00:28:34,560 --> 00:28:38,440 Speaker 10: just generally accelerating innovation and also helping upskille workers. 627 00:28:39,040 --> 00:28:41,800 Speaker 3: You know, over the last several quarters, Bloomberg does this analysis. 628 00:28:41,800 --> 00:28:44,760 Speaker 3: It shows what are companies talking about on their quarterly 629 00:28:44,800 --> 00:28:47,760 Speaker 3: conference calls, And for the last several quarters, every single 630 00:28:47,800 --> 00:28:49,800 Speaker 3: company in Y S and P five hundred has talked 631 00:28:49,800 --> 00:28:52,160 Speaker 3: about AI, with the exception of Apple. Last quarter have 632 00:28:52,240 --> 00:28:55,240 Speaker 3: I mentioned AI? Which is interesting? What are companies most 633 00:28:55,440 --> 00:28:57,320 Speaker 3: commonly asking you for help with? 634 00:28:58,280 --> 00:28:58,400 Speaker 4: Oh? 635 00:28:58,520 --> 00:28:58,720 Speaker 2: Man? 636 00:28:58,800 --> 00:29:01,240 Speaker 10: That goes all over the place. It depends in the companies. 637 00:29:01,240 --> 00:29:03,080 Speaker 10: We have some small mom and pop businesses that just 638 00:29:03,120 --> 00:29:06,400 Speaker 10: want help, but trying to move towards technology, towards computers. 639 00:29:06,640 --> 00:29:08,200 Speaker 10: We have other companies, for example, that want to be 640 00:29:08,240 --> 00:29:11,120 Speaker 10: the bleeding edge of some sort and field. So for example, 641 00:29:11,120 --> 00:29:13,360 Speaker 10: it might be life sciences, it might be AI for example, 642 00:29:13,600 --> 00:29:15,920 Speaker 10: and they're asking us to help improve something that they're 643 00:29:15,920 --> 00:29:18,240 Speaker 10: already doing, or it's a very specific project they're pushing 644 00:29:18,280 --> 00:29:20,280 Speaker 10: us to find faculty to help out with. So it 645 00:29:20,360 --> 00:29:23,400 Speaker 10: kind of depends. We have other folks. For example, Picatinny 646 00:29:23,480 --> 00:29:25,920 Speaker 10: Arsenal and Department of Defense are looking for just workers, 647 00:29:26,120 --> 00:29:29,120 Speaker 10: so helping us upscale workers for advanced manufacturing and all 648 00:29:29,160 --> 00:29:31,440 Speaker 10: sorts of different programs they need help finding talent for 649 00:29:31,600 --> 00:29:33,000 Speaker 10: so we're trying to help that with that as well. 650 00:29:33,200 --> 00:29:35,320 Speaker 4: So JPM we're going to interfew saw this. They had 651 00:29:35,320 --> 00:29:37,280 Speaker 4: a great piece out that said that some of its 652 00:29:37,280 --> 00:29:41,240 Speaker 4: corporate customers are slashing manual work by almost ninety percent 653 00:29:41,840 --> 00:29:45,640 Speaker 4: with its cash flow management tool that runs on AI. 654 00:29:46,160 --> 00:29:47,920 Speaker 4: And that's the fear, right that we're going to use 655 00:29:47,960 --> 00:29:50,080 Speaker 4: AI and replace all the workers and those workers don't 656 00:29:50,080 --> 00:29:51,760 Speaker 4: have any jobs. Is there any truth to that? 657 00:29:52,600 --> 00:29:55,080 Speaker 10: It's a great question. So whenever you have technologies they 658 00:29:55,120 --> 00:29:57,400 Speaker 10: are disruptive, there are going to be jobs certainly that 659 00:29:57,440 --> 00:29:59,040 Speaker 10: are going to go away. But if you look back 660 00:29:59,080 --> 00:30:01,640 Speaker 10: to when accel first came out, or when computers first 661 00:30:01,640 --> 00:30:03,680 Speaker 10: came out, you look at accounting as a great use case. 662 00:30:04,000 --> 00:30:06,480 Speaker 10: Accountants didn't go away because we were going from a 663 00:30:06,520 --> 00:30:08,640 Speaker 10: ledger that was literally on paper to a computer based system. 664 00:30:08,680 --> 00:30:10,440 Speaker 10: We found new questions to answer, new ways that we 665 00:30:10,440 --> 00:30:12,880 Speaker 10: could look at our accounting and finances. So I think 666 00:30:12,960 --> 00:30:15,320 Speaker 10: the jobs are going to change, But the overall number 667 00:30:15,360 --> 00:30:17,440 Speaker 10: of jobs in that net, I don't know if it 668 00:30:17,440 --> 00:30:19,880 Speaker 10: will actually reduce. It might increase in some cases. But 669 00:30:19,880 --> 00:30:21,880 Speaker 10: we're going to answer different questions. We're going to do 670 00:30:21,920 --> 00:30:23,560 Speaker 10: things much more quickly than we did in the past, 671 00:30:23,600 --> 00:30:23,960 Speaker 10: for sure. 672 00:30:24,520 --> 00:30:26,840 Speaker 3: I guess my lack of knowledge of full appreciation of 673 00:30:26,840 --> 00:30:29,120 Speaker 3: AIS is I'm just not sure if it's something completely 674 00:30:29,160 --> 00:30:31,479 Speaker 3: new or is it just the next iteration of what 675 00:30:31,520 --> 00:30:35,240 Speaker 3: the smart people at NJIT typically do. Is it I'm 676 00:30:35,280 --> 00:30:38,520 Speaker 3: just not sure what's new about it other than Man, 677 00:30:38,560 --> 00:30:41,560 Speaker 3: everybody's talking about it, and it was a theme. One 678 00:30:41,600 --> 00:30:43,440 Speaker 3: of the themes that drove the stock market in twenty 679 00:30:43,480 --> 00:30:45,600 Speaker 3: twenty three was a concept of AI and the average 680 00:30:45,640 --> 00:30:48,000 Speaker 3: trader has no idea what AI is, but he's buying 681 00:30:48,040 --> 00:30:49,720 Speaker 3: stocks because he thinks they're an AI play. 682 00:30:49,880 --> 00:30:52,160 Speaker 10: It's been around for decades, right, but we have a 683 00:30:52,160 --> 00:30:53,760 Speaker 10: couple of technologies that came out in the last two 684 00:30:53,840 --> 00:30:55,840 Speaker 10: years that have really transformed the way we see AI 685 00:30:55,840 --> 00:30:57,440 Speaker 10: and while we're talking about it, and the reason is 686 00:30:57,480 --> 00:31:00,440 Speaker 10: because now it's accessible. So for example, to years ago, 687 00:31:00,440 --> 00:31:01,800 Speaker 10: if I go into Google and I tell you how 688 00:31:01,800 --> 00:31:04,360 Speaker 10: do I make chicken palm? I got all these ads, 689 00:31:04,400 --> 00:31:06,400 Speaker 10: I get all these things that tell me about chicken parm. 690 00:31:06,440 --> 00:31:07,880 Speaker 10: I go in a chat GPT and I type that 691 00:31:07,920 --> 00:31:09,640 Speaker 10: for example, and I get a perfect recipe on how 692 00:31:09,680 --> 00:31:12,560 Speaker 10: to actually make that, So it becomes very accessible to anyone. 693 00:31:12,600 --> 00:31:14,400 Speaker 10: And I think that go to market strategy. The open 694 00:31:14,440 --> 00:31:17,280 Speaker 10: I had of making accessible is what really changed the game. 695 00:31:17,640 --> 00:31:20,520 Speaker 10: And also the same time computing power is exponentially increasing, 696 00:31:20,560 --> 00:31:23,200 Speaker 10: it's more accessible. We're now able to use it everywhere 697 00:31:23,240 --> 00:31:25,320 Speaker 10: from making chicken palm to try and do research. 698 00:31:25,560 --> 00:31:27,080 Speaker 4: So what kind of cool stuff are you guys working 699 00:31:27,120 --> 00:31:28,960 Speaker 4: on right now? Like what were you most excited about? 700 00:31:29,320 --> 00:31:31,680 Speaker 10: For us as NGI, what we're very focused on is 701 00:31:31,720 --> 00:31:33,560 Speaker 10: trying to get things out at the university into the 702 00:31:33,560 --> 00:31:36,040 Speaker 10: real world. And one specific project that we're working on 703 00:31:36,400 --> 00:31:39,360 Speaker 10: is actually on law enforcement and body cams. So bodycams 704 00:31:39,480 --> 00:31:41,760 Speaker 10: is there a sensor that generates a huge amount of 705 00:31:41,840 --> 00:31:44,800 Speaker 10: data and from those data sets, we're usually looking at 706 00:31:44,840 --> 00:31:47,400 Speaker 10: them after the fact, so after something bad happens, we're 707 00:31:47,400 --> 00:31:49,959 Speaker 10: trying to review that situation. What we're trying to do 708 00:31:50,080 --> 00:31:52,320 Speaker 10: is can we look at that data and predict something 709 00:31:52,360 --> 00:31:54,360 Speaker 10: bad is going to happen before it happens. So if 710 00:31:54,400 --> 00:31:57,960 Speaker 10: we see a pattern between some behaviors, running back time for. 711 00:31:57,920 --> 00:31:59,760 Speaker 4: A second, so you have a BTE so you're tracking 712 00:31:59,840 --> 00:32:02,160 Speaker 4: the behavior to then model behavior later. 713 00:32:02,520 --> 00:32:04,600 Speaker 10: Yes. So for example, let's say we see an officer 714 00:32:04,640 --> 00:32:07,560 Speaker 10: is running more frequently, they're yelling more frequently. That has 715 00:32:07,600 --> 00:32:10,600 Speaker 10: probably correlated to some behavior outcome, such as excessive use 716 00:32:10,600 --> 00:32:13,240 Speaker 10: of force. So for example, we might identify this officer 717 00:32:13,280 --> 00:32:15,200 Speaker 10: as at a much higher likelihood of excessive use in 718 00:32:15,240 --> 00:32:17,520 Speaker 10: force in the future. Let's intervene and get them training 719 00:32:17,560 --> 00:32:19,880 Speaker 10: before something bad happens. So we're trying to build that 720 00:32:19,880 --> 00:32:22,800 Speaker 10: a software we can actually put onto the hardware and 721 00:32:22,880 --> 00:32:25,640 Speaker 10: help the law enforcement and help with de escalating situations. 722 00:32:25,720 --> 00:32:27,000 Speaker 4: Wow, that's really cool. 723 00:32:27,440 --> 00:32:28,120 Speaker 3: What other stuff like? 724 00:32:28,120 --> 00:32:29,720 Speaker 4: What are the thing are you excited about? 725 00:32:29,920 --> 00:32:32,160 Speaker 10: Oh man, there's so many Take your second best. My 726 00:32:32,240 --> 00:32:34,360 Speaker 10: second best would definitely be in the drone space. So 727 00:32:34,480 --> 00:32:37,680 Speaker 10: drones are another sensor. We're collecting huge amounts of imagery data, 728 00:32:37,840 --> 00:32:39,240 Speaker 10: and today a lot of that work is actually a 729 00:32:39,280 --> 00:32:41,760 Speaker 10: person looking at videos, scrolling through video like you would 730 00:32:41,760 --> 00:32:44,280 Speaker 10: from a VHS tape, and we're using computer vision and 731 00:32:44,320 --> 00:32:47,120 Speaker 10: AI to actually analyze that data and try to predict 732 00:32:47,120 --> 00:32:49,480 Speaker 10: what's happening and try to classify certain imagery and answer 733 00:32:49,560 --> 00:32:52,040 Speaker 10: very specific questions like is a power line going to 734 00:32:52,080 --> 00:32:54,440 Speaker 10: fail based upon a single picture from a simple drone? 735 00:32:54,480 --> 00:32:56,160 Speaker 4: Oh, now that could be really helpful, dependually all the 736 00:32:56,200 --> 00:32:58,560 Speaker 4: wildfires and stuff that we've had. And then as all 737 00:32:58,560 --> 00:33:01,200 Speaker 4: the utilities are kind of grappling with like old infrastructure 738 00:33:01,280 --> 00:33:03,000 Speaker 4: that is not easy to replace, kind of how you 739 00:33:03,040 --> 00:33:06,240 Speaker 4: manage that? Is it expensive for these companies to use this? 740 00:33:07,040 --> 00:33:10,000 Speaker 10: Usually the bottleneck today is data generation and data annotation 741 00:33:10,120 --> 00:33:12,000 Speaker 10: because there's lots of data, but we have to annotate 742 00:33:12,000 --> 00:33:14,160 Speaker 10: the data to be actually able to use it. So, 743 00:33:14,200 --> 00:33:16,120 Speaker 10: for example, if the body cams, we have to know 744 00:33:16,160 --> 00:33:18,400 Speaker 10: what those events are that we're trying to predict and 745 00:33:18,440 --> 00:33:20,360 Speaker 10: how she classifying them ahead of time. So that's the 746 00:33:20,440 --> 00:33:22,280 Speaker 10: real the bottleneck for it in a lot of cases. 747 00:33:22,480 --> 00:33:24,600 Speaker 4: All right, thanks to Michael Johnson, president of the New 748 00:33:24,680 --> 00:33:25,800 Speaker 4: Jersey Innovation. 749 00:33:25,480 --> 00:33:28,920 Speaker 3: Institute, let's stick with our conversations on artificial intelligence at 750 00:33:28,960 --> 00:33:31,840 Speaker 3: the New Jersey Institute of Technology. New Jersey Governor Phil 751 00:33:31,920 --> 00:33:34,720 Speaker 3: Murphy recently laid out further details of a so called 752 00:33:34,840 --> 00:33:38,640 Speaker 3: AI moonshot plan. The proposed plan would include seven million 753 00:33:38,680 --> 00:33:41,800 Speaker 3: dollars to advance AI utilization and opportunities in the state. 754 00:33:41,920 --> 00:33:44,640 Speaker 4: We were joined by AI expert Beth Simon Novic, who 755 00:33:44,880 --> 00:33:48,760 Speaker 4: was recently appointed as New Jersey's first ever Chief AI Strategist, 756 00:33:48,960 --> 00:33:51,480 Speaker 4: and we asked Beth how she helps implement Governor Phil 757 00:33:51,560 --> 00:33:54,000 Speaker 4: Murphy's vision of having New Jersey lead the nation in 758 00:33:54,040 --> 00:33:55,000 Speaker 4: the advancement of AI. 759 00:33:55,440 --> 00:33:58,320 Speaker 7: Governor Murphy has said very loud and clear, we have 760 00:33:58,440 --> 00:34:00,680 Speaker 7: to do better when it comes to technology in terms 761 00:34:00,680 --> 00:34:04,200 Speaker 7: of embracing the use of technology to grow the economy, 762 00:34:04,240 --> 00:34:06,240 Speaker 7: to grow jobs in the state, but also to improve 763 00:34:06,280 --> 00:34:09,759 Speaker 7: how government works. So my job is to work on 764 00:34:09,840 --> 00:34:11,319 Speaker 7: all of the above and to see what we can 765 00:34:11,360 --> 00:34:13,720 Speaker 7: do as government to make that easier, to make that better, 766 00:34:14,120 --> 00:34:16,640 Speaker 7: and to embrace the responsible and ethical use of AI 767 00:34:16,800 --> 00:34:19,000 Speaker 7: to ensure that we're doing right by our residents. 768 00:34:19,400 --> 00:34:23,360 Speaker 3: So what are some of the applications that the governor 769 00:34:23,400 --> 00:34:26,040 Speaker 3: and the Governor's office thinks AI can do Over the 770 00:34:26,120 --> 00:34:29,000 Speaker 3: next several years, Where will the residents of New Jersey 771 00:34:29,000 --> 00:34:29,799 Speaker 3: see it? Do you think? 772 00:34:29,920 --> 00:34:32,360 Speaker 7: So this is not a several years from now. The 773 00:34:32,400 --> 00:34:35,040 Speaker 7: future is already here. And we've been using AI for 774 00:34:35,160 --> 00:34:37,640 Speaker 7: quite some time, and generative AI since the very beginning, 775 00:34:37,880 --> 00:34:39,880 Speaker 7: so in many ways that you don't even see or 776 00:34:39,920 --> 00:34:41,879 Speaker 7: know about. So, for example, if you're getting a letter 777 00:34:41,920 --> 00:34:43,680 Speaker 7: from the State of New Jersey about let's say your 778 00:34:43,760 --> 00:34:47,279 Speaker 7: unemployment benefits, you're getting a letter that has been simplified, 779 00:34:47,440 --> 00:34:50,800 Speaker 7: that has been written in plain English, that's been written, 780 00:34:50,800 --> 00:34:52,960 Speaker 7: we hope, more clearly than it would have been before 781 00:34:53,280 --> 00:34:55,840 Speaker 7: because generative AI can help us to do a first draft. 782 00:34:56,239 --> 00:34:59,160 Speaker 7: If you're calling up about your anchor tax relief that 783 00:34:59,239 --> 00:35:01,440 Speaker 7: the State of New Jerse is giving out to residents, 784 00:35:01,719 --> 00:35:05,040 Speaker 7: you are hopefully getting your call resolved faster because you 785 00:35:05,120 --> 00:35:07,719 Speaker 7: get a menu option that's we've written with the help 786 00:35:07,719 --> 00:35:10,400 Speaker 7: of AI. Because voice to text our call center operators 787 00:35:10,440 --> 00:35:12,960 Speaker 7: know people are calling in asking the following kinds of questions, 788 00:35:13,360 --> 00:35:16,600 Speaker 7: we should write these menu options and these instructions and 789 00:35:16,640 --> 00:35:20,719 Speaker 7: answers so people can get that information faster. When you're 790 00:35:20,760 --> 00:35:23,360 Speaker 7: going out, for example, and typing in on a website 791 00:35:23,360 --> 00:35:25,480 Speaker 7: and telling us comments of how we can do something 792 00:35:25,520 --> 00:35:28,360 Speaker 7: better on a website like business dot J dot gov, 793 00:35:28,600 --> 00:35:30,560 Speaker 7: where you can go to start and run and grow 794 00:35:30,600 --> 00:35:33,440 Speaker 7: your business everything you need from one place. We're taking 795 00:35:33,480 --> 00:35:35,799 Speaker 7: the comments we're getting from citizens about what they need 796 00:35:35,840 --> 00:35:38,680 Speaker 7: about what they want, using AI to help us summarize 797 00:35:38,719 --> 00:35:42,640 Speaker 7: those comments, synthesize them, and turn that into the information 798 00:35:42,719 --> 00:35:44,719 Speaker 7: that people want and need front and center. So the 799 00:35:44,760 --> 00:35:49,800 Speaker 7: goal is government that's more responsive, more informative, and providing 800 00:35:49,840 --> 00:35:52,400 Speaker 7: services twenty four to seven that are responsive to what 801 00:35:52,440 --> 00:35:53,640 Speaker 7: people actually want and need. 802 00:35:53,719 --> 00:35:56,200 Speaker 4: That's a pretty good pitch. You were also the chief 803 00:35:56,239 --> 00:35:57,960 Speaker 4: of innovation, right, h Jersey. 804 00:35:58,080 --> 00:36:00,000 Speaker 7: I was for many years the chief Innovation Office. 805 00:36:00,520 --> 00:36:04,040 Speaker 4: So did the Chief Innovation Officer become the AI strategist 806 00:36:04,160 --> 00:36:06,040 Speaker 4: or is there also an innovation officer? And I guess 807 00:36:06,040 --> 00:36:08,839 Speaker 4: I'm trying to understand, like is the innovation thing now 808 00:36:08,920 --> 00:36:11,040 Speaker 4: AI or can there be other stuff? 809 00:36:11,320 --> 00:36:13,560 Speaker 7: There is still other stuff. We have a wonderful new 810 00:36:13,640 --> 00:36:17,800 Speaker 7: Chief Innovation Officer, Dave Cole, has taken over that title 811 00:36:18,160 --> 00:36:20,880 Speaker 7: and is leading our efforts to use technology to improve 812 00:36:20,920 --> 00:36:24,360 Speaker 7: how we bring services to residents. So projects like business 813 00:36:24,400 --> 00:36:27,200 Speaker 7: dot J dot gov to take the business one, for example, 814 00:36:27,320 --> 00:36:30,600 Speaker 7: or other digitization of residence services, so that instead of 815 00:36:30,640 --> 00:36:33,040 Speaker 7: having to go to a government office, you know, between 816 00:36:33,120 --> 00:36:35,000 Speaker 7: nine and five, you can come to a website, you 817 00:36:35,040 --> 00:36:36,080 Speaker 7: can use your mobile phone. 818 00:36:36,160 --> 00:36:36,960 Speaker 4: Oh my gosh, that'd be. 819 00:36:36,920 --> 00:36:41,319 Speaker 7: Amazing transact with government twenty four to seven. That's work 820 00:36:41,360 --> 00:36:43,280 Speaker 7: that's been underway for a long time, and that doesn't 821 00:36:43,360 --> 00:36:46,799 Speaker 7: just depend on AI. That is about again, clearer instructions, 822 00:36:46,880 --> 00:36:50,520 Speaker 7: planer English, things available online, giving you the information front 823 00:36:50,520 --> 00:36:52,680 Speaker 7: and center that you want and need in the way 824 00:36:52,680 --> 00:36:55,400 Speaker 7: that people have become accustomed to from the best businesses. 825 00:36:55,800 --> 00:36:58,040 Speaker 7: We think that government should serve citizens in much the 826 00:36:58,080 --> 00:36:59,959 Speaker 7: same way, except in the public interest. 827 00:37:00,360 --> 00:37:03,760 Speaker 3: Well, New Jersey's had a long history of technological innovation. 828 00:37:03,800 --> 00:37:07,360 Speaker 3: I think of telecommunications with Bellcore and Bell Labs supporting 829 00:37:07,360 --> 00:37:09,600 Speaker 3: eighteen teen Verizon. I think about some of the biotech 830 00:37:09,640 --> 00:37:12,839 Speaker 3: and pharmaceutical companies like Johnson and Johnson based here in 831 00:37:12,840 --> 00:37:17,200 Speaker 3: New Jersey. I'm wondering, is there support for the young 832 00:37:17,320 --> 00:37:20,120 Speaker 3: NJ grads that are in a garage somewhere in Jersey 833 00:37:20,160 --> 00:37:22,799 Speaker 3: City coming up with the next AI type thing. How 834 00:37:22,800 --> 00:37:24,040 Speaker 3: do we support those people? 835 00:37:24,239 --> 00:37:27,920 Speaker 7: Absolutely so, there are a whole number and range of 836 00:37:27,960 --> 00:37:31,759 Speaker 7: investments that are out there to support people starting new businesses. 837 00:37:32,080 --> 00:37:34,600 Speaker 7: That's what my colleagues at EDA work on in particular, 838 00:37:34,800 --> 00:37:37,799 Speaker 7: is ensuring that we're providing those kinds of incentives for 839 00:37:37,840 --> 00:37:40,120 Speaker 7: people who want to start their business in New Jersey 840 00:37:40,120 --> 00:37:42,480 Speaker 7: and grow their business in New Jersey. That's particularly why 841 00:37:42,520 --> 00:37:45,560 Speaker 7: the government is here to help support those businesses going 842 00:37:45,560 --> 00:37:47,440 Speaker 7: out and in particular now to look at how we 843 00:37:47,440 --> 00:37:51,000 Speaker 7: can support new AI businesses or existing businesses who are 844 00:37:51,040 --> 00:37:54,120 Speaker 7: asking how we can turn around and use AI to 845 00:37:54,280 --> 00:37:56,840 Speaker 7: improve what we do. It's a question we've been answering 846 00:37:56,840 --> 00:37:58,880 Speaker 7: for a long time. Before we called it AI. We 847 00:37:58,960 --> 00:38:00,400 Speaker 7: called it big data. 848 00:38:00,040 --> 00:38:01,200 Speaker 3: Yep, right. 849 00:38:01,239 --> 00:38:03,360 Speaker 7: So the more the people we're using a lot of 850 00:38:03,400 --> 00:38:05,319 Speaker 7: businesses have asked themselves, how could I go out and 851 00:38:05,320 --> 00:38:08,480 Speaker 7: start using data to measure what's working, to measure what 852 00:38:08,560 --> 00:38:11,080 Speaker 7: customers want, and again to deliver new kinds of services 853 00:38:11,080 --> 00:38:15,400 Speaker 7: across a range of industries. It's why we've been starting 854 00:38:15,400 --> 00:38:18,080 Speaker 7: new partnerships, such as with Princeton, around this new AI 855 00:38:18,200 --> 00:38:20,760 Speaker 7: hub that's been set up so that we can connect 856 00:38:20,760 --> 00:38:23,680 Speaker 7: some of that tremendous innovation that's coming out of universities 857 00:38:23,920 --> 00:38:27,359 Speaker 7: like NJIT, like Rutgers, like Princeton. We're of course, known 858 00:38:27,400 --> 00:38:29,440 Speaker 7: in this state for having the best universities and the 859 00:38:29,440 --> 00:38:31,800 Speaker 7: best education system in the country, and we want to 860 00:38:31,840 --> 00:38:34,520 Speaker 7: connect that back to how we're growing the economy and 861 00:38:34,560 --> 00:38:36,080 Speaker 7: growing jobs. Hearing the state all right? 862 00:38:36,080 --> 00:38:39,280 Speaker 4: Thanks to Bethsimono Noovic, Chief AI Strategist of the State 863 00:38:39,360 --> 00:38:40,040 Speaker 4: of New Jersey. 864 00:38:40,760 --> 00:38:45,279 Speaker 2: This is the Bloomberg Intelligence Podcast, available on Apples, Spotify, 865 00:38:45,480 --> 00:38:48,400 Speaker 2: and anywhere else you will get your podcasts. Listen live 866 00:38:48,480 --> 00:38:52,080 Speaker 2: each weekday ten am to noon Eastern on Bloomberg dot Com, 867 00:38:52,200 --> 00:38:55,600 Speaker 2: the iHeartRadio app, tune In, and the Bloomberg Business app. 868 00:38:55,719 --> 00:38:58,840 Speaker 2: You can also watch us live every weekday on YouTube 869 00:38:58,920 --> 00:39:00,600 Speaker 2: and always on the Blue Umbergton