1 00:00:02,520 --> 00:00:11,920 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:12,200 --> 00:00:12,680 Speaker 2: Thanks Sure. 3 00:00:13,600 --> 00:00:16,320 Speaker 3: A report from a little known firm called Tatrini Research 4 00:00:16,360 --> 00:00:20,159 Speaker 3: sent shockwaves across multiple sectors this week, outlining scenarios in 5 00:00:20,160 --> 00:00:23,480 Speaker 3: which AI agents remove all friction in the economy. Among 6 00:00:23,520 --> 00:00:27,280 Speaker 3: the scenarios, AI agents seek to save users by eliminating 7 00:00:27,320 --> 00:00:31,760 Speaker 3: transaction fees charged by payment processing firms like MasterCard and Visa. 8 00:00:31,800 --> 00:00:35,040 Speaker 3: Let's get insight on how one executive is navigating potential 9 00:00:35,120 --> 00:00:37,800 Speaker 3: risks from AI. Sephanie Feris joins us at the desk. 10 00:00:37,880 --> 00:00:40,839 Speaker 3: She CEO of Fis, a fintech that helps banks, merchants, 11 00:00:40,880 --> 00:00:44,320 Speaker 3: and capital market firms process payments globally. Stephanie, so great 12 00:00:44,320 --> 00:00:46,120 Speaker 3: to have you for It's been a very fun week 13 00:00:46,120 --> 00:00:48,279 Speaker 3: for you two with everything. 14 00:00:47,920 --> 00:00:48,559 Speaker 2: That's going on. 15 00:00:48,880 --> 00:00:51,320 Speaker 3: And look, of course your revenues don't come from interchange 16 00:00:51,320 --> 00:00:52,680 Speaker 3: fees directly, but of course a lot. 17 00:00:52,600 --> 00:00:54,240 Speaker 2: Of your customers grapple with this. 18 00:00:54,440 --> 00:00:57,120 Speaker 3: How were you thinking about the likelihood that this could 19 00:00:57,200 --> 00:00:59,880 Speaker 3: be the future that the payments world and processing were 20 00:01:00,160 --> 00:01:02,960 Speaker 3: and interchange fees completely gets disrupted by AI. 21 00:01:03,360 --> 00:01:06,880 Speaker 1: Well, it's a great question and it's certainly to hot 22 00:01:06,920 --> 00:01:09,360 Speaker 1: topic of the day. Pulling way out I think if 23 00:01:09,400 --> 00:01:12,479 Speaker 1: we think about FIS, we're the largest financial technology firm 24 00:01:12,520 --> 00:01:17,120 Speaker 1: that serves financial services broadly, we see more money that 25 00:01:17,160 --> 00:01:19,399 Speaker 1: goes across our entire money life cycle. We support like 26 00:01:19,440 --> 00:01:22,920 Speaker 1: twelve percent of the world's economy uses our technology, so 27 00:01:22,959 --> 00:01:25,840 Speaker 1: we have a great purview in terms of what's happening. 28 00:01:26,280 --> 00:01:28,280 Speaker 1: I think the other thing to note that's important as 29 00:01:28,319 --> 00:01:32,959 Speaker 1: we tackle this question is why we're so excited about 30 00:01:33,000 --> 00:01:36,200 Speaker 1: twenty twenty six and beyond and so optimistic And I'll 31 00:01:36,200 --> 00:01:39,360 Speaker 1: bring it back to AI. But financial services in itself 32 00:01:39,720 --> 00:01:44,120 Speaker 1: is growth on. We are really seeing a generational moment here. 33 00:01:44,520 --> 00:01:47,600 Speaker 1: So you think about the banking industry and where it 34 00:01:47,640 --> 00:01:50,440 Speaker 1: sits today in terms of its ability to have less 35 00:01:50,440 --> 00:01:54,720 Speaker 1: regulatory be able to grow organically or inorganically fifty billion 36 00:01:54,760 --> 00:01:58,160 Speaker 1: dollars worth of M and A transactions last year, be 37 00:01:58,240 --> 00:02:01,440 Speaker 1: able to innovate inside the financial services industry with crypto 38 00:02:01,560 --> 00:02:05,280 Speaker 1: and tokenized deposits. So financial services as a set, which 39 00:02:05,280 --> 00:02:08,400 Speaker 1: payments is part of that has a real growth on agenda. 40 00:02:08,560 --> 00:02:12,160 Speaker 1: At the same time, there's a tectonic shift happening in technology, 41 00:02:12,520 --> 00:02:14,880 Speaker 1: which is AI. So you put those two things together 42 00:02:15,440 --> 00:02:17,800 Speaker 1: and for me, it's a generational moment. So when I 43 00:02:17,840 --> 00:02:21,359 Speaker 1: think about financial services including payments, you put those two 44 00:02:21,400 --> 00:02:24,960 Speaker 1: trends together and it's pretty exciting, especially for FIS because 45 00:02:24,960 --> 00:02:27,839 Speaker 1: we serve that industry now with respect to AI, which 46 00:02:27,840 --> 00:02:30,200 Speaker 1: seems to be the topic of the day, and I 47 00:02:30,240 --> 00:02:34,000 Speaker 1: certainly also read the report. Look, I think what you're 48 00:02:34,040 --> 00:02:37,560 Speaker 1: going to hear from many CEOs that we are eternal optimists, 49 00:02:37,639 --> 00:02:41,000 Speaker 1: so we believe in technology, and as CEO of a 50 00:02:41,000 --> 00:02:44,919 Speaker 1: technology company, we think innovations and technology like this are 51 00:02:45,000 --> 00:02:49,320 Speaker 1: really good. We also have survived them over many, many decades, 52 00:02:49,320 --> 00:02:51,840 Speaker 1: and so when you think about it from an FIS standpoint, 53 00:02:52,639 --> 00:02:57,320 Speaker 1: we process transactions and run technology for systems of record. 54 00:02:57,760 --> 00:03:02,400 Speaker 1: They're multi decade system and they're AI enablers. So we 55 00:03:02,440 --> 00:03:05,000 Speaker 1: don't see a world where we're disrupted by AI. We 56 00:03:05,040 --> 00:03:07,720 Speaker 1: actually see a world where AI is a strategic accelerant. 57 00:03:07,720 --> 00:03:09,160 Speaker 1: And I think that's what you're going to hear from 58 00:03:09,160 --> 00:03:12,240 Speaker 1: a lot of us that run technology across the financial 59 00:03:12,240 --> 00:03:17,040 Speaker 1: services industry. What you need to have AI capabilities is 60 00:03:17,080 --> 00:03:21,639 Speaker 1: a fantastic data and data mote and data strategy, and 61 00:03:21,680 --> 00:03:24,720 Speaker 1: that's what we have. We have more data at FIS 62 00:03:24,760 --> 00:03:27,359 Speaker 1: than anyone in the financial services industry. We have over 63 00:03:27,400 --> 00:03:30,600 Speaker 1: a billion cardholder accounts on file, we process seventy three 64 00:03:30,639 --> 00:03:34,720 Speaker 1: billion transactions. We see over nine hundred million bank accounts 65 00:03:34,720 --> 00:03:36,880 Speaker 1: on file, and so when we are talking to our 66 00:03:36,960 --> 00:03:40,120 Speaker 1: customers about AI, it's about how do we help them 67 00:03:40,240 --> 00:03:40,880 Speaker 1: use their data? 68 00:03:40,920 --> 00:03:42,440 Speaker 2: How do we help them get to their data? 69 00:03:42,480 --> 00:03:45,360 Speaker 1: How do we help them build AI agents on top 70 00:03:45,360 --> 00:03:47,160 Speaker 1: of that data. So for us, we see it as 71 00:03:47,160 --> 00:03:47,920 Speaker 1: an accelerant. 72 00:03:48,360 --> 00:03:49,080 Speaker 2: I was thinking about it. 73 00:03:49,120 --> 00:03:50,960 Speaker 4: I was thinking about this when I read this a 74 00:03:51,040 --> 00:03:55,800 Speaker 4: trainee piece. He obviously brings up crypto with his reference 75 00:03:55,800 --> 00:04:01,160 Speaker 4: to Ether and Solana as maybe payment rail with less friction. 76 00:04:01,680 --> 00:04:04,680 Speaker 4: And you've seen this before right in your career. You 77 00:04:04,720 --> 00:04:08,000 Speaker 4: were at Fifth Third dealing with payment processing, then at 78 00:04:08,000 --> 00:04:12,800 Speaker 4: PwC before. So AI is just another I mean maybe 79 00:04:12,840 --> 00:04:17,520 Speaker 4: it's generational, right, but you've seen generational shifts with bitcoin 80 00:04:17,600 --> 00:04:20,920 Speaker 4: in two thousand and nine and with the Internet in 81 00:04:20,960 --> 00:04:25,040 Speaker 4: the late nineties. So do you adopt those technologies and 82 00:04:25,080 --> 00:04:27,880 Speaker 4: then use them to do a better job serving your 83 00:04:27,920 --> 00:04:30,960 Speaker 4: clients or do you worry that they will put you 84 00:04:31,000 --> 00:04:31,640 Speaker 4: out of business? 85 00:04:31,800 --> 00:04:34,960 Speaker 1: Yeah, well, we know absolutely have to adopt them right 86 00:04:35,040 --> 00:04:37,960 Speaker 1: our job in terms of payment capabilities, and you're exactly right. 87 00:04:38,000 --> 00:04:41,440 Speaker 1: Whether it's a check, acch WY or real time payments, 88 00:04:41,480 --> 00:04:45,919 Speaker 1: crypto tookenized deposits. Our job as the technology provider is 89 00:04:45,960 --> 00:04:50,080 Speaker 1: to make those capabilities available to everyone on our financial institutions. 90 00:04:50,440 --> 00:04:52,240 Speaker 1: So we have to make sure that we innovate so 91 00:04:52,320 --> 00:04:55,120 Speaker 1: that they can provide those capabilities. If their clients wants 92 00:04:55,120 --> 00:04:56,919 Speaker 1: to use Crypto, they need to be able to use Crypto, 93 00:04:56,960 --> 00:04:58,360 Speaker 1: If they want to use Circle, if they want to 94 00:04:58,440 --> 00:04:59,440 Speaker 1: use a tokenized. 95 00:04:59,080 --> 00:05:01,280 Speaker 2: Deposit, you have to have that capability. 96 00:05:01,320 --> 00:05:03,920 Speaker 1: The great news is we have all and the good 97 00:05:03,960 --> 00:05:05,720 Speaker 1: news is we're not seeing any of the other things 98 00:05:05,760 --> 00:05:09,040 Speaker 1: go away. It's an and to your point, right, it hasn't. 99 00:05:09,120 --> 00:05:12,080 Speaker 1: We haven't seen ACH go away. We haven't seen branches 100 00:05:12,120 --> 00:05:15,120 Speaker 1: go away. We haven't seen ATMs go away. It's an and, 101 00:05:15,320 --> 00:05:17,560 Speaker 1: so for us, it's an and. Now when you think 102 00:05:17,560 --> 00:05:20,880 Speaker 1: about a technology company ourselves, the thing that I'm most 103 00:05:20,960 --> 00:05:23,480 Speaker 1: focused on as a CEO is making sure that I 104 00:05:23,600 --> 00:05:26,960 Speaker 1: enable my entire fifty five thousand people to be not 105 00:05:27,000 --> 00:05:30,279 Speaker 1: only AI literate, but AI strength and strong. So we're 106 00:05:30,279 --> 00:05:34,400 Speaker 1: investing four times in AI capabilities, both in terms of 107 00:05:34,440 --> 00:05:36,480 Speaker 1: MATT like you talked about, to deliver to our our 108 00:05:36,480 --> 00:05:39,400 Speaker 1: customers the products, but also to make sure that our 109 00:05:39,440 --> 00:05:42,680 Speaker 1: technologists are the best in the industry in terms of 110 00:05:42,720 --> 00:05:44,039 Speaker 1: how to use this technology. 111 00:05:44,080 --> 00:05:45,520 Speaker 2: It's like putting a computer on your desk. 112 00:05:45,560 --> 00:05:48,200 Speaker 1: I talk to them all the time about well most 113 00:05:48,200 --> 00:05:50,200 Speaker 1: of them don't even remember, but remember when you had 114 00:05:50,240 --> 00:05:52,599 Speaker 1: a desk and you did paperwork, and then the next 115 00:05:52,680 --> 00:05:54,520 Speaker 1: day you had a desk and you had a computer. 116 00:05:54,880 --> 00:05:56,599 Speaker 1: No one taught you how to use it. You powered 117 00:05:56,640 --> 00:05:58,920 Speaker 1: it on and figured it out. And so that's how 118 00:05:58,960 --> 00:06:01,040 Speaker 1: we're thinking about it. We're making a lot of investment 119 00:06:01,080 --> 00:06:03,159 Speaker 1: in our colleagues to make sure that we keep them 120 00:06:03,320 --> 00:06:06,720 Speaker 1: upskilled and that they can actually be cutting front end 121 00:06:06,839 --> 00:06:07,840 Speaker 1: As a technology how. 122 00:06:07,720 --> 00:06:10,000 Speaker 3: Are you thinking about that growth of your talent, Because 123 00:06:10,000 --> 00:06:12,520 Speaker 3: there's one theory you added more AI, less people, and 124 00:06:12,560 --> 00:06:15,200 Speaker 3: the other one, which maybe is more interesting. If AI 125 00:06:15,279 --> 00:06:17,839 Speaker 3: can do more and can produce more code, you need 126 00:06:17,880 --> 00:06:19,840 Speaker 3: more people to oversee that code. 127 00:06:19,880 --> 00:06:20,760 Speaker 2: How are you thinking. 128 00:06:20,560 --> 00:06:23,640 Speaker 3: About things like headcount as you add technology to what 129 00:06:23,720 --> 00:06:25,160 Speaker 3: people are already doing at FIS. 130 00:06:25,320 --> 00:06:27,240 Speaker 2: Yeah, it's a great question. We get it all the time. 131 00:06:27,320 --> 00:06:30,679 Speaker 1: For me, it's most important as a technology company CEO. 132 00:06:30,880 --> 00:06:33,520 Speaker 1: I need everyone in my company to be using the 133 00:06:33,520 --> 00:06:36,560 Speaker 1: most cutting edge technology, whether we're using it to deliver 134 00:06:36,680 --> 00:06:39,560 Speaker 1: for our products or we're using it internally for ourselves. 135 00:06:40,000 --> 00:06:42,960 Speaker 1: I put it in every single person's hands. Everybody has 136 00:06:43,000 --> 00:06:46,880 Speaker 1: digital capabilities and digital agent capabilities. Now the challenge it 137 00:06:46,960 --> 00:06:49,039 Speaker 1: and then we give a lot of learning, but you 138 00:06:49,120 --> 00:06:49,560 Speaker 1: have to have. 139 00:06:49,520 --> 00:06:51,240 Speaker 2: A natural curiosity to want to do it. 140 00:06:51,600 --> 00:06:55,240 Speaker 1: I talk a lot about my dad who lost his 141 00:06:55,360 --> 00:06:57,840 Speaker 1: job in the nineties and didn't have a didn't know 142 00:06:57,839 --> 00:06:58,080 Speaker 1: how to. 143 00:06:58,080 --> 00:06:58,800 Speaker 2: Work on a computer. 144 00:06:58,880 --> 00:07:00,960 Speaker 1: I came into my care in the nineties and I 145 00:07:01,000 --> 00:07:03,440 Speaker 1: had a computer. I never knew how to not work 146 00:07:03,640 --> 00:07:05,919 Speaker 1: with a computer. He never knew how to work with 147 00:07:06,000 --> 00:07:08,760 Speaker 1: a computer. So that's really where we are in terms 148 00:07:08,800 --> 00:07:11,120 Speaker 1: of AI. You have to need, you have to want 149 00:07:11,160 --> 00:07:12,960 Speaker 1: to learn how to do it. And it's my job 150 00:07:13,000 --> 00:07:14,480 Speaker 1: as a CEO to make sure that I put the 151 00:07:14,520 --> 00:07:16,880 Speaker 1: tools in your hands and I encourage you to do it, 152 00:07:17,080 --> 00:07:18,680 Speaker 1: and then it's really up to you to make sure 153 00:07:18,720 --> 00:07:19,640 Speaker 1: that you embrace it. 154 00:07:19,720 --> 00:07:22,240 Speaker 4: One of the things you need at FIS is obviously 155 00:07:22,280 --> 00:07:25,440 Speaker 4: for consumers to keep powering this economy for sure, right 156 00:07:25,480 --> 00:07:28,280 Speaker 4: and you know, never count the American consumer out is 157 00:07:28,320 --> 00:07:30,600 Speaker 4: something that I've been hearing for twenty five years but 158 00:07:31,280 --> 00:07:34,920 Speaker 4: the alarm, part of the alarm that the treaty piece 159 00:07:35,040 --> 00:07:38,160 Speaker 4: raised is that the white collar jobs are the ones 160 00:07:38,200 --> 00:07:40,040 Speaker 4: that are going to be lost now they're the ones 161 00:07:40,040 --> 00:07:43,960 Speaker 4: that power the bulk of the spending. How much do 162 00:07:44,040 --> 00:07:46,040 Speaker 4: you worry about that? You've got to think about it. 163 00:07:46,040 --> 00:07:48,120 Speaker 4: It's so big picture and it's so kind of nebulous, 164 00:07:48,160 --> 00:07:52,840 Speaker 4: but it would be a concern, right if if the 165 00:07:52,920 --> 00:07:56,760 Speaker 4: top ten percent were reduced in terms of workforce. 166 00:07:57,520 --> 00:08:01,360 Speaker 1: Yes, and no. So you're absolutely right. So first of all, 167 00:08:01,400 --> 00:08:03,760 Speaker 1: I agree with you. I never count count the US 168 00:08:03,800 --> 00:08:06,440 Speaker 1: consumer out, and in fact, because of the large data 169 00:08:06,480 --> 00:08:10,160 Speaker 1: that we see, not only are financial services and banks 170 00:08:10,320 --> 00:08:13,920 Speaker 1: booming globally, but the consumer continues to remain healthy. I 171 00:08:13,920 --> 00:08:16,040 Speaker 1: think you just talked about that on your previous piece. 172 00:08:16,280 --> 00:08:19,040 Speaker 1: We're not seeing signs that a consumer is under stress 173 00:08:19,120 --> 00:08:22,680 Speaker 1: or duress. So that's good, and I do think I 174 00:08:22,760 --> 00:08:23,760 Speaker 1: agree that you sell me that. 175 00:08:23,800 --> 00:08:28,320 Speaker 4: Bears repeating because you have a great viewpoint to judge this. 176 00:08:28,520 --> 00:08:31,360 Speaker 4: You're not seeing signs that the consumer is under stress. 177 00:08:31,120 --> 00:08:31,680 Speaker 2: No, we're not. 178 00:08:31,960 --> 00:08:37,439 Speaker 1: The consumer remains healthy, banks remain healthy, financial services remains healthy, 179 00:08:37,520 --> 00:08:39,960 Speaker 1: and is growth on. I think that's really and that 180 00:08:40,080 --> 00:08:43,920 Speaker 1: is true globally. I think that's really important to reinforce 181 00:08:44,480 --> 00:08:47,600 Speaker 1: and it's also why we as fis serving that that sector, 182 00:08:47,960 --> 00:08:50,600 Speaker 1: feel so good about what's going to happen for our company. 183 00:08:50,600 --> 00:08:53,640 Speaker 1: But also I think it's important to know we aren't 184 00:08:53,640 --> 00:08:57,760 Speaker 1: seeing any signs of that yet. So in terms of 185 00:08:57,800 --> 00:09:00,440 Speaker 1: then how to think about the you know, worker like 186 00:09:00,480 --> 00:09:04,240 Speaker 1: frankly like myself that are older and need to adopt 187 00:09:04,320 --> 00:09:10,040 Speaker 1: new technologies're not older. Oh, thank you, thank you so much. Look, 188 00:09:10,080 --> 00:09:12,800 Speaker 1: I think it's this is my point where I tell 189 00:09:12,840 --> 00:09:15,199 Speaker 1: the story about my dad, and I talk by the 190 00:09:15,240 --> 00:09:16,640 Speaker 1: way everybody tell everybody. 191 00:09:16,679 --> 00:09:19,000 Speaker 2: It's a great ending. He retired. 192 00:09:19,400 --> 00:09:21,120 Speaker 1: He couldn't figure out how to do save or save 193 00:09:21,160 --> 00:09:24,079 Speaker 1: as on the computer, but my mom could and he 194 00:09:24,160 --> 00:09:24,800 Speaker 1: was very happy. 195 00:09:24,840 --> 00:09:26,240 Speaker 2: So it's a happy story. 196 00:09:26,800 --> 00:09:29,000 Speaker 1: But I do think it's a incumbent, and I talk 197 00:09:29,080 --> 00:09:31,360 Speaker 1: to my teams about it a lot. I'm not worried 198 00:09:31,360 --> 00:09:34,120 Speaker 1: about the younger workers. My daughter's about to graduate and 199 00:09:34,160 --> 00:09:35,439 Speaker 1: come into the workforce. 200 00:09:35,679 --> 00:09:36,800 Speaker 2: She needs to get a job. 201 00:09:37,640 --> 00:09:39,800 Speaker 1: She will know how to use AI, just like I 202 00:09:40,760 --> 00:09:43,360 Speaker 1: figured out how to use a computer. It's really incommon 203 00:09:43,440 --> 00:09:46,240 Speaker 1: upon me, as the CEO of the technology company to 204 00:09:46,320 --> 00:09:47,800 Speaker 1: put that in everybody's hands. 205 00:09:48,320 --> 00:09:51,680 Speaker 2: Do I worry about it? I yes and no. 206 00:09:52,320 --> 00:09:54,240 Speaker 1: But we need to make sure that we're giving everybody 207 00:09:54,480 --> 00:09:58,920 Speaker 1: an equal opportunity to get after it. The Citron research story, 208 00:09:58,960 --> 00:10:01,000 Speaker 1: as I told you about it before we came on, 209 00:10:01,280 --> 00:10:03,160 Speaker 1: it was super depressing. And then I woke up the 210 00:10:03,200 --> 00:10:05,400 Speaker 1: next morning and I was, you know, again going back 211 00:10:05,400 --> 00:10:08,360 Speaker 1: to my optimistic point as a CEO markets two. Yeah, 212 00:10:08,440 --> 00:10:11,480 Speaker 1: it's too much to take, and so I think we're all. 213 00:10:11,400 --> 00:10:14,040 Speaker 2: Going to be just fine. Stephanie. This has been so amazing. 214 00:10:14,080 --> 00:10:15,520 Speaker 2: We could have you on for another hour. 215 00:10:15,600 --> 00:10:17,480 Speaker 3: I also think, by the way, part of this conversation 216 00:10:17,520 --> 00:10:20,040 Speaker 3: is why it's important that like schools are teaching AI 217 00:10:20,200 --> 00:10:22,040 Speaker 3: and they're not resisting against it. 218 00:10:22,120 --> 00:10:24,400 Speaker 2: Stephanie, please please come back again. This is so wonderful. 219 00:10:24,760 --> 00:10:26,600 Speaker 4: Are you actually from the great state of Ohio? 220 00:10:26,720 --> 00:10:28,400 Speaker 2: I am all right. That's that's why. 221 00:10:28,440 --> 00:10:29,280 Speaker 4: Oh wow, we're God. 222 00:10:30,080 --> 00:10:32,280 Speaker 2: You and I have a connection there, don't we. 223 00:10:32,720 --> 00:10:36,040 Speaker 4: In that I used to go to parties at Miami. 224 00:10:36,000 --> 00:10:38,960 Speaker 2: Exactly, and miamis oh I have basketball. I have to 225 00:10:38,960 --> 00:10:42,240 Speaker 2: mention out. Great job, go red HAWX. I love this, Stephanie, 226 00:10:42,320 --> 00:10:44,800 Speaker 2: Thank you so much. Again. That's Stephanie Ferris. The CEO 227 00:10:44,920 --> 00:10:45,679 Speaker 2: of FIS,