1 00:00:05,120 --> 00:00:08,200 Speaker 1: Hello, Hello, Hello. This is Smart Talks with IBM, a 2 00:00:08,320 --> 00:00:12,799 Speaker 1: podcast from Pushkin Industries, High Heart Media and IBM about 3 00:00:12,800 --> 00:00:15,400 Speaker 1: what it means to look at today's most challenging problems 4 00:00:15,720 --> 00:00:21,240 Speaker 1: in a new way. I'm Malcolm Gladwell. Today I'm chatting 5 00:00:21,239 --> 00:00:24,400 Speaker 1: with a Neil Bout, the senior vice president and Chief 6 00:00:24,400 --> 00:00:28,080 Speaker 1: Technology Officer of Anthem, one of the most prominent health 7 00:00:28,120 --> 00:00:31,360 Speaker 1: insurance companies in the United States. We have been now 8 00:00:31,480 --> 00:00:36,040 Speaker 1: pivoting the more around. Okay, we are building these capabilities, 9 00:00:36,080 --> 00:00:39,280 Speaker 1: we are building these solutions. How are they fundamentally changing 10 00:00:39,320 --> 00:00:42,519 Speaker 1: and improving the lives of our members, our communities and 11 00:00:42,600 --> 00:00:45,159 Speaker 1: really making a difference to the people we saw? And 12 00:00:45,240 --> 00:00:47,680 Speaker 1: Neil has been with Anthem for over thirteen years and 13 00:00:47,720 --> 00:00:52,360 Speaker 1: his spearheaded efforts to improve customer experience and members needs. 14 00:00:53,040 --> 00:00:56,520 Speaker 1: I'll also be chatting with Glenn Finch, Managing Partner of 15 00:00:56,600 --> 00:01:03,440 Speaker 1: Global Business Services at IBM. How you deal with empathy 16 00:01:03,680 --> 00:01:06,880 Speaker 1: in an AI system. It's all based on the choice 17 00:01:06,920 --> 00:01:10,640 Speaker 1: of words that you use and the verbal inflections that 18 00:01:10,680 --> 00:01:14,600 Speaker 1: are present when you have a voice response. Then is 19 00:01:14,640 --> 00:01:18,120 Speaker 1: a twenty five year IBM veteran. His work focuses on 20 00:01:18,160 --> 00:01:25,640 Speaker 1: the most challenging and transformative engagements at IBM. I'm excited 21 00:01:25,680 --> 00:01:28,240 Speaker 1: to share my conversation with the Neil and Glenn about 22 00:01:28,360 --> 00:01:33,160 Speaker 1: artificial intelligence and how it's influencing customers to interact with 23 00:01:33,200 --> 00:01:37,920 Speaker 1: their healthcare in a new way. Al Right, guys, let's 24 00:01:37,920 --> 00:01:52,200 Speaker 1: get started. Hi everyone, Thanks guys for joining me today. 25 00:01:52,640 --> 00:01:55,520 Speaker 1: Why don't we start with the two of you just 26 00:01:56,000 --> 00:01:59,920 Speaker 1: introducing yourself, tell me, tell me what you do as 27 00:02:00,160 --> 00:02:03,520 Speaker 1: great I'm glad to be here today, Thanks for hosting us. 28 00:02:03,800 --> 00:02:07,960 Speaker 1: I basically lead the technology and practice here at Anthem 29 00:02:08,080 --> 00:02:11,720 Speaker 1: as a CTO, managing all the roadmaps for technology, making 30 00:02:11,720 --> 00:02:14,680 Speaker 1: sure that we're building solutions that are meeting our business 31 00:02:14,720 --> 00:02:16,760 Speaker 1: needs on a day to day basis, making sure that 32 00:02:16,840 --> 00:02:19,560 Speaker 1: we are catering to the needs of our members. So 33 00:02:20,160 --> 00:02:23,240 Speaker 1: overall technology roadmap, making sure that we work with partners 34 00:02:23,240 --> 00:02:27,440 Speaker 1: like IBM to bring new technology to the forefront. And 35 00:02:27,480 --> 00:02:29,639 Speaker 1: how how long have you been with Anthem. I've been 36 00:02:29,639 --> 00:02:32,639 Speaker 1: with Anthem for thirteen years actually, and the company has 37 00:02:32,680 --> 00:02:36,720 Speaker 1: evolved while we are in the healthcare business. Our focus 38 00:02:36,800 --> 00:02:40,120 Speaker 1: has been more members centric now, so really understanding how 39 00:02:40,680 --> 00:02:43,560 Speaker 1: a big organization like Anthem can make sure that we 40 00:02:43,680 --> 00:02:49,000 Speaker 1: pivot from being a normal traditional listed company which definitely 41 00:02:49,080 --> 00:02:52,000 Speaker 1: is meeting the expectations of the stockholders, but also catering 42 00:02:52,040 --> 00:02:54,000 Speaker 1: to the need of our members and the communities that 43 00:02:54,040 --> 00:02:57,600 Speaker 1: we serve in. Yeah, why don't you introduce yourself. I'm 44 00:02:57,639 --> 00:03:00,560 Speaker 1: Glenn Finch. I look after data in AI and the 45 00:03:00,600 --> 00:03:04,520 Speaker 1: services side of the IBM company. We take a lot 46 00:03:04,560 --> 00:03:07,760 Speaker 1: of wicked cool technology and bring it to life of 47 00:03:07,880 --> 00:03:11,880 Speaker 1: clients like Anthem and Uh, you know, I get the 48 00:03:12,000 --> 00:03:14,160 Speaker 1: great pleasure of working with a Neil on a daily 49 00:03:14,200 --> 00:03:20,079 Speaker 1: basis to really fundamentally change the member experience using artificial intelligence, 50 00:03:20,080 --> 00:03:23,360 Speaker 1: so usually on the cutting edge of things, and just 51 00:03:23,360 --> 00:03:26,520 Speaker 1: just love coming to work every day. Yeah, so you 52 00:03:26,560 --> 00:03:30,320 Speaker 1: said something. The two of you have been working together 53 00:03:30,360 --> 00:03:33,239 Speaker 1: for some time. When did you guys me me to 54 00:03:33,400 --> 00:03:36,560 Speaker 1: first know each other. As I said, the industry has 55 00:03:36,560 --> 00:03:40,080 Speaker 1: been evolving a lot, Malcolm, So a couple of years back. 56 00:03:40,520 --> 00:03:43,880 Speaker 1: We basically we're kind of figuring out as the consumer 57 00:03:43,920 --> 00:03:48,560 Speaker 1: experience changes, as people get so much used to Netflix 58 00:03:48,640 --> 00:03:51,280 Speaker 1: and Amazon and the way they do their day to 59 00:03:51,320 --> 00:03:54,120 Speaker 1: day shopping, the way they experienced things. We were looking 60 00:03:54,120 --> 00:03:57,280 Speaker 1: for a partner where we could really explore the power 61 00:03:57,320 --> 00:04:00,320 Speaker 1: of AI, really use our data in a way wherein 62 00:04:00,400 --> 00:04:04,920 Speaker 1: we can create these personalized experiences. So that's where Glenn 63 00:04:04,920 --> 00:04:07,200 Speaker 1: and I actually talked a little bit and we figured 64 00:04:07,200 --> 00:04:10,960 Speaker 1: out that there is a possibility of us partnering ib 65 00:04:11,040 --> 00:04:15,360 Speaker 1: AM bringing its UM technology, and basically that's when we 66 00:04:15,920 --> 00:04:18,080 Speaker 1: kind of figure out there's there's a definite role to 67 00:04:18,120 --> 00:04:21,360 Speaker 1: play and partner on this journey together. And it's been great. 68 00:04:21,360 --> 00:04:23,359 Speaker 1: Over the last two years, we have been able to 69 00:04:23,440 --> 00:04:27,640 Speaker 1: deliver on some great, exceptional experiences for our members and 70 00:04:27,640 --> 00:04:30,400 Speaker 1: and we are now moving beyond to other constituents and 71 00:04:30,520 --> 00:04:34,840 Speaker 1: really making sure that UM we make it awesome for 72 00:04:34,839 --> 00:04:37,560 Speaker 1: for members to connect with us. Yeah, Glenn, had you 73 00:04:37,600 --> 00:04:41,720 Speaker 1: worked with an in an insurance provider before? Yea, So 74 00:04:41,960 --> 00:04:45,600 Speaker 1: we have a variety of clients around the world, so yes, 75 00:04:46,520 --> 00:04:51,200 Speaker 1: but Anthem is special to my heart. We started thinking 76 00:04:51,240 --> 00:04:56,000 Speaker 1: through this because when you work with Anthem, this concept 77 00:04:56,040 --> 00:04:59,480 Speaker 1: of member and member experience, you need to show up 78 00:04:59,520 --> 00:05:02,919 Speaker 1: every day with that front and center in your mind. 79 00:05:03,320 --> 00:05:06,719 Speaker 1: So there are other clients who focus on cost or 80 00:05:07,120 --> 00:05:10,559 Speaker 1: technical debt or something like that, but that's not true 81 00:05:10,560 --> 00:05:12,960 Speaker 1: at Anthem. You need to show up front and center 82 00:05:13,040 --> 00:05:17,200 Speaker 1: every day with how are you going to radically improve 83 00:05:17,279 --> 00:05:21,039 Speaker 1: the member experience first, and fas say, but it's the 84 00:05:21,080 --> 00:05:23,919 Speaker 1: relationship between the two companies and the two of you 85 00:05:24,040 --> 00:05:26,760 Speaker 1: goes back so far that I'm really curious to get 86 00:05:26,800 --> 00:05:29,760 Speaker 1: a sense of how the kinds of questions you've been 87 00:05:30,800 --> 00:05:34,360 Speaker 1: asking and problems you've been trying to solve have evolved 88 00:05:34,400 --> 00:05:37,920 Speaker 1: over that time. Tell me about ten years ago, what 89 00:05:37,960 --> 00:05:39,960 Speaker 1: were you guys talking about. So I think the ten 90 00:05:40,040 --> 00:05:42,640 Speaker 1: years back, the conversation is more and more around Okay, 91 00:05:42,640 --> 00:05:44,560 Speaker 1: how many sellers do we have in our data center, 92 00:05:44,839 --> 00:05:48,160 Speaker 1: how many licensing points that we're going to be spending 93 00:05:48,200 --> 00:05:50,400 Speaker 1: this year? What will be our footprint? What is our 94 00:05:50,440 --> 00:05:53,919 Speaker 1: network speed? Are we able to manage the new capability 95 00:05:54,000 --> 00:05:57,479 Speaker 1: that we're delivering? And it really was very technologically focused 96 00:05:57,520 --> 00:06:00,800 Speaker 1: conversation that we used to have. And what has happened 97 00:06:00,800 --> 00:06:03,440 Speaker 1: over the years, Malcolm, is that you know, we have 98 00:06:03,520 --> 00:06:07,760 Speaker 1: been now pivoting too more around Okay, we are building 99 00:06:07,800 --> 00:06:10,800 Speaker 1: these capabilities, we are building these solutions, how are they 100 00:06:10,839 --> 00:06:13,840 Speaker 1: fundamentally changing and improving the lives of our members, our 101 00:06:13,920 --> 00:06:17,480 Speaker 1: communities and really making a difference to the people we serve. 102 00:06:17,680 --> 00:06:21,480 Speaker 1: So as we looked at technology and engineering, we kind 103 00:06:21,480 --> 00:06:24,560 Speaker 1: of pivoted from that to more platform and a product 104 00:06:24,680 --> 00:06:27,359 Speaker 1: that we are building for our constituents. And as that 105 00:06:27,400 --> 00:06:29,400 Speaker 1: pivot happened, you know, I would say around three or 106 00:06:29,400 --> 00:06:33,480 Speaker 1: four years back, the conversation then evolved to more around Okay, 107 00:06:33,520 --> 00:06:36,200 Speaker 1: how are we improving the experience? How are we making 108 00:06:36,240 --> 00:06:39,400 Speaker 1: sure that we're making it easier for the members? And 109 00:06:39,400 --> 00:06:43,359 Speaker 1: and it pivoted from being reactive and kind of what 110 00:06:43,480 --> 00:06:47,480 Speaker 1: I called sick care management to more wellness oriented conversation 111 00:06:47,600 --> 00:06:50,600 Speaker 1: how do we keep our members healthy? And that's where 112 00:06:50,640 --> 00:06:55,560 Speaker 1: the overall pioneering of personalized experiences, predictive and proactive health 113 00:06:55,560 --> 00:06:59,200 Speaker 1: care management kind of started. And as we had interactive 114 00:06:59,279 --> 00:07:01,479 Speaker 1: that IBM, we knew that they had the technology and 115 00:07:01,520 --> 00:07:04,800 Speaker 1: they had the real backbone, which good. So the needs 116 00:07:04,839 --> 00:07:07,680 Speaker 1: that we wanted to kind of bring forward and talk 117 00:07:07,720 --> 00:07:11,200 Speaker 1: about that pivot. I'm curious what's driving it. Did you 118 00:07:11,240 --> 00:07:13,280 Speaker 1: go to a NIL and say, look, you have an 119 00:07:13,320 --> 00:07:15,160 Speaker 1: opportunity to do so much more here? Does a neil 120 00:07:15,200 --> 00:07:17,560 Speaker 1: come to you and say I don't want to be 121 00:07:17,600 --> 00:07:20,560 Speaker 1: just focused on technology. Are members are telling us X, 122 00:07:20,680 --> 00:07:23,840 Speaker 1: Y and Z or take me back to that transformative 123 00:07:23,920 --> 00:07:26,040 Speaker 1: moment when you start thinking about this project in a 124 00:07:26,040 --> 00:07:30,400 Speaker 1: different way. There's been a a massive shift at the 125 00:07:30,440 --> 00:07:36,440 Speaker 1: IBM company in general to shift away from pure technology 126 00:07:36,600 --> 00:07:41,920 Speaker 1: and move towards technology on behalf of a workflow. When 127 00:07:41,920 --> 00:07:45,000 Speaker 1: you think about artificial intelligence and you are trying to 128 00:07:45,040 --> 00:07:49,280 Speaker 1: have a conversation with someone, right, you don't need just 129 00:07:49,480 --> 00:07:55,000 Speaker 1: deep artificial intelligence programmers. You need to have people attached 130 00:07:55,040 --> 00:07:58,480 Speaker 1: to that that know how to have a conversation with 131 00:07:58,520 --> 00:08:03,320 Speaker 1: people and what sequence some words are gonna elicit a response, 132 00:08:03,400 --> 00:08:07,680 Speaker 1: and how that experience feels. To remember, that's a very 133 00:08:07,920 --> 00:08:12,200 Speaker 1: different type of program then just dropping in a chatbot 134 00:08:12,240 --> 00:08:15,160 Speaker 1: and hoping it works right to answer the twelve questions 135 00:08:15,160 --> 00:08:16,920 Speaker 1: that you get most of the time, right, And you 136 00:08:17,040 --> 00:08:21,040 Speaker 1: mentioned this concept of personalization, right, just making sure we 137 00:08:21,120 --> 00:08:24,040 Speaker 1: put the right people together on the program is half 138 00:08:24,040 --> 00:08:27,080 Speaker 1: the battle, right, And that's a shift that IBM has 139 00:08:27,120 --> 00:08:31,000 Speaker 1: made very consciously, started about five years ago. You know, 140 00:08:31,040 --> 00:08:35,280 Speaker 1: we really in Earnest called out intelligent workflows about two 141 00:08:35,360 --> 00:08:37,600 Speaker 1: or three years ago and that's when we started doing 142 00:08:37,600 --> 00:08:42,959 Speaker 1: this together. M. It was tough, it was ambitious as 143 00:08:43,000 --> 00:08:45,040 Speaker 1: compared to anything else that we had done out here. 144 00:08:47,400 --> 00:08:50,040 Speaker 1: And one thing which Malcolm was very very beautiful and 145 00:08:50,040 --> 00:08:53,040 Speaker 1: and has been very important for us learning on the goal. 146 00:08:53,360 --> 00:08:55,640 Speaker 1: When you have so much data that you're capturing, when 147 00:08:55,679 --> 00:08:59,160 Speaker 1: you have a technology that really can give you in 148 00:08:59,200 --> 00:09:02,480 Speaker 1: a nanosecond the response to what exactly is happening. The 149 00:09:02,520 --> 00:09:04,520 Speaker 1: beauty of it is that you can pivot and kind 150 00:09:04,559 --> 00:09:07,920 Speaker 1: of change on the fly. The agility that you build 151 00:09:07,960 --> 00:09:10,160 Speaker 1: into our system, the agility that you build into our 152 00:09:10,200 --> 00:09:12,960 Speaker 1: operations is a key and that's what we have been 153 00:09:13,000 --> 00:09:15,439 Speaker 1: able to do. And unfortunately at Anthem, we have been 154 00:09:15,920 --> 00:09:18,360 Speaker 1: really at the forefront of that, investing the right dollars 155 00:09:18,360 --> 00:09:21,760 Speaker 1: and bringing the agility, bringing the way we can kind 156 00:09:21,760 --> 00:09:25,000 Speaker 1: of pivot to what is more important to the concerns. 157 00:09:25,040 --> 00:09:28,120 Speaker 1: That has been a great thing that has been happening here. 158 00:09:28,679 --> 00:09:33,960 Speaker 1: Let's go through some some very specific examples. So, I 159 00:09:34,000 --> 00:09:37,800 Speaker 1: am a I'm a member of Anthem, I am on 160 00:09:37,840 --> 00:09:41,480 Speaker 1: your website. I would like to accomplish something. Tell me 161 00:09:41,559 --> 00:09:45,679 Speaker 1: a specific thing that an expectation a member might have, 162 00:09:46,280 --> 00:09:51,640 Speaker 1: and how you have said about trying to satisfy that expectation, 163 00:09:51,640 --> 00:09:54,880 Speaker 1: and let's get let's get super specific. Give me a scenario, 164 00:09:55,440 --> 00:09:58,120 Speaker 1: a tough a tough scenario. Yeah, yeah, well, I think 165 00:09:58,160 --> 00:10:00,319 Speaker 1: I can give you a comparison to the past. Right, 166 00:10:00,440 --> 00:10:03,640 Speaker 1: So when you were enrolled as a member, we probably 167 00:10:03,640 --> 00:10:06,240 Speaker 1: would send you an ID card which was a hard 168 00:10:06,480 --> 00:10:09,400 Speaker 1: piece of paper, a very good piece of paper which 169 00:10:09,440 --> 00:10:12,439 Speaker 1: costs us a lot. Then there was nothing that we 170 00:10:12,480 --> 00:10:14,520 Speaker 1: would let you know other than that, hey, if you 171 00:10:14,559 --> 00:10:17,839 Speaker 1: want a register on our website, please, you're welcome, right, 172 00:10:18,520 --> 00:10:21,520 Speaker 1: And then that's where our first interaction with you as 173 00:10:21,520 --> 00:10:25,400 Speaker 1: a member used to happen. And frankly, there was nothing 174 00:10:25,440 --> 00:10:28,600 Speaker 1: after that. There was a vacuum, and then you would 175 00:10:28,880 --> 00:10:31,439 Speaker 1: probably try to understand your benefits. You will make sure 176 00:10:31,480 --> 00:10:34,640 Speaker 1: that you know what your co pay is, and then 177 00:10:35,440 --> 00:10:37,080 Speaker 1: we will not hear from you for a long time, 178 00:10:37,480 --> 00:10:40,959 Speaker 1: and all of a sudden, someday, unfortunately somebody is tack 179 00:10:41,000 --> 00:10:43,280 Speaker 1: in your family and then you pick up the card, 180 00:10:43,360 --> 00:10:46,760 Speaker 1: go to a provider and basically have a visit um 181 00:10:46,800 --> 00:10:49,199 Speaker 1: there and then you go from there. So that's the 182 00:10:49,240 --> 00:10:53,120 Speaker 1: traditional experience that somebody would have had. Right now we 183 00:10:53,200 --> 00:10:56,280 Speaker 1: have totally revamped that. So as a member, when you 184 00:10:56,400 --> 00:10:59,000 Speaker 1: enrolled with us, we send you a welcome kit which 185 00:10:59,040 --> 00:11:01,400 Speaker 1: sent you a digital well kit. We send you an 186 00:11:01,400 --> 00:11:04,840 Speaker 1: ID card which is available on your phone. We send 187 00:11:04,840 --> 00:11:07,560 Speaker 1: you a link to our Sydney have tapp which basically 188 00:11:07,640 --> 00:11:10,440 Speaker 1: you can download, you can register in a minute, but 189 00:11:10,520 --> 00:11:13,239 Speaker 1: if you've been an existing member, you will get a personalized, 190 00:11:13,280 --> 00:11:16,840 Speaker 1: curated news feed which is specific to you based on 191 00:11:17,280 --> 00:11:21,640 Speaker 1: your prior experience and based on your claims, history and 192 00:11:21,679 --> 00:11:23,760 Speaker 1: other things that we know about you. We work with 193 00:11:23,800 --> 00:11:26,480 Speaker 1: IBM around the AI chat part, which is basically a 194 00:11:26,520 --> 00:11:29,720 Speaker 1: Watson enabled chat board which you can ask the questions 195 00:11:29,720 --> 00:11:32,280 Speaker 1: from what is my copay? You don't have a call us, 196 00:11:32,360 --> 00:11:35,120 Speaker 1: you don't have to send us an email. You can 197 00:11:35,200 --> 00:11:37,520 Speaker 1: really ask a question there itself. You can ask for 198 00:11:37,760 --> 00:11:40,280 Speaker 1: what are the providers near me? And we'll match a 199 00:11:40,360 --> 00:11:43,559 Speaker 1: provider to you based on your past history. And that's 200 00:11:43,600 --> 00:11:46,439 Speaker 1: where AI comes in that what do we think Malcolm's 201 00:11:46,440 --> 00:11:50,720 Speaker 1: age group, Malcolm's prior history tells us who should be 202 00:11:50,760 --> 00:11:52,880 Speaker 1: the right provider for him to take care of things. 203 00:11:53,280 --> 00:11:57,880 Speaker 1: So that interactive, more personalized, more engaging experience is what 204 00:11:58,080 --> 00:12:04,760 Speaker 1: is different. Let me give you an example. I'd love 205 00:12:04,800 --> 00:12:07,120 Speaker 1: for both of you the way on this. So I'm 206 00:12:07,280 --> 00:12:09,920 Speaker 1: fifty seven years old. It is indicated for someone at 207 00:12:09,960 --> 00:12:13,040 Speaker 1: my age that I get a shingles vaccine. I didn't 208 00:12:13,080 --> 00:12:15,560 Speaker 1: notice never occurred to me. A friend of mine got shingles. 209 00:12:15,920 --> 00:12:18,439 Speaker 1: It was like the worst experience of his life. He 210 00:12:18,559 --> 00:12:21,880 Speaker 1: lost three weeks. It was like so painful, and he's like, 211 00:12:22,640 --> 00:12:24,640 Speaker 1: whatever you do, Malcolm, you need to get a shingles 212 00:12:24,720 --> 00:12:26,839 Speaker 1: vaccine right now. So I went out and got my 213 00:12:26,880 --> 00:12:28,840 Speaker 1: shingles vaccine and then I had to get the booster. 214 00:12:29,000 --> 00:12:31,640 Speaker 1: I remember the booster and blah blah blah. Now when 215 00:12:31,640 --> 00:12:35,880 Speaker 1: you're talking about Sydney and about about drawing on past experience, 216 00:12:36,200 --> 00:12:40,439 Speaker 1: if I was a long time Anthem subscriber, would you 217 00:12:41,000 --> 00:12:43,719 Speaker 1: reach out to me and say, Malcolm, you gotta get 218 00:12:43,720 --> 00:12:45,920 Speaker 1: your shingles vaccine? Would you do? Is that what you're 219 00:12:45,920 --> 00:12:49,360 Speaker 1: thinking about? Exactly? Exactly not only we will tell you 220 00:12:49,440 --> 00:12:51,360 Speaker 1: that you need to take shingles vaccine, will tell you 221 00:12:51,440 --> 00:12:54,520 Speaker 1: exactly which provider probably is the right one for you. 222 00:12:54,720 --> 00:12:56,839 Speaker 1: And that is what the beauty is right now, that 223 00:12:56,960 --> 00:13:00,560 Speaker 1: not to care really that the care gap that we have. 224 00:13:01,160 --> 00:13:02,960 Speaker 1: How does the data tell us that these are the 225 00:13:02,960 --> 00:13:06,440 Speaker 1: care gaps in Malcolm's journey? You know, you you pay 226 00:13:06,480 --> 00:13:08,920 Speaker 1: a lot for your insurance company to take care of you, 227 00:13:09,320 --> 00:13:11,079 Speaker 1: and how do we make sure that we take care 228 00:13:11,120 --> 00:13:14,360 Speaker 1: of you? We be your advocate, We be your journey partners. 229 00:13:14,480 --> 00:13:17,719 Speaker 1: Rather than just allowing for you to come to us 230 00:13:17,800 --> 00:13:21,360 Speaker 1: when you feel that you're sick. So this this AI 231 00:13:21,440 --> 00:13:23,800 Speaker 1: system is called Sydney. First of all, who came up 232 00:13:23,840 --> 00:13:26,680 Speaker 1: with Sydney actually loved the name? But is that who? 233 00:13:26,720 --> 00:13:30,040 Speaker 1: Whose decision was it to call this system Sydney? Actually 234 00:13:30,040 --> 00:13:34,200 Speaker 1: you know, uh, it was it was our team. We 235 00:13:34,240 --> 00:13:36,559 Speaker 1: did some research in terms of what could be a 236 00:13:36,640 --> 00:13:39,319 Speaker 1: very neutral name that we can keep out there. And 237 00:13:39,320 --> 00:13:41,240 Speaker 1: and Malcolm, I can tell you that I love the 238 00:13:41,320 --> 00:13:44,280 Speaker 1: name so much that the beginning of when we had 239 00:13:44,320 --> 00:13:47,440 Speaker 1: COVID hit us, my daughter was asking for a dog 240 00:13:47,480 --> 00:13:49,720 Speaker 1: for a long time and we got a dog, and 241 00:13:49,760 --> 00:13:52,960 Speaker 1: actually we named the dog Sydney. So that's how how 242 00:13:53,040 --> 00:13:55,040 Speaker 1: much how much I care about the name and how 243 00:13:55,120 --> 00:13:57,319 Speaker 1: much I love the name. But thank you very much 244 00:13:57,320 --> 00:13:59,600 Speaker 1: for that's just so we true you Sydney. I'm getting 245 00:13:59,640 --> 00:14:01,439 Speaker 1: the AI assistum, but not your dog, That's all I 246 00:14:01,480 --> 00:14:05,120 Speaker 1: want to be clear. That's what you know we make. 247 00:14:05,280 --> 00:14:07,280 Speaker 1: We may supply you with a picture of Sydney on 248 00:14:07,440 --> 00:14:09,640 Speaker 1: when you when you come to the come to the app, 249 00:14:09,720 --> 00:14:12,320 Speaker 1: but yeah, you're getting the A. Yeah. So what we 250 00:14:12,480 --> 00:14:17,240 Speaker 1: find is that to build trust in AI systems and 251 00:14:17,760 --> 00:14:21,000 Speaker 1: to build the willingness for remember to go along a 252 00:14:21,080 --> 00:14:25,720 Speaker 1: journey experience. There's some things we have to do at 253 00:14:25,760 --> 00:14:28,440 Speaker 1: the table stakes level, at the grassroots level, that we 254 00:14:28,520 --> 00:14:31,040 Speaker 1: have to get right inexorably. And I'm going to go 255 00:14:31,080 --> 00:14:33,640 Speaker 1: back to a Neil's comment about the I D card. 256 00:14:34,320 --> 00:14:37,640 Speaker 1: What happens if you've lost your I D card. You 257 00:14:37,680 --> 00:14:41,360 Speaker 1: don't want to wait on the phone for anybody to 258 00:14:41,400 --> 00:14:43,920 Speaker 1: get a replacement a D card. You'd like to be 259 00:14:43,960 --> 00:14:47,080 Speaker 1: able to do that once and done on the web 260 00:14:47,160 --> 00:14:49,360 Speaker 1: or the mobile. Might have to ask a couple of 261 00:14:49,440 --> 00:14:53,640 Speaker 1: questions and have it done lights out right. So there's 262 00:14:53,720 --> 00:14:59,239 Speaker 1: this combination of doing the more routine things with absolute 263 00:15:00,000 --> 00:15:04,960 Speaker 1: decision writes out complete ease of member, and then that 264 00:15:05,040 --> 00:15:09,800 Speaker 1: builds this trust to have this more longitudinal journey to 265 00:15:09,960 --> 00:15:13,040 Speaker 1: answer your questions or to recommend to you about shingles, 266 00:15:13,120 --> 00:15:17,000 Speaker 1: vaccine right, or a variety of other things based on 267 00:15:17,200 --> 00:15:19,760 Speaker 1: you know your your health challenges. So it's a it's 268 00:15:19,840 --> 00:15:22,240 Speaker 1: kind of a double edged sort of taking care of 269 00:15:22,240 --> 00:15:25,520 Speaker 1: the table stakes and taking people along the journey. Your 270 00:15:25,560 --> 00:15:29,640 Speaker 1: point is you start with the very prosaic stuff and 271 00:15:29,720 --> 00:15:32,400 Speaker 1: you build a trust in the system, and then you 272 00:15:32,440 --> 00:15:36,200 Speaker 1: can move to the more high end stuff. Tell me 273 00:15:36,240 --> 00:15:39,200 Speaker 1: about how you build an AI system like this. This 274 00:15:39,280 --> 00:15:45,000 Speaker 1: is not a trivial accomplishment. What went into building Sydney. Yeah, 275 00:15:45,120 --> 00:15:49,480 Speaker 1: so I think you know, Malcolm, Traditionally, we have a 276 00:15:49,520 --> 00:15:52,160 Speaker 1: lot of data over the years that we have accumulated 277 00:15:52,360 --> 00:15:57,840 Speaker 1: for every member, and we have eighty million lives, multiple 278 00:15:57,920 --> 00:16:01,960 Speaker 1: petabytes of data which is sitting on our systems, and 279 00:16:02,000 --> 00:16:05,200 Speaker 1: that data basically allows us to learn. You know, the 280 00:16:05,280 --> 00:16:07,680 Speaker 1: data is data. As long as you don't touch it, 281 00:16:07,840 --> 00:16:11,080 Speaker 1: you don't do anything. But once you start really using 282 00:16:11,120 --> 00:16:14,360 Speaker 1: technologies and and when when we call AI, these are 283 00:16:15,440 --> 00:16:18,760 Speaker 1: mathematical models that you can run on this data to 284 00:16:18,880 --> 00:16:21,960 Speaker 1: give you insights. And those insights are the key at 285 00:16:22,040 --> 00:16:25,320 Speaker 1: the end of the day. And as we get those insights, 286 00:16:25,400 --> 00:16:26,960 Speaker 1: we have to make sure that we have a way 287 00:16:27,000 --> 00:16:29,400 Speaker 1: to use those insights to make a difference in the 288 00:16:29,800 --> 00:16:31,920 Speaker 1: in the life of any member that we have or 289 00:16:31,960 --> 00:16:35,080 Speaker 1: any constant. Actually, you know, our sales experience for our 290 00:16:35,120 --> 00:16:39,040 Speaker 1: brokers are providers. Getting to know exactly what they need 291 00:16:39,120 --> 00:16:42,080 Speaker 1: to know is very very important. So we are making 292 00:16:42,120 --> 00:16:44,560 Speaker 1: sure that this data and the minding of this data 293 00:16:44,760 --> 00:16:49,080 Speaker 1: is constant. So when we talk about the partnership with IBM. 294 00:16:49,080 --> 00:16:51,800 Speaker 1: We're talking about ability for us to mind this data 295 00:16:52,000 --> 00:16:55,880 Speaker 1: on the fly at a very very quick speed, and 296 00:16:55,960 --> 00:16:58,880 Speaker 1: that is what is key. Then we're able to use 297 00:16:58,920 --> 00:17:00,880 Speaker 1: AI in a different text. And I'm going to give 298 00:17:00,920 --> 00:17:03,400 Speaker 1: you example of something that really we're bringing to the 299 00:17:03,480 --> 00:17:06,399 Speaker 1: forefront of of what we call as a nutrition tracker. 300 00:17:07,320 --> 00:17:10,239 Speaker 1: So imagine that you have your phone in front of you, 301 00:17:11,680 --> 00:17:14,040 Speaker 1: You have a plate of food that came in front 302 00:17:14,040 --> 00:17:17,879 Speaker 1: of you, and you can open Sydney and show the 303 00:17:17,920 --> 00:17:21,200 Speaker 1: food of plate to Sydney. Sydney can tell you based 304 00:17:21,240 --> 00:17:23,520 Speaker 1: on what it plate. You take a picture of the 305 00:17:23,520 --> 00:17:26,320 Speaker 1: photo and Sydney looks at the photo and says, why 306 00:17:26,359 --> 00:17:28,040 Speaker 1: are you loading up on carbs? I mean, is that 307 00:17:28,080 --> 00:17:30,679 Speaker 1: what we're talking about exactly? That's what I'm talking about. 308 00:17:30,720 --> 00:17:33,040 Speaker 1: So you know, this is a great partnership we have 309 00:17:33,119 --> 00:17:35,760 Speaker 1: with one of our ecosystem partners. And actually this isn't 310 00:17:35,760 --> 00:17:38,760 Speaker 1: pilot with our house account, which is eighty thousand members 311 00:17:38,840 --> 00:17:42,080 Speaker 1: right now, and it can tell you. It can you 312 00:17:42,119 --> 00:17:43,960 Speaker 1: can show it a cup and it can tell you 313 00:17:44,000 --> 00:17:47,240 Speaker 1: this is a coffee with no milk, and it's going 314 00:17:47,280 --> 00:17:49,919 Speaker 1: to be seventy calories and it keeps track of what 315 00:17:50,080 --> 00:17:52,800 Speaker 1: you're eating and basically that's how we build the healthy 316 00:17:52,800 --> 00:17:56,160 Speaker 1: habits out there. So the advancement in the in the 317 00:17:56,200 --> 00:17:58,480 Speaker 1: field of technology and how do we make sure that 318 00:17:58,640 --> 00:18:01,720 Speaker 1: we move away from that leg thee information technology to 319 00:18:02,200 --> 00:18:05,159 Speaker 1: really the exponential technology that is in front of us 320 00:18:05,240 --> 00:18:10,439 Speaker 1: is the key. We needed to take all of that 321 00:18:10,600 --> 00:18:15,680 Speaker 1: AI and persist a conversation with a member, right And 322 00:18:15,880 --> 00:18:20,359 Speaker 1: and that's where Watson came in to help Sydney persist 323 00:18:20,520 --> 00:18:25,520 Speaker 1: conversations with members, right Because crunching through data and and 324 00:18:25,560 --> 00:18:28,639 Speaker 1: knowing about your claim is one thing, but being able 325 00:18:28,680 --> 00:18:31,320 Speaker 1: to talk to you about that claim and understand your 326 00:18:31,359 --> 00:18:34,560 Speaker 1: responses back, whether you're on a keyboard, whether you're speaking, 327 00:18:34,560 --> 00:18:37,800 Speaker 1: whether you're doing whatever. That's kind of where Watson came 328 00:18:37,840 --> 00:18:41,880 Speaker 1: in to help augment Sydney and again designing those conversations. 329 00:18:42,320 --> 00:18:43,760 Speaker 1: I don't know if you've been in a in a 330 00:18:43,840 --> 00:18:46,080 Speaker 1: situation where you're sitting next to somebody and they're talking 331 00:18:46,160 --> 00:18:48,720 Speaker 1: and you say, oh my god, I can't believe they 332 00:18:48,720 --> 00:18:54,000 Speaker 1: said that. Well, you have to engineer that out of 333 00:18:55,240 --> 00:18:58,199 Speaker 1: the conversations that you have with members so that you know, 334 00:18:58,680 --> 00:19:01,480 Speaker 1: all of the members are delighted and one of the 335 00:19:01,520 --> 00:19:04,800 Speaker 1: things I'm proudest of is when by our work together, 336 00:19:05,520 --> 00:19:11,120 Speaker 1: we have members that are thanking Sydney when we're working 337 00:19:11,160 --> 00:19:16,119 Speaker 1: with them with artificial intelligence, responding to their questions, just 338 00:19:16,280 --> 00:19:20,280 Speaker 1: as if Sydney was a fully human worker, right, And 339 00:19:20,480 --> 00:19:24,160 Speaker 1: that that's what I get delight from is when we've 340 00:19:24,200 --> 00:19:27,880 Speaker 1: been able to change a member experience and work through 341 00:19:27,880 --> 00:19:35,400 Speaker 1: that all of the things members might ask, Now, are 342 00:19:35,400 --> 00:19:39,200 Speaker 1: you taking real life conversations, looking at them and feeding 343 00:19:39,240 --> 00:19:42,520 Speaker 1: them to Sydney and saying okay? In the last two years, 344 00:19:42,520 --> 00:19:45,040 Speaker 1: these are all the These are all the phone conversations 345 00:19:45,040 --> 00:19:46,520 Speaker 1: we've had with our members. These are the kinds of 346 00:19:46,520 --> 00:19:50,800 Speaker 1: things they ask. Is that where it starts? We build 347 00:19:50,800 --> 00:19:53,800 Speaker 1: what we call the anthology of the conversations? You know, 348 00:19:53,960 --> 00:19:56,960 Speaker 1: how are we making sure that as we get the 349 00:19:57,040 --> 00:20:01,200 Speaker 1: interactions noted down for our members or provide us into 350 00:20:01,240 --> 00:20:04,240 Speaker 1: our system, whether it's a phone call, whether it's a chat, 351 00:20:04,320 --> 00:20:06,960 Speaker 1: whether it's basically even they came to the website and 352 00:20:07,000 --> 00:20:11,000 Speaker 1: they clicked through specific things, right, so we are noting 353 00:20:11,040 --> 00:20:13,879 Speaker 1: those down. We are kind of creating a what we 354 00:20:13,960 --> 00:20:16,960 Speaker 1: call a graph model and a flow of When a 355 00:20:17,040 --> 00:20:19,480 Speaker 1: member asked this, the next question possible level, it's going 356 00:20:19,520 --> 00:20:22,240 Speaker 1: to be this. If you give up yes to that 357 00:20:22,320 --> 00:20:24,480 Speaker 1: answer or not to that answer, they're gonna probably ask 358 00:20:24,520 --> 00:20:27,520 Speaker 1: you this. So that kind of slow Sydney can be 359 00:20:27,600 --> 00:20:31,399 Speaker 1: thinking two in three steps ahead exactly. So Sydney is 360 00:20:31,440 --> 00:20:33,560 Speaker 1: thinking two or three steps ahead and making sure that 361 00:20:33,560 --> 00:20:36,120 Speaker 1: the anticipation of what you're going to be doing and 362 00:20:36,119 --> 00:20:38,800 Speaker 1: and beyond. Sydney, our overall system is thinking two or 363 00:20:38,880 --> 00:20:45,000 Speaker 1: three systems steps ahead and predicting proactively those conversations as 364 00:20:45,040 --> 00:20:48,040 Speaker 1: well as those interventions that we need to give to 365 00:20:48,119 --> 00:20:51,920 Speaker 1: the members. So really using ai UM you know Watson 366 00:20:52,000 --> 00:20:55,680 Speaker 1: as a back backbone to this, Sydney is basically what 367 00:20:55,760 --> 00:21:00,000 Speaker 1: we call the human centered, designed, focused Interaction and Engagements 368 00:21:00,000 --> 00:21:02,720 Speaker 1: system that sits on top of the backbone of the 369 00:21:02,760 --> 00:21:05,760 Speaker 1: AI as the data at the bottom, So that basically 370 00:21:05,840 --> 00:21:08,760 Speaker 1: is layered away. How Sydney is able to answer the 371 00:21:08,840 --> 00:21:11,680 Speaker 1: question that we have it aspects you of what type 372 00:21:11,720 --> 00:21:14,680 Speaker 1: of question it is because our intology of the data 373 00:21:14,920 --> 00:21:16,960 Speaker 1: as well as the AIS that we have built is 374 00:21:17,160 --> 00:21:20,240 Speaker 1: very very dock solid. And that is and the good 375 00:21:20,280 --> 00:21:23,320 Speaker 1: thing is that it's it's a gift that keeps giving 376 00:21:23,359 --> 00:21:27,399 Speaker 1: because the more data we collect, the more the system 377 00:21:27,440 --> 00:21:31,720 Speaker 1: it gets Yeah, wait, can you Stumps, Sydney, can you 378 00:21:31,760 --> 00:21:34,600 Speaker 1: ask it a question? You can? I'm sure it's possible too. 379 00:21:34,880 --> 00:21:37,040 Speaker 1: And then you know, when we when we get into 380 00:21:37,080 --> 00:21:39,800 Speaker 1: that situation, what we want to do is we want 381 00:21:39,800 --> 00:21:44,040 Speaker 1: to bring the member to a human agent so that 382 00:21:44,200 --> 00:21:48,399 Speaker 1: the member satisfied seamlessly right, so that there's there's no 383 00:21:48,640 --> 00:21:52,320 Speaker 1: daylight at all regardless of how the member has wants 384 00:21:52,480 --> 00:21:55,800 Speaker 1: to connect with the human agent. A lot of members, 385 00:21:55,880 --> 00:21:59,359 Speaker 1: you know, are are dealing with time challenges and they 386 00:21:59,359 --> 00:22:01,959 Speaker 1: don't want to call up anymore. They just want somebody 387 00:22:02,000 --> 00:22:04,440 Speaker 1: to you know, be able to chat with. We try 388 00:22:04,480 --> 00:22:06,880 Speaker 1: and respond to all that, and then if if somebody 389 00:22:06,960 --> 00:22:11,160 Speaker 1: needs a human agent, then we go there. Yeah. Yeah, 390 00:22:11,720 --> 00:22:16,280 Speaker 1: what's what did your what did your members tell you about, 391 00:22:16,760 --> 00:22:22,439 Speaker 1: either explicitly or implicitly about what they wanted? You know, 392 00:22:22,480 --> 00:22:24,199 Speaker 1: we've been through this. We've just been through a year 393 00:22:24,200 --> 00:22:28,320 Speaker 1: and a half of craziness, you know, where everything is 394 00:22:28,320 --> 00:22:31,040 Speaker 1: being turned upside down. I'm curious, what have you what 395 00:22:31,080 --> 00:22:34,520 Speaker 1: have you learned from them over the last stretch is 396 00:22:34,560 --> 00:22:37,119 Speaker 1: what a member wants today very different than it was 397 00:22:37,119 --> 00:22:41,160 Speaker 1: two years ago. Definitely, Malcolm. If you look at that, 398 00:22:41,720 --> 00:22:43,920 Speaker 1: you know, the terms that you use in healthcare are 399 00:22:44,040 --> 00:22:46,560 Speaker 1: very very complex, and it's very difficult for people to 400 00:22:46,640 --> 00:22:49,480 Speaker 1: understand what my cope is. What is an out of network, 401 00:22:49,560 --> 00:22:53,520 Speaker 1: what is it in network? What does a claim uh 402 00:22:53,640 --> 00:22:56,760 Speaker 1: that that needs a pre authoriation mean to me? So 403 00:22:56,800 --> 00:23:01,200 Speaker 1: if you look at the conversation that we were having before, 404 00:23:01,400 --> 00:23:06,119 Speaker 1: they were really very hardcore health care oriented conversations and 405 00:23:06,119 --> 00:23:10,000 Speaker 1: and the transparency to to what I'm going to pay 406 00:23:10,240 --> 00:23:13,680 Speaker 1: was not there. So this was this was industry where 407 00:23:13,720 --> 00:23:17,200 Speaker 1: in you know, you're going to buy insurance and you're 408 00:23:17,200 --> 00:23:19,520 Speaker 1: going to buy a product without really understanding what I'm 409 00:23:19,520 --> 00:23:21,480 Speaker 1: going to get. At the end of the day, what 410 00:23:21,560 --> 00:23:24,800 Speaker 1: we did and basically what our customers actually demanded from 411 00:23:24,880 --> 00:23:27,800 Speaker 1: us is that irrespect you of the channel that they 412 00:23:27,800 --> 00:23:30,719 Speaker 1: come to us. Um what we call here at Anthem 413 00:23:30,760 --> 00:23:33,920 Speaker 1: connected experiences. We want to build the connected experiences, whether 414 00:23:33,960 --> 00:23:36,520 Speaker 1: they come to us from a phone call, whether they're 415 00:23:36,600 --> 00:23:39,560 Speaker 1: chatting with us, they're having a web in traction, whether 416 00:23:39,680 --> 00:23:42,400 Speaker 1: they're in the provider's office. How do we make sure 417 00:23:42,400 --> 00:23:45,480 Speaker 1: that we connect the experience end to end. Now, once 418 00:23:45,520 --> 00:23:47,399 Speaker 1: we connect the experience, we want to make sure that 419 00:23:47,440 --> 00:23:51,520 Speaker 1: we are building a very human centered design way of 420 00:23:51,600 --> 00:23:54,520 Speaker 1: answering their questions. So it is as simple as making 421 00:23:54,520 --> 00:23:58,520 Speaker 1: sure that we provide them a nudge on probably this 422 00:23:58,600 --> 00:24:01,720 Speaker 1: is what you're looking for, and that clicks with them 423 00:24:01,720 --> 00:24:03,359 Speaker 1: and they say, yeah, that's what I was looking for. 424 00:24:03,600 --> 00:24:07,000 Speaker 1: So that input, simple interaction really helps to make sure 425 00:24:07,000 --> 00:24:09,800 Speaker 1: that you make the member feel good. Having the ability 426 00:24:09,840 --> 00:24:13,480 Speaker 1: to text, having an ability to get dancers while you're 427 00:24:13,480 --> 00:24:16,040 Speaker 1: cooking your dinner and you can text and say that, hey, 428 00:24:16,080 --> 00:24:18,399 Speaker 1: could you please tell me what will cope for the 429 00:24:18,440 --> 00:24:20,919 Speaker 1: next visit? I have a daughter X And you go 430 00:24:20,960 --> 00:24:23,240 Speaker 1: ahead and start cooking your dinner and when you come back, 431 00:24:23,280 --> 00:24:25,199 Speaker 1: you have a text back out there which tells you 432 00:24:25,240 --> 00:24:27,520 Speaker 1: exactly what it is. And the beauty of it is 433 00:24:27,560 --> 00:24:30,560 Speaker 1: that we had a very constant loop out there, you know, 434 00:24:30,600 --> 00:24:34,399 Speaker 1: the technologies that we use that that allowed us to 435 00:24:34,560 --> 00:24:37,800 Speaker 1: have a constant feedback on those complex interactions that we 436 00:24:37,880 --> 00:24:40,200 Speaker 1: were having. And that's where IBM team and we work 437 00:24:40,280 --> 00:24:42,240 Speaker 1: together and kind of figured out, Okay, what will be 438 00:24:42,240 --> 00:24:45,919 Speaker 1: our game plan. What did you learn from working with 439 00:24:46,040 --> 00:24:50,040 Speaker 1: other people on the Watson platform that helped a Neil 440 00:24:50,080 --> 00:24:53,200 Speaker 1: and Anthem? What did you bring to them? So from 441 00:24:53,240 --> 00:24:55,840 Speaker 1: what you've learned from others. So what we what we've 442 00:24:55,880 --> 00:25:00,240 Speaker 1: tried to do with Watson, Well, Watson first started, we 443 00:25:00,320 --> 00:25:04,760 Speaker 1: thought that everybody wanted of a spoke suit, and so 444 00:25:05,359 --> 00:25:08,320 Speaker 1: we kind of go on a journey together to make 445 00:25:08,359 --> 00:25:11,199 Speaker 1: up a spoke suit. And what what we found the 446 00:25:11,200 --> 00:25:15,240 Speaker 1: clients really wanted was well, look, I want you to 447 00:25:15,320 --> 00:25:20,679 Speaker 1: show up with the suit partially done to answer some 448 00:25:20,760 --> 00:25:24,440 Speaker 1: of the basic things, and then I want to make 449 00:25:24,480 --> 00:25:27,879 Speaker 1: it my own, right, So so show up ready to 450 00:25:27,920 --> 00:25:32,320 Speaker 1: go so that we can get into production answering questions 451 00:25:32,359 --> 00:25:35,400 Speaker 1: in a few months, and then we will work together 452 00:25:35,880 --> 00:25:40,199 Speaker 1: to radically customize and taylor that experience. That's been my 453 00:25:40,200 --> 00:25:46,000 Speaker 1: biggest learning, right, So whether it was UM in financial services, 454 00:25:46,080 --> 00:25:49,600 Speaker 1: or healthcare or Telco or you know, there's about seven 455 00:25:49,680 --> 00:25:53,560 Speaker 1: or eight dominant industries. UM. We tried to make a 456 00:25:53,640 --> 00:25:58,840 Speaker 1: series of industry specific cartridges so that Watson came kind 457 00:25:58,840 --> 00:26:01,720 Speaker 1: of pre trained, right, so that we were ready to 458 00:26:01,720 --> 00:26:06,879 Speaker 1: go quickly. And then the second learning was we needed 459 00:26:06,920 --> 00:26:10,320 Speaker 1: to show up with the right people because remember you're 460 00:26:10,359 --> 00:26:14,760 Speaker 1: creating a conversational interaction with someone, right, so you've got 461 00:26:14,760 --> 00:26:17,800 Speaker 1: to make sure that people are designing the words correctly 462 00:26:17,800 --> 00:26:20,399 Speaker 1: and the user experience right. Those are the two things 463 00:26:20,440 --> 00:26:22,919 Speaker 1: I think that you know, we brought that tried to 464 00:26:22,960 --> 00:26:27,840 Speaker 1: help Anthem accelerate. I mean, and Neil said something that 465 00:26:28,560 --> 00:26:31,560 Speaker 1: I thought was fascinating. You're talking about designing a system 466 00:26:31,600 --> 00:26:35,840 Speaker 1: with empathy, and I'm curious what does First of all, 467 00:26:36,560 --> 00:26:39,440 Speaker 1: what does empathy look like in an AI system? And 468 00:26:40,080 --> 00:26:45,119 Speaker 1: b have you has anyone ever, has any non healthcare 469 00:26:46,840 --> 00:26:51,639 Speaker 1: player ever asked you, Glenn to put empathy in the system. UM. 470 00:26:51,800 --> 00:26:56,160 Speaker 1: Clients outside of healthcare are less focused on empathy. They 471 00:26:56,200 --> 00:27:00,879 Speaker 1: are focused more on making sure to get UM, you know, 472 00:27:01,160 --> 00:27:06,159 Speaker 1: the information out there correctly, especially in highly regulated industries. Right. 473 00:27:07,840 --> 00:27:13,600 Speaker 1: How you deal with empathy in an AI system, it's 474 00:27:13,640 --> 00:27:15,880 Speaker 1: all based on the choice of words that you use 475 00:27:16,200 --> 00:27:20,000 Speaker 1: and the verbal inflections that are present when you have 476 00:27:20,040 --> 00:27:24,520 Speaker 1: a voice response, right, and you you and I UM 477 00:27:24,520 --> 00:27:27,680 Speaker 1: when we're talking right now, was Malcolm with Annal with whomever. 478 00:27:28,520 --> 00:27:31,359 Speaker 1: We can just by the words that somebody chooses, we 479 00:27:31,400 --> 00:27:34,920 Speaker 1: can know whether it matters to them about what we're 480 00:27:34,960 --> 00:27:38,080 Speaker 1: talking about, right, And so we try and build a 481 00:27:38,119 --> 00:27:42,080 Speaker 1: lot of those human characteristics into all the responses as 482 00:27:42,119 --> 00:27:45,520 Speaker 1: compared to just getting the information right. Just what's you know? 483 00:27:46,080 --> 00:27:50,320 Speaker 1: Usn't just telling people you're sorry, right, Those are the 484 00:27:50,359 --> 00:27:52,440 Speaker 1: types of things that you have to engineer in as 485 00:27:52,480 --> 00:27:57,760 Speaker 1: compared to just being flawlessly precise about the answer. Yeah. 486 00:27:58,119 --> 00:28:00,119 Speaker 1: Um wait, one last question for for both of has 487 00:28:00,160 --> 00:28:04,160 Speaker 1: been such a fun conversation. We talked about ten years 488 00:28:04,160 --> 00:28:06,760 Speaker 1: ago when you guys started talking and then this sort 489 00:28:06,760 --> 00:28:10,399 Speaker 1: of this transition moment five years ago. Now let's go 490 00:28:10,480 --> 00:28:14,479 Speaker 1: five years in the future. So let's imagine it's twenty 491 00:28:16,400 --> 00:28:18,760 Speaker 1: and we're three of us are talking again. I want 492 00:28:18,760 --> 00:28:23,040 Speaker 1: to know what problems you're trying to solve. Then the 493 00:28:23,119 --> 00:28:26,800 Speaker 1: problems we're trying to solve at that time would would 494 00:28:27,080 --> 00:28:29,359 Speaker 1: would definitely be much different on where we are. But 495 00:28:29,800 --> 00:28:31,800 Speaker 1: what I can tell you before I get there is 496 00:28:31,840 --> 00:28:34,119 Speaker 1: that you know, we want to make sure that in 497 00:28:34,160 --> 00:28:39,000 Speaker 1: the next five years, anthem Is is treated like a 498 00:28:39,040 --> 00:28:44,520 Speaker 1: platform company which is focused on creating these solutions with 499 00:28:44,600 --> 00:28:47,480 Speaker 1: the help of our partners that really meet the need 500 00:28:47,560 --> 00:28:50,520 Speaker 1: of the members in the journey that they have from 501 00:28:50,520 --> 00:28:52,920 Speaker 1: a healthcare perspective, and we do want to pivot from 502 00:28:52,920 --> 00:28:57,280 Speaker 1: a from a sick care to more proactive and predictive 503 00:28:57,320 --> 00:29:00,440 Speaker 1: care and wellness for members. So we're gonna will down 504 00:29:00,520 --> 00:29:03,720 Speaker 1: and keep working on that because it's a it's something 505 00:29:03,720 --> 00:29:07,000 Speaker 1: that never ends and it's going to keep keep going 506 00:29:07,360 --> 00:29:11,960 Speaker 1: in the years to come. I'm still processing this fantastic 507 00:29:12,000 --> 00:29:16,280 Speaker 1: idea about taking a photo of your meal, if your 508 00:29:16,360 --> 00:29:20,160 Speaker 1: plate of food, and getting instantent feedback and analysis on that. 509 00:29:20,600 --> 00:29:23,880 Speaker 1: So Sydney starts gets all these pictures of my food, 510 00:29:23,920 --> 00:29:26,880 Speaker 1: gets it gets a sense of what I'm eating over 511 00:29:26,920 --> 00:29:28,880 Speaker 1: the course of the given day. Is the idea that 512 00:29:29,280 --> 00:29:31,840 Speaker 1: so I'm thinking about this five year from now conversation. 513 00:29:32,080 --> 00:29:34,680 Speaker 1: So five years from now I might be taking a 514 00:29:34,720 --> 00:29:37,640 Speaker 1: photo of everything, and then at the end of every day, 515 00:29:37,680 --> 00:29:43,040 Speaker 1: Sydney text me and says, Malcolm, you should be aware 516 00:29:43,080 --> 00:29:47,120 Speaker 1: of the fact that you're nutritional patterns of the last 517 00:29:47,120 --> 00:29:49,520 Speaker 1: few days. You need to eat a few more vegetables 518 00:29:49,600 --> 00:29:51,560 Speaker 1: or you'll be useful to have some. Is that what 519 00:29:51,600 --> 00:29:54,040 Speaker 1: we're talking about here? But I think this idea is 520 00:29:54,080 --> 00:29:57,400 Speaker 1: fantastic because there is no we have no way of 521 00:29:57,480 --> 00:30:00,640 Speaker 1: making any nutritional sense of the stuff we unless you 522 00:30:00,720 --> 00:30:04,320 Speaker 1: spend two hours on a on Google before you make 523 00:30:04,360 --> 00:30:07,400 Speaker 1: your dinner. How do you know whether the sum total 524 00:30:07,440 --> 00:30:09,080 Speaker 1: of the things you eat in a given day is 525 00:30:09,120 --> 00:30:12,520 Speaker 1: going to be um is optimum? I love this. I 526 00:30:12,560 --> 00:30:15,280 Speaker 1: want this with this now? Can I do I have 527 00:30:15,320 --> 00:30:19,720 Speaker 1: to wait five years? Can I have? Guys, It's been 528 00:30:19,720 --> 00:30:22,920 Speaker 1: a really, really fun conversation. I really appreciate you taking 529 00:30:22,920 --> 00:30:26,320 Speaker 1: the time, Neil Glenn have a wonderful day, and that 530 00:30:26,600 --> 00:30:30,040 Speaker 1: the future cannot come fast enough, at least for me, 531 00:30:30,240 --> 00:30:33,680 Speaker 1: So bring it on. I'm waiting for it. Thank you 532 00:30:33,760 --> 00:30:37,880 Speaker 1: very much for Malcolm. Oh yeah, awesome, Thanks Malcolm. Bye, guys. 533 00:30:41,080 --> 00:30:44,400 Speaker 1: Understanding customer needs has become even more important in the 534 00:30:44,400 --> 00:30:48,640 Speaker 1: wake of COVID nineteen. Companies like IBM and Anthem are 535 00:30:48,720 --> 00:30:52,520 Speaker 1: learning to leverage technology to deliver a more personal experience, 536 00:30:52,920 --> 00:30:57,280 Speaker 1: a crucial part of our evolving healthcare system. Thanks again 537 00:30:57,480 --> 00:31:00,160 Speaker 1: to a Neil Bot and Glenn Finch for talking king 538 00:31:00,200 --> 00:31:04,560 Speaker 1: with me, I learned a lot Smart Talks with IBM 539 00:31:04,680 --> 00:31:08,680 Speaker 1: is produced by Emily Rosteck with Carlie Migliori, edited by 540 00:31:08,760 --> 00:31:13,080 Speaker 1: Karen Shakergee engineering by Martin Gonzalez, mixed and mastered by 541 00:31:13,160 --> 00:31:19,000 Speaker 1: Jason Gambrell and Ben Tolliday. Music by Granmoscope. Special thanks 542 00:31:19,000 --> 00:31:23,479 Speaker 1: to Molly Sosha, Andy Kelly Mia, Label, Jacob Weisberg, Heather Faine, 543 00:31:23,880 --> 00:31:27,400 Speaker 1: Eric Sandler, and Maggie Taylor and the teams at eight 544 00:31:27,440 --> 00:31:31,600 Speaker 1: Bar and IBM. Smart Talks with IBM is a production 545 00:31:31,600 --> 00:31:34,320 Speaker 1: of Pushkin Industries and I Heart Media. You can find 546 00:31:34,400 --> 00:31:38,520 Speaker 1: more Pushkin podcasts on the i Heart Radio app, Apple Podcasts, 547 00:31:38,760 --> 00:31:42,520 Speaker 1: or wherever you like to listen. I'm Malcolm Gladwell, See 548 00:31:42,520 --> 00:31:43,080 Speaker 1: you next time.