1 00:00:04,000 --> 00:00:07,560 Speaker 1: Hello, Hello. This is Smart Talks with IBM, a podcast 2 00:00:07,600 --> 00:00:11,520 Speaker 1: from Pushkin Industries, iHeart Media and IBM about what it 3 00:00:11,560 --> 00:00:14,400 Speaker 1: means to look at today's most challenging problems in a 4 00:00:14,400 --> 00:00:22,479 Speaker 1: new way. I'm Malcolm Gladwell. Today I'll be discussing the 5 00:00:22,480 --> 00:00:26,960 Speaker 1: innovations around hybrid cloud with Lumen's David Shakochus and IBM's 6 00:00:27,000 --> 00:00:31,200 Speaker 1: Howard Boville. David is Vice President for Enterprise Technology and 7 00:00:31,440 --> 00:00:36,240 Speaker 1: Field CTO at LUMEN, where he's helped clients across industries 8 00:00:36,600 --> 00:00:42,239 Speaker 1: create new business opportunities through unique digital interactions. David has 9 00:00:42,240 --> 00:00:45,680 Speaker 1: been immersed in cloud computing long before his time with Luhmen, 10 00:00:45,960 --> 00:00:50,720 Speaker 1: working with companies such as Unit, Digital Island and fuse Point. 11 00:00:51,280 --> 00:00:54,400 Speaker 1: You're putting computing capacity in places that didn't used to 12 00:00:54,400 --> 00:00:56,800 Speaker 1: be thought of as data centers before. There's an element 13 00:00:56,840 --> 00:01:00,800 Speaker 1: of novel challenge and so they're inherently there's there's more complexity. 14 00:01:01,960 --> 00:01:05,440 Speaker 1: Howard is the head of IBM Cloud Platform. In this role, 15 00:01:05,680 --> 00:01:11,200 Speaker 1: Howard has focused on driving digital transformation for enterprises, especially 16 00:01:11,440 --> 00:01:15,760 Speaker 1: in highly regulated industries. Before joining IBM, Howard was Chief 17 00:01:15,800 --> 00:01:18,959 Speaker 1: Technology Officer for Bank of America, where he led the 18 00:01:19,000 --> 00:01:22,679 Speaker 1: transformation of the Bank's infrastructure and developed one of the 19 00:01:22,760 --> 00:01:27,160 Speaker 1: largest internal private clouds. The times you have to kind 20 00:01:27,160 --> 00:01:29,200 Speaker 1: of be a technology of angelists in terms of what 21 00:01:29,240 --> 00:01:31,920 Speaker 1: the art of the possible is against the problems. In 22 00:01:31,959 --> 00:01:35,679 Speaker 1: this episode, we'll explore working and living in a world 23 00:01:35,720 --> 00:01:39,520 Speaker 1: of cloud technology. Will show you how new innovations in 24 00:01:39,560 --> 00:01:43,679 Speaker 1: cloud computing have reimagined a world where computing can happen 25 00:01:43,840 --> 00:01:48,720 Speaker 1: anywhere and businesses can use data to accelerate innovation to 26 00:01:48,840 --> 00:02:07,400 Speaker 1: improve service and performance. Let's dive in. Welcome everyone, Howard 27 00:02:07,400 --> 00:02:11,400 Speaker 1: and David. Thank you for joining us today. Let's jump in. David, 28 00:02:11,440 --> 00:02:15,920 Speaker 1: I'm gonna ask you to define some terms and that 29 00:02:15,960 --> 00:02:18,399 Speaker 1: will be easy for you but useful for the rest 30 00:02:18,440 --> 00:02:22,480 Speaker 1: of us. First of all, the Fourth Industrial Revolution? What 31 00:02:22,680 --> 00:02:27,840 Speaker 1: is it? Yeah? Good one, so we can really look back. 32 00:02:27,880 --> 00:02:32,680 Speaker 1: I think on history in these periods of technology advancement, right, 33 00:02:32,720 --> 00:02:36,320 Speaker 1: you know, the period of industry that was defined by 34 00:02:36,360 --> 00:02:39,560 Speaker 1: steam power, the period of history, and the industrial period 35 00:02:39,600 --> 00:02:43,760 Speaker 1: of history that was defined by electrical distribution. We commonly 36 00:02:43,800 --> 00:02:46,320 Speaker 1: think of the third one is really this information age, 37 00:02:46,360 --> 00:02:52,600 Speaker 1: the information revolution of digitization of process creation and online 38 00:02:52,639 --> 00:02:56,080 Speaker 1: connectivity of data. Is this third industrial revolution of the 39 00:02:56,120 --> 00:03:00,840 Speaker 1: digital age? Information technology systems community caating with each other, 40 00:03:00,960 --> 00:03:04,280 Speaker 1: and the advent of all that you can do in 41 00:03:04,280 --> 00:03:09,080 Speaker 1: industry with those technologies, and the Fourth Industrial Revolution is 42 00:03:09,080 --> 00:03:12,800 Speaker 1: really this reflection of the explosion of data that gets 43 00:03:12,840 --> 00:03:16,120 Speaker 1: created by all that connectivity. Taking data and being able 44 00:03:16,120 --> 00:03:19,440 Speaker 1: to acquire it, analyze it, and take action upon it 45 00:03:19,480 --> 00:03:22,320 Speaker 1: is opening up a wide range of new industries and 46 00:03:22,360 --> 00:03:26,880 Speaker 1: new business opportunities and new regulatory challenges. UM and that's 47 00:03:26,919 --> 00:03:29,360 Speaker 1: what we mean when we say the Fourth Industrial Revolution. 48 00:03:31,720 --> 00:03:35,240 Speaker 1: How did Luminum IBM come together, and what's the logic 49 00:03:35,280 --> 00:03:38,080 Speaker 1: behind your collaboration in this field? You can't take the 50 00:03:38,160 --> 00:03:41,360 Speaker 1: heritage of both companies. So limit are a world class 51 00:03:41,360 --> 00:03:45,520 Speaker 1: global networking company. They connect things together at the highest 52 00:03:45,600 --> 00:03:48,320 Speaker 1: level of quality, lowest latency and so on. And it's 53 00:03:48,360 --> 00:03:51,040 Speaker 1: and it's hard through all the actual transformations that IBM 54 00:03:51,040 --> 00:03:53,400 Speaker 1: has been through. Is where a compute company on which 55 00:03:53,400 --> 00:03:56,240 Speaker 1: the software runs and we write that also the software 56 00:03:56,240 --> 00:03:58,480 Speaker 1: in certain contexts as well. So the combination of the 57 00:03:58,520 --> 00:04:03,520 Speaker 1: two capabilities solves for the problem. We've been working together 58 00:04:03,720 --> 00:04:06,280 Speaker 1: for years. I think the the advent of what we've 59 00:04:06,280 --> 00:04:10,240 Speaker 1: been focused on with IBM Cloud Satellite has really been 60 00:04:10,840 --> 00:04:14,400 Speaker 1: initiated by Lumen's investment in making our network a place 61 00:04:14,440 --> 00:04:17,359 Speaker 1: where you can run software workloads more readily and easily. 62 00:04:17,640 --> 00:04:20,479 Speaker 1: And IBM cloud satellite is a great modality that just 63 00:04:20,520 --> 00:04:23,760 Speaker 1: snaps right into that network. Yeah, you work for Lumen. 64 00:04:24,560 --> 00:04:27,400 Speaker 1: Is the simplest way to describe Lumen that Lumen is 65 00:04:27,440 --> 00:04:31,160 Speaker 1: a Fourth Industrial Revolution company. We're we're a Fourth Industrial 66 00:04:31,160 --> 00:04:34,120 Speaker 1: Revolution company because we believe at the core of all 67 00:04:34,160 --> 00:04:37,240 Speaker 1: of it is connectivity. All that data and all those 68 00:04:37,240 --> 00:04:39,040 Speaker 1: sources of data and all the ways that you need 69 00:04:39,080 --> 00:04:42,480 Speaker 1: to interact with that data requires a substantial amount of 70 00:04:42,560 --> 00:04:46,640 Speaker 1: aggregate networking capacity. We're now kind of hitting this tipping 71 00:04:46,640 --> 00:04:48,919 Speaker 1: point in the Fourth Industrial Revolution where the amount of 72 00:04:48,960 --> 00:04:52,440 Speaker 1: data coming inbound from cameras and from sensors and from 73 00:04:52,520 --> 00:04:55,960 Speaker 1: devices and gaming consoles and and a variety of uh 74 00:04:56,279 --> 00:04:59,279 Speaker 1: input sources like that is actually exceeding capacity in the 75 00:04:59,279 --> 00:05:02,080 Speaker 1: other direction. So that's really why for the Fourth Industrial 76 00:05:02,120 --> 00:05:05,280 Speaker 1: Revolution of work, you need massive amounts of network connectivity. 77 00:05:05,279 --> 00:05:08,480 Speaker 1: And that's what what LUMAN does. So this brings up 78 00:05:08,520 --> 00:05:10,360 Speaker 1: the second word I want you to define, and that's 79 00:05:10,640 --> 00:05:15,640 Speaker 1: edge computing, which I'm assuming is edge computing is it 80 00:05:15,720 --> 00:05:19,520 Speaker 1: is a technological response to the phenomenon you've just described 81 00:05:20,040 --> 00:05:23,240 Speaker 1: it is. It's one way to think about edge computing. 82 00:05:23,360 --> 00:05:24,880 Speaker 1: The way we talk about it a lot is it's 83 00:05:24,880 --> 00:05:30,480 Speaker 1: moving workloads software workloads closer to digital interactions, and a 84 00:05:30,560 --> 00:05:35,120 Speaker 1: digital interaction could be between things and people and business models. Yes, 85 00:05:35,240 --> 00:05:36,840 Speaker 1: I mean, just to add to some of David's point 86 00:05:36,880 --> 00:05:38,920 Speaker 1: with some kind of practical use cases that we kind 87 00:05:38,920 --> 00:05:41,839 Speaker 1: of we're involved in. So so, first and foremost, the 88 00:05:41,920 --> 00:05:45,080 Speaker 1: edge computing piece actually is joining the analog world to 89 00:05:45,120 --> 00:05:47,840 Speaker 1: the digital world, Whereas until this point you would look 90 00:05:47,839 --> 00:05:49,760 Speaker 1: at the digital world through the screens that we all 91 00:05:49,839 --> 00:05:53,120 Speaker 1: spend sumers time looking at. Whereas on the edge, it's 92 00:05:53,160 --> 00:05:57,359 Speaker 1: actually looking at physical locations like retail branches, like shipping 93 00:05:57,360 --> 00:06:01,680 Speaker 1: concern as, like welds on on a well, but an automobile. 94 00:06:01,720 --> 00:06:05,320 Speaker 1: And there's there's two practical examples. So there's thermal imaging 95 00:06:05,360 --> 00:06:07,719 Speaker 1: techniques that we now use to look at the quality 96 00:06:07,720 --> 00:06:10,520 Speaker 1: of a weld all the way through a production process 97 00:06:10,560 --> 00:06:15,080 Speaker 1: in an automotive plant that wasn't possible, that connects in 98 00:06:15,160 --> 00:06:18,279 Speaker 1: that local location, gathers that data and determines that the 99 00:06:18,279 --> 00:06:20,960 Speaker 1: welders at the actual right quality or on a shipping 100 00:06:21,000 --> 00:06:24,480 Speaker 1: content basis, it's it's the combination of r F I 101 00:06:24,600 --> 00:06:29,000 Speaker 1: D tags, connecting to networks that contract with that level 102 00:06:29,040 --> 00:06:32,719 Speaker 1: of accuracy and giving you that experience. Come in terms 103 00:06:32,760 --> 00:06:36,560 Speaker 1: of the how has this come about, it's because as 104 00:06:36,600 --> 00:06:40,039 Speaker 1: we've become more familiar with the amount of data that 105 00:06:40,080 --> 00:06:43,440 Speaker 1: we can capture through a digital interaction through a screen, 106 00:06:43,480 --> 00:06:46,520 Speaker 1: whether that's a mobile phone or whether computer, and all 107 00:06:46,520 --> 00:06:48,560 Speaker 1: of the analytics that you can then do on kind 108 00:06:48,560 --> 00:06:53,120 Speaker 1: of humans behavior, the same questions that get posed the 109 00:06:53,120 --> 00:06:56,560 Speaker 1: physical locations or physical assets, the physical interactions or the 110 00:06:56,600 --> 00:06:59,800 Speaker 1: physical assets, and it's the wedding of those two things 111 00:07:00,080 --> 00:07:04,000 Speaker 1: create this I T problem. The companies like Lumen and 112 00:07:04,080 --> 00:07:06,440 Speaker 1: IBM sold for at the edge so that you can 113 00:07:06,480 --> 00:07:10,240 Speaker 1: actually tie together the digital world and the physical world 114 00:07:10,720 --> 00:07:12,680 Speaker 1: in the same way as you were capturing the data 115 00:07:12,920 --> 00:07:16,080 Speaker 1: purely from a digital world. And it's then human's curiosity 116 00:07:16,760 --> 00:07:18,840 Speaker 1: that I say, Okay, well we've got these questions answered 117 00:07:18,840 --> 00:07:21,200 Speaker 1: from the kind of the Third Industrial Revolution, I dad, 118 00:07:21,240 --> 00:07:23,400 Speaker 1: we went through. How do we apply that through the 119 00:07:23,440 --> 00:07:27,480 Speaker 1: Fourth Industrial Revolution into the analog world? Yeah? Yeah, you 120 00:07:27,520 --> 00:07:30,320 Speaker 1: know what this makes me think of if I was 121 00:07:32,000 --> 00:07:35,040 Speaker 1: and and start me if this is too speculative if 122 00:07:35,080 --> 00:07:38,400 Speaker 1: I was a basketball coach, I would love to have 123 00:07:38,440 --> 00:07:41,320 Speaker 1: an edge computing system which picked up data from my 124 00:07:41,400 --> 00:07:44,200 Speaker 1: players on the court in real time and told me 125 00:07:44,760 --> 00:07:49,840 Speaker 1: who was getting tired, told me whose performance was subpar, 126 00:07:50,800 --> 00:07:54,800 Speaker 1: told me how quickly someone was responding on defense. And 127 00:07:55,400 --> 00:07:57,520 Speaker 1: I mean, that's a that's kind of what That's an 128 00:07:57,520 --> 00:07:59,520 Speaker 1: example of what you're talking about. How it isn't it. 129 00:07:59,520 --> 00:08:02,320 Speaker 1: It's like the and wor all that had previously been 130 00:08:02,480 --> 00:08:06,120 Speaker 1: entirely analog perhaps perhaps bang on a trash can when 131 00:08:06,120 --> 00:08:10,320 Speaker 1: they see something. But but but that but that in 132 00:08:10,400 --> 00:08:12,560 Speaker 1: parts to the point you're making is in reality because 133 00:08:12,600 --> 00:08:15,840 Speaker 1: that there are tracking devices now on athletes in practically 134 00:08:15,880 --> 00:08:20,080 Speaker 1: all disciplines at tracking how many kilometers are, how many 135 00:08:20,120 --> 00:08:24,120 Speaker 1: miles they're running average pace, And that's been tracked, and 136 00:08:24,200 --> 00:08:27,080 Speaker 1: that will be analyzed at the halftime break or the 137 00:08:27,160 --> 00:08:29,200 Speaker 1: quarter time break to being upon the actual sport that's 138 00:08:29,200 --> 00:08:30,680 Speaker 1: been followed or the third time I guess of its 139 00:08:30,720 --> 00:08:33,200 Speaker 1: ice hockey. So that has been tracked. What isn't is 140 00:08:33,200 --> 00:08:36,200 Speaker 1: the physiological elements that you talked about. But I guess 141 00:08:36,280 --> 00:08:38,600 Speaker 1: kind of that will be at some point because humans 142 00:08:38,600 --> 00:08:41,520 Speaker 1: curiosity will drive into that element to say Okay, what 143 00:08:41,679 --> 00:08:43,560 Speaker 1: what level of fatigue are we are? Therefore, was the 144 00:08:43,559 --> 00:08:46,480 Speaker 1: optimal moment to actually make a substitution of a different 145 00:08:46,480 --> 00:08:50,240 Speaker 1: player onto the pitch? Yeah? Yeah. Or if I'm if 146 00:08:50,240 --> 00:08:52,800 Speaker 1: I'm a hospital, I want to monitor the performance of 147 00:08:52,880 --> 00:08:57,120 Speaker 1: my surgeons. I mean an hour four of a complex operation. 148 00:08:57,160 --> 00:08:59,320 Speaker 1: I would love to be able to in real time 149 00:08:59,320 --> 00:09:02,760 Speaker 1: crunch a whole series of data that tells me, you know, 150 00:09:02,800 --> 00:09:06,480 Speaker 1: who's working well and who's flagging. Another kind of hallmark 151 00:09:06,520 --> 00:09:08,880 Speaker 1: of edge computing is when you really need to correlate 152 00:09:08,960 --> 00:09:13,360 Speaker 1: things locally. Um. You know a public safety use case 153 00:09:13,480 --> 00:09:16,440 Speaker 1: where you know, a gunshot rings out, um, and an 154 00:09:16,480 --> 00:09:19,880 Speaker 1: audio sensor picks that up well, correlating that with all 155 00:09:19,920 --> 00:09:21,920 Speaker 1: the stop lights in the area, all the lights in 156 00:09:21,960 --> 00:09:25,200 Speaker 1: the area, you know, any other public safety device that 157 00:09:25,280 --> 00:09:29,960 Speaker 1: is within a particular geographic boundary. UM. That intense correlation 158 00:09:30,240 --> 00:09:33,480 Speaker 1: of events to other outcomes may need to happen within 159 00:09:33,520 --> 00:09:36,720 Speaker 1: split seconds, uh, you know, for for a public safety 160 00:09:36,720 --> 00:09:39,280 Speaker 1: outcome to be achieved. So so so it's it's not just 161 00:09:39,520 --> 00:09:41,840 Speaker 1: the fact that you know, we're tracking, we're analyzing data, 162 00:09:41,880 --> 00:09:44,520 Speaker 1: and then we're getting lessons learned at halftime of you know, 163 00:09:44,760 --> 00:09:47,440 Speaker 1: which one of our players run around. The more fine 164 00:09:47,480 --> 00:09:50,880 Speaker 1: grained like milliseconds matter kind of use cases is another 165 00:09:51,120 --> 00:09:57,600 Speaker 1: place where edge computing really shines. In step one, you 166 00:09:57,679 --> 00:10:00,600 Speaker 1: analyze that kind of data, say the past coutball player 167 00:10:00,640 --> 00:10:04,200 Speaker 1: or the surgeon after the fact. So you have the 168 00:10:04,240 --> 00:10:07,480 Speaker 1: meeting the next day and you say, you didn't perform 169 00:10:07,600 --> 00:10:10,400 Speaker 1: very well yesterday, Malcolm on the court. But if I 170 00:10:10,400 --> 00:10:13,240 Speaker 1: can do it in real time, then I can actually 171 00:10:13,280 --> 00:10:17,480 Speaker 1: affect the outcome of the game as it's happening. And 172 00:10:17,520 --> 00:10:21,400 Speaker 1: that that shift from being able to make those judgments 173 00:10:21,840 --> 00:10:25,800 Speaker 1: immediately and make those judgments after the fact is huge. 174 00:10:26,320 --> 00:10:30,040 Speaker 1: It's I could win the game, yeah, didn't otherwise lose 175 00:10:30,200 --> 00:10:33,720 Speaker 1: And I'm echoing, I'm capturing your excitement. You are kind 176 00:10:33,720 --> 00:10:35,800 Speaker 1: of echoing that position where we've kind of gone from 177 00:10:35,800 --> 00:10:39,680 Speaker 1: the digital perspective where people are playing online games, sporting 178 00:10:39,760 --> 00:10:42,200 Speaker 1: games and making judgment spist up on what they can 179 00:10:42,200 --> 00:10:45,000 Speaker 1: see from the analytics they get in that digital realm, 180 00:10:45,040 --> 00:10:47,720 Speaker 1: and then translating that into ideas that could be extended 181 00:10:47,760 --> 00:10:51,400 Speaker 1: into the animal row and therefore that desire. You can 182 00:10:51,440 --> 00:10:54,800 Speaker 1: imagine there are people as we speak now putting together 183 00:10:54,800 --> 00:10:58,120 Speaker 1: innitative solutions that can address that very problem. Yeah. The 184 00:10:58,679 --> 00:11:01,079 Speaker 1: other thing too, is it we're talking about all the 185 00:11:01,120 --> 00:11:04,080 Speaker 1: whiz bang use cases and and there's sort of a 186 00:11:04,120 --> 00:11:06,880 Speaker 1: subtext to everything we just said, which is that there's 187 00:11:07,000 --> 00:11:12,200 Speaker 1: good software designed at scale able to run and achieve 188 00:11:12,240 --> 00:11:17,320 Speaker 1: those outcomes better basketball performance, public safety use cases. There's 189 00:11:17,360 --> 00:11:19,640 Speaker 1: software that needs to go and collect all that data 190 00:11:19,679 --> 00:11:22,120 Speaker 1: and take action against it. And the other sort of 191 00:11:22,120 --> 00:11:24,400 Speaker 1: really the dimension, and certainly with a big dimension of 192 00:11:24,400 --> 00:11:27,560 Speaker 1: the IBM and LUMIN relationship is being able to enable 193 00:11:27,720 --> 00:11:31,400 Speaker 1: great software development anywhere that the network can reach. All 194 00:11:31,440 --> 00:11:34,959 Speaker 1: these use cases don't happen unless there's software that goes 195 00:11:35,000 --> 00:11:38,200 Speaker 1: and runs that business logic or runs that analysis, or 196 00:11:38,360 --> 00:11:42,520 Speaker 1: processes those inputs into actionable outputs and respond to an 197 00:11:42,520 --> 00:11:51,280 Speaker 1: event stream. Yeah. Yeah, talk a little bit about the 198 00:11:51,760 --> 00:11:56,079 Speaker 1: cloud piece of this. Why does hybrid cloud How does 199 00:11:56,120 --> 00:11:59,360 Speaker 1: it fit into this puzzle that you've been describing. So 200 00:11:59,679 --> 00:12:03,959 Speaker 1: the hyhbrary cloud space essentially encapsulates all the points that 201 00:12:04,040 --> 00:12:06,800 Speaker 1: David has gone through. So it's a cloud essentially is 202 00:12:07,120 --> 00:12:10,960 Speaker 1: a building with computers in that run applications. And the 203 00:12:11,000 --> 00:12:14,480 Speaker 1: paradigm until probably about ten fifteen years ago was that 204 00:12:14,720 --> 00:12:16,800 Speaker 1: a large corporate would have a big data center have 205 00:12:16,880 --> 00:12:20,440 Speaker 1: its own computers in them and would have that capability. 206 00:12:20,679 --> 00:12:23,839 Speaker 1: And then what created a huge innovation was the actual 207 00:12:23,880 --> 00:12:25,880 Speaker 1: ability for our developer to come up with an idea 208 00:12:26,240 --> 00:12:29,040 Speaker 1: not need to unbuild a big data center for computers, 209 00:12:29,080 --> 00:12:32,240 Speaker 1: and it could actually rent the space and then turn 210 00:12:32,320 --> 00:12:34,520 Speaker 1: that idea into software and turn that software into a 211 00:12:35,040 --> 00:12:37,920 Speaker 1: Facebook or a Netflix or whatever it may be. So 212 00:12:37,960 --> 00:12:41,280 Speaker 1: it reduced barriers of entry, and that was the first phase. 213 00:12:41,320 --> 00:12:43,080 Speaker 1: The first that PHAs that were now in is this 214 00:12:43,200 --> 00:12:46,000 Speaker 1: kind of synthesis between the digital and the analog at 215 00:12:46,040 --> 00:12:49,120 Speaker 1: the edge, and that's the hybrid cloud computing, where we 216 00:12:49,120 --> 00:12:53,400 Speaker 1: can actually create many data centers specific to particular needs 217 00:12:54,120 --> 00:12:56,400 Speaker 1: all around the world, not just within the assets that 218 00:12:56,480 --> 00:12:58,560 Speaker 1: IBM has or the assets that are the cloud service 219 00:12:58,600 --> 00:13:02,080 Speaker 1: providers have. And it's these partnerships. There's also kind of 220 00:13:02,160 --> 00:13:05,600 Speaker 1: new economic models in the marketplace where companies can operate 221 00:13:05,600 --> 00:13:08,160 Speaker 1: with humility to recognize, Okay, we may be large companies, 222 00:13:08,160 --> 00:13:10,599 Speaker 1: but actually we can see the assets in another company 223 00:13:10,679 --> 00:13:12,240 Speaker 1: and the brilliant people that exist there, and if we 224 00:13:12,240 --> 00:13:14,840 Speaker 1: could partner with them, we could create something valuable for 225 00:13:14,880 --> 00:13:18,360 Speaker 1: the marketplace. What the what's the challenge if I'm a 226 00:13:18,400 --> 00:13:20,760 Speaker 1: company and I want to do something sophisticated with all 227 00:13:20,800 --> 00:13:23,880 Speaker 1: of this data. Where am I gonna What's going to 228 00:13:24,000 --> 00:13:25,960 Speaker 1: keep me up at night? A part of this puzzle 229 00:13:26,800 --> 00:13:28,720 Speaker 1: when you have this explosion of data and it can 230 00:13:28,720 --> 00:13:31,000 Speaker 1: be at the edge, the key thing that we need 231 00:13:31,080 --> 00:13:34,240 Speaker 1: to be very mindful of is cybersecurity risk in that 232 00:13:34,240 --> 00:13:36,320 Speaker 1: that data gets in the hands of their own people 233 00:13:36,679 --> 00:13:38,480 Speaker 1: who then can actually use that to their own to 234 00:13:38,520 --> 00:13:41,840 Speaker 1: their own gain, or to whatever purpose they want to use. 235 00:13:42,240 --> 00:13:44,480 Speaker 1: So every solution that has to be built has to 236 00:13:44,480 --> 00:13:48,280 Speaker 1: be built at a very high gradive cybersecurity, so ensuring 237 00:13:48,280 --> 00:13:51,520 Speaker 1: that we protect our customers data and also we protect 238 00:13:51,559 --> 00:13:53,720 Speaker 1: them from the laws, rules and REGs that they have 239 00:13:53,800 --> 00:13:57,800 Speaker 1: to be obligated to. Broadly speaking, you're you're putting computing 240 00:13:57,840 --> 00:14:00,199 Speaker 1: capacity in places that didn't used to be out of 241 00:14:00,200 --> 00:14:03,880 Speaker 1: the data centers before. Right there, there's an element of newness, 242 00:14:03,960 --> 00:14:07,840 Speaker 1: There's an element of novel challenge that you may be 243 00:14:07,960 --> 00:14:11,840 Speaker 1: running into, and so that inherently there's there's more complexity. 244 00:14:12,040 --> 00:14:14,400 Speaker 1: The other thing that keeps a lot of it leaders 245 00:14:14,640 --> 00:14:16,280 Speaker 1: up at night is whether they are going to be 246 00:14:16,320 --> 00:14:19,880 Speaker 1: able to write software and deliver it at a pace 247 00:14:19,920 --> 00:14:22,200 Speaker 1: of change that is actually going to be able to 248 00:14:22,200 --> 00:14:25,320 Speaker 1: take advantage of or or solve the problem they're trying 249 00:14:25,360 --> 00:14:27,920 Speaker 1: to run. So I want to go back. I want 250 00:14:27,960 --> 00:14:29,600 Speaker 1: to do a for example here, because it seems to 251 00:14:29,680 --> 00:14:32,360 Speaker 1: be a really interesting and important point when I raise 252 00:14:32,400 --> 00:14:36,240 Speaker 1: that example of this surgeon and we want to gather 253 00:14:36,360 --> 00:14:38,960 Speaker 1: data from the surgical suite, we want to make sense 254 00:14:39,000 --> 00:14:41,920 Speaker 1: of it in real time, we want to inform the 255 00:14:42,040 --> 00:14:46,560 Speaker 1: surgery itself. But then you also want to share that 256 00:14:46,680 --> 00:14:50,080 Speaker 1: data with lots of other hospitals and use that to 257 00:14:50,160 --> 00:14:54,520 Speaker 1: build some kind of system that can improve surgery generally. 258 00:14:54,760 --> 00:14:56,480 Speaker 1: So what you're saying is, in order to do that 259 00:14:56,600 --> 00:14:59,520 Speaker 1: last piece, which is arguably the most important of the pieces, 260 00:15:00,400 --> 00:15:02,880 Speaker 1: everyone's going to be reading from the same book, right, 261 00:15:04,520 --> 00:15:06,600 Speaker 1: And the key around that is there's a level of 262 00:15:06,640 --> 00:15:08,920 Speaker 1: complexity as also reading from the same book means that 263 00:15:08,960 --> 00:15:12,160 Speaker 1: the actual the format is the same, the language is 264 00:15:12,160 --> 00:15:14,200 Speaker 1: the same, the type face to carry that analogy on. 265 00:15:14,680 --> 00:15:17,280 Speaker 1: So getting consistency in terms of the data models as 266 00:15:17,280 --> 00:15:21,840 Speaker 1: it's known is super important, as is the provenance so 267 00:15:21,880 --> 00:15:23,960 Speaker 1: that you know that the actual quality of the data 268 00:15:24,440 --> 00:15:26,160 Speaker 1: is at the highest level of insecurity. And the reason 269 00:15:26,200 --> 00:15:28,320 Speaker 1: why that's important is you would take all of that 270 00:15:28,400 --> 00:15:31,960 Speaker 1: inside all of those lessons that are turned into data 271 00:15:32,360 --> 00:15:36,040 Speaker 1: and put them into an artificial intelligence model to treat 272 00:15:36,120 --> 00:15:38,720 Speaker 1: what's not as training that model so that it actually 273 00:15:38,760 --> 00:15:42,880 Speaker 1: can come up with hypotheses that are actually continually initiatively 274 00:15:42,920 --> 00:15:45,680 Speaker 1: improved based upon the amount of data. But if there's 275 00:15:45,720 --> 00:15:49,800 Speaker 1: any issue, any corruption in that data, it will compromise 276 00:15:49,840 --> 00:15:52,560 Speaker 1: the actual outcomes. And because the volumes of data can 277 00:15:52,560 --> 00:15:55,200 Speaker 1: be so large, it is actually difficult to ensure that 278 00:15:55,200 --> 00:15:58,680 Speaker 1: actually the outcomes are trained correctly. So there's a huge 279 00:15:58,720 --> 00:16:00,760 Speaker 1: amount of work has to go into ensure the integrity 280 00:16:00,760 --> 00:16:02,160 Speaker 1: of the day to the providence of the days or 281 00:16:02,280 --> 00:16:05,160 Speaker 1: is correct so the AI doesn't get trend in the 282 00:16:05,200 --> 00:16:09,360 Speaker 1: wrong way. Yeah, that idea of software distribution. The in 283 00:16:09,760 --> 00:16:13,080 Speaker 1: our data analytics practice. One of the industries they work 284 00:16:13,120 --> 00:16:16,280 Speaker 1: with extensively is manufacturing. And one of the things that 285 00:16:16,320 --> 00:16:19,400 Speaker 1: we see organizations challenged by, and as a phrase you 286 00:16:19,520 --> 00:16:21,760 Speaker 1: are one of our data scientists uses all the time, 287 00:16:22,240 --> 00:16:24,200 Speaker 1: is that it's actually kind of easy to go and 288 00:16:24,240 --> 00:16:26,320 Speaker 1: collect a lot of data locally on the shop floor, 289 00:16:27,000 --> 00:16:29,080 Speaker 1: and it's kind of easy to get all of the 290 00:16:29,160 --> 00:16:32,080 Speaker 1: data historically that you've ever had once it's available in 291 00:16:32,080 --> 00:16:34,600 Speaker 1: your data center to go have a data scientist analyze 292 00:16:34,640 --> 00:16:38,640 Speaker 1: it and come up with you know, widely held best 293 00:16:38,680 --> 00:16:42,280 Speaker 1: practices and the source of what should be the most 294 00:16:42,280 --> 00:16:44,960 Speaker 1: efficient way to do things and what should be the 295 00:16:45,000 --> 00:16:48,360 Speaker 1: most efficient data model that can analyze all the sensors 296 00:16:48,360 --> 00:16:50,640 Speaker 1: in the factory. The challenges is getting it from the 297 00:16:50,680 --> 00:16:53,280 Speaker 1: top floor to the shop floor. It's fine to get 298 00:16:53,360 --> 00:16:56,440 Speaker 1: that lesson in your shortcore data center. It's fine. It's 299 00:16:56,440 --> 00:16:58,800 Speaker 1: fine to go collect a lot of data. The challenges 300 00:16:58,840 --> 00:17:01,200 Speaker 1: connecting them together, and that's really where this idea of 301 00:17:01,560 --> 00:17:04,720 Speaker 1: consistent delivery of new software when you learn the lesson 302 00:17:04,760 --> 00:17:06,760 Speaker 1: in the top floor says this is the way it 303 00:17:06,760 --> 00:17:08,880 Speaker 1: ought to be. How do you get that code out 304 00:17:08,960 --> 00:17:11,600 Speaker 1: into your built environment so that that software is actually 305 00:17:11,640 --> 00:17:14,400 Speaker 1: taking effect. It's not just a theory that is a 306 00:17:14,440 --> 00:17:16,399 Speaker 1: model in a data center, but it's a model that 307 00:17:16,440 --> 00:17:20,439 Speaker 1: can make a difference. Tell me about how this collaboration 308 00:17:20,520 --> 00:17:23,920 Speaker 1: between your two companies addresses that problem. Can you give 309 00:17:23,920 --> 00:17:28,359 Speaker 1: me an example, Well, yeah, I think what however, was 310 00:17:28,600 --> 00:17:31,160 Speaker 1: alluding to. One of the customers we're working with right 311 00:17:31,200 --> 00:17:33,919 Speaker 1: now is in the financial services industry. But this is 312 00:17:33,920 --> 00:17:37,879 Speaker 1: a digital interaction between the financial services business model of 313 00:17:37,920 --> 00:17:40,679 Speaker 1: banking and the people that walk up to it. And 314 00:17:40,720 --> 00:17:44,440 Speaker 1: there's a security risk out there in the world whereby 315 00:17:44,600 --> 00:17:48,800 Speaker 1: bad actors will target a t M machines and it's 316 00:17:48,800 --> 00:17:52,399 Speaker 1: called skimming, where they'll go and walk up to an 317 00:17:52,400 --> 00:17:54,840 Speaker 1: a t M, put a device that looks same color, 318 00:17:55,000 --> 00:17:59,200 Speaker 1: same fitting over the credit card slot, and surreptitiously scan 319 00:17:59,320 --> 00:18:01,520 Speaker 1: the credit card as it's being inserted into the machine. 320 00:18:01,880 --> 00:18:05,320 Speaker 1: The user doesn't know that it happened, and the bank 321 00:18:05,760 --> 00:18:08,399 Speaker 1: doesn't necessarily know that it happens. In the the the 322 00:18:08,640 --> 00:18:13,439 Speaker 1: point at which they can take most efficial effective action 323 00:18:13,680 --> 00:18:16,480 Speaker 1: against that bad actor is the point at which they're 324 00:18:16,480 --> 00:18:18,440 Speaker 1: walking up to the machine which has a video camera 325 00:18:18,440 --> 00:18:21,159 Speaker 1: inside of it, and inserting that device. And so there 326 00:18:21,200 --> 00:18:23,120 Speaker 1: are certain patterns you can be looking for. Are they're 327 00:18:23,119 --> 00:18:24,800 Speaker 1: walking up to it with a bag, are they reaching 328 00:18:24,840 --> 00:18:26,920 Speaker 1: into the bag? They are they taking on a certain 329 00:18:26,960 --> 00:18:31,040 Speaker 1: posture against that a t M interface to know maybe 330 00:18:31,080 --> 00:18:33,960 Speaker 1: there's further correlation we need to take against this person. 331 00:18:34,080 --> 00:18:36,520 Speaker 1: But so financial companies would look at that and say, 332 00:18:36,600 --> 00:18:38,320 Speaker 1: you know, that could be a needle in a haystack 333 00:18:38,720 --> 00:18:41,359 Speaker 1: kind of analysis problem. And if you get better and 334 00:18:41,400 --> 00:18:44,000 Speaker 1: better at getting closer to figuring out who is skimming 335 00:18:44,040 --> 00:18:46,680 Speaker 1: off your A t M machines and who isn't. Once 336 00:18:46,720 --> 00:18:48,880 Speaker 1: you get good at building that model and then deploying 337 00:18:48,880 --> 00:18:51,120 Speaker 1: that software to all your A t M s, you're 338 00:18:51,160 --> 00:18:54,440 Speaker 1: in a situation where your overall risk to your customers 339 00:18:54,480 --> 00:18:57,760 Speaker 1: and your brand that the payoff becomes immeasurable. So that's 340 00:18:57,760 --> 00:18:59,520 Speaker 1: one of the things that we're working on with IBM 341 00:18:59,560 --> 00:19:01,840 Speaker 1: and some of the great video analytics software they have 342 00:19:01,960 --> 00:19:04,560 Speaker 1: that we can put out closer to some of these 343 00:19:04,560 --> 00:19:08,800 Speaker 1: financial institutions, acquire analyzed, but then act upon the data 344 00:19:08,920 --> 00:19:16,240 Speaker 1: that's involved. Oh, I see. So to your point, the 345 00:19:16,280 --> 00:19:20,199 Speaker 1: insight number one is this particular a t M has 346 00:19:20,200 --> 00:19:23,320 Speaker 1: been compromised, But the much more useful bit of information 347 00:19:23,400 --> 00:19:27,199 Speaker 1: is it's been compromised by such and such a person, 348 00:19:27,880 --> 00:19:31,560 Speaker 1: and we're observing that person compromising it in real time, 349 00:19:32,119 --> 00:19:36,160 Speaker 1: right right, right, So whether that a t M learns 350 00:19:36,280 --> 00:19:38,640 Speaker 1: what a bad actor looks like walking up to it 351 00:19:38,920 --> 00:19:42,359 Speaker 1: in Minneapolis, well that's good. But the key is then learning, 352 00:19:42,560 --> 00:19:45,080 Speaker 1: updating the model, getting that new software tested, and then 353 00:19:45,080 --> 00:19:47,639 Speaker 1: getting it deployed consistently to all the other places that 354 00:19:47,640 --> 00:19:50,080 Speaker 1: can benefit. I want to go back to this partnership 355 00:19:50,119 --> 00:19:53,439 Speaker 1: between Lumin and IBM. You said, you guys have been 356 00:19:53,440 --> 00:19:55,840 Speaker 1: working together for some time. How long when did you win? 357 00:19:55,880 --> 00:19:59,320 Speaker 1: Did it first start? We've had relationships with IBM and 358 00:19:59,400 --> 00:20:01,600 Speaker 1: some of it's a Elia companies in one way, shape 359 00:20:01,680 --> 00:20:05,240 Speaker 1: or form for a few decades. The other thing to 360 00:20:05,280 --> 00:20:07,840 Speaker 1: remember is Lumen is a service provider, right, so we 361 00:20:08,040 --> 00:20:10,680 Speaker 1: contract with our customers to go deliver services for them. 362 00:20:10,680 --> 00:20:13,120 Speaker 1: In a lot of cases, those services have always involved 363 00:20:13,600 --> 00:20:18,720 Speaker 1: IBM software, IBM data capabilities, working with the IBM cloud, 364 00:20:19,240 --> 00:20:21,920 Speaker 1: and so IBM as a technology entity has been connected 365 00:20:21,960 --> 00:20:25,040 Speaker 1: to the end points of Lumen networks, you know for 366 00:20:25,080 --> 00:20:29,080 Speaker 1: all that time. Yeah, what does from a customer standpoint, 367 00:20:29,119 --> 00:20:32,200 Speaker 1: what does the partnership between Lumen and IBM look like? 368 00:20:33,000 --> 00:20:35,520 Speaker 1: I mean, are you if I'm that financial service companies 369 00:20:35,600 --> 00:20:37,639 Speaker 1: is trying to trying to stop my A t M 370 00:20:37,680 --> 00:20:40,480 Speaker 1: s from being hacked? Is am I dealing with a 371 00:20:40,600 --> 00:20:45,240 Speaker 1: kind of task force made up of lumon and IBM folks, 372 00:20:45,680 --> 00:20:47,879 Speaker 1: So that the solution that we're putting together there is 373 00:20:47,920 --> 00:20:52,600 Speaker 1: precisely that. So the out of technology companies continue to evolve, 374 00:20:53,160 --> 00:20:54,959 Speaker 1: they have these kind of tusk forces that you talk 375 00:20:55,000 --> 00:20:57,960 Speaker 1: about that actually work on problems and then reapply let 376 00:20:58,000 --> 00:21:01,639 Speaker 1: us technology innovations to those problems. We then create new 377 00:21:01,680 --> 00:21:04,679 Speaker 1: go to market offerings. As I mentioned earlier, kind of 378 00:21:04,920 --> 00:21:06,719 Speaker 1: the business models that are kind of really worked now 379 00:21:06,800 --> 00:21:09,520 Speaker 1: is where you actually get and understand with humility the 380 00:21:09,520 --> 00:21:11,520 Speaker 1: assets that you have as a company and combine them 381 00:21:11,520 --> 00:21:14,200 Speaker 1: with assets of other companies. And the thing that really 382 00:21:14,240 --> 00:21:16,120 Speaker 1: makes it come a live is in getting too very 383 00:21:16,119 --> 00:21:18,600 Speaker 1: smart groups of people together to actually face off to 384 00:21:18,640 --> 00:21:21,480 Speaker 1: those business problems. So the the problem that DIV was 385 00:21:21,520 --> 00:21:24,480 Speaker 1: going through, there was a conversation in the meeting room 386 00:21:24,480 --> 00:21:26,600 Speaker 1: which is, we have this problem, how would you think 387 00:21:26,640 --> 00:21:30,240 Speaker 1: about this? And then we combine our engineers various components 388 00:21:30,280 --> 00:21:32,800 Speaker 1: that we have worked up what we call proof of 389 00:21:32,840 --> 00:21:35,920 Speaker 1: concepts to kind of work through is there are there 390 00:21:35,920 --> 00:21:37,919 Speaker 1: there in some of the solutions that we can put together, 391 00:21:38,280 --> 00:21:40,199 Speaker 1: and then increasingly that becomes something that we would call 392 00:21:40,240 --> 00:21:42,960 Speaker 1: a production offering, which actually becomes more generally available in 393 00:21:43,000 --> 00:21:51,200 Speaker 1: the marketplace. What's the what's the hardest problem is always 394 00:21:51,400 --> 00:21:54,159 Speaker 1: I think called latency is always the hardest thing, and 395 00:21:54,200 --> 00:21:57,399 Speaker 1: it's in the both domains were probably primarily in the 396 00:21:57,400 --> 00:22:00,000 Speaker 1: limit demand, and that's where you kind of forever put 397 00:22:00,040 --> 00:22:04,080 Speaker 1: shing physics to actually get as close to the speed 398 00:22:04,119 --> 00:22:07,200 Speaker 1: of light in terms of how quickly you're you're transmitting data. 399 00:22:07,840 --> 00:22:10,480 Speaker 1: And it's a tough two problem to solve for but 400 00:22:10,560 --> 00:22:13,359 Speaker 1: because of the huge volumes of data and because of 401 00:22:13,560 --> 00:22:17,440 Speaker 1: increasingly humans nature for instant gratification and that we want 402 00:22:17,480 --> 00:22:21,080 Speaker 1: everything now, we want it immediately. And what's hard about 403 00:22:21,119 --> 00:22:24,800 Speaker 1: that is latency is a particularly hard problem from a 404 00:22:24,840 --> 00:22:30,760 Speaker 1: technical standpoint because in some cases latency, you know, latency 405 00:22:31,040 --> 00:22:33,440 Speaker 1: is that is the amount of time it takes usually 406 00:22:33,480 --> 00:22:37,159 Speaker 1: measured in milliseconds, which are less than the blink of 407 00:22:37,160 --> 00:22:38,960 Speaker 1: an eye, but the amount of time it takes a 408 00:22:38,960 --> 00:22:43,280 Speaker 1: packet to traverse between two particular endpoints and a network, 409 00:22:43,800 --> 00:22:45,919 Speaker 1: but those all add up, right. You can sort of 410 00:22:45,960 --> 00:22:48,760 Speaker 1: thinking of it in a computer or a brain context 411 00:22:48,800 --> 00:22:52,119 Speaker 1: as processing speed. How fast can I react to things? Well, 412 00:22:52,160 --> 00:22:54,240 Speaker 1: if it takes a while for the packets to travel 413 00:22:54,320 --> 00:22:59,520 Speaker 1: through their neurons, to use a brain analogy of a network, 414 00:22:59,800 --> 00:23:01,920 Speaker 1: the longer it takes for the packets to process through, 415 00:23:02,280 --> 00:23:04,280 Speaker 1: the longer it takes for an outcome to occur. And 416 00:23:04,359 --> 00:23:06,879 Speaker 1: if an outcome takes too long to process, then it 417 00:23:06,960 --> 00:23:11,199 Speaker 1: becomes fairly useless. Yeah, Yeah. My first question is do 418 00:23:11,280 --> 00:23:15,520 Speaker 1: customers always realize what the potential of all of these 419 00:23:15,520 --> 00:23:19,240 Speaker 1: different pieces are or is part of your job in 420 00:23:19,400 --> 00:23:23,320 Speaker 1: helping people in opening people's eyes to what's possible. You know, 421 00:23:23,640 --> 00:23:25,680 Speaker 1: very at times you have to kind of be a 422 00:23:25,720 --> 00:23:27,840 Speaker 1: technology of angelists in terms of what the art of 423 00:23:27,880 --> 00:23:30,680 Speaker 1: the possible is against the problems UM. And it's not 424 00:23:30,720 --> 00:23:34,199 Speaker 1: because customers don't have the samability to see that. It's 425 00:23:34,240 --> 00:23:36,000 Speaker 1: just very often they don't see that the breadth of 426 00:23:36,040 --> 00:23:37,600 Speaker 1: things that we see when we're working with lots of 427 00:23:37,600 --> 00:23:40,760 Speaker 1: different industries and we can apply solutions from one place 428 00:23:41,280 --> 00:23:44,479 Speaker 1: to another. UM. The other element in terms of the 429 00:23:44,520 --> 00:23:48,639 Speaker 1: pace of adoption in organizations is less about the actual 430 00:23:48,920 --> 00:23:51,479 Speaker 1: people within them, but also the technology decisions that were 431 00:23:51,480 --> 00:23:54,119 Speaker 1: made in the past. Large investments will have already been 432 00:23:54,200 --> 00:23:56,679 Speaker 1: made to actually build the technology environments that they have. 433 00:23:56,840 --> 00:23:59,720 Speaker 1: They're known as legacy environments, and it's getting from a 434 00:23:59,760 --> 00:24:02,640 Speaker 1: leg see environment to the new environment. And that that's 435 00:24:02,920 --> 00:24:04,639 Speaker 1: a tricky dribble in the sense that you have to 436 00:24:04,640 --> 00:24:06,119 Speaker 1: look at your balance sheet, you have to look at 437 00:24:06,160 --> 00:24:08,000 Speaker 1: the amount of work that would be necessary to do that. 438 00:24:08,560 --> 00:24:10,960 Speaker 1: You've got to change everything from infrastructure to lines of 439 00:24:11,359 --> 00:24:14,679 Speaker 1: application code, of data sets and so on. Um so 440 00:24:14,720 --> 00:24:18,320 Speaker 1: it's a very complex environment for our customers to be 441 00:24:18,359 --> 00:24:21,000 Speaker 1: able for other thinking about and therefore, what do they 442 00:24:21,080 --> 00:24:24,560 Speaker 1: prioritize as their next area of innovation relowsy to the 443 00:24:24,560 --> 00:24:26,760 Speaker 1: actual value that they would get for their customers, are, 444 00:24:26,800 --> 00:24:30,000 Speaker 1: for their shareholders or whatever that drivers are. It's really 445 00:24:30,040 --> 00:24:33,320 Speaker 1: interesting the that word I was going to kick out 446 00:24:33,320 --> 00:24:37,359 Speaker 1: of it. Enterprise i T. It's the only context in 447 00:24:37,400 --> 00:24:42,480 Speaker 1: which legacy is an epithet. Right, Like you say legacy 448 00:24:42,600 --> 00:24:45,040 Speaker 1: to an I T person, they roll their eyes and 449 00:24:45,119 --> 00:24:47,359 Speaker 1: you know their their blood pressure goes up. It's like 450 00:24:47,440 --> 00:24:50,080 Speaker 1: nails on a chalkboard. But to most individuals, like what 451 00:24:50,240 --> 00:24:53,680 Speaker 1: is your legacy, the word legacy means like it's something 452 00:24:53,720 --> 00:24:56,439 Speaker 1: to be honored, right, It's something. In an enterprise context, 453 00:24:56,560 --> 00:24:58,800 Speaker 1: legacy just means you've made a lot of decisions already, 454 00:24:59,000 --> 00:25:00,560 Speaker 1: You've made a lot of the say as you've made 455 00:25:00,600 --> 00:25:02,880 Speaker 1: a lot of implementations, you're bringing a lot behind you. 456 00:25:02,920 --> 00:25:05,199 Speaker 1: That should be a good thing. But an enterprise I 457 00:25:05,240 --> 00:25:08,320 Speaker 1: T context and a technology domain, it's really challenging. Yeah, 458 00:25:08,359 --> 00:25:10,080 Speaker 1: I mean that what I've heard played back to me 459 00:25:10,200 --> 00:25:12,719 Speaker 1: is kind of yeah, how that God may have created 460 00:25:12,720 --> 00:25:14,399 Speaker 1: the earthen seven days, but you didn't have to deal 461 00:25:14,440 --> 00:25:19,000 Speaker 1: with legacy. Uh yeah, so it kind of gives you 462 00:25:19,040 --> 00:25:22,440 Speaker 1: a sense as to the differences in an I T context. Yeah, 463 00:25:22,640 --> 00:25:25,280 Speaker 1: one last question, I want you guys to jump ahead 464 00:25:25,280 --> 00:25:28,199 Speaker 1: ten years from now. I've gathered the two of you 465 00:25:28,480 --> 00:25:32,280 Speaker 1: ten years from now, tell me what's top of mind 466 00:25:33,359 --> 00:25:42,119 Speaker 1: in I think what's what's really a huge challenge in 467 00:25:42,800 --> 00:25:47,080 Speaker 1: business and in the ways that business and organizations collaborate 468 00:25:47,359 --> 00:25:51,320 Speaker 1: is this concept of compose ability. And I think compose 469 00:25:51,359 --> 00:25:54,320 Speaker 1: ability of the ability to go break things down into 470 00:25:54,359 --> 00:25:58,240 Speaker 1: simple functions and have them be intercombined. Um, we're just 471 00:25:58,280 --> 00:26:00,520 Speaker 1: still even at the outset of that. You're to see 472 00:26:00,520 --> 00:26:01,960 Speaker 1: that a lot in the cloud, but as we get 473 00:26:01,960 --> 00:26:04,160 Speaker 1: out closer to edge computing in some of these Fourth 474 00:26:04,160 --> 00:26:09,280 Speaker 1: Industrial Revolution use cases, the ability to take and compose 475 00:26:10,119 --> 00:26:14,280 Speaker 1: different capabilities from from an IBM, from another software company, 476 00:26:14,359 --> 00:26:17,960 Speaker 1: from a real estate company that's selling you access to 477 00:26:18,040 --> 00:26:20,919 Speaker 1: run computing capacity at the end of a of a 478 00:26:20,920 --> 00:26:25,240 Speaker 1: physical link. The ability to compose services together, whether it's 479 00:26:25,320 --> 00:26:28,880 Speaker 1: through multiple parties, or the ways organizations even present themselves 480 00:26:28,880 --> 00:26:31,680 Speaker 1: to the world take advantage of us in any way, 481 00:26:31,760 --> 00:26:35,080 Speaker 1: in any slice that you so choose. Compose ability is 482 00:26:35,080 --> 00:26:37,359 Speaker 1: going to open up a massive amount of possibilities that 483 00:26:37,359 --> 00:26:39,440 Speaker 1: it's maybe a little rooted in the here and now, 484 00:26:39,480 --> 00:26:41,560 Speaker 1: but it's it's something that I'm excited about over the 485 00:26:41,560 --> 00:26:44,240 Speaker 1: next five to ten. Yeah, the thing I'm interested is 486 00:26:44,320 --> 00:26:46,240 Speaker 1: kind of the So we're in the midst of ultificial 487 00:26:46,240 --> 00:26:50,560 Speaker 1: intelligence that is increasingly starting to tax the inventors of those, 488 00:26:50,600 --> 00:26:53,120 Speaker 1: which is human beings. So the prefrontal call text only 489 00:26:53,160 --> 00:26:55,360 Speaker 1: has so many energy it can burn it a debt 490 00:26:55,640 --> 00:26:57,000 Speaker 1: and it is being burnt out at the end of 491 00:26:57,040 --> 00:26:58,840 Speaker 1: every day through the actual amount of debt, so it's 492 00:26:58,840 --> 00:27:04,119 Speaker 1: bombarding it. So the intelligent augmentation, so flipping the two 493 00:27:04,280 --> 00:27:08,080 Speaker 1: letters from artificial intelligence to intelligence augmentation so that we 494 00:27:08,119 --> 00:27:10,800 Speaker 1: actually can actually work within these environments in a far 495 00:27:10,880 --> 00:27:14,080 Speaker 1: more commulative style relative to what we can biologically do. 496 00:27:14,400 --> 00:27:16,160 Speaker 1: It's going to be where there's a lot of advancements. 497 00:27:16,440 --> 00:27:19,120 Speaker 1: And I talked about the partnerships between two technology companies, 498 00:27:19,119 --> 00:27:23,679 Speaker 1: so saluminate ourselves, but they'll be increasing partnerships between health 499 00:27:23,760 --> 00:27:28,880 Speaker 1: and bio companies as well as it relates to technology. Yeah, wonderful, Well, 500 00:27:29,280 --> 00:27:31,600 Speaker 1: thank you so much. This has been really fun. Thank 501 00:27:31,600 --> 00:27:38,440 Speaker 1: you very much being a pleasure it was. Thanks again 502 00:27:38,480 --> 00:27:42,160 Speaker 1: to David cos and Howard Boville for talking with me. 503 00:27:42,680 --> 00:27:46,760 Speaker 1: It's fascinating to consider how quickly data analysis can change 504 00:27:46,840 --> 00:27:51,200 Speaker 1: performance in real time, and the endless possibilities of hybrid 505 00:27:51,200 --> 00:27:56,359 Speaker 1: cloud and edge computing, and look forward to witnessing its evolution. 506 00:28:01,040 --> 00:28:04,000 Speaker 1: Smart Talks with IBM is produced by Emile Rostak with 507 00:28:04,119 --> 00:28:09,800 Speaker 1: Carlie mcgliori and Katherine Gurda, Edited by Karen shakerge engineering 508 00:28:09,840 --> 00:28:14,320 Speaker 1: by Martin Gonzalez, mixed and mastered by Jason Gambrell and 509 00:28:14,440 --> 00:28:20,000 Speaker 1: Ben Tolliday. Music by Granmascope. Special thanks to Molly Sosha, Andy, 510 00:28:20,080 --> 00:28:24,000 Speaker 1: Kelly Neil, LaBelle, Jacob Weisberg, Head of Fane, Eric Sandler, 511 00:28:24,200 --> 00:28:27,959 Speaker 1: and Maggie Taylor and the team's at eight Bar and IBM. 512 00:28:28,119 --> 00:28:32,360 Speaker 1: Smart Talks with IBM is a production of Pushkin Industries 513 00:28:32,760 --> 00:28:36,359 Speaker 1: and I Heart Media. You can find more Pushkin podcasts 514 00:28:36,640 --> 00:28:41,240 Speaker 1: on the I Heart Radio app, Apple Podcasts, or wherever 515 00:28:41,720 --> 00:28:45,920 Speaker 1: you like to listen. I'm Malcolm Gladwell, See you next time.