1 00:00:00,080 --> 00:00:05,360 Speaker 1: I'm talking about blockchain. As a tech podcaster, I've found 2 00:00:05,400 --> 00:00:09,840 Speaker 1: it challenging to explain blockchain from a technical perspective because 3 00:00:09,840 --> 00:00:14,120 Speaker 1: it requires a certain level of familiarity with several other concepts, 4 00:00:14,120 --> 00:00:17,880 Speaker 1: and by the time I finished explaining those basic concepts, 5 00:00:18,280 --> 00:00:22,120 Speaker 1: there's very little time left for the star attraction. Jason, 6 00:00:22,360 --> 00:00:25,560 Speaker 1: as you will hear, has a very different approach and 7 00:00:25,760 --> 00:00:29,560 Speaker 1: one that I find particularly effective. I chatted with Jason 8 00:00:29,600 --> 00:00:33,440 Speaker 1: about his work at IBM, how the pandemic continues to 9 00:00:33,520 --> 00:00:37,320 Speaker 1: present enormous challenges to businesses and the average person alike, 10 00:00:37,840 --> 00:00:40,960 Speaker 1: and how blockchain can play a role in industries that 11 00:00:41,320 --> 00:00:46,080 Speaker 1: found themselves in a rapidly shifting and dynamic situation. Have 12 00:00:46,200 --> 00:00:50,680 Speaker 1: a listen, Jason. Before we get started, I was hoping 13 00:00:50,720 --> 00:00:54,040 Speaker 1: maybe you could give us a little background information on 14 00:00:54,120 --> 00:00:59,720 Speaker 1: yourself and what you do with IBM. So take it away, 15 00:00:59,760 --> 00:01:02,400 Speaker 1: tell us all about yourself. A sure, John, Let me 16 00:01:02,440 --> 00:01:05,240 Speaker 1: see if I can keep this contained in a minute blurb. 17 00:01:05,959 --> 00:01:11,400 Speaker 1: Probably not so, IBM. IBM has been my career within 18 00:01:12,440 --> 00:01:15,800 Speaker 1: the last number of years. I started. My first career 19 00:01:16,120 --> 00:01:19,040 Speaker 1: out of school was a military on my army, veteran 20 00:01:19,080 --> 00:01:22,360 Speaker 1: airborne ranger. All that fun stuff my body gave up 21 00:01:22,400 --> 00:01:26,200 Speaker 1: before I did. From the military. I then went through 22 00:01:26,240 --> 00:01:29,160 Speaker 1: a number of careers, from running a subsidiary of a 23 00:01:29,160 --> 00:01:32,880 Speaker 1: company that was ultimately sold to Berkshire Hathaway, did a 24 00:01:32,920 --> 00:01:37,800 Speaker 1: stint as I exect on Loan and public service UH, 25 00:01:37,840 --> 00:01:43,679 Speaker 1: and then started my own company, later sold and then 26 00:01:44,120 --> 00:01:49,080 Speaker 1: ended up at IBM, starting new companies through solutions throughout 27 00:01:49,440 --> 00:01:53,040 Speaker 1: North America and now globally, having started parts of our 28 00:01:53,080 --> 00:01:57,440 Speaker 1: business in Southeast Asia as well as China. And so 29 00:01:57,720 --> 00:02:00,840 Speaker 1: what's consistent and all that is a lot of starting 30 00:02:00,880 --> 00:02:07,280 Speaker 1: things up, growing them based on solid teams and technology, 31 00:02:07,360 --> 00:02:10,639 Speaker 1: based in data. And so here we are now. As 32 00:02:10,720 --> 00:02:14,120 Speaker 1: I recently in the last five years was running our 33 00:02:14,160 --> 00:02:16,840 Speaker 1: solutions group for all of our services, I ran an 34 00:02:16,840 --> 00:02:20,640 Speaker 1: experiment on this thing called blockchain, and so here I 35 00:02:20,680 --> 00:02:26,200 Speaker 1: am now four years almost four years later, at the 36 00:02:26,320 --> 00:02:31,320 Speaker 1: head of our blockchain business globally and enjoying some of 37 00:02:31,320 --> 00:02:34,679 Speaker 1: the most talented people I've ever worked with in my 38 00:02:35,000 --> 00:02:39,400 Speaker 1: multiple careers. I love that we've already got the cattle 39 00:02:39,400 --> 00:02:42,560 Speaker 1: of the bag with the blockchain term. I find in 40 00:02:42,600 --> 00:02:46,640 Speaker 1: technology that every few years a new term will emerge 41 00:02:46,800 --> 00:02:51,359 Speaker 1: that that the mainstream find it difficult to really grock 42 00:02:51,560 --> 00:02:55,600 Speaker 1: like cloud computing was one of those examples, right, And blockchain, 43 00:02:55,639 --> 00:02:58,280 Speaker 1: I feel, is one of those things. For a lot 44 00:02:58,320 --> 00:03:02,560 Speaker 1: of people where they hear the term, they are they 45 00:03:02,639 --> 00:03:04,960 Speaker 1: might be somewhat familiar with it. A lot of people 46 00:03:05,080 --> 00:03:09,800 Speaker 1: their touchstone is cryptocurrency, but they don't really have any 47 00:03:10,040 --> 00:03:13,399 Speaker 1: grasp of what it actually means. So when you're describing 48 00:03:13,639 --> 00:03:18,000 Speaker 1: your work to someone who is unfamiliar with blockchain, how 49 00:03:18,000 --> 00:03:21,920 Speaker 1: do you explain what it is? Well, it's great you 50 00:03:21,960 --> 00:03:24,920 Speaker 1: asked that question, Jonathan, because I usually leave out the 51 00:03:25,000 --> 00:03:29,200 Speaker 1: B word. Um, let's think of taking the word apart, 52 00:03:29,280 --> 00:03:32,000 Speaker 1: get rid of the block part, because people hear blockchain 53 00:03:32,000 --> 00:03:34,280 Speaker 1: and they start thinking crypto, and that's not what we're 54 00:03:34,320 --> 00:03:37,160 Speaker 1: talking about, it's what's behind that. So if you just 55 00:03:37,520 --> 00:03:40,040 Speaker 1: think of chain, I'll give you an example, and this 56 00:03:40,120 --> 00:03:43,480 Speaker 1: is this example hopefully will help you understand the outcome 57 00:03:43,560 --> 00:03:46,200 Speaker 1: we get from this technology. And if you just put 58 00:03:46,280 --> 00:03:50,640 Speaker 1: supply where the B word was in supply chain and 59 00:03:50,720 --> 00:03:53,080 Speaker 1: let me ask you, so, Jonathan, do you drink coffee 60 00:03:53,160 --> 00:03:57,840 Speaker 1: or tea or anything of that sort either one I do. Indeed, 61 00:03:57,880 --> 00:04:01,400 Speaker 1: I drink both tea and coffee and coffee okay, great, 62 00:04:01,720 --> 00:04:05,120 Speaker 1: Now let me ask you, uh, at any time, do 63 00:04:05,200 --> 00:04:08,680 Speaker 1: you have organic to your coffee? I have had it 64 00:04:08,840 --> 00:04:11,440 Speaker 1: on occasion, yes, on occasion. Well you probably paid a 65 00:04:11,520 --> 00:04:13,640 Speaker 1: little bit more for it because it's organic. Now, so 66 00:04:13,720 --> 00:04:16,200 Speaker 1: you a fun fact that many people spread and know 67 00:04:16,360 --> 00:04:22,360 Speaker 1: that there's more organic food consumed every year than is produced. 68 00:04:23,560 --> 00:04:27,800 Speaker 1: So think on that for a second. Right, So, so 69 00:04:27,880 --> 00:04:32,400 Speaker 1: somewhere in there in this supply chain, and now think 70 00:04:32,400 --> 00:04:34,840 Speaker 1: of the linear supply chain. I'm trying to paint a 71 00:04:35,040 --> 00:04:38,840 Speaker 1: visual picture for you. You know, let's let's think from 72 00:04:38,880 --> 00:04:43,400 Speaker 1: in this case, from from farm to to cup. Using 73 00:04:43,400 --> 00:04:46,160 Speaker 1: that time farm the fork kind of thing. Think think 74 00:04:46,200 --> 00:04:50,280 Speaker 1: of that that supply chain forget to be where it's 75 00:04:50,320 --> 00:04:55,760 Speaker 1: supply chain, and think of how that coffee being or 76 00:04:55,839 --> 00:04:59,599 Speaker 1: t leaf grew someplace and then went through a number 77 00:04:59,640 --> 00:05:02,039 Speaker 1: of sta ages to finally get to a store where 78 00:05:02,080 --> 00:05:04,919 Speaker 1: you picked it up or to a location where you 79 00:05:05,000 --> 00:05:09,640 Speaker 1: bought that cup of to your coffee. So that chain, 80 00:05:09,960 --> 00:05:14,719 Speaker 1: it's typically these multiple handshakes. So just think about a 81 00:05:14,760 --> 00:05:17,559 Speaker 1: matter of moving it from one place to the next 82 00:05:17,760 --> 00:05:22,039 Speaker 1: very literally and handshakes, passing it from one place to 83 00:05:22,040 --> 00:05:24,520 Speaker 1: the next, so that when you get it that final 84 00:05:24,560 --> 00:05:27,479 Speaker 1: handshake where you get to consume it. Jonathan, you say 85 00:05:27,520 --> 00:05:31,039 Speaker 1: okay with that handshake, I know that this coffee is 86 00:05:31,240 --> 00:05:34,280 Speaker 1: organic because it says it on the label or it's 87 00:05:34,279 --> 00:05:36,960 Speaker 1: set it on the menu. That's all you have at 88 00:05:36,960 --> 00:05:40,120 Speaker 1: this point, And think of those handshakes. It's just a 89 00:05:40,160 --> 00:05:43,000 Speaker 1: passing of data. Forget the tea or coffee. It's just 90 00:05:43,200 --> 00:05:45,960 Speaker 1: data about the t your coffee that you're going to consume, 91 00:05:46,080 --> 00:05:50,760 Speaker 1: something you're putting in your body. That data is going 92 00:05:50,800 --> 00:05:53,640 Speaker 1: from one place to the next until you consume it. 93 00:05:54,760 --> 00:05:58,919 Speaker 1: And in that chain, those handshakes are what you trust. 94 00:05:59,440 --> 00:06:01,919 Speaker 1: And there's some thing that you're trusting there first that 95 00:06:01,920 --> 00:06:05,440 Speaker 1: that data is shared accurately, and then no one's changed it. 96 00:06:06,480 --> 00:06:12,839 Speaker 1: That chain of data is something that you trust just inherently. Now, 97 00:06:12,920 --> 00:06:17,320 Speaker 1: what if, Jonathan, what if you could guarantee that that 98 00:06:17,320 --> 00:06:20,000 Speaker 1: that being or that t leaf that you consumed was 99 00:06:20,080 --> 00:06:25,600 Speaker 1: actually organic by ensuring that that data could be accurately 100 00:06:25,680 --> 00:06:28,320 Speaker 1: shared and you wouldn't have to worry about it going 101 00:06:28,360 --> 00:06:32,000 Speaker 1: from handshake to handshake. You would know from where you 102 00:06:32,080 --> 00:06:35,320 Speaker 1: consumed it all the way back to that farm or 103 00:06:35,400 --> 00:06:38,280 Speaker 1: to that tree or wherever it was grown. If you 104 00:06:38,320 --> 00:06:42,039 Speaker 1: could get rid of the handshakes. Instead of having a 105 00:06:42,160 --> 00:06:46,640 Speaker 1: supply chain, you would have a block chain because each 106 00:06:46,680 --> 00:06:52,839 Speaker 1: one of those handshakes would be digital technology that is unchangeable, 107 00:06:52,960 --> 00:06:57,080 Speaker 1: often called a mutable, and you would be able to 108 00:06:57,120 --> 00:07:02,839 Speaker 1: make sure that that t is organic because you, as 109 00:07:02,880 --> 00:07:06,080 Speaker 1: the consumer, would be able to see digitally all the 110 00:07:06,120 --> 00:07:10,080 Speaker 1: way back to the point of origination. So there you go. Excellent. Well, 111 00:07:10,080 --> 00:07:13,600 Speaker 1: I love that that explanation. I love that narrative approach. 112 00:07:14,000 --> 00:07:16,400 Speaker 1: From a certain perspective, I think a lot of people 113 00:07:16,680 --> 00:07:22,360 Speaker 1: immediately can kind of identify with the importance of this technology. 114 00:07:22,400 --> 00:07:26,119 Speaker 1: I mean, we we've seen depictions of our perhaps even 115 00:07:26,240 --> 00:07:29,680 Speaker 1: encountered the experience of the hey kid, you want to 116 00:07:29,680 --> 00:07:32,560 Speaker 1: buy a watch kind of thing, right, the idea, the 117 00:07:32,560 --> 00:07:35,920 Speaker 1: guy who's got got the These are all trust me, 118 00:07:36,000 --> 00:07:39,840 Speaker 1: these are all Rolexes. These are all brand brands that 119 00:07:39,920 --> 00:07:43,200 Speaker 1: you want, and you're immediately the red flag should come 120 00:07:43,240 --> 00:07:46,640 Speaker 1: up of really, that's probably not true. So in a 121 00:07:46,640 --> 00:07:49,880 Speaker 1: lot of ways, from what you're describing this, this approach 122 00:07:50,240 --> 00:07:53,280 Speaker 1: is a way of verifying through every single step of 123 00:07:53,320 --> 00:07:57,200 Speaker 1: the process that the thing you are trying to take 124 00:07:57,600 --> 00:08:00,000 Speaker 1: possession of, or the thing you're trying to purchase or 125 00:08:00,080 --> 00:08:04,840 Speaker 1: sell is genuine. It's the actual thing, and you can 126 00:08:04,880 --> 00:08:08,920 Speaker 1: easily see where that would be a really valuable approach. 127 00:08:09,560 --> 00:08:13,480 Speaker 1: That's that's exactly a Jonathan and and to simplify it 128 00:08:13,560 --> 00:08:16,920 Speaker 1: even more, the outcome is what's key here. Think of 129 00:08:16,960 --> 00:08:19,480 Speaker 1: that chain of custody almost going from one to next, 130 00:08:19,600 --> 00:08:23,640 Speaker 1: or here you go watch the dad joke chain of trustity. 131 00:08:23,720 --> 00:08:29,240 Speaker 1: So it's that trust going going from from source to 132 00:08:29,320 --> 00:08:32,640 Speaker 1: the consumer. And so that's called provenance. And it's one 133 00:08:32,679 --> 00:08:35,880 Speaker 1: of the three letters that we call out here that 134 00:08:36,000 --> 00:08:40,080 Speaker 1: you've mentioned in your Luxury good or Rolex example. Is 135 00:08:40,120 --> 00:08:43,440 Speaker 1: that first there's a thought of tokenization. I just mean 136 00:08:43,640 --> 00:08:48,360 Speaker 1: tokenizing an asset, digitizing something that has physical value. So 137 00:08:48,480 --> 00:08:51,480 Speaker 1: you've now token ized that watch because you've made it 138 00:08:51,520 --> 00:08:54,120 Speaker 1: digitally given to the digital And here comes the second 139 00:08:54,120 --> 00:08:57,559 Speaker 1: word identity. And so whether it's an identity of a 140 00:08:57,600 --> 00:09:01,080 Speaker 1: good or of a service or of a person, you 141 00:09:01,160 --> 00:09:03,880 Speaker 1: have identity. And then I mentioned the P provenance, So 142 00:09:04,000 --> 00:09:07,120 Speaker 1: T I P. If you can remember those three things, 143 00:09:07,679 --> 00:09:09,600 Speaker 1: that's all you need to know about that B word 144 00:09:09,679 --> 00:09:12,520 Speaker 1: because that's what it delivers as an outcome, and not 145 00:09:12,600 --> 00:09:16,960 Speaker 1: mutually exclusively either because as your in your rolex example, 146 00:09:17,440 --> 00:09:20,440 Speaker 1: there is a digitization of it, so it's hokenize there. 147 00:09:20,679 --> 00:09:22,600 Speaker 1: There is you could say, an identity to make sure 148 00:09:22,640 --> 00:09:26,600 Speaker 1: it's that same watch. And then from the from the 149 00:09:26,640 --> 00:09:29,480 Speaker 1: source of manufacturer to year rist so you know, then 150 00:09:29,520 --> 00:09:32,920 Speaker 1: the provenance. So those three things will be common outcomes 151 00:09:33,040 --> 00:09:36,640 Speaker 1: of of using this this new technology to kind of 152 00:09:36,920 --> 00:09:40,600 Speaker 1: shift a little bit. We've also seen obviously this past 153 00:09:40,679 --> 00:09:46,280 Speaker 1: year some tumultuous changes as companies have had to find 154 00:09:46,320 --> 00:09:51,199 Speaker 1: new ways to be flexible in the the coronavirus outbreak, 155 00:09:51,840 --> 00:09:55,480 Speaker 1: and that ended up being a big news story in 156 00:09:55,559 --> 00:09:59,600 Speaker 1: itself that the coronavirus outbreak and COVID nineteen we're having 157 00:09:59,600 --> 00:10:03,720 Speaker 1: a maze ger impact on companies and their supply chains. 158 00:10:03,960 --> 00:10:06,840 Speaker 1: Can you go into a little bit about why that was? 159 00:10:06,960 --> 00:10:11,160 Speaker 1: Why was that's such a an enormous impact early on 160 00:10:11,559 --> 00:10:14,160 Speaker 1: with the and and continues to be for several companies 161 00:10:14,440 --> 00:10:17,439 Speaker 1: with the outbreak of COVID nineteen. I think you had 162 00:10:17,679 --> 00:10:20,560 Speaker 1: you got to keep point there, Joathan. It continues to 163 00:10:20,600 --> 00:10:24,360 Speaker 1: be and it always this thought of supply chains and 164 00:10:24,640 --> 00:10:29,440 Speaker 1: supply chain visibility. Um, you know, having that that understanding 165 00:10:29,480 --> 00:10:33,640 Speaker 1: has always been elusive, always talked about so so what's 166 00:10:33,640 --> 00:10:37,240 Speaker 1: so what happened with with our current pandemic is it? 167 00:10:37,600 --> 00:10:41,720 Speaker 1: I think everyone realized just how fragile, or in this case, 168 00:10:41,760 --> 00:10:46,679 Speaker 1: how brittle their supply chains are not flexible at all. 169 00:10:47,000 --> 00:10:51,920 Speaker 1: And what we also realized was that the current demand 170 00:10:53,440 --> 00:10:56,359 Speaker 1: is different than it's ever been before. In the pandemic 171 00:10:56,880 --> 00:11:01,080 Speaker 1: it was giving an example, we had everyone needed hand 172 00:11:01,440 --> 00:11:06,000 Speaker 1: sanitizer and mask right you had ppe personal protective equipment, 173 00:11:06,480 --> 00:11:09,920 Speaker 1: and everyone was looking for it and could make enough. 174 00:11:10,280 --> 00:11:13,400 Speaker 1: And we had challenge companies that said, well maybe if 175 00:11:13,400 --> 00:11:15,680 Speaker 1: I make some of that, I can save save save 176 00:11:15,720 --> 00:11:18,000 Speaker 1: our company. So a brewery that says, now I'm going 177 00:11:18,080 --> 00:11:21,400 Speaker 1: to make hand sanitizer. Now you have a brewery that 178 00:11:21,640 --> 00:11:24,640 Speaker 1: is creating something they've never created before, trying to tie 179 00:11:24,640 --> 00:11:28,319 Speaker 1: into an ecosystem and a supply chain that they've never 180 00:11:28,360 --> 00:11:30,720 Speaker 1: tied into, and they're going to try and get that 181 00:11:30,760 --> 00:11:33,080 Speaker 1: to let's say, a hospital that needs a lot of 182 00:11:33,120 --> 00:11:36,400 Speaker 1: hand sanitizer. Keep the story going, but not too long. 183 00:11:37,200 --> 00:11:41,200 Speaker 1: They finally get that hand sanitizer to that hospital. And 184 00:11:41,200 --> 00:11:45,080 Speaker 1: then a hospital gets so much because they've over overbought. 185 00:11:45,640 --> 00:11:47,440 Speaker 1: Now they need to get rid of it. They want 186 00:11:47,520 --> 00:11:48,959 Speaker 1: to make sure they're not just going to throw it out, 187 00:11:49,000 --> 00:11:51,439 Speaker 1: they need to get to another hospital perhaps, so now 188 00:11:51,480 --> 00:11:54,760 Speaker 1: they instead of being a buyer, they're now a supplier. 189 00:11:54,840 --> 00:11:58,240 Speaker 1: So how do they tie into a different type of 190 00:11:58,600 --> 00:12:02,239 Speaker 1: business process where they're supply line other hospitals. I imagine 191 00:12:02,280 --> 00:12:05,880 Speaker 1: that visibility also comes in very handy when you are 192 00:12:05,920 --> 00:12:09,640 Speaker 1: trying to identify the specific points within a supply chain 193 00:12:09,720 --> 00:12:14,120 Speaker 1: where you're you've got either active problems or potential problems 194 00:12:14,160 --> 00:12:18,080 Speaker 1: down the line. Uh. Since I cover technology, I cover 195 00:12:18,160 --> 00:12:22,080 Speaker 1: a lot of companies that are building products that have 196 00:12:23,120 --> 00:12:27,200 Speaker 1: numerous components within them, all potentially all of them from 197 00:12:27,400 --> 00:12:33,640 Speaker 1: different suppliers, are different manufacturers, and any one of those, 198 00:12:33,880 --> 00:12:37,440 Speaker 1: any one point of failure can cause a delay in 199 00:12:37,920 --> 00:12:42,640 Speaker 1: production and shipping. So something like this where you're able 200 00:12:42,760 --> 00:12:47,120 Speaker 1: to have that kind of of overview of the full 201 00:12:47,160 --> 00:12:50,400 Speaker 1: supply chain because of this approach, I would imagine that 202 00:12:50,480 --> 00:12:53,160 Speaker 1: ends up being really valuable from that perspective as well, 203 00:12:53,240 --> 00:12:57,640 Speaker 1: being able to detect where any potential problems are going 204 00:12:57,679 --> 00:13:01,720 Speaker 1: to be and be more proactive in finding ways to 205 00:13:01,920 --> 00:13:05,360 Speaker 1: solve those challenges. So, can you talk a little more 206 00:13:05,400 --> 00:13:10,240 Speaker 1: about how IBM is specifically helping organizations like hospitals manage 207 00:13:10,280 --> 00:13:12,200 Speaker 1: their supply chains. We kind of touched on it a 208 00:13:12,240 --> 00:13:14,880 Speaker 1: little bit, but do you have any other stories or 209 00:13:14,920 --> 00:13:17,680 Speaker 1: examples you can share with us? Well, I think you 210 00:13:18,240 --> 00:13:20,800 Speaker 1: started with PPE, but right in our we're all talking 211 00:13:20,840 --> 00:13:25,959 Speaker 1: about the vaccine, right, and this thought that we all 212 00:13:26,200 --> 00:13:32,280 Speaker 1: and this this crosses hospitals, but then also UM, your 213 00:13:32,400 --> 00:13:36,240 Speaker 1: your your pharmaceutical dispensaries whatever that may be your your 214 00:13:36,400 --> 00:13:39,440 Speaker 1: your local drug store, or it could be UM a 215 00:13:39,600 --> 00:13:47,199 Speaker 1: healthcare provider for elderly. Uh. And so that thought, how 216 00:13:47,200 --> 00:13:50,240 Speaker 1: do you how do you take this capability that we've 217 00:13:50,280 --> 00:13:54,160 Speaker 1: just talked about the operation new operation operation system for 218 00:13:54,280 --> 00:13:58,280 Speaker 1: trust and look at this value chain. Now that does, 219 00:13:58,320 --> 00:14:03,640 Speaker 1: as I mentioned before, cross industry as well as ecosystems. 220 00:14:04,200 --> 00:14:09,000 Speaker 1: And so we started with this thought of of bringing 221 00:14:09,040 --> 00:14:16,000 Speaker 1: together the the healthcare providers, the first responders with all 222 00:14:16,120 --> 00:14:19,640 Speaker 1: the suppliers for PPE as well as ventilators. And that's 223 00:14:19,680 --> 00:14:21,800 Speaker 1: the example I gave before, and I left out the 224 00:14:21,920 --> 00:14:25,840 Speaker 1: name of what it was called Rapid Supplier connect, which 225 00:14:25,920 --> 00:14:28,960 Speaker 1: was really just taking some IBM components and we had 226 00:14:28,960 --> 00:14:31,080 Speaker 1: in blockchain as well as things that we had and 227 00:14:31,320 --> 00:14:35,680 Speaker 1: something called sterling from a supply chain perspective, and a 228 00:14:35,680 --> 00:14:38,200 Speaker 1: solution called trust your supplier and you brought them together. 229 00:14:38,280 --> 00:14:39,960 Speaker 1: So that was the first that was when you say 230 00:14:40,000 --> 00:14:43,240 Speaker 1: what were we doing? We created that call it matchmaking 231 00:14:43,360 --> 00:14:47,680 Speaker 1: service on steroids with trust. And then as you progress 232 00:14:47,760 --> 00:14:49,640 Speaker 1: to the example I'm giving now where we say, oh, 233 00:14:49,680 --> 00:14:52,480 Speaker 1: well what about vaccinations, think about think about the challenge 234 00:14:52,520 --> 00:14:56,360 Speaker 1: we have in just the United States. You have federal 235 00:14:58,000 --> 00:15:02,160 Speaker 1: level people really working hard to get things into states, 236 00:15:02,520 --> 00:15:05,080 Speaker 1: and we know that that challenge just what we've seen 237 00:15:05,720 --> 00:15:08,720 Speaker 1: dynamically from how can you get a state or how 238 00:15:08,720 --> 00:15:11,040 Speaker 1: can you see a state? Back to this, how do 239 00:15:11,080 --> 00:15:14,240 Speaker 1: you share trust amongst players and what we think are 240 00:15:14,240 --> 00:15:17,600 Speaker 1: the same ecosystem and then even within that state, there's 241 00:15:17,680 --> 00:15:21,440 Speaker 1: even more entities that are moving a vaccine around. And 242 00:15:21,520 --> 00:15:24,960 Speaker 1: so we have what's called the Vaccine Accountability Network that 243 00:15:25,040 --> 00:15:29,160 Speaker 1: does just that. What you have is a capability that 244 00:15:29,200 --> 00:15:32,400 Speaker 1: gives you this trusted data that will partner with you. 245 00:15:32,440 --> 00:15:38,240 Speaker 1: Think of your your classic enterprise resource management systems, you 246 00:15:38,360 --> 00:15:43,400 Speaker 1: have your your contact management systems, in this case, contact 247 00:15:43,440 --> 00:15:46,480 Speaker 1: tracing systems. You have all of these systems that are 248 00:15:46,520 --> 00:15:49,080 Speaker 1: in place. And what's great is because this is delivered 249 00:15:49,080 --> 00:15:51,400 Speaker 1: in the cloud. You know, I was dropping those hybrid words. 250 00:15:51,400 --> 00:15:53,280 Speaker 1: It can be on any cloud, it can be multi 251 00:15:53,280 --> 00:15:57,360 Speaker 1: clouds and on premise off premise, so that's the hybrid term, 252 00:15:57,400 --> 00:15:59,320 Speaker 1: and you don't have to rip things out. This is 253 00:15:59,360 --> 00:16:04,600 Speaker 1: a data sharing and data trust layer that goes across 254 00:16:04,760 --> 00:16:09,120 Speaker 1: what already exists. So you hopefully now see, Jonathan, and 255 00:16:09,200 --> 00:16:13,200 Speaker 1: when I talk about this thought of a vaccine accountability network, 256 00:16:13,280 --> 00:16:19,040 Speaker 1: that it brings the capabilities together. This is truly where 257 00:16:19,080 --> 00:16:22,080 Speaker 1: we liked we as IBM, we see this as as 258 00:16:22,480 --> 00:16:25,800 Speaker 1: good tech and tech for good because it's bringing great 259 00:16:25,840 --> 00:16:29,400 Speaker 1: outcomes for our citizens. It's tech that matters for us 260 00:16:29,400 --> 00:16:31,840 Speaker 1: and for the world. And on top of that, just 261 00:16:31,920 --> 00:16:35,840 Speaker 1: as a sort of a my own little anecdotal addition here, 262 00:16:35,920 --> 00:16:39,280 Speaker 1: I think that it's incredibly valuable not just from the 263 00:16:39,280 --> 00:16:42,560 Speaker 1: the supply chain and technical side of things, but the 264 00:16:42,600 --> 00:16:46,920 Speaker 1: fact that you have that reassurance that this is you know, 265 00:16:47,000 --> 00:16:51,320 Speaker 1: it's all accounted for. We understand exactly where this has 266 00:16:51,320 --> 00:16:54,640 Speaker 1: been every step of the way. It helps combat something 267 00:16:54,680 --> 00:16:59,640 Speaker 1: that we have unfortunately really seen become more and more prevalent, 268 00:16:59,680 --> 00:17:03,800 Speaker 1: which is just the vast amount of misinformation that can 269 00:17:03,840 --> 00:17:09,119 Speaker 1: come out, whether purposefully or or you know, unconsciously, and 270 00:17:09,119 --> 00:17:12,080 Speaker 1: then just it can it can spread so quickly. Having 271 00:17:12,680 --> 00:17:17,679 Speaker 1: the capability of saying this is trustworthy and we have 272 00:17:17,760 --> 00:17:21,720 Speaker 1: the technology to prove that it's trustworthy and have that 273 00:17:21,800 --> 00:17:24,360 Speaker 1: reassurance there. I think that ends up being a valuable 274 00:17:24,400 --> 00:17:27,760 Speaker 1: tool as well. It's sort of an additional benefit of 275 00:17:27,800 --> 00:17:31,679 Speaker 1: this something that uh, you know, we're fortunate that the 276 00:17:31,840 --> 00:17:36,800 Speaker 1: technology has that capability while we are unfortunately in an 277 00:17:36,800 --> 00:17:39,919 Speaker 1: era of misinformation. So having something that you can actually 278 00:17:39,960 --> 00:17:45,399 Speaker 1: depend upon is incredibly reassuring and refreshing in a in 279 00:17:45,440 --> 00:17:49,640 Speaker 1: a world that occasionally feels like that is a luxury 280 00:17:49,760 --> 00:17:52,919 Speaker 1: as opposed to something you should just expect. So to me, 281 00:17:53,040 --> 00:17:56,560 Speaker 1: like that is another benefit of this technology. On top 282 00:17:56,640 --> 00:17:59,600 Speaker 1: of the logistical side of it, it's sort of a 283 00:17:59,600 --> 00:18:03,639 Speaker 1: psychle logical side right right, and I think a personalizing 284 00:18:04,040 --> 00:18:06,840 Speaker 1: like that. When you start talking about trust and information, 285 00:18:07,640 --> 00:18:10,399 Speaker 1: it brings to light something else that we've seen in 286 00:18:10,440 --> 00:18:15,000 Speaker 1: this pandemic where whilst people tend to think trust and information, 287 00:18:15,040 --> 00:18:18,240 Speaker 1: they go straight to you know, maybe media and news, 288 00:18:18,240 --> 00:18:20,240 Speaker 1: but we think of things that all of us can 289 00:18:20,280 --> 00:18:24,360 Speaker 1: personalize and its credentials. I talked about identity, but not 290 00:18:24,480 --> 00:18:27,240 Speaker 1: just are you who you say you are, Jonathan, but 291 00:18:27,320 --> 00:18:30,960 Speaker 1: what about your credentials. What about you know, your your diploma, 292 00:18:31,200 --> 00:18:34,240 Speaker 1: you're learning credentials, you know something that sits somewhere out 293 00:18:34,280 --> 00:18:36,240 Speaker 1: and we go, wait, what government agency that we go 294 00:18:36,320 --> 00:18:37,359 Speaker 1: to for that? And do I have to go to 295 00:18:37,400 --> 00:18:39,760 Speaker 1: my college? You know all of those things that you 296 00:18:39,800 --> 00:18:43,320 Speaker 1: start thinking about credentials in the thought of, uh, you know, 297 00:18:43,880 --> 00:18:46,720 Speaker 1: information about my health? Back to current pandemic. How do 298 00:18:46,800 --> 00:18:50,359 Speaker 1: I know if I've been vaccinated? Or how do I 299 00:18:50,400 --> 00:18:52,600 Speaker 1: know that if I'm COVID negative or positive? Don't we 300 00:18:52,640 --> 00:18:56,480 Speaker 1: want to document that in a credentialing type of way. Well, 301 00:18:57,560 --> 00:19:00,199 Speaker 1: this type of information is starting to become even more 302 00:19:01,119 --> 00:19:03,320 Speaker 1: more powerful when we take what I've just said and 303 00:19:03,359 --> 00:19:07,440 Speaker 1: applied it to a vaccine or to your health status, 304 00:19:07,920 --> 00:19:10,640 Speaker 1: and we we have this wonderful way if we went 305 00:19:10,640 --> 00:19:13,080 Speaker 1: through that vaccine thing, you know, how do you know 306 00:19:13,119 --> 00:19:15,440 Speaker 1: if you have been immunized? You could have a health 307 00:19:15,520 --> 00:19:17,800 Speaker 1: pass and that's something else you can google as as 308 00:19:17,800 --> 00:19:21,960 Speaker 1: we work with with that capability. But beyond health, think 309 00:19:21,960 --> 00:19:26,800 Speaker 1: about opening up new opportunities with this technology and using 310 00:19:26,880 --> 00:19:30,360 Speaker 1: information that can help others where you and I may 311 00:19:30,440 --> 00:19:32,840 Speaker 1: use it to validate that we went to high school 312 00:19:32,960 --> 00:19:36,080 Speaker 1: or that we have graduate degrees or college degrees. What 313 00:19:36,160 --> 00:19:41,360 Speaker 1: about using this to empower information that allows someone who 314 00:19:41,359 --> 00:19:43,479 Speaker 1: may not go to school for four years but just 315 00:19:43,560 --> 00:19:46,640 Speaker 1: needs a certification to get a great job. It can't 316 00:19:46,640 --> 00:19:48,840 Speaker 1: say okay, well great, go back to college and come 317 00:19:48,880 --> 00:19:50,879 Speaker 1: back to me in four years. So what if we 318 00:19:50,920 --> 00:19:54,720 Speaker 1: could put that personal credential on on a device that 319 00:19:54,800 --> 00:19:56,920 Speaker 1: you have in your pocket. So that's what we started. 320 00:19:56,960 --> 00:20:00,959 Speaker 1: We start about accurate, accuracy, reliable, and trust as well 321 00:20:00,960 --> 00:20:05,320 Speaker 1: as portability of information. That's what this capability enables, and 322 00:20:05,359 --> 00:20:07,960 Speaker 1: that's where I think it should, you know, really get 323 00:20:07,960 --> 00:20:11,560 Speaker 1: people excited when we talk about real information and real 324 00:20:11,600 --> 00:20:15,760 Speaker 1: trusted information is empowering and opening up markets. Finding not 325 00:20:15,960 --> 00:20:18,960 Speaker 1: something that's a blue collar job or a white collar job, 326 00:20:18,960 --> 00:20:21,840 Speaker 1: but thinking about these new collar jobs that just require 327 00:20:21,960 --> 00:20:26,159 Speaker 1: trusted certifications. I love that you have have really gone 328 00:20:26,200 --> 00:20:30,359 Speaker 1: to kind of the the endpoint the individual person with 329 00:20:30,480 --> 00:20:34,199 Speaker 1: that sort of application of this technology and how that 330 00:20:34,240 --> 00:20:36,080 Speaker 1: can make an impact. Because a lot of the time 331 00:20:36,119 --> 00:20:39,720 Speaker 1: when we talk about these big ideas in tech, uh, 332 00:20:39,760 --> 00:20:41,840 Speaker 1: it can seem distance. It can seem like, oh, well, 333 00:20:41,880 --> 00:20:44,760 Speaker 1: that's something that's going on at like the enterprise level, 334 00:20:45,080 --> 00:20:47,919 Speaker 1: but I don't see how that has any impact on me. 335 00:20:48,080 --> 00:20:52,240 Speaker 1: You've turned that around entirely with that that description and 336 00:20:52,280 --> 00:20:55,840 Speaker 1: to show this is a way where the impact of 337 00:20:55,840 --> 00:21:00,440 Speaker 1: this technology can have a real, uh positive in fluence 338 00:21:00,560 --> 00:21:03,240 Speaker 1: on the individual as opposed to this is like these 339 00:21:03,240 --> 00:21:09,080 Speaker 1: are these big financial or big industrial kind of ideas 340 00:21:09,119 --> 00:21:14,359 Speaker 1: that otherwise seemed distant. That being said, with the new 341 00:21:14,400 --> 00:21:18,600 Speaker 1: normal that we have, how how is blockchain making an 342 00:21:18,640 --> 00:21:21,879 Speaker 1: impact in other industries outside of the medical industry. Are 343 00:21:21,920 --> 00:21:26,560 Speaker 1: we starting to see other sectors adopt this technology and 344 00:21:26,600 --> 00:21:30,159 Speaker 1: if so, in what ways is it manifesting? So you 345 00:21:30,320 --> 00:21:34,240 Speaker 1: you called out one of the first industries that really 346 00:21:34,280 --> 00:21:39,280 Speaker 1: embraced this, which was the financial services sector UM banking 347 00:21:39,320 --> 00:21:46,640 Speaker 1: for example. Uh, this thought of transacting digitally, in being 348 00:21:46,680 --> 00:21:53,600 Speaker 1: able to verify, validate, and settle financial transactions. And ultimately, 349 00:21:53,600 --> 00:21:56,200 Speaker 1: when we think about that original example I gave you, 350 00:21:56,320 --> 00:22:00,200 Speaker 1: those handshakes are transactions and the more that you can 351 00:22:00,200 --> 00:22:03,080 Speaker 1: make those trusted and obviously more often than not those 352 00:22:03,119 --> 00:22:10,280 Speaker 1: handshakes have an exchange of some monetary value. And so 353 00:22:11,320 --> 00:22:14,760 Speaker 1: industries such as the banking industry have picked this up 354 00:22:14,760 --> 00:22:21,280 Speaker 1: to say, look, can we quickly transact and make sure 355 00:22:21,280 --> 00:22:23,560 Speaker 1: that it's it's digital and we all appreciate this now. 356 00:22:23,600 --> 00:22:25,360 Speaker 1: When I used to say it, people say, oh yeah, 357 00:22:25,480 --> 00:22:27,800 Speaker 1: we won't go digital sundown now everything. You know, we're 358 00:22:27,800 --> 00:22:29,640 Speaker 1: all trying to figure out how we do everything digitally 359 00:22:29,640 --> 00:22:33,240 Speaker 1: and touchless. Well, we dot trade as a as a 360 00:22:33,280 --> 00:22:38,119 Speaker 1: company that towel the largest banks in Europe that came together. 361 00:22:38,160 --> 00:22:40,959 Speaker 1: These are banks that typically compete, and back to this 362 00:22:40,960 --> 00:22:43,720 Speaker 1: thought of you know, blockchain using the B word is 363 00:22:43,800 --> 00:22:48,320 Speaker 1: a a team sport. Again, you have competitive banks that say, look, 364 00:22:48,359 --> 00:22:51,600 Speaker 1: we can actually you know, raise the tide here and 365 00:22:51,680 --> 00:22:56,639 Speaker 1: raise all boats by transacting digitally and availing trade finance quicker, 366 00:22:56,680 --> 00:23:00,160 Speaker 1: better faster than small and medium sized enterprises because as 367 00:23:00,240 --> 00:23:03,080 Speaker 1: it's digital and we can, we can make it happen quicker. 368 00:23:03,119 --> 00:23:07,600 Speaker 1: So you see this, this type of transacting happening in 369 00:23:07,760 --> 00:23:10,960 Speaker 1: financial services, and you see things as I've given a 370 00:23:11,000 --> 00:23:14,080 Speaker 1: financial services example, it's no different than what you have 371 00:23:14,240 --> 00:23:17,000 Speaker 1: in in retail here in North America. And you have 372 00:23:17,520 --> 00:23:23,400 Speaker 1: some forward thinking entities that say, my accounts receivable, accounts payable. 373 00:23:23,680 --> 00:23:26,760 Speaker 1: You know, it's a process that it takes thirty days 374 00:23:26,800 --> 00:23:28,800 Speaker 1: to settle. But what if I could see all the 375 00:23:28,840 --> 00:23:32,760 Speaker 1: way through those handshakes. As we mentioned before, I could 376 00:23:32,800 --> 00:23:35,120 Speaker 1: settle in ten days and I don't have to make 377 00:23:35,160 --> 00:23:38,280 Speaker 1: phone calls and send emails, and that would maybe reduce 378 00:23:38,480 --> 00:23:42,760 Speaker 1: my my overhead. And then finally, when you look at 379 00:23:43,200 --> 00:23:48,040 Speaker 1: where that's going, Jonathan, as we've looked at different industries, 380 00:23:48,359 --> 00:23:51,640 Speaker 1: I've told you that's going cross industry. There's words that 381 00:23:51,640 --> 00:23:57,560 Speaker 1: that identify and most industries words that that people will 382 00:23:57,560 --> 00:23:59,400 Speaker 1: not either and say, yeah, we're trying to figure that out, 383 00:23:59,520 --> 00:24:02,960 Speaker 1: such as sustainability. And so that's where we're starting to 384 00:24:03,040 --> 00:24:07,680 Speaker 1: see big concepts become material when you can put, as 385 00:24:07,720 --> 00:24:12,400 Speaker 1: you mentioned, trust and accountability in front of it. Jason, Unfortunately, 386 00:24:12,480 --> 00:24:16,760 Speaker 1: when we're recording this, we're recording this as there's a 387 00:24:17,520 --> 00:24:22,080 Speaker 1: new surge of COVID nineteen in North America. The UK 388 00:24:22,760 --> 00:24:28,560 Speaker 1: has recently entered a month long lockdown period. Uh, clearly 389 00:24:28,640 --> 00:24:34,240 Speaker 1: we're at a new phase that's at least as difficult 390 00:24:34,280 --> 00:24:38,119 Speaker 1: as the initial challenges were when we first started seeing 391 00:24:38,200 --> 00:24:42,640 Speaker 1: regions around the world lockdown. Are you seeing any difference 392 00:24:42,640 --> 00:24:46,840 Speaker 1: in approach this time around? Our Our companies getting better 393 00:24:47,040 --> 00:24:51,960 Speaker 1: at handling this with their supply chains, and I assume 394 00:24:52,200 --> 00:24:56,000 Speaker 1: those that are reliant on things like blockchain are having 395 00:24:56,920 --> 00:25:00,359 Speaker 1: less of a challenge and adapting to that than others. 396 00:25:00,440 --> 00:25:03,960 Speaker 1: But but what are you seeing out there? We're seeing 397 00:25:03,960 --> 00:25:08,320 Speaker 1: what you just alluded to, where those that have existing 398 00:25:08,359 --> 00:25:12,360 Speaker 1: capabilities um knew before that they were challenged in their 399 00:25:12,400 --> 00:25:15,600 Speaker 1: supply chain, and now they're thinking beyond a supply beyond 400 00:25:15,760 --> 00:25:19,480 Speaker 1: the supply chain, and they're aware of the value chain 401 00:25:19,560 --> 00:25:22,879 Speaker 1: that I mentioned that it's not just that layer of 402 00:25:23,040 --> 00:25:27,879 Speaker 1: getting a package go to service across a linear spectrum, 403 00:25:27,880 --> 00:25:32,480 Speaker 1: but it's very dynamic um more more like a series 404 00:25:32,520 --> 00:25:34,639 Speaker 1: of concentric circles of trying to get from point A 405 00:25:34,760 --> 00:25:38,719 Speaker 1: to point B wherever that is. They realize how flexible 406 00:25:39,119 --> 00:25:42,880 Speaker 1: it needs to be and that there's this capability as 407 00:25:42,920 --> 00:25:46,000 Speaker 1: you call out that says, look, you don't have to 408 00:25:46,040 --> 00:25:48,560 Speaker 1: do the rip and replace of what you have, but 409 00:25:48,640 --> 00:25:51,600 Speaker 1: you can actually work with what you have and connected 410 00:25:51,640 --> 00:25:55,919 Speaker 1: in a permission way to use data with more power 411 00:25:56,280 --> 00:26:01,440 Speaker 1: and no one has to ask to explain the B word. Instead, 412 00:26:02,200 --> 00:26:04,600 Speaker 1: what they're saying is how can I get the outcomes? 413 00:26:04,600 --> 00:26:07,840 Speaker 1: And so as you you talk about things such as 414 00:26:07,920 --> 00:26:11,399 Speaker 1: and you alluded to supply chain management, you start to 415 00:26:11,440 --> 00:26:15,920 Speaker 1: think about using your value chain and value chain integration 416 00:26:16,119 --> 00:26:21,440 Speaker 1: of these ecosystems for outcome management. Not just change that data, 417 00:26:21,480 --> 00:26:24,439 Speaker 1: but also think about the business processes that have to 418 00:26:24,520 --> 00:26:27,400 Speaker 1: change with that, because, as we said, we're pulling out 419 00:26:27,480 --> 00:26:31,200 Speaker 1: some of those handshakes, which means you're changing the workflows. 420 00:26:31,240 --> 00:26:33,840 Speaker 1: You're making those those workflows just a little bit more 421 00:26:33,880 --> 00:26:37,880 Speaker 1: intelligent as well as transparent. Right, So, if you're doing that, 422 00:26:37,880 --> 00:26:41,800 Speaker 1: that's not easy because regardless of how cool it sounds 423 00:26:41,960 --> 00:26:45,040 Speaker 1: when we talk about it Jonathan on this podcast, it 424 00:26:45,160 --> 00:26:48,040 Speaker 1: might sound cool, but it's change, and we know that 425 00:26:48,240 --> 00:26:52,560 Speaker 1: change is not easy. A fortunate outcome of an unfortunate 426 00:26:52,640 --> 00:26:56,680 Speaker 1: situation is that the sense of urgency and also collaboration 427 00:26:56,720 --> 00:26:59,920 Speaker 1: and teamwork are overcoming that challenge right now, and that's 428 00:27:00,080 --> 00:27:03,240 Speaker 1: we're seeing as a positive outcome here where people are 429 00:27:03,240 --> 00:27:06,600 Speaker 1: coming together and say, I'm not going to get all 430 00:27:06,600 --> 00:27:09,600 Speaker 1: wound up in the who, what, why, and how of technology, 431 00:27:09,640 --> 00:27:12,400 Speaker 1: but I'm going to focus on the outcome and what 432 00:27:12,480 --> 00:27:17,119 Speaker 1: does it take to put what I currently have on 433 00:27:17,240 --> 00:27:21,600 Speaker 1: steroids to that next level of capability around data and trust. 434 00:27:22,000 --> 00:27:24,680 Speaker 1: And that's that's the magic spot we're in right now 435 00:27:24,680 --> 00:27:28,000 Speaker 1: with this with this capability, I think what you say, 436 00:27:28,320 --> 00:27:31,480 Speaker 1: essentially it it really does lend credence to the old 437 00:27:31,560 --> 00:27:35,600 Speaker 1: saying necessity is the mother of invention, the idea that 438 00:27:35,640 --> 00:27:41,400 Speaker 1: we have found ourselves in this incredible circumstance and that 439 00:27:41,480 --> 00:27:47,840 Speaker 1: it calls for really innovative solutions. And so this is 440 00:27:47,960 --> 00:27:52,080 Speaker 1: an example of that about how when we are faced 441 00:27:52,080 --> 00:27:56,600 Speaker 1: with these massive challenges, there is a way to rise 442 00:27:56,680 --> 00:27:59,400 Speaker 1: up to meet those challenges, and it requires that sort 443 00:27:59,440 --> 00:28:05,000 Speaker 1: of flexible thinking and that adaptability and that that that 444 00:28:05,119 --> 00:28:11,240 Speaker 1: determination to to adopt these new approaches and and not 445 00:28:11,359 --> 00:28:13,800 Speaker 1: a fear of it. You know, there comes a point 446 00:28:13,840 --> 00:28:16,679 Speaker 1: where you really need to look at the options and 447 00:28:16,760 --> 00:28:19,600 Speaker 1: say what makes the most sense, what is going to work? 448 00:28:20,800 --> 00:28:24,480 Speaker 1: With that in mind To sort of conclude our conversation, 449 00:28:25,480 --> 00:28:27,960 Speaker 1: we're kind of seeing a trial by fire with this 450 00:28:28,000 --> 00:28:31,800 Speaker 1: particular implementation of you know, this this concept of the 451 00:28:31,880 --> 00:28:35,320 Speaker 1: value chain in extreme circumstances. But where do you see 452 00:28:35,760 --> 00:28:39,200 Speaker 1: the future of this? What what what's in store for 453 00:28:39,280 --> 00:28:46,280 Speaker 1: us once we have cleared this UH incredibly important hurdle 454 00:28:46,680 --> 00:28:50,840 Speaker 1: and we're beyond it. What is the future of IBM 455 00:28:50,920 --> 00:28:54,680 Speaker 1: S approach and UH and how our industry is going 456 00:28:54,680 --> 00:28:57,800 Speaker 1: to make use of this moving forward? Well, I think 457 00:28:58,120 --> 00:29:00,360 Speaker 1: use of this and I would I would I would 458 00:29:00,760 --> 00:29:03,840 Speaker 1: say the operative word is this, because this is beyond 459 00:29:03,880 --> 00:29:07,000 Speaker 1: that bucket of blockchain that I mentioned. I believe that 460 00:29:07,560 --> 00:29:13,000 Speaker 1: it's this. This thought of an open platform for innovation 461 00:29:14,160 --> 00:29:17,720 Speaker 1: and and platform then becomes this. Oh he said, the 462 00:29:17,760 --> 00:29:19,920 Speaker 1: big word of platform. What's that mean for you? But 463 00:29:20,120 --> 00:29:25,040 Speaker 1: just think of that as this new capability that says, 464 00:29:25,520 --> 00:29:30,240 Speaker 1: I'm not limited to the walls of my enterprise. I'm 465 00:29:30,240 --> 00:29:34,840 Speaker 1: not just a pharma company. I can pull down my 466 00:29:35,120 --> 00:29:41,760 Speaker 1: barriers of business and work collaboratively with a regulatory body, 467 00:29:41,600 --> 00:29:46,520 Speaker 1: a government, or a shipping group. I can I can 468 00:29:46,560 --> 00:29:50,920 Speaker 1: work all the way down to that retail store or 469 00:29:51,360 --> 00:29:54,920 Speaker 1: or pharmacies that's distributing my good because I can see 470 00:29:55,400 --> 00:29:58,920 Speaker 1: or I'm somewhere in that value chain where I'm a 471 00:29:59,120 --> 00:30:05,200 Speaker 1: supplier to supplier to the supplier to the cooperative that's 472 00:30:05,680 --> 00:30:08,880 Speaker 1: supplying one of the players. And if you see, if 473 00:30:08,880 --> 00:30:10,880 Speaker 1: you can get that far down, it opens up the 474 00:30:10,960 --> 00:30:17,640 Speaker 1: market and it allows for new players, new areas of collaboration, 475 00:30:18,840 --> 00:30:22,480 Speaker 1: and it really allows us then to think about what 476 00:30:23,080 --> 00:30:27,520 Speaker 1: that platform of business looks like tomorrow. And it blurs 477 00:30:27,560 --> 00:30:30,200 Speaker 1: the line then between industries as I've I've called out, 478 00:30:30,960 --> 00:30:35,080 Speaker 1: because you you have them working together. It's then becomes 479 00:30:35,560 --> 00:30:40,120 Speaker 1: a blurring of my ecosystem in your ecosystem, So it's 480 00:30:40,160 --> 00:30:43,920 Speaker 1: less about my enterprise my ecosystem, and it becomes a 481 00:30:44,040 --> 00:30:51,920 Speaker 1: collaborative team sport of outcomes, and your competitiveness becomes based 482 00:30:51,920 --> 00:30:57,520 Speaker 1: on your ability to transact in that ecosystem and convene 483 00:30:58,000 --> 00:31:03,120 Speaker 1: outcomes around how you deliver value in that value chain. 484 00:31:03,200 --> 00:31:07,400 Speaker 1: So it sounds complicated, but think about getting outcomes out 485 00:31:07,400 --> 00:31:13,240 Speaker 1: of a much more level and transparent ecosystem versus me 486 00:31:13,400 --> 00:31:17,000 Speaker 1: putting my arms around a walled capability. That's that's where 487 00:31:17,040 --> 00:31:19,000 Speaker 1: the future is, and it's it's it's going to be 488 00:31:19,000 --> 00:31:23,880 Speaker 1: built on open platforms of capability with data always in 489 00:31:23,960 --> 00:31:26,640 Speaker 1: the center. So you can call out all the buzzwords 490 00:31:26,680 --> 00:31:32,400 Speaker 1: from AI, robotic, process automation, IoT, advanced analytics, all of 491 00:31:32,440 --> 00:31:36,440 Speaker 1: those use data and they're going to be operating in 492 00:31:36,480 --> 00:31:39,560 Speaker 1: an open platform. And that's where we as IBMC the future, 493 00:31:39,560 --> 00:31:41,680 Speaker 1: and that's where we're running hard and fast and leading, 494 00:31:42,960 --> 00:31:45,320 Speaker 1: leading as as quickly as we can. Well, Jason, that 495 00:31:45,400 --> 00:31:49,720 Speaker 1: to me sounds incredibly exciting. I love the idea of 496 00:31:49,800 --> 00:31:55,120 Speaker 1: this approach opening up new opportunities because ultimately that's going 497 00:31:55,240 --> 00:32:00,200 Speaker 1: to manifest as benefits to everyone down the road in 498 00:32:00,280 --> 00:32:03,280 Speaker 1: some way, whether it's a consumer good or a service 499 00:32:04,120 --> 00:32:09,360 Speaker 1: or something like life saving pharmaceuticals. Like the idea of 500 00:32:09,400 --> 00:32:12,440 Speaker 1: being able to find new ways to collaborate, in new 501 00:32:12,480 --> 00:32:17,280 Speaker 1: ways to innovate, to have that enabler there, uh, really 502 00:32:17,360 --> 00:32:19,600 Speaker 1: makes me excited about the future, which I think is 503 00:32:20,360 --> 00:32:24,040 Speaker 1: something that is is sorely needed. Honestly, the idea of 504 00:32:24,080 --> 00:32:26,719 Speaker 1: being excited about the future is a great thing. So 505 00:32:26,880 --> 00:32:30,800 Speaker 1: I thank you for coming onto my podcast and and 506 00:32:30,880 --> 00:32:34,440 Speaker 1: explaining this in a way where it doesn't make someone's 507 00:32:34,440 --> 00:32:37,640 Speaker 1: eyes just glaze over as they start to think about, 508 00:32:38,280 --> 00:32:41,560 Speaker 1: you know, hashing and making up blocks and adding another 509 00:32:41,600 --> 00:32:44,200 Speaker 1: block to the chain, and all of the all of 510 00:32:44,240 --> 00:32:47,920 Speaker 1: the ways that you can try to explain that process 511 00:32:47,960 --> 00:32:52,520 Speaker 1: that tend to lose of the audience right off the bat. 512 00:32:52,640 --> 00:32:55,240 Speaker 1: So I appreciate your approach very much. Thank you for 513 00:32:55,320 --> 00:32:59,320 Speaker 1: joining the show, Jonathan, thanks for the conversation, thoroughly enjoyed it, 514 00:32:59,360 --> 00:33:01,920 Speaker 1: and anach on all the best to you in in 515 00:33:02,120 --> 00:33:05,360 Speaker 1: the rest of you, stay stay safe, stay healthy, stay happy. 516 00:33:06,120 --> 00:33:10,680 Speaker 1: Jason's focus on outcomes rather than the technical process is 517 00:33:10,840 --> 00:33:14,200 Speaker 1: a useful one to explain how important blockchain is from 518 00:33:14,240 --> 00:33:20,520 Speaker 1: a practical application standpoint. As we've seen, flexibility and adaptability 519 00:33:20,560 --> 00:33:24,560 Speaker 1: are too important qualities for businesses in general, but particularly 520 00:33:24,680 --> 00:33:28,120 Speaker 1: during times of rapid change. With respect to that, I 521 00:33:28,120 --> 00:33:30,960 Speaker 1: think it's pretty clear how blockchain technology can play a 522 00:33:31,080 --> 00:33:35,000 Speaker 1: part in helping businesses understand exactly what's going on throughout 523 00:33:35,040 --> 00:33:38,520 Speaker 1: the entire supply chain. I would like to thank Jason 524 00:33:38,600 --> 00:33:42,160 Speaker 1: Kelly again for coming onto the show and explaining his work. 525 00:33:42,560 --> 00:33:45,480 Speaker 1: To learn more about ibm s work in blockchain, visit 526 00:33:45,760 --> 00:33:56,320 Speaker 1: www dot IBM dot com. Slash Blockchain text Stuff is 527 00:33:56,320 --> 00:33:59,480 Speaker 1: an I Heart Radio production. For more podcasts from my 528 00:33:59,600 --> 00:34:03,160 Speaker 1: heart rate you visit the I Heart Radio app, Apple Podcasts, 529 00:34:03,320 --> 00:34:09,840 Speaker 1: or wherever you listen to your favorite shows. H