1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,480 Speaker 1: This season of Smart Talks with IBM is all about 3 00:00:14,600 --> 00:00:18,480 Speaker 1: new creators, the developers, data scientists, c t o s, 4 00:00:18,560 --> 00:00:23,280 Speaker 1: and other visionaries creatively applying technology in business to drive change. 5 00:00:23,800 --> 00:00:26,680 Speaker 1: They use their knowledge and creativity to develop better ways 6 00:00:26,680 --> 00:00:30,280 Speaker 1: of working, no matter the industry. Join hosts from your 7 00:00:30,320 --> 00:00:33,880 Speaker 1: favorite Pushkin Industries podcasts as they use their expertise to 8 00:00:33,960 --> 00:00:37,480 Speaker 1: deepen these conversations, and of course Malcolm Gladwell will guide 9 00:00:37,520 --> 00:00:39,839 Speaker 1: you through the season as your host and provide his 10 00:00:39,920 --> 00:00:43,000 Speaker 1: thoughts and analysis along the way. Look out for new 11 00:00:43,040 --> 00:00:45,480 Speaker 1: episodes of Smart Talks with IBM on the I Heart 12 00:00:45,560 --> 00:00:49,159 Speaker 1: Radio app, Apple Podcasts, or wherever you get your podcasts, 13 00:00:49,360 --> 00:00:55,760 Speaker 1: and learn more at IBM dot com slash smart talks. Hello, Hello, 14 00:00:55,800 --> 00:00:59,840 Speaker 1: Welcome to Smart Talks with IBM, a podcast from Pushkin Industries, 15 00:01:00,160 --> 00:01:04,720 Speaker 1: I Heart Radio and ib M. I'm Malcolm Glamo. This season, 16 00:01:04,880 --> 00:01:09,160 Speaker 1: we're talking to new creators, the developers, data scientists, ct 17 00:01:09,319 --> 00:01:13,080 Speaker 1: os and other visionaries who are creatively applying technology in 18 00:01:13,160 --> 00:01:17,400 Speaker 1: business to drive change. Channeling their knowledge and expertise, they're 19 00:01:17,440 --> 00:01:21,880 Speaker 1: developing more creative and effective solutions, no matter the industry. 20 00:01:22,640 --> 00:01:26,760 Speaker 1: Our guest today is Nicholas Renaut, Senior Data science and 21 00:01:26,920 --> 00:01:32,119 Speaker 1: AI technical specialist at IBM. Nicholas's job is to help 22 00:01:32,160 --> 00:01:36,240 Speaker 1: companies formulate a data strategy that streamlines the way they 23 00:01:36,240 --> 00:01:41,200 Speaker 1: do business and prepares them to use sophisticated AI technologies. 24 00:01:41,840 --> 00:01:44,120 Speaker 1: But beyond his day to day, Nick is also a 25 00:01:44,160 --> 00:01:47,600 Speaker 1: content creator on YouTube, where his channel has over a 26 00:01:47,680 --> 00:01:52,520 Speaker 1: hundred thousand subscribers. His videos explain computer science concepts in 27 00:01:52,560 --> 00:01:56,240 Speaker 1: a way beginners can understand, and he often demonstrates how 28 00:01:56,280 --> 00:02:01,040 Speaker 1: to use machine learning and data science to solve novel problems. 29 00:02:01,840 --> 00:02:05,200 Speaker 1: On today's show, How Nicholas Learned Data science from the 30 00:02:05,240 --> 00:02:09,440 Speaker 1: bottom up, the fundamentals of data management, and how an 31 00:02:09,480 --> 00:02:15,840 Speaker 1: innovative data strategy can help businesses create novels solutions, Nick 32 00:02:15,919 --> 00:02:20,639 Speaker 1: spoke with Ronald Young Jr. Host of the Pushkin podcast Solvable. 33 00:02:21,200 --> 00:02:25,079 Speaker 1: Along with being a frequent contributor to NPR, Ronald also 34 00:02:25,160 --> 00:02:29,400 Speaker 1: hosts and produces the podcast Time Well Spent and Leaving 35 00:02:29,400 --> 00:02:36,200 Speaker 1: the Theater. Okay, let's get to the interview. So tell 36 00:02:36,240 --> 00:02:38,960 Speaker 1: me a little bit about how you got into data 37 00:02:39,000 --> 00:02:41,000 Speaker 1: and when you found out like the power that it 38 00:02:41,040 --> 00:02:43,520 Speaker 1: really harnesses. Do you have a story or anything that 39 00:02:43,639 --> 00:02:47,520 Speaker 1: kind of like when you first pique your interest in data. 40 00:02:48,440 --> 00:02:51,960 Speaker 1: My first interaction with data and with coding was actually 41 00:02:52,040 --> 00:02:56,359 Speaker 1: when I was around about eleven years old, So this 42 00:02:56,520 --> 00:03:00,320 Speaker 1: was really just getting started with just looking at spread sheets. 43 00:03:00,360 --> 00:03:03,880 Speaker 1: So my dad would come home and after working a 44 00:03:04,080 --> 00:03:08,680 Speaker 1: nine or five job, he actually started working with investing 45 00:03:08,840 --> 00:03:13,040 Speaker 1: in stocks and doing value based trading that way. I'll 46 00:03:13,040 --> 00:03:15,520 Speaker 1: always remember I walked up to his desk one time 47 00:03:15,880 --> 00:03:17,720 Speaker 1: and he said, Nick, if there's one thing that you 48 00:03:17,760 --> 00:03:21,399 Speaker 1: should learn, I'm seeing all these people work on these 49 00:03:21,440 --> 00:03:27,800 Speaker 1: things called macros in spreadsheets, and these people like wizards 50 00:03:27,840 --> 00:03:31,840 Speaker 1: inside of my business. I know that you're still you're 51 00:03:31,880 --> 00:03:33,720 Speaker 1: still in high school, but I really think you should 52 00:03:33,760 --> 00:03:36,880 Speaker 1: learn this stuff. And I started doubling in some Excel 53 00:03:36,960 --> 00:03:41,200 Speaker 1: spreadsheets and started just recording macros and tweaking stuff, and 54 00:03:41,240 --> 00:03:43,520 Speaker 1: that that's where it all started. But from there, it's 55 00:03:43,720 --> 00:03:47,240 Speaker 1: it's always been a recurring vein throughout my career that 56 00:03:47,280 --> 00:03:51,680 Speaker 1: I've done some sort of wizardry with data, whether it 57 00:03:51,760 --> 00:03:55,080 Speaker 1: be coding or business intelligence or data views. It's it's 58 00:03:55,160 --> 00:03:58,880 Speaker 1: always had a bit of a strain throughout throughout whatever 59 00:03:58,920 --> 00:04:02,240 Speaker 1: I've done, whether based start ups so YouTube or what 60 00:04:02,320 --> 00:04:05,160 Speaker 1: I'm doing now at IBM. Your dad was right. Let 61 00:04:05,160 --> 00:04:07,600 Speaker 1: me just say that, because that's someone who's trying to 62 00:04:07,600 --> 00:04:10,520 Speaker 1: put together a spreadsheet just to manage my personal finances. 63 00:04:10,960 --> 00:04:13,720 Speaker 1: Trying to look up the formula to actually bring a 64 00:04:13,800 --> 00:04:17,640 Speaker 1: value from what what uhet to another is enough of 65 00:04:17,640 --> 00:04:20,480 Speaker 1: a struggle for me. So I'm glad to do. Really 66 00:04:20,600 --> 00:04:24,799 Speaker 1: is it's like it's absolutely is uh so like knowing 67 00:04:24,839 --> 00:04:27,120 Speaker 1: that you know, this was how you started getting into spreadsheets. 68 00:04:27,360 --> 00:04:29,920 Speaker 1: You know you're looking at stocks and all of that. Um, 69 00:04:30,040 --> 00:04:32,960 Speaker 1: can you talk to me about how you found out 70 00:04:33,160 --> 00:04:37,120 Speaker 1: the importance of data literacy, how you begin to value 71 00:04:37,520 --> 00:04:41,400 Speaker 1: understanding what the numbers meant and what power that could have. 72 00:04:42,440 --> 00:04:46,160 Speaker 1: I got a cadet ship at one of the big 73 00:04:46,160 --> 00:04:49,480 Speaker 1: four accounting firms and started out as an orditor there, 74 00:04:49,520 --> 00:04:54,040 Speaker 1: which is pretty much data focus. So I saw that 75 00:04:54,880 --> 00:05:00,240 Speaker 1: these numbers ultimately fed into a significantly big at PA Chill, 76 00:05:00,360 --> 00:05:04,960 Speaker 1: which was a formal annual report, and numbers being wrong 77 00:05:05,040 --> 00:05:08,920 Speaker 1: in an annual report can move markets, right, those numbers 78 00:05:08,920 --> 00:05:12,159 Speaker 1: need to be absolutely bang on. But I think that 79 00:05:12,279 --> 00:05:14,920 Speaker 1: is sort of where it started. Where it really culminated 80 00:05:15,240 --> 00:05:18,680 Speaker 1: was when I started doing some work at the Reserve 81 00:05:18,720 --> 00:05:23,400 Speaker 1: Bank of Australia. And those numbers don't just impact the 82 00:05:24,440 --> 00:05:29,640 Speaker 1: metrics for a particular organization, they impact the entire countries metrics. 83 00:05:29,680 --> 00:05:32,560 Speaker 1: Getting those numbers wrong on a particular chart, or getting 84 00:05:32,560 --> 00:05:36,360 Speaker 1: them right on a particular chart can move entire organizations, 85 00:05:36,440 --> 00:05:39,560 Speaker 1: or can shift an entire country. It's kind of crazy 86 00:05:39,680 --> 00:05:45,200 Speaker 1: what the value that doing things correctly with data has. 87 00:05:45,320 --> 00:05:48,640 Speaker 1: So when you're presenting a metric, you have to ensure 88 00:05:48,760 --> 00:05:51,640 Speaker 1: that you are portraying the appropriate message. It's not just 89 00:05:51,720 --> 00:05:56,200 Speaker 1: about the wrong number, because correlation does not necessarily imply causation. 90 00:05:56,279 --> 00:05:59,520 Speaker 1: So understanding what it is that you're saying is so 91 00:05:59,520 --> 00:06:03,360 Speaker 1: so important, and it is so much more powerful now 92 00:06:03,360 --> 00:06:06,280 Speaker 1: that we've got so much more data available at our fingertips. 93 00:06:06,279 --> 00:06:07,960 Speaker 1: It's really easy to go and grab a bunch of 94 00:06:07,960 --> 00:06:10,080 Speaker 1: metrics and go, hey, I'm going to grab this data 95 00:06:10,120 --> 00:06:11,880 Speaker 1: from over here, and grab that data from over here 96 00:06:11,920 --> 00:06:14,640 Speaker 1: for a measured together. Hey, look, these two lines follow 97 00:06:14,680 --> 00:06:16,800 Speaker 1: the same trend. They must be related. Do you find 98 00:06:16,839 --> 00:06:20,480 Speaker 1: yourself ever looking at data points and say those this, 99 00:06:20,920 --> 00:06:22,720 Speaker 1: How do I don't understand this chart? Why did they 100 00:06:22,720 --> 00:06:24,560 Speaker 1: where did they pull this from? Do you find yourself 101 00:06:24,560 --> 00:06:27,160 Speaker 1: doing that a lot of your regular life. Oh yeah, 102 00:06:27,200 --> 00:06:29,680 Speaker 1: that There's there's some great charts out there as well 103 00:06:29,760 --> 00:06:31,880 Speaker 1: that you always see, and they they plot like the 104 00:06:31,960 --> 00:06:36,040 Speaker 1: number of Nicolas Cage movies against the g d P 105 00:06:36,520 --> 00:06:39,760 Speaker 1: of Bolivia or something, and it's like, well, they're going 106 00:06:39,800 --> 00:06:42,720 Speaker 1: in the same direction. They must have some relationship. But 107 00:06:43,320 --> 00:06:45,680 Speaker 1: people can really quickly look at a picture and go 108 00:06:46,200 --> 00:06:48,640 Speaker 1: and make an assumption about what that is saying without 109 00:06:48,680 --> 00:06:51,880 Speaker 1: actually interpreting. Hey, are these on the same scales? Are 110 00:06:51,880 --> 00:06:54,560 Speaker 1: they what time period is being displayed? What am I 111 00:06:54,600 --> 00:06:58,120 Speaker 1: actually looking at here? And I find myself doing this 112 00:06:58,160 --> 00:06:59,920 Speaker 1: more and more often when I just see a child 113 00:07:00,080 --> 00:07:02,799 Speaker 1: them like, hold on, let's just not make any assumption. 114 00:07:03,040 --> 00:07:05,279 Speaker 1: What is this chart actually trying to say? What is 115 00:07:05,320 --> 00:07:09,000 Speaker 1: it actually trying to portray? Because you can lie with 116 00:07:09,080 --> 00:07:12,720 Speaker 1: statistics if you know what you're doing. It is they're 117 00:07:12,760 --> 00:07:16,720 Speaker 1: so powerful and people can gloss over them so quickly. 118 00:07:16,720 --> 00:07:19,600 Speaker 1: We've got attention spans that are so much shorter these 119 00:07:19,680 --> 00:07:24,720 Speaker 1: days that it can be very very easy to take 120 00:07:24,720 --> 00:07:29,400 Speaker 1: away the wrong message. So you also produce content across 121 00:07:29,640 --> 00:07:33,960 Speaker 1: various platforms, including YouTube and your personal blog. UH as 122 00:07:34,000 --> 00:07:36,200 Speaker 1: a content creator, how did you get started in that 123 00:07:36,240 --> 00:07:38,920 Speaker 1: field and what type of content are you creating? Yeah, 124 00:07:38,960 --> 00:07:43,240 Speaker 1: that's a crazy story, right. So I always wanted to 125 00:07:43,280 --> 00:07:45,680 Speaker 1: get into tech and said, hey, I'd really really like 126 00:07:46,320 --> 00:07:49,480 Speaker 1: to work for IBM. I saw what they were doing 127 00:07:49,480 --> 00:07:52,760 Speaker 1: with Watson, and I'm like, why are people talking about 128 00:07:52,800 --> 00:07:56,640 Speaker 1: this more? And I had no affiliation with with IBM 129 00:07:56,680 --> 00:07:58,840 Speaker 1: at the time, and I'm like, well, this is so cool. 130 00:07:58,880 --> 00:08:01,600 Speaker 1: There used to be this thing or or this service 131 00:08:01,640 --> 00:08:06,160 Speaker 1: available on the cloud platform called Personality Insights, and you 132 00:08:06,200 --> 00:08:09,040 Speaker 1: could plug in a little bit of text and from 133 00:08:09,080 --> 00:08:12,920 Speaker 1: that piece of text, it would analyze that particular person's 134 00:08:12,960 --> 00:08:16,080 Speaker 1: personality based on the Big five personality traits. And there 135 00:08:16,080 --> 00:08:18,880 Speaker 1: actually used to be this demo app where you could 136 00:08:18,960 --> 00:08:20,880 Speaker 1: cook it up to a Twitter account, so I could 137 00:08:21,080 --> 00:08:25,960 Speaker 1: pass through Oprah's Twitter account or Lebron's Twitter account and 138 00:08:25,960 --> 00:08:29,960 Speaker 1: it would actually analyze their profiles. And this is so cool. 139 00:08:30,880 --> 00:08:34,360 Speaker 1: It was nuts, and I was like and a lot 140 00:08:34,360 --> 00:08:36,400 Speaker 1: of people don't know how to use this. So that 141 00:08:36,440 --> 00:08:38,720 Speaker 1: was quite possibly one of the first true toils that 142 00:08:38,760 --> 00:08:44,560 Speaker 1: I made on YouTube, and actually used a bunch of 143 00:08:44,640 --> 00:08:48,920 Speaker 1: videos that I made following after that too. Finally land 144 00:08:48,920 --> 00:08:51,360 Speaker 1: a job at IBM. I actually spammed a bunch of 145 00:08:51,400 --> 00:08:54,040 Speaker 1: links in my resume and my coverle that I was like, hey, 146 00:08:54,080 --> 00:08:56,880 Speaker 1: I'm already working with this stuff and I could do it. 147 00:08:57,240 --> 00:09:00,959 Speaker 1: And the person that hired me, she actually said that 148 00:09:00,960 --> 00:09:05,640 Speaker 1: that was like such an amazing way to portray what 149 00:09:05,640 --> 00:09:08,760 Speaker 1: what you love about what you do, that that that 150 00:09:08,840 --> 00:09:11,679 Speaker 1: had such an influencing factor in actually getting the job. 151 00:09:11,720 --> 00:09:14,400 Speaker 1: But yeah, I did it because one the tech was 152 00:09:14,600 --> 00:09:16,840 Speaker 1: so cool and I thought it was so interesting and 153 00:09:16,880 --> 00:09:22,040 Speaker 1: so powerful, and yeah, eventually that helped me land that job. 154 00:09:22,640 --> 00:09:25,240 Speaker 1: So you do a lot of tutorials where you're you're 155 00:09:25,280 --> 00:09:29,440 Speaker 1: breaking down complex topics to kind of a wider audience. 156 00:09:30,040 --> 00:09:33,240 Speaker 1: Why is that important for you to do? Yeah? I 157 00:09:33,280 --> 00:09:37,080 Speaker 1: think one of the amazing things about knowledge is it's 158 00:09:37,160 --> 00:09:38,680 Speaker 1: one of the things that you can give away and 159 00:09:38,720 --> 00:09:44,079 Speaker 1: never lose, right, And I think one of the trickiest 160 00:09:44,120 --> 00:09:48,120 Speaker 1: things about the whole data science and machine learning field 161 00:09:48,280 --> 00:09:52,400 Speaker 1: is that it can be pretty tricky to get started, 162 00:09:52,559 --> 00:09:57,960 Speaker 1: and sometimes we get hung up with learning from the 163 00:09:58,000 --> 00:10:02,599 Speaker 1: bottom up right, And there's nothing wrong with learning fundamentals 164 00:10:02,640 --> 00:10:06,200 Speaker 1: and learning foundations and really getting stuck in. But in 165 00:10:06,320 --> 00:10:09,119 Speaker 1: order to stick with something, you have to find it interesting. 166 00:10:09,240 --> 00:10:11,679 Speaker 1: So if you can see the end result and then 167 00:10:11,760 --> 00:10:14,400 Speaker 1: work your way back up and work out how that's worked. 168 00:10:15,080 --> 00:10:17,439 Speaker 1: Then it is so much more attractive because you get 169 00:10:17,480 --> 00:10:20,040 Speaker 1: that instant gratification and go, hey, I've just built this 170 00:10:20,120 --> 00:10:23,920 Speaker 1: machine learning app that is able to decode sign language. 171 00:10:23,960 --> 00:10:25,720 Speaker 1: It's so cool. Now I'm going to go and work 172 00:10:25,720 --> 00:10:28,080 Speaker 1: out the tech behind it. Admittedly, not everyone goes and 173 00:10:28,120 --> 00:10:30,920 Speaker 1: works out the tech behind it, but what I'm trying 174 00:10:30,920 --> 00:10:33,880 Speaker 1: to do is make it so that more people can 175 00:10:33,880 --> 00:10:36,680 Speaker 1: get involved and get started with it. Lately, I've been 176 00:10:36,720 --> 00:10:40,640 Speaker 1: doing these things called code that challenges, and they're kind 177 00:10:40,640 --> 00:10:44,160 Speaker 1: of crazy, right, but I love doing them. So I 178 00:10:44,240 --> 00:10:48,359 Speaker 1: have to build entire machine learning or data science applications 179 00:10:48,800 --> 00:10:54,360 Speaker 1: without looking at any reference code, stack overflow, or looking 180 00:10:54,360 --> 00:10:59,600 Speaker 1: at any documentation within fifteen minutes. So it is literally 181 00:10:59,640 --> 00:11:02,199 Speaker 1: just a trial by fire. I'll have my phone, I'll 182 00:11:02,200 --> 00:11:04,440 Speaker 1: set a time, and I'm like, all right, guys, we're on. 183 00:11:04,640 --> 00:11:08,520 Speaker 1: Like the edit is literally just cording NonStop and me 184 00:11:08,640 --> 00:11:10,600 Speaker 1: explaining on the go. But it allows people to see 185 00:11:10,840 --> 00:11:13,839 Speaker 1: and explain my thought process as I'm developing it. Um. 186 00:11:13,880 --> 00:11:17,800 Speaker 1: That's obviously super fun, right because it's highly engaging and 187 00:11:17,800 --> 00:11:20,120 Speaker 1: it shows people that, hey, you can get cited in 188 00:11:20,200 --> 00:11:24,720 Speaker 1: this relatively quickly. Nicholas is the kind of person whose 189 00:11:24,720 --> 00:11:28,080 Speaker 1: passion for data science is so great it spills over 190 00:11:28,200 --> 00:11:31,959 Speaker 1: from his professional life onto his YouTube channel. But when 191 00:11:32,000 --> 00:11:35,520 Speaker 1: he's not making videos, he's using that same expertise to 192 00:11:35,559 --> 00:11:40,200 Speaker 1: help his clients make their businesses work better. At IBM, 193 00:11:40,400 --> 00:11:44,880 Speaker 1: Nicholas works with businesses to formulate a data strategy, preparing 194 00:11:44,880 --> 00:11:47,880 Speaker 1: them to get the most out of technology like machine 195 00:11:47,960 --> 00:11:52,080 Speaker 1: learning or deep learning. He explained to Ronald why thinking 196 00:11:52,200 --> 00:11:55,480 Speaker 1: critically about the data it generates can help a company 197 00:11:55,720 --> 00:11:59,079 Speaker 1: run more efficiently. So there's a quote that you've used 198 00:11:59,080 --> 00:12:02,960 Speaker 1: in your presentation say their firms are trying to become 199 00:12:03,000 --> 00:12:07,360 Speaker 1: insights driven, but only one third report succeeding. What is 200 00:12:07,400 --> 00:12:11,320 Speaker 1: the role of creativity in the successful one third and 201 00:12:11,559 --> 00:12:14,199 Speaker 1: how are you at IBM helping to increase that number. 202 00:12:15,600 --> 00:12:19,120 Speaker 1: I remember going to a talk by our previous CEO, 203 00:12:19,360 --> 00:12:22,559 Speaker 1: and she said that there's a large number of organizations 204 00:12:22,600 --> 00:12:26,160 Speaker 1: that are just experimenting with random acts of digital So 205 00:12:26,240 --> 00:12:29,559 Speaker 1: they're just testing out some of these news technologies are 206 00:12:29,559 --> 00:12:33,120 Speaker 1: seeing kind of what's possible. But the ones that are 207 00:12:33,120 --> 00:12:36,520 Speaker 1: truly being successful are the ones that are getting there 208 00:12:37,840 --> 00:12:41,480 Speaker 1: the data ready their data strategy in play. They're the 209 00:12:41,520 --> 00:12:44,080 Speaker 1: ones that are starting to collect their data. They're starting 210 00:12:44,080 --> 00:12:46,960 Speaker 1: to get it ready and organized. They're starting to take 211 00:12:47,000 --> 00:12:50,240 Speaker 1: a look at it and starting to iterate and prototype 212 00:12:50,280 --> 00:12:53,640 Speaker 1: and in a structured manner. They're starting to roll this 213 00:12:53,720 --> 00:12:59,360 Speaker 1: stuff out. The journey to get something as sophisticated as 214 00:13:00,000 --> 00:13:03,719 Speaker 1: machine learning into production is a lot more difficult than 215 00:13:03,760 --> 00:13:08,280 Speaker 1: I think people realize because you're now building a box 216 00:13:08,480 --> 00:13:12,440 Speaker 1: that has its own rules. You haven't defined those rules yourself, 217 00:13:12,720 --> 00:13:16,200 Speaker 1: So how do you explain that when something goes right? 218 00:13:16,600 --> 00:13:19,360 Speaker 1: But how do you explain when something goes wrong? And 219 00:13:19,400 --> 00:13:23,800 Speaker 1: having governance around that is absolutely critical, which is really 220 00:13:23,800 --> 00:13:27,120 Speaker 1: whether the data strategy does come into play. So let's 221 00:13:27,200 --> 00:13:31,400 Speaker 1: let's get into more business focused data strategies. Why is 222 00:13:31,400 --> 00:13:33,720 Speaker 1: it so important to have a data strategy in place 223 00:13:34,160 --> 00:13:37,680 Speaker 1: to fuel AI modeling and how does data literacy play 224 00:13:37,679 --> 00:13:42,160 Speaker 1: a role in getting value from these models. We've got 225 00:13:42,320 --> 00:13:45,520 Speaker 1: algorithms left, right and center these days, but I think 226 00:13:45,559 --> 00:13:48,240 Speaker 1: the thing that people forget is that you can't use 227 00:13:48,280 --> 00:13:53,400 Speaker 1: any of these algorithms unless you've got data. So ensuring 228 00:13:53,440 --> 00:13:57,960 Speaker 1: that you have a structure in place too one collect 229 00:13:57,960 --> 00:14:02,400 Speaker 1: your data to organ is it three, analyze it, and 230 00:14:02,440 --> 00:14:07,600 Speaker 1: then or infused to machine learning or deep learning into it. 231 00:14:07,679 --> 00:14:10,120 Speaker 1: Is absolutely critical because if you don't collect it, you 232 00:14:10,160 --> 00:14:12,320 Speaker 1: can't do anything with it. If you don't organize it, 233 00:14:12,720 --> 00:14:15,600 Speaker 1: you can't discover what you've actually got, what the quality 234 00:14:15,640 --> 00:14:18,040 Speaker 1: looks like. You don't analyze it, you don't know whether 235 00:14:18,120 --> 00:14:20,760 Speaker 1: or not you can trust it. Um and then he 236 00:14:20,880 --> 00:14:23,080 Speaker 1: infused is always like the icing on the cake, right 237 00:14:23,360 --> 00:14:25,640 Speaker 1: to the machine learning, the deep learning, all the cool 238 00:14:25,720 --> 00:14:30,720 Speaker 1: buzzwords that people throw around. That is like the last step, 239 00:14:31,120 --> 00:14:35,120 Speaker 1: and it is always the coolest step. But you can't 240 00:14:35,400 --> 00:14:38,160 Speaker 1: ever get to that last cool step unless you've gone 241 00:14:38,200 --> 00:14:42,280 Speaker 1: through that the hard work that that's come before. Let's 242 00:14:42,400 --> 00:14:44,880 Speaker 1: like expand a little bit on the pain points for 243 00:14:44,920 --> 00:14:49,040 Speaker 1: companies when they're developing or implementing a data strategy. What 244 00:14:49,120 --> 00:14:52,360 Speaker 1: do those pain points look like? Honestly, the biggest pain 245 00:14:52,440 --> 00:14:56,800 Speaker 1: point that I see organizations actually the top two that 246 00:14:56,880 --> 00:15:00,000 Speaker 1: I see them coming back to over and over again, 247 00:15:00,040 --> 00:15:05,920 Speaker 1: and is collecting and organizing their data. So let's say, 248 00:15:05,960 --> 00:15:13,080 Speaker 1: for example, you've got a manufacturing type organization and what 249 00:15:13,120 --> 00:15:18,560 Speaker 1: they want to do is they want to improve the 250 00:15:18,680 --> 00:15:24,200 Speaker 1: production quality on a particular manufacturing line. So ideally, if 251 00:15:24,240 --> 00:15:27,360 Speaker 1: they see that they've got defective products on the manufacturing line, 252 00:15:27,360 --> 00:15:29,200 Speaker 1: they want to get rid of those sooner rather than 253 00:15:29,280 --> 00:15:31,040 Speaker 1: later because they don't want to be shipping him out 254 00:15:31,080 --> 00:15:34,520 Speaker 1: to the customer going through the whole warranty and claims 255 00:15:34,560 --> 00:15:38,440 Speaker 1: process that just costs a ton of money. So they're like, well, 256 00:15:38,840 --> 00:15:41,480 Speaker 1: it would be great to use some computer vision or 257 00:15:41,560 --> 00:15:44,240 Speaker 1: some deep learning to detect when we've got defects on 258 00:15:44,280 --> 00:15:46,360 Speaker 1: the product line, and then we can grab those and 259 00:15:46,440 --> 00:15:50,280 Speaker 1: rip them out. Somebody along the line is like, great, 260 00:15:50,520 --> 00:15:53,360 Speaker 1: let's go and do it. The first stumbling block that 261 00:15:53,440 --> 00:15:55,960 Speaker 1: you're going to trip up at is, hold on, do 262 00:15:56,000 --> 00:16:00,440 Speaker 1: you have any images of defective products from example cameras 263 00:16:00,520 --> 00:16:03,280 Speaker 1: that are looking at that production line. So if you 264 00:16:03,360 --> 00:16:06,320 Speaker 1: haven't gone and collected images of that or video of that, 265 00:16:07,440 --> 00:16:10,040 Speaker 1: there is no way in hell that you can actually 266 00:16:10,080 --> 00:16:16,080 Speaker 1: go and build that system to improve your organizational productivity. 267 00:16:16,160 --> 00:16:20,080 Speaker 1: So knowing well in advance what data you're likely to 268 00:16:20,240 --> 00:16:24,320 Speaker 1: need is absolutely critical. It is the first step in 269 00:16:24,360 --> 00:16:29,720 Speaker 1: the data science life cycle. So collecting, understanding, and exploring 270 00:16:29,760 --> 00:16:33,880 Speaker 1: your data is the absolute first step. The second one 271 00:16:34,160 --> 00:16:37,760 Speaker 1: is a little bit more interesting. So let's say, for example, 272 00:16:38,880 --> 00:16:41,480 Speaker 1: you sort of want to get in on the craze 273 00:16:41,520 --> 00:16:44,680 Speaker 1: that is data science or machine learning, and you bring 274 00:16:44,680 --> 00:16:49,600 Speaker 1: on a data science team. The next biggest stumbling block 275 00:16:49,680 --> 00:16:52,480 Speaker 1: that I find a lot of organizations trip up on 276 00:16:52,760 --> 00:16:55,560 Speaker 1: is discovering their data. They've got a ton of data, 277 00:16:55,600 --> 00:16:59,160 Speaker 1: but nobody knows what they've got. So being able to 278 00:16:59,240 --> 00:17:03,520 Speaker 1: find so to discover, rate, review, and rank that information 279 00:17:04,520 --> 00:17:09,400 Speaker 1: is paramount because you'll have people come in and go okay. 280 00:17:09,480 --> 00:17:12,800 Speaker 1: So a line managers approached me and said that we 281 00:17:12,840 --> 00:17:15,480 Speaker 1: want to take a look at our top performing customers 282 00:17:15,600 --> 00:17:18,320 Speaker 1: and we want to build a retention strategy so we're 283 00:17:18,359 --> 00:17:22,119 Speaker 1: not losing customers anymore. Well, your data scientists is then 284 00:17:22,160 --> 00:17:24,359 Speaker 1: going to go, well, do we have data of customers 285 00:17:24,359 --> 00:17:27,480 Speaker 1: that have left previously. If you can't easily search and 286 00:17:27,520 --> 00:17:29,640 Speaker 1: find out what you've got, that makes it pretty hard 287 00:17:29,680 --> 00:17:33,919 Speaker 1: to go and build those models. So collecting, organizing, and 288 00:17:33,960 --> 00:17:38,800 Speaker 1: discovering really absolutely critical, but that they can be a 289 00:17:38,800 --> 00:17:42,400 Speaker 1: little bit tricky to handle in a large number of organizations. 290 00:17:42,920 --> 00:17:46,480 Speaker 1: What kind of supporting technology and new solutions do we 291 00:17:46,560 --> 00:17:50,679 Speaker 1: need to meet growing data management issues? It really comes 292 00:17:50,720 --> 00:17:53,679 Speaker 1: down to two a few things. So ensuring that you 293 00:17:53,680 --> 00:17:56,520 Speaker 1: can one collect the types of data that you're looking at. 294 00:17:56,680 --> 00:17:59,959 Speaker 1: So I think when people think of data, they're always 295 00:18:00,119 --> 00:18:02,280 Speaker 1: thinking of hate it's just going to be a bunch 296 00:18:02,320 --> 00:18:04,600 Speaker 1: of spreadsheets. It might just be stuff that we can 297 00:18:04,600 --> 00:18:08,080 Speaker 1: throw into a database. But there is so much more 298 00:18:08,160 --> 00:18:10,399 Speaker 1: out there. Right, there's video, how do we store that? 299 00:18:10,480 --> 00:18:15,160 Speaker 1: How do we hold that? There is images, there's natural text. 300 00:18:15,240 --> 00:18:19,159 Speaker 1: Like we're just talking about ensuring that you've got appropriate 301 00:18:19,200 --> 00:18:22,400 Speaker 1: processes in place to be able to store holding catalog 302 00:18:22,520 --> 00:18:27,000 Speaker 1: that I think is absolutely critical. We talked a little 303 00:18:27,000 --> 00:18:30,359 Speaker 1: bit about data cataloging and the need to be able 304 00:18:30,440 --> 00:18:35,280 Speaker 1: to search and discover that data that is absolutely paramount. 305 00:18:35,359 --> 00:18:37,199 Speaker 1: Once you've got it collected, how do you find it? 306 00:18:38,960 --> 00:18:42,680 Speaker 1: What is IBM's unique approach to facilitating access to data 307 00:18:42,720 --> 00:18:48,440 Speaker 1: within companies. So one of the biggest things, and one 308 00:18:48,480 --> 00:18:50,840 Speaker 1: of the my favorite things that I get to work with, 309 00:18:51,160 --> 00:18:54,479 Speaker 1: is a particular tool set, right, and this tool set 310 00:18:54,560 --> 00:18:57,480 Speaker 1: is called cloud Path for Data. So, without getting too 311 00:18:57,480 --> 00:19:00,880 Speaker 1: pitchy that the absolutely amazing thing about this is that 312 00:19:01,640 --> 00:19:05,720 Speaker 1: those stages that I was talking about, right, so collect, organized, analyzing, infused, 313 00:19:06,200 --> 00:19:09,040 Speaker 1: it actually helps facilitate each one of those stages, right, 314 00:19:09,520 --> 00:19:13,440 Speaker 1: So you can actually collect, store, and hold your data 315 00:19:13,480 --> 00:19:16,960 Speaker 1: in a secure and government place. You've got data catalog 316 00:19:17,080 --> 00:19:19,480 Speaker 1: in capabilities which allows you to search. Like one of 317 00:19:19,480 --> 00:19:23,080 Speaker 1: my favorite things is that you might have a data set, right, 318 00:19:23,119 --> 00:19:24,920 Speaker 1: So I might be a data scientist, and then we 319 00:19:25,000 --> 00:19:28,040 Speaker 1: might have another data scientist on the team. I can 320 00:19:28,080 --> 00:19:30,399 Speaker 1: have a data set inside of there, and I can 321 00:19:30,440 --> 00:19:33,000 Speaker 1: actually rank it and add comments and go, hey, just 322 00:19:33,040 --> 00:19:35,040 Speaker 1: be wary of this column with lot certain features that 323 00:19:35,080 --> 00:19:38,879 Speaker 1: you need to be mindful of, and that provides additional 324 00:19:38,960 --> 00:19:43,480 Speaker 1: metadata understand what is what my data actually looks like 325 00:19:43,560 --> 00:19:47,120 Speaker 1: and and things that I should be mindful for. So 326 00:19:47,359 --> 00:19:52,119 Speaker 1: I'm I'm Joe employee. How can data be helpful to me? 327 00:19:53,160 --> 00:19:57,320 Speaker 1: Great question? So, I mean data is impacting everyone, right, 328 00:19:57,359 --> 00:20:02,040 Speaker 1: whether you you like it or not. Um and more 329 00:20:02,080 --> 00:20:04,520 Speaker 1: often than not, what you're going to find is that 330 00:20:04,600 --> 00:20:09,000 Speaker 1: you can improve whatever it is that you do by 331 00:20:09,080 --> 00:20:12,880 Speaker 1: by looking at that data, whether it's let's take an 332 00:20:12,960 --> 00:20:17,080 Speaker 1: organization out of it. If you use sleep trackers, you 333 00:20:17,119 --> 00:20:20,640 Speaker 1: can begin to see when you're sleep or when you're 334 00:20:20,640 --> 00:20:23,919 Speaker 1: getting good quality sleep versus when you're getting bad quality sleep. 335 00:20:24,240 --> 00:20:27,520 Speaker 1: If you start to collect additional data points like hey, 336 00:20:27,840 --> 00:20:31,600 Speaker 1: am I drinking enough water during the day, am I 337 00:20:31,720 --> 00:20:34,399 Speaker 1: doing certain things like looking at my phone just before 338 00:20:34,480 --> 00:20:37,199 Speaker 1: I go to bed? Are these things influencing my sleep? 339 00:20:37,680 --> 00:20:41,800 Speaker 1: And is that causing a negative impact on my quality 340 00:20:41,840 --> 00:20:45,120 Speaker 1: of life? So that's taking a broader view of it. 341 00:20:45,440 --> 00:20:48,560 Speaker 1: But when you step into a team or a business view, 342 00:20:49,600 --> 00:20:52,959 Speaker 1: data can can make your life a billion times easier. 343 00:20:53,440 --> 00:20:56,359 Speaker 1: If you know that there's a particular issue in a 344 00:20:56,440 --> 00:20:59,960 Speaker 1: system earlier on in a data pipeline, before something crosses 345 00:21:00,000 --> 00:21:02,359 Speaker 1: your desk, you might go and say, hey, look, if 346 00:21:02,359 --> 00:21:05,159 Speaker 1: we just changed how we collected these pieces of information, 347 00:21:05,640 --> 00:21:08,159 Speaker 1: if we just transformed what we actually did with it, 348 00:21:08,400 --> 00:21:11,000 Speaker 1: this is going to streamline my entire workflow and and 349 00:21:11,119 --> 00:21:14,439 Speaker 1: help me out. But not only that. Right, So I 350 00:21:14,560 --> 00:21:18,080 Speaker 1: work a little bit with the automation team, and they're 351 00:21:18,119 --> 00:21:21,320 Speaker 1: really big on robotic process automation. Let's say you're doing 352 00:21:21,400 --> 00:21:25,080 Speaker 1: something each and every single day. You're copying a far 353 00:21:25,359 --> 00:21:28,600 Speaker 1: from here to there. You're grabbing some information from a website, 354 00:21:28,600 --> 00:21:30,919 Speaker 1: You're throwing it into a form, and you have to 355 00:21:30,960 --> 00:21:34,000 Speaker 1: do that twenty times a day. There are tools that 356 00:21:34,119 --> 00:21:37,040 Speaker 1: can automate that entire process for you, and they're smart. 357 00:21:37,160 --> 00:21:39,399 Speaker 1: They're not just looking at where you're clicking on the page. 358 00:21:39,440 --> 00:21:42,479 Speaker 1: They're looking at what applications you're opening, They're looking at 359 00:21:42,480 --> 00:21:45,600 Speaker 1: what fields you're pulling data out of. You can automate 360 00:21:45,640 --> 00:21:47,919 Speaker 1: those entire workflows. That means that you don't have to 361 00:21:47,960 --> 00:21:51,200 Speaker 1: do that repetitive, kind of boring work that you don't 362 00:21:51,240 --> 00:21:53,760 Speaker 1: really want to do. You can palm that off and 363 00:21:54,000 --> 00:21:56,240 Speaker 1: do the robot and do the stuff that you actually 364 00:21:56,240 --> 00:21:59,359 Speaker 1: really want to get involved in. As Nicholas said, the 365 00:21:59,359 --> 00:22:02,359 Speaker 1: way a company leverages this data has an impact on 366 00:22:02,560 --> 00:22:06,359 Speaker 1: every level of the business. Data informs how we do 367 00:22:06,440 --> 00:22:08,960 Speaker 1: our jobs day to day and how we plan for 368 00:22:08,960 --> 00:22:12,560 Speaker 1: the future. Having an open mindset about data makes it 369 00:22:12,640 --> 00:22:16,240 Speaker 1: easier for a business to come up with creative solutions. 370 00:22:17,040 --> 00:22:20,680 Speaker 1: In the next part of their conversation, Ronald asked Nicholas 371 00:22:20,720 --> 00:22:25,520 Speaker 1: how data science and creativity come together. So let's talk 372 00:22:25,520 --> 00:22:27,440 Speaker 1: a little bit more about creativity. We talked a little 373 00:22:27,440 --> 00:22:29,760 Speaker 1: bit about your YouTube channel, UH and how you use 374 00:22:29,800 --> 00:22:32,760 Speaker 1: that to help people get started with data science. What 375 00:22:32,880 --> 00:22:35,879 Speaker 1: does creativity mean to you and do you see your 376 00:22:35,880 --> 00:22:39,920 Speaker 1: work as creative? I definitely say my work as creative, 377 00:22:40,240 --> 00:22:47,960 Speaker 1: and I think creativity is truly thinking outside of the 378 00:22:48,000 --> 00:22:52,439 Speaker 1: box and looking at just different ways of doing things. 379 00:22:53,119 --> 00:22:57,080 Speaker 1: I think the biggest thing that I try to embody 380 00:22:57,240 --> 00:23:00,879 Speaker 1: is having an open mindset and really in never being 381 00:23:00,880 --> 00:23:04,720 Speaker 1: willing to shut something down or not look at a 382 00:23:04,760 --> 00:23:10,320 Speaker 1: particular solution or option, because you really never know where 383 00:23:10,440 --> 00:23:13,280 Speaker 1: a particular solution might come from. If you look at 384 00:23:13,600 --> 00:23:17,920 Speaker 1: where some of the advancements in that the medical field 385 00:23:17,920 --> 00:23:21,679 Speaker 1: are coming from, it's because they're being open to new ideas, 386 00:23:22,400 --> 00:23:28,240 Speaker 1: new materials, new ingredients, new recipes, new technologies. Having an 387 00:23:28,240 --> 00:23:32,159 Speaker 1: open mindset really helps improve that that that ability to 388 00:23:32,200 --> 00:23:35,800 Speaker 1: solve complex problems. And I think for me, creativity is 389 00:23:35,800 --> 00:23:38,280 Speaker 1: really just having that that open mindset. Tell me a 390 00:23:38,320 --> 00:23:42,080 Speaker 1: little bit about how you approach novel problems. What do 391 00:23:42,160 --> 00:23:46,040 Speaker 1: you do when you get stuck? I think the most 392 00:23:46,160 --> 00:23:51,480 Speaker 1: important thing I really like when I push myself to 393 00:23:51,600 --> 00:23:55,479 Speaker 1: do something that I've personally never done before, and a 394 00:23:55,520 --> 00:24:01,600 Speaker 1: lot of the time that yields new solutions to problems 395 00:24:01,680 --> 00:24:04,760 Speaker 1: that that that might be really difficult to solve. It 396 00:24:04,800 --> 00:24:07,640 Speaker 1: doesn't necessarily need to be using this particular set of techniques. 397 00:24:07,720 --> 00:24:10,280 Speaker 1: It's what else can we do to solve this problem? 398 00:24:10,359 --> 00:24:13,280 Speaker 1: And sometimes like it'll be staring you in the face 399 00:24:13,359 --> 00:24:16,160 Speaker 1: and you'll just have no idea until you go, hey, 400 00:24:16,160 --> 00:24:17,680 Speaker 1: I'm going to throw everything out of the box and 401 00:24:17,920 --> 00:24:20,760 Speaker 1: just give it a crack and see what is possible. Um. 402 00:24:21,040 --> 00:24:24,159 Speaker 1: But sometimes it does require that that little bit of 403 00:24:24,280 --> 00:24:28,600 Speaker 1: grit to to push yourself to see just what is possible. 404 00:24:28,680 --> 00:24:32,080 Speaker 1: And I think that's when I've come up with some 405 00:24:32,240 --> 00:24:36,359 Speaker 1: of my favorite things that I've ever done, so something 406 00:24:36,359 --> 00:24:39,399 Speaker 1: that I'm trying to adopt in my in my daily life. 407 00:24:39,440 --> 00:24:43,160 Speaker 1: And I'm reading a lot more about stoicism and philosophy, 408 00:24:43,280 --> 00:24:46,120 Speaker 1: and I'm seeing that you kind of really just got 409 00:24:46,119 --> 00:24:48,800 Speaker 1: to push through sometimes to to see what what's on 410 00:24:48,840 --> 00:24:52,280 Speaker 1: the other side. We talked a little bit earlier about 411 00:24:52,640 --> 00:24:56,480 Speaker 1: how folks can take bits of data and kind of 412 00:24:56,480 --> 00:24:59,320 Speaker 1: tell their own story with it, especially if they if 413 00:24:59,320 --> 00:25:02,679 Speaker 1: they know the story that they're trying to tell. But 414 00:25:02,800 --> 00:25:06,800 Speaker 1: let's talk about using that for good. How does creativity 415 00:25:06,840 --> 00:25:10,520 Speaker 1: play a role in data storytelling. I think there's just 416 00:25:11,359 --> 00:25:14,520 Speaker 1: so much good that you can do with data that 417 00:25:15,800 --> 00:25:20,080 Speaker 1: if you have that in your core ethos, then the 418 00:25:20,080 --> 00:25:22,879 Speaker 1: world's your oyster, right. I always come back to my 419 00:25:22,960 --> 00:25:26,399 Speaker 1: favorite project that I've ever done, and that was using 420 00:25:26,520 --> 00:25:29,240 Speaker 1: computer vision to try to decode sign language. It is 421 00:25:29,280 --> 00:25:31,800 Speaker 1: by no means a state of the art model. But 422 00:25:31,840 --> 00:25:34,680 Speaker 1: I forget hold on why is never nobody ever approached 423 00:25:34,720 --> 00:25:37,160 Speaker 1: this or at least shared how they've tried to do it. 424 00:25:37,320 --> 00:25:40,440 Speaker 1: And I've kind of just had to get real creative 425 00:25:40,440 --> 00:25:44,080 Speaker 1: and trying to build that I had. I literally spent 426 00:25:44,280 --> 00:25:47,120 Speaker 1: weeks just trying to install stuff then trying to get 427 00:25:47,160 --> 00:25:49,480 Speaker 1: it running on my computer before I even got anywhere 428 00:25:49,600 --> 00:25:54,200 Speaker 1: near building that particular model. And and it's super hardcore 429 00:25:54,480 --> 00:25:56,240 Speaker 1: in terms of trying to get it set up. But 430 00:25:56,520 --> 00:26:01,000 Speaker 1: there's so many opportunities for good, whether that's improve accessibility 431 00:26:01,240 --> 00:26:06,280 Speaker 1: to certain technologies, improving the quality of life for people 432 00:26:06,359 --> 00:26:09,160 Speaker 1: that could benefit from us using data a little bit better. 433 00:26:09,520 --> 00:26:12,919 Speaker 1: There's a large body of work with a bunch of 434 00:26:12,920 --> 00:26:18,400 Speaker 1: different data scientists where they're actually building language translation models 435 00:26:18,400 --> 00:26:23,560 Speaker 1: for languages which aren't hyper popular or aren't as widely 436 00:26:23,600 --> 00:26:27,200 Speaker 1: spread as we might see in our day to day lives. 437 00:26:27,240 --> 00:26:31,719 Speaker 1: If you look at India, there are a turn of dialects. 438 00:26:31,760 --> 00:26:35,359 Speaker 1: If you look at even where my parents from Mauritius, 439 00:26:35,359 --> 00:26:41,120 Speaker 1: there's there's a whole completely separate dialect where if you've 440 00:26:41,160 --> 00:26:44,439 Speaker 1: never heard it before, you were like this just slang French, 441 00:26:44,560 --> 00:26:48,120 Speaker 1: but no it's it. It's like um, it's its whole 442 00:26:48,160 --> 00:26:52,320 Speaker 1: separate language that obviously allows or improves the ability for 443 00:26:52,359 --> 00:26:56,560 Speaker 1: people to to tap into data and do a little 444 00:26:56,600 --> 00:26:59,359 Speaker 1: bit of good. But there's so much I mean, people 445 00:26:59,359 --> 00:27:03,600 Speaker 1: are using medical image data to improve medical segmentation and 446 00:27:03,680 --> 00:27:08,800 Speaker 1: improve diagnoses. There's just so much amazing work that that's 447 00:27:08,800 --> 00:27:12,680 Speaker 1: happening in that space. There is obviously the temptation or 448 00:27:12,760 --> 00:27:15,280 Speaker 1: used data for bad, but I'd like to think that 449 00:27:15,680 --> 00:27:19,040 Speaker 1: the large majority of the community are really trying to 450 00:27:19,160 --> 00:27:22,920 Speaker 1: use it for good. You started talking about a little 451 00:27:22,960 --> 00:27:25,439 Speaker 1: bit just now, But what are some future trends and 452 00:27:25,520 --> 00:27:29,560 Speaker 1: challenges and future topics or projects you're excited about anything 453 00:27:29,560 --> 00:27:35,119 Speaker 1: in particular looking real further forward, what I'm super excited 454 00:27:35,160 --> 00:27:37,920 Speaker 1: about And I still don't know how it's necessarily going 455 00:27:37,960 --> 00:27:40,000 Speaker 1: to impact me, whether or not that's going to change 456 00:27:40,000 --> 00:27:42,919 Speaker 1: my experience as a developer or not. That we've got 457 00:27:43,000 --> 00:27:45,880 Speaker 1: quantum computers coming right, there's a ton of work that's 458 00:27:45,920 --> 00:27:50,960 Speaker 1: happening in that space. It's going to radically shift how 459 00:27:51,160 --> 00:27:55,120 Speaker 1: large a machine learning model we're able to create, how 460 00:27:55,200 --> 00:27:58,359 Speaker 1: fast we're able to train them. I'm just excited to 461 00:27:58,400 --> 00:28:01,800 Speaker 1: see what happens in that space. I'm not a quantum 462 00:28:01,800 --> 00:28:05,440 Speaker 1: physicist by any means, but I'm still excited to see 463 00:28:05,480 --> 00:28:07,199 Speaker 1: what I'll be able to do with him in the future. 464 00:28:08,480 --> 00:28:12,040 Speaker 1: I love that, as you'all contingent filt this technology, you're 465 00:28:12,080 --> 00:28:14,720 Speaker 1: excited to play with it after it's built, which I'm 466 00:28:15,920 --> 00:28:19,119 Speaker 1: I don't want to have to build it. Nicholas for 467 00:28:19,240 --> 00:28:22,080 Speaker 1: not thank you so much for a talk of me today. 468 00:28:22,320 --> 00:28:25,320 Speaker 1: It's been an absolute pleasure. Thank you so much for 469 00:28:25,520 --> 00:28:31,360 Speaker 1: your insightful questions. It's it's been awesome. Ronald Nick made 470 00:28:31,400 --> 00:28:34,480 Speaker 1: a point that I think is important to remember when 471 00:28:34,520 --> 00:28:38,520 Speaker 1: it comes to technologies ability to improve our businesses, or 472 00:28:38,560 --> 00:28:41,800 Speaker 1: make our jobs easier, or even do social good, A 473 00:28:41,880 --> 00:28:47,040 Speaker 1: thoughtful data strategy is always the first stepping stone. Without 474 00:28:47,040 --> 00:28:51,280 Speaker 1: good data, using machine learning or artificial intelligence to create 475 00:28:51,440 --> 00:28:58,160 Speaker 1: innovative solutions becomes much much harder. Our technology gets more 476 00:28:58,280 --> 00:29:02,040 Speaker 1: sophisticated every day, but that doesn't mean we should lose 477 00:29:02,080 --> 00:29:05,080 Speaker 1: sight of the fundamentals. If we want to get the 478 00:29:05,160 --> 00:29:10,520 Speaker 1: most out of smarter technologies, better business decisions, more optimized technology, 479 00:29:10,760 --> 00:29:16,760 Speaker 1: fresh and unexpected insights, we're going to need smarter data strategy. 480 00:29:17,960 --> 00:29:21,040 Speaker 1: On the next episode of Smart Talks, with IBM the 481 00:29:21,120 --> 00:29:26,200 Speaker 1: power of Salesforce to transform the customer experience. We talked 482 00:29:26,200 --> 00:29:30,320 Speaker 1: with Phil Weinmeister had a product for Salesforce America's at IBM, 483 00:29:30,360 --> 00:29:35,560 Speaker 1: consulting about transforming digital experiences with the power of Salesforce 484 00:29:36,040 --> 00:29:40,520 Speaker 1: and IBM. Smart Talks with IBM is produced by Matt Romano, 485 00:29:40,960 --> 00:29:45,920 Speaker 1: David jaw, Royston Deserve, and Edith Rousselo with Jacob Goldstein. 486 00:29:46,480 --> 00:29:50,720 Speaker 1: Were edited by Sophie crane Are. Engineers are Jason Gambrel, 487 00:29:51,200 --> 00:29:56,680 Speaker 1: Sarah Brugare and Ben Holliday. Theme song by Granmascope. Special 488 00:29:56,680 --> 00:30:00,720 Speaker 1: thanks to Carlie mcglory, Andy Kelly, Kathy cal Hand and 489 00:30:00,840 --> 00:30:03,520 Speaker 1: the eight Bar and IBM teams, as well as the 490 00:30:03,520 --> 00:30:07,280 Speaker 1: Pushkin marketing team. Smart Talks with IBM is a production 491 00:30:07,280 --> 00:30:10,760 Speaker 1: of Pushkin Industries and I Heart Media. To find more 492 00:30:10,800 --> 00:30:15,240 Speaker 1: Pushkin podcasts, listen on the I Heart Radio app, Apple Podcasts, 493 00:30:15,560 --> 00:30:20,360 Speaker 1: or wherever you listen to podcasts. Hi'm Malcolm Gladwell. This 494 00:30:20,440 --> 00:30:29,960 Speaker 1: is a paid advertisement from IBM.