1 00:00:04,120 --> 00:00:07,480 Speaker 1: Get in text with technology with tech Stuff from stuff 2 00:00:07,520 --> 00:00:13,600 Speaker 1: works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,600 --> 00:00:16,200 Speaker 1: I'm your host job in Strickland. I'm an executive producer 4 00:00:16,239 --> 00:00:17,880 Speaker 1: with How Stuff Works in my Heart Radio and I 5 00:00:17,960 --> 00:00:22,200 Speaker 1: love all things tech. And if today's episode sounds a 6 00:00:22,200 --> 00:00:25,360 Speaker 1: little different one, it's a special episode two. I'm recording 7 00:00:25,360 --> 00:00:29,560 Speaker 1: it on location in San Francisco, California, so you might 8 00:00:29,640 --> 00:00:34,040 Speaker 1: occasionally hear some traffic noises, some hotel noises. Maybe you'll 9 00:00:34,080 --> 00:00:37,400 Speaker 1: hear a bell of the famous San Francisco trolley. Perhaps 10 00:00:37,400 --> 00:00:41,480 Speaker 1: you'll even hear bagpipers, because we did. But I'm here 11 00:00:41,680 --> 00:00:45,360 Speaker 1: in San Francisco for a specific reason. IBM invited me 12 00:00:45,479 --> 00:00:48,240 Speaker 1: to fly out here and attend the Think two thousand 13 00:00:48,280 --> 00:00:51,400 Speaker 1: nineteen conference and really get an up close and personal 14 00:00:51,479 --> 00:00:54,160 Speaker 1: view of some of the innovations and services the company 15 00:00:54,200 --> 00:00:58,160 Speaker 1: is rolling out all to their clients, their business partners, 16 00:00:58,160 --> 00:01:00,760 Speaker 1: and I really wanted to share with you my own 17 00:01:00,840 --> 00:01:05,520 Speaker 1: takeaways from this event. Now, before I jump into all this, 18 00:01:06,160 --> 00:01:09,600 Speaker 1: IBM is a business to business entity, meaning that if 19 00:01:09,640 --> 00:01:13,960 Speaker 1: you're an average Jonathan like me, you rarely deal directly 20 00:01:14,319 --> 00:01:17,679 Speaker 1: with IBM. But the company is one of those leading 21 00:01:17,800 --> 00:01:21,880 Speaker 1: entities that provides the tech that other companies use in 22 00:01:21,959 --> 00:01:25,640 Speaker 1: order to do their business. So while it may or 23 00:01:25,680 --> 00:01:27,880 Speaker 1: may not be obvious, there's a lot of stuff that 24 00:01:27,920 --> 00:01:30,399 Speaker 1: we encounter in our day to day lives that's powered 25 00:01:30,520 --> 00:01:34,640 Speaker 1: by IBM. This episode is the first of four special 26 00:01:34,680 --> 00:01:38,280 Speaker 1: episodes about the conference and the technology and innovations that 27 00:01:38,319 --> 00:01:42,400 Speaker 1: are at the bleeding edge of deployment, and today we're 28 00:01:42,440 --> 00:01:47,000 Speaker 1: going to focus on artificial intelligence, something that you could 29 00:01:47,360 --> 00:01:51,120 Speaker 1: argue is almost synonymous with IBM. So you guys know 30 00:01:51,440 --> 00:01:54,360 Speaker 1: that AI is one of my favorite topics to talk about, 31 00:01:54,440 --> 00:01:56,279 Speaker 1: and it can be easy to fall into the trap 32 00:01:56,360 --> 00:02:00,600 Speaker 1: of thinking about AI as some sort of nebulous intelligence 33 00:02:00,640 --> 00:02:04,120 Speaker 1: living in a machine, But when you strip away the 34 00:02:04,240 --> 00:02:07,360 Speaker 1: veil of mystery, you'll see that AI is just another 35 00:02:07,400 --> 00:02:11,160 Speaker 1: part of computer science. It might rely on one architecture 36 00:02:11,320 --> 00:02:14,880 Speaker 1: over another, or it might require an artificial neural network approach, 37 00:02:14,960 --> 00:02:17,839 Speaker 1: depending upon the application, but really it just comes down 38 00:02:17,880 --> 00:02:21,480 Speaker 1: to a series of special algorithms designed to handle information 39 00:02:21,800 --> 00:02:25,280 Speaker 1: in a way to allow a computer to make decisions. 40 00:02:25,720 --> 00:02:30,520 Speaker 1: It's sophisticated and it's fascinating, but it's not magic. However, 41 00:02:30,919 --> 00:02:34,320 Speaker 1: you might be forgiven for thinking of it as magic 42 00:02:34,480 --> 00:02:37,679 Speaker 1: if you happen to witness the exchange between IBM s 43 00:02:37,720 --> 00:02:44,120 Speaker 1: AI System Project Debater and Grand Champion debater Harish Natarajan. 44 00:02:44,639 --> 00:02:47,519 Speaker 1: The debate between man and machine happened a day before 45 00:02:47,560 --> 00:02:51,480 Speaker 1: the official start of the conference. The two participants of 46 00:02:51,520 --> 00:02:55,560 Speaker 1: the debate we're not told about their topic until fifteen 47 00:02:55,600 --> 00:02:59,280 Speaker 1: minutes before the debate was to begin, and then they 48 00:02:59,280 --> 00:03:02,639 Speaker 1: were given their stance on what that topic was. They 49 00:03:02,680 --> 00:03:07,519 Speaker 1: each had four minutes to establish their positions on the subject. 50 00:03:07,919 --> 00:03:10,799 Speaker 1: Then after a short break, they had another four minutes 51 00:03:10,840 --> 00:03:14,639 Speaker 1: to offer rebuttal of their opponent's stance, and then one 52 00:03:14,720 --> 00:03:18,640 Speaker 1: short break later they had two minutes to summarize their arguments. 53 00:03:19,080 --> 00:03:23,280 Speaker 1: The debate topic turned out to be preschools should be subsidized. 54 00:03:23,720 --> 00:03:27,440 Speaker 1: Project Debater, who, by the way, has a gender. Project 55 00:03:27,520 --> 00:03:31,600 Speaker 1: Debater is a she. She was given the pro stance 56 00:03:31,880 --> 00:03:35,960 Speaker 1: on that particular argument and Mr Natarajan got the counter 57 00:03:36,160 --> 00:03:41,320 Speaker 1: stance the the the idea that preschools should not be subsidized. 58 00:03:41,880 --> 00:03:44,520 Speaker 1: I am not going to go through a blow by 59 00:03:44,600 --> 00:03:48,000 Speaker 1: blow of the debate. For one thing, you can actually 60 00:03:48,040 --> 00:03:51,480 Speaker 1: listen to it yourself. Intelligence Squared, which is a show 61 00:03:51,560 --> 00:03:55,160 Speaker 1: dedicated to civil debate on a wide array of topics 62 00:03:55,520 --> 00:03:59,760 Speaker 1: played host to this particular special debate. I urge you 63 00:03:59,800 --> 00:04:02,960 Speaker 1: to seek out that podcast or a video of the 64 00:04:03,000 --> 00:04:06,160 Speaker 1: debate if you want to see how it unfolded for yourself, 65 00:04:06,720 --> 00:04:09,080 Speaker 1: bit by bit. I really just want to talk more 66 00:04:09,080 --> 00:04:14,040 Speaker 1: about the process that was involved. So to debate, no 67 00:04:14,080 --> 00:04:17,359 Speaker 1: matter what you are, whether you're human or machine, you 68 00:04:17,400 --> 00:04:20,400 Speaker 1: need to have an understanding of what it is you're 69 00:04:20,440 --> 00:04:25,560 Speaker 1: either arguing for or against, which is a pretty obvious statement, 70 00:04:25,560 --> 00:04:27,760 Speaker 1: but I feel like I have to lay it out 71 00:04:27,839 --> 00:04:30,839 Speaker 1: that way. You need to be able to form an argument, 72 00:04:31,360 --> 00:04:34,360 Speaker 1: and you have to be able to support that argument logically. 73 00:04:34,640 --> 00:04:37,559 Speaker 1: You want to build your argument so that one part 74 00:04:37,720 --> 00:04:41,520 Speaker 1: leads inevitably into the next part and it all supports 75 00:04:41,560 --> 00:04:44,440 Speaker 1: the stance you have, whether it be for or against 76 00:04:44,520 --> 00:04:48,200 Speaker 1: a particular proposal. This is a non trivial task for 77 00:04:48,240 --> 00:04:52,640 Speaker 1: a human being, and it is an incredible challenge for computers. 78 00:04:53,160 --> 00:04:57,479 Speaker 1: Project debater Or has about ten billion sentences worth of 79 00:04:57,600 --> 00:05:00,760 Speaker 1: data stored in its memory, So when a gets a topic, 80 00:05:01,200 --> 00:05:03,880 Speaker 1: first it has to scour all of the information that 81 00:05:04,120 --> 00:05:07,240 Speaker 1: is in its memory banks and look for relevant information 82 00:05:07,480 --> 00:05:09,960 Speaker 1: related to that topic. Then it has to go a 83 00:05:10,000 --> 00:05:13,760 Speaker 1: step further. It can't just pull up any random information 84 00:05:13,920 --> 00:05:16,640 Speaker 1: about the topic. It has to understand that the information 85 00:05:16,640 --> 00:05:20,320 Speaker 1: actually supports its argument. That is, the computer has to 86 00:05:20,360 --> 00:05:24,359 Speaker 1: make sure it is picking information that is aligned with 87 00:05:24,440 --> 00:05:29,160 Speaker 1: its debate position and not actually against its debate position. 88 00:05:29,720 --> 00:05:33,320 Speaker 1: This falls into the field of natural language processing, and 89 00:05:33,320 --> 00:05:36,040 Speaker 1: I've talked a lot about this too, but in short, 90 00:05:36,480 --> 00:05:39,320 Speaker 1: this describes the area of computer science in which we 91 00:05:39,400 --> 00:05:42,400 Speaker 1: try to find ways for machines to suss out the 92 00:05:42,520 --> 00:05:47,560 Speaker 1: meaning from actual human language. At the basic level, computers 93 00:05:47,720 --> 00:05:52,080 Speaker 1: communicate in machine code and we communicate in human languages. 94 00:05:52,480 --> 00:05:56,960 Speaker 1: Machines don't natively understand human language, just as machine code 95 00:05:56,960 --> 00:06:00,240 Speaker 1: would appear to be nonsense to us. The journey to 96 00:06:00,320 --> 00:06:04,800 Speaker 1: creating powerful systems that use natural language processing to figure 97 00:06:04,800 --> 00:06:08,640 Speaker 1: out the meaning of words, whether they're written or they're spoken, 98 00:06:09,000 --> 00:06:11,479 Speaker 1: it's been a really long one, and many people have 99 00:06:11,600 --> 00:06:16,000 Speaker 1: made advancements, sometimes from completely different perspectives. We've got a 100 00:06:16,000 --> 00:06:18,800 Speaker 1: lot better at this in general, but it's still a 101 00:06:18,880 --> 00:06:22,160 Speaker 1: challenging problem. So think about Google Search for a second. 102 00:06:22,680 --> 00:06:25,640 Speaker 1: When you search for a topic, you type your search 103 00:06:25,760 --> 00:06:28,120 Speaker 1: terms into Google, and then you look at the results. 104 00:06:28,440 --> 00:06:32,200 Speaker 1: You typically get a pretty wide variety of responses. Some 105 00:06:32,360 --> 00:06:34,760 Speaker 1: of them are going to be more relevant than others. 106 00:06:35,040 --> 00:06:37,479 Speaker 1: You might even get a few that aren't relevant at all. 107 00:06:37,920 --> 00:06:41,680 Speaker 1: Google's algorithms attempt to guess at which responses will be 108 00:06:41,720 --> 00:06:45,120 Speaker 1: the most relevant based on your search and sometimes on 109 00:06:45,240 --> 00:06:49,760 Speaker 1: some supplemental information like your search history. But sometimes the 110 00:06:49,839 --> 00:06:52,400 Speaker 1: results aren't ordered in a way that you would prefer. 111 00:06:52,520 --> 00:06:55,240 Speaker 1: You might get an okay response at the top and 112 00:06:55,400 --> 00:06:58,240 Speaker 1: maybe a better one or more relevant one two or 113 00:06:58,279 --> 00:07:01,560 Speaker 1: three spots down. Typically it ends up being on the 114 00:07:01,600 --> 00:07:04,039 Speaker 1: first page, but sometimes it can even be buried lower 115 00:07:04,120 --> 00:07:07,680 Speaker 1: down in the search results. Project debater can't just do 116 00:07:07,839 --> 00:07:11,400 Speaker 1: a simple search and return on key terms, or else 117 00:07:11,480 --> 00:07:15,000 Speaker 1: it might end up spouting out gibberish. It could string 118 00:07:15,080 --> 00:07:18,200 Speaker 1: together two or three sentences that contradict each other that 119 00:07:18,200 --> 00:07:21,160 Speaker 1: wouldn't do anyone any good. And that leads me to 120 00:07:21,160 --> 00:07:24,400 Speaker 1: another point. It's not just good enough to grab relevant 121 00:07:24,440 --> 00:07:27,640 Speaker 1: information that aligns with the argument stance. That's necessary, but 122 00:07:27,680 --> 00:07:31,120 Speaker 1: it's not enough. Those statements have to be ordered properly. 123 00:07:31,160 --> 00:07:34,440 Speaker 1: You have to build support for your stance. You have 124 00:07:34,480 --> 00:07:38,000 Speaker 1: to have this logical progression. A good argument needs that 125 00:07:38,320 --> 00:07:40,800 Speaker 1: from the opening to the closing, so you need to 126 00:07:40,840 --> 00:07:44,200 Speaker 1: make sure there's a flow of information. Otherwise all you'll 127 00:07:44,200 --> 00:07:46,800 Speaker 1: get is a series of relevant points, but they're in 128 00:07:46,800 --> 00:07:50,119 Speaker 1: no particular order, and you have no transitions from point 129 00:07:50,120 --> 00:07:52,760 Speaker 1: to point. It would be jarring and it would be ineffective. 130 00:07:53,480 --> 00:07:58,400 Speaker 1: Project Debater could also support arguments with evidence, which is 131 00:07:58,520 --> 00:08:01,400 Speaker 1: kind of cool. Throughout the debate, we heard the system 132 00:08:01,520 --> 00:08:05,160 Speaker 1: site various studies and quote experts in the field to 133 00:08:05,240 --> 00:08:10,360 Speaker 1: provide support for her stance. This was pretty compelling stuff, 134 00:08:10,600 --> 00:08:14,160 Speaker 1: and this is where Project Debater could be incredibly helpful 135 00:08:14,200 --> 00:08:18,920 Speaker 1: for people who want to argue for or against well anything. Really, 136 00:08:19,520 --> 00:08:22,320 Speaker 1: it's the one area I would say that Project Debater 137 00:08:22,640 --> 00:08:27,880 Speaker 1: had an enormous advantage over the human champion. Harrish. Natarajan 138 00:08:28,160 --> 00:08:32,000 Speaker 1: understands how to create a logical, persuasive argument and how 139 00:08:32,040 --> 00:08:35,560 Speaker 1: to find weaknesses in the arguments of opponents, but he 140 00:08:35,600 --> 00:08:39,400 Speaker 1: can't research a library's worth of information in fifteen minutes 141 00:08:39,440 --> 00:08:43,760 Speaker 1: in preparation for a debate, but Project Debater can. However, 142 00:08:43,760 --> 00:08:47,000 Speaker 1: I wouldn't feel too badly for Harrish. He held the 143 00:08:47,040 --> 00:08:50,840 Speaker 1: advantage in lots of other ways. For example, during the debate, 144 00:08:51,040 --> 00:08:54,280 Speaker 1: he brought up a criticism of Project Debater's argument, pointing 145 00:08:54,280 --> 00:08:57,720 Speaker 1: out that one of her conclusions was based without first 146 00:08:57,880 --> 00:09:00,880 Speaker 1: establishing the evidence needed to support were in it. In 147 00:09:00,920 --> 00:09:04,160 Speaker 1: a debate between human champions, you would likely hear during 148 00:09:04,160 --> 00:09:07,520 Speaker 1: the rebuttal phase a response to this, perhaps including some 149 00:09:07,640 --> 00:09:10,480 Speaker 1: of the evidence that might have been left out previously. 150 00:09:10,800 --> 00:09:15,360 Speaker 1: Project Debater didn't really address that criticism. We also found 151 00:09:15,360 --> 00:09:18,800 Speaker 1: out at the end of the debate that typically the 152 00:09:18,880 --> 00:09:23,839 Speaker 1: Intelligence Squared format would include another round the moderator would 153 00:09:23,840 --> 00:09:26,520 Speaker 1: hold around in which you would ask critical questions of 154 00:09:26,559 --> 00:09:29,480 Speaker 1: each of the participants in order to test their arguments 155 00:09:29,520 --> 00:09:32,640 Speaker 1: and their logic. This is done in the normal debates 156 00:09:32,640 --> 00:09:35,520 Speaker 1: on Intelligence Squared, so if you listen to other examples, 157 00:09:35,559 --> 00:09:38,640 Speaker 1: you would hear that round. But Project Debater, while it's 158 00:09:38,640 --> 00:09:42,000 Speaker 1: really impressive, isn't quite up to the task of handling 159 00:09:42,040 --> 00:09:44,520 Speaker 1: that sort of response just yet. And since this was 160 00:09:44,559 --> 00:09:47,520 Speaker 1: really a showcase for the technology, that round was not 161 00:09:47,600 --> 00:09:51,360 Speaker 1: included in this debate. I was really impressed by Project Debater, 162 00:09:51,800 --> 00:09:54,920 Speaker 1: and as Harish pointed out after the exchange, the technology 163 00:09:54,960 --> 00:09:57,760 Speaker 1: has the potential to really help people get a deeper 164 00:09:57,840 --> 00:10:00,719 Speaker 1: understanding of complex topics. They can use it to help 165 00:10:00,760 --> 00:10:04,439 Speaker 1: them support their arguments on any given stance, or and 166 00:10:04,679 --> 00:10:07,600 Speaker 1: this is something I think is equally as important, they 167 00:10:07,679 --> 00:10:11,320 Speaker 1: could use Project Debater to produce counter arguments to their 168 00:10:11,400 --> 00:10:15,520 Speaker 1: own stances. Then they might either better learn how to 169 00:10:15,640 --> 00:10:19,120 Speaker 1: argue against those who oppose them and anticipate the arguments 170 00:10:19,160 --> 00:10:23,840 Speaker 1: they would put up against a specific stance, or it 171 00:10:23,920 --> 00:10:27,960 Speaker 1: might actually change their own mind about the entire subject. 172 00:10:28,160 --> 00:10:31,240 Speaker 1: It might be that you have a preconceived idea of 173 00:10:31,280 --> 00:10:33,559 Speaker 1: what is right, and then you get the information from 174 00:10:33,559 --> 00:10:35,920 Speaker 1: Project Debater and you start to question those ideas. You 175 00:10:35,960 --> 00:10:38,680 Speaker 1: might change your mind. That leads me into the next 176 00:10:38,679 --> 00:10:42,760 Speaker 1: section IBM's general philosophy about artificial intelligence. But before we 177 00:10:42,800 --> 00:10:53,559 Speaker 1: get into that, let's take a quick break. A common 178 00:10:53,600 --> 00:10:56,840 Speaker 1: thread that was once in science fiction and now tends 179 00:10:56,880 --> 00:10:59,760 Speaker 1: to be in today's headlines is the impact that automation 180 00:11:00,120 --> 00:11:03,640 Speaker 1: artificial intelligence will have on the workforce. On the most 181 00:11:03,679 --> 00:11:07,199 Speaker 1: pessimistic side, there's a fear that these technologies are going 182 00:11:07,240 --> 00:11:10,160 Speaker 1: to eliminate millions of jobs and that will be plunged 183 00:11:10,160 --> 00:11:14,120 Speaker 1: into an economic crisis, perhaps requiring an entire overhaul of 184 00:11:14,120 --> 00:11:17,480 Speaker 1: how we think about work and money. And there will 185 00:11:17,559 --> 00:11:21,199 Speaker 1: no doubt be jobs that will become either completely automated 186 00:11:21,559 --> 00:11:24,079 Speaker 1: or automated to the point that fewer humans will be 187 00:11:24,120 --> 00:11:27,320 Speaker 1: needed to carry out that same amount of work over time. 188 00:11:27,400 --> 00:11:32,520 Speaker 1: So there's definitely some validity to that fear, but many entities, 189 00:11:32,600 --> 00:11:35,599 Speaker 1: IBM among them, say that we're probably not going to 190 00:11:35,679 --> 00:11:40,439 Speaker 1: see anything quite so dramatic as a job apocalypse. Instead, 191 00:11:40,800 --> 00:11:44,200 Speaker 1: IBM's vision is one in which artificial intelligence acts sort 192 00:11:44,200 --> 00:11:48,520 Speaker 1: of like a super smart, super efficient assistant to aid 193 00:11:48,600 --> 00:11:52,360 Speaker 1: us in our jobs. The tedious or difficult parts of 194 00:11:52,480 --> 00:11:56,680 Speaker 1: jobs that humans find troubling could be handled by artificial intelligence, 195 00:11:56,920 --> 00:11:59,640 Speaker 1: whereas the parts of jobs that are easy for humans 196 00:11:59,679 --> 00:12:02,679 Speaker 1: but not so easy for machines would still need a 197 00:12:02,720 --> 00:12:05,960 Speaker 1: person taking that position and fulfilling those parts of the 198 00:12:06,040 --> 00:12:10,480 Speaker 1: job duties, and artificial intelligence will necessitate new positions in 199 00:12:10,559 --> 00:12:13,160 Speaker 1: order to oversee the systems and to maintain them and 200 00:12:13,200 --> 00:12:17,079 Speaker 1: grow them over time as businesses themselves grow. I had 201 00:12:17,120 --> 00:12:19,880 Speaker 1: the opportunity to sit down with Rob Thomas, who is 202 00:12:19,960 --> 00:12:24,079 Speaker 1: general manager of IBM Data and AI, to talk about this. Now. 203 00:12:24,120 --> 00:12:26,680 Speaker 1: We're sitting in a lounge in the w Hotel in 204 00:12:26,720 --> 00:12:29,160 Speaker 1: San Francisco for this interview, So if you hear some 205 00:12:29,240 --> 00:12:32,800 Speaker 1: ambient noise that's just the sound of business people being 206 00:12:32,840 --> 00:12:38,080 Speaker 1: business in the background. I'm sitting here with Rob Thomas, 207 00:12:38,080 --> 00:12:42,600 Speaker 1: general manager of IBM Data and AI, and today we 208 00:12:43,080 --> 00:12:47,360 Speaker 1: heard the announcement of Watson Anywhere, and I have to 209 00:12:47,400 --> 00:12:51,680 Speaker 1: ask you what does that mean, Jonathan. It's an exciting 210 00:12:51,760 --> 00:12:56,000 Speaker 1: day for us. Let's start with basics. I like to 211 00:12:56,040 --> 00:13:00,719 Speaker 1: say there's no AI without I, A meaning information in architecture. 212 00:13:01,679 --> 00:13:04,840 Speaker 1: AI is only as good as the data that you 213 00:13:04,880 --> 00:13:08,760 Speaker 1: feed it. So that's a problem every company deals with 214 00:13:08,920 --> 00:13:11,000 Speaker 1: and you can even see it in your consumer life. 215 00:13:11,000 --> 00:13:12,920 Speaker 1: If you're using an app over and over again, it 216 00:13:12,920 --> 00:13:14,920 Speaker 1: starts to know you a little bit. So your AI 217 00:13:15,040 --> 00:13:17,559 Speaker 1: is only as good as your data. What we realized 218 00:13:17,720 --> 00:13:20,800 Speaker 1: is companies have a lot of data, but they have 219 00:13:20,840 --> 00:13:23,240 Speaker 1: a data in a lot of different places. It might 220 00:13:23,280 --> 00:13:26,240 Speaker 1: be in one office location, might be data in a 221 00:13:26,240 --> 00:13:30,160 Speaker 1: different office location. There might be data on a public cloud. 222 00:13:30,200 --> 00:13:34,280 Speaker 1: They might have different cloud providers. We made the decision 223 00:13:34,360 --> 00:13:37,080 Speaker 1: that we were going to bring the AI to the 224 00:13:37,160 --> 00:13:40,280 Speaker 1: data and enable that to happen. So Watson Anywhere is 225 00:13:40,320 --> 00:13:43,200 Speaker 1: about taking the best of what we've built in Watson 226 00:13:43,559 --> 00:13:45,720 Speaker 1: and saying you can have that wherever you want it, 227 00:13:45,760 --> 00:13:48,920 Speaker 1: which is normally wherever your data is. And this is 228 00:13:48,920 --> 00:13:51,040 Speaker 1: going to be significant because this is what clients have 229 00:13:51,160 --> 00:13:53,600 Speaker 1: been asking for and now we're making it really easy 230 00:13:53,640 --> 00:13:56,280 Speaker 1: for them to consume Watson AI wherever they have data. 231 00:13:56,559 --> 00:13:59,120 Speaker 1: So if if I'm understanding this correctly, you can look 232 00:13:59,120 --> 00:14:02,280 Speaker 1: at this in very broad sense in two very different directions. 233 00:14:02,320 --> 00:14:04,520 Speaker 1: You can look at it in the sense of I've 234 00:14:04,679 --> 00:14:08,200 Speaker 1: got this big company and I have data spread out 235 00:14:08,240 --> 00:14:11,920 Speaker 1: through multiple locations, and maybe I need to integrate that 236 00:14:12,080 --> 00:14:15,280 Speaker 1: in meaningful ways. That's one way, but it may also 237 00:14:15,400 --> 00:14:18,439 Speaker 1: mean I have a really large company and I've got 238 00:14:18,480 --> 00:14:23,960 Speaker 1: offices in other states, other countries, perhaps where that integration 239 00:14:24,080 --> 00:14:27,920 Speaker 1: may not be as easy or seamless. There might be 240 00:14:28,280 --> 00:14:31,000 Speaker 1: specific laws, for example, when you need b are over 241 00:14:31,120 --> 00:14:35,000 Speaker 1: in the UK, where I need to be very particular 242 00:14:35,320 --> 00:14:37,480 Speaker 1: with how I'm handling data in this region. I might 243 00:14:37,520 --> 00:14:39,600 Speaker 1: not be applying that somewhere else. In this way, if 244 00:14:39,640 --> 00:14:42,320 Speaker 1: I'm taking the AI to where the data is, I 245 00:14:42,360 --> 00:14:47,400 Speaker 1: can handle those different use case scenarios in the specific 246 00:14:47,440 --> 00:14:50,280 Speaker 1: ways that are necessary. So it could be either way. 247 00:14:50,320 --> 00:14:52,400 Speaker 1: It could be like integrating stuff, or it could be 248 00:14:52,440 --> 00:14:57,360 Speaker 1: applying for specific implementations and I am I a lot 249 00:14:57,360 --> 00:15:00,720 Speaker 1: of companies have different security policies for what they can 250 00:15:00,720 --> 00:15:03,320 Speaker 1: do with their data. Like you say, some are worried 251 00:15:03,360 --> 00:15:06,040 Speaker 1: about g d p R can't leave a certain country. 252 00:15:06,080 --> 00:15:09,280 Speaker 1: So we're just saying take the best of the AI, 253 00:15:09,440 --> 00:15:11,960 Speaker 1: put it wherever it is however you want to do it, 254 00:15:12,040 --> 00:15:14,600 Speaker 1: which makes it really easier for them to access, which 255 00:15:14,640 --> 00:15:17,720 Speaker 1: kind of brings up the idea of So what is Watson? 256 00:15:17,760 --> 00:15:19,920 Speaker 1: What is AI? We can talk about that for a minute. Sure. 257 00:15:20,680 --> 00:15:23,200 Speaker 1: I think there's two worlds of AI right now. One 258 00:15:23,320 --> 00:15:26,920 Speaker 1: is people that want to build their own AI. So 259 00:15:27,080 --> 00:15:31,000 Speaker 1: Watson is a way that you can build run your AI, 260 00:15:31,360 --> 00:15:34,440 Speaker 1: manage that. We have a product called Watson Studio Watson 261 00:15:34,440 --> 00:15:37,880 Speaker 1: Machine Learning that's basically how you build run manager AI. 262 00:15:38,120 --> 00:15:41,560 Speaker 1: That's what the data scientists of the world do. They 263 00:15:41,600 --> 00:15:44,800 Speaker 1: want to build something unique for their company. There's another 264 00:15:44,840 --> 00:15:47,520 Speaker 1: world that says, I don't have those skills to build 265 00:15:47,520 --> 00:15:51,720 Speaker 1: my own AI. I just want to use AI. We've 266 00:15:51,720 --> 00:15:55,480 Speaker 1: built some Watson applications. We have Watson Assistant, which is 267 00:15:55,480 --> 00:16:00,680 Speaker 1: basically a customer service agent encoded in software where we 268 00:16:00,720 --> 00:16:04,320 Speaker 1: automate a lot of the decision making to make current 269 00:16:04,320 --> 00:16:08,040 Speaker 1: customer service representatives a lot more effective and how they 270 00:16:08,040 --> 00:16:12,360 Speaker 1: can support customers. And we've got another application called Watson Discovery. 271 00:16:12,760 --> 00:16:14,760 Speaker 1: Any company with a lot of data in different places, 272 00:16:14,760 --> 00:16:17,800 Speaker 1: they want to discover all of their data, what's in there, 273 00:16:17,880 --> 00:16:20,800 Speaker 1: looking for the proverbial needle in a haystack. So there's 274 00:16:20,880 --> 00:16:22,440 Speaker 1: kind of two worlds of a I I want to 275 00:16:22,440 --> 00:16:24,880 Speaker 1: build my own and then I want to build for me. 276 00:16:25,240 --> 00:16:27,240 Speaker 1: Just help me solve a problem I know I have. 277 00:16:27,800 --> 00:16:31,080 Speaker 1: That's what Watson is today, what we're doing, and so 278 00:16:31,400 --> 00:16:35,760 Speaker 1: you've got the with these two approaches. I also like 279 00:16:35,840 --> 00:16:38,360 Speaker 1: the idea, and it's you sort of alluded to it 280 00:16:38,400 --> 00:16:43,600 Speaker 1: that this AI is really all about augmenting us, not 281 00:16:43,600 --> 00:16:46,680 Speaker 1: not that you're replacing any sort of human element, but 282 00:16:46,680 --> 00:16:49,720 Speaker 1: that you're augmenting what we're already doing, making us more effective, 283 00:16:49,720 --> 00:16:53,000 Speaker 1: more efficient, being able to find more meaning in that data. 284 00:16:53,600 --> 00:16:55,240 Speaker 1: One of the stories we hear all the time is 285 00:16:55,280 --> 00:16:58,640 Speaker 1: about just this concept of big data, this massive amount 286 00:16:58,640 --> 00:17:03,120 Speaker 1: of information that we're constant only uh have at our fingertips, 287 00:17:03,160 --> 00:17:06,360 Speaker 1: but it's so big of a problem that it's hard 288 00:17:06,400 --> 00:17:10,040 Speaker 1: to tackle. This is another approach to doing that in 289 00:17:10,080 --> 00:17:12,960 Speaker 1: a meaningful way, where you're actually able to h to 290 00:17:13,040 --> 00:17:15,359 Speaker 1: create action plans based on all this information. You have 291 00:17:15,600 --> 00:17:17,320 Speaker 1: to say, well, we've got all this info, what do 292 00:17:17,359 --> 00:17:20,120 Speaker 1: we do with it? To me, that's a fascinating part 293 00:17:20,119 --> 00:17:24,520 Speaker 1: of this as well, because obviously one of the boogeymen 294 00:17:24,920 --> 00:17:28,119 Speaker 1: in tech is this concept of of AI. It's usually 295 00:17:28,320 --> 00:17:32,399 Speaker 1: a misunderstanding what AI is, and ibm S approach has 296 00:17:32,440 --> 00:17:37,359 Speaker 1: been No, this is really about enhancement, not about replacement. 297 00:17:37,880 --> 00:17:41,600 Speaker 1: I like to say AI is not going to replace managers, 298 00:17:42,359 --> 00:17:45,720 Speaker 1: but managers that use AI are going to replace managers 299 00:17:45,720 --> 00:17:49,760 Speaker 1: that don't. And it actually it's a wave. It gives 300 00:17:49,800 --> 00:17:53,080 Speaker 1: you a superpower if you're willing to use it. Give 301 00:17:53,119 --> 00:17:57,280 Speaker 1: you an example, Royal Bank of Scotland, big retail commercial bank. 302 00:17:57,880 --> 00:18:01,040 Speaker 1: They're trying to serve all of their ret hell banking customers. 303 00:18:01,600 --> 00:18:07,200 Speaker 1: They're using Watson Assistant for customer service. They're getting faster answers. 304 00:18:07,960 --> 00:18:10,359 Speaker 1: It's still going through their agents, but their agents are 305 00:18:10,400 --> 00:18:13,359 Speaker 1: using the Watson Assistant to say do I understand that question? 306 00:18:13,440 --> 00:18:16,800 Speaker 1: What they're really looking for? So they're getting faster answers, 307 00:18:16,880 --> 00:18:20,840 Speaker 1: they're getting better answers, you get more satisfied clients. So 308 00:18:21,200 --> 00:18:24,120 Speaker 1: that's why I go back to what I said, Managers 309 00:18:24,119 --> 00:18:28,440 Speaker 1: that use AI have superpowers, and so I encourage everybody 310 00:18:28,480 --> 00:18:31,639 Speaker 1: to be open to that because it makes you more effective. 311 00:18:32,240 --> 00:18:33,760 Speaker 1: Means you have to do less of the boring work. 312 00:18:33,800 --> 00:18:35,480 Speaker 1: We all have boring work we have to do. You 313 00:18:35,520 --> 00:18:37,919 Speaker 1: can automate a lot of the boring work. So I 314 00:18:37,920 --> 00:18:40,520 Speaker 1: think it actually makes jobs a lot more interesting, which 315 00:18:40,560 --> 00:18:45,000 Speaker 1: is exciting. And as an average user and average person, 316 00:18:45,840 --> 00:18:47,600 Speaker 1: the one of the results is that you get you 317 00:18:47,680 --> 00:18:50,840 Speaker 1: just get better results when you're using these different services 318 00:18:50,920 --> 00:18:53,920 Speaker 1: that are incorporating the AI, because you're getting the right answer, 319 00:18:53,960 --> 00:18:56,920 Speaker 1: You're getting the right answer faster. Uh, you don't have 320 00:18:57,000 --> 00:19:00,200 Speaker 1: to worry about as much follow up. So it's takes 321 00:19:00,200 --> 00:19:03,280 Speaker 1: a lot of the frustration out of those interactions between 322 00:19:03,760 --> 00:19:07,959 Speaker 1: customer and say, uh, you know a customer representative. I 323 00:19:08,000 --> 00:19:10,760 Speaker 1: think I've been on both sides of that. I've been 324 00:19:10,760 --> 00:19:13,920 Speaker 1: on the side as the customer who's frustrated, trying very 325 00:19:14,000 --> 00:19:16,399 Speaker 1: hard not to let my frustration spill out with my 326 00:19:16,480 --> 00:19:20,720 Speaker 1: interaction representative. And I've I've married to a woman who 327 00:19:20,840 --> 00:19:24,119 Speaker 1: was a customer representative who would come home and I 328 00:19:24,160 --> 00:19:26,520 Speaker 1: would I would hold her for an hour because she 329 00:19:26,600 --> 00:19:30,359 Speaker 1: had been nailed at eight hours straight. So uh, that 330 00:19:30,480 --> 00:19:32,240 Speaker 1: is something I think that a lot of people lose 331 00:19:32,280 --> 00:19:34,359 Speaker 1: in this too, because when we have these discussions, we're 332 00:19:34,400 --> 00:19:38,399 Speaker 1: talking about the enterprise level frequently, and the average person says, well, 333 00:19:38,400 --> 00:19:42,160 Speaker 1: how does that affect me? It affects them because this 334 00:19:42,280 --> 00:19:46,000 Speaker 1: ends up being incorporated into applications that are forward facing 335 00:19:46,000 --> 00:19:51,080 Speaker 1: for customers in some cases. So I'm really excited by this. 336 00:19:51,400 --> 00:19:56,119 Speaker 1: I think Watson is an incredible platform. I've had various 337 00:19:56,160 --> 00:19:59,960 Speaker 1: interactions on various levels with Watson throughout the years, ray 338 00:20:00,040 --> 00:20:03,679 Speaker 1: raging from seeing how it could be used in a 339 00:20:03,720 --> 00:20:07,440 Speaker 1: customer service aspect to Watson Chef, which was my favorite 340 00:20:07,440 --> 00:20:12,000 Speaker 1: implementation I've ever seen, just to get weird fun recipes 341 00:20:12,200 --> 00:20:17,480 Speaker 1: generated by Watson Um. Last night I saw the debater presentation, 342 00:20:17,480 --> 00:20:20,919 Speaker 1: which was fascinating to see this platform be able to 343 00:20:20,960 --> 00:20:25,439 Speaker 1: actually put together a cohesive, coherent argument from lots of 344 00:20:25,480 --> 00:20:30,040 Speaker 1: different points of data. That is a phenomenal achievement. People 345 00:20:30,080 --> 00:20:33,679 Speaker 1: don't realize how incredibly difficult that is. What excites you 346 00:20:33,920 --> 00:20:36,960 Speaker 1: most about where we are with AI and where you 347 00:20:37,000 --> 00:20:39,920 Speaker 1: see us going in the future. Let me give you 348 00:20:39,960 --> 00:20:42,800 Speaker 1: an example that I think hopefully everybody can relate to, 349 00:20:43,080 --> 00:20:44,840 Speaker 1: which will kind of bring it to life for how 350 00:20:45,440 --> 00:20:50,320 Speaker 1: AI is not just impacting businesses but individuals. AMC Networks 351 00:20:50,440 --> 00:20:53,199 Speaker 1: is a big client of ours, so I'm sure some 352 00:20:53,280 --> 00:20:55,919 Speaker 1: of your listeners have seen some of their TV shows. 353 00:20:55,960 --> 00:20:59,000 Speaker 1: Breaking Bad was a popular one, but they've got many 354 00:20:59,240 --> 00:21:04,560 Speaker 1: and AMC networks challenge was, how do we understand what 355 00:21:04,760 --> 00:21:09,840 Speaker 1: viewers are responding to, what are they liking? And can 356 00:21:09,880 --> 00:21:12,879 Speaker 1: we adjust plot lines based on what we're hearing and 357 00:21:12,960 --> 00:21:16,320 Speaker 1: what they're liking. They also, you know, their businesses also 358 00:21:16,320 --> 00:21:19,959 Speaker 1: around advertising, so can we give advertisers an idea of 359 00:21:19,960 --> 00:21:22,520 Speaker 1: when to engage with the users, how to engage with 360 00:21:22,520 --> 00:21:27,440 Speaker 1: the users. They're using our technology on cloud where they 361 00:21:27,640 --> 00:21:31,120 Speaker 1: federate data coming from step top boxes companies like Nielsen 362 00:21:31,200 --> 00:21:35,720 Speaker 1: third party data. They bring that behind their firewall and 363 00:21:35,760 --> 00:21:37,720 Speaker 1: the company to manage that data. And then they're using 364 00:21:37,760 --> 00:21:41,120 Speaker 1: things like Watson Studio to build models and say hey, 365 00:21:41,200 --> 00:21:43,760 Speaker 1: you should go do this kind of thing to an advertiser, 366 00:21:44,280 --> 00:21:46,840 Speaker 1: or even reaching out direct to a consumer saying we 367 00:21:46,920 --> 00:21:50,840 Speaker 1: thought this show might interest you. So AI is behind 368 00:21:50,920 --> 00:21:55,920 Speaker 1: the scenes actually everywhere, and I think sometimes you only 369 00:21:55,960 --> 00:21:58,000 Speaker 1: notice that when you kind of have an aha moment 370 00:21:58,040 --> 00:22:00,600 Speaker 1: where you're like, wow, that felt magical. It's it's actually 371 00:22:00,640 --> 00:22:03,600 Speaker 1: AI is not magic, it's just computer science. But it 372 00:22:03,720 --> 00:22:07,840 Speaker 1: is impacting every individual, whether they know it or not. Today. 373 00:22:07,920 --> 00:22:11,000 Speaker 1: That's incredible. I only have one last question for you, 374 00:22:11,400 --> 00:22:13,679 Speaker 1: which is are there any questions I should have asked you? 375 00:22:14,520 --> 00:22:17,840 Speaker 1: I wish you would ask me why IBM and how 376 00:22:17,960 --> 00:22:21,760 Speaker 1: is how is IBM relevant to this space going forward? 377 00:22:22,000 --> 00:22:24,960 Speaker 1: I'm sorry I did have one more question. Okay, why 378 00:22:25,000 --> 00:22:28,480 Speaker 1: IBM and why is IBM relevant in this space going forward? 379 00:22:28,640 --> 00:22:32,879 Speaker 1: Great question. IBM has an amazing history over a hundred 380 00:22:32,920 --> 00:22:36,240 Speaker 1: years old, and I think we've always been a steward 381 00:22:36,560 --> 00:22:41,600 Speaker 1: of responsibility and integrity. And when you work with IBM, 382 00:22:41,760 --> 00:22:43,280 Speaker 1: you know what you're going to get, which is you're 383 00:22:43,280 --> 00:22:47,800 Speaker 1: going to be satisfied. This whole area of data AI 384 00:22:48,119 --> 00:22:52,680 Speaker 1: make some people a little squeamish. They're worried about lack 385 00:22:52,760 --> 00:22:56,440 Speaker 1: of transparency. They're worried about is my data being shared? 386 00:22:57,280 --> 00:22:59,600 Speaker 1: The best part of working with IBM and being part 387 00:22:59,640 --> 00:23:02,879 Speaker 1: of him is that you know your data safe, you 388 00:23:02,920 --> 00:23:05,480 Speaker 1: know your models are safe. You know IBM not sharing 389 00:23:05,520 --> 00:23:07,760 Speaker 1: this with anybody else. You know that we will be 390 00:23:07,800 --> 00:23:11,240 Speaker 1: the stewards of responsibility and AI. That's why last year 391 00:23:11,240 --> 00:23:14,639 Speaker 1: we came out with explainability and bias detection for AI. 392 00:23:14,760 --> 00:23:16,800 Speaker 1: I think we're the first company to do that, because 393 00:23:17,040 --> 00:23:19,320 Speaker 1: there's a lot of things that can go wrong in 394 00:23:19,359 --> 00:23:23,119 Speaker 1: the world of machine learning or AI unless you're thinking 395 00:23:23,160 --> 00:23:27,560 Speaker 1: about societal impacts, human impacts, and we spend a lot 396 00:23:27,560 --> 00:23:29,400 Speaker 1: of time on that at IBM. So that's why I'm 397 00:23:29,480 --> 00:23:33,280 Speaker 1: very optimistic. Absolutely, and yes, you definitely don't want something 398 00:23:33,280 --> 00:23:36,520 Speaker 1: like artificial intelligence to become a black box technology where 399 00:23:36,520 --> 00:23:39,120 Speaker 1: you have no idea how it's making its decisions behind 400 00:23:39,160 --> 00:23:43,920 Speaker 1: the scenes, because obviously that breeds mistrust and unease. So 401 00:23:44,040 --> 00:23:46,640 Speaker 1: I'm very happy to hear that as well. I remember 402 00:23:46,640 --> 00:23:48,919 Speaker 1: at the Think Conference last year, I was at those 403 00:23:48,960 --> 00:23:52,480 Speaker 1: those presentations and I took away I was really impressed 404 00:23:52,480 --> 00:23:55,760 Speaker 1: by the discussions about bias and transparency as well. It's 405 00:23:55,800 --> 00:23:57,880 Speaker 1: something that a lot of people have been arguing for, 406 00:23:58,040 --> 00:24:00,919 Speaker 1: and to see a leader in a space take that 407 00:24:01,040 --> 00:24:04,640 Speaker 1: very seriously is uh is a great relief and it 408 00:24:04,680 --> 00:24:07,119 Speaker 1: gives me a lot of optimism about the future that 409 00:24:07,320 --> 00:24:10,040 Speaker 1: Mr Thomas, Thank you so much for taking time to 410 00:24:10,080 --> 00:24:12,639 Speaker 1: talk with me and my listeners. I really appreciate it 411 00:24:12,720 --> 00:24:16,119 Speaker 1: great being here. Thank you appreciate it. Now up to 412 00:24:16,160 --> 00:24:19,040 Speaker 1: this point, I haven't really mentioned Watson, but it's a 413 00:24:19,040 --> 00:24:22,480 Speaker 1: good time to remind ourselves that Watson is an artificial 414 00:24:22,520 --> 00:24:26,800 Speaker 1: intelligence platform. IBAM made a really big splashback in two 415 00:24:26,800 --> 00:24:30,399 Speaker 1: thousand eleven when Watson competed on Jeopardy against two humans 416 00:24:30,440 --> 00:24:33,680 Speaker 1: who were former long reigning champions, and you can see 417 00:24:33,680 --> 00:24:37,160 Speaker 1: in Watson some of the same concepts that emerged more 418 00:24:37,280 --> 00:24:41,360 Speaker 1: evolved in Project Debater. The system can parce language, including 419 00:24:41,440 --> 00:24:45,000 Speaker 1: more subtle stuff like wordplay, figure out what is meant 420 00:24:45,119 --> 00:24:48,440 Speaker 1: by that language, and then evaluate what the proper response 421 00:24:48,520 --> 00:24:52,440 Speaker 1: should be. If the evaluation meets a certain threshold of confidence, 422 00:24:52,480 --> 00:24:55,160 Speaker 1: then Watson will submit it. Otherwise it kind of keeps 423 00:24:55,160 --> 00:24:59,240 Speaker 1: its electronic trap shut. Now, in real world deployments, Watson 424 00:24:59,359 --> 00:25:02,640 Speaker 1: rarely has quite so difficult a task to perform as 425 00:25:02,640 --> 00:25:06,280 Speaker 1: to compete in Jeopardy, which is all general knowledge. Typically, 426 00:25:06,320 --> 00:25:09,000 Speaker 1: Watson is working within a fairly well defined set of 427 00:25:09,000 --> 00:25:13,359 Speaker 1: parameters for its implementation. For example, a car insurance company 428 00:25:13,480 --> 00:25:16,600 Speaker 1: using Watson to help with customer interactions wouldn't have to 429 00:25:16,640 --> 00:25:19,520 Speaker 1: worry about someone asking what the capital of Belgium is 430 00:25:19,960 --> 00:25:23,720 Speaker 1: or what's the best barbecue restaurant in Atlanta, So by 431 00:25:23,720 --> 00:25:26,840 Speaker 1: showing off Watson's potential on the grand stage of Jeopardy, 432 00:25:27,080 --> 00:25:29,440 Speaker 1: IBM was able to lay the foundation for a pretty 433 00:25:29,480 --> 00:25:33,600 Speaker 1: convincing sales pitch. Yes, Watson could do all these amazing things, 434 00:25:33,600 --> 00:25:36,480 Speaker 1: but imagine what it can do when it focuses on 435 00:25:36,520 --> 00:25:40,400 Speaker 1: a very particular industry such as healthcare. That's really where 436 00:25:40,400 --> 00:25:43,679 Speaker 1: IBM's focus has been. I have more to say, but 437 00:25:43,800 --> 00:25:54,359 Speaker 1: first let's take another quick break. Another interesting thing that 438 00:25:54,480 --> 00:25:57,160 Speaker 1: Mr Thomas brought up was the idea of taking artificial 439 00:25:57,200 --> 00:26:00,480 Speaker 1: intelligence to the data as opposed to the other way around, 440 00:26:00,800 --> 00:26:02,679 Speaker 1: and I think that's a pretty smart move. There have 441 00:26:02,720 --> 00:26:06,560 Speaker 1: been so many high profile, high impact data breaches over 442 00:26:06,560 --> 00:26:09,520 Speaker 1: the last few years that I imagine most companies are 443 00:26:09,680 --> 00:26:13,720 Speaker 1: pretty reluctant to move mission critical information if they don't 444 00:26:13,760 --> 00:26:17,160 Speaker 1: have to. The potential for something to go wrong, for 445 00:26:17,359 --> 00:26:20,560 Speaker 1: some bad actor to find a vulnerability and exploit it 446 00:26:20,600 --> 00:26:24,359 Speaker 1: and thus get access to private information, or perhaps worse, 447 00:26:24,600 --> 00:26:27,639 Speaker 1: for the process itself to go wrong and to accidentally 448 00:26:27,720 --> 00:26:31,000 Speaker 1: dump information into the public sphere without any need for 449 00:26:31,040 --> 00:26:35,400 Speaker 1: outside interference. That's enough to make any company decline incorporating 450 00:26:35,480 --> 00:26:38,679 Speaker 1: AI if it means porting data over to where the 451 00:26:38,760 --> 00:26:42,440 Speaker 1: AI is so by making Watson available to companies to 452 00:26:42,520 --> 00:26:46,000 Speaker 1: run on their own private cloud or on premises or 453 00:26:46,040 --> 00:26:49,119 Speaker 1: on prem as they say here at Think two thousand nineteen, 454 00:26:49,440 --> 00:26:51,560 Speaker 1: or in the public cloud. This is a huge deal. 455 00:26:51,640 --> 00:26:54,960 Speaker 1: It removes that barrier of entry. Now companies that are 456 00:26:55,000 --> 00:26:57,919 Speaker 1: interested in using AI with their services can do it 457 00:26:58,000 --> 00:27:00,480 Speaker 1: without the worry of having to move their own data 458 00:27:00,560 --> 00:27:04,120 Speaker 1: around and her keynote speech kicking off the IBM Think 459 00:27:04,160 --> 00:27:08,280 Speaker 1: two thousand nineteen conference, Jenny Rometti, the CEO, president and 460 00:27:08,400 --> 00:27:11,240 Speaker 1: chair of the Board for IBM, spoke about this. She 461 00:27:11,320 --> 00:27:16,120 Speaker 1: outlined two general approaches to incorporating AI into services. One 462 00:27:16,200 --> 00:27:19,480 Speaker 1: is what she would call the outside in approach. This 463 00:27:19,560 --> 00:27:23,280 Speaker 1: is where companies would take their pre existing applications and 464 00:27:23,280 --> 00:27:26,479 Speaker 1: then they would add a layer of artificial intelligence on 465 00:27:26,640 --> 00:27:29,320 Speaker 1: top of those applications in order to make them work 466 00:27:29,640 --> 00:27:33,040 Speaker 1: better and more efficiently. This is an approach companies might 467 00:27:33,040 --> 00:27:35,720 Speaker 1: take if they lack the expertise or time to build 468 00:27:35,720 --> 00:27:39,480 Speaker 1: out new apps entirely. But some companies might opt to 469 00:27:39,560 --> 00:27:42,320 Speaker 1: do the reverse, the inside out approach. In other words, 470 00:27:42,600 --> 00:27:45,879 Speaker 1: that's where they create all new applications and processes that 471 00:27:45,920 --> 00:27:49,000 Speaker 1: incorporate AI into them from the beginning to try and 472 00:27:49,080 --> 00:27:52,960 Speaker 1: maximize the value of having the artificial intelligence involved. So 473 00:27:53,000 --> 00:27:56,159 Speaker 1: what the heck does all that mean? How does that 474 00:27:56,320 --> 00:27:59,920 Speaker 1: impact us as average people? Well, largely, it means the 475 00:28:00,080 --> 00:28:03,840 Speaker 1: service as we use, such as mobile apps or computer software, 476 00:28:04,080 --> 00:28:07,920 Speaker 1: will work better, become more sophisticated, and they will incorporate 477 00:28:07,960 --> 00:28:11,160 Speaker 1: more features. And this will become more important as companies 478 00:28:11,200 --> 00:28:14,680 Speaker 1: continue to grow and place data in different data centers 479 00:28:14,720 --> 00:28:19,199 Speaker 1: and clouds. As that happens, it becomes increasingly challenging to 480 00:28:19,280 --> 00:28:22,360 Speaker 1: manage all that information and to coordinate between those centers 481 00:28:22,359 --> 00:28:26,159 Speaker 1: of data and pull together meaningful results. It helps if 482 00:28:26,200 --> 00:28:29,600 Speaker 1: I give you an example. So let's imagine that you 483 00:28:29,680 --> 00:28:32,960 Speaker 1: have downloaded a travel app and it's all to help 484 00:28:33,000 --> 00:28:35,080 Speaker 1: you plan and book a trip you want to take. 485 00:28:35,440 --> 00:28:38,640 Speaker 1: Maybe you're traveling to another country in six months, so 486 00:28:38,720 --> 00:28:41,760 Speaker 1: you're planning well in advance. The app helps you by 487 00:28:41,760 --> 00:28:46,640 Speaker 1: consulting many different sources of information. It prices out flights 488 00:28:46,640 --> 00:28:49,120 Speaker 1: through various airlines to help you find one that fits 489 00:28:49,160 --> 00:28:52,280 Speaker 1: your budget and your schedule. It gives you information on 490 00:28:52,320 --> 00:28:56,080 Speaker 1: hotels at your destination to have availability. It provides a 491 00:28:56,120 --> 00:28:58,560 Speaker 1: list of possible activities you might want to do while 492 00:28:58,560 --> 00:29:01,880 Speaker 1: you're there. Maybe the includes a calendar that's populated with 493 00:29:01,920 --> 00:29:04,680 Speaker 1: cultural events that are going on at the location while 494 00:29:04,720 --> 00:29:09,160 Speaker 1: you're there on vacation. There are options for restaurant recommendations, 495 00:29:09,280 --> 00:29:12,160 Speaker 1: information on average weather during that time of year, so 496 00:29:12,200 --> 00:29:14,160 Speaker 1: you know what sort of clothes you're gonna need to pack, 497 00:29:14,840 --> 00:29:17,160 Speaker 1: and maybe more pieces. Maybe there are things that will 498 00:29:17,160 --> 00:29:20,000 Speaker 1: populate over time so that each time you go back 499 00:29:20,040 --> 00:29:24,120 Speaker 1: to consult your plan, it's update with the latest information. 500 00:29:24,560 --> 00:29:28,280 Speaker 1: These various pieces of information don't all live on one 501 00:29:28,320 --> 00:29:31,800 Speaker 1: server somewhere connected to that app. There's no business out 502 00:29:31,800 --> 00:29:34,800 Speaker 1: there that has all of this information stored on some 503 00:29:34,880 --> 00:29:39,680 Speaker 1: magical computer. Instead, the information is coming from numerous sources 504 00:29:39,760 --> 00:29:43,320 Speaker 1: and organized in a meaningful way for your mobile device interface. 505 00:29:44,000 --> 00:29:46,040 Speaker 1: The goal of an AI approach is to have an 506 00:29:46,080 --> 00:29:49,160 Speaker 1: automated system in place that's able to do this kind 507 00:29:49,160 --> 00:29:51,840 Speaker 1: of stuff quickly and without error, so that the end 508 00:29:51,960 --> 00:29:55,000 Speaker 1: user you, in other words, ends up with a seamless 509 00:29:55,080 --> 00:29:58,680 Speaker 1: and helpful experience, so that you get the information you need, 510 00:29:58,800 --> 00:30:01,680 Speaker 1: when you need it and where you need it. And 511 00:30:01,760 --> 00:30:04,840 Speaker 1: this is harder than it sounds. And the general message 512 00:30:04,880 --> 00:30:07,120 Speaker 1: at think two thousand nineteen that I've been hearing is 513 00:30:07,160 --> 00:30:10,760 Speaker 1: that it's not a question of if companies should start 514 00:30:10,800 --> 00:30:14,160 Speaker 1: employing these sorts of AI approaches in their processes, but 515 00:30:14,320 --> 00:30:17,720 Speaker 1: rather when they should. That if you don't do this, 516 00:30:17,840 --> 00:30:19,600 Speaker 1: it means you're not going to be able to keep 517 00:30:19,680 --> 00:30:22,600 Speaker 1: up with the demands of business and growth. So it 518 00:30:22,720 --> 00:30:25,719 Speaker 1: sounds like we've got a future of artificial intelligent assistance 519 00:30:25,840 --> 00:30:29,920 Speaker 1: ahead of us, and I think that's pretty fascinating, especially 520 00:30:29,920 --> 00:30:32,600 Speaker 1: when we think of it in the context of augmenting 521 00:30:32,680 --> 00:30:36,239 Speaker 1: what we humans can already do, not replacing what we 522 00:30:36,280 --> 00:30:39,400 Speaker 1: can do. That's a pretty cool message and one that 523 00:30:39,480 --> 00:30:42,600 Speaker 1: I really found inspiring while I was at Think two 524 00:30:42,600 --> 00:30:45,440 Speaker 1: thousand nineteen. I'm going to have a lot more episodes 525 00:30:45,480 --> 00:30:47,600 Speaker 1: coming out in the near future about some of the 526 00:30:47,680 --> 00:30:50,680 Speaker 1: other things that I am looking into while I'm at 527 00:30:50,680 --> 00:30:54,480 Speaker 1: the conference, including some more interviews with some really interesting people, 528 00:30:54,720 --> 00:30:57,400 Speaker 1: So make sure you stay tuned and check out those 529 00:30:57,440 --> 00:31:00,560 Speaker 1: when they published. They'll be coming out very soon. If 530 00:31:00,600 --> 00:31:03,560 Speaker 1: you guys have any suggestions for future episodes of tech Stuff, 531 00:31:03,600 --> 00:31:07,200 Speaker 1: whether it's about a specific company, a specific technology, maybe 532 00:31:07,200 --> 00:31:09,120 Speaker 1: you have follow up questions about some of the stuff 533 00:31:09,160 --> 00:31:11,680 Speaker 1: I'm talking about this week, make sure you reach out 534 00:31:11,920 --> 00:31:13,960 Speaker 1: let me know about those The email address for the 535 00:31:14,000 --> 00:31:17,320 Speaker 1: show is tech stuff at how stuff works dot com. 536 00:31:17,360 --> 00:31:20,400 Speaker 1: You can also visit our website at tech stuff podcast 537 00:31:20,440 --> 00:31:22,440 Speaker 1: dot com. There you're going to find an archive of 538 00:31:22,520 --> 00:31:24,560 Speaker 1: all of our past shows, as well as links to 539 00:31:24,800 --> 00:31:27,360 Speaker 1: how to connect to us on social media and to 540 00:31:27,520 --> 00:31:30,360 Speaker 1: our merchandise store. And that's it for now, but I'll 541 00:31:30,400 --> 00:31:34,160 Speaker 1: be back again in just a short while to talk 542 00:31:34,240 --> 00:31:37,320 Speaker 1: more about the incredible stuff I'm seeing here at Thanks 543 00:31:37,360 --> 00:31:40,400 Speaker 1: two thousand nineteen. Thank you very much, IBM, and I 544 00:31:40,440 --> 00:31:48,680 Speaker 1: will talk to you again. Release soon for more on 545 00:31:48,760 --> 00:31:51,040 Speaker 1: this and bathands of other topics, because it has to 546 00:31:51,240 --> 00:32:01,680 Speaker 1: works dot com