1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:14,000 Speaker 1: stuff works dot com. Hey there, welcome to tech Stuff. 3 00:00:14,040 --> 00:00:17,600 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:17,600 --> 00:00:21,160 Speaker 1: how Stuff Works and I love all things tech. And yeah, 5 00:00:21,440 --> 00:00:25,000 Speaker 1: I am at the IBM Think two thousand eighteen conference, 6 00:00:25,000 --> 00:00:27,680 Speaker 1: which is why this sounds a little different than normal. 7 00:00:28,760 --> 00:00:32,400 Speaker 1: I am in a hotel room over at the Excalibur Casino, 8 00:00:33,000 --> 00:00:35,080 Speaker 1: and I wanted to talk a little bit about what 9 00:00:35,200 --> 00:00:39,040 Speaker 1: I saw and some of the talks that I went to, 10 00:00:39,840 --> 00:00:43,319 Speaker 1: and I learned a lot of interesting things. Now, one 11 00:00:43,360 --> 00:00:46,559 Speaker 1: thing to say is that the THINK conference it's all 12 00:00:46,600 --> 00:00:51,320 Speaker 1: about IBM and ibm S partners and customers. And unlike 13 00:00:51,400 --> 00:00:53,280 Speaker 1: a lot of companies that we deal with on a 14 00:00:53,360 --> 00:00:59,160 Speaker 1: day to day basis, IBM doesn't really have consumer facing businesses. 15 00:00:59,200 --> 00:01:01,480 Speaker 1: In other words, it's not like you go to the 16 00:01:01,560 --> 00:01:05,440 Speaker 1: store and you go buy IBM stuff. IBM mostly makes 17 00:01:05,480 --> 00:01:11,080 Speaker 1: things for other companies and as such, we don't necessarily 18 00:01:11,160 --> 00:01:14,200 Speaker 1: have to uh, we don't necessarily encounter it directly. We 19 00:01:14,319 --> 00:01:17,960 Speaker 1: encounter IBMS products because they are inside other things that 20 00:01:18,120 --> 00:01:22,440 Speaker 1: we are using. So uh, it's interesting to go to 21 00:01:22,520 --> 00:01:25,679 Speaker 1: these events and to hear these talks, because a lot 22 00:01:25,720 --> 00:01:29,960 Speaker 1: of it is stuff that is very much relevant for 23 00:01:30,400 --> 00:01:35,800 Speaker 1: business leaders or for I T. Professionals, or for UH 24 00:01:36,200 --> 00:01:39,840 Speaker 1: infrastructure engineers that kind of thing, but less so for 25 00:01:39,880 --> 00:01:43,880 Speaker 1: the general public unless you step back a little bit. 26 00:01:44,160 --> 00:01:47,199 Speaker 1: Even so, there were some really interesting talks that talked 27 00:01:47,240 --> 00:01:51,560 Speaker 1: about UH where the future is headed as far as 28 00:01:53,040 --> 00:01:56,800 Speaker 1: very big, broad technologies, and I thought that that would 29 00:01:56,800 --> 00:01:59,000 Speaker 1: be the best way to kind of tackle this, to 30 00:01:59,080 --> 00:02:02,800 Speaker 1: talk about these sort of trends that have been identified 31 00:02:03,200 --> 00:02:06,320 Speaker 1: and these predictions that have been made about these kinds 32 00:02:06,360 --> 00:02:08,080 Speaker 1: of tech, because those are the sort of things that 33 00:02:08,120 --> 00:02:10,960 Speaker 1: are going to affect us moving forward, us being, you know, 34 00:02:11,000 --> 00:02:14,000 Speaker 1: the average person as opposed to people who are running 35 00:02:14,200 --> 00:02:17,000 Speaker 1: a tech company. One of the things that they talked 36 00:02:17,040 --> 00:02:21,680 Speaker 1: about UH both at the the keynote speech that was 37 00:02:21,919 --> 00:02:24,640 Speaker 1: technically the very first big keynote speech that was a 38 00:02:24,800 --> 00:02:29,080 Speaker 1: Jenny Romti. Jenny Rometti is the CEO of IBM. She 39 00:02:29,680 --> 00:02:34,000 Speaker 1: got up and spoke very directly to IBMS partners and customers. 40 00:02:34,600 --> 00:02:39,160 Speaker 1: She talked about how there are different laws that we 41 00:02:39,240 --> 00:02:43,400 Speaker 1: have created more like observations really that UM that have 42 00:02:44,120 --> 00:02:47,519 Speaker 1: described the way technology has developed over the years. Now. 43 00:02:47,680 --> 00:02:50,520 Speaker 1: The most famous one is one I've talked about numerous 44 00:02:50,560 --> 00:02:55,119 Speaker 1: times on this show. That would be Moore's law, Moore's law, 45 00:02:55,240 --> 00:02:57,800 Speaker 1: which was proposed by Gordon Moore. Of course he didn't 46 00:02:57,880 --> 00:03:00,240 Speaker 1: call it Moore's law. He just made an observe. Sian 47 00:03:01,200 --> 00:03:06,160 Speaker 1: was about how every eighteen months or so year and 48 00:03:06,160 --> 00:03:09,840 Speaker 1: a half to two years, the number of discrete components 49 00:03:10,440 --> 00:03:15,000 Speaker 1: meaning transistors at that time on a microchip we're doubling. 50 00:03:15,720 --> 00:03:22,120 Speaker 1: And this observation wasn't about necessarily our technological capabilities, like 51 00:03:22,160 --> 00:03:25,239 Speaker 1: the ability to make things that small. It was more 52 00:03:25,280 --> 00:03:30,639 Speaker 1: about the fact that economics demanded that this was the case, 53 00:03:30,720 --> 00:03:33,840 Speaker 1: that there was enough of a demand two in to 54 00:03:34,000 --> 00:03:39,520 Speaker 1: give an incentive to manufacturing facilities that made these microchips 55 00:03:39,600 --> 00:03:43,280 Speaker 1: to try and make ever smaller components to make more 56 00:03:43,320 --> 00:03:48,040 Speaker 1: powerful processors. So, in other words, it wasn't so much 57 00:03:48,080 --> 00:03:52,400 Speaker 1: that we had this these egghead scientists locked in the 58 00:03:52,480 --> 00:03:55,720 Speaker 1: laboratory coming up with new ways to make transistors smaller. 59 00:03:56,000 --> 00:04:00,320 Speaker 1: It was more like we had money in wheelbarrows out side, 60 00:04:00,800 --> 00:04:03,520 Speaker 1: and we can only get that money if we made 61 00:04:03,560 --> 00:04:08,240 Speaker 1: smaller transistors. And so it was really an economic driven law. 62 00:04:08,400 --> 00:04:11,680 Speaker 1: But the effect that we have on us, it doesn't 63 00:04:11,680 --> 00:04:13,800 Speaker 1: really matter. The economic part we can kind of ignore. 64 00:04:13,880 --> 00:04:16,880 Speaker 1: What we look at is the fact that our processing 65 00:04:16,960 --> 00:04:21,360 Speaker 1: power effectively doubles every eighteen months or so. So every 66 00:04:21,440 --> 00:04:24,640 Speaker 1: year and a half to two years, the machines were 67 00:04:24,760 --> 00:04:28,440 Speaker 1: using are twice as powerful as the ones that were 68 00:04:28,760 --> 00:04:31,960 Speaker 1: two years ago. Uh, And that's kind of cool. It 69 00:04:32,000 --> 00:04:38,880 Speaker 1: means that we keep getting these incredibly sophisticated machines on 70 00:04:38,920 --> 00:04:43,200 Speaker 1: a regular basis, and a lot of the technology sectors 71 00:04:43,240 --> 00:04:48,440 Speaker 1: businesses depend upon the continuation of Moore's law. Later on, 72 00:04:48,920 --> 00:04:52,839 Speaker 1: I was at a talk with Dr Michio Kaku, who 73 00:04:53,000 --> 00:04:56,760 Speaker 1: is a famous physicist and futurist. He talked a little 74 00:04:56,800 --> 00:04:59,680 Speaker 1: bit about the end of the era of Moore's law. 75 00:04:59,720 --> 00:05:02,480 Speaker 1: He did not give a specific prediction as to when 76 00:05:02,520 --> 00:05:05,320 Speaker 1: it would end, but he did say that based just 77 00:05:05,440 --> 00:05:08,840 Speaker 1: purely on physics alone, it will end. What he meant 78 00:05:08,880 --> 00:05:12,760 Speaker 1: by that is, More's law depends on us shrinking these 79 00:05:12,800 --> 00:05:16,600 Speaker 1: components down more and more and more. Once you get 80 00:05:16,600 --> 00:05:19,440 Speaker 1: to the point where the quantum world comes into play, 81 00:05:19,760 --> 00:05:22,120 Speaker 1: this gets really tricky and I've talked about this before, 82 00:05:22,120 --> 00:05:25,040 Speaker 1: to the fact that if you were to create logic 83 00:05:25,080 --> 00:05:28,960 Speaker 1: gates that are so thin that an electron could potentially 84 00:05:29,040 --> 00:05:32,240 Speaker 1: exist on the other side of a logic gate, then 85 00:05:32,320 --> 00:05:35,719 Speaker 1: sometimes an electron is going to be on the other 86 00:05:35,760 --> 00:05:38,520 Speaker 1: side of the uh the electron gates sort of like 87 00:05:38,839 --> 00:05:42,880 Speaker 1: it had tunneled through, except it had not physically tunneled 88 00:05:43,000 --> 00:05:46,560 Speaker 1: through the wall. It's just that it had the probability 89 00:05:46,760 --> 00:05:49,360 Speaker 1: of potentially being on the other side of that wall. 90 00:05:49,400 --> 00:05:51,280 Speaker 1: And as long as there's a probability, it means that 91 00:05:51,400 --> 00:05:54,599 Speaker 1: sometimes that does happen. Even though that you know, in 92 00:05:54,640 --> 00:05:57,240 Speaker 1: the classical world we would say, well, there's a barrier there. 93 00:05:57,279 --> 00:05:59,880 Speaker 1: You can't just go through. A barrier didn't go through. 94 00:05:59,880 --> 00:06:02,479 Speaker 1: It just appeared on the other side because there was 95 00:06:02,520 --> 00:06:05,280 Speaker 1: a chance it could. And if there's a chance, then 96 00:06:05,320 --> 00:06:08,800 Speaker 1: sometimes that does happen. Well, even beyond that, even if 97 00:06:08,839 --> 00:06:12,839 Speaker 1: you say, well, we'll keep figuring out ways to counteract 98 00:06:13,080 --> 00:06:18,440 Speaker 1: this quantum effect so that we can keep having microprocessors 99 00:06:18,440 --> 00:06:22,640 Speaker 1: that are accurate even with quantum tunneling being an issue, 100 00:06:22,880 --> 00:06:25,479 Speaker 1: you eventually get down to the point where you're at 101 00:06:25,520 --> 00:06:30,120 Speaker 1: the atomic scale, meaning the components you're creating are made 102 00:06:30,120 --> 00:06:33,120 Speaker 1: out of atoms themselves. At this stage, you really it 103 00:06:33,360 --> 00:06:37,320 Speaker 1: would be really difficult to counteract those quantum effects and 104 00:06:37,480 --> 00:06:43,279 Speaker 1: you would have to abandon this particular approach to computer 105 00:06:43,360 --> 00:06:47,960 Speaker 1: science and computer architecture, or else it would just collapse 106 00:06:48,000 --> 00:06:52,520 Speaker 1: in on itself. So Moore's law, while it was incredibly important, 107 00:06:52,560 --> 00:06:56,240 Speaker 1: and it continues to be incredibly important right now. Um Ever, 108 00:06:56,320 --> 00:07:01,400 Speaker 1: since you know the transistor was invented, it it it 109 00:07:01,440 --> 00:07:04,240 Speaker 1: only represents the first kind of wave of laws. The 110 00:07:04,279 --> 00:07:07,239 Speaker 1: next law that they talked about was one they called 111 00:07:07,440 --> 00:07:13,520 Speaker 1: Metcalf's law. Uh. Metcalf's law is actually pretty commonly referred 112 00:07:13,560 --> 00:07:17,720 Speaker 1: to law, just not necessarily among you know, regular folks 113 00:07:17,760 --> 00:07:21,080 Speaker 1: like me and you. But Metcalf's law is about the 114 00:07:21,240 --> 00:07:27,040 Speaker 1: value of a network. So how do you measure how 115 00:07:27,160 --> 00:07:30,160 Speaker 1: valuable a network is? Like if you look at a 116 00:07:30,240 --> 00:07:33,080 Speaker 1: network of devices, and then you look at a different 117 00:07:33,120 --> 00:07:35,440 Speaker 1: network of devices, how could you say which one is 118 00:07:35,440 --> 00:07:39,080 Speaker 1: is quote unquote worth more. Metca Cat's law gives you 119 00:07:39,160 --> 00:07:43,400 Speaker 1: that that measurement. It states that the value of a network, 120 00:07:43,920 --> 00:07:48,400 Speaker 1: of a telecommunications network is proportional to the square of 121 00:07:48,440 --> 00:07:52,720 Speaker 1: the number of connected nodes in the system. So, however 122 00:07:52,800 --> 00:07:55,480 Speaker 1: many nodes are there, and the node can be any 123 00:07:55,520 --> 00:07:58,520 Speaker 1: connected device. They could be a computer, it could be 124 00:07:58,520 --> 00:08:01,280 Speaker 1: a smartphone, could be a table, it could be a 125 00:08:01,360 --> 00:08:09,040 Speaker 1: game console. Those nodes collectively end up determining the value 126 00:08:09,200 --> 00:08:12,480 Speaker 1: of the telecommunications network. When you square the number of 127 00:08:12,480 --> 00:08:16,280 Speaker 1: those nodes. It's those interconnections that make the network of 128 00:08:16,320 --> 00:08:20,640 Speaker 1: a valuable. This is incredibly important again in the world 129 00:08:20,640 --> 00:08:23,840 Speaker 1: of business, less so probably for for me and you. 130 00:08:24,480 --> 00:08:27,640 Speaker 1: The third one, the third law that they were proposing, 131 00:08:28,400 --> 00:08:32,560 Speaker 1: would be what they were cheekily referring to as Watson's law. Watson, 132 00:08:32,720 --> 00:08:38,520 Speaker 1: of course, is not just and artificially intelligent platform for 133 00:08:38,720 --> 00:08:43,680 Speaker 1: IBM and for IBMS customers and partners. Watson also refers 134 00:08:43,720 --> 00:08:47,120 Speaker 1: to the founder of IBM what was his name, his 135 00:08:47,200 --> 00:08:53,200 Speaker 1: last name, But Watson's law would be about how the 136 00:08:53,240 --> 00:08:56,720 Speaker 1: amount of data in a system can be leveraged to 137 00:08:56,880 --> 00:09:00,920 Speaker 1: get the amount of knowledge out of that data. This 138 00:09:01,040 --> 00:09:04,840 Speaker 1: sort of as as data grows exponentially, our ability to 139 00:09:04,960 --> 00:09:08,640 Speaker 1: leverage knowledge from that data grows exponentially. So what the 140 00:09:08,640 --> 00:09:13,080 Speaker 1: heck does that mean? Well, think of data as just 141 00:09:13,200 --> 00:09:17,040 Speaker 1: points of information that are not necessarily connected to one another. 142 00:09:17,160 --> 00:09:22,280 Speaker 1: They're not structured necessarily. This would be as if I 143 00:09:22,360 --> 00:09:25,400 Speaker 1: recorded a podcast and I just started to say random 144 00:09:25,800 --> 00:09:29,720 Speaker 1: words into the microphone, and I did that for forty 145 00:09:29,800 --> 00:09:33,120 Speaker 1: five minutes to an hour. And Okay, smart Alex, you 146 00:09:33,200 --> 00:09:36,000 Speaker 1: might think that's how I do it now, but you're 147 00:09:36,120 --> 00:09:38,960 Speaker 1: you're just mean, you're meaning heads. That's not how I 148 00:09:39,000 --> 00:09:42,360 Speaker 1: do it. I actually think this stuff out and I 149 00:09:43,000 --> 00:09:47,520 Speaker 1: structure my data so that I create a foundation and 150 00:09:47,559 --> 00:09:50,320 Speaker 1: then I build upon it. That's a very easy way 151 00:09:50,360 --> 00:09:53,560 Speaker 1: to get knowledge, right. You have the structured format, you 152 00:09:53,600 --> 00:09:56,559 Speaker 1: can digest it, you can synthesize it. You can then 153 00:09:56,760 --> 00:10:00,280 Speaker 1: use that yourself. But if the data is unstru shirt, 154 00:10:00,640 --> 00:10:05,959 Speaker 1: and the data is about a lot of seemingly unconnected things, 155 00:10:06,320 --> 00:10:10,719 Speaker 1: and it's spread across multiple types of files, Let's say 156 00:10:10,800 --> 00:10:15,400 Speaker 1: that you've got an enormous folder, uh, and that folder 157 00:10:15,480 --> 00:10:20,320 Speaker 1: contains files that are video files, they are documents, their presentations, 158 00:10:20,679 --> 00:10:24,120 Speaker 1: their spreadsheets, they're all these different things that that on 159 00:10:24,200 --> 00:10:28,280 Speaker 1: casual glance don't have any connectivity to them. How can 160 00:10:28,320 --> 00:10:31,800 Speaker 1: you make that useful so that you can actually leverage 161 00:10:31,840 --> 00:10:35,439 Speaker 1: that data and do stuff with it? And that's kind 162 00:10:35,480 --> 00:10:38,240 Speaker 1: of what IBM was focusing on. And that's really where 163 00:10:38,240 --> 00:10:41,120 Speaker 1: they were talking about Watson quite a lot. It wasn't 164 00:10:41,480 --> 00:10:45,080 Speaker 1: a lot of people think of Watson as this, uh, 165 00:10:45,120 --> 00:10:48,520 Speaker 1: this the supercomputer that played on Jeopardy, which is not accurate. 166 00:10:48,840 --> 00:10:53,600 Speaker 1: Watson is not a supercomputer. The machine that ran Watson 167 00:10:54,320 --> 00:10:59,720 Speaker 1: was just a machine. It was not the entity itself. Uh. 168 00:11:00,000 --> 00:11:01,959 Speaker 1: If you want to do to get a little metaphysical 169 00:11:02,000 --> 00:11:04,080 Speaker 1: with this, you could actually think about a human being 170 00:11:04,120 --> 00:11:06,840 Speaker 1: and you ask, well, what is the human Is the 171 00:11:06,920 --> 00:11:10,800 Speaker 1: human being the body, the physical form, or is it 172 00:11:10,840 --> 00:11:15,079 Speaker 1: the mind? The person, the personality, the emotions, the memories, 173 00:11:15,520 --> 00:11:19,560 Speaker 1: the things that are that inhabit the body and also 174 00:11:19,640 --> 00:11:23,280 Speaker 1: that control the body. Is that the person? And you 175 00:11:23,360 --> 00:11:25,920 Speaker 1: might argue, well, it's actually the collective. It's the body 176 00:11:25,960 --> 00:11:28,280 Speaker 1: and the mind, and I think that's a valid argument. 177 00:11:28,280 --> 00:11:32,800 Speaker 1: You could also argue that Watson ultimately is a platform 178 00:11:33,200 --> 00:11:37,360 Speaker 1: and the physical machine that runs that platform. I probably 179 00:11:37,400 --> 00:11:39,600 Speaker 1: wouldn't argue with you too much there either, except I'd 180 00:11:39,640 --> 00:11:42,600 Speaker 1: say that the platform is more important than anything else 181 00:11:42,640 --> 00:11:45,440 Speaker 1: in this in this particular case, And by platform I 182 00:11:45,480 --> 00:11:48,480 Speaker 1: really just means set of rules, set of algorithms that 183 00:11:48,520 --> 00:11:52,000 Speaker 1: Watson uses in order to process information, to look for meaning, 184 00:11:52,040 --> 00:11:58,960 Speaker 1: to look for results. So let's take that Jeopardy example, Uh, Jeopardy. 185 00:11:59,040 --> 00:12:01,760 Speaker 1: In Jeopardy, Wat's and played against two former champions, one 186 00:12:01,760 --> 00:12:04,040 Speaker 1: of whom now Records podcast for How Stuff Works. So 187 00:12:04,080 --> 00:12:08,880 Speaker 1: that's kind of awesome. And Watson was playing by looking 188 00:12:08,920 --> 00:12:11,280 Speaker 1: at a clue. We're looking quote unquote. It was the 189 00:12:11,320 --> 00:12:15,400 Speaker 1: clues being fed to Watson and then going through its 190 00:12:15,559 --> 00:12:18,520 Speaker 1: massive amount of data and trying to use that to 191 00:12:18,559 --> 00:12:21,200 Speaker 1: figure out what the answer is. And it wasn't just 192 00:12:21,280 --> 00:12:24,679 Speaker 1: looking at a list of trivia or facts. It's not 193 00:12:24,760 --> 00:12:29,200 Speaker 1: like it's looking at an enormous table and every cell 194 00:12:29,800 --> 00:12:33,600 Speaker 1: in that table is filled with a different fact, like 195 00:12:33,840 --> 00:12:37,520 Speaker 1: George Washington was the first President of the United States. Instead, 196 00:12:37,520 --> 00:12:42,240 Speaker 1: it's looking at a massive library of information and pulling 197 00:12:42,880 --> 00:12:46,760 Speaker 1: bits and pieces of information together to formulate an idea 198 00:12:46,760 --> 00:12:50,280 Speaker 1: of what the answer is. And if that formulation reaches 199 00:12:50,320 --> 00:12:54,120 Speaker 1: a certain threshold of confidence, Watson would then ring in 200 00:12:54,600 --> 00:12:59,000 Speaker 1: and present that answer. So it's it's not that it's 201 00:12:59,360 --> 00:13:03,520 Speaker 1: looking at, uh you know, a very long trivia book. 202 00:13:03,920 --> 00:13:07,880 Speaker 1: It's looking at all this information and drawing conclusions from 203 00:13:07,880 --> 00:13:10,640 Speaker 1: it the way similar to how a human being would 204 00:13:10,760 --> 00:13:17,680 Speaker 1: not not completely analogous, but similar and uh so, using Watson, 205 00:13:18,120 --> 00:13:24,119 Speaker 1: you could leverage your unstructured data. You put Watson into 206 00:13:24,240 --> 00:13:26,800 Speaker 1: work at this, and Watson would start to look for 207 00:13:27,000 --> 00:13:32,920 Speaker 1: meaningful connections between data points and pulling relevant information about 208 00:13:32,960 --> 00:13:36,839 Speaker 1: any given query. So then Watson becomes an agent that 209 00:13:36,960 --> 00:13:40,320 Speaker 1: you could interact with. And this agent's job is kind 210 00:13:40,360 --> 00:13:43,440 Speaker 1: of like a reference librarian. It's to go to the 211 00:13:43,480 --> 00:13:47,000 Speaker 1: massive amount of information that's at its disposal and return 212 00:13:47,080 --> 00:13:50,040 Speaker 1: to you the relevant points of information. This is not 213 00:13:50,160 --> 00:13:54,080 Speaker 1: that different from the way people were thinking about web 214 00:13:54,160 --> 00:13:57,679 Speaker 1: three point oh when that was a big discussion. H 215 00:13:57,840 --> 00:14:00,720 Speaker 1: you may remember that like people to talk about how 216 00:14:01,559 --> 00:14:03,959 Speaker 1: right now? If you use a search engine, typically the 217 00:14:03,960 --> 00:14:05,959 Speaker 1: way it works as you type something in the search 218 00:14:06,000 --> 00:14:09,040 Speaker 1: engine and it pulls up a list of websites that 219 00:14:09,120 --> 00:14:12,079 Speaker 1: may or may not have what you're looking for on 220 00:14:12,160 --> 00:14:15,240 Speaker 1: those websites. So if you might you might be looking 221 00:14:15,280 --> 00:14:20,200 Speaker 1: for a let's say it's a um A history of 222 00:14:20,720 --> 00:14:25,480 Speaker 1: the Crusades, and you type that into the search engine 223 00:14:25,520 --> 00:14:28,080 Speaker 1: and it pulls for you a bunch of different sites 224 00:14:28,160 --> 00:14:30,480 Speaker 1: written by different people. Some of them might be very 225 00:14:30,520 --> 00:14:33,640 Speaker 1: easy to read and understand. Some of them might be 226 00:14:33,760 --> 00:14:36,280 Speaker 1: less easy to read, but they might be more accurate 227 00:14:36,400 --> 00:14:41,160 Speaker 1: and more uh unbiased. With the information you don't necessarily 228 00:14:41,200 --> 00:14:43,320 Speaker 1: know at the top of it. You have to go 229 00:14:43,400 --> 00:14:45,960 Speaker 1: through and read all that yourself. But the web three 230 00:14:45,960 --> 00:14:48,760 Speaker 1: point of search engines. This was something that Will from 231 00:14:48,760 --> 00:14:53,840 Speaker 1: Alpha was trying to be would pull the relevant information 232 00:14:54,160 --> 00:14:58,640 Speaker 1: not websites, but the relevant information from those websites and 233 00:14:58,720 --> 00:15:02,120 Speaker 1: present it to you. And that way you could look 234 00:15:02,160 --> 00:15:05,320 Speaker 1: over the important bits of information, you skip over everything 235 00:15:05,320 --> 00:15:09,120 Speaker 1: else you're given the correct context. In theory, you could 236 00:15:09,120 --> 00:15:12,440 Speaker 1: even have an agent like this that could learn about 237 00:15:12,680 --> 00:15:16,240 Speaker 1: you and your learning styles and thus present the information 238 00:15:16,280 --> 00:15:19,240 Speaker 1: to you in a way that is most helpful to you. 239 00:15:19,840 --> 00:15:22,080 Speaker 1: So it's a very big difference between the way we 240 00:15:22,120 --> 00:15:25,360 Speaker 1: do searches now and the way that this proposed method 241 00:15:25,440 --> 00:15:27,520 Speaker 1: would work. And that's kind of what Watson is doing. 242 00:15:28,000 --> 00:15:32,200 Speaker 1: So you've got this this user facing aspect of Watson. 243 00:15:32,280 --> 00:15:35,400 Speaker 1: It's kind of like a chat bot, and you can 244 00:15:35,640 --> 00:15:39,080 Speaker 1: send that chat bot requests and then the chat bot 245 00:15:39,120 --> 00:15:42,280 Speaker 1: will try and pull the information for you, or you 246 00:15:42,280 --> 00:15:44,920 Speaker 1: can use it to generate reports. Let's say that you 247 00:15:45,000 --> 00:15:48,680 Speaker 1: are a business owner and you want to look at 248 00:15:48,800 --> 00:15:54,440 Speaker 1: some information that's gonna pull things from presentations, predictions, results. 249 00:15:54,480 --> 00:15:57,360 Speaker 1: Maybe you've got like a end of the quarter report. 250 00:15:57,960 --> 00:16:00,840 Speaker 1: Maybe you want to take a look at formation from 251 00:16:01,040 --> 00:16:04,840 Speaker 1: reports from your supply chain. All this kind of complicated 252 00:16:04,920 --> 00:16:10,200 Speaker 1: stuff and Watson could go out, curate and present this 253 00:16:10,280 --> 00:16:13,560 Speaker 1: information in a way that has meaning to you, that 254 00:16:13,560 --> 00:16:16,520 Speaker 1: where you can understand what's going on and you can 255 00:16:16,600 --> 00:16:21,720 Speaker 1: draw conclusions. Uh. This actually was a pretty interesting concept 256 00:16:21,840 --> 00:16:25,440 Speaker 1: to me. I mean, I've seen some implementations of Watson 257 00:16:26,000 --> 00:16:30,000 Speaker 1: that do this, and they do it in such a simple, 258 00:16:30,640 --> 00:16:35,320 Speaker 1: seemingly simple way that's deceptive. You start to forget that 259 00:16:35,360 --> 00:16:39,800 Speaker 1: there is a very powerful computer algorithm that is controlling 260 00:16:39,800 --> 00:16:44,240 Speaker 1: all of this because the implementation itself might be pretty straightforward. 261 00:16:44,760 --> 00:16:47,760 Speaker 1: So for an example, I went to the Weather Company 262 00:16:48,200 --> 00:16:51,720 Speaker 1: last year in TV and while I was there, I 263 00:16:51,760 --> 00:16:54,280 Speaker 1: had a chance to talk to a team that was 264 00:16:54,360 --> 00:16:58,840 Speaker 1: using Watson in a lot of different implementations, and uh, 265 00:16:58,880 --> 00:17:00,880 Speaker 1: you know, they were using it as is the basis 266 00:17:01,000 --> 00:17:04,959 Speaker 1: of a customer service platform or to respond to requests. 267 00:17:05,680 --> 00:17:08,919 Speaker 1: And when you first look at that, it looks deceptively simple. 268 00:17:08,960 --> 00:17:11,120 Speaker 1: You're asking, well, what's the weather going to be like? 269 00:17:11,240 --> 00:17:14,639 Speaker 1: And you get results, Uh that that doesn't seem like 270 00:17:14,680 --> 00:17:16,600 Speaker 1: it's that hard. You would figure that, oh, well, they're 271 00:17:16,640 --> 00:17:19,399 Speaker 1: just gonna pull whatever the record is for my location 272 00:17:20,000 --> 00:17:23,520 Speaker 1: for tomorrow and present it to me. But a lot 273 00:17:23,560 --> 00:17:25,879 Speaker 1: more could be going on behind the scenes, and I 274 00:17:25,920 --> 00:17:28,399 Speaker 1: think that's part of the problem that IBM has been 275 00:17:28,440 --> 00:17:31,920 Speaker 1: dealing with and kind of one of the reasons why 276 00:17:31,960 --> 00:17:34,520 Speaker 1: they've made such a big deal of it at this conference. 277 00:17:35,119 --> 00:17:40,320 Speaker 1: It's because the perception of what Watson is maybe a 278 00:17:40,359 --> 00:17:46,440 Speaker 1: little too narrow, a little too uh uh focused on 279 00:17:47,680 --> 00:17:51,399 Speaker 1: little aspects of what Watson does and ignores the big picture. 280 00:17:51,800 --> 00:17:56,440 Speaker 1: So they've they've definitely doubled down on that. I went 281 00:17:56,520 --> 00:17:59,920 Speaker 1: to a talk called Journey to AI that was really 282 00:18:00,160 --> 00:18:04,159 Speaker 1: all about this, and they talked all about the the 283 00:18:04,240 --> 00:18:09,280 Speaker 1: different variations of artificial intelligence, and uh one of the 284 00:18:09,359 --> 00:18:13,080 Speaker 1: things they mentioned was the very different views of what 285 00:18:13,240 --> 00:18:18,000 Speaker 1: AI is. For example, you've got simple AI. Simple AI 286 00:18:18,040 --> 00:18:19,800 Speaker 1: would include some of the stuff I talked about in 287 00:18:19,800 --> 00:18:24,199 Speaker 1: a previous episode about the little aspects of intelligence that 288 00:18:24,280 --> 00:18:29,280 Speaker 1: are very very narrow, just to slice the pie of intelligence, 289 00:18:29,359 --> 00:18:32,560 Speaker 1: but they do represent what intelligence is in in just 290 00:18:32,600 --> 00:18:36,320 Speaker 1: a very specific application. So image recognition is an example 291 00:18:36,359 --> 00:18:40,600 Speaker 1: of that, or voice recognition or natural language processing even 292 00:18:40,760 --> 00:18:44,399 Speaker 1: as part of that. These are all aspects of intelligence. 293 00:18:44,920 --> 00:18:47,720 Speaker 1: You would not call a machine that lacks one of 294 00:18:47,760 --> 00:18:51,399 Speaker 1: these things truly intelligent, but you also wouldn't call a 295 00:18:51,440 --> 00:18:54,720 Speaker 1: machine that only has one of these things truly intelligent. 296 00:18:55,040 --> 00:18:58,199 Speaker 1: So if I have a smartphone and the smartphone is 297 00:18:58,240 --> 00:19:02,320 Speaker 1: able to recognize uh images, so i'm I'm I point 298 00:19:02,359 --> 00:19:05,280 Speaker 1: my smartphone at something and it even labels what that 299 00:19:05,359 --> 00:19:07,960 Speaker 1: thing is. Maybe it says, oh, well, that's a specific 300 00:19:08,080 --> 00:19:12,320 Speaker 1: model and make of car, or maybe it says that 301 00:19:12,440 --> 00:19:18,040 Speaker 1: building is a historic landmark, or this park is going 302 00:19:18,080 --> 00:19:22,359 Speaker 1: to have a concert uh at at it the next day, 303 00:19:22,480 --> 00:19:25,680 Speaker 1: or something along those lines. That's cool. That image recognition 304 00:19:25,760 --> 00:19:29,840 Speaker 1: is really cool, but I wouldn't call my smartphone intelligent. Similarly, 305 00:19:29,880 --> 00:19:32,359 Speaker 1: if my smartphone happens to have one of those digital 306 00:19:32,400 --> 00:19:34,680 Speaker 1: assistance on it, and it does, I've got an Android phone, 307 00:19:34,720 --> 00:19:37,439 Speaker 1: so I've got the Google Assistant on there. Um, I 308 00:19:37,480 --> 00:19:41,159 Speaker 1: can talk to that and it can retrieve information for me. 309 00:19:41,480 --> 00:19:43,879 Speaker 1: It can do tasks for me. I can use it 310 00:19:43,920 --> 00:19:47,440 Speaker 1: to make calls, I can use it to send text messages, 311 00:19:47,560 --> 00:19:49,600 Speaker 1: or I can use it to search for information on 312 00:19:49,640 --> 00:19:52,280 Speaker 1: my phone or on the internet. I still wouldn't call 313 00:19:52,320 --> 00:19:56,320 Speaker 1: my phone intelligent. It has an aspect of intelligence. Similarly, 314 00:19:56,320 --> 00:20:01,000 Speaker 1: if I had a supercomputer that could listen to voice commands, 315 00:20:01,080 --> 00:20:04,320 Speaker 1: respond in natural language, and do these other things, but 316 00:20:04,359 --> 00:20:06,639 Speaker 1: it couldn't do any image recognition. I would feel I 317 00:20:06,640 --> 00:20:09,159 Speaker 1: would I would notice that lack, and I wouldn't call 318 00:20:09,240 --> 00:20:12,480 Speaker 1: that intelligent. On the other side of the scale, you 319 00:20:12,560 --> 00:20:17,040 Speaker 1: have general AI, where you know, the classic image of 320 00:20:17,040 --> 00:20:20,480 Speaker 1: this is you've got a big machine. They can do 321 00:20:21,760 --> 00:20:25,440 Speaker 1: uh that can do general thinking, like thinking that's analogous 322 00:20:25,440 --> 00:20:28,840 Speaker 1: to human thinking. It can process information, it can draw conclusions, 323 00:20:28,880 --> 00:20:33,080 Speaker 1: that can synthesize data. It can um innovate. It may 324 00:20:33,119 --> 00:20:36,760 Speaker 1: even be self aware, although the weather or not self 325 00:20:36,800 --> 00:20:39,919 Speaker 1: awareness is directly tied to intelligence is a matter of 326 00:20:39,960 --> 00:20:45,160 Speaker 1: philosophical debate. Talking about general AI, I mean, that's that's 327 00:20:45,160 --> 00:20:49,600 Speaker 1: a hard, hard goal to hit. We honestly don't know 328 00:20:49,880 --> 00:20:52,720 Speaker 1: what it will take to get there. It may be 329 00:20:53,000 --> 00:20:56,879 Speaker 1: that we are thirty years away from having a true 330 00:20:57,040 --> 00:21:00,359 Speaker 1: general AI, It may be much longer than that, it 331 00:21:00,400 --> 00:21:03,320 Speaker 1: maybe a century away, or it may even be impossible 332 00:21:03,320 --> 00:21:08,000 Speaker 1: for us to do based upon our technological abilities. Right now, 333 00:21:08,119 --> 00:21:12,960 Speaker 1: most technologists think that it is attainable, but they don't 334 00:21:13,000 --> 00:21:15,440 Speaker 1: know exactly what it's going to take to get there. 335 00:21:15,600 --> 00:21:19,480 Speaker 1: So there's some argument about the timeline, But there are 336 00:21:19,480 --> 00:21:22,880 Speaker 1: a lot of interesting things that can happen between those 337 00:21:22,960 --> 00:21:28,760 Speaker 1: simple versions of AI, and that that crazy general AI 338 00:21:28,800 --> 00:21:31,679 Speaker 1: that that you know, science fiction writers write about and 339 00:21:32,119 --> 00:21:36,320 Speaker 1: warn us about. And that's where this this ability to 340 00:21:36,600 --> 00:21:42,760 Speaker 1: deal with unstructured data comes in and h designing AI 341 00:21:42,880 --> 00:21:45,439 Speaker 1: is part of that problem. But as they mentioned in 342 00:21:45,600 --> 00:21:49,240 Speaker 1: multiple presentations here at IBM, it's not just building the 343 00:21:49,320 --> 00:21:54,240 Speaker 1: artificial intelligence to do this that's a challenge. It's also 344 00:21:54,359 --> 00:22:00,200 Speaker 1: incorporating that artificial intelligence into existing work practices because, as 345 00:22:00,240 --> 00:22:04,400 Speaker 1: most businesses have existed for a while now, it's not 346 00:22:04,520 --> 00:22:07,480 Speaker 1: like you can just slot AI n necessarily. It's not 347 00:22:07,560 --> 00:22:10,200 Speaker 1: like a module you plug in and everything works properly. 348 00:22:10,520 --> 00:22:15,720 Speaker 1: You might have to reevaluate and redesign work processes in 349 00:22:15,840 --> 00:22:18,320 Speaker 1: order to make this happen. And again, this gets a 350 00:22:18,320 --> 00:22:21,200 Speaker 1: little little dry and technical if you're not really into 351 00:22:21,240 --> 00:22:23,200 Speaker 1: the business side of things. But when you start thinking 352 00:22:23,200 --> 00:22:26,160 Speaker 1: about you realize, yeah, it's not enough to just build 353 00:22:26,160 --> 00:22:28,400 Speaker 1: a tool. You have to figure out how's the best 354 00:22:28,400 --> 00:22:32,879 Speaker 1: way to use that tool with respect to the things 355 00:22:32,920 --> 00:22:36,920 Speaker 1: you're already trying to do that. They started talking about 356 00:22:36,960 --> 00:22:42,320 Speaker 1: impotence match. The engineers were chatting, chatting all about impotence 357 00:22:42,359 --> 00:22:46,160 Speaker 1: match between man and machine to get machines to process 358 00:22:46,320 --> 00:22:50,000 Speaker 1: human language and commands and to return information that would 359 00:22:50,000 --> 00:22:53,439 Speaker 1: be useful to humans, and to eventually get rid of 360 00:22:53,480 --> 00:22:56,960 Speaker 1: that boundary between man and machines so that decisions can 361 00:22:56,960 --> 00:23:01,000 Speaker 1: be made together and implemented together. So this gets into 362 00:23:01,080 --> 00:23:05,280 Speaker 1: that concept of augmented intelligence, not that we are trying 363 00:23:05,320 --> 00:23:09,760 Speaker 1: to create a supercomputer that is incredibly intelligent, and we 364 00:23:09,800 --> 00:23:13,239 Speaker 1: will then reference the supercomputer as if it were an 365 00:23:13,240 --> 00:23:17,800 Speaker 1: oracle or a deity, instead talking about creating machines that 366 00:23:17,800 --> 00:23:23,440 Speaker 1: would work right alongside people, and the machines could help 367 00:23:23,600 --> 00:23:26,920 Speaker 1: fill in the gaps that would be there because of 368 00:23:26,960 --> 00:23:30,280 Speaker 1: the human failings that are in all of us, and 369 00:23:30,359 --> 00:23:34,679 Speaker 1: humans could provide all the bits that machines are not 370 00:23:34,800 --> 00:23:38,720 Speaker 1: good at, and together we could be better. And that 371 00:23:38,800 --> 00:23:40,159 Speaker 1: we have to get to a point where we have 372 00:23:40,280 --> 00:23:44,280 Speaker 1: to trust the machines as a an assistant, and the 373 00:23:44,280 --> 00:23:47,360 Speaker 1: machines have to quote unquote, trust us as teachers. By 374 00:23:47,359 --> 00:23:49,640 Speaker 1: trust us, they don't necessarily mean that the machines are 375 00:23:49,640 --> 00:23:54,280 Speaker 1: going to be harboring doubts, but rather that humans are 376 00:23:54,280 --> 00:23:57,800 Speaker 1: the ones designing these machines, and we have to make 377 00:23:57,840 --> 00:24:01,560 Speaker 1: certain that we do so in a way that is responsible, 378 00:24:01,640 --> 00:24:06,080 Speaker 1: that is ethical, that is inclusive. Otherwise we end up 379 00:24:06,119 --> 00:24:09,240 Speaker 1: with bad machines. And it's not that the machines themselves 380 00:24:09,240 --> 00:24:14,320 Speaker 1: were inherently wicked, but rather they were poorly designed. I've 381 00:24:14,359 --> 00:24:17,840 Speaker 1: got more to say about the Journey to AI presentation 382 00:24:18,000 --> 00:24:21,200 Speaker 1: at IBM THINK, but before I go into it, let's 383 00:24:21,200 --> 00:24:31,240 Speaker 1: take a quick break to thank our sponsor. The folks 384 00:24:31,240 --> 00:24:35,240 Speaker 1: over at IBM are arguing that every single industry across 385 00:24:35,280 --> 00:24:38,000 Speaker 1: the world is going to be affected by this sort 386 00:24:38,000 --> 00:24:46,560 Speaker 1: of transformation of of data and knowledge. They started referencing 387 00:24:46,600 --> 00:24:52,520 Speaker 1: things like retail optimization, or the oil industry, or automotive 388 00:24:52,520 --> 00:24:56,520 Speaker 1: industries shipping. All of these things they said, we're going 389 00:24:56,560 --> 00:24:59,040 Speaker 1: to transform dramatically over the next few years due to 390 00:24:59,119 --> 00:25:02,520 Speaker 1: this kind of technology. Uh. And they talked about how 391 00:25:03,119 --> 00:25:06,439 Speaker 1: the one field you can look at right now that 392 00:25:06,600 --> 00:25:11,480 Speaker 1: is undergoing such a transformation is healthcare. All healthcare is 393 00:25:11,480 --> 00:25:15,760 Speaker 1: is transforming because we are seeing not just advanced tools 394 00:25:15,840 --> 00:25:20,600 Speaker 1: come into hospitals and doctors offices, but also these programs 395 00:25:20,640 --> 00:25:24,720 Speaker 1: like Watson where a doctor can actually turn to Watson 396 00:25:25,440 --> 00:25:30,439 Speaker 1: as a colleague, almost like someone up here who can 397 00:25:31,560 --> 00:25:35,359 Speaker 1: provide more information a second opinion, if you will. In fact, 398 00:25:35,560 --> 00:25:39,760 Speaker 1: IBM brought up some representatives from the American Cancer Society 399 00:25:40,200 --> 00:25:46,240 Speaker 1: and some very prestigious cancer research hospitals to talk about 400 00:25:46,320 --> 00:25:51,560 Speaker 1: this and about how cancer is a really really difficult problem. 401 00:25:51,640 --> 00:25:56,000 Speaker 1: It is, uh, it is a complicated disease. Really, when 402 00:25:56,000 --> 00:26:00,440 Speaker 1: you think about cancer is a family of diseases. It's 403 00:26:00,600 --> 00:26:05,879 Speaker 1: not just a single illness, but rather a whole, a 404 00:26:05,920 --> 00:26:08,960 Speaker 1: whole suite of illnesses. There are hundreds of different types 405 00:26:09,000 --> 00:26:12,360 Speaker 1: of cancer. Now, to make it more complicated, there are 406 00:26:12,480 --> 00:26:17,119 Speaker 1: different methods for diagnosing and treating all these different types 407 00:26:17,240 --> 00:26:22,760 Speaker 1: of cancer, and that obviously means that you have to 408 00:26:22,800 --> 00:26:27,560 Speaker 1: be very careful when you're an oncologist, a cancer specialist 409 00:26:27,760 --> 00:26:32,760 Speaker 1: to correctly identify, to diagnose, and to treat specific types 410 00:26:32,800 --> 00:26:34,879 Speaker 1: of cancer, because a treatment for one type may not 411 00:26:34,960 --> 00:26:39,160 Speaker 1: be effective for a different type, and not every place 412 00:26:39,240 --> 00:26:45,440 Speaker 1: in the world has access to incredibly gifted, educated oncologists. 413 00:26:45,840 --> 00:26:48,240 Speaker 1: If you happen to be fortunate and a lot enough 414 00:26:48,280 --> 00:26:51,000 Speaker 1: to live in a major city in a well developed nation, 415 00:26:51,640 --> 00:26:54,919 Speaker 1: then you may live close to a teaching hospital, in 416 00:26:54,960 --> 00:26:59,200 Speaker 1: which case you have the access to incredible specialists who 417 00:26:59,240 --> 00:27:04,679 Speaker 1: have dedicated their lives to learning and fighting cancer. But 418 00:27:04,760 --> 00:27:07,920 Speaker 1: if you live in a small town and you don't 419 00:27:08,040 --> 00:27:14,800 Speaker 1: have that access, then you your your options are severely limited. Well, 420 00:27:15,040 --> 00:27:18,320 Speaker 1: IBM and Watson. One of the first problems they were 421 00:27:18,359 --> 00:27:22,320 Speaker 1: looking at tackling outside of you know, once developing the platform, 422 00:27:22,880 --> 00:27:27,800 Speaker 1: was using Watson to help doctors treat cancer. And the 423 00:27:27,840 --> 00:27:32,000 Speaker 1: way Watson works, the way it's effective, is that you 424 00:27:32,040 --> 00:27:36,440 Speaker 1: have to feed it information. Without the data, Watson is useless. 425 00:27:37,400 --> 00:27:42,800 Speaker 1: Watson is good at analyzing data, curating data, and producing results, 426 00:27:42,800 --> 00:27:44,480 Speaker 1: but in order to do that, you have to give 427 00:27:44,560 --> 00:27:48,760 Speaker 1: it data. So what the IBM did was they reached 428 00:27:48,760 --> 00:27:52,720 Speaker 1: out to the American Cancer Society and they talked with 429 00:27:52,800 --> 00:27:58,520 Speaker 1: them about feeding Watson data about cancer. American Cancer Society 430 00:27:58,560 --> 00:28:02,680 Speaker 1: had millions of data sets and clinical records that they 431 00:28:02,800 --> 00:28:07,920 Speaker 1: used to help train Watson to understand how the diagnosis 432 00:28:07,960 --> 00:28:11,400 Speaker 1: and treatment processes for different types of cancer actually went. 433 00:28:12,480 --> 00:28:16,000 Speaker 1: So this was like Watson getting a crash course in 434 00:28:16,400 --> 00:28:23,119 Speaker 1: oncology and from that information which is constantly being refreshed 435 00:28:23,160 --> 00:28:27,720 Speaker 1: with new research, with new experiments, with new treatments, that 436 00:28:28,000 --> 00:28:31,199 Speaker 1: also can then go to Watson. Watson is able to 437 00:28:32,080 --> 00:28:36,160 Speaker 1: look at a huge set of data points and look 438 00:28:36,160 --> 00:28:43,760 Speaker 1: at the effectiveness overall of any given diagnosis method or treatment. So, 439 00:28:44,080 --> 00:28:47,280 Speaker 1: in other words, you might have conducted a series of 440 00:28:47,440 --> 00:28:52,600 Speaker 1: experiments and determined that one particular approach is the most effective, 441 00:28:52,640 --> 00:28:55,480 Speaker 1: and that's why you that's your go to approach for 442 00:28:55,880 --> 00:28:59,120 Speaker 1: looking at that type of cancer. Watson, however, can look 443 00:28:59,160 --> 00:29:02,120 Speaker 1: across the higher set of data points, not just from 444 00:29:02,200 --> 00:29:05,480 Speaker 1: your experiments and your work and your research, but everyone 445 00:29:05,520 --> 00:29:08,320 Speaker 1: else is that has been part of the American Cancer 446 00:29:08,360 --> 00:29:13,760 Speaker 1: Society's work, and then Watson can say, you know, yeah, 447 00:29:13,800 --> 00:29:16,400 Speaker 1: that that method, out of all the ones you've tried, 448 00:29:16,880 --> 00:29:20,480 Speaker 1: has worked best for you. But there's this other methodology 449 00:29:20,520 --> 00:29:23,440 Speaker 1: that is even more effective that you have not yet tried, 450 00:29:24,040 --> 00:29:26,400 Speaker 1: that you didn't even know about. But because I have 451 00:29:26,440 --> 00:29:29,640 Speaker 1: access to all the information, which is far far greater 452 00:29:29,720 --> 00:29:33,760 Speaker 1: than what any human can navigate, I can tell you that, 453 00:29:34,400 --> 00:29:39,080 Speaker 1: based upon the success rate of all those cases, this 454 00:29:39,160 --> 00:29:42,680 Speaker 1: is something you should try. And thus Watson becomes that 455 00:29:43,040 --> 00:29:46,600 Speaker 1: cancer specialist who can provide a second opinion. Uh, this 456 00:29:46,720 --> 00:29:50,840 Speaker 1: is a very powerful tool, something that can legitimately save lives, 457 00:29:51,560 --> 00:29:56,880 Speaker 1: and it is of a real consequence to those of 458 00:29:56,960 --> 00:29:59,720 Speaker 1: us in the audience who are not just trying to 459 00:30:00,000 --> 00:30:02,680 Speaker 1: create a business or I shouldn't say just but are 460 00:30:02,720 --> 00:30:04,880 Speaker 1: trying to create a business or trying to figure out 461 00:30:04,960 --> 00:30:09,560 Speaker 1: how to uh streamline our our back end processes as 462 00:30:09,560 --> 00:30:11,920 Speaker 1: we try to do whatever it is we do. This 463 00:30:12,000 --> 00:30:17,440 Speaker 1: is life and death for millions of people around the world. Uh, 464 00:30:17,640 --> 00:30:23,160 Speaker 1: it's a really interesting case study too. I mean that 465 00:30:23,400 --> 00:30:26,000 Speaker 1: so far Watson is being used in more than two 466 00:30:26,080 --> 00:30:30,840 Speaker 1: hundred hospitals across the world. More than ten thousand patients 467 00:30:31,040 --> 00:30:33,840 Speaker 1: are able to take advantage of this using Watson to 468 00:30:33,840 --> 00:30:37,680 Speaker 1: help make decisions. Really, it's the physicians who are using 469 00:30:37,680 --> 00:30:41,240 Speaker 1: Watson to kind of guide themselves and get that second 470 00:30:41,240 --> 00:30:44,880 Speaker 1: opinion which may or may not confirm what the original 471 00:30:44,880 --> 00:30:49,360 Speaker 1: physician had concluded, help refine approaches, help give options to patients, 472 00:30:49,360 --> 00:30:53,720 Speaker 1: which obviously is also really important. And when you consider 473 00:30:53,880 --> 00:30:58,080 Speaker 1: that this year alone, in one point seven million Americans 474 00:30:58,120 --> 00:31:01,520 Speaker 1: will be diagnosed with cancer, you realize this is a 475 00:31:01,640 --> 00:31:04,920 Speaker 1: very big deal. And of course that's just the United States. Obviously, 476 00:31:05,120 --> 00:31:08,560 Speaker 1: global numbers will be much higher. And again, if you 477 00:31:08,640 --> 00:31:11,440 Speaker 1: happen to live in a country like the United States 478 00:31:11,480 --> 00:31:14,280 Speaker 1: and you're near a learning hospital, you then might have 479 00:31:14,360 --> 00:31:18,280 Speaker 1: access to people who are the leading practitioners, the leading thinkers, 480 00:31:18,400 --> 00:31:21,920 Speaker 1: leading researchers in cancer. But if you live in a 481 00:31:22,000 --> 00:31:26,440 Speaker 1: developing nation where you have a much worse ratio of 482 00:31:26,520 --> 00:31:30,600 Speaker 1: doctor to patients, then you would really want to have 483 00:31:30,760 --> 00:31:34,520 Speaker 1: access to this deep level of expertise. That's the whole concept. 484 00:31:35,360 --> 00:31:38,480 Speaker 1: So uh they all the folks up on stage, the 485 00:31:39,400 --> 00:31:43,520 Speaker 1: representatives from Memorial Sloan Kettering, which is a cancer treatment center, 486 00:31:43,640 --> 00:31:46,920 Speaker 1: and also of the American Cancer Society. We're citing some 487 00:31:46,960 --> 00:31:51,640 Speaker 1: really interesting uh um statistics. So in the United States, 488 00:31:51,960 --> 00:31:54,520 Speaker 1: where we have a lot of oncologists, a lot of 489 00:31:54,600 --> 00:32:01,360 Speaker 1: cancer specialists, on average, every oncologist has about one patients, 490 00:32:02,120 --> 00:32:04,120 Speaker 1: which you know, that's that's a lot of patients. But 491 00:32:04,280 --> 00:32:07,240 Speaker 1: if you think about you realize, well, that might be 492 00:32:07,320 --> 00:32:10,400 Speaker 1: manageable for a single oncologist. But in other parts of 493 00:32:10,400 --> 00:32:13,280 Speaker 1: the world, it's more like the number. You look at 494 00:32:13,280 --> 00:32:15,880 Speaker 1: the number of oncologists versus the number of people who 495 00:32:16,000 --> 00:32:20,840 Speaker 1: are dealing with cancer, and it becomes ten thousand patients 496 00:32:20,840 --> 00:32:25,320 Speaker 1: to one oncologist. At that scale, it is impossible, no 497 00:32:25,360 --> 00:32:29,480 Speaker 1: matter how gifted and intelligent and educated you are, to 498 00:32:29,560 --> 00:32:34,520 Speaker 1: be able to handle that enormous amount of of work 499 00:32:35,360 --> 00:32:39,000 Speaker 1: without help. And so again that was where they were 500 00:32:39,040 --> 00:32:43,240 Speaker 1: citing use of Watson as a way to help offload 501 00:32:43,360 --> 00:32:47,120 Speaker 1: some of this this very difficult work that the oncologists 502 00:32:47,200 --> 00:32:53,720 Speaker 1: do and get guidance from expertise from around the world. 503 00:32:54,680 --> 00:32:59,520 Speaker 1: And again, this is not Watson coming up with new treatments. 504 00:32:59,760 --> 00:33:05,320 Speaker 1: This is an artificially intelligent platform for a very narrow 505 00:33:05,440 --> 00:33:09,080 Speaker 1: definition of AI looking at an enormous data set that 506 00:33:09,160 --> 00:33:12,160 Speaker 1: was generated by humans, by human beings. So we're not 507 00:33:12,240 --> 00:33:15,200 Speaker 1: saying that there's a computer doctor out there that's better 508 00:33:15,280 --> 00:33:19,520 Speaker 1: than human doctors, that it's smarter than we are. Moreover, 509 00:33:19,680 --> 00:33:23,320 Speaker 1: it's more like saying we have the world's best librarian 510 00:33:24,160 --> 00:33:30,680 Speaker 1: that is looking at the mass collected knowledge base on 511 00:33:30,720 --> 00:33:34,520 Speaker 1: a very specific subject and returning the results that are 512 00:33:34,600 --> 00:33:40,360 Speaker 1: relevant to any given query to help with human decisions. 513 00:33:40,400 --> 00:33:43,920 Speaker 1: So that's where that augmenting intelligence comes in. It's not 514 00:33:44,000 --> 00:33:46,800 Speaker 1: that you've got a robo doctor. It's that you've got 515 00:33:47,360 --> 00:33:51,760 Speaker 1: a robo reference librarian who is able to reference all 516 00:33:51,800 --> 00:33:54,280 Speaker 1: the human doctors and see what has worked the best. 517 00:33:54,560 --> 00:33:57,360 Speaker 1: That's a good way of looking at Watson in general 518 00:33:58,000 --> 00:34:00,600 Speaker 1: when you want to understand what it does and what 519 00:34:00,680 --> 00:34:05,120 Speaker 1: it could do in lots of different contexts. It's again 520 00:34:05,200 --> 00:34:09,200 Speaker 1: something that could help with handling any large set of 521 00:34:09,280 --> 00:34:13,359 Speaker 1: data points. It wouldn't have to be medical, although that's 522 00:34:13,400 --> 00:34:17,520 Speaker 1: an easy way to understand how that could be an 523 00:34:17,520 --> 00:34:23,400 Speaker 1: effective use. Another possible use of Watson would be for 524 00:34:23,800 --> 00:34:30,240 Speaker 1: the purposes of augmented reality, where you are using something 525 00:34:30,239 --> 00:34:34,520 Speaker 1: like a smartphone, let's say, to take images of whatever 526 00:34:34,520 --> 00:34:36,759 Speaker 1: it is you're looking at, and you're asking Watson to 527 00:34:36,800 --> 00:34:39,320 Speaker 1: give you guidance on how to deal with the situation. 528 00:34:39,719 --> 00:34:42,640 Speaker 1: So imagine that you are an auto mechanic and you 529 00:34:43,120 --> 00:34:47,240 Speaker 1: have a vehicle come in that is not not frequently 530 00:34:47,280 --> 00:34:49,560 Speaker 1: found in your area, so you haven't had a lot 531 00:34:49,560 --> 00:34:53,120 Speaker 1: of experience working on it. You you know, you have 532 00:34:53,200 --> 00:34:55,839 Speaker 1: good working knowledge of automobiles in general, but you don't 533 00:34:55,840 --> 00:35:00,520 Speaker 1: know the particulars of this specific make and model. And 534 00:35:00,719 --> 00:35:03,520 Speaker 1: you lift up the hood and you're looking at the engine, 535 00:35:03,520 --> 00:35:05,960 Speaker 1: and you're looking at different parts, and you see one 536 00:35:06,040 --> 00:35:08,080 Speaker 1: particular part that you believe is the problem, so you 537 00:35:08,120 --> 00:35:10,719 Speaker 1: take up photo of it, and then you have a 538 00:35:10,760 --> 00:35:13,719 Speaker 1: Watson assistant that's working with you on an app that's 539 00:35:13,719 --> 00:35:18,919 Speaker 1: specifically written for your line of work. So, in other words, 540 00:35:18,920 --> 00:35:22,319 Speaker 1: Watson is really just looking at a data set that 541 00:35:22,560 --> 00:35:25,799 Speaker 1: is relevant to auto mechanics. It's not like it's the 542 00:35:25,840 --> 00:35:29,120 Speaker 1: world's it's not looking at all the information across the 543 00:35:29,120 --> 00:35:32,640 Speaker 1: Internet or anything like that. This is a specific implementation 544 00:35:33,560 --> 00:35:38,799 Speaker 1: of the platform. And then Watson references it's information, returns 545 00:35:39,600 --> 00:35:43,399 Speaker 1: the results to you, and explains what that part is. 546 00:35:43,520 --> 00:35:46,839 Speaker 1: What are some of the common problems, what is you know, basically, 547 00:35:46,880 --> 00:35:50,160 Speaker 1: what was the problem that you have encountered specifically, how 548 00:35:50,160 --> 00:35:52,520 Speaker 1: do you address it? Do you have repairs you can make? 549 00:35:52,560 --> 00:35:54,480 Speaker 1: Do you need to replace the part? If you do 550 00:35:54,600 --> 00:35:56,440 Speaker 1: need to replace the part, where would you get it? 551 00:35:56,680 --> 00:35:59,680 Speaker 1: How long will it take to get there? Essentially all 552 00:35:59,680 --> 00:36:01,920 Speaker 1: the amation you need as a mechanic in order to 553 00:36:01,960 --> 00:36:05,320 Speaker 1: fix the problem and also to alert your customer. Hey, 554 00:36:05,360 --> 00:36:07,600 Speaker 1: here's what's going on. Here's how much it's gonna cost. 555 00:36:07,680 --> 00:36:10,600 Speaker 1: Here's how long it's gonna take. Um, And you can 556 00:36:10,680 --> 00:36:15,680 Speaker 1: even answer why. You could find out where the delays 557 00:36:15,719 --> 00:36:17,560 Speaker 1: are if it's gonna be something that's gonna take like, well, 558 00:36:17,560 --> 00:36:21,759 Speaker 1: it's gonna take two weeks. Why, Well, because here's the 559 00:36:21,840 --> 00:36:25,160 Speaker 1: obscure part that I need to order, and here's the 560 00:36:25,200 --> 00:36:27,680 Speaker 1: really complicated supply chain of how it's going to have 561 00:36:27,760 --> 00:36:30,600 Speaker 1: to get to me. And I can't speed that up 562 00:36:30,640 --> 00:36:32,440 Speaker 1: because I have no control over it. If you're able 563 00:36:32,480 --> 00:36:35,120 Speaker 1: to actually explain that to the customer, then you can, 564 00:36:35,280 --> 00:36:37,239 Speaker 1: you know, maybe take some of the heat off. And 565 00:36:37,280 --> 00:36:39,960 Speaker 1: you can also probably say, hey, next time, buy a 566 00:36:39,960 --> 00:36:44,040 Speaker 1: car that's not so uh, you know, exotic. It's something 567 00:36:44,080 --> 00:36:46,439 Speaker 1: that I can work on. No, no, don't victim blame. 568 00:36:46,520 --> 00:36:49,520 Speaker 1: That's not cool, but you could at least explain the 569 00:36:50,520 --> 00:36:54,560 Speaker 1: context of what's happening. And I found this really interesting. 570 00:36:54,600 --> 00:36:56,960 Speaker 1: They also talked about how Watson could also work with 571 00:36:57,520 --> 00:37:01,000 Speaker 1: companies that have much smaller data sets that you know, 572 00:37:01,080 --> 00:37:04,600 Speaker 1: obviously you have different scales here. If you look at 573 00:37:05,880 --> 00:37:10,840 Speaker 1: all the information on a consumer facing business where they're 574 00:37:10,840 --> 00:37:14,160 Speaker 1: collecting information about the people who use the product, then 575 00:37:14,239 --> 00:37:17,320 Speaker 1: the data sets could potentially be enormous. A good example 576 00:37:17,320 --> 00:37:20,520 Speaker 1: of this would be Facebook, which of course is is 577 00:37:20,560 --> 00:37:23,920 Speaker 1: going through a massive scandal right now due to a 578 00:37:23,960 --> 00:37:26,719 Speaker 1: company that collected data and then tried to leverage it 579 00:37:26,800 --> 00:37:32,000 Speaker 1: in a way that was unethical at best. So Facebook 580 00:37:32,239 --> 00:37:37,400 Speaker 1: has more than a billion users, and people use Facebook 581 00:37:37,400 --> 00:37:40,160 Speaker 1: a lot. People who are using Facebook a ton are 582 00:37:40,200 --> 00:37:43,799 Speaker 1: sharing a lot of information about themselves, either directly or indirectly. 583 00:37:44,400 --> 00:37:47,040 Speaker 1: So you have this massive amount of data that Facebook 584 00:37:47,160 --> 00:37:50,840 Speaker 1: is collecting and sitting on top of and using a 585 00:37:50,960 --> 00:37:55,680 Speaker 1: device like or a an API platform like Watson to 586 00:37:56,000 --> 00:37:58,680 Speaker 1: go through all that data and pull meaningful information from 587 00:37:58,719 --> 00:38:05,120 Speaker 1: it could create ate some really powerful strategies. You could 588 00:38:05,120 --> 00:38:08,400 Speaker 1: figure out trends and be able to leverage them, and 589 00:38:08,440 --> 00:38:11,160 Speaker 1: you could do them in ways that were maybe helpful 590 00:38:11,360 --> 00:38:15,920 Speaker 1: or maybe exploitative, probably the second. But you would have 591 00:38:15,920 --> 00:38:18,080 Speaker 1: a huge amount of data. That's really the point I'm 592 00:38:18,080 --> 00:38:21,479 Speaker 1: getting at is because you've got an engaged user base 593 00:38:21,600 --> 00:38:25,960 Speaker 1: that is enthusiastically handing information over, you would have an 594 00:38:26,080 --> 00:38:29,200 Speaker 1: enormous data set. But you could also use a tool 595 00:38:29,239 --> 00:38:32,840 Speaker 1: like Watson for internal processes like let's say that you 596 00:38:32,920 --> 00:38:35,799 Speaker 1: are a company, and let's say that you're part of 597 00:38:36,160 --> 00:38:38,320 Speaker 1: a shipping company. So you need to be able to 598 00:38:38,400 --> 00:38:43,440 Speaker 1: keep track of all the suppliers, the destinations, the the 599 00:38:44,320 --> 00:38:46,920 Speaker 1: way that you're actually moving product from point A to 600 00:38:46,960 --> 00:38:50,240 Speaker 1: point B. It's a lot of moving parts, law logistics, 601 00:38:50,280 --> 00:38:53,319 Speaker 1: but it's on the whole. If you look at all 602 00:38:53,360 --> 00:38:55,759 Speaker 1: the data and you were to say, like let's fill 603 00:38:55,880 --> 00:39:00,480 Speaker 1: up you know, two containers with raw information, it would 604 00:39:00,520 --> 00:39:04,719 Speaker 1: be a fraction of the size of something like Facebook. Like, yeah, 605 00:39:04,760 --> 00:39:06,640 Speaker 1: there are a lot of data points and it's complicated. 606 00:39:06,680 --> 00:39:11,279 Speaker 1: It's too complicated for humans to navigate easily. But it's 607 00:39:11,320 --> 00:39:15,400 Speaker 1: not like it's the huge amount of data that's generated 608 00:39:15,440 --> 00:39:19,000 Speaker 1: on a daily basis from Facebook. Watson still, however, has 609 00:39:19,040 --> 00:39:23,720 Speaker 1: the capability of learning even from smaller data sets. So again, 610 00:39:23,760 --> 00:39:27,280 Speaker 1: this was IBM talking to their partners and their customers saying, Hey, 611 00:39:28,040 --> 00:39:30,319 Speaker 1: I know that we're talking about using Watson for these 612 00:39:30,400 --> 00:39:35,640 Speaker 1: really really big ideas and these really world changing applications 613 00:39:35,680 --> 00:39:39,120 Speaker 1: that are relying upon millions and millions of records, but 614 00:39:40,080 --> 00:39:42,319 Speaker 1: Watson could also work for you. That was kind of 615 00:39:42,320 --> 00:39:45,399 Speaker 1: a message, uh, and you know that was a very 616 00:39:45,440 --> 00:39:47,920 Speaker 1: compelling one. They were. They they brought up several people 617 00:39:48,280 --> 00:39:51,760 Speaker 1: to talk about how this has been used. For example, 618 00:39:51,800 --> 00:39:56,120 Speaker 1: they brought up the CEO of Orange Bank. Orange is 619 00:39:56,280 --> 00:40:00,239 Speaker 1: a telecommunications company, and the telecommunications company to I did 620 00:40:00,400 --> 00:40:03,480 Speaker 1: that they were going to create a financial institution as well, 621 00:40:03,600 --> 00:40:07,719 Speaker 1: so an actual bank, and they the bank had decided 622 00:40:07,760 --> 00:40:09,480 Speaker 1: that one of the things they wanted to do was 623 00:40:09,600 --> 00:40:14,120 Speaker 1: create a an interface for their customers that would make 624 00:40:14,160 --> 00:40:19,520 Speaker 1: it very easy to deal with routine sort of problems 625 00:40:19,520 --> 00:40:24,239 Speaker 1: and questions and uh and provide information without the need 626 00:40:24,360 --> 00:40:29,560 Speaker 1: to reference that customer up to a human customer service representative, 627 00:40:29,920 --> 00:40:31,759 Speaker 1: which is a delicate thing to do. You want to 628 00:40:31,800 --> 00:40:34,440 Speaker 1: make sure that you are serving your customers properly. You 629 00:40:34,440 --> 00:40:36,160 Speaker 1: don't want to turn them off. You don't want them 630 00:40:36,239 --> 00:40:39,440 Speaker 1: to log again. They see a chat bot and they say, oh, well, 631 00:40:39,520 --> 00:40:42,040 Speaker 1: no one cares about me. They just put me in 632 00:40:42,120 --> 00:40:45,440 Speaker 1: touch with a robot. Uh. But at the same time, 633 00:40:45,480 --> 00:40:48,600 Speaker 1: you don't want to have to deal with uh, you know, 634 00:40:48,640 --> 00:40:52,719 Speaker 1: customer service representatives answering the same mundane questions over and 635 00:40:52,760 --> 00:40:55,320 Speaker 1: over again. That makes it hard to have an engaged 636 00:40:55,480 --> 00:40:59,719 Speaker 1: and and happy workforce. So there's a delicate balance here. 637 00:41:00,000 --> 00:41:04,080 Speaker 1: What Orange decided to do was create a virtual advisor. 638 00:41:04,560 --> 00:41:08,239 Speaker 1: They named the virtual advisor Jingo d J I, N G, 639 00:41:08,680 --> 00:41:14,319 Speaker 1: G O, and Jingo uses Watson as the the foundation 640 00:41:14,440 --> 00:41:17,399 Speaker 1: for what it does. And as the CEO explained, it's 641 00:41:17,440 --> 00:41:21,239 Speaker 1: the customer's first point of contact for the bank, and 642 00:41:21,360 --> 00:41:24,640 Speaker 1: Jingo can respond to a lot of different common queries 643 00:41:25,200 --> 00:41:29,479 Speaker 1: and they could be very general ones that are sort 644 00:41:29,480 --> 00:41:31,880 Speaker 1: of bank wide kind of questions, or they could be 645 00:41:32,000 --> 00:41:35,960 Speaker 1: very specific to the individual. And they said that Jingo 646 00:41:36,320 --> 00:41:39,560 Speaker 1: is the most effective agent they've seen, and that Jingo 647 00:41:39,600 --> 00:41:42,520 Speaker 1: also never has to take a break. Jingo can work 648 00:41:43,200 --> 00:41:46,799 Speaker 1: seven and is never tired and can respond to most 649 00:41:46,800 --> 00:41:49,520 Speaker 1: requests without the need to funnel customers to other agents. 650 00:41:50,280 --> 00:41:54,680 Speaker 1: So this was an example of an industry that has 651 00:41:55,280 --> 00:41:59,200 Speaker 1: a relatively small data set compared to something like Facebook, 652 00:41:59,600 --> 00:42:02,279 Speaker 1: and a bank, even with a lot of customers, is 653 00:42:02,320 --> 00:42:04,839 Speaker 1: going to be dealing with the same volume of information 654 00:42:05,320 --> 00:42:08,319 Speaker 1: as a social media network would. What else can we 655 00:42:08,400 --> 00:42:12,680 Speaker 1: expect when AI starts to insinuate its way into our 656 00:42:12,760 --> 00:42:15,520 Speaker 1: daily lives. Well, I'll tell you about it in just 657 00:42:15,560 --> 00:42:17,960 Speaker 1: a minute, but first let's take a quick break to 658 00:42:18,120 --> 00:42:28,399 Speaker 1: thank our sponsor. IBM also chatted about how AI could 659 00:42:28,600 --> 00:42:32,440 Speaker 1: help out in the field of human resources. That HR 660 00:42:32,480 --> 00:42:35,760 Speaker 1: is another one of those those departments in most companies 661 00:42:35,800 --> 00:42:37,840 Speaker 1: that has to field a lot of the same questions 662 00:42:37,880 --> 00:42:40,399 Speaker 1: over and over, and it may be that there are 663 00:42:40,440 --> 00:42:44,000 Speaker 1: lots of different policies that the HR representative has to 664 00:42:44,040 --> 00:42:47,840 Speaker 1: go through and find the relevant information. And while the 665 00:42:47,960 --> 00:42:52,840 Speaker 1: HR representative might have access to all that, he or 666 00:42:52,920 --> 00:42:56,359 Speaker 1: she may not automatically know the answer, and so it 667 00:42:56,360 --> 00:43:00,320 Speaker 1: takes time and effort to hunt down to and serve's 668 00:43:00,520 --> 00:43:05,120 Speaker 1: that employees might have. For HR professionals, so IBM had 669 00:43:05,160 --> 00:43:08,120 Speaker 1: also kind of I mentioned that Watson would be an 670 00:43:08,160 --> 00:43:10,759 Speaker 1: ideal tool for that as well. So if you need 671 00:43:10,840 --> 00:43:17,000 Speaker 1: to ask about specific forms or policies or uh compensation packages, 672 00:43:17,040 --> 00:43:19,000 Speaker 1: all the sort of things that HR folks have to 673 00:43:19,040 --> 00:43:24,080 Speaker 1: deal with, you could have an artificially intelligent platform do 674 00:43:24,160 --> 00:43:27,920 Speaker 1: that on your behalf. Which was also kind of interesting. 675 00:43:28,239 --> 00:43:30,960 Speaker 1: So there were several other folks that they brought up 676 00:43:30,960 --> 00:43:35,880 Speaker 1: on stage to chat about, you know, their experiences implementing 677 00:43:36,520 --> 00:43:41,640 Speaker 1: Watson in different ways. It was very much all about here, 678 00:43:41,680 --> 00:43:46,359 Speaker 1: here's what this this API is really for and how 679 00:43:46,480 --> 00:43:48,880 Speaker 1: you might use it, and not you know, trying to 680 00:43:48,920 --> 00:43:54,000 Speaker 1: get away from Watson is the the computer program that 681 00:43:54,320 --> 00:43:59,160 Speaker 1: one on Jeopardy or Watson was this quirky platform that 682 00:43:59,200 --> 00:44:03,200 Speaker 1: could come up with dynamically created recipes based upon the 683 00:44:03,320 --> 00:44:06,560 Speaker 1: ingredients who fed to it. The whole idea was to 684 00:44:07,160 --> 00:44:13,120 Speaker 1: create something that would have multiple use cases on multiple scales, 685 00:44:13,719 --> 00:44:16,040 Speaker 1: and I found it. I found it helpful to get 686 00:44:16,080 --> 00:44:19,000 Speaker 1: a better grip on exactly what Watson is and is not. 687 00:44:20,080 --> 00:44:23,520 Speaker 1: Um It was a fascinating discussion. We saw a lot 688 00:44:23,560 --> 00:44:27,319 Speaker 1: of interesting people. We saw the CEO of Nvidio come 689 00:44:27,320 --> 00:44:33,200 Speaker 1: out and talk about partnering with IBM to pair GPUs 690 00:44:33,280 --> 00:44:37,759 Speaker 1: and CPUs together to create the most powerful machines that 691 00:44:37,800 --> 00:44:41,359 Speaker 1: are able to process enormous amounts of information in a 692 00:44:41,480 --> 00:44:45,440 Speaker 1: very short amount of time. They talked about how uh, 693 00:44:45,480 --> 00:44:49,080 Speaker 1: this is the sort of of technology that's powering the 694 00:44:49,120 --> 00:44:54,359 Speaker 1: next generation of machines like autonomous cars. They also even 695 00:44:54,400 --> 00:44:59,120 Speaker 1: acknowledged the fact that this is still a young field 696 00:44:59,280 --> 00:45:03,400 Speaker 1: and a knowledge the the tragic accident that happened in 697 00:45:03,440 --> 00:45:09,160 Speaker 1: Arizona when a an autonomous suv that was that belonged 698 00:45:09,160 --> 00:45:12,360 Speaker 1: to Uber struck and killed a pedestrian as she was 699 00:45:12,440 --> 00:45:16,640 Speaker 1: walking her bicycle across the street. They took some time 700 00:45:16,680 --> 00:45:19,960 Speaker 1: to actually talk about this and say, this is a 701 00:45:20,000 --> 00:45:24,160 Speaker 1: horrible tragedy and nothing should distract us from the fact 702 00:45:24,200 --> 00:45:26,960 Speaker 1: that you know this, this person passed away and her 703 00:45:27,040 --> 00:45:30,239 Speaker 1: family is dealing with the the aftermath of that, and 704 00:45:30,239 --> 00:45:35,160 Speaker 1: it's terrible, and it also forces us to acknowledge that 705 00:45:36,280 --> 00:45:40,520 Speaker 1: these things were working on our life and death situations. 706 00:45:40,560 --> 00:45:44,319 Speaker 1: They are not trivial, They're not something that are It's 707 00:45:44,360 --> 00:45:47,239 Speaker 1: not just an engineering problem, it's not just a kind 708 00:45:47,239 --> 00:45:51,319 Speaker 1: of a hypothetical situation. These are are technologies that could 709 00:45:51,320 --> 00:45:55,520 Speaker 1: potentially save or end lives if the technology is implemented 710 00:45:55,960 --> 00:45:58,759 Speaker 1: one way or another, so it behooves us to be 711 00:45:58,840 --> 00:46:04,680 Speaker 1: extremely careful to figure out how to do it properly. Uh. 712 00:46:04,719 --> 00:46:09,439 Speaker 1: The CEO of Nvideo also talked about just how complicated 713 00:46:09,440 --> 00:46:13,279 Speaker 1: this whole process is for for vehicles and mentioned that, 714 00:46:13,760 --> 00:46:15,840 Speaker 1: you know, some people might think that a car is 715 00:46:15,880 --> 00:46:18,920 Speaker 1: just sort of processing one big stream of data and 716 00:46:19,239 --> 00:46:23,040 Speaker 1: making decisions on how to proceed based on that, because 717 00:46:23,080 --> 00:46:24,960 Speaker 1: that's kind of how humans do it, right, Like we 718 00:46:25,560 --> 00:46:28,360 Speaker 1: perceive stuff and then we have to respond to it, 719 00:46:28,400 --> 00:46:30,920 Speaker 1: We have to react to it. But machines do this 720 00:46:30,960 --> 00:46:34,760 Speaker 1: in a different way. They're they're collecting different individual streams 721 00:46:34,800 --> 00:46:36,800 Speaker 1: of data, and each of those streams needs to be 722 00:46:36,840 --> 00:46:41,440 Speaker 1: analyzed and processed, and then the collective information needs to 723 00:46:41,440 --> 00:46:44,239 Speaker 1: be analyzed and processed so that the right reaction can 724 00:46:44,320 --> 00:46:47,120 Speaker 1: take place. So it's it's almost like you can think 725 00:46:47,160 --> 00:46:52,280 Speaker 1: of each sensor as sending its information to a centralized location, 726 00:46:52,880 --> 00:46:56,920 Speaker 1: and then all of those collective information streams from all 727 00:46:56,960 --> 00:47:02,120 Speaker 1: of those sensors has to be synthesized and analyzed, and 728 00:47:02,160 --> 00:47:05,359 Speaker 1: then the reaction has to take place. So it makes 729 00:47:05,400 --> 00:47:08,880 Speaker 1: it sound way more complicated than you might originally imagine, 730 00:47:08,920 --> 00:47:11,759 Speaker 1: I certainly felt that way. We got to watch a 731 00:47:11,840 --> 00:47:17,120 Speaker 1: video of a an eight minute drive of an autonomous 732 00:47:17,120 --> 00:47:20,279 Speaker 1: car down country roads in New Jersey, showing how it 733 00:47:20,280 --> 00:47:24,279 Speaker 1: would navigate down the roads, even properly navigating when there 734 00:47:24,280 --> 00:47:27,919 Speaker 1: were no road signs available, making certain that the car 735 00:47:28,360 --> 00:47:30,919 Speaker 1: was behaving the way it was supposed to. And as 736 00:47:31,040 --> 00:47:33,600 Speaker 1: they were pointing out, like even the in this scenario 737 00:47:34,120 --> 00:47:37,520 Speaker 1: it was nice weather, it was during the daytime. Uh, 738 00:47:37,560 --> 00:47:41,759 Speaker 1: even in that scenario, it's a complicated thing to make 739 00:47:41,760 --> 00:47:45,800 Speaker 1: a machine do that properly. And then you start imagining 740 00:47:45,840 --> 00:47:50,120 Speaker 1: all the different additional complications that could arise, like bad 741 00:47:50,200 --> 00:47:55,600 Speaker 1: weather or night driving, or heavier traffic, and or even 742 00:47:55,719 --> 00:47:59,400 Speaker 1: things like wildlife running across the street. We realized this 743 00:47:59,520 --> 00:48:04,640 Speaker 1: is a lot more difficult than just sensing a potential 744 00:48:04,960 --> 00:48:07,880 Speaker 1: obstacle on the road and taking the right course of 745 00:48:07,920 --> 00:48:11,160 Speaker 1: action to avoid hitting it. In fact, according to the CEO, 746 00:48:11,280 --> 00:48:14,560 Speaker 1: he said that every car needs about a hundred servers 747 00:48:15,080 --> 00:48:19,719 Speaker 1: to process all the information. And uh they were using 748 00:48:19,719 --> 00:48:22,239 Speaker 1: a fleet of around a hundred cars, so or two 749 00:48:22,280 --> 00:48:24,280 Speaker 1: hundred cars, so they had a thousand to two thousand 750 00:48:24,280 --> 00:48:27,520 Speaker 1: servers dedicated just to processing information in order to develop 751 00:48:27,600 --> 00:48:30,440 Speaker 1: this technology in the first place, so it becomes an 752 00:48:30,440 --> 00:48:34,600 Speaker 1: incredibly difficult thing to do well. That was kind of 753 00:48:34,640 --> 00:48:40,000 Speaker 1: the overall story of the journey to AI. This this 754 00:48:40,080 --> 00:48:46,120 Speaker 1: discussion of being in this this middle period between developing 755 00:48:46,160 --> 00:48:51,040 Speaker 1: these very hyper focused tools and artificial intelligence and the 756 00:48:51,080 --> 00:48:54,719 Speaker 1: goal of getting general and artificial intelligence. The idea of 757 00:48:55,200 --> 00:49:01,239 Speaker 1: using AI as kind of an assistant to performing very 758 00:49:01,280 --> 00:49:06,760 Speaker 1: complicated tasks, complicated from a computational standpoint, also complicated from 759 00:49:07,200 --> 00:49:12,440 Speaker 1: just just from how much data is there. Again, if 760 00:49:12,480 --> 00:49:14,960 Speaker 1: you if you put a human being in charge of 761 00:49:15,560 --> 00:49:17,960 Speaker 1: going through all that information to find the most relevant 762 00:49:18,480 --> 00:49:22,720 Speaker 1: and useful information, it would take hours or days or years, 763 00:49:22,760 --> 00:49:28,160 Speaker 1: depending upon the data set, whereas artificially intelligent, properly designed 764 00:49:28,480 --> 00:49:30,760 Speaker 1: program can do it in a fraction of that time, 765 00:49:31,120 --> 00:49:35,520 Speaker 1: and do it dynamically, request after request after request, and 766 00:49:35,600 --> 00:49:39,680 Speaker 1: can continuously update its answers based upon fresh information coming 767 00:49:39,680 --> 00:49:42,680 Speaker 1: into the data set. I found it really interesting and 768 00:49:42,719 --> 00:49:44,240 Speaker 1: it gives me a lot of hope for the future 769 00:49:44,520 --> 00:49:48,799 Speaker 1: for various implementations of this type of technology, whether it's 770 00:49:48,840 --> 00:49:53,000 Speaker 1: Watson or some comparable technology. I really think it's going 771 00:49:53,040 --> 00:49:56,239 Speaker 1: to be interesting for all sorts of different applications, some 772 00:49:56,360 --> 00:49:59,800 Speaker 1: of which we as consumers will interact with directly, whether 773 00:49:59,800 --> 00:50:04,239 Speaker 1: it's a customer service agent or maybe it's a personal assistant, 774 00:50:04,440 --> 00:50:07,520 Speaker 1: something that gets to know us and our routines. We're 775 00:50:07,520 --> 00:50:09,759 Speaker 1: starting to see that a little bit in some of 776 00:50:09,760 --> 00:50:14,440 Speaker 1: the personal assistants like Google Home, uh Sirie, Alexa, that 777 00:50:14,520 --> 00:50:17,680 Speaker 1: kind of thing. You see a little bit there, But 778 00:50:18,000 --> 00:50:23,600 Speaker 1: it'll continue to grow more sophisticated and more proactive to 779 00:50:23,640 --> 00:50:26,800 Speaker 1: the point where we can have kind of like a 780 00:50:27,320 --> 00:50:30,839 Speaker 1: It's almost like having an AI life coach right at 781 00:50:30,880 --> 00:50:34,719 Speaker 1: your disposal. So I found it all very fascinating and 782 00:50:34,760 --> 00:50:37,799 Speaker 1: I hope to learn a lot more about lots of 783 00:50:37,800 --> 00:50:41,200 Speaker 1: different topics while I'm here at the THINK conference. I 784 00:50:41,239 --> 00:50:44,480 Speaker 1: can't wait to chat with you guys more about quantum computing. 785 00:50:44,520 --> 00:50:47,080 Speaker 1: I actually got to see a a model of what 786 00:50:47,120 --> 00:50:50,120 Speaker 1: a quantum computer looks like, and boy, halldy, it does 787 00:50:50,160 --> 00:50:53,960 Speaker 1: not look like a normal computer. But I'll definitely do 788 00:50:54,040 --> 00:50:57,000 Speaker 1: an episode about that to talk more about what quantum 789 00:50:57,040 --> 00:51:00,480 Speaker 1: computers are, how they work, why they are important, and 790 00:51:00,520 --> 00:51:03,120 Speaker 1: where we might be going with it, and maybe talk 791 00:51:03,120 --> 00:51:05,520 Speaker 1: a little bit more about some of the the stuff 792 00:51:05,640 --> 00:51:07,840 Speaker 1: Dr Michio Kaku said, maybe some of the stuff that 793 00:51:07,920 --> 00:51:10,400 Speaker 1: Neil deGrasse Tyson said. I went to his talk as well, 794 00:51:11,040 --> 00:51:14,319 Speaker 1: and uh, they were very fascinating. They weren't quite as 795 00:51:14,400 --> 00:51:17,200 Speaker 1: tech oriented as I would like to do a full 796 00:51:17,280 --> 00:51:20,160 Speaker 1: episode like a recap on them, but I might touch 797 00:51:20,320 --> 00:51:23,160 Speaker 1: on some of the themes they talked about and their 798 00:51:23,239 --> 00:51:27,600 Speaker 1: meaning to me as just a person who loves tech 799 00:51:27,719 --> 00:51:30,600 Speaker 1: and the tech sector in general, because they both gave 800 00:51:30,680 --> 00:51:34,320 Speaker 1: very fascinating presentations. If you guys have suggestions for future 801 00:51:34,360 --> 00:51:37,880 Speaker 1: episodes of tech Stuff, whether it is a technology, a company, 802 00:51:38,080 --> 00:51:40,400 Speaker 1: a person, maybe there's someone you want me to interview, 803 00:51:41,560 --> 00:51:43,880 Speaker 1: let me know. Send me a message. The email address 804 00:51:43,920 --> 00:51:46,320 Speaker 1: for the show is tech Stuff at how stuff works 805 00:51:46,360 --> 00:51:48,719 Speaker 1: dot com, or you can drop me a line on 806 00:51:48,760 --> 00:51:51,400 Speaker 1: Facebook or Twitter. The handover both of those is text 807 00:51:51,440 --> 00:51:55,920 Speaker 1: Stuff hs W. Remember you can follow us on Instagram. 808 00:51:55,960 --> 00:51:59,960 Speaker 1: That account is always showing interesting behind the scenes information, 809 00:52:00,120 --> 00:52:02,400 Speaker 1: so make sure you go check that out. And on 810 00:52:02,440 --> 00:52:07,640 Speaker 1: Wednesdays and Fridays typically I record live. I stream my 811 00:52:07,760 --> 00:52:11,799 Speaker 1: recording sessions on twitch dot tv slash tech Stuff, so 812 00:52:11,840 --> 00:52:14,239 Speaker 1: you can come and watch me record one of these episodes. 813 00:52:14,280 --> 00:52:16,120 Speaker 1: There's a chat room there. You can jump in there 814 00:52:16,120 --> 00:52:19,480 Speaker 1: and chat with me live as I'm recording, although I 815 00:52:19,520 --> 00:52:22,880 Speaker 1: don't respond until I hit a break because otherwise I 816 00:52:22,920 --> 00:52:26,520 Speaker 1: find it too distracting and I ramble and that does 817 00:52:26,560 --> 00:52:29,839 Speaker 1: not make for good podcasting. But please come on buy 818 00:52:29,960 --> 00:52:32,239 Speaker 1: say hello. I would love to see you there, and 819 00:52:32,280 --> 00:52:41,799 Speaker 1: I'll talk to you again really soon. For more on 820 00:52:41,840 --> 00:52:44,320 Speaker 1: this and thousands of other topics because at how stuff 821 00:52:44,320 --> 00:52:54,800 Speaker 1: Works dot com