1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in tech with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:14,200 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,240 --> 00:00:18,159 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:18,239 --> 00:00:22,320 Speaker 1: how Stuff Works and I love all things tech. This week, 5 00:00:22,720 --> 00:00:26,400 Speaker 1: I've got a series of special shows in addition to 6 00:00:26,440 --> 00:00:29,720 Speaker 1: our normal episodes that I am happy to bring with you. 7 00:00:29,720 --> 00:00:32,640 Speaker 1: You may notice that things sound a little different than 8 00:00:32,680 --> 00:00:35,400 Speaker 1: they usually do. That is because I am not currently 9 00:00:35,600 --> 00:00:40,880 Speaker 1: in the massive, wonderful, underground, super secret studio at how 10 00:00:40,960 --> 00:00:45,440 Speaker 1: stuff Works. Instead, I am on the road. I'm actually 11 00:00:45,479 --> 00:00:50,120 Speaker 1: in Las Vegas, Nevada, where I am attending the inaugural 12 00:00:50,479 --> 00:00:55,280 Speaker 1: IBM Think Conference. It's taking place between March nineteenth and 13 00:00:55,320 --> 00:01:01,360 Speaker 1: March ten and it's kind of a big industry conference 14 00:01:01,440 --> 00:01:04,880 Speaker 1: for people who are in the text space, who work 15 00:01:05,000 --> 00:01:09,199 Speaker 1: with computers and software and hardware. Kind of a place 16 00:01:09,200 --> 00:01:11,480 Speaker 1: for them to meet, to network and to learn a 17 00:01:11,480 --> 00:01:16,880 Speaker 1: lot more about some bleeding edge technologies. They have lots 18 00:01:16,920 --> 00:01:19,880 Speaker 1: of different activities going on all week, and I am 19 00:01:19,959 --> 00:01:22,720 Speaker 1: going to be here attending some of the events, talking 20 00:01:22,760 --> 00:01:27,080 Speaker 1: to people, getting more information, and recording special episodes for you, 21 00:01:27,440 --> 00:01:32,440 Speaker 1: my beloved listeners. So I hope you enjoy this special 22 00:01:32,480 --> 00:01:35,920 Speaker 1: series and I can't wait to really dive in and 23 00:01:36,040 --> 00:01:40,080 Speaker 1: learn more about these topics. It looks really interesting. Now. 24 00:01:40,240 --> 00:01:42,680 Speaker 1: One thing I will say before I jump in and 25 00:01:42,680 --> 00:01:44,640 Speaker 1: give you kind of an overview of what I have 26 00:01:44,840 --> 00:01:49,960 Speaker 1: to expect this week is that it's crazy busy, y'all. 27 00:01:50,000 --> 00:01:54,040 Speaker 1: I mean, the conference has got so many people attending. 28 00:01:54,080 --> 00:01:57,080 Speaker 1: I haven't seen any numbers about how many people are 29 00:01:57,120 --> 00:02:00,520 Speaker 1: actually at this thing. But it's taking place at Mandalay 30 00:02:00,600 --> 00:02:04,160 Speaker 1: Bay as one of the resorts and casinos here at 31 00:02:04,240 --> 00:02:07,480 Speaker 1: Las Vegas, and it has a very large convention center. 32 00:02:07,800 --> 00:02:10,440 Speaker 1: During ce S, you may have heard me do several 33 00:02:10,440 --> 00:02:13,400 Speaker 1: episodes about ce S over the years. This is the 34 00:02:13,440 --> 00:02:17,680 Speaker 1: place where the press events happened before the show floor opens. 35 00:02:17,680 --> 00:02:20,519 Speaker 1: At the Las Vegas Convention Center at Mandalaid Bay, you 36 00:02:20,560 --> 00:02:23,680 Speaker 1: would have all the different rooms booked up for the 37 00:02:23,760 --> 00:02:29,079 Speaker 1: various press conferences that companies would use to unveil their 38 00:02:29,200 --> 00:02:32,160 Speaker 1: latest and greatest products they're coming out over the next 39 00:02:32,400 --> 00:02:36,600 Speaker 1: year or two. Well, at IBM think that's what is 40 00:02:36,639 --> 00:02:39,800 Speaker 1: being used as the meeting space and the keynote space, 41 00:02:40,120 --> 00:02:45,560 Speaker 1: presentation space for all sorts of different I T computer 42 00:02:46,840 --> 00:02:50,680 Speaker 1: topics and activities. It's kind of overwhelming, to be honest. 43 00:02:50,800 --> 00:02:55,120 Speaker 1: In fact, I am now kind of unwinding in my 44 00:02:55,200 --> 00:02:58,359 Speaker 1: hotel room before I have to go back out and 45 00:02:58,840 --> 00:03:01,040 Speaker 1: jump right back into I'm going to go to a 46 00:03:01,120 --> 00:03:05,600 Speaker 1: session where they're going to talk about five big topics 47 00:03:05,600 --> 00:03:08,400 Speaker 1: in science and technology. So I'm really looking forward to it. 48 00:03:08,440 --> 00:03:12,160 Speaker 1: There's some very interesting people who are here presenting, and 49 00:03:12,200 --> 00:03:16,080 Speaker 1: I can't wait to learn more. Some of the areas 50 00:03:16,120 --> 00:03:20,120 Speaker 1: that we're going to talk about this week will include blockchain. Now, 51 00:03:20,160 --> 00:03:22,480 Speaker 1: if you've been listening to text stuff over the past 52 00:03:22,520 --> 00:03:25,919 Speaker 1: few months, you've probably heard a recent episode I did 53 00:03:26,040 --> 00:03:33,520 Speaker 1: about blockchain. Blockchain is the technology that underlies cryptocurrencies like bitcoin, 54 00:03:33,600 --> 00:03:35,920 Speaker 1: and in fact, I would argue bitcoin is probably the 55 00:03:35,960 --> 00:03:41,440 Speaker 1: most famous use of blockchain technology, but blockchain is not 56 00:03:42,080 --> 00:03:48,960 Speaker 1: only good for bitcoins and cryptocurrencies. That's one use for it, 57 00:03:49,000 --> 00:03:50,880 Speaker 1: but not the only one. In fact, a lot of 58 00:03:50,920 --> 00:03:55,200 Speaker 1: people argue that blockchain is going to be the next 59 00:03:56,680 --> 00:04:01,320 Speaker 1: um next evolution of the web. So you might remember 60 00:04:01,680 --> 00:04:04,880 Speaker 1: people were talking about web two point oh more than 61 00:04:04,920 --> 00:04:08,840 Speaker 1: a decade ago, and blockchain would be like the next step. 62 00:04:08,920 --> 00:04:12,040 Speaker 1: So what does that even mean? Well, what one point 63 00:04:12,040 --> 00:04:16,279 Speaker 1: oh was the sort of websites you first saw, primarily 64 00:04:16,480 --> 00:04:20,760 Speaker 1: when the web first launched, that would be websites that 65 00:04:20,760 --> 00:04:24,000 Speaker 1: were pretty much static. They did not change, They were 66 00:04:24,040 --> 00:04:28,800 Speaker 1: not really interactive. They did not have any capacity to 67 00:04:28,960 --> 00:04:33,839 Speaker 1: have users add to or change things in any meaningful way. 68 00:04:34,080 --> 00:04:38,159 Speaker 1: And so it was kind of like looking at a 69 00:04:38,240 --> 00:04:41,919 Speaker 1: newspaper or a magazine, something that's in a fixed format 70 00:04:42,480 --> 00:04:47,839 Speaker 1: that doesn't have any real um, any real, any real 71 00:04:47,960 --> 00:04:52,039 Speaker 1: special thing about it. Right, there's nothing that gives it 72 00:04:52,120 --> 00:04:56,640 Speaker 1: any sort of uh sense that you your presence matters, 73 00:04:56,720 --> 00:04:59,960 Speaker 1: or that that's even registering at all, unless there was 74 00:05:00,040 --> 00:05:03,760 Speaker 1: one of those little helpful counters, as was often the 75 00:05:03,760 --> 00:05:05,840 Speaker 1: case in early websites, where it would tell you how 76 00:05:05,839 --> 00:05:09,120 Speaker 1: many visitors had been to that website above. From that, 77 00:05:09,160 --> 00:05:12,960 Speaker 1: there really wasn't anything that indicated that you mattered at all. 78 00:05:13,160 --> 00:05:15,160 Speaker 1: So that was web one point. Oh and a lot 79 00:05:15,200 --> 00:05:21,120 Speaker 1: of those websites ended up kind of just being uh momentary. 80 00:05:21,200 --> 00:05:24,240 Speaker 1: They didn't stick around because there was no point in 81 00:05:24,240 --> 00:05:26,560 Speaker 1: going back and visiting after you've seen them once. They 82 00:05:26,680 --> 00:05:29,839 Speaker 1: weren't ever going to change. If they did change, you 83 00:05:29,880 --> 00:05:32,680 Speaker 1: didn't necessarily know about it, and so you would just 84 00:05:32,720 --> 00:05:34,839 Speaker 1: have to go back and visit and see if anything 85 00:05:34,880 --> 00:05:36,440 Speaker 1: had changed since the last time you were there. It 86 00:05:36,480 --> 00:05:41,320 Speaker 1: was kind of inconvenient. Web two point oh arguably was 87 00:05:41,800 --> 00:05:47,200 Speaker 1: when websites started to incorporate interactive features where users could 88 00:05:47,240 --> 00:05:50,960 Speaker 1: come into the website and things would change, they'd be dynamic. 89 00:05:51,320 --> 00:05:55,440 Speaker 1: Sometimes this meant that there were animated aspects to the 90 00:05:55,440 --> 00:05:57,800 Speaker 1: web page. Sometimes it just meant that there was a 91 00:05:57,800 --> 00:06:01,200 Speaker 1: way for users to leave feedback. For example, some people 92 00:06:01,240 --> 00:06:05,320 Speaker 1: point to Amazon as being an early web dew point 93 00:06:05,360 --> 00:06:10,039 Speaker 1: oh style web page because you could leave user reviews 94 00:06:10,680 --> 00:06:14,080 Speaker 1: on the site, so you could actually impact what happened 95 00:06:14,279 --> 00:06:17,559 Speaker 1: with products on Amazon dot Com. And then the course 96 00:06:17,640 --> 00:06:22,520 Speaker 1: later on, Amazon was also incorporating things like recommendation engines 97 00:06:22,600 --> 00:06:28,000 Speaker 1: that would try to engage users and and encourage them 98 00:06:28,040 --> 00:06:31,599 Speaker 1: to spend more money and to buy more things, and 99 00:06:31,720 --> 00:06:35,600 Speaker 1: that was also and indication that was web two point oh. 100 00:06:35,600 --> 00:06:38,640 Speaker 1: It was something that was more than just visit website, 101 00:06:38,800 --> 00:06:42,080 Speaker 1: read some information and leave. Blockchain is supposedly going to 102 00:06:42,200 --> 00:06:46,200 Speaker 1: be the next step, and really it comes down to 103 00:06:47,440 --> 00:06:50,400 Speaker 1: the very nature of blockchain itself, which is a peer 104 00:06:50,400 --> 00:06:54,200 Speaker 1: to peer technology within a network. So you're not necessarily 105 00:06:54,240 --> 00:06:57,520 Speaker 1: talking about Internet wide. It's a network within the Internet. 106 00:06:58,040 --> 00:07:01,080 Speaker 1: Now it might be a network that spans multip pole networks, 107 00:07:01,240 --> 00:07:04,280 Speaker 1: because it's all peers that are connecting to one another 108 00:07:04,680 --> 00:07:10,400 Speaker 1: and using a method that creates blocks of data. Thus 109 00:07:10,480 --> 00:07:13,080 Speaker 1: the block in the block chain, and each block is 110 00:07:13,400 --> 00:07:16,760 Speaker 1: sequentially added to a chain of other blocks, and it 111 00:07:16,800 --> 00:07:20,000 Speaker 1: all dates back to the very first block in the chain, 112 00:07:20,160 --> 00:07:23,520 Speaker 1: and there's information within each block that can be traced 113 00:07:23,560 --> 00:07:26,760 Speaker 1: all the way back to that first one. Uh. The 114 00:07:26,800 --> 00:07:31,240 Speaker 1: transactions that happen within a block are recorded within that block, 115 00:07:32,000 --> 00:07:36,360 Speaker 1: and as other computers inside the peer to peer network 116 00:07:37,240 --> 00:07:42,560 Speaker 1: verify and validate those transactions, whatever they may be, they 117 00:07:42,600 --> 00:07:45,680 Speaker 1: then are able to create the next block in the blockchain. 118 00:07:45,760 --> 00:07:49,280 Speaker 1: With cryptocurrency, you get a reward for this. That's what's 119 00:07:49,280 --> 00:07:54,080 Speaker 1: called mining. It's when in bitcoin your computer participates in 120 00:07:54,280 --> 00:07:59,480 Speaker 1: solving a particularly tough math problem essentially, and if you 121 00:07:59,560 --> 00:08:05,119 Speaker 1: do end that ends up being the validation for previous transactions. 122 00:08:05,600 --> 00:08:09,680 Speaker 1: The transactions all are codified as a block added to 123 00:08:09,720 --> 00:08:13,640 Speaker 1: the end of the chain, and everyone in the peer 124 00:08:13,640 --> 00:08:16,760 Speaker 1: to peer network has a has access to a ledger 125 00:08:17,080 --> 00:08:20,600 Speaker 1: that is updated across the entire network. So the ledger 126 00:08:20,920 --> 00:08:23,480 Speaker 1: is the full record of all transactions dating back to 127 00:08:23,520 --> 00:08:26,720 Speaker 1: the very first one. They're going to be very deep 128 00:08:26,760 --> 00:08:30,440 Speaker 1: dives on blockchain technology here this week, as people talk 129 00:08:30,480 --> 00:08:34,240 Speaker 1: about the different ways to use it beyond cryptocurrencies, and 130 00:08:34,480 --> 00:08:41,080 Speaker 1: how it could serve as a backbone to future web interactions. Honestly, 131 00:08:41,120 --> 00:08:44,160 Speaker 1: I'm very eager to hear more about that, because while 132 00:08:44,200 --> 00:08:47,400 Speaker 1: I've just described kind of a very high level of 133 00:08:47,400 --> 00:08:52,400 Speaker 1: what blockchain is, I I am really curious about how 134 00:08:52,679 --> 00:08:57,400 Speaker 1: this is going to be used to actually, uh form 135 00:08:57,840 --> 00:09:01,840 Speaker 1: the the spine of the web in the future. Was 136 00:09:01,880 --> 00:09:04,400 Speaker 1: it going to be besides a way of keeping track 137 00:09:04,440 --> 00:09:09,760 Speaker 1: of transactions, like whether it's currencies or property or anything 138 00:09:09,760 --> 00:09:12,920 Speaker 1: along those lines, what can it do beyond that? And 139 00:09:12,960 --> 00:09:15,760 Speaker 1: I honestly don't fully understand that, so I look forward 140 00:09:15,760 --> 00:09:18,560 Speaker 1: to learning more while I'm here. That's just one of 141 00:09:18,600 --> 00:09:22,520 Speaker 1: the aspects that will be covered during this conference. Another one, 142 00:09:22,640 --> 00:09:26,360 Speaker 1: a big one, is artificial intelligence. Now I've talked about 143 00:09:26,440 --> 00:09:30,040 Speaker 1: that a lot on tech stuff as well. Artificial intelligence 144 00:09:30,040 --> 00:09:32,280 Speaker 1: A lot of people when they hear that, I think 145 00:09:32,360 --> 00:09:36,240 Speaker 1: they immediately go to the idea of a machine that 146 00:09:36,360 --> 00:09:41,400 Speaker 1: can quote unquote think like a person. It is mimicking 147 00:09:41,440 --> 00:09:46,880 Speaker 1: the way brains think, or it somehow possesses human like intelligence, 148 00:09:46,960 --> 00:09:50,600 Speaker 1: or perhaps even superhuman intelligence, where it's able to think 149 00:09:50,600 --> 00:09:54,160 Speaker 1: even better than humans are. But artificial intelligence is a 150 00:09:54,280 --> 00:09:57,400 Speaker 1: much more broad category than that. That would be a 151 00:09:57,520 --> 00:10:03,400 Speaker 1: very specific niche definition of what artificial intelligence is. Artificial 152 00:10:03,440 --> 00:10:10,000 Speaker 1: intelligence encapsulates multiple disciplines, and it is also uh something 153 00:10:10,040 --> 00:10:14,880 Speaker 1: that involves lots and lots of different parts of intelligence, 154 00:10:14,920 --> 00:10:18,200 Speaker 1: not just this cognition that we think of, but think 155 00:10:18,240 --> 00:10:22,880 Speaker 1: of things like image recognition with computers. Teaching a computer 156 00:10:23,000 --> 00:10:28,640 Speaker 1: to identify a specific kind of thing to extrapolate from knowledge, 157 00:10:29,120 --> 00:10:31,920 Speaker 1: it's tricky. You might be able to teach a computer, 158 00:10:32,000 --> 00:10:36,200 Speaker 1: for example, that a mug is a mug like a 159 00:10:36,240 --> 00:10:40,320 Speaker 1: coffee mug. You've got a red coffee mug has got 160 00:10:40,520 --> 00:10:44,840 Speaker 1: sort of a curved cut mug. It's not, you know, 161 00:10:44,960 --> 00:10:47,199 Speaker 1: just straight up and down. It's got kind of kind 162 00:10:47,200 --> 00:10:50,720 Speaker 1: of bowls out a little bit. And you take images 163 00:10:50,760 --> 00:10:53,280 Speaker 1: of it from a certain angle with certain lighting, and 164 00:10:53,400 --> 00:10:56,600 Speaker 1: you feed it to a computer and you let them 165 00:10:56,760 --> 00:10:59,760 Speaker 1: You have your computer breakdown the image so it can 166 00:10:59,800 --> 00:11:02,760 Speaker 1: I identify where the all the borders are, what are 167 00:11:02,760 --> 00:11:05,440 Speaker 1: what is the mug versus, what is the background versus 168 00:11:05,520 --> 00:11:09,520 Speaker 1: what is the platform it's on? And you Essentially, you 169 00:11:09,600 --> 00:11:12,840 Speaker 1: teach the computer, hey, this is a mug. Computers don't 170 00:11:12,880 --> 00:11:18,400 Speaker 1: magically then understand that all other containers that have that 171 00:11:18,559 --> 00:11:22,760 Speaker 1: same basic sort of shape um are mugs. If you 172 00:11:23,000 --> 00:11:26,840 Speaker 1: showed another picture of that same mug with different lighting, 173 00:11:27,360 --> 00:11:30,120 Speaker 1: the computer might not be able to tell you that 174 00:11:30,120 --> 00:11:32,760 Speaker 1: that's a mug. Or it's at a different angle. Maybe 175 00:11:32,920 --> 00:11:35,080 Speaker 1: you turn it so that the handle is facing a 176 00:11:35,080 --> 00:11:37,920 Speaker 1: different way, and then you showed the computer the image, 177 00:11:37,960 --> 00:11:39,800 Speaker 1: it may not be able to figure out that that 178 00:11:39,960 --> 00:11:42,920 Speaker 1: is also a mug. If it's a different color, if 179 00:11:42,960 --> 00:11:46,840 Speaker 1: it's a different shape. All of these things are tricky 180 00:11:46,880 --> 00:11:50,080 Speaker 1: for machines. Like as human beings, we can see an 181 00:11:50,120 --> 00:11:55,040 Speaker 1: example or two of something and then we're pretty good. 182 00:11:55,440 --> 00:11:57,959 Speaker 1: We then can say, all right, I now I get 183 00:11:58,000 --> 00:12:01,320 Speaker 1: the general idea of the things that that are the 184 00:12:01,400 --> 00:12:05,680 Speaker 1: constitute a mug. And when I encounter something else, even 185 00:12:05,720 --> 00:12:08,320 Speaker 1: if it is of a different shape, a different color, 186 00:12:08,520 --> 00:12:12,120 Speaker 1: different lighting, on a different surface, maybe it's even got 187 00:12:12,280 --> 00:12:15,400 Speaker 1: a different type of liquid inside of it, I still 188 00:12:15,440 --> 00:12:18,559 Speaker 1: can figure out that that thing is a mug. Because 189 00:12:18,800 --> 00:12:23,640 Speaker 1: I can extrapolate. I can take what I've learned and 190 00:12:23,720 --> 00:12:28,360 Speaker 1: extend it beyond just the few instances that I have encountered. 191 00:12:28,880 --> 00:12:31,880 Speaker 1: And then if I walk into a mug store, I 192 00:12:31,960 --> 00:12:34,199 Speaker 1: don't look at just one and say that's a mug, 193 00:12:34,240 --> 00:12:35,880 Speaker 1: but I have no idea what the rest of these 194 00:12:35,880 --> 00:12:38,880 Speaker 1: things are. I can actually say, oh, these are all 195 00:12:38,960 --> 00:12:42,120 Speaker 1: different types of mugs and different sizes and shapes and colors. 196 00:12:42,360 --> 00:12:45,200 Speaker 1: Computers are not good at that generally speaking. This is 197 00:12:45,440 --> 00:12:47,760 Speaker 1: one of the reasons why you may have heard about 198 00:12:47,760 --> 00:12:51,959 Speaker 1: that story of a computer being fed thousands upon thousands 199 00:12:51,960 --> 00:12:56,640 Speaker 1: of images of cat pictures so that the computer could 200 00:12:56,720 --> 00:13:00,640 Speaker 1: learn what a cat is. Because without all of that, 201 00:13:01,040 --> 00:13:06,600 Speaker 1: without the this huge body of examples, the computer simply 202 00:13:06,640 --> 00:13:10,840 Speaker 1: can't extrapolate and figure out what is is not a cat. 203 00:13:11,120 --> 00:13:14,600 Speaker 1: And even after all of those images were fed to 204 00:13:14,640 --> 00:13:17,320 Speaker 1: the computer, it's still had some issues. It wasn't like 205 00:13:17,400 --> 00:13:20,960 Speaker 1: it had magically understood what a cat was. It did 206 00:13:20,960 --> 00:13:24,319 Speaker 1: not have an innate grasp of cat nous, and by 207 00:13:24,360 --> 00:13:26,800 Speaker 1: that I mean the qualities that make a cat not 208 00:13:27,320 --> 00:13:32,080 Speaker 1: a character from Hunger games. So artificial intelligence, again is 209 00:13:32,280 --> 00:13:36,800 Speaker 1: very much a multidisciplinary thing, image recognition being just one 210 00:13:36,960 --> 00:13:41,080 Speaker 1: tiny part of it. There are numerous other aspects to 211 00:13:41,200 --> 00:13:48,040 Speaker 1: artificial intelligence, including things like voice recognition and natural language processing. 212 00:13:48,280 --> 00:13:52,480 Speaker 1: Natural language processing, of course, is where a computer starts 213 00:13:52,520 --> 00:13:55,640 Speaker 1: to learn what we mean when we say things a 214 00:13:55,679 --> 00:14:00,120 Speaker 1: certain way. Now, again, just like with our ability to 215 00:14:00,200 --> 00:14:04,840 Speaker 1: extrapolate based upon the limited examples we have seen of 216 00:14:04,880 --> 00:14:08,719 Speaker 1: any one object, with image recognition, natural language, we can 217 00:14:08,720 --> 00:14:12,480 Speaker 1: have different ways of saying the same thing and still 218 00:14:12,600 --> 00:14:17,480 Speaker 1: mean the same meaning. So I might say it's raining, 219 00:14:17,840 --> 00:14:21,080 Speaker 1: or it's pouring outside, or it's coming down like cats 220 00:14:21,080 --> 00:14:23,680 Speaker 1: and dogs. Those are all different ways of me saying 221 00:14:23,680 --> 00:14:25,760 Speaker 1: that water is falling from the sky in the form 222 00:14:25,800 --> 00:14:29,080 Speaker 1: of precipitation, and you will get wet if you walk outside. 223 00:14:29,680 --> 00:14:32,920 Speaker 1: And people who have had just a limited amount of 224 00:14:33,480 --> 00:14:36,520 Speaker 1: exposure to these sort of ideas can pick up on that, 225 00:14:36,640 --> 00:14:40,200 Speaker 1: but computers, again do not unless you expressly tell them. So. 226 00:14:41,080 --> 00:14:44,400 Speaker 1: The way we say things, the word order we choose, 227 00:14:44,480 --> 00:14:48,840 Speaker 1: the syntax, the the accent we may speak with, the 228 00:14:48,880 --> 00:14:52,440 Speaker 1: dialect if you prefer that we might speak with, the 229 00:14:52,840 --> 00:14:57,000 Speaker 1: emphasis we would place on different words, the speed at 230 00:14:57,040 --> 00:14:59,880 Speaker 1: which we speak. All of these different factors make it 231 00:15:00,080 --> 00:15:03,760 Speaker 1: very challenging for computers to understand us. That being said, 232 00:15:04,240 --> 00:15:07,120 Speaker 1: voice recognition and natural language processing has come a far 233 00:15:07,200 --> 00:15:10,000 Speaker 1: away in a short amount of time. Over the last 234 00:15:10,040 --> 00:15:12,840 Speaker 1: couple of decades, it has really advanced quite a bit, 235 00:15:12,880 --> 00:15:15,600 Speaker 1: which is why we see all of these digital personal 236 00:15:15,600 --> 00:15:19,240 Speaker 1: assistance that we can talk to like Siri and Alexa 237 00:15:19,440 --> 00:15:24,040 Speaker 1: and Google Assistant, and they have become pretty good and 238 00:15:24,200 --> 00:15:27,280 Speaker 1: understanding us. They're not perfect, but they're pretty good at 239 00:15:27,280 --> 00:15:30,000 Speaker 1: figuring out what we want, even if we say things 240 00:15:30,080 --> 00:15:33,480 Speaker 1: in different ways. But that's another aspect of artificial intelligence. 241 00:15:33,800 --> 00:15:35,600 Speaker 1: All of these sort of things are the kind of 242 00:15:35,600 --> 00:15:38,560 Speaker 1: topics that will be talked about during the IBM Think Conference, 243 00:15:38,920 --> 00:15:41,600 Speaker 1: And again they're gonna go into much greater detail and 244 00:15:41,640 --> 00:15:46,920 Speaker 1: talk about real world applications for AI, not just theory, 245 00:15:47,400 --> 00:15:50,640 Speaker 1: not just how far we've come, but how we can 246 00:15:50,680 --> 00:15:53,880 Speaker 1: put it to use. There's more to come about the 247 00:15:53,920 --> 00:15:58,240 Speaker 1: IBM Think Conference, but before I get much further into it, 248 00:15:58,320 --> 00:16:09,320 Speaker 1: let's take a quick break to and our sponsor IBM Watson. 249 00:16:09,800 --> 00:16:12,480 Speaker 1: You may have heard of it as the computer that 250 00:16:12,640 --> 00:16:16,560 Speaker 1: defeated former Jeopardy champions. Well, it's much more than that. 251 00:16:17,080 --> 00:16:20,520 Speaker 1: Although that definitely was a big publicity stunt. You could 252 00:16:20,560 --> 00:16:24,440 Speaker 1: argue it was the equivalent of when Deep Blue went 253 00:16:24,560 --> 00:16:30,000 Speaker 1: up against Kasparov in the chess competitions. But IBM s 254 00:16:30,000 --> 00:16:34,720 Speaker 1: Watson is sort of in that natural language processing and 255 00:16:34,960 --> 00:16:37,960 Speaker 1: understanding what people are saying and being able to pull 256 00:16:38,160 --> 00:16:41,520 Speaker 1: data based on that. That is sort of an example 257 00:16:41,560 --> 00:16:44,120 Speaker 1: of that. So in Jeopardy it was really put to 258 00:16:44,160 --> 00:16:48,600 Speaker 1: the test because with Jeopardy, you don't just get clues 259 00:16:48,760 --> 00:16:51,760 Speaker 1: and then you have to come up with the right answer. 260 00:16:52,040 --> 00:16:56,720 Speaker 1: Sometimes those clues are puns or rhymes, or they are 261 00:16:56,920 --> 00:17:01,760 Speaker 1: very circumspect, like it's very circuitous the kind of logic 262 00:17:01,840 --> 00:17:04,159 Speaker 1: you have to use to figure out what the actual 263 00:17:04,240 --> 00:17:06,720 Speaker 1: answer is in the form of a question, and Watson 264 00:17:06,760 --> 00:17:09,359 Speaker 1: did really well with it. You may remember that the 265 00:17:09,359 --> 00:17:12,240 Speaker 1: way Watson worked was that it had a massive database. 266 00:17:12,280 --> 00:17:14,560 Speaker 1: It was not connected to the Internet for the purposes 267 00:17:14,600 --> 00:17:17,639 Speaker 1: of Jeopardy. It just had a big database of information 268 00:17:18,359 --> 00:17:23,760 Speaker 1: and it would process whatever the clue was. It would 269 00:17:23,840 --> 00:17:28,399 Speaker 1: reference its database, and it would look for potential answers 270 00:17:28,480 --> 00:17:32,199 Speaker 1: and assign each potential answer a probability, sort of a 271 00:17:32,240 --> 00:17:37,400 Speaker 1: confidence probability. If Watson's confidence probability probability was high enough, 272 00:17:37,440 --> 00:17:39,280 Speaker 1: and as I recall, it was somewhere in the eight 273 00:17:39,480 --> 00:17:43,400 Speaker 1: percentile range, like it had to be at least sure 274 00:17:43,440 --> 00:17:45,720 Speaker 1: that that was the right answer. It would then put 275 00:17:45,800 --> 00:17:49,800 Speaker 1: forward that as its answer in jeopardy. If none of 276 00:17:49,800 --> 00:17:52,960 Speaker 1: the answers that came up with met that level of confidence, 277 00:17:53,280 --> 00:17:56,800 Speaker 1: Watson would not answer. It would not attempt to see 278 00:17:57,240 --> 00:18:00,119 Speaker 1: if perhaps it could take a wild guess and at 279 00:18:00,160 --> 00:18:02,199 Speaker 1: the right answer, and it turned out to work. Watson 280 00:18:02,280 --> 00:18:05,120 Speaker 1: ended up winning that game of Jeopardy. But of course, 281 00:18:05,119 --> 00:18:09,919 Speaker 1: Watson isn't just about showing up former champions. It's about 282 00:18:10,480 --> 00:18:15,560 Speaker 1: all sorts of stuff IBM SUH. Early applications of Watson 283 00:18:15,600 --> 00:18:20,679 Speaker 1: were to include it with medical procedures so that it 284 00:18:20,720 --> 00:18:25,800 Speaker 1: could help doctors diagnose and treat patients. It was not 285 00:18:25,960 --> 00:18:30,159 Speaker 1: replacing doctors. It was not meant to be a robo 286 00:18:30,320 --> 00:18:34,360 Speaker 1: doctor that would treat you based on your symptoms and 287 00:18:34,400 --> 00:18:37,200 Speaker 1: then you just go to the robo pharmacist. But rather 288 00:18:37,359 --> 00:18:40,920 Speaker 1: it was acting as an assistant to help either confirm 289 00:18:41,040 --> 00:18:44,560 Speaker 1: what a doctor was thinking or perhaps narrowed down some 290 00:18:44,640 --> 00:18:49,680 Speaker 1: options of what the cause of any particular symptoms might be. 291 00:18:50,200 --> 00:18:53,439 Speaker 1: And it has seen a lot of use in that field. 292 00:18:54,000 --> 00:18:57,520 Speaker 1: It's gone well beyond that, though some of the other 293 00:18:57,760 --> 00:19:00,880 Speaker 1: applications have been a little silly. You may remember if 294 00:19:00,880 --> 00:19:04,120 Speaker 1: you listen to the Forward Thinking podcast We did an 295 00:19:04,119 --> 00:19:09,480 Speaker 1: episode about Chef Watson that was an application that IBM 296 00:19:09,520 --> 00:19:15,000 Speaker 1: created where Watson would design a meal for you based 297 00:19:15,040 --> 00:19:18,800 Speaker 1: off of ingredients you told it. We're at your disposal. 298 00:19:19,040 --> 00:19:22,120 Speaker 1: So you might say, hey, I have chicken, and I've 299 00:19:22,160 --> 00:19:25,840 Speaker 1: got rosemary, and i have some potatoes, and I've got 300 00:19:25,960 --> 00:19:30,199 Speaker 1: some rice, and i've got some green beans. What can 301 00:19:30,280 --> 00:19:32,879 Speaker 1: I make with these? And it would say, all right, well, 302 00:19:32,960 --> 00:19:34,720 Speaker 1: let me come up with a recipe for you, and 303 00:19:34,760 --> 00:19:37,880 Speaker 1: would generate a recipe and it probably includes some ingredients 304 00:19:37,880 --> 00:19:39,679 Speaker 1: that you might not have on hand. It would be 305 00:19:39,680 --> 00:19:41,679 Speaker 1: stuff that you would have to go shop for. But 306 00:19:41,720 --> 00:19:45,320 Speaker 1: the interesting thing was it was dynamically creating these recipes. 307 00:19:45,359 --> 00:19:49,960 Speaker 1: It wasn't accessing a database of recipes and pulling from 308 00:19:50,000 --> 00:19:54,280 Speaker 1: the ones that included the ingredients you mentioned. Instead, it 309 00:19:54,320 --> 00:19:58,840 Speaker 1: would look at how flavors had been combined in the 310 00:19:58,880 --> 00:20:04,439 Speaker 1: past by looking at a huge library of recipes that 311 00:20:04,520 --> 00:20:08,399 Speaker 1: had been fed to Chef Watson. So just imagine a 312 00:20:08,560 --> 00:20:11,520 Speaker 1: library is worth of cookbooks fed to this computer, and 313 00:20:11,560 --> 00:20:14,080 Speaker 1: the computer says, well, based upon what I can see 314 00:20:14,320 --> 00:20:18,280 Speaker 1: from all these recipes, roseberry and chicken do pair well together, 315 00:20:18,359 --> 00:20:20,760 Speaker 1: So we're going to make a recipe that uses those 316 00:20:20,760 --> 00:20:24,640 Speaker 1: two ingredients, and based upon all the different ways I've 317 00:20:24,640 --> 00:20:28,160 Speaker 1: seen to prepare chicken, I think a baked chicken dish 318 00:20:28,200 --> 00:20:30,760 Speaker 1: would be the perfect one to go with, and so 319 00:20:30,800 --> 00:20:33,760 Speaker 1: on and so forth. But it's not pulling a recipe, 320 00:20:34,119 --> 00:20:37,199 Speaker 1: it was creating one based upon its knowledge. It's like 321 00:20:37,240 --> 00:20:41,919 Speaker 1: a chef, not a cook. Uh. This did not always 322 00:20:41,960 --> 00:20:45,480 Speaker 1: necessarily work out great. If you listen to that episode 323 00:20:45,480 --> 00:20:48,720 Speaker 1: of Forward Thinking, you'll hear some of the experiences we 324 00:20:48,760 --> 00:20:52,600 Speaker 1: had as we were testing out Chef Watson. I remember 325 00:20:52,640 --> 00:20:56,840 Speaker 1: a cauliflower fricacy recipe in which the cauliflower was listed 326 00:20:56,880 --> 00:21:01,199 Speaker 1: as optional, which I thought was a bit unusual. Also, Uh, 327 00:21:01,359 --> 00:21:03,439 Speaker 1: the other interesting thing was that you could feed the 328 00:21:03,480 --> 00:21:07,480 Speaker 1: exact same ingredients into Chef Watson multiple times and you'll 329 00:21:07,520 --> 00:21:10,680 Speaker 1: get different results each time. Again because the recipes are 330 00:21:10,680 --> 00:21:16,160 Speaker 1: created dynamically, so it's not again not not looking at 331 00:21:16,400 --> 00:21:19,440 Speaker 1: a recipe it's already written. It's writing a new one 332 00:21:19,480 --> 00:21:21,720 Speaker 1: for you each time you ask it. It's kind of interesting. 333 00:21:22,520 --> 00:21:25,840 Speaker 1: There was also a an example recently where Watson was 334 00:21:25,920 --> 00:21:30,280 Speaker 1: being used as a platform for the Weather Company, where 335 00:21:30,280 --> 00:21:33,720 Speaker 1: if you use the Weather Company's app, Watson was helping 336 00:21:33,760 --> 00:21:36,480 Speaker 1: power that so it's kind of like an an A 337 00:21:36,600 --> 00:21:41,679 Speaker 1: p I. So different developers can use Watson as a 338 00:21:41,680 --> 00:21:46,119 Speaker 1: platform to build different applications, and it all depends on 339 00:21:46,160 --> 00:21:48,720 Speaker 1: what your application needs whether or not Watson is a 340 00:21:48,720 --> 00:21:51,959 Speaker 1: good fit. It doesn't necessarily mean that every single application 341 00:21:52,560 --> 00:21:55,360 Speaker 1: is going to benefit from using Watson. If you want 342 00:21:55,400 --> 00:21:58,679 Speaker 1: to create a game that's like Angry Birds, Watson may 343 00:21:58,720 --> 00:22:01,480 Speaker 1: not be of much use. But if it's anything where 344 00:22:01,920 --> 00:22:07,320 Speaker 1: you have someone asking questions or asking for data, then 345 00:22:07,560 --> 00:22:10,800 Speaker 1: Watson might end up being helpful. And so that Obviously, 346 00:22:10,840 --> 00:22:15,800 Speaker 1: there's a lot of different roundtable discussions and breakout sessions 347 00:22:15,840 --> 00:22:18,560 Speaker 1: all about IBM Watson and how to use it. One 348 00:22:18,640 --> 00:22:22,199 Speaker 1: thing that I think is gonna be particularly interesting in 349 00:22:22,200 --> 00:22:27,080 Speaker 1: a very sensitive subject given extremely recent news, is the 350 00:22:27,119 --> 00:22:31,840 Speaker 1: idea of self driving cars. There are some sessions that 351 00:22:31,880 --> 00:22:34,760 Speaker 1: are supposed to be about self driving cars. The reason 352 00:22:34,840 --> 00:22:38,000 Speaker 1: I say sensitive is because the day I'm recording this, 353 00:22:38,119 --> 00:22:42,040 Speaker 1: on March nineteen, two thousand eighteen, UH there was also 354 00:22:43,000 --> 00:22:46,960 Speaker 1: very tragic news out of Arizona, and that news was 355 00:22:47,000 --> 00:22:53,240 Speaker 1: that Uber uh has a fleet of self driving vehicles 356 00:22:53,240 --> 00:22:56,280 Speaker 1: that they've been testing in different markets, including in Arizona, 357 00:22:56,440 --> 00:22:59,600 Speaker 1: and one of those self driving vehicles and SUV ended 358 00:22:59,680 --> 00:23:05,240 Speaker 1: up striking and killing a pedestrian, Elaine Hertzberg. She was 359 00:23:05,359 --> 00:23:07,920 Speaker 1: trying to cross the street. She was walking a bicycle 360 00:23:08,000 --> 00:23:11,520 Speaker 1: across the street in Tempe, Arizona, on the night of 361 00:23:11,640 --> 00:23:15,040 Speaker 1: March eighteen, two thousand eighteen, and she was struck by 362 00:23:15,040 --> 00:23:18,840 Speaker 1: this SUV. And the SUV was an autonomous mode. There 363 00:23:19,000 --> 00:23:23,760 Speaker 1: was a driver in the vehicle because while Uber has 364 00:23:23,760 --> 00:23:28,639 Speaker 1: been testing this autonomous vehicle technology, they have been asked 365 00:23:28,640 --> 00:23:31,280 Speaker 1: to keep an operator behind the wheel of the car 366 00:23:31,359 --> 00:23:34,679 Speaker 1: to take over just in case something goes wrong and 367 00:23:34,760 --> 00:23:39,720 Speaker 1: the vehicle is unable to cope with it. And I'm 368 00:23:39,760 --> 00:23:42,639 Speaker 1: as of the recording of this podcast, I'm not sure 369 00:23:42,680 --> 00:23:46,480 Speaker 1: exactly what happened, whether or not the driver was aware 370 00:23:46,840 --> 00:23:49,520 Speaker 1: of what was about to happen, was unable to respond 371 00:23:49,560 --> 00:23:53,679 Speaker 1: in time. I don't know what the sequence of events was, 372 00:23:53,880 --> 00:23:59,280 Speaker 1: specifically in regards to that operator. But the tragic story 373 00:23:59,520 --> 00:24:03,560 Speaker 1: is that this autonomous car, the self driving car, struck 374 00:24:03,600 --> 00:24:07,480 Speaker 1: and killed a pedestrian. And as far as I can determine, 375 00:24:07,840 --> 00:24:12,280 Speaker 1: it is the first fatality due to an autonomous car. 376 00:24:12,520 --> 00:24:14,840 Speaker 1: Obviously not the first accident. There have been a few 377 00:24:14,880 --> 00:24:18,399 Speaker 1: others and a few that have been attributed specifically to 378 00:24:18,560 --> 00:24:22,520 Speaker 1: the autonomous cars and not to human drivers. But this 379 00:24:22,720 --> 00:24:27,040 Speaker 1: is a terrible, terrible story, and obviously that's going to 380 00:24:27,400 --> 00:24:31,680 Speaker 1: affect the conversations that go on here, So I hope 381 00:24:31,720 --> 00:24:35,480 Speaker 1: to attend some of those and hear what experts in 382 00:24:35,480 --> 00:24:38,520 Speaker 1: the field have to say about this and what is 383 00:24:38,560 --> 00:24:40,760 Speaker 1: the best course of action. Obviously, you want to be 384 00:24:40,840 --> 00:24:45,800 Speaker 1: respectful of the victim and her family, and you want 385 00:24:45,840 --> 00:24:49,399 Speaker 1: to be realistic. You do not want to dismiss this. 386 00:24:49,520 --> 00:24:53,160 Speaker 1: Obviously that would be horrible, it'd be unthinkable. But how 387 00:24:53,200 --> 00:24:59,240 Speaker 1: do you move forward when there's so much momentum technologically 388 00:24:59,280 --> 00:25:03,360 Speaker 1: speaking behind the movement to go to autonomous cars? And 389 00:25:03,600 --> 00:25:07,720 Speaker 1: I'm I want to find out what people's thoughts are 390 00:25:07,720 --> 00:25:10,879 Speaker 1: on this. It may very well be that there aren't 391 00:25:10,920 --> 00:25:15,199 Speaker 1: any prepared sessions to cover this because the news is 392 00:25:15,280 --> 00:25:18,840 Speaker 1: so recent, but I'm sure there will be questions about it. 393 00:25:19,240 --> 00:25:21,199 Speaker 1: There's a bit more I expect to see here at 394 00:25:21,240 --> 00:25:25,480 Speaker 1: the IBM Think Conference, and I'll tell you about it 395 00:25:25,600 --> 00:25:28,080 Speaker 1: in just a minute, but I've got to take a 396 00:25:28,160 --> 00:25:39,639 Speaker 1: quick break to thank our sponsor. Another area that is 397 00:25:39,680 --> 00:25:41,919 Speaker 1: going to get a lot of coverage here at the 398 00:25:42,040 --> 00:25:46,080 Speaker 1: conference is all about cloud and data issues, and oh boy, 399 00:25:46,240 --> 00:25:50,960 Speaker 1: there's a lot to talk about there too, because again, 400 00:25:51,119 --> 00:25:58,200 Speaker 1: recent news has been pretty rough in regards to data mining. 401 00:25:58,880 --> 00:26:04,000 Speaker 1: So I guess I should talk quickly about what this 402 00:26:04,119 --> 00:26:06,880 Speaker 1: is the first place cloud computing In case you were 403 00:26:06,920 --> 00:26:11,040 Speaker 1: not aware, is this this model of computing in which 404 00:26:11,080 --> 00:26:15,400 Speaker 1: you have powerful computers connected to a network that are 405 00:26:15,440 --> 00:26:21,200 Speaker 1: doing computation for you, more specifically, for your computer. It 406 00:26:21,280 --> 00:26:24,160 Speaker 1: might be just computation, it might be storage, It might 407 00:26:24,359 --> 00:26:26,800 Speaker 1: just be storage, it might be both. But the idea 408 00:26:26,840 --> 00:26:29,960 Speaker 1: is that instead of your computer doing all the work, 409 00:26:30,320 --> 00:26:32,520 Speaker 1: a computer on a network is doing all the work 410 00:26:32,560 --> 00:26:36,000 Speaker 1: for you and sending you the results. So your computer 411 00:26:36,080 --> 00:26:38,399 Speaker 1: is just receiving some information. It's not having to crunch 412 00:26:38,440 --> 00:26:41,560 Speaker 1: any numbers. This could be really useful if you wanted 413 00:26:41,600 --> 00:26:46,000 Speaker 1: to do something that was well beyond your computers processing abilities. 414 00:26:46,359 --> 00:26:49,159 Speaker 1: It's also a great way to distribute work across a 415 00:26:49,280 --> 00:26:53,719 Speaker 1: network of computers instead of depending on just one processor. 416 00:26:53,760 --> 00:26:57,480 Speaker 1: Even if it's a massive, multi core processor, it can 417 00:26:57,480 --> 00:27:00,600 Speaker 1: still be more efficient to distribute that workload cross a 418 00:27:00,960 --> 00:27:05,399 Speaker 1: network of computers. An example of this would be the 419 00:27:05,480 --> 00:27:09,880 Speaker 1: many at home projects like SETI at home. These are 420 00:27:09,920 --> 00:27:16,080 Speaker 1: projects where computers uh in a centralized location received massive 421 00:27:16,080 --> 00:27:20,440 Speaker 1: amounts of data. So with SETI, it's data about radio 422 00:27:20,880 --> 00:27:26,120 Speaker 1: frequencies radio waves, and it's generally mostly noise. The vast 423 00:27:26,160 --> 00:27:29,120 Speaker 1: majority of that information is mostly noise, but there could 424 00:27:29,160 --> 00:27:31,360 Speaker 1: be some signal in that noise. That's the whole point 425 00:27:31,400 --> 00:27:34,000 Speaker 1: at SETI at Home is to look for any potential 426 00:27:34,040 --> 00:27:40,359 Speaker 1: signal that could have originated away from us, extraterrestrial in nature, aliens, 427 00:27:40,359 --> 00:27:42,919 Speaker 1: and other words. But to do that, you have to 428 00:27:43,359 --> 00:27:46,680 Speaker 1: siphon through an awful lot of radio frequencies that either 429 00:27:46,880 --> 00:27:49,520 Speaker 1: came from Earth and just bounced around, so we're just 430 00:27:49,560 --> 00:27:52,280 Speaker 1: picking up stuff that we sent out, or they came 431 00:27:52,320 --> 00:27:57,320 Speaker 1: from natural occurring phenomena like pulsars or some other celestial body. 432 00:27:58,400 --> 00:28:02,560 Speaker 1: And to do that would just take a regular computer 433 00:28:03,080 --> 00:28:06,639 Speaker 1: way too much time. We're constantly gathering more of this information, 434 00:28:06,720 --> 00:28:08,960 Speaker 1: so you would fall behind very quickly, and you would 435 00:28:08,960 --> 00:28:11,320 Speaker 1: never be able to catch up because every time you're 436 00:28:11,320 --> 00:28:15,040 Speaker 1: solving a little bit of that problem, you're getting a 437 00:28:15,160 --> 00:28:19,560 Speaker 1: hundred times over more information every minute, so you would 438 00:28:19,560 --> 00:28:22,080 Speaker 1: never be able to keep up with it. Steady at 439 00:28:22,119 --> 00:28:25,800 Speaker 1: Home divides that data into chunks and sends it out 440 00:28:26,000 --> 00:28:29,440 Speaker 1: across its network to people's computers, and their computers will 441 00:28:29,480 --> 00:28:32,920 Speaker 1: work on parts of those problems while the processor would 442 00:28:32,960 --> 00:28:36,000 Speaker 1: otherwise be idle. So let's say that you've got your 443 00:28:36,000 --> 00:28:40,760 Speaker 1: computer on your CPU is working at capacity. Well, if 444 00:28:40,760 --> 00:28:42,720 Speaker 1: you had one of these programs on it, you could 445 00:28:42,720 --> 00:28:46,840 Speaker 1: dedicate a lot of that unused CPU processing power to 446 00:28:46,920 --> 00:28:49,600 Speaker 1: solving these problems, and you would only be solving a 447 00:28:49,600 --> 00:28:52,800 Speaker 1: teeny tiny fraction of the overall work, and other computers 448 00:28:52,800 --> 00:28:55,400 Speaker 1: would be working on the same stuff and sending all 449 00:28:55,440 --> 00:28:59,320 Speaker 1: that data back to the main center of computers, which 450 00:28:59,320 --> 00:29:03,200 Speaker 1: would verify by the results and then continue dividing up 451 00:29:03,200 --> 00:29:05,239 Speaker 1: the job and sending it out to other computers. This 452 00:29:05,320 --> 00:29:08,440 Speaker 1: is kind of a method of grid computing or cloud computing. 453 00:29:09,000 --> 00:29:11,600 Speaker 1: Cloud storage is very similar. You probably have used it. 454 00:29:11,840 --> 00:29:14,240 Speaker 1: If you haven't used it on your computer, you've definitely 455 00:29:14,360 --> 00:29:18,480 Speaker 1: used on your smartphone. This is where you store information 456 00:29:18,680 --> 00:29:22,880 Speaker 1: on servers that belong to someone else. So you might 457 00:29:22,920 --> 00:29:27,040 Speaker 1: have photo albums that are sitting on someone else's computer 458 00:29:27,280 --> 00:29:30,600 Speaker 1: by someone else, I usually mean a corporation like Apple 459 00:29:30,960 --> 00:29:33,440 Speaker 1: or a Google or Facebook or something like that, and 460 00:29:33,480 --> 00:29:36,960 Speaker 1: you have instances of those images, perhaps on your smartphone 461 00:29:37,720 --> 00:29:41,520 Speaker 1: but they also exist on other computers. That's cloud storage, 462 00:29:42,160 --> 00:29:45,920 Speaker 1: and it's very useful if you want to be able 463 00:29:45,960 --> 00:29:49,000 Speaker 1: to store more stuff than what your device can hold. 464 00:29:49,960 --> 00:29:52,520 Speaker 1: That's fantastic. It it's great to be able to turn 465 00:29:52,600 --> 00:30:00,440 Speaker 1: to that. But it's also somewhat limited because um, someone 466 00:30:00,480 --> 00:30:05,200 Speaker 1: else has your your your file, your work, your images, 467 00:30:05,840 --> 00:30:09,600 Speaker 1: and that means that if they change their policies, you 468 00:30:09,640 --> 00:30:11,800 Speaker 1: may no longer have access to it, or you may 469 00:30:11,800 --> 00:30:14,720 Speaker 1: not have full control over it. You may have surrendered 470 00:30:15,000 --> 00:30:19,719 Speaker 1: control over the things that you generated to the entity 471 00:30:19,840 --> 00:30:22,760 Speaker 1: that is now storing it. You might be compromising your 472 00:30:22,800 --> 00:30:29,240 Speaker 1: own privacy. Uh. It is a tricky situation. It's it's 473 00:30:29,840 --> 00:30:32,239 Speaker 1: got a lot of factors to it, and it's a 474 00:30:32,280 --> 00:30:38,160 Speaker 1: big big deal here at IBM. Think recently there was 475 00:30:38,880 --> 00:30:40,880 Speaker 1: a big news story and I'm going to do a 476 00:30:40,920 --> 00:30:43,560 Speaker 1: full episode about this later, but there was a big 477 00:30:43,560 --> 00:30:49,280 Speaker 1: news story about a company called Cambridge Analytica, which used 478 00:30:49,480 --> 00:30:54,360 Speaker 1: an enormous amount of data that it mind from primarily Facebook, 479 00:30:55,000 --> 00:31:00,480 Speaker 1: in order to influence elections, to get information about voters 480 00:31:00,480 --> 00:31:05,320 Speaker 1: and potential voters, and to help push them in a 481 00:31:05,360 --> 00:31:10,200 Speaker 1: specific direction when it came to elections. It is an 482 00:31:10,400 --> 00:31:17,000 Speaker 1: enormous story and developing scandal really, and because of that story, 483 00:31:17,320 --> 00:31:20,200 Speaker 1: I feel like that's going to end up generating some 484 00:31:20,320 --> 00:31:24,040 Speaker 1: questions here at the conference as well, not just about 485 00:31:24,200 --> 00:31:28,800 Speaker 1: the viability of cloud services and data mining, but the 486 00:31:28,880 --> 00:31:33,120 Speaker 1: ethics of it. What is ethical, what is not? And 487 00:31:33,280 --> 00:31:37,280 Speaker 1: how should we codify that, How should we define those 488 00:31:37,280 --> 00:31:40,680 Speaker 1: ethics and how how do we hold ourselves accountable to 489 00:31:41,000 --> 00:31:45,000 Speaker 1: ethical standards to make sure that the technologies that we 490 00:31:45,040 --> 00:31:48,360 Speaker 1: have at our disposal are used in a responsible manner, 491 00:31:48,520 --> 00:31:52,000 Speaker 1: because some would argue that so far that has not happened, 492 00:31:52,400 --> 00:32:00,000 Speaker 1: that we have had multiple instances of violations of privacy security. Uh. 493 00:32:00,080 --> 00:32:03,000 Speaker 1: Another example of of that sort of thing is all 494 00:32:03,040 --> 00:32:05,880 Speaker 1: the different data breaches we have seen over the years 495 00:32:06,240 --> 00:32:09,320 Speaker 1: where companies have not done a good job at protecting 496 00:32:09,480 --> 00:32:14,720 Speaker 1: customer data. And since that data is very much important 497 00:32:14,720 --> 00:32:17,880 Speaker 1: to us as individuals, this is a big concern. In fact, 498 00:32:17,920 --> 00:32:22,640 Speaker 1: that's another area at IBM THINK. It's all about data security. 499 00:32:22,720 --> 00:32:24,800 Speaker 1: How do we keep that data safe? How do we 500 00:32:24,880 --> 00:32:29,160 Speaker 1: protect against cyber attackers? And I'm sure I will see 501 00:32:29,200 --> 00:32:32,959 Speaker 1: a lot of information about that. Another big area of 502 00:32:32,960 --> 00:32:37,520 Speaker 1: discussion at IBM THINK is all about infrastructure. How do 503 00:32:37,560 --> 00:32:41,520 Speaker 1: you incorporate this technology into existing infrastructures. How do you 504 00:32:41,600 --> 00:32:46,160 Speaker 1: design new infrastructures with this technology? And that could be anything. 505 00:32:46,200 --> 00:32:49,440 Speaker 1: It could be anything from the infrastructure of a building 506 00:32:49,600 --> 00:32:53,760 Speaker 1: to a city, to a country, to all sorts of stuff. 507 00:32:54,160 --> 00:32:57,480 Speaker 1: But there are a lot of interesting discussions that are 508 00:32:57,560 --> 00:33:01,680 Speaker 1: on the schedule about the Internet of Things, about integrating 509 00:33:01,720 --> 00:33:06,920 Speaker 1: this technology into buildings and cities and making it a 510 00:33:07,040 --> 00:33:11,920 Speaker 1: seamless part of the infrastructure, not an overlay, but an 511 00:33:11,960 --> 00:33:15,360 Speaker 1: integral part. So when we talk about things like smart 512 00:33:15,400 --> 00:33:18,000 Speaker 1: homes and smart cities, that's what this is all about. 513 00:33:18,280 --> 00:33:22,360 Speaker 1: How can this technology actually improve things, not just be 514 00:33:22,400 --> 00:33:27,920 Speaker 1: a gimmick, but be something that ends up becoming absolutely necessary, 515 00:33:28,200 --> 00:33:32,960 Speaker 1: so that it is so seamlessly entwined with our infrastructure 516 00:33:33,080 --> 00:33:37,160 Speaker 1: that it is, uh, we can't imagine our lives without 517 00:33:37,200 --> 00:33:42,600 Speaker 1: it moving forward. Obviously, that also raises other questions, mostly 518 00:33:42,760 --> 00:33:45,240 Speaker 1: pertaining to things that I've already talked about, like privacy 519 00:33:45,240 --> 00:33:48,320 Speaker 1: and security. How can you make sure that once you 520 00:33:48,520 --> 00:33:52,920 Speaker 1: have this infrastructure, it's safe from bad actors who would 521 00:33:53,440 --> 00:33:57,600 Speaker 1: perhaps try to damage or otherwise leverage that information in 522 00:33:58,480 --> 00:34:01,360 Speaker 1: malicious ways. So there's a lot to talk about their 523 00:34:01,440 --> 00:34:03,840 Speaker 1: Internet of Things has brought up a lot of interesting 524 00:34:03,960 --> 00:34:07,200 Speaker 1: questions about security. You may remember when we had Shannon 525 00:34:07,280 --> 00:34:09,520 Speaker 1: Morse on the show. She talked a bit about this, 526 00:34:10,080 --> 00:34:12,440 Speaker 1: how there are a lot of companies out there that 527 00:34:12,480 --> 00:34:16,719 Speaker 1: are springing to market with these Internet of Things products 528 00:34:17,320 --> 00:34:22,520 Speaker 1: that maybe haven't been fully fleshed out, especially when security 529 00:34:22,600 --> 00:34:27,040 Speaker 1: comes into play. And there's also quantum computing. That's another 530 00:34:27,440 --> 00:34:30,440 Speaker 1: discussion that's going on here at IBM. Think there's talk 531 00:34:30,520 --> 00:34:35,680 Speaker 1: about quantum computing emerging from labs and going into practical use. 532 00:34:36,120 --> 00:34:40,759 Speaker 1: Quantum computers are interesting things. They make use of cubits. 533 00:34:41,320 --> 00:34:45,399 Speaker 1: Cubits are quantum bits. A bit obviously for those who 534 00:34:45,400 --> 00:34:47,799 Speaker 1: have been listening you know all about this. Bits are 535 00:34:47,840 --> 00:34:50,920 Speaker 1: the basic units of information. They can either be a 536 00:34:51,040 --> 00:34:55,440 Speaker 1: zero or a one, and that is you can think 537 00:34:55,480 --> 00:34:57,720 Speaker 1: of as a note or a yes, or an off 538 00:34:57,880 --> 00:35:02,280 Speaker 1: and an on, and using it's and chaining bits together, 539 00:35:02,760 --> 00:35:06,880 Speaker 1: you can represent all sorts of different types of information. Ultimately, 540 00:35:06,960 --> 00:35:14,000 Speaker 1: computers are processing information in bits. A cubit. A quantum 541 00:35:14,040 --> 00:35:18,320 Speaker 1: bit can be in superposition, which means it can inhabit 542 00:35:18,440 --> 00:35:21,400 Speaker 1: all possible states, which means it can be both a 543 00:35:21,520 --> 00:35:26,560 Speaker 1: zero and a one and everything Technically in between simultaneously. 544 00:35:26,840 --> 00:35:30,880 Speaker 1: Now that does not necessarily mean anything for every single 545 00:35:30,960 --> 00:35:34,720 Speaker 1: type of application, but for certain types of computational work, 546 00:35:35,200 --> 00:35:41,560 Speaker 1: that would make it much easier to process information rapidly. Specifically, 547 00:35:41,600 --> 00:35:46,399 Speaker 1: any anything that was using parallel processing, cubits would be 548 00:35:46,600 --> 00:35:50,520 Speaker 1: pretty good for that. Not all the computational problems would 549 00:35:51,160 --> 00:35:55,440 Speaker 1: benefit from quantum computing, but the ones that would. The 550 00:35:56,040 --> 00:35:58,319 Speaker 1: processing would take a fraction of the amount of time. 551 00:35:58,360 --> 00:36:00,799 Speaker 1: I mean that a fraction of a fraction of the 552 00:36:00,800 --> 00:36:03,960 Speaker 1: amount of time that classical computer would take. One of 553 00:36:03,960 --> 00:36:07,520 Speaker 1: the big things that that cubits could do is make 554 00:36:08,160 --> 00:36:14,600 Speaker 1: decryption really easy, which is kind of terrifying because encryption 555 00:36:14,719 --> 00:36:18,239 Speaker 1: is how we keep a lot of data safe. Basically, 556 00:36:18,480 --> 00:36:23,920 Speaker 1: the way you're you're your base level encryption works is 557 00:36:23,960 --> 00:36:27,800 Speaker 1: that you you take a really really really big prime number. 558 00:36:28,480 --> 00:36:32,120 Speaker 1: That meaning that it's a number that is only divisible 559 00:36:32,200 --> 00:36:35,319 Speaker 1: by itself. You it doesn't have any other factors. You 560 00:36:35,400 --> 00:36:40,720 Speaker 1: cannot uh find anything else in there too too, uh 561 00:36:40,920 --> 00:36:43,800 Speaker 1: divide it by and get a whole integer. So, for example, 562 00:36:44,120 --> 00:36:47,680 Speaker 1: the number five, you can't divide five by anything other 563 00:36:47,719 --> 00:36:51,400 Speaker 1: than itself and get a whole integer. Except instead of 564 00:36:51,480 --> 00:36:53,560 Speaker 1: using the number five, you would use a digit that 565 00:36:53,680 --> 00:36:56,560 Speaker 1: are a number that is maybe you know, hundreds of 566 00:36:56,600 --> 00:36:59,920 Speaker 1: digits long, but still prime number. Then you take another 567 00:37:00,360 --> 00:37:04,319 Speaker 1: incredibly long prime number, a really really really really really 568 00:37:04,320 --> 00:37:06,880 Speaker 1: long one, and you multiply both of those together, and 569 00:37:06,880 --> 00:37:09,600 Speaker 1: then you have a product. That product ends up being 570 00:37:09,800 --> 00:37:14,399 Speaker 1: the crux of your encryption. If someone tries to break 571 00:37:14,400 --> 00:37:16,560 Speaker 1: your encryption, they can look at that product and they 572 00:37:16,600 --> 00:37:19,400 Speaker 1: see what the product is, but they don't know which 573 00:37:19,400 --> 00:37:23,360 Speaker 1: two numbers you use to multiply together to get that product, 574 00:37:23,920 --> 00:37:26,439 Speaker 1: so they have to start trying to figure it out, 575 00:37:26,440 --> 00:37:28,800 Speaker 1: and they do this by going through all the known 576 00:37:28,840 --> 00:37:33,520 Speaker 1: prime numbers to see if you can divide the product 577 00:37:33,600 --> 00:37:36,600 Speaker 1: by that prime number, and if so, is the other 578 00:37:36,760 --> 00:37:41,200 Speaker 1: factor also a prime number. This can take a really 579 00:37:41,200 --> 00:37:44,240 Speaker 1: long time. For a classical computer, it takes ages because 580 00:37:44,239 --> 00:37:49,040 Speaker 1: it has to go through every single possible solution before 581 00:37:49,040 --> 00:37:51,200 Speaker 1: it can try and, you know, find the right one, 582 00:37:51,520 --> 00:37:53,799 Speaker 1: or at least every possible one leading up to the 583 00:37:53,880 --> 00:37:56,279 Speaker 1: right one. With a quantum computer, because you can have 584 00:37:56,640 --> 00:38:01,360 Speaker 1: cubits in superposition, they can process this information much more quickly. 585 00:38:01,400 --> 00:38:07,960 Speaker 1: It's like they're doing numerous processes, numerous solutions simultaneously, because 586 00:38:08,120 --> 00:38:11,040 Speaker 1: all the cubits can be both zero and one at 587 00:38:11,040 --> 00:38:13,360 Speaker 1: the same time, So as long as you have enough 588 00:38:13,480 --> 00:38:17,279 Speaker 1: cubits enough to be able to process the request you 589 00:38:17,360 --> 00:38:21,320 Speaker 1: have in theory, you could do that kind of computational 590 00:38:21,440 --> 00:38:26,719 Speaker 1: problem in an instant as opposed to perhaps years or 591 00:38:26,800 --> 00:38:31,640 Speaker 1: decades or centuries, depending upon the complexity of the computational problem. Now, again, 592 00:38:31,960 --> 00:38:35,840 Speaker 1: that's only for a specific set of computational problems. For 593 00:38:36,040 --> 00:38:39,560 Speaker 1: that set, quantum computers will be amazing. But if you 594 00:38:39,600 --> 00:38:44,680 Speaker 1: wanted to play a game on a quantum computer, it 595 00:38:44,719 --> 00:38:47,879 Speaker 1: wouldn't necessarily run any better. In fact, we probably run 596 00:38:47,920 --> 00:38:50,520 Speaker 1: worse than on a classical computer because you have to 597 00:38:50,560 --> 00:38:55,520 Speaker 1: have enough cubits two at least equal what the classical 598 00:38:55,560 --> 00:39:00,800 Speaker 1: computer could do. Cubits also are very tricky keeping bits 599 00:39:00,880 --> 00:39:05,000 Speaker 1: in superposition. Keeping anything in a quantum state is tricky 600 00:39:05,040 --> 00:39:09,839 Speaker 1: because the slightest thing can cause it to decohere too 601 00:39:09,920 --> 00:39:11,920 Speaker 1: for the whole system to sort of fall apart and 602 00:39:11,960 --> 00:39:15,000 Speaker 1: then just become a classical computer. And since most quantum 603 00:39:15,000 --> 00:39:20,319 Speaker 1: computers have a relatively small number of cubits, they end 604 00:39:20,400 --> 00:39:24,040 Speaker 1: up becoming very dumb computers. If you were to disturb 605 00:39:24,360 --> 00:39:29,240 Speaker 1: your typical quantum computer and reverted to classical computer status. 606 00:39:29,280 --> 00:39:32,760 Speaker 1: It would probably be less powerful than your average smart watch. 607 00:39:33,640 --> 00:39:35,880 Speaker 1: But there's gonna be a lot of discussion here at 608 00:39:35,920 --> 00:39:39,480 Speaker 1: IBM think about quantum computing and how it will start 609 00:39:39,520 --> 00:39:42,520 Speaker 1: to become a practical thing and not just something that's 610 00:39:42,560 --> 00:39:46,840 Speaker 1: been worked on in laboratories and research facilities. There are 611 00:39:46,840 --> 00:39:50,560 Speaker 1: a lot of interesting speakers here as well. Obviously IBM 612 00:39:50,760 --> 00:39:53,400 Speaker 1: has a lot of their experts here on things like 613 00:39:53,520 --> 00:39:59,560 Speaker 1: cognitive machine learning, artificial intelligence, virtual reality, augmented reality. That 614 00:39:59,640 --> 00:40:04,000 Speaker 1: both of those subjects are also represented here at the conference. Uh. 615 00:40:04,040 --> 00:40:07,799 Speaker 1: They're going to be doing breakout sessions all week long, 616 00:40:07,840 --> 00:40:09,800 Speaker 1: and I hope to talk to some of them this week. 617 00:40:10,320 --> 00:40:14,720 Speaker 1: But they're also representatives from other companies that are taking 618 00:40:14,719 --> 00:40:19,560 Speaker 1: on sessions, uh, Folks from like American Airlines or Nvidia, 619 00:40:20,000 --> 00:40:24,680 Speaker 1: or even companies like ticket Master. There's also some celebrities here, uh, 620 00:40:24,840 --> 00:40:29,560 Speaker 1: some people who are famous for their commentary on science 621 00:40:29,560 --> 00:40:33,880 Speaker 1: and their contributions to science, like astrophysicist Neil deGrasse Tyson 622 00:40:34,000 --> 00:40:38,279 Speaker 1: is here, futurist Michio Kaku is here. So they will 623 00:40:38,360 --> 00:40:42,960 Speaker 1: be giving keynote presentations as well on various subject matters. 624 00:40:43,160 --> 00:40:46,440 Speaker 1: I can't wait to hear some of those. I probably 625 00:40:46,480 --> 00:40:49,600 Speaker 1: won't be able to talk to them because they're booked 626 00:40:49,640 --> 00:40:51,640 Speaker 1: pretty solid, but I do hope to talk to at 627 00:40:51,719 --> 00:40:54,480 Speaker 1: least some of the experts in these various fields and 628 00:40:54,520 --> 00:40:57,719 Speaker 1: get their insight everything from what do they think is 629 00:40:57,760 --> 00:41:00,880 Speaker 1: cool about their area of study, what are some of 630 00:41:00,920 --> 00:41:04,000 Speaker 1: the most recent developments that have them excited, How did 631 00:41:04,000 --> 00:41:06,680 Speaker 1: they get into their field in the first place, And 632 00:41:06,719 --> 00:41:10,560 Speaker 1: if someone else is interested in that field, what should 633 00:41:10,600 --> 00:41:12,799 Speaker 1: they do, how should they pursue it. I want to 634 00:41:12,840 --> 00:41:15,279 Speaker 1: ask all those sorts of questions. So I hope to 635 00:41:15,440 --> 00:41:19,479 Speaker 1: present to you guys several bonus episodes this week, all 636 00:41:19,520 --> 00:41:22,400 Speaker 1: about this conference and the people I talked to and 637 00:41:22,440 --> 00:41:27,640 Speaker 1: the things I encounter and learn, and hopefully that will 638 00:41:27,680 --> 00:41:30,279 Speaker 1: all be useful to you guys and you'll enjoy it. 639 00:41:30,840 --> 00:41:34,319 Speaker 1: And uh, the regular episodes will also publish, so we're 640 00:41:34,360 --> 00:41:37,080 Speaker 1: going to have a whole bunch of tech stuffs in 641 00:41:37,120 --> 00:41:40,600 Speaker 1: a short amount of time. But I also hope to 642 00:41:40,680 --> 00:41:42,600 Speaker 1: do more of these in the future, where I go 643 00:41:42,760 --> 00:41:48,520 Speaker 1: to certain events and create special episodes just for those experiences, 644 00:41:48,880 --> 00:41:52,440 Speaker 1: so I can bring you some more current events and 645 00:41:52,440 --> 00:41:57,240 Speaker 1: and cool news on top of the normal tech Stuff episodes, 646 00:41:57,280 --> 00:42:01,360 Speaker 1: so the show is not changing. We haven't completely revamped it. 647 00:42:01,840 --> 00:42:04,279 Speaker 1: This is just sort of a mini series of specialness, 648 00:42:04,800 --> 00:42:07,480 Speaker 1: so you're just getting more of what you love. I 649 00:42:07,520 --> 00:42:12,719 Speaker 1: hope now I'm gonna focus really heavily on IBM Think 650 00:42:12,800 --> 00:42:15,160 Speaker 1: this week, but I'll be back also in the office 651 00:42:15,200 --> 00:42:18,759 Speaker 1: next week doing my regular tech stuff stick, which means 652 00:42:18,840 --> 00:42:20,600 Speaker 1: I need to hear from you guys. If you have 653 00:42:20,680 --> 00:42:24,160 Speaker 1: suggestions for topics that you really want to hear more about, 654 00:42:24,800 --> 00:42:27,000 Speaker 1: send me a message. You can get in touch with 655 00:42:27,040 --> 00:42:30,680 Speaker 1: me via email. The address is tech Stuff at how 656 00:42:30,719 --> 00:42:33,200 Speaker 1: stuff works dot com, or you can drop me a 657 00:42:33,239 --> 00:42:35,960 Speaker 1: line on Facebook or Twitter. The handle for both of 658 00:42:36,000 --> 00:42:40,160 Speaker 1: those is text stuff hs W. We've got an Instagram account. 659 00:42:40,200 --> 00:42:42,479 Speaker 1: You should be following that because all sorts of cool 660 00:42:42,520 --> 00:42:45,040 Speaker 1: behind the scenes information gets posted to that all the time. 661 00:42:45,880 --> 00:42:48,880 Speaker 1: And if you want to see me record live, although 662 00:42:49,000 --> 00:42:51,960 Speaker 1: not this week, but on normal weeks, you can go 663 00:42:52,040 --> 00:42:55,600 Speaker 1: to twitch dot tv slash tech Stuff. I record on 664 00:42:55,640 --> 00:43:00,200 Speaker 1: Wednesdays and Fridays. I stream my recording sessions live so 665 00:43:00,280 --> 00:43:02,799 Speaker 1: you can watch as I record an episode of tech 666 00:43:02,840 --> 00:43:06,400 Speaker 1: Stuff and watch as I make silly mistakes and have 667 00:43:06,520 --> 00:43:08,719 Speaker 1: to stop myself and go back and fix it, and 668 00:43:08,840 --> 00:43:10,640 Speaker 1: you can even chat with me. There's a chat room 669 00:43:10,680 --> 00:43:13,319 Speaker 1: in there, and I welcome all people who want to 670 00:43:13,400 --> 00:43:16,759 Speaker 1: chat and tell me about their favorite episodes of tech 671 00:43:16,800 --> 00:43:19,279 Speaker 1: stuff or things they would like me to cover. I 672 00:43:19,400 --> 00:43:21,719 Speaker 1: welcome that. I hope to see you in there, and 673 00:43:21,760 --> 00:43:30,719 Speaker 1: I will talk to you again really soon for more 674 00:43:30,760 --> 00:43:33,040 Speaker 1: on this and bathos of other topics because it how 675 00:43:33,080 --> 00:43:43,840 Speaker 1: staff works dot com