1 00:00:04,200 --> 00:00:07,200 Speaker 1: Get in tech with technology with tech Stuff from how 2 00:00:07,280 --> 00:00:14,360 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,400 --> 00:00:18,320 Speaker 1: I'm Jonathan Strickland. I'm an executive producer at how Stuff 4 00:00:18,360 --> 00:00:21,079 Speaker 1: Works and I love all things tech and I am 5 00:00:21,120 --> 00:00:27,560 Speaker 1: continuing a special series of podcasts recorded on location in 6 00:00:27,680 --> 00:00:31,880 Speaker 1: Las Vegas, Nevada during the two thousand and eighteen IBM 7 00:00:32,200 --> 00:00:37,000 Speaker 1: Think Conference. So if you think this doesn't sound like 8 00:00:37,040 --> 00:00:40,760 Speaker 1: a normal episode, that's why I am literally by myself 9 00:00:40,760 --> 00:00:43,680 Speaker 1: in a hotel room. I've got an air conditioner blasting 10 00:00:43,840 --> 00:00:46,400 Speaker 1: at me not far away, because I am in Las Vegas, 11 00:00:46,520 --> 00:00:50,960 Speaker 1: a desert. There's no producer here, there's no studio here. 12 00:00:51,320 --> 00:00:54,600 Speaker 1: It's just me and a recorder and a microphone and 13 00:00:54,640 --> 00:00:57,400 Speaker 1: a song in my heart and a dream in my brain, 14 00:00:57,760 --> 00:00:59,520 Speaker 1: and the A C just shut off so it'll be 15 00:00:59,560 --> 00:01:03,600 Speaker 1: a little more quiet now. But the Think Conference is 16 00:01:03,640 --> 00:01:07,839 Speaker 1: a pretty big deal. It's a hundreds thousands of people 17 00:01:08,000 --> 00:01:14,039 Speaker 1: all gathered together learning about uh top topics in computing, 18 00:01:14,240 --> 00:01:19,840 Speaker 1: getting education and workshops involved in all sorts of different things. 19 00:01:20,280 --> 00:01:25,720 Speaker 1: And last night I went to a science slam at 20 00:01:25,760 --> 00:01:31,199 Speaker 1: the conference where we were treated to five different presentations 21 00:01:31,240 --> 00:01:36,240 Speaker 1: by scientists that are related to IBM Research, and we 22 00:01:36,319 --> 00:01:39,960 Speaker 1: got to learn about some breakthrough sciences and some scientific 23 00:01:40,000 --> 00:01:43,920 Speaker 1: work that was pretty pretty interesting, and I wanted to 24 00:01:43,959 --> 00:01:46,199 Speaker 1: kind of report on that and what I saw and 25 00:01:46,280 --> 00:01:52,000 Speaker 1: the sort of implications that the presentations have for us, 26 00:01:52,280 --> 00:01:56,040 Speaker 1: as you know, human being type folks. Well, the presentation 27 00:01:56,160 --> 00:02:00,840 Speaker 1: was opened by the Director of IBM Research, Our Krishna, 28 00:02:00,880 --> 00:02:03,160 Speaker 1: who came up on stage and sort of set the 29 00:02:03,160 --> 00:02:08,560 Speaker 1: ground explained what in brief the five different science breakthroughs 30 00:02:08,560 --> 00:02:11,240 Speaker 1: we're going to be about. There was a lot of 31 00:02:11,320 --> 00:02:15,079 Speaker 1: overlap between them, so expect that when I get into 32 00:02:15,120 --> 00:02:18,520 Speaker 1: the nitty and the gritty. And he then introduced the 33 00:02:18,560 --> 00:02:23,280 Speaker 1: actual MC of the evening, Jamie Garcia. Now, Jamie Garcia 34 00:02:23,440 --> 00:02:29,000 Speaker 1: is a polymer chemist. She works on lots of different projects, 35 00:02:29,200 --> 00:02:32,320 Speaker 1: including ways to figure out how to break down long 36 00:02:32,400 --> 00:02:37,520 Speaker 1: chain polymers so that you can get rid of plastics 37 00:02:37,520 --> 00:02:41,200 Speaker 1: that would otherwise pollute the environment. So an important work. 38 00:02:41,520 --> 00:02:43,720 Speaker 1: It was pretty interesting hearing her talk. She talked about 39 00:02:43,720 --> 00:02:49,480 Speaker 1: how her specific work is very tricky because if you 40 00:02:49,560 --> 00:02:54,160 Speaker 1: want to create computer models of long chain polymers, it 41 00:02:54,200 --> 00:02:57,480 Speaker 1: requires a lot of processing power. The bigger the molecule is, 42 00:02:57,560 --> 00:03:00,959 Speaker 1: the more processing power it needs to simulate the interaction 43 00:03:01,040 --> 00:03:04,200 Speaker 1: of molecules properly because the computer is trying to keep 44 00:03:04,240 --> 00:03:07,120 Speaker 1: track of all the little sub atomic particles. All those 45 00:03:07,160 --> 00:03:11,520 Speaker 1: little electrons have to be modeled and simulated, and as 46 00:03:11,560 --> 00:03:15,880 Speaker 1: you add atoms to your molecule, creating these long chains, 47 00:03:16,320 --> 00:03:20,040 Speaker 1: it creates exponential more work for your computer. So she 48 00:03:20,960 --> 00:03:26,360 Speaker 1: has a vested interest based upon what she studies in 49 00:03:26,960 --> 00:03:30,800 Speaker 1: the advancement of computer sciences. Then she began to introduce 50 00:03:30,880 --> 00:03:34,920 Speaker 1: the actual presenters, and each presenter had about five minutes 51 00:03:35,400 --> 00:03:41,040 Speaker 1: to present his or her study her work and explain 52 00:03:41,120 --> 00:03:43,360 Speaker 1: what they were there about. And it was really interesting 53 00:03:43,400 --> 00:03:45,640 Speaker 1: because there was a it was a good variety. There 54 00:03:45,640 --> 00:03:48,600 Speaker 1: were two men and three women who each got up 55 00:03:48,600 --> 00:03:52,880 Speaker 1: on stage, and they all were trying to explain to 56 00:03:52,960 --> 00:03:58,080 Speaker 1: us the importance of their various projects and what what 57 00:03:58,280 --> 00:04:01,960 Speaker 1: stage they were in, why we should be interested, and 58 00:04:02,000 --> 00:04:04,440 Speaker 1: they all had very different styles, and it was very 59 00:04:04,480 --> 00:04:10,120 Speaker 1: interesting to see these different uh approaches of present presenting 60 00:04:10,160 --> 00:04:12,760 Speaker 1: this information to a general audience, because you know, a 61 00:04:12,760 --> 00:04:16,800 Speaker 1: lot of people in the crowd were not scientists or 62 00:04:17,040 --> 00:04:20,280 Speaker 1: data analysts or anything like that, so it was it 63 00:04:20,360 --> 00:04:23,599 Speaker 1: is a challenging thing to have to communicate science to 64 00:04:23,760 --> 00:04:27,159 Speaker 1: an audience where you're not entirely certain what their background 65 00:04:27,200 --> 00:04:30,440 Speaker 1: happens to be, but they all did a really good job. 66 00:04:30,680 --> 00:04:33,440 Speaker 1: The first one to get up was a guy named 67 00:04:33,440 --> 00:04:40,200 Speaker 1: Andreas Kind, who is an expert on uh cryptography and blockchain, 68 00:04:40,360 --> 00:04:42,919 Speaker 1: and I mentioned a bit in the preview episode the 69 00:04:43,040 --> 00:04:46,080 Speaker 1: kind of Overview episode that blockchain was going to be 70 00:04:46,160 --> 00:04:49,200 Speaker 1: one of the topics that I was going to look 71 00:04:49,240 --> 00:04:51,520 Speaker 1: into here, and I've got a little bit better grip 72 00:04:51,600 --> 00:04:57,000 Speaker 1: on what some aspects of blockchain are all about, and 73 00:04:57,480 --> 00:05:00,680 Speaker 1: it really blew my mind. So Kind came out and 74 00:05:00,760 --> 00:05:08,280 Speaker 1: mentioned that almost everything that we encounter, whether it's a 75 00:05:09,080 --> 00:05:14,640 Speaker 1: product like a designer bag or shoes or car parts, 76 00:05:14,800 --> 00:05:19,719 Speaker 1: or even drugs like medical drugs, has been copied at 77 00:05:19,800 --> 00:05:25,279 Speaker 1: some point, and he cited a number that in counterfeit 78 00:05:25,360 --> 00:05:32,880 Speaker 1: goods were valued at one eight trillion dollars. So you're 79 00:05:32,960 --> 00:05:37,960 Speaker 1: looking at a massive problem across multiple industries where you 80 00:05:38,000 --> 00:05:42,000 Speaker 1: could encounter copies of stuff being passed off as if 81 00:05:42,040 --> 00:05:44,880 Speaker 1: they were the real thing, and in some cases that 82 00:05:45,000 --> 00:05:47,839 Speaker 1: just ends up being kind of a a slap in 83 00:05:47,880 --> 00:05:50,520 Speaker 1: the face to rich people who are buying luxury goods. 84 00:05:50,560 --> 00:05:52,080 Speaker 1: You know, it might be hard for you to feel 85 00:05:52,120 --> 00:05:55,680 Speaker 1: any kind of sympathy for that, because who's going to 86 00:05:55,760 --> 00:05:59,960 Speaker 1: feel badly about a billionaire walking up to a play 87 00:06:00,000 --> 00:06:04,000 Speaker 1: a said buying, you know, some sort of ridiculously expensive wallet, 88 00:06:04,640 --> 00:06:07,599 Speaker 1: and then finding out three months later that it was 89 00:06:07,680 --> 00:06:10,360 Speaker 1: actually a cheap knockoff that was being passed off as 90 00:06:10,360 --> 00:06:12,600 Speaker 1: the real thing. Most of us probably wouldn't lose very 91 00:06:12,680 --> 00:06:15,880 Speaker 1: much sleep about that. Although that is a legitimate problem, 92 00:06:15,880 --> 00:06:20,280 Speaker 1: and obviously anyone who has built their business on creating 93 00:06:20,360 --> 00:06:24,360 Speaker 1: luxury goods of any kind has a really really powerful 94 00:06:24,400 --> 00:06:27,400 Speaker 1: interest in this. They don't want their products to be 95 00:06:27,440 --> 00:06:31,240 Speaker 1: copied because that obviously undervalues what they do. If no 96 00:06:31,279 --> 00:06:36,159 Speaker 1: one trusts that the thing you're selling is actually from you, 97 00:06:36,520 --> 00:06:38,280 Speaker 1: then you're not going to sell very much of it. 98 00:06:38,440 --> 00:06:42,520 Speaker 1: But there are other issues as well that affect, you know, 99 00:06:43,000 --> 00:06:47,320 Speaker 1: the common folk like myself, where even if we didn't 100 00:06:47,320 --> 00:06:50,480 Speaker 1: feel sympathy for the rich people who we're buying these 101 00:06:50,520 --> 00:06:56,080 Speaker 1: luxury goods, we may feel very strongly about these other cases. 102 00:06:56,360 --> 00:07:01,240 Speaker 1: So Andrea's kind gave another example about car parts, and 103 00:07:01,320 --> 00:07:04,360 Speaker 1: specifically he was talking about breaks, but it really could 104 00:07:04,360 --> 00:07:07,160 Speaker 1: be any car part. He said that in certain parts 105 00:07:07,160 --> 00:07:09,840 Speaker 1: of the world, depending upon where you are, you might 106 00:07:09,920 --> 00:07:13,520 Speaker 1: discover that up to forty of auto parts in the 107 00:07:13,560 --> 00:07:18,239 Speaker 1: aftermarket are copies. They are not the legitimate part that 108 00:07:18,560 --> 00:07:22,800 Speaker 1: they are, you know, claimed to be, So that could 109 00:07:22,800 --> 00:07:26,480 Speaker 1: be a huge issue. If perhaps you were getting new 110 00:07:26,520 --> 00:07:29,280 Speaker 1: brakes put into your vehicle, your old brakes are wearing out. 111 00:07:29,760 --> 00:07:32,360 Speaker 1: You go to an auto mechanic shop, you pay a 112 00:07:32,360 --> 00:07:35,160 Speaker 1: certain amount of money to have brand new brakes put in, 113 00:07:35,800 --> 00:07:38,880 Speaker 1: but you may not be absolutely certain that the brakes 114 00:07:38,920 --> 00:07:41,320 Speaker 1: that are being put in are the real deal, like 115 00:07:41,360 --> 00:07:44,520 Speaker 1: they are actually made by the company that you are 116 00:07:44,640 --> 00:07:49,440 Speaker 1: told made them. It may be that they were cheap knockoffs, 117 00:07:49,480 --> 00:07:52,000 Speaker 1: and thus they may not perform at the right level, 118 00:07:52,440 --> 00:07:57,080 Speaker 1: and they might fail more readily than a real set 119 00:07:57,080 --> 00:08:00,360 Speaker 1: of brakes would from the actual manufacturer. That the could 120 00:08:00,360 --> 00:08:04,360 Speaker 1: be life and death, and that obviously is something that 121 00:08:04,400 --> 00:08:09,040 Speaker 1: affects anybody. It's not just the people who have money 122 00:08:09,080 --> 00:08:12,119 Speaker 1: to burn, it's all of us who may rely upon 123 00:08:12,200 --> 00:08:15,160 Speaker 1: this sort of thing. And even if you aren't someone 124 00:08:15,200 --> 00:08:18,480 Speaker 1: who drives, and you're you're not too concerned about auto 125 00:08:18,600 --> 00:08:20,440 Speaker 1: parts and whether or not they're the real thing, or 126 00:08:20,480 --> 00:08:24,280 Speaker 1: maybe maybe you think you know, I'll risk it, I'll 127 00:08:24,320 --> 00:08:27,760 Speaker 1: be careful and maybe my skill as a driver will 128 00:08:27,800 --> 00:08:33,640 Speaker 1: counteract any downfall of the parts there might be. It's 129 00:08:33,679 --> 00:08:36,240 Speaker 1: still not that easy to just kind of walk away 130 00:08:36,240 --> 00:08:40,040 Speaker 1: from this, uh. As Kind pointed out, another big issue 131 00:08:40,360 --> 00:08:44,760 Speaker 1: is medical drugs. He cited a case of a blood 132 00:08:44,880 --> 00:08:51,080 Speaker 1: thinning drug and some researchers who discovered that some instances 133 00:08:51,120 --> 00:08:56,640 Speaker 1: of this blood thinning drug had up to filler material 134 00:08:56,760 --> 00:09:00,280 Speaker 1: in them. In other words, there were instances of this 135 00:09:00,360 --> 00:09:06,120 Speaker 1: drug where it had been replaced with non active ingredients 136 00:09:06,120 --> 00:09:10,720 Speaker 1: are even harmful ones, and this obviously can cause medical problems. 137 00:09:10,720 --> 00:09:15,319 Speaker 1: It can lead to uh serious issues, even even the 138 00:09:15,400 --> 00:09:18,840 Speaker 1: death of the person who is taking the drug potentially, 139 00:09:19,080 --> 00:09:25,319 Speaker 1: and that of course is absolutely unacceptable. And again depending 140 00:09:25,320 --> 00:09:28,719 Speaker 1: on what country you are. In some countries, you you 141 00:09:28,760 --> 00:09:33,320 Speaker 1: are not certain about the origin of the medication that 142 00:09:33,360 --> 00:09:36,000 Speaker 1: you're getting. In certain countries, you know, if you're going 143 00:09:36,040 --> 00:09:39,920 Speaker 1: to a reputable doctor and reputable pharmacist, you're in pretty 144 00:09:39,920 --> 00:09:43,800 Speaker 1: good shape. You're reasonably certain that the medication you are 145 00:09:43,840 --> 00:09:47,959 Speaker 1: taking is in fact what you were told it would be. 146 00:09:48,679 --> 00:09:53,600 Speaker 1: But in other countries where the medical industry isn't as 147 00:09:53,960 --> 00:09:58,120 Speaker 1: as well established and maybe not as well funded, then 148 00:09:58,760 --> 00:10:01,679 Speaker 1: you can't necessarily be certain. You may be taking drugs 149 00:10:01,720 --> 00:10:04,800 Speaker 1: that were bought at a discount, and it turns out 150 00:10:04,840 --> 00:10:06,839 Speaker 1: that a lot of them just don't have the active 151 00:10:06,960 --> 00:10:09,360 Speaker 1: ingredients they're they're supposed to have, and it's not going 152 00:10:09,440 --> 00:10:11,800 Speaker 1: to be an effective treatment for you. So all of 153 00:10:11,800 --> 00:10:15,760 Speaker 1: these different examples were problems that kind was pointing out 154 00:10:15,800 --> 00:10:20,120 Speaker 1: that he said blockchain could help solve, and that raised 155 00:10:20,200 --> 00:10:23,560 Speaker 1: a question. Guys, They're still more to come about the 156 00:10:23,720 --> 00:10:26,800 Speaker 1: science Slam, but before we jump into the next segment, 157 00:10:26,880 --> 00:10:37,160 Speaker 1: let's take another quick break to thank our sponsor. Blockchain 158 00:10:37,360 --> 00:10:42,840 Speaker 1: is a digital construct, right. Blockchain is math and code. 159 00:10:43,440 --> 00:10:47,320 Speaker 1: It's something that exists on computers, but not quote unquote 160 00:10:47,360 --> 00:10:50,960 Speaker 1: in the real world. There's no physical presence of blockchain. 161 00:10:51,320 --> 00:10:55,559 Speaker 1: With cryptocurrency. It's easy to understand more or less at 162 00:10:55,559 --> 00:10:59,559 Speaker 1: a high level why blockchain is effective because every single 163 00:10:59,600 --> 00:11:04,480 Speaker 1: trans action that you make with cryptocurrency becomes part of 164 00:11:04,520 --> 00:11:08,680 Speaker 1: this blockchain that you cannot alter, at least you cannot 165 00:11:08,800 --> 00:11:13,160 Speaker 1: reasonably alter it without having to rebuild the entire blockchain 166 00:11:13,320 --> 00:11:16,560 Speaker 1: from that point of transaction moving forward, and since the 167 00:11:16,559 --> 00:11:20,240 Speaker 1: blockchain is constantly getting added to essentially every ten minutes 168 00:11:20,240 --> 00:11:23,280 Speaker 1: in the case of bitcoin. Your computer is never going 169 00:11:23,320 --> 00:11:26,839 Speaker 1: to be fast and strong enough to build back that 170 00:11:26,960 --> 00:11:31,040 Speaker 1: chain and uh and beat out the rest of the 171 00:11:31,080 --> 00:11:34,280 Speaker 1: system so that you can alter a transaction and make 172 00:11:34,280 --> 00:11:37,760 Speaker 1: it look legitimate. So anytime you buy something with bitcoin, 173 00:11:37,800 --> 00:11:44,000 Speaker 1: anytime you transfer bitcoin to someone else, that transaction becomes 174 00:11:44,040 --> 00:11:47,800 Speaker 1: part of a block of transactions that the overall system 175 00:11:48,000 --> 00:11:52,480 Speaker 1: will validate, and once validated, it joins the chain, and 176 00:11:53,280 --> 00:11:57,360 Speaker 1: every future transaction that is made will contain a record 177 00:11:57,600 --> 00:12:02,920 Speaker 1: that includes your transaction. Though this ledger of transactions is 178 00:12:02,960 --> 00:12:06,439 Speaker 1: shared across the entire network of computers that are participating 179 00:12:06,520 --> 00:12:11,040 Speaker 1: in that blockchain. So if you have ten thousand computers 180 00:12:11,040 --> 00:12:14,040 Speaker 1: in this network, all ten thousand can see that ledger 181 00:12:14,520 --> 00:12:17,760 Speaker 1: and they can see the history of transactions that includes 182 00:12:17,800 --> 00:12:21,600 Speaker 1: the one that you just did. So it makes it 183 00:12:21,640 --> 00:12:25,880 Speaker 1: impossible to really alter things. It's a very reliable record, 184 00:12:26,240 --> 00:12:28,440 Speaker 1: and it makes it very difficult for you to claim 185 00:12:28,480 --> 00:12:31,280 Speaker 1: that you didn't actually spend that bitcoin, and you in 186 00:12:31,320 --> 00:12:34,280 Speaker 1: fact still have that bitcoin and you could spend it 187 00:12:34,320 --> 00:12:37,880 Speaker 1: on something else, well the ledger will say otherwise, and 188 00:12:38,200 --> 00:12:41,040 Speaker 1: that will mean that you will not have a legitimate argument. 189 00:12:42,320 --> 00:12:45,800 Speaker 1: That is fine for bitcoin transactions, but how can you 190 00:12:45,920 --> 00:12:50,920 Speaker 1: use that same technology to ensure that real physical things 191 00:12:50,960 --> 00:12:52,920 Speaker 1: are what they say they are, because if you could 192 00:12:52,960 --> 00:12:56,839 Speaker 1: tag a physical object in such a way that it 193 00:12:56,920 --> 00:13:00,920 Speaker 1: was related to a point of data within a blockchain, 194 00:13:01,360 --> 00:13:05,680 Speaker 1: then you could trace the movement of that that physical 195 00:13:05,720 --> 00:13:10,080 Speaker 1: component as it moved through a system. As Kind pointed out, 196 00:13:10,840 --> 00:13:15,320 Speaker 1: manufacturing in the twenty first century is really really complicated 197 00:13:15,320 --> 00:13:18,400 Speaker 1: because you have lots of different entities that contribute to 198 00:13:18,480 --> 00:13:21,960 Speaker 1: making all sorts of stuff, whether it's drugs or electronics 199 00:13:22,120 --> 00:13:26,679 Speaker 1: or car parts. You might have factories and manufacturing facilities 200 00:13:26,679 --> 00:13:29,520 Speaker 1: that are spread around the world that are all contributing 201 00:13:29,559 --> 00:13:33,400 Speaker 1: toward this. Even if you're talking about food supply, you 202 00:13:33,480 --> 00:13:36,320 Speaker 1: might be able to go to your local grocery store. 203 00:13:36,559 --> 00:13:40,239 Speaker 1: Let's say that you live someplace like San Francisco, California. 204 00:13:40,520 --> 00:13:42,120 Speaker 1: You might be able to go to a market in 205 00:13:42,160 --> 00:13:46,880 Speaker 1: San Francisco and pick up food that was originally grown 206 00:13:47,040 --> 00:13:51,400 Speaker 1: in Japan. Well, you don't know the pathway that food 207 00:13:51,440 --> 00:13:54,920 Speaker 1: took from the point of origin where it was grown 208 00:13:55,080 --> 00:13:58,160 Speaker 1: or caught. Let's say it's fish. You don't know where 209 00:13:58,200 --> 00:14:01,000 Speaker 1: that fish was caught. You don't know who that fish 210 00:14:01,040 --> 00:14:04,320 Speaker 1: was sold to, you don't know where that fish was processed. 211 00:14:04,520 --> 00:14:07,480 Speaker 1: All you know is where you bought the fish. That's 212 00:14:07,600 --> 00:14:10,960 Speaker 1: your last point of contact. You don't know anything else 213 00:14:11,000 --> 00:14:13,600 Speaker 1: about it. But with blockchain, if you were able to 214 00:14:13,720 --> 00:14:18,600 Speaker 1: somehow physically link the fish into a blockchain operation, you 215 00:14:18,600 --> 00:14:22,280 Speaker 1: can actually look at that chain of events and trace 216 00:14:22,360 --> 00:14:26,600 Speaker 1: back every single point of contact that fish had all 217 00:14:26,640 --> 00:14:29,000 Speaker 1: the way back to where it was caught, and you 218 00:14:29,040 --> 00:14:32,280 Speaker 1: would be able to verify that in fact everything was 219 00:14:33,000 --> 00:14:36,280 Speaker 1: UH was safe and healthy, that it didn't pass through 220 00:14:36,320 --> 00:14:39,000 Speaker 1: any hands where there might have been contamination, or at 221 00:14:39,080 --> 00:14:42,200 Speaker 1: least it decreases that, and if there were contamination, you 222 00:14:42,200 --> 00:14:45,200 Speaker 1: would be able to trace exactly where that happened because 223 00:14:45,200 --> 00:14:46,520 Speaker 1: you would be able to look at the chain of 224 00:14:46,560 --> 00:14:49,520 Speaker 1: events and say, well, according to this, it must have 225 00:14:49,560 --> 00:14:52,280 Speaker 1: happened at this processing facility, which would make it much 226 00:14:52,280 --> 00:14:56,120 Speaker 1: easier to UH to do inspections to make certain that 227 00:14:56,160 --> 00:14:58,240 Speaker 1: everything is on the up and up. But how do 228 00:14:58,320 --> 00:15:01,760 Speaker 1: you actually tag that? How do you have some sort 229 00:15:01,800 --> 00:15:06,360 Speaker 1: of record between the physical object and the digital record, 230 00:15:06,680 --> 00:15:10,840 Speaker 1: Because if if you don't have some way of verifying 231 00:15:11,240 --> 00:15:13,680 Speaker 1: that the physical object you're looking at is in fact 232 00:15:13,720 --> 00:15:17,080 Speaker 1: the same one that's in the digital record, you can't 233 00:15:17,080 --> 00:15:21,360 Speaker 1: be certain that the blockchain is an effective uh list 234 00:15:21,400 --> 00:15:26,480 Speaker 1: of transactions for that specific physical object. If I'm looking 235 00:15:26,480 --> 00:15:29,120 Speaker 1: at a pair of shoes and the blockchain tells me 236 00:15:29,160 --> 00:15:31,320 Speaker 1: that the pair of shoes is absolutely legitimate because I 237 00:15:31,320 --> 00:15:34,240 Speaker 1: can trace everything back, But it turns out the blockchain 238 00:15:34,600 --> 00:15:37,280 Speaker 1: is a record of a different pair of shoes, I 239 00:15:37,280 --> 00:15:39,800 Speaker 1: don't I don't know that because there's nothing that's tying 240 00:15:39,840 --> 00:15:42,840 Speaker 1: the physical object in my hands to this digital record. 241 00:15:43,040 --> 00:15:45,360 Speaker 1: It doesn't do me any good. So here's the problem. 242 00:15:45,440 --> 00:15:48,440 Speaker 1: How do you How do you link those two? Andrea's 243 00:15:48,520 --> 00:15:52,240 Speaker 1: kind talked about a way of doing that using something 244 00:15:52,320 --> 00:15:58,359 Speaker 1: called crypto anchors. These are anchor points on physical objects 245 00:15:58,760 --> 00:16:04,600 Speaker 1: that can link them to digital ledgers like blockchain and uh. 246 00:16:04,640 --> 00:16:06,520 Speaker 1: The whole idea is that you have to have some 247 00:16:06,560 --> 00:16:10,160 Speaker 1: sort of encryption, physical encryption that you can attach to 248 00:16:10,200 --> 00:16:14,080 Speaker 1: those physical objects in some way that would allow you 249 00:16:14,160 --> 00:16:18,240 Speaker 1: to verify that, in fact, the chain of events is 250 00:16:18,280 --> 00:16:22,000 Speaker 1: exactly what the digital record says it is. But what 251 00:16:22,080 --> 00:16:24,720 Speaker 1: does that even mean? Like, what what sort of physical 252 00:16:25,040 --> 00:16:27,640 Speaker 1: crypto anchor could you come up with? He gave a 253 00:16:27,680 --> 00:16:30,520 Speaker 1: couple of different examples. One of them I thought was 254 00:16:30,600 --> 00:16:36,120 Speaker 1: incredibly interesting. He talked about a malaria test. Now, if 255 00:16:36,160 --> 00:16:38,840 Speaker 1: you are in another country and you need to take 256 00:16:38,840 --> 00:16:41,160 Speaker 1: a malaria test, you've been bitten by a mosquito, and 257 00:16:41,200 --> 00:16:44,200 Speaker 1: you want, obviously you want to make sure that you 258 00:16:44,320 --> 00:16:48,320 Speaker 1: are testing negative for malaria. You want to also be 259 00:16:48,400 --> 00:16:51,240 Speaker 1: certain that that test is legitimate, that it came from 260 00:16:51,280 --> 00:16:56,760 Speaker 1: a trusted source. So he showed off a type of 261 00:16:56,840 --> 00:17:00,320 Speaker 1: malaria test this piece of paper that had very tiny, 262 00:17:00,360 --> 00:17:03,160 Speaker 1: little colorful dots on one part of the paper, and 263 00:17:03,160 --> 00:17:09,480 Speaker 1: those colorful dots represented an algorithm, a code. So first 264 00:17:09,480 --> 00:17:11,879 Speaker 1: you would look at those colorful dots and you would 265 00:17:12,320 --> 00:17:15,520 Speaker 1: make certain that it was the right pattern, the right colors, 266 00:17:15,600 --> 00:17:19,160 Speaker 1: and you would have a blockchain record of this malaria 267 00:17:19,280 --> 00:17:23,639 Speaker 1: test that would correspond to the code that you're looking 268 00:17:23,640 --> 00:17:25,880 Speaker 1: at the physical code. So as long as the physical 269 00:17:25,920 --> 00:17:29,679 Speaker 1: code on the malaria test matched the one that was 270 00:17:30,040 --> 00:17:34,439 Speaker 1: on the blockchain, you then could feel confident that, in fact, 271 00:17:34,480 --> 00:17:37,359 Speaker 1: this test is a legitimate one. But you say, what 272 00:17:37,480 --> 00:17:40,800 Speaker 1: about counterfeiting? What would happen if someone was able to 273 00:17:40,840 --> 00:17:43,600 Speaker 1: look at the blockchain, they were able to see what 274 00:17:43,720 --> 00:17:46,320 Speaker 1: the code was supposed to be. They take some regular 275 00:17:46,320 --> 00:17:48,560 Speaker 1: old paper that doesn't have a malaria test on it 276 00:17:48,640 --> 00:17:52,439 Speaker 1: at all. Then they very very carefully placed those dots 277 00:17:52,440 --> 00:17:56,399 Speaker 1: in the right configuration, in the right set of colors. Well, 278 00:17:56,880 --> 00:18:01,280 Speaker 1: the secret to this is that the actual code itself 279 00:18:01,680 --> 00:18:05,080 Speaker 1: is the thing you see is only half of the code. 280 00:18:05,400 --> 00:18:10,400 Speaker 1: When you expose that piece of paper to liquid, then 281 00:18:10,840 --> 00:18:14,240 Speaker 1: the ink dots, which are on little bitty micro pillars, 282 00:18:14,320 --> 00:18:17,600 Speaker 1: some of them wash away, some of them are revealed, 283 00:18:18,240 --> 00:18:20,960 Speaker 1: and you have a new code that will be there 284 00:18:21,320 --> 00:18:24,720 Speaker 1: in replace. It's usually in this case with the malaria test, 285 00:18:24,760 --> 00:18:27,560 Speaker 1: you would actually use a serum to do this. So 286 00:18:27,640 --> 00:18:31,119 Speaker 1: only with contact with the legitimate serum would this series 287 00:18:31,160 --> 00:18:33,439 Speaker 1: of dots change and they would change it to a 288 00:18:33,480 --> 00:18:35,480 Speaker 1: new code. And that new code is also tied to 289 00:18:35,520 --> 00:18:38,639 Speaker 1: the blockchain, so you would have a before and after. 290 00:18:38,880 --> 00:18:41,840 Speaker 1: So after adding serum to this malaria test, you could 291 00:18:41,960 --> 00:18:45,480 Speaker 1: then verify that in fact, the malaria test is a 292 00:18:45,560 --> 00:18:48,040 Speaker 1: legitimate one, it's a legitimate source. Then you can actually 293 00:18:48,160 --> 00:18:51,679 Speaker 1: use the malaria test on a patient. So that is 294 00:18:51,720 --> 00:18:54,760 Speaker 1: sort of the way to confound counterfeiters. If you think 295 00:18:54,800 --> 00:18:58,000 Speaker 1: about it, it's not that different from like a special 296 00:18:58,040 --> 00:19:02,639 Speaker 1: water mark on uh uh a unit of currency, or 297 00:19:03,000 --> 00:19:05,200 Speaker 1: one of those elements where you have like a transparent 298 00:19:05,280 --> 00:19:09,640 Speaker 1: panel that's inside a dollar bill or or some other 299 00:19:09,760 --> 00:19:12,159 Speaker 1: unit of currency. It's one of those things that is 300 00:19:12,160 --> 00:19:17,639 Speaker 1: meant to to uh make counterfeiting much more difficult. So 301 00:19:17,680 --> 00:19:21,920 Speaker 1: by having this dual layer of code that only gets 302 00:19:21,960 --> 00:19:24,439 Speaker 1: revealed if you add serum to the malaria test, you 303 00:19:24,520 --> 00:19:29,199 Speaker 1: have helped ensure that the the test that you you 304 00:19:29,280 --> 00:19:31,720 Speaker 1: have in your possession is in fact an authentic one 305 00:19:31,760 --> 00:19:35,560 Speaker 1: and it is linked back to this instance of the blockchain. 306 00:19:35,880 --> 00:19:38,359 Speaker 1: And by having this unique kind of code for every 307 00:19:38,359 --> 00:19:42,479 Speaker 1: single malaria test, you make it incredibly difficult for anyone 308 00:19:42,520 --> 00:19:46,200 Speaker 1: to make a legitimate counterfeit or a legitimate a seemingly 309 00:19:46,320 --> 00:19:50,119 Speaker 1: legitimate counterfeit. And I thought that was really neat this 310 00:19:50,200 --> 00:19:55,399 Speaker 1: idea of making sure using these physical uh components, something 311 00:19:55,440 --> 00:19:58,480 Speaker 1: that you could then link to a digital record. It's 312 00:19:58,560 --> 00:20:02,200 Speaker 1: really really clever. Now kind also talked very briefly, because 313 00:20:02,200 --> 00:20:04,480 Speaker 1: he was coming towards the end of his his presentation 314 00:20:04,520 --> 00:20:09,879 Speaker 1: at this point, about the use of micro chips. I 315 00:20:09,960 --> 00:20:12,720 Speaker 1: was gonna say microelectronics, and I guess technically it kind 316 00:20:12,720 --> 00:20:14,720 Speaker 1: of is, but I mean they're super super small. We're 317 00:20:14,720 --> 00:20:17,560 Speaker 1: talking about chips that are about the size of a 318 00:20:17,640 --> 00:20:20,760 Speaker 1: grain of rice that have more than a million discrete 319 00:20:20,760 --> 00:20:23,760 Speaker 1: components on them, so essentially like a million transistors on 320 00:20:23,840 --> 00:20:28,479 Speaker 1: this little size of rice chip, or even the size 321 00:20:28,480 --> 00:20:32,199 Speaker 1: of a grain of salt. These chips can still have 322 00:20:32,400 --> 00:20:38,800 Speaker 1: very sophisticated components to them, including things like motion sensors 323 00:20:38,920 --> 00:20:43,680 Speaker 1: or or a transmitter sensor. And the idea is that 324 00:20:43,760 --> 00:20:47,760 Speaker 1: these particular chips will cost very little less than ten 325 00:20:47,920 --> 00:20:51,920 Speaker 1: cents to create a single chip, and that they are 326 00:20:52,040 --> 00:20:56,240 Speaker 1: able to monitor and analyze and communicate and even act 327 00:20:56,840 --> 00:21:02,240 Speaker 1: on information. And these little tiny chips would be able 328 00:21:02,320 --> 00:21:06,240 Speaker 1: to analyze products and measure them in such a way 329 00:21:06,280 --> 00:21:10,879 Speaker 1: to make absolutely certain that they are in fact um legitimate. 330 00:21:10,920 --> 00:21:13,720 Speaker 1: They would concentrate on what are considered to be unique 331 00:21:14,119 --> 00:21:17,600 Speaker 1: identify irs or whatever the product is, whether it's the 332 00:21:17,640 --> 00:21:23,359 Speaker 1: specific shape or the chemical makeup or uh any other 333 00:21:23,400 --> 00:21:27,040 Speaker 1: sort of physical property that is considered to be unique 334 00:21:27,160 --> 00:21:31,040 Speaker 1: to that product, that it would be almost impossible, if 335 00:21:31,080 --> 00:21:35,240 Speaker 1: not outright impossible, for someone to duplicate that exactly. You 336 00:21:35,240 --> 00:21:36,960 Speaker 1: would have the chip program in such a way to 337 00:21:37,040 --> 00:21:40,919 Speaker 1: detect that quality, and if the quality is not present, 338 00:21:41,000 --> 00:21:44,520 Speaker 1: then obviously the chip would indicate that, and the chip 339 00:21:44,600 --> 00:21:47,800 Speaker 1: would be the thing that's low. That's that's tied to 340 00:21:47,880 --> 00:21:50,840 Speaker 1: the blockchain. So as long as the chip is active 341 00:21:51,000 --> 00:21:54,159 Speaker 1: on this uh, this product, whatever it may be, and 342 00:21:54,200 --> 00:21:57,200 Speaker 1: that it's verifying that it is in fact authentic, and 343 00:21:57,440 --> 00:22:00,280 Speaker 1: you can record the process of where the chip was 344 00:22:00,320 --> 00:22:03,560 Speaker 1: from one point in the supply chain to the next, 345 00:22:03,640 --> 00:22:06,199 Speaker 1: all the way to its end point, whether it's a 346 00:22:06,200 --> 00:22:10,879 Speaker 1: consumer or a doctor or a manufacturing facility, whatever it 347 00:22:10,960 --> 00:22:13,639 Speaker 1: might be. You could then look at the blockchain verify 348 00:22:13,760 --> 00:22:16,040 Speaker 1: that in fact, all the right steps were taken, it 349 00:22:16,080 --> 00:22:18,439 Speaker 1: went to all the right points of contact, all the 350 00:22:18,520 --> 00:22:23,920 Speaker 1: right processes were uh were performed on this product, and 351 00:22:24,320 --> 00:22:26,520 Speaker 1: the chip is still verifying that the product that is 352 00:22:26,600 --> 00:22:29,479 Speaker 1: in your hands is in fact authentic, then you have 353 00:22:29,680 --> 00:22:32,520 Speaker 1: this ability to say, yes, this is the real thing. 354 00:22:32,800 --> 00:22:36,800 Speaker 1: And at least according to kind Uh, he said that 355 00:22:37,840 --> 00:22:41,960 Speaker 1: using this you could end up cutting counterfeiting in half 356 00:22:42,440 --> 00:22:46,000 Speaker 1: within the next couple of years, and then moving forward 357 00:22:46,040 --> 00:22:49,679 Speaker 1: you could reduce it even further. So this again would 358 00:22:49,880 --> 00:22:54,879 Speaker 1: help people be certain that the thing they got was 359 00:22:55,440 --> 00:22:58,280 Speaker 1: completely legitimate and they wouldn't have to worry about whether 360 00:22:58,400 --> 00:23:01,240 Speaker 1: or not the product they had was going to fail 361 00:23:01,280 --> 00:23:04,760 Speaker 1: on them or be harmful to them or otherwise not 362 00:23:04,920 --> 00:23:08,639 Speaker 1: be what they thought they were getting. So that was 363 00:23:09,000 --> 00:23:12,679 Speaker 1: number one of five. Still have some more people to 364 00:23:12,720 --> 00:23:16,160 Speaker 1: talk about, some more ideas, more cool notions that were 365 00:23:16,160 --> 00:23:19,520 Speaker 1: brought up at the IBM Research Science Slam. I think, 366 00:23:20,760 --> 00:23:23,920 Speaker 1: but first let's take a quick break to thank our sponsor. 367 00:23:31,160 --> 00:23:34,639 Speaker 1: The second person to come up was a PhD student 368 00:23:35,040 --> 00:23:40,440 Speaker 1: named Chichilia Boscini and chi Chilia. She talked about lattice 369 00:23:40,520 --> 00:23:47,520 Speaker 1: based cryptography and I loved Chichilia's presentation. UH. This young 370 00:23:47,560 --> 00:23:51,879 Speaker 1: woman got up on stage and began to speak about 371 00:23:51,960 --> 00:23:58,520 Speaker 1: post quantum cryptography lattice based UH encryption systems, and she 372 00:23:58,600 --> 00:24:03,439 Speaker 1: made it funny and engaging, which is kind of a 373 00:24:03,480 --> 00:24:06,800 Speaker 1: magic power. She talked about how once you learn the 374 00:24:06,840 --> 00:24:11,240 Speaker 1: fundamentals of mathematics, you can build all sorts of things, 375 00:24:11,280 --> 00:24:14,119 Speaker 1: and that when she was a child, she fell in 376 00:24:14,200 --> 00:24:17,160 Speaker 1: love with math. She thought math was fun. She liked 377 00:24:17,440 --> 00:24:20,280 Speaker 1: coming up with problems and solving them. She talked about 378 00:24:20,280 --> 00:24:23,640 Speaker 1: a teacher who would use a story of a fictional 379 00:24:23,640 --> 00:24:28,280 Speaker 1: old old lady who had an electric wheelchair, and a 380 00:24:28,359 --> 00:24:31,760 Speaker 1: fictional old lady kept on getting into various problems. Various 381 00:24:31,880 --> 00:24:35,520 Speaker 1: situations and the math problems were all related. To figuring 382 00:24:35,560 --> 00:24:39,640 Speaker 1: out how this nice old lady in electric wheelchair could 383 00:24:39,680 --> 00:24:42,399 Speaker 1: get out of a various predicaments. So it was this 384 00:24:42,520 --> 00:24:46,600 Speaker 1: sort of this real world kind of of scenario to 385 00:24:46,960 --> 00:24:53,600 Speaker 1: frame all these very ethereal math problems. And Chilia loved it. 386 00:24:53,720 --> 00:24:58,600 Speaker 1: So she went into looking UH into math further and 387 00:24:58,640 --> 00:25:01,840 Speaker 1: studying it, you know, is her primary area of focus. 388 00:25:02,400 --> 00:25:05,959 Speaker 1: She also talked about things like number theory, where you 389 00:25:06,000 --> 00:25:10,400 Speaker 1: take integer numbers and you take prime numbers and their properties. 390 00:25:11,040 --> 00:25:14,120 Speaker 1: And you may remember in the review episode I talked 391 00:25:14,119 --> 00:25:18,200 Speaker 1: about how cryptography is very much reliant. Our current status 392 00:25:18,200 --> 00:25:21,440 Speaker 1: of cryptography is very much reliant on things like prime 393 00:25:21,520 --> 00:25:26,440 Speaker 1: numbers and factoring. So to give you a quick update 394 00:25:26,560 --> 00:25:28,960 Speaker 1: on what I said or a quick refresher, I guess 395 00:25:28,960 --> 00:25:32,400 Speaker 1: on what I said. If you're looking at modern day cryptography, 396 00:25:32,400 --> 00:25:36,359 Speaker 1: modern day encryption UH strategies, typically what you do is 397 00:25:36,400 --> 00:25:40,760 Speaker 1: you take an extremely large prime number. That's a number 398 00:25:40,800 --> 00:25:43,959 Speaker 1: that's only divisible by itself. It doesn't have any other factors, 399 00:25:44,040 --> 00:25:47,240 Speaker 1: so you can't divide by two or four or seven 400 00:25:47,320 --> 00:25:49,639 Speaker 1: or anything like that. It's only divisible by itself. But 401 00:25:49,760 --> 00:25:53,440 Speaker 1: you take a really really big prime number. I'm talking 402 00:25:53,440 --> 00:25:58,040 Speaker 1: about a number there's hundreds of digits long, so enormous number, 403 00:25:58,720 --> 00:26:03,399 Speaker 1: and then you multiply it another equally huge prime number, 404 00:26:03,840 --> 00:26:05,680 Speaker 1: and then you get a product. You get you get 405 00:26:05,800 --> 00:26:08,880 Speaker 1: something that is the product of these two enormous prime 406 00:26:08,960 --> 00:26:12,400 Speaker 1: numbers that becomes sort of your key to encrypting everything. 407 00:26:12,800 --> 00:26:16,040 Speaker 1: And encryption obviously takes other steps past this. You've got 408 00:26:16,080 --> 00:26:18,679 Speaker 1: your encryption key, but then you've got to keep on 409 00:26:18,800 --> 00:26:21,840 Speaker 1: going and actually encrypt everything. But the point being that 410 00:26:22,320 --> 00:26:25,880 Speaker 1: someone could get hold of that encryption key, but they 411 00:26:25,960 --> 00:26:28,760 Speaker 1: don't know the secrets of how things were encrypted because 412 00:26:28,760 --> 00:26:31,560 Speaker 1: they don't have the actual prime numbers that you used 413 00:26:31,720 --> 00:26:34,240 Speaker 1: to create the key. In order to figure out what 414 00:26:34,280 --> 00:26:36,959 Speaker 1: those prime numbers are, you would have to start setting 415 00:26:36,960 --> 00:26:40,840 Speaker 1: a computer program to taking that enormous product and starting 416 00:26:40,840 --> 00:26:45,040 Speaker 1: to divide by various large prime numbers and making certain 417 00:26:45,119 --> 00:26:49,320 Speaker 1: that the other, uh any any you know, reasonable or 418 00:26:49,440 --> 00:26:52,679 Speaker 1: real integer number that you would get by dividing by 419 00:26:52,680 --> 00:26:55,760 Speaker 1: a prime number was also a prime number. And this 420 00:26:55,800 --> 00:26:59,760 Speaker 1: whole process is very very difficult for a classic computer 421 00:27:00,040 --> 00:27:03,639 Speaker 1: can take an incredibly long time to solve. We're talking 422 00:27:04,119 --> 00:27:07,280 Speaker 1: years or decades or even longer to solve some of 423 00:27:07,320 --> 00:27:10,600 Speaker 1: these really hard problems because computers are just going to 424 00:27:10,680 --> 00:27:13,360 Speaker 1: do it sequentially. They're just gonna keep trying, and once 425 00:27:13,400 --> 00:27:15,920 Speaker 1: they try one prime number, they might try a prime 426 00:27:16,000 --> 00:27:18,119 Speaker 1: number and the result they get back is not a 427 00:27:18,200 --> 00:27:21,320 Speaker 1: real number, or it's not a whole integer. Well, that 428 00:27:21,320 --> 00:27:23,639 Speaker 1: that result has to get tossed out because it cannot 429 00:27:23,680 --> 00:27:26,800 Speaker 1: be an answer to your question. Or it might get 430 00:27:26,920 --> 00:27:29,920 Speaker 1: a prime number and it takes this huge number, divides 431 00:27:29,960 --> 00:27:31,639 Speaker 1: it by a prime number, and it gets a second 432 00:27:31,880 --> 00:27:34,800 Speaker 1: whole integer. Everything is cool there, but then the second 433 00:27:34,800 --> 00:27:37,639 Speaker 1: whole integer turns out not to be a prime number, 434 00:27:37,840 --> 00:27:39,880 Speaker 1: which the computer also has to test, right. It has 435 00:27:39,920 --> 00:27:43,320 Speaker 1: to make sure that the result is also a prime number. 436 00:27:43,359 --> 00:27:45,600 Speaker 1: If the result was not a prime number, then that 437 00:27:45,720 --> 00:27:48,360 Speaker 1: result is invalid and the computer has to keep going. 438 00:27:48,400 --> 00:27:50,359 Speaker 1: So this is a very very long process once you 439 00:27:50,400 --> 00:27:54,199 Speaker 1: get to truly large numbers. However, we don't know for 440 00:27:54,280 --> 00:27:59,560 Speaker 1: certain that this particular process is uh is truly a 441 00:27:59,680 --> 00:28:03,919 Speaker 1: hard problem. There's no mathematical proof that it is a 442 00:28:03,960 --> 00:28:08,120 Speaker 1: hard problem. There's circumstantial evidence that it's a very hard 443 00:28:08,160 --> 00:28:11,120 Speaker 1: problem because computers are not very good at solving it, 444 00:28:11,760 --> 00:28:16,320 Speaker 1: but that that alone is not a mathematical proof. Mathematical 445 00:28:16,359 --> 00:28:19,399 Speaker 1: proofs and evidence are two different things. I think I 446 00:28:19,400 --> 00:28:24,119 Speaker 1: actually listened to a gentleman talk with this young woman yesterday, 447 00:28:24,520 --> 00:28:29,480 Speaker 1: and I think he was getting proof. The word proof 448 00:28:29,520 --> 00:28:32,040 Speaker 1: was was causing him to hang up a bit because 449 00:28:32,040 --> 00:28:35,560 Speaker 1: he wasn't thinking mathematical proofs. He was thinking proof. I said, hey, 450 00:28:35,600 --> 00:28:37,760 Speaker 1: I have evidence here, I have proof that this is 451 00:28:37,800 --> 00:28:41,280 Speaker 1: a hard problem. It's not exactly the same thing. They're related, 452 00:28:41,320 --> 00:28:44,280 Speaker 1: but not the same thing. But she explained that with 453 00:28:44,400 --> 00:28:48,440 Speaker 1: quantum computers, if you're able to quantize the information, and 454 00:28:48,440 --> 00:28:51,720 Speaker 1: you're able to use quantum computers, because quantum computers use cubits, 455 00:28:51,760 --> 00:28:55,400 Speaker 1: and if you have a sufficient number of cubits, if 456 00:28:55,400 --> 00:28:58,680 Speaker 1: you have a quantum computer that's powerful enough, it could 457 00:28:58,760 --> 00:29:02,320 Speaker 1: take minute it's to solve a problem that would take 458 00:29:02,360 --> 00:29:05,720 Speaker 1: a classical computer years and years and years to solve 459 00:29:05,960 --> 00:29:08,800 Speaker 1: a classical computer to solve a problem like this. Like 460 00:29:08,840 --> 00:29:11,240 Speaker 1: I mentioned in the preview episode, quantum computers are not 461 00:29:11,280 --> 00:29:15,479 Speaker 1: necessarily great for all computational problems, but for a certain 462 00:29:15,560 --> 00:29:19,440 Speaker 1: set of computational problems, they would be far more efficient 463 00:29:19,760 --> 00:29:23,320 Speaker 1: assuming you had a powerful enough quantum computer. Now, she 464 00:29:23,360 --> 00:29:27,080 Speaker 1: did say, there is some good news. We don't have 465 00:29:27,160 --> 00:29:31,600 Speaker 1: to worry about our encryption schemes falling apart overnight. That 466 00:29:32,040 --> 00:29:34,200 Speaker 1: it took years and years and years to create a 467 00:29:34,240 --> 00:29:37,320 Speaker 1: quantum computer with just a single cubit, and these days 468 00:29:37,320 --> 00:29:40,600 Speaker 1: you're talking about quantum computers that are only at around 469 00:29:40,640 --> 00:29:43,840 Speaker 1: the fifty cubit range, and you would need many, many, 470 00:29:43,920 --> 00:29:46,840 Speaker 1: many more cubits in order to be a serious threat 471 00:29:46,880 --> 00:29:50,200 Speaker 1: to cryptography. That before we get to that point where 472 00:29:50,240 --> 00:29:52,800 Speaker 1: there's gonna be more years of work and development to 473 00:29:52,840 --> 00:29:56,440 Speaker 1: make quantum computers reliable and make sure they don't just 474 00:29:56,720 --> 00:30:01,480 Speaker 1: decohere easily and decoherences when the quantum states all begin 475 00:30:01,520 --> 00:30:04,920 Speaker 1: to collapse in on themselves. So if that happens, then 476 00:30:04,960 --> 00:30:07,800 Speaker 1: a quantum computer turns into a classical computer, and then 477 00:30:07,800 --> 00:30:10,680 Speaker 1: you're just back to where you were before. UH. You 478 00:30:11,240 --> 00:30:13,440 Speaker 1: have a computer that's going to go through the old 479 00:30:13,520 --> 00:30:17,200 Speaker 1: process of trying to solve this problem. Quantum computers sort 480 00:30:17,240 --> 00:30:20,920 Speaker 1: of solve problems in parallel. They kind of test all 481 00:30:21,080 --> 00:30:26,760 Speaker 1: possible UH answers simultaneously because cubits can act in superposition 482 00:30:26,840 --> 00:30:30,440 Speaker 1: and be zero one and all technically all values in between. 483 00:30:30,880 --> 00:30:35,280 Speaker 1: And because of that, you can run multiple problems and 484 00:30:35,400 --> 00:30:38,320 Speaker 1: each bit is acting as either a zero or a one. 485 00:30:38,600 --> 00:30:42,000 Speaker 1: That means you get every single potential combination depending on 486 00:30:42,040 --> 00:30:45,000 Speaker 1: how many cubits you have at your disposal. Right, so 487 00:30:45,040 --> 00:30:48,400 Speaker 1: if you have fifty cubits, that's the equivalent of fifty bits, 488 00:30:48,400 --> 00:30:50,880 Speaker 1: except the bits can be both zero and one at 489 00:30:50,880 --> 00:30:54,560 Speaker 1: the same time. You can run any problem that would 490 00:30:54,640 --> 00:30:58,440 Speaker 1: require fifty bits or fewer, and it will run all 491 00:30:58,480 --> 00:31:03,840 Speaker 1: the potential solution simultaneously, and then will analyze those results, 492 00:31:03,880 --> 00:31:08,240 Speaker 1: and it then assigns sort of a threshold of confidence 493 00:31:08,800 --> 00:31:12,040 Speaker 1: for each of those results at the end of the process. 494 00:31:12,120 --> 00:31:16,200 Speaker 1: And technically usually whichever result has the highest level of 495 00:31:16,200 --> 00:31:20,680 Speaker 1: confidence is most likely the correct one. Uh, this is 496 00:31:20,880 --> 00:31:23,080 Speaker 1: kind of it sounds like I'm dancing around things, but 497 00:31:23,080 --> 00:31:27,280 Speaker 1: that's because quantum is weird. Quantum does not answer in certainties. 498 00:31:27,440 --> 00:31:32,520 Speaker 1: Quantum answers in probabilities. So it may say that I'm 499 00:31:33,480 --> 00:31:37,440 Speaker 1: confident that this particular result is in fact the solution 500 00:31:37,520 --> 00:31:40,440 Speaker 1: to your problem, and you have to decide where's your threshold, 501 00:31:40,440 --> 00:31:43,920 Speaker 1: where is the what is the cut off for certainty 502 00:31:44,000 --> 00:31:46,720 Speaker 1: that you need before you go forward and say, all right, 503 00:31:46,800 --> 00:31:48,920 Speaker 1: this is our result. But if you do have a 504 00:31:49,000 --> 00:31:52,320 Speaker 1: quantum computer with a sufficient number of cubits and you're 505 00:31:52,320 --> 00:31:58,680 Speaker 1: presented encrypted information, then you could potentially do the equivalent 506 00:31:58,720 --> 00:32:01,720 Speaker 1: of a brute force attack of figuring out what that 507 00:32:01,880 --> 00:32:05,479 Speaker 1: encryption scheme, what what numbers it uses if you have 508 00:32:05,600 --> 00:32:07,760 Speaker 1: this this quantum computer, because it can run all those 509 00:32:07,800 --> 00:32:12,480 Speaker 1: calculations I talked about earlier essentially simultaneously and then give 510 00:32:12,560 --> 00:32:15,320 Speaker 1: you the most likely answer, and then you would have 511 00:32:16,000 --> 00:32:19,240 Speaker 1: essentially the keys to the kingdom. And it's it's kind 512 00:32:19,240 --> 00:32:22,600 Speaker 1: of like thinking about someone has found the perfect way 513 00:32:22,640 --> 00:32:26,240 Speaker 1: to create a skeleton key that fits every single lock 514 00:32:26,440 --> 00:32:29,560 Speaker 1: that's ever been made. Once you do that, then locks 515 00:32:29,560 --> 00:32:32,680 Speaker 1: are useless because someone could get hold of one of 516 00:32:32,680 --> 00:32:35,480 Speaker 1: those skeleton keys, they can get access to everything. So 517 00:32:35,520 --> 00:32:38,880 Speaker 1: this is why people talk about quantum computers as being 518 00:32:38,920 --> 00:32:43,080 Speaker 1: the end of our current encryption strategy, because it won't 519 00:32:43,240 --> 00:32:45,400 Speaker 1: do you any good. If someone actually has access to 520 00:32:45,440 --> 00:32:49,000 Speaker 1: a quantum computer, they can decrypt anything that's out there 521 00:32:49,360 --> 00:32:52,680 Speaker 1: that's using that particular approach. So she said, well, eventually 522 00:32:52,720 --> 00:32:54,640 Speaker 1: we're gonna get there. We're not there yet. We're at 523 00:32:54,680 --> 00:32:57,959 Speaker 1: fifty cubits. And in order to get a quantum computer stable, 524 00:32:58,040 --> 00:33:00,600 Speaker 1: you have to keep it at an incredibly low temperature 525 00:33:00,640 --> 00:33:03,800 Speaker 1: to maintain that quantum state, and you cannot interfere with 526 00:33:03,880 --> 00:33:06,520 Speaker 1: that machine at all, because if you do, then the 527 00:33:06,560 --> 00:33:10,560 Speaker 1: coherence dissipates and you're back to having a less powerful 528 00:33:10,600 --> 00:33:13,480 Speaker 1: classical computer. But eventually we're gonna get to a point 529 00:33:13,520 --> 00:33:15,120 Speaker 1: where they are going to be powerful enough. So we 530 00:33:15,200 --> 00:33:18,920 Speaker 1: have to start thinking about what the next evolution in 531 00:33:18,960 --> 00:33:22,680 Speaker 1: cryptography must be in order to protect our data from 532 00:33:22,960 --> 00:33:27,840 Speaker 1: the quantum world. And this brings us to the lattice 533 00:33:28,000 --> 00:33:31,680 Speaker 1: based approach. I had no idea this was a thing, 534 00:33:31,880 --> 00:33:34,280 Speaker 1: and it blew my mind when she talked about it. 535 00:33:34,520 --> 00:33:40,440 Speaker 1: Classic cryptography, we're using large prime numbers. With lattice based approaches, 536 00:33:40,800 --> 00:33:47,080 Speaker 1: you create a a applotting system. Imagine a two dimensional grid, 537 00:33:47,600 --> 00:33:50,440 Speaker 1: so just grid paper. You've got that grid in front 538 00:33:50,480 --> 00:33:53,040 Speaker 1: of you, and you take a point. You make a 539 00:33:53,080 --> 00:33:57,200 Speaker 1: point on that grid, and the lattice problem requires that 540 00:33:57,240 --> 00:33:59,720 Speaker 1: you find the points in a grid that are closest 541 00:33:59,800 --> 00:34:03,920 Speaker 1: to that fixed point you've chosen. That fixed point you've chosen, 542 00:34:03,920 --> 00:34:06,920 Speaker 1: it's called the origin. And so your job as a 543 00:34:06,960 --> 00:34:09,279 Speaker 1: computer is, all right, let me find the points that 544 00:34:09,320 --> 00:34:12,920 Speaker 1: are closest to the origin, and that will be the 545 00:34:12,920 --> 00:34:16,200 Speaker 1: basis of my cryptography. If it's two dimensional. You could 546 00:34:16,200 --> 00:34:18,359 Speaker 1: do it yourself. You can just look at a piece 547 00:34:18,400 --> 00:34:21,080 Speaker 1: of grid paper. You see a point that's on there already. 548 00:34:21,280 --> 00:34:24,680 Speaker 1: You can quickly identify which points are the closest to 549 00:34:24,840 --> 00:34:27,440 Speaker 1: that central point, and you would have the answer right 550 00:34:27,480 --> 00:34:30,080 Speaker 1: in front of you. It's it's it's a it's not 551 00:34:30,160 --> 00:34:35,480 Speaker 1: a difficult problem. What Chilia said was, imagine that you 552 00:34:35,600 --> 00:34:39,399 Speaker 1: don't use just two dimensions. Imagine you use use more 553 00:34:39,480 --> 00:34:43,600 Speaker 1: than two dimensions. Imagine you use one hundred dimensions. Here's 554 00:34:43,600 --> 00:34:47,239 Speaker 1: the thing. We can't really imagine that we live in 555 00:34:47,280 --> 00:34:50,600 Speaker 1: a world of three dimensions that we observe directly. For 556 00:34:50,800 --> 00:34:53,279 Speaker 1: if you argue about time being a dimension, although you 557 00:34:53,320 --> 00:34:56,760 Speaker 1: cannot physically see time, you see the effects of time 558 00:34:57,280 --> 00:35:02,440 Speaker 1: through the causality. But if you were to add extra dimensions, 559 00:35:02,440 --> 00:35:06,520 Speaker 1: which you can do computationally, even though we as human 560 00:35:06,560 --> 00:35:10,200 Speaker 1: beings are not capable of really imagining that, it would 561 00:35:10,239 --> 00:35:13,400 Speaker 1: it makes this problem far, far, far more difficult to solve, 562 00:35:14,080 --> 00:35:18,360 Speaker 1: perhaps too difficult even for quantum computers to solve in 563 00:35:18,400 --> 00:35:22,960 Speaker 1: an arbitrarily easy fashion. So if you do make such 564 00:35:23,000 --> 00:35:26,360 Speaker 1: a problem a lattice based problem, where you know the 565 00:35:26,400 --> 00:35:29,920 Speaker 1: answer and the person that you're sending information to also 566 00:35:30,200 --> 00:35:33,839 Speaker 1: knows the answer, but no one else knows the answer. Then, 567 00:35:33,880 --> 00:35:36,160 Speaker 1: even if they have a quantum computer, they can't just 568 00:35:36,239 --> 00:35:41,840 Speaker 1: brute force figure out the solution to your cryptography strategy 569 00:35:42,040 --> 00:35:47,000 Speaker 1: because it's too difficult to map out. Uh. There's no 570 00:35:47,239 --> 00:35:51,880 Speaker 1: mathematical proof that can simplify this nos, no sort of 571 00:35:51,960 --> 00:35:57,080 Speaker 1: shortcut to the in destination, and so a lattice based 572 00:35:57,280 --> 00:36:02,560 Speaker 1: strategy for cryptography might be the few sure of cryptography 573 00:36:02,560 --> 00:36:06,600 Speaker 1: in general. I think that's amazing, and honestly I still 574 00:36:06,640 --> 00:36:10,320 Speaker 1: only have kind of a semi basic grasp on the concept. 575 00:36:10,480 --> 00:36:15,920 Speaker 1: But the presentation was fantastic and I highly recommend you 576 00:36:15,960 --> 00:36:18,640 Speaker 1: look into it if this sort of stuff sounds interesting 577 00:36:18,680 --> 00:36:24,440 Speaker 1: to you. To Chilia made mathematics sound really entertaining and fun. 578 00:36:25,120 --> 00:36:28,720 Speaker 1: Uh I liked her approach. I don't think I would 579 00:36:28,719 --> 00:36:30,760 Speaker 1: find it nearly as fun. I think I would quickly 580 00:36:30,920 --> 00:36:34,560 Speaker 1: run up against the very limits no pun intended for 581 00:36:34,600 --> 00:36:37,680 Speaker 1: all my calculus fans out there, but the very limits 582 00:36:37,680 --> 00:36:40,799 Speaker 1: of my understanding of mathematics, and then I would get frustrated. 583 00:36:41,200 --> 00:36:47,000 Speaker 1: Because I loved math up through triggonometry, and then once 584 00:36:47,040 --> 00:36:50,000 Speaker 1: you got to calculus, I I I hit a wall 585 00:36:50,640 --> 00:36:53,640 Speaker 1: and I just suddenly could not get my mind wrapped 586 00:36:53,640 --> 00:36:57,440 Speaker 1: around those concepts. So I have a huge admiration for 587 00:36:57,480 --> 00:37:00,200 Speaker 1: people who not only understand the concepts, but they are 588 00:37:00,239 --> 00:37:06,080 Speaker 1: capable of communicating not just the general uh way that 589 00:37:06,640 --> 00:37:09,840 Speaker 1: this sort of stuff works, but also their own interest 590 00:37:09,920 --> 00:37:13,520 Speaker 1: and enthusiasm for the subject. Alright, guys, I recorded the 591 00:37:13,640 --> 00:37:16,520 Speaker 1: full episode about the Science Slam, and it turned out 592 00:37:16,520 --> 00:37:19,919 Speaker 1: to be a little long, So rather than do a 593 00:37:19,960 --> 00:37:23,400 Speaker 1: full epic length episode about the Science Slam, I figured 594 00:37:23,400 --> 00:37:26,360 Speaker 1: out end this episode here and we'll pick up in 595 00:37:26,400 --> 00:37:29,160 Speaker 1: the next one to talk about the other presenters who 596 00:37:29,280 --> 00:37:33,160 Speaker 1: took the stage at Think Conference two thousand eighteen IBMS 597 00:37:33,239 --> 00:37:36,880 Speaker 1: conference to chat about the science they've done and the 598 00:37:36,920 --> 00:37:39,360 Speaker 1: work that they're looking at and how that has the 599 00:37:39,400 --> 00:37:44,240 Speaker 1: potential to really change our world. I am incredibly thankful 600 00:37:44,280 --> 00:37:47,120 Speaker 1: that I got a chance to see these brilliant people 601 00:37:47,239 --> 00:37:50,719 Speaker 1: speak and to to get some inspiration from them, because 602 00:37:50,760 --> 00:37:53,920 Speaker 1: it was really cool to hear about stuff that usually 603 00:37:53,920 --> 00:37:56,640 Speaker 1: I just read like a press release about, but to 604 00:37:56,719 --> 00:37:59,160 Speaker 1: actually hear the people who are doing the research talk 605 00:37:59,200 --> 00:38:02,960 Speaker 1: about it and to kind of convey their excitement about 606 00:38:03,040 --> 00:38:07,280 Speaker 1: their fields of study was really invigorating. So make certain 607 00:38:07,320 --> 00:38:10,879 Speaker 1: you tune into that next episode to hear the conclusion 608 00:38:11,400 --> 00:38:14,600 Speaker 1: of the science slam and what else I learned, because 609 00:38:14,600 --> 00:38:16,640 Speaker 1: it was really cool stuff and I hope you guys 610 00:38:16,680 --> 00:38:19,960 Speaker 1: find it interesting too. If you guys have suggestions for 611 00:38:20,200 --> 00:38:22,720 Speaker 1: topics I should tackle in future episodes of tech Stuff, 612 00:38:22,760 --> 00:38:24,200 Speaker 1: you know, once I get through with this little mini 613 00:38:24,200 --> 00:38:26,920 Speaker 1: series over at the Think Conference, I'm gonna be right 614 00:38:26,960 --> 00:38:30,360 Speaker 1: back to my normal recording schedule and uh, you know, 615 00:38:30,440 --> 00:38:33,200 Speaker 1: good old reliable Jonathan talking about technology. If you have 616 00:38:33,239 --> 00:38:35,280 Speaker 1: anything that you would like me to talk about, maybe 617 00:38:35,280 --> 00:38:38,840 Speaker 1: there's a particular company or a technology or a person 618 00:38:39,320 --> 00:38:41,719 Speaker 1: that you think I should profile, or maybe there's someone 619 00:38:41,880 --> 00:38:44,160 Speaker 1: I should interview or have on as a guest host. 620 00:38:44,520 --> 00:38:46,400 Speaker 1: Get in touch with me let me know what you think. 621 00:38:46,600 --> 00:38:50,320 Speaker 1: The address for the show is tech Stuff at how 622 00:38:50,400 --> 00:38:53,120 Speaker 1: stuff works dot com, or you can always drop me 623 00:38:53,160 --> 00:38:55,640 Speaker 1: a line on Twitter or Facebook. The handle for both 624 00:38:55,680 --> 00:38:59,799 Speaker 1: of those is text Stuff h s W. Don't forget 625 00:38:59,880 --> 00:39:02,040 Speaker 1: to over to Instagram. Check out tech Stuff over there. 626 00:39:02,040 --> 00:39:04,520 Speaker 1: We've got lots of behind the scenes photos and other 627 00:39:05,000 --> 00:39:08,959 Speaker 1: goodies for you to look at. And also I live 628 00:39:09,000 --> 00:39:12,000 Speaker 1: stream when I'm recording in the normal studio on Wednesdays 629 00:39:12,000 --> 00:39:14,280 Speaker 1: and Fridays. If you would like to see me record 630 00:39:14,320 --> 00:39:17,720 Speaker 1: this show live, you can go to twitch dot tv 631 00:39:17,880 --> 00:39:21,120 Speaker 1: slash tech Stuff on a Wednesday or Friday. The schedule 632 00:39:21,239 --> 00:39:24,120 Speaker 1: is up on the page there, and you can watch 633 00:39:24,239 --> 00:39:27,279 Speaker 1: as I record these shows, as I stumble my way 634 00:39:27,320 --> 00:39:30,840 Speaker 1: through mistakes, as I restart, and actually doesn't happen that often. 635 00:39:30,880 --> 00:39:33,840 Speaker 1: I'm mostly mostly what you hear is what I record, 636 00:39:34,480 --> 00:39:36,359 Speaker 1: and you can chat with me. There's a chat room 637 00:39:36,360 --> 00:39:38,399 Speaker 1: in there, and whenever I get to a break, I'm 638 00:39:38,440 --> 00:39:40,960 Speaker 1: happy to chat with all the people in there. Sometimes 639 00:39:40,960 --> 00:39:44,160 Speaker 1: you guys help shape the show and I really appreciate that, 640 00:39:44,280 --> 00:39:46,439 Speaker 1: So join in. Come on over to twitch dot tv 641 00:39:46,520 --> 00:39:49,040 Speaker 1: slash tech Stuff, be part of the group. I look 642 00:39:49,080 --> 00:39:51,160 Speaker 1: forward to seeing you and I'll talk to you again 643 00:39:51,760 --> 00:40:00,400 Speaker 1: really soon for more on this and Bathos of other topics. 644 00:40:00,640 --> 00:40:07,319 Speaker 1: Is that how stuff works dot com wh