1 00:00:04,160 --> 00:00:07,200 Speaker 1: Get in tech with technology with tech Stuff from half 2 00:00:07,240 --> 00:00:14,440 Speaker 1: stuff works dot com. Hey, they're welcome to tech Stuff. 3 00:00:14,480 --> 00:00:17,400 Speaker 1: I'm Jonathan Strickland. I'm an executive producer with how Stuff 4 00:00:17,440 --> 00:00:20,560 Speaker 1: Works and I love all things tech and I'm continuing 5 00:00:20,600 --> 00:00:24,880 Speaker 1: the special series coming to you from Las Vegas, Nevada, 6 00:00:25,120 --> 00:00:31,160 Speaker 1: at the IBM Think Conference and UH. In our last episode, 7 00:00:31,240 --> 00:00:35,680 Speaker 1: I talked about attending the IBM Research Science Slam, in 8 00:00:35,680 --> 00:00:39,920 Speaker 1: which several people who have been doing some incredible research 9 00:00:39,920 --> 00:00:44,240 Speaker 1: into different fields and utilizing technology and interesting ways took 10 00:00:44,280 --> 00:00:47,240 Speaker 1: the stage to talk about their work. And it was 11 00:00:47,280 --> 00:00:50,680 Speaker 1: a fantastic night. I really enjoyed myself. I'm very thankful 12 00:00:50,720 --> 00:00:52,560 Speaker 1: that I got to attend. I also got a really 13 00:00:52,600 --> 00:00:57,320 Speaker 1: comfortable seat right up front because I happened to figure 14 00:00:57,320 --> 00:01:01,440 Speaker 1: out where the main doors were before they opened. That's 15 00:01:01,480 --> 00:01:06,520 Speaker 1: that's something I'm really proud of, even though it was 16 00:01:06,680 --> 00:01:08,760 Speaker 1: really anyone could have done it, but I managed to 17 00:01:08,760 --> 00:01:12,080 Speaker 1: get a nice, big, comfy seat, and I chatted a 18 00:01:12,120 --> 00:01:15,600 Speaker 1: little bit about the first presenters. But we've got a 19 00:01:15,640 --> 00:01:18,560 Speaker 1: couple more to talk about today, so we're going to 20 00:01:18,640 --> 00:01:23,679 Speaker 1: transition into that episode and what else I saw while 21 00:01:23,720 --> 00:01:27,319 Speaker 1: I was at the IBM Research science Slam. Hope you enjoy. 22 00:01:27,440 --> 00:01:33,600 Speaker 1: The next presenter was Tom Zimmerman, who uh MS Garcia 23 00:01:33,840 --> 00:01:36,839 Speaker 1: referred to as mcgever, saying that he could take any 24 00:01:36,880 --> 00:01:39,640 Speaker 1: two objects and turn it into kind of a microprocessor. 25 00:01:40,560 --> 00:01:44,200 Speaker 1: He was credited as a human slash machine devices and 26 00:01:44,280 --> 00:01:48,920 Speaker 1: paradigm scientists, which honestly did not know was a thing 27 00:01:49,240 --> 00:01:52,560 Speaker 1: before last night, so this was a treat for me. 28 00:01:52,880 --> 00:01:58,080 Speaker 1: He was very expressive, very funny, probably the most humorous 29 00:01:58,120 --> 00:02:00,400 Speaker 1: of all the presenters who came up there. And he 30 00:02:00,440 --> 00:02:03,880 Speaker 1: talked about how he made sort of a private project 31 00:02:03,920 --> 00:02:06,360 Speaker 1: for himself and how that turned into something that could 32 00:02:06,440 --> 00:02:09,760 Speaker 1: be much greater. And he did this by taking an 33 00:02:09,800 --> 00:02:12,520 Speaker 1: image sensor, essentially the same sort of sensor that you 34 00:02:12,520 --> 00:02:15,760 Speaker 1: could find in a smartphone for the camera. He took 35 00:02:15,800 --> 00:02:17,480 Speaker 1: the image sensor and a couple of l e ed 36 00:02:17,680 --> 00:02:20,440 Speaker 1: s and he put it into sort of a waterproof 37 00:02:20,480 --> 00:02:26,919 Speaker 1: container and created a basic three D microscope. Um. He 38 00:02:27,000 --> 00:02:31,440 Speaker 1: used Python, the programming language, to create a method to 39 00:02:31,600 --> 00:02:35,000 Speaker 1: plot out where things were within a three D space, 40 00:02:35,120 --> 00:02:37,359 Speaker 1: so not just the x y coordinates, but also the 41 00:02:37,480 --> 00:02:39,520 Speaker 1: z coordinates if you think of the x y z 42 00:02:39,760 --> 00:02:42,320 Speaker 1: ax s. He was able to plot all that out, 43 00:02:42,400 --> 00:02:45,760 Speaker 1: so not just how high up in a picture something 44 00:02:45,840 --> 00:02:47,880 Speaker 1: is or how low down in the picture something is, 45 00:02:47,880 --> 00:02:50,560 Speaker 1: but how close or far away that thing is. And 46 00:02:50,600 --> 00:02:53,600 Speaker 1: he used it to look at the life forms within 47 00:02:53,639 --> 00:02:56,440 Speaker 1: a single drop of water. And he found this to 48 00:02:56,520 --> 00:03:00,280 Speaker 1: be really fascinating watching all these tiny, little microscope opic 49 00:03:00,320 --> 00:03:04,360 Speaker 1: life forms moving around and being able to plot where 50 00:03:04,360 --> 00:03:07,760 Speaker 1: they were and track their progress. And this got him 51 00:03:07,760 --> 00:03:12,160 Speaker 1: interested in the subject of plankton. Now, as Mr Zimmerman 52 00:03:12,240 --> 00:03:18,320 Speaker 1: pointed out, plankton are incredibly important to our our world. 53 00:03:19,040 --> 00:03:24,679 Speaker 1: Without plankton, we would find it very, very difficult to exist. 54 00:03:25,000 --> 00:03:29,040 Speaker 1: Plankton produced two thirds of the oxygen that we breathe. 55 00:03:29,480 --> 00:03:32,800 Speaker 1: So you know, plants take carbon dioxide and then they 56 00:03:32,840 --> 00:03:36,000 Speaker 1: convert that into oxygen, and then we breathe that oxygen, 57 00:03:36,040 --> 00:03:39,320 Speaker 1: we exhale carbon dioxide. We're all part of that cycle. Well, 58 00:03:39,320 --> 00:03:43,680 Speaker 1: plankton are responsible for two thirds of that oxygen. So 59 00:03:44,080 --> 00:03:45,960 Speaker 1: while you might think of all the big forests out 60 00:03:46,000 --> 00:03:48,160 Speaker 1: there is being really important carbon sinks, and they are, 61 00:03:48,400 --> 00:03:51,840 Speaker 1: don't get me wrong, we don't want to cut those down. Uh, 62 00:03:51,920 --> 00:03:56,080 Speaker 1: Plankton are even more important. They are huge carbon sinks. 63 00:03:56,080 --> 00:04:00,440 Speaker 1: They sequester carbon from the ecosystem. Which means that they 64 00:04:00,520 --> 00:04:04,440 Speaker 1: also can counteract that that effect. Obviously, if we we 65 00:04:04,560 --> 00:04:08,960 Speaker 1: keep dumping carbon into the ecosystem that contributes to climate change, 66 00:04:09,680 --> 00:04:14,320 Speaker 1: it contributes to the greenhouse effect, which some people would 67 00:04:14,720 --> 00:04:17,719 Speaker 1: say is all about global warming. As it turns out, 68 00:04:17,880 --> 00:04:21,719 Speaker 1: the climate on Earth is more complicated than warming or cooling. 69 00:04:22,080 --> 00:04:24,360 Speaker 1: That's why a lot of people now call it climate 70 00:04:24,440 --> 00:04:29,520 Speaker 1: change rather than global warming. But controlling the amount of 71 00:04:29,520 --> 00:04:33,400 Speaker 1: carbon that we introduced into the ecosystem is incredibly important, 72 00:04:33,880 --> 00:04:36,440 Speaker 1: and plankton are really good at soaking up carbon. But 73 00:04:36,520 --> 00:04:38,880 Speaker 1: here's the problem is that we're actually dumping more carbon 74 00:04:39,000 --> 00:04:43,000 Speaker 1: into the environment than the plankton can easily absorb, and 75 00:04:43,040 --> 00:04:46,640 Speaker 1: plankton are dying as a result, So we're killing off 76 00:04:47,440 --> 00:04:50,800 Speaker 1: the life form that is responsible for producing most of 77 00:04:50,839 --> 00:04:54,040 Speaker 1: the oxygen we breathe. Not only that, but plankton are 78 00:04:54,080 --> 00:04:58,359 Speaker 1: also the bottom of the food chain over in or 79 00:04:58,760 --> 00:05:01,200 Speaker 1: very close to the very bottom the food chain over 80 00:05:01,240 --> 00:05:03,880 Speaker 1: in the oceans, so they serve as a food source 81 00:05:03,920 --> 00:05:07,360 Speaker 1: for just about every species of baby fish out there. 82 00:05:07,880 --> 00:05:12,000 Speaker 1: So if the plankton die off, then the food supply 83 00:05:12,160 --> 00:05:14,800 Speaker 1: for these fish die off, then the fish die off, 84 00:05:14,800 --> 00:05:17,520 Speaker 1: then the predators that eat those fish die off and 85 00:05:17,560 --> 00:05:21,240 Speaker 1: you start to see the food chain collapse in on itself. Obviously, 86 00:05:21,279 --> 00:05:25,960 Speaker 1: this is a really bad thing, but it's also uh 87 00:05:26,120 --> 00:05:29,720 Speaker 1: a tricky thing to study plankton because typically the way 88 00:05:29,720 --> 00:05:32,960 Speaker 1: scientists would study plankton is they would go out into 89 00:05:33,000 --> 00:05:34,920 Speaker 1: the field and by the field, I mean the ocean, 90 00:05:35,080 --> 00:05:38,320 Speaker 1: and they take a big net and they would trawl 91 00:05:38,680 --> 00:05:41,520 Speaker 1: the ocean and they would pull up some plankton. They 92 00:05:41,560 --> 00:05:44,240 Speaker 1: would put this in jars with preservatives which would kill 93 00:05:44,279 --> 00:05:47,400 Speaker 1: the plankton, and then they would come back to the 94 00:05:47,520 --> 00:05:50,640 Speaker 1: lab and they would study the plankton under a microscope. 95 00:05:51,200 --> 00:05:55,520 Speaker 1: Zimmerman equated this to someone who is it's their job 96 00:05:55,560 --> 00:06:00,480 Speaker 1: to UH to evaluate to analyze sports, and they are 97 00:06:00,480 --> 00:06:04,479 Speaker 1: following a football team, and the way they figure out 98 00:06:04,520 --> 00:06:08,480 Speaker 1: how well the football team performs is they are allowed 99 00:06:08,520 --> 00:06:12,440 Speaker 1: to go on the football team's bus after a game 100 00:06:12,880 --> 00:06:16,120 Speaker 1: and take pictures of the football players as they're asleep, 101 00:06:16,720 --> 00:06:18,760 Speaker 1: and then try to analyze how well they play the 102 00:06:18,760 --> 00:06:21,840 Speaker 1: game based upon those pictures of sleeping people. He said, 103 00:06:22,560 --> 00:06:24,440 Speaker 1: that's kind of the equivalent of what scientists are having 104 00:06:24,440 --> 00:06:26,680 Speaker 1: to do with plankton. That if you're only able to 105 00:06:26,720 --> 00:06:29,240 Speaker 1: study them after they've been preserved and therefore they're no 106 00:06:29,320 --> 00:06:33,120 Speaker 1: longer alive. You can only gather so much information about them, 107 00:06:33,120 --> 00:06:35,360 Speaker 1: and it's not terribly useful. It would be better if 108 00:06:35,360 --> 00:06:38,440 Speaker 1: you could study plankton within their own ecosystem and not 109 00:06:38,520 --> 00:06:42,200 Speaker 1: brought back to this this microscope he had been playing with, 110 00:06:42,240 --> 00:06:46,200 Speaker 1: this idea he had created, and he said again that 111 00:06:46,360 --> 00:06:48,719 Speaker 1: the basic parts were all pretty easy to use. You 112 00:06:48,720 --> 00:06:51,760 Speaker 1: could have an image sensor from a smartphone, a couple 113 00:06:51,800 --> 00:06:56,400 Speaker 1: of LEDs, and a waterproof container. You could program some software, 114 00:06:56,600 --> 00:07:00,159 Speaker 1: some artificial intelligence software, use chips that were develop for 115 00:07:00,200 --> 00:07:04,320 Speaker 1: smart cameras that were meant to do things like image recognition, 116 00:07:04,400 --> 00:07:08,039 Speaker 1: face recognition, you know, the sort of basic artificial intelligence 117 00:07:08,080 --> 00:07:10,760 Speaker 1: that I talked about in my preview episode. But then 118 00:07:10,920 --> 00:07:15,520 Speaker 1: reprogram it, retrain the neural network to recognize plankton and 119 00:07:15,560 --> 00:07:18,880 Speaker 1: to track their behaviors. By looking at those behaviors, you 120 00:07:18,880 --> 00:07:23,120 Speaker 1: can learn more about that plankton, like how plankton eat 121 00:07:23,520 --> 00:07:26,280 Speaker 1: other things. And then he gave an example of plankton 122 00:07:26,560 --> 00:07:31,080 Speaker 1: that like a particular type of algae and occasionally this 123 00:07:31,200 --> 00:07:34,080 Speaker 1: plankton will eat a different type of algae that is 124 00:07:34,120 --> 00:07:38,520 Speaker 1: capable of creating a toxin, and that toxin UH more 125 00:07:38,600 --> 00:07:41,920 Speaker 1: or less makes the plankton drunk. As Zimmerman explained, the 126 00:07:41,960 --> 00:07:45,480 Speaker 1: plankton on its own would dart all over the place 127 00:07:45,520 --> 00:07:48,440 Speaker 1: and be able to elude predators because it's able to 128 00:07:48,440 --> 00:07:51,440 Speaker 1: to move around quite a bit. But when it gets drunk, 129 00:07:51,440 --> 00:07:54,320 Speaker 1: when it eats this particular type of algae, it just 130 00:07:54,400 --> 00:07:57,160 Speaker 1: will swib in a straight line. It's kind of how 131 00:07:57,920 --> 00:08:02,200 Speaker 1: UH and intoxicated plankton would move around in its environment. 132 00:08:02,240 --> 00:08:03,800 Speaker 1: But if it moves in the straight line, it makes 133 00:08:03,840 --> 00:08:07,560 Speaker 1: it very easy for predators to eat that plankton. Well. 134 00:08:07,600 --> 00:08:12,320 Speaker 1: As predators eat the plankton, then that plankton is less 135 00:08:12,320 --> 00:08:16,120 Speaker 1: capable of eating algae, you know, obviously, because the population 136 00:08:16,160 --> 00:08:19,320 Speaker 1: of the plankton starts to drop, algae has fewer predators 137 00:08:19,320 --> 00:08:21,840 Speaker 1: of its own, and then its population begins to grow, 138 00:08:21,880 --> 00:08:23,920 Speaker 1: and then you get algae blooms. That could be a 139 00:08:23,960 --> 00:08:27,880 Speaker 1: big problem. So it's better if you're able to monitor 140 00:08:27,920 --> 00:08:31,080 Speaker 1: the plankton and monitor what's happening in the ecosystem and 141 00:08:31,120 --> 00:08:35,520 Speaker 1: be able to perhaps intervene if things are not going well. 142 00:08:36,480 --> 00:08:38,720 Speaker 1: But you can only intervene if you have all the information. 143 00:08:38,760 --> 00:08:42,839 Speaker 1: You can only make a meaningful and helpful act. If 144 00:08:42,880 --> 00:08:45,880 Speaker 1: you know what's going on, without that information, you may 145 00:08:45,920 --> 00:08:50,679 Speaker 1: do more harm than good. So the Zimmerman's point was 146 00:08:50,720 --> 00:08:53,480 Speaker 1: that we now have the capability of making these tools 147 00:08:53,480 --> 00:08:57,200 Speaker 1: to gather the information we need to make more responsible choices. 148 00:08:57,960 --> 00:09:04,319 Speaker 1: And it was a really fascinating way of putting technology 149 00:09:04,360 --> 00:09:07,360 Speaker 1: in the role of an effective tool for a really 150 00:09:07,400 --> 00:09:10,960 Speaker 1: difficult problem. I have more to say about the science 151 00:09:11,080 --> 00:09:14,840 Speaker 1: slam over at the THINK two thousand eighteen conference, but 152 00:09:14,880 --> 00:09:17,480 Speaker 1: before I jump into the next little segment, I'd like 153 00:09:17,520 --> 00:09:26,760 Speaker 1: to take a quick break to thank our sponsor. The 154 00:09:26,840 --> 00:09:31,920 Speaker 1: next person to take the stage was Francesco Rossi. Francesca 155 00:09:32,000 --> 00:09:35,760 Speaker 1: Rossi's area of expertise was something that I thought was 156 00:09:36,040 --> 00:09:41,560 Speaker 1: truly interesting, artificial intelligence ethics. She talked about AI in 157 00:09:41,600 --> 00:09:43,760 Speaker 1: a way that a lot of people at IBM like 158 00:09:43,880 --> 00:09:47,319 Speaker 1: to talk about AI. They don't necessarily talk about artificial intelligence. 159 00:09:47,320 --> 00:09:51,560 Speaker 1: They talk about augmented intelligence. In other words, these are 160 00:09:51,720 --> 00:09:54,880 Speaker 1: the devices and the programs, the software of the firmware 161 00:09:55,240 --> 00:09:59,600 Speaker 1: that help us make decisions. They don't necessarily make all 162 00:09:59,600 --> 00:10:02,640 Speaker 1: the di visions for us. They aren't thinking, they aren't 163 00:10:02,679 --> 00:10:06,880 Speaker 1: having communications with us. They are guiding us as we 164 00:10:06,960 --> 00:10:09,920 Speaker 1: try to make decisions, and then we use that information 165 00:10:09,960 --> 00:10:13,800 Speaker 1: as a tool to help us in our tasks. So, 166 00:10:14,280 --> 00:10:16,920 Speaker 1: how do we build machines to help people make smarter 167 00:10:17,080 --> 00:10:22,120 Speaker 1: and more grounded positions and decisions? Uh, Artificial intelligence can 168 00:10:22,280 --> 00:10:27,000 Speaker 1: help solve some of the world's most difficult problems. And 169 00:10:27,400 --> 00:10:29,959 Speaker 1: Rossie talked about how she's been working in the AI 170 00:10:30,080 --> 00:10:33,840 Speaker 1: field for decades and that the conversation has gradually shifted 171 00:10:34,040 --> 00:10:38,320 Speaker 1: during her time and studies of AI. She said that 172 00:10:38,559 --> 00:10:42,360 Speaker 1: early when she was studying AI, the conversations were all 173 00:10:42,400 --> 00:10:45,319 Speaker 1: about how can we make it smarter? How can we 174 00:10:45,800 --> 00:10:51,640 Speaker 1: make these artificially intelligent programs faster, more capable, Uh, make 175 00:10:51,720 --> 00:10:56,480 Speaker 1: decisions more reliably? How do we do that? And so 176 00:10:56,559 --> 00:10:59,400 Speaker 1: the focus was just on performance. It had nothing to 177 00:10:59,440 --> 00:11:01,920 Speaker 1: do with the quality of those decisions, or maybe the 178 00:11:01,960 --> 00:11:07,439 Speaker 1: impact those decisions might have on other people, but rather, uh, 179 00:11:07,640 --> 00:11:11,280 Speaker 1: just can we make a machine that's able to to 180 00:11:11,800 --> 00:11:14,840 Speaker 1: do this task better than the ones we have right now? 181 00:11:15,160 --> 00:11:17,600 Speaker 1: These days, she said, you know, and back then it 182 00:11:17,679 --> 00:11:20,520 Speaker 1: was just computer scientists who are having this conversation. These days, 183 00:11:20,520 --> 00:11:23,640 Speaker 1: she said, there's a huge number of disciplines that all 184 00:11:23,640 --> 00:11:26,440 Speaker 1: get together to talk about these sort of things that 185 00:11:26,760 --> 00:11:33,440 Speaker 1: not only include computer scientists, but also philosophers, lawyers, economists, policymakers, 186 00:11:33,760 --> 00:11:36,960 Speaker 1: people who have recognized that machines not only have the 187 00:11:36,960 --> 00:11:40,719 Speaker 1: capability of making decisions quickly, but that those decisions can 188 00:11:40,760 --> 00:11:45,520 Speaker 1: have a real effect, positive or negative, on actual human 189 00:11:45,559 --> 00:11:47,920 Speaker 1: beings in the real world, and that there has to 190 00:11:47,960 --> 00:11:51,079 Speaker 1: be some sort of ethical approach to the development of 191 00:11:51,120 --> 00:11:54,760 Speaker 1: AI if we want AI to actually benefit humanity. One 192 00:11:54,800 --> 00:11:57,800 Speaker 1: of the big problems, or several of them, actually, she mentioned, 193 00:11:57,920 --> 00:12:00,720 Speaker 1: transparency is a huge issue. How do you know how 194 00:12:00,840 --> 00:12:04,640 Speaker 1: the AI arrived at its decision? You want a transparent 195 00:12:05,240 --> 00:12:09,440 Speaker 1: way of communicating that. Without that, then you just have 196 00:12:09,520 --> 00:12:12,640 Speaker 1: a black box. You have something that has taken data 197 00:12:12,920 --> 00:12:15,960 Speaker 1: and produced a result and you have no idea how 198 00:12:16,000 --> 00:12:19,720 Speaker 1: it went from A to b uh, And without knowing, 199 00:12:19,800 --> 00:12:22,360 Speaker 1: you don't know if the decision is a good one. Right, So, 200 00:12:22,720 --> 00:12:26,400 Speaker 1: as AI gets more complex and starts to make more 201 00:12:26,520 --> 00:12:30,600 Speaker 1: complicated decisions, if you don't have transparency, it's it's like 202 00:12:30,640 --> 00:12:34,839 Speaker 1: you're consulting a mysterious oracle and you don't really know 203 00:12:35,240 --> 00:12:38,839 Speaker 1: if the oracle has his or her act together or 204 00:12:38,920 --> 00:12:43,080 Speaker 1: is just making stuff up. So transparency is very important. 205 00:12:43,120 --> 00:12:47,040 Speaker 1: Explainability also very important. Can you explain how the machine 206 00:12:47,480 --> 00:12:50,360 Speaker 1: came to its conclusions, not just the pathway it took, 207 00:12:50,400 --> 00:12:53,679 Speaker 1: but how it decided one set of factors was more 208 00:12:53,720 --> 00:12:56,440 Speaker 1: important than another set of factors. And she also talked 209 00:12:56,440 --> 00:12:59,160 Speaker 1: about bias, and in fact, most of her conversation was 210 00:12:59,200 --> 00:13:04,160 Speaker 1: about bias. Bias is prejudice. It could be positive or 211 00:13:04,280 --> 00:13:07,520 Speaker 1: negative in regards to any particular set of data points. 212 00:13:07,679 --> 00:13:10,920 Speaker 1: So bias is something that we humans have. Now, it's 213 00:13:10,960 --> 00:13:13,959 Speaker 1: something that it's a quality we possess. We do get 214 00:13:14,000 --> 00:13:17,480 Speaker 1: a bias four different things. Uh, we could have a 215 00:13:17,520 --> 00:13:21,120 Speaker 1: positive experience with a particular thing. Let's let's take Let's 216 00:13:21,120 --> 00:13:24,600 Speaker 1: take roller coasters for example. Let's say that you ride 217 00:13:24,600 --> 00:13:26,679 Speaker 1: your very first roller coaster when you're a little kid, 218 00:13:26,760 --> 00:13:29,280 Speaker 1: and it's a wonderful trip, it's a wonderful ride, you 219 00:13:29,320 --> 00:13:31,200 Speaker 1: love it, and then you have sort of a bias 220 00:13:31,240 --> 00:13:33,720 Speaker 1: toward roller coasters because you love that feeling you had. 221 00:13:33,880 --> 00:13:36,480 Speaker 1: Or let's say the opposite happened. You ride your first 222 00:13:36,600 --> 00:13:39,160 Speaker 1: roller coaster and it it rattles you around a lot, 223 00:13:39,679 --> 00:13:42,199 Speaker 1: makes you feel sick, and you get off that ride 224 00:13:42,280 --> 00:13:46,200 Speaker 1: and your decision making process tells you, hey, this is 225 00:13:46,200 --> 00:13:48,719 Speaker 1: not for me. Roller coasters are bad. They are not 226 00:13:48,920 --> 00:13:51,880 Speaker 1: well designed rides. They hurt, they make me feel sick. 227 00:13:52,200 --> 00:13:55,240 Speaker 1: I don't like them. They scare me. I'm never writing 228 00:13:55,280 --> 00:13:57,760 Speaker 1: a roller coaster again. You've created a bias based on 229 00:13:57,800 --> 00:14:00,520 Speaker 1: that experience, and may very well be the your bias 230 00:14:00,600 --> 00:14:03,880 Speaker 1: plays out properly. Uh. It could be that that experience 231 00:14:04,000 --> 00:14:05,719 Speaker 1: just tells you that this is how you're going to 232 00:14:05,800 --> 00:14:08,120 Speaker 1: react every single time you can move forward with this 233 00:14:08,200 --> 00:14:11,319 Speaker 1: particular thing. In some cases, that's not a bad thing, 234 00:14:11,360 --> 00:14:14,720 Speaker 1: and she actually Rosie talks about that about how bias 235 00:14:14,800 --> 00:14:17,960 Speaker 1: is not inherently bad. But when it comes to things 236 00:14:18,000 --> 00:14:22,840 Speaker 1: like judging people, then obviously that's much more problematic. If 237 00:14:22,960 --> 00:14:26,600 Speaker 1: you go to a different culture and you encounter something 238 00:14:26,680 --> 00:14:30,840 Speaker 1: that upsets you, you might end up developing a bias 239 00:14:30,920 --> 00:14:34,120 Speaker 1: against anyone who comes from that culture. And that's not 240 00:14:34,240 --> 00:14:37,880 Speaker 1: necessarily representative, it's not fair. It can mean that you 241 00:14:38,160 --> 00:14:41,640 Speaker 1: then treat an entire group of people unfairly based upon 242 00:14:41,720 --> 00:14:46,120 Speaker 1: this bias. And that's where the scary part comes in 243 00:14:46,200 --> 00:14:49,360 Speaker 1: with AI, because while AI is going to follow very 244 00:14:49,400 --> 00:14:52,760 Speaker 1: specific rules that are set out based upon the AIS programming, 245 00:14:53,320 --> 00:14:55,880 Speaker 1: and AI can still be biased. Now, that doesn't mean 246 00:14:55,880 --> 00:14:59,840 Speaker 1: the AI is developing opinions of its own about people. 247 00:15:00,240 --> 00:15:03,440 Speaker 1: It means that the AI is referencing the data sets 248 00:15:03,440 --> 00:15:06,200 Speaker 1: that were fed to it, and data sets are created 249 00:15:06,240 --> 00:15:09,120 Speaker 1: by human beings. If the human beings who create the 250 00:15:09,240 --> 00:15:14,800 Speaker 1: data sets failed to include enough diversity, enough representation in 251 00:15:14,880 --> 00:15:18,440 Speaker 1: that data set, then the people who are not represented 252 00:15:18,680 --> 00:15:23,040 Speaker 1: can be affected negatively. And there are great examples of 253 00:15:23,080 --> 00:15:24,960 Speaker 1: this out in the world that you can actually see 254 00:15:25,080 --> 00:15:29,240 Speaker 1: things that have shown that there are problematic implementations of 255 00:15:29,320 --> 00:15:34,640 Speaker 1: artificial intelligence that do in fact indicate a bias is present. Uh. 256 00:15:34,680 --> 00:15:40,640 Speaker 1: There were stories about facial recognition technology that worked fine 257 00:15:40,800 --> 00:15:43,320 Speaker 1: if you happen to be a white person, but if 258 00:15:43,320 --> 00:15:47,960 Speaker 1: you were of any other race, particularly if you were uh, 259 00:15:48,000 --> 00:15:51,480 Speaker 1: if you were black, then it wasn't working properly. It 260 00:15:51,600 --> 00:15:55,880 Speaker 1: wasn't detecting people properly. Well. That indicates that perhaps the 261 00:15:56,000 --> 00:15:59,520 Speaker 1: data set that was used to train that artificial intelligence 262 00:15:59,640 --> 00:16:03,560 Speaker 1: at a lack of representation of people of different races 263 00:16:03,560 --> 00:16:06,760 Speaker 1: than than just white people. And it's not necessarily that 264 00:16:06,840 --> 00:16:11,200 Speaker 1: it was uh planned that way, or that the people 265 00:16:11,200 --> 00:16:14,880 Speaker 1: who were designing it were specifically excluding an entire group. 266 00:16:15,160 --> 00:16:18,120 Speaker 1: There might have been no malicious intent whatsoever. But that 267 00:16:18,160 --> 00:16:21,000 Speaker 1: doesn't really matter if there was malicious intent from the 268 00:16:21,040 --> 00:16:25,600 Speaker 1: beginning or not. The effect is the same whether it 269 00:16:25,680 --> 00:16:29,600 Speaker 1: was intended to exclude a group or just accidentally excluded 270 00:16:29,640 --> 00:16:32,320 Speaker 1: a group because the person who is designing the system 271 00:16:32,480 --> 00:16:36,920 Speaker 1: didn't belong to that group. That lack of diversity creates 272 00:16:36,920 --> 00:16:40,640 Speaker 1: a bias, and that bias has the potential to negatively 273 00:16:40,720 --> 00:16:44,840 Speaker 1: impact an entire population of people as a result, this 274 00:16:44,920 --> 00:16:47,120 Speaker 1: is not a good thing. You want to have AI 275 00:16:47,400 --> 00:16:50,520 Speaker 1: that is as unbiased as you can possibly be. Now, 276 00:16:50,640 --> 00:16:54,280 Speaker 1: Rosie argues that in the long run, over years and 277 00:16:54,360 --> 00:16:58,120 Speaker 1: years and years, we will have an explosion in AI 278 00:16:58,360 --> 00:17:04,760 Speaker 1: over multiple disciplines, multiple industries, and I completely agree that 279 00:17:04,920 --> 00:17:08,000 Speaker 1: is exactly what we're already seeing it. We're seeing AI 280 00:17:08,200 --> 00:17:11,040 Speaker 1: being developed in all sorts of different ways. And she 281 00:17:11,119 --> 00:17:13,879 Speaker 1: also argues that the ones that will stick around, the 282 00:17:13,920 --> 00:17:17,160 Speaker 1: ones we will rely upon, will ultimately be the ones 283 00:17:17,200 --> 00:17:19,960 Speaker 1: that do not have bias. We will realize that those 284 00:17:20,040 --> 00:17:24,119 Speaker 1: are the ones that are valuable, and we will abandon 285 00:17:24,280 --> 00:17:27,920 Speaker 1: all the AI constructs that contain by us. But that's 286 00:17:27,960 --> 00:17:31,040 Speaker 1: the long run. In the mid term, we're going to 287 00:17:31,119 --> 00:17:35,680 Speaker 1: have problems. We're going to have a I that because 288 00:17:35,720 --> 00:17:38,280 Speaker 1: of a lack of diversity in their data sets, are 289 00:17:38,320 --> 00:17:40,639 Speaker 1: not going to be able to handle real world situations 290 00:17:40,640 --> 00:17:42,879 Speaker 1: that are gonna have real world impact on people. So 291 00:17:42,920 --> 00:17:46,960 Speaker 1: she said, it's absolutely imperative that we have these ethical 292 00:17:46,960 --> 00:17:52,800 Speaker 1: discussions now and start consciously developing AI with an attempt 293 00:17:52,880 --> 00:17:56,719 Speaker 1: to avoid introducing bias. In order to do that, you 294 00:17:56,760 --> 00:18:04,560 Speaker 1: have to create multidisciplinary, multi under multi stakeholder, multicultural teams 295 00:18:04,600 --> 00:18:07,600 Speaker 1: to develop that artificial intelligence. You have to have this 296 00:18:07,720 --> 00:18:12,200 Speaker 1: representation and this diversity from the ground level and then 297 00:18:12,280 --> 00:18:15,399 Speaker 1: build up as you are creating this AI, and only 298 00:18:15,440 --> 00:18:19,200 Speaker 1: then can you be reasonably certain that you have the 299 00:18:19,280 --> 00:18:24,000 Speaker 1: representation you need to avoid bias. At that point, you 300 00:18:24,040 --> 00:18:27,239 Speaker 1: would have an AI that, no matter what it was 301 00:18:27,359 --> 00:18:30,960 Speaker 1: it was intended to do, will be considered much more 302 00:18:31,000 --> 00:18:36,680 Speaker 1: trustworthy and beneficial, not just smart, not just efficient. And 303 00:18:36,800 --> 00:18:40,080 Speaker 1: so I found this to be really a fascinating UH 304 00:18:40,400 --> 00:18:44,840 Speaker 1: presentation again to to think about how our way of 305 00:18:44,880 --> 00:18:48,640 Speaker 1: thinking about AI has changed so dramatically over the last 306 00:18:48,680 --> 00:18:52,040 Speaker 1: couple of decades, and that we've shifted from how can 307 00:18:52,040 --> 00:18:54,640 Speaker 1: we make this machine think too? How can we make 308 00:18:54,640 --> 00:18:58,080 Speaker 1: sure that this machine is performing in a way that 309 00:18:58,240 --> 00:19:01,639 Speaker 1: is not inherently unfair to any particular group of people. 310 00:19:02,560 --> 00:19:06,639 Speaker 1: Um and obviously in today's environment, as we get more 311 00:19:06,680 --> 00:19:08,239 Speaker 1: and more sensitive to this sort of thing, I mean, 312 00:19:08,280 --> 00:19:11,080 Speaker 1: we have whole sections of the world where people are 313 00:19:11,080 --> 00:19:16,960 Speaker 1: becoming more xenophobic and they're becoming more isolationist. They they 314 00:19:17,000 --> 00:19:20,399 Speaker 1: are are banding together with people they identify, with the 315 00:19:20,440 --> 00:19:24,080 Speaker 1: people that they feel represent who they are, and they 316 00:19:24,080 --> 00:19:28,600 Speaker 1: are more readily excluding people who don't fit that group. 317 00:19:29,480 --> 00:19:33,240 Speaker 1: That's a dangerous way of thinking, as a dangerous approach. 318 00:19:33,240 --> 00:19:36,320 Speaker 1: In some cases, it's necessary if you are part of 319 00:19:36,400 --> 00:19:40,760 Speaker 1: a very small population, if you are a minority within 320 00:19:41,320 --> 00:19:44,920 Speaker 1: a population that is the far outnumbers you, then you 321 00:19:45,000 --> 00:19:48,719 Speaker 1: might be banding together with other people of your identity, 322 00:19:49,080 --> 00:19:52,240 Speaker 1: you know that that share these cultural or or ethnic 323 00:19:52,320 --> 00:19:55,200 Speaker 1: identities that you have in a way of protecting yourself, 324 00:19:55,440 --> 00:19:59,639 Speaker 1: which is completely understandable if you are vastly outnumbered by 325 00:19:59,720 --> 00:20:03,399 Speaker 1: other is that's a self preservation technique. But then you 326 00:20:03,440 --> 00:20:04,919 Speaker 1: also have the flip side of it, where you have 327 00:20:04,960 --> 00:20:07,639 Speaker 1: the majority. If they're doing it, then they are more 328 00:20:07,680 --> 00:20:14,159 Speaker 1: likely to create situations that are disadvantageous or oppressive to 329 00:20:14,280 --> 00:20:18,160 Speaker 1: those minorities. And so it's it's really important moving forward, 330 00:20:18,560 --> 00:20:21,560 Speaker 1: that we try to break through that that we try 331 00:20:21,600 --> 00:20:27,880 Speaker 1: to embrace this sort of multicultural, diverse, representative approach so 332 00:20:27,920 --> 00:20:34,200 Speaker 1: that we don't create inherently unfair divisive scenarios, whether it's 333 00:20:34,240 --> 00:20:39,680 Speaker 1: with technology or anything else. Honestly, so, uh, I really 334 00:20:39,680 --> 00:20:42,760 Speaker 1: like this presentation. I have a feeling that not everybody 335 00:20:42,800 --> 00:20:46,400 Speaker 1: would because some people feel very strongly about this sort 336 00:20:46,440 --> 00:20:50,479 Speaker 1: of xenophobic kind of philosophy. They probably don't even think 337 00:20:50,520 --> 00:20:52,960 Speaker 1: of themselves as xenophobic. Um. In fact, I would be 338 00:20:52,960 --> 00:20:56,000 Speaker 1: shocked if they did. But yeah, I thought that this 339 00:20:56,080 --> 00:20:59,320 Speaker 1: was a really valuable talk. We're in the home stretch, 340 00:20:59,400 --> 00:21:02,720 Speaker 1: we're all molost done, but there's still some more to 341 00:21:02,760 --> 00:21:05,439 Speaker 1: talk about that I saw over at this Science Slam, 342 00:21:05,440 --> 00:21:09,000 Speaker 1: this incredible evening of science and technology and just geeking 343 00:21:09,000 --> 00:21:12,520 Speaker 1: out like crazy. Before we conclude, let's take another quick 344 00:21:12,520 --> 00:21:22,520 Speaker 1: break to thank our sponsor. The last person to get 345 00:21:22,600 --> 00:21:25,960 Speaker 1: up and speak was Talia Gershon. She got up to 346 00:21:25,960 --> 00:21:28,880 Speaker 1: talk about quantum computing and AI challenges. So again there's 347 00:21:28,920 --> 00:21:31,720 Speaker 1: some overlap here with some of the previous discussions. She 348 00:21:31,840 --> 00:21:35,680 Speaker 1: was talking specifically about that example I gave earlier with 349 00:21:36,320 --> 00:21:40,480 Speaker 1: miss Garcia, the the polymer chemist and talked about how 350 00:21:41,119 --> 00:21:43,960 Speaker 1: using a computer to accurately simulate the bonding of large 351 00:21:43,960 --> 00:21:48,040 Speaker 1: molecules does grow exponentially as you grow the size of 352 00:21:48,080 --> 00:21:51,560 Speaker 1: the molecule itself. So if you add more atoms to 353 00:21:51,600 --> 00:21:55,119 Speaker 1: a molecule, then the amount of computing power it takes 354 00:21:55,160 --> 00:22:00,320 Speaker 1: to simulate and model that molecule grows dramatically to point 355 00:22:00,359 --> 00:22:04,159 Speaker 1: where a even the most powerful supercomputer would find the 356 00:22:04,240 --> 00:22:07,920 Speaker 1: problem so difficult that it would take ages to create 357 00:22:07,960 --> 00:22:10,840 Speaker 1: a simulation. So even if you were creating a simulation 358 00:22:10,880 --> 00:22:14,639 Speaker 1: that was supposed to represent a micro second of time, 359 00:22:15,040 --> 00:22:20,000 Speaker 1: it might take days or longer weeks to create that simulation. 360 00:22:20,760 --> 00:22:23,760 Speaker 1: So you're taking weeks of real time to simulate a 361 00:22:23,800 --> 00:22:28,040 Speaker 1: micro second of simulated time. Obviously, this is not an 362 00:22:28,040 --> 00:22:33,639 Speaker 1: efficient way to go about things. So quantum computing, she argued, 363 00:22:33,680 --> 00:22:38,240 Speaker 1: could help solve this. And she asked the audience, who 364 00:22:38,320 --> 00:22:41,640 Speaker 1: here has heard buzz about quantum computing? And you know, 365 00:22:41,680 --> 00:22:44,159 Speaker 1: about two thirds of the hands went up in the audience, 366 00:22:44,160 --> 00:22:45,760 Speaker 1: and then she said, who here feels like they have 367 00:22:45,800 --> 00:22:48,760 Speaker 1: a really strong grip on what quantum computing is, And 368 00:22:48,800 --> 00:22:52,480 Speaker 1: then there were maybe a dozen hands still up in 369 00:22:52,520 --> 00:22:56,000 Speaker 1: the audience. I wrote, how quantum computing works, and I 370 00:22:56,040 --> 00:23:00,040 Speaker 1: did not raise my hand because while I did a 371 00:23:00,119 --> 00:23:04,159 Speaker 1: lot of research into quantum computing, I felt like my 372 00:23:04,320 --> 00:23:07,480 Speaker 1: understanding of quantum computing is still in the very, very 373 00:23:07,560 --> 00:23:12,600 Speaker 1: basic level, largely because there comes a point with quantum 374 00:23:12,600 --> 00:23:17,720 Speaker 1: mechanics and quantum computers where my understanding hits a wall, 375 00:23:18,320 --> 00:23:22,040 Speaker 1: and rather than feeling like I really have a grip 376 00:23:22,200 --> 00:23:26,240 Speaker 1: on what is happening, I'm just communicating what smarter people 377 00:23:26,320 --> 00:23:29,760 Speaker 1: are telling me quantum computing is all about. But I don't. 378 00:23:29,760 --> 00:23:32,600 Speaker 1: I feel like I don't have a real grasp of it. However, 379 00:23:32,640 --> 00:23:35,560 Speaker 1: I say that I also remember distinctly when I first 380 00:23:35,600 --> 00:23:38,960 Speaker 1: started studying quantum computing, I was also looking into string theory, 381 00:23:39,440 --> 00:23:43,000 Speaker 1: and I watched a documentary in which a leading physicist, 382 00:23:43,000 --> 00:23:47,880 Speaker 1: a leading expert on string theory, said, I I sometimes 383 00:23:47,920 --> 00:23:49,760 Speaker 1: get asked at the end of the day when it 384 00:23:49,800 --> 00:23:55,240 Speaker 1: all boils down, do I really really understand the science 385 00:23:55,280 --> 00:23:59,040 Speaker 1: I'm talking about? And my answer has to be not really. 386 00:23:59,080 --> 00:24:03,280 Speaker 1: There gets to a point where mathematically I can see 387 00:24:03,320 --> 00:24:07,040 Speaker 1: what's supposed to be happening, but there's a barrier between 388 00:24:07,080 --> 00:24:12,919 Speaker 1: the mathematics and my actual human understanding. And I've found 389 00:24:13,000 --> 00:24:15,800 Speaker 1: some relief in that. Talia Gershawn got up and talked 390 00:24:15,800 --> 00:24:18,120 Speaker 1: about this sort of thing. She's talked about how quantum 391 00:24:18,160 --> 00:24:22,880 Speaker 1: computers encode information into complex quantum states and then they 392 00:24:23,040 --> 00:24:28,840 Speaker 1: run uh quantumized processes on these quantum states. They use 393 00:24:29,080 --> 00:24:34,080 Speaker 1: a method to measure the final state that results as 394 00:24:34,119 --> 00:24:38,119 Speaker 1: a part of these quantized calculations, and then they record 395 00:24:38,119 --> 00:24:40,920 Speaker 1: a result which doesn't necessarily clear it up very much 396 00:24:40,960 --> 00:24:42,600 Speaker 1: for for us. And in fact, she was doing this 397 00:24:42,680 --> 00:24:45,200 Speaker 1: to comedic effects, saying that's if you wanted to say 398 00:24:45,200 --> 00:24:47,840 Speaker 1: it in the basic level, and that's still really complicated. 399 00:24:48,240 --> 00:24:51,920 Speaker 1: She argues that quantum computing is an interdisciplinary problem, that 400 00:24:52,000 --> 00:24:55,440 Speaker 1: it requires lots of people working in lots of fields 401 00:24:55,440 --> 00:24:58,920 Speaker 1: in a very specialized way to make quantum computing possible. 402 00:24:59,440 --> 00:25:03,560 Speaker 1: Because you quantum physicists who are experts on quantum mechanics 403 00:25:03,640 --> 00:25:07,880 Speaker 1: to talk about that aspect of quantum computing and quantum 404 00:25:07,880 --> 00:25:12,879 Speaker 1: information and they use the language of linear algebra to 405 00:25:13,000 --> 00:25:18,120 Speaker 1: write out there their work. But then you would need 406 00:25:18,160 --> 00:25:22,400 Speaker 1: computer scientists to take that linear algebra and translate that 407 00:25:22,560 --> 00:25:27,200 Speaker 1: into a language that computers can use to actually run processes. 408 00:25:27,560 --> 00:25:31,000 Speaker 1: So you have to take sort of the formulas created 409 00:25:31,040 --> 00:25:35,720 Speaker 1: by scientists, quantum scientists give it to computer scientists who 410 00:25:35,760 --> 00:25:39,800 Speaker 1: then can transform that into information that computers can actually use. 411 00:25:40,440 --> 00:25:44,680 Speaker 1: Then you would also need people who are material science 412 00:25:45,200 --> 00:25:49,480 Speaker 1: experts to actually create the physical quantum computer. You would 413 00:25:49,520 --> 00:25:54,000 Speaker 1: have to have device manufacturing experts to UH to take 414 00:25:54,080 --> 00:25:57,600 Speaker 1: the designs that the material scientists had created and make 415 00:25:57,640 --> 00:26:00,639 Speaker 1: it a real thing. You have to have physicists who 416 00:26:00,720 --> 00:26:02,520 Speaker 1: would be testing all of this to make sure that 417 00:26:02,600 --> 00:26:05,840 Speaker 1: it was actually working within the realm of quantum mechanics. 418 00:26:05,920 --> 00:26:09,600 Speaker 1: You'd have to have electrical controls expertise to be able 419 00:26:09,680 --> 00:26:13,159 Speaker 1: to create the quantum circuitry. You'd have to have advanced 420 00:26:13,200 --> 00:26:17,520 Speaker 1: cryogenics to keep the quantum computer cold enough to operate. 421 00:26:17,720 --> 00:26:20,480 Speaker 1: So she was arguing that all of these things require 422 00:26:20,520 --> 00:26:24,600 Speaker 1: people with very deep expertise and very specific fields, which 423 00:26:24,640 --> 00:26:28,439 Speaker 1: makes quantum computing particularly difficult. You can't just have, you know, 424 00:26:28,480 --> 00:26:31,320 Speaker 1: a small team of experts work together. It used to 425 00:26:31,400 --> 00:26:33,720 Speaker 1: be way back in the day when you talk about 426 00:26:33,760 --> 00:26:36,720 Speaker 1: things like the dawn of personal computers, like in the 427 00:26:36,800 --> 00:26:40,240 Speaker 1: nineteen seventies, you could have a person or a couple 428 00:26:40,280 --> 00:26:44,080 Speaker 1: of people put together all the different components and make 429 00:26:44,160 --> 00:26:48,240 Speaker 1: a computer. UH to to design and produce a computer 430 00:26:48,440 --> 00:26:52,280 Speaker 1: like think about Apple computers with jobs in Wozniak working 431 00:26:52,320 --> 00:26:55,600 Speaker 1: out of a garage and actually designing and building the 432 00:26:55,640 --> 00:26:59,359 Speaker 1: first Apple computer. That was possible back then. But with 433 00:26:59,440 --> 00:27:02,840 Speaker 1: quantum paters, you're talking about elements that require such deep 434 00:27:02,920 --> 00:27:06,959 Speaker 1: knowledge that you have to have an entire fleet of 435 00:27:07,040 --> 00:27:11,520 Speaker 1: experts across multiple disciplines in order to build an effective 436 00:27:11,600 --> 00:27:14,720 Speaker 1: quantum computer. To take that and then to build it 437 00:27:14,760 --> 00:27:18,480 Speaker 1: into a scalable technology is going to require a lot 438 00:27:18,520 --> 00:27:21,960 Speaker 1: of breakthroughs. Obviously, you can't just ramp that up. You 439 00:27:22,000 --> 00:27:25,119 Speaker 1: can't just have well, now we've designed this quantum computer, 440 00:27:25,240 --> 00:27:28,000 Speaker 1: let's create an assembly line and churn them out and 441 00:27:28,040 --> 00:27:32,400 Speaker 1: sell them. Uh. It's it's a huge undertaking. And she 442 00:27:32,480 --> 00:27:35,400 Speaker 1: talked about a phrase that one of her colleagues would 443 00:27:35,480 --> 00:27:40,000 Speaker 1: use consistently whenever anyone was working on a quantum computer design, 444 00:27:40,359 --> 00:27:45,440 Speaker 1: which was you're thinking too classically. You're limiting yourself to 445 00:27:45,640 --> 00:27:50,080 Speaker 1: thinking in the old classical physics and classical computer approach 446 00:27:50,600 --> 00:27:54,000 Speaker 1: to the way we do things, which works fine if 447 00:27:54,000 --> 00:27:58,399 Speaker 1: you're working on classical systems, but quantum systems require thinking 448 00:27:58,600 --> 00:28:01,879 Speaker 1: outside of that. It require there's a stretch you have 449 00:28:01,960 --> 00:28:05,560 Speaker 1: to actually go beyond what we typically think about as 450 00:28:05,640 --> 00:28:08,960 Speaker 1: human beings, because the quantum world is not something that 451 00:28:09,040 --> 00:28:13,080 Speaker 1: we can observe in our day to day lives. You know, 452 00:28:13,160 --> 00:28:16,000 Speaker 1: we observe the classical universe. That's what were that's what 453 00:28:16,080 --> 00:28:19,359 Speaker 1: our senses are capable of picking up. When you're getting 454 00:28:19,359 --> 00:28:22,760 Speaker 1: to things that belong to the quantum realm, they don't 455 00:28:22,800 --> 00:28:26,320 Speaker 1: make sense to us, largely because we can't observe them, 456 00:28:26,359 --> 00:28:28,480 Speaker 1: and because we can't observe them, they don't they don't 457 00:28:28,480 --> 00:28:31,600 Speaker 1: seem to be part of our realities. So things that 458 00:28:32,119 --> 00:28:34,840 Speaker 1: we understand, like, for instance, if I walk up to 459 00:28:34,920 --> 00:28:38,200 Speaker 1: a wall and I keep walking, I'm gonna slam into 460 00:28:38,200 --> 00:28:40,680 Speaker 1: that wall. I'm not just gonna pass through that wall. 461 00:28:41,040 --> 00:28:43,959 Speaker 1: I'm gonna hit it. It's gonna hurt, it's gonna stop me. 462 00:28:44,640 --> 00:28:47,560 Speaker 1: But in the quantum world, you can have a field, 463 00:28:48,680 --> 00:28:52,440 Speaker 1: and anywhere within that field you could potentially exist. Right, 464 00:28:52,520 --> 00:28:57,239 Speaker 1: So imagine instead of having a physical location that you 465 00:28:57,280 --> 00:29:00,840 Speaker 1: could identify with like GPS coordinates or something, it's more 466 00:29:00,920 --> 00:29:05,560 Speaker 1: like you have a big, sort of nebulous circle. Within 467 00:29:05,640 --> 00:29:08,320 Speaker 1: that circle, you could be at any of those points 468 00:29:08,360 --> 00:29:11,040 Speaker 1: at any given moment, and if you were to take 469 00:29:11,080 --> 00:29:13,600 Speaker 1: a snapshot of a moment, then yes, you would appear 470 00:29:13,680 --> 00:29:16,440 Speaker 1: at a very specific point within that circle. But if 471 00:29:16,440 --> 00:29:18,640 Speaker 1: you took a different snapshot at a different moment, you 472 00:29:18,640 --> 00:29:21,520 Speaker 1: would be in a totally different part of that circle. Now, 473 00:29:21,640 --> 00:29:24,320 Speaker 1: in this world, if I were to approach a wall, 474 00:29:25,240 --> 00:29:30,120 Speaker 1: sometimes within those snapshots, I would once my circle overlaps 475 00:29:30,200 --> 00:29:32,440 Speaker 1: the wall and goes on to the other side. So 476 00:29:32,640 --> 00:29:34,440 Speaker 1: part of my circle is still on the side of 477 00:29:34,440 --> 00:29:36,160 Speaker 1: the wall that I was on originally. Part of my 478 00:29:36,200 --> 00:29:38,400 Speaker 1: circle now overlaps the other side of the wall. You 479 00:29:38,520 --> 00:29:41,320 Speaker 1: take a snapshot, sometimes that snapshot is gonna show me 480 00:29:41,400 --> 00:29:43,640 Speaker 1: on the other side of that wall, even though I 481 00:29:43,680 --> 00:29:46,560 Speaker 1: didn't actually pass through it. I didn't walk through the wall. 482 00:29:46,880 --> 00:29:49,000 Speaker 1: I just appeared on the other side of the wall 483 00:29:49,040 --> 00:29:53,280 Speaker 1: because my circle overlapped it. That circle represents the probabilities 484 00:29:53,360 --> 00:29:55,320 Speaker 1: that I could be in any of those points at 485 00:29:55,320 --> 00:29:58,920 Speaker 1: any given time. As long as there is probability, that 486 00:29:58,960 --> 00:30:01,760 Speaker 1: means that at some point points I will be in 487 00:30:01,800 --> 00:30:04,800 Speaker 1: that part of the circle. Now, this actually exists in 488 00:30:04,840 --> 00:30:08,280 Speaker 1: the quantum world. It exists in our our microprocessors. It's 489 00:30:08,280 --> 00:30:12,440 Speaker 1: called electron tunneling or quantum tunneling, and this is where 490 00:30:12,520 --> 00:30:16,480 Speaker 1: you have these gates, these logic gates that, because of 491 00:30:16,520 --> 00:30:19,360 Speaker 1: the materials that were used and because of their thinness, 492 00:30:19,360 --> 00:30:21,640 Speaker 1: are so thin that when an electron comes up to 493 00:30:21,680 --> 00:30:24,800 Speaker 1: the gate, there's the possibility that the electron will actually 494 00:30:24,840 --> 00:30:27,520 Speaker 1: be on the opposite side of the gate, not on 495 00:30:27,560 --> 00:30:30,960 Speaker 1: the side that's supposed to be on, and that creates 496 00:30:31,040 --> 00:30:34,040 Speaker 1: electron leakage. This is a bad thing for electronics because 497 00:30:34,080 --> 00:30:38,440 Speaker 1: electronics is all about the controlled pathway of electrons, and 498 00:30:38,480 --> 00:30:41,800 Speaker 1: if electrons can sometimes bypass a gate without the gate 499 00:30:42,440 --> 00:30:45,560 Speaker 1: uh allowing for this like it it's supposed to stop 500 00:30:45,600 --> 00:30:48,560 Speaker 1: the electron. Instead the electron just passes right through because 501 00:30:49,160 --> 00:30:52,800 Speaker 1: it's electron field overlaps where the gate is. Then you 502 00:30:52,840 --> 00:30:57,480 Speaker 1: get errors, you get mistakes. So that is a real 503 00:30:57,520 --> 00:31:02,200 Speaker 1: world example of how these qual tum effects can create problems. 504 00:31:02,480 --> 00:31:06,560 Speaker 1: But we don't observe these directly because it's on a level, 505 00:31:06,600 --> 00:31:09,680 Speaker 1: it's on a scale that's far too small for us 506 00:31:09,720 --> 00:31:14,640 Speaker 1: to to see. So I found it really interesting to 507 00:31:15,840 --> 00:31:19,080 Speaker 1: think about that as well, about how this the strange 508 00:31:19,240 --> 00:31:23,959 Speaker 1: world that doesn't seem to behave according to physics that 509 00:31:24,000 --> 00:31:28,719 Speaker 1: we have observed, can still have real impacts on us. Obviously, 510 00:31:29,320 --> 00:31:32,480 Speaker 1: using this sort of world to create electronics that we 511 00:31:32,520 --> 00:31:34,600 Speaker 1: can then use to do all sorts of stuff is 512 00:31:34,720 --> 00:31:38,360 Speaker 1: pretty complicated. So how do we fix this? Tellia Gershon 513 00:31:38,440 --> 00:31:40,320 Speaker 1: said that one thing we need to do is start 514 00:31:40,320 --> 00:31:42,760 Speaker 1: in the classroom. We need to teach people how to 515 00:31:42,800 --> 00:31:45,960 Speaker 1: think outside the classical system um. I certainly would have 516 00:31:46,000 --> 00:31:49,600 Speaker 1: benefited from this. When I was a kid, I didn't 517 00:31:49,640 --> 00:31:52,360 Speaker 1: have a whole lot of exposure to quantum physics and 518 00:31:52,440 --> 00:31:54,840 Speaker 1: quantum mechanics. When I was going to school, I had 519 00:31:54,840 --> 00:31:57,600 Speaker 1: a little bit, but just enough to really confuse me, 520 00:31:57,720 --> 00:32:01,000 Speaker 1: not enough to get kind of a bay sick understanding 521 00:32:01,000 --> 00:32:04,480 Speaker 1: and a field for things. She said that within five years, 522 00:32:04,920 --> 00:32:08,719 Speaker 1: you're going to see physics departments and computer science departments, 523 00:32:09,160 --> 00:32:13,600 Speaker 1: and electrical engineering departments and mechanical engineering departments all talking 524 00:32:13,800 --> 00:32:19,760 Speaker 1: about uh, quantum effects and quantum mechanics, quantum computing, quantum states, 525 00:32:20,280 --> 00:32:22,880 Speaker 1: and that there will actually be classes on things like 526 00:32:22,920 --> 00:32:28,400 Speaker 1: designing quantum circuitry and quantum programming. And when we see that, 527 00:32:28,440 --> 00:32:32,200 Speaker 1: we're gonna see a huge development in this space because 528 00:32:32,240 --> 00:32:35,960 Speaker 1: people who otherwise kind of had to forge a pathway 529 00:32:36,240 --> 00:32:40,480 Speaker 1: towards quantum computing will now have the torch lifted up 530 00:32:40,520 --> 00:32:43,800 Speaker 1: by people who were being trained on this from the 531 00:32:43,840 --> 00:32:46,960 Speaker 1: get go, and therefore will end up having a benefit 532 00:32:47,160 --> 00:32:51,360 Speaker 1: of the previous pioneers knowledge and be able to carry 533 00:32:51,360 --> 00:32:55,040 Speaker 1: it much further and develop the technology to a point 534 00:32:55,080 --> 00:32:58,959 Speaker 1: that is really really useful for all of us. And 535 00:32:59,000 --> 00:33:02,880 Speaker 1: that was the final presenter that night, and one of 536 00:33:02,920 --> 00:33:06,840 Speaker 1: the last messages that IBM Research gave that evening was 537 00:33:06,960 --> 00:33:11,320 Speaker 1: that effective science communication is more critical today than ever before. 538 00:33:12,000 --> 00:33:15,560 Speaker 1: That science communication is a tricky, difficult thing. We're talking 539 00:33:15,600 --> 00:33:19,640 Speaker 1: about very hard concepts for some people to understand because 540 00:33:19,640 --> 00:33:24,120 Speaker 1: they've had limited exposure to those ideas and their counterintuitive 541 00:33:24,200 --> 00:33:26,760 Speaker 1: in many cases. So you have to be really good 542 00:33:26,760 --> 00:33:30,520 Speaker 1: at communicating this to people so they understand not just 543 00:33:30,680 --> 00:33:33,640 Speaker 1: what is going on, but why it's important. And that 544 00:33:34,240 --> 00:33:38,320 Speaker 1: another really critical element is public engagement in science to 545 00:33:38,360 --> 00:33:42,480 Speaker 1: create a conversation in science, to to not just educate 546 00:33:42,520 --> 00:33:44,800 Speaker 1: the public, but then to invite the public to take 547 00:33:44,880 --> 00:33:49,520 Speaker 1: part in these conversations because you'll get more representation that way, 548 00:33:49,560 --> 00:33:55,200 Speaker 1: and more uh ideas and more challenges to your notions 549 00:33:55,240 --> 00:33:58,760 Speaker 1: which are equally important, and that way you can take 550 00:33:58,800 --> 00:34:01,680 Speaker 1: part in making the this visions that will create these 551 00:34:01,720 --> 00:34:07,239 Speaker 1: technologies and ultimately help us move forward. I found that 552 00:34:07,280 --> 00:34:10,560 Speaker 1: to be really inspiring as well. Well. That wraps up 553 00:34:10,640 --> 00:34:16,360 Speaker 1: the Science Slam at the IBM research session that happened 554 00:34:16,360 --> 00:34:19,840 Speaker 1: on March nineteenth, two thousand eighteen. UM. I look forward 555 00:34:19,880 --> 00:34:24,239 Speaker 1: to attending the conference. Today. We're getting close to six am, 556 00:34:24,280 --> 00:34:26,640 Speaker 1: which means pretty soon I'm gonna go hit the gym 557 00:34:26,680 --> 00:34:29,040 Speaker 1: and then I'll go to the conference and see what 558 00:34:29,080 --> 00:34:30,840 Speaker 1: else I can find and who I can talk to. 559 00:34:30,960 --> 00:34:32,880 Speaker 1: And I hope to record a whole bunch more episodes 560 00:34:32,920 --> 00:34:35,879 Speaker 1: special episodes for you in this mini series about Think 561 00:34:35,920 --> 00:34:40,080 Speaker 1: two thousand eighteen, talking about cutting edge technologies, getting insight 562 00:34:40,640 --> 00:34:44,920 Speaker 1: into some of the most complicated and fascinating aspects of 563 00:34:44,960 --> 00:34:48,960 Speaker 1: science and technology, and where these might be taking us. 564 00:34:49,200 --> 00:34:51,640 Speaker 1: I hope you're enjoying the mini series so far, and 565 00:34:51,719 --> 00:34:54,560 Speaker 1: I look forward to including more of these. If you 566 00:34:54,600 --> 00:34:57,880 Speaker 1: guys have suggestions for future episodes of tech Stuff, I 567 00:34:58,239 --> 00:35:00,840 Speaker 1: highly recommend that you can in touch with me and 568 00:35:00,920 --> 00:35:03,919 Speaker 1: let me know. My email address is tech stuff at 569 00:35:04,000 --> 00:35:06,160 Speaker 1: how stuff works dot com, or you can drop me 570 00:35:06,200 --> 00:35:09,279 Speaker 1: a line on Facebook or Twitter. The handle for both 571 00:35:09,280 --> 00:35:12,920 Speaker 1: of those is tech Stuff hs W. You can follow 572 00:35:13,000 --> 00:35:15,640 Speaker 1: us on Instagram see lots of behind the scenes good 573 00:35:15,680 --> 00:35:18,839 Speaker 1: ease that way, and uh make sure you tune into 574 00:35:18,840 --> 00:35:21,840 Speaker 1: twitch dot tv slash tech stuff on a typical week 575 00:35:22,200 --> 00:35:26,000 Speaker 1: on Wednesdays and Friday's, I record live. I live stream 576 00:35:26,280 --> 00:35:28,640 Speaker 1: tech stuff and you can check it out. Just go 577 00:35:28,760 --> 00:35:31,160 Speaker 1: to twitch dot tv slash tech Stuff. There's a chat 578 00:35:31,200 --> 00:35:33,440 Speaker 1: room there. You can participate and say hi to me. 579 00:35:33,560 --> 00:35:36,160 Speaker 1: I always love seeing people there and I'll talk to 580 00:35:36,200 --> 00:35:45,360 Speaker 1: you again really soon for more on this and bathos 581 00:35:45,400 --> 00:35:57,520 Speaker 1: of other topics because it how staff works dot com