1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:13,880 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,920 --> 00:00:17,000 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:17,040 --> 00:00:19,320 Speaker 1: how Stuff Works and I love all things tech. And 5 00:00:19,360 --> 00:00:23,360 Speaker 1: here is another special episode brought to you from the 6 00:00:23,440 --> 00:00:27,040 Speaker 1: IBM Think two thousand and eighteen conference. I've been going 7 00:00:27,080 --> 00:00:30,240 Speaker 1: to this all week in Las Vegas, Nevada. It is Thursday, 8 00:00:30,320 --> 00:00:33,800 Speaker 1: March twenty two as I record this episode, and you 9 00:00:33,880 --> 00:00:37,000 Speaker 1: probably heard me speak a little bit about a lot 10 00:00:37,040 --> 00:00:40,519 Speaker 1: of different topics that I've heard about here at the 11 00:00:40,680 --> 00:00:44,839 Speaker 1: Think two thousand eighteen conference, and uh pretty much wrapped 12 00:00:44,840 --> 00:00:49,080 Speaker 1: everything up. I am technically attending the conference today. However, 13 00:00:49,120 --> 00:00:51,640 Speaker 1: I'm also checking out of my hotel room today, which 14 00:00:51,640 --> 00:00:54,120 Speaker 1: means I will not have a dedicated place to sit 15 00:00:54,160 --> 00:00:58,600 Speaker 1: down and record. So I might record another episode after this, 16 00:00:58,640 --> 00:01:02,240 Speaker 1: depending upon what I counter today, but that might have 17 00:01:02,280 --> 00:01:04,400 Speaker 1: to wait until I get back to the studio because 18 00:01:04,400 --> 00:01:06,280 Speaker 1: I may not be able to find a corner to 19 00:01:06,360 --> 00:01:09,160 Speaker 1: sit down and record and that is suitable for such 20 00:01:09,200 --> 00:01:12,440 Speaker 1: a thing. However, I do have that opportunity right now, 21 00:01:13,120 --> 00:01:17,200 Speaker 1: and I figure I would have this chance to kind 22 00:01:17,200 --> 00:01:19,959 Speaker 1: of talk to you about a few different speakers I 23 00:01:20,080 --> 00:01:23,400 Speaker 1: had the chance to to witness and listen to while 24 00:01:23,440 --> 00:01:26,720 Speaker 1: I was here at think two thousand eighteen, and uh, 25 00:01:26,760 --> 00:01:29,720 Speaker 1: these were several talks given by various experts, including a 26 00:01:29,720 --> 00:01:33,960 Speaker 1: few who have achieved celebrity status beyond just uh, the 27 00:01:34,200 --> 00:01:36,920 Speaker 1: the tech conference level. And I don't think I could 28 00:01:36,959 --> 00:01:42,360 Speaker 1: talk about any of these presentations as a full episode 29 00:01:42,360 --> 00:01:45,200 Speaker 1: without doing a ton more research to flesh it out, 30 00:01:45,240 --> 00:01:48,920 Speaker 1: to give background, because these presentations were each about forty 31 00:01:49,000 --> 00:01:52,560 Speaker 1: minutes thirty forty minutes long, and so you know, to 32 00:01:52,760 --> 00:01:55,160 Speaker 1: make a what usually ends up being about a forty 33 00:01:55,160 --> 00:01:58,080 Speaker 1: five minute long podcast about a thirty minute talk, you 34 00:01:58,120 --> 00:02:00,360 Speaker 1: can't do that without patting it, which I'm not gonna do. 35 00:02:00,560 --> 00:02:02,960 Speaker 1: That's an insult to you, guys. You deserve better than that, 36 00:02:04,080 --> 00:02:07,040 Speaker 1: or without doing a lot of background information to flesh 37 00:02:07,040 --> 00:02:10,760 Speaker 1: things out to provide context. And while I could, in 38 00:02:10,880 --> 00:02:13,480 Speaker 1: theory do that, there's only so many hours in the day. 39 00:02:13,600 --> 00:02:15,639 Speaker 1: So what we're gonna do instead is I'm going to 40 00:02:15,720 --> 00:02:18,320 Speaker 1: tell you about three of the speakers I've seen and 41 00:02:18,360 --> 00:02:21,040 Speaker 1: the stuff that they covered, and that will be each 42 00:02:21,160 --> 00:02:23,960 Speaker 1: of the three segments of this podcast. And I think 43 00:02:24,000 --> 00:02:26,320 Speaker 1: it's pretty interesting. They were a fascinating group of people. 44 00:02:26,639 --> 00:02:28,320 Speaker 1: And up first was someone who is kind of a 45 00:02:28,440 --> 00:02:33,680 Speaker 1: rock star in science. That would be Neil Degrass Tyson. 46 00:02:33,840 --> 00:02:36,200 Speaker 1: I'm sure most of you have heard about Neil de 47 00:02:36,200 --> 00:02:40,040 Speaker 1: Grass Tyson. Maybe you've watched television programs that he's hosted, 48 00:02:40,200 --> 00:02:45,000 Speaker 1: or heard him chime in on scientific discussions on various venues, 49 00:02:45,000 --> 00:02:49,120 Speaker 1: whether it's online video, radio shows, TV, all sorts of stuff. 50 00:02:49,320 --> 00:02:52,680 Speaker 1: He's uh of course, also made cameo appearances, or at 51 00:02:52,760 --> 00:02:55,679 Speaker 1: least a character that is Neil de Grass Tyson made 52 00:02:55,760 --> 00:02:58,839 Speaker 1: cameo appearances and epic rap battles of history. It wasn't 53 00:02:58,840 --> 00:03:01,520 Speaker 1: actually Neil Degrass Tyson was obviously an actor playing him. 54 00:03:01,560 --> 00:03:05,119 Speaker 1: But he's not just an accomplished scientist. He's a famous 55 00:03:05,160 --> 00:03:09,360 Speaker 1: science communicator. So he uses his knowledge and humor and 56 00:03:09,440 --> 00:03:14,760 Speaker 1: other uh social uh strategies to talk about science with 57 00:03:14,840 --> 00:03:17,560 Speaker 1: the general public and to explain it and to to 58 00:03:17,639 --> 00:03:20,800 Speaker 1: get enthusiasm behind it. He presented a talk here at 59 00:03:20,840 --> 00:03:24,360 Speaker 1: THINK two thousand eighteen called the Knowledge of Nature and 60 00:03:24,400 --> 00:03:29,080 Speaker 1: the Nature of Knowledge. Now, his presentation didn't speak directly 61 00:03:29,120 --> 00:03:32,040 Speaker 1: to the theme of technology, but much of what he 62 00:03:32,080 --> 00:03:35,560 Speaker 1: had to say was relevant to tech, particularly the way 63 00:03:35,600 --> 00:03:39,200 Speaker 1: people who work in tech communicate their work to the public. 64 00:03:39,520 --> 00:03:42,760 Speaker 1: And he began by talking about what we are able 65 00:03:42,800 --> 00:03:47,320 Speaker 1: to observe directly using our five primary senses, those of 66 00:03:47,360 --> 00:03:51,080 Speaker 1: course being site, hearing, smell, taste, and touch. Now he 67 00:03:51,120 --> 00:03:54,280 Speaker 1: acknowledged that we actually have other senses on top of 68 00:03:54,320 --> 00:03:58,160 Speaker 1: these five. You could argue a whole list of different senses, 69 00:03:58,160 --> 00:04:01,480 Speaker 1: like the ability to sense pain, which is related to 70 00:04:01,600 --> 00:04:05,760 Speaker 1: but not completely encompassed by touch. But these are generally 71 00:04:06,160 --> 00:04:10,640 Speaker 1: agreed upon as the five basic senses, and they obviously 72 00:04:10,720 --> 00:04:15,960 Speaker 1: have limits. So let's take an example. Our site is limited. 73 00:04:16,200 --> 00:04:19,360 Speaker 1: Not only can we not focus to an infinite distance, 74 00:04:19,360 --> 00:04:23,160 Speaker 1: Eventually the focal point will exceed what even the keenest 75 00:04:23,240 --> 00:04:26,640 Speaker 1: eyed person can see. We can also only see a 76 00:04:26,880 --> 00:04:30,160 Speaker 1: band of the spectrum of light. We call it the 77 00:04:30,279 --> 00:04:33,720 Speaker 1: visible spectrum. Obviously, you know that's that's all the light 78 00:04:33,800 --> 00:04:36,720 Speaker 1: that human beings are capable of perceiving with their sight. 79 00:04:37,360 --> 00:04:41,160 Speaker 1: But there's obviously light outside of that spectrum. There's ultraviolet light, 80 00:04:41,200 --> 00:04:44,200 Speaker 1: there's infrared light, and we know it exists, but we 81 00:04:44,240 --> 00:04:47,919 Speaker 1: cannot directly observe it with our eyes. And beyond that 82 00:04:47,960 --> 00:04:52,200 Speaker 1: there are other electromagnetic frequencies that we know exist, but 83 00:04:52,400 --> 00:04:56,479 Speaker 1: again we cannot observe them using only our five senses. 84 00:04:56,920 --> 00:05:01,280 Speaker 1: There are similar cases for our other senses, right, Like 85 00:05:02,000 --> 00:05:04,679 Speaker 1: there's limits to what we can hear. We can hear 86 00:05:04,720 --> 00:05:08,360 Speaker 1: at a certain spectrum of sound frequencies. Typically we call 87 00:05:08,440 --> 00:05:12,440 Speaker 1: it twenty hurts to twenty thou hurts and anything in between. 88 00:05:12,440 --> 00:05:15,760 Speaker 1: That that's the the average range of human hearing. And 89 00:05:15,800 --> 00:05:18,520 Speaker 1: of course that range decreases as we get older. As 90 00:05:18,560 --> 00:05:22,479 Speaker 1: we get older, our ability to hear to perceive those 91 00:05:22,600 --> 00:05:25,920 Speaker 1: upper level frequencies, the ones that are at the twenty 92 00:05:26,360 --> 00:05:30,480 Speaker 1: level um, that decreases over time. So we're are our 93 00:05:30,800 --> 00:05:35,000 Speaker 1: range narrows as we get older, and we know obviously 94 00:05:35,440 --> 00:05:38,039 Speaker 1: that there are frequencies outside of that. So he was 95 00:05:38,120 --> 00:05:42,520 Speaker 1: pointing out that we have limitations. It's not that our 96 00:05:42,560 --> 00:05:45,440 Speaker 1: senses are total crap. He was pointing out that there's 97 00:05:45,480 --> 00:05:47,599 Speaker 1: way more out there in the universe than we can 98 00:05:47,640 --> 00:05:53,479 Speaker 1: observe directly, and that, uh, we know this through science 99 00:05:53,480 --> 00:05:57,200 Speaker 1: and technology. We cannot directly experience it as human beings 100 00:05:57,520 --> 00:06:00,039 Speaker 1: in ways that we can make sense of using the 101 00:06:00,160 --> 00:06:02,720 Speaker 1: five basic senses, but we've learned a lot about it 102 00:06:03,160 --> 00:06:05,359 Speaker 1: using science and technology, and you can even think of 103 00:06:05,360 --> 00:06:10,200 Speaker 1: technology as applied science, because without science, there's really no technology. 104 00:06:10,600 --> 00:06:13,960 Speaker 1: With these scientific tools, we can then extend our understanding 105 00:06:14,040 --> 00:06:18,119 Speaker 1: of the universe. We can make observations, we can develop hypotheses, 106 00:06:18,480 --> 00:06:21,760 Speaker 1: we can test those hypotheses to see if they have merit, 107 00:06:22,320 --> 00:06:25,400 Speaker 1: and we can have other people test those hypotheses. We 108 00:06:25,440 --> 00:06:29,279 Speaker 1: can even have them use different methods to test those hypotheses, 109 00:06:29,279 --> 00:06:33,039 Speaker 1: and as long as those results keep coming back, then 110 00:06:33,200 --> 00:06:35,640 Speaker 1: we can be reasonably confident that we've hit on something. 111 00:06:35,960 --> 00:06:38,120 Speaker 1: This is the basis of the scientific method. In fact, 112 00:06:38,200 --> 00:06:40,640 Speaker 1: Neil deGrasse Tyson took some time to talk about the 113 00:06:40,680 --> 00:06:44,800 Speaker 1: scientific method and stress the importance of testing ideas and 114 00:06:44,880 --> 00:06:47,839 Speaker 1: have others test those same ideas, both with an approach 115 00:06:47,880 --> 00:06:50,960 Speaker 1: that is similar to the one you took when you 116 00:06:51,120 --> 00:06:54,479 Speaker 1: first tested it and with other entirely different approaches to 117 00:06:54,520 --> 00:06:56,839 Speaker 1: make sure that they come to the same result. And 118 00:06:56,880 --> 00:06:59,920 Speaker 1: if everyone gets more or less the same answer through 119 00:07:00,000 --> 00:07:03,159 Speaker 1: their various tests, and those tests are considered to be 120 00:07:03,240 --> 00:07:06,960 Speaker 1: well designed, you can feel reasonably sure that the idea 121 00:07:07,040 --> 00:07:10,560 Speaker 1: you had, the observation you made, the conclusion you came 122 00:07:10,600 --> 00:07:14,480 Speaker 1: to is close to an actual truth as close as 123 00:07:14,880 --> 00:07:17,480 Speaker 1: we can reasonably get, and therefore we just kind of 124 00:07:17,920 --> 00:07:21,440 Speaker 1: say it's true, because I mean, otherwise you're being really, 125 00:07:21,520 --> 00:07:24,320 Speaker 1: really particular. And if you argue that we can never 126 00:07:24,840 --> 00:07:29,160 Speaker 1: know the absolute truth, you might be right, but it's 127 00:07:29,200 --> 00:07:33,840 Speaker 1: also kind of impractical. So at some point you just say, 128 00:07:33,880 --> 00:07:36,080 Speaker 1: all right, this is good enough, this is close enough 129 00:07:36,120 --> 00:07:38,160 Speaker 1: to truth for us to call it that. He was 130 00:07:38,200 --> 00:07:42,680 Speaker 1: also careful to differentiate the concept of proof from evidence. 131 00:07:43,120 --> 00:07:46,160 Speaker 1: He talked about proofs in mathematical terms and said, no 132 00:07:46,280 --> 00:07:49,760 Speaker 1: good scientists would ever use the word proof when talking 133 00:07:49,760 --> 00:07:52,640 Speaker 1: about his or her work. You can find evidence to 134 00:07:52,680 --> 00:07:56,520 Speaker 1: support your claims, but not proof. Proofs are for math, 135 00:07:56,800 --> 00:07:59,560 Speaker 1: not for science. This goes back to something else I 136 00:07:59,560 --> 00:08:02,720 Speaker 1: had mentioned and that I had witnessed earlier in the 137 00:08:02,600 --> 00:08:06,920 Speaker 1: h the conference where Chilia Bosquini was talking about lattice 138 00:08:06,960 --> 00:08:11,720 Speaker 1: based cryptography and mathematical proofs, and the person she was 139 00:08:11,760 --> 00:08:16,760 Speaker 1: talking to was interpreting proof as evidence. So he was saying, 140 00:08:17,160 --> 00:08:20,320 Speaker 1: isn't the fact that computers have had so much trouble 141 00:08:20,680 --> 00:08:26,840 Speaker 1: with large factors, large number of factoring that that's proof 142 00:08:27,520 --> 00:08:31,200 Speaker 1: that it is a hard problem. And so they were 143 00:08:31,240 --> 00:08:33,040 Speaker 1: both using a word, but they were using it in 144 00:08:33,120 --> 00:08:36,480 Speaker 1: different ways, and there was a disconnect there. There was 145 00:08:36,520 --> 00:08:39,160 Speaker 1: a miscommunication going on, and I didn't want to butt 146 00:08:39,160 --> 00:08:41,160 Speaker 1: in because it didn't want to be a smarty pants. 147 00:08:41,760 --> 00:08:44,920 Speaker 1: But it was interesting to see this breakdown in communication 148 00:08:45,000 --> 00:08:50,160 Speaker 1: because they're using two separate sets of terminologies that share 149 00:08:50,160 --> 00:08:54,600 Speaker 1: a common a common lexicography, like they're all lexicons that 150 00:08:54,679 --> 00:08:57,120 Speaker 1: are exactly the same, but the meanings are different and 151 00:08:57,160 --> 00:09:00,520 Speaker 1: that's the problem. So then Neil Degrass Tyson went on 152 00:09:00,559 --> 00:09:04,800 Speaker 1: to talk about the difference between accuracy and precision. He 153 00:09:04,920 --> 00:09:07,760 Speaker 1: used a simple question, and that question is what time 154 00:09:07,840 --> 00:09:10,720 Speaker 1: is it? So his first answer was it's two thousand eighteen, 155 00:09:10,760 --> 00:09:13,160 Speaker 1: which is accurate, or at least it's accurate as of 156 00:09:13,160 --> 00:09:17,000 Speaker 1: the recording of this podcast. I'm recording it on March eighteen. 157 00:09:17,440 --> 00:09:21,199 Speaker 1: But even though the answer is accurate, it's not very precise. 158 00:09:21,320 --> 00:09:23,400 Speaker 1: It doesn't go to a very good level of precision. 159 00:09:23,400 --> 00:09:26,720 Speaker 1: It's not useful. Typically, like if I ask you what 160 00:09:26,840 --> 00:09:28,640 Speaker 1: time is it because I need to figure out if 161 00:09:28,679 --> 00:09:31,839 Speaker 1: I'm going to be late to a meeting I'm rushing 162 00:09:31,880 --> 00:09:33,800 Speaker 1: off to, and you just give me the year. That 163 00:09:33,840 --> 00:09:36,800 Speaker 1: doesn't give me enough information to know whether or not 164 00:09:36,840 --> 00:09:39,080 Speaker 1: I need to take a car or if I can 165 00:09:39,120 --> 00:09:41,720 Speaker 1: take a bike ride to get there. However, you could 166 00:09:41,760 --> 00:09:44,400 Speaker 1: also air the other way, where you have a lot 167 00:09:44,440 --> 00:09:46,480 Speaker 1: of precision but not a lot of accuracy. So his 168 00:09:46,600 --> 00:09:50,560 Speaker 1: second answer said, oh, it's oh, seven hours, forty four minutes, 169 00:09:50,600 --> 00:09:53,920 Speaker 1: twenty two seconds and thirty seven milliseconds. Now that answer 170 00:09:54,000 --> 00:09:56,400 Speaker 1: is precise, but it could also be wrong. What if 171 00:09:56,440 --> 00:10:01,080 Speaker 1: it's forty five minutes, so it's off by a whole minute. Yeah, 172 00:10:01,080 --> 00:10:03,440 Speaker 1: you had this huge level of precision, but your accuracy 173 00:10:03,520 --> 00:10:06,640 Speaker 1: was off. Moreover, because you went to such a level 174 00:10:06,679 --> 00:10:11,480 Speaker 1: of precision down to the millisecond, you cannot possibly be 175 00:10:11,600 --> 00:10:16,320 Speaker 1: accurate because by the time you finish telling someone what 176 00:10:16,360 --> 00:10:19,040 Speaker 1: time it is, it will be a different time. And 177 00:10:19,120 --> 00:10:22,599 Speaker 1: so when was it four minutes, twenty two seconds and 178 00:10:22,640 --> 00:10:25,400 Speaker 1: thirty seven milliseconds? Was it when you started telling the 179 00:10:25,440 --> 00:10:27,880 Speaker 1: person what time it was? Was it? The end of 180 00:10:27,920 --> 00:10:32,320 Speaker 1: that precision without accuracy is not very useful either, was 181 00:10:32,360 --> 00:10:37,000 Speaker 1: his point. And uh, his this was to say that 182 00:10:37,160 --> 00:10:40,880 Speaker 1: accuracy and precision are both important, and the level of 183 00:10:40,880 --> 00:10:44,280 Speaker 1: precision you should use when you're communicating something to others 184 00:10:44,720 --> 00:10:47,520 Speaker 1: is dependent upon what you're trying to accomplish. You could 185 00:10:47,520 --> 00:10:50,760 Speaker 1: always get more precise, but sometimes that reaches a level 186 00:10:50,800 --> 00:10:53,240 Speaker 1: that is no longer of any real use. And I 187 00:10:53,280 --> 00:10:56,000 Speaker 1: can identify with this idea. See when I construct an 188 00:10:56,000 --> 00:10:58,320 Speaker 1: episode of tech stuff, when I'm researching and writing up 189 00:10:58,320 --> 00:11:01,679 Speaker 1: my notes, I frequently have to figure out where should 190 00:11:01,720 --> 00:11:05,559 Speaker 1: I start? Where's my starting point for the episode, based 191 00:11:05,640 --> 00:11:08,280 Speaker 1: upon whatever topic it is that I'm covering. Now, Rarely 192 00:11:08,520 --> 00:11:11,760 Speaker 1: do I start right at the quote unquote beginning of 193 00:11:11,760 --> 00:11:15,160 Speaker 1: a technology's creation. And that's because it's usually helpful to 194 00:11:15,400 --> 00:11:19,400 Speaker 1: understand things that were happening before someone, more likely a 195 00:11:19,440 --> 00:11:24,200 Speaker 1: string of someone's invented the technology. Without having that understanding 196 00:11:24,320 --> 00:11:28,079 Speaker 1: of what preceded the invention, you don't really have enough 197 00:11:28,120 --> 00:11:31,200 Speaker 1: context to understand what is going on and why it's important. 198 00:11:31,520 --> 00:11:33,800 Speaker 1: Neil deGrasse Tyson went on to use an example of 199 00:11:33,800 --> 00:11:37,600 Speaker 1: his own as he spoke about the Earth's orbit. The 200 00:11:37,679 --> 00:11:41,240 Speaker 1: shorthand description of the Earth's orbit around the Sun. Something 201 00:11:41,240 --> 00:11:43,320 Speaker 1: that you might have learned in elementary school is that 202 00:11:43,360 --> 00:11:46,680 Speaker 1: it's an elliptical orbit. But that's not entirely accurate. First 203 00:11:46,720 --> 00:11:49,360 Speaker 1: of all, the Earth's orbit is only a slight deviation 204 00:11:49,600 --> 00:11:52,440 Speaker 1: from a circle. You could argue that it's closer to 205 00:11:52,480 --> 00:11:55,800 Speaker 1: being a circular orbit than it is an elliptical orbit. 206 00:11:55,920 --> 00:11:59,600 Speaker 1: But despite this, many textbooks exaggerate the path that the 207 00:11:59,600 --> 00:12:03,080 Speaker 1: Earth's orbit takes to show that's not a perfect circle. 208 00:12:03,520 --> 00:12:06,199 Speaker 1: But even if you were to say it's more circular 209 00:12:06,240 --> 00:12:10,760 Speaker 1: than elliptical, that's not entirely precise because the Earth's Moon 210 00:12:11,040 --> 00:12:14,360 Speaker 1: affects the Earth's orbital path. Now, the Moon does not 211 00:12:15,080 --> 00:12:18,600 Speaker 1: actually revolve around the Earth. Instead, the Earth and the 212 00:12:18,640 --> 00:12:21,839 Speaker 1: Moon are in a dance together. They rotate around a 213 00:12:21,880 --> 00:12:25,320 Speaker 1: common center of gravity between the two. This is true 214 00:12:25,360 --> 00:12:28,840 Speaker 1: for any two objects that are revolving around one another. 215 00:12:29,040 --> 00:12:33,440 Speaker 1: They revolve around a common point of center of gravity. Now, 216 00:12:33,480 --> 00:12:36,920 Speaker 1: as it happens because of the masses of the two 217 00:12:37,600 --> 00:12:40,760 Speaker 1: the two bodies, the Earth and the Moon, the center 218 00:12:40,800 --> 00:12:44,480 Speaker 1: of gravity happens to be located below the Earth's surface. 219 00:12:44,800 --> 00:12:46,839 Speaker 1: But it's not at the center of the Earth. If 220 00:12:46,840 --> 00:12:48,560 Speaker 1: it if it were at the center of the Earth, 221 00:12:48,559 --> 00:12:52,160 Speaker 1: then you could say while the Moon essentially revolves around 222 00:12:52,160 --> 00:12:55,480 Speaker 1: the Earth. Instead, it's about a thousand miles below the 223 00:12:55,520 --> 00:12:59,400 Speaker 1: Earth's surface on whichever side the Moon is on at 224 00:12:59,440 --> 00:13:02,000 Speaker 1: that time. So, in other words, the center of gravity 225 00:13:02,040 --> 00:13:07,720 Speaker 1: moves around the Earth's interior as the Moon continues through 226 00:13:07,720 --> 00:13:10,520 Speaker 1: its orbit, which has the effect of creating a little 227 00:13:10,559 --> 00:13:13,960 Speaker 1: wobble in the Earth's orbit. So you can really think 228 00:13:14,000 --> 00:13:16,080 Speaker 1: of the Earth's orbit around the Sun as a bit 229 00:13:16,160 --> 00:13:19,240 Speaker 1: of a a circle that's made up of a kind 230 00:13:19,240 --> 00:13:22,240 Speaker 1: of squiggly line. Then that squiggly line is the little 231 00:13:22,360 --> 00:13:26,080 Speaker 1: jiggle that the Earth has because of the the Moon, 232 00:13:26,400 --> 00:13:29,480 Speaker 1: the Moon's poll on the Earth. But even that is 233 00:13:29,520 --> 00:13:33,480 Speaker 1: not truly precise, because the Sun in our Solar system 234 00:13:33,840 --> 00:13:37,520 Speaker 1: is orbiting around the galactic center of the Milky Way 235 00:13:37,600 --> 00:13:41,280 Speaker 1: galaxy and it's not in the same plane as Earth's orbit. 236 00:13:41,400 --> 00:13:45,800 Speaker 1: So while from one perspective you could say the Earth 237 00:13:45,880 --> 00:13:48,880 Speaker 1: has a wobbly circular orbit, if you were looking at 238 00:13:48,920 --> 00:13:51,440 Speaker 1: the Sun directly in front of it as it was 239 00:13:51,480 --> 00:13:54,760 Speaker 1: moving through its galactical orbital path, you would see that 240 00:13:54,800 --> 00:13:57,360 Speaker 1: the planets of our Solar system are moving around and 241 00:13:57,400 --> 00:14:02,640 Speaker 1: what it amounts to a spiral Because take any arbitrary 242 00:14:02,720 --> 00:14:06,600 Speaker 1: starting position of the Earth with respect to its position 243 00:14:06,640 --> 00:14:08,480 Speaker 1: with the Sun, and you say, all right, this is 244 00:14:08,480 --> 00:14:12,800 Speaker 1: the day zero, and we're gonna go three days around 245 00:14:13,200 --> 00:14:15,520 Speaker 1: so that we return to our point of origin. The 246 00:14:15,559 --> 00:14:18,520 Speaker 1: thing is you'll never return to that exact fixed point 247 00:14:18,600 --> 00:14:23,600 Speaker 1: in space, because the Solar system itself is uh rotating 248 00:14:23,600 --> 00:14:26,840 Speaker 1: around this galactic center in the Milky Way. So while 249 00:14:26,960 --> 00:14:31,240 Speaker 1: you will return to your same relative position in reference 250 00:14:31,280 --> 00:14:34,080 Speaker 1: to the Sun, you won't be in the same fixed 251 00:14:34,080 --> 00:14:37,480 Speaker 1: point in space because the whole system has been moving 252 00:14:37,520 --> 00:14:40,880 Speaker 1: this entire time. So you can't really even say it's 253 00:14:40,920 --> 00:14:43,120 Speaker 1: a circle. It's more like a spiral when you when 254 00:14:43,160 --> 00:14:46,480 Speaker 1: you take your reference point as the galaxy as opposed 255 00:14:46,520 --> 00:14:50,080 Speaker 1: to just the Solar system. Uh. Again, the whole purpose 256 00:14:50,120 --> 00:14:52,320 Speaker 1: of this was to give an example to say we 257 00:14:52,400 --> 00:14:55,640 Speaker 1: have to make choices when we communicate information to others. 258 00:14:56,040 --> 00:14:57,960 Speaker 1: At what point do we need to draw the line 259 00:14:57,960 --> 00:15:00,880 Speaker 1: and say, okay, that's enough detail. Anything more than that 260 00:15:01,000 --> 00:15:04,880 Speaker 1: is either confusing or it's boring, or it's both. And 261 00:15:04,920 --> 00:15:06,920 Speaker 1: then we make those determinations and hope that we can 262 00:15:06,960 --> 00:15:10,520 Speaker 1: inspire people to look further into subjects. Later on, Neil 263 00:15:10,560 --> 00:15:14,360 Speaker 1: deGrasse Tyson's presentation concluded with a discussion about how to 264 00:15:14,440 --> 00:15:17,440 Speaker 1: convey information to your audience in a way that matters 265 00:15:17,480 --> 00:15:21,800 Speaker 1: to them. He showed examples of science incorporated into pop culture. 266 00:15:22,400 --> 00:15:25,880 Speaker 1: He specifically played a clip from the movie Frozen and 267 00:15:25,920 --> 00:15:29,280 Speaker 1: then he danced around on stage as Elsa sang frozen 268 00:15:29,400 --> 00:15:33,800 Speaker 1: fractals all around because the word fractal was included in 269 00:15:33,880 --> 00:15:37,040 Speaker 1: a Disney film and sung by a Disney princess. And 270 00:15:37,080 --> 00:15:41,480 Speaker 1: he said, this is amazing progress. He said, you need 271 00:15:41,560 --> 00:15:43,840 Speaker 1: to find out what your audience cares about and then 272 00:15:43,880 --> 00:15:46,680 Speaker 1: find ways to use that as a means of explaining 273 00:15:46,760 --> 00:15:50,720 Speaker 1: scientific or technological ideas to them. And he used football 274 00:15:50,760 --> 00:15:53,160 Speaker 1: as his own example, showing off a series of fairly 275 00:15:53,200 --> 00:15:56,280 Speaker 1: recent tweets he had made during football games to get 276 00:15:56,280 --> 00:15:59,880 Speaker 1: across scientific principles. Now, those tweets sparked a lot of 277 00:16:00,000 --> 00:16:03,280 Speaker 1: conversation and humor and jokes around his followers and other 278 00:16:03,280 --> 00:16:05,760 Speaker 1: people who saw those messages, and it meant that for 279 00:16:05,840 --> 00:16:09,240 Speaker 1: a short while at least, people were talking about actual science. 280 00:16:09,960 --> 00:16:12,000 Speaker 1: For people in the technology space, that could be a 281 00:16:12,080 --> 00:16:16,440 Speaker 1: very valuable lesson because technology gets super complicated, and not 282 00:16:16,600 --> 00:16:21,400 Speaker 1: only can the actual mechanics of technology get complex, the 283 00:16:21,400 --> 00:16:25,480 Speaker 1: way it works and the intricacies of how it works. 284 00:16:26,200 --> 00:16:30,560 Speaker 1: The way we describe technology can also be really confusing 285 00:16:30,600 --> 00:16:32,320 Speaker 1: because we use a lot of jargon, a lot of 286 00:16:32,440 --> 00:16:36,600 Speaker 1: terminology that is not in the common parlance, and we 287 00:16:36,680 --> 00:16:40,520 Speaker 1: use a lot of shorthand to communicate complicated ideas. To 288 00:16:40,600 --> 00:16:45,240 Speaker 1: a newcomer, this comes across as incredibly intimidating, very dense, 289 00:16:45,360 --> 00:16:48,640 Speaker 1: and difficult to understand. Before you can even get to 290 00:16:48,720 --> 00:16:53,520 Speaker 1: the point where you comprehend the actual concepts that underlie technology, 291 00:16:53,760 --> 00:16:56,239 Speaker 1: you first have to get a grip on the language 292 00:16:56,280 --> 00:16:59,360 Speaker 1: that's being used. Then you have to struggle with that 293 00:16:59,440 --> 00:17:03,480 Speaker 1: language to understand what is actually happening with any technological, 294 00:17:04,080 --> 00:17:08,200 Speaker 1: uh product, or service. And if technologies can use analogies 295 00:17:08,359 --> 00:17:11,679 Speaker 1: or stories or examples to explain their work, If if 296 00:17:11,720 --> 00:17:15,800 Speaker 1: technologists rather can use those tools to explain what it 297 00:17:15,840 --> 00:17:18,159 Speaker 1: is they're doing and what they're they're the things they 298 00:17:18,160 --> 00:17:22,240 Speaker 1: work on, what they do, then the general public may 299 00:17:22,520 --> 00:17:25,520 Speaker 1: end up having a more enthusiastic response because they'll have 300 00:17:25,600 --> 00:17:30,440 Speaker 1: a touchstone to understand what exactly is going on. Now, 301 00:17:30,480 --> 00:17:34,080 Speaker 1: that's easier said than done, because you you gotta you 302 00:17:34,119 --> 00:17:37,320 Speaker 1: gotta walk a line. You don't want to simplify matters 303 00:17:37,800 --> 00:17:41,480 Speaker 1: to the point where people are going to misunderstand what 304 00:17:41,520 --> 00:17:44,960 Speaker 1: you're saying. So with some subjects like artificial intelligence or 305 00:17:45,040 --> 00:17:48,680 Speaker 1: quantum computing, going too broad or too simple will make 306 00:17:48,720 --> 00:17:51,920 Speaker 1: people think that the technology is akin to magic because 307 00:17:52,320 --> 00:17:55,760 Speaker 1: you've you've abstracted it and you've simplified it so much 308 00:17:56,320 --> 00:17:59,240 Speaker 1: that it just sounds like it does things that nothing 309 00:17:59,280 --> 00:18:03,320 Speaker 1: could truly possibly do, or you give them the idea 310 00:18:03,400 --> 00:18:06,399 Speaker 1: that this technology is capable of doing things that we 311 00:18:06,480 --> 00:18:09,880 Speaker 1: really can't do yet. So, like the science lesson about 312 00:18:09,920 --> 00:18:12,160 Speaker 1: the Earth's orbit, there's a balance we have to achieve 313 00:18:12,280 --> 00:18:15,320 Speaker 1: when we're explaining these things to someone new to the 314 00:18:15,359 --> 00:18:18,320 Speaker 1: subject matter. And that's true across every industry. It's not 315 00:18:18,400 --> 00:18:22,119 Speaker 1: just science or technology. I mean, there are plenty of 316 00:18:22,440 --> 00:18:27,160 Speaker 1: discussions about business and particularly finance. Law is another example. 317 00:18:27,720 --> 00:18:31,600 Speaker 1: Whenever I read any stories about finance or law or business, 318 00:18:32,040 --> 00:18:34,320 Speaker 1: I get to a point where I realize I've read 319 00:18:34,359 --> 00:18:37,439 Speaker 1: a paragraph and I have absolutely no idea what that 320 00:18:37,480 --> 00:18:40,280 Speaker 1: paragraph meant. I can reread it and I can say 321 00:18:40,320 --> 00:18:43,160 Speaker 1: to myself, I individually I understand what all of these 322 00:18:43,200 --> 00:18:46,720 Speaker 1: words mean, but collectively I don't get what they're trying 323 00:18:46,720 --> 00:18:49,359 Speaker 1: to say. And you know that that comes with the 324 00:18:49,440 --> 00:18:53,240 Speaker 1: territory of having specialization in any field. Eventually get to 325 00:18:53,280 --> 00:18:57,200 Speaker 1: a point where the language you use is extremely efficient. 326 00:18:57,320 --> 00:18:59,880 Speaker 1: If you're talking to someone else who shares your knowledge, 327 00:19:00,160 --> 00:19:03,800 Speaker 1: you can have very deep very quick conversations based upon 328 00:19:03,840 --> 00:19:06,480 Speaker 1: this shared language. But if you try to communicate to 329 00:19:06,480 --> 00:19:09,119 Speaker 1: anyone outside of that, you get the deer in the 330 00:19:09,119 --> 00:19:11,960 Speaker 1: headlights look. Or at least that's what I get whenever 331 00:19:12,000 --> 00:19:14,400 Speaker 1: I try and talk to anyone about finance or law 332 00:19:14,520 --> 00:19:16,840 Speaker 1: or anything like that. So that's why I try to 333 00:19:16,880 --> 00:19:19,280 Speaker 1: structure episodes the way I do. It's not that I 334 00:19:19,320 --> 00:19:22,320 Speaker 1: don't have faith in my audience. I have a great audience. 335 00:19:22,359 --> 00:19:25,280 Speaker 1: I have a really smart and engaging audience. It's that 336 00:19:25,680 --> 00:19:28,680 Speaker 1: I never know when it might be someone's first introduction 337 00:19:28,880 --> 00:19:31,680 Speaker 1: to that specific subject matter, and I want to make 338 00:19:31,720 --> 00:19:34,760 Speaker 1: sure that I can build a foundation before diving more 339 00:19:34,800 --> 00:19:37,639 Speaker 1: deeply into any given topic. When we come back from 340 00:19:37,680 --> 00:19:40,680 Speaker 1: the break, i'll tell you about the presentation a famous 341 00:19:40,720 --> 00:19:45,760 Speaker 1: futurist gave at IBM Think. But first let's take a 342 00:19:45,800 --> 00:19:56,080 Speaker 1: break and thank our sponsor. The next famous person I 343 00:19:56,080 --> 00:19:58,879 Speaker 1: saw give a presentation at Think two thousand eighteen was 344 00:19:59,000 --> 00:20:03,600 Speaker 1: Dr michio Aku. Dr Cocku is a theoretical physicist and 345 00:20:03,600 --> 00:20:06,760 Speaker 1: a professor at City College of New York. He's also 346 00:20:06,800 --> 00:20:10,080 Speaker 1: an author. A futurist, UH that would be someone who 347 00:20:10,080 --> 00:20:13,680 Speaker 1: predicts how technology and advancements and science will change our 348 00:20:13,720 --> 00:20:16,680 Speaker 1: world in the decades to come very close to my heart. 349 00:20:16,720 --> 00:20:19,359 Speaker 1: Since I used to host a show called Forward Thinking. 350 00:20:19,400 --> 00:20:22,400 Speaker 1: In fact, I often would read about Minchio Okaku's work 351 00:20:22,920 --> 00:20:25,760 Speaker 1: as a host of Forward Thinking, I would refer to 352 00:20:25,840 --> 00:20:28,280 Speaker 1: that as part of the research I did for lots 353 00:20:28,280 --> 00:20:31,399 Speaker 1: of different topics. He's a frequent guest or host on 354 00:20:31,560 --> 00:20:34,560 Speaker 1: television series and radio programs, and he talks a lot 355 00:20:34,600 --> 00:20:38,479 Speaker 1: about his work and predictions regularly. He has specifically studied 356 00:20:38,560 --> 00:20:41,639 Speaker 1: quantum mechanics and string theory. Now, I've talked an awful 357 00:20:41,680 --> 00:20:44,040 Speaker 1: lot about quantum mechanics already in this mini series, so 358 00:20:44,080 --> 00:20:46,240 Speaker 1: I'm not going to rehash that. But what the heck 359 00:20:46,320 --> 00:20:50,720 Speaker 1: is string theory? Well, string theory is a family of 360 00:20:50,840 --> 00:20:55,840 Speaker 1: theories that attempt to describe why fundamental forces and particles 361 00:20:55,880 --> 00:21:00,040 Speaker 1: found in nature behave the way they do. They go 362 00:21:00,119 --> 00:21:05,159 Speaker 1: into topic in depth. Uh, it would require a whole 363 00:21:05,240 --> 00:21:08,560 Speaker 1: suite of episodes to explain this. And there are multiple 364 00:21:08,880 --> 00:21:12,320 Speaker 1: types of string theory. There's not just one string theory. 365 00:21:12,680 --> 00:21:16,520 Speaker 1: There are competing theories. Um. I would rapidly find myself 366 00:21:16,520 --> 00:21:19,719 Speaker 1: out of depth if I tried to tackle the entire field, 367 00:21:19,760 --> 00:21:23,040 Speaker 1: and even sum summarize it in uh, you know, in 368 00:21:23,080 --> 00:21:27,040 Speaker 1: a way that was at least semi comprehensive, but I 369 00:21:27,080 --> 00:21:30,119 Speaker 1: can give a very super high level idea. In general, 370 00:21:30,480 --> 00:21:35,440 Speaker 1: these theories describe forces and particles as one dimensional strings 371 00:21:35,480 --> 00:21:38,879 Speaker 1: that vibrate in a way that gives those forces and 372 00:21:38,960 --> 00:21:42,920 Speaker 1: particles their properties. Some of them are closed loop strings 373 00:21:43,000 --> 00:21:46,160 Speaker 1: that are like rubber bands. Dr Cocu has talked about 374 00:21:46,200 --> 00:21:50,040 Speaker 1: supersymmetric string theory in particular. That's one of the flavors 375 00:21:50,080 --> 00:21:53,600 Speaker 1: of string theory, and it's really fascinating stuff that at 376 00:21:53,680 --> 00:21:57,240 Speaker 1: least I'm told makes sense from a mathematical perspective. Now 377 00:21:57,240 --> 00:21:59,320 Speaker 1: I say I'm told because the math is far too 378 00:21:59,320 --> 00:22:02,600 Speaker 1: complex for me to understand, so I certainly can't look 379 00:22:02,600 --> 00:22:04,800 Speaker 1: at and say, oh, yeah, no, that holds up. But 380 00:22:04,880 --> 00:22:07,960 Speaker 1: as of right now and for the foreseeable future, we 381 00:22:08,000 --> 00:22:11,919 Speaker 1: have no way to directly observe and test string theory, 382 00:22:12,359 --> 00:22:14,199 Speaker 1: which has led some people to say that's not so 383 00:22:14,280 --> 00:22:17,879 Speaker 1: much a scientific theory as it is a philosophy because 384 00:22:17,880 --> 00:22:20,520 Speaker 1: you cannot test or observe it. But we'll set that 385 00:22:20,600 --> 00:22:24,639 Speaker 1: whole argument aside. Dr Kaku, who took the main stage 386 00:22:24,640 --> 00:22:28,439 Speaker 1: here at IBM focus largely unwary thinks humanity will be 387 00:22:28,440 --> 00:22:31,520 Speaker 1: headed in the next few decades with the space the 388 00:22:31,560 --> 00:22:35,119 Speaker 1: space missions in particular, he pointed out that today's computers 389 00:22:35,200 --> 00:22:39,040 Speaker 1: still depend upon silicon chips and Moore's law, but he 390 00:22:39,040 --> 00:22:42,600 Speaker 1: also talked about the ultimate limits of that technology, and 391 00:22:42,640 --> 00:22:45,159 Speaker 1: as I mentioned in this series, you can only shrink 392 00:22:45,200 --> 00:22:49,440 Speaker 1: components down so far using traditional silicon chip technology. Once 393 00:22:49,440 --> 00:22:53,639 Speaker 1: you shrink below certain thresholds, quantum effects begin to override 394 00:22:53,680 --> 00:22:57,280 Speaker 1: your classical computer design, and electrons no longer follow the 395 00:22:57,320 --> 00:23:00,000 Speaker 1: pathways you've made for them, and our electronics are tod 396 00:23:00,000 --> 00:23:02,600 Speaker 1: in it upon electrons doing what we want them to do. 397 00:23:02,960 --> 00:23:05,840 Speaker 1: At the quantum level, electrons do lots of strange things, 398 00:23:06,080 --> 00:23:09,520 Speaker 1: things we can't necessarily control or predict, and that means 399 00:23:09,800 --> 00:23:13,320 Speaker 1: our electronics and computers that are built on these components 400 00:23:14,280 --> 00:23:18,160 Speaker 1: cannot really work properly. They will will become unreliable, they'll 401 00:23:18,200 --> 00:23:21,600 Speaker 1: become error prone. We can't we can't keep going this 402 00:23:21,720 --> 00:23:26,240 Speaker 1: route indefinitely. We will eventually hit that fundamental physical level 403 00:23:26,320 --> 00:23:29,800 Speaker 1: based upon how we build things today. So something has 404 00:23:29,880 --> 00:23:32,960 Speaker 1: to change now. Dr Cocu said, if we cling to 405 00:23:33,000 --> 00:23:35,879 Speaker 1: silicon as the basis for our technology moving forward, then 406 00:23:35,880 --> 00:23:39,520 Speaker 1: eventually Silicon Valley could become a new rust belt. We 407 00:23:39,560 --> 00:23:42,280 Speaker 1: will hit that ultimate limit of what we're capable of 408 00:23:42,320 --> 00:23:45,200 Speaker 1: doing using that technology, and we will progress no further. 409 00:23:45,520 --> 00:23:48,400 Speaker 1: So we'll need to shift to a new paradigm. Dr 410 00:23:48,480 --> 00:23:51,000 Speaker 1: Kaku mentioned several technologies who felt were going to be 411 00:23:51,040 --> 00:23:53,760 Speaker 1: instrumental in pushing us past the end of the silicon 412 00:23:53,880 --> 00:23:56,399 Speaker 1: chip age and Moore's law, and he talked about the 413 00:23:56,440 --> 00:24:00,159 Speaker 1: four great technological revolutions. The first of those would the 414 00:24:00,200 --> 00:24:03,560 Speaker 1: development of the steam engine, which led to an incredibly 415 00:24:03,600 --> 00:24:07,920 Speaker 1: productive period and the Industrial Revolution. Understanding and leveraging the 416 00:24:08,000 --> 00:24:11,159 Speaker 1: laws of thermodynamics allowed us to create new ways to 417 00:24:11,200 --> 00:24:14,280 Speaker 1: do work and travel. We were able to create engines 418 00:24:14,359 --> 00:24:18,480 Speaker 1: that could harness the power of thermodynamics and translated into work. 419 00:24:19,200 --> 00:24:22,720 Speaker 1: About at eight decades after the invention of the steam engine, 420 00:24:22,800 --> 00:24:26,320 Speaker 1: physicists were getting a much better understanding of some other 421 00:24:26,359 --> 00:24:31,040 Speaker 1: scientific principles like magnetism and electricity, which led to inventions 422 00:24:31,080 --> 00:24:35,720 Speaker 1: like radio and then television. Decades after that, engineers and 423 00:24:35,760 --> 00:24:40,720 Speaker 1: scientists invented the transistor, the third of the technological revolutions. 424 00:24:40,760 --> 00:24:43,960 Speaker 1: That transistor made micro computers possible, as well as other 425 00:24:44,000 --> 00:24:46,840 Speaker 1: devices like lasers, and it allowed us to go to space. 426 00:24:46,960 --> 00:24:50,119 Speaker 1: It created the space industry. In fact, the space industry 427 00:24:50,920 --> 00:24:56,040 Speaker 1: pretty much gave the incentive to development of transistors and manaturization. 428 00:24:56,640 --> 00:24:59,119 Speaker 1: We had a goal, we wanted to send people to 429 00:24:59,160 --> 00:25:01,359 Speaker 1: the Moon, and that meant that we were going to 430 00:25:01,480 --> 00:25:04,440 Speaker 1: have to make some big advancements in science and technology 431 00:25:04,480 --> 00:25:08,320 Speaker 1: in order to make that possible. Space, as in the 432 00:25:08,359 --> 00:25:11,600 Speaker 1: space inside of a capsule, is at a premium. You 433 00:25:11,600 --> 00:25:13,720 Speaker 1: want to make sure things are lightweight, you want to 434 00:25:13,720 --> 00:25:16,359 Speaker 1: make sure things are compact, and that meant that we 435 00:25:16,359 --> 00:25:19,520 Speaker 1: couldn't rely upon the massive circuitry of the past. We 436 00:25:19,640 --> 00:25:23,560 Speaker 1: had to create other means to manaturize things and make 437 00:25:23,600 --> 00:25:26,840 Speaker 1: it more practical for applications like space travel. And uh, 438 00:25:27,880 --> 00:25:31,840 Speaker 1: mitchio Kaku is very excited about space travel. He's very 439 00:25:31,840 --> 00:25:34,840 Speaker 1: excited the fact that we are dedicating ourselves to going 440 00:25:34,880 --> 00:25:37,720 Speaker 1: back to the Moon, first with an unmanned probe that 441 00:25:37,800 --> 00:25:41,600 Speaker 1: should land on the Moon in two thousand nineteen, then 442 00:25:41,680 --> 00:25:46,440 Speaker 1: eventually with a a base in orbit around the Moon 443 00:25:46,440 --> 00:25:48,720 Speaker 1: which will act as sort of a launching ground for 444 00:25:49,080 --> 00:25:53,680 Speaker 1: missions towards Mars. And he talked about how that's very 445 00:25:53,680 --> 00:25:57,320 Speaker 1: important and also mentioned Elon Musk's follow up to the 446 00:25:57,359 --> 00:26:01,240 Speaker 1: Falcon nine Heavy Rocket, which is called the b f R. 447 00:26:01,440 --> 00:26:04,840 Speaker 1: B stands for big, our stands for rocket, and F 448 00:26:05,119 --> 00:26:09,120 Speaker 1: as Dr CaCu says, stands for well, let your imagination 449 00:26:09,160 --> 00:26:11,720 Speaker 1: take over. The Fourth Revolution is the one that's happening 450 00:26:11,840 --> 00:26:18,080 Speaker 1: right now. Dr CaCu says. It involves stuff like biotechnology, nanotechnology, 451 00:26:18,240 --> 00:26:23,560 Speaker 1: artificial intelligence, and related technologies. It's also about computing models 452 00:26:23,600 --> 00:26:27,120 Speaker 1: like neural networks, which mimic the neural structure of brains. 453 00:26:27,520 --> 00:26:30,359 Speaker 1: The neural network is made up of nodes that behave 454 00:26:30,440 --> 00:26:34,119 Speaker 1: like neurons, like the neurons the neural cells we have 455 00:26:34,240 --> 00:26:37,520 Speaker 1: in our brains. There are interconnections with those nodes to 456 00:26:37,720 --> 00:26:41,439 Speaker 1: other artificial neurons, and processing information with a neural network 457 00:26:41,480 --> 00:26:44,280 Speaker 1: allows you to do some interesting things like machine learning. 458 00:26:44,680 --> 00:26:49,240 Speaker 1: Computers relying on an artificial neural network can learn from data, 459 00:26:49,359 --> 00:26:52,480 Speaker 1: and they can get more proficient at handling that information 460 00:26:52,640 --> 00:26:57,040 Speaker 1: as they get more experience, so they improve themselves. Over time, 461 00:26:57,400 --> 00:27:00,560 Speaker 1: the most efficient pathways start to win now over the 462 00:27:00,640 --> 00:27:04,560 Speaker 1: least efficient pathways, and you can teach computers. Now I've 463 00:27:04,560 --> 00:27:07,720 Speaker 1: talked about this before, about teaching computers how to recognize 464 00:27:07,760 --> 00:27:11,479 Speaker 1: what a cat is, for example, which is a somewhat 465 00:27:11,520 --> 00:27:14,880 Speaker 1: trivial version of this, but neural networks are being used 466 00:27:14,920 --> 00:27:19,160 Speaker 1: across multiple industries to solve very difficult problems. He also 467 00:27:19,200 --> 00:27:22,560 Speaker 1: mentioned artificial intelligence, which will augment our abilities to make 468 00:27:22,600 --> 00:27:25,280 Speaker 1: decisions and take actions, and he stressed that he felt 469 00:27:25,280 --> 00:27:30,000 Speaker 1: AI was not poised to replace human beings, but help them. 470 00:27:30,240 --> 00:27:33,399 Speaker 1: He laid out the two big arguments that tend to 471 00:27:33,720 --> 00:27:37,680 Speaker 1: be be presented in AI, sort of the Zuckerberg argument 472 00:27:38,040 --> 00:27:41,960 Speaker 1: versus the Elon Musk argument, and the Zuckerberg argument is 473 00:27:42,119 --> 00:27:44,240 Speaker 1: AI is going to be a huge help. It's going 474 00:27:44,320 --> 00:27:46,879 Speaker 1: to benefit us. It's going to let us do what 475 00:27:46,920 --> 00:27:49,119 Speaker 1: we want to do better and faster. It's going to 476 00:27:49,200 --> 00:27:52,280 Speaker 1: create opportunities that do not exist right now that we 477 00:27:52,359 --> 00:27:55,560 Speaker 1: can't even predict because we aren't in that age yet. 478 00:27:56,119 --> 00:27:58,399 Speaker 1: Musk says it's going to lead to killer robots and 479 00:27:58,400 --> 00:28:00,919 Speaker 1: we're all gonna die now. I am are simplifying on 480 00:28:01,040 --> 00:28:03,840 Speaker 1: both sides, more on Musk than on Zuckerberg for a 481 00:28:03,840 --> 00:28:08,000 Speaker 1: comedic effect. Musk says AI could potentially pose an existential 482 00:28:08,040 --> 00:28:10,040 Speaker 1: threat to the human race, saying that if you were 483 00:28:10,080 --> 00:28:13,080 Speaker 1: to create AI um and you don't have good controls 484 00:28:13,119 --> 00:28:16,679 Speaker 1: in place, then it could be the end of humanity. 485 00:28:16,760 --> 00:28:20,119 Speaker 1: It could eventually decide that we are bad and that 486 00:28:20,200 --> 00:28:22,840 Speaker 1: we need to be stopped. Well, CaCu seems to fall 487 00:28:22,960 --> 00:28:26,119 Speaker 1: more on Zuckerbird's side than Musk's side on this. He 488 00:28:26,160 --> 00:28:29,480 Speaker 1: said we should always be aware of how we implement 489 00:28:29,520 --> 00:28:32,280 Speaker 1: AI and how we design it, but that we shouldn't 490 00:28:32,320 --> 00:28:34,960 Speaker 1: worry in the near term about AI going all terminator 491 00:28:35,000 --> 00:28:38,280 Speaker 1: on us and wiping us out. As we develop AI 492 00:28:38,320 --> 00:28:40,920 Speaker 1: and as it becomes more sophisticated, we can talk about 493 00:28:40,960 --> 00:28:43,960 Speaker 1: how to build in fail safes that act on the 494 00:28:44,040 --> 00:28:46,560 Speaker 1: AI as sort of a limitter so that it doesn't 495 00:28:46,600 --> 00:28:50,040 Speaker 1: cause harm. The whole field of thought is fascinating to me, 496 00:28:50,080 --> 00:28:52,240 Speaker 1: and it gets way more complicated than just put a 497 00:28:52,320 --> 00:28:55,040 Speaker 1: chip in the computer's brain so that turns off if 498 00:28:55,040 --> 00:28:57,640 Speaker 1: it thinks bad thoughts. But I'm gonna leave that whole 499 00:28:57,640 --> 00:29:00,240 Speaker 1: discussion for another show because I think that that's something 500 00:29:00,240 --> 00:29:02,200 Speaker 1: that needs to be an actual discussion. I need to 501 00:29:02,240 --> 00:29:04,640 Speaker 1: get people on the show and we can all talk 502 00:29:04,680 --> 00:29:07,720 Speaker 1: about these implications and kind of argue it all out, 503 00:29:07,920 --> 00:29:10,360 Speaker 1: not to have a fight, but rather to just kind 504 00:29:10,400 --> 00:29:12,360 Speaker 1: of see all sides of the issue and Of course, 505 00:29:12,640 --> 00:29:16,520 Speaker 1: he also brought up quantum computing, because everything at IBM 506 00:29:16,680 --> 00:29:20,000 Speaker 1: Think ten seems to be about quantum computing at some point, 507 00:29:20,400 --> 00:29:22,240 Speaker 1: and how that could be a huge help for human 508 00:29:22,280 --> 00:29:25,080 Speaker 1: efforts moving forward. Though out of all these topics, he 509 00:29:25,160 --> 00:29:28,320 Speaker 1: was really most interested in discussing neural networks and artificial intelligence, 510 00:29:28,360 --> 00:29:31,000 Speaker 1: I would say, and much of his presentation was in 511 00:29:31,040 --> 00:29:33,800 Speaker 1: fact about space travel. Dr Cocus stress that we need 512 00:29:33,880 --> 00:29:37,080 Speaker 1: healthy and robust space program, not just so that we 513 00:29:37,120 --> 00:29:39,520 Speaker 1: can learn more about our Solar system and beyond, though 514 00:29:39,560 --> 00:29:43,520 Speaker 1: that is very important, but also for the actual survival 515 00:29:43,720 --> 00:29:47,080 Speaker 1: of the human race long term, because it's only a 516 00:29:47,080 --> 00:29:51,400 Speaker 1: matter of time before we have a cataclysmic event on 517 00:29:51,440 --> 00:29:53,760 Speaker 1: the Earth, sort of like a giant meteor or a 518 00:29:53,800 --> 00:29:57,320 Speaker 1: comment that collides with the Earth. That will happen eventually. 519 00:29:58,120 --> 00:30:00,760 Speaker 1: It might not happen for thousands of years rs, but 520 00:30:00,840 --> 00:30:04,400 Speaker 1: it will happen. It's statistically a certainty that at some 521 00:30:04,480 --> 00:30:06,720 Speaker 1: point it will happen. It happened before and it will 522 00:30:06,720 --> 00:30:09,520 Speaker 1: happen again, and that could be an extinction level events. 523 00:30:09,520 --> 00:30:11,960 Speaker 1: So we need a robust space program so that we 524 00:30:11,960 --> 00:30:16,280 Speaker 1: can do stuff like detect such threats decades before they 525 00:30:16,280 --> 00:30:21,280 Speaker 1: are imminent and then take measures to deflect those threats 526 00:30:21,360 --> 00:30:24,000 Speaker 1: so that we are actually safe. It would also help 527 00:30:24,080 --> 00:30:26,440 Speaker 1: ensure the survival of humans if we spread out a bit, 528 00:30:26,920 --> 00:30:30,600 Speaker 1: if we started to colonize other planets. Spreading ourselves out 529 00:30:31,200 --> 00:30:35,080 Speaker 1: ensures the possibility that the human race survives if something 530 00:30:35,120 --> 00:30:38,560 Speaker 1: goes wrong on any one given location, and at that 531 00:30:38,640 --> 00:30:42,200 Speaker 1: point we have an insurance policy against extinction events. Dr 532 00:30:42,280 --> 00:30:45,080 Speaker 1: Cock who touched on augmented reality as well. He talked 533 00:30:45,120 --> 00:30:47,920 Speaker 1: about glasses and headsets that allow you to see information 534 00:30:47,960 --> 00:30:50,920 Speaker 1: overlaid on top of the world around you. He also 535 00:30:50,920 --> 00:30:53,640 Speaker 1: talked about voice recognition tools that let you speak directly 536 00:30:53,720 --> 00:30:57,080 Speaker 1: to this technology, calling up information you need for any 537 00:30:57,080 --> 00:31:00,760 Speaker 1: given situation. You could have specific data sets and use 538 00:31:01,520 --> 00:31:03,840 Speaker 1: those data sets with use cases. For people who have 539 00:31:04,080 --> 00:31:07,920 Speaker 1: specialized trades, for example, like electrical engineers or heart surgeons, 540 00:31:08,360 --> 00:31:13,360 Speaker 1: they could have very carefully customized data sets that aid 541 00:31:13,440 --> 00:31:17,320 Speaker 1: them in their specific occupations. They also talked about how 542 00:31:17,320 --> 00:31:19,280 Speaker 1: in the future will have contact lenses that will be 543 00:31:19,320 --> 00:31:22,680 Speaker 1: able to do this this idea of having augmented reality contacts. 544 00:31:22,720 --> 00:31:25,360 Speaker 1: You put the contact in it acts like a computer screen. 545 00:31:25,400 --> 00:31:28,360 Speaker 1: It can overlay digital information on top of the view 546 00:31:28,400 --> 00:31:30,800 Speaker 1: you have of the world around you in real time, 547 00:31:31,080 --> 00:31:34,600 Speaker 1: and it will tell you all about stuff that's around you. 548 00:31:34,600 --> 00:31:37,440 Speaker 1: You can learn about, let's say, a building that you're 549 00:31:37,520 --> 00:31:41,320 Speaker 1: looking at, or they can help explain a scientific principle 550 00:31:41,600 --> 00:31:44,800 Speaker 1: that you are encountering. Maybe you're at a talk like 551 00:31:44,880 --> 00:31:48,080 Speaker 1: the one that Dr Cocku was giving and your augmented 552 00:31:48,120 --> 00:31:51,680 Speaker 1: reality contact lens is giving you illustrations that augment that 553 00:31:51,800 --> 00:31:56,680 Speaker 1: talk and help you better understand what he's trying to express. 554 00:31:58,080 --> 00:32:00,600 Speaker 1: Or you might be at a party in your contact 555 00:32:00,640 --> 00:32:03,000 Speaker 1: lens is telling you which people at the party are important, 556 00:32:03,000 --> 00:32:04,960 Speaker 1: so you know who to suck up to. That was 557 00:32:04,960 --> 00:32:08,000 Speaker 1: actually Dr Cocku's example. He said, this will be a 558 00:32:08,040 --> 00:32:10,320 Speaker 1: great tool if you're ever at a cocktail party and 559 00:32:10,320 --> 00:32:12,440 Speaker 1: you want to know who's important so that you can 560 00:32:12,600 --> 00:32:15,720 Speaker 1: sidle up to them and kiss their butt. Um. I 561 00:32:15,720 --> 00:32:18,520 Speaker 1: thought it was a clever and funny example. Dr Cocku, 562 00:32:18,600 --> 00:32:21,360 Speaker 1: by the way, is a very funny person. Hearing him 563 00:32:21,600 --> 00:32:25,280 Speaker 1: um make jokes as he's talking about futurism was very refreshing. 564 00:32:25,840 --> 00:32:28,880 Speaker 1: Dr cock who also talked about how display technology has 565 00:32:28,920 --> 00:32:31,280 Speaker 1: become far more advanced and we have displays now that 566 00:32:31,320 --> 00:32:34,640 Speaker 1: are flexible and that there are paper thin with things 567 00:32:34,640 --> 00:32:37,120 Speaker 1: like oh lad displays, and then the future will have 568 00:32:37,160 --> 00:32:40,040 Speaker 1: digitized paper everywhere, so you could have a sheet of 569 00:32:40,040 --> 00:32:42,560 Speaker 1: paper that could be literally anything because it can be 570 00:32:42,560 --> 00:32:44,920 Speaker 1: a display. It's not really paper, it's a display that 571 00:32:44,960 --> 00:32:49,320 Speaker 1: could show any potential any potential information you need. You 572 00:32:49,320 --> 00:32:52,080 Speaker 1: can have a wall coded in digitized paper and be 573 00:32:52,160 --> 00:32:55,560 Speaker 1: digitized wallpaper, and that wall becomes a computing surface that 574 00:32:55,640 --> 00:32:58,640 Speaker 1: you can use. So let's say you've got this wallpaper display. 575 00:32:58,760 --> 00:33:00,520 Speaker 1: You could walk up to the wall all you could 576 00:33:00,520 --> 00:33:02,520 Speaker 1: talk to it. You could pull up information about any 577 00:33:02,560 --> 00:33:05,400 Speaker 1: given subject, and you could interact with an AI chatbot 578 00:33:05,920 --> 00:33:08,400 Speaker 1: to answer questions, or you help make decisions, kind of 579 00:33:08,400 --> 00:33:11,959 Speaker 1: like Jarvis in Iron Man. Dr Cock who gave an 580 00:33:11,960 --> 00:33:15,320 Speaker 1: example of someone waking up at night and they have 581 00:33:15,440 --> 00:33:19,800 Speaker 1: chest pains and they start talking to the wall, kind 582 00:33:19,800 --> 00:33:22,360 Speaker 1: of having a robo doctor right there to find out 583 00:33:22,680 --> 00:33:26,480 Speaker 1: if the symptoms they are suffering from indicate something serious 584 00:33:26,520 --> 00:33:29,000 Speaker 1: like a heart attack, and if so, the digital assistant 585 00:33:29,360 --> 00:33:33,320 Speaker 1: helps by reaching out to emergency medical personnel and getting 586 00:33:33,320 --> 00:33:36,960 Speaker 1: things ready so that that the the the person in 587 00:33:37,040 --> 00:33:40,200 Speaker 1: question can get medical attention, or if it's something more 588 00:33:40,280 --> 00:33:44,520 Speaker 1: benign like just indigestion. Dr Cock who also talked about 589 00:33:44,560 --> 00:33:48,760 Speaker 1: trying to foil aging, both through genetic approaches and through 590 00:33:48,800 --> 00:33:51,920 Speaker 1: digital ones. He talked about identifying the genes that are 591 00:33:51,960 --> 00:33:55,200 Speaker 1: responsible for aging and finding a way to deactivate them 592 00:33:55,480 --> 00:33:58,840 Speaker 1: so that the aging process stops once you reach adulthood. 593 00:33:59,160 --> 00:34:01,720 Speaker 1: From an external point of view, you would always appear 594 00:34:01,760 --> 00:34:05,240 Speaker 1: to be, say thirty years old. However, Dr Cock, who 595 00:34:05,320 --> 00:34:07,880 Speaker 1: also had not acknowledged that this wouldn't necessarily mean you 596 00:34:07,920 --> 00:34:10,480 Speaker 1: never die of old age, because your organs could continue 597 00:34:10,520 --> 00:34:14,680 Speaker 1: to deteriorate just from use over years and years and years. 598 00:34:14,800 --> 00:34:17,520 Speaker 1: You might live much longer, but eventually stuff would start 599 00:34:17,560 --> 00:34:19,920 Speaker 1: to wear out, and that would require us to find 600 00:34:20,000 --> 00:34:23,000 Speaker 1: ways to build new organs. Now we've started to do 601 00:34:23,040 --> 00:34:25,719 Speaker 1: that already, but we've got other major organs that we 602 00:34:25,760 --> 00:34:29,160 Speaker 1: haven't yet been able to build, like a liver, but 603 00:34:29,280 --> 00:34:33,760 Speaker 1: that might preserve someone's life well beyond today's lifespans, perhaps 604 00:34:33,760 --> 00:34:38,120 Speaker 1: even indefinitely. And then he mentioned digital immortality. This is 605 00:34:38,120 --> 00:34:41,280 Speaker 1: a very different approach. This is where you somehow capture 606 00:34:41,320 --> 00:34:43,920 Speaker 1: all the information that makes a person who he or 607 00:34:44,000 --> 00:34:46,960 Speaker 1: she is. Think of it like a three D scan, 608 00:34:47,239 --> 00:34:50,359 Speaker 1: except for who you are, not what you look like 609 00:34:51,080 --> 00:34:53,600 Speaker 1: all of your memories and your thoughts and your emotional 610 00:34:53,640 --> 00:34:56,560 Speaker 1: responses to anything, the way you think, the way you 611 00:34:56,600 --> 00:34:59,400 Speaker 1: come up with ideas, all of that would end up 612 00:34:59,440 --> 00:35:01,759 Speaker 1: being capt shared, and you'd end up with a simulation 613 00:35:02,440 --> 00:35:06,040 Speaker 1: of you or all practical purposes, and it would behave 614 00:35:06,200 --> 00:35:09,520 Speaker 1: as you would in any given situation, assuming that the 615 00:35:09,640 --> 00:35:13,120 Speaker 1: simulation was as close to perfect as possible. Now, would 616 00:35:13,160 --> 00:35:16,799 Speaker 1: you then say that that construct is in fact you, Well, 617 00:35:16,840 --> 00:35:19,600 Speaker 1: if you think of yourself in terms of your fleshy existence, no, 618 00:35:19,920 --> 00:35:22,440 Speaker 1: you wouldn't. The version that you would be looking at 619 00:35:22,920 --> 00:35:26,080 Speaker 1: would be a simulation, and you, however, would eventually age 620 00:35:26,120 --> 00:35:29,120 Speaker 1: and die. But if you think of yourself as your 621 00:35:29,160 --> 00:35:33,560 Speaker 1: collection of experiences and your thoughts and your emotions, then 622 00:35:33,640 --> 00:35:37,080 Speaker 1: maybe you might say the simulation is you, but it's 623 00:35:37,080 --> 00:35:40,840 Speaker 1: another instance of you. Now, that doesn't solve the problem 624 00:35:41,000 --> 00:35:43,839 Speaker 1: that your own experience of being you would someday come 625 00:35:43,880 --> 00:35:46,840 Speaker 1: to an end, and a simulated jerk face version of 626 00:35:46,880 --> 00:35:50,000 Speaker 1: you would keep on going as if no one cared, 627 00:35:50,280 --> 00:35:53,960 Speaker 1: as long as no one unplugged them jerk face. Then 628 00:35:54,080 --> 00:35:58,120 Speaker 1: Dr Cocu said something I found really fascinating, and I 629 00:35:58,160 --> 00:36:00,879 Speaker 1: hadn't really considered before He said that if you could 630 00:36:00,880 --> 00:36:05,640 Speaker 1: digitize a person, boil their essence down into pure information, 631 00:36:06,160 --> 00:36:09,640 Speaker 1: and then you encode that information using a laser beam, 632 00:36:09,960 --> 00:36:12,880 Speaker 1: you could fire that laser beam at distant targets like 633 00:36:13,000 --> 00:36:16,080 Speaker 1: other planets in our Solar system, or even beyond. Let's 634 00:36:16,080 --> 00:36:20,160 Speaker 1: say you shoot it toward Alpha Centauri. Within four years, 635 00:36:20,200 --> 00:36:24,520 Speaker 1: you would get there. You could have minds travel vast 636 00:36:24,840 --> 00:36:28,719 Speaker 1: distances at the speed of light. Matter can't travel at 637 00:36:28,760 --> 00:36:31,879 Speaker 1: the speed of light, but information can. Of course, I'm 638 00:36:31,880 --> 00:36:34,680 Speaker 1: not sure what you could do once you've got to 639 00:36:34,719 --> 00:36:37,400 Speaker 1: where you were going, because if you're information encoded on 640 00:36:37,440 --> 00:36:40,480 Speaker 1: a medium, I can't see any way you could meaningfully 641 00:36:40,600 --> 00:36:43,880 Speaker 1: do anything at all or perceive anything. You would just 642 00:36:43,960 --> 00:36:46,640 Speaker 1: have just information shooting out there, kind of like how 643 00:36:46,760 --> 00:36:50,160 Speaker 1: radio waves can radiate out into space. That doesn't necessarily 644 00:36:50,160 --> 00:36:52,959 Speaker 1: mean we can do anything with them. A program isn't 645 00:36:53,080 --> 00:36:56,279 Speaker 1: running if you don't execute it right, Like if you 646 00:36:56,320 --> 00:36:59,600 Speaker 1: have a computer game on a disk. Imagine an old 647 00:36:59,640 --> 00:37:02,120 Speaker 1: disc you've got, you put it in your computer, and 648 00:37:02,160 --> 00:37:04,480 Speaker 1: you don't run the program. You just put it in 649 00:37:04,480 --> 00:37:07,399 Speaker 1: the computer, but the computer is not engaging the disk 650 00:37:07,440 --> 00:37:09,520 Speaker 1: at all. It's not like the game is actually running 651 00:37:09,560 --> 00:37:11,640 Speaker 1: on your machine. It's not like it's making any sort 652 00:37:11,680 --> 00:37:15,399 Speaker 1: of activity, but still, this is an intriguing idea. It 653 00:37:15,440 --> 00:37:18,319 Speaker 1: could be a new form of space exploration, assuming we 654 00:37:18,360 --> 00:37:21,160 Speaker 1: figure out how to make that information useful, so that 655 00:37:21,239 --> 00:37:24,480 Speaker 1: can either communicate back to us about what it has discovered, 656 00:37:24,800 --> 00:37:27,440 Speaker 1: or it can somehow go about and take actions of 657 00:37:27,480 --> 00:37:30,680 Speaker 1: its own across the universe. I just don't know how 658 00:37:30,680 --> 00:37:34,480 Speaker 1: that would work. Dr Kko's final section was about brain 659 00:37:34,560 --> 00:37:37,759 Speaker 1: computer interfaces. Now, these are the technologies that allow us 660 00:37:37,760 --> 00:37:41,880 Speaker 1: to communicate directly with technologies through thought. It's kind of 661 00:37:41,920 --> 00:37:45,799 Speaker 1: a technological telepathy. It's still a very early field, but 662 00:37:45,840 --> 00:37:48,600 Speaker 1: there are examples out there and have been for a 663 00:37:48,600 --> 00:37:52,439 Speaker 1: decade or more. These are not perfect methods, and each 664 00:37:52,480 --> 00:37:56,319 Speaker 1: method has advantages and disadvantages. The best way to get 665 00:37:56,440 --> 00:38:00,200 Speaker 1: high resolution data from a person that is consistently reliable, 666 00:38:00,360 --> 00:38:03,200 Speaker 1: or at least more reliable than the alternatives, is to 667 00:38:03,280 --> 00:38:08,359 Speaker 1: embed sensors directly onto that person's brain. So it requires 668 00:38:08,520 --> 00:38:11,600 Speaker 1: brain surgery. Then you have the problem of figuring out 669 00:38:11,600 --> 00:38:14,680 Speaker 1: how do you send signals from the sensors you've implanted 670 00:38:14,680 --> 00:38:17,720 Speaker 1: in the brain to the target technologies. Now, you could 671 00:38:18,400 --> 00:38:22,560 Speaker 1: include wires or an antenna that protrudes from the brain 672 00:38:22,680 --> 00:38:26,200 Speaker 1: through the skull. But this also presents a problem because 673 00:38:26,239 --> 00:38:29,280 Speaker 1: you've created the potential for contaminants to infect the brain. 674 00:38:29,480 --> 00:38:33,320 Speaker 1: Like anytime you've got something that's breaking that blood brain barrier, 675 00:38:33,640 --> 00:38:36,920 Speaker 1: that's bad news. So you have to be super careful 676 00:38:36,960 --> 00:38:39,239 Speaker 1: with this. And the skull is pretty thick. It's thick 677 00:38:39,360 --> 00:38:42,000 Speaker 1: enough to make it tricky to send signals through the skull. 678 00:38:42,120 --> 00:38:46,239 Speaker 1: So a wireless solution, while it might be possible, is 679 00:38:46,320 --> 00:38:50,040 Speaker 1: not ideal. It might end up being hard to detect 680 00:38:50,280 --> 00:38:54,160 Speaker 1: legitimate signals. Or you could go the non invasive route 681 00:38:54,239 --> 00:38:56,440 Speaker 1: and you could use an e G headset to pick 682 00:38:56,520 --> 00:38:59,920 Speaker 1: up brain waves. Now those don't require surgery, but sometimes 683 00:39:00,080 --> 00:39:03,360 Speaker 1: are not terribly accurate. They can pick up noise that 684 00:39:03,480 --> 00:39:07,040 Speaker 1: looks like brain waves, but it's actually just interference. Uh, 685 00:39:07,120 --> 00:39:09,840 Speaker 1: they can be false positives, or it can fail to 686 00:39:10,000 --> 00:39:13,960 Speaker 1: register legitimate commands as something that's meaningful. So you might 687 00:39:14,000 --> 00:39:18,920 Speaker 1: be thinking, now type the word dog, but it's nothing's 688 00:39:18,960 --> 00:39:21,400 Speaker 1: happening because it hasn't picked it up as a legitimate command. 689 00:39:22,120 --> 00:39:24,520 Speaker 1: So this doesn't require a surgeon to operate on you, 690 00:39:24,560 --> 00:39:26,600 Speaker 1: but it also does not give you as accurate a 691 00:39:26,640 --> 00:39:30,640 Speaker 1: response as an intercranial approach would. However, we do have 692 00:39:30,719 --> 00:39:33,800 Speaker 1: some examples of these technologies that allow people who otherwise 693 00:39:33,840 --> 00:39:37,719 Speaker 1: couldn't control things with their bodies to be able to 694 00:39:37,719 --> 00:39:40,359 Speaker 1: have control over things. And a lot of it comes 695 00:39:40,360 --> 00:39:44,279 Speaker 1: into communication, right, using a computer interface to communicate with 696 00:39:44,360 --> 00:39:47,800 Speaker 1: other people, you think, and then the computer either speaks 697 00:39:47,920 --> 00:39:50,960 Speaker 1: or types things out on your behalf and it opens 698 00:39:51,000 --> 00:39:53,160 Speaker 1: up a channel of communication where otherwise you would be 699 00:39:53,200 --> 00:39:57,000 Speaker 1: unable to do so. Or we can pair that type 700 00:39:57,040 --> 00:40:00,439 Speaker 1: of technology with something like an exoskeleton, and of someone 701 00:40:00,440 --> 00:40:03,799 Speaker 1: who has lost mobility the chance to regain it. They 702 00:40:03,920 --> 00:40:08,160 Speaker 1: can use their mind to send signals to the exoskeleton 703 00:40:08,280 --> 00:40:11,279 Speaker 1: and use that to move them around. One day, we 704 00:40:11,360 --> 00:40:14,920 Speaker 1: might extend this so that we have robotic bodies that 705 00:40:15,000 --> 00:40:18,640 Speaker 1: we can control remotely using our minds, giving us telepresence 706 00:40:18,920 --> 00:40:22,920 Speaker 1: into distant locations. Of course, the further out we go, 707 00:40:23,080 --> 00:40:25,200 Speaker 1: the more we'll have to deal with the limitation of 708 00:40:25,239 --> 00:40:28,160 Speaker 1: remote control. Because even if we had a perfect system 709 00:40:28,280 --> 00:40:31,440 Speaker 1: that let us communicate with a robot that's light years 710 00:40:31,480 --> 00:40:33,920 Speaker 1: away from us, so we can send signals to it 711 00:40:33,960 --> 00:40:36,480 Speaker 1: and we can receive signals back from it, somehow there 712 00:40:36,480 --> 00:40:40,120 Speaker 1: would still be tremendous lag because information is still only 713 00:40:40,120 --> 00:40:42,439 Speaker 1: going to travel at the speed of light. So if 714 00:40:42,440 --> 00:40:44,640 Speaker 1: a robot is light years away, that means it literally 715 00:40:44,640 --> 00:40:47,640 Speaker 1: takes years for a command you send to be received 716 00:40:47,640 --> 00:40:50,759 Speaker 1: by the robot, and years again for the robot to 717 00:40:50,840 --> 00:40:53,960 Speaker 1: send the results back to you. So it would be 718 00:40:54,000 --> 00:40:57,759 Speaker 1: like the longest game of mail chess, like where you're 719 00:40:57,800 --> 00:41:00,600 Speaker 1: mailing your move back to your opponent and over again, 720 00:41:00,640 --> 00:41:03,920 Speaker 1: but still super cool. Dr Cocku ended his talk by 721 00:41:03,920 --> 00:41:08,239 Speaker 1: reminding us that without pursuit of these new forms of technologies, 722 00:41:08,280 --> 00:41:11,960 Speaker 1: we could find ourselves clinging to diminishing returns with classical computing, 723 00:41:12,120 --> 00:41:14,840 Speaker 1: and that was a really interesting presentation. I do have 724 00:41:14,920 --> 00:41:17,600 Speaker 1: one more speaker to talk about, but first let's take 725 00:41:17,640 --> 00:41:27,919 Speaker 1: another quick break to thank our sponsor. The last person 726 00:41:28,000 --> 00:41:29,520 Speaker 1: I want to talk about in this episode is a 727 00:41:29,560 --> 00:41:32,840 Speaker 1: remarkable young man and his name is Tonme boks Sheet. 728 00:41:33,200 --> 00:41:36,279 Speaker 1: He's a programmer. He's an expert on deep learning and 729 00:41:36,400 --> 00:41:42,680 Speaker 1: artificial intelligence, and he's fourteen freaking years old. Box She's 730 00:41:42,719 --> 00:41:45,560 Speaker 1: father is a computer programmer, and ten May began learning 731 00:41:45,560 --> 00:41:48,319 Speaker 1: about coding and programming when he was just five years old, 732 00:41:48,360 --> 00:41:51,400 Speaker 1: and he loved it, and so he studied it intently. 733 00:41:51,680 --> 00:41:54,120 Speaker 1: When he was seven, he started up a YouTube channel 734 00:41:54,160 --> 00:41:57,320 Speaker 1: and he began to upload videos about how to code 735 00:41:57,719 --> 00:42:00,759 Speaker 1: and develop for the web. He started to developing for 736 00:42:00,880 --> 00:42:04,120 Speaker 1: iOS when he was eight, and Apple published his first 737 00:42:04,160 --> 00:42:07,040 Speaker 1: app when he was nine. That was an app, by 738 00:42:07,080 --> 00:42:10,960 Speaker 1: the way, that taught multiplication to kids. When he was eleven, 739 00:42:11,480 --> 00:42:14,640 Speaker 1: he caught the attention of IBM. See box She saw 740 00:42:14,680 --> 00:42:17,719 Speaker 1: a video about IBM Watson and how that platform was 741 00:42:17,719 --> 00:42:21,279 Speaker 1: advancing artificial intelligence, and he found it absolutely fascinating. So 742 00:42:21,320 --> 00:42:23,799 Speaker 1: he also started to work with an alpha build of 743 00:42:23,840 --> 00:42:26,520 Speaker 1: a tool that IBM had developed that was meant to 744 00:42:26,520 --> 00:42:30,960 Speaker 1: convert documents from one format into other formats. Box She 745 00:42:31,080 --> 00:42:33,920 Speaker 1: discovered a bug in the code, and he sent a 746 00:42:33,920 --> 00:42:36,440 Speaker 1: bug report to the development team to let them know 747 00:42:36,520 --> 00:42:39,680 Speaker 1: about it. Some IBM developers ended up reaching out to 748 00:42:39,760 --> 00:42:42,919 Speaker 1: box She in response, and a new partnership began to form. 749 00:42:43,600 --> 00:42:46,480 Speaker 1: Box She, by the way, isn't truly an IBM partner 750 00:42:46,680 --> 00:42:49,600 Speaker 1: because he does not get compensated. He doesn't get paid 751 00:42:49,640 --> 00:42:51,880 Speaker 1: for this, but he does work with IBM on a 752 00:42:51,920 --> 00:42:56,480 Speaker 1: lot of projects. Right now. Uh he's creating novel approaches 753 00:42:56,520 --> 00:42:59,880 Speaker 1: to computing and he's finding new ways to leverage data, 754 00:43:00,200 --> 00:43:04,200 Speaker 1: and since his communications with IBM, he has ended up 755 00:43:04,200 --> 00:43:06,840 Speaker 1: going on tours around the world to talk about the 756 00:43:06,880 --> 00:43:10,200 Speaker 1: next advancements and data processing and data analysis, as well 757 00:43:10,239 --> 00:43:14,439 Speaker 1: as artificial intelligence and brain computer interfaces and the work 758 00:43:14,440 --> 00:43:17,880 Speaker 1: he's doing in those fields. He decided to talk about 759 00:43:17,960 --> 00:43:20,319 Speaker 1: three of the projects that he was interested in, all 760 00:43:20,400 --> 00:43:23,560 Speaker 1: three relating to healthcare, and he said that healthcare is 761 00:43:23,560 --> 00:43:26,920 Speaker 1: one of those fields where deep learning and big data 762 00:43:27,080 --> 00:43:30,719 Speaker 1: are really important because there's tons and tons and tons 763 00:43:30,800 --> 00:43:35,080 Speaker 1: of information in the healthcare sphere and it's useful information 764 00:43:35,360 --> 00:43:39,759 Speaker 1: in theory. In practice, however, it can often not be useful. Now, 765 00:43:39,760 --> 00:43:41,799 Speaker 1: the reason I say that is, imagine that you are 766 00:43:41,840 --> 00:43:45,040 Speaker 1: a doctor. Some of you probably are doctors. For the 767 00:43:45,080 --> 00:43:47,319 Speaker 1: rest of you. Imagine that you are a doctor, and 768 00:43:47,400 --> 00:43:50,080 Speaker 1: you're a good doctor. You're you're you're good at what 769 00:43:50,120 --> 00:43:53,680 Speaker 1: you do. You understand your field. However, there are always 770 00:43:53,719 --> 00:43:59,400 Speaker 1: more advancements being made in the pioneering edge of medicine. 771 00:43:59,800 --> 00:44:02,480 Speaker 1: They you may or may not be familiar with. So 772 00:44:02,640 --> 00:44:05,520 Speaker 1: part of your job isn't just treating your patients. It's 773 00:44:05,680 --> 00:44:09,759 Speaker 1: learning more about developments in medicine. So you have to 774 00:44:09,800 --> 00:44:12,040 Speaker 1: go and seek out that information. You have to learn 775 00:44:12,080 --> 00:44:14,319 Speaker 1: about it, you have to comprehend it. Then you have 776 00:44:14,360 --> 00:44:16,799 Speaker 1: to put it into practice if it makes sense as 777 00:44:17,040 --> 00:44:21,160 Speaker 1: as your role as doctor. And while you're doing this, 778 00:44:21,239 --> 00:44:24,080 Speaker 1: while you're learning about the latest and greatest stuff, pioneers 779 00:44:24,080 --> 00:44:28,080 Speaker 1: are finding even more breakthroughs in medicine. And you have 780 00:44:28,120 --> 00:44:30,640 Speaker 1: to keep doing this. You have to keep learning. It's 781 00:44:30,640 --> 00:44:34,880 Speaker 1: a never ending process. And if you are a doctor 782 00:44:34,960 --> 00:44:38,239 Speaker 1: working in a in a in a large city that 783 00:44:38,280 --> 00:44:41,719 Speaker 1: has access to a lot of uh like cutting edge 784 00:44:41,760 --> 00:44:46,280 Speaker 1: research centers, then you have those assets at your disposal. 785 00:44:46,360 --> 00:44:49,520 Speaker 1: You might be able to reference or refer rather a 786 00:44:49,560 --> 00:44:53,040 Speaker 1: patient to one of those specialists who can then use 787 00:44:53,120 --> 00:44:56,400 Speaker 1: their their expertise and the latest and greatest information to 788 00:44:56,480 --> 00:44:59,719 Speaker 1: help that patient. But the further out you are from 789 00:44:59,760 --> 00:45:03,680 Speaker 1: those centers, the less of that support you're going to get, 790 00:45:03,800 --> 00:45:05,520 Speaker 1: and you'll have to do more and more of this 791 00:45:05,640 --> 00:45:07,480 Speaker 1: on your own, which leaves it up to you to 792 00:45:07,600 --> 00:45:11,359 Speaker 1: learn everything and that is just not humanly possible. That's 793 00:45:11,400 --> 00:45:15,520 Speaker 1: where deep learning and big data come in. So with 794 00:45:15,600 --> 00:45:19,160 Speaker 1: deep learning, you can design computer algorithms that are looking 795 00:45:19,440 --> 00:45:25,000 Speaker 1: deeply into the results of any given scenario. So let's 796 00:45:25,200 --> 00:45:29,560 Speaker 1: take drugs as a as a as A as an example, 797 00:45:30,320 --> 00:45:34,000 Speaker 1: clinical trials are great. Clinical trials are how we determine 798 00:45:34,120 --> 00:45:38,279 Speaker 1: whether or not a drug is effective, whether or not 799 00:45:38,560 --> 00:45:44,880 Speaker 1: it poses the potential to give patients UH adverse side effects. 800 00:45:45,320 --> 00:45:47,920 Speaker 1: These are really important things, and all drugs have to 801 00:45:47,960 --> 00:45:52,000 Speaker 1: go through lengthy clinical trials before they can be used 802 00:45:52,040 --> 00:45:58,560 Speaker 1: in medical applications. However, clinical trials are not perfect because 803 00:45:58,800 --> 00:46:04,000 Speaker 1: it is impossible to have a truly representative population of 804 00:46:04,280 --> 00:46:07,880 Speaker 1: volunteers in these clinical trials. You're never going to get 805 00:46:08,640 --> 00:46:17,520 Speaker 1: a full demographically representative population of people testing out your drug. UH. Eventually, 806 00:46:17,719 --> 00:46:22,560 Speaker 1: you're going to get approval. Let's say that the drug 807 00:46:22,600 --> 00:46:28,120 Speaker 1: is effective and doesn't show any UH implications of truly 808 00:46:28,160 --> 00:46:30,719 Speaker 1: adverse sight effects. You get it approved, it goes out 809 00:46:30,760 --> 00:46:33,000 Speaker 1: into the market. Suddenly, it is now out in the 810 00:46:33,040 --> 00:46:36,840 Speaker 1: world where more than seven billion people live. And because 811 00:46:36,880 --> 00:46:39,720 Speaker 1: more than seven billion people live there, they're all sorts 812 00:46:39,719 --> 00:46:44,520 Speaker 1: of different body types, body chemistryes, potential drug interactions going 813 00:46:44,520 --> 00:46:47,200 Speaker 1: on because people could be taking other drugs that could 814 00:46:47,840 --> 00:46:51,720 Speaker 1: change the way your new drug interacts with their bodies. 815 00:46:52,320 --> 00:46:56,960 Speaker 1: And suddenly you've got things that are happening where you 816 00:46:57,280 --> 00:47:00,120 Speaker 1: aren't sure if your drug is responsible for maybe some 817 00:47:00,160 --> 00:47:02,600 Speaker 1: adverse side effects or not. In fact, you can't even 818 00:47:02,600 --> 00:47:05,080 Speaker 1: really be sure in clinical trials. If you observe it 819 00:47:05,160 --> 00:47:07,600 Speaker 1: frequently enough, you could say, well, chances are the drug 820 00:47:07,680 --> 00:47:10,479 Speaker 1: is causing these side effects, but you can't just say 821 00:47:10,480 --> 00:47:14,520 Speaker 1: that based upon one instance because there are too many 822 00:47:14,600 --> 00:47:17,800 Speaker 1: unknown variables. It's like that way with humans. We humans 823 00:47:17,800 --> 00:47:21,040 Speaker 1: are complicated people. And maybe that the person who was 824 00:47:21,040 --> 00:47:23,400 Speaker 1: in the clinical trial had taken some aspirin that morning 825 00:47:23,440 --> 00:47:26,319 Speaker 1: and the aspirin interacted with the drug in a way 826 00:47:26,719 --> 00:47:30,680 Speaker 1: that was unexpected, and thus the person in the trial 827 00:47:30,760 --> 00:47:34,160 Speaker 1: had an adverse side effect. But if they hadn't taken 828 00:47:34,200 --> 00:47:36,279 Speaker 1: the aspirin, they would have been fine. Like they're They're 829 00:47:36,280 --> 00:47:40,040 Speaker 1: all these unknowns that can take place during clinical trials. 830 00:47:40,120 --> 00:47:43,640 Speaker 1: So what Bakshi was arguing is that if you use 831 00:47:43,719 --> 00:47:48,360 Speaker 1: deep learning, you can look across the results of real 832 00:47:48,760 --> 00:47:52,879 Speaker 1: life cases of people taking drugs and reporting any sort 833 00:47:52,880 --> 00:47:55,480 Speaker 1: of side effects they might have felt, and used that 834 00:47:55,600 --> 00:47:58,200 Speaker 1: to start to draw more conclusions. You could even do 835 00:47:58,239 --> 00:48:00,680 Speaker 1: this by looking at social media. So in other words, 836 00:48:00,719 --> 00:48:04,880 Speaker 1: you could look at instances of a drug being mentioned 837 00:48:04,920 --> 00:48:09,480 Speaker 1: in social media and looking at any possible descriptions of 838 00:48:09,520 --> 00:48:13,120 Speaker 1: how a person felt after they took that drug and 839 00:48:13,280 --> 00:48:17,839 Speaker 1: mine the entire sphere of social media for instances like 840 00:48:17,920 --> 00:48:20,640 Speaker 1: that and add that into a database. Now, this would 841 00:48:20,640 --> 00:48:23,120 Speaker 1: be way too much for any human to handle, but 842 00:48:23,160 --> 00:48:26,000 Speaker 1: if you put it into the realm of computers, they 843 00:48:26,000 --> 00:48:29,600 Speaker 1: could look around, look for data points, try and see 844 00:48:29,800 --> 00:48:31,799 Speaker 1: how many of those data points there are. Is there 845 00:48:32,000 --> 00:48:35,760 Speaker 1: a significant number? Is there enough to suggest that perhaps 846 00:48:35,760 --> 00:48:38,560 Speaker 1: there is something actually going on here, or is it 847 00:48:39,040 --> 00:48:44,120 Speaker 1: an outlier that isn't indicative of anything meaningful. Computers can 848 00:48:44,160 --> 00:48:46,840 Speaker 1: help do this, and that can help people start to 849 00:48:47,840 --> 00:48:51,320 Speaker 1: consider what drugs they want to prescribe to patients. People 850 00:48:51,360 --> 00:48:54,480 Speaker 1: being doctors, So a doctor could look at their patient 851 00:48:54,680 --> 00:48:57,120 Speaker 1: and let's say the doctor she's looking at her patients, 852 00:48:57,160 --> 00:49:01,160 Speaker 1: she says, well, I want I want to prescribe medication 853 00:49:01,200 --> 00:49:04,320 Speaker 1: to treat you, and I've got a couple of different options. 854 00:49:04,480 --> 00:49:09,480 Speaker 1: So based upon who you are, your body type, your 855 00:49:09,520 --> 00:49:13,200 Speaker 1: body chemistry, the other medications you might be on, I'm 856 00:49:13,239 --> 00:49:16,560 Speaker 1: going to take a look at this information that has 857 00:49:16,600 --> 00:49:23,320 Speaker 1: been curated by algorithms that have mined this huge data 858 00:49:23,360 --> 00:49:26,439 Speaker 1: set and brought back all these results, and based upon 859 00:49:26,440 --> 00:49:31,000 Speaker 1: this information, I am more likely to prescribe drug b 860 00:49:31,640 --> 00:49:35,680 Speaker 1: to you because too, from what I see here, it 861 00:49:35,760 --> 00:49:38,879 Speaker 1: is the least likely to cause any really bad side 862 00:49:38,920 --> 00:49:42,280 Speaker 1: effects and the most likely to be efficacious in treating 863 00:49:42,480 --> 00:49:47,120 Speaker 1: your disease or whatever. And that was one of the 864 00:49:47,160 --> 00:49:51,720 Speaker 1: three ways that tom may backshe was citing as being 865 00:49:51,840 --> 00:49:55,719 Speaker 1: an important way to use dev learning really just kind 866 00:49:55,719 --> 00:50:02,360 Speaker 1: of finding out ways to avoid having negative drug interactions. 867 00:50:02,880 --> 00:50:07,600 Speaker 1: This idea of leveraging information that is out there that 868 00:50:07,640 --> 00:50:11,560 Speaker 1: otherwise is you know, it's it's useful, but it's unstructured 869 00:50:11,760 --> 00:50:15,280 Speaker 1: and it's not being harnessed in any meaningful way. Changing 870 00:50:15,320 --> 00:50:18,720 Speaker 1: that by using a neural network and using open source 871 00:50:18,800 --> 00:50:23,920 Speaker 1: databases like FDA Drug Adverse Database and social media to 872 00:50:23,920 --> 00:50:27,600 Speaker 1: try and predict if any one particular person will suffer 873 00:50:27,640 --> 00:50:32,520 Speaker 1: negative events from any particular drug. I found that really fascinating. 874 00:50:33,120 --> 00:50:37,240 Speaker 1: The idea of not just using the official databases, the 875 00:50:37,400 --> 00:50:42,200 Speaker 1: medical resources that have accumulated over the course of decades 876 00:50:42,360 --> 00:50:48,359 Speaker 1: of work, but also the more anecdotal evidence and I 877 00:50:48,480 --> 00:50:50,680 Speaker 1: hesitate to use those two words together, but the more 878 00:50:50,719 --> 00:50:54,600 Speaker 1: anecdotal accounts, let's say, of people who are using those drugs, 879 00:50:54,640 --> 00:50:58,279 Speaker 1: who are reporting, you know, just casually on what it 880 00:50:58,400 --> 00:51:02,080 Speaker 1: is they're experiencing and using that information to help kind 881 00:51:02,080 --> 00:51:05,640 Speaker 1: of create a much larger, more informal clinical trial in 882 00:51:05,680 --> 00:51:10,000 Speaker 1: the real world that can end up having real impact 883 00:51:10,320 --> 00:51:12,879 Speaker 1: on the future of how that those particular drugs are 884 00:51:12,920 --> 00:51:15,239 Speaker 1: are used in the future. I used the future a 885 00:51:15,239 --> 00:51:17,399 Speaker 1: couple of times there, but that's just because I'm thinking 886 00:51:17,400 --> 00:51:21,319 Speaker 1: about it so much. Then bok she transitioned to talk 887 00:51:21,360 --> 00:51:24,160 Speaker 1: about another project he's been working on that is called 888 00:51:24,280 --> 00:51:28,400 Speaker 1: the Cognitive Story, and this is a story about augmenting 889 00:51:28,400 --> 00:51:31,640 Speaker 1: people's lives with the power of cognitive computing and artificial 890 00:51:31,680 --> 00:51:34,640 Speaker 1: intelligence and brain computer interfaces. So this kind of ties 891 00:51:34,680 --> 00:51:38,600 Speaker 1: into what Dr Mitchiokaku had talked about earlier during the conference. 892 00:51:39,200 --> 00:51:43,520 Speaker 1: Uh In this in this particular case, uh Tom may Bak, 893 00:51:43,560 --> 00:51:47,960 Speaker 1: she was talking about a woman named Boo, and the 894 00:51:48,200 --> 00:51:54,239 Speaker 1: woman has Rhetz syndrome, which has made her almost completely 895 00:51:54,280 --> 00:51:59,680 Speaker 1: non communicative. She's quadriplegic and almost incapable of communicating except 896 00:51:59,680 --> 00:52:02,799 Speaker 1: to the those who know her best, who can interpret 897 00:52:03,400 --> 00:52:07,080 Speaker 1: what Boo is trying to communicate, and they can therefore 898 00:52:08,640 --> 00:52:12,640 Speaker 1: help her throughout the day. But anyone who doesn't know 899 00:52:12,760 --> 00:52:19,640 Speaker 1: Boo with that level of familiarity is not going to 900 00:52:19,640 --> 00:52:22,440 Speaker 1: be able to understand what she's trying to communicate at 901 00:52:22,440 --> 00:52:26,440 Speaker 1: any given time. And so Bakshi and the team he's 902 00:52:26,480 --> 00:52:29,640 Speaker 1: working with have been trying to develop a tool that 903 00:52:29,680 --> 00:52:33,840 Speaker 1: allows Boo to communicate with other people, and it requires 904 00:52:33,880 --> 00:52:36,920 Speaker 1: several steps. The first of those steps is finding a 905 00:52:36,960 --> 00:52:41,600 Speaker 1: way to interpret booze brain waves so that Boo doesn't 906 00:52:41,640 --> 00:52:46,760 Speaker 1: have to struggle, she just thinks, and then those thoughts 907 00:52:46,760 --> 00:52:50,680 Speaker 1: can be communicated to the outside world. Um and so 908 00:52:50,800 --> 00:52:53,560 Speaker 1: that would require doing what I had talked about previously, 909 00:52:53,600 --> 00:52:56,120 Speaker 1: finding that brain computer interface. In this case, they went 910 00:52:56,200 --> 00:52:59,200 Speaker 1: with the e G model, the noninvasive model. They had 911 00:52:59,200 --> 00:53:03,359 Speaker 1: two three D print a specific headset for Boo so 912 00:53:03,440 --> 00:53:07,360 Speaker 1: that it could comfortably sit on her head. Rett syndrome 913 00:53:07,360 --> 00:53:10,600 Speaker 1: means that she's very, very sensitive, and you can't have 914 00:53:10,719 --> 00:53:13,840 Speaker 1: something heavy or bulky on her because it would be 915 00:53:14,160 --> 00:53:16,719 Speaker 1: caused so much discomfort that it would not be of 916 00:53:16,760 --> 00:53:19,960 Speaker 1: any real help. It would actually it would hurt her physically. 917 00:53:20,280 --> 00:53:23,399 Speaker 1: And then then you have to train the technology. It's 918 00:53:23,400 --> 00:53:25,840 Speaker 1: not just enough that you are able to detect brain waves. 919 00:53:26,560 --> 00:53:30,200 Speaker 1: It's not like all humans have brain waves that follow 920 00:53:30,280 --> 00:53:33,399 Speaker 1: the exact same patterns. And therefore, if I think yes 921 00:53:33,520 --> 00:53:36,880 Speaker 1: or no, and you think yes or no, it's exactly 922 00:53:36,920 --> 00:53:39,279 Speaker 1: the same as it in my brain as it is 923 00:53:39,320 --> 00:53:41,880 Speaker 1: in your brain. So they have to train it with 924 00:53:41,960 --> 00:53:44,960 Speaker 1: booze own brain waves. And that also means they have 925 00:53:45,000 --> 00:53:50,000 Speaker 1: to rely heavily upon Booze Mother for the interpretation of 926 00:53:50,080 --> 00:53:55,520 Speaker 1: Booze communications, because again, without knowing Boo in this level 927 00:53:55,520 --> 00:54:01,000 Speaker 1: of familiarity, they can't know what any particular brain wave represents. 928 00:54:01,000 --> 00:54:03,600 Speaker 1: They can detect the brain waves, but detecting it is 929 00:54:03,600 --> 00:54:06,520 Speaker 1: one thing. Knowing what it means is something else. So 930 00:54:06,840 --> 00:54:10,120 Speaker 1: they work with Booze Mother, who has been given the 931 00:54:10,120 --> 00:54:14,719 Speaker 1: title of intimate interpreter, to interpret what those brain waves mean. 932 00:54:15,360 --> 00:54:18,880 Speaker 1: And right right now, they're able to kind of go 933 00:54:18,960 --> 00:54:21,400 Speaker 1: with a binary approach, a yes or no, but they 934 00:54:21,440 --> 00:54:25,120 Speaker 1: want to extend that so that Boo can communicate more 935 00:54:25,200 --> 00:54:28,480 Speaker 1: complicated thoughts and not just respond to a series of 936 00:54:28,560 --> 00:54:33,520 Speaker 1: yes no questions. To narrow down what it is she wants. Um, 937 00:54:33,560 --> 00:54:37,240 Speaker 1: this is going to require lots and lots of data gathering. 938 00:54:37,280 --> 00:54:39,959 Speaker 1: You have to do a lot of sessions with Boo 939 00:54:40,040 --> 00:54:45,239 Speaker 1: wearing the e G headset, thinking things, measuring them, interpreting 940 00:54:45,320 --> 00:54:49,200 Speaker 1: what those those measurements mean, and then mapping that to 941 00:54:50,120 --> 00:54:53,919 Speaker 1: the system. They want to use neural networks and deep 942 00:54:54,040 --> 00:54:59,719 Speaker 1: learning to detect which of those signals represent actual commands, 943 00:55:00,080 --> 00:55:04,360 Speaker 1: which ones might be noise or interference or just a 944 00:55:04,440 --> 00:55:08,120 Speaker 1: random little blip from the brain. And you have to 945 00:55:08,160 --> 00:55:11,440 Speaker 1: differentiate those if you want it to be a meaningful experience, 946 00:55:11,480 --> 00:55:15,960 Speaker 1: if you want to make sure that the thing you 947 00:55:16,000 --> 00:55:19,960 Speaker 1: are recording is in fact an actual command and not 948 00:55:20,160 --> 00:55:26,400 Speaker 1: just random information, because otherwise, obviously you would be acting upon, uh, 949 00:55:26,600 --> 00:55:29,799 Speaker 1: just blips like blips that don't that don't indicate any 950 00:55:29,840 --> 00:55:35,160 Speaker 1: actual request or question or response. You have to cut 951 00:55:35,160 --> 00:55:39,000 Speaker 1: out all that noise. Um Box. She also gave us 952 00:55:39,000 --> 00:55:41,960 Speaker 1: an example, showed us a black screen and on that 953 00:55:42,000 --> 00:55:45,279 Speaker 1: black screen, occasionally a red square would appear, and Box 954 00:55:45,400 --> 00:55:48,040 Speaker 1: she said, well, when that black screen is black and 955 00:55:48,040 --> 00:55:50,520 Speaker 1: you're just concentrating on it, eventually your mind kind of 956 00:55:50,560 --> 00:55:54,040 Speaker 1: goes at rest and your alpha waves will start to spike, 957 00:55:54,440 --> 00:55:57,080 Speaker 1: because that's those are the brain waves that are associated 958 00:55:57,120 --> 00:56:01,640 Speaker 1: with relaxation and uh, you know, just just relaxed focus. 959 00:56:02,200 --> 00:56:04,880 Speaker 1: And then when the red square appears, you would expect 960 00:56:04,920 --> 00:56:10,279 Speaker 1: to see those alpha waves drop because you were shown something, 961 00:56:10,280 --> 00:56:13,600 Speaker 1: you were stimulated by the red square you. You had 962 00:56:13,640 --> 00:56:16,399 Speaker 1: stimuli there and your brain reacted and box. She said. 963 00:56:17,320 --> 00:56:20,080 Speaker 1: The thing is sometimes you'll get spikes even when there's 964 00:56:20,120 --> 00:56:22,600 Speaker 1: no red square. So that's the trick is figuring out 965 00:56:22,680 --> 00:56:28,640 Speaker 1: how to eliminate those those outliers, those false hits, and 966 00:56:28,719 --> 00:56:32,720 Speaker 1: make sure that the information you're gathering is actually real stuff. 967 00:56:33,560 --> 00:56:38,359 Speaker 1: The last thing that bok She touched upon was in 968 00:56:38,680 --> 00:56:43,200 Speaker 1: concern of mental health. UM He's in a collaboration with 969 00:56:43,239 --> 00:56:47,200 Speaker 1: several other people to create an early warning system for depression. 970 00:56:47,680 --> 00:56:51,680 Speaker 1: He started with looking at depression in teens and also 971 00:56:51,719 --> 00:56:54,640 Speaker 1: to create a therapist chatbot that would help teens deal 972 00:56:54,680 --> 00:56:58,440 Speaker 1: with depression and prevent teen suicide. He cited a very 973 00:56:58,480 --> 00:57:02,560 Speaker 1: troubling statistic and said that in Australia, more than forty 974 00:57:03,600 --> 00:57:09,440 Speaker 1: of teenagers calling helplines per year don't even get another 975 00:57:09,480 --> 00:57:13,080 Speaker 1: person on the phone. The call goes unanswered, which means 976 00:57:13,080 --> 00:57:15,160 Speaker 1: there are thousands of kids who need help but they 977 00:57:15,160 --> 00:57:17,720 Speaker 1: are not getting it because the helpline can't afford to 978 00:57:17,760 --> 00:57:21,080 Speaker 1: pay enough staff to staff all the phones all the time. 979 00:57:21,760 --> 00:57:25,360 Speaker 1: In the United States, of the teens who committed suicide 980 00:57:25,960 --> 00:57:30,720 Speaker 1: were uh found later to have given off signs or patterns, 981 00:57:30,880 --> 00:57:34,120 Speaker 1: but humans aren't always good at picking up on that. 982 00:57:34,280 --> 00:57:37,600 Speaker 1: Like in hindsight, we could say, oh, there was this sign, 983 00:57:37,680 --> 00:57:41,000 Speaker 1: this tragic sign that I did not notice, and if 984 00:57:41,040 --> 00:57:43,160 Speaker 1: I had, if I had noticed, maybe I could have 985 00:57:43,200 --> 00:57:46,960 Speaker 1: done something. This is a horrible tragedy for everybody. Obviously, 986 00:57:47,040 --> 00:57:50,320 Speaker 1: there's a a kid who's lost his or her life, 987 00:57:50,840 --> 00:57:53,880 Speaker 1: and then of course the people around that kid who 988 00:57:53,920 --> 00:57:57,480 Speaker 1: may blame themselves for not having picked up on subtle 989 00:57:57,560 --> 00:58:00,240 Speaker 1: signs that we humans just typically are not very good 990 00:58:00,440 --> 00:58:05,600 Speaker 1: and noticing. Tom They talked about creating algorithms that can 991 00:58:05,640 --> 00:58:10,160 Speaker 1: mine social media again looking for these signs, looking for 992 00:58:10,560 --> 00:58:15,720 Speaker 1: warning flags that someone may be going through depression. Maybe 993 00:58:15,800 --> 00:58:20,760 Speaker 1: there they are experiencing anxiety and stress, and maybe they 994 00:58:20,840 --> 00:58:26,200 Speaker 1: are are entertaining thoughts of self harm or suicide. And 995 00:58:26,280 --> 00:58:28,600 Speaker 1: there are a lot of different indicators that could potentially 996 00:58:28,720 --> 00:58:32,760 Speaker 1: lead someone to have that sort of concern about a person. 997 00:58:33,240 --> 00:58:36,000 Speaker 1: So the idea would be to design these algorithms that 998 00:58:36,480 --> 00:58:40,600 Speaker 1: could send an alert to someone who could step in 999 00:58:40,680 --> 00:58:45,000 Speaker 1: and intervene and perhaps give help to someone who needs 1000 00:58:45,040 --> 00:58:50,080 Speaker 1: it because the indicators all show with that person is 1001 00:58:50,120 --> 00:58:52,840 Speaker 1: on a really bad pathway. And also this idea of 1002 00:58:52,840 --> 00:58:57,280 Speaker 1: a therapist chat bot UH is really intriguing. The idea 1003 00:58:57,400 --> 00:59:00,560 Speaker 1: of a chat butt that is responsive in a way 1004 00:59:00,800 --> 00:59:03,600 Speaker 1: that's meaningful. It doesn't feel like you're just talking to 1005 00:59:03,680 --> 00:59:08,120 Speaker 1: a robot, that you're talking to an uncaring entity that's 1006 00:59:08,200 --> 00:59:13,320 Speaker 1: just asking very tough or or very you know, dry 1007 00:59:13,400 --> 00:59:16,200 Speaker 1: questions that wouldn't be helpful at all. It has to 1008 00:59:16,200 --> 00:59:19,680 Speaker 1: be something that seems to evoke empathy. It seems to 1009 00:59:19,680 --> 00:59:25,000 Speaker 1: be UH listening and caring about the person. And the 1010 00:59:25,120 --> 00:59:27,800 Speaker 1: nice thing is that chatbots, of course never get tired, 1011 00:59:27,960 --> 00:59:29,960 Speaker 1: they never have to go on break, They can always 1012 00:59:29,960 --> 00:59:34,320 Speaker 1: be available. They can always provide some guidance and perhaps 1013 00:59:34,840 --> 00:59:39,000 Speaker 1: lead someone to resources that might give them more help 1014 00:59:39,040 --> 00:59:42,600 Speaker 1: and support when they need it most. And um and 1015 00:59:42,640 --> 00:59:45,680 Speaker 1: they talked about how they had rolled this out for teenagers, 1016 00:59:45,680 --> 00:59:48,840 Speaker 1: but now they're looking at using it for for veterans, 1017 00:59:48,960 --> 00:59:51,800 Speaker 1: for people who have served in the armed forces and 1018 00:59:51,920 --> 00:59:58,160 Speaker 1: maybe dealing with depression and other related UH problems and 1019 00:59:58,760 --> 01:00:01,880 Speaker 1: feel like they don't have the resources around them to 1020 01:00:01,920 --> 01:00:05,560 Speaker 1: help them cope with these issues. So I thought, I 1021 01:00:05,560 --> 01:00:07,800 Speaker 1: found that really interesting to this idea of using not 1022 01:00:07,840 --> 01:00:11,400 Speaker 1: just very powerful technologies to do cool stuff, but very 1023 01:00:11,440 --> 01:00:16,280 Speaker 1: compassionate uses of technology. So that's all three of the 1024 01:00:16,320 --> 01:00:19,960 Speaker 1: presenters that I wanted to cover for this episode. Like 1025 01:00:20,000 --> 01:00:22,640 Speaker 1: I said before, I've got one more full day here 1026 01:00:22,680 --> 01:00:25,120 Speaker 1: at Think eighteen. You never know what I might encounter. 1027 01:00:25,200 --> 01:00:27,560 Speaker 1: I might end up having enough to to talk about 1028 01:00:27,760 --> 01:00:32,360 Speaker 1: yet another episode. Um, we will see. I can't guarantee 1029 01:00:32,400 --> 01:00:34,560 Speaker 1: that that will happen, but I'm going to explore, see 1030 01:00:34,560 --> 01:00:36,479 Speaker 1: who I can talk to, see what I can find out, 1031 01:00:37,120 --> 01:00:39,080 Speaker 1: and so maybe we'll get one more out of these, 1032 01:00:39,120 --> 01:00:42,240 Speaker 1: but if not, it has been a pleasure attending this conference. 1033 01:00:42,240 --> 01:00:44,120 Speaker 1: I got to talk to a lot of really smart 1034 01:00:44,160 --> 01:00:47,280 Speaker 1: people and learn a lot about these topics that I've 1035 01:00:47,280 --> 01:00:50,560 Speaker 1: always been interested in and had some understanding in, but 1036 01:00:50,640 --> 01:00:53,400 Speaker 1: this time I felt like I was being absolutely immersed 1037 01:00:53,560 --> 01:00:57,480 Speaker 1: in the ideas and it was really, really fascinating. It's 1038 01:00:57,520 --> 01:01:02,200 Speaker 1: also mentally exhausting. I don't know how everyone is holding 1039 01:01:02,280 --> 01:01:06,280 Speaker 1: up so well, because I feel like my brain is 1040 01:01:06,320 --> 01:01:10,120 Speaker 1: slowly turning into sludge being exposed to all this really 1041 01:01:10,200 --> 01:01:15,760 Speaker 1: really hyper detailed, complicated information. I hope you guys have 1042 01:01:15,880 --> 01:01:18,640 Speaker 1: enjoyed this mini series. I hope to do more of these, 1043 01:01:18,720 --> 01:01:23,280 Speaker 1: not just for tech conferences, but related events things that 1044 01:01:23,360 --> 01:01:26,680 Speaker 1: touch upon technology. Obviously, because it is tech stuff. But 1045 01:01:26,760 --> 01:01:28,320 Speaker 1: I hope to go to more of these sort of 1046 01:01:28,360 --> 01:01:31,720 Speaker 1: things and record special episodes that do not again, they 1047 01:01:31,720 --> 01:01:34,920 Speaker 1: don't replace our normal episodes of tech stuff, they just 1048 01:01:35,120 --> 01:01:38,520 Speaker 1: augment them. It's like that augmented intelligence thing. If you 1049 01:01:38,560 --> 01:01:42,000 Speaker 1: guys have suggestions for future episodes of tech Stuff, maybe 1050 01:01:42,040 --> 01:01:44,960 Speaker 1: there's a particular technology I need to cover, a person 1051 01:01:45,000 --> 01:01:48,760 Speaker 1: who's important in technology, a company that's instrumental in tech. 1052 01:01:49,080 --> 01:01:51,880 Speaker 1: Maybe there's a conference or an event that you think 1053 01:01:51,920 --> 01:01:55,000 Speaker 1: I absolutely need to go to because you've always wanted 1054 01:01:55,040 --> 01:01:57,600 Speaker 1: to learn more about it and you think it would 1055 01:01:57,640 --> 01:02:00,600 Speaker 1: be ideal for tech Stuff to attend. Let me know, 1056 01:02:00,800 --> 01:02:03,000 Speaker 1: send me a message. The email address for the show 1057 01:02:03,120 --> 01:02:05,920 Speaker 1: is text stuff at how stuff works dot com, or 1058 01:02:06,000 --> 01:02:09,200 Speaker 1: drop me a line on Twitter or Facebook that sometimes 1059 01:02:09,240 --> 01:02:10,760 Speaker 1: it is the fastest way to get in touch with me. 1060 01:02:11,200 --> 01:02:14,160 Speaker 1: Those uh they handle for both of those is text 1061 01:02:14,200 --> 01:02:17,680 Speaker 1: stuff hs W. Remember we've gotten Instagram account. You can 1062 01:02:17,720 --> 01:02:20,000 Speaker 1: go follow that and see behind the scenes stuff and 1063 01:02:20,120 --> 01:02:25,080 Speaker 1: other interesting tidbits. And every week on Wednesdays and Fridays, 1064 01:02:25,120 --> 01:02:28,160 Speaker 1: I record the normal episodes of tech Stuff and I 1065 01:02:28,240 --> 01:02:30,760 Speaker 1: usually stream it live on twitch tv, so you can 1066 01:02:30,760 --> 01:02:33,080 Speaker 1: go to twitch dot tv slash text stuff. You can 1067 01:02:33,080 --> 01:02:37,360 Speaker 1: watch me record episodes live that means I make mistakes 1068 01:02:37,360 --> 01:02:40,480 Speaker 1: and you can watch me do it and stumble over myself. 1069 01:02:40,560 --> 01:02:42,320 Speaker 1: There's also a chat room there so you can chat 1070 01:02:42,360 --> 01:02:44,560 Speaker 1: with me, and during breaks, I do like to chat 1071 01:02:44,600 --> 01:02:46,400 Speaker 1: with everybody in the chat room, see what's going on, 1072 01:02:46,840 --> 01:02:49,080 Speaker 1: see if there's anything that they want me to cover 1073 01:02:49,120 --> 01:02:52,360 Speaker 1: in specific or in particular I guess I should say, 1074 01:02:52,560 --> 01:02:55,439 Speaker 1: And it's always a joy. So please come on by, 1075 01:02:55,720 --> 01:02:58,000 Speaker 1: jump in the chat room, introduce yourself. I'd love to 1076 01:02:58,000 --> 01:03:00,720 Speaker 1: see there, and I'll talk to you again really soon. 1077 01:03:07,280 --> 01:03:09,680 Speaker 1: For more on this and thousands of other topics. Is 1078 01:03:09,720 --> 01:03:20,760 Speaker 1: that how stuff works dot com