1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech stuff from how 2 00:00:07,200 --> 00:00:13,960 Speaker 1: stuff works dot com. M Hey the everybody, and welcome 3 00:00:14,040 --> 00:00:18,280 Speaker 1: to text Stuff. I am your host, Jonathan strictly executive 4 00:00:18,400 --> 00:00:24,400 Speaker 1: vampire here at how Stuff Works. Happy Halloween if you're 5 00:00:24,400 --> 00:00:28,240 Speaker 1: listening to this on Halloween. Otherwise, that was the worst 6 00:00:28,440 --> 00:00:31,600 Speaker 1: possible introduction I could have given for this episode, and 7 00:00:31,640 --> 00:00:36,040 Speaker 1: I I should feel badly about it, but I don't. Nope, 8 00:00:36,040 --> 00:00:40,440 Speaker 1: But in celebration of Halloween, we are going to take 9 00:00:40,479 --> 00:00:44,960 Speaker 1: a look at spooky thick or you know, scary tech, 10 00:00:45,159 --> 00:00:47,320 Speaker 1: or at least some technology that could at least be 11 00:00:47,440 --> 00:00:51,320 Speaker 1: kind of creepy. See here's the problem running into y'all. 12 00:00:51,600 --> 00:00:56,120 Speaker 1: I've already done an episode about tech in Haunted House Attraction, 13 00:00:56,200 --> 00:00:59,920 Speaker 1: so that's finished. I already did one on ghost hunting 14 00:01:00,000 --> 00:01:03,760 Speaker 1: technology years ago where I gave a skeptical view of 15 00:01:03,800 --> 00:01:08,160 Speaker 1: what all that was about. And sooner or later, after 16 00:01:08,319 --> 00:01:11,120 Speaker 1: you do a few years of a technology podcast, you 17 00:01:11,160 --> 00:01:13,120 Speaker 1: start to run out of the fun stuff like that. 18 00:01:13,280 --> 00:01:17,280 Speaker 1: So for this episode, I turned to the house Stuff 19 00:01:17,319 --> 00:01:20,880 Speaker 1: Works website and I pulled up a classic article, which 20 00:01:20,920 --> 00:01:24,480 Speaker 1: is also always a dangerous thing when you're talking technology. 21 00:01:24,520 --> 00:01:29,760 Speaker 1: But this article is titled ten scary modern technologies. It 22 00:01:29,800 --> 00:01:33,240 Speaker 1: was co written by David Ruse and my former editor 23 00:01:33,480 --> 00:01:38,120 Speaker 1: and Tech Stuff co host Chris Pallette. So get ready 24 00:01:38,120 --> 00:01:42,800 Speaker 1: for some tech that goes bump in the night, I guess. 25 00:01:43,640 --> 00:01:48,560 Speaker 1: And actually, to be more serious, the technology I'm listing 26 00:01:48,560 --> 00:01:53,360 Speaker 1: here does not fall into the supernatural or ghostly categories 27 00:01:53,400 --> 00:01:58,000 Speaker 1: at all. The technology represents stuff that could be unsettling 28 00:01:58,880 --> 00:02:02,760 Speaker 1: or at worst could call as massive enormous problems due 29 00:02:02,760 --> 00:02:07,280 Speaker 1: to overreach or unintended consequences. So there are some serious 30 00:02:07,440 --> 00:02:10,000 Speaker 1: entries in here, even though I'm having a little bit 31 00:02:10,040 --> 00:02:13,560 Speaker 1: of fun with the presentation. So let us get started. 32 00:02:14,000 --> 00:02:18,520 Speaker 1: Number ten on the list hollow Sonics and the Audio 33 00:02:18,680 --> 00:02:22,960 Speaker 1: Spotlight System. This is more in the creepy, unsettling vein 34 00:02:23,760 --> 00:02:27,720 Speaker 1: technology that focuses sound. That's what this is all about. 35 00:02:27,760 --> 00:02:32,160 Speaker 1: It's focusing sound into a narrow beam, which provides the 36 00:02:32,200 --> 00:02:38,000 Speaker 1: opportunity for ultra targeted sound for stuff like advertising. It's 37 00:02:38,000 --> 00:02:42,040 Speaker 1: not necessarily creepy, but not necessarily welcome either. But imagine 38 00:02:42,040 --> 00:02:45,160 Speaker 1: that you're walking through a store and just as you're 39 00:02:45,200 --> 00:02:51,040 Speaker 1: passing in front of a certain store item, let's say 40 00:02:51,160 --> 00:02:54,520 Speaker 1: it's cookie crisp cereal, and you start hearing a voice 41 00:02:55,280 --> 00:02:57,919 Speaker 1: whispering to you, Hey, that cookie crisp looks pretty good. 42 00:02:58,600 --> 00:03:02,000 Speaker 1: It might seem really weird, this little disembodied voice, and 43 00:03:02,000 --> 00:03:04,000 Speaker 1: as soon as you get past a certain point, you 44 00:03:04,040 --> 00:03:07,120 Speaker 1: can't hear it anymore. You just have this experience. As 45 00:03:07,160 --> 00:03:10,160 Speaker 1: you're going through the store, you keep hearing these targeted 46 00:03:10,240 --> 00:03:13,760 Speaker 1: sounds in very narrow spots, and as soon as you're 47 00:03:13,760 --> 00:03:15,800 Speaker 1: out of those spots, you can't hear it. That sounds 48 00:03:15,800 --> 00:03:18,480 Speaker 1: pretty creepy, right, Well, how does it work well? According 49 00:03:18,520 --> 00:03:20,880 Speaker 1: to the company, the secret is in the size of 50 00:03:20,919 --> 00:03:25,280 Speaker 1: the sound waves compared to the size of the sound source. 51 00:03:25,560 --> 00:03:28,400 Speaker 1: The company states that if you have a sound source 52 00:03:28,800 --> 00:03:31,880 Speaker 1: that is much larger than the comparable size of the 53 00:03:31,919 --> 00:03:36,120 Speaker 1: sound waves it's producing, you get more directionality out of 54 00:03:36,120 --> 00:03:39,240 Speaker 1: your sound, so you can focus it more like a beam. 55 00:03:39,560 --> 00:03:42,600 Speaker 1: So if you were to create loudspeakers that are much much, 56 00:03:42,680 --> 00:03:46,000 Speaker 1: much larger than the sound waves they're producing, and sound 57 00:03:46,040 --> 00:03:49,000 Speaker 1: waves can measure from a few inches up to several 58 00:03:49,040 --> 00:03:52,640 Speaker 1: feet in size, you could direct those sound waves in 59 00:03:52,720 --> 00:03:56,520 Speaker 1: a beam, more like a directional beam, rather than having 60 00:03:56,520 --> 00:03:59,920 Speaker 1: them propagate outward in all equal directions. But that would 61 00:04:00,000 --> 00:04:03,200 Speaker 1: not work very well for something like targeted advertising, because 62 00:04:03,520 --> 00:04:06,400 Speaker 1: you would have to have these enormous speakers perched behind 63 00:04:06,400 --> 00:04:09,520 Speaker 1: the cheerios, and that would be very distracting. So this 64 00:04:09,560 --> 00:04:14,040 Speaker 1: company has gone with an approach that has these devices 65 00:04:14,080 --> 00:04:20,840 Speaker 1: producing ultrasonic beams of sound. Ultrasonic sound ways are very 66 00:04:20,960 --> 00:04:25,680 Speaker 1: very tiny. They're also normally imperceptible, at least directly. They 67 00:04:25,680 --> 00:04:29,240 Speaker 1: are imperceptible to us. We cannot hear in this frequency. 68 00:04:29,400 --> 00:04:32,120 Speaker 1: They do have a very strong directionality as a result 69 00:04:32,160 --> 00:04:36,120 Speaker 1: of the way they are generated. Uh So, according to 70 00:04:36,160 --> 00:04:39,400 Speaker 1: the company, as the ultrasonic beam travels through the air, 71 00:04:39,760 --> 00:04:43,120 Speaker 1: the inherent properties of the air caused the ultrasound to 72 00:04:43,240 --> 00:04:46,560 Speaker 1: change shape in a predictable way. This gives rise to 73 00:04:46,640 --> 00:04:50,640 Speaker 1: frequency components in the audible band which can be accurately 74 00:04:50,680 --> 00:04:56,080 Speaker 1: predicted and therefore precisely controlled. By generating the correct ultrasonic signal, 75 00:04:56,440 --> 00:05:01,200 Speaker 1: we can create within the air itself any sound desired. So, 76 00:05:01,240 --> 00:05:04,920 Speaker 1: in other words, it's the interaction of these ultrasonic frequencies 77 00:05:04,920 --> 00:05:09,880 Speaker 1: with the air itself that causes the ultrasonic frequencies to change, 78 00:05:10,480 --> 00:05:14,240 Speaker 1: and then you produce these audible frequencies. And because this 79 00:05:14,320 --> 00:05:19,320 Speaker 1: is all very controllable in a predictable environment, you can 80 00:05:19,480 --> 00:05:22,080 Speaker 1: produce whatever sounds you want. This is based off the 81 00:05:22,400 --> 00:05:25,240 Speaker 1: work of an inventor named Woody Norris. He demonstrated this 82 00:05:25,279 --> 00:05:28,200 Speaker 1: technology at a TED talk in the early two thousands. 83 00:05:28,560 --> 00:05:31,840 Speaker 1: Rather than generating the audible sound on the face of 84 00:05:31,880 --> 00:05:35,000 Speaker 1: the speaker as a traditional speaker would, where the speakers 85 00:05:35,080 --> 00:05:37,920 Speaker 1: moving a diaphragm in and out and pushing air around, 86 00:05:38,240 --> 00:05:42,000 Speaker 1: the ultrasonic emitter creates a column of the air itself 87 00:05:42,040 --> 00:05:45,360 Speaker 1: to act like a speaker, which is pretty nifty. I'm sorry, 88 00:05:45,360 --> 00:05:51,640 Speaker 1: I meant spooky. Number nine DNA hacking. So back in 89 00:05:51,680 --> 00:05:55,880 Speaker 1: two thousand three, scientists finished mapping out the human genome. 90 00:05:56,279 --> 00:05:59,920 Speaker 1: But that was obviously just the beginning. That's just uh, 91 00:06:00,600 --> 00:06:05,800 Speaker 1: a long list of base pairs um. Next came the 92 00:06:05,800 --> 00:06:09,320 Speaker 1: work to examine the genome closely and determine which bait 93 00:06:09,560 --> 00:06:13,400 Speaker 1: base pairs in the more than three billion pairs that 94 00:06:13,480 --> 00:06:16,839 Speaker 1: make up the human genome are responsible for different stuff 95 00:06:16,920 --> 00:06:22,359 Speaker 1: like our uh possibility of developing a disease, for example, 96 00:06:22,560 --> 00:06:25,120 Speaker 1: Because if you could determine that, and if you could 97 00:06:25,120 --> 00:06:29,080 Speaker 1: determine a way of changing that part of the DNA 98 00:06:29,200 --> 00:06:33,680 Speaker 1: so that it's eliminated, maybe you could make someone not 99 00:06:34,880 --> 00:06:37,920 Speaker 1: develop that disease. You could potentially wipe out certain diseases. 100 00:06:38,320 --> 00:06:41,120 Speaker 1: And beyond that, what about hacking the human genome so 101 00:06:41,160 --> 00:06:45,120 Speaker 1: that we can create designer human beings, human beings who 102 00:06:45,120 --> 00:06:49,000 Speaker 1: have traits that we consider to be superior. Now that's 103 00:06:49,040 --> 00:06:53,600 Speaker 1: the stuff of lots of science fiction and horror cautionary tales, 104 00:06:53,640 --> 00:06:57,640 Speaker 1: stuff like Gattica saying, yeah, but who gets to decide 105 00:06:57,640 --> 00:07:00,960 Speaker 1: what is superior? And what happens to people who aren't 106 00:07:01,240 --> 00:07:04,279 Speaker 1: able to take advantage of that technology, and what happens 107 00:07:04,279 --> 00:07:06,640 Speaker 1: to the people who do take advantage of that technology. 108 00:07:06,680 --> 00:07:10,360 Speaker 1: There are a lot of ethical questions around it. However, 109 00:07:10,440 --> 00:07:14,280 Speaker 1: beyond that, there are other scary things to consider. DNA 110 00:07:14,400 --> 00:07:19,120 Speaker 1: hacking could allow for individually focused biological warfare. So imagine 111 00:07:19,520 --> 00:07:22,280 Speaker 1: that you're living in a world and someone can get 112 00:07:22,320 --> 00:07:26,280 Speaker 1: access to your DNA. Maybe they get a skin sample 113 00:07:26,520 --> 00:07:28,720 Speaker 1: or some hair or something. They're able to get something 114 00:07:28,760 --> 00:07:31,280 Speaker 1: from you that has traces of your DNA in it, 115 00:07:31,280 --> 00:07:33,680 Speaker 1: and they're able to analyze your DNA and see what 116 00:07:33,840 --> 00:07:37,920 Speaker 1: makes you you and look for any vulnerabilities. Maybe they 117 00:07:37,960 --> 00:07:40,480 Speaker 1: find out that you have a predisposed weakness to a 118 00:07:40,480 --> 00:07:44,240 Speaker 1: particular type of virus and then the engineer a virus 119 00:07:44,280 --> 00:07:47,440 Speaker 1: to attack you. Specifically. That was actually the premise of 120 00:07:47,440 --> 00:07:50,320 Speaker 1: a two thousand twelve article in The Atlantic, and that 121 00:07:50,440 --> 00:07:52,680 Speaker 1: article hypothesized the future in which the President of the 122 00:07:52,720 --> 00:07:55,840 Speaker 1: United States could be targeted and assassinated through the use 123 00:07:55,880 --> 00:07:59,600 Speaker 1: of a particularly nasty virus that was Taylor made for 124 00:07:59,720 --> 00:08:03,600 Speaker 1: the aden yikes. Or imagine more than that, maybe a 125 00:08:03,640 --> 00:08:08,040 Speaker 1: widespread plague unless Stephen King's the stand engineered through a 126 00:08:08,080 --> 00:08:12,840 Speaker 1: deep understanding of DNA paired with a really crappy containment strategy. 127 00:08:13,240 --> 00:08:15,680 Speaker 1: So how realistic is all of that? Well, in the 128 00:08:15,760 --> 00:08:20,800 Speaker 1: short term, it's probably not terribly realistic. It's certainly possible, 129 00:08:20,840 --> 00:08:24,960 Speaker 1: but not necessarily plausible. But it does require a deep 130 00:08:25,040 --> 00:08:28,000 Speaker 1: understanding of DNA and a means to manipulate it easily 131 00:08:28,080 --> 00:08:30,880 Speaker 1: in order to pay it off. We've seen some advances 132 00:08:30,920 --> 00:08:33,400 Speaker 1: in those areas. There are a lot more ways that 133 00:08:33,440 --> 00:08:35,920 Speaker 1: we can manipulate DNA than there used to be, but 134 00:08:36,400 --> 00:08:41,080 Speaker 1: not exactly easy to do. It's easy, year, but that's 135 00:08:41,120 --> 00:08:43,560 Speaker 1: that's a matter of degrees. However, you can make a 136 00:08:43,600 --> 00:08:46,520 Speaker 1: convincing argument that it would not require too deep and 137 00:08:46,760 --> 00:08:51,840 Speaker 1: understanding to cause real harm. Unintentionally, and that would be 138 00:08:51,840 --> 00:08:55,200 Speaker 1: a very difficult argument to counter. And while DNA hacking 139 00:08:55,240 --> 00:08:58,840 Speaker 1: could produce all sorts of different futuristic results, we do 140 00:08:58,920 --> 00:09:03,280 Speaker 1: already have a culture of bio hackers also sometimes known 141 00:09:03,320 --> 00:09:08,120 Speaker 1: as grinders, who have taken body alteration to new places. 142 00:09:08,160 --> 00:09:11,200 Speaker 1: These folks are not working on a DNA level, They're 143 00:09:11,200 --> 00:09:16,560 Speaker 1: not changing themselves fundamentally in that way. Instead, they're doing alterations, 144 00:09:16,720 --> 00:09:20,280 Speaker 1: you know, upgrades. So one example would be there are 145 00:09:20,280 --> 00:09:24,240 Speaker 1: people who have chosen to have small magnets implanted under 146 00:09:24,280 --> 00:09:28,840 Speaker 1: the skin of their fingertips. Say, these little nubby magnets 147 00:09:28,880 --> 00:09:34,000 Speaker 1: sticking out on their fingertips kind of weird, right, Well, 148 00:09:34,000 --> 00:09:36,440 Speaker 1: what's the purpose of that. Well, because of the effects 149 00:09:36,440 --> 00:09:39,960 Speaker 1: of magnetism and electromagnetism, they would actually be able to 150 00:09:40,000 --> 00:09:43,600 Speaker 1: sense when they were near magnetic or electromagnetic fields. So 151 00:09:43,640 --> 00:09:46,440 Speaker 1: it'd be kind of like Spider Man's spiky sense. They'd 152 00:09:46,440 --> 00:09:49,600 Speaker 1: feel tugging on their fingertips as they passed through it. 153 00:09:49,720 --> 00:09:53,640 Speaker 1: So instead of detecting danger like Spider Man would, you'd 154 00:09:53,679 --> 00:09:56,320 Speaker 1: be able to tell like whether or not electricity was 155 00:09:56,400 --> 00:09:59,000 Speaker 1: flowing through a conductor, for example, because you could bring 156 00:09:59,040 --> 00:10:01,600 Speaker 1: your fingers close to that conductor. Maybe it's some wires. 157 00:10:01,880 --> 00:10:04,360 Speaker 1: You bring your fingers close and if you start feeling 158 00:10:04,880 --> 00:10:11,120 Speaker 1: the pulses from an alternating magnetic field of fluctuating magnetic field, 159 00:10:11,400 --> 00:10:14,040 Speaker 1: that would tell you, oh, electricity, alternating current is flowing 160 00:10:14,040 --> 00:10:16,320 Speaker 1: through here, because I can feel it in my fingertips. 161 00:10:16,720 --> 00:10:18,800 Speaker 1: That would be something you normally would not be able 162 00:10:18,840 --> 00:10:21,080 Speaker 1: to feel. So you kind of have a sixth sense 163 00:10:21,880 --> 00:10:24,440 Speaker 1: due to having these magnets implanted in your fingertips. And 164 00:10:24,480 --> 00:10:27,320 Speaker 1: people have actually done this, however, a word of caution. 165 00:10:28,240 --> 00:10:32,520 Speaker 1: From what I understand, these operations are pretty painful. They 166 00:10:32,679 --> 00:10:37,040 Speaker 1: usually are not done with an esthetic. There's not really 167 00:10:37,080 --> 00:10:42,880 Speaker 1: a place you can go that's medically you know, license 168 00:10:43,000 --> 00:10:45,479 Speaker 1: to do this because it's not a it's not a 169 00:10:45,520 --> 00:10:48,680 Speaker 1: standard medical procedure. As far as I know, there are 170 00:10:48,720 --> 00:10:52,240 Speaker 1: no accredited medical facilities that do it. I think any 171 00:10:52,280 --> 00:10:56,080 Speaker 1: doctor who did practice this would have uh the danger 172 00:10:56,240 --> 00:11:00,480 Speaker 1: of being barred from practice. So you're usually talk about 173 00:11:00,679 --> 00:11:05,920 Speaker 1: entrepreneuring body modification specialists, you know, like piercers, who will 174 00:11:05,960 --> 00:11:08,920 Speaker 1: do this kind of operation. And anytime you're talking about 175 00:11:08,960 --> 00:11:13,080 Speaker 1: introducing something foreign to the body, you're also raising other 176 00:11:13,240 --> 00:11:16,400 Speaker 1: risks like infection. So you've got to be super duper 177 00:11:16,440 --> 00:11:19,680 Speaker 1: careful about that kind of thing. In other words, you're 178 00:11:19,720 --> 00:11:23,160 Speaker 1: not gonna find me putting magnets under my fingertips anytime soon. 179 00:11:23,600 --> 00:11:30,280 Speaker 1: Number eight on the list cyber war Oh boy. Well, 180 00:11:30,600 --> 00:11:34,000 Speaker 1: the scenario the article specifically lays out is an all 181 00:11:34,040 --> 00:11:39,760 Speaker 1: out cyber warfare attack where one nation targets another nation's infrastructure, 182 00:11:39,880 --> 00:11:43,800 Speaker 1: like it's power grid system or water system. And sure enough, 183 00:11:44,040 --> 00:11:47,120 Speaker 1: the U. S Department of Homeland Security has reported that 184 00:11:47,240 --> 00:11:49,880 Speaker 1: there's lots of evidence to show that various hackers have 185 00:11:50,000 --> 00:11:54,960 Speaker 1: infiltrated critical infrastructure with such tactics like Russian hackers and 186 00:11:55,000 --> 00:11:59,760 Speaker 1: the things like power plants and gas pipelines. And that's terrifying. 187 00:12:00,160 --> 00:12:02,280 Speaker 1: And before that report had even made the news that 188 00:12:02,320 --> 00:12:06,800 Speaker 1: was in security experts have for years been warning that 189 00:12:06,920 --> 00:12:11,160 Speaker 1: Chinese hackers have been infiltrating American infrastructure. CNN reported on 190 00:12:11,240 --> 00:12:13,560 Speaker 1: that back in two thousand and fourteen. So this is 191 00:12:13,559 --> 00:12:17,040 Speaker 1: not exactly a new story. It's a continuing story that 192 00:12:17,120 --> 00:12:23,040 Speaker 1: continues to be really problematic and concerning. So the infiltration 193 00:12:23,080 --> 00:12:26,360 Speaker 1: part is a reality. We know for a fact hackers 194 00:12:26,400 --> 00:12:30,000 Speaker 1: from other countries have infiltrated various systems there have been 195 00:12:30,000 --> 00:12:34,720 Speaker 1: traces found of their activities, so we know that there 196 00:12:34,760 --> 00:12:38,160 Speaker 1: have at least been people snooping around our systems. Whether 197 00:12:38,240 --> 00:12:41,360 Speaker 1: or not they've installed anything to help shut stuff down 198 00:12:41,360 --> 00:12:44,560 Speaker 1: as another matter, but they've definitely been there, and they 199 00:12:44,800 --> 00:12:47,240 Speaker 1: appeared to have been there from places like Russia and China. 200 00:12:47,640 --> 00:12:53,200 Speaker 1: That does not automatically mean that the infiltration was state directed, 201 00:12:53,240 --> 00:12:57,199 Speaker 1: in other words, that it was a government backed project, 202 00:12:57,760 --> 00:13:01,080 Speaker 1: but that seems to be the general role. Consensus is 203 00:13:01,080 --> 00:13:05,240 Speaker 1: that these were likely the activities of a state back 204 00:13:06,000 --> 00:13:10,440 Speaker 1: group of hackers. What about shutting everything down? Is that realistic? Well, 205 00:13:10,600 --> 00:13:14,320 Speaker 1: it could be more challenging to do, but not necessarily impossible. 206 00:13:14,440 --> 00:13:19,800 Speaker 1: Many utilities have started installing self healing systems. Self healing 207 00:13:19,840 --> 00:13:22,480 Speaker 1: system isn't quite as cool as it sounds like. It's 208 00:13:22,520 --> 00:13:26,040 Speaker 1: not like it's Wolverine and infrastructure form, but it does 209 00:13:26,120 --> 00:13:31,520 Speaker 1: involve having a system that when it detects problems, automatically 210 00:13:31,640 --> 00:13:35,840 Speaker 1: tries to reroute services to get around those problems. So 211 00:13:35,840 --> 00:13:38,680 Speaker 1: with a power grid, it might be if a smart 212 00:13:38,679 --> 00:13:41,760 Speaker 1: power grid system detects that there's a short that some reason, 213 00:13:41,880 --> 00:13:45,160 Speaker 1: or there's a a break in connectivity at some point, 214 00:13:45,200 --> 00:13:47,880 Speaker 1: it may try to reroute power to work around that 215 00:13:47,960 --> 00:13:51,560 Speaker 1: as much as possible. That could help confound a cyber 216 00:13:51,600 --> 00:13:54,920 Speaker 1: warfare attack a little bit. At least it might mitigate 217 00:13:55,080 --> 00:13:59,560 Speaker 1: the impact of an attack, um though preventing one entirely 218 00:13:59,679 --> 00:14:03,199 Speaker 1: maybe not. But then there are also attacks on other systems, 219 00:14:03,240 --> 00:14:06,079 Speaker 1: not just power grids and and water and gas, which 220 00:14:06,120 --> 00:14:09,680 Speaker 1: are all scary enough. But there's the evidence that showed 221 00:14:09,880 --> 00:14:12,920 Speaker 1: Russian hackers were targeting election systems in the United States 222 00:14:12,960 --> 00:14:15,480 Speaker 1: leading up to the twenty sixteen elections. I talked about 223 00:14:15,480 --> 00:14:17,160 Speaker 1: this in a recent episode of Tech Stuff, so I'm 224 00:14:17,200 --> 00:14:19,240 Speaker 1: not gonna go all the way through it again. But 225 00:14:19,320 --> 00:14:22,440 Speaker 1: the really insidious thing about those attacks is they don't 226 00:14:22,440 --> 00:14:25,800 Speaker 1: even have to be super successful to be effective. If 227 00:14:25,800 --> 00:14:29,000 Speaker 1: you can so doubt in the minds of a nation's 228 00:14:29,120 --> 00:14:33,120 Speaker 1: citizens as to the validity of any given election, you 229 00:14:33,160 --> 00:14:36,400 Speaker 1: have undermined the very foundation of that nation's government. A 230 00:14:36,480 --> 00:14:39,440 Speaker 1: government that doesn't have the confidence of its population is 231 00:14:39,480 --> 00:14:42,520 Speaker 1: on shaky ground and has to move more and more 232 00:14:42,560 --> 00:14:46,800 Speaker 1: towards totalitarianism in order to maintain power. If you don't 233 00:14:46,840 --> 00:14:51,200 Speaker 1: have any confidence that your system works, then you don't 234 00:14:51,200 --> 00:14:53,840 Speaker 1: have any confidence in your government at all. So cyber 235 00:14:53,880 --> 00:14:57,200 Speaker 1: war is something that is continuing right now. It is 236 00:14:57,560 --> 00:15:02,240 Speaker 1: actually happening. It is already in place. And obviously I've 237 00:15:02,240 --> 00:15:04,440 Speaker 1: given examples of how the US has been the target 238 00:15:04,480 --> 00:15:08,240 Speaker 1: of cyber warfare, but don't forget the US has engaged 239 00:15:08,280 --> 00:15:10,600 Speaker 1: in it too. We're not, you know, the United States 240 00:15:10,640 --> 00:15:13,600 Speaker 1: has not been just the poor victim in all these cases. 241 00:15:13,640 --> 00:15:17,480 Speaker 1: The United States has certainly played a hand in cyber 242 00:15:17,480 --> 00:15:21,360 Speaker 1: warfare activities. One example would be stuck s net, the 243 00:15:21,440 --> 00:15:25,760 Speaker 1: computer virus that was designed to sabotage uranium enrichment facilities 244 00:15:25,800 --> 00:15:29,440 Speaker 1: in Iran. So this is not something that everyone else 245 00:15:29,520 --> 00:15:31,720 Speaker 1: is doing and the United States is the victim. This 246 00:15:31,760 --> 00:15:33,760 Speaker 1: is something that everyone is doing as much as they 247 00:15:33,800 --> 00:15:37,240 Speaker 1: can and stepping it up as much as they can. Well, 248 00:15:37,520 --> 00:15:39,800 Speaker 1: we have a lot more scary technology to talk about, 249 00:15:39,840 --> 00:15:43,280 Speaker 1: but I need to have a sip of tea to 250 00:15:43,400 --> 00:15:46,320 Speaker 1: comfort myself. Let's take a quick break to thank our sponsor. 251 00:15:54,160 --> 00:15:58,120 Speaker 1: Number seven is the technological singularity. Now, out of all 252 00:15:58,200 --> 00:16:01,000 Speaker 1: the science fiction ideas I find particular liarly interesting, this 253 00:16:01,040 --> 00:16:04,760 Speaker 1: one ranks near the top of them. The singularity refers 254 00:16:04,800 --> 00:16:07,480 Speaker 1: to a general concept that could be brought about in 255 00:16:07,560 --> 00:16:10,360 Speaker 1: several different ways, but from a very high level. The 256 00:16:10,400 --> 00:16:14,400 Speaker 1: idea goes something like this. Imagine that technology has advanced 257 00:16:14,440 --> 00:16:17,800 Speaker 1: to the point that the newest stuff coming out is 258 00:16:17,880 --> 00:16:22,520 Speaker 1: already designing the next generation of stuff. And imagine that 259 00:16:22,520 --> 00:16:25,520 Speaker 1: the gaps between these generations are getting smaller and smaller, 260 00:16:25,520 --> 00:16:28,320 Speaker 1: and eventually you reach a point where the present is 261 00:16:28,360 --> 00:16:31,240 Speaker 1: defined by constant change, and that's the only way you 262 00:16:31,280 --> 00:16:35,200 Speaker 1: can define it. Because things change so quickly it is 263 00:16:35,480 --> 00:16:39,040 Speaker 1: almost impossible to describe the present in any coherent way. 264 00:16:39,160 --> 00:16:42,680 Speaker 1: That's how quickly everything is evolving. The thing that would 265 00:16:42,720 --> 00:16:47,400 Speaker 1: fuel this would be the emergence of superhuman intelligence in 266 00:16:47,560 --> 00:16:52,600 Speaker 1: most of the scenarios that involve the technological singularity. However, 267 00:16:53,120 --> 00:16:58,000 Speaker 1: that would not necessarily require just, uh, you know, a 268 00:16:58,000 --> 00:17:02,640 Speaker 1: computer AI. That's one possible version is pure artificial intelligence 269 00:17:02,640 --> 00:17:06,120 Speaker 1: that has superhuman capabilities and processing information. This is your 270 00:17:06,160 --> 00:17:09,280 Speaker 1: basic deep thought from the Hitchecker's Guide to the Galaxy 271 00:17:09,480 --> 00:17:13,560 Speaker 1: or sky Net from the terminator that scenario. This would 272 00:17:13,560 --> 00:17:15,760 Speaker 1: be a scenario in which we humans have created an 273 00:17:15,760 --> 00:17:19,399 Speaker 1: AI so powerful we are unable to control it, and 274 00:17:19,440 --> 00:17:21,879 Speaker 1: then it goes on to redefine our world in ways 275 00:17:21,920 --> 00:17:24,879 Speaker 1: that we could not anticipate because we cannot operate on 276 00:17:24,920 --> 00:17:28,080 Speaker 1: the same level as this superhuman intelligence, but that is 277 00:17:28,119 --> 00:17:32,240 Speaker 1: not necessarily the only pathway to the technological singularity. Another 278 00:17:32,280 --> 00:17:35,359 Speaker 1: way might be that humans find a way to boost 279 00:17:35,359 --> 00:17:38,720 Speaker 1: our own intelligence and thus we evolve beyond what we 280 00:17:38,800 --> 00:17:42,000 Speaker 1: traditionally think of as being human. We might do this 281 00:17:42,119 --> 00:17:44,919 Speaker 1: through a deeper understanding of biology. We could boost our 282 00:17:44,960 --> 00:17:47,639 Speaker 1: intelligence that way, going back to the concept of d 283 00:17:47,800 --> 00:17:50,280 Speaker 1: n A hacking and things related to that. Or it 284 00:17:50,400 --> 00:17:54,720 Speaker 1: might involve using technology to create cyborg like beings where 285 00:17:55,240 --> 00:17:58,840 Speaker 1: we merge with technology on some level, and with tech 286 00:17:58,880 --> 00:18:01,840 Speaker 1: and biology working to get other we boost our intelligence 287 00:18:01,840 --> 00:18:05,080 Speaker 1: to new levels and achieve superhuman intelligence that way. So 288 00:18:06,080 --> 00:18:09,800 Speaker 1: bottom line, is this a possibility, Well, it beats me. 289 00:18:10,119 --> 00:18:12,080 Speaker 1: But there are a lot of super smart people who 290 00:18:12,080 --> 00:18:14,720 Speaker 1: are on either side of this issue. So some people 291 00:18:14,800 --> 00:18:19,000 Speaker 1: say the singularity is essentially a foregone conclusion. It will happen. 292 00:18:19,080 --> 00:18:22,119 Speaker 1: The only question is when will it happen. But there 293 00:18:22,119 --> 00:18:25,040 Speaker 1: are other people who say there might be some fundamental 294 00:18:25,080 --> 00:18:29,160 Speaker 1: barriers that were not likely to get over and those 295 00:18:29,200 --> 00:18:33,199 Speaker 1: barriers will block the singularity from ever happening. One of 296 00:18:33,200 --> 00:18:39,360 Speaker 1: the frequent criticisms of various UH singularity scenarios is that 297 00:18:39,480 --> 00:18:42,200 Speaker 1: a lot of it rests on the belief that we're 298 00:18:42,200 --> 00:18:45,159 Speaker 1: gonna see progress continue on a pace that's similar to 299 00:18:45,280 --> 00:18:50,640 Speaker 1: what Moore's law has observed with computer processing power. And 300 00:18:50,720 --> 00:18:53,520 Speaker 1: the thing is that pace may not be realistic or 301 00:18:53,560 --> 00:18:57,720 Speaker 1: sustainable or even applicable to some technologies. So Moore's law 302 00:18:57,920 --> 00:19:02,760 Speaker 1: applies to uh to the processing power of computers generally speaking, 303 00:19:03,119 --> 00:19:05,359 Speaker 1: but they may not apply to other elements that would 304 00:19:05,359 --> 00:19:08,800 Speaker 1: be necessary to bring about superhuman intelligence, because processing power 305 00:19:09,600 --> 00:19:12,320 Speaker 1: by itself is not intelligence. You also have to have 306 00:19:12,520 --> 00:19:14,640 Speaker 1: the software side. You've got a lot of other pieces 307 00:19:14,680 --> 00:19:17,760 Speaker 1: that have to be in place. However, if it is possible, 308 00:19:18,440 --> 00:19:20,720 Speaker 1: it could very well mean the end of the human 309 00:19:20,800 --> 00:19:24,679 Speaker 1: race as we know it today. Now that doesn't necessarily 310 00:19:24,680 --> 00:19:28,120 Speaker 1: mean it's the end of humanity entirely. It just may 311 00:19:28,160 --> 00:19:31,720 Speaker 1: mean that humanity will transition into something different, So it 312 00:19:31,760 --> 00:19:35,560 Speaker 1: could be a new beginning. It's not necessarily the end 313 00:19:35,600 --> 00:19:44,040 Speaker 1: of everything, but still spooky. Number six Google Glass, I 314 00:19:44,040 --> 00:19:48,160 Speaker 1: mean that was the number on the on the article Google. 315 00:19:48,240 --> 00:19:52,800 Speaker 1: You remember Google glass, the augmented reality glasses. Back when 316 00:19:52,840 --> 00:19:55,840 Speaker 1: Chris and Dave were working on this article, Google glass 317 00:19:55,880 --> 00:19:59,359 Speaker 1: was still a real thing. It was it was poised 318 00:19:59,359 --> 00:20:01,960 Speaker 1: to become an actual, a consumer product outside of the 319 00:20:02,000 --> 00:20:06,360 Speaker 1: relatively small sample of bleeding edge adopters a k a. 320 00:20:06,440 --> 00:20:09,720 Speaker 1: Glass holes. We were sometimes called that because I was 321 00:20:09,760 --> 00:20:12,240 Speaker 1: one of them. I had a pair of Google glass 322 00:20:12,280 --> 00:20:16,720 Speaker 1: The glasses were part augmented reality headset, part user interface 323 00:20:16,760 --> 00:20:19,680 Speaker 1: for the world around you. They included a camera which 324 00:20:19,680 --> 00:20:22,240 Speaker 1: could pull in information, and a Bluetooth chips so the 325 00:20:22,240 --> 00:20:26,000 Speaker 1: glasses could communicate with a paired mobile device, and through 326 00:20:26,040 --> 00:20:29,439 Speaker 1: that mobile device, the glasses could also pair information like 327 00:20:29,520 --> 00:20:33,639 Speaker 1: GPS coordinates. So these glasses, while giving you potentially incredible 328 00:20:33,680 --> 00:20:36,840 Speaker 1: access to information about the world around you, could also 329 00:20:36,960 --> 00:20:39,919 Speaker 1: gather information about the world around you for the benefit 330 00:20:39,960 --> 00:20:44,240 Speaker 1: of Google. And suddenly this company could potentially access information 331 00:20:44,240 --> 00:20:47,240 Speaker 1: from cameras mounted on faces all over the world. And 332 00:20:47,640 --> 00:20:51,080 Speaker 1: the glasses also had microphones, because you know, you could 333 00:20:51,160 --> 00:20:54,879 Speaker 1: use voice commands to make your glasses do stuff. But 334 00:20:54,960 --> 00:20:57,760 Speaker 1: that also meant that in very Google could listen in 335 00:20:58,040 --> 00:21:00,760 Speaker 1: as well, not just see everything, but here thing, which 336 00:21:00,840 --> 00:21:03,560 Speaker 1: raises some big privacy concerns, not just for the people 337 00:21:03,600 --> 00:21:07,320 Speaker 1: wearing the glasses, but for everyone around those people. And 338 00:21:07,440 --> 00:21:10,919 Speaker 1: Google makes money with information, so you would effectively be 339 00:21:11,000 --> 00:21:14,399 Speaker 1: generating product for Google to sell. By wearing a pair 340 00:21:14,480 --> 00:21:17,320 Speaker 1: of those glasses and walking around everywhere, Google would be 341 00:21:17,320 --> 00:21:20,800 Speaker 1: the head of a big surveillance state, far more invasive 342 00:21:20,800 --> 00:21:25,160 Speaker 1: than a network of closed circuit cameras if such technology 343 00:21:25,240 --> 00:21:29,440 Speaker 1: was used on ethically. And while Google Glass is now 344 00:21:29,600 --> 00:21:32,320 Speaker 1: far more limited in its rollout, you know you only 345 00:21:32,359 --> 00:21:35,760 Speaker 1: see it in a few industries. At this point, the 346 00:21:35,840 --> 00:21:38,199 Speaker 1: company Google is still very much in the business of 347 00:21:38,280 --> 00:21:41,440 Speaker 1: knowing where it's customers are. In August two thousand eighteen, 348 00:21:41,840 --> 00:21:45,080 Speaker 1: numerous tech journals reported on a study that was conducted 349 00:21:45,080 --> 00:21:49,040 Speaker 1: by Douglas C. Schmidt of Vanderbilt University. That's say said 350 00:21:49,040 --> 00:21:52,480 Speaker 1: that a stationary Android phone running Chrome in the background 351 00:21:52,720 --> 00:21:57,040 Speaker 1: would ping Google servers with location data three forty times 352 00:21:57,080 --> 00:21:59,640 Speaker 1: in a twenty four hour period. Even if you turn 353 00:21:59,720 --> 00:22:02,600 Speaker 1: the low in history feature off of the phone, the 354 00:22:02,600 --> 00:22:05,560 Speaker 1: phones were still sending location data back to Google, according 355 00:22:05,600 --> 00:22:08,000 Speaker 1: to the study. Google, by the way, has disputed the 356 00:22:08,000 --> 00:22:11,720 Speaker 1: findings of the study. Then there are the numerous personal 357 00:22:11,880 --> 00:22:15,760 Speaker 1: assistant devices that are out there, including Google Home, that 358 00:22:15,880 --> 00:22:18,880 Speaker 1: also are always listening for commands. And of course that's 359 00:22:18,920 --> 00:22:21,560 Speaker 1: just Google. There are other companies out there, like Apple 360 00:22:21,720 --> 00:22:25,200 Speaker 1: and Amazon that also have technologies similar to these. All 361 00:22:25,240 --> 00:22:28,720 Speaker 1: of these could be monitoring users and sending data back 362 00:22:29,000 --> 00:22:32,600 Speaker 1: so that the companies might later exploit that information for profit, 363 00:22:33,160 --> 00:22:36,040 Speaker 1: usually to sell you stuff to advertise directly to you. 364 00:22:36,119 --> 00:22:38,880 Speaker 1: But that's still pretty creepy, right. Even if the companies 365 00:22:39,000 --> 00:22:43,280 Speaker 1: are not actively exploiting that information, the fact that the 366 00:22:43,359 --> 00:22:46,520 Speaker 1: data could be transmitted and recorded at all is problematic. 367 00:22:46,800 --> 00:22:49,680 Speaker 1: Though again I should say the companies generally say they 368 00:22:49,720 --> 00:22:52,640 Speaker 1: do not record user data in that way, so that's 369 00:22:52,640 --> 00:22:57,719 Speaker 1: a relief right. Number five drones. Drones are legit creepy. 370 00:22:58,000 --> 00:23:01,320 Speaker 1: Many drones have cameras mounted on them. That does allow 371 00:23:01,359 --> 00:23:06,600 Speaker 1: potential filmmakers unprecedented access and capabilities. Now, a low budget 372 00:23:06,600 --> 00:23:09,880 Speaker 1: film can have the equivalent of an expensive crane shot. 373 00:23:10,280 --> 00:23:12,880 Speaker 1: It's a fraction of what it would cost to rent 374 00:23:13,000 --> 00:23:16,320 Speaker 1: and operate a film crane with all the associated personnel, 375 00:23:16,400 --> 00:23:18,560 Speaker 1: the safety features, all that kind of stuff. You could 376 00:23:19,119 --> 00:23:21,879 Speaker 1: reduce all that down to an operator and a drone 377 00:23:22,320 --> 00:23:24,200 Speaker 1: and it would be much less expensive. But it also 378 00:23:24,240 --> 00:23:27,400 Speaker 1: means a drone operator who's using one of these devices 379 00:23:27,640 --> 00:23:31,000 Speaker 1: could use it to do stuff like peeping. Super darn 380 00:23:31,119 --> 00:23:34,520 Speaker 1: creepy to be spying on neighbors and stuff. That's just 381 00:23:34,680 --> 00:23:40,360 Speaker 1: the consumer technology version of drones that is already troubling. 382 00:23:40,640 --> 00:23:43,480 Speaker 1: But then you have to remember there's also tons of 383 00:23:43,520 --> 00:23:46,960 Speaker 1: military grade drones, and they're being used to do everything 384 00:23:47,040 --> 00:23:52,840 Speaker 1: from surveillance work to active strikes on military targets. Weaponized drones. 385 00:23:53,440 --> 00:23:56,480 Speaker 1: These drones may be semi autonomous or completely under the 386 00:23:56,480 --> 00:23:59,800 Speaker 1: control of an operator who's potentially hundreds of miles away. 387 00:24:00,320 --> 00:24:04,960 Speaker 1: They greatly extend the surveillance capabilities of various government agencies 388 00:24:04,960 --> 00:24:08,560 Speaker 1: and divisions, from military to law enforcement. In the United States, 389 00:24:08,560 --> 00:24:11,280 Speaker 1: Congress passed a bill in two thousand twelve giving the 390 00:24:11,359 --> 00:24:15,000 Speaker 1: Federal Aviation Administration or f a A, the authority to 391 00:24:15,160 --> 00:24:18,879 Speaker 1: drop rules for commercial and police drones in US airspace. 392 00:24:19,560 --> 00:24:23,000 Speaker 1: The FAA hasn't been super fast to share that information. 393 00:24:23,520 --> 00:24:26,639 Speaker 1: That prompted the Electronic Frontier Foundation or e f F 394 00:24:27,280 --> 00:24:30,159 Speaker 1: to sue the f a A under the Freedom of 395 00:24:30,200 --> 00:24:33,159 Speaker 1: Information Act to at least share a list of the 396 00:24:33,200 --> 00:24:36,960 Speaker 1: public entities and private drone manufacturers that applied to flight 397 00:24:37,040 --> 00:24:39,520 Speaker 1: drones in the United States, as well as thousands of 398 00:24:39,560 --> 00:24:43,359 Speaker 1: pages related to license applications. But the f a A didn't, 399 00:24:43,560 --> 00:24:48,199 Speaker 1: you know, explain how those entities were planning on using 400 00:24:48,560 --> 00:24:53,280 Speaker 1: the drones. So that's a problem. Number four three D printers. Well, 401 00:24:53,320 --> 00:24:56,280 Speaker 1: I just recently talked about Maker Bought, and maybe you 402 00:24:56,280 --> 00:24:58,399 Speaker 1: think the scariest thing about three D printers is that 403 00:24:58,440 --> 00:25:00,760 Speaker 1: they could lead to minor RNs as you try to 404 00:25:00,760 --> 00:25:03,360 Speaker 1: deal with melting plastic. But in fact there are other 405 00:25:03,440 --> 00:25:06,199 Speaker 1: things to worry about as well, like fake A t 406 00:25:06,359 --> 00:25:09,920 Speaker 1: M facades. See back in two thousand eleven, some thieves 407 00:25:10,160 --> 00:25:12,800 Speaker 1: used a three D printer to create a false front 408 00:25:12,880 --> 00:25:15,400 Speaker 1: for an A t M terminal, and they installed a 409 00:25:15,440 --> 00:25:19,320 Speaker 1: skimmer on some A t M s so unsuspecting customers 410 00:25:19,320 --> 00:25:22,600 Speaker 1: would come up and it would look like a real 411 00:25:22,680 --> 00:25:27,200 Speaker 1: A t M front that you couldn't necessarily tell immediately 412 00:25:27,240 --> 00:25:30,000 Speaker 1: that there was a projection on there that was a 413 00:25:30,040 --> 00:25:33,919 Speaker 1: false front. So they would put their uh, their A 414 00:25:34,000 --> 00:25:36,959 Speaker 1: t M card into this. They would then type in 415 00:25:37,000 --> 00:25:42,600 Speaker 1: their pen and meanwhile the skimmer was actually scanning the 416 00:25:42,920 --> 00:25:45,640 Speaker 1: data on the card and recording it with the pin 417 00:25:46,160 --> 00:25:48,639 Speaker 1: and allowing the thieves to steal more than four hundred 418 00:25:48,640 --> 00:25:52,040 Speaker 1: thousand dollars in the process. And you know, all it 419 00:25:52,080 --> 00:25:54,439 Speaker 1: took was a three D printer to create that that 420 00:25:55,240 --> 00:25:59,359 Speaker 1: convincing a t M facade. In two thousand thirteen, a 421 00:25:59,359 --> 00:26:02,560 Speaker 1: guy named d Wilson made headlines when he published files 422 00:26:02,600 --> 00:26:06,200 Speaker 1: for his three D printed Liberator handgun, which fired three 423 00:26:06,280 --> 00:26:10,280 Speaker 1: a D ammunition. That's scared a ton of people because 424 00:26:10,320 --> 00:26:13,360 Speaker 1: it means that anyone who had access to a three 425 00:26:13,440 --> 00:26:17,520 Speaker 1: D printer and the appropriate materials could have the opportunity 426 00:26:17,560 --> 00:26:20,880 Speaker 1: to make an untraceable weapon. There'll be no background check 427 00:26:20,920 --> 00:26:23,600 Speaker 1: required because you just print the thing out. And it 428 00:26:23,640 --> 00:26:27,160 Speaker 1: also raised other possible problems. If the plastic were not 429 00:26:27,400 --> 00:26:30,600 Speaker 1: of a sufficiently good enough quality, it could mean that 430 00:26:30,680 --> 00:26:34,520 Speaker 1: the printed gun would not contain the explosive reaction properly 431 00:26:34,560 --> 00:26:37,439 Speaker 1: when you fire the bullet, so it could end up 432 00:26:37,440 --> 00:26:39,520 Speaker 1: breaking a part in the person's hand, causing injury to 433 00:26:39,520 --> 00:26:42,200 Speaker 1: the person holding the guns. So even if the person 434 00:26:42,480 --> 00:26:44,680 Speaker 1: who had printed the three D gun did so as 435 00:26:44,760 --> 00:26:47,640 Speaker 1: kind of just a proof of concept and they're firing at, 436 00:26:47,840 --> 00:26:52,760 Speaker 1: you know, just a paper target, there's the possibility that 437 00:26:52,760 --> 00:26:57,800 Speaker 1: the gun itself could explode or or fracture as part 438 00:26:57,800 --> 00:27:00,880 Speaker 1: of this if it weren't made out of sufficiently strong material, 439 00:27:01,240 --> 00:27:04,359 Speaker 1: and severe injury could follow. Cody Wilson, I should add, 440 00:27:04,359 --> 00:27:07,080 Speaker 1: has recently been in the news again. He resigned his 441 00:27:07,200 --> 00:27:10,679 Speaker 1: role as director of Defense Distributed, the company that he 442 00:27:10,800 --> 00:27:13,199 Speaker 1: used to promote the design and distribution of three D 443 00:27:13,280 --> 00:27:18,399 Speaker 1: printaple gun files. It's unrelated to the gun side of things, 444 00:27:18,960 --> 00:27:23,439 Speaker 1: so that organization still exists and continues to push Wilson's 445 00:27:23,560 --> 00:27:27,879 Speaker 1: vision even without Wilson at the helm. I've got some 446 00:27:27,920 --> 00:27:31,200 Speaker 1: more spooky things to talk about, but I grew tired, 447 00:27:31,560 --> 00:27:34,080 Speaker 1: so I need the goal and drink from blood, and 448 00:27:34,160 --> 00:27:38,640 Speaker 1: by blood I mean Earl great t I'll be right 449 00:27:38,640 --> 00:27:48,879 Speaker 1: back if through this word from our sponsors number three 450 00:27:49,640 --> 00:27:55,280 Speaker 1: driverless cars, this is a h's still a big worry. UM. 451 00:27:55,320 --> 00:27:58,720 Speaker 1: Back when Chris and Dave we're writing this driverless cars, 452 00:27:58,760 --> 00:28:03,600 Speaker 1: we're still kind of out of the very very limited 453 00:28:03,600 --> 00:28:08,040 Speaker 1: testing situations. Google was the best known version. But now 454 00:28:08,320 --> 00:28:13,719 Speaker 1: we actually have some at least some rudimentary UM automated 455 00:28:13,880 --> 00:28:17,960 Speaker 1: car systems out there in the real world, like Tesla's autopilot, 456 00:28:18,640 --> 00:28:24,800 Speaker 1: which has contributed to some notable accidents, including a fatality. UM. 457 00:28:25,000 --> 00:28:29,159 Speaker 1: Tesla has said that this system was not meant to 458 00:28:29,240 --> 00:28:34,560 Speaker 1: be used as h and a driverless car solution, but 459 00:28:34,720 --> 00:28:37,680 Speaker 1: people have still done it because you hear a word 460 00:28:37,720 --> 00:28:41,160 Speaker 1: like autopilot and you want to test it out, I guess. Um, 461 00:28:41,280 --> 00:28:45,160 Speaker 1: So that has been an issue. It has raised concerns 462 00:28:45,200 --> 00:28:50,200 Speaker 1: that perhaps this autonomous car technology is nowhere near ready 463 00:28:50,320 --> 00:28:53,400 Speaker 1: for full rollout, which I think most companies that are 464 00:28:53,440 --> 00:28:56,920 Speaker 1: working on the technology would say is correct. They're still 465 00:28:56,920 --> 00:29:01,320 Speaker 1: working on the tech to make it a reality. Um. 466 00:29:01,360 --> 00:29:03,959 Speaker 1: There have been a lot of stuff, a lot of 467 00:29:04,120 --> 00:29:10,320 Speaker 1: stories published about various problems, philosophical problems that you need 468 00:29:10,360 --> 00:29:14,760 Speaker 1: to resolve in order to make a consistent and predictable 469 00:29:14,760 --> 00:29:19,240 Speaker 1: autonomous car solution, one of them being the infamous trolley problem. 470 00:29:19,480 --> 00:29:22,640 Speaker 1: The basic version of the trolley problem is that you've 471 00:29:22,640 --> 00:29:26,560 Speaker 1: got our out of control trolley. It's going down some 472 00:29:26,680 --> 00:29:31,280 Speaker 1: tracks and there's a switch that will allow you to 473 00:29:31,400 --> 00:29:35,240 Speaker 1: change the pathway of the trolley. And if the trolley 474 00:29:35,320 --> 00:29:39,320 Speaker 1: continues where it's path where it's going now, um, it's 475 00:29:39,320 --> 00:29:42,320 Speaker 1: going to collide with a group of people. If you 476 00:29:42,360 --> 00:29:44,960 Speaker 1: throw the switch, it will change the direction of the 477 00:29:45,000 --> 00:29:48,680 Speaker 1: trolley and it will collide with one person. Do you 478 00:29:48,720 --> 00:29:52,320 Speaker 1: throw the switch? If you do nothing, then maybe you 479 00:29:52,440 --> 00:29:55,800 Speaker 1: feel like I'm not involved. Therefore, my decision did not 480 00:29:56,880 --> 00:29:59,880 Speaker 1: affect anybody that it just played out the way it 481 00:29:59,920 --> 00:30:02,520 Speaker 1: was gonna play out. If I throw the switch, is 482 00:30:02,600 --> 00:30:05,040 Speaker 1: that I am actively condemning that other person to death? 483 00:30:05,080 --> 00:30:08,320 Speaker 1: Other people would say no, by not choosing, you've actively 484 00:30:08,960 --> 00:30:12,440 Speaker 1: chosen to condemn the first group to death. Uh. There 485 00:30:12,440 --> 00:30:14,760 Speaker 1: are variations of this problem. Maybe you say, all right, 486 00:30:14,800 --> 00:30:18,640 Speaker 1: well you've got a choice. You can uh not throw 487 00:30:18,680 --> 00:30:21,880 Speaker 1: a switch, which means the out of control trolley will 488 00:30:21,920 --> 00:30:25,440 Speaker 1: eventually come to a stop, but everyone in the trolley 489 00:30:25,560 --> 00:30:27,680 Speaker 1: is going to die as a result of this accident. 490 00:30:28,080 --> 00:30:30,200 Speaker 1: Or you can throw the switch and the trolley will 491 00:30:30,280 --> 00:30:33,160 Speaker 1: hit somebody, but the trolley will stop and everyone who's 492 00:30:33,160 --> 00:30:36,480 Speaker 1: in the trolley will survive. So either you actively kill 493 00:30:36,600 --> 00:30:38,920 Speaker 1: someone but everyone in the trolley lives, or you don't 494 00:30:38,920 --> 00:30:40,960 Speaker 1: do anything and everyone in the trolley dies, but the 495 00:30:41,120 --> 00:30:44,160 Speaker 1: person who is just innocently crossing the pathway they live. 496 00:30:44,880 --> 00:30:48,240 Speaker 1: These sort of ethical problems are things that people talk 497 00:30:48,280 --> 00:30:52,200 Speaker 1: about and debate amongst themselves, but it turns out to 498 00:30:52,240 --> 00:30:56,040 Speaker 1: be an actual practical problem when you're designing autonomous car systems, 499 00:30:56,080 --> 00:30:59,440 Speaker 1: because eventually you have to build in some sort of 500 00:30:59,440 --> 00:31:03,360 Speaker 1: decision making system for a car in the event that 501 00:31:03,960 --> 00:31:09,800 Speaker 1: it it encounters a non avoidable car accident that all 502 00:31:10,000 --> 00:31:13,080 Speaker 1: the problems have aligned in such a way that there 503 00:31:13,200 --> 00:31:17,640 Speaker 1: is no possible outcome in which there isn't a car accident. 504 00:31:17,720 --> 00:31:21,000 Speaker 1: So what does the car do? Does it behave in 505 00:31:21,040 --> 00:31:23,960 Speaker 1: a way that preserves the life of the person writing 506 00:31:23,960 --> 00:31:25,400 Speaker 1: in the car? Does it behave in a way that 507 00:31:25,440 --> 00:31:27,720 Speaker 1: preserves the life of the people in the surrounding area. 508 00:31:28,240 --> 00:31:30,600 Speaker 1: There are a lot of tough questions to answer, So M. I. T. 509 00:31:30,760 --> 00:31:33,640 Speaker 1: Published a paper from a quiz called the Moral Machine. 510 00:31:34,280 --> 00:31:36,880 Speaker 1: This quiz was designed to find out what people thought 511 00:31:37,120 --> 00:31:40,400 Speaker 1: should be given priority in these situations, and it was 512 00:31:40,440 --> 00:31:45,760 Speaker 1: distributed globally across social media platforms. They recorded forty million 513 00:31:45,880 --> 00:31:50,840 Speaker 1: ethical decisions in total. Global preference had certain consistencies. By 514 00:31:50,840 --> 00:31:55,520 Speaker 1: the way, generally speaking, people prefer to spare human lives 515 00:31:55,520 --> 00:31:58,280 Speaker 1: over animal lives. So if you had an option where 516 00:31:58,840 --> 00:32:01,840 Speaker 1: you can make a choice, but this animal will die 517 00:32:01,920 --> 00:32:03,400 Speaker 1: as a result, or you can make a choice or 518 00:32:03,440 --> 00:32:06,120 Speaker 1: that human will die, people would say, well, it's a shame, 519 00:32:06,160 --> 00:32:10,840 Speaker 1: but I'd rather choose where the animal dies. Also, people 520 00:32:11,120 --> 00:32:13,880 Speaker 1: in general would choose to spare more lives rather than 521 00:32:13,960 --> 00:32:16,840 Speaker 1: fewer lives. So in my example with the group of 522 00:32:16,880 --> 00:32:19,760 Speaker 1: people versus the one person, more people would feel comfortable 523 00:32:20,000 --> 00:32:22,240 Speaker 1: with the one person losing their life as opposed to 524 00:32:22,560 --> 00:32:25,880 Speaker 1: the group of people. Also, people in general want to 525 00:32:25,920 --> 00:32:30,200 Speaker 1: spare children's lives more than adult lives, so this really 526 00:32:30,200 --> 00:32:33,640 Speaker 1: just showed people's preferences in ethical decisions. However, the study 527 00:32:33,760 --> 00:32:37,200 Speaker 1: authors stated that experts really should be the ones to 528 00:32:37,240 --> 00:32:40,760 Speaker 1: make the final call when designing these algorithms, that just 529 00:32:40,840 --> 00:32:45,280 Speaker 1: going by public preference alone may not be the best decision. However, 530 00:32:45,600 --> 00:32:50,160 Speaker 1: another thing to remember is that in st seven thousand 531 00:32:50,240 --> 00:32:55,080 Speaker 1: people died from car accidents. If driverless cars can reduce 532 00:32:55,160 --> 00:32:59,440 Speaker 1: that number year by year, if we can find a 533 00:32:59,520 --> 00:33:02,920 Speaker 1: way to driverless cars more reliable so that it reduces 534 00:33:03,000 --> 00:33:07,200 Speaker 1: the overall number of fatal accidents, that would be an 535 00:33:07,240 --> 00:33:11,120 Speaker 1: incredible thing and it would definitely be a good argument 536 00:33:11,240 --> 00:33:15,560 Speaker 1: in support for autonomous cars. It is, however, that there 537 00:33:15,560 --> 00:33:18,360 Speaker 1: is possible that there's a psychological barrier of a machine 538 00:33:18,440 --> 00:33:21,480 Speaker 1: quote unquote causing deaths, and that that could be enough 539 00:33:21,520 --> 00:33:24,320 Speaker 1: to screw things up, because while you might be able 540 00:33:24,360 --> 00:33:28,760 Speaker 1: to statistically state fewer people died because there were autonomous cars, 541 00:33:28,800 --> 00:33:31,120 Speaker 1: the fact that the cars were autonomous and then people 542 00:33:31,240 --> 00:33:35,880 Speaker 1: died has a psychological effect. You're thinking, oh, it's a 543 00:33:35,920 --> 00:33:40,400 Speaker 1: machine killing a person. Number two geo engineering. So this 544 00:33:40,520 --> 00:33:43,120 Speaker 1: is the use of science and technology to you know, 545 00:33:43,160 --> 00:33:46,040 Speaker 1: quote unquote hack the planet. This is one of those 546 00:33:46,080 --> 00:33:49,479 Speaker 1: ideas that's meant to help counteract problems like climate change. 547 00:33:49,920 --> 00:33:53,800 Speaker 1: So the scientific consensus tells us, yes, there is climate change, 548 00:33:54,080 --> 00:33:57,240 Speaker 1: and yes it is largely due to human causes, chiefly 549 00:33:57,320 --> 00:33:59,720 Speaker 1: the increase of CEO two in the atmosphere along with 550 00:33:59,720 --> 00:34:04,600 Speaker 1: other greenhouse gasses. So by creating technologies designed to capture 551 00:34:04,680 --> 00:34:07,480 Speaker 1: and sequester carbon dioxide so that it's not in the 552 00:34:07,520 --> 00:34:13,600 Speaker 1: atmosphere anymore, we could help slow or maybe even stop 553 00:34:13,800 --> 00:34:17,719 Speaker 1: the process of climate change. But the thing is, we 554 00:34:17,800 --> 00:34:21,799 Speaker 1: don't know for sure that some of these proposals would 555 00:34:21,840 --> 00:34:26,439 Speaker 1: work or what the other consequences of those actions might be. 556 00:34:27,000 --> 00:34:29,680 Speaker 1: One of the uh, well, some of the proposed methods 557 00:34:29,719 --> 00:34:33,120 Speaker 1: would definitely have nasty consequences. We know that. So, for example, 558 00:34:33,400 --> 00:34:37,920 Speaker 1: one possibility would be, let's put some more iron in 559 00:34:37,960 --> 00:34:41,320 Speaker 1: the oceans in order to spur algae blooms to soak 560 00:34:41,440 --> 00:34:46,000 Speaker 1: up carbon dioxide, which that could help you could actually 561 00:34:46,040 --> 00:34:49,440 Speaker 1: soak up CEO two from the atmosphere. However, that would 562 00:34:49,480 --> 00:34:54,080 Speaker 1: also have a huge negative impact on the ocean itself. 563 00:34:54,080 --> 00:34:57,279 Speaker 1: You would create dead zones in the ocean because of 564 00:34:57,320 --> 00:35:00,319 Speaker 1: this algae bloom, and that means messing up a really 565 00:35:00,360 --> 00:35:03,960 Speaker 1: complicated ecosystem that lots of life forms depend upon, including 566 00:35:04,000 --> 00:35:06,960 Speaker 1: life forms that aren't, you know, directly in the ocean, 567 00:35:07,200 --> 00:35:11,480 Speaker 1: so you would have a ripple effect. Fittingly enough, since 568 00:35:11,480 --> 00:35:14,000 Speaker 1: we're talking about water, you could have die offs happening 569 00:35:14,040 --> 00:35:18,160 Speaker 1: other places in the world that are a result of this, 570 00:35:18,760 --> 00:35:21,920 Speaker 1: and you know, not even places where there are oceans, 571 00:35:21,960 --> 00:35:27,760 Speaker 1: so unintended consequences could be really, really nasty. Another tactic 572 00:35:28,239 --> 00:35:32,000 Speaker 1: besides trying to capture CEO two is to find ways 573 00:35:32,040 --> 00:35:34,840 Speaker 1: to reflect more of the Sun's energy back off into 574 00:35:34,840 --> 00:35:37,680 Speaker 1: space without it getting absorbed by the Earth and then 575 00:35:37,960 --> 00:35:42,040 Speaker 1: admitted back into the art's atmosphere. The message here is 576 00:35:42,080 --> 00:35:44,759 Speaker 1: that the cure could end up being worse or at 577 00:35:44,800 --> 00:35:47,880 Speaker 1: least just as bad as the disease, though in a 578 00:35:47,920 --> 00:35:50,600 Speaker 1: different way, and ultimately we'll be taking a lot of 579 00:35:50,600 --> 00:35:54,279 Speaker 1: potentially irreversible actions without a full appreciation of what was 580 00:35:54,360 --> 00:36:01,880 Speaker 1: going to happen. However, if we employ these sponsibly and carefully, 581 00:36:01,960 --> 00:36:05,000 Speaker 1: it's likely that we could use them as part of 582 00:36:05,040 --> 00:36:08,719 Speaker 1: an overall plan to help reduce climate change. Experts warn 583 00:36:08,840 --> 00:36:11,799 Speaker 1: us that these are not magic bullets. These are not 584 00:36:12,320 --> 00:36:17,560 Speaker 1: going to miraculously reverse the course that we've seen over 585 00:36:17,600 --> 00:36:21,719 Speaker 1: the last few decades. They would at best be a 586 00:36:21,840 --> 00:36:28,160 Speaker 1: good additional strategy along with reducing our carbon dioxide emissions, 587 00:36:28,200 --> 00:36:31,480 Speaker 1: that would be our most important action to take. We 588 00:36:31,560 --> 00:36:34,040 Speaker 1: can't just assume that we're going to come up with 589 00:36:34,080 --> 00:36:38,319 Speaker 1: a technological solution that will allow us to continue to 590 00:36:38,400 --> 00:36:43,440 Speaker 1: behave the way we've been behaving and magically erase the 591 00:36:43,480 --> 00:36:47,920 Speaker 1: consequences of those actions. That's just not a realistic outlook 592 00:36:48,040 --> 00:36:51,400 Speaker 1: at this point. Number one on the list from Chris 593 00:36:51,400 --> 00:36:56,040 Speaker 1: and Dave was internet surveillance, which kind of ties back 594 00:36:56,080 --> 00:36:59,480 Speaker 1: into that Google problem I mentioned earlier, but it goes 595 00:36:59,640 --> 00:37:03,240 Speaker 1: well on that. So internet surveillance comes in all sorts 596 00:37:03,239 --> 00:37:06,759 Speaker 1: of forms. Right the social media we use when we 597 00:37:06,840 --> 00:37:09,680 Speaker 1: do our actions on their, all of that's getting tracked, 598 00:37:10,160 --> 00:37:12,920 Speaker 1: All of that is going into various tech boxes based 599 00:37:12,960 --> 00:37:17,280 Speaker 1: on our profiles, so that we are encountering the ads 600 00:37:17,320 --> 00:37:20,200 Speaker 1: that are closest tied to our behavior, so that we 601 00:37:20,239 --> 00:37:23,080 Speaker 1: have the most uh the most incentive to click on 602 00:37:23,160 --> 00:37:25,239 Speaker 1: those ads or to act on them. In some way, 603 00:37:25,320 --> 00:37:28,960 Speaker 1: thus benefiting the company that makes the social platform as 604 00:37:28,960 --> 00:37:31,240 Speaker 1: well as the company that's doing the advertising, and whatever 605 00:37:31,320 --> 00:37:34,880 Speaker 1: company is ultimately in charge of the product or service 606 00:37:34,920 --> 00:37:38,480 Speaker 1: that is being advertised. So there's that there are companies 607 00:37:38,520 --> 00:37:42,879 Speaker 1: like Google that are taking this to extremes, tracking all 608 00:37:42,880 --> 00:37:45,040 Speaker 1: sorts of behaviors for all sorts of stuff, so that 609 00:37:45,040 --> 00:37:48,520 Speaker 1: whenever we're doing searches, we're also getting served up search 610 00:37:48,560 --> 00:37:52,359 Speaker 1: results that are catered to us more and more, which 611 00:37:52,640 --> 00:37:55,280 Speaker 1: in some ways is good, you know, you're getting stuff 612 00:37:55,320 --> 00:37:57,560 Speaker 1: that is more relevant to you, and in other ways 613 00:37:57,600 --> 00:38:01,160 Speaker 1: comes across as super creepy because it means that there's 614 00:38:01,200 --> 00:38:04,960 Speaker 1: this enormous corporation out there that might know you better 615 00:38:05,000 --> 00:38:08,440 Speaker 1: than you know yourself, and that's kind of worrisome. But 616 00:38:08,480 --> 00:38:11,520 Speaker 1: then you have other things like the n s A, 617 00:38:11,760 --> 00:38:15,640 Speaker 1: which a couple of years ago was famously revealed that 618 00:38:15,680 --> 00:38:18,920 Speaker 1: the n s A was tapping into all sorts of 619 00:38:18,960 --> 00:38:25,440 Speaker 1: different communication tools two spy on on communications between lots 620 00:38:25,480 --> 00:38:29,840 Speaker 1: of different people in an effort to to promote national security, 621 00:38:29,880 --> 00:38:33,520 Speaker 1: But you could argue it was also a huge violation 622 00:38:33,560 --> 00:38:37,479 Speaker 1: of people's expectation of privacy and that it also took 623 00:38:37,520 --> 00:38:43,280 Speaker 1: almost a presumed guilty approach and applied it to absolutely 624 00:38:43,360 --> 00:38:48,200 Speaker 1: everybody who uses these these forms of communication, from cellular 625 00:38:48,239 --> 00:38:52,919 Speaker 1: phones too, website traffic, to all sources of stuff. Then 626 00:38:52,960 --> 00:38:58,080 Speaker 1: you have hackers, either state backed hackers who are are 627 00:38:58,120 --> 00:39:03,600 Speaker 1: working too i on behalf of a government, or independent 628 00:39:03,680 --> 00:39:06,520 Speaker 1: hackers who are just trying to gather as much information 629 00:39:06,520 --> 00:39:09,640 Speaker 1: as possible to exploit it. Maybe that information is your 630 00:39:09,640 --> 00:39:13,279 Speaker 1: bank account, or maybe it's your social Security number. It 631 00:39:13,280 --> 00:39:15,839 Speaker 1: could be all sorts of stuff, but ultimately they're doing 632 00:39:15,840 --> 00:39:17,919 Speaker 1: it so they can make money and they have no 633 00:39:18,040 --> 00:39:22,560 Speaker 1: real concern about what happens to you and your information. UH. 634 00:39:23,040 --> 00:39:27,640 Speaker 1: Information is valuable, whether it's for a government to try 635 00:39:27,680 --> 00:39:32,240 Speaker 1: and protect itself or the citizens that it represents. Ideally, 636 00:39:32,280 --> 00:39:33,919 Speaker 1: we would like to at least see a government try 637 00:39:33,960 --> 00:39:39,520 Speaker 1: to protect the people it represents. Maybe that's being naive, uh, 638 00:39:39,760 --> 00:39:42,879 Speaker 1: or if it's a you know, a corporation that's doing 639 00:39:42,880 --> 00:39:45,440 Speaker 1: this in order to make a profit, or a hacker 640 00:39:45,440 --> 00:39:48,080 Speaker 1: that's doing this in order to make a profit in 641 00:39:48,120 --> 00:39:51,920 Speaker 1: an even more unethical way than the corporations are. Information 642 00:39:51,960 --> 00:39:54,960 Speaker 1: is valuable. It's also a good reminder of why you 643 00:39:54,960 --> 00:39:57,920 Speaker 1: should do stuff like use VPNs when you can so 644 00:39:57,960 --> 00:40:02,359 Speaker 1: that you can protect your activity from prying eyes and 645 00:40:02,520 --> 00:40:06,280 Speaker 1: not have to worry quite so much about your every 646 00:40:06,320 --> 00:40:11,480 Speaker 1: move being spied upon. Uh, don't use public WiFi, especially 647 00:40:11,520 --> 00:40:16,200 Speaker 1: to do anything sensitive. You know, be careful, be responsible 648 00:40:16,840 --> 00:40:22,320 Speaker 1: with your browsing activities so that you limit the ways 649 00:40:22,440 --> 00:40:25,120 Speaker 1: that you can be exploited. I'm not saying don't make 650 00:40:25,239 --> 00:40:28,279 Speaker 1: use of these various platforms social media. I'd be a 651 00:40:28,360 --> 00:40:30,640 Speaker 1: hypocrite if I did. I use them all the time. 652 00:40:31,640 --> 00:40:36,000 Speaker 1: Just know what you're getting into and be aware of 653 00:40:36,040 --> 00:40:40,000 Speaker 1: what could potentially happen. If you're comfortable with those consequences, 654 00:40:40,719 --> 00:40:42,799 Speaker 1: I say, you're all good. If you're not, then you 655 00:40:42,840 --> 00:40:46,040 Speaker 1: need to think about ways you can change your behavior, 656 00:40:46,080 --> 00:40:49,920 Speaker 1: whether it's using VPNs or backing off on using the 657 00:40:49,920 --> 00:40:53,000 Speaker 1: Internet as much, or whatever it may be, so that 658 00:40:53,640 --> 00:40:57,400 Speaker 1: you feel you're using technology to benefit you as opposed 659 00:40:57,440 --> 00:41:01,239 Speaker 1: to having other people receive a benefit because of you. 660 00:41:02,600 --> 00:41:05,400 Speaker 1: Don't be used. In other words, now, as I say that, 661 00:41:05,920 --> 00:41:07,480 Speaker 1: I think you should also go to t public dot 662 00:41:07,560 --> 00:41:09,880 Speaker 1: com slash tech stuff. We got a merchandise store. It's awesome. 663 00:41:09,920 --> 00:41:14,640 Speaker 1: You can go buy stuff and every purchase helps the show. Again, hypocrite, 664 00:41:14,760 --> 00:41:19,359 Speaker 1: I know, but that's why it's spooky. Hey, guys, if 665 00:41:19,400 --> 00:41:20,879 Speaker 1: you want to get in touch with me and talk 666 00:41:20,880 --> 00:41:25,319 Speaker 1: about how scared you are after the spooky episode, you 667 00:41:25,360 --> 00:41:28,480 Speaker 1: can go over to text stuff podcast dot com. That's 668 00:41:28,480 --> 00:41:31,919 Speaker 1: our website. You'll find all the contact information there. Check 669 00:41:31,960 --> 00:41:36,880 Speaker 1: it out, and uh I will haunt you again really soon. 670 00:41:43,080 --> 00:41:45,480 Speaker 1: For more on this and thousands of other topics because 671 00:41:45,520 --> 00:41:56,640 Speaker 1: it how stuff works dot com