1 00:00:04,480 --> 00:00:12,560 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,880 --> 00:00:18,840 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:18,880 --> 00:00:22,440 Speaker 1: tech are you, y'all? I'm working on an episode that's 5 00:00:22,520 --> 00:00:25,000 Speaker 1: still in progress. I wanted to get something to you 6 00:00:25,079 --> 00:00:28,640 Speaker 1: today while I still work on that episode, and I thought, well, 7 00:00:28,760 --> 00:00:31,800 Speaker 1: it's spooky season. I've got quite a few episodes of 8 00:00:31,840 --> 00:00:35,000 Speaker 1: tech Stuff that relate to spooky topics, and I thought 9 00:00:35,040 --> 00:00:38,320 Speaker 1: this would be a fun one. It originally published on Halloween, 10 00:00:38,720 --> 00:00:43,840 Speaker 1: October thirty first, twenty to eighteen. It is titled tech 11 00:00:43,880 --> 00:00:50,600 Speaker 1: Stuff's Spooky Halloween Spectacular and I hope you enjoy as 12 00:00:50,800 --> 00:00:55,800 Speaker 1: I revisit an article from my old employer, House stuff 13 00:00:55,840 --> 00:01:01,760 Speaker 1: Works ten scary modern technologies. Enjoy this classic episode of 14 00:01:01,840 --> 00:01:03,640 Speaker 1: tech Stuff, and I'll have a new one for you 15 00:01:04,160 --> 00:01:06,880 Speaker 1: on Wednesday. 16 00:01:09,680 --> 00:01:13,640 Speaker 2: Either everybody, and Welcome to tech Stuff. I am your host, 17 00:01:13,840 --> 00:01:19,600 Speaker 2: Jonathan Strictly, the executive vampire here at Hall Stuff Works Halloween. 18 00:01:21,600 --> 00:01:25,160 Speaker 2: If you're listening to this on Halloween otherwise, that was 19 00:01:25,240 --> 00:01:29,280 Speaker 2: the worst possible introduction I could have given for this episode, and. 20 00:01:29,240 --> 00:01:33,639 Speaker 1: I should feel badly about it, but I don't. Nope, 21 00:01:33,640 --> 00:01:38,080 Speaker 1: But in celebration of Halloween, we are going to take 22 00:01:38,080 --> 00:01:42,600 Speaker 1: a look at spooky tech, or you know, scary tech, 23 00:01:42,760 --> 00:01:44,920 Speaker 1: or at least some technology that could at least be 24 00:01:45,040 --> 00:01:48,880 Speaker 1: kind of creepy. See here's the problem I'm running into, y'all. 25 00:01:49,200 --> 00:01:53,640 Speaker 1: I've already done an episode about tech in Haunted House Attraction, 26 00:01:53,840 --> 00:01:57,559 Speaker 1: so that's finished. I already did one on ghost hunting 27 00:01:57,560 --> 00:02:01,360 Speaker 1: technology years ago where I gave a skeptical view of 28 00:02:01,360 --> 00:02:05,800 Speaker 1: what all that was about. And sooner or later, after 29 00:02:05,920 --> 00:02:08,720 Speaker 1: you do a few years of a technology podcast, you 30 00:02:08,760 --> 00:02:10,680 Speaker 1: start to run out of the fun stuff like that. 31 00:02:10,880 --> 00:02:15,239 Speaker 1: So for this episode, I turned to the house Stuffworks 32 00:02:15,280 --> 00:02:18,600 Speaker 1: website and I pulled up a classic article, which is 33 00:02:18,639 --> 00:02:22,160 Speaker 1: also always a dangerous thing when you're talking technology. But 34 00:02:22,240 --> 00:02:27,480 Speaker 1: this article is titled ten Scary modern Technologies. It was 35 00:02:27,520 --> 00:02:31,160 Speaker 1: co written by David Russ and my former editor and 36 00:02:31,320 --> 00:02:35,799 Speaker 1: Tech Stuff co host Chris Pollette. So get ready for 37 00:02:35,919 --> 00:02:40,600 Speaker 1: some tech that goes bump in the night, I guess. 38 00:02:41,240 --> 00:02:46,120 Speaker 1: And actually, to be more serious, the technology I'm listing 39 00:02:46,160 --> 00:02:50,960 Speaker 1: here does not fall into the supernatural or ghostly categories 40 00:02:51,000 --> 00:02:55,560 Speaker 1: at all. The technology represents stuff that could be unsettling 41 00:02:56,480 --> 00:03:00,480 Speaker 1: or at worst could cause massive enormous problem due to 42 00:03:00,760 --> 00:03:05,519 Speaker 1: overreach or unintended consequences. So there are some serious intreges 43 00:03:05,720 --> 00:03:07,720 Speaker 1: in here, even though I'm having a little bit of 44 00:03:07,720 --> 00:03:11,959 Speaker 1: fun with the presentation. So let us get started. Number 45 00:03:12,080 --> 00:03:17,760 Speaker 1: ten on the list hollow sonics and the audio spotlight system. 46 00:03:18,160 --> 00:03:22,880 Speaker 1: This is more in the creepy, unsettling vein technology that 47 00:03:23,120 --> 00:03:26,320 Speaker 1: focuses sound. That's what this is all about. It's focusing 48 00:03:26,360 --> 00:03:30,600 Speaker 1: sound into a narrow beam, which provides the opportunity for 49 00:03:30,800 --> 00:03:36,720 Speaker 1: ultra targeted sound for stuff like advertising. It's not necessarily creepy, 50 00:03:36,720 --> 00:03:39,920 Speaker 1: but it's not necessarily welcome either. But imagine that you're 51 00:03:39,920 --> 00:03:43,440 Speaker 1: walking through a store and just as you're passing in 52 00:03:43,560 --> 00:03:49,640 Speaker 1: front of a certain store item, let's say it's cookie 53 00:03:49,640 --> 00:03:54,000 Speaker 1: crisp cereal, and you start hearing a voice whispering to you, Hey, 54 00:03:54,040 --> 00:03:57,240 Speaker 1: that cookie crisp looks pretty good. It might seem really weird, 55 00:03:57,280 --> 00:04:00,240 Speaker 1: this little disembodied voice, And as soon as as you 56 00:04:00,240 --> 00:04:02,640 Speaker 1: get past a certain point, you can't hear it anymore. 57 00:04:03,400 --> 00:04:05,640 Speaker 1: You just have this experience as you're going through the store, 58 00:04:05,680 --> 00:04:10,280 Speaker 1: you keep hearing these targeted sounds in very narrow spots 59 00:04:10,680 --> 00:04:12,080 Speaker 1: and as soon as you're out of those spots, you 60 00:04:12,080 --> 00:04:14,400 Speaker 1: can't hear it. That sounds pretty creepy, right, Well, how 61 00:04:14,400 --> 00:04:17,320 Speaker 1: does it work well? According to the company, the secret 62 00:04:17,360 --> 00:04:20,640 Speaker 1: is in the size of the sound waves compared to 63 00:04:20,720 --> 00:04:24,040 Speaker 1: the size of the sound source. The company states that 64 00:04:24,360 --> 00:04:27,640 Speaker 1: if you have a sound source that is much larger 65 00:04:27,880 --> 00:04:30,960 Speaker 1: than the comparable size of the sound waves it's producing, 66 00:04:31,640 --> 00:04:34,919 Speaker 1: you get more directionality out of your sound so you 67 00:04:34,960 --> 00:04:37,520 Speaker 1: can focus it more like a beam. So if you 68 00:04:37,600 --> 00:04:40,480 Speaker 1: were to create loud speakers that are much much much 69 00:04:40,560 --> 00:04:43,960 Speaker 1: larger than the sound waves they're producing, and sound waves 70 00:04:43,960 --> 00:04:46,919 Speaker 1: can measure from a few inches up to several feet 71 00:04:46,960 --> 00:04:50,920 Speaker 1: in size, you could direct those sound waves in a beam, 72 00:04:51,200 --> 00:04:54,760 Speaker 1: more like a directional beam, rather than having them propagate 73 00:04:54,800 --> 00:04:57,960 Speaker 1: outward in all equal directions. But that would not work 74 00:04:58,040 --> 00:05:01,440 Speaker 1: very well for something like targeted adzing, because you would 75 00:05:01,440 --> 00:05:04,680 Speaker 1: have to have these enormous speakers perched behind the cheerios 76 00:05:04,760 --> 00:05:08,000 Speaker 1: and that would be very distracting. So this company has 77 00:05:08,080 --> 00:05:13,320 Speaker 1: gone with an approach that has these devices producing ultrasonic 78 00:05:13,560 --> 00:05:19,320 Speaker 1: beams of sound. Ultrasonic sound waves are very very tiny. 79 00:05:19,480 --> 00:05:24,120 Speaker 1: They're also normally imperceptible at least directly, they are imperceptible 80 00:05:24,120 --> 00:05:27,280 Speaker 1: to us. We cannot hear in this frequency. They do 81 00:05:27,320 --> 00:05:29,960 Speaker 1: have a very strong directionality as a result of the 82 00:05:29,960 --> 00:05:34,800 Speaker 1: way they are generated. So, according to the company, as 83 00:05:34,839 --> 00:05:38,680 Speaker 1: the ultrasonic beam travels through the air, the inherent properties 84 00:05:38,720 --> 00:05:41,880 Speaker 1: of the air cause the ultrasound to change shape in 85 00:05:41,920 --> 00:05:45,760 Speaker 1: a predictable way. This gives rise to frequency components in 86 00:05:45,800 --> 00:05:49,520 Speaker 1: the audible band which can be accurately predicted and therefore 87 00:05:49,560 --> 00:05:54,360 Speaker 1: precisely controlled. By generating the correct ultrasonic signal, we can 88 00:05:54,440 --> 00:05:58,919 Speaker 1: create within the air itself any sound desired. So, in 89 00:05:58,960 --> 00:06:02,640 Speaker 1: other words, it's the interaction of these ultrasonic frequencies with 90 00:06:02,800 --> 00:06:07,360 Speaker 1: the air itself that causes the ultrasonic frequencies to change, 91 00:06:08,080 --> 00:06:11,760 Speaker 1: and then you produce these audible frequencies. And because this 92 00:06:11,880 --> 00:06:16,800 Speaker 1: is all very controllable in a predictable environment, you can 93 00:06:17,120 --> 00:06:19,720 Speaker 1: produce whatever sounds you want. This is based off the 94 00:06:20,000 --> 00:06:22,839 Speaker 1: work of an inventor named Woody Norris. He demonstrated this 95 00:06:22,880 --> 00:06:25,920 Speaker 1: technology at a TED talk in the early two thousands. 96 00:06:26,120 --> 00:06:29,400 Speaker 1: Rather than generating the audible sound on the face of 97 00:06:29,480 --> 00:06:32,520 Speaker 1: the speaker as a traditional speaker would, where the speaker 98 00:06:32,560 --> 00:06:35,120 Speaker 1: is moving, a diaphragm in and out, and pushing air 99 00:06:35,160 --> 00:06:39,080 Speaker 1: around the ultrasonic emitter creates a column of the air 100 00:06:39,120 --> 00:06:41,680 Speaker 1: itself to act like a speaker, which is pretty nifty. 101 00:06:42,520 --> 00:06:48,720 Speaker 1: I'm sorry, I meant spooky number nine. DNA hacking. So 102 00:06:48,920 --> 00:06:52,240 Speaker 1: back in two thousand and three, scientists finished mapping out 103 00:06:52,360 --> 00:06:56,200 Speaker 1: the human genome. But that was obviously just the beginning. 104 00:06:56,240 --> 00:06:56,720 Speaker 2: That's just. 105 00:06:58,200 --> 00:07:03,640 Speaker 1: A long list of base pairs. Next came the work 106 00:07:03,680 --> 00:07:08,159 Speaker 1: to examine the genome closely and determine which base pairs. 107 00:07:07,880 --> 00:07:10,400 Speaker 2: In the more than three billion. 108 00:07:10,120 --> 00:07:13,400 Speaker 1: Pairs that make up the human genome are responsible for 109 00:07:13,520 --> 00:07:20,000 Speaker 1: different stuff like our possibility of developing a disease, for example, 110 00:07:20,160 --> 00:07:22,720 Speaker 1: Because if you could determine that, and if you could 111 00:07:22,720 --> 00:07:26,720 Speaker 1: determine a way of changing that part of the DNA 112 00:07:26,840 --> 00:07:31,320 Speaker 1: so that it's eliminated, maybe you could make someone not 113 00:07:32,520 --> 00:07:35,440 Speaker 1: develop that disease. You could potentially wipe out certain diseases. 114 00:07:35,920 --> 00:07:38,720 Speaker 1: And beyond that, what about hacking the human genome so 115 00:07:38,760 --> 00:07:42,720 Speaker 1: that we can create designer human beings, human beings who 116 00:07:42,760 --> 00:07:46,600 Speaker 1: have traits that we considered to be superior. Now that's 117 00:07:46,640 --> 00:07:51,200 Speaker 1: the stuff of lots of science fiction and horror cautionary tales. 118 00:07:51,240 --> 00:07:55,200 Speaker 1: Stuff like Gatica saying yeah, but who gets to decide 119 00:07:55,240 --> 00:07:58,560 Speaker 1: what is superior and what happens to people who aren't 120 00:07:58,880 --> 00:08:01,880 Speaker 1: able to take advantage of that technology, and what happens 121 00:08:01,880 --> 00:08:04,280 Speaker 1: to the people who do take advantage of that technology. 122 00:08:04,280 --> 00:08:07,960 Speaker 1: There are a lot of ethical questions around it. However, 123 00:08:08,040 --> 00:08:11,960 Speaker 1: beyond that, there are other scary things to consider. DNA 124 00:08:12,000 --> 00:08:16,840 Speaker 1: hacking could allow for individually focused biological warfare. So imagine 125 00:08:17,120 --> 00:08:19,920 Speaker 1: that you're living in a world and someone can get 126 00:08:19,960 --> 00:08:23,920 Speaker 1: access to your DNA. Maybe they get a skin sample 127 00:08:24,120 --> 00:08:26,320 Speaker 1: or some hair or something. They're able to get something 128 00:08:26,360 --> 00:08:28,840 Speaker 1: from you that has traces of your DNA in it, 129 00:08:28,880 --> 00:08:31,280 Speaker 1: and they're able to analyze your DNA and see what 130 00:08:31,440 --> 00:08:35,480 Speaker 1: makes you you and look for any vulnerabilities. Maybe they 131 00:08:35,520 --> 00:08:38,080 Speaker 1: find out that you have a predisposed weakness to a 132 00:08:38,120 --> 00:08:41,880 Speaker 1: particular type of virus, and then they engineer a virus 133 00:08:41,920 --> 00:08:45,040 Speaker 1: to attack you specifically. That was actually the premise of 134 00:08:45,040 --> 00:08:48,319 Speaker 1: a twenty twelve article in the Atlantic, and that article 135 00:08:48,400 --> 00:08:50,560 Speaker 1: hypothesized a future in which the president of the United 136 00:08:50,600 --> 00:08:53,560 Speaker 1: States could be targeted and assassinated through the use of 137 00:08:53,600 --> 00:08:57,920 Speaker 1: a particularly nasty virus that was tailor made for the president. 138 00:08:58,440 --> 00:09:02,439 Speaker 1: Yikes or imagine more than that, maybe a widespread plague. 139 00:09:03,040 --> 00:09:06,480 Speaker 1: Les Stephen King's the stand engineered through a deep understanding 140 00:09:06,480 --> 00:09:10,960 Speaker 1: of DNA paired with a really crappy containment strategy. So 141 00:09:10,960 --> 00:09:14,000 Speaker 1: how realistic is all of that, Well, in a short term, 142 00:09:14,040 --> 00:09:18,760 Speaker 1: it's probably not terribly realistic. It's certainly pausible, but not 143 00:09:18,800 --> 00:09:23,200 Speaker 1: necessarily plausible. But it does require a deep understanding of 144 00:09:23,280 --> 00:09:26,079 Speaker 1: DNA and a means to manipulate it easily in order 145 00:09:26,120 --> 00:09:29,240 Speaker 1: to pay it off. We've seen some advances in those areas. 146 00:09:29,280 --> 00:09:31,760 Speaker 1: There are a lot more ways that we can manipulate 147 00:09:31,840 --> 00:09:35,400 Speaker 1: DNA than there used to be, but it's not exactly 148 00:09:35,600 --> 00:09:39,240 Speaker 1: easy to do. It's easy, yer, but that's a matter 149 00:09:39,280 --> 00:09:42,280 Speaker 1: of degrees. However, you can make a convincing argument that 150 00:09:42,320 --> 00:09:46,000 Speaker 1: it would not require too deep and understanding to cause 151 00:09:46,080 --> 00:09:50,120 Speaker 1: real harm unintentionally, and that would be a very difficult 152 00:09:50,240 --> 00:09:53,480 Speaker 1: argument to counter. And while DNA hacking could produce all 153 00:09:53,480 --> 00:09:57,760 Speaker 1: sorts of different futuristic results, we do already have a 154 00:09:57,840 --> 00:10:02,360 Speaker 1: culture of biohackers also sometimes known as grinders, who have 155 00:10:02,400 --> 00:10:06,480 Speaker 1: taken body alteration to new places. These folks are not 156 00:10:06,679 --> 00:10:10,959 Speaker 1: working on a DNA level. They're not changing themselves fundamentally 157 00:10:11,080 --> 00:10:15,280 Speaker 1: in that way. Instead they're doing alterations, you know, upgrades. 158 00:10:15,640 --> 00:10:18,920 Speaker 1: So one example would be there are people who have 159 00:10:19,080 --> 00:10:22,720 Speaker 1: chosen to have small magnets implanted under the skin of 160 00:10:22,760 --> 00:10:28,480 Speaker 1: their fingertips. So these little nubby magnets sticking out on 161 00:10:28,559 --> 00:10:31,920 Speaker 1: their fingertips. It's kind of weird, right, Well, what's the 162 00:10:31,920 --> 00:10:34,760 Speaker 1: purpose of that. Well, because of the effects of magnetism 163 00:10:34,880 --> 00:10:38,160 Speaker 1: and electromagnetism, they would actually be able to sense when 164 00:10:38,160 --> 00:10:41,560 Speaker 1: they were near magnetic or electromagnetic fields. So it'd be 165 00:10:41,640 --> 00:10:44,720 Speaker 1: kind of like Spider Man's spidy sense. They'd feel tugging 166 00:10:44,760 --> 00:10:48,199 Speaker 1: on their fingertips as they pass through it. So instead 167 00:10:48,240 --> 00:10:51,600 Speaker 1: of detecting danger like Spider Man would, you'd be able 168 00:10:51,640 --> 00:10:54,520 Speaker 1: to tell like whether or not electricity was flowing through 169 00:10:54,520 --> 00:10:57,160 Speaker 1: a conductor, for example, because you could bring your fingers 170 00:10:57,160 --> 00:10:59,719 Speaker 1: close to that conductor. Maybe it's some wires. You bring 171 00:10:59,720 --> 00:11:03,280 Speaker 1: a thing close and if you start feeling the pulses 172 00:11:04,320 --> 00:11:09,000 Speaker 1: from an alternating magnetic field, a fluctuating magnetic field, that 173 00:11:09,080 --> 00:11:11,920 Speaker 1: would tell you, oh, electricity, alternating current is flowing through here, 174 00:11:11,960 --> 00:11:14,640 Speaker 1: because I can feel it in my fingertips. That would 175 00:11:14,640 --> 00:11:16,800 Speaker 1: be something you normally would not be able to feel 176 00:11:16,800 --> 00:11:19,800 Speaker 1: so you kind of have a sixth sense due to 177 00:11:19,880 --> 00:11:22,440 Speaker 1: having these magnets implanted in your fingertips. And people have 178 00:11:22,520 --> 00:11:26,080 Speaker 1: actually done this. However, a word of caution. From what 179 00:11:26,160 --> 00:11:31,040 Speaker 1: I understand, these operations are pretty painful. They usually are 180 00:11:31,080 --> 00:11:36,760 Speaker 1: not done with anesthetic. There's not really a place you 181 00:11:36,800 --> 00:11:42,079 Speaker 1: can go that's medically licensed to do this because it's 182 00:11:42,120 --> 00:11:46,040 Speaker 1: not a standard medical procedure. As far as I know, 183 00:11:46,080 --> 00:11:49,440 Speaker 1: there are no accredited medical facilities that do it. I 184 00:11:49,440 --> 00:11:53,200 Speaker 1: think any doctor who did practice this would have the 185 00:11:53,320 --> 00:11:57,680 Speaker 1: danger of being barred from practice. So you're usually talking 186 00:11:57,800 --> 00:12:02,679 Speaker 1: about entrepreneuring body modify cation specialists, you know, like piercers, 187 00:12:03,120 --> 00:12:06,040 Speaker 1: who will do this kind of operation. And anytime you're 188 00:12:06,040 --> 00:12:09,360 Speaker 1: talking about introducing something foreign to the body, you're also 189 00:12:09,760 --> 00:12:13,240 Speaker 1: raising other risks like infection, so you've got to be 190 00:12:13,320 --> 00:12:16,560 Speaker 1: super duper careful about that kind of thing. In other words, 191 00:12:17,080 --> 00:12:20,080 Speaker 1: you're not gonna find me putting magnets under my fingertips 192 00:12:20,120 --> 00:12:25,640 Speaker 1: anytime soon. Number eight on the list cyber war who boy. 193 00:12:27,559 --> 00:12:27,880 Speaker 2: Well. 194 00:12:28,200 --> 00:12:31,400 Speaker 1: The scenario of the article specifically lays out is an 195 00:12:31,440 --> 00:12:35,800 Speaker 1: all out cyber warfare attack where one nation targets another 196 00:12:35,920 --> 00:12:40,400 Speaker 1: nation's infrastructure, like its power grid system or water system, 197 00:12:40,760 --> 00:12:43,839 Speaker 1: and sure enough, the US Department of Homeland Security has 198 00:12:43,880 --> 00:12:47,000 Speaker 1: reported that there's lots of evidence to show that various 199 00:12:47,000 --> 00:12:51,640 Speaker 1: hackers have infiltrated critical infrastructure with such tactics like Russian 200 00:12:51,640 --> 00:12:56,000 Speaker 1: hackers and the things like power plants and gas pipelines, 201 00:12:56,040 --> 00:12:59,360 Speaker 1: and that's terrifying. And before that report had even made 202 00:12:59,400 --> 00:13:02,960 Speaker 1: the news, that was in twenty eighteen, security experts have 203 00:13:03,200 --> 00:13:06,520 Speaker 1: for years been warning that Chinese hackers have been infiltrating 204 00:13:06,520 --> 00:13:10,400 Speaker 1: American infrastructure. CNN reported on that back in twenty fourteen. 205 00:13:10,679 --> 00:13:13,320 Speaker 1: So this is not exactly a new story. It's a 206 00:13:13,360 --> 00:13:19,000 Speaker 1: continuing story that continues to be really problematic and concerning. 207 00:13:19,200 --> 00:13:22,480 Speaker 1: So the infiltration part is a reality. We know for 208 00:13:22,559 --> 00:13:27,160 Speaker 1: a fact hackers from other countries have infiltrated various systems. 209 00:13:27,200 --> 00:13:31,120 Speaker 1: There have been traces found of their activities, so we 210 00:13:31,280 --> 00:13:34,880 Speaker 1: know that there have at least been people snooping around 211 00:13:34,880 --> 00:13:38,120 Speaker 1: our systems. Whether or not they've installed anything to help 212 00:13:38,160 --> 00:13:41,400 Speaker 1: shut stuff down as another matter, but they've definitely been there, 213 00:13:41,640 --> 00:13:43,920 Speaker 1: and they appear to have been there from places like 214 00:13:44,000 --> 00:13:48,640 Speaker 1: Russia and China. That does not automatically mean that the 215 00:13:48,720 --> 00:13:52,080 Speaker 1: infiltration was state directed. In other words that it was 216 00:13:52,080 --> 00:13:57,320 Speaker 1: a government backed project, but that seems to be The 217 00:13:57,400 --> 00:14:01,960 Speaker 1: general consensus is that these were lifely the activities of 218 00:14:02,000 --> 00:14:06,600 Speaker 1: a state back group of hackers. What about shutting everything down? 219 00:14:06,679 --> 00:14:10,199 Speaker 1: Is that realistic? Well, it could be more challenging to do, 220 00:14:10,280 --> 00:14:15,160 Speaker 1: but not necessarily impossible. Many utilities have started installing self 221 00:14:15,200 --> 00:14:19,160 Speaker 1: healing systems. A self healing system isn't quite as cool 222 00:14:19,240 --> 00:14:21,560 Speaker 1: as it sounds like. It's not like it's Wolverine in 223 00:14:21,560 --> 00:14:25,160 Speaker 1: infrastructure form, but it does involve having a system that 224 00:14:25,680 --> 00:14:32,120 Speaker 1: when it detects problems automatically tries to reroute services to 225 00:14:32,240 --> 00:14:34,200 Speaker 1: get around those problems. So with a power grid, it 226 00:14:34,280 --> 00:14:37,560 Speaker 1: might be if a smart power grid system detects that 227 00:14:37,600 --> 00:14:40,840 Speaker 1: there's a short for some reason, or there's a break 228 00:14:40,880 --> 00:14:43,880 Speaker 1: in connectivity at some point, it may try to reroute 229 00:14:43,920 --> 00:14:47,120 Speaker 1: power to work around that as much as possible. That 230 00:14:47,160 --> 00:14:50,760 Speaker 1: could help confound a cyber warfare attack a little bit. 231 00:14:50,800 --> 00:14:54,160 Speaker 1: At least it might mitigate the impact of an attack, 232 00:14:55,280 --> 00:14:59,280 Speaker 1: though preventing one entirely maybe not. But then there are 233 00:14:59,280 --> 00:15:02,560 Speaker 1: also attacks on other systems, not just power grids and 234 00:15:02,680 --> 00:15:05,720 Speaker 1: water and gas, which are all scary enough. But there's 235 00:15:05,840 --> 00:15:09,720 Speaker 1: the evidence that showed Russian hackers were targeting election systems 236 00:15:09,720 --> 00:15:12,280 Speaker 1: in the United States leading up to the twenty sixteen elections. 237 00:15:12,520 --> 00:15:14,560 Speaker 1: I talked about this in a recent episode of Tech Stuff, 238 00:15:14,560 --> 00:15:16,280 Speaker 1: so I'm not going to go all the way through 239 00:15:16,320 --> 00:15:19,200 Speaker 1: it again. But the really insidious thing about those attacks 240 00:15:19,600 --> 00:15:21,800 Speaker 1: is they don't even have to be super successful to 241 00:15:21,800 --> 00:15:25,840 Speaker 1: be effective. If you can so doubt in the minds 242 00:15:25,880 --> 00:15:29,520 Speaker 1: of a nation's citizens as to the validity of any 243 00:15:29,560 --> 00:15:32,680 Speaker 1: given election, you have undermined the very foundation of that 244 00:15:32,760 --> 00:15:35,680 Speaker 1: nation's government. A government that doesn't have the confidence of 245 00:15:35,720 --> 00:15:39,640 Speaker 1: its population is on shaky ground and has to move 246 00:15:39,680 --> 00:15:43,080 Speaker 1: more and more towards totalitarianism in order to maintain power. 247 00:15:43,920 --> 00:15:46,840 Speaker 1: If you don't have any confidence that your system works, 248 00:15:47,520 --> 00:15:50,560 Speaker 1: then you don't have any confidence in your government at all. 249 00:15:50,960 --> 00:15:54,440 Speaker 1: So cyber war is something that is continuing right now. 250 00:15:54,480 --> 00:15:58,880 Speaker 1: It is actually happening. It is already in place. And 251 00:15:59,120 --> 00:16:01,560 Speaker 1: obviously I've given examples of how the US has been 252 00:16:01,560 --> 00:16:05,080 Speaker 1: the target of cyber warfare, but don't forget the US 253 00:16:05,120 --> 00:16:07,640 Speaker 1: has engaged in it too. We're not, you know, the 254 00:16:07,720 --> 00:16:10,400 Speaker 1: United States has not been just the poor victim in 255 00:16:10,440 --> 00:16:13,520 Speaker 1: all these cases. The United States has certainly played a 256 00:16:13,640 --> 00:16:18,600 Speaker 1: hand in cyber warfare activities. One example would be Stuck's Net, 257 00:16:18,800 --> 00:16:22,800 Speaker 1: the computer virus that was designed to sabotage uranium enrichment 258 00:16:22,840 --> 00:16:26,840 Speaker 1: facilities in Iran. So this is not something that everyone 259 00:16:26,840 --> 00:16:29,120 Speaker 1: else is doing and the United States is the victim. 260 00:16:29,200 --> 00:16:31,200 Speaker 1: This is something that everyone is doing as much as 261 00:16:31,200 --> 00:16:34,880 Speaker 1: they can and stepping it up as much as they can. Well, 262 00:16:35,120 --> 00:16:37,400 Speaker 1: we have a lot more scary technology to talk about, 263 00:16:37,440 --> 00:16:40,920 Speaker 1: but I need to have a sip of tea to 264 00:16:41,000 --> 00:16:43,280 Speaker 1: comfort myself, so let's take a quick break to thank 265 00:16:43,320 --> 00:16:55,320 Speaker 1: our sponsor. Number seven is the technological singularity. Now, out 266 00:16:55,360 --> 00:16:58,400 Speaker 1: of all the science fiction ideas I find particularly interesting, 267 00:16:58,480 --> 00:17:01,920 Speaker 1: this one ranks near the top of them. The singularity 268 00:17:01,960 --> 00:17:04,959 Speaker 1: refers to a general concept that could be brought about 269 00:17:05,000 --> 00:17:07,760 Speaker 1: in several different ways, but from a very high level, 270 00:17:07,880 --> 00:17:11,480 Speaker 1: the idea goes something like this. Imagine that technology has 271 00:17:11,560 --> 00:17:14,720 Speaker 1: advanced to the point that the newest stuff coming out 272 00:17:15,320 --> 00:17:20,040 Speaker 1: is already designing the next generation of stuff. And imagine 273 00:17:20,040 --> 00:17:23,120 Speaker 1: that the gaps between these generations are getting smaller and smaller, 274 00:17:23,119 --> 00:17:25,919 Speaker 1: and eventually you reach a point where the present is 275 00:17:25,960 --> 00:17:28,800 Speaker 1: defined by constant change, and that's the only way you 276 00:17:28,880 --> 00:17:32,840 Speaker 1: can define it, because things change so quickly. It is 277 00:17:33,080 --> 00:17:36,399 Speaker 1: almost impossible to describe the present in any coherent way. 278 00:17:36,760 --> 00:17:40,240 Speaker 1: That's how quickly everything is evolving. The thing that would 279 00:17:40,320 --> 00:17:45,080 Speaker 1: fuel this would be the emergence of superhuman intelligence in 280 00:17:45,200 --> 00:17:50,199 Speaker 1: most of the scenarios that involve the technological singularity. However, 281 00:17:50,680 --> 00:17:56,880 Speaker 1: that would not necessarily require just a computer AI. That's 282 00:17:56,920 --> 00:18:01,000 Speaker 1: one possible version is pure artificial intelligen that has superhuman 283 00:18:01,040 --> 00:18:05,160 Speaker 1: capabilities in processing information. This is your basic deep thought 284 00:18:05,320 --> 00:18:08,359 Speaker 1: from Hitchicker's Guide to the Galaxy or Skynet from the 285 00:18:08,440 --> 00:18:12,080 Speaker 1: Terminator that scenario. This would be a scenario in which 286 00:18:12,080 --> 00:18:15,080 Speaker 1: we humans have created an AI so powerful we are 287 00:18:15,200 --> 00:18:17,960 Speaker 1: unable to control it, and then it goes on to 288 00:18:18,000 --> 00:18:20,800 Speaker 1: redefine our world in ways that we could not anticipate 289 00:18:21,040 --> 00:18:23,600 Speaker 1: because we cannot operate on the same level as this 290 00:18:23,760 --> 00:18:27,600 Speaker 1: superhuman intelligence. But that is not necessarily the only pathway 291 00:18:27,640 --> 00:18:31,880 Speaker 1: to the technological singularity. Another way might be that humans 292 00:18:31,920 --> 00:18:34,880 Speaker 1: find a way to boost our own intelligence, and thus 293 00:18:34,880 --> 00:18:38,399 Speaker 1: we evolve beyond what we traditionally think of as being human. 294 00:18:38,760 --> 00:18:41,560 Speaker 1: We might do this through a deeper understanding of biology. 295 00:18:41,840 --> 00:18:44,560 Speaker 1: We could boost our intelligence that way, going back to 296 00:18:44,600 --> 00:18:47,040 Speaker 1: the concept of DNA hacking and things related to that, 297 00:18:47,680 --> 00:18:51,439 Speaker 1: or it might involve using technology to create cyborg like 298 00:18:51,600 --> 00:18:55,919 Speaker 1: beings where we merge with technology on some level and 299 00:18:56,080 --> 00:18:59,400 Speaker 1: with tech and biology working together, we boost our intelligence 300 00:18:59,440 --> 00:19:02,639 Speaker 1: to new level and achieve superhuman intelligence that way. So 301 00:19:03,680 --> 00:19:07,320 Speaker 1: bottom line, is this a possibility, Well, it beats me. 302 00:19:07,720 --> 00:19:09,640 Speaker 1: But there are a lot of super smart people who 303 00:19:09,680 --> 00:19:12,359 Speaker 1: are on either side of this issue. So some people 304 00:19:12,400 --> 00:19:16,639 Speaker 1: say the singularity is essentially a foregone conclusion. It will happen, 305 00:19:16,680 --> 00:19:19,719 Speaker 1: The only question is when will it happen. But there 306 00:19:19,720 --> 00:19:22,679 Speaker 1: are other people who say there might be some fundamental 307 00:19:22,720 --> 00:19:26,760 Speaker 1: barriers that were not likely to get over, and those 308 00:19:26,800 --> 00:19:30,800 Speaker 1: barriers will block the singularity from ever happening. One of 309 00:19:30,800 --> 00:19:37,199 Speaker 1: the frequent criticisms of various singularity scenarios is that a 310 00:19:37,240 --> 00:19:39,959 Speaker 1: lot of it rests on the belief that we're going 311 00:19:40,040 --> 00:19:42,840 Speaker 1: to see progress continue on a pace that's similar to 312 00:19:42,880 --> 00:19:48,280 Speaker 1: what Moore's law has observed with computer processing power, and 313 00:19:48,320 --> 00:19:51,720 Speaker 1: the thing is that pace may not be realistic or sustainable, 314 00:19:51,800 --> 00:19:56,080 Speaker 1: or even applicable to some technologies. So Moore's law applies 315 00:19:56,200 --> 00:20:01,119 Speaker 1: to the processing power of computers generally speaking, but it 316 00:20:01,160 --> 00:20:03,520 Speaker 1: may not apply to other elements that would be necessary 317 00:20:03,520 --> 00:20:07,800 Speaker 1: to bring about superhuman intelligence, because processing power by itself 318 00:20:07,920 --> 00:20:11,040 Speaker 1: is not intelligence. You also have to have the software side. 319 00:20:11,040 --> 00:20:12,560 Speaker 1: You've got a lot of other pieces that have to 320 00:20:12,600 --> 00:20:16,320 Speaker 1: be in place. However, if it is possible, it could 321 00:20:16,440 --> 00:20:19,080 Speaker 1: very well mean the end of the human race as 322 00:20:19,119 --> 00:20:22,680 Speaker 1: we know it today. Now that doesn't necessarily mean it's 323 00:20:22,720 --> 00:20:26,040 Speaker 1: the end of humanity entirely. It just may mean that 324 00:20:26,119 --> 00:20:29,800 Speaker 1: humanity will transition into something different, So it could be 325 00:20:29,880 --> 00:20:33,800 Speaker 1: a new beginning. It's not necessarily the end of everything, 326 00:20:34,880 --> 00:20:41,919 Speaker 1: but still spooky. Number six Google Glass, I mean that 327 00:20:42,040 --> 00:20:46,959 Speaker 1: was the number on the article Google. You remember Google Glass, 328 00:20:47,560 --> 00:20:51,360 Speaker 1: the augmented reality glasses. Back when Chris and Dave were 329 00:20:51,400 --> 00:20:55,080 Speaker 1: working on this article, Google glass was still a real thing. 330 00:20:56,119 --> 00:20:59,400 Speaker 1: It was poised to become an actual consumer product outside 331 00:20:59,400 --> 00:21:03,280 Speaker 1: of the relative small sample of bleeding edge adopters aka 332 00:21:04,080 --> 00:21:07,320 Speaker 1: glass holes. We were sometimes called that because I was 333 00:21:07,359 --> 00:21:09,520 Speaker 1: one of them. I had a pair of Google Glass. 334 00:21:09,880 --> 00:21:14,320 Speaker 1: The glasses were part augmented reality headset, part user interface 335 00:21:14,359 --> 00:21:17,280 Speaker 1: for the world around you. They included a camera which 336 00:21:17,280 --> 00:21:19,840 Speaker 1: could pull in information, and a bluetooth chips so the 337 00:21:19,840 --> 00:21:23,600 Speaker 1: glasses could communicate with a paired mobile device, and through 338 00:21:23,640 --> 00:21:27,080 Speaker 1: that mobile device, the glasses could also pair information like 339 00:21:27,119 --> 00:21:31,280 Speaker 1: GPS coordinates, So these glasses, while giving you potentially incredible 340 00:21:31,320 --> 00:21:34,480 Speaker 1: access to information about the world around you, could also 341 00:21:34,560 --> 00:21:37,560 Speaker 1: gather information about the world around you for the benefit 342 00:21:37,560 --> 00:21:41,800 Speaker 1: of Google. And suddenly this company could potentially access information 343 00:21:41,840 --> 00:21:44,800 Speaker 1: from cameras mounted on faces all over the world. And 344 00:21:45,240 --> 00:21:48,680 Speaker 1: the glasses also had microphones, because you know, you could 345 00:21:48,760 --> 00:21:52,520 Speaker 1: use voice commands to make your glasses do stuff. But 346 00:21:52,560 --> 00:21:55,480 Speaker 1: that also meant that in theory, Google could listen in 347 00:21:55,640 --> 00:21:58,360 Speaker 1: as well, not just see everything, but hear everything, which 348 00:21:58,440 --> 00:22:01,160 Speaker 1: raises some big privacy concer not just for the people 349 00:22:01,200 --> 00:22:04,960 Speaker 1: wearing the glasses, but for everyone around those people. And 350 00:22:05,040 --> 00:22:08,520 Speaker 1: Google makes money with information, so you would effectively be 351 00:22:08,600 --> 00:22:12,040 Speaker 1: generating product for Google to sell by wearing a pair 352 00:22:12,080 --> 00:22:14,879 Speaker 1: of those glasses and walking around everywhere. Google would be 353 00:22:14,920 --> 00:22:18,400 Speaker 1: the head of a big surveillance state, far more invasive 354 00:22:18,440 --> 00:22:22,720 Speaker 1: than a network of closed circuit cameras if such technology 355 00:22:22,880 --> 00:22:27,400 Speaker 1: was used unethically. And while Google Glass is now far 356 00:22:27,480 --> 00:22:30,160 Speaker 1: more limited in its rollout, you know you only see 357 00:22:30,200 --> 00:22:33,840 Speaker 1: it in a few industries. At this point, the company 358 00:22:33,880 --> 00:22:36,120 Speaker 1: Google is still very much in the business of knowing 359 00:22:36,200 --> 00:22:40,160 Speaker 1: where its customers are. In August twenty eighteen, numerous tech 360 00:22:40,240 --> 00:22:43,439 Speaker 1: journals reported on a study that was conducted by Douglas C. 361 00:22:43,720 --> 00:22:47,560 Speaker 1: Schmidt of Venderbilt University that say said that a stationary 362 00:22:47,600 --> 00:22:51,560 Speaker 1: Android phone running Chrome in the background would ping Google 363 00:22:51,640 --> 00:22:54,800 Speaker 1: servers with location data three hundred and forty times in 364 00:22:54,840 --> 00:22:57,399 Speaker 1: a twenty four hour period. Even if you turn the 365 00:22:57,440 --> 00:23:00,639 Speaker 1: location history feature off of the phone, the phones were 366 00:23:00,680 --> 00:23:04,439 Speaker 1: still sending location data back to Google, according to the study. Google, 367 00:23:04,480 --> 00:23:06,600 Speaker 1: by the way, has disputed the findings of this study. 368 00:23:07,080 --> 00:23:10,879 Speaker 1: Then there are the numerous personal assistant devices that are 369 00:23:10,880 --> 00:23:14,680 Speaker 1: out there, including Google Home, that also are always listening 370 00:23:14,760 --> 00:23:17,480 Speaker 1: for commands. And of course that's just Google. There are 371 00:23:17,560 --> 00:23:20,520 Speaker 1: other companies out there, like Apple and Amazon that also 372 00:23:20,640 --> 00:23:23,919 Speaker 1: have technologies similar to these. All of these could be 373 00:23:24,040 --> 00:23:27,679 Speaker 1: monitoring users and sending data back so that the companies 374 00:23:27,880 --> 00:23:31,520 Speaker 1: might later exploit that information for profit, usually to sell 375 00:23:31,640 --> 00:23:34,240 Speaker 1: you stuff to advertise directly to you. But that's still 376 00:23:34,240 --> 00:23:37,639 Speaker 1: pretty creepy, right. Even if the companies are not actively 377 00:23:37,760 --> 00:23:41,639 Speaker 1: exploiting that information, the fact that the data could be 378 00:23:41,640 --> 00:23:45,199 Speaker 1: transmitted and recorded at all is problematic, though again I 379 00:23:45,240 --> 00:23:48,000 Speaker 1: should say the companies generally say they do not record 380 00:23:48,119 --> 00:23:50,880 Speaker 1: user data in that way. So whew, that's a relief, right. 381 00:23:51,440 --> 00:23:56,480 Speaker 1: Number five drones. Drones are legit creepy. Many drones have 382 00:23:56,600 --> 00:24:01,280 Speaker 1: cameras mounted on them. That does allow potential filmmakers unprecedented 383 00:24:01,320 --> 00:24:04,840 Speaker 1: access and capabilities. Now, a low budget film can have 384 00:24:04,920 --> 00:24:08,679 Speaker 1: the equivalent of an expensive crane shot. It's a fraction 385 00:24:09,119 --> 00:24:11,359 Speaker 1: of what it would cost to rent and operate a 386 00:24:11,440 --> 00:24:14,760 Speaker 1: film crane with all the associated personnel, the safety features, 387 00:24:14,800 --> 00:24:17,479 Speaker 1: all that kind of stuff. You could reduce all that 388 00:24:17,600 --> 00:24:20,320 Speaker 1: down to an operator and a drone and it would 389 00:24:20,320 --> 00:24:22,440 Speaker 1: be much less expensive. But it also means a drone 390 00:24:22,440 --> 00:24:26,000 Speaker 1: operator who's using one of these devices could use it 391 00:24:26,240 --> 00:24:29,840 Speaker 1: to do stuff like peeping, super darn creepy, to be 392 00:24:29,920 --> 00:24:34,120 Speaker 1: spying on neighbors and stuff. That's just the consumer technology 393 00:24:34,400 --> 00:24:39,000 Speaker 1: version of drones that is already troubling. But then you 394 00:24:39,080 --> 00:24:42,640 Speaker 1: have to remember there's also tons of military grade drones 395 00:24:43,119 --> 00:24:45,960 Speaker 1: and they're being used to do everything from surveillance work 396 00:24:46,200 --> 00:24:51,679 Speaker 1: to active strikes on military targets. Weaponized drones, these drones 397 00:24:51,720 --> 00:24:54,560 Speaker 1: may be semi autonomous or completely under the control of 398 00:24:54,600 --> 00:24:58,560 Speaker 1: an operator who's potentially hundreds of miles away. They greatly 399 00:24:58,640 --> 00:25:03,160 Speaker 1: extend the surveillance case abilities of various government agencies and divisions, 400 00:25:03,160 --> 00:25:06,600 Speaker 1: from military to law enforcement. In the United States, Congress 401 00:25:06,640 --> 00:25:09,960 Speaker 1: passed a bill in twenty twelve giving the Federal Aviation 402 00:25:10,080 --> 00:25:14,200 Speaker 1: Administration or FAA, the authority to drop rules for commercial 403 00:25:14,400 --> 00:25:18,320 Speaker 1: and police drones in US airspace. The FAA hasn't been 404 00:25:18,480 --> 00:25:22,399 Speaker 1: super fast to share that information. That prompted the Electronic 405 00:25:22,440 --> 00:25:27,280 Speaker 1: Frontier Foundation or EFF to sue the FAA under the 406 00:25:27,280 --> 00:25:30,480 Speaker 1: Freedom of Information Act to at least share a list 407 00:25:30,560 --> 00:25:34,199 Speaker 1: of the public entities in private drone manufacturers that applied 408 00:25:34,240 --> 00:25:36,320 Speaker 1: to fly drones in the United States, as well as 409 00:25:36,359 --> 00:25:40,960 Speaker 1: thousands of pages related to license applications. But the FAA didn't, 410 00:25:41,160 --> 00:25:45,840 Speaker 1: you know, explain how those entities were planning on using 411 00:25:46,200 --> 00:25:50,880 Speaker 1: the drones. So that's a problem. Number four three D printers. Well, 412 00:25:50,960 --> 00:25:53,840 Speaker 1: I just recently talked about maker Bot, and maybe you 413 00:25:53,880 --> 00:25:56,000 Speaker 1: think the scariest thing about three D printers is that 414 00:25:56,040 --> 00:25:58,359 Speaker 1: they could lead to minor burns as you try to 415 00:25:58,359 --> 00:26:01,280 Speaker 1: deal with melting plastic. In fact, there are other things 416 00:26:01,280 --> 00:26:05,240 Speaker 1: to worry about as well, like fake ATM facades. See 417 00:26:05,240 --> 00:26:08,480 Speaker 1: back in twenty eleven, some thieves used a three D 418 00:26:08,600 --> 00:26:11,720 Speaker 1: printer to create a false front for an ATM terminal, 419 00:26:12,000 --> 00:26:16,480 Speaker 1: and they installed a skimmer on some ATMs, so unsuspecting 420 00:26:16,520 --> 00:26:19,560 Speaker 1: customers would come up and it would look like a 421 00:26:19,960 --> 00:26:25,000 Speaker 1: real ATM front that you couldn't necessarily tell immediately that 422 00:26:25,080 --> 00:26:28,399 Speaker 1: there was a projection on there that was a false front. 423 00:26:28,960 --> 00:26:32,960 Speaker 1: So they would put their their ATM card into this. 424 00:26:33,280 --> 00:26:38,040 Speaker 1: They would then type in their pin and meanwhile the 425 00:26:38,080 --> 00:26:41,920 Speaker 1: skimmer was actually scanning the data on the card and 426 00:26:42,040 --> 00:26:44,840 Speaker 1: recording it with the pen and allowing the thieves to 427 00:26:44,880 --> 00:26:47,920 Speaker 1: steal more than four hundred thousand dollars in the process. 428 00:26:48,840 --> 00:26:50,720 Speaker 1: And you know, all it took was a three D 429 00:26:50,760 --> 00:26:56,840 Speaker 1: printer to create that convincing ATM facade. In twenty thirteen, 430 00:26:56,880 --> 00:26:59,720 Speaker 1: a guy named Cody Wilson made headlines when he published 431 00:26:59,760 --> 00:27:03,280 Speaker 1: five else for his three D printed Liberator handgun, which 432 00:27:03,320 --> 00:27:07,200 Speaker 1: fired three eighty ammunition. That scared a ton of people 433 00:27:07,560 --> 00:27:10,760 Speaker 1: because it means that anyone who had access to a 434 00:27:10,800 --> 00:27:14,480 Speaker 1: three D printer and the appropriate materials could have the 435 00:27:14,480 --> 00:27:18,280 Speaker 1: opportunity to make an untraceable weapon. There'd be no background 436 00:27:18,359 --> 00:27:21,080 Speaker 1: check required because you just print the thing out. And 437 00:27:21,119 --> 00:27:24,560 Speaker 1: it also raised other possible problems if the plastic were 438 00:27:24,640 --> 00:27:28,080 Speaker 1: not of a sufficiently good enough quality. It could mean 439 00:27:28,119 --> 00:27:31,200 Speaker 1: that the printed gun would not contain the explosive reaction 440 00:27:31,680 --> 00:27:34,879 Speaker 1: properly when you fire the bullet, so it could end 441 00:27:34,920 --> 00:27:37,119 Speaker 1: up breaking apart in a person's hand, causing injury to 442 00:27:37,160 --> 00:27:39,840 Speaker 1: the person holding the gun. So even if the person 443 00:27:40,080 --> 00:27:42,280 Speaker 1: who had printed the three D gun did so as 444 00:27:42,359 --> 00:27:45,280 Speaker 1: kind of just a proof of concept and they're firing at, 445 00:27:45,440 --> 00:27:50,359 Speaker 1: you know, just a paper target, there's the possibility that 446 00:27:50,359 --> 00:27:55,520 Speaker 1: the gun itself could explode or fracture as part of 447 00:27:55,560 --> 00:27:58,520 Speaker 1: this if it weren't made out of sufficiently strong material, 448 00:27:58,840 --> 00:28:01,960 Speaker 1: and severe injury could follow. Cody Wilson, I should add, 449 00:28:01,960 --> 00:28:04,679 Speaker 1: has recently been in the news again. He resigned his 450 00:28:04,800 --> 00:28:08,280 Speaker 1: role as director of Defense Distributed, the company that he 451 00:28:08,400 --> 00:28:10,800 Speaker 1: used to promote the design and distribution of three D 452 00:28:10,840 --> 00:28:16,040 Speaker 1: principle gun files. It's unrelated to the gun side of things, 453 00:28:16,560 --> 00:28:21,040 Speaker 1: so that organization still exists and continues to push Wilson's 454 00:28:21,160 --> 00:28:25,480 Speaker 1: vision even without Wilson at the helm. I've got some 455 00:28:25,520 --> 00:28:28,760 Speaker 1: more spooky things to talk about, but I grow tired, 456 00:28:29,160 --> 00:28:31,679 Speaker 1: so I need the goal and drink some blood, and 457 00:28:31,720 --> 00:28:36,240 Speaker 1: by blood, I mean oh grade t I'll be right 458 00:28:36,280 --> 00:28:47,760 Speaker 1: back after this word from our sponsors. Number three driverless cars. 459 00:28:48,360 --> 00:28:48,880 Speaker 1: This is a. 460 00:28:50,440 --> 00:28:51,320 Speaker 2: Still a big worry. 461 00:28:52,320 --> 00:28:55,720 Speaker 1: Back when Chris and Dave were writing this, driverless cars 462 00:28:55,720 --> 00:29:00,120 Speaker 1: were still kind of coming out of the very very 463 00:29:00,160 --> 00:29:04,720 Speaker 1: limited testing situations. Google was the best known version. But 464 00:29:04,840 --> 00:29:10,719 Speaker 1: now we actually have some at least some rudimentary automated 465 00:29:10,880 --> 00:29:14,960 Speaker 1: car systems out there in the real world, like Tesla's autopilot, 466 00:29:15,640 --> 00:29:20,560 Speaker 1: which has contributed to some notable accidents, including a fatality. 467 00:29:22,000 --> 00:29:26,200 Speaker 1: Tesla has said that this system was not meant to 468 00:29:26,240 --> 00:29:32,360 Speaker 1: be used as a driverless car solution, but people have 469 00:29:32,480 --> 00:29:35,440 Speaker 1: still done it because you hear a word like autopilot 470 00:29:35,520 --> 00:29:38,480 Speaker 1: and you want to test it out. I guess so 471 00:29:39,600 --> 00:29:42,280 Speaker 1: that has been an issue. It has raised concerns that 472 00:29:42,360 --> 00:29:47,400 Speaker 1: perhaps this autonomous car technology is nowhere near ready for 473 00:29:47,600 --> 00:29:50,720 Speaker 1: full rollout, which I think most companies that are working 474 00:29:50,720 --> 00:29:54,320 Speaker 1: on the technology would say is correct. They're still working 475 00:29:54,560 --> 00:29:58,600 Speaker 1: on the tech to make it a reality. There have 476 00:29:58,640 --> 00:30:01,880 Speaker 1: been a lot of of a lot of stories published 477 00:30:01,920 --> 00:30:08,640 Speaker 1: about various problems, philosophical problems that you need to resolve 478 00:30:08,720 --> 00:30:12,959 Speaker 1: in order to make a consistent and predictable autonomous car solution, 479 00:30:13,320 --> 00:30:17,080 Speaker 1: one of them being the infamous trolley problem. The basic 480 00:30:17,160 --> 00:30:20,560 Speaker 1: version of the trolley problem is that you've got out 481 00:30:20,600 --> 00:30:25,920 Speaker 1: of control trolley. It's going down some tracks, and there's 482 00:30:25,960 --> 00:30:29,640 Speaker 1: a switch that will allow you to change the pathway 483 00:30:29,760 --> 00:30:34,040 Speaker 1: of the trolley. And if the trolley continues where it's 484 00:30:34,080 --> 00:30:37,440 Speaker 1: path where it's going now, it's going to collide with 485 00:30:37,840 --> 00:30:40,680 Speaker 1: a group of people. If you throw the switch, it 486 00:30:40,720 --> 00:30:43,560 Speaker 1: will change the direction of the trolley and it will 487 00:30:43,560 --> 00:30:46,840 Speaker 1: collide with one person. Do you throw the switch? If 488 00:30:46,880 --> 00:30:51,200 Speaker 1: you do nothing, then maybe you feel like I'm not involved. 489 00:30:51,440 --> 00:30:56,360 Speaker 1: Therefore my decision did not affect anybody. It just played 490 00:30:56,360 --> 00:30:58,080 Speaker 1: out the way it was going to play out. If 491 00:30:58,080 --> 00:31:01,160 Speaker 1: I throw the switch, is that I'm actively condemning that 492 00:31:01,240 --> 00:31:03,320 Speaker 1: other person to death? Other people would say no, by 493 00:31:03,400 --> 00:31:07,760 Speaker 1: not choosing, you've actively chosen to condemn the first group 494 00:31:07,880 --> 00:31:11,360 Speaker 1: to death. There are variations of this problem. Maybe you say, 495 00:31:11,400 --> 00:31:15,280 Speaker 1: all right, well you've got a choice. You can not 496 00:31:15,400 --> 00:31:17,920 Speaker 1: throw a switch, which means the out of control trolley 497 00:31:18,480 --> 00:31:22,000 Speaker 1: will eventually come to a stop, but everyone in the 498 00:31:22,040 --> 00:31:24,640 Speaker 1: trolley is going to die as a result of this accident. 499 00:31:25,080 --> 00:31:27,160 Speaker 1: Or you can throw the switch and the trolley will 500 00:31:27,240 --> 00:31:30,120 Speaker 1: hit somebody, but the trolley will stop and everyone who's 501 00:31:30,160 --> 00:31:33,880 Speaker 1: in the trolley will survive. So either you actively kill someone, 502 00:31:33,920 --> 00:31:36,000 Speaker 1: but everyone in the trolley lives. Or you don't do 503 00:31:36,080 --> 00:31:38,280 Speaker 1: anything and everyone in the trolley dies, but the person 504 00:31:38,320 --> 00:31:42,120 Speaker 1: who is just innocently crossing the pathway they live. These 505 00:31:42,800 --> 00:31:45,640 Speaker 1: sort of ethical problems are things that people talk about 506 00:31:46,080 --> 00:31:49,280 Speaker 1: and debate amongst themselves, but it turns out to be 507 00:31:49,320 --> 00:31:53,040 Speaker 1: an actual practical problem when you're designing autonomous car systems, 508 00:31:53,080 --> 00:31:56,400 Speaker 1: because eventually you have to build in some sort of 509 00:31:56,440 --> 00:32:00,360 Speaker 1: decision making system for a car in the event that 510 00:32:00,920 --> 00:32:07,120 Speaker 1: it encounters a non avoidable car accident, that all the 511 00:32:07,280 --> 00:32:10,200 Speaker 1: problems have aligned in such a way that there is 512 00:32:10,360 --> 00:32:14,640 Speaker 1: no possible outcome in which there isn't a car accident. 513 00:32:14,680 --> 00:32:18,000 Speaker 1: So what does the car do? Does it behave in 514 00:32:18,040 --> 00:32:20,920 Speaker 1: a way that preserves the life of the person writing 515 00:32:20,960 --> 00:32:22,360 Speaker 1: in the car? Does it behave in a way that 516 00:32:22,400 --> 00:32:24,640 Speaker 1: preserves the life of the people in the surrounding area. 517 00:32:25,200 --> 00:32:27,160 Speaker 1: There are a lot of tough questions to answer, so 518 00:32:27,320 --> 00:32:30,640 Speaker 1: MIT published a paper from a quiz called the Moral Machine. 519 00:32:31,280 --> 00:32:33,840 Speaker 1: This quiz was designed to find out what people thought 520 00:32:34,120 --> 00:32:37,400 Speaker 1: should be given priority in these situations, and it was 521 00:32:37,440 --> 00:32:42,760 Speaker 1: distributed globally across social media platforms. They recorded forty million 522 00:32:42,880 --> 00:32:47,800 Speaker 1: ethical decisions in total global preference had certain consistencies. By 523 00:32:47,800 --> 00:32:52,320 Speaker 1: the way, generally speaking, people prefer to spare human lives 524 00:32:52,520 --> 00:32:55,240 Speaker 1: over animal lives. So if you had an option where 525 00:32:55,800 --> 00:32:58,840 Speaker 1: you can make a choice, but this animal will die 526 00:32:58,880 --> 00:33:00,400 Speaker 1: as a result, or you can make it vo or 527 00:33:00,400 --> 00:33:03,080 Speaker 1: that human will die, people would say, well, it's a shame, 528 00:33:03,120 --> 00:33:07,840 Speaker 1: but I'd rather choose where the animal dies. Also, people 529 00:33:08,120 --> 00:33:10,840 Speaker 1: in general would choose to spare more lives rather than 530 00:33:10,920 --> 00:33:13,840 Speaker 1: fewer lives. So in my example with the group of 531 00:33:13,840 --> 00:33:16,760 Speaker 1: people versus the one person, more people would feel comfortable 532 00:33:17,000 --> 00:33:19,480 Speaker 1: with the one person losing their life as opposed to 533 00:33:19,520 --> 00:33:22,840 Speaker 1: the group of people. Also, people in general want to 534 00:33:22,880 --> 00:33:27,200 Speaker 1: spare children's lives more than adult lives, So this really 535 00:33:27,240 --> 00:33:30,640 Speaker 1: just showed people's preferences and ethical decisions. However, the study 536 00:33:30,760 --> 00:33:34,200 Speaker 1: authors stated that experts really should be the ones to 537 00:33:34,200 --> 00:33:37,800 Speaker 1: make the final call when designing these algorithms, that just 538 00:33:37,840 --> 00:33:42,280 Speaker 1: going by public preference alone may not be their best decision. However, 539 00:33:42,600 --> 00:33:46,400 Speaker 1: another thing to remember is that in twenty sixteen, thirty 540 00:33:46,440 --> 00:33:51,320 Speaker 1: seven thousand people died from car accidents. If driverless cars 541 00:33:51,400 --> 00:33:55,440 Speaker 1: can reduce that number year by year, if we can 542 00:33:55,920 --> 00:33:58,760 Speaker 1: find a way to make driverless cars more reliable so 543 00:33:58,840 --> 00:34:02,959 Speaker 1: that it reduces the overall number of fatal accidents. That 544 00:34:03,040 --> 00:34:07,160 Speaker 1: would be an incredible thing and it would definitely be 545 00:34:07,240 --> 00:34:11,840 Speaker 1: a good argument in support for autonomous cars. It is, however, 546 00:34:12,080 --> 00:34:15,040 Speaker 1: that there possible that there's a psychological barrier of a 547 00:34:15,080 --> 00:34:18,200 Speaker 1: machine quote unquote causing deaths, and that that could be 548 00:34:18,280 --> 00:34:21,080 Speaker 1: enough to screw things up, because while you might be 549 00:34:21,120 --> 00:34:24,839 Speaker 1: able to statistically state fewer people died because there were 550 00:34:24,880 --> 00:34:27,560 Speaker 1: autonomous cars, the fact that the cars were autonomous and 551 00:34:27,600 --> 00:34:32,560 Speaker 1: then people died has a psychological effect. You're thinking, oh, 552 00:34:32,640 --> 00:34:37,160 Speaker 1: it's a machine killing a person. Number two geoengineering, So 553 00:34:37,239 --> 00:34:40,399 Speaker 1: this is the use of science and technology to quote 554 00:34:40,480 --> 00:34:43,360 Speaker 1: unquote hack the planet. This is one of those ideas 555 00:34:43,400 --> 00:34:47,000 Speaker 1: that's meant to help counteract problems like climate change. So 556 00:34:47,280 --> 00:34:50,840 Speaker 1: the scientific consensus tells us, yes, there is climate change, 557 00:34:51,040 --> 00:34:54,240 Speaker 1: and yes it is largely due to human causes, chiefly 558 00:34:54,320 --> 00:34:56,680 Speaker 1: the increase of CO two in the atmosphere along with 559 00:34:56,719 --> 00:35:01,600 Speaker 1: other greenhouse gasses. So by creating technologies designed to capture 560 00:35:01,640 --> 00:35:04,440 Speaker 1: and sequester carbon dioxide so that it's not in the 561 00:35:04,480 --> 00:35:10,480 Speaker 1: atmosphere anymore, we could help slow or maybe even stop 562 00:35:10,840 --> 00:35:14,680 Speaker 1: the process of climate change. But the thing is, we 563 00:35:14,800 --> 00:35:18,759 Speaker 1: don't know for sure that some of these proposals would 564 00:35:18,840 --> 00:35:23,279 Speaker 1: work or what the other consequences of those actions might be. 565 00:35:24,000 --> 00:35:26,879 Speaker 1: One of the well, some of the proposed methods would 566 00:35:26,880 --> 00:35:28,520 Speaker 1: definitely have nasty consequences. 567 00:35:28,560 --> 00:35:29,080 Speaker 2: We know that. 568 00:35:29,280 --> 00:35:33,400 Speaker 1: So, for example, one possibility would be, let's put some 569 00:35:34,000 --> 00:35:36,840 Speaker 1: more iron in the oceans in order to spur algae 570 00:35:36,920 --> 00:35:42,239 Speaker 1: blooms to soak up carbon dioxide, which that could help 571 00:35:42,280 --> 00:35:45,200 Speaker 1: you could actually soak up CO two from the atmosphere. However, 572 00:35:46,000 --> 00:35:50,279 Speaker 1: that would also have a huge negative impact on the 573 00:35:50,320 --> 00:35:53,280 Speaker 1: ocean itself. You would create dead zones in the ocean 574 00:35:53,320 --> 00:35:56,920 Speaker 1: because of this algae bloom, and that means messing up 575 00:35:56,920 --> 00:36:00,480 Speaker 1: a really complicated ecosystem that lots of lifeforms to depend upon, 576 00:36:00,520 --> 00:36:03,960 Speaker 1: including life forms that aren't you know, directly in the ocean, 577 00:36:04,200 --> 00:36:08,440 Speaker 1: So you would have a ripple effect. Fittingly enough, since 578 00:36:08,440 --> 00:36:10,920 Speaker 1: we're talking about water, you could have die offs happening 579 00:36:11,040 --> 00:36:15,160 Speaker 1: other places in the world that are a result of this, 580 00:36:15,760 --> 00:36:18,920 Speaker 1: and you know, not even places where there are oceans, 581 00:36:18,920 --> 00:36:24,760 Speaker 1: So unintended consequences could be really, really nasty. Another tactic 582 00:36:25,239 --> 00:36:29,000 Speaker 1: besides trying to capture CO two is to find ways 583 00:36:29,040 --> 00:36:31,800 Speaker 1: to reflect more of the Sun's energy back off into 584 00:36:31,840 --> 00:36:34,720 Speaker 1: space without it getting absorbed by the Earth and then 585 00:36:34,960 --> 00:36:39,040 Speaker 1: emitted back into the ris's atmosphere. The message here is 586 00:36:39,080 --> 00:36:41,719 Speaker 1: that the cure could end up being worse or at 587 00:36:41,800 --> 00:36:44,880 Speaker 1: least just as bad as the disease, though in a 588 00:36:44,920 --> 00:36:47,560 Speaker 1: different way. And ultimately we'd be taking a lot of 589 00:36:47,600 --> 00:36:51,279 Speaker 1: potentially irreversible actions without a full appreciation of what was 590 00:36:51,320 --> 00:36:58,520 Speaker 1: going to happen. However, if we employ these responsibly and carefully, 591 00:36:58,960 --> 00:37:02,000 Speaker 1: it's likely that we could use them as part of 592 00:37:02,040 --> 00:37:05,680 Speaker 1: an overall plan to help reduce climate change. Experts warn 593 00:37:05,800 --> 00:37:08,640 Speaker 1: us that these are not magic bullets. These are not 594 00:37:09,320 --> 00:37:14,560 Speaker 1: going to miraculously reverse the course that we've seen over 595 00:37:14,560 --> 00:37:18,799 Speaker 1: the last few decades. They would at best be a 596 00:37:18,840 --> 00:37:25,160 Speaker 1: good additional strategy along with reducing our carbon dioxide emissions, 597 00:37:25,160 --> 00:37:28,440 Speaker 1: that would be our most important action to take. We 598 00:37:28,520 --> 00:37:31,000 Speaker 1: can't just assume that we're going to come up with 599 00:37:31,080 --> 00:37:35,319 Speaker 1: a technological solution that will allow us to continue to 600 00:37:35,400 --> 00:37:40,440 Speaker 1: behave the way we've been behaving and magically erase the 601 00:37:40,480 --> 00:37:44,880 Speaker 1: consequences of those actions. That's just not a realistic outlook 602 00:37:45,040 --> 00:37:48,359 Speaker 1: at this point. Number one on the list from Chris 603 00:37:48,400 --> 00:37:53,040 Speaker 1: and Dave was Internet surveillance, which kind of ties back 604 00:37:53,080 --> 00:37:56,520 Speaker 1: into that Google problem I mentioned earlier, but it goes 605 00:37:56,640 --> 00:38:00,919 Speaker 1: well beyond that. So Internet surveillance comes in all forms, right, 606 00:38:01,520 --> 00:38:05,120 Speaker 1: the social media we use, when we do our actions 607 00:38:05,120 --> 00:38:07,480 Speaker 1: on there, all of that's getting tracked. All of that 608 00:38:07,600 --> 00:38:10,759 Speaker 1: is going into various tech boxes based on our profiles, 609 00:38:10,840 --> 00:38:15,560 Speaker 1: so that we are encountering the ads that are closest 610 00:38:15,640 --> 00:38:17,719 Speaker 1: tied to our behavior, so that we have the most 611 00:38:18,680 --> 00:38:21,080 Speaker 1: the most incentive to click on those ads or to 612 00:38:21,120 --> 00:38:23,960 Speaker 1: act on them in some way, thus benefiting the company 613 00:38:24,000 --> 00:38:26,600 Speaker 1: that makes the social platform as well as the company 614 00:38:26,600 --> 00:38:30,440 Speaker 1: that's doing the advertising, and whatever company is ultimately in 615 00:38:30,560 --> 00:38:33,080 Speaker 1: charge of the product or service that is being advertised. 616 00:38:33,800 --> 00:38:36,600 Speaker 1: So there's that there are companies like Google that are 617 00:38:37,040 --> 00:38:40,920 Speaker 1: taking this to extremes, tracking all sorts of behaviors for 618 00:38:41,000 --> 00:38:43,200 Speaker 1: all sorts of stuff, so that whenever we're doing searches, 619 00:38:43,239 --> 00:38:47,360 Speaker 1: we're also getting served up search results that are catered 620 00:38:47,520 --> 00:38:50,759 Speaker 1: to us more and more, which in some ways is good, 621 00:38:51,120 --> 00:38:53,360 Speaker 1: you know, you're getting stuff that is more relevant to you, 622 00:38:53,800 --> 00:38:56,520 Speaker 1: and in other ways comes across as super creepy because 623 00:38:57,160 --> 00:39:00,520 Speaker 1: it means that there's this enormous corporation out there that 624 00:39:00,840 --> 00:39:03,839 Speaker 1: might know you better than you know yourself, and that's 625 00:39:03,920 --> 00:39:07,560 Speaker 1: kind of worrisome. But then you have other things like 626 00:39:07,920 --> 00:39:11,560 Speaker 1: the NSSA, which a couple of years ago is famously 627 00:39:11,600 --> 00:39:15,879 Speaker 1: revealed that the NSSA was tapping into all sorts of 628 00:39:15,960 --> 00:39:22,520 Speaker 1: different communication tools to spy on communications between lots of 629 00:39:22,520 --> 00:39:26,920 Speaker 1: different people in an effort to promote national security. But 630 00:39:28,120 --> 00:39:30,640 Speaker 1: you could argue it was also a huge violation of 631 00:39:30,680 --> 00:39:34,920 Speaker 1: people's expectation of privacy and that it also took almost 632 00:39:34,920 --> 00:39:40,880 Speaker 1: a presumed guilty approach and applied it to absolutely everybody 633 00:39:40,960 --> 00:39:45,759 Speaker 1: who uses these forms of communication, from cellular phones to 634 00:39:46,920 --> 00:39:51,400 Speaker 1: website traffic to all sorts of stuff. Then you have hackers, 635 00:39:51,920 --> 00:39:57,120 Speaker 1: either state back to hackers who are working to spy 636 00:39:57,320 --> 00:40:01,520 Speaker 1: on behalf of a government or independent hackers who are 637 00:40:01,560 --> 00:40:04,200 Speaker 1: just trying to gather as much information as possible to 638 00:40:04,239 --> 00:40:07,960 Speaker 1: exploit it. Maybe that information is your bank account, or 639 00:40:08,000 --> 00:40:10,759 Speaker 1: maybe it's your social Security number, it could be all 640 00:40:10,800 --> 00:40:13,160 Speaker 1: sorts of stuff, but ultimately they're doing it so they 641 00:40:13,200 --> 00:40:16,160 Speaker 1: can make money and they have no real concern about 642 00:40:16,160 --> 00:40:21,320 Speaker 1: what happens to you and your information. Information is valuable, 643 00:40:21,760 --> 00:40:26,239 Speaker 1: whether it's for a government to try and protect itself 644 00:40:26,480 --> 00:40:29,680 Speaker 1: or the citizens that it represents. Ideally, we would like 645 00:40:29,680 --> 00:40:31,440 Speaker 1: to at least se a government try to protect the 646 00:40:31,440 --> 00:40:37,319 Speaker 1: people it represents. Maybe that's being naive, or if it's 647 00:40:37,360 --> 00:40:40,439 Speaker 1: a you know, a corporation that's doing this in order 648 00:40:40,520 --> 00:40:43,000 Speaker 1: to make a profit, or a hacker that's doing this 649 00:40:43,080 --> 00:40:45,600 Speaker 1: in order to make a profit in an even more 650 00:40:45,680 --> 00:40:50,480 Speaker 1: unethical way than the corporations are. Information is valuable. It's 651 00:40:50,480 --> 00:40:52,720 Speaker 1: also a good reminder of why you should do stuff 652 00:40:52,760 --> 00:40:55,399 Speaker 1: like use VPNs when you can so that you can 653 00:40:55,440 --> 00:41:00,279 Speaker 1: protect your activities from prying eyes and not have to 654 00:41:00,320 --> 00:41:04,560 Speaker 1: worry quite so much about your every move being spied upon. 655 00:41:05,640 --> 00:41:09,600 Speaker 1: Don't use public Wi fi, especially to do anything sensitive. 656 00:41:10,160 --> 00:41:15,480 Speaker 1: You know, be careful, be responsible with your browsing activities 657 00:41:16,239 --> 00:41:20,880 Speaker 1: so that you limit the ways that you can be exploited. 658 00:41:20,920 --> 00:41:23,960 Speaker 1: I'm not saying don't make use of these various platforms 659 00:41:24,000 --> 00:41:26,480 Speaker 1: social media. I'd be a hypocrite if I did. I 660 00:41:26,600 --> 00:41:30,840 Speaker 1: use them all the time. Just know what you're getting 661 00:41:30,840 --> 00:41:34,839 Speaker 1: into and be aware of what could potentially happen. If 662 00:41:34,880 --> 00:41:38,800 Speaker 1: you're comfortable with those consequences, I say, you're all good. 663 00:41:38,840 --> 00:41:41,600 Speaker 1: If you're not, then you need to think about ways 664 00:41:41,600 --> 00:41:45,239 Speaker 1: you can change your behavior. Whether it's using VPNs or 665 00:41:45,320 --> 00:41:48,880 Speaker 1: backing off on using the Internet as much, or whatever 666 00:41:48,880 --> 00:41:52,719 Speaker 1: it may be, so that you feel you're using technology 667 00:41:52,800 --> 00:41:56,240 Speaker 1: to benefit you as opposed to having other people receive 668 00:41:56,280 --> 00:42:01,000 Speaker 1: a benefit because of you don't be used. In other words, 669 00:42:01,520 --> 00:42:03,640 Speaker 1: now as I say that, I think you should also 670 00:42:03,640 --> 00:42:05,440 Speaker 1: go to tea public dot com slash tech Stuff. We 671 00:42:05,480 --> 00:42:07,520 Speaker 1: got a merchandise store. It's awesome. You can go buy 672 00:42:07,560 --> 00:42:12,000 Speaker 1: stuff and every purchase helps the show. Again. Hypocrite, I know, 673 00:42:12,880 --> 00:42:16,759 Speaker 1: but that's why spooky. Hey guys, if you want to 674 00:42:16,760 --> 00:42:18,760 Speaker 1: get in touch with me and talk about how scared 675 00:42:18,840 --> 00:42:22,919 Speaker 1: you are after this spooky episode, you can go over 676 00:42:23,000 --> 00:42:26,719 Speaker 1: to tech stuff podcast dot com. That's our website. You'll 677 00:42:26,719 --> 00:42:29,960 Speaker 1: find all the contact information there. Check it out, and 678 00:42:30,520 --> 00:42:41,239 Speaker 1: I will haunt you again really soon. Tech Stuff is 679 00:42:41,239 --> 00:42:45,799 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 680 00:42:45,840 --> 00:42:49,480 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 681 00:42:49,520 --> 00:42:53,800 Speaker 1: favorite shows.