1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,960 --> 00:00:14,480 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,600 --> 00:00:17,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,680 --> 00:00:20,040 Speaker 1: and a love of all things tech, and as we 5 00:00:20,200 --> 00:00:22,160 Speaker 1: close in on Halloween, I thought it would be fun 6 00:00:22,200 --> 00:00:26,880 Speaker 1: to do some, you know, Halloween themed episodes of tech Stuff. Unfortunately, 7 00:00:26,920 --> 00:00:30,720 Speaker 1: I'm also in the middle of doing about fifty other things, 8 00:00:31,280 --> 00:00:35,000 Speaker 1: including some interesting changes for tech Stuff that will be 9 00:00:35,000 --> 00:00:38,279 Speaker 1: coming up in the new years. Interesting changes meaning you're 10 00:00:38,280 --> 00:00:40,559 Speaker 1: gonna get more of it. There's going to be not 11 00:00:40,680 --> 00:00:43,879 Speaker 1: just uh, the classic tech stuff episodes that you've come 12 00:00:43,920 --> 00:00:46,360 Speaker 1: to expect where I do the deep dives and technology, 13 00:00:46,520 --> 00:00:48,599 Speaker 1: but some other stuff as well that I'm really excited 14 00:00:48,640 --> 00:00:51,080 Speaker 1: to talk about more as we get closer. And I'm 15 00:00:51,120 --> 00:00:54,440 Speaker 1: also working on the launch of a brand new podcast, 16 00:00:54,560 --> 00:00:57,880 Speaker 1: or rather the revival of a podcast I used to 17 00:00:57,920 --> 00:01:01,120 Speaker 1: host just in my spare time. It's coming up pretty 18 00:01:01,240 --> 00:01:03,760 Speaker 1: soon in the middle of December. I'll talk more about 19 00:01:03,800 --> 00:01:07,640 Speaker 1: that as we get closer as well. So the short 20 00:01:07,720 --> 00:01:09,959 Speaker 1: side of it is that all of these things have 21 00:01:10,480 --> 00:01:13,120 Speaker 1: taken a lot of my time and attention in the 22 00:01:13,319 --> 00:01:17,160 Speaker 1: short term, and uh, it turns out. The scariest word 23 00:01:17,840 --> 00:01:21,640 Speaker 1: out of all the scary words around Halloween is deadline. 24 00:01:22,080 --> 00:01:25,440 Speaker 1: I mean, the word dead is right there right spooky. 25 00:01:25,760 --> 00:01:29,440 Speaker 1: And so I decided that rather than give you a 26 00:01:29,600 --> 00:01:33,959 Speaker 1: halfhearted episode, something that was put together in a rush 27 00:01:34,080 --> 00:01:37,640 Speaker 1: and not researched properly, I would rather play an episode 28 00:01:37,800 --> 00:01:40,720 Speaker 1: that I recorded back in two thousand eighteen. This episode 29 00:01:40,760 --> 00:01:45,280 Speaker 1: originally published on October thirty one, Halloween, two thousand eighteen. 30 00:01:45,680 --> 00:01:50,760 Speaker 1: It is called tech Stuffs Spooky Halloween Spectacular. And I'll 31 00:01:50,760 --> 00:01:52,880 Speaker 1: have a little more to say afterward, but I hope 32 00:01:52,920 --> 00:02:00,720 Speaker 1: you enjoy this episode from ten. Let's listen in. Hey 33 00:02:00,840 --> 00:02:04,440 Speaker 1: the everybody, and welcome to Text Stuff. I am your host, 34 00:02:04,680 --> 00:02:10,480 Speaker 1: Jonathan Strictly, executive vampire here at ull Stuff works Halloween. 35 00:02:12,520 --> 00:02:16,080 Speaker 1: If you're listening to this on Halloween. Otherwise, that was 36 00:02:16,120 --> 00:02:19,880 Speaker 1: the worst possible introduction I could have given for this episode, 37 00:02:19,960 --> 00:02:24,519 Speaker 1: and I I should feel badly about it, but I don't. Nope, 38 00:02:24,560 --> 00:02:28,959 Speaker 1: But in celebration of Halloween, we are going to take 39 00:02:28,960 --> 00:02:33,440 Speaker 1: a look at spooky thick or you know, scary tech, 40 00:02:33,639 --> 00:02:35,800 Speaker 1: or at least some technology that could at least be 41 00:02:35,919 --> 00:02:39,840 Speaker 1: kind of creepy. See, here's the problem. Running into y'all, 42 00:02:40,120 --> 00:02:44,640 Speaker 1: I've already done an episode about tech in haunted house attractions, 43 00:02:44,720 --> 00:02:48,400 Speaker 1: so that's finished. I already did one on ghost hunting 44 00:02:48,440 --> 00:02:52,240 Speaker 1: technology years ago where I gave a skeptical view of 45 00:02:52,280 --> 00:02:56,640 Speaker 1: what all that was about. And sooner or later, after 46 00:02:56,800 --> 00:02:59,600 Speaker 1: you do a few years of a technology podcast, you 47 00:02:59,600 --> 00:03:01,600 Speaker 1: start to run out of the fun stuff like that. 48 00:03:01,760 --> 00:03:05,800 Speaker 1: So for this episode, I turned to the house Stuff 49 00:03:05,800 --> 00:03:09,400 Speaker 1: Works website and I pulled up a classic article, which 50 00:03:09,440 --> 00:03:12,959 Speaker 1: is also always a dangerous thing when you're talking technology. 51 00:03:13,000 --> 00:03:18,240 Speaker 1: But this article is titled ten Scary modern Technologies. It 52 00:03:18,280 --> 00:03:21,720 Speaker 1: was co written by David Ruse and my former editor 53 00:03:21,960 --> 00:03:26,600 Speaker 1: and tech Stuff co host Chris Pallette. So get ready 54 00:03:26,639 --> 00:03:31,280 Speaker 1: for some tech that goes bump in the night, I guess. 55 00:03:32,120 --> 00:03:37,040 Speaker 1: And actually, to be more serious, the technology I'm listing 56 00:03:37,040 --> 00:03:41,840 Speaker 1: here does not fall into the supernatural or ghostly categories 57 00:03:41,920 --> 00:03:46,480 Speaker 1: at all. The technology represents stuff that could be unsettling 58 00:03:47,400 --> 00:03:51,360 Speaker 1: or at worst could cause massive enormous problems due to 59 00:03:51,640 --> 00:03:56,280 Speaker 1: overreach or unintended consequences. So there are some serious entries 60 00:03:56,600 --> 00:03:58,600 Speaker 1: in here, even though I'm having a little bit of 61 00:03:58,600 --> 00:04:02,840 Speaker 1: fun with the presentation. So let us get started. Number 62 00:04:02,920 --> 00:04:08,920 Speaker 1: ten on the list, hollow sonics, and the audio spotlight system. 63 00:04:09,040 --> 00:04:13,800 Speaker 1: This is more in the creepy, unsettling vein technology that 64 00:04:14,000 --> 00:04:17,200 Speaker 1: focuses sound. That's what this is all about. It's focusing 65 00:04:17,240 --> 00:04:21,479 Speaker 1: sound into a narrow beam, which provides the opportunity for 66 00:04:21,680 --> 00:04:27,600 Speaker 1: ultra targeted sound for stuff like advertising. It's not necessarily creepy, 67 00:04:27,600 --> 00:04:31,120 Speaker 1: but not necessarily welcome either. But imagine that you're walking 68 00:04:31,120 --> 00:04:34,800 Speaker 1: through a store and just as you're passing in front 69 00:04:35,080 --> 00:04:41,240 Speaker 1: of a certain store item, let's say it's cookie crisp cereal, 70 00:04:41,440 --> 00:04:44,919 Speaker 1: and you start hearing a voice whispering to you, Hey, 71 00:04:44,920 --> 00:04:48,120 Speaker 1: that cookie crisp looks pretty good. It might seem really weird, 72 00:04:48,160 --> 00:04:51,279 Speaker 1: this little disembodied voice, and as soon as you get 73 00:04:51,279 --> 00:04:54,400 Speaker 1: past a certain point, you can't hear it anymore. You 74 00:04:54,480 --> 00:04:56,520 Speaker 1: just have this experience. As you're going through the store, 75 00:04:56,560 --> 00:05:01,160 Speaker 1: you keep hearing these targeted sounds in very narrow spots, 76 00:05:01,560 --> 00:05:02,960 Speaker 1: and as soon as you're out of those spots, you 77 00:05:03,000 --> 00:05:05,240 Speaker 1: can't hear it. That sounds pretty creepy, right, Well, how 78 00:05:05,279 --> 00:05:08,160 Speaker 1: does it work? Well? According to the company, the secret 79 00:05:08,279 --> 00:05:11,560 Speaker 1: is in the size of the sound waves compared to 80 00:05:11,560 --> 00:05:14,919 Speaker 1: the size of the sound source. The company states that 81 00:05:15,240 --> 00:05:18,560 Speaker 1: if you have a sound source that is much larger 82 00:05:18,720 --> 00:05:21,880 Speaker 1: than the comparable size of the sound waves it's producing, 83 00:05:22,560 --> 00:05:25,839 Speaker 1: you get more directionality out of your sound so you 84 00:05:25,839 --> 00:05:28,400 Speaker 1: can focus it more like a beam. So if you 85 00:05:28,440 --> 00:05:31,800 Speaker 1: were to create loudspeakers that are much much, much larger 86 00:05:31,839 --> 00:05:35,239 Speaker 1: than the sound waves they're producing, and sound waves can 87 00:05:35,400 --> 00:05:38,440 Speaker 1: measure from a few inches up to several feet in size, 88 00:05:39,080 --> 00:05:42,320 Speaker 1: you could direct those sound waves in a beam, more 89 00:05:42,400 --> 00:05:46,000 Speaker 1: like a directional beam, rather than having them propagate outward 90 00:05:46,080 --> 00:05:49,080 Speaker 1: in all equal directions. But that would not work very 91 00:05:49,080 --> 00:05:52,400 Speaker 1: well for something like targeted advertising because you would have 92 00:05:52,440 --> 00:05:55,719 Speaker 1: to have these enormous speakers perched behind the cheerios, and 93 00:05:55,760 --> 00:05:59,200 Speaker 1: that would be very distracting. So this company has gone 94 00:05:59,320 --> 00:06:05,000 Speaker 1: with an approach that has these devices producing ultrasonic beams 95 00:06:05,040 --> 00:06:10,560 Speaker 1: of sound. Ultrasonic sound ways are very very tiny. They're 96 00:06:10,600 --> 00:06:15,240 Speaker 1: also normally imperceptible, at least directly. They are imperceptible to us. 97 00:06:15,279 --> 00:06:18,400 Speaker 1: We cannot hear in this frequency. They do have a 98 00:06:18,480 --> 00:06:21,200 Speaker 1: very strong directionality as a result of the way they 99 00:06:21,240 --> 00:06:25,800 Speaker 1: are generated. Uh So, according to the company, as the 100 00:06:25,880 --> 00:06:29,719 Speaker 1: ultrasonic beam travels through the air, the inherent properties of 101 00:06:29,760 --> 00:06:32,960 Speaker 1: the air caused the ultrasound to change shape in a 102 00:06:32,960 --> 00:06:36,760 Speaker 1: predictable way. This gives rise to frequency components in the 103 00:06:36,800 --> 00:06:41,600 Speaker 1: audible band which can be accurately predicted and therefore precisely controlled. 104 00:06:42,000 --> 00:06:46,159 Speaker 1: By generating the correct ultrasonic signal, we can create within 105 00:06:46,240 --> 00:06:50,320 Speaker 1: the air itself any sound desired. So, in other words, 106 00:06:50,640 --> 00:06:54,120 Speaker 1: it's the interaction of these ultrasonic frequencies with the air 107 00:06:54,240 --> 00:06:59,760 Speaker 1: itself that causes the ultrasonic frequencies to change, and then 108 00:07:00,040 --> 00:07:03,080 Speaker 1: you produce these audible frequencies. And because this is all 109 00:07:03,240 --> 00:07:08,640 Speaker 1: very controllable in a predictable environment, you can produce whatever 110 00:07:08,680 --> 00:07:11,280 Speaker 1: sounds you want. This is based off the work of 111 00:07:11,320 --> 00:07:14,400 Speaker 1: an inventor named Woody Norris. He demonstrated this technology at 112 00:07:14,400 --> 00:07:17,440 Speaker 1: a TED talk in the early two thousand's. Rather than 113 00:07:17,520 --> 00:07:20,920 Speaker 1: generating the audible sound on the face of the speaker 114 00:07:21,080 --> 00:07:24,160 Speaker 1: as a traditional speaker would, where the speakers moving a 115 00:07:24,200 --> 00:07:27,400 Speaker 1: diaphragm in and out and pushing air around, the ultrasonic 116 00:07:27,480 --> 00:07:30,960 Speaker 1: emitter creates a column of the air itself to act 117 00:07:31,040 --> 00:07:33,920 Speaker 1: like a speaker, which is pretty nifty. I'm sorry, I 118 00:07:33,960 --> 00:07:40,000 Speaker 1: meant spooky number nine d N A hacking. So back 119 00:07:40,040 --> 00:07:44,360 Speaker 1: in two thousand three, scientists finished mapping out the human genome. 120 00:07:44,760 --> 00:07:49,040 Speaker 1: But that was obviously just the beginning. That's just uh, 121 00:07:49,080 --> 00:07:54,280 Speaker 1: a long list of base pairs. Um. Next came the 122 00:07:54,320 --> 00:07:57,800 Speaker 1: work to examine the genome closely and determine which bait 123 00:07:58,080 --> 00:08:01,880 Speaker 1: base pairs in the more than three billion pairs that 124 00:08:02,000 --> 00:08:05,360 Speaker 1: make up the human genome are responsible for different stuff 125 00:08:05,400 --> 00:08:10,880 Speaker 1: like our uh possibility of developing a disease, for example, 126 00:08:11,080 --> 00:08:13,600 Speaker 1: because if you could determine that, and if you could 127 00:08:13,600 --> 00:08:17,560 Speaker 1: determine a way of changing that part of the DNA 128 00:08:17,720 --> 00:08:22,200 Speaker 1: so that it's eliminated, maybe you could make someone not 129 00:08:23,400 --> 00:08:26,400 Speaker 1: develop that disease. You could potentially wipe out certain diseases. 130 00:08:26,800 --> 00:08:29,600 Speaker 1: And beyond that, what about hacking the human genomes so 131 00:08:29,640 --> 00:08:33,600 Speaker 1: that we can create designer human beings, human beings who 132 00:08:33,640 --> 00:08:37,520 Speaker 1: have traits that we consider to be superior. Now that's 133 00:08:37,520 --> 00:08:42,080 Speaker 1: the stuff of lots of science fiction and horror cautionary tales, 134 00:08:42,120 --> 00:08:46,120 Speaker 1: stuff like Gatica saying, yeah, but who gets to decide 135 00:08:46,160 --> 00:08:49,440 Speaker 1: what is superior? And what happens to people who aren't 136 00:08:49,760 --> 00:08:52,760 Speaker 1: able to take advantage of that technology, and what happens 137 00:08:52,760 --> 00:08:55,160 Speaker 1: to the people who do take advantage of that technology. 138 00:08:55,160 --> 00:08:58,840 Speaker 1: There are a lot of ethical questions around it. However, 139 00:08:58,920 --> 00:09:02,800 Speaker 1: beyond that, there are other scary things to consider. DNA 140 00:09:02,880 --> 00:09:07,640 Speaker 1: hacking could allow for individually focused biological warfare. So imagine 141 00:09:08,000 --> 00:09:10,760 Speaker 1: that you're living in a world and someone can get 142 00:09:10,800 --> 00:09:14,760 Speaker 1: access to your DNA. Maybe they get a skin sample 143 00:09:15,000 --> 00:09:17,240 Speaker 1: or some hair or something. They're able to get something 144 00:09:17,280 --> 00:09:19,760 Speaker 1: from you that has traces of your DNA in it, 145 00:09:19,760 --> 00:09:22,160 Speaker 1: and they're able to analyze your DNA and see what 146 00:09:22,320 --> 00:09:26,360 Speaker 1: makes you you and look for any vulnerabilities. Maybe they 147 00:09:26,440 --> 00:09:28,960 Speaker 1: find out that you have a predisposed weakness to a 148 00:09:29,000 --> 00:09:32,760 Speaker 1: particular type of virus, and then the engineer a virus 149 00:09:32,800 --> 00:09:35,920 Speaker 1: to attack you specifically. That was actually the premise of 150 00:09:35,920 --> 00:09:38,800 Speaker 1: a two thousand twelve article in The Atlantic, and that 151 00:09:38,920 --> 00:09:41,160 Speaker 1: article hypothesized the future in which the President of the 152 00:09:41,240 --> 00:09:44,320 Speaker 1: United States could be targeted and assassinated through the use 153 00:09:44,360 --> 00:09:48,160 Speaker 1: of a particularly nasty virus that was Taylor made for 154 00:09:48,240 --> 00:09:52,080 Speaker 1: the president. Yikes. Or imagine more than that, maybe a 155 00:09:52,120 --> 00:09:56,520 Speaker 1: widespread plague unless Stephen King's the stand engineered through a 156 00:09:56,600 --> 00:10:01,679 Speaker 1: deep understanding of DNA paired with a really crap containment strategy. 157 00:10:01,720 --> 00:10:04,200 Speaker 1: So how realistic is all of that, Well, in the 158 00:10:04,240 --> 00:10:09,320 Speaker 1: short term, it's probably not terribly realistic. It's certainly possible, 159 00:10:09,360 --> 00:10:13,440 Speaker 1: but not necessarily plausible. But it does require a deep 160 00:10:13,520 --> 00:10:16,520 Speaker 1: understanding of DNA and a means to manipulate it easily 161 00:10:16,600 --> 00:10:19,400 Speaker 1: in order to pay it off. We've seen some advances 162 00:10:19,400 --> 00:10:21,880 Speaker 1: in those areas. There are a lot more ways that 163 00:10:21,920 --> 00:10:24,400 Speaker 1: we can manipulate DNA than there used to be, but 164 00:10:24,880 --> 00:10:29,560 Speaker 1: not exactly easy to do. It's easy, you're but that's 165 00:10:29,600 --> 00:10:32,040 Speaker 1: that's a matter of degrees. However, you can make a 166 00:10:32,080 --> 00:10:35,040 Speaker 1: convincing argument that it would not require too deep and 167 00:10:35,240 --> 00:10:40,280 Speaker 1: understanding to cause real harm unintentionally, and that would be 168 00:10:40,360 --> 00:10:43,680 Speaker 1: a very difficult argument to counter. And while DNA hacking 169 00:10:43,720 --> 00:10:47,320 Speaker 1: could produce all sorts of different futuristic results, we do 170 00:10:47,400 --> 00:10:51,800 Speaker 1: already have a culture of bio hackers, also sometimes known 171 00:10:51,800 --> 00:10:56,240 Speaker 1: as grinders, who have taken body alteration to new places. 172 00:10:56,640 --> 00:10:59,680 Speaker 1: These folks are not working on a DNA level, They're 173 00:10:59,679 --> 00:11:05,040 Speaker 1: not changing themselves fundamentally in that way. Instead, they're doing alterations, 174 00:11:05,200 --> 00:11:08,760 Speaker 1: you know, upgrades. So one example would be there are 175 00:11:08,800 --> 00:11:12,720 Speaker 1: people who have chosen to have small magnets implanted under 176 00:11:12,760 --> 00:11:17,360 Speaker 1: the skin of their fingertips. Say, these little nubby magnets 177 00:11:17,360 --> 00:11:22,480 Speaker 1: sticking out on their fingertips kind of weird, right, Well, 178 00:11:22,480 --> 00:11:24,920 Speaker 1: what's the purpose of that. Well, because of the effects 179 00:11:24,920 --> 00:11:28,440 Speaker 1: of magnetism and electromagnetism, they would actually be able to 180 00:11:28,480 --> 00:11:32,079 Speaker 1: sense when they were near magnetic or electromagnetic fields. So 181 00:11:32,160 --> 00:11:34,920 Speaker 1: it'd be kind of like Spider Man's spiky sense. They'd 182 00:11:34,920 --> 00:11:38,080 Speaker 1: feel tugging on their fingertips as they passed through it. 183 00:11:38,200 --> 00:11:42,120 Speaker 1: So instead of detecting danger like Spider Man would, you'd 184 00:11:42,160 --> 00:11:44,800 Speaker 1: be able to tell like whether or not electricity was 185 00:11:44,880 --> 00:11:47,480 Speaker 1: flowing through a conductor, for example, because you could bring 186 00:11:47,520 --> 00:11:50,080 Speaker 1: your fingers close to that conductor. Maybe it's some wires. 187 00:11:50,360 --> 00:11:52,840 Speaker 1: You bring your fingers close, and if you start feeling 188 00:11:53,360 --> 00:11:59,600 Speaker 1: the pulses from an alternating magnetic field of fluctuating magnetic field, 189 00:12:00,000 --> 00:12:02,520 Speaker 1: I would tell you, oh, electricity, alternating current is flowing 190 00:12:02,520 --> 00:12:04,840 Speaker 1: through here, because I can feel it in my fingertips. 191 00:12:05,240 --> 00:12:07,280 Speaker 1: That would be something you normally would not be able 192 00:12:07,320 --> 00:12:09,560 Speaker 1: to feel. So you kind of have a sixth sense 193 00:12:10,360 --> 00:12:12,920 Speaker 1: due to having these magnets implanted in your fingertips. And 194 00:12:12,960 --> 00:12:15,800 Speaker 1: people have actually done this, However, a word of caution. 195 00:12:16,720 --> 00:12:21,000 Speaker 1: From what I understand, these operations are pretty painful. They 196 00:12:21,160 --> 00:12:25,520 Speaker 1: usually are not done with an esthetic. There's not really 197 00:12:25,559 --> 00:12:31,440 Speaker 1: a place you can go that's medically, you know, licensed 198 00:12:31,480 --> 00:12:33,959 Speaker 1: to do this, because it's not a it's not a 199 00:12:34,000 --> 00:12:37,200 Speaker 1: standard medical procedure. As far as I know, there are 200 00:12:37,200 --> 00:12:40,760 Speaker 1: no accredited medical facilities that do it. I think any 201 00:12:40,800 --> 00:12:44,600 Speaker 1: doctor who did practice this would have uh the danger 202 00:12:44,720 --> 00:12:48,959 Speaker 1: of being barred from practice. So you're usually talking about 203 00:12:49,160 --> 00:12:54,400 Speaker 1: entrepreneuring body modification specialists, you know, like piercers, who will 204 00:12:54,440 --> 00:12:57,400 Speaker 1: do this kind of operation. And anytime you're talking about 205 00:12:57,440 --> 00:13:01,600 Speaker 1: introducing something foreign to the body, you're also raising other 206 00:13:01,720 --> 00:13:04,880 Speaker 1: risks like infection, So you've got to be super duper 207 00:13:04,920 --> 00:13:08,160 Speaker 1: careful about that kind of thing. In other words, you're 208 00:13:08,200 --> 00:13:11,640 Speaker 1: not gonna find me putting magnets under my fingertips anytime soon. 209 00:13:12,080 --> 00:13:18,760 Speaker 1: Number eight on the list cyber war Oh boy, Well, 210 00:13:19,080 --> 00:13:22,480 Speaker 1: the scenario the article specifically lays out is an all 211 00:13:22,520 --> 00:13:28,240 Speaker 1: out cyber warfare attack where one nation targets another nation's infrastructure, 212 00:13:28,400 --> 00:13:32,320 Speaker 1: like it's power grid system or water system. And sure enough, 213 00:13:32,520 --> 00:13:35,600 Speaker 1: the U. S Department of Homeland Security has reported that 214 00:13:35,720 --> 00:13:38,360 Speaker 1: there's lots of evidence to show that various hackers have 215 00:13:38,480 --> 00:13:43,440 Speaker 1: infiltrated critical infrastructure with such tactics like Russian hackers and 216 00:13:43,480 --> 00:13:48,240 Speaker 1: the things like power plants and gas pipelines, and that's terrifying. 217 00:13:48,640 --> 00:13:50,760 Speaker 1: And before that report had even made the news that 218 00:13:50,840 --> 00:13:55,280 Speaker 1: was in security experts have for years been warning that 219 00:13:55,440 --> 00:13:59,640 Speaker 1: Chinese hackers have been infiltrating American infrastructure. CNN reported on 220 00:13:59,720 --> 00:14:02,680 Speaker 1: that in two thousand fourteen. So this is not exactly 221 00:14:02,720 --> 00:14:06,280 Speaker 1: a new story. It's a continuing story that continues to 222 00:14:06,280 --> 00:14:11,920 Speaker 1: be really problematic and concerning. So the infiltration part is 223 00:14:11,960 --> 00:14:15,280 Speaker 1: a reality. We know for a fact hackers from other 224 00:14:15,360 --> 00:14:19,360 Speaker 1: countries have infiltrated various systems. There have been traces found 225 00:14:19,920 --> 00:14:23,520 Speaker 1: of their activities, so we know that there have at 226 00:14:23,600 --> 00:14:26,920 Speaker 1: least been people snooping around our systems. Whether or not 227 00:14:26,920 --> 00:14:30,640 Speaker 1: they've installed anything to help shut stuff down as another matter, 228 00:14:31,040 --> 00:14:33,920 Speaker 1: but they've definitely been there, and they appeared to have 229 00:14:33,960 --> 00:14:36,520 Speaker 1: been there from places like Russia and China. That does 230 00:14:36,600 --> 00:14:41,800 Speaker 1: not automatically mean that the infiltration was state directed, in 231 00:14:41,800 --> 00:14:46,320 Speaker 1: other words, that it was a government backed project, but 232 00:14:47,280 --> 00:14:50,160 Speaker 1: that seems to be the general consensus is that these 233 00:14:50,240 --> 00:14:55,320 Speaker 1: were likely the activities of a state backed group of hackers. 234 00:14:56,200 --> 00:14:59,280 Speaker 1: What about shutting everything down is that realistic? Well, it 235 00:14:59,360 --> 00:15:02,800 Speaker 1: could be more challenging to do, but not necessarily impossible. 236 00:15:02,920 --> 00:15:08,280 Speaker 1: Many utilities have started installing self healing systems. Self healing 237 00:15:08,320 --> 00:15:10,960 Speaker 1: system isn't quite as cool as it sounds like. It's 238 00:15:11,000 --> 00:15:14,520 Speaker 1: not like it's wolverine and infrastructure form, but it does 239 00:15:14,600 --> 00:15:20,000 Speaker 1: involve having a system that when it detects problems automatically 240 00:15:20,120 --> 00:15:24,320 Speaker 1: tries to reroute services to get around those problems. So 241 00:15:24,360 --> 00:15:27,160 Speaker 1: with a power grid, it might be if a smart 242 00:15:27,160 --> 00:15:30,280 Speaker 1: power grid system detects that there's a short that some reason, 243 00:15:30,360 --> 00:15:33,640 Speaker 1: or there's a a break in connectivity at some point, 244 00:15:33,680 --> 00:15:36,360 Speaker 1: it may try to reroute power to work around that 245 00:15:36,440 --> 00:15:40,040 Speaker 1: as much as possible. That could help confound a cyber 246 00:15:40,120 --> 00:15:43,400 Speaker 1: warfare attack a little bit. At least it might mitigate 247 00:15:43,600 --> 00:15:48,080 Speaker 1: the impact of an attack, um though preventing one entirely 248 00:15:48,200 --> 00:15:51,720 Speaker 1: maybe not. But then there are also attacks on other systems, 249 00:15:51,720 --> 00:15:54,720 Speaker 1: not just power grids and water and gas, which are 250 00:15:54,760 --> 00:15:58,760 Speaker 1: all scary enough. But there's the evidence that showed Russian 251 00:15:58,800 --> 00:16:01,680 Speaker 1: hackers were targeting elect systems in the United States leading 252 00:16:01,720 --> 00:16:04,040 Speaker 1: up to the twenty sixteen elections. I talked about this 253 00:16:04,080 --> 00:16:05,720 Speaker 1: in a recent episode of Tech Stuff, so I'm not 254 00:16:05,720 --> 00:16:07,880 Speaker 1: gonna go all the way through it again. But the 255 00:16:07,920 --> 00:16:11,160 Speaker 1: really insidious thing about those attacks is they don't even 256 00:16:11,200 --> 00:16:14,400 Speaker 1: have to be super successful to be effective. If you 257 00:16:14,440 --> 00:16:18,240 Speaker 1: can so doubt in the minds of a nation's citizens 258 00:16:18,880 --> 00:16:21,720 Speaker 1: as to the validity of any given election, you have 259 00:16:21,920 --> 00:16:25,360 Speaker 1: undermined the very foundation of that nation's government. A government 260 00:16:25,360 --> 00:16:28,080 Speaker 1: that doesn't have the confidence of its population is on 261 00:16:28,120 --> 00:16:31,320 Speaker 1: shaky ground and has to move more and more towards 262 00:16:31,400 --> 00:16:35,400 Speaker 1: totalitarianism in order to maintain power. If you don't have 263 00:16:35,440 --> 00:16:39,760 Speaker 1: any confidence that your system works, then you don't have 264 00:16:39,800 --> 00:16:42,560 Speaker 1: any confidence in your government at all. So cyber war 265 00:16:42,720 --> 00:16:46,920 Speaker 1: is something that is continuing right now. It is actually happening. 266 00:16:47,320 --> 00:16:51,520 Speaker 1: It is already in place. And obviously I've given examples 267 00:16:51,520 --> 00:16:53,920 Speaker 1: of how the US has been the target of cyber warfare, 268 00:16:54,320 --> 00:16:57,400 Speaker 1: but don't forget the US has engaged in it too. 269 00:16:57,440 --> 00:16:59,720 Speaker 1: We're not, you know, the United States has not been 270 00:17:00,240 --> 00:17:02,520 Speaker 1: just the poor victim in all these cases. The United 271 00:17:02,520 --> 00:17:07,080 Speaker 1: States has certainly played a hand in cyber warfare activities. 272 00:17:07,480 --> 00:17:10,840 Speaker 1: One example would be stuck s net, the computer virus 273 00:17:11,119 --> 00:17:14,760 Speaker 1: that was designed to sabotage uranium enrichment facilities in Iran. 274 00:17:15,000 --> 00:17:18,320 Speaker 1: So this is not something that everyone else is doing 275 00:17:18,359 --> 00:17:20,600 Speaker 1: and the United States is the victim. This is something 276 00:17:20,600 --> 00:17:23,280 Speaker 1: that everyone is doing as much as they can and 277 00:17:23,359 --> 00:17:26,080 Speaker 1: stepping it up as much as they can. Well, we 278 00:17:26,160 --> 00:17:28,399 Speaker 1: have a lot more scary technology to talk about, but 279 00:17:29,119 --> 00:17:32,639 Speaker 1: I need to have a sip of tea to comfort myself. 280 00:17:32,720 --> 00:17:42,960 Speaker 1: Let's take a quick break to thank our sponsor. Number 281 00:17:43,040 --> 00:17:46,760 Speaker 1: seven is the technological singularity. Now, out of all the 282 00:17:46,800 --> 00:17:50,159 Speaker 1: science fiction ideas I find particularly interesting, this one ranks 283 00:17:50,280 --> 00:17:53,679 Speaker 1: near the top of them. The singularity refers to a 284 00:17:53,720 --> 00:17:57,280 Speaker 1: general concept that could be brought about in several different ways, 285 00:17:57,359 --> 00:17:59,600 Speaker 1: but from a very high level, the idea goes something 286 00:17:59,680 --> 00:18:03,399 Speaker 1: like this. Imagine that technology has advanced to the point 287 00:18:03,720 --> 00:18:08,159 Speaker 1: that the newest stuff coming out is already designing the 288 00:18:08,200 --> 00:18:11,920 Speaker 1: next generation of stuff. And imagine that the gaps between 289 00:18:11,960 --> 00:18:15,040 Speaker 1: these generations are getting smaller and smaller, and eventually you 290 00:18:15,160 --> 00:18:18,639 Speaker 1: reach a point where the present is defined by constant change, 291 00:18:18,640 --> 00:18:21,480 Speaker 1: and that's the only way you can define it. Because 292 00:18:22,040 --> 00:18:25,520 Speaker 1: things change so quickly it is almost impossible to describe 293 00:18:25,560 --> 00:18:29,240 Speaker 1: the present in any coherent way. That's how quickly everything 294 00:18:29,359 --> 00:18:33,080 Speaker 1: is evolving. The thing that would fuel this would be 295 00:18:33,080 --> 00:18:38,040 Speaker 1: the emergence of superhuman intelligence. In most of the scenarios 296 00:18:38,640 --> 00:18:43,080 Speaker 1: that involve the technological singularity However, that would not necessarily 297 00:18:44,000 --> 00:18:48,000 Speaker 1: require just uh, you know, a computer AI. That's one 298 00:18:48,000 --> 00:18:52,520 Speaker 1: possible version is pure artificial intelligence that has superhuman capabilities 299 00:18:52,520 --> 00:18:56,320 Speaker 1: and processing information. This is your basic deep thought from 300 00:18:56,440 --> 00:18:59,240 Speaker 1: Hitchecker's Guide to the Galaxy or sky Net from the 301 00:18:59,359 --> 00:19:02,960 Speaker 1: terminator that scenario. This would be a scenario in which 302 00:19:02,960 --> 00:19:05,960 Speaker 1: we humans have created an AI so powerful we are 303 00:19:06,080 --> 00:19:08,800 Speaker 1: unable to control it, and then it goes on to 304 00:19:08,880 --> 00:19:11,680 Speaker 1: redefine our world in ways that we could not anticipate 305 00:19:11,960 --> 00:19:14,479 Speaker 1: because we cannot operate on the same level as this 306 00:19:14,640 --> 00:19:18,520 Speaker 1: superhuman intelligence. But that is not necessarily the only pathway 307 00:19:18,520 --> 00:19:22,760 Speaker 1: to the technological singularity. Another way might be that humans 308 00:19:22,800 --> 00:19:25,760 Speaker 1: find a way to boost our own intelligence and thus 309 00:19:25,800 --> 00:19:29,280 Speaker 1: we evolve beyond what we traditionally think of as being human. 310 00:19:29,640 --> 00:19:32,639 Speaker 1: We might do this through a deeper understanding of biology. 311 00:19:32,760 --> 00:19:35,439 Speaker 1: We could boost our intelligence that way, going back to 312 00:19:35,480 --> 00:19:37,600 Speaker 1: the concept of d n A hacking and things related 313 00:19:37,640 --> 00:19:41,119 Speaker 1: to that. Or it might involve using technology to create 314 00:19:41,240 --> 00:19:46,200 Speaker 1: cyborg like beings where we merge with technology on some level, 315 00:19:46,720 --> 00:19:49,720 Speaker 1: and with tech and biology working together, we boost our 316 00:19:49,760 --> 00:19:53,080 Speaker 1: intelligence to new levels and achieve superhuman intelligence that way. 317 00:19:53,440 --> 00:19:58,280 Speaker 1: So bottom line is this a possibility, well, it beats me. 318 00:19:58,600 --> 00:20:00,520 Speaker 1: But there are a lot of super smart people who 319 00:20:00,560 --> 00:20:03,200 Speaker 1: are on either side of this issue. So some people 320 00:20:03,280 --> 00:20:07,520 Speaker 1: say the singularity is essentially a foregone conclusion. It will happen, 321 00:20:07,560 --> 00:20:10,600 Speaker 1: The only question is when will it happen. But there 322 00:20:10,600 --> 00:20:13,560 Speaker 1: are other people who say there might be some fundamental 323 00:20:13,560 --> 00:20:17,680 Speaker 1: barriers that were not likely to get over, and those 324 00:20:17,680 --> 00:20:21,679 Speaker 1: barriers will block the singularity from ever happening. One of 325 00:20:21,680 --> 00:20:27,840 Speaker 1: the frequent criticisms of various UH singularity scenarios is that 326 00:20:27,960 --> 00:20:30,680 Speaker 1: a lot of it rests on the belief that we're 327 00:20:30,680 --> 00:20:33,520 Speaker 1: going to see progress continue on a pace that's similar 328 00:20:33,560 --> 00:20:38,280 Speaker 1: to what Moore's law has observed with computer processing power. 329 00:20:38,920 --> 00:20:41,800 Speaker 1: And the thing is that pace may not be realistic, 330 00:20:41,920 --> 00:20:45,960 Speaker 1: or sustainable, or even applicable to some technologies. So Moore's 331 00:20:46,000 --> 00:20:50,440 Speaker 1: law applies to UH to the processing power of computers 332 00:20:50,480 --> 00:20:53,560 Speaker 1: generally speaking, but they may not apply to other elements 333 00:20:53,560 --> 00:20:56,480 Speaker 1: that would be necessary to bring about superhuman intelligence, because 334 00:20:56,520 --> 00:21:00,480 Speaker 1: processing power by itself is not intelligence. You also have 335 00:21:00,560 --> 00:21:02,480 Speaker 1: to have the software side. You've got a lot of 336 00:21:02,480 --> 00:21:05,200 Speaker 1: other pieces that have to be in place. However, if 337 00:21:05,240 --> 00:21:08,440 Speaker 1: it is possible, it could very well mean the end 338 00:21:08,640 --> 00:21:11,800 Speaker 1: of the human race as we know it today. Now, 339 00:21:12,200 --> 00:21:15,760 Speaker 1: that doesn't necessarily mean it's the end of humanity entirely. 340 00:21:16,080 --> 00:21:19,480 Speaker 1: It just may mean that humanity will transition into something different, 341 00:21:20,000 --> 00:21:23,120 Speaker 1: so it could be a new beginning. It's not necessarily 342 00:21:23,600 --> 00:21:30,760 Speaker 1: the end of everything, but still spooky. Number six Google Glass, 343 00:21:32,440 --> 00:21:35,439 Speaker 1: I mean that was the number on the on the 344 00:21:35,560 --> 00:21:39,960 Speaker 1: article Google. You remember Google glass, the augmented reality glasses. 345 00:21:40,880 --> 00:21:43,160 Speaker 1: Back when Chris and Dave were working on this article, 346 00:21:43,600 --> 00:21:47,359 Speaker 1: Google glass was still a real thing. It was it 347 00:21:47,440 --> 00:21:50,359 Speaker 1: was poised to become an actual consumer product outside of 348 00:21:50,400 --> 00:21:54,880 Speaker 1: the relatively small sample of bleeding edge adopters a k a. 349 00:21:54,920 --> 00:21:58,199 Speaker 1: Glass holes. We were sometimes called that because I was 350 00:21:58,240 --> 00:22:00,720 Speaker 1: one of them. I had a pair of Google Glass. 351 00:22:00,760 --> 00:22:05,200 Speaker 1: The glasses were part augmented reality headset, part user interface 352 00:22:05,240 --> 00:22:08,160 Speaker 1: for the world around you. They included a camera which 353 00:22:08,160 --> 00:22:10,720 Speaker 1: could pull in information, and a Bluetooth chips so the 354 00:22:10,760 --> 00:22:14,520 Speaker 1: glasses could communicate with a paired mobile device, and through 355 00:22:14,520 --> 00:22:17,919 Speaker 1: that mobile device, the glasses could also pair information like 356 00:22:18,000 --> 00:22:22,119 Speaker 1: GPS coordinates. So these glasses, while giving you potentially incredible 357 00:22:22,160 --> 00:22:25,359 Speaker 1: access to information about the world around you, could also 358 00:22:25,440 --> 00:22:28,439 Speaker 1: gather information about the world around you for the benefit 359 00:22:28,440 --> 00:22:32,720 Speaker 1: of Google and suddenly this company could potentially access information 360 00:22:32,760 --> 00:22:35,800 Speaker 1: from cameras mounted on faces all over the world. And 361 00:22:36,119 --> 00:22:39,600 Speaker 1: the glasses also had microphones, because you know, you could 362 00:22:39,640 --> 00:22:43,399 Speaker 1: use voice commands to make your glasses do stuff. But 363 00:22:43,440 --> 00:22:46,280 Speaker 1: that also meant that in very Google could listen in 364 00:22:46,520 --> 00:22:49,240 Speaker 1: as well, not just see everything, but hear everything, which 365 00:22:49,320 --> 00:22:52,040 Speaker 1: raises some big privacy concerns, not just for the people 366 00:22:52,080 --> 00:22:55,879 Speaker 1: wearing the glasses, but for everyone around those people. And 367 00:22:55,920 --> 00:22:59,399 Speaker 1: Google makes money with information, so you would effectively be 368 00:22:59,480 --> 00:23:02,920 Speaker 1: generating product for Google to sell by wearing a pair 369 00:23:02,960 --> 00:23:05,800 Speaker 1: of those glasses and walking around everywhere. Google would be 370 00:23:05,800 --> 00:23:09,280 Speaker 1: the head of a big surveillance state, far more invasive 371 00:23:09,280 --> 00:23:13,640 Speaker 1: than a network of closed circuit cameras if such technology 372 00:23:13,720 --> 00:23:17,960 Speaker 1: was used on ethically. And while Google Glass is now 373 00:23:18,080 --> 00:23:20,800 Speaker 1: far more limited in its rollout, you know you only 374 00:23:20,840 --> 00:23:24,240 Speaker 1: see it in a few industries. At this point, the 375 00:23:24,320 --> 00:23:26,679 Speaker 1: company Google is still very much in the business of 376 00:23:26,760 --> 00:23:29,879 Speaker 1: knowing where it's customers are. In August two thousand eighteen, 377 00:23:30,359 --> 00:23:33,560 Speaker 1: numerous tech journals reported on a study that was conducted 378 00:23:33,600 --> 00:23:37,240 Speaker 1: by Douglas C. Schmidt of Vanderbilt University. That's so, he 379 00:23:37,280 --> 00:23:40,439 Speaker 1: said that a stationary Android phone running Chrome in the 380 00:23:40,440 --> 00:23:45,040 Speaker 1: background would ping Google servers with location data three forty 381 00:23:45,160 --> 00:23:47,960 Speaker 1: times in a twenty four hour period. Even if you 382 00:23:48,000 --> 00:23:51,080 Speaker 1: turn the location history feature off of the phone, the 383 00:23:51,119 --> 00:23:54,040 Speaker 1: phones were still sending location data back to Google, according 384 00:23:54,080 --> 00:23:56,480 Speaker 1: to the study. Google, by the way, has disputed the 385 00:23:56,520 --> 00:23:59,879 Speaker 1: findings of the study. Then there are the numerous per 386 00:24:00,400 --> 00:24:04,240 Speaker 1: assistant devices that are out there, including Google Home, that 387 00:24:04,400 --> 00:24:07,360 Speaker 1: also are always listening for commands. And of course that's 388 00:24:07,400 --> 00:24:10,040 Speaker 1: just Google. There are other companies out there like Apple 389 00:24:10,200 --> 00:24:13,680 Speaker 1: and Amazon that also have technologies similar to these. All 390 00:24:13,720 --> 00:24:17,200 Speaker 1: of these could be monitoring users and sending data back 391 00:24:17,520 --> 00:24:21,120 Speaker 1: so that the companies might later exploit that information for profit, 392 00:24:21,640 --> 00:24:24,520 Speaker 1: usually to sell you stuff to advertise directly to you. 393 00:24:24,600 --> 00:24:27,360 Speaker 1: But that's still pretty creepy, right. Even if the companies 394 00:24:27,480 --> 00:24:31,760 Speaker 1: are not actively exploiting that information, the fact that the 395 00:24:31,840 --> 00:24:35,000 Speaker 1: data could be transmitted and recorded at all is problematic. 396 00:24:35,280 --> 00:24:38,159 Speaker 1: Though again, I should say, the companies generally say they 397 00:24:38,200 --> 00:24:41,120 Speaker 1: do not record user data in that way, so that's 398 00:24:41,119 --> 00:24:46,199 Speaker 1: a relief right. Number five drones. Drones are legit creepy. 399 00:24:46,480 --> 00:24:49,800 Speaker 1: Many drones have cameras mounted on them that does allow 400 00:24:49,840 --> 00:24:55,080 Speaker 1: potential filmmakers unprecedented access and capabilities now a low budget 401 00:24:55,119 --> 00:24:58,439 Speaker 1: film can have the equivalent of an expensive crane shot. 402 00:24:58,760 --> 00:25:01,439 Speaker 1: It's a fraction of what it would cost to rent 403 00:25:01,480 --> 00:25:04,840 Speaker 1: and operate a film crane with all the associated personnel, 404 00:25:04,880 --> 00:25:07,080 Speaker 1: the safety features, all that kind of stuff. You could 405 00:25:07,600 --> 00:25:10,400 Speaker 1: reduce all that down to an operator and a drone 406 00:25:10,800 --> 00:25:12,680 Speaker 1: and it would be much less expensive. But it also 407 00:25:12,720 --> 00:25:15,880 Speaker 1: means a drone operator who's using one of these devices 408 00:25:16,160 --> 00:25:20,359 Speaker 1: could use it to do stuff like peeping, super darn creepy, 409 00:25:20,400 --> 00:25:23,240 Speaker 1: to be spying on neighbors and stuff. That's just the 410 00:25:23,320 --> 00:25:29,320 Speaker 1: consumer technology version of drones that is already troubling. But 411 00:25:29,560 --> 00:25:32,600 Speaker 1: then you have to remember there's also tons of military 412 00:25:32,680 --> 00:25:35,720 Speaker 1: grade drones, and they're being used to do everything from 413 00:25:35,800 --> 00:25:41,320 Speaker 1: surveillance work to active strikes on military targets. Weaponized drones. 414 00:25:41,960 --> 00:25:44,960 Speaker 1: These drones may be semi autonomous or completely under the 415 00:25:44,960 --> 00:25:48,640 Speaker 1: control of an operator who's potentially hundreds of miles away. 416 00:25:48,800 --> 00:25:53,440 Speaker 1: They greatly extend the surveillance capabilities of various government agencies 417 00:25:53,440 --> 00:25:57,040 Speaker 1: and divisions, from military to law enforcement. In the United States, 418 00:25:57,040 --> 00:25:59,679 Speaker 1: Congress passed a bill in two thousand twelve giving the 419 00:26:00,080 --> 00:26:03,520 Speaker 1: Ural Aviation Administration or f a A, the authority to 420 00:26:03,640 --> 00:26:07,359 Speaker 1: drop rules for commercial and police drones in US airspace. 421 00:26:08,040 --> 00:26:10,680 Speaker 1: The f a A hasn't been super fast to share 422 00:26:10,720 --> 00:26:14,760 Speaker 1: that information. That prompted the Electronic Frontier Foundation or e 423 00:26:14,920 --> 00:26:18,119 Speaker 1: f F, to sue the f a A under the 424 00:26:18,200 --> 00:26:21,359 Speaker 1: Freedom of Information Act to at least share a list 425 00:26:21,440 --> 00:26:25,120 Speaker 1: of the public entities and private drone manufacturers that applied 426 00:26:25,119 --> 00:26:27,199 Speaker 1: to flight drones in the United States, as well as 427 00:26:27,280 --> 00:26:31,080 Speaker 1: thousands of pages related to license applications. But the f 428 00:26:31,200 --> 00:26:35,639 Speaker 1: a A didn't, you know, explain how those entities were 429 00:26:35,640 --> 00:26:39,560 Speaker 1: planning on using the drones. So that's a problem. Number 430 00:26:39,600 --> 00:26:43,320 Speaker 1: four three D printers. Well, I just recently talked about 431 00:26:43,359 --> 00:26:45,880 Speaker 1: Maker bought and maybe you think the scariest thing about 432 00:26:45,920 --> 00:26:48,240 Speaker 1: three D printers is that they could lead to minor 433 00:26:48,320 --> 00:26:50,760 Speaker 1: burns as you try to deal with melting plastic. But 434 00:26:50,800 --> 00:26:53,200 Speaker 1: in fact there are other things to worry about as well, 435 00:26:53,480 --> 00:26:56,560 Speaker 1: like fake A t M facades. See back in two 436 00:26:56,600 --> 00:26:59,920 Speaker 1: thousand eleven, some thieves used a three D printer to 437 00:27:00,119 --> 00:27:02,600 Speaker 1: create a false front for an A t M terminal, 438 00:27:02,920 --> 00:27:06,159 Speaker 1: and they installed a skimmer on some A t M 439 00:27:06,200 --> 00:27:09,320 Speaker 1: s so unsuspecting customers would come up and it would 440 00:27:09,320 --> 00:27:13,679 Speaker 1: look like a real A t M front. That you 441 00:27:13,720 --> 00:27:17,080 Speaker 1: couldn't necessarily tell immediately that there was a projection on 442 00:27:17,160 --> 00:27:20,840 Speaker 1: there that was a false front. So they would put 443 00:27:20,880 --> 00:27:24,360 Speaker 1: their uh, their a t M card into this. They 444 00:27:24,440 --> 00:27:29,400 Speaker 1: would then type in their pen and meanwhile the skimmer 445 00:27:29,520 --> 00:27:33,360 Speaker 1: was actually scanning the data on the card and recording 446 00:27:33,359 --> 00:27:36,040 Speaker 1: it with the pen and allowing the thieves to steal 447 00:27:36,119 --> 00:27:39,920 Speaker 1: more than four hundred thousand dollars in the process. And 448 00:27:40,000 --> 00:27:41,920 Speaker 1: you know, all it took was a three D printer 449 00:27:42,000 --> 00:27:46,199 Speaker 1: to create that that convincing a t M facade. In 450 00:27:46,280 --> 00:27:49,920 Speaker 1: two thousand thirteen, a guy named Cody Wilson made headlines 451 00:27:49,960 --> 00:27:53,720 Speaker 1: when he published files for his three D printed Liberator handgun, 452 00:27:53,960 --> 00:27:57,600 Speaker 1: which fired three a D ammunition. That's scared a ton 453 00:27:57,640 --> 00:28:01,200 Speaker 1: of people because it means that anyone who had access 454 00:28:01,240 --> 00:28:05,120 Speaker 1: to a three D printer and the appropriate materials could 455 00:28:05,160 --> 00:28:08,560 Speaker 1: have the opportunity to make an untraceable weapon. There'll be 456 00:28:08,600 --> 00:28:11,280 Speaker 1: no background check required because you just print the thing out. 457 00:28:11,880 --> 00:28:15,160 Speaker 1: And it also raised other possible problems. If the plastic 458 00:28:15,240 --> 00:28:18,640 Speaker 1: were not of a sufficiently good enough quality, it could 459 00:28:18,720 --> 00:28:21,679 Speaker 1: mean that the printed gun would not contain the explosive 460 00:28:21,680 --> 00:28:25,560 Speaker 1: reaction properly when you fire the bullet, so it could 461 00:28:25,640 --> 00:28:27,520 Speaker 1: end up breaking a part in the person's hand, causing 462 00:28:27,560 --> 00:28:30,240 Speaker 1: injury to the person holding the guns. So even if 463 00:28:30,240 --> 00:28:32,720 Speaker 1: the person who had printed the three D gun did 464 00:28:32,760 --> 00:28:35,280 Speaker 1: so as kind of just a proof of concept and 465 00:28:35,280 --> 00:28:40,480 Speaker 1: they're firing at, you know, just a paper target, there's 466 00:28:40,520 --> 00:28:44,440 Speaker 1: the possibility that the gun itself could explode or or 467 00:28:44,800 --> 00:28:47,920 Speaker 1: fracture as part of this if it weren't made out 468 00:28:47,960 --> 00:28:52,240 Speaker 1: of sufficiently strong material, and severe injury could follow. Cody Wilson, 469 00:28:52,280 --> 00:28:54,640 Speaker 1: I should add, has recently been in the news again. 470 00:28:54,800 --> 00:28:58,560 Speaker 1: He resigned his role as director of Defense Distributed, the 471 00:28:58,560 --> 00:29:01,280 Speaker 1: company that he used to promote the design and distribution 472 00:29:01,280 --> 00:29:05,680 Speaker 1: of three D printaple gun files. It's unrelated to the 473 00:29:05,720 --> 00:29:10,040 Speaker 1: gun side of things, so that organization still exists and 474 00:29:10,120 --> 00:29:15,880 Speaker 1: continues to push Wilson's vision even without Wilson at the helm. 475 00:29:15,920 --> 00:29:18,800 Speaker 1: I've got some more spooky things to talk about, but 476 00:29:18,880 --> 00:29:21,400 Speaker 1: I grew tired, so I need the goal and drink 477 00:29:21,440 --> 00:29:25,200 Speaker 1: some blood. And by blood, I mean oh great, t 478 00:29:26,560 --> 00:29:29,040 Speaker 1: I'll be right back after this word from our sponsors 479 00:29:36,760 --> 00:29:42,280 Speaker 1: number three driver less cars. This is as still a 480 00:29:42,280 --> 00:29:45,720 Speaker 1: big worry. Um. Back when Chris and Dave we're writing 481 00:29:45,720 --> 00:29:48,959 Speaker 1: this driverless cars, we're still kind of coming out of 482 00:29:49,760 --> 00:29:54,720 Speaker 1: the very very limited testing situations. Google was the best 483 00:29:54,720 --> 00:29:59,280 Speaker 1: known version. But now we actually have some at least 484 00:29:59,320 --> 00:30:04,000 Speaker 1: some rudiment um automated car systems out there in the 485 00:30:04,040 --> 00:30:09,560 Speaker 1: real world, like Tesla's autopilot, which has contributed to some 486 00:30:09,800 --> 00:30:15,960 Speaker 1: notable accidents, including a fatality. UM. Tesla has said that 487 00:30:16,000 --> 00:30:19,600 Speaker 1: this system was not meant to be used as h 488 00:30:19,840 --> 00:30:24,400 Speaker 1: and at driverless car solution, but people have still done 489 00:30:24,440 --> 00:30:27,240 Speaker 1: it because you hear a word like autopilot and you 490 00:30:27,280 --> 00:30:31,200 Speaker 1: want to test it out, I guess UM, So that 491 00:30:31,320 --> 00:30:34,400 Speaker 1: has been an issue. It has raised concerns that perhaps 492 00:30:34,440 --> 00:30:39,880 Speaker 1: this autonomous car technology is nowhere near ready for full rollout, 493 00:30:39,920 --> 00:30:42,440 Speaker 1: which I think most companies that are working on the 494 00:30:42,440 --> 00:30:46,360 Speaker 1: technology would say is correct. They're still working on the 495 00:30:46,360 --> 00:30:50,280 Speaker 1: tech to make it a reality. UM. There have been 496 00:30:50,280 --> 00:30:53,760 Speaker 1: a lot of stuff, a lot of stories published about 497 00:30:54,360 --> 00:31:00,120 Speaker 1: various problems, philosophical problems that you need to resolve all 498 00:31:00,240 --> 00:31:04,560 Speaker 1: in order to make a consistent and predictable autonomous car solution, 499 00:31:04,840 --> 00:31:08,600 Speaker 1: one of them being the infamous trolley problem. The basic 500 00:31:08,680 --> 00:31:11,680 Speaker 1: version of the trolley problem is that you've got our 501 00:31:11,880 --> 00:31:16,640 Speaker 1: out of control trolley. It's going down some tracks and 502 00:31:17,200 --> 00:31:20,520 Speaker 1: there's a switch that will allow you to change the 503 00:31:20,600 --> 00:31:25,080 Speaker 1: pathway of the trolley. And if the trolley continues where 504 00:31:25,360 --> 00:31:28,120 Speaker 1: it's path where it's going now, um, it's going to 505 00:31:28,200 --> 00:31:31,480 Speaker 1: collide with a group of people. If you throw the switch, 506 00:31:32,080 --> 00:31:34,840 Speaker 1: it will change the direction of the trolley and it 507 00:31:34,880 --> 00:31:37,920 Speaker 1: will collide with one person. Do you throw the switch? 508 00:31:38,200 --> 00:31:42,040 Speaker 1: If you do nothing, then maybe you feel like I'm 509 00:31:42,040 --> 00:31:46,720 Speaker 1: not involved. Therefore my decision did not affect anybody. That 510 00:31:47,240 --> 00:31:49,120 Speaker 1: it just played out the way it was gonna play out. 511 00:31:49,440 --> 00:31:52,560 Speaker 1: If I throw the switch, is that I'm actively condemning 512 00:31:52,600 --> 00:31:54,520 Speaker 1: that other person to death? Other people would say no, 513 00:31:54,640 --> 00:31:59,000 Speaker 1: by not choosing, you've actively chosen to condemn the first 514 00:31:59,000 --> 00:32:02,200 Speaker 1: group to death. Uh. There are variations of this problem. 515 00:32:02,240 --> 00:32:05,000 Speaker 1: Maybe you say, all right, well you've got a choice. 516 00:32:05,080 --> 00:32:08,240 Speaker 1: You can uh not throw a switch, which means the 517 00:32:08,320 --> 00:32:12,480 Speaker 1: out of control trolley will eventually come to a stop, 518 00:32:12,560 --> 00:32:15,160 Speaker 1: but everyone in the trolley is going to die as 519 00:32:15,160 --> 00:32:17,200 Speaker 1: a result of this accident. Or you can throw the 520 00:32:17,240 --> 00:32:20,520 Speaker 1: switch and the trolley will hit somebody, but the trolley 521 00:32:20,520 --> 00:32:22,800 Speaker 1: will stop and everyone who's in the trolley will survive. 522 00:32:23,520 --> 00:32:26,040 Speaker 1: So either you actively kill someone, but everyone in the 523 00:32:26,080 --> 00:32:28,400 Speaker 1: trolley lives or you don't do anything and everyone in 524 00:32:28,400 --> 00:32:31,000 Speaker 1: the trolley dies, but the person who is just innocently 525 00:32:31,040 --> 00:32:35,600 Speaker 1: crossing the pathway they live. These sort of ethical problems 526 00:32:35,640 --> 00:32:39,520 Speaker 1: are things that people talk about and debate amongst themselves, 527 00:32:39,840 --> 00:32:42,400 Speaker 1: but it turns out to be an actual practical problem 528 00:32:42,480 --> 00:32:46,040 Speaker 1: when you're designing autonomous car systems, because eventually you have 529 00:32:46,120 --> 00:32:49,360 Speaker 1: to build in some sort of decision making system for 530 00:32:49,440 --> 00:32:54,360 Speaker 1: a car in the event that it it encounters a 531 00:32:54,440 --> 00:32:59,960 Speaker 1: non avoidable car accident, that all the problems have aligned 532 00:33:00,000 --> 00:33:03,760 Speaker 1: in such a way that there is no possible outcome 533 00:33:04,280 --> 00:33:06,720 Speaker 1: in which there isn't a car accident. So what does 534 00:33:06,760 --> 00:33:10,160 Speaker 1: the car do? Does it behave in a way that 535 00:33:10,520 --> 00:33:12,840 Speaker 1: preserves the life of the person writing in the car? 536 00:33:12,960 --> 00:33:14,680 Speaker 1: Does it behave in a way that preserves the life 537 00:33:14,720 --> 00:33:16,960 Speaker 1: of the people in the surrounding area. There are a 538 00:33:17,000 --> 00:33:19,080 Speaker 1: lot of tough questions to answer. So M. I. T 539 00:33:19,240 --> 00:33:22,120 Speaker 1: published a paper from a quiz called the Moral Machine. 540 00:33:22,800 --> 00:33:25,360 Speaker 1: This quiz was designed to find out what people thought 541 00:33:25,600 --> 00:33:28,920 Speaker 1: should be given priority in these situations, and it was 542 00:33:28,960 --> 00:33:34,280 Speaker 1: distributed globally across social media platforms. They recorded forty million 543 00:33:34,360 --> 00:33:39,320 Speaker 1: ethical decisions in total. Global preference had certain consistencies. By 544 00:33:39,320 --> 00:33:44,000 Speaker 1: the way, generally speaking, people prefer to spare human lives 545 00:33:44,040 --> 00:33:46,760 Speaker 1: over animal lives. So if you had an option where 546 00:33:47,320 --> 00:33:50,320 Speaker 1: you can make a choice, but this animal will die 547 00:33:50,400 --> 00:33:51,880 Speaker 1: as a result, or you can make a choice or 548 00:33:51,920 --> 00:33:54,600 Speaker 1: that human will die, people would say, well, it's a shame, 549 00:33:54,640 --> 00:33:59,320 Speaker 1: but I'd rather choose where the animal dies. Also, people 550 00:33:59,600 --> 00:34:02,360 Speaker 1: in gen would choose to spare more lives rather than 551 00:34:02,440 --> 00:34:05,360 Speaker 1: fewer lives. So in my example with the group of 552 00:34:05,360 --> 00:34:08,239 Speaker 1: people versus the one person, more people would feel comfortable 553 00:34:08,480 --> 00:34:10,719 Speaker 1: with the one person losing their life as opposed to 554 00:34:11,040 --> 00:34:14,360 Speaker 1: the group of people. Also, people in general want to 555 00:34:14,400 --> 00:34:18,680 Speaker 1: spare children's lives more than adult lives, So this really 556 00:34:18,719 --> 00:34:22,120 Speaker 1: just showed people's preferences and ethical decisions. However, the study 557 00:34:22,239 --> 00:34:25,680 Speaker 1: authors stated that experts really should be the ones to 558 00:34:25,719 --> 00:34:29,239 Speaker 1: make the final call when designing these algorithms, that just 559 00:34:29,320 --> 00:34:33,760 Speaker 1: going by public preference alone may not be the best decision. However, 560 00:34:34,120 --> 00:34:38,720 Speaker 1: another thing to remember is that in st seven thousand 561 00:34:38,719 --> 00:34:43,560 Speaker 1: people died from car accidents. If driverless cars can reduce 562 00:34:43,640 --> 00:34:47,960 Speaker 1: that number year by year, if we can find a 563 00:34:48,000 --> 00:34:50,440 Speaker 1: way to make driver less cars more reliable so that 564 00:34:50,560 --> 00:34:54,799 Speaker 1: it reduces the overall number of fatal accidents. That would 565 00:34:54,840 --> 00:34:58,839 Speaker 1: be an incredible thing, and it would definitely be a 566 00:34:58,840 --> 00:35:03,320 Speaker 1: good argument in support for autonomous cars. It is, however, 567 00:35:03,600 --> 00:35:06,440 Speaker 1: that there is possible that there's a psychological barrier of 568 00:35:06,440 --> 00:35:09,600 Speaker 1: a machine quote unquote causing deaths, and that that could 569 00:35:09,640 --> 00:35:12,520 Speaker 1: be enough to screw things up, because while you might 570 00:35:12,520 --> 00:35:16,200 Speaker 1: be able to statistically state fewer people died because there 571 00:35:16,200 --> 00:35:18,960 Speaker 1: were autonomous cars, the fact that the cars were autonomous 572 00:35:19,000 --> 00:35:24,120 Speaker 1: and then people died has a psychological effect. You're thinking, oh, 573 00:35:24,160 --> 00:35:28,719 Speaker 1: it's a machine killing a person. Number two geoengineering, so 574 00:35:28,760 --> 00:35:31,600 Speaker 1: this is the use of science and technology to you know, 575 00:35:31,680 --> 00:35:34,520 Speaker 1: quote unquote hack the planet. This is one of those 576 00:35:34,560 --> 00:35:37,960 Speaker 1: ideas that's meant to help counteract problems like climate change. 577 00:35:38,440 --> 00:35:42,280 Speaker 1: So the scientific consensus tells us, yes, there is climate change, 578 00:35:42,560 --> 00:35:45,720 Speaker 1: and yes it is largely due to human causes, chiefly 579 00:35:45,800 --> 00:35:48,120 Speaker 1: the increase of C O two in the atmosphere along 580 00:35:48,120 --> 00:35:52,120 Speaker 1: with other greenhouse gasses. So by creating technologies designed to 581 00:35:52,640 --> 00:35:55,880 Speaker 1: capture and sequester carbon dioxide so that it's not in 582 00:35:55,920 --> 00:36:01,480 Speaker 1: the atmosphere anymore, we could help slow or maybe even 583 00:36:01,520 --> 00:36:05,440 Speaker 1: stop the process of climate change. But the thing is, 584 00:36:06,080 --> 00:36:10,000 Speaker 1: we don't know for sure that some of these proposals 585 00:36:10,040 --> 00:36:14,320 Speaker 1: would work or what the other consequences of those actions 586 00:36:14,400 --> 00:36:17,480 Speaker 1: might be. One of the uh, well, some of the 587 00:36:17,480 --> 00:36:20,920 Speaker 1: proposed methods would definitely have nasty consequences. We know that. So, 588 00:36:21,000 --> 00:36:25,719 Speaker 1: for example, one possibility would be, let's put some more 589 00:36:25,840 --> 00:36:28,880 Speaker 1: iron in the oceans in order to spur algae blooms 590 00:36:29,320 --> 00:36:33,840 Speaker 1: to soak up carbon dioxide, which that could help you 591 00:36:33,880 --> 00:36:36,680 Speaker 1: could actually soak up c O two from the atmosphere. However, 592 00:36:37,560 --> 00:36:41,799 Speaker 1: that would also have a huge negative impact on the 593 00:36:41,800 --> 00:36:44,760 Speaker 1: ocean itself. You would create dead zones in the ocean 594 00:36:44,800 --> 00:36:48,400 Speaker 1: because of this algae bloom, and that means messing up 595 00:36:48,440 --> 00:36:52,000 Speaker 1: a really complicated ecosystem that lots of life forms depend upon, 596 00:36:52,040 --> 00:36:55,440 Speaker 1: including life forms that aren't you know, directly in the ocean. 597 00:36:55,719 --> 00:36:59,960 Speaker 1: So you would have a ripple effect. Fittingly enough, since 598 00:37:00,000 --> 00:37:02,440 Speaker 1: we're talking about water, you could have die offs happening 599 00:37:02,520 --> 00:37:06,680 Speaker 1: other places in the world that are a result of this, 600 00:37:07,239 --> 00:37:10,400 Speaker 1: and you know, not even places where there are oceans, 601 00:37:10,440 --> 00:37:16,280 Speaker 1: So unintended consequences could be really, really nasty. Another tactic 602 00:37:16,719 --> 00:37:20,480 Speaker 1: besides trying to capture CEO two is to find ways 603 00:37:20,520 --> 00:37:23,320 Speaker 1: to reflect more of the Sun's energy back off into 604 00:37:23,360 --> 00:37:26,320 Speaker 1: space without it getting absorbed by the Earth and then 605 00:37:26,440 --> 00:37:30,560 Speaker 1: admitted back into the art's atmosphere. The message here is 606 00:37:30,600 --> 00:37:33,239 Speaker 1: that the cure could end up being worse, or at 607 00:37:33,320 --> 00:37:36,360 Speaker 1: least just as bad as the disease, though in a 608 00:37:36,400 --> 00:37:39,080 Speaker 1: different way. And ultimately we'll be taking a lot of 609 00:37:39,120 --> 00:37:42,799 Speaker 1: potentially irreversible actions without a full appreciation of what was 610 00:37:42,840 --> 00:37:50,000 Speaker 1: going to happen. However, if we employ these responsibly and carefully, 611 00:37:50,440 --> 00:37:53,480 Speaker 1: it's likely that we could use them as part of 612 00:37:53,520 --> 00:37:57,200 Speaker 1: an overall plan to help reduce climate change. Experts warn 613 00:37:57,320 --> 00:38:00,279 Speaker 1: us that these are not magic bullets. These are not 614 00:38:00,800 --> 00:38:06,040 Speaker 1: going to miraculously reverse the course that we've seen over 615 00:38:06,080 --> 00:38:10,200 Speaker 1: the last few decades. They would at best be a 616 00:38:10,360 --> 00:38:16,680 Speaker 1: good additional strategy along with reducing our carbon dioxide emissions, 617 00:38:16,719 --> 00:38:19,960 Speaker 1: that would be our most important action to take. We 618 00:38:20,040 --> 00:38:22,520 Speaker 1: can't just assume that we're going to come up with 619 00:38:22,560 --> 00:38:26,799 Speaker 1: a technological solution that will allow us to continue to 620 00:38:26,880 --> 00:38:31,920 Speaker 1: behave the way we've been behaving and magically erase the 621 00:38:31,960 --> 00:38:36,440 Speaker 1: consequences of those actions. That's just not a realistic outlook 622 00:38:36,560 --> 00:38:39,880 Speaker 1: at this point. Number one on the list from Chris 623 00:38:39,880 --> 00:38:44,520 Speaker 1: and Dave was Internet surveillance, which kind of ties back 624 00:38:44,560 --> 00:38:48,000 Speaker 1: into that Google problem I mentioned earlier, but it goes 625 00:38:48,160 --> 00:38:51,720 Speaker 1: well beyond that. So Internet surveillance comes in all sorts 626 00:38:51,719 --> 00:38:55,239 Speaker 1: of forms. Right the social media we use, when we 627 00:38:55,360 --> 00:38:58,200 Speaker 1: do our actions on there, all of that's getting tracked. 628 00:38:58,640 --> 00:39:01,400 Speaker 1: All of that is going into very tech boxes based 629 00:39:01,440 --> 00:39:05,760 Speaker 1: on our profiles, so that we are encountering the ads 630 00:39:05,800 --> 00:39:08,719 Speaker 1: that are closest tied to our behavior, so that we 631 00:39:08,760 --> 00:39:11,600 Speaker 1: have the most uh the most incentive to click on 632 00:39:11,640 --> 00:39:13,719 Speaker 1: those ads or to act on them in some way, 633 00:39:13,840 --> 00:39:17,440 Speaker 1: thus benefiting the company that makes the social platform as 634 00:39:17,480 --> 00:39:19,760 Speaker 1: well as the company that's doing the advertising, and whatever 635 00:39:19,800 --> 00:39:23,360 Speaker 1: company is ultimately in charge of the product or service 636 00:39:23,400 --> 00:39:26,960 Speaker 1: that is being advertised. So there's that there are companies 637 00:39:27,000 --> 00:39:31,359 Speaker 1: like Google that are taking this to extremes, tracking all 638 00:39:31,400 --> 00:39:33,520 Speaker 1: sorts of behaviors for all sorts of stuff, so that 639 00:39:33,520 --> 00:39:37,000 Speaker 1: whenever we're doing searches, we're also getting served up search 640 00:39:37,080 --> 00:39:40,880 Speaker 1: results that are catered to us more and more, which 641 00:39:41,160 --> 00:39:43,760 Speaker 1: in some ways is good, you know, you're getting stuff 642 00:39:43,800 --> 00:39:46,040 Speaker 1: that is more relevant to you, and in other ways 643 00:39:46,080 --> 00:39:49,640 Speaker 1: comes across as super creepy because it means that there's 644 00:39:49,680 --> 00:39:53,440 Speaker 1: this enormous corporation out there that might know you better 645 00:39:53,480 --> 00:39:56,920 Speaker 1: than you know yourself, and that's kind of worrisome. But 646 00:39:56,960 --> 00:40:00,120 Speaker 1: then you have other things like the n s A, 647 00:40:00,280 --> 00:40:04,239 Speaker 1: which a couple of years ago famously revealed that the 648 00:40:04,360 --> 00:40:07,840 Speaker 1: n s A was tapping into all sorts of different 649 00:40:07,880 --> 00:40:13,960 Speaker 1: communication tools two spy on on the communications between lots 650 00:40:13,960 --> 00:40:18,320 Speaker 1: of different people in an effort to to promote national security. 651 00:40:18,360 --> 00:40:22,000 Speaker 1: But you could argue it was also a huge violation 652 00:40:22,040 --> 00:40:25,959 Speaker 1: of people's expectation of privacy and that it also took 653 00:40:26,000 --> 00:40:31,760 Speaker 1: almost a presumed guilty approach and applied it to absolutely 654 00:40:31,840 --> 00:40:36,719 Speaker 1: everybody who uses these these forms of communication, from cellular 655 00:40:36,719 --> 00:40:41,399 Speaker 1: phones to website traffic, to all sources of stuff. Then 656 00:40:41,440 --> 00:40:46,600 Speaker 1: you have hackers, either state backed hackers who are are 657 00:40:46,600 --> 00:40:52,080 Speaker 1: working to spy on behalf of a government, or independent 658 00:40:52,160 --> 00:40:55,000 Speaker 1: hackers who are just trying to gather as much information 659 00:40:55,000 --> 00:40:58,120 Speaker 1: as possible to exploit it. Maybe that information is your 660 00:40:58,120 --> 00:41:01,759 Speaker 1: bank account, or maybe it's your social Security number, it 661 00:41:01,760 --> 00:41:04,319 Speaker 1: could be all sorts of stuff, But ultimately they're doing 662 00:41:04,320 --> 00:41:06,399 Speaker 1: it so they can make money and they have no 663 00:41:06,560 --> 00:41:09,600 Speaker 1: real concern about what happens to you and your information. 664 00:41:10,920 --> 00:41:15,680 Speaker 1: Uh information is valuable, whether it's for a government to 665 00:41:15,760 --> 00:41:20,719 Speaker 1: try and protect itself or the citizens that it represents. Ideally, 666 00:41:20,760 --> 00:41:22,439 Speaker 1: we would like to at least see a government try 667 00:41:22,440 --> 00:41:28,000 Speaker 1: to protect the people it represents. Maybe that's being naive, uh, 668 00:41:28,239 --> 00:41:31,359 Speaker 1: Or if it's a you know, a corporation that's doing 669 00:41:31,360 --> 00:41:33,920 Speaker 1: this in order to make a profit, or a hacker 670 00:41:33,960 --> 00:41:36,600 Speaker 1: that's doing this in order to make a profit in 671 00:41:36,600 --> 00:41:40,400 Speaker 1: an even more unethical way than the corporations are. Information 672 00:41:40,480 --> 00:41:43,440 Speaker 1: is valuable. It's also a good reminder of why you 673 00:41:43,440 --> 00:41:46,400 Speaker 1: should do stuff like use VPNs when you can, so 674 00:41:46,440 --> 00:41:50,879 Speaker 1: that you can protect your activities from prying eyes and 675 00:41:51,000 --> 00:41:54,760 Speaker 1: not have to worry quite so much about your every 676 00:41:54,800 --> 00:41:59,960 Speaker 1: move being spied upon. U don't use public WiFi, especially 677 00:42:00,040 --> 00:42:04,800 Speaker 1: to do anything sensitive. You know, be careful, be responsible 678 00:42:05,360 --> 00:42:10,840 Speaker 1: with your browsing activities so that you limit the ways 679 00:42:10,920 --> 00:42:13,600 Speaker 1: that you can be exploited. I'm not saying don't make 680 00:42:13,719 --> 00:42:16,759 Speaker 1: use of these various platforms social media. I'd be a 681 00:42:16,840 --> 00:42:19,120 Speaker 1: hypocrite if I did. I use them all the time. 682 00:42:20,120 --> 00:42:24,480 Speaker 1: Just know what you're getting into and be aware of 683 00:42:24,520 --> 00:42:28,480 Speaker 1: what could potentially happen. If you're comfortable with those consequences, 684 00:42:29,200 --> 00:42:31,279 Speaker 1: I say, you're all good. If you're not, then you 685 00:42:31,320 --> 00:42:34,520 Speaker 1: need to think about ways you can change your behavior, 686 00:42:34,600 --> 00:42:38,400 Speaker 1: whether it's using VPNs or backing off on using the 687 00:42:38,440 --> 00:42:41,600 Speaker 1: Internet as much, or whatever it may be, so that 688 00:42:42,120 --> 00:42:45,880 Speaker 1: you feel you're using technology to benefit you as opposed 689 00:42:45,920 --> 00:42:49,719 Speaker 1: to having other people receive a benefit because of you 690 00:42:51,120 --> 00:42:54,600 Speaker 1: don't be used. In other words, I hope you guys 691 00:42:54,680 --> 00:42:58,879 Speaker 1: enjoyed that episode from I will have a brand new 692 00:42:58,880 --> 00:43:01,719 Speaker 1: episode ready to go on Wednesday. If you are a 693 00:43:01,719 --> 00:43:05,319 Speaker 1: fan of horror games, I think you will enjoy it. 694 00:43:05,440 --> 00:43:08,080 Speaker 1: It's going to be sort of a uh an overview 695 00:43:08,120 --> 00:43:12,279 Speaker 1: of some famous scary games and maybe some that are 696 00:43:12,320 --> 00:43:15,120 Speaker 1: not so famous, but ones that I think people should 697 00:43:15,160 --> 00:43:17,840 Speaker 1: know about, So that is something to look forward to 698 00:43:17,880 --> 00:43:20,239 Speaker 1: on Wednesday. On Friday, we are going to have a 699 00:43:20,320 --> 00:43:24,400 Speaker 1: classic episode of the most classic variety. We're talking about 700 00:43:24,440 --> 00:43:28,760 Speaker 1: an old school tech Stuff episode and it does relate 701 00:43:28,840 --> 00:43:33,480 Speaker 1: back to the spooky theme of Halloween, and it's famous 702 00:43:33,520 --> 00:43:37,520 Speaker 1: for me getting a little head up. I got a 703 00:43:37,560 --> 00:43:42,080 Speaker 1: little I got a little agitated on that episode. Long 704 00:43:42,160 --> 00:43:45,239 Speaker 1: time listeners of tech stuff will recognize it, but it 705 00:43:45,320 --> 00:43:49,279 Speaker 1: is from like a decade ago, So that's something to 706 00:43:49,280 --> 00:43:52,040 Speaker 1: look forward to on Friday. As I mentioned in the 707 00:43:52,120 --> 00:43:55,359 Speaker 1: earlier part of this episode, I will have some pretty 708 00:43:55,400 --> 00:43:59,120 Speaker 1: big announcements about tech stuff leading up closer to the 709 00:43:59,160 --> 00:44:02,759 Speaker 1: new year. I'm really excited about it. And I'll have 710 00:44:02,840 --> 00:44:07,040 Speaker 1: that other podcast that will be launching soon that's also exciting. 711 00:44:07,280 --> 00:44:12,200 Speaker 1: And before I go, if you have not already subscribed 712 00:44:12,239 --> 00:44:16,600 Speaker 1: to it, I highly recommend you check out our podcast 713 00:44:16,840 --> 00:44:23,480 Speaker 1: Thirteen Days of Halloween. It is a phenomenal series that 714 00:44:23,719 --> 00:44:27,960 Speaker 1: was produced largely out of our office in Atlanta. The 715 00:44:28,040 --> 00:44:30,920 Speaker 1: people who have been working on it worked so hard 716 00:44:31,280 --> 00:44:38,760 Speaker 1: to create really immersive, effective horror stories. They are really 717 00:44:39,400 --> 00:44:42,680 Speaker 1: great works of art, I think, and uh, I really 718 00:44:42,680 --> 00:44:45,640 Speaker 1: hope that in the future I can get involved in 719 00:44:45,719 --> 00:44:49,719 Speaker 1: that particular type of work because I was blown away 720 00:44:50,080 --> 00:44:52,080 Speaker 1: and I I just it's the sort of thing that 721 00:44:52,080 --> 00:44:53,440 Speaker 1: when you hear it, you just want to be a 722 00:44:53,480 --> 00:44:57,160 Speaker 1: part of it because that's how interesting and innovative it is. 723 00:44:57,440 --> 00:44:59,960 Speaker 1: So if you have not already checked it out Thirteen 724 00:45:00,080 --> 00:45:03,160 Speaker 1: Days of Halloween, go listen to an episode or two. 725 00:45:03,920 --> 00:45:07,800 Speaker 1: Uh do it like you know, with headphones on, lights off, 726 00:45:08,239 --> 00:45:12,120 Speaker 1: quiet room if you can um and uh, you know, 727 00:45:13,120 --> 00:45:15,120 Speaker 1: make sure the light switch isn't too far away because 728 00:45:15,160 --> 00:45:17,880 Speaker 1: it does get spooky. If you have any suggestions for 729 00:45:17,920 --> 00:45:20,399 Speaker 1: future episodes of tech Stuff. I welcome you to let 730 00:45:20,440 --> 00:45:22,840 Speaker 1: me know. You can reach out on Twitter. That's the 731 00:45:22,880 --> 00:45:25,560 Speaker 1: best way. The handle for the show is text stuff 732 00:45:25,880 --> 00:45:30,200 Speaker 1: H s W and I'll talk to you again really soon. 733 00:45:35,320 --> 00:45:38,359 Speaker 1: Text Stuff is an I Heart Radio production. For more 734 00:45:38,440 --> 00:45:41,799 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 735 00:45:41,960 --> 00:45:45,120 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.