1 00:00:04,440 --> 00:00:12,400 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,880 --> 00:00:20,439 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:20,480 --> 00:00:24,479 Speaker 1: are you? So? Originally I had a different plan for today. 5 00:00:24,840 --> 00:00:28,840 Speaker 1: I was getting work done on an episode that I 6 00:00:28,880 --> 00:00:31,480 Speaker 1: will still bring to you. It'll just probably be a 7 00:00:31,520 --> 00:00:35,000 Speaker 1: little while longer before I do it because as I 8 00:00:35,080 --> 00:00:40,920 Speaker 1: was working early on this episode, I developed an ocular migraine, 9 00:00:41,400 --> 00:00:44,080 Speaker 1: which has still affected me right now. For those who 10 00:00:44,120 --> 00:00:47,840 Speaker 1: are not familiar ocular migraines, you end up getting these 11 00:00:48,280 --> 00:00:50,520 Speaker 1: weird effects on your vision. I don't have any pain, 12 00:00:50,680 --> 00:00:55,040 Speaker 1: at least not yet, but I can't see very well. 13 00:00:55,120 --> 00:00:57,240 Speaker 1: So it took me a while to even open up 14 00:00:57,240 --> 00:01:00,480 Speaker 1: the recording software for me to be able to do 15 00:01:00,600 --> 00:01:04,320 Speaker 1: this bit. But yeah, I am not able to do 16 00:01:04,400 --> 00:01:07,000 Speaker 1: my work right now because I can't really see. But 17 00:01:07,120 --> 00:01:09,840 Speaker 1: these have happened before. They don't last that long. Usually 18 00:01:09,840 --> 00:01:13,199 Speaker 1: this one's a longer one, but they clear up pretty 19 00:01:13,680 --> 00:01:16,320 Speaker 1: much on their own, and most of the time I 20 00:01:16,360 --> 00:01:19,080 Speaker 1: don't get on migraine migraine with them, So here's hoping 21 00:01:19,120 --> 00:01:21,360 Speaker 1: that's the case today. But I didn't want to leave 22 00:01:21,400 --> 00:01:23,840 Speaker 1: you without an episode, so I thought it would bring 23 00:01:23,880 --> 00:01:26,559 Speaker 1: you this one that we published a few years back 24 00:01:26,959 --> 00:01:30,520 Speaker 1: in September of twenty twenty. It is called tech Stuff 25 00:01:30,600 --> 00:01:35,720 Speaker 1: Looks Inside the Black Box because I think this topic 26 00:01:35,760 --> 00:01:40,120 Speaker 1: in particular is incredibly relevant today, particularly when we are 27 00:01:40,240 --> 00:01:44,399 Speaker 1: talking about artificial intelligence. A lot of AI systems work 28 00:01:44,560 --> 00:01:47,720 Speaker 1: inside what we would call a black box, and so 29 00:01:48,400 --> 00:01:51,200 Speaker 1: I thought, why not bring that one back and have 30 00:01:51,240 --> 00:01:56,040 Speaker 1: that one rerun today while I get my vision under control. 31 00:01:56,360 --> 00:01:58,400 Speaker 1: I hope you're all well, and I'll chat with you 32 00:01:58,440 --> 00:02:02,440 Speaker 1: again at the of the episode. So this is tech 33 00:02:02,440 --> 00:02:09,280 Speaker 1: Stuff Looks Inside the Black Box from September second, twenty twenty. Today, 34 00:02:09,600 --> 00:02:11,760 Speaker 1: I thought it was a good time to take another 35 00:02:11,840 --> 00:02:15,919 Speaker 1: opportunity to chat about one of the subjects I really 36 00:02:16,280 --> 00:02:19,760 Speaker 1: hammer home in this series. And I don't make any 37 00:02:19,800 --> 00:02:24,400 Speaker 1: apologies for this. It's about critical thinking. So yeah, this 38 00:02:24,440 --> 00:02:29,120 Speaker 1: is another critical Thinking in Technology episode. Now, in these episodes, 39 00:02:29,360 --> 00:02:33,200 Speaker 1: I explain how taking time and really thinking through things 40 00:02:33,520 --> 00:02:36,880 Speaker 1: is important so that we make the most informed decisions 41 00:02:36,880 --> 00:02:40,800 Speaker 1: we can and so that we aren't either fooling ourselves 42 00:02:41,040 --> 00:02:44,400 Speaker 1: or you know, allowing someone else to fool us when 43 00:02:44,440 --> 00:02:47,600 Speaker 1: it comes to technology. Though, I'll tell you a secret. 44 00:02:47,880 --> 00:02:50,520 Speaker 1: Using these skills in all parts of your life is 45 00:02:50,520 --> 00:02:53,240 Speaker 1: a great idea because it can be really easy for 46 00:02:53,320 --> 00:02:56,160 Speaker 1: us to fall into patterns where we let ourselves believe 47 00:02:56,240 --> 00:03:00,839 Speaker 1: things just because it's convenient, or you know, it reaffirms 48 00:03:00,840 --> 00:03:04,280 Speaker 1: our biases, our prejudices, that kind of thing. So if 49 00:03:04,320 --> 00:03:09,040 Speaker 1: you use critical thinking beyond the realm of technology, I 50 00:03:09,080 --> 00:03:13,360 Speaker 1: ain't gonna be mad. Specifically, I wanted to talk about 51 00:03:13,400 --> 00:03:18,240 Speaker 1: a general category of issues in tech that some refer 52 00:03:18,320 --> 00:03:21,960 Speaker 1: to as the black box problem. Now, this is not 53 00:03:22,520 --> 00:03:27,160 Speaker 1: the same thing as the black box that's onboard your 54 00:03:27,200 --> 00:03:31,320 Speaker 1: typical airplane. In fact, i'll explain what that is first 55 00:03:31,360 --> 00:03:35,000 Speaker 1: because it's pretty simple, and then we can move on. First, 56 00:03:35,160 --> 00:03:39,880 Speaker 1: the black box inside airplanes is typically orange. So right 57 00:03:39,920 --> 00:03:42,440 Speaker 1: off the bat, we have a problem with nomenclature, right, 58 00:03:42,480 --> 00:03:47,160 Speaker 1: I mean, you had one job black box. Actually that's 59 00:03:47,600 --> 00:03:50,120 Speaker 1: not true. The black box has a very important job 60 00:03:50,160 --> 00:03:52,880 Speaker 1: and it requires a couple of things to work. But 61 00:03:53,080 --> 00:03:57,400 Speaker 1: the black box, which is orange, is all about maintaining 62 00:03:57,440 --> 00:04:01,080 Speaker 1: a record of an aircraft's activities in a housing capable 63 00:04:01,280 --> 00:04:06,720 Speaker 1: of withstanding tremendous punishment. Another name, or a more appropriate name, 64 00:04:06,760 --> 00:04:09,680 Speaker 1: really for this device for the black box is the 65 00:04:09,720 --> 00:04:14,120 Speaker 1: flight data recorder. Sensors in various parts of the plane 66 00:04:14,360 --> 00:04:18,360 Speaker 1: detect changes and then send data to the flight data recorder, 67 00:04:18,400 --> 00:04:23,800 Speaker 1: which you know, records them. If a pilot makes any 68 00:04:23,800 --> 00:04:27,520 Speaker 1: adjustments to any controls, whether it's the flight stick or 69 00:04:27,560 --> 00:04:30,240 Speaker 1: a knob, or a button or a switch or whatever, 70 00:04:30,600 --> 00:04:33,960 Speaker 1: the control not only does whatever it was intended to do, 71 00:04:34,240 --> 00:04:37,680 Speaker 1: assuming everything's in working order, but it also sends a 72 00:04:37,720 --> 00:04:41,279 Speaker 1: signal that is recorded on the flight recorder. So the 73 00:04:41,360 --> 00:04:43,920 Speaker 1: job of the flight recorder is to create as accurate 74 00:04:44,040 --> 00:04:48,120 Speaker 1: a representation of what went on with that aircraft as 75 00:04:48,320 --> 00:04:52,640 Speaker 1: is possible. The Federal Aviation Administration or FAA in the 76 00:04:52,720 --> 00:04:55,719 Speaker 1: United States, has a long list of parameters that the 77 00:04:55,760 --> 00:04:58,720 Speaker 1: flight recorder is supposed to keep track of, more than 78 00:04:58,839 --> 00:05:02,839 Speaker 1: eighty in fact, and these include not just the aircraft's systems, 79 00:05:02,880 --> 00:05:05,640 Speaker 1: but what was going on in the environment. So if 80 00:05:05,640 --> 00:05:08,680 Speaker 1: a pilot encounters a problem on a flight, or in 81 00:05:08,720 --> 00:05:12,120 Speaker 1: a worst case scenario, in the event of a crash, 82 00:05:12,360 --> 00:05:15,640 Speaker 1: the flight recorder represents an opportunity to find out what 83 00:05:16,240 --> 00:05:19,880 Speaker 1: actually went wrong. You know, was it a malfunction, was 84 00:05:19,920 --> 00:05:24,880 Speaker 1: it pilot error, was it weather? The crash survivable memory 85 00:05:24,960 --> 00:05:29,000 Speaker 1: units inside the heavy duty casing of the black box 86 00:05:29,040 --> 00:05:32,080 Speaker 1: are really meant to act as a lasting record, So 87 00:05:32,279 --> 00:05:36,080 Speaker 1: if the recorder is recoverable, it gives investigators a chance 88 00:05:36,120 --> 00:05:39,920 Speaker 1: to find out what happened. But, as I said, that's 89 00:05:40,080 --> 00:05:42,599 Speaker 1: not the black box I really wanted to talk about 90 00:05:42,640 --> 00:05:44,839 Speaker 1: for today's episode, so I'm not going to go into 91 00:05:44,839 --> 00:05:48,600 Speaker 1: any more detail about that. Now, you could argue that 92 00:05:48,800 --> 00:05:51,240 Speaker 1: the black box I want to talk about is sort 93 00:05:51,279 --> 00:05:55,240 Speaker 1: of the opposite of what we find in airplanes, because 94 00:05:55,320 --> 00:05:58,400 Speaker 1: in an airplane, the black box contains a record of 95 00:05:58,440 --> 00:06:01,640 Speaker 1: everything that has gone on, and it can help explain 96 00:06:01,760 --> 00:06:06,320 Speaker 1: why a certain outcome has happened. In technology in general, 97 00:06:06,640 --> 00:06:09,240 Speaker 1: we use the term black box to refer to a 98 00:06:09,440 --> 00:06:12,360 Speaker 1: system or technology where we know what goes into it 99 00:06:12,880 --> 00:06:15,840 Speaker 1: and we can see what comes out of it, but 100 00:06:15,880 --> 00:06:18,160 Speaker 1: we have no idea of what went on in the 101 00:06:18,200 --> 00:06:21,039 Speaker 1: middle of that process. We don't have a way to 102 00:06:21,160 --> 00:06:25,080 Speaker 1: understand the process by which the device takes input and 103 00:06:25,200 --> 00:06:29,800 Speaker 1: produces output. Now, in most cases, we're not talking about 104 00:06:29,839 --> 00:06:34,919 Speaker 1: an instance where literally nobody understands what's going on with 105 00:06:35,160 --> 00:06:38,960 Speaker 1: a device or a system. It's more like the creators 106 00:06:39,040 --> 00:06:43,120 Speaker 1: of whatever system we're talking about have purposefully made it 107 00:06:43,320 --> 00:06:47,640 Speaker 1: difficult or impossible for the average person to understand or 108 00:06:47,720 --> 00:06:51,200 Speaker 1: in some cases even see, what a technology is doing. 109 00:06:51,960 --> 00:06:55,719 Speaker 1: Sometimes it's intentional, sometimes it's not. So Maybe purposefully is 110 00:06:55,720 --> 00:07:00,000 Speaker 1: being a little strong there, but that's often how it unfolds. 111 00:07:00,040 --> 00:07:03,279 Speaker 1: Let's take a general example that has created issues with 112 00:07:03,360 --> 00:07:06,880 Speaker 1: a particular subsection of tech heads, and those would be 113 00:07:07,279 --> 00:07:09,800 Speaker 1: the gear heads, you know, the people who love to 114 00:07:09,840 --> 00:07:13,120 Speaker 1: work on vehicles like motorcycles and cars and trucks and 115 00:07:13,120 --> 00:07:16,880 Speaker 1: stuff in what you might think of as the good 116 00:07:16,960 --> 00:07:21,040 Speaker 1: old days, at least from the perspective of DIY technology. 117 00:07:21,240 --> 00:07:23,680 Speaker 1: There's plenty of other things that were wrong back then, 118 00:07:23,760 --> 00:07:28,440 Speaker 1: but in those terms, a car's systems were pretty darn accessible. 119 00:07:28,720 --> 00:07:31,760 Speaker 1: A motorist would need to spend time and effort to 120 00:07:31,880 --> 00:07:35,000 Speaker 1: learn how the car worked and what each component was 121 00:07:35,040 --> 00:07:38,559 Speaker 1: meant to do. But that was actually an achievable goal. 122 00:07:38,960 --> 00:07:41,800 Speaker 1: So with a bit of study and some hands on work, 123 00:07:41,840 --> 00:07:44,800 Speaker 1: you could suss out how an engine works. You know, 124 00:07:44,880 --> 00:07:48,360 Speaker 1: how spark plugs cause a little explosion by igniting a 125 00:07:48,400 --> 00:07:52,680 Speaker 1: mixture of fuel and air inside an engine's cylinders. How 126 00:07:52,720 --> 00:07:55,960 Speaker 1: that explosion would force out a piston which connects to 127 00:07:56,000 --> 00:07:59,040 Speaker 1: a crank shaft, and how that reciprocating motion of the 128 00:07:59,040 --> 00:08:02,320 Speaker 1: piston would translate into rotational motion of the crank shaft 129 00:08:02,600 --> 00:08:06,320 Speaker 1: that could then be transmitted ultimately to wheels through a transmission. 130 00:08:06,680 --> 00:08:10,320 Speaker 1: You could learn what the carburetor does, how the various 131 00:08:10,480 --> 00:08:13,080 Speaker 1: fans and belts work and what they do. You know 132 00:08:13,120 --> 00:08:15,400 Speaker 1: where the oil pan is and how to change out 133 00:08:15,400 --> 00:08:18,280 Speaker 1: oil and all that kind of stuff. What's more, you 134 00:08:18,280 --> 00:08:21,760 Speaker 1: can make repairs yourself. If you had the tools, the 135 00:08:21,840 --> 00:08:25,760 Speaker 1: replacement parts, and the knowledge and the time you could 136 00:08:25,800 --> 00:08:29,440 Speaker 1: swap out parts. You could customize your vehicle. You know, 137 00:08:29,480 --> 00:08:31,720 Speaker 1: I've known a lot of people who have taken on 138 00:08:31,920 --> 00:08:35,640 Speaker 1: cars as projects. They'll purchase an old junker and then 139 00:08:35,679 --> 00:08:39,080 Speaker 1: they will lovingly restore it to its former glory or 140 00:08:39,120 --> 00:08:42,640 Speaker 1: turn it into something truly transformational. And all of that 141 00:08:42,800 --> 00:08:47,760 Speaker 1: is possible because those old cars had really accessible systems. 142 00:08:48,120 --> 00:08:53,600 Speaker 1: They were relatively simple electro mechanical systems. Once you understood 143 00:08:53,600 --> 00:08:56,520 Speaker 1: how they worked, you could see how they worked or 144 00:08:56,559 --> 00:08:59,400 Speaker 1: how they were supposed to work, and you could understand 145 00:08:59,440 --> 00:09:03,439 Speaker 1: what was going on. Through that understanding, you could address 146 00:09:03,480 --> 00:09:07,120 Speaker 1: stuff when things weren't going well. And that's how cars 147 00:09:07,160 --> 00:09:10,680 Speaker 1: were for decades but that began to change in the 148 00:09:10,800 --> 00:09:16,640 Speaker 1: late nineteen sixties, but it really accelerated no pun intended 149 00:09:16,960 --> 00:09:22,119 Speaker 1: in the nineteen seventies. So what happened well in nineteen 150 00:09:22,320 --> 00:09:26,320 Speaker 1: sixty eight leading into nineteen sixty nine, Volkswagen introduced a 151 00:09:26,480 --> 00:09:30,320 Speaker 1: new standard feature for their Type three vehicle, which was 152 00:09:30,360 --> 00:09:34,000 Speaker 1: sometimes called the Volkswagen fifteen hundred or sixteen hundred, there 153 00:09:34,000 --> 00:09:36,960 Speaker 1: were a couple of names for it. These were family cars, 154 00:09:37,040 --> 00:09:40,679 Speaker 1: and Volkswagen's intent was to create a vehicle with a 155 00:09:40,679 --> 00:09:44,240 Speaker 1: bit more luggage and passenger space than their Type one, 156 00:09:44,640 --> 00:09:49,240 Speaker 1: which was also known as the Volkswagen Beetle. The feature 157 00:09:49,600 --> 00:09:51,520 Speaker 1: for these cars that I wanted to talk about was 158 00:09:51,559 --> 00:09:55,360 Speaker 1: an electronic fuel injection system that was controlled by a 159 00:09:55,400 --> 00:09:59,800 Speaker 1: computer chip, and the marketing for this particular feature said 160 00:09:59,800 --> 00:10:05,240 Speaker 1: that this quote electronic brain end quote was quote smarter 161 00:10:05,600 --> 00:10:08,800 Speaker 1: than a carburetor end quote. Now, the purpose of a 162 00:10:08,840 --> 00:10:12,680 Speaker 1: carburetor is to mix fuel with air at a ratio 163 00:10:12,800 --> 00:10:17,000 Speaker 1: that is suitable for combustion inside the engine cylinders. But 164 00:10:17,120 --> 00:10:20,680 Speaker 1: an engine doesn't need exactly the same ratio of fuel 165 00:10:20,679 --> 00:10:23,920 Speaker 1: to air from moment to moment. It actually varies as 166 00:10:23,960 --> 00:10:27,760 Speaker 1: a vehicle runs longer or it travels faster, or it 167 00:10:27,760 --> 00:10:30,959 Speaker 1: starts climbing a steep hill, or you know, lots of stuff. 168 00:10:31,080 --> 00:10:35,120 Speaker 1: The ratio changes somewhat, and the carburetor manages this with 169 00:10:35,679 --> 00:10:38,080 Speaker 1: a couple of valves, one called the choke and another 170 00:10:38,120 --> 00:10:41,599 Speaker 1: called the throttle, among other elements. But it's all mechanical, 171 00:10:41,880 --> 00:10:44,720 Speaker 1: and while it works, it's not as precise as an 172 00:10:44,760 --> 00:10:48,440 Speaker 1: electronic system could be. And that's where the Volkswagen system 173 00:10:48,520 --> 00:10:52,480 Speaker 1: came in. Volkswagen was pushing this as a more efficient 174 00:10:52,600 --> 00:10:56,280 Speaker 1: and useful component than a carburetor. It would prevent the 175 00:10:56,320 --> 00:10:59,640 Speaker 1: engine from being flooded with fuel and not enough air 176 00:10:59,760 --> 00:11:03,360 Speaker 1: for combustion. It would handle the transitions of fuel and 177 00:11:03,400 --> 00:11:07,680 Speaker 1: air mix ratios more quickly and precisely. It was the 178 00:11:07,720 --> 00:11:11,120 Speaker 1: start of something big, but were it not for some 179 00:11:11,320 --> 00:11:15,160 Speaker 1: other external factors, it might not have taken off the 180 00:11:15,160 --> 00:11:17,520 Speaker 1: way it did, or at least not as quickly as 181 00:11:17,559 --> 00:11:21,079 Speaker 1: it did. Those other factors, as I said, were external, 182 00:11:21,440 --> 00:11:25,280 Speaker 1: and there were a pair of doozies. One was a 183 00:11:25,360 --> 00:11:28,800 Speaker 1: growing concern that burning fossil fuels was having a negative 184 00:11:28,840 --> 00:11:32,280 Speaker 1: impact on the environment, which turned out to be absolutely 185 00:11:32,320 --> 00:11:36,840 Speaker 1: the case. Cities like Los Angeles, California, where getting around 186 00:11:36,880 --> 00:11:40,720 Speaker 1: pretty much requires having a car. We're dealing with some 187 00:11:41,040 --> 00:11:46,160 Speaker 1: really serious smog problems, and so organizations like the Environmental 188 00:11:46,200 --> 00:11:50,080 Speaker 1: Protection Agency in the United States began to draft requirements 189 00:11:50,120 --> 00:11:54,840 Speaker 1: to reduce car emissions that would mean that automotive companies 190 00:11:54,880 --> 00:11:58,520 Speaker 1: would have to create more efficient engine systems. The other 191 00:11:58,640 --> 00:12:02,000 Speaker 1: major factor that can atributed to this was the oil 192 00:12:02,080 --> 00:12:05,240 Speaker 1: crisis of the nineteen seventies, which I talked about not 193 00:12:05,360 --> 00:12:08,840 Speaker 1: long ago in a different Tech Stuff podcast. This was 194 00:12:08,920 --> 00:12:12,440 Speaker 1: a geopolitical problem that threw much of the world into 195 00:12:12,520 --> 00:12:16,559 Speaker 1: a scramble to curb fossil fuel consumption because the supply 196 00:12:17,120 --> 00:12:21,679 Speaker 1: was limited. The double whammy of environmental concerns and the 197 00:12:22,200 --> 00:12:25,640 Speaker 1: oil crisis forced a lot of car companies to rethink 198 00:12:26,120 --> 00:12:31,680 Speaker 1: their previous strategy, which was pretty much more power, make bigger, 199 00:12:31,760 --> 00:12:35,720 Speaker 1: much fast, go go go, go guzzle. That kind of 200 00:12:35,840 --> 00:12:38,319 Speaker 1: is how they were thinking back in the day. Turns 201 00:12:38,320 --> 00:12:41,319 Speaker 1: out that was an unsustainable option. And if you look back, 202 00:12:41,840 --> 00:12:45,880 Speaker 1: especially at American cars during the fifties and sixties, you 203 00:12:45,920 --> 00:12:49,920 Speaker 1: see that trend of the engines getting bigger and more powerful, 204 00:12:50,559 --> 00:12:54,559 Speaker 1: and that was just the way things were going until 205 00:12:54,640 --> 00:12:58,040 Speaker 1: we started to see these external changes come in and 206 00:12:58,200 --> 00:13:02,720 Speaker 1: so more car companies began to incorporate computer controlled fuel 207 00:13:02,760 --> 00:13:07,199 Speaker 1: injection systems. But this move also marked a move away 208 00:13:07,640 --> 00:13:10,840 Speaker 1: from the design that made it harder for the DIY 209 00:13:10,880 --> 00:13:14,840 Speaker 1: crowd to work on cars. Working on a damaged carburetor 210 00:13:15,120 --> 00:13:19,600 Speaker 1: was one thing, Dealing with a malfunctioning computer chip was another. 211 00:13:20,120 --> 00:13:22,960 Speaker 1: It didn't fall into the typical skill set of your 212 00:13:23,040 --> 00:13:27,200 Speaker 1: amateur mechanic, and of course we didn't stop computer controlled 213 00:13:27,240 --> 00:13:30,600 Speaker 1: fuel injection systems. Over time, we saw a lot more 214 00:13:30,600 --> 00:13:34,880 Speaker 1: automotive systems make the transition to computer control. Today, your 215 00:13:34,920 --> 00:13:39,120 Speaker 1: average car has computer systems that control the engine, the transmission, 216 00:13:39,240 --> 00:13:43,880 Speaker 1: the doors, entertainment system, the windows. And these systems are 217 00:13:44,200 --> 00:13:48,920 Speaker 1: all individual and they have a name. They're called electronic 218 00:13:49,080 --> 00:13:55,600 Speaker 1: control units or ECUs. Collectively, they form the controller area 219 00:13:55,720 --> 00:14:01,199 Speaker 1: network or can CAN. The connections themselves, the physical connections 220 00:14:01,280 --> 00:14:05,360 Speaker 1: are called the can bus bus, and that's really just 221 00:14:05,400 --> 00:14:07,440 Speaker 1: a way of saying these are the physical connectors that 222 00:14:07,520 --> 00:14:11,880 Speaker 1: allow data to pass from one ECU to another. And 223 00:14:12,320 --> 00:14:15,680 Speaker 1: there wasn't really like a central processing unit or anything. 224 00:14:15,679 --> 00:14:18,480 Speaker 1: There was no like central brain. It was more like 225 00:14:19,120 --> 00:14:24,400 Speaker 1: ECUs that depend upon one another would send relevant information 226 00:14:24,560 --> 00:14:29,080 Speaker 1: to each other and not to anything else, so you know, 227 00:14:29,160 --> 00:14:32,440 Speaker 1: if the door sensor is showing a door as open, 228 00:14:32,880 --> 00:14:36,400 Speaker 1: it can send an alert to other systems so that 229 00:14:36,400 --> 00:14:40,920 Speaker 1: that information is appropriately dealt with. Now, at the same 230 00:14:41,040 --> 00:14:45,680 Speaker 1: time that these individual systems were evolving, so too we 231 00:14:45,760 --> 00:14:49,240 Speaker 1: saw the rise of what would become the onboard diagnostic 232 00:14:49,360 --> 00:14:53,680 Speaker 1: system or OBD. And the OBD keeps an eye on 233 00:14:53,760 --> 00:14:57,040 Speaker 1: what's going on with the various systems in the car, 234 00:14:57,120 --> 00:15:01,200 Speaker 1: and it sends notifications to the driver via dashboard indicators 235 00:15:01,240 --> 00:15:05,640 Speaker 1: when something is outside normal operating parameters. So let's say 236 00:15:06,040 --> 00:15:10,200 Speaker 1: that this diagnosis computer picks up that there's something hinky 237 00:15:10,360 --> 00:15:13,400 Speaker 1: happening with the fuel air mixture and it activates that 238 00:15:13,480 --> 00:15:16,600 Speaker 1: pesky check engine light on the dashboard that gives you 239 00:15:16,720 --> 00:15:20,960 Speaker 1: next to no useful information. The problem is that these 240 00:15:21,080 --> 00:15:25,200 Speaker 1: days it can be challenging or sometimes impossible to figure 241 00:15:25,200 --> 00:15:28,400 Speaker 1: out exactly what caused that check engine light to come 242 00:15:28,440 --> 00:15:32,680 Speaker 1: on without the access to some special equipment and expertise. 243 00:15:33,160 --> 00:15:36,480 Speaker 1: The car systems have become so sophisticated that it could 244 00:15:36,480 --> 00:15:39,320 Speaker 1: be a challenge to figure out what exactly has gone awry. 245 00:15:40,080 --> 00:15:44,800 Speaker 1: Mechanics use devices called OBD scan tools, and these tools 246 00:15:44,840 --> 00:15:48,360 Speaker 1: connect to the computer on board a car, and then 247 00:15:48,400 --> 00:15:52,560 Speaker 1: the car provides an error code to the scanner. This, 248 00:15:52,600 --> 00:15:55,520 Speaker 1: by the way, took a long time to standardize because 249 00:15:55,520 --> 00:15:57,560 Speaker 1: you've got a lot of different car companies out there, 250 00:15:57,600 --> 00:16:01,320 Speaker 1: and obviously there was a need to move towards standardization 251 00:16:01,560 --> 00:16:04,800 Speaker 1: so that you didn't have to have fifty different scan 252 00:16:05,000 --> 00:16:08,400 Speaker 1: tools and fifty different code charts to deal with all 253 00:16:08,440 --> 00:16:11,600 Speaker 1: the different car companies. But the code corresponds to the 254 00:16:11,600 --> 00:16:15,680 Speaker 1: specific issue the OBD has detected. So not only do 255 00:16:15,760 --> 00:16:19,360 Speaker 1: you need a special piece of equipment to diagnose what 256 00:16:19,480 --> 00:16:21,680 Speaker 1: has gone wrong with the car, you also need to 257 00:16:21,760 --> 00:16:25,200 Speaker 1: know the codes, or else you haven't really learned anything. 258 00:16:25,520 --> 00:16:30,200 Speaker 1: If I get a eight digit code and I don't 259 00:16:30,200 --> 00:16:32,440 Speaker 1: know what that code refers to, then I'm not really 260 00:16:32,440 --> 00:16:34,840 Speaker 1: any better off than just looking at a check engine light. 261 00:16:35,440 --> 00:16:37,640 Speaker 1: On top of all of that, even if you know 262 00:16:37,720 --> 00:16:40,840 Speaker 1: what is wrong, you might not be able to easily 263 00:16:40,920 --> 00:16:43,680 Speaker 1: access the problem or fix it due to the level 264 00:16:43,720 --> 00:16:49,320 Speaker 1: of complexity, sophistication and computerization of vehicles. Not all cars 265 00:16:49,520 --> 00:16:53,360 Speaker 1: or motorcycles or whatever are equal. Obviously, some are a 266 00:16:53,400 --> 00:16:57,160 Speaker 1: bit easier to work on than others. Some require a 267 00:16:57,280 --> 00:17:00,440 Speaker 1: lot of specific care. Though. For example, if you're driving 268 00:17:00,480 --> 00:17:03,840 Speaker 1: a Tesla, chances are the amount of personal tinkering you're 269 00:17:03,840 --> 00:17:06,119 Speaker 1: going to do on your car is going to be 270 00:17:06,200 --> 00:17:10,280 Speaker 1: fairly limited. Now, I'm not saying it's impossible, just that 271 00:17:10,680 --> 00:17:14,840 Speaker 1: it's really challenging. So, in general, we've seen cars go 272 00:17:15,000 --> 00:17:18,679 Speaker 1: from a mechanical system or electro mechanical system that the 273 00:17:18,880 --> 00:17:22,480 Speaker 1: average person can understand and work on, to a group 274 00:17:22,520 --> 00:17:28,320 Speaker 1: of interconnected specialized computer systems that are increasingly difficult to access. 275 00:17:28,720 --> 00:17:32,320 Speaker 1: The cars have become a type of black box. This 276 00:17:32,359 --> 00:17:36,440 Speaker 1: can be extra frustrating for gear heads who have actually 277 00:17:36,640 --> 00:17:41,040 Speaker 1: an understanding of underlying mechanical issues that could cause problems. 278 00:17:41,040 --> 00:17:43,280 Speaker 1: They might even know how to solve an issue if 279 00:17:43,320 --> 00:17:46,960 Speaker 1: they can just get to it, but they are finding 280 00:17:47,040 --> 00:17:52,440 Speaker 1: themselves with fewer options in order to address underlying issues. Now, 281 00:17:52,520 --> 00:17:56,119 Speaker 1: cars are just one example of technologies that have moved 282 00:17:56,240 --> 00:18:01,359 Speaker 1: toward a black box like system. Lots of others. But 283 00:18:01,640 --> 00:18:04,080 Speaker 1: apart from making it harder to tinker with your tech, 284 00:18:04,440 --> 00:18:07,200 Speaker 1: what's the problem. But when we come back, I'll talk 285 00:18:07,240 --> 00:18:10,520 Speaker 1: about some of the pitfalls of turning tech into a 286 00:18:10,600 --> 00:18:23,160 Speaker 1: black box. But first let's take a quick break. We're back. 287 00:18:23,400 --> 00:18:27,560 Speaker 1: So the card transformed from a purely electro mechanical technology 288 00:18:28,040 --> 00:18:31,720 Speaker 1: to one that increasingly relies on computer systems. But the 289 00:18:31,760 --> 00:18:35,840 Speaker 1: computer itself can also be something of a black box 290 00:18:35,880 --> 00:18:38,840 Speaker 1: for people now. In the very early days of the 291 00:18:38,960 --> 00:18:42,600 Speaker 1: personal computer, it was hobbyists who were ordering kits through 292 00:18:42,600 --> 00:18:46,600 Speaker 1: the mail and then building computers at home. Typically, these 293 00:18:46,640 --> 00:18:50,720 Speaker 1: hobbyists had a working understanding of how the computer systems operated, 294 00:18:50,920 --> 00:18:53,520 Speaker 1: you know, the actual way in which they would accept 295 00:18:53,600 --> 00:18:58,880 Speaker 1: inputs and process information and then produce outputs. Before high 296 00:18:58,960 --> 00:19:02,880 Speaker 1: level programming line languages, programmers also had to kind of 297 00:19:03,080 --> 00:19:05,879 Speaker 1: think like a computer in order to program them to 298 00:19:05,960 --> 00:19:10,560 Speaker 1: carry out functions. As computer languages became more high level, 299 00:19:11,000 --> 00:19:15,000 Speaker 1: meaning there was a layer of abstraction between the programmer 300 00:19:15,080 --> 00:19:17,520 Speaker 1: and the actual processes that were going on at the 301 00:19:17,560 --> 00:19:21,280 Speaker 1: hardware level of the computer, that connection began to get 302 00:19:21,320 --> 00:19:26,360 Speaker 1: more tenuous. Now, I'm not saying that programmers today don't 303 00:19:26,400 --> 00:19:29,760 Speaker 1: have a real understanding of how computers work, but rather 304 00:19:30,320 --> 00:19:36,120 Speaker 1: that this understanding is less critical because programming languages, computer engines, 305 00:19:36,160 --> 00:19:39,400 Speaker 1: app developer kits you know, software developer kits and so on, 306 00:19:39,760 --> 00:19:44,600 Speaker 1: provide a framework that reduces the amount of low level 307 00:19:44,640 --> 00:19:47,840 Speaker 1: work programmers need to do in order to build stuff 308 00:19:47,920 --> 00:19:51,280 Speaker 1: as software. For the average user, you know, someone who 309 00:19:51,400 --> 00:19:55,400 Speaker 1: isn't learned in the ways of computer science, computers are 310 00:19:55,720 --> 00:20:01,199 Speaker 1: pretty much black boxes. They work until they don't. You 311 00:20:01,200 --> 00:20:04,119 Speaker 1: push buttons on a keyboard, or you click on a mouse, 312 00:20:04,240 --> 00:20:06,560 Speaker 1: or you touch a screen, and you know, the computer 313 00:20:06,760 --> 00:20:10,679 Speaker 1: does the stuff how it does stuff like how it 314 00:20:10,760 --> 00:20:14,119 Speaker 1: detects a screen touch and then translates that into a 315 00:20:14,160 --> 00:20:17,919 Speaker 1: command that is then executed to produce a specific result. 316 00:20:18,480 --> 00:20:21,600 Speaker 1: You know, that's not important to us. We don't care 317 00:20:21,920 --> 00:20:24,520 Speaker 1: or need to know how that works in order to 318 00:20:24,680 --> 00:20:28,159 Speaker 1: enjoy the benefits of it. So for us, it's just 319 00:20:28,280 --> 00:20:31,440 Speaker 1: the way things are. You push that button and this 320 00:20:31,560 --> 00:20:34,919 Speaker 1: thing happens. It just does the black box of the 321 00:20:34,960 --> 00:20:39,440 Speaker 1: computer system, which can be a desktop, a laptop, tablet, smartphone, 322 00:20:39,520 --> 00:20:43,040 Speaker 1: video game console, you know whatever. It just takes care 323 00:20:43,400 --> 00:20:46,000 Speaker 1: of what we need it to do. That's not to 324 00:20:46,040 --> 00:20:50,200 Speaker 1: say a computer is impenetrably a black box. You can 325 00:20:50,440 --> 00:20:53,800 Speaker 1: learn how they work and how programming languages work and 326 00:20:53,840 --> 00:20:57,639 Speaker 1: so on. Computer science and programming classes are all built 327 00:20:57,680 --> 00:21:01,760 Speaker 1: around that. So while a computer system is effectively a 328 00:21:01,880 --> 00:21:05,719 Speaker 1: black box to the average user, it wasn't made that 329 00:21:05,800 --> 00:21:09,479 Speaker 1: way by design, and it can be addressed on a 330 00:21:09,520 --> 00:21:13,160 Speaker 1: case by case basis, depending on the time, the interest 331 00:21:13,400 --> 00:21:17,080 Speaker 1: of the individual computer user, and their dedication to learning. 332 00:21:17,240 --> 00:21:20,960 Speaker 1: But sometimes people will set out to make technologies with 333 00:21:21,040 --> 00:21:24,720 Speaker 1: the intent of them being black boxes from the get go. 334 00:21:25,240 --> 00:21:28,800 Speaker 1: These technologies are dependent in part or in whole on 335 00:21:29,040 --> 00:21:33,120 Speaker 1: obviuse skating how they work, in other words, by obscuring it. 336 00:21:33,560 --> 00:21:38,480 Speaker 1: Sometimes that's in an effort to protect an invention from copycats, 337 00:21:38,880 --> 00:21:40,919 Speaker 1: the whole idea being that if you come up with 338 00:21:41,000 --> 00:21:44,480 Speaker 1: something really clever, you don't want someone else to come 339 00:21:44,520 --> 00:21:47,760 Speaker 1: along lift that idea and do the same thing you 340 00:21:47,800 --> 00:21:50,720 Speaker 1: are doing, but selling it for less money or something. 341 00:21:51,240 --> 00:21:55,320 Speaker 1: But other times you might be hiding how something works 342 00:21:55,440 --> 00:21:59,480 Speaker 1: specifically with the intent to deceive. And now it's time 343 00:21:59,800 --> 00:22:04,000 Speaker 1: to look much further back than the nineteen sixties. It 344 00:22:04,080 --> 00:22:08,119 Speaker 1: was seventeen seventy in Europe. As the story goes, the 345 00:22:08,160 --> 00:22:11,480 Speaker 1: European world had seen a great deal of advancement in 346 00:22:11,560 --> 00:22:16,440 Speaker 1: mechanical clockwork devices. At that point. Clocks themselves, often powered 347 00:22:16,440 --> 00:22:20,600 Speaker 1: by winding a spring and keeping time using gears with 348 00:22:20,960 --> 00:22:24,239 Speaker 1: a reliable and consistent basis that was much better than 349 00:22:24,280 --> 00:22:28,800 Speaker 1: earlier methods, even allowed people the ability to carry a 350 00:22:28,960 --> 00:22:34,199 Speaker 1: time keeping device with them phenomenal. Based on similar principles, 351 00:22:34,480 --> 00:22:37,760 Speaker 1: various tinkerers had come up with toys and distractions that 352 00:22:37,920 --> 00:22:42,680 Speaker 1: also ran on clockwork, like gears and springs. Some of 353 00:22:42,720 --> 00:22:46,399 Speaker 1: these were quite elaborate, such as figures that appeared to 354 00:22:46,440 --> 00:22:52,040 Speaker 1: play musical instruments, and one of them was particularly impressive. 355 00:22:52,480 --> 00:22:56,040 Speaker 1: It appeared to be an automaton that could play expert 356 00:22:56,119 --> 00:23:00,840 Speaker 1: level chess. The figure, made out of wood, was dressed 357 00:23:00,920 --> 00:23:04,680 Speaker 1: in Turkish costume, leading to it being called the Turk, 358 00:23:04,960 --> 00:23:09,440 Speaker 1: or sometimes the mechanical Turk. If you were to sit 359 00:23:09,600 --> 00:23:13,800 Speaker 1: down to play against the Turk, you, as an opponent, 360 00:23:13,880 --> 00:23:16,560 Speaker 1: would move a piece, and then you would watch as 361 00:23:16,600 --> 00:23:19,800 Speaker 1: this mechanical figure would shift and move a piece of 362 00:23:19,840 --> 00:23:23,360 Speaker 1: its own in response. And the Turk was a pretty 363 00:23:23,440 --> 00:23:28,280 Speaker 1: good chess player. It frequently beat the opponent's at face. 364 00:23:28,400 --> 00:23:32,600 Speaker 1: Sometimes it would lose to particularly strong players, but it 365 00:23:32,680 --> 00:23:35,960 Speaker 1: held its own pretty darn well. The man behind this 366 00:23:36,080 --> 00:23:40,399 Speaker 1: invention was Wulfgan von Kimplin, who was in the service 367 00:23:40,440 --> 00:23:45,119 Speaker 1: of Maria Teresa, Impress of the Holy Roman Empire. He 368 00:23:45,160 --> 00:23:48,480 Speaker 1: had been invited to view a magician's performance in the court. 369 00:23:48,600 --> 00:23:52,520 Speaker 1: So the story goes and The Impress had invited him 370 00:23:52,560 --> 00:23:55,800 Speaker 1: specifically and afterwards asked him what he thought, and allegedly 371 00:23:56,119 --> 00:23:59,360 Speaker 1: he boasted he could create a much more compelling illusion 372 00:23:59,400 --> 00:24:02,760 Speaker 1: than anything this magician did. Now, according to the story, 373 00:24:02,960 --> 00:24:06,640 Speaker 1: the Empress essentially said, oh yeah, well, prove it, buster, 374 00:24:07,160 --> 00:24:09,760 Speaker 1: and he was given six months to do just that. 375 00:24:10,440 --> 00:24:13,080 Speaker 1: The turk was what he had to show for it 376 00:24:13,160 --> 00:24:17,920 Speaker 1: in six months time, and it reportedly went over like Gangbusters. 377 00:24:18,440 --> 00:24:22,000 Speaker 1: The wooden turk stood behind a cabinet, on top of 378 00:24:22,040 --> 00:24:26,399 Speaker 1: which was the chessboard, and Kempland would reportedly open the 379 00:24:26,440 --> 00:24:30,679 Speaker 1: cabinet doors and reveal some gears and mechanics to prove 380 00:24:30,960 --> 00:24:35,399 Speaker 1: that it was purely a mechanical system. In fact, the 381 00:24:35,480 --> 00:24:39,440 Speaker 1: gears were masking a hidden compartment behind them, in which 382 00:24:39,480 --> 00:24:44,399 Speaker 1: a human chess player was sitting inside, hunched over, keeping 383 00:24:44,480 --> 00:24:47,480 Speaker 1: track of a game, using a smaller chessboard in front 384 00:24:47,520 --> 00:24:51,320 Speaker 1: of him, and using various levers to move the turk's 385 00:24:51,440 --> 00:24:55,919 Speaker 1: limbs in response. Now, a lot of folks suspected that 386 00:24:56,119 --> 00:24:59,280 Speaker 1: something was up from the get go, but you know, 387 00:24:59,400 --> 00:25:02,600 Speaker 1: part of the fun of a magic trick is just 388 00:25:02,760 --> 00:25:07,040 Speaker 1: not knowing what's going on. Some folks try very hard 389 00:25:07,119 --> 00:25:10,080 Speaker 1: to figure out the process. I am not one of them. 390 00:25:10,480 --> 00:25:13,280 Speaker 1: Others are just happy to be entertained by a very 391 00:25:13,320 --> 00:25:16,760 Speaker 1: well performed trick. But in a way, the Turk was 392 00:25:16,880 --> 00:25:20,439 Speaker 1: a kind of black box. In fact, you could argue 393 00:25:20,440 --> 00:25:23,840 Speaker 1: that a lot of magic tricks pretty much fall into 394 00:25:23,880 --> 00:25:29,200 Speaker 1: the black box category. The process is purposefully hidden from 395 00:25:29,240 --> 00:25:31,840 Speaker 1: the viewer. If we could see what the magician was 396 00:25:31,880 --> 00:25:35,480 Speaker 1: doing from beginning to end, all the way through and 397 00:25:35,800 --> 00:25:40,240 Speaker 1: without any misdirection, then it wouldn't be magic. We might 398 00:25:40,520 --> 00:25:44,040 Speaker 1: admire the skill of the magician, how quickly they were 399 00:25:44,040 --> 00:25:49,119 Speaker 1: able to do things, but we wouldn't really consider it magical. 400 00:25:49,840 --> 00:25:53,040 Speaker 1: So the output is dependent upon people not knowing the 401 00:25:53,080 --> 00:25:57,280 Speaker 1: process the inputs went through. Now that's not to say 402 00:25:57,280 --> 00:25:59,600 Speaker 1: that you can't appreciate a really good magic trick even 403 00:25:59,600 --> 00:26:01,600 Speaker 1: if you know how it's done. One of the best 404 00:26:01,600 --> 00:26:04,800 Speaker 1: examples I know of is pen and Teller. They did 405 00:26:04,920 --> 00:26:09,720 Speaker 1: a phenomenal version of the cups and balls routine where 406 00:26:09,760 --> 00:26:14,560 Speaker 1: they used clear plastic cups and balls of aluminum foil 407 00:26:14,720 --> 00:26:19,280 Speaker 1: to demonstrate how cups and balls works, and you can 408 00:26:19,320 --> 00:26:22,359 Speaker 1: watch the entire time, and even being able to see 409 00:26:22,359 --> 00:26:25,160 Speaker 1: through the cups and see the moves they're being made. 410 00:26:25,760 --> 00:26:29,919 Speaker 1: Teller does them with such skill that it is truly phenomenal. 411 00:26:30,000 --> 00:26:33,120 Speaker 1: It doesn't hurt that Pen is spouting off a lot 412 00:26:33,160 --> 00:26:36,480 Speaker 1: of nonsense at the same time and misdirecting even as 413 00:26:36,480 --> 00:26:39,800 Speaker 1: you're watching what's going on. I highly recommend you check 414 00:26:39,840 --> 00:26:42,240 Speaker 1: it out on YouTube. Look for Pen and Teller cups 415 00:26:42,240 --> 00:26:46,920 Speaker 1: and balls. You won't be disappointed. Now. The Turk, as 416 00:26:47,080 --> 00:26:50,199 Speaker 1: far as I can tell, was always intended to be 417 00:26:50,600 --> 00:26:55,199 Speaker 1: an entertainment, not necessarily something that was specifically meant to 418 00:26:55,280 --> 00:26:59,840 Speaker 1: perpetuate some sort of hoax. You wouldn't call a stagement 419 00:27:00,440 --> 00:27:03,520 Speaker 1: a huckster or a con man or anything like that. 420 00:27:03,920 --> 00:27:09,119 Speaker 1: Their occupation is dependent upon misdirection and making impossible acts 421 00:27:09,200 --> 00:27:14,160 Speaker 1: seem like they really happened, but always or nearly always 422 00:27:14,320 --> 00:27:17,840 Speaker 1: with the implication that it's all an illusion or a 423 00:27:17,920 --> 00:27:21,240 Speaker 1: trick of some sort. But not everyone is quite so 424 00:27:21,320 --> 00:27:24,439 Speaker 1: forthcoming about the fact that the thing they are doing 425 00:27:24,560 --> 00:27:28,200 Speaker 1: is done through trickery. For the scam artist, the black 426 00:27:28,240 --> 00:27:33,919 Speaker 1: box creates an incredible opportunity. As technological complexity outpaces the 427 00:27:34,000 --> 00:27:39,199 Speaker 1: average person's understanding, the scam artists can create fake gadgets 428 00:27:39,240 --> 00:27:42,879 Speaker 1: and devices that they claim can do certain things and 429 00:27:42,920 --> 00:27:46,240 Speaker 1: then count upon the ignorance of the average person to 430 00:27:46,400 --> 00:27:49,840 Speaker 1: get away with it. Typically, the go to scam is 431 00:27:49,880 --> 00:27:53,159 Speaker 1: to convince people with money to pour investments into the 432 00:27:53,359 --> 00:27:57,399 Speaker 1: hoax technology in an effort to fund whatever the next 433 00:27:57,440 --> 00:28:00,159 Speaker 1: phase of development is supposed to be, whether that's to 434 00:28:00,280 --> 00:28:04,360 Speaker 1: bring a prototype into a production model or to refine 435 00:28:04,400 --> 00:28:07,639 Speaker 1: a design or whatever. But the end result is pretty 436 00:28:07,680 --> 00:28:11,119 Speaker 1: much the same across the board. The con artist tries 437 00:28:11,160 --> 00:28:14,160 Speaker 1: to wheedle out as much money from their marks as 438 00:28:14,200 --> 00:28:17,520 Speaker 1: they can before they pull up stakes and skip town, 439 00:28:18,200 --> 00:28:21,880 Speaker 1: or they find some way to shift focus or punt 440 00:28:21,920 --> 00:28:26,159 Speaker 1: any promises on delivering results further into the future, like 441 00:28:26,440 --> 00:28:30,240 Speaker 1: that's a future me problem kind of approach. Once in 442 00:28:30,280 --> 00:28:33,320 Speaker 1: a blue moon, you might find someone who is just 443 00:28:33,440 --> 00:28:36,199 Speaker 1: hoping to make enough time to come up with a 444 00:28:36,240 --> 00:28:39,600 Speaker 1: way to do their hoax for realsies, or at least 445 00:28:39,800 --> 00:28:42,960 Speaker 1: to simulate it close enough so that people are satisfied. 446 00:28:43,760 --> 00:28:50,840 Speaker 1: That typically doesn't work out so well, Fairness, I'll get 447 00:28:50,880 --> 00:28:54,720 Speaker 1: back to that. So let's talk about some examples of 448 00:28:54,760 --> 00:28:59,320 Speaker 1: outright scams that leaned heavily on the black box concept, 449 00:28:59,440 --> 00:29:03,520 Speaker 1: whether by having their supposed and actual operating mechanisms hidden 450 00:29:04,240 --> 00:29:07,120 Speaker 1: or by obscuring how they really worked with a lot 451 00:29:07,120 --> 00:29:12,240 Speaker 1: of nonsensical claims and technobabble. One historical scam artist was 452 00:29:12,280 --> 00:29:15,960 Speaker 1: a guy named Charles Redheffer who claimed to have built 453 00:29:15,960 --> 00:29:20,040 Speaker 1: a perpetual motion machine. If he had managed to do 454 00:29:20,120 --> 00:29:22,560 Speaker 1: such a thing, it would have been a true feat, 455 00:29:22,880 --> 00:29:25,640 Speaker 1: as it would break the laws of physics as we 456 00:29:25,880 --> 00:29:29,720 Speaker 1: understand them. So let's go over why that is just 457 00:29:29,920 --> 00:29:34,240 Speaker 1: pretty quickly. For perpetual motion to work, and thus for 458 00:29:34,400 --> 00:29:38,080 Speaker 1: free energy in general to work, a machine would need 459 00:29:38,120 --> 00:29:42,040 Speaker 1: to be able to operate with absolutely no energy loss, 460 00:29:42,200 --> 00:29:45,280 Speaker 1: and for free energy, it would have to generate that 461 00:29:45,480 --> 00:29:50,040 Speaker 1: energy in some way. A perpetual motion machine, once set 462 00:29:50,080 --> 00:29:54,680 Speaker 1: into motion, would never stop moving unless someone or something 463 00:29:55,160 --> 00:29:58,680 Speaker 1: specifically intervened, But if it were left to its own devices, 464 00:29:59,120 --> 00:30:02,600 Speaker 1: it would continue to do whatever it was doing until 465 00:30:02,640 --> 00:30:06,520 Speaker 1: the last syllable of recorded time. To borrow a phrase 466 00:30:06,560 --> 00:30:09,400 Speaker 1: from the Bard. Now, if we look at our understanding 467 00:30:09,480 --> 00:30:12,760 Speaker 1: of thermodynamics, we'll see that doing this in the real 468 00:30:12,800 --> 00:30:16,520 Speaker 1: world is impossible, or at least it would go against 469 00:30:16,760 --> 00:30:21,120 Speaker 1: fundamental ways that we understand regarding how our universe works. 470 00:30:21,760 --> 00:30:25,760 Speaker 1: The first law of thermodynamics says that energy is neither 471 00:30:25,960 --> 00:30:31,760 Speaker 1: created nor destroyed. Energy can, however, be converted from one 472 00:30:31,840 --> 00:30:36,160 Speaker 1: form into another. So if you hold a water balloon 473 00:30:36,320 --> 00:30:39,680 Speaker 1: over the head of a close personal friend, let's say 474 00:30:40,120 --> 00:30:43,480 Speaker 1: it's ben boleen of stuff they don't want you to know. 475 00:30:44,080 --> 00:30:47,880 Speaker 1: The water balloon has a certain amount of potential energy. 476 00:30:48,280 --> 00:30:51,200 Speaker 1: If you let go of the balloon, that potential energy 477 00:30:51,280 --> 00:30:55,240 Speaker 1: converts into kinetic energy, the energy of movement. You didn't 478 00:30:55,320 --> 00:31:00,600 Speaker 1: create or destroy energy here, it just changed forms. So 479 00:31:01,160 --> 00:31:03,760 Speaker 1: if you have what you claim to be a perpetual 480 00:31:03,800 --> 00:31:07,479 Speaker 1: motion machine and you set it in motion, the energy 481 00:31:07,600 --> 00:31:12,280 Speaker 1: you gave that machine at that initial point should sustain 482 00:31:12,320 --> 00:31:15,960 Speaker 1: it forever, and it would never have that initial energy 483 00:31:16,160 --> 00:31:19,400 Speaker 1: change form into some other type of energy that could 484 00:31:19,400 --> 00:31:22,920 Speaker 1: then escape the system and show a net energy loss 485 00:31:23,480 --> 00:31:26,360 Speaker 1: for the system itself. Remember, the energy is not being destroyed, 486 00:31:26,720 --> 00:31:30,240 Speaker 1: but it can be lost in another form. This means 487 00:31:30,320 --> 00:31:33,400 Speaker 1: that such a machine could not have any parts that 488 00:31:33,720 --> 00:31:37,080 Speaker 1: had any contact with one another, which would make it 489 00:31:37,240 --> 00:31:41,040 Speaker 1: a really strange machine. And that's because friction would be 490 00:31:41,080 --> 00:31:44,400 Speaker 1: a constant means for energy to convert from one form 491 00:31:44,640 --> 00:31:47,960 Speaker 1: to another form, in this case, kinetic energy the energy 492 00:31:47,960 --> 00:31:52,920 Speaker 1: of movement into heat. Friction is the resistance surfaces have 493 00:31:53,120 --> 00:31:57,200 Speaker 1: regarding moving against each other. So if the machine has 494 00:31:57,280 --> 00:31:59,840 Speaker 1: any moving parts at all, those parts will be in 495 00:32:00,040 --> 00:32:04,160 Speaker 1: countering friction, which means some of that moving energy will 496 00:32:04,160 --> 00:32:07,480 Speaker 1: be converted to heat and thus escape the system. So 497 00:32:07,560 --> 00:32:11,080 Speaker 1: the overall system of the machine itself will have a 498 00:32:11,160 --> 00:32:14,240 Speaker 1: net loss of energy. There will be less energy to 499 00:32:14,320 --> 00:32:18,080 Speaker 1: keep it going, which means gradually it will slow down 500 00:32:18,200 --> 00:32:21,640 Speaker 1: and ultimately just stop. As a result, it might take 501 00:32:21,680 --> 00:32:24,840 Speaker 1: a long time if the machine is particularly well designed, 502 00:32:24,840 --> 00:32:28,400 Speaker 1: but it will eventually happen. You would need some form 503 00:32:28,520 --> 00:32:32,320 Speaker 1: of energy input to keep things going on occasion, kind 504 00:32:32,320 --> 00:32:35,560 Speaker 1: of like a little push. Imagine that you've got a 505 00:32:36,720 --> 00:32:39,240 Speaker 1: swing like a rope with a tire at the end 506 00:32:39,280 --> 00:32:41,400 Speaker 1: of it. No one's in it right now. You would 507 00:32:41,440 --> 00:32:43,680 Speaker 1: have to give that tire a little push every now 508 00:32:43,720 --> 00:32:46,520 Speaker 1: and then to keep it swinging, otherwise it will eventually stop. 509 00:32:47,040 --> 00:32:50,320 Speaker 1: But that means you wouldn't have a perpetual motion machine. 510 00:32:50,520 --> 00:32:54,520 Speaker 1: There are other factors that similarly make perpetual motion impossible. 511 00:32:54,640 --> 00:32:57,360 Speaker 1: If the machine makes any sort of sound, than some 512 00:32:57,640 --> 00:33:00,680 Speaker 1: of the energy of operation is going in to creating 513 00:33:00,680 --> 00:33:04,520 Speaker 1: the vibrations that make sound. Sound itself is energy. It's 514 00:33:04,600 --> 00:33:07,520 Speaker 1: kinetic energy, so that would mean the machine as a 515 00:33:07,560 --> 00:33:11,720 Speaker 1: whole would be losing energy through that sound. A machine 516 00:33:11,760 --> 00:33:15,360 Speaker 1: operating inside an atmosphere has to overcome the friction of 517 00:33:15,480 --> 00:33:19,400 Speaker 1: moving through air, and the list goes on. Moreover, if 518 00:33:19,440 --> 00:33:23,440 Speaker 1: we could build a perpetual motion machine, we'd be able 519 00:33:23,480 --> 00:33:26,920 Speaker 1: to harness it for energy, but only up to whatever 520 00:33:27,080 --> 00:33:30,480 Speaker 1: the starting initial energy was to get it moving in 521 00:33:30,520 --> 00:33:34,920 Speaker 1: the first place. Because again, energy cannot be created. We 522 00:33:35,000 --> 00:33:37,959 Speaker 1: can build devices that can harness other forms of energy 523 00:33:38,000 --> 00:33:41,600 Speaker 1: and convert that energy into say electricity, but these are 524 00:33:41,640 --> 00:33:46,080 Speaker 1: not perpetual motion or free energy machines. These machines are 525 00:33:46,120 --> 00:33:50,280 Speaker 1: just collecting and converting energy that's already in the system 526 00:33:50,440 --> 00:33:55,520 Speaker 1: or already present, so they're not making anything. Redheffer, however, 527 00:33:55,680 --> 00:33:58,560 Speaker 1: claimed to have built a perpetual motion machine that could 528 00:33:58,560 --> 00:34:02,880 Speaker 1: potentially serve as a free energy generator. Now, if true, 529 00:34:03,320 --> 00:34:07,200 Speaker 1: this would have been an astonishing discovery. Not only would 530 00:34:07,200 --> 00:34:10,160 Speaker 1: our understanding of the universe be proven to be wrong, 531 00:34:10,880 --> 00:34:15,080 Speaker 1: but we would also have access to an inexhaustible supply 532 00:34:15,320 --> 00:34:19,400 Speaker 1: of energy. Redheffer showed off what he said was a 533 00:34:19,440 --> 00:34:23,000 Speaker 1: working model of his design in Philadelphia, and he was 534 00:34:23,080 --> 00:34:26,160 Speaker 1: asking for money to fund the construction of a larger, 535 00:34:26,360 --> 00:34:31,520 Speaker 1: practical version of his design. A group of inspectors from 536 00:34:31,560 --> 00:34:35,000 Speaker 1: the city came out to check out how this thing worked, 537 00:34:35,520 --> 00:34:38,680 Speaker 1: and they noticed something hinky was going on. Even though 538 00:34:38,719 --> 00:34:42,360 Speaker 1: Redheifer was doing his best to run interference and prevent 539 00:34:42,400 --> 00:34:45,320 Speaker 1: anyone from getting too close a look at the machine, 540 00:34:46,000 --> 00:34:49,719 Speaker 1: the gears of the device, which was supposedly powering a 541 00:34:49,880 --> 00:34:53,560 Speaker 1: second machine, were worn down in such a way that 542 00:34:53,719 --> 00:34:56,280 Speaker 1: it was pretty clear that it was actually the second 543 00:34:56,320 --> 00:34:59,160 Speaker 1: machine that was providing the energy to turn the quote 544 00:34:59,239 --> 00:35:03,720 Speaker 1: unquote per petual motion machine, not the other way around. 545 00:35:03,880 --> 00:35:05,840 Speaker 1: So if we were talking about cars, this would be 546 00:35:05,920 --> 00:35:10,040 Speaker 1: like discovering that the wheels turning were causing the pistons 547 00:35:10,040 --> 00:35:14,239 Speaker 1: of the engine to reciprocate in their cylinders. It's going 548 00:35:14,280 --> 00:35:17,560 Speaker 1: the opposite way. So the investigators then hired a local 549 00:35:17,600 --> 00:35:22,640 Speaker 1: engineer named Azaiah Lukens to build a similar device, using 550 00:35:22,680 --> 00:35:26,000 Speaker 1: a secondary machine to provide power to what would be 551 00:35:26,120 --> 00:35:30,040 Speaker 1: the perpetual motion type machine, and then they showed it 552 00:35:30,160 --> 00:35:33,120 Speaker 1: to Redheifer, who saw that the jig was up and 553 00:35:33,200 --> 00:35:35,640 Speaker 1: he hoofed it out of town to New York City. 554 00:35:36,239 --> 00:35:40,120 Speaker 1: He tried to pull essentially the same scam there, this 555 00:35:40,239 --> 00:35:43,319 Speaker 1: time using a machine that was secretly powered by a 556 00:35:43,560 --> 00:35:47,919 Speaker 1: hand crank in a secret room on the other side 557 00:35:47,920 --> 00:35:51,440 Speaker 1: of the wall. Technically, it was just a feller sitting 558 00:35:51,480 --> 00:35:54,120 Speaker 1: there with a hand crank in one hand and a 559 00:35:54,160 --> 00:35:58,080 Speaker 1: sandwich in the other, providing the work to turn this machine. 560 00:35:58,520 --> 00:36:03,399 Speaker 1: Robert Fulton, an engineer of great renown, exposed the whole 561 00:36:03,440 --> 00:36:06,520 Speaker 1: device as a fraud when he pulled apart some boards 562 00:36:06,560 --> 00:36:09,400 Speaker 1: on the wall and revealed the man sitting there cranking 563 00:36:09,400 --> 00:36:13,600 Speaker 1: away and Red Heifer fled again. Records of what happened 564 00:36:13,600 --> 00:36:16,239 Speaker 1: next are sketchy. It seems he might have tried to 565 00:36:16,280 --> 00:36:20,040 Speaker 1: pull the saying Dangs scheme in Philadelphia again a bit later, 566 00:36:20,560 --> 00:36:24,320 Speaker 1: but he disappeared from the historical record after reportedly refusing 567 00:36:24,360 --> 00:36:28,360 Speaker 1: to demonstrate his new device. When we come back, i'll 568 00:36:28,520 --> 00:36:32,560 Speaker 1: compare this to what I mentioned before a farahnose before 569 00:36:32,640 --> 00:36:36,560 Speaker 1: we chat about other concerns regarding the black box problem. 570 00:36:36,560 --> 00:36:49,879 Speaker 1: But first let's take another quick break. Okay, So THEARRHNOS 571 00:36:50,600 --> 00:36:54,560 Speaker 1: this is the biomedical technology company that was founded by 572 00:36:54,600 --> 00:36:58,759 Speaker 1: Elizabeth Holmes, and she is currently awaiting a trial on 573 00:36:59,320 --> 00:37:03,200 Speaker 1: charges of federal fraud in the United States. The trial 574 00:37:03,280 --> 00:37:06,480 Speaker 1: was supposed to begin in August twenty twenty, but has 575 00:37:06,520 --> 00:37:11,680 Speaker 1: since been delayed until twenty twenty one due to COVID nineteen. Now, 576 00:37:11,800 --> 00:37:17,080 Speaker 1: the pitch for Therahnose was really really alluring. What if 577 00:37:17,400 --> 00:37:21,280 Speaker 1: engineers could make a machine capable of testing a single 578 00:37:21,400 --> 00:37:25,799 Speaker 1: droplet of blood for more than one hundred possible illnesses 579 00:37:25,960 --> 00:37:29,920 Speaker 1: and conditions, So rather than going through multiple blood draws 580 00:37:29,960 --> 00:37:33,040 Speaker 1: and tests to try and figure out what's wrong, you 581 00:37:33,080 --> 00:37:36,080 Speaker 1: could get an answer based off one little pinprick within 582 00:37:36,120 --> 00:37:39,279 Speaker 1: a couple of hours. Maybe you would even be able 583 00:37:39,320 --> 00:37:42,600 Speaker 1: to buy a Theoroughnose machine for your home, kind of 584 00:37:42,640 --> 00:37:45,080 Speaker 1: like a desktop printer, and that would allow you to 585 00:37:45,080 --> 00:37:48,680 Speaker 1: do a quick blood test at a moment's notice. Maybe 586 00:37:48,680 --> 00:37:50,719 Speaker 1: you would get a heads up about something you should 587 00:37:50,760 --> 00:37:53,840 Speaker 1: talk to your doctor about, preventing tragedy. In the process, 588 00:37:54,400 --> 00:37:56,800 Speaker 1: you might learn that with some changes in your lifestyle, 589 00:37:56,840 --> 00:38:00,360 Speaker 1: you could improve your overall health or stave off various illness. 590 00:38:01,200 --> 00:38:05,840 Speaker 1: It would democratize medicine, giving the average person more control 591 00:38:05,960 --> 00:38:09,120 Speaker 1: and knowledge about their own health and giving them a 592 00:38:09,160 --> 00:38:14,560 Speaker 1: better starter point for conversations with their doctors. And yeah, 593 00:38:14,880 --> 00:38:19,080 Speaker 1: that's a great goal. It's a fantastic sales pitch, and 594 00:38:19,280 --> 00:38:23,920 Speaker 1: it did get Holmes and Therphnos a lot of interested 595 00:38:24,000 --> 00:38:26,880 Speaker 1: investors who really wanted to tap into this because not 596 00:38:26,920 --> 00:38:30,800 Speaker 1: only is it something that you would want for yourself, 597 00:38:30,880 --> 00:38:34,680 Speaker 1: you could easily see that if this is possible, that 598 00:38:34,800 --> 00:38:37,160 Speaker 1: business is going to be like the next Apple, It'll 599 00:38:37,200 --> 00:38:41,640 Speaker 1: become a trillion dollar company. Something that powerful would undoubtedly 600 00:38:41,920 --> 00:38:46,960 Speaker 1: become a powerhouse. Now I've done full episodes about Bharrahnos 601 00:38:47,040 --> 00:38:50,600 Speaker 1: and how it fell apart because spoiler alert, that's exactly 602 00:38:50,600 --> 00:38:55,520 Speaker 1: what happened. The technology just didn't work. But I think 603 00:38:55,560 --> 00:38:58,480 Speaker 1: a lot of what happened with Bharaphnos was largely dependent 604 00:38:58,560 --> 00:39:03,919 Speaker 1: upon naivete ignorance, and wishful thinking. Our technology can do 605 00:39:04,000 --> 00:39:07,399 Speaker 1: some pretty astounding stuff, right, I mean if you had 606 00:39:07,400 --> 00:39:10,480 Speaker 1: told me in two thousand that by the end of 607 00:39:10,480 --> 00:39:13,880 Speaker 1: the decade I would be carrying around a device capable 608 00:39:13,920 --> 00:39:17,120 Speaker 1: of really harnessing the power of the Internet in my 609 00:39:17,280 --> 00:39:19,920 Speaker 1: pocket and I would have access to it all the time, 610 00:39:20,600 --> 00:39:23,799 Speaker 1: I would have thought you were bonkers. So if technology 611 00:39:23,800 --> 00:39:27,040 Speaker 1: can do incredible things like that, why can't it do 612 00:39:27,120 --> 00:39:31,960 Speaker 1: something equally incredible with blood tests? The idea is that, well, 613 00:39:32,080 --> 00:39:35,400 Speaker 1: we're already seeing this amazing stuff happen. Why isn't this 614 00:39:35,440 --> 00:39:39,440 Speaker 1: other amazing thing possible? And that is dangerous thinking. It 615 00:39:39,520 --> 00:39:44,000 Speaker 1: equates all technological advances and developments, and that's just not 616 00:39:44,320 --> 00:39:49,680 Speaker 1: how reality works. Moore's law, the observation that generally speaking, 617 00:39:50,120 --> 00:39:54,880 Speaker 1: computational power doubles every two years, has really helped fuel 618 00:39:55,040 --> 00:40:00,239 Speaker 1: a misunderstanding about technology in general. We extend that same 619 00:40:00,520 --> 00:40:04,279 Speaker 1: crazy growth to all sorts of fields and technology when 620 00:40:04,320 --> 00:40:07,520 Speaker 1: that doesn't actually apply, and it gives us the motivation 621 00:40:07,640 --> 00:40:12,680 Speaker 1: to fool ourselves into thinking that the impossible is actually possible. 622 00:40:13,440 --> 00:40:17,319 Speaker 1: That I think is what happened with Pharrhos. Now. I'm 623 00:40:17,320 --> 00:40:22,120 Speaker 1: not saying Holmes set out to deceive people. I don't 624 00:40:22,160 --> 00:40:26,719 Speaker 1: know what she really believed was possible, but based on 625 00:40:26,840 --> 00:40:30,480 Speaker 1: what I've read and seen and listened to, to me, it 626 00:40:30,560 --> 00:40:33,280 Speaker 1: sounds like she figured there was at least a decent 627 00:40:33,440 --> 00:40:37,640 Speaker 1: chance her vision would become possible. And so a lot 628 00:40:37,640 --> 00:40:41,840 Speaker 1: of Therahnosa's activities, in my personal opinion, appear to have 629 00:40:41,920 --> 00:40:45,600 Speaker 1: been meant to stall for time while engineers were working 630 00:40:45,640 --> 00:40:49,120 Speaker 1: on very hard problems to make the blood testing device 631 00:40:49,239 --> 00:40:53,319 Speaker 1: work as intended. The further into the process, the more 632 00:40:53,440 --> 00:40:55,840 Speaker 1: the company had to spin wheels to make it seem 633 00:40:55,880 --> 00:40:58,960 Speaker 1: like it was making more progress than it actually was. 634 00:40:59,560 --> 00:41:02,319 Speaker 1: The company had raised an enormous amount of money from 635 00:41:02,360 --> 00:41:05,759 Speaker 1: the investors, so they were beholden to them. They had 636 00:41:05,800 --> 00:41:09,280 Speaker 1: also secured agreements with drug store chains to provide services 637 00:41:09,320 --> 00:41:13,040 Speaker 1: to customers, so they needed to perform a service. It 638 00:41:13,239 --> 00:41:16,320 Speaker 1: had to show progress, even if behind the scenes things 639 00:41:16,360 --> 00:41:19,640 Speaker 1: had actually stalled out. On top of that, you also 640 00:41:19,719 --> 00:41:24,719 Speaker 1: have the reports of executives like Holmes herself living the 641 00:41:24,800 --> 00:41:29,840 Speaker 1: high life and really enjoying incredible benefits of wealth because 642 00:41:29,840 --> 00:41:33,160 Speaker 1: of the enormous investment into the company, so that plays 643 00:41:33,200 --> 00:41:38,160 Speaker 1: a part too. Sharrhnos's operations were effectively a black box 644 00:41:38,200 --> 00:41:41,000 Speaker 1: to the outside world. It was meant to misdirect and 645 00:41:41,120 --> 00:41:44,480 Speaker 1: give the implication that things were working fine behind the scenes, 646 00:41:45,000 --> 00:41:47,239 Speaker 1: while the people who were actually there were trying to 647 00:41:47,320 --> 00:41:51,000 Speaker 1: keep up the illusion while simultaneously attempting to solve what 648 00:41:51,040 --> 00:41:55,239 Speaker 1: appeared to be impossible problems. At some point, based on 649 00:41:55,320 --> 00:41:59,160 Speaker 1: how things unfolded, I would say that executives that Sharrhnos 650 00:41:59,239 --> 00:42:03,520 Speaker 1: appeared to be perpetrating a scam, not just you know, 651 00:42:04,120 --> 00:42:07,800 Speaker 1: trying to maintain an illusion while getting things to work. 652 00:42:08,080 --> 00:42:12,040 Speaker 1: They were actively scamming people. In my opinion, maybe they 653 00:42:12,040 --> 00:42:14,879 Speaker 1: were still holding out hope that it would ultimately work out, 654 00:42:15,239 --> 00:42:17,920 Speaker 1: but that doesn't change that it was a classic case 655 00:42:18,000 --> 00:42:20,760 Speaker 1: of smoke and mirrors to hide what was really happening, 656 00:42:20,960 --> 00:42:25,400 Speaker 1: such as using existing blood testing technology from other companies 657 00:42:25,719 --> 00:42:28,800 Speaker 1: in order to run tests while claiming that the results 658 00:42:28,800 --> 00:42:32,840 Speaker 1: were coming from actual therenose devices. But again, this is 659 00:42:32,880 --> 00:42:35,319 Speaker 1: all my own opinion based on what I've seen and 660 00:42:35,440 --> 00:42:38,320 Speaker 1: read about the subject. A court we'll have to determine 661 00:42:38,320 --> 00:42:42,120 Speaker 1: whether or not Holmes and others actually committed fraud. A 662 00:42:42,160 --> 00:42:44,839 Speaker 1: lot of the technology we rely upon in our day 663 00:42:44,880 --> 00:42:48,960 Speaker 1: to day lives is complicated stuff, and there are limited 664 00:42:49,000 --> 00:42:51,160 Speaker 1: hours in the day, and it's a bit much task 665 00:42:51,200 --> 00:42:53,799 Speaker 1: anyone to become an expert on all things tech to 666 00:42:53,840 --> 00:42:56,560 Speaker 1: figure out exactly how they work. Tech is also becoming 667 00:42:56,600 --> 00:42:59,400 Speaker 1: more and more specialized, so you might become an expert 668 00:42:59,440 --> 00:43:03,280 Speaker 1: in one area of technology and be completely ignorant of another. 669 00:43:03,560 --> 00:43:06,760 Speaker 1: That's not unusual because it takes a lot of time 670 00:43:06,800 --> 00:43:09,759 Speaker 1: to become an expert at specific areas of tech. These days, 671 00:43:09,760 --> 00:43:13,799 Speaker 1: they've become so specialized. But by overlooking the how we 672 00:43:13,880 --> 00:43:18,080 Speaker 1: can make ourselves vulnerable to bad actors out there when 673 00:43:18,080 --> 00:43:21,920 Speaker 1: it comes to technology. Maybe they are actively trying to 674 00:43:21,920 --> 00:43:24,640 Speaker 1: pull the wool over our eyes, or maybe they're just 675 00:43:24,960 --> 00:43:29,000 Speaker 1: simply misguided and they misunderstand how stuff works. But either way, 676 00:43:29,360 --> 00:43:33,320 Speaker 1: our own ignorance of how tech does and what it does, 677 00:43:33,480 --> 00:43:36,719 Speaker 1: and the limitations that we all face based on the 678 00:43:36,760 --> 00:43:40,040 Speaker 1: fundamental laws of the universe as we understand them, that 679 00:43:40,120 --> 00:43:44,600 Speaker 1: all makes us potential marks or targets. That's where critical 680 00:43:44,640 --> 00:43:48,240 Speaker 1: thinking comes in and plays a part. Knowing to ask 681 00:43:48,520 --> 00:43:52,680 Speaker 1: questions and to critically examine the answers, and to ask 682 00:43:52,800 --> 00:43:56,160 Speaker 1: follow up questions, and to not accept claims at face 683 00:43:56,280 --> 00:43:59,640 Speaker 1: value are all important traits. Now, we do have to 684 00:43:59,680 --> 00:44:03,600 Speaker 1: be care not to go so far as to embrace denialism. 685 00:44:03,920 --> 00:44:07,680 Speaker 1: If we are confronted with compelling evidence that supports a claim, 686 00:44:08,120 --> 00:44:11,160 Speaker 1: we need to be ready to accept that claim. I'm 687 00:44:11,160 --> 00:44:13,160 Speaker 1: not advocating for you guys to just go out there 688 00:44:13,200 --> 00:44:16,520 Speaker 1: and say that any and every claim is just bogus. 689 00:44:16,600 --> 00:44:20,040 Speaker 1: That's not the point. I'll close this out by talking 690 00:44:20,040 --> 00:44:23,560 Speaker 1: about something we're seeing unfold in real time around us, 691 00:44:24,120 --> 00:44:27,799 Speaker 1: and that involves machine learning and AI systems. Now, if 692 00:44:27,800 --> 00:44:30,720 Speaker 1: you follow the circles that report on this kind of stuff, 693 00:44:31,080 --> 00:44:35,359 Speaker 1: you will occasionally see calls for transparency. Those calls are 694 00:44:35,400 --> 00:44:39,120 Speaker 1: to urge people who are designing these machine learning systems 695 00:44:39,160 --> 00:44:43,080 Speaker 1: and AI systems to show their work as it were, 696 00:44:43,239 --> 00:44:46,719 Speaker 1: and to have the systems themselves show their work. It's 697 00:44:46,760 --> 00:44:48,880 Speaker 1: not enough to create a system that can perform a 698 00:44:48,920 --> 00:44:52,759 Speaker 1: task like image recognition and then give us results. We 699 00:44:52,880 --> 00:44:56,400 Speaker 1: need to know how the system came to those conclusions 700 00:44:56,440 --> 00:44:59,560 Speaker 1: that it produced. We need this in order to check 701 00:44:59,560 --> 00:45:05,320 Speaker 1: for stuff biases, which is a serious issue in artificial intelligence. Honestly, 702 00:45:05,360 --> 00:45:08,520 Speaker 1: it's a really big problem for tech in general, but 703 00:45:08,640 --> 00:45:12,240 Speaker 1: we're really seeing it play out rather spectacularly in AI. 704 00:45:12,680 --> 00:45:15,720 Speaker 1: Now i'll give you an example that I've already alluded to, 705 00:45:16,280 --> 00:45:21,640 Speaker 1: facial recognition technology. The US National Institute of Standards and 706 00:45:21,680 --> 00:45:27,120 Speaker 1: Technology conducted an investigation in twenty nineteen into facial recognition technologies, 707 00:45:27,680 --> 00:45:31,560 Speaker 1: and it found that algorithms were pretty darn good at 708 00:45:31,560 --> 00:45:36,680 Speaker 1: identifying Caucasian faces, but if they were analyzing a black 709 00:45:37,000 --> 00:45:41,680 Speaker 1: or an Asian face, they were far less accurate, sometimes 710 00:45:42,040 --> 00:45:47,040 Speaker 1: one hundred times more likely to falsely identify somebody based 711 00:45:47,120 --> 00:45:52,480 Speaker 1: on an image. The worst error rates involved identifying Native Americans, 712 00:45:52,840 --> 00:45:55,600 Speaker 1: so let's let that sink in, because when we talk 713 00:45:55,600 --> 00:45:59,839 Speaker 1: about issues like systemic racism, we sometimes forget about how 714 00:45:59,880 --> 00:46:03,520 Speaker 1: that can manifest in ways that aren't as intuitive or 715 00:46:03,640 --> 00:46:08,239 Speaker 1: obvious as the really overt stuff. We live in a 716 00:46:08,280 --> 00:46:12,360 Speaker 1: world that has cameras all over the place. Surveillance is 717 00:46:12,400 --> 00:46:14,840 Speaker 1: a real thing that's going on all the time. Police 718 00:46:14,880 --> 00:46:19,360 Speaker 1: and other law enforcement agencies rely heavily on facial recognition 719 00:46:19,440 --> 00:46:24,000 Speaker 1: algorithms to identify suspects and to search for people of interest. 720 00:46:24,480 --> 00:46:28,000 Speaker 1: And if those algorithms have a low rate of reliability 721 00:46:28,080 --> 00:46:32,560 Speaker 1: for different ethnicities, a disproportionate number of people who have 722 00:46:32,760 --> 00:46:36,720 Speaker 1: no connection to any investigation are going to be singled 723 00:46:36,719 --> 00:46:41,360 Speaker 1: out by mistake by these algorithms. Lives can be disrupted, 724 00:46:41,520 --> 00:46:46,360 Speaker 1: careers can be ruined, relationships hurt all because a computer 725 00:46:46,440 --> 00:46:51,839 Speaker 1: program can't tell the difference between two different faces. That 726 00:46:52,280 --> 00:46:55,360 Speaker 1: is a serious problem, and it points to a couple 727 00:46:55,400 --> 00:46:58,319 Speaker 1: of things. One of the big ones is a lack 728 00:46:58,360 --> 00:47:01,600 Speaker 1: of diversity on the design inside of things. We've seen 729 00:47:01,640 --> 00:47:03,600 Speaker 1: this with tech for a long time. There is a 730 00:47:04,400 --> 00:47:08,560 Speaker 1: really critical diversity issue going on with technology. The people 731 00:47:08,600 --> 00:47:12,279 Speaker 1: who are building algorithms and training machine learning systems are 732 00:47:12,400 --> 00:47:14,880 Speaker 1: largely failing to do so in a way that can 733 00:47:14,920 --> 00:47:21,040 Speaker 1: be equally applicable across different ethnicities. Meanwhile, organizations like the 734 00:47:21,040 --> 00:47:25,080 Speaker 1: American Civil Liberties Union are calling upon law enforcement agencies 735 00:47:25,120 --> 00:47:29,919 Speaker 1: to stop relying on technology like this, entirely, pointing out 736 00:47:29,920 --> 00:47:34,279 Speaker 1: that the potential for harm to befall innocent people outweighs 737 00:47:34,320 --> 00:47:38,640 Speaker 1: the benefits of using the tech to catch criminals. A 738 00:47:38,800 --> 00:47:43,160 Speaker 1: machine learning system trained to do something like identify people 739 00:47:43,320 --> 00:47:47,680 Speaker 1: based on their faces needs to be transparent so that 740 00:47:47,960 --> 00:47:51,840 Speaker 1: when a bias becomes evident, engineers can go back to 741 00:47:51,920 --> 00:47:54,520 Speaker 1: the machine learning system and look and see where it 742 00:47:54,560 --> 00:47:57,799 Speaker 1: went wrong, and then train it to eliminate the bias. 743 00:47:58,239 --> 00:48:01,920 Speaker 1: Without transparency, it can be hard or impossible to figure 744 00:48:01,920 --> 00:48:06,000 Speaker 1: out exactly where things are going wrong within the system. Meanwhile, 745 00:48:06,480 --> 00:48:11,120 Speaker 1: real people in the real world are suffering the consequences. Now, 746 00:48:11,200 --> 00:48:13,799 Speaker 1: if we extend this outward and we look into a 747 00:48:13,840 --> 00:48:17,520 Speaker 1: future where artificial intelligence is undoubtedly going to play a 748 00:48:17,640 --> 00:48:20,879 Speaker 1: critical part in our day to day experiences, we see 749 00:48:20,880 --> 00:48:24,400 Speaker 1: how we need to avoid these black box situations. We 750 00:48:24,480 --> 00:48:28,280 Speaker 1: need to understand why a system will generate a particular 751 00:48:28,320 --> 00:48:31,799 Speaker 1: output given specific inputs. We've got to be able to 752 00:48:31,960 --> 00:48:34,120 Speaker 1: check the systems to be certain they are coming to 753 00:48:34,160 --> 00:48:39,960 Speaker 1: the right conclusions. Artificial intelligence has enormous potential to augment 754 00:48:40,080 --> 00:48:43,320 Speaker 1: how we go about everything from running errands to performing 755 00:48:43,360 --> 00:48:46,480 Speaker 1: our jobs, but we need to be certain that the 756 00:48:46,480 --> 00:48:51,400 Speaker 1: guidance we receive is dependable, that's the right course of action, 757 00:48:52,080 --> 00:48:55,240 Speaker 1: and so I hope this episode has really driven home 758 00:48:55,400 --> 00:48:58,319 Speaker 1: how it's important for us to hold technology up to 759 00:48:58,400 --> 00:49:02,640 Speaker 1: a critical view. It's not that technology is inherently good 760 00:49:03,239 --> 00:49:07,080 Speaker 1: or bad, or that people are specifically acting in an 761 00:49:07,120 --> 00:49:11,840 Speaker 1: ethical or unethical way, but rather that without using critical thinking, 762 00:49:11,840 --> 00:49:15,640 Speaker 1: we can't be certain if what we're relying upon is 763 00:49:15,840 --> 00:49:20,480 Speaker 1: actually reliable or not. I also urge, as always that 764 00:49:20,560 --> 00:49:24,440 Speaker 1: we pair compassion with critical thinking. I think there's a 765 00:49:24,560 --> 00:49:28,560 Speaker 1: tendency for us to kind of assign blame and intent 766 00:49:28,880 --> 00:49:32,880 Speaker 1: when things go wrong, and sometimes that is appropriate, but 767 00:49:32,960 --> 00:49:36,960 Speaker 1: I would argue that we shouldn't jump to that conclusion 768 00:49:37,080 --> 00:49:40,840 Speaker 1: right off the bat. Sometimes people just make bad choices, 769 00:49:41,160 --> 00:49:44,560 Speaker 1: or they are misinterpreting things, but they don't have any 770 00:49:44,719 --> 00:49:48,200 Speaker 1: intent to mislead. So while I do advocate that we 771 00:49:48,360 --> 00:49:53,040 Speaker 1: use critical thinking as much as possible, let's be decent, 772 00:49:53,280 --> 00:49:56,760 Speaker 1: nice human beings whenever we do that if it turns 773 00:49:56,760 --> 00:50:01,000 Speaker 1: out someone is truly being unethical and trying to deceive others, 774 00:50:01,480 --> 00:50:04,840 Speaker 1: that's obviously a different story. But before you know for sure, 775 00:50:05,640 --> 00:50:09,880 Speaker 1: I say, we employ that compassion and hopefully we are 776 00:50:09,920 --> 00:50:13,839 Speaker 1: able to solve these problems before they have these real 777 00:50:13,840 --> 00:50:19,360 Speaker 1: world impacts, because the consequences of those are dramatic and 778 00:50:19,480 --> 00:50:24,200 Speaker 1: terrible and avoidable if we use critical thinking. I hope 779 00:50:24,200 --> 00:50:27,000 Speaker 1: you enjoyed that rerun episode from just a few years ago. 780 00:50:28,360 --> 00:50:33,319 Speaker 1: Boy seems like a real, totally different era because that 781 00:50:33,440 --> 00:50:38,400 Speaker 1: was obviously several months into the COVID nineteen pandemic when 782 00:50:38,600 --> 00:50:41,239 Speaker 1: lots of us were still on lockdown and stuff. Very 783 00:50:41,280 --> 00:50:45,600 Speaker 1: different time from today. But yeah, as I said, black boxes, 784 00:50:45,719 --> 00:50:49,960 Speaker 1: it's still very much an ongoing topic in tech, particularly 785 00:50:50,040 --> 00:50:52,319 Speaker 1: in the field of artificial intelligence, but not just that. 786 00:50:53,160 --> 00:50:55,920 Speaker 1: We see it in the right to repair movement as well, 787 00:50:56,280 --> 00:51:00,880 Speaker 1: as advocates argue that companies can't, of few skate the 788 00:51:00,920 --> 00:51:04,560 Speaker 1: workings of their products so that it makes it impossible 789 00:51:04,560 --> 00:51:06,240 Speaker 1: for you to do any kind of maintenance or repair 790 00:51:06,280 --> 00:51:11,239 Speaker 1: on them yourself or with an independent operation that can 791 00:51:11,280 --> 00:51:13,319 Speaker 1: do it for maybe less than what it would cost 792 00:51:13,360 --> 00:51:17,480 Speaker 1: you to take it to A quote unquote official repair site. 793 00:51:17,800 --> 00:51:21,120 Speaker 1: So yeah, black Box is still very much a thing, 794 00:51:21,280 --> 00:51:25,239 Speaker 1: still very much an ongoing concern in the field of technology, 795 00:51:25,280 --> 00:51:28,440 Speaker 1: and I'm sure we'll end up talking about it again 796 00:51:28,480 --> 00:51:32,520 Speaker 1: and again as the various artificial intelligence stories continue to unfold. 797 00:51:32,960 --> 00:51:36,120 Speaker 1: I hope you are all well, and I will talk 798 00:51:36,120 --> 00:51:46,400 Speaker 1: to you again really soon. Tech Stuff is an iHeartRadio production. 799 00:51:46,680 --> 00:51:51,719 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 800 00:51:51,840 --> 00:51:57,440 Speaker 1: or wherever you listen to your favorite shows.