1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Text Stuff, a production from my Heart Radio. 2 00:00:12,119 --> 00:00:15,160 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,320 --> 00:00:18,840 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,880 --> 00:00:22,880 Speaker 1: and I love all things tech and today I thought 5 00:00:22,920 --> 00:00:26,279 Speaker 1: it was a good time to take another opportunity to 6 00:00:26,360 --> 00:00:29,760 Speaker 1: chat about one of the subjects. I really hammer her 7 00:00:29,800 --> 00:00:33,479 Speaker 1: home in this series, and I don't make any apologies 8 00:00:33,520 --> 00:00:37,920 Speaker 1: for this. It is not about convergence though long time 9 00:00:38,040 --> 00:00:40,720 Speaker 1: listeners of tech Stuff know that for years that was 10 00:00:41,200 --> 00:00:46,400 Speaker 1: my favorite trend to cover. It's about critical thinking. So yeah, 11 00:00:46,520 --> 00:00:50,640 Speaker 1: this is another critical thinking in Technology episode. Now. In 12 00:00:50,680 --> 00:00:54,600 Speaker 1: these episodes, I explain how taking time and really thinking 13 00:00:54,680 --> 00:00:57,840 Speaker 1: through things is important so that we make the most 14 00:00:57,920 --> 00:01:01,760 Speaker 1: informed decisions we can and so that we aren't either 15 00:01:02,000 --> 00:01:05,919 Speaker 1: fooling ourselves or you know, allowing someone else to fool 16 00:01:06,080 --> 00:01:09,240 Speaker 1: us when it comes to technology. Though, I'll tell you 17 00:01:09,280 --> 00:01:12,399 Speaker 1: a secret, using these skills in all parts of your 18 00:01:12,400 --> 00:01:15,080 Speaker 1: life is a great idea because it can be really 19 00:01:15,120 --> 00:01:17,399 Speaker 1: easy for us to fall into patterns where we let 20 00:01:17,440 --> 00:01:22,240 Speaker 1: ourselves believe things just because it's convenient or you know, 21 00:01:22,280 --> 00:01:25,920 Speaker 1: it reaffirms our biases are prejudices, that kind of thing. 22 00:01:26,040 --> 00:01:30,560 Speaker 1: So if you use critical thinking beyond the realm of technology. 23 00:01:31,240 --> 00:01:35,160 Speaker 1: I ain't gonna be mad. Specifically, I wanted to talk 24 00:01:35,240 --> 00:01:40,119 Speaker 1: about a general category of issues in tech that some 25 00:01:40,240 --> 00:01:44,080 Speaker 1: refer to as the black box problem. Now, this is 26 00:01:44,120 --> 00:01:48,520 Speaker 1: not the same thing as the black box that's on 27 00:01:48,720 --> 00:01:52,840 Speaker 1: board your typical airplane. In fact, i'll explain what that 28 00:01:53,040 --> 00:01:55,840 Speaker 1: is first because it's pretty simple, and then we can 29 00:01:55,880 --> 00:02:01,280 Speaker 1: move on. First, the black box inside airplanes is typically orange. 30 00:02:01,560 --> 00:02:04,760 Speaker 1: So right off the bat, we have a problem with nomenclature, right, 31 00:02:04,800 --> 00:02:09,600 Speaker 1: I mean, you had one job black box. Actually that's 32 00:02:09,639 --> 00:02:12,000 Speaker 1: that's not true. The black box has a very important 33 00:02:12,080 --> 00:02:14,639 Speaker 1: job and it requires a couple of things to work. 34 00:02:15,080 --> 00:02:18,960 Speaker 1: But the black box, which is orange, is all about 35 00:02:19,080 --> 00:02:22,720 Speaker 1: maintaining a record of an aircraft's activities in a housing 36 00:02:22,840 --> 00:02:27,960 Speaker 1: capable of withstanding tremendous punishment. Another name, or a a 37 00:02:28,000 --> 00:02:31,480 Speaker 1: more appropriate name really for this device, for the black box, 38 00:02:31,639 --> 00:02:35,880 Speaker 1: is the flight data recorder. Sensors in various parts of 39 00:02:35,880 --> 00:02:39,520 Speaker 1: the plane detect changes and then send data to the 40 00:02:39,520 --> 00:02:45,080 Speaker 1: flight data recorder, which you know, records them. If a 41 00:02:45,120 --> 00:02:48,679 Speaker 1: pilot makes any adjustments to any controls, whether it's the 42 00:02:48,760 --> 00:02:51,520 Speaker 1: flight stick or a knob or a button or a 43 00:02:51,520 --> 00:02:55,200 Speaker 1: switch or whatever. The control not only does whatever it 44 00:02:55,280 --> 00:02:58,880 Speaker 1: was intended to do, assuming everything's in working order, but 45 00:02:58,960 --> 00:03:02,120 Speaker 1: it also sends us signal that is recorded on the 46 00:03:02,160 --> 00:03:04,880 Speaker 1: flight recorder. So the job of the flight recorder is 47 00:03:04,919 --> 00:03:08,600 Speaker 1: to create as accurate a representation of what went on 48 00:03:08,840 --> 00:03:13,800 Speaker 1: with that aircraft as is possible. The Federal Aviation Administration 49 00:03:13,880 --> 00:03:16,280 Speaker 1: or f a A in the United States has a 50 00:03:16,320 --> 00:03:19,359 Speaker 1: long list of parameters that the flight recorder is supposed 51 00:03:19,400 --> 00:03:22,840 Speaker 1: to keep track of, more than eighty in fact, and 52 00:03:22,880 --> 00:03:25,720 Speaker 1: these include not just the aircraft systems, but what was 53 00:03:25,760 --> 00:03:28,800 Speaker 1: going on in the environment. So if a pilot encounters 54 00:03:28,840 --> 00:03:32,800 Speaker 1: a problem on a flight, or in a worst case scenario, 55 00:03:32,880 --> 00:03:36,080 Speaker 1: in the event of a crash, the flight recorder represents 56 00:03:36,080 --> 00:03:40,440 Speaker 1: an opportunity to find out what actually went wrong. You know, 57 00:03:40,600 --> 00:03:44,040 Speaker 1: was it a malfunction, was it pilot error, was it 58 00:03:44,280 --> 00:03:49,240 Speaker 1: whether the crash survivable? Memory units inside the heavy duty 59 00:03:49,320 --> 00:03:53,000 Speaker 1: casing of the black box are really meant to act 60 00:03:53,000 --> 00:03:56,560 Speaker 1: as a lasting record, So if the recorder is recoverable, 61 00:03:56,800 --> 00:04:00,480 Speaker 1: it gives investigators a chance to find out what happened. But, 62 00:04:00,800 --> 00:04:03,920 Speaker 1: as I said, that's not the black box I really 63 00:04:03,920 --> 00:04:06,520 Speaker 1: wanted to talk about for today's episode, so I'm not 64 00:04:06,520 --> 00:04:09,560 Speaker 1: going to go into any more detail about that. Now, 65 00:04:09,600 --> 00:04:12,600 Speaker 1: you could argue that the black box I want to 66 00:04:12,640 --> 00:04:15,200 Speaker 1: talk about is sort of the opposite of what we 67 00:04:15,320 --> 00:04:19,520 Speaker 1: find in airplanes, because in an airplane, the black box 68 00:04:19,560 --> 00:04:22,919 Speaker 1: contains a record of everything that has gone on, and 69 00:04:22,960 --> 00:04:26,600 Speaker 1: it can help explain why a certain outcome has happened. 70 00:04:27,160 --> 00:04:30,520 Speaker 1: In technology in general, we use the term black box 71 00:04:30,560 --> 00:04:33,600 Speaker 1: to refer to a system or technology where we know 72 00:04:33,640 --> 00:04:36,800 Speaker 1: what goes into it and we can see what comes 73 00:04:36,920 --> 00:04:39,479 Speaker 1: out of it, but we have no idea of what 74 00:04:39,640 --> 00:04:42,640 Speaker 1: went on in the middle of that process. We don't 75 00:04:42,680 --> 00:04:45,599 Speaker 1: have a way to understand the process by which the 76 00:04:45,640 --> 00:04:50,320 Speaker 1: device takes input and produces output. Now, in most cases, 77 00:04:50,680 --> 00:04:55,880 Speaker 1: we're not talking about an instance where literally nobody understands 78 00:04:55,920 --> 00:04:59,479 Speaker 1: what's going on with a device or a system. It's 79 00:04:59,520 --> 00:05:03,680 Speaker 1: more like the creators of whatever system we're talking about 80 00:05:03,760 --> 00:05:07,840 Speaker 1: have purposefully made it difficult or impossible for the average 81 00:05:07,880 --> 00:05:12,039 Speaker 1: person to understand or in some cases, even see what 82 00:05:12,160 --> 00:05:16,720 Speaker 1: a technology is doing. Sometimes it's intentional, sometimes it's not. 83 00:05:16,800 --> 00:05:19,839 Speaker 1: So Maybe purposefully is being a little strong there, but 84 00:05:20,279 --> 00:05:24,000 Speaker 1: that's often how it unfolds. Let's take a general example 85 00:05:24,200 --> 00:05:28,360 Speaker 1: that has created issues with a particular subsection of tech heads, 86 00:05:28,440 --> 00:05:31,120 Speaker 1: and those would be the gear heads, you know, the 87 00:05:31,360 --> 00:05:34,320 Speaker 1: people who love to work on vehicles like motorcycles and 88 00:05:34,360 --> 00:05:37,839 Speaker 1: cars and trucks and stuff, and what you might think 89 00:05:37,880 --> 00:05:40,880 Speaker 1: of as the good old days, at least from the 90 00:05:40,920 --> 00:05:44,719 Speaker 1: perspective of d I Y technology. There's plenty of other 91 00:05:44,760 --> 00:05:47,400 Speaker 1: things that were wrong back then, but in those terms, 92 00:05:47,600 --> 00:05:52,000 Speaker 1: the cars systems were pretty darn accessible. The motorist would 93 00:05:52,040 --> 00:05:54,800 Speaker 1: need to spend time and effort to learn how the 94 00:05:54,839 --> 00:05:57,920 Speaker 1: car worked and what each component was meant to do, 95 00:05:58,360 --> 00:06:01,479 Speaker 1: but that was actually in a evable goal. So with 96 00:06:01,520 --> 00:06:04,240 Speaker 1: a bit of study and some hands on work, you 97 00:06:04,240 --> 00:06:07,320 Speaker 1: could suss out how an engine works. You know, how 98 00:06:07,520 --> 00:06:11,120 Speaker 1: spark plugs cause a little explosion by igniting a mixture 99 00:06:11,440 --> 00:06:15,080 Speaker 1: of fuel and air inside an engine's cylinders, How that 100 00:06:15,120 --> 00:06:18,359 Speaker 1: explosion would force out a piston which connects to a 101 00:06:18,400 --> 00:06:21,680 Speaker 1: crank shaft, and how that reciprocating motion of the piston 102 00:06:21,720 --> 00:06:24,960 Speaker 1: would translate into rotational motion of the crank shaft that 103 00:06:25,000 --> 00:06:29,039 Speaker 1: could then be transmitted ultimately to wheels. Through transmission, you 104 00:06:29,080 --> 00:06:33,200 Speaker 1: could learn what the carburetor does, how the various fans 105 00:06:33,279 --> 00:06:35,640 Speaker 1: and belts work, and what they do you know where 106 00:06:35,680 --> 00:06:38,000 Speaker 1: the oil pan is and how to change out oil 107 00:06:38,080 --> 00:06:40,680 Speaker 1: and all that kind of stuff. What's more, you can 108 00:06:40,760 --> 00:06:45,160 Speaker 1: make repairs yourself. If you had the tools, the replacement parts, 109 00:06:45,200 --> 00:06:49,000 Speaker 1: and the knowledge and the time you could swap out parts. 110 00:06:49,040 --> 00:06:52,240 Speaker 1: You could customize your vehicle. You know, I've known a 111 00:06:52,240 --> 00:06:55,560 Speaker 1: lot of people who have taken on cars as projects. 112 00:06:55,560 --> 00:06:58,839 Speaker 1: They'll purchase an old junker and then they will lovingly 113 00:06:58,920 --> 00:07:02,000 Speaker 1: restore it to its former glory or turn it into 114 00:07:02,080 --> 00:07:06,480 Speaker 1: something truly transformational. And all of that is possible because 115 00:07:06,520 --> 00:07:12,080 Speaker 1: those old cars had really accessible systems. They were relatively 116 00:07:12,320 --> 00:07:16,560 Speaker 1: simple electro mechanical systems. Once you understood how they worked, 117 00:07:16,920 --> 00:07:19,360 Speaker 1: you could see how they worked or how they were 118 00:07:19,360 --> 00:07:22,880 Speaker 1: supposed to work, and you could understand what was going on. 119 00:07:23,440 --> 00:07:26,920 Speaker 1: Through that understanding, you could address stuff when things weren't 120 00:07:27,240 --> 00:07:31,000 Speaker 1: going well. And that's how cars were for decades. But 121 00:07:31,200 --> 00:07:35,040 Speaker 1: that began to change in the late nineteen sixties. But 122 00:07:35,200 --> 00:07:40,640 Speaker 1: it really accelerated. Uh No, pun intended in the nineteen seventies. 123 00:07:41,040 --> 00:07:45,960 Speaker 1: So what happened well in nineteen sixty eight, leading into 124 00:07:46,040 --> 00:07:50,240 Speaker 1: nineteen sixty nine, Volkswagen introduced a new standard feature for 125 00:07:50,320 --> 00:07:54,360 Speaker 1: their Type three vehicle, which was sometimes called the Volkswagen 126 00:07:54,520 --> 00:07:56,680 Speaker 1: fifteen hundred or sixteen hundred. There are a couple of 127 00:07:56,800 --> 00:08:00,200 Speaker 1: names for it. These were family cars, and volkswagen Is 128 00:08:00,240 --> 00:08:03,400 Speaker 1: intent was to create a vehicle with a bit more 129 00:08:03,480 --> 00:08:07,280 Speaker 1: luggage and passenger space than their type one, which was 130 00:08:07,360 --> 00:08:12,280 Speaker 1: also known as the Volkswagen Beetle. The feature for these 131 00:08:12,320 --> 00:08:14,600 Speaker 1: cars that I wanted to talk about was an electronic 132 00:08:14,760 --> 00:08:18,400 Speaker 1: fuel injection system that was controlled by a computer chip, 133 00:08:18,960 --> 00:08:22,440 Speaker 1: and the marketing for the this particular feature said that 134 00:08:22,560 --> 00:08:28,000 Speaker 1: this quote electronic brain end quote was quote smarter than 135 00:08:28,040 --> 00:08:31,680 Speaker 1: a carburetor end quote. Now, the purpose of a carburetor 136 00:08:32,080 --> 00:08:35,160 Speaker 1: is to mix fuel with air at a ratio that 137 00:08:35,280 --> 00:08:39,520 Speaker 1: is suitable for combustion inside the engine cylinders. But an 138 00:08:39,520 --> 00:08:43,040 Speaker 1: engine doesn't need exactly the same ratio of fuel to 139 00:08:43,120 --> 00:08:46,280 Speaker 1: air from moment to moment. It actually varies. As a 140 00:08:46,360 --> 00:08:50,360 Speaker 1: vehicle runs longer, or it travels faster, or it starts 141 00:08:50,360 --> 00:08:53,200 Speaker 1: climbing a steep hill, or you know, lots of stuff, 142 00:08:53,320 --> 00:08:57,439 Speaker 1: the ratio changes somewhat, and the carburetor manages this with 143 00:08:57,920 --> 00:09:00,640 Speaker 1: a couple of valves, one called the choke, another called 144 00:09:00,640 --> 00:09:04,280 Speaker 1: the throttle, among other elements. But it's all mechanical and 145 00:09:04,320 --> 00:09:07,640 Speaker 1: while it works, it's not as precise as an electronic 146 00:09:07,679 --> 00:09:11,160 Speaker 1: system could be. And that's where the Volkswagen system came in. 147 00:09:11,720 --> 00:09:15,520 Speaker 1: Volkswagen was pushing this as a more efficient and useful 148 00:09:15,600 --> 00:09:19,200 Speaker 1: component than a carburetor. It would prevent the engine from 149 00:09:19,240 --> 00:09:22,800 Speaker 1: being flooded with fuel and not enough air for combustion. 150 00:09:23,240 --> 00:09:26,240 Speaker 1: It would handle the transitions of fuel and air mix 151 00:09:26,400 --> 00:09:30,400 Speaker 1: ratios more quickly and precisely. It was the start of 152 00:09:30,480 --> 00:09:35,400 Speaker 1: something big, but were it not for some other external factors, 153 00:09:35,800 --> 00:09:38,000 Speaker 1: it might not have taken off the way it did, 154 00:09:38,080 --> 00:09:40,840 Speaker 1: or at least not as quickly as it did. Those 155 00:09:40,880 --> 00:09:44,440 Speaker 1: other factors, as I said, were external, and there were 156 00:09:44,480 --> 00:09:48,839 Speaker 1: a pair of doozies. One was a growing concern that 157 00:09:48,960 --> 00:09:52,360 Speaker 1: burning fossil fuels was having a negative impact on the environment, 158 00:09:52,480 --> 00:09:56,079 Speaker 1: which turned out to be absolutely the case. Cities like 159 00:09:56,160 --> 00:10:00,440 Speaker 1: Los Angeles, California, where getting around pretty much re buyers 160 00:10:00,600 --> 00:10:05,400 Speaker 1: having a car, we're dealing with some really serious smog problems, 161 00:10:05,840 --> 00:10:09,800 Speaker 1: and so organizations like the Environmental Protection Agency in the 162 00:10:09,880 --> 00:10:14,319 Speaker 1: United States began to draft requirements to reduce car emissions. 163 00:10:14,960 --> 00:10:17,880 Speaker 1: That would mean that automotive companies would have to create 164 00:10:17,920 --> 00:10:22,679 Speaker 1: more efficient engine systems. The other major factor that contributed 165 00:10:22,720 --> 00:10:26,360 Speaker 1: to this was the oil crisis of the nineties seventies, 166 00:10:26,480 --> 00:10:29,079 Speaker 1: which I talked about not long ago in a different 167 00:10:29,120 --> 00:10:33,520 Speaker 1: Tech Stuff podcast. This was a geopolitical problem that threw 168 00:10:33,640 --> 00:10:36,520 Speaker 1: much of the world into a scramble to curb fossil 169 00:10:36,559 --> 00:10:41,840 Speaker 1: fuel consumption because the supply was limited. The double whammy 170 00:10:41,880 --> 00:10:46,280 Speaker 1: of environmental concerns and the oil crisis forced a lot 171 00:10:46,280 --> 00:10:51,240 Speaker 1: of car companies to rethink their previous strategy, which was 172 00:10:51,559 --> 00:10:55,160 Speaker 1: pretty much more power, make bigger, much fast, go go, go, 173 00:10:55,160 --> 00:10:59,320 Speaker 1: go guzzle. That kind of is how they were thinking 174 00:10:59,640 --> 00:11:02,360 Speaker 1: back in the day. Turns out that was an unsustainable option. 175 00:11:02,600 --> 00:11:06,640 Speaker 1: And if you look back, especially at American cars during 176 00:11:06,720 --> 00:11:09,880 Speaker 1: the fifties and sixties, you see that trend of the 177 00:11:09,920 --> 00:11:13,880 Speaker 1: engines getting bigger and more powerful, and that was just 178 00:11:14,920 --> 00:11:17,800 Speaker 1: the way things were going until we started to see 179 00:11:17,840 --> 00:11:22,120 Speaker 1: these external changes come in, and so more car companies 180 00:11:22,200 --> 00:11:27,120 Speaker 1: began to incorporate computer controlled fuel injection systems. But this 181 00:11:27,200 --> 00:11:30,840 Speaker 1: move also marked a move away from the design that 182 00:11:30,880 --> 00:11:33,080 Speaker 1: made it, you know, harder for the d I Y 183 00:11:33,160 --> 00:11:37,120 Speaker 1: crowd to work on cars. Working on a damaged carburetor 184 00:11:37,400 --> 00:11:41,880 Speaker 1: was one thing. Dealing with a malfunctioning computer chip was another. 185 00:11:42,400 --> 00:11:45,240 Speaker 1: It didn't fall into the typical skill set of your 186 00:11:45,280 --> 00:11:49,000 Speaker 1: amateur mechanic. And of course we didn't stop at computer 187 00:11:49,040 --> 00:11:52,640 Speaker 1: controlled fuel injection systems. Over time, we saw a lot 188 00:11:52,679 --> 00:11:56,720 Speaker 1: more automotive systems make the transition to computer control. Today, 189 00:11:57,000 --> 00:11:59,960 Speaker 1: your average car has computer systems that control the engine 190 00:12:00,040 --> 00:12:05,200 Speaker 1: and the transmission, the doors, entertainment system, the windows, and 191 00:12:05,280 --> 00:12:09,840 Speaker 1: these systems are all individual and they have a name. 192 00:12:09,920 --> 00:12:14,960 Speaker 1: They're called electronic control units or e c USE. Collectively, 193 00:12:15,480 --> 00:12:20,320 Speaker 1: they form the controller area network or can c a N. 194 00:12:20,760 --> 00:12:24,760 Speaker 1: The connections themselves, the physical connections are called the can 195 00:12:24,960 --> 00:12:28,040 Speaker 1: bus b US, and that's really just a way of 196 00:12:28,040 --> 00:12:30,760 Speaker 1: saying these are the physical connectors that allow data to 197 00:12:30,880 --> 00:12:34,800 Speaker 1: pass from one e c U to another. And there 198 00:12:35,000 --> 00:12:38,080 Speaker 1: wasn't really like a central processing unit or anything. There 199 00:12:38,120 --> 00:12:41,640 Speaker 1: was no like central brain. It was more like ec 200 00:12:41,880 --> 00:12:46,760 Speaker 1: use that depend upon one another would send relevant information 201 00:12:46,840 --> 00:12:51,320 Speaker 1: to each other and not to anything else. So you know, 202 00:12:51,440 --> 00:12:54,679 Speaker 1: if the door sensor is showing a door is open, 203 00:12:55,160 --> 00:12:58,679 Speaker 1: it can send an alert to other systems so that 204 00:12:58,679 --> 00:13:02,880 Speaker 1: that information is is appropriately dealt with. Now, at the 205 00:13:02,960 --> 00:13:07,400 Speaker 1: same time that these individual systems were evolving, so too 206 00:13:07,800 --> 00:13:10,640 Speaker 1: we saw the rise of what would become the onboard 207 00:13:10,760 --> 00:13:14,360 Speaker 1: diagnostic system or o b D, and the o b 208 00:13:14,520 --> 00:13:17,280 Speaker 1: D keeps an eye on what's going on with the 209 00:13:17,400 --> 00:13:21,160 Speaker 1: various systems in the car, and it sends notifications to 210 00:13:21,200 --> 00:13:25,120 Speaker 1: the driver via dashboard indicators when something is outside normal 211 00:13:25,200 --> 00:13:31,040 Speaker 1: operating parameters. So let's say that this diagnosis computer picks 212 00:13:31,120 --> 00:13:34,200 Speaker 1: up that there's something hinky happening with the fuel air 213 00:13:34,320 --> 00:13:37,680 Speaker 1: mixture and it activates that pesky check engine light on 214 00:13:37,720 --> 00:13:40,840 Speaker 1: the dashboard that gives you next to no useful information. 215 00:13:41,920 --> 00:13:45,079 Speaker 1: The problem is that these days it can be challenging 216 00:13:45,160 --> 00:13:49,240 Speaker 1: or sometimes impossible to figure out exactly what caused that 217 00:13:49,360 --> 00:13:52,640 Speaker 1: check engine light to come on without the access to 218 00:13:52,760 --> 00:13:56,880 Speaker 1: some special equipment and expertise. The car systems have become 219 00:13:56,960 --> 00:13:59,720 Speaker 1: so sophisticated that it could be a challenge to figure 220 00:13:59,760 --> 00:14:04,640 Speaker 1: out what exactly has gone awry. Mechanics use devices called 221 00:14:04,640 --> 00:14:07,680 Speaker 1: O b D scan tools, and these tools connect to 222 00:14:07,760 --> 00:14:11,240 Speaker 1: the computer on board a car, and then the car 223 00:14:11,400 --> 00:14:15,040 Speaker 1: provides a an error code to the scanner. This, by 224 00:14:15,040 --> 00:14:17,920 Speaker 1: the way, took a long time to standardize because you've 225 00:14:17,960 --> 00:14:20,040 Speaker 1: got a lot of different car companies out there, and 226 00:14:20,120 --> 00:14:23,960 Speaker 1: obviously there was a need to move towards standardization so 227 00:14:24,040 --> 00:14:27,680 Speaker 1: that you didn't have to have fifty different scan tools 228 00:14:27,760 --> 00:14:30,800 Speaker 1: and fifty different code charts to deal with all the 229 00:14:30,800 --> 00:14:34,440 Speaker 1: different car companies. But the code corresponds to the specific 230 00:14:34,560 --> 00:14:37,840 Speaker 1: issue the O B D has detected. So not only 231 00:14:37,880 --> 00:14:41,440 Speaker 1: do you need a special piece of equipment to diagnose 232 00:14:41,560 --> 00:14:43,800 Speaker 1: what has gone wrong with the car, you also need 233 00:14:43,880 --> 00:14:47,480 Speaker 1: to know the codes, or else you haven't really learned anything. 234 00:14:47,800 --> 00:14:52,080 Speaker 1: If I get you know, a eight digit code, and 235 00:14:52,120 --> 00:14:54,240 Speaker 1: I don't know what that code refers to, then I'm 236 00:14:54,280 --> 00:14:56,280 Speaker 1: not really any better off than just looking at a 237 00:14:56,360 --> 00:14:59,360 Speaker 1: check engine light. On top of all of that, even 238 00:14:59,400 --> 00:15:02,080 Speaker 1: if you know what is wrong, you might not be 239 00:15:02,200 --> 00:15:05,400 Speaker 1: able to easily access the problem or fix it due 240 00:15:05,400 --> 00:15:10,160 Speaker 1: to the level of complexity, sophistication, and computerization of vehicles. 241 00:15:10,560 --> 00:15:15,040 Speaker 1: Not all cars, or motorcycles or or whatever are equal. Obviously, 242 00:15:15,160 --> 00:15:18,200 Speaker 1: some are a bit easier to work on than others. 243 00:15:18,240 --> 00:15:21,840 Speaker 1: Some require a lot of specific care though. For example, 244 00:15:21,960 --> 00:15:24,920 Speaker 1: if you're driving a Tesla, chances are the amount of 245 00:15:24,960 --> 00:15:27,880 Speaker 1: personal tinkering you're going to do on your car is 246 00:15:27,920 --> 00:15:31,720 Speaker 1: going to be fairly limited. Now I'm not saying it's impossible, 247 00:15:32,160 --> 00:15:36,480 Speaker 1: just that it's really challenging. So in general, we've seen 248 00:15:36,560 --> 00:15:40,640 Speaker 1: cars go from a mechanical system or electro mechanical system 249 00:15:40,720 --> 00:15:44,160 Speaker 1: that the average person can understand and work on, to 250 00:15:44,320 --> 00:15:49,320 Speaker 1: a group of interconnected, specialized computer systems that are increasingly 251 00:15:49,400 --> 00:15:52,960 Speaker 1: difficult to access. The cars have become a type of 252 00:15:53,080 --> 00:15:57,600 Speaker 1: black box. This can be extra frustrating for gear heads 253 00:15:57,680 --> 00:16:02,280 Speaker 1: who have actually an understanding of underlying mechanical issues that 254 00:16:02,360 --> 00:16:04,960 Speaker 1: could cause problems. They might even know how to solve 255 00:16:05,000 --> 00:16:08,280 Speaker 1: an issue if they can just get to it, but 256 00:16:08,360 --> 00:16:12,120 Speaker 1: they are finding themselves with fewer options in order to 257 00:16:12,160 --> 00:16:16,920 Speaker 1: address underlying issues. Now, cars are just one example of 258 00:16:16,960 --> 00:16:21,960 Speaker 1: technologies that have moved toward a black box like system. 259 00:16:22,000 --> 00:16:24,880 Speaker 1: There are lots of others. But apart from making it 260 00:16:24,920 --> 00:16:27,960 Speaker 1: harder to tinker with your tech, what's the problem. But 261 00:16:28,040 --> 00:16:30,280 Speaker 1: when we come back, I'll talk about some of the 262 00:16:30,320 --> 00:16:34,280 Speaker 1: pitfalls of turning tech into a black box. But first 263 00:16:34,760 --> 00:16:45,560 Speaker 1: let's take a quick break. We're back. So the car 264 00:16:45,760 --> 00:16:49,960 Speaker 1: transformed from a purely electro mechanical technology to one that 265 00:16:50,120 --> 00:16:54,360 Speaker 1: increasingly relies on computer systems. But the computer itself can 266 00:16:54,440 --> 00:16:58,280 Speaker 1: also be something of a black box for people none 267 00:16:58,280 --> 00:17:01,520 Speaker 1: the very early days of the personal computer. It was 268 00:17:01,600 --> 00:17:04,720 Speaker 1: hobbyists who were ordering kits through the mail and then 269 00:17:04,760 --> 00:17:09,359 Speaker 1: building computers at home. Typically, these hobbyists had a working 270 00:17:09,440 --> 00:17:12,520 Speaker 1: understanding of how the computer systems operated, you know, the 271 00:17:12,560 --> 00:17:16,440 Speaker 1: actual way in which they would accept inputs and process 272 00:17:16,440 --> 00:17:21,800 Speaker 1: information and then produce outputs. Before high level programming languages, 273 00:17:22,080 --> 00:17:25,600 Speaker 1: programmers also had to kind of think like a computer 274 00:17:25,720 --> 00:17:29,040 Speaker 1: in order to program them to carry out functions. As 275 00:17:29,119 --> 00:17:33,280 Speaker 1: computer languages became more high level, meaning there was a 276 00:17:33,359 --> 00:17:37,760 Speaker 1: layer of abstraction between the programmer and the actual processes 277 00:17:37,800 --> 00:17:40,360 Speaker 1: that were going on at the hardware level of the computer, 278 00:17:41,040 --> 00:17:45,040 Speaker 1: that connection began to get more tenuous. Now, I'm not 279 00:17:45,160 --> 00:17:49,119 Speaker 1: saying that programmers today don't have a real understanding of 280 00:17:49,160 --> 00:17:53,600 Speaker 1: how computers work, but rather that this understanding is less 281 00:17:53,680 --> 00:17:58,960 Speaker 1: critical because programming languages, computer engines, app developer kits you know, 282 00:17:59,000 --> 00:18:02,680 Speaker 1: software developer kits and so on, provide a framework that 283 00:18:02,840 --> 00:18:07,199 Speaker 1: reduces the amount of low level work programmers need to 284 00:18:07,240 --> 00:18:10,919 Speaker 1: do in order to build stuff as software. For the 285 00:18:10,960 --> 00:18:13,920 Speaker 1: average user, you know, someone who isn't learned in the 286 00:18:13,960 --> 00:18:19,160 Speaker 1: ways of computer science, computers are pretty much black boxes. 287 00:18:19,560 --> 00:18:24,080 Speaker 1: They work until they don't. You push buttons on a keyboard, 288 00:18:24,200 --> 00:18:26,520 Speaker 1: or you click on a mouse, or you touch a screen, 289 00:18:26,600 --> 00:18:30,119 Speaker 1: and you know, the computer does the stuff how it 290 00:18:30,160 --> 00:18:33,679 Speaker 1: does stuff like how it detects a screen touch and 291 00:18:33,680 --> 00:18:37,520 Speaker 1: then translates that into a command that is then executed 292 00:18:37,640 --> 00:18:40,920 Speaker 1: to produce a specific result. You know, that's not important 293 00:18:41,000 --> 00:18:44,359 Speaker 1: to us. We don't care or need to know how 294 00:18:44,400 --> 00:18:47,639 Speaker 1: that works in order to enjoy the benefits of it. 295 00:18:47,720 --> 00:18:51,119 Speaker 1: So for us, it's just the way things are. You 296 00:18:51,200 --> 00:18:54,640 Speaker 1: push that button and this thing happens. It just does. 297 00:18:55,240 --> 00:18:58,000 Speaker 1: The black box of the computer system, which can be 298 00:18:58,040 --> 00:19:02,080 Speaker 1: a desktop, a laptop, tab, smartphone, video game console, you 299 00:19:02,119 --> 00:19:05,440 Speaker 1: know whatever. It just takes care of what we needed 300 00:19:05,480 --> 00:19:09,440 Speaker 1: to do. That's not to say a computer is impenetrably 301 00:19:09,640 --> 00:19:13,160 Speaker 1: a black box. You can learn how they work, and 302 00:19:13,200 --> 00:19:17,040 Speaker 1: how programming languages work and so on. Computer science and 303 00:19:17,080 --> 00:19:20,919 Speaker 1: programming classes are all built around that. So while a 304 00:19:20,960 --> 00:19:25,240 Speaker 1: computer system is effectively a black box to the average user, 305 00:19:25,920 --> 00:19:29,560 Speaker 1: it wasn't made that way by design, and it can 306 00:19:29,600 --> 00:19:32,920 Speaker 1: be addressed on a case by case basis, depending on 307 00:19:33,000 --> 00:19:36,520 Speaker 1: the time, the interest of the individual computer user, and 308 00:19:36,920 --> 00:19:41,040 Speaker 1: their dedication to learning. But sometimes people will set out 309 00:19:41,080 --> 00:19:44,639 Speaker 1: to make technologies with the intent of them being black 310 00:19:44,760 --> 00:19:48,520 Speaker 1: boxes from the get go. These technologies are dependent in 311 00:19:48,680 --> 00:19:52,040 Speaker 1: part or in whole on of view skating how they work, 312 00:19:52,280 --> 00:19:56,560 Speaker 1: in other words, by obscuring it. Sometimes that's in an 313 00:19:56,560 --> 00:20:00,800 Speaker 1: effort to protect an invention from copycats, the whole idea 314 00:20:00,880 --> 00:20:03,439 Speaker 1: being that if you come up with something really clever, 315 00:20:04,040 --> 00:20:07,160 Speaker 1: you don't want someone else to come along, lift that 316 00:20:07,280 --> 00:20:09,720 Speaker 1: idea and do the same thing you are doing, but 317 00:20:09,840 --> 00:20:12,720 Speaker 1: selling it for you know, less money or something. But 318 00:20:12,880 --> 00:20:17,440 Speaker 1: other times you might be hiding how something works, specifically 319 00:20:17,440 --> 00:20:21,199 Speaker 1: with the intent to deceive. And now it's time to 320 00:20:21,320 --> 00:20:25,480 Speaker 1: look much further back than the nineteen sixties. It was 321 00:20:25,720 --> 00:20:29,960 Speaker 1: seventeen seventy in Europe. As the story goes, the European 322 00:20:30,000 --> 00:20:33,399 Speaker 1: world had seen a great deal of advancement in mechanical 323 00:20:33,480 --> 00:20:37,880 Speaker 1: clockwork devices. At that point. Clocks themselves, often powered by 324 00:20:37,960 --> 00:20:42,359 Speaker 1: winding a spring and keeping time using gears with a 325 00:20:42,440 --> 00:20:46,440 Speaker 1: reliable and consistent basis that was much better than earlier methods. 326 00:20:47,040 --> 00:20:51,040 Speaker 1: Even allowed people the ability to carry a time keeping 327 00:20:51,080 --> 00:20:56,840 Speaker 1: device with them. Phenomenal Based on similar principles, various tinkerers 328 00:20:56,840 --> 00:20:59,800 Speaker 1: had come up with toys and distractions that also ran 329 00:21:00,240 --> 00:21:04,880 Speaker 1: on clockwork, like gears and springs. Some of these were 330 00:21:05,000 --> 00:21:09,919 Speaker 1: quite elaborate, such as figures that appeared to play musical instruments, 331 00:21:09,960 --> 00:21:14,399 Speaker 1: and one of them was particularly impressive. It appeared to 332 00:21:14,440 --> 00:21:19,920 Speaker 1: be an automaton that could play expert level chess. The figure, 333 00:21:20,200 --> 00:21:24,159 Speaker 1: made out of wood, was dressed in Turkish costume, leading 334 00:21:24,240 --> 00:21:28,920 Speaker 1: to it being called the Turk, or sometimes the mechanical Turk. 335 00:21:29,840 --> 00:21:33,119 Speaker 1: If you were to sit down to play against the Turk, 336 00:21:33,840 --> 00:21:36,840 Speaker 1: you as an opponent, would move a piece and then 337 00:21:36,880 --> 00:21:40,199 Speaker 1: you would watch as this mechanical figure would shift and 338 00:21:40,280 --> 00:21:43,040 Speaker 1: move a piece of its own in response. And the 339 00:21:43,080 --> 00:21:47,480 Speaker 1: Turk was a pretty good chess player. It frequently beat 340 00:21:47,560 --> 00:21:51,960 Speaker 1: the opponents at face. Sometimes it would lose to particularly 341 00:21:52,040 --> 00:21:55,680 Speaker 1: strong players, but it held its own pretty darn well. 342 00:21:56,240 --> 00:22:00,920 Speaker 1: The man behind this invention was Wolfgang Fawn Kimplin, who 343 00:22:01,000 --> 00:22:04,359 Speaker 1: was in the service of Maria Teresa, Empress of the 344 00:22:04,400 --> 00:22:07,919 Speaker 1: Holy Roman Empire. He had been invited to view a 345 00:22:07,960 --> 00:22:11,720 Speaker 1: magician's performance in the court, so the story goes, and 346 00:22:12,000 --> 00:22:15,400 Speaker 1: the Empress had invited him specifically and afterwards asked him 347 00:22:15,400 --> 00:22:18,800 Speaker 1: what he thought, and allegedly he boasted he could create 348 00:22:18,840 --> 00:22:22,919 Speaker 1: a much more compelling illusion than anything this magician did. Now, 349 00:22:22,960 --> 00:22:26,360 Speaker 1: according to the story. The Empress essentially said, oh yeah, 350 00:22:26,560 --> 00:22:30,080 Speaker 1: we'll prove it buster, and he was given six months 351 00:22:30,119 --> 00:22:33,560 Speaker 1: to do just that. The turk was what he had 352 00:22:33,600 --> 00:22:36,240 Speaker 1: to show for it in six months time, and it 353 00:22:36,359 --> 00:22:41,200 Speaker 1: reportedly went over like gang Busters. The wooden turk stood 354 00:22:41,240 --> 00:22:44,920 Speaker 1: behind a cabinet, on top of which was the chessboard, 355 00:22:45,280 --> 00:22:49,439 Speaker 1: and Kimplin would reportedly open the cabinet doors and reveal 356 00:22:49,640 --> 00:22:53,200 Speaker 1: some gears and mechanics to prove that it was purely 357 00:22:53,359 --> 00:22:58,320 Speaker 1: a mechanical system. In fact, the gears were masking a 358 00:22:58,400 --> 00:23:02,040 Speaker 1: hidden compartment behind them, in which a human chess player 359 00:23:02,560 --> 00:23:06,560 Speaker 1: was sitting inside, hunched over, keeping track of a game, 360 00:23:06,920 --> 00:23:09,960 Speaker 1: using a smaller chessboard in front of them, and using 361 00:23:10,040 --> 00:23:14,720 Speaker 1: various levers to move the turk's limbs in response. Now, 362 00:23:14,840 --> 00:23:18,720 Speaker 1: a lot of folks suspected that something was up from 363 00:23:18,840 --> 00:23:21,520 Speaker 1: the get go, but you know, part of the fun 364 00:23:22,040 --> 00:23:25,960 Speaker 1: of a magic trick is just not knowing what's going on. 365 00:23:26,600 --> 00:23:29,920 Speaker 1: Some folks try very hard to figure out the process. 366 00:23:29,960 --> 00:23:32,879 Speaker 1: I am not one of them. Others are just happy 367 00:23:32,920 --> 00:23:36,439 Speaker 1: to be entertained by a very well performed trick. But 368 00:23:36,520 --> 00:23:39,760 Speaker 1: in a way, the turk was a kind of black box. 369 00:23:40,280 --> 00:23:42,880 Speaker 1: In fact, you could argue that a lot of magic 370 00:23:42,880 --> 00:23:47,480 Speaker 1: tricks pretty much fall into the black box category. The 371 00:23:47,720 --> 00:23:51,600 Speaker 1: process is purposefully hidden from the viewer. If we could 372 00:23:51,640 --> 00:23:54,760 Speaker 1: see what the magician was doing from beginning to end, 373 00:23:55,400 --> 00:23:59,679 Speaker 1: all the way through and without any misdirection, then it 374 00:23:59,720 --> 00:24:03,600 Speaker 1: would and be magic. We might admire the skill of 375 00:24:03,640 --> 00:24:06,439 Speaker 1: the magician how quickly they were able to do things, 376 00:24:07,119 --> 00:24:11,880 Speaker 1: but we wouldn't really consider it magical. So the output 377 00:24:11,920 --> 00:24:15,879 Speaker 1: is dependent upon people not knowing the process the inputs 378 00:24:16,240 --> 00:24:18,919 Speaker 1: went through. Now that's not to say that you can 379 00:24:19,000 --> 00:24:21,280 Speaker 1: appreciate a really good magic trick even if you know 380 00:24:21,320 --> 00:24:23,719 Speaker 1: how it's done. One of the best examples I know 381 00:24:23,840 --> 00:24:28,080 Speaker 1: of is Penn and Teller. They did a phenomenal version 382 00:24:28,320 --> 00:24:32,240 Speaker 1: of the cups and balls routine where they used clear 383 00:24:32,400 --> 00:24:37,840 Speaker 1: plastic cups and balls of aluminum foil to demonstrate how 384 00:24:38,080 --> 00:24:41,600 Speaker 1: cups and balls works, and you can watch the entire 385 00:24:41,680 --> 00:24:44,640 Speaker 1: time and even being able to see through the cups 386 00:24:44,680 --> 00:24:47,920 Speaker 1: and see the moves they're being made, Teller does them 387 00:24:47,920 --> 00:24:51,760 Speaker 1: with such skill that it is truly phenomenal. It doesn't 388 00:24:51,920 --> 00:24:55,199 Speaker 1: hurt that Penn is spouting off a lot of nonsense 389 00:24:55,240 --> 00:24:58,320 Speaker 1: at the same time and misdirecting even as you're watching 390 00:24:58,359 --> 00:25:01,360 Speaker 1: what's going on, I highly recommend you check it out 391 00:25:01,400 --> 00:25:04,040 Speaker 1: on YouTube. Look for pen and teller cups and balls. 392 00:25:04,480 --> 00:25:08,760 Speaker 1: You won't be disappointed. Now. The Turk, as far as 393 00:25:08,800 --> 00:25:12,800 Speaker 1: I can tell, was always intended to be an entertainment, 394 00:25:13,400 --> 00:25:17,640 Speaker 1: not necessarily something that was specifically meant to perpetuate some 395 00:25:17,760 --> 00:25:21,800 Speaker 1: sort of hoax. You wouldn't call a stage magician a 396 00:25:21,920 --> 00:25:25,400 Speaker 1: huckster or a con man or anything like that. Their 397 00:25:25,440 --> 00:25:30,760 Speaker 1: occupation is dependent upon misdirection and making impossible acts seem 398 00:25:30,840 --> 00:25:35,760 Speaker 1: like they really happened, but always or nearly always with 399 00:25:35,880 --> 00:25:39,480 Speaker 1: the implication that it's all an illusion or a trick 400 00:25:39,600 --> 00:25:43,199 Speaker 1: of some sort. But not everyone is quite so forthcoming 401 00:25:43,240 --> 00:25:46,280 Speaker 1: about the fact that the thing they're doing is done 402 00:25:46,320 --> 00:25:50,400 Speaker 1: through trickery. For the scam artist, the black box creates 403 00:25:50,480 --> 00:25:57,000 Speaker 1: an incredible opportunity. As technological complexity outpaces the average person's understanding, 404 00:25:57,600 --> 00:26:01,720 Speaker 1: the scam artists can create fake adjets and devices that 405 00:26:01,840 --> 00:26:05,200 Speaker 1: they claim can do certain things and then count upon 406 00:26:05,320 --> 00:26:09,720 Speaker 1: the ignorance of the average person to get away with it. Typically, 407 00:26:09,760 --> 00:26:12,520 Speaker 1: the go to scam is to convince people with money 408 00:26:12,600 --> 00:26:17,080 Speaker 1: to pour investments into the hoax technology in an effort 409 00:26:17,119 --> 00:26:20,200 Speaker 1: to fund whatever the next phase of development is supposed 410 00:26:20,200 --> 00:26:23,480 Speaker 1: to be, whether that's to bring a prototype into a 411 00:26:23,520 --> 00:26:27,399 Speaker 1: production model or to refine a design or whatever. But 412 00:26:27,480 --> 00:26:30,600 Speaker 1: the end result is pretty much the same across the board. 413 00:26:30,920 --> 00:26:34,240 Speaker 1: The corn artist tries to wheedle out as much money 414 00:26:34,320 --> 00:26:37,600 Speaker 1: from their marks as they can before they pull up 415 00:26:37,640 --> 00:26:41,040 Speaker 1: stakes and skip town, or they find some way to 416 00:26:41,320 --> 00:26:46,119 Speaker 1: shift focus or punt any promises on delivering results further 417 00:26:46,240 --> 00:26:49,920 Speaker 1: into the future, like that's the future me problem kind 418 00:26:49,960 --> 00:26:53,359 Speaker 1: of approach. Once in a blue moon, you might find 419 00:26:53,600 --> 00:26:56,639 Speaker 1: someone who was just hoping to make enough time to 420 00:26:56,720 --> 00:26:59,199 Speaker 1: come up with a way to do their hopes for 421 00:26:59,280 --> 00:27:02,680 Speaker 1: real zes, or at least to simulate it close enough 422 00:27:02,760 --> 00:27:07,560 Speaker 1: so that people are satisfied. That typically doesn't work out 423 00:27:07,760 --> 00:27:14,040 Speaker 1: so well. M. Thoromness. I'll get back to that. So 424 00:27:14,160 --> 00:27:18,000 Speaker 1: let's talk about some examples of outright scams that leaned 425 00:27:18,119 --> 00:27:21,840 Speaker 1: heavily on the black box concept, whether by having their 426 00:27:21,880 --> 00:27:27,160 Speaker 1: supposed and actual operating mechanisms hidden or by obscuring how 427 00:27:27,200 --> 00:27:29,760 Speaker 1: they really worked with a lot of nonsensical claims and 428 00:27:29,880 --> 00:27:34,200 Speaker 1: techno babble. One historical scam artist was a guy named 429 00:27:34,359 --> 00:27:38,960 Speaker 1: Charles Redheffer, who claimed to have built a perpetual motion machine. 430 00:27:39,480 --> 00:27:42,399 Speaker 1: If he had managed to do such a thing, it 431 00:27:42,440 --> 00:27:45,040 Speaker 1: would have been a true feat, as it would break 432 00:27:45,200 --> 00:27:48,720 Speaker 1: the laws of physics as we understand them. So let's 433 00:27:48,760 --> 00:27:53,560 Speaker 1: go over why that is just pretty quickly. For perpetual 434 00:27:53,640 --> 00:27:57,320 Speaker 1: motion to work, and thus for free energy in general 435 00:27:57,359 --> 00:28:00,119 Speaker 1: to work, a machine would need to be able to 436 00:28:00,240 --> 00:28:04,399 Speaker 1: operate with absolutely no energy loss, and for free energy, 437 00:28:04,440 --> 00:28:07,919 Speaker 1: it would have to generate that energy in some way. 438 00:28:08,160 --> 00:28:13,040 Speaker 1: A perpetual motion machine, once set into motion, would never 439 00:28:13,119 --> 00:28:17,920 Speaker 1: stop moving, you know, unless someone or something specifically intervened. 440 00:28:18,160 --> 00:28:20,480 Speaker 1: But if it were left to its own devices, it 441 00:28:20,520 --> 00:28:24,080 Speaker 1: would continue to do whatever it was doing until the 442 00:28:24,200 --> 00:28:28,000 Speaker 1: last syllable of recorded time. To borrow a phrase from 443 00:28:28,040 --> 00:28:31,960 Speaker 1: the Bard. Now, if we look at our understanding of thermodynamics, 444 00:28:32,160 --> 00:28:35,879 Speaker 1: we'll see that doing this in the real world is impossible, 445 00:28:36,200 --> 00:28:39,120 Speaker 1: or at least it would go against fundamental ways that 446 00:28:39,160 --> 00:28:43,920 Speaker 1: we understand regarding how our universe works. The first law 447 00:28:44,160 --> 00:28:49,520 Speaker 1: of thermodynamics says that energy is neither created nor destroyed. 448 00:28:50,000 --> 00:28:54,600 Speaker 1: Energy can, however, be converted from one form into another. 449 00:28:55,240 --> 00:28:58,320 Speaker 1: So if you hold a water balloon over the head 450 00:28:58,440 --> 00:29:02,600 Speaker 1: of a close personal friend, let's say it's Ben bowlen 451 00:29:02,920 --> 00:29:05,840 Speaker 1: of stuff. They don't want you to know. The water 452 00:29:05,960 --> 00:29:09,840 Speaker 1: balloon has a certain amount of potential energy. If you 453 00:29:09,920 --> 00:29:13,320 Speaker 1: let go of the balloon, that potential energy converts into 454 00:29:13,520 --> 00:29:17,600 Speaker 1: kinetic energy, the energy of movement. You didn't create or 455 00:29:17,760 --> 00:29:22,760 Speaker 1: destroy energy here, it just changed forms. So if you 456 00:29:22,840 --> 00:29:25,960 Speaker 1: have what you claim to be a perpetual motion machine 457 00:29:26,280 --> 00:29:29,480 Speaker 1: and you set it in motion, the energy you gave 458 00:29:29,600 --> 00:29:34,440 Speaker 1: that machine and that initial point should sustain it forever 459 00:29:34,720 --> 00:29:38,120 Speaker 1: and it would never have that initial energy change form 460 00:29:38,200 --> 00:29:41,520 Speaker 1: into some other type of energy that could then escape 461 00:29:41,600 --> 00:29:45,200 Speaker 1: the system and show a net energy loss for the 462 00:29:45,240 --> 00:29:48,080 Speaker 1: system itself. Remember, the energy is not being destroyed, but 463 00:29:48,160 --> 00:29:51,680 Speaker 1: it can be lost in another form. This means that 464 00:29:51,800 --> 00:29:55,240 Speaker 1: such a machine could not have any parts that had 465 00:29:55,320 --> 00:29:58,600 Speaker 1: any contact with one another, which would make it a 466 00:29:58,680 --> 00:30:02,600 Speaker 1: really strange machine. And that's because friction would be a 467 00:30:02,640 --> 00:30:06,000 Speaker 1: constant means for energy to convert from one form to 468 00:30:06,280 --> 00:30:09,360 Speaker 1: another form, in this case kinetic energy, the energy of 469 00:30:09,400 --> 00:30:14,880 Speaker 1: movement into heat. Friction is the resistance surfaces have regarding 470 00:30:15,040 --> 00:30:18,800 Speaker 1: moving against each other. So if the machine has any 471 00:30:18,880 --> 00:30:22,200 Speaker 1: moving parts at all, those parts will be encountering friction, 472 00:30:22,560 --> 00:30:26,120 Speaker 1: which means some of that moving energy will be converted 473 00:30:26,160 --> 00:30:29,720 Speaker 1: to heat and thus escape the system. So the overall 474 00:30:29,800 --> 00:30:33,280 Speaker 1: system of the machine itself will have a net loss 475 00:30:33,400 --> 00:30:36,520 Speaker 1: of energy. There will be less energy to keep it going, 476 00:30:36,920 --> 00:30:41,240 Speaker 1: which means gradually it will slow down and ultimately just stop. 477 00:30:41,280 --> 00:30:44,120 Speaker 1: As a result, it might take a long time if 478 00:30:44,120 --> 00:30:47,959 Speaker 1: the machine is particularly well designed, but it will eventually happen. 479 00:30:48,440 --> 00:30:51,840 Speaker 1: You would need some form of energy input to keep 480 00:30:51,920 --> 00:30:55,240 Speaker 1: things going on occasion, kind of like a little push. 481 00:30:55,560 --> 00:30:59,320 Speaker 1: Imagine that you've got a a swing like a rope 482 00:30:59,360 --> 00:31:01,120 Speaker 1: with a higher at the end of it. No one's 483 00:31:01,120 --> 00:31:03,240 Speaker 1: in it right now. You would have to give that 484 00:31:03,320 --> 00:31:05,600 Speaker 1: tire a little push every now and then to keep 485 00:31:05,600 --> 00:31:09,000 Speaker 1: it swinging, otherwise it will eventually stop. But that means 486 00:31:09,040 --> 00:31:12,400 Speaker 1: you wouldn't have a perpetual motion machine. There are other 487 00:31:12,480 --> 00:31:16,600 Speaker 1: factors that similarly make perpetual motion impossible. That the machine 488 00:31:16,680 --> 00:31:19,520 Speaker 1: makes any sort of sound, then some of the energy 489 00:31:19,560 --> 00:31:23,719 Speaker 1: of operation is going into creating the vibrations that make sound. 490 00:31:23,880 --> 00:31:28,000 Speaker 1: Sound itself is energy, it's kinetic energy, So that would 491 00:31:28,040 --> 00:31:30,760 Speaker 1: mean the machine as a whole would be losing energy 492 00:31:31,000 --> 00:31:35,440 Speaker 1: through that sound. A machine operating inside an atmosphere has 493 00:31:35,480 --> 00:31:38,280 Speaker 1: to overcome the friction of moving through air and the 494 00:31:38,360 --> 00:31:42,760 Speaker 1: list goes on. Moreover, if we could build a perpetual 495 00:31:42,840 --> 00:31:46,040 Speaker 1: motion machine, we'd be able to harness it for energy, 496 00:31:46,360 --> 00:31:50,520 Speaker 1: but only up to whatever the starting initial energy was 497 00:31:50,640 --> 00:31:53,240 Speaker 1: to get it moving in the first place. Because again, 498 00:31:53,880 --> 00:31:57,560 Speaker 1: energy cannot be created. We can build devices that can 499 00:31:57,600 --> 00:32:00,520 Speaker 1: harness other forms of energy and convert that energy and 500 00:32:00,520 --> 00:32:04,520 Speaker 1: to say electricity, But these are not perpetual motion or 501 00:32:04,680 --> 00:32:08,760 Speaker 1: free energy machines. These machines are just collecting and converting 502 00:32:08,880 --> 00:32:13,520 Speaker 1: energy that's already in the system. We're already present, so 503 00:32:13,680 --> 00:32:17,880 Speaker 1: they're not making anything. Reneffer, however, claimed to have built 504 00:32:17,920 --> 00:32:21,280 Speaker 1: a perpetual motion machine that could potentially serve as a 505 00:32:21,360 --> 00:32:25,400 Speaker 1: free energy generator. Now, if true, this would have been 506 00:32:25,560 --> 00:32:29,440 Speaker 1: an astonishing discovery. Not only would our understanding of the 507 00:32:29,520 --> 00:32:33,200 Speaker 1: universe be proven to be wrong, but we would also 508 00:32:33,400 --> 00:32:38,760 Speaker 1: have access to an inexhaustible supply of energy. Red Effort 509 00:32:38,760 --> 00:32:41,880 Speaker 1: showed off what he said was a working model of 510 00:32:42,000 --> 00:32:45,200 Speaker 1: his design in Philadelphia, and he was asking for money 511 00:32:45,200 --> 00:32:49,160 Speaker 1: to fund the construction of a larger, practical version of 512 00:32:49,280 --> 00:32:53,760 Speaker 1: his design. A group of inspectors from the city came 513 00:32:53,760 --> 00:32:57,240 Speaker 1: out to check out how this thing worked, and they 514 00:32:57,320 --> 00:33:00,560 Speaker 1: noticed something hinky was going on, even though red Haifer 515 00:33:00,680 --> 00:33:04,040 Speaker 1: was doing his best to run interference and prevent anyone 516 00:33:04,120 --> 00:33:07,360 Speaker 1: from getting too close a look at the machine. The 517 00:33:07,440 --> 00:33:12,400 Speaker 1: gears of the device, which was supposedly powering a second machine, 518 00:33:12,520 --> 00:33:15,280 Speaker 1: were worn down in such a way that it was 519 00:33:15,360 --> 00:33:18,080 Speaker 1: pretty clear that it was actually the second machine that 520 00:33:18,160 --> 00:33:21,600 Speaker 1: was providing the energy to turn the quote unquote perpetual 521 00:33:21,720 --> 00:33:25,600 Speaker 1: motion machine, not the other way around. So if we 522 00:33:25,600 --> 00:33:28,360 Speaker 1: were talking about cars, this would be like discovering that 523 00:33:28,440 --> 00:33:32,040 Speaker 1: the wheels turning were causing the pistons of the engine 524 00:33:32,080 --> 00:33:36,760 Speaker 1: to reciprocate in their cylinders. It's it's going the opposite way. 525 00:33:36,800 --> 00:33:40,240 Speaker 1: So the investigators then hired a local engineer named Isaiah 526 00:33:40,520 --> 00:33:45,080 Speaker 1: Lucas to build a similar device, using a secondary machine 527 00:33:45,120 --> 00:33:48,600 Speaker 1: to provide power to what would be the perpetual motion 528 00:33:48,760 --> 00:33:52,240 Speaker 1: type machine, and then they showed it to red Haifer, 529 00:33:52,600 --> 00:33:55,080 Speaker 1: who saw that the jig was up and he hoofted 530 00:33:55,160 --> 00:33:58,160 Speaker 1: out of town to New York City. He tried to 531 00:33:58,240 --> 00:34:02,160 Speaker 1: pull essentially the same scam there, this time using a 532 00:34:02,200 --> 00:34:06,440 Speaker 1: machine that was secretly powered by a hand crank in 533 00:34:06,640 --> 00:34:10,799 Speaker 1: a secret room on the other side of the wall. Uh, 534 00:34:10,840 --> 00:34:13,319 Speaker 1: technically it was just a feller sitting there with a 535 00:34:13,440 --> 00:34:16,399 Speaker 1: hand crank in one hand and a sandwich in the other, 536 00:34:16,800 --> 00:34:20,960 Speaker 1: providing the work to turn this machine. Robert Fulton, a 537 00:34:20,960 --> 00:34:25,200 Speaker 1: mechanical engineer of great renown, exposed the whole device as 538 00:34:25,239 --> 00:34:28,000 Speaker 1: a fraud when he pulled apart some boards on the 539 00:34:28,040 --> 00:34:31,719 Speaker 1: wall and revealed the man sitting there cranking away and 540 00:34:31,800 --> 00:34:36,000 Speaker 1: Red Heifer fled again. Records of what happened next are sketchy. 541 00:34:36,080 --> 00:34:38,320 Speaker 1: It seems he might have tried to pull the Sang 542 00:34:38,440 --> 00:34:42,160 Speaker 1: Dang scheme in Philadelphia again a bit later, but he 543 00:34:42,239 --> 00:34:46,520 Speaker 1: disappeared from the historical record after reportedly refusing to demonstrate 544 00:34:46,640 --> 00:34:50,480 Speaker 1: his new device. When we come back, I'll compare this 545 00:34:50,600 --> 00:34:54,399 Speaker 1: to what I mentioned before a Tharaos before we chat 546 00:34:54,400 --> 00:34:58,239 Speaker 1: about other concerns regarding the black box problem. But first 547 00:34:58,280 --> 00:35:11,440 Speaker 1: let's take another quick break. Okay, So, Saraos this is 548 00:35:11,480 --> 00:35:15,880 Speaker 1: the biomedical technology company that was founded by Elizabeth Holmes, 549 00:35:16,480 --> 00:35:20,200 Speaker 1: and she is currently awaiting a trial on charges of 550 00:35:20,280 --> 00:35:24,080 Speaker 1: federal fraud in the United States. The trial was supposed 551 00:35:24,160 --> 00:35:27,279 Speaker 1: to begin in August twenty twenty, but has since been 552 00:35:27,280 --> 00:35:32,680 Speaker 1: delayed until one due to COVID nineteen. Now, the pitch 553 00:35:32,920 --> 00:35:38,799 Speaker 1: for Tharaos was really really alluring. What if engineers could 554 00:35:38,800 --> 00:35:42,400 Speaker 1: make a machine capable of testing a single droplet of 555 00:35:42,440 --> 00:35:47,160 Speaker 1: blood for more than one hundred possible illnesses and conditions, 556 00:35:47,600 --> 00:35:50,920 Speaker 1: So rather than going through multiple blood draws and tests 557 00:35:50,960 --> 00:35:53,719 Speaker 1: to try and figure out what's wrong, you could get 558 00:35:53,760 --> 00:35:56,480 Speaker 1: an answer based off one little pin prick within a 559 00:35:56,520 --> 00:35:59,680 Speaker 1: couple of hours. Maybe you would even be able to 560 00:35:59,719 --> 00:36:02,920 Speaker 1: buy a theorophn nos machine for your home, kind of 561 00:36:02,920 --> 00:36:05,360 Speaker 1: like a desktop printer, and that would allow you to 562 00:36:05,400 --> 00:36:09,000 Speaker 1: do a quick blood test at a moment's notice. Maybe 563 00:36:09,000 --> 00:36:11,040 Speaker 1: you would get a heads up about something you should 564 00:36:11,080 --> 00:36:14,200 Speaker 1: talk to your doctor about, preventing tragedy. In the process, 565 00:36:14,719 --> 00:36:17,080 Speaker 1: you might learn that with some changes in your lifestyle, 566 00:36:17,160 --> 00:36:20,839 Speaker 1: you could improve your overall health or stave off various illnesses. 567 00:36:21,480 --> 00:36:26,080 Speaker 1: It would democratize medicine, giving the average person more control 568 00:36:26,280 --> 00:36:29,440 Speaker 1: and knowledge about their own health and giving them a 569 00:36:29,440 --> 00:36:35,040 Speaker 1: better starter point for conversations with their doctors. And yeah, 570 00:36:35,200 --> 00:36:39,359 Speaker 1: that's a great goal. It's a fantastic sales pitch, and 571 00:36:39,560 --> 00:36:44,240 Speaker 1: it did get homes and theorophness a lot of interested 572 00:36:44,320 --> 00:36:47,120 Speaker 1: investors who really wanted to tap into this because not 573 00:36:47,200 --> 00:36:51,160 Speaker 1: only is it something that you would want for yourself. 574 00:36:51,200 --> 00:36:55,040 Speaker 1: You could easily see that if this is possible, that 575 00:36:55,120 --> 00:36:57,320 Speaker 1: business is going to be like the next Apple. It 576 00:36:57,320 --> 00:37:00,319 Speaker 1: will become a trillion dollar company. Something that power full 577 00:37:00,640 --> 00:37:06,480 Speaker 1: would undoubtedly become a powerhouse. Now I've done full episodes 578 00:37:06,480 --> 00:37:09,920 Speaker 1: about the Ainos and how it fell apart because spoiler alert, 579 00:37:10,200 --> 00:37:14,279 Speaker 1: that's exactly what happened. The technology just didn't work. But 580 00:37:15,480 --> 00:37:17,880 Speaker 1: I think a lot of what happened with Aaronis was 581 00:37:17,960 --> 00:37:23,640 Speaker 1: largely dependent upon naivete ignorance and wishful thinking. Our technology 582 00:37:23,719 --> 00:37:27,440 Speaker 1: can do some pretty astounding stuff, right, I mean if 583 00:37:27,440 --> 00:37:30,440 Speaker 1: you had told me in two thousand that by the 584 00:37:30,520 --> 00:37:32,799 Speaker 1: end of the decade I would be carrying around a 585 00:37:32,840 --> 00:37:36,719 Speaker 1: device capable of really harnessing the power of the Internet 586 00:37:37,160 --> 00:37:39,480 Speaker 1: in my pocket and I would have access to it 587 00:37:39,560 --> 00:37:42,840 Speaker 1: all the time, I would have thought you were bonkers. 588 00:37:42,880 --> 00:37:46,640 Speaker 1: So if technology can do incredible things like that, why 589 00:37:46,760 --> 00:37:50,640 Speaker 1: can't it do something equally incredible with blood tests. The 590 00:37:50,719 --> 00:37:54,880 Speaker 1: idea is that, well, we're already seeing this amazing stuff happen, 591 00:37:55,080 --> 00:37:58,000 Speaker 1: why isn't this other amazing thing possible? And that is 592 00:37:58,160 --> 00:38:03,120 Speaker 1: dangerous thinking. It a weights all technological advances and developments, 593 00:38:03,160 --> 00:38:08,000 Speaker 1: and that's just not how reality works Moore's law, the 594 00:38:08,120 --> 00:38:13,280 Speaker 1: observation that generally speaking, computational power doubles every two years 595 00:38:13,719 --> 00:38:18,640 Speaker 1: has really helped fuel a misunderstanding about technology in general. 596 00:38:18,680 --> 00:38:22,719 Speaker 1: We extend that same crazy growth to all sorts of 597 00:38:22,800 --> 00:38:26,880 Speaker 1: fields and technology when that doesn't actually apply, and it 598 00:38:26,920 --> 00:38:30,080 Speaker 1: gives us the motivation to fool ourselves into thinking that 599 00:38:30,200 --> 00:38:35,200 Speaker 1: the impossible is actually possible. That I think is what 600 00:38:35,320 --> 00:38:39,120 Speaker 1: happened with Sarah. No. Nos, Now, I'm not saying Holmes 601 00:38:39,320 --> 00:38:43,880 Speaker 1: set out to deceive people. I don't know what she 602 00:38:44,040 --> 00:38:47,960 Speaker 1: really believed was possible, but based on what I've read 603 00:38:48,080 --> 00:38:51,400 Speaker 1: and seen and listened to, to me, it sounds like 604 00:38:51,560 --> 00:38:54,399 Speaker 1: she figured there was at least a decent chance her 605 00:38:54,520 --> 00:38:59,480 Speaker 1: vision would become possible, and so a lot of Saraphnoss activities, 606 00:38:59,719 --> 00:39:03,400 Speaker 1: in personal opinion, appeared to have been meant to stall 607 00:39:03,560 --> 00:39:07,520 Speaker 1: for time while engineers were working on very hard problems 608 00:39:07,760 --> 00:39:11,360 Speaker 1: to make the blood testing device work as intended. The 609 00:39:11,440 --> 00:39:14,439 Speaker 1: further into the process, the more of the company had 610 00:39:14,480 --> 00:39:16,560 Speaker 1: to spin wheels to make it seem like it was 611 00:39:16,640 --> 00:39:20,640 Speaker 1: making more progress than it actually was. The company had 612 00:39:20,719 --> 00:39:23,520 Speaker 1: raised an enormous amount of money from the investors, so 613 00:39:23,560 --> 00:39:27,400 Speaker 1: they were beholden to them. They had also secured agreements 614 00:39:27,400 --> 00:39:30,719 Speaker 1: with drug store chains to provide services to customers, so 615 00:39:31,080 --> 00:39:34,960 Speaker 1: they needed to perform a service. It had to show progress, 616 00:39:35,080 --> 00:39:38,080 Speaker 1: even if behind the scenes things had actually stalled out. 617 00:39:38,560 --> 00:39:41,520 Speaker 1: On top of that, you also have the reports of 618 00:39:42,160 --> 00:39:45,959 Speaker 1: executives like Holmes herself living the high life and really 619 00:39:46,080 --> 00:39:51,640 Speaker 1: enjoying incredible benefits of wealth because of the enormous investment 620 00:39:51,680 --> 00:39:55,120 Speaker 1: into the company. So that plays a part two therein 621 00:39:55,160 --> 00:39:58,680 Speaker 1: knows as operations were effectively a black box to the 622 00:39:58,719 --> 00:40:01,719 Speaker 1: outside world. It was meant to missdirect and give the 623 00:40:01,760 --> 00:40:05,520 Speaker 1: implication that things were working fine behind the scenes, while 624 00:40:05,560 --> 00:40:07,799 Speaker 1: the people who were actually there were trying to keep 625 00:40:07,880 --> 00:40:11,720 Speaker 1: up the illusion while simultaneously attempting to solve what appeared 626 00:40:11,760 --> 00:40:15,799 Speaker 1: to be impossible problems. At some point, based on how 627 00:40:15,840 --> 00:40:20,080 Speaker 1: things unfolded, I would say that executives that THEOS appeared 628 00:40:20,160 --> 00:40:24,760 Speaker 1: to be perpetrating a scam, not just you know, trying 629 00:40:24,760 --> 00:40:28,560 Speaker 1: to maintain an illusion while getting things to work. They 630 00:40:28,560 --> 00:40:32,480 Speaker 1: were actively scamming people. In my opinion, maybe they were 631 00:40:32,480 --> 00:40:35,160 Speaker 1: still holding out hope that it would ultimately work out, 632 00:40:35,520 --> 00:40:38,200 Speaker 1: but that doesn't change that it was a classic case 633 00:40:38,320 --> 00:40:41,040 Speaker 1: of smoke and mirrors to hide what was really happening, 634 00:40:41,280 --> 00:40:45,719 Speaker 1: such as using existing blood testing technology from other companies 635 00:40:46,000 --> 00:40:49,120 Speaker 1: in order to run tests while claiming that the results 636 00:40:49,120 --> 00:40:53,160 Speaker 1: were coming from actual THEOS devices. But again, this is 637 00:40:53,200 --> 00:40:55,719 Speaker 1: all my own opinion based on what I've seen and 638 00:40:55,760 --> 00:40:58,600 Speaker 1: read about the subject. A court will have to determine 639 00:40:58,600 --> 00:41:01,880 Speaker 1: whether or not Homes and other is actually committed fraud. 640 00:41:02,320 --> 00:41:04,880 Speaker 1: A lot of the technology we rely upon in our 641 00:41:05,000 --> 00:41:08,839 Speaker 1: day to day lives is complicated stuff, and there are 642 00:41:08,840 --> 00:41:11,239 Speaker 1: limited hours in the day. It's a bit much to 643 00:41:11,280 --> 00:41:13,680 Speaker 1: ask anyone to become an expert on all things tech 644 00:41:14,040 --> 00:41:16,520 Speaker 1: to figure out exactly how they work. Tech is also 645 00:41:16,560 --> 00:41:19,319 Speaker 1: becoming more and more specialized, so you might become an 646 00:41:19,320 --> 00:41:22,960 Speaker 1: expert in one area of technology and be completely ignorant 647 00:41:23,040 --> 00:41:26,600 Speaker 1: of another. That's not unusual because it takes a lot 648 00:41:26,640 --> 00:41:29,600 Speaker 1: of time to become an expert at specific areas of tech. 649 00:41:29,680 --> 00:41:32,960 Speaker 1: These days, they've become so specialized. But by overlooking the 650 00:41:33,040 --> 00:41:37,959 Speaker 1: how we can make ourselves vulnerable to bad actors out 651 00:41:37,960 --> 00:41:41,680 Speaker 1: there when it comes to technology. Maybe they are actively 652 00:41:41,760 --> 00:41:44,480 Speaker 1: trying to pull the wool over our eyes, or maybe 653 00:41:44,520 --> 00:41:48,400 Speaker 1: they're just simply misguided and they misunderstand how stuff works. 654 00:41:48,560 --> 00:41:52,280 Speaker 1: But either way, our own ignorance of how tech does 655 00:41:52,480 --> 00:41:55,480 Speaker 1: and what it does, and the limitations that we all 656 00:41:55,560 --> 00:41:58,239 Speaker 1: face based on, you know, the fundamental laws of the 657 00:41:58,320 --> 00:42:01,680 Speaker 1: universe as we understand them, that all makes us potential 658 00:42:02,080 --> 00:42:06,120 Speaker 1: marks or targets. That's where critical thinking comes in and 659 00:42:06,160 --> 00:42:10,759 Speaker 1: plays a part. Knowing to ask questions and to critically 660 00:42:10,880 --> 00:42:14,640 Speaker 1: examine the answers, and to ask follow up questions, and 661 00:42:14,680 --> 00:42:19,440 Speaker 1: to not accept claims at face value are all important traits. Now, 662 00:42:19,480 --> 00:42:21,719 Speaker 1: we do have to be careful not to go so 663 00:42:21,760 --> 00:42:25,719 Speaker 1: far as to embrace denialism. If we are confronted with 664 00:42:25,800 --> 00:42:28,960 Speaker 1: compelling evidence that supports the claim, we need to be 665 00:42:29,040 --> 00:42:32,400 Speaker 1: ready to accept that claim. I'm not advocating for you 666 00:42:32,440 --> 00:42:34,840 Speaker 1: guys to just go out there and say that any 667 00:42:34,880 --> 00:42:37,880 Speaker 1: and every claim is just bogus. That's not the point. 668 00:42:38,480 --> 00:42:41,640 Speaker 1: I'll close this out by talking about something we're seeing 669 00:42:41,760 --> 00:42:46,040 Speaker 1: unfold in real time around us, and that involves machine 670 00:42:46,120 --> 00:42:49,560 Speaker 1: learning and AI systems. Now, if you follow the circles 671 00:42:49,600 --> 00:42:52,359 Speaker 1: that report on this kind of stuff, you will occasionally 672 00:42:52,360 --> 00:42:56,480 Speaker 1: see calls for transparency. Those calls are to urge people 673 00:42:56,520 --> 00:43:00,560 Speaker 1: who are designing these machine learning systems and AI systems 674 00:43:01,000 --> 00:43:03,880 Speaker 1: to show their work, as it were, and to have 675 00:43:03,960 --> 00:43:07,520 Speaker 1: the systems themselves show their work. It's not enough to 676 00:43:07,560 --> 00:43:10,520 Speaker 1: create a system that can perform a task like image 677 00:43:10,560 --> 00:43:13,759 Speaker 1: recognition and then give us results. We need to know 678 00:43:14,160 --> 00:43:17,680 Speaker 1: how the system came to those conclusions that it produced. 679 00:43:18,200 --> 00:43:21,399 Speaker 1: We need this in order to check for stuff like biases, 680 00:43:21,600 --> 00:43:25,839 Speaker 1: which is a serious issue in artificial intelligence. Honestly, it's 681 00:43:25,840 --> 00:43:29,200 Speaker 1: a really big problem for tech in general, but we're 682 00:43:29,200 --> 00:43:33,080 Speaker 1: really seeing it play out rather spectacularly in AI. Now 683 00:43:33,080 --> 00:43:35,840 Speaker 1: i'll give you an example that I've already alluded to, 684 00:43:36,560 --> 00:43:41,719 Speaker 1: facial recognition technology. The U s National Institute of Standards 685 00:43:41,800 --> 00:43:46,160 Speaker 1: and Technology conducted an investigation in twenty nineteen into facial 686 00:43:46,200 --> 00:43:51,320 Speaker 1: recognition technologies, and it found that algorithms were pretty darn 687 00:43:51,400 --> 00:43:56,200 Speaker 1: good at identifying Caucasian faces, but if they were analyzing 688 00:43:56,360 --> 00:44:00,920 Speaker 1: a black or an Asian face, they were are less accurate, 689 00:44:01,280 --> 00:44:07,360 Speaker 1: sometimes one times more likely to falsely identify somebody based 690 00:44:07,400 --> 00:44:13,080 Speaker 1: on an image. The worst error rates involved identifying Native Americans. 691 00:44:13,160 --> 00:44:15,919 Speaker 1: So let's let that sink in, because when we talk 692 00:44:15,920 --> 00:44:20,279 Speaker 1: about issues like systemic racism, we sometimes forget about how 693 00:44:20,320 --> 00:44:23,880 Speaker 1: that can manifest in ways that aren't as intuitive or 694 00:44:23,920 --> 00:44:28,520 Speaker 1: obvious as the really overt stuff. We live in a 695 00:44:28,560 --> 00:44:32,680 Speaker 1: world that has cameras all over the place. Surveillance is 696 00:44:32,680 --> 00:44:35,160 Speaker 1: a real thing that's going on all the time. Police 697 00:44:35,200 --> 00:44:39,680 Speaker 1: and other law enforcement agencies rely heavily on facial recognition 698 00:44:39,719 --> 00:44:44,320 Speaker 1: algorithms to identify suspects and to search for people of interest, 699 00:44:44,760 --> 00:44:48,280 Speaker 1: and if those algorithms have a low rate of reliability 700 00:44:48,400 --> 00:44:52,799 Speaker 1: for different ethnicities, a disproportionate number of people who have 701 00:44:53,040 --> 00:44:57,000 Speaker 1: no connection to any investigation are going to be singled 702 00:44:57,040 --> 00:45:01,680 Speaker 1: out by mistake by these algorithms. Lives can be disrupted, 703 00:45:01,840 --> 00:45:06,680 Speaker 1: careers can be ruined, relationships hurt all because a computer 704 00:45:06,760 --> 00:45:12,240 Speaker 1: program can't tell the difference between two different faces. That 705 00:45:12,600 --> 00:45:15,680 Speaker 1: is a serious problem, and it points to a couple 706 00:45:15,680 --> 00:45:18,600 Speaker 1: of things. One of the big ones is a lack 707 00:45:18,640 --> 00:45:21,920 Speaker 1: of diversity on the design side of things. We've seen 708 00:45:21,960 --> 00:45:24,560 Speaker 1: this with tech for a long time. There is a 709 00:45:24,680 --> 00:45:28,880 Speaker 1: really critical diversity issue going on with technology. The people 710 00:45:28,920 --> 00:45:32,600 Speaker 1: who are building algorithms and training machine learning systems are 711 00:45:32,719 --> 00:45:35,200 Speaker 1: largely failing to do so in a way that can 712 00:45:35,239 --> 00:45:41,319 Speaker 1: be equally applicable across different ethnicities. Meanwhile, organizations like the 713 00:45:41,360 --> 00:45:45,400 Speaker 1: American Civil Liberties Union are calling upon law enforcement agencies 714 00:45:45,400 --> 00:45:50,200 Speaker 1: to stop relying on technology like this entirely pointing out 715 00:45:50,239 --> 00:45:54,600 Speaker 1: that the potential for harm to befall innocent people outweighs 716 00:45:54,640 --> 00:45:58,520 Speaker 1: the benefits of using the tech to catch, you know, criminals. 717 00:45:58,840 --> 00:46:03,080 Speaker 1: A machine learning system trained to do something like identify 718 00:46:03,160 --> 00:46:07,839 Speaker 1: people based on their faces needs to be transparent so 719 00:46:07,880 --> 00:46:12,040 Speaker 1: that when a bias becomes evident, engineers can go back 720 00:46:12,080 --> 00:46:14,760 Speaker 1: to the machine learning system and look and see where 721 00:46:14,760 --> 00:46:17,920 Speaker 1: it went wrong, and then train it to eliminate the bias. 722 00:46:18,560 --> 00:46:22,240 Speaker 1: Without transparency, it can be hard or impossible to figure 723 00:46:22,239 --> 00:46:26,240 Speaker 1: out exactly where things are going wrong within the system. Meanwhile, 724 00:46:26,760 --> 00:46:31,480 Speaker 1: real people in the real world are suffering the consequences. Now, 725 00:46:31,480 --> 00:46:34,080 Speaker 1: if we extend this outward and we look into a 726 00:46:34,160 --> 00:46:37,879 Speaker 1: future where artificial intelligence is undoubtedly going to play a 727 00:46:37,920 --> 00:46:41,200 Speaker 1: critical part in our day to day experiences, we see 728 00:46:41,200 --> 00:46:44,719 Speaker 1: how we need to avoid these black box situations. We 729 00:46:44,760 --> 00:46:48,600 Speaker 1: need to understand why a system will generate a particular 730 00:46:48,640 --> 00:46:52,160 Speaker 1: output given specific inputs. We've got to be able to 731 00:46:52,239 --> 00:46:54,400 Speaker 1: check the systems to be certain they are coming to 732 00:46:54,440 --> 00:46:59,920 Speaker 1: the right conclusions. Artificial intelligence has enormous potential to all 733 00:47:00,160 --> 00:47:03,040 Speaker 1: mean how we go about everything from running errands to 734 00:47:03,160 --> 00:47:06,600 Speaker 1: performing our jobs, but we need to be certain that 735 00:47:06,640 --> 00:47:11,120 Speaker 1: the guidance we receive is dependable. That's the right course 736 00:47:11,200 --> 00:47:14,760 Speaker 1: of action, And so I hope this episode has really 737 00:47:14,840 --> 00:47:18,160 Speaker 1: driven home how it's important for us to hold technology 738 00:47:18,320 --> 00:47:21,840 Speaker 1: up to a critical view. It's not that technology is 739 00:47:21,960 --> 00:47:27,040 Speaker 1: inherently good or bad, or that people are specifically acting 740 00:47:27,120 --> 00:47:30,960 Speaker 1: in an ethical or unethical way, but rather that without 741 00:47:31,120 --> 00:47:34,239 Speaker 1: using critical thinking, we can't be certain if what we're 742 00:47:34,239 --> 00:47:39,080 Speaker 1: relying upon is actually reliable or not. I also urge, 743 00:47:39,200 --> 00:47:44,080 Speaker 1: as always that we pair compassion with critical thinking. I 744 00:47:44,120 --> 00:47:47,400 Speaker 1: think there's a tendency for us to kind of assign 745 00:47:47,600 --> 00:47:52,000 Speaker 1: blame and intent when things go wrong, and sometimes that 746 00:47:52,200 --> 00:47:56,200 Speaker 1: is appropriate, but I would argue that we shouldn't jump 747 00:47:56,320 --> 00:47:59,680 Speaker 1: to that conclusion right off the bat. Sometimes people just 748 00:48:00,239 --> 00:48:04,160 Speaker 1: bad choices, or they are misinterpreting things, but they don't 749 00:48:04,280 --> 00:48:08,200 Speaker 1: have any intent to mislead. So while I do advocate 750 00:48:08,280 --> 00:48:11,880 Speaker 1: that we use critical thinking as much as possible, let's 751 00:48:12,320 --> 00:48:16,600 Speaker 1: be decent, nice human beings whenever we do that. If 752 00:48:16,640 --> 00:48:20,360 Speaker 1: it turns out someone is truly being unethical and trying 753 00:48:20,400 --> 00:48:24,320 Speaker 1: to deceive others, that's obviously a different story. But before 754 00:48:24,360 --> 00:48:28,279 Speaker 1: you know for sure, I say we employ that compassion, 755 00:48:28,880 --> 00:48:32,480 Speaker 1: and hopefully we are able to solve these problems before 756 00:48:32,880 --> 00:48:37,960 Speaker 1: they have these real world impacts, because the consequences of 757 00:48:38,000 --> 00:48:42,279 Speaker 1: those are dramatic and terrible and avoidable if we use 758 00:48:42,320 --> 00:48:46,279 Speaker 1: critical thinking. I hope you guys enjoyed this episode. Will 759 00:48:46,320 --> 00:48:50,040 Speaker 1: be back with other new episodes that will probably touch 760 00:48:50,080 --> 00:48:53,640 Speaker 1: on critical thinking, but they won't be, you know, completely 761 00:48:53,680 --> 00:48:57,319 Speaker 1: built around the concept. But if you guys have suggestions 762 00:48:57,320 --> 00:48:59,839 Speaker 1: for future topics I should tackle in tech Stuff, whether 763 00:49:00,040 --> 00:49:04,080 Speaker 1: it's a company, a trend, a personality and tech a 764 00:49:04,160 --> 00:49:07,560 Speaker 1: specific technology you want to know how it works, anything 765 00:49:07,600 --> 00:49:09,920 Speaker 1: like that, let me know. Send me a message on Twitter. 766 00:49:10,000 --> 00:49:12,239 Speaker 1: The handle for the show is tech Stuff H s 767 00:49:12,440 --> 00:49:20,680 Speaker 1: W and I'll talk to you again really soon. Text 768 00:49:20,719 --> 00:49:24,160 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 769 00:49:24,160 --> 00:49:26,920 Speaker 1: from I Heart Radio, visit the i heart Radio app, 770 00:49:27,080 --> 00:49:30,240 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.