1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:13,840 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,880 --> 00:00:16,640 Speaker 1: I'm your host, Jonathan Strickling. I'm an executive producer with 4 00:00:16,640 --> 00:00:18,480 Speaker 1: How Stuff Works and my Heart Radio and I love 5 00:00:18,560 --> 00:00:22,560 Speaker 1: all things tech. I always have to qualify that after 6 00:00:22,600 --> 00:00:25,360 Speaker 1: I say it, because then I end up covering topics 7 00:00:25,400 --> 00:00:29,560 Speaker 1: like this one today where I don't love all things tech. 8 00:00:30,000 --> 00:00:33,320 Speaker 1: I guess it's it's being a little a little disingenuous 9 00:00:33,360 --> 00:00:37,760 Speaker 1: to make that claim. But recently, critical thinking has really 10 00:00:38,040 --> 00:00:41,400 Speaker 1: been on my mind a lot. I always want to 11 00:00:41,479 --> 00:00:45,159 Speaker 1: be a critical thinker, though like most humans, I do 12 00:00:45,280 --> 00:00:49,040 Speaker 1: have lapses. Sometimes I encounter a message that is so 13 00:00:49,120 --> 00:00:52,400 Speaker 1: appealing that my desire for it to be true can 14 00:00:52,479 --> 00:00:56,160 Speaker 1: override my skepticism, and I'll fail to ask myself or 15 00:00:56,440 --> 00:00:59,760 Speaker 1: anyone else for that matter, important questions to make sure 16 00:00:59,800 --> 00:01:03,880 Speaker 1: that what is being promised is in fact realistic and achievable. 17 00:01:04,280 --> 00:01:07,280 Speaker 1: My goal is to minimize the number of times I 18 00:01:07,360 --> 00:01:09,840 Speaker 1: go along with a pitch simply because it was a 19 00:01:09,880 --> 00:01:13,679 Speaker 1: really good pitch. But technology sometimes makes that really hard. 20 00:01:14,120 --> 00:01:16,559 Speaker 1: And I want to talk about this challenge because it's 21 00:01:16,560 --> 00:01:21,880 Speaker 1: one that we all encounter. Technology is undeniably amazing. Just 22 00:01:21,920 --> 00:01:25,399 Speaker 1: think about how much humans have achieved in a very 23 00:01:25,480 --> 00:01:28,600 Speaker 1: short amount of time. For most of human history, our 24 00:01:28,720 --> 00:01:33,959 Speaker 1: technological advancement was really slow. We completed some monumental achievements 25 00:01:33,959 --> 00:01:37,760 Speaker 1: in art and architecture, and sadly, in finding new ways 26 00:01:37,800 --> 00:01:40,600 Speaker 1: to kill each other, but apart from some early experiments 27 00:01:40,600 --> 00:01:43,720 Speaker 1: and steam power, and a few interesting ideas from geniuses 28 00:01:43,760 --> 00:01:48,760 Speaker 1: like Leonardo da Vinci, we really didn't see incredibly rapid advances. 29 00:01:49,160 --> 00:01:52,560 Speaker 1: Then we get to the nineteenth century, when a combination 30 00:01:52,600 --> 00:01:56,800 Speaker 1: of factors led to the Industrial Revolution. That revolution increased 31 00:01:56,800 --> 00:01:59,720 Speaker 1: productivity and lead to conditions that made it possible for 32 00:01:59,800 --> 00:02:03,960 Speaker 1: more innovators to experiment and expand our knowledge, understanding, and 33 00:02:04,080 --> 00:02:07,800 Speaker 1: ability to exploit the world around us. Then we get 34 00:02:07,840 --> 00:02:11,080 Speaker 1: to the twentieth century and the development of computers and 35 00:02:11,120 --> 00:02:16,280 Speaker 1: the transistor, manaturization, mass produced plastic, and some other important 36 00:02:16,320 --> 00:02:21,919 Speaker 1: innovations that would allow for truly rapid technological evolution. Consider this, 37 00:02:22,280 --> 00:02:25,440 Speaker 1: When I was a kid, there was no public internet, 38 00:02:25,560 --> 00:02:28,160 Speaker 1: cell phones were pretty much restricted to R and D labs. 39 00:02:28,480 --> 00:02:32,240 Speaker 1: Personal computers had just entered the market. Many of today's 40 00:02:32,400 --> 00:02:36,440 Speaker 1: fastest growing companies couldn't have existed because the business they 41 00:02:36,440 --> 00:02:40,240 Speaker 1: are based off of didn't have a platform yet. For 42 00:02:40,280 --> 00:02:43,560 Speaker 1: a while, it remained possible for the average person to 43 00:02:43,680 --> 00:02:47,320 Speaker 1: know enough about the technology they encountered to deal with 44 00:02:47,360 --> 00:02:50,120 Speaker 1: it when things went wrong, at least for most of 45 00:02:50,160 --> 00:02:53,360 Speaker 1: the technology. Some like television sets and stuff, we're already 46 00:02:53,360 --> 00:02:56,560 Speaker 1: well beyond the understanding of the average person. But this 47 00:02:56,680 --> 00:03:00,359 Speaker 1: actually got harder to do as technology advanced, and we've 48 00:03:00,400 --> 00:03:03,640 Speaker 1: seen it manifest in many ways. A good example of 49 00:03:03,680 --> 00:03:08,160 Speaker 1: this is in the automotive industry. Classic cars can be complicated, 50 00:03:08,200 --> 00:03:11,440 Speaker 1: but with some training and practice and owner can learn 51 00:03:11,480 --> 00:03:13,720 Speaker 1: how to do maintenance and repairs on a lot of 52 00:03:13,760 --> 00:03:17,040 Speaker 1: different parts of the car by themselves. There's a learning 53 00:03:17,040 --> 00:03:19,919 Speaker 1: curve there, but it's totally possible, and there are a 54 00:03:20,000 --> 00:03:22,840 Speaker 1: lot of people who love to take old cars and 55 00:03:22,880 --> 00:03:27,160 Speaker 1: restore them as sort of a passion project. Today cars 56 00:03:27,200 --> 00:03:29,560 Speaker 1: tend to have components in them that are high tech 57 00:03:30,080 --> 00:03:32,200 Speaker 1: and sealed away in such a way as to make 58 00:03:32,240 --> 00:03:36,720 Speaker 1: it difficult or impossible to access without proprietary tools and 59 00:03:36,840 --> 00:03:39,400 Speaker 1: a deep understanding of how they work. You might not 60 00:03:39,440 --> 00:03:41,720 Speaker 1: be able to tell what's wrong with a car without 61 00:03:41,760 --> 00:03:46,280 Speaker 1: a special diagnosis scanning tool. And after learning what's wrong, 62 00:03:46,520 --> 00:03:48,840 Speaker 1: you might not be able to do anything about it yourself. 63 00:03:49,240 --> 00:03:53,120 Speaker 1: As cars get more advanced with features like various sensors 64 00:03:53,160 --> 00:03:56,880 Speaker 1: and systems for stuff like lane assist, adaptive cruise control, 65 00:03:57,040 --> 00:04:00,960 Speaker 1: parking assist, and more, they become harder to maintained by 66 00:04:00,960 --> 00:04:04,920 Speaker 1: the owner. They're turning into what folks call a black box, 67 00:04:05,320 --> 00:04:08,680 Speaker 1: in which you have a type of technology where the 68 00:04:08,720 --> 00:04:12,600 Speaker 1: inner workings are hidden away from the user. It doesn't 69 00:04:12,760 --> 00:04:15,720 Speaker 1: literally have to be hidden from sight, it can just 70 00:04:15,840 --> 00:04:19,960 Speaker 1: be so complicated that the average person finds it inaccessible. 71 00:04:20,400 --> 00:04:22,880 Speaker 1: And this leads us to a real challenge. As we 72 00:04:22,920 --> 00:04:26,600 Speaker 1: have learned more about the universe, we've specialized in our 73 00:04:26,640 --> 00:04:29,760 Speaker 1: focus we had to. We quickly reached a point in 74 00:04:29,760 --> 00:04:32,480 Speaker 1: which it's pretty much impossible to have a deep understanding 75 00:04:32,720 --> 00:04:36,800 Speaker 1: of all subjects. Then of our understanding has grown, we've 76 00:04:36,839 --> 00:04:40,760 Speaker 1: pushed back the boundaries of ignorance. But now there's just 77 00:04:40,800 --> 00:04:43,680 Speaker 1: too much to know for any one person to know 78 00:04:43,760 --> 00:04:47,320 Speaker 1: it all. And the stuff we've built has capitalized on 79 00:04:47,400 --> 00:04:51,360 Speaker 1: this focused understanding, but it also means that it's created 80 00:04:51,400 --> 00:04:54,640 Speaker 1: a barrier for us. We might know that something works, 81 00:04:54,920 --> 00:04:58,920 Speaker 1: but we don't necessarily understand how it works because it's 82 00:04:58,960 --> 00:05:02,360 Speaker 1: based off a principle that's alien to us, and this 83 00:05:02,440 --> 00:05:05,840 Speaker 1: was bound to happen, but it creates a dangerous situation. 84 00:05:06,120 --> 00:05:10,480 Speaker 1: It's dangerous for a few different reasons. First, we as 85 00:05:10,520 --> 00:05:14,400 Speaker 1: consumers can grow complacent. We expect stuff to work, and 86 00:05:14,440 --> 00:05:18,000 Speaker 1: when that stuff doesn't work, we're frustrated. Worse, because we 87 00:05:18,080 --> 00:05:22,080 Speaker 1: probably don't understand how that stuff works, we don't really 88 00:05:22,120 --> 00:05:25,119 Speaker 1: know how to go about fixing it. On a positive note, 89 00:05:25,400 --> 00:05:28,800 Speaker 1: that usually means there's opportunity for people with expertise to 90 00:05:28,839 --> 00:05:32,359 Speaker 1: make a living as a troubleshooter or repair professional, but 91 00:05:32,440 --> 00:05:35,200 Speaker 1: it means that as consumers we have less ability to 92 00:05:35,240 --> 00:05:39,240 Speaker 1: work with the stuff we actually consume. Second, our technology 93 00:05:39,279 --> 00:05:42,880 Speaker 1: is approaching the point where it could be really dangerous 94 00:05:42,960 --> 00:05:46,480 Speaker 1: if we don't understand exactly how it's working. As machine 95 00:05:46,600 --> 00:05:51,280 Speaker 1: learning models and artificial intelligence become more sophisticated, it becomes 96 00:05:51,279 --> 00:05:54,120 Speaker 1: more important for us to understand how these systems are 97 00:05:54,160 --> 00:05:57,800 Speaker 1: coming to conclusions. It's pretty cool to say, hey, I 98 00:05:57,880 --> 00:06:00,560 Speaker 1: built this machine learning model and I train AI on 99 00:06:00,600 --> 00:06:03,240 Speaker 1: how to recognize a person's face. Maybe you built the 100 00:06:03,279 --> 00:06:06,960 Speaker 1: model to work with a camera manufacturer so that the 101 00:06:07,040 --> 00:06:10,640 Speaker 1: cameras they make automatically detective face and then focus on it. 102 00:06:11,000 --> 00:06:14,000 Speaker 1: But that same technology could potentially be used in tracking 103 00:06:14,080 --> 00:06:18,080 Speaker 1: and identification systems, and if that system was being used 104 00:06:18,080 --> 00:06:21,760 Speaker 1: by say, law enforcement, you want to understand exactly how 105 00:06:21,800 --> 00:06:24,640 Speaker 1: the system is identifying people so that you can audit 106 00:06:24,680 --> 00:06:27,200 Speaker 1: it and make sure that it's being accurate and not 107 00:06:27,720 --> 00:06:30,800 Speaker 1: having a lot of false flags. Otherwise you run the 108 00:06:30,880 --> 00:06:34,760 Speaker 1: risk of having the machine mistakenly identify people during an investigation, 109 00:06:35,040 --> 00:06:38,320 Speaker 1: which at the very least could be disruptive. Or to 110 00:06:38,400 --> 00:06:42,800 Speaker 1: go back to cars for a second, consider driverless cars. Now. 111 00:06:42,920 --> 00:06:46,960 Speaker 1: I am still optimistic about driverless cars, but I've tempered 112 00:06:47,040 --> 00:06:50,719 Speaker 1: my expectations on when we might see them. In my mind, 113 00:06:51,240 --> 00:06:54,080 Speaker 1: I was thinking, well, a car decked out with sensors 114 00:06:54,200 --> 00:06:56,520 Speaker 1: and a really fast computer system would be able to 115 00:06:56,560 --> 00:06:59,800 Speaker 1: detect potential problems and react to them far faster and 116 00:06:59,800 --> 00:07:03,760 Speaker 1: more logically than a human. I even thought a computer 117 00:07:03,839 --> 00:07:08,039 Speaker 1: system could be able to monitor completely around a vehicle, 118 00:07:08,279 --> 00:07:11,360 Speaker 1: whereas a human is typically focused on whatever is directly 119 00:07:11,400 --> 00:07:14,520 Speaker 1: in front of him or her, or perhaps in a mirror, 120 00:07:14,800 --> 00:07:17,880 Speaker 1: but can't pay attention to all directions at the same time. 121 00:07:18,200 --> 00:07:21,120 Speaker 1: And sure machines can react in a fraction of the 122 00:07:21,120 --> 00:07:23,680 Speaker 1: time it takes humans to do it. But machines are 123 00:07:23,840 --> 00:07:28,440 Speaker 1: really good at handling routine situations and then responding appropriately. 124 00:07:28,920 --> 00:07:32,360 Speaker 1: The more unusual an event, the harder it is for 125 00:07:32,400 --> 00:07:37,120 Speaker 1: the machine to cope with it. Machines typically aren't terribly adaptive, 126 00:07:37,560 --> 00:07:40,880 Speaker 1: and so with many millions of cars on the road, 127 00:07:41,280 --> 00:07:47,240 Speaker 1: plus bicyclists, pedestrians, animals, debris, weather events, and other factors, 128 00:07:47,480 --> 00:07:50,800 Speaker 1: it's pretty rare for any drive of a significant length 129 00:07:50,880 --> 00:07:54,960 Speaker 1: to be completely quote unquote normal. So we need to 130 00:07:55,000 --> 00:07:59,440 Speaker 1: design autonomous cars that can adapt to situations. But that 131 00:07:59,600 --> 00:08:02,360 Speaker 1: also means we need to understand what decisions a car 132 00:08:02,400 --> 00:08:05,600 Speaker 1: will make, or the very least, determine why a car 133 00:08:05,760 --> 00:08:10,119 Speaker 1: behaved in one way versus an alternative. And so there's 134 00:08:10,160 --> 00:08:13,320 Speaker 1: a move in AI and artificial neural network circles to 135 00:08:13,400 --> 00:08:17,640 Speaker 1: make these processes as transparent as possible so that we're 136 00:08:17,680 --> 00:08:21,400 Speaker 1: not caught off guard when a machine takes a particular action. 137 00:08:21,920 --> 00:08:26,520 Speaker 1: And third, advanced technology has given us unrealistic expectations of 138 00:08:26,600 --> 00:08:30,400 Speaker 1: exactly what was possible. After all, two hundred years ago, 139 00:08:30,920 --> 00:08:35,360 Speaker 1: we wouldn't really dream of going up into space. A 140 00:08:35,480 --> 00:08:38,959 Speaker 1: century ago, we might dream of it, but we still 141 00:08:39,000 --> 00:08:41,840 Speaker 1: had no real understanding of how we would accomplish it. 142 00:08:42,360 --> 00:08:45,920 Speaker 1: Then within another five decades we were sending people up 143 00:08:45,960 --> 00:08:48,800 Speaker 1: to space, then to the moon, and now we have 144 00:08:48,920 --> 00:08:53,120 Speaker 1: private companies designing launch vehicles that can return to Earth 145 00:08:53,160 --> 00:08:56,200 Speaker 1: to be refurbished and used again in future launches. That's 146 00:08:56,240 --> 00:09:00,720 Speaker 1: pretty incredible. We also have seen technology go from enormous 147 00:09:01,040 --> 00:09:04,280 Speaker 1: to the very tiny. In the nineteen forties, a computer 148 00:09:04,360 --> 00:09:07,280 Speaker 1: took up a lot of space, maybe the entire floor 149 00:09:07,320 --> 00:09:10,120 Speaker 1: of a building, and it's processing power would be a 150 00:09:10,120 --> 00:09:13,160 Speaker 1: fraction of what you'd find in the average smartphone. These days, 151 00:09:13,679 --> 00:09:17,160 Speaker 1: managorization and Moore's Law have conditioned us to think that 152 00:09:17,240 --> 00:09:20,920 Speaker 1: technology is capable of pretty much anything. I mean, it 153 00:09:20,960 --> 00:09:23,640 Speaker 1: has to be. We wouldn't have assumed that it would 154 00:09:23,679 --> 00:09:25,920 Speaker 1: be easy to get your hands on a portable computer, 155 00:09:26,080 --> 00:09:29,640 Speaker 1: capable of acting like a camera, a communications device, and 156 00:09:29,720 --> 00:09:32,920 Speaker 1: a direct link to the world's largest repository of human knowledge, 157 00:09:33,640 --> 00:09:36,280 Speaker 1: even if most of that knowledge seems to be centered 158 00:09:36,320 --> 00:09:40,240 Speaker 1: on cats. But that means when someone comes forward with 159 00:09:40,320 --> 00:09:43,719 Speaker 1: extraordinary claims, it's easier for us to take them at 160 00:09:43,760 --> 00:09:48,439 Speaker 1: face value. Technology has created an environment in which what 161 00:09:48,520 --> 00:09:54,240 Speaker 1: was impossible yesterday becomes a mundane everyday task tomorrow, and 162 00:09:54,280 --> 00:09:57,960 Speaker 1: this means that people can leverage that to our disadvantage. 163 00:09:58,320 --> 00:10:01,280 Speaker 1: In some cases, you might be with an outright snake 164 00:10:01,320 --> 00:10:04,720 Speaker 1: oil salesman type, someone who knows very well that the 165 00:10:04,880 --> 00:10:08,520 Speaker 1: dream they're peddling isn't based in reality. But in other 166 00:10:08,600 --> 00:10:12,320 Speaker 1: cases you might have sincere people who truly believe they've 167 00:10:12,360 --> 00:10:15,880 Speaker 1: either cracked the code on something that was previously thought 168 00:10:15,920 --> 00:10:19,720 Speaker 1: impossible or they feel they're right on the cusp and 169 00:10:19,720 --> 00:10:22,760 Speaker 1: if they can just get enough funding to cover costs, 170 00:10:23,040 --> 00:10:25,080 Speaker 1: they'll get the rest of the way there. Now. In 171 00:10:25,080 --> 00:10:27,200 Speaker 1: a way, this could be a good thing, as it 172 00:10:27,240 --> 00:10:30,760 Speaker 1: means that innovators have more access to resources than ever before, 173 00:10:30,880 --> 00:10:34,120 Speaker 1: and it could lead to great discoveries. But in other 174 00:10:34,200 --> 00:10:38,040 Speaker 1: cases it can lead to frustration, financial hardship, and worse. 175 00:10:38,440 --> 00:10:40,760 Speaker 1: When we come back after this quick break, I'll give 176 00:10:40,800 --> 00:10:52,400 Speaker 1: an example that's in everyone's minds right now. There's probably 177 00:10:52,720 --> 00:10:55,600 Speaker 1: no better company to point to when you're talking about 178 00:10:55,640 --> 00:10:59,439 Speaker 1: the dangers of wishful thinking than Theopness. And I know 179 00:10:59,720 --> 00:11:02,720 Speaker 1: it's been in the news a lot. You've probably heard 180 00:11:03,280 --> 00:11:06,559 Speaker 1: tons about it, maybe you've seen the documentary about it, 181 00:11:06,600 --> 00:11:10,520 Speaker 1: maybe you've seen the the various articles or listen to 182 00:11:10,559 --> 00:11:13,240 Speaker 1: the podcasts about it. But we're going to talk about 183 00:11:13,280 --> 00:11:16,920 Speaker 1: a little bit more in this context and just to 184 00:11:16,920 --> 00:11:19,560 Speaker 1: give you an overview in case you haven't encountered this. 185 00:11:20,360 --> 00:11:23,760 Speaker 1: A woman named Elizabeth Holmes founded the company after she 186 00:11:23,840 --> 00:11:27,360 Speaker 1: dropped out of Stanford. I do not know her. I 187 00:11:27,440 --> 00:11:30,559 Speaker 1: do not know whether or not she sincerely believed or 188 00:11:30,640 --> 00:11:33,600 Speaker 1: believes that the text she was seeking to invent could 189 00:11:33,600 --> 00:11:36,800 Speaker 1: really work. But I do know that as of right now, 190 00:11:37,360 --> 00:11:40,720 Speaker 1: it isn't working, and that's a big problem for those 191 00:11:40,760 --> 00:11:43,800 Speaker 1: of you who are unfamiliar with the story. Let's give 192 00:11:43,920 --> 00:11:48,280 Speaker 1: a quick summary. Elizabeth Holmes wanted to create tech that 193 00:11:48,320 --> 00:11:51,800 Speaker 1: could disrupt the health care industry. It would, in theory, 194 00:11:52,160 --> 00:11:55,800 Speaker 1: give more control an agency to consumers who could learn 195 00:11:55,880 --> 00:11:58,160 Speaker 1: much more about their health on their own without the 196 00:11:58,200 --> 00:12:01,120 Speaker 1: need to make an appointment with a doctor and undergo 197 00:12:01,240 --> 00:12:05,240 Speaker 1: numerous blood draws to have various blood tests performed. The 198 00:12:05,280 --> 00:12:08,600 Speaker 1: basic idea was that Sara no Nos would develop a 199 00:12:08,640 --> 00:12:12,520 Speaker 1: device capable of taking a very small sample of blood, 200 00:12:12,800 --> 00:12:15,520 Speaker 1: small enough to be drawn from the tip of a finger. 201 00:12:16,160 --> 00:12:18,880 Speaker 1: They would then run a battery of tests to look 202 00:12:18,920 --> 00:12:23,079 Speaker 1: for indicators of different conditions and diseases within a relatively 203 00:12:23,120 --> 00:12:26,480 Speaker 1: short time. It would produce the results, giving the user 204 00:12:26,520 --> 00:12:29,640 Speaker 1: more information about their health, which in theory, would help 205 00:12:29,720 --> 00:12:32,839 Speaker 1: that person have meaningful conversations with a physician if there 206 00:12:32,840 --> 00:12:36,920 Speaker 1: were any markers that raised concern. And it's a very powerful, 207 00:12:37,200 --> 00:12:41,280 Speaker 1: very compelling idea. There are technologies like labs on a 208 00:12:41,400 --> 00:12:45,640 Speaker 1: chip and microprocessors designed to detect the presence of certain 209 00:12:45,720 --> 00:12:49,320 Speaker 1: markers that indicate the presence of illness, but this would 210 00:12:49,320 --> 00:12:52,600 Speaker 1: wrap all of that up into a single package. In fact, 211 00:12:52,840 --> 00:12:56,000 Speaker 1: Homes worked on a project related to this. After her 212 00:12:56,080 --> 00:12:59,560 Speaker 1: first year at Stanford. She joined the Genome Institute in 213 00:12:59,600 --> 00:13:04,600 Speaker 1: single Or and was overseeing blood tests. She first envisioned 214 00:13:04,800 --> 00:13:08,120 Speaker 1: a sort of arm band that would use micro needles, 215 00:13:08,160 --> 00:13:12,160 Speaker 1: and those micro needles would both draw small blood samples 216 00:13:12,200 --> 00:13:16,599 Speaker 1: and would also administer medication on an as needed basis. 217 00:13:16,640 --> 00:13:19,120 Speaker 1: She later led a team to work on a machine 218 00:13:19,160 --> 00:13:22,520 Speaker 1: that would accept a small capsule containing a blood sample 219 00:13:22,960 --> 00:13:26,480 Speaker 1: an attempt to run multiple tests on that sample. This 220 00:13:26,600 --> 00:13:29,920 Speaker 1: was in stark contrast with the normal medical procedure in 221 00:13:29,960 --> 00:13:33,160 Speaker 1: which a doctor or nurse would draw numerous vials of 222 00:13:33,200 --> 00:13:36,600 Speaker 1: blood for testing send those samples off to one of 223 00:13:36,800 --> 00:13:40,559 Speaker 1: two major lab testing companies in the United States. If 224 00:13:40,600 --> 00:13:45,160 Speaker 1: Sarinos's technology worked, the company could totally up end that 225 00:13:45,320 --> 00:13:49,000 Speaker 1: system in the US. Patients could go to a clinic 226 00:13:49,120 --> 00:13:51,439 Speaker 1: to have a test run and get the results back 227 00:13:51,559 --> 00:13:53,720 Speaker 1: in hours, rather than having to take a trip to 228 00:13:53,760 --> 00:13:56,880 Speaker 1: the doctor's office, sit for a blood draw, and wait 229 00:13:56,920 --> 00:13:59,880 Speaker 1: several days. They could end up being cheaper than the 230 00:14:00,000 --> 00:14:03,640 Speaker 1: old approach. Sarainos executives like Homes stressed that this would 231 00:14:03,640 --> 00:14:07,080 Speaker 1: put patients in control of their own health information and 232 00:14:07,120 --> 00:14:10,080 Speaker 1: could provide many benefits, such as a heads up for 233 00:14:10,160 --> 00:14:14,080 Speaker 1: possible problems in the future or catching something early enough 234 00:14:14,120 --> 00:14:16,880 Speaker 1: to treat it before it became too severe. But the 235 00:14:16,920 --> 00:14:20,560 Speaker 1: problem is that this depended upon that if Thara Noss 236 00:14:20,600 --> 00:14:26,200 Speaker 1: technology worked. Thing. As it turned out, the tech wasn't working. 237 00:14:26,480 --> 00:14:28,720 Speaker 1: At least, it wasn't working at the level of the 238 00:14:28,760 --> 00:14:32,800 Speaker 1: company was striving for the engineers that thereas were trying 239 00:14:33,000 --> 00:14:37,160 Speaker 1: really hard to create a diagnostic device that could take 240 00:14:37,280 --> 00:14:39,720 Speaker 1: that small blood sample and run it through a lot 241 00:14:39,760 --> 00:14:44,200 Speaker 1: of tests. But as it turns out, that's actually incredibly complicated. 242 00:14:44,560 --> 00:14:47,960 Speaker 1: Using such a small sample was already a huge challenge. 243 00:14:48,320 --> 00:14:50,200 Speaker 1: On top of that, you have issues you have to 244 00:14:50,200 --> 00:14:54,520 Speaker 1: worry about, like contamination. A contaminated sample could give off 245 00:14:54,640 --> 00:14:58,040 Speaker 1: false positives, creating a situation in which a patient believes 246 00:14:58,040 --> 00:15:01,440 Speaker 1: they might have a particular disease or a condition when 247 00:15:01,440 --> 00:15:04,800 Speaker 1: that isn't really the case, or it could mask something 248 00:15:04,840 --> 00:15:08,240 Speaker 1: that the patient would need to know about, but because 249 00:15:08,240 --> 00:15:12,640 Speaker 1: the results would be inconclusive, they wouldn't know about it. Now. 250 00:15:12,640 --> 00:15:15,040 Speaker 1: Perhaps the hope was that the company would be able 251 00:15:15,040 --> 00:15:18,360 Speaker 1: to develop the technology rapidly with the help of large 252 00:15:18,360 --> 00:15:21,360 Speaker 1: investments in the company. And sure enough, there were a 253 00:15:21,360 --> 00:15:25,960 Speaker 1: lot of folks with deep pockets who poured money into thoroughness, 254 00:15:26,040 --> 00:15:29,240 Speaker 1: and you can sort of understand why. If it all worked, 255 00:15:29,520 --> 00:15:32,360 Speaker 1: it would be a revolution in medicine. The company would 256 00:15:32,480 --> 00:15:36,280 Speaker 1: stand to gain billions of dollars, The cause appeared to 257 00:15:36,280 --> 00:15:38,520 Speaker 1: be noble, and the outcome looked like it would be 258 00:15:38,560 --> 00:15:42,880 Speaker 1: incredibly profitable. It was a very tempting opportunity, and many 259 00:15:43,680 --> 00:15:47,640 Speaker 1: didn't resist that temptation. On top of all that, it 260 00:15:47,720 --> 00:15:50,880 Speaker 1: was dependent upon technology, and as I mentioned, we've come 261 00:15:50,880 --> 00:15:53,040 Speaker 1: to a point where we believe technology can do just 262 00:15:53,120 --> 00:15:56,080 Speaker 1: about anything. So it didn't seem outside the realm of 263 00:15:56,120 --> 00:16:01,280 Speaker 1: possibility that a microchip inside a Suffici CICAD machine would 264 00:16:01,280 --> 00:16:04,120 Speaker 1: be able to run a series of tests on the 265 00:16:04,240 --> 00:16:07,760 Speaker 1: small blood sample and come up with meaningful results. But 266 00:16:07,880 --> 00:16:11,040 Speaker 1: flash forward a few years and a bombshell of an 267 00:16:11,160 --> 00:16:14,400 Speaker 1: article revealing that there was really a shell game going 268 00:16:14,440 --> 00:16:17,520 Speaker 1: on at Sarais tells us that now we know the 269 00:16:17,520 --> 00:16:21,120 Speaker 1: technology was a failure, that THEOS was depending upon the 270 00:16:21,240 --> 00:16:24,080 Speaker 1: same sort of machines that the company was purporting to 271 00:16:24,120 --> 00:16:28,480 Speaker 1: replace with its innovative approach, that hundreds or thousands of 272 00:16:28,520 --> 00:16:32,600 Speaker 1: patients in trial locations were potentially at risk due to 273 00:16:32,920 --> 00:16:36,800 Speaker 1: unreliable results, and that the company used some pretty draconian 274 00:16:36,920 --> 00:16:40,320 Speaker 1: tactics to keep employees in line and prevent them speaking 275 00:16:40,360 --> 00:16:42,960 Speaker 1: out about what was going on. It's about as bad 276 00:16:42,960 --> 00:16:46,280 Speaker 1: an outcome as you could imagine. What's more, there are 277 00:16:46,320 --> 00:16:49,280 Speaker 1: those who say that even if everything had worked, the 278 00:16:49,320 --> 00:16:52,880 Speaker 1: whole enterprise was misguided in the first place. A Piece 279 00:16:53,000 --> 00:16:56,680 Speaker 1: and Wired by Gnoam Cohen cites a couple of those people. 280 00:16:56,960 --> 00:17:00,080 Speaker 1: The piece has the title the other Big lesson we 281 00:17:00,120 --> 00:17:04,920 Speaker 1: should learn from thoroughness. Cohen mentions Faye Flam, who wrote 282 00:17:04,960 --> 00:17:07,920 Speaker 1: a piece in Bloomberg that argued thoroness was tapping into 283 00:17:08,000 --> 00:17:12,200 Speaker 1: another deep human desire, the illusion of controlling our own 284 00:17:12,240 --> 00:17:16,120 Speaker 1: destiny through thous We could end up getting our own 285 00:17:16,119 --> 00:17:19,960 Speaker 1: test results and then apply our own interpretations to them. 286 00:17:20,000 --> 00:17:22,080 Speaker 1: Perhaps we would interpret them in a way that is 287 00:17:22,119 --> 00:17:25,040 Speaker 1: most comforting to us, or one that seems to align 288 00:17:25,080 --> 00:17:29,800 Speaker 1: with our preconceived ideas about our health. This isn't exactly 289 00:17:29,840 --> 00:17:33,320 Speaker 1: the best way to handle a medical issue. Surely it 290 00:17:33,359 --> 00:17:37,320 Speaker 1: makes more sense to have a trained medical professional provide 291 00:17:37,320 --> 00:17:41,679 Speaker 1: an unbiased, objective interpretation of test results that gives you 292 00:17:41,720 --> 00:17:44,360 Speaker 1: the best chance to take appropriate actions that will help 293 00:17:44,400 --> 00:17:48,439 Speaker 1: you lead a better, healthier life. So yeah, it was 294 00:17:48,480 --> 00:17:52,200 Speaker 1: a really compelling sales pitch, no wonder so many people 295 00:17:52,240 --> 00:17:55,000 Speaker 1: were on board. If it worked, it would be cheaper, 296 00:17:55,359 --> 00:17:58,960 Speaker 1: less painful, more convenient, and at least on the surface, 297 00:17:59,040 --> 00:18:02,600 Speaker 1: more empowering than the established method. It was the sort 298 00:18:02,600 --> 00:18:05,800 Speaker 1: of thing we'd want to believe in, so people did. 299 00:18:06,720 --> 00:18:10,359 Speaker 1: I think autonomous cars are following a similar trajectory. Now, 300 00:18:10,359 --> 00:18:12,840 Speaker 1: to be clear, I feel that a lot of great 301 00:18:12,840 --> 00:18:16,480 Speaker 1: work has been done in autonomous cars. They are much 302 00:18:16,640 --> 00:18:20,680 Speaker 1: further along than a mythical blood testing device that only 303 00:18:20,720 --> 00:18:23,440 Speaker 1: ever got approval for performing one type of blood test 304 00:18:23,800 --> 00:18:25,560 Speaker 1: when it was supposed to be able to run more 305 00:18:25,600 --> 00:18:28,000 Speaker 1: than one hundred of them, but we still have a 306 00:18:28,040 --> 00:18:32,320 Speaker 1: long way to go. Unfortunately, because of our experiences dealing 307 00:18:32,320 --> 00:18:36,000 Speaker 1: with truly amazing tech and the expectation that, of course, 308 00:18:36,040 --> 00:18:38,760 Speaker 1: technology can take care of the problem, we've had some 309 00:18:38,880 --> 00:18:44,919 Speaker 1: high profile accidents that prove this isn't the most reliable philosophy. Again, 310 00:18:45,240 --> 00:18:47,600 Speaker 1: it would be understandable to put a lot of faith 311 00:18:47,800 --> 00:18:51,680 Speaker 1: in the tech. Google's self driving cars, which have been 312 00:18:51,760 --> 00:18:56,040 Speaker 1: pioneers in the field, famously operated at first in secret 313 00:18:56,160 --> 00:19:00,280 Speaker 1: and then openly for hundreds of driving miles without a 314 00:19:00,440 --> 00:19:03,639 Speaker 1: single accident, or at least that was the official story. 315 00:19:04,119 --> 00:19:07,000 Speaker 1: There have been a few accidents, most of which were 316 00:19:07,040 --> 00:19:10,240 Speaker 1: likely the fault of a human driver, either the safety 317 00:19:10,280 --> 00:19:14,360 Speaker 1: operator in the autonomous vehicle or the driver of another car, 318 00:19:14,800 --> 00:19:18,200 Speaker 1: but later reports suggested that there were some serious accidents, 319 00:19:18,440 --> 00:19:20,159 Speaker 1: at least a few of which were caused by the 320 00:19:20,200 --> 00:19:24,480 Speaker 1: autonomous driving system behaving in an unexpected way. The company 321 00:19:24,600 --> 00:19:28,320 Speaker 1: kept these accidents quiet, and so there was an unearned 322 00:19:28,400 --> 00:19:33,639 Speaker 1: expectation of safety with this tech. Then enter Tesla and 323 00:19:33,680 --> 00:19:37,159 Speaker 1: the autopilot feature in the company's electric vehicles. While the 324 00:19:37,200 --> 00:19:40,000 Speaker 1: company issued a statement that made it clear that this 325 00:19:40,040 --> 00:19:43,639 Speaker 1: feature wasn't supposed to replace a human driver. That didn't 326 00:19:43,680 --> 00:19:46,480 Speaker 1: stop people from trying it out that way. Most of 327 00:19:46,520 --> 00:19:50,119 Speaker 1: those people didn't have any problems, but in at least 328 00:19:50,160 --> 00:19:53,199 Speaker 1: a couple of cases, drivers using autopilot ended up in 329 00:19:53,320 --> 00:19:57,360 Speaker 1: tragic situations. One of those was the case of Joshua 330 00:19:57,440 --> 00:20:01,720 Speaker 1: Brown in two thousand sixteen, his s Tesla crashed into 331 00:20:01,720 --> 00:20:05,159 Speaker 1: a semi truck. Brown had been using the autopilot feature. 332 00:20:05,440 --> 00:20:08,919 Speaker 1: According to the vehicle's datalogues. Out of the thirty seven 333 00:20:08,960 --> 00:20:13,000 Speaker 1: minutes Brown had the autopilot feature turned on, his hands 334 00:20:13,000 --> 00:20:16,320 Speaker 1: were on the wheel for just twenty five seconds total, 335 00:20:16,880 --> 00:20:20,040 Speaker 1: in direct violation of the policy that Tesla had set. 336 00:20:20,440 --> 00:20:24,240 Speaker 1: The company stressed the autopilot isn't meant to replace human 337 00:20:24,320 --> 00:20:27,320 Speaker 1: drivers and that the cars driver should have had their 338 00:20:27,320 --> 00:20:30,359 Speaker 1: hands on the wheel at all times. The second fatal 339 00:20:30,400 --> 00:20:34,159 Speaker 1: incident happened in March two thousand eighteen, when way Walter 340 00:20:34,359 --> 00:20:38,920 Speaker 1: Huang's Model X veered into a highway safety barrier. Recently, 341 00:20:39,160 --> 00:20:42,439 Speaker 1: his family sued Tesla, alleging the company was aware of 342 00:20:42,480 --> 00:20:45,400 Speaker 1: the dangers of the features, that Huang had been operating 343 00:20:45,400 --> 00:20:48,920 Speaker 1: the vehicle within the parameters of autopilot, and that Tesla 344 00:20:48,960 --> 00:20:52,320 Speaker 1: had been using drivers like Wong to beta test changes 345 00:20:52,600 --> 00:20:55,439 Speaker 1: to the feature in the wild. That suit is just 346 00:20:55,480 --> 00:20:58,520 Speaker 1: getting started as I record this episode, and I don't 347 00:20:58,520 --> 00:21:01,000 Speaker 1: mean to pick on Tesla, after all. I started this 348 00:21:01,040 --> 00:21:03,879 Speaker 1: by talking about how Google kept several accidents on the 349 00:21:03,960 --> 00:21:07,480 Speaker 1: q T. One of Uber's self driving cars in its 350 00:21:07,520 --> 00:21:10,600 Speaker 1: beta test program in Arizona, struck and killed a pedestrian 351 00:21:10,640 --> 00:21:14,200 Speaker 1: in March two thousand eighteen. State prosecutors decided that Uber 352 00:21:14,359 --> 00:21:17,680 Speaker 1: isn't liable for the accident, but that the safety operator 353 00:21:17,680 --> 00:21:20,320 Speaker 1: who was in the car might bear some responsibility for 354 00:21:20,400 --> 00:21:23,760 Speaker 1: failing to act before the accident. According to venture Beat, 355 00:21:24,080 --> 00:21:27,159 Speaker 1: the operator was streaming a video of the voice and 356 00:21:27,200 --> 00:21:31,080 Speaker 1: watching that rather than the road. It's quite possible that 357 00:21:31,160 --> 00:21:34,520 Speaker 1: companies pushing autonomous car technology are doing their best to 358 00:21:34,600 --> 00:21:38,200 Speaker 1: keep incidents quiet in an effort to avoid government regulations 359 00:21:38,280 --> 00:21:42,360 Speaker 1: and interference which could threaten the profitability of such a pursuit. 360 00:21:42,880 --> 00:21:45,679 Speaker 1: But at the same time, these high profile incidents have 361 00:21:45,840 --> 00:21:49,480 Speaker 1: dealt a blow to consumer confidence about the technology in general, 362 00:21:49,600 --> 00:21:52,880 Speaker 1: and they really reinforce that driving is more complicated than 363 00:21:52,920 --> 00:21:56,359 Speaker 1: just staying within your lane and breaking if something is 364 00:21:56,400 --> 00:21:59,679 Speaker 1: in the way. Before I wrap up this section, I 365 00:21:59,760 --> 00:22:03,000 Speaker 1: do want to also mention that, at least according to Tesla, 366 00:22:03,240 --> 00:22:06,560 Speaker 1: the autopilot feature has proven to be safer than human 367 00:22:06,680 --> 00:22:11,720 Speaker 1: drivers operating vehicles unassistant. According to Tesla's report, there was 368 00:22:11,800 --> 00:22:15,919 Speaker 1: one accident per two point eight seven million miles driven 369 00:22:16,200 --> 00:22:20,119 Speaker 1: where autopilot was engaged and one accident per one point 370 00:22:20,320 --> 00:22:24,040 Speaker 1: seven six million miles driven when it wasn't, and that 371 00:22:24,080 --> 00:22:28,159 Speaker 1: according to government statistics, the average is an automobile crash 372 00:22:28,359 --> 00:22:33,440 Speaker 1: every four hundred thirty six thousand miles. However, skeptical researchers 373 00:22:33,440 --> 00:22:36,479 Speaker 1: have found that Tesla hasn't always been honest or at 374 00:22:36,480 --> 00:22:40,400 Speaker 1: the very least correct about safety reports. A firm called 375 00:22:40,520 --> 00:22:45,080 Speaker 1: Quality Control Systems sued the United States National Highway Traffic 376 00:22:45,200 --> 00:22:49,960 Speaker 1: Safety Administration, or in ht s A, in order to 377 00:22:49,960 --> 00:22:54,640 Speaker 1: get the data the agency said had proved that Tesla's 378 00:22:54,640 --> 00:22:58,440 Speaker 1: autopilot had cut back on crashes by fort In fact, 379 00:22:58,840 --> 00:23:02,200 Speaker 1: QCs found at in cases in which all the data 380 00:23:02,320 --> 00:23:05,400 Speaker 1: was available, which was just a fraction of the cases 381 00:23:05,480 --> 00:23:08,880 Speaker 1: the nh t s A had used to come up 382 00:23:08,920 --> 00:23:13,159 Speaker 1: with that mark, the auto steer feature on autopilot actually 383 00:23:13,200 --> 00:23:17,840 Speaker 1: increased crash rates by So what does all this mean. 384 00:23:18,440 --> 00:23:20,520 Speaker 1: I'll get back to that in a second, but first 385 00:23:20,600 --> 00:23:31,040 Speaker 1: let's take a quick break. So am I saying you 386 00:23:31,080 --> 00:23:35,720 Speaker 1: shouldn't believe anyone, Well, that's not quite it. I think 387 00:23:35,720 --> 00:23:39,120 Speaker 1: a better thing to say is don't accept anything at 388 00:23:39,119 --> 00:23:43,439 Speaker 1: face value. Sometimes people can just be wrong about stuff. 389 00:23:43,840 --> 00:23:45,919 Speaker 1: It happens to me more often than I care to admit. 390 00:23:46,400 --> 00:23:48,840 Speaker 1: But the responsible thing to do in those cases is 391 00:23:48,880 --> 00:23:51,160 Speaker 1: to own up to the mistake and correct it where 392 00:23:51,160 --> 00:23:55,119 Speaker 1: you can. I am certain that at least some people 393 00:23:55,280 --> 00:23:58,359 Speaker 1: who thought they were onto some sort of free energy device, 394 00:23:58,400 --> 00:24:02,480 Speaker 1: for example, really did. They were onto something. Others might 395 00:24:02,480 --> 00:24:05,800 Speaker 1: have suspected that what they were pursuing was impossible, but 396 00:24:05,880 --> 00:24:09,080 Speaker 1: they had already invested too much to back out at 397 00:24:09,119 --> 00:24:12,640 Speaker 1: any rate. Whether it's a perpetual motion machine or an 398 00:24:12,720 --> 00:24:16,280 Speaker 1: over unity engine, the simple fact is that these devices 399 00:24:16,320 --> 00:24:21,240 Speaker 1: have never been proven to actually work as described supporters say. 400 00:24:21,320 --> 00:24:25,200 Speaker 1: This is because there are powerful entities like petroleum companies 401 00:24:25,400 --> 00:24:28,160 Speaker 1: that will use every means to keep such devices from 402 00:24:28,200 --> 00:24:31,680 Speaker 1: being deployed. But at the level of classical physics, such 403 00:24:31,720 --> 00:24:34,480 Speaker 1: a device would have to defy laws of physics that 404 00:24:34,520 --> 00:24:36,880 Speaker 1: have stood the test of time. Now does that mean 405 00:24:37,240 --> 00:24:42,080 Speaker 1: that such a device is impossible. No, it's not impossible, 406 00:24:42,320 --> 00:24:47,280 Speaker 1: but it does mean you need truly extraordinary, irrefutable proof 407 00:24:47,560 --> 00:24:51,960 Speaker 1: that it worked. In other cases, people are being outright 408 00:24:52,040 --> 00:24:55,320 Speaker 1: dishonest in an effort to advance their own agendas. They 409 00:24:55,440 --> 00:24:58,960 Speaker 1: might take efforts to hide any deficiencies in the technology, 410 00:24:59,200 --> 00:25:02,600 Speaker 1: or to overly elevate stuff that's working to make it 411 00:25:02,640 --> 00:25:05,480 Speaker 1: seem more important than it is. They might just be 412 00:25:05,560 --> 00:25:08,000 Speaker 1: stalling for time in the hopes that a breakthrough is 413 00:25:08,080 --> 00:25:10,679 Speaker 1: right around the corner and they can reap the benefits 414 00:25:11,080 --> 00:25:14,560 Speaker 1: once it all pans out. Now, we're going to see 415 00:25:14,560 --> 00:25:19,360 Speaker 1: technology continue to advance and evolve. In most cases, we'll 416 00:25:19,400 --> 00:25:23,520 Speaker 1: see it do so gradually, perhaps so gradually that we 417 00:25:23,600 --> 00:25:27,399 Speaker 1: don't really appreciate how incredible that technology can be. I 418 00:25:27,440 --> 00:25:30,560 Speaker 1: have owned smartphones for about ten years or so, and 419 00:25:30,640 --> 00:25:32,880 Speaker 1: now I take it for granted that I have access 420 00:25:32,960 --> 00:25:35,800 Speaker 1: to them. But as a kid, it would have totally 421 00:25:35,840 --> 00:25:38,000 Speaker 1: floored me to know that such a thing would be 422 00:25:38,040 --> 00:25:41,280 Speaker 1: possible in my lifetime, let alone that I would actually 423 00:25:41,280 --> 00:25:45,880 Speaker 1: own one. When confronted with claims about technology, it's good 424 00:25:45,880 --> 00:25:51,280 Speaker 1: to ask questions questions like how is this possible? How 425 00:25:51,359 --> 00:25:55,520 Speaker 1: does it work. What is it doing differently from earlier 426 00:25:55,720 --> 00:25:59,159 Speaker 1: versions of this tech. If it's a technology that relates 427 00:25:59,160 --> 00:26:02,640 Speaker 1: to a specialized field, it might be necessary to consult 428 00:26:02,720 --> 00:26:05,919 Speaker 1: experts in that field to get good answers. There's no 429 00:26:06,000 --> 00:26:09,199 Speaker 1: shame in that back. If someone presented a technology to 430 00:26:09,280 --> 00:26:13,000 Speaker 1: me with claims that the whole thing worked on quantum principles, 431 00:26:13,640 --> 00:26:16,760 Speaker 1: I need to consult with an expert. I have the 432 00:26:17,000 --> 00:26:21,000 Speaker 1: most basic understanding of high level quantum physics, and once 433 00:26:21,000 --> 00:26:23,880 Speaker 1: you get past that, it's all beyond me. I might 434 00:26:23,920 --> 00:26:27,280 Speaker 1: suspect something fishy, but I'd have no way of knowing 435 00:26:28,359 --> 00:26:31,520 Speaker 1: on my own anyway. If my suspicions were warranted, I 436 00:26:31,520 --> 00:26:34,600 Speaker 1: would need to consult with someone far more educated and 437 00:26:34,680 --> 00:26:38,040 Speaker 1: experience than I in the world of quantum physics to 438 00:26:38,080 --> 00:26:40,800 Speaker 1: get a better handle on it. Now, the more vague 439 00:26:40,880 --> 00:26:44,280 Speaker 1: the claims, the more skepticism you should apply. If the 440 00:26:44,320 --> 00:26:49,680 Speaker 1: claim includes disconnected scientific terminology, particularly if it is getting 441 00:26:49,680 --> 00:26:52,879 Speaker 1: into fields like quantum physics, that's a red flag. You 442 00:26:52,960 --> 00:26:56,000 Speaker 1: need to pay closer attention to those claims, or might 443 00:26:56,119 --> 00:27:00,600 Speaker 1: even include non scientific or meaningless language, which is another 444 00:27:00,680 --> 00:27:03,399 Speaker 1: big warning sign. Maybe you'll see a device that claims 445 00:27:03,400 --> 00:27:05,560 Speaker 1: that if you wear it. The device will boost your 446 00:27:05,680 --> 00:27:08,800 Speaker 1: quote unquote energy in some way, but what does that 447 00:27:08,840 --> 00:27:12,639 Speaker 1: actually mean? Terms must first be defined, and then you 448 00:27:12,680 --> 00:27:15,000 Speaker 1: can move on to the next question, which is, well, 449 00:27:15,080 --> 00:27:16,879 Speaker 1: how the heck does it do this thing you claim 450 00:27:16,880 --> 00:27:20,760 Speaker 1: it's doing. For gadgets or technologies that cite experts, it's 451 00:27:20,880 --> 00:27:23,880 Speaker 1: good to find out who those experts are. If there's 452 00:27:23,960 --> 00:27:27,520 Speaker 1: language like studies show that blah blah blah blah blah, 453 00:27:27,720 --> 00:27:30,320 Speaker 1: it's good to find out who did the actual study. 454 00:27:30,720 --> 00:27:34,000 Speaker 1: Was it a reputable third party that could provide an objective, 455 00:27:34,080 --> 00:27:37,679 Speaker 1: unbiased point of view, or was it an in house 456 00:27:37,720 --> 00:27:42,359 Speaker 1: team or biased party. There's lending credence to claims without 457 00:27:42,400 --> 00:27:46,040 Speaker 1: actually finding out if those claims have merit. Moreover, we 458 00:27:46,080 --> 00:27:50,520 Speaker 1: have to remember that tech isn't magic, though science fiction 459 00:27:50,600 --> 00:27:54,760 Speaker 1: author Arthur C. Clark did once observe that any sufficiently 460 00:27:54,800 --> 00:27:59,120 Speaker 1: advanced technology would seem to be magic to us. Technology 461 00:27:59,160 --> 00:28:04,959 Speaker 1: has limits. There are fundamental physical limits that tech can't breakthrough. 462 00:28:05,359 --> 00:28:08,000 Speaker 1: And just because we see tech doing some stuff really 463 00:28:08,040 --> 00:28:12,720 Speaker 1: well doesn't mean it can do everything equally well. I 464 00:28:12,760 --> 00:28:15,320 Speaker 1: know I go on about critical thinking a lot in 465 00:28:15,359 --> 00:28:18,159 Speaker 1: this show, but the reason I do that is I 466 00:28:18,240 --> 00:28:21,400 Speaker 1: want people to apply that skill set in their lives 467 00:28:21,640 --> 00:28:25,119 Speaker 1: to make better informed decisions. I want you, guys to 468 00:28:25,200 --> 00:28:28,800 Speaker 1: avoid pitfalls, whether they are purposefully placed in your path 469 00:28:29,119 --> 00:28:31,159 Speaker 1: or not. I want you to be able to spot 470 00:28:31,280 --> 00:28:34,520 Speaker 1: a mistake or a scam. I want you to follow 471 00:28:34,560 --> 00:28:37,199 Speaker 1: your suspicions when you feel something isn't on the up 472 00:28:37,240 --> 00:28:40,600 Speaker 1: and up. And along with that, I do urge the 473 00:28:40,720 --> 00:28:44,720 Speaker 1: use of compassion. Please keep in mind that not everyone 474 00:28:44,880 --> 00:28:48,680 Speaker 1: hawking tech that promises too much is doing so out 475 00:28:48,720 --> 00:28:53,080 Speaker 1: of malice or agreed. Some could be genuinely misled by 476 00:28:53,120 --> 00:28:55,920 Speaker 1: what the tech can do, and so it's a good 477 00:28:55,960 --> 00:28:59,840 Speaker 1: idea to have critical thinking and compassion go hand in hand. 478 00:29:00,280 --> 00:29:03,640 Speaker 1: Try to understand not just how realistic the claim is, 479 00:29:04,200 --> 00:29:08,080 Speaker 1: but the person making the claim. If they're intentionally trying 480 00:29:08,120 --> 00:29:12,080 Speaker 1: to mislead people and take advantage of them, well they're 481 00:29:12,280 --> 00:29:14,760 Speaker 1: kind of scummy and I feel they should be called 482 00:29:14,800 --> 00:29:18,479 Speaker 1: out on that behavior. But maybe they're just believing in 483 00:29:18,600 --> 00:29:22,160 Speaker 1: something they want to believe in because of the promise 484 00:29:22,280 --> 00:29:26,120 Speaker 1: it makes. That doesn't necessarily make them bad. They might 485 00:29:26,160 --> 00:29:28,240 Speaker 1: mean they are gullible, or that they are in a 486 00:29:28,320 --> 00:29:31,880 Speaker 1: situation that they desperately want out of, and the promise 487 00:29:32,080 --> 00:29:36,360 Speaker 1: seems to suggest an escape route. So long story short, 488 00:29:37,040 --> 00:29:41,800 Speaker 1: don't believe all the hype. Ask questions, ask for clarification 489 00:29:41,880 --> 00:29:44,760 Speaker 1: when you get answers, to make sure that those answers 490 00:29:45,040 --> 00:29:48,960 Speaker 1: are actually substantive and they mean something. Be prepared to 491 00:29:49,000 --> 00:29:52,560 Speaker 1: dismiss a claim if the support for that claim is lacking. 492 00:29:53,160 --> 00:29:56,000 Speaker 1: Also be prepared to accept a claim if the support 493 00:29:56,080 --> 00:30:00,200 Speaker 1: merits it. One of the biggest complaints about skeptics is 494 00:30:00,200 --> 00:30:02,920 Speaker 1: that they are seen as people who dismiss claims out 495 00:30:02,920 --> 00:30:06,360 Speaker 1: of hand, and for some people that is true, although 496 00:30:06,400 --> 00:30:10,600 Speaker 1: we typically call them deniers rather than skeptics. But most 497 00:30:10,640 --> 00:30:13,520 Speaker 1: of us try to keep in mind that if extraordinary 498 00:30:13,600 --> 00:30:17,160 Speaker 1: proof for a claim exists, we should be willing to 499 00:30:17,200 --> 00:30:21,520 Speaker 1: adjust our world view to incorporate this new idea, even 500 00:30:21,560 --> 00:30:25,360 Speaker 1: if it previously seemed impossible. The proof just has to 501 00:30:25,440 --> 00:30:28,560 Speaker 1: be there, And don't just assume everyone is out to 502 00:30:28,560 --> 00:30:31,520 Speaker 1: pull one over on you. Just be aware there are 503 00:30:31,560 --> 00:30:37,560 Speaker 1: those people out there too. In short, be good human beings, 504 00:30:38,600 --> 00:30:42,680 Speaker 1: and keep in mind again technology as it advances, we're 505 00:30:42,680 --> 00:30:45,400 Speaker 1: going to keep running into this problem. Because we see 506 00:30:45,400 --> 00:30:48,520 Speaker 1: it do amazing things in one arena, we might expect 507 00:30:48,520 --> 00:30:51,960 Speaker 1: it can do equally amazing things in another and that's 508 00:30:52,000 --> 00:30:56,960 Speaker 1: not always the case. Well, that's it for this soapbox 509 00:30:57,120 --> 00:31:00,880 Speaker 1: edition of tech Stuff and my regular or call for 510 00:31:00,920 --> 00:31:05,360 Speaker 1: critical thinking. I think it's particularly important to consider it 511 00:31:05,440 --> 00:31:10,080 Speaker 1: now in the wake of things like Thorinus and Facebook 512 00:31:10,120 --> 00:31:15,440 Speaker 1: and all of its controversies and related technological issues. And 513 00:31:15,480 --> 00:31:19,160 Speaker 1: of course you can and should use critical thinking well 514 00:31:19,240 --> 00:31:22,640 Speaker 1: outside the world of technology. You should apply it pretty 515 00:31:22,720 --> 00:31:25,440 Speaker 1: much everywhere in your life so that you can be 516 00:31:25,560 --> 00:31:29,040 Speaker 1: reasonably sure you're getting the real deal and not being misled. 517 00:31:29,560 --> 00:31:32,479 Speaker 1: If you guys have suggestions for a future episode of 518 00:31:32,560 --> 00:31:36,000 Speaker 1: tech Stuff, you can contact me. The email adjust for 519 00:31:36,040 --> 00:31:38,720 Speaker 1: the show is tech Stuff at how stuff Works dot com, 520 00:31:39,120 --> 00:31:41,400 Speaker 1: or you can drop me on line on Facebook or Twitter. 521 00:31:41,560 --> 00:31:44,240 Speaker 1: The handle for both of those is tech Stuff hs W. 522 00:31:44,760 --> 00:31:46,960 Speaker 1: You can head on over to our website that's tech 523 00:31:47,000 --> 00:31:50,400 Speaker 1: stuff podcast dot com as an archive of all of 524 00:31:50,440 --> 00:31:55,080 Speaker 1: our previous episodes plus links to our background on the show, 525 00:31:55,120 --> 00:31:57,720 Speaker 1: as well as to our our online store, where every 526 00:31:57,720 --> 00:31:59,680 Speaker 1: purchase you make goes to help the show. We greatly 527 00:31:59,680 --> 00:32:01,720 Speaker 1: a pre ate it, and I will talk to you 528 00:32:01,760 --> 00:32:10,320 Speaker 1: again really soon. Yeah. Tech Stuff is a production of 529 00:32:10,360 --> 00:32:13,440 Speaker 1: I Heart Radio's How Stuff Works For more podcasts from 530 00:32:13,440 --> 00:32:17,200 Speaker 1: my heart Radio, visit the i heart Radio app, Apple podcasts, 531 00:32:17,320 --> 00:32:19,320 Speaker 1: or wherever you listen to your favorite shows.