1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to Text Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,800 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,040 --> 00:00:22,040 Speaker 1: and I love all things tech and a lot of 5 00:00:22,120 --> 00:00:25,000 Speaker 1: you out there who have been listening to this show 6 00:00:25,160 --> 00:00:27,680 Speaker 1: for a while, I know that I like to get 7 00:00:27,720 --> 00:00:31,400 Speaker 1: on a soapbox about critical thinking, and I also talk 8 00:00:31,640 --> 00:00:35,800 Speaker 1: a lot about compassion and how I believe we really 9 00:00:36,200 --> 00:00:40,480 Speaker 1: need to employ both critical thinking and compassion together if 10 00:00:40,600 --> 00:00:44,240 Speaker 1: we are to be good human beings. In many places 11 00:00:44,280 --> 00:00:48,280 Speaker 1: around the world, including here in the United States, we 12 00:00:48,320 --> 00:00:51,520 Speaker 1: are seeing a play by play of what happens when 13 00:00:52,400 --> 00:00:55,360 Speaker 1: we ignore critical thinking and compassion when we do not 14 00:00:55,560 --> 00:01:01,240 Speaker 1: incorporate those qualities into our systems. When we do that, 15 00:01:01,360 --> 00:01:04,560 Speaker 1: then there are people who are specifically left out of 16 00:01:04,560 --> 00:01:09,479 Speaker 1: those systems who suffer because of those systems, from pandemic 17 00:01:09,560 --> 00:01:12,680 Speaker 1: responses to people on the sidelines who are judging the 18 00:01:12,680 --> 00:01:17,360 Speaker 1: actions and motivations of protesters who, let's face it, are 19 00:01:17,440 --> 00:01:21,480 Speaker 1: actually just asking that they be treated like human beings. 20 00:01:21,800 --> 00:01:24,640 Speaker 1: This is a problem that has a real world negative 21 00:01:24,680 --> 00:01:31,360 Speaker 1: impact on countless lives. We need to examine ideas and 22 00:01:31,440 --> 00:01:35,880 Speaker 1: claims carefully and make sure that those ideas and those 23 00:01:35,880 --> 00:01:39,200 Speaker 1: claims have merit. But we also need to remember that 24 00:01:39,360 --> 00:01:43,520 Speaker 1: other people are you know people. Some of them might 25 00:01:43,560 --> 00:01:48,480 Speaker 1: be knowingly engaged in creating or perpetrating falsehoods for whatever reasons, 26 00:01:49,440 --> 00:01:52,200 Speaker 1: and that's obviously bad. You know, if their motivations are 27 00:01:52,240 --> 00:01:55,240 Speaker 1: bad and they're acting on those bad motivations, they are 28 00:01:55,280 --> 00:01:58,400 Speaker 1: not good people. But others might be doing this without 29 00:01:58,560 --> 00:02:01,760 Speaker 1: realizing it or noing that the information that they're spreading 30 00:02:01,960 --> 00:02:04,960 Speaker 1: is wrong, which is also bad, but that is potentially 31 00:02:05,000 --> 00:02:08,720 Speaker 1: reparable with the right approach. So today's episode is going 32 00:02:08,760 --> 00:02:12,080 Speaker 1: to be another one about critical thinking and probably a 33 00:02:12,120 --> 00:02:14,840 Speaker 1: bit of compassion thrown in there too, and I'll frame 34 00:02:14,880 --> 00:02:17,480 Speaker 1: it within the context of technology for the most part. 35 00:02:17,520 --> 00:02:21,520 Speaker 1: I'm not going to do a rundown of you know, 36 00:02:21,720 --> 00:02:25,240 Speaker 1: current events. But please keep in mind that we should 37 00:02:25,280 --> 00:02:29,000 Speaker 1: be employing these qualities everywhere, not just when it comes 38 00:02:29,080 --> 00:02:32,280 Speaker 1: to should I buy this gadget or is it too 39 00:02:32,320 --> 00:02:35,560 Speaker 1: good to be true? Critical thinking really is about digging 40 00:02:35,600 --> 00:02:38,560 Speaker 1: beneath the surface level of a topic or an issue 41 00:02:38,919 --> 00:02:42,400 Speaker 1: to understand what's really going on. And it's related to 42 00:02:42,440 --> 00:02:45,760 Speaker 1: stuff like the scientific method, and it's a skill that, 43 00:02:46,120 --> 00:02:49,960 Speaker 1: like most skills, gets better the more you use it. 44 00:02:49,960 --> 00:02:53,799 Speaker 1: It's not the same thing as outright denial. I'm not 45 00:02:54,000 --> 00:02:57,079 Speaker 1: telling you to go out and just deny stuff. It's 46 00:02:57,080 --> 00:03:00,720 Speaker 1: really more about asking questions and looking for the answers 47 00:03:00,760 --> 00:03:03,960 Speaker 1: and evaluaiting those answers, and if the answers hold up, 48 00:03:04,280 --> 00:03:09,120 Speaker 1: then accepting those answers, even if the answer doesn't align 49 00:03:09,200 --> 00:03:12,440 Speaker 1: with what you had already guessed was going to be 50 00:03:12,480 --> 00:03:16,080 Speaker 1: the answer You've you've got to be willing to accept 51 00:03:16,120 --> 00:03:19,320 Speaker 1: it if it holds up under evaluation. And one other 52 00:03:19,360 --> 00:03:20,480 Speaker 1: thing I want to get out of the way is 53 00:03:20,520 --> 00:03:23,480 Speaker 1: that I don't mean to come across as being on 54 00:03:23,520 --> 00:03:25,760 Speaker 1: a high horse here. I want to tell you guys 55 00:03:25,960 --> 00:03:28,400 Speaker 1: something that I am not very proud of. It took 56 00:03:28,560 --> 00:03:33,760 Speaker 1: me an embarrassingly long time to really embrace critical thinking. 57 00:03:34,240 --> 00:03:37,560 Speaker 1: I didn't even hear the term until I was in college, 58 00:03:37,880 --> 00:03:40,080 Speaker 1: and my professors talked about it as though it had 59 00:03:40,080 --> 00:03:43,280 Speaker 1: been part of my education the entire time, and it wasn't. 60 00:03:43,800 --> 00:03:46,960 Speaker 1: And while I learned about critical thinking in college, I 61 00:03:47,200 --> 00:03:49,920 Speaker 1: was not good at it. I'd say it wasn't until 62 00:03:49,920 --> 00:03:52,360 Speaker 1: I was in my mid thirties that I started really 63 00:03:52,800 --> 00:03:55,480 Speaker 1: understanding it and employing it, and to this day I 64 00:03:55,480 --> 00:03:58,200 Speaker 1: still slip up. It's not unusual for me to realize 65 00:03:58,240 --> 00:04:02,440 Speaker 1: that something I had previously just accepted really required more 66 00:04:02,480 --> 00:04:05,800 Speaker 1: scrutiny and understanding. So you can think of this episode 67 00:04:05,800 --> 00:04:10,200 Speaker 1: as being inspired by my own slow journey toward critical thinking, 68 00:04:10,280 --> 00:04:13,440 Speaker 1: a journey that's still going on today. But let's start 69 00:04:13,440 --> 00:04:18,200 Speaker 1: with the obvious. We've all seen countless advertisements of variable 70 00:04:18,320 --> 00:04:22,280 Speaker 1: messaging and quality, and we all understand that ads are 71 00:04:22,320 --> 00:04:26,320 Speaker 1: intended to motivate us to spend money on goods or services. 72 00:04:26,839 --> 00:04:29,400 Speaker 1: That includes stuff like political ads too. By the way, 73 00:04:29,440 --> 00:04:32,080 Speaker 1: in those cases, you can think of the transaction as 74 00:04:32,120 --> 00:04:35,240 Speaker 1: a vote rather than an exchange of money, unless we're 75 00:04:35,360 --> 00:04:38,120 Speaker 1: talking about ads that are asking for campaign contributions, in 76 00:04:38,120 --> 00:04:40,800 Speaker 1: which case, yeah, that's similar to an ad trying to 77 00:04:40,839 --> 00:04:43,360 Speaker 1: sell you a new car or pair of sneakers or whatever. 78 00:04:43,960 --> 00:04:47,560 Speaker 1: So let's think about this from the perspective of someone 79 00:04:47,600 --> 00:04:52,120 Speaker 1: who designs ads. Their job ultimately is to create something 80 00:04:52,200 --> 00:04:56,800 Speaker 1: that motivates the audience to take action. Moreover, that action 81 00:04:56,920 --> 00:05:00,360 Speaker 1: typically means motivating them to spend some of their own 82 00:05:00,480 --> 00:05:03,240 Speaker 1: money on a product or service, and that could be 83 00:05:03,320 --> 00:05:06,279 Speaker 1: a tough hill to climb right, So someone who is 84 00:05:06,360 --> 00:05:08,839 Speaker 1: good at creating ads has to think about how to 85 00:05:08,920 --> 00:05:11,640 Speaker 1: frame their subject in a way that speaks to the 86 00:05:11,680 --> 00:05:16,320 Speaker 1: intended audience. Typically, these ads need to convince that audience 87 00:05:16,360 --> 00:05:19,520 Speaker 1: that whatever is being sold is something that the audience 88 00:05:19,560 --> 00:05:23,400 Speaker 1: actually needs. Now, if whatever is being sold is already 89 00:05:23,480 --> 00:05:27,400 Speaker 1: an established thing, that's not as difficult, right. I mean, 90 00:05:28,160 --> 00:05:31,120 Speaker 1: if I'm trying to sell microwave ovens, I don't have 91 00:05:31,200 --> 00:05:34,760 Speaker 1: to explain what a microwave oven is because those have 92 00:05:34,839 --> 00:05:38,159 Speaker 1: existed for decades. Moreover, folks tend to understand the use 93 00:05:38,200 --> 00:05:40,479 Speaker 1: case for a microwave oven and how they can really 94 00:05:40,640 --> 00:05:43,960 Speaker 1: be convenient and fast compared to other methods of food preparation. 95 00:05:44,360 --> 00:05:47,640 Speaker 1: So all of that groundwork has already been done for me. 96 00:05:48,160 --> 00:05:50,920 Speaker 1: But let's say I come out with a new product, 97 00:05:51,120 --> 00:05:54,000 Speaker 1: or a product that hasn't really established a firm foothold 98 00:05:54,000 --> 00:05:56,520 Speaker 1: in the market. Now I have to convince my audience 99 00:05:56,920 --> 00:06:00,159 Speaker 1: that the thing I have created or that I am 100 00:06:00,200 --> 00:06:03,000 Speaker 1: tasked to sell is actually useful. I will have to 101 00:06:03,120 --> 00:06:07,280 Speaker 1: demonstrate how it solves a problem or and these ads 102 00:06:07,279 --> 00:06:09,880 Speaker 1: are particularly clever, I have to convince my audience that 103 00:06:09,960 --> 00:06:12,800 Speaker 1: they have a problem. They were previously unaware of, you know, 104 00:06:12,880 --> 00:06:15,279 Speaker 1: the problem they never knew they had, and that my 105 00:06:15,400 --> 00:06:18,839 Speaker 1: product solves that problem. I'm sure you've seen examples of 106 00:06:18,880 --> 00:06:21,880 Speaker 1: ads or critiques of ads that have some variation of 107 00:06:21,880 --> 00:06:23,640 Speaker 1: the phrase it solves a problem and you didn't even 108 00:06:23,680 --> 00:06:26,680 Speaker 1: know you had. That, by the way, is a red flag. 109 00:06:26,760 --> 00:06:28,839 Speaker 1: But we're going to get to red flags a little later. 110 00:06:29,160 --> 00:06:33,039 Speaker 1: So that's one of the responsibilities that add creators must meet. 111 00:06:33,360 --> 00:06:36,240 Speaker 1: Another is to convince the audience that the specific variant 112 00:06:36,520 --> 00:06:39,799 Speaker 1: of the goods or services that the ad is about 113 00:06:40,320 --> 00:06:44,839 Speaker 1: is superior to all others in that same category. Going 114 00:06:44,880 --> 00:06:46,880 Speaker 1: back to the microwave oven, if I were to try 115 00:06:46,960 --> 00:06:49,039 Speaker 1: and market a new oven, I would need to convince 116 00:06:49,040 --> 00:06:52,080 Speaker 1: people that the oven I was producing was better than 117 00:06:52,120 --> 00:06:55,040 Speaker 1: the dozens of other variations that are already on the market, 118 00:06:55,440 --> 00:06:56,760 Speaker 1: or else I'm not going to get a good return 119 00:06:56,760 --> 00:06:59,440 Speaker 1: on my investment. I'm not gonna sell many ovens, so 120 00:06:59,480 --> 00:07:01,919 Speaker 1: I need to come people my microwave is the best 121 00:07:02,480 --> 00:07:05,760 Speaker 1: in some way. Maybe it's less expensive, maybe it's more 122 00:07:05,880 --> 00:07:09,279 Speaker 1: energy efficient, maybe it heats more evenly, or maybe I 123 00:07:09,400 --> 00:07:13,760 Speaker 1: just claim some combination of these things and then try 124 00:07:13,800 --> 00:07:15,880 Speaker 1: and go for that all of this needs to be 125 00:07:15,960 --> 00:07:19,040 Speaker 1: conveyed in an effective way to the audience, and that 126 00:07:19,080 --> 00:07:22,240 Speaker 1: means it needs to grab attention, It needs to be memorable, 127 00:07:22,560 --> 00:07:25,640 Speaker 1: and typically it needs to be short and really skilled 128 00:07:25,680 --> 00:07:29,680 Speaker 1: advertisers have become adept at sussing out some basic psychology 129 00:07:29,720 --> 00:07:33,120 Speaker 1: to trigger our impulses. So let's talk about some of 130 00:07:33,120 --> 00:07:35,600 Speaker 1: those red flags I just mentioned. I already said that 131 00:07:35,640 --> 00:07:38,000 Speaker 1: the concept of solving a problem you didn't know you 132 00:07:38,080 --> 00:07:42,440 Speaker 1: have should be a warning. In some cases, it may 133 00:07:42,600 --> 00:07:45,560 Speaker 1: very well be legitimate. The product or service might actually 134 00:07:45,560 --> 00:07:48,440 Speaker 1: take care of something that otherwise we had to do ourselves, 135 00:07:48,440 --> 00:07:51,480 Speaker 1: and we never even thought twice about it because in 136 00:07:51,480 --> 00:07:54,440 Speaker 1: our experience there was no way to offload that task. 137 00:07:55,120 --> 00:07:58,800 Speaker 1: But there is no shortage of products, particularly in the 138 00:07:59,000 --> 00:08:04,720 Speaker 1: as seen on TV category, that really don't solve anything. 139 00:08:05,120 --> 00:08:08,840 Speaker 1: They might create an alternative way to do something, but 140 00:08:08,880 --> 00:08:12,000 Speaker 1: they don't actually save any one time or effort. They 141 00:08:12,160 --> 00:08:15,160 Speaker 1: might not even work as well as the more established 142 00:08:15,200 --> 00:08:19,679 Speaker 1: methods they're supposed to replace, and they also cost you money. 143 00:08:19,720 --> 00:08:21,920 Speaker 1: You have to buy these things, right, so that would 144 00:08:21,920 --> 00:08:23,760 Speaker 1: mean for you it would be a net loss if 145 00:08:23,800 --> 00:08:27,200 Speaker 1: you bought one. As a rule, I assume any product 146 00:08:27,200 --> 00:08:30,960 Speaker 1: that relies on videos of actors being comically incapable of 147 00:08:31,000 --> 00:08:34,280 Speaker 1: doing something simple like opening a carton of milk to 148 00:08:34,400 --> 00:08:38,760 Speaker 1: fall into this category. Another red flag falls in line 149 00:08:38,760 --> 00:08:41,000 Speaker 1: with the old saying if it looks too good to 150 00:08:41,040 --> 00:08:44,120 Speaker 1: be true, it probably is. I think that's saying can 151 00:08:44,120 --> 00:08:48,319 Speaker 1: be applied to about of all the ads I see 152 00:08:48,440 --> 00:08:51,520 Speaker 1: whenever I visit Facebook. I can't tell you, guys how 153 00:08:51,520 --> 00:08:55,000 Speaker 1: frequently I've seen ads for stuff like electric guitars that 154 00:08:55,080 --> 00:08:57,959 Speaker 1: really go whole hog on this one. Now. I've been 155 00:08:58,000 --> 00:09:01,120 Speaker 1: doing a lot of research on guitars because I'm thinking 156 00:09:01,120 --> 00:09:04,480 Speaker 1: about finally learning to play one, but because I'm browsing 157 00:09:04,480 --> 00:09:08,040 Speaker 1: online to learn this stuff, and because I have not 158 00:09:08,120 --> 00:09:11,400 Speaker 1: taken better precautions to hide my browsing from Facebook, and 159 00:09:11,440 --> 00:09:14,040 Speaker 1: that is totally on me, I now get flooded with 160 00:09:14,080 --> 00:09:18,079 Speaker 1: ads for guitars when I look at Facebook. These ads 161 00:09:18,120 --> 00:09:22,520 Speaker 1: frequently claimed to sell some brand named guitars, stuff like Fender, 162 00:09:22,720 --> 00:09:25,920 Speaker 1: you know, the Fender Strato Caster or the Gibson Less Paul, 163 00:09:26,480 --> 00:09:30,560 Speaker 1: but for ludicrously low prices. A guitar that could cost 164 00:09:30,559 --> 00:09:34,000 Speaker 1: six dollars or more will list on one of these 165 00:09:34,000 --> 00:09:38,400 Speaker 1: ads for so talk about the temptation. If you're an 166 00:09:38,440 --> 00:09:41,720 Speaker 1: aspiring musician or someone who's been playing on an entry 167 00:09:41,840 --> 00:09:46,199 Speaker 1: level instrument, but you would love to own something more professional, 168 00:09:46,320 --> 00:09:49,840 Speaker 1: more high end, that sounds like an incredible offer. The 169 00:09:49,880 --> 00:09:52,760 Speaker 1: ads I saw typically said that whatever store was posting 170 00:09:52,800 --> 00:09:55,439 Speaker 1: the ad was going out of business, or that they 171 00:09:55,440 --> 00:09:57,520 Speaker 1: had to make room for new stocks, They had to 172 00:09:57,559 --> 00:10:00,320 Speaker 1: do a clearance sale of all this old inventory, and 173 00:10:00,360 --> 00:10:02,440 Speaker 1: the ad was creating a fiction to make it seem 174 00:10:02,480 --> 00:10:05,440 Speaker 1: as though there was some justification for those low prices, 175 00:10:05,480 --> 00:10:09,480 Speaker 1: to give at least some sense that this possibly could 176 00:10:09,520 --> 00:10:11,960 Speaker 1: be on the up and up. Now did those ads 177 00:10:12,000 --> 00:10:16,160 Speaker 1: tempt me? Yeah, I'm human, But I also didn't believe them. 178 00:10:16,200 --> 00:10:19,079 Speaker 1: Immediately I was suspicious. So I decided that I was 179 00:10:19,080 --> 00:10:22,080 Speaker 1: going to start opening up these ads into another browser 180 00:10:22,520 --> 00:10:25,760 Speaker 1: and look at the landing page. And I immediately grew 181 00:10:25,800 --> 00:10:29,800 Speaker 1: suspicious of them because the language was very similar to 182 00:10:29,840 --> 00:10:33,400 Speaker 1: other ones I had seen before, the details about the 183 00:10:33,440 --> 00:10:37,559 Speaker 1: page were really sparse, and also as soon as I 184 00:10:37,640 --> 00:10:40,040 Speaker 1: opened up more than one of them, I saw that 185 00:10:40,120 --> 00:10:43,640 Speaker 1: these ads, which were supposedly all for different companies and 186 00:10:43,720 --> 00:10:47,160 Speaker 1: different guitar shops, they all had different names and different 187 00:10:47,280 --> 00:10:50,160 Speaker 1: U r L s They all landed on pages that 188 00:10:50,200 --> 00:10:53,520 Speaker 1: were identical except for the U r L address, So 189 00:10:53,559 --> 00:10:57,679 Speaker 1: I wasn't getting redirected to the same website. No, there 190 00:10:57,679 --> 00:11:02,280 Speaker 1: were duplicates of this same web site. That's dead giveaway. 191 00:11:02,320 --> 00:11:04,920 Speaker 1: It's an indication that someone is casting as wide a 192 00:11:04,960 --> 00:11:07,920 Speaker 1: net as they can to try and trick people into 193 00:11:07,960 --> 00:11:11,320 Speaker 1: a transaction. So what is actually going on here? Well, 194 00:11:11,360 --> 00:11:13,720 Speaker 1: with many of these companies, most of which are running 195 00:11:13,760 --> 00:11:17,400 Speaker 1: places like China, the whole operation is a bait and switch. 196 00:11:17,800 --> 00:11:19,880 Speaker 1: The goal is to convince people that they are buying 197 00:11:19,920 --> 00:11:23,160 Speaker 1: something legitimate, like a Fender stratocaster, but it could be anything. 198 00:11:23,160 --> 00:11:25,520 Speaker 1: It could be a costume piece, it could be some 199 00:11:25,640 --> 00:11:29,240 Speaker 1: other gadget, it could be knives, you name it. But 200 00:11:29,280 --> 00:11:31,800 Speaker 1: what the company will ship will be some sort of 201 00:11:31,880 --> 00:11:36,040 Speaker 1: cheap knockoff. And most companies won't even attempt to hide 202 00:11:36,040 --> 00:11:39,120 Speaker 1: this fact. It's just apparent as soon as you receive 203 00:11:39,400 --> 00:11:44,239 Speaker 1: the package. These companies are very slow to ship those products, 204 00:11:44,400 --> 00:11:47,600 Speaker 1: and really the goal is to waste enough time so 205 00:11:47,640 --> 00:11:50,280 Speaker 1: that by the time you finally get whatever that cheap 206 00:11:50,360 --> 00:11:54,319 Speaker 1: knockoff is, it's really hard for you to cancel that transaction. 207 00:11:54,400 --> 00:11:57,120 Speaker 1: The transaction is already gone through, and it's hard for 208 00:11:57,120 --> 00:12:00,120 Speaker 1: you to get your money back. Many of them have 209 00:12:00,200 --> 00:12:03,520 Speaker 1: no return policy, or they do have a return policy, 210 00:12:03,559 --> 00:12:05,920 Speaker 1: but it's going to cost you more to ship the 211 00:12:06,040 --> 00:12:09,559 Speaker 1: product back to the company then it cost you to 212 00:12:09,679 --> 00:12:12,520 Speaker 1: buy in the first place. So that one hundred dollar 213 00:12:13,080 --> 00:12:16,920 Speaker 1: cheap knockoff of a Fender stratocaster would end up costing 214 00:12:17,000 --> 00:12:19,880 Speaker 1: me two hundred dollars to ship back to the company 215 00:12:19,920 --> 00:12:22,840 Speaker 1: for a refund. So it's a net loss. So you 216 00:12:22,920 --> 00:12:25,240 Speaker 1: don't get the thing you actually, you know, paid for, 217 00:12:25,720 --> 00:12:28,079 Speaker 1: and you're left with no real way to fix that situation. 218 00:12:28,160 --> 00:12:30,800 Speaker 1: And since the company is operating out of another country, 219 00:12:30,960 --> 00:12:34,480 Speaker 1: you don't really have many options to seek justice. Now, 220 00:12:34,520 --> 00:12:37,400 Speaker 1: in my case, I started marking these ads and reporting 221 00:12:37,480 --> 00:12:41,120 Speaker 1: them to Facebook as being misleading, and this would prompt 222 00:12:41,160 --> 00:12:44,600 Speaker 1: Facebook to essentially block those ads from appearing on my page. Now, 223 00:12:44,600 --> 00:12:48,320 Speaker 1: it didn't mean that Facebook removed the ad, It's just 224 00:12:48,360 --> 00:12:50,679 Speaker 1: that I didn't see it anymore. But here's the thing. 225 00:12:50,960 --> 00:12:53,840 Speaker 1: As soon as I would block one version of that ad, 226 00:12:53,880 --> 00:12:57,200 Speaker 1: I would see another one that was nearly identical to 227 00:12:57,280 --> 00:12:59,680 Speaker 1: the first one, and it would go to a different 228 00:12:59,679 --> 00:13:01,920 Speaker 1: you are l but it was on that exact same 229 00:13:02,160 --> 00:13:05,760 Speaker 1: style web page, the same layout, same pictures, everything, So 230 00:13:05,800 --> 00:13:08,440 Speaker 1: it was the same scam. And then I would report 231 00:13:08,480 --> 00:13:10,680 Speaker 1: that ad, and then I would see another, And I 232 00:13:10,760 --> 00:13:13,160 Speaker 1: must have gone through at least half a dozen of 233 00:13:13,200 --> 00:13:16,480 Speaker 1: these before I stopped seeing versions of that ad. So today, 234 00:13:16,520 --> 00:13:19,160 Speaker 1: if I go to Facebook, I don't see that one anymore. 235 00:13:19,800 --> 00:13:22,040 Speaker 1: But doesn't mean that there aren't other ads that are 236 00:13:22,080 --> 00:13:25,840 Speaker 1: following in this same pattern. And again, none of this 237 00:13:26,000 --> 00:13:29,120 Speaker 1: means that Facebook is actually not running those ads for 238 00:13:29,200 --> 00:13:35,040 Speaker 1: other people. Facebook's revenue is almost entirely dependent upon advertising. 239 00:13:35,080 --> 00:13:38,680 Speaker 1: It is not in Facebook's financial interests to come down 240 00:13:38,760 --> 00:13:42,240 Speaker 1: hard on advertisers and demand that they are transparent and honest. 241 00:13:42,600 --> 00:13:45,280 Speaker 1: If Facebook institute those rules, it would lose out on 242 00:13:45,440 --> 00:13:49,400 Speaker 1: millions of dollars of revenue every quarter. So Facebook has 243 00:13:49,440 --> 00:13:54,920 Speaker 1: a financial incentive to run those ads and to serve 244 00:13:54,960 --> 00:13:57,960 Speaker 1: as a platform for advertising that's both good and bad. 245 00:13:58,440 --> 00:14:01,920 Speaker 1: Bad ads will only hurt Facebook if there's a large 246 00:14:02,080 --> 00:14:05,880 Speaker 1: enough response among users, and typically that would mean that 247 00:14:05,960 --> 00:14:09,719 Speaker 1: users would have to abandon the platform or or just 248 00:14:09,800 --> 00:14:12,680 Speaker 1: not use it at all, or somehow block ads across 249 00:14:12,720 --> 00:14:16,319 Speaker 1: all the Facebook. The company's standpoint is really more hands 250 00:14:16,320 --> 00:14:20,320 Speaker 1: off and essentially saying caveat intour or buyer, beware the 251 00:14:20,360 --> 00:14:23,840 Speaker 1: responsibility of figuring out which ads are legit would fall 252 00:14:23,880 --> 00:14:27,720 Speaker 1: to you, the user, not to Facebook. This also applies 253 00:14:27,760 --> 00:14:30,480 Speaker 1: to Facebook's stance on misinformation. Now, if you are in 254 00:14:30,520 --> 00:14:33,560 Speaker 1: the United States, you've likely seen a lot of news 255 00:14:33,640 --> 00:14:36,960 Speaker 1: about Mark Zuckerberg saying that he doesn't want Facebook to 256 00:14:37,000 --> 00:14:39,880 Speaker 1: be an arbiter of truth. He doesn't want Facebook to 257 00:14:39,880 --> 00:14:43,600 Speaker 1: declare to users which posts reflect honest messages and which 258 00:14:43,600 --> 00:14:46,080 Speaker 1: are misinformation, And he frames in a way that makes 259 00:14:46,080 --> 00:14:48,920 Speaker 1: it sound like Facebook is trying to be an agnostic 260 00:14:49,000 --> 00:14:52,800 Speaker 1: platform upon which people can express themselves. Freedom of speech 261 00:14:53,240 --> 00:14:55,840 Speaker 1: the first Amendment in the United States, and that to 262 00:14:55,880 --> 00:14:58,960 Speaker 1: act otherwise, we mean Facebook would have to replace this 263 00:14:59,040 --> 00:15:02,800 Speaker 1: with a company of proved expression of ideas. So, yeah, 264 00:15:02,880 --> 00:15:05,400 Speaker 1: they're framing it is a very much a free speech 265 00:15:05,480 --> 00:15:09,400 Speaker 1: kind of issue. But is that what's really going on? 266 00:15:10,080 --> 00:15:13,600 Speaker 1: When we come back, we'll consider some alternatives by applying 267 00:15:13,680 --> 00:15:17,000 Speaker 1: some critical thinking. But before that, and I know this 268 00:15:17,080 --> 00:15:19,840 Speaker 1: is going to sound totally hypocritical, We're going to take 269 00:15:19,880 --> 00:15:22,240 Speaker 1: a brief break to thank our sponsor, and by the 270 00:15:22,360 --> 00:15:25,720 Speaker 1: end of this episode I'll have more to say about sponsors. 271 00:15:25,760 --> 00:15:30,080 Speaker 1: And ads in our podcasts in particular. But first, let's 272 00:15:30,080 --> 00:15:40,560 Speaker 1: take that quick break. Zuckerberg says he doesn't want his 273 00:15:40,640 --> 00:15:43,600 Speaker 1: company to dictate what is truth, and he frames it 274 00:15:43,640 --> 00:15:45,560 Speaker 1: in a way that seems to say, who are we 275 00:15:45,640 --> 00:15:48,840 Speaker 1: to decide what is truth and what is false? And 276 00:15:48,960 --> 00:15:52,560 Speaker 1: that's a fairly compelling argument, right. I mean, I don't 277 00:15:52,600 --> 00:15:55,400 Speaker 1: know about you, but I don't love the idea of 278 00:15:55,440 --> 00:16:01,160 Speaker 1: some centralized authority deciding seemingly arbitrarily, which are true and 279 00:16:01,200 --> 00:16:05,240 Speaker 1: which ones are false. We've seen throughout Facebook's history that 280 00:16:05,320 --> 00:16:09,360 Speaker 1: whenever the company tweaks their algorithm, people get mad because 281 00:16:09,400 --> 00:16:11,960 Speaker 1: they see only a selection of the posts their friends 282 00:16:12,040 --> 00:16:14,640 Speaker 1: are making. Every time it happens, I have to look 283 00:16:14,640 --> 00:16:17,880 Speaker 1: for the settings that allow me to view Facebook posts 284 00:16:17,920 --> 00:16:22,200 Speaker 1: by most recent rather than whatever factors Facebook thinks are 285 00:16:22,240 --> 00:16:24,480 Speaker 1: more important to me. And even then, I know I'm 286 00:16:24,520 --> 00:16:27,800 Speaker 1: not actually seeing everything my friends are posting in reverse 287 00:16:27,880 --> 00:16:31,120 Speaker 1: chronological order. I'm just seeing some of it. So I 288 00:16:31,200 --> 00:16:34,160 Speaker 1: think that not wanting to be the arbiter of truth 289 00:16:34,520 --> 00:16:37,800 Speaker 1: is part of the reasoning. I think that's a kernel 290 00:16:37,920 --> 00:16:40,000 Speaker 1: of truth with Facebook. But I also think it's not 291 00:16:40,040 --> 00:16:43,320 Speaker 1: the only, or perhaps not even the primary reason for 292 00:16:43,360 --> 00:16:47,560 Speaker 1: that decision. So let's think critically about Facebook and how 293 00:16:47,600 --> 00:16:51,640 Speaker 1: Facebook makes money. So, as we already mentioned, Facebook makes 294 00:16:51,680 --> 00:16:55,400 Speaker 1: money through advertising, and it doesn't have a huge incentive 295 00:16:55,440 --> 00:16:57,560 Speaker 1: to make sure that those ads that are running on 296 00:16:57,560 --> 00:17:01,920 Speaker 1: the platform are for legitimate businesses and purposes, though the 297 00:17:01,960 --> 00:17:04,600 Speaker 1: company definitely does have an interest in preventing anything that 298 00:17:04,640 --> 00:17:08,480 Speaker 1: would be damaging to Facebook from running on that platform. 299 00:17:08,480 --> 00:17:11,800 Speaker 1: But another component to this is that Facebook makes more 300 00:17:11,880 --> 00:17:16,600 Speaker 1: money the longer people stay on Facebook actively. So if 301 00:17:16,600 --> 00:17:20,119 Speaker 1: you can convince people to actively stay on Facebook for longer, 302 00:17:20,640 --> 00:17:23,040 Speaker 1: you can serve them more ads, and that means you 303 00:17:23,080 --> 00:17:27,080 Speaker 1: make more money. Therefore, it benefits Facebook to employee strategies 304 00:17:27,119 --> 00:17:31,640 Speaker 1: that keep people glued to the website longer. One way 305 00:17:31,640 --> 00:17:34,760 Speaker 1: to do that is to design algorithms that show posts 306 00:17:34,800 --> 00:17:37,680 Speaker 1: that are proven to get more engagement than others. And 307 00:17:37,720 --> 00:17:40,920 Speaker 1: by engagement, we're talking about stuff like posts that get 308 00:17:41,119 --> 00:17:44,000 Speaker 1: lots of comments or a lot of people are sharing 309 00:17:44,000 --> 00:17:47,040 Speaker 1: it to their own pages or friends pages, and the 310 00:17:47,160 --> 00:17:50,199 Speaker 1: number of likes that those posts are getting, Posts that 311 00:17:50,359 --> 00:17:55,080 Speaker 1: encourage people to participate and perpetuate essentially, and those are 312 00:17:55,080 --> 00:17:58,720 Speaker 1: the posts that keep people on Facebook longer. Thus, those 313 00:17:58,720 --> 00:18:01,879 Speaker 1: are the ones that mean more ads can be served 314 00:18:01,920 --> 00:18:04,639 Speaker 1: for longer amounts of time to those people. In the 315 00:18:04,680 --> 00:18:07,560 Speaker 1: business model for Facebook, all of this is a good thing. 316 00:18:08,000 --> 00:18:12,200 Speaker 1: So it's in Facebook's interests to design algorithms that can 317 00:18:12,280 --> 00:18:16,480 Speaker 1: identify the types of posts that are driving engagement and 318 00:18:16,520 --> 00:18:19,680 Speaker 1: then serve them to a larger spectrum of Facebook's users. 319 00:18:19,680 --> 00:18:22,720 Speaker 1: Remember I said earlier, I know I'm not seeing all 320 00:18:22,720 --> 00:18:24,880 Speaker 1: of the stuff my friends are posting, even when I'm 321 00:18:24,880 --> 00:18:29,960 Speaker 1: trying to view my Facebook in you know, chronological order. Well, 322 00:18:30,000 --> 00:18:32,480 Speaker 1: that's because Facebook is kind of choosing which ones I 323 00:18:32,520 --> 00:18:35,840 Speaker 1: see in which ones I don't, And usually it's trying 324 00:18:35,880 --> 00:18:38,200 Speaker 1: to pick the ones that are more likely to drive 325 00:18:38,280 --> 00:18:42,000 Speaker 1: high engagement, and those are the ones I'm seeing. Once 326 00:18:42,040 --> 00:18:45,040 Speaker 1: you understand that that's how Facebook works, you can start 327 00:18:45,080 --> 00:18:48,720 Speaker 1: to craft posts that take advantage of this property. You 328 00:18:48,760 --> 00:18:52,600 Speaker 1: can game the system. You can build stuff that by 329 00:18:52,640 --> 00:18:55,960 Speaker 1: its very nature, is designed to drive engagement. Now, it 330 00:18:56,080 --> 00:18:59,200 Speaker 1: doesn't have to be true. In fact, if you restrict 331 00:18:59,200 --> 00:19:02,720 Speaker 1: yourself to posting stuff that's only true, you're not likely 332 00:19:02,760 --> 00:19:05,720 Speaker 1: to drive that much engagement. But it does need to 333 00:19:05,760 --> 00:19:09,600 Speaker 1: be really compelling. Things that are more outrageous or tap 334 00:19:09,680 --> 00:19:13,919 Speaker 1: into basic emotional responses, whether positive or negative, tend to 335 00:19:13,960 --> 00:19:18,080 Speaker 1: work really well. So it is with that understanding that 336 00:19:18,200 --> 00:19:22,680 Speaker 1: people can craft messages that contain misinformation and have those 337 00:19:22,720 --> 00:19:27,200 Speaker 1: messages perpetuate quickly across platforms like Facebook, and those messages 338 00:19:27,240 --> 00:19:30,000 Speaker 1: can be really harmful. And whether the person who made 339 00:19:30,080 --> 00:19:33,920 Speaker 1: the message intended to push a specific agenda, or they 340 00:19:33,920 --> 00:19:36,560 Speaker 1: wanted to discredit some other point of view they don't 341 00:19:36,560 --> 00:19:39,160 Speaker 1: agree with, or they just wanted to find a way 342 00:19:39,160 --> 00:19:43,399 Speaker 1: to drive engagement for whatever reason, the motivations are immaterial. 343 00:19:43,400 --> 00:19:46,400 Speaker 1: They could range from irritating but mostly benign to downright 344 00:19:46,520 --> 00:19:49,440 Speaker 1: mean spirited, but the outcome ends up being the same. 345 00:19:49,560 --> 00:19:54,679 Speaker 1: Misinformation spreads like wildfire across Facebook and beyond. And if 346 00:19:54,680 --> 00:19:58,280 Speaker 1: you are encountering misinformation all the time, and it's popping 347 00:19:58,320 --> 00:20:00,040 Speaker 1: up like crazy in your feet and being re and 348 00:20:00,119 --> 00:20:03,119 Speaker 1: forced that way, you start to get the impression that 349 00:20:03,119 --> 00:20:05,840 Speaker 1: that stuff is the real deal, even if it doesn't 350 00:20:05,880 --> 00:20:08,920 Speaker 1: sound legit to you. The fact that Facebook can become 351 00:20:08,920 --> 00:20:13,080 Speaker 1: a fire hose of misinformation helps reinforce those messages, even 352 00:20:13,119 --> 00:20:15,920 Speaker 1: if the messages themselves lack merit. I compared to living 353 00:20:15,920 --> 00:20:19,119 Speaker 1: in a place like China or North Korea, where nearly 354 00:20:19,160 --> 00:20:22,800 Speaker 1: all messaging requires state level approval, and that means the 355 00:20:22,800 --> 00:20:27,560 Speaker 1: government gets to dictate what information the citizens can access. Well, 356 00:20:27,600 --> 00:20:30,159 Speaker 1: that sort of thing is bound to shape thoughts and opinions. 357 00:20:30,160 --> 00:20:35,320 Speaker 1: If you never get to see anything outside the approved stuff, 358 00:20:35,720 --> 00:20:37,679 Speaker 1: then you have no idea that it even exists. We 359 00:20:37,720 --> 00:20:41,760 Speaker 1: don't have this innate ability to know the truth from falsehood. 360 00:20:42,200 --> 00:20:46,359 Speaker 1: Facebook meanwhile, has little incentive to change anything. Facebook isn't 361 00:20:46,440 --> 00:20:48,720 Speaker 1: in the business of trying to make things better for 362 00:20:48,760 --> 00:20:51,760 Speaker 1: the average person. That's not their business model. There in 363 00:20:51,760 --> 00:20:54,520 Speaker 1: the business of making money, and that increased level of 364 00:20:54,560 --> 00:20:58,960 Speaker 1: engagement drives revenue, which means more money for Facebook. Acting 365 00:20:59,000 --> 00:21:02,800 Speaker 1: against that would go against the company's self interest. Moreover, 366 00:21:03,160 --> 00:21:08,960 Speaker 1: there's a concept that's important to Facebook's operations called safe harbor. Now, basically, 367 00:21:09,000 --> 00:21:11,800 Speaker 1: this concept is that if a company is acting as 368 00:21:11,880 --> 00:21:15,639 Speaker 1: a means of communication, it cannot be held responsible for 369 00:21:15,760 --> 00:21:20,560 Speaker 1: the stuff that's said using that company's services. So let's 370 00:21:20,600 --> 00:21:23,720 Speaker 1: say that if someone were to call up another person 371 00:21:23,760 --> 00:21:26,760 Speaker 1: on the phone and then they threatened that person. So 372 00:21:27,160 --> 00:21:30,400 Speaker 1: person A calls person B and issues a threat over 373 00:21:30,440 --> 00:21:33,879 Speaker 1: the phone, the phone company would not be responsible for 374 00:21:33,960 --> 00:21:38,000 Speaker 1: that horrible act. The phone company just runs the infrastructure 375 00:21:38,080 --> 00:21:41,040 Speaker 1: that was used to make the phone call. The same 376 00:21:41,080 --> 00:21:44,560 Speaker 1: sort of concept generally applies to platforms like Facebook. The 377 00:21:44,680 --> 00:21:47,280 Speaker 1: idea is that if a user were to post something 378 00:21:47,359 --> 00:21:51,919 Speaker 1: illegal or disruptive on Facebook, the company wouldn't be held responsible, 379 00:21:51,960 --> 00:21:55,119 Speaker 1: particularly if the company could show that it had acted 380 00:21:55,160 --> 00:21:59,040 Speaker 1: promptly in response to complaints or reports about the transgression. 381 00:21:59,480 --> 00:22:02,800 Speaker 1: If face book takes a stance on misinformation, there might 382 00:22:02,840 --> 00:22:05,240 Speaker 1: be a fear that it would be shifting in a 383 00:22:05,359 --> 00:22:09,040 Speaker 1: role of accountability, which might seem to undermine the safe 384 00:22:09,040 --> 00:22:13,400 Speaker 1: harbor argument. In addition to that, monitoring posts and labeling 385 00:22:13,440 --> 00:22:17,200 Speaker 1: stuff that is spreading misinformation as such, or removing such 386 00:22:17,240 --> 00:22:21,200 Speaker 1: messages or whatever the plan is, would require an investment 387 00:22:21,320 --> 00:22:24,560 Speaker 1: on Facebook's part. The company would have to spend time, energy, 388 00:22:24,640 --> 00:22:28,119 Speaker 1: and resources, and all that boils down to money to 389 00:22:28,200 --> 00:22:31,880 Speaker 1: create a method to identify and label or remove posts 390 00:22:31,920 --> 00:22:35,960 Speaker 1: that are promoting misinformation. So not only would Facebook be 391 00:22:36,040 --> 00:22:39,399 Speaker 1: cutting back on the types of posts that drive engagement 392 00:22:39,800 --> 00:22:42,760 Speaker 1: and thus generate revenue, the company would also have to 393 00:22:42,800 --> 00:22:46,480 Speaker 1: pour money into that effort. Asking a business to spend 394 00:22:46,560 --> 00:22:49,560 Speaker 1: money so that it can make less money is a 395 00:22:49,720 --> 00:22:54,600 Speaker 1: pretty tough argument. It's no wonder Zuckerberg as opposed to it. Yes, 396 00:22:54,880 --> 00:22:58,560 Speaker 1: it would be a huge responsibility to determine which messages 397 00:22:58,640 --> 00:23:01,800 Speaker 1: reflect reality and which do not. It would bring Facebook 398 00:23:01,840 --> 00:23:06,000 Speaker 1: under enormous scrutiny and criticism. Any person or group or 399 00:23:06,040 --> 00:23:10,480 Speaker 1: whatever that saw messages being labeled or removed would raise 400 00:23:10,520 --> 00:23:13,119 Speaker 1: an enormous stink over it, and the company would have 401 00:23:13,119 --> 00:23:15,480 Speaker 1: to deal with that. Right, So, let's say people who 402 00:23:15,480 --> 00:23:18,640 Speaker 1: are perpetrating misinformation see that their messages are being taken down, 403 00:23:18,720 --> 00:23:21,400 Speaker 1: then they can just make it an even bigger thing. 404 00:23:21,800 --> 00:23:27,960 Speaker 1: But more than that, combating misinformation hurts the company's bottom line. Now, 405 00:23:28,000 --> 00:23:31,560 Speaker 1: as I record this, news is breaking that dozens of 406 00:23:31,600 --> 00:23:34,960 Speaker 1: Facebook employees have staged a walk out in protest of 407 00:23:34,960 --> 00:23:38,560 Speaker 1: how Zuckerberg and other high level management at Facebook have 408 00:23:38,640 --> 00:23:44,000 Speaker 1: refused to intervene when it comes to misinformation and inflammatory posts. Now, 409 00:23:44,040 --> 00:23:46,159 Speaker 1: the particular posts that are at the center of this 410 00:23:46,240 --> 00:23:49,040 Speaker 1: protest originated from the office of the President of the 411 00:23:49,080 --> 00:23:52,880 Speaker 1: United States. So the stakes are very high on this one, 412 00:23:53,520 --> 00:23:57,159 Speaker 1: and I'm definitely dogging on Facebook a lot here, but 413 00:23:57,240 --> 00:23:59,400 Speaker 1: I want to be clear that it's not the only 414 00:23:59,400 --> 00:24:02,560 Speaker 1: corporation and to do this kind of stuff. The stage 415 00:24:02,600 --> 00:24:05,840 Speaker 1: had been set long before Facebook was even a thing. 416 00:24:06,560 --> 00:24:10,480 Speaker 1: Back in the nineteen sixties, financial analysts and big thinkers 417 00:24:10,600 --> 00:24:15,160 Speaker 1: started a gradual shift towards a focus on shareholder value, 418 00:24:15,760 --> 00:24:19,840 Speaker 1: pairing that idea with the idea that executives should receive 419 00:24:19,920 --> 00:24:25,080 Speaker 1: compensation that traded high salaries for stock options. The logic 420 00:24:25,240 --> 00:24:28,159 Speaker 1: was that if your leaders in your company have a 421 00:24:28,200 --> 00:24:31,400 Speaker 1: personal stake in the performance of the company, they will 422 00:24:31,480 --> 00:24:35,159 Speaker 1: make decisions that will be best for the company. But 423 00:24:35,280 --> 00:24:38,280 Speaker 1: it hasn't necessarily played out that way, as time and 424 00:24:38,320 --> 00:24:42,120 Speaker 1: again executives have made decisions that were incredibly positive from 425 00:24:42,200 --> 00:24:46,840 Speaker 1: a shareholder value and thus a personal finance perspective, but 426 00:24:47,200 --> 00:24:52,960 Speaker 1: had negative consequences for stuff like customer satisfaction or employee 427 00:24:53,000 --> 00:24:57,760 Speaker 1: conditions and more. In recent years, more leaders, including Jack Welch, 428 00:24:57,760 --> 00:25:01,320 Speaker 1: who was seen as the champion of shareholder value back 429 00:25:01,359 --> 00:25:04,080 Speaker 1: in the nineteen eighties. He was the CEO of General 430 00:25:04,160 --> 00:25:07,040 Speaker 1: Electric back at that time. Even Jack Welch has come 431 00:25:07,040 --> 00:25:12,040 Speaker 1: out and said that this is a bad approach and 432 00:25:12,160 --> 00:25:16,560 Speaker 1: that really the priorities should be customers first, then employees, 433 00:25:16,920 --> 00:25:20,840 Speaker 1: and then shareholders. They should be behind the focus on 434 00:25:20,920 --> 00:25:24,080 Speaker 1: customers and employees. But this is a view that's taking 435 00:25:24,119 --> 00:25:27,399 Speaker 1: a very long time to precipitate throughout business as a whole, 436 00:25:27,600 --> 00:25:29,399 Speaker 1: and in the meantime we see a lot of leaders 437 00:25:29,480 --> 00:25:33,720 Speaker 1: making decisions the result in enormous increases in wealth at 438 00:25:33,760 --> 00:25:36,800 Speaker 1: others expense. Now, why did I bring all that up 439 00:25:36,840 --> 00:25:39,359 Speaker 1: in the first place. Well, I did it to show 440 00:25:39,440 --> 00:25:42,320 Speaker 1: that by taking a particular stance or claim, in this 441 00:25:42,359 --> 00:25:46,359 Speaker 1: case Zuckerberg's explanation for why Facebook shouldn't weigh in on 442 00:25:46,480 --> 00:25:49,040 Speaker 1: whether or not something is accurate or true, we can 443 00:25:49,080 --> 00:25:52,040 Speaker 1: actually ask ourselves questions and look at the subject on 444 00:25:52,119 --> 00:25:54,840 Speaker 1: a deeper level, and at the end of the day, 445 00:25:54,960 --> 00:25:59,280 Speaker 1: you might find yourself agreeing with Zuckerberg's decision, even if 446 00:25:59,400 --> 00:26:03,360 Speaker 1: his explanation for that decision is not the full truth. Now, 447 00:26:03,400 --> 00:26:06,080 Speaker 1: I personally do not agree with his decision, but that's 448 00:26:06,080 --> 00:26:07,960 Speaker 1: my own point of view, and I don't wish to 449 00:26:08,040 --> 00:26:10,520 Speaker 1: argue that my point of view is the quote unquote 450 00:26:10,720 --> 00:26:13,640 Speaker 1: right one. It's just the one I happen to have. 451 00:26:14,119 --> 00:26:16,119 Speaker 1: I do think it's important to get as full and 452 00:26:16,200 --> 00:26:20,560 Speaker 1: understanding as possible before weighing in with opinions. Now, this 453 00:26:20,640 --> 00:26:23,200 Speaker 1: is a skill, and like all skills, we all need 454 00:26:23,240 --> 00:26:25,440 Speaker 1: to practice at it to get better and I include 455 00:26:25,440 --> 00:26:28,920 Speaker 1: myself in that, and you've likely encountered messages that reinforced 456 00:26:29,000 --> 00:26:33,320 Speaker 1: a previously held belief. These are very easy to accept 457 00:26:33,640 --> 00:26:36,040 Speaker 1: as the messages we see fall in line with what 458 00:26:36,200 --> 00:26:39,919 Speaker 1: we already believe. We're more inclined to accept those kinds 459 00:26:40,000 --> 00:26:43,119 Speaker 1: of messages right, and we're more likely to dismiss a 460 00:26:43,200 --> 00:26:47,199 Speaker 1: message that conflicts with something we really believe. This is 461 00:26:47,240 --> 00:26:50,800 Speaker 1: a natural human response, and that means it's also the 462 00:26:50,840 --> 00:26:53,720 Speaker 1: type of response other people can count on when they 463 00:26:53,760 --> 00:26:59,520 Speaker 1: craft messages. For example, i hold some pretty liberal beliefs, 464 00:26:59,800 --> 00:27:02,560 Speaker 1: and I'm not asking any of you to subscribe to 465 00:27:02,600 --> 00:27:06,760 Speaker 1: my beliefs. Some of you might have very conservative beliefs, 466 00:27:06,800 --> 00:27:09,080 Speaker 1: and I'm not going to tell you to change. Rather, 467 00:27:09,240 --> 00:27:11,600 Speaker 1: I'm framing it this way so that we understand where 468 00:27:11,640 --> 00:27:15,399 Speaker 1: I'm coming from. So, if I encounter a message that 469 00:27:15,520 --> 00:27:18,600 Speaker 1: indicates the President of the United States has said or 470 00:27:18,720 --> 00:27:23,000 Speaker 1: done something particularly upsetting to me, I am predisposed to 471 00:27:23,040 --> 00:27:27,159 Speaker 1: accept that that report is absolutely true. And partly that 472 00:27:27,200 --> 00:27:29,320 Speaker 1: has to do with history. The president does have a 473 00:27:29,440 --> 00:27:32,160 Speaker 1: very long record of saying and doing things that I 474 00:27:32,200 --> 00:27:36,800 Speaker 1: find upsetting, so precedent has conditioned me to expect such 475 00:27:36,840 --> 00:27:40,920 Speaker 1: things from him moving forward. But the rest is because 476 00:27:41,240 --> 00:27:46,280 Speaker 1: I hold certain biases, and if a message reinforces those biases, 477 00:27:47,040 --> 00:27:50,400 Speaker 1: I'm likely to buy into that message. And that's precisely 478 00:27:50,520 --> 00:27:54,360 Speaker 1: when I need to employ critical thinking. When I encounter 479 00:27:54,480 --> 00:27:57,560 Speaker 1: these messages, I actually do research to see if they're 480 00:27:57,560 --> 00:28:00,640 Speaker 1: true or relevant. Like I see a lot of messages 481 00:28:00,640 --> 00:28:04,479 Speaker 1: that contain quotes supposedly said by the President, and I 482 00:28:04,560 --> 00:28:07,640 Speaker 1: try to do research to make sure that those are 483 00:28:07,760 --> 00:28:11,600 Speaker 1: things that he actually said, to verify that somewhere the 484 00:28:11,640 --> 00:28:14,679 Speaker 1: President is recorded as saying that quote, and also to 485 00:28:14,680 --> 00:28:17,760 Speaker 1: see if there's more context around the quote and that 486 00:28:17,920 --> 00:28:22,240 Speaker 1: it's not something that was pulled out that by itself 487 00:28:22,359 --> 00:28:27,000 Speaker 1: is really awful, but within context isn't. And sometimes I 488 00:28:27,000 --> 00:28:29,760 Speaker 1: can't find any evidence of the quote outside of the 489 00:28:29,800 --> 00:28:33,360 Speaker 1: original message I saw, and then I'm just thinking, well, 490 00:28:33,480 --> 00:28:36,440 Speaker 1: this could just be made up because it's it's playing 491 00:28:36,600 --> 00:28:41,960 Speaker 1: to my expectations, and that's not enough. Likewise, I could 492 00:28:42,080 --> 00:28:45,640 Speaker 1: encounter a message that praises someone from the liberal side 493 00:28:45,640 --> 00:28:49,160 Speaker 1: of the spectrum, and once again, my bias means I'm 494 00:28:49,200 --> 00:28:51,960 Speaker 1: predisposed to accept that as fact because it falls in 495 00:28:52,040 --> 00:28:55,040 Speaker 1: line with my personal world view. But I also research 496 00:28:55,240 --> 00:28:58,000 Speaker 1: these messages to make sure they are real and within 497 00:28:58,360 --> 00:29:03,200 Speaker 1: the proper context. I do not want to blindly accept 498 00:29:03,440 --> 00:29:06,320 Speaker 1: that a message aligns that happens to align with my 499 00:29:06,360 --> 00:29:11,120 Speaker 1: personal worldview is true and then go on to perpetuate misinformation. 500 00:29:12,360 --> 00:29:15,920 Speaker 1: Doing this is not always easy. I find that, especially 501 00:29:15,920 --> 00:29:19,800 Speaker 1: when I'm particularly emotional, it's really challenging to remember to 502 00:29:19,880 --> 00:29:22,920 Speaker 1: apply critical thinking. But I also think that's when we 503 00:29:22,960 --> 00:29:26,160 Speaker 1: need to rely on it the most. On the mild side, 504 00:29:26,600 --> 00:29:29,800 Speaker 1: it might mean you're less likely to spread falsehoods, but 505 00:29:29,880 --> 00:29:32,320 Speaker 1: on the heavier side, it could mean you helped diffuse 506 00:29:32,480 --> 00:29:36,240 Speaker 1: a dangerous situation. When we come back, I'll go into 507 00:29:36,280 --> 00:29:39,400 Speaker 1: some more elements of critical thinking and compassion that are 508 00:29:39,440 --> 00:29:42,200 Speaker 1: applicable in the tech world and beyond. But first let's 509 00:29:42,240 --> 00:29:52,800 Speaker 1: take another quick break. In past episodes, you can hear 510 00:29:52,840 --> 00:29:57,160 Speaker 1: me get really emotional about instances in which opportunistic people 511 00:29:57,480 --> 00:30:01,160 Speaker 1: have used technology to take advantage of other in various ways. 512 00:30:01,760 --> 00:30:04,400 Speaker 1: The schemes all tend to fall back on the common 513 00:30:04,440 --> 00:30:08,200 Speaker 1: failures that we have as humans, and honestly, there is 514 00:30:08,320 --> 00:30:12,120 Speaker 1: nothing really new about those schemes. There are tricks that 515 00:30:12,160 --> 00:30:16,720 Speaker 1: con artists have been relying upon for centuries. The tools 516 00:30:16,760 --> 00:30:20,080 Speaker 1: of the trade are the same. What changes is whatever 517 00:30:20,160 --> 00:30:23,760 Speaker 1: is being sold and whatever platform you're using in order 518 00:30:23,760 --> 00:30:26,320 Speaker 1: to sell it. And it doesn't matter if we're talking 519 00:30:26,360 --> 00:30:30,600 Speaker 1: about a gadget or a philosophy. And I think all 520 00:30:30,600 --> 00:30:34,880 Speaker 1: scams are reprehensible, but I get particularly angry at those 521 00:30:34,920 --> 00:30:38,640 Speaker 1: that are taking aim at already vulnerable targets. For example, 522 00:30:39,160 --> 00:30:41,800 Speaker 1: people who are looking for a job. This is a 523 00:30:41,840 --> 00:30:45,000 Speaker 1: population of people who are seeking a means to earn 524 00:30:45,080 --> 00:30:48,480 Speaker 1: a living. They might be trying to land their first 525 00:30:48,600 --> 00:30:52,000 Speaker 1: steady job ever, or they might be trying to transition 526 00:30:52,120 --> 00:30:55,080 Speaker 1: from one job that's not so great into something that 527 00:30:55,160 --> 00:30:58,960 Speaker 1: they hope is better. Anyway, they are putting themselves out 528 00:30:59,040 --> 00:31:01,800 Speaker 1: there in search of opportunities, and that means they are 529 00:31:01,840 --> 00:31:06,360 Speaker 1: a vulnerable population, and some people find that irresistible. There 530 00:31:06,360 --> 00:31:10,840 Speaker 1: are numerous individuals and shady companies that take advantage of 531 00:31:10,960 --> 00:31:15,240 Speaker 1: job seekers. A common tactic is to advertise a job online, 532 00:31:15,480 --> 00:31:18,479 Speaker 1: but the advertised job is just bait. There is no 533 00:31:18,600 --> 00:31:22,000 Speaker 1: intent on offering that job to an applicant. In fact, 534 00:31:22,000 --> 00:31:25,480 Speaker 1: the job might not even exist. The goal is to 535 00:31:25,600 --> 00:31:28,320 Speaker 1: lure people into a job interview and then pull the 536 00:31:28,320 --> 00:31:30,800 Speaker 1: bait and switch, saying oh, you know, we already filled 537 00:31:30,840 --> 00:31:33,680 Speaker 1: that position, but we have a totally different job that 538 00:31:33,720 --> 00:31:36,200 Speaker 1: we would love to offer you. And thus they offer 539 00:31:36,280 --> 00:31:39,080 Speaker 1: up a different job to the applicant, and typically the 540 00:31:39,240 --> 00:31:44,320 Speaker 1: offered job, the new one is less desirable than whatever 541 00:31:44,560 --> 00:31:47,240 Speaker 1: was originally being applied for. It might have lower pay 542 00:31:47,280 --> 00:31:51,280 Speaker 1: and might have fewer benefits or both, but or it 543 00:31:51,360 --> 00:31:54,320 Speaker 1: may just be you know, less desirable in job duties. 544 00:31:54,360 --> 00:31:58,120 Speaker 1: But but the person doing the hiring knows that if 545 00:31:58,120 --> 00:32:00,680 Speaker 1: someone has taken the effort to go to a job interview, 546 00:32:01,240 --> 00:32:03,360 Speaker 1: and if they are looking for a job, if they're 547 00:32:03,400 --> 00:32:06,880 Speaker 1: in that place, they could be in need enough and 548 00:32:06,920 --> 00:32:11,120 Speaker 1: emotionally vulnerable enough to agree to this switch, even though 549 00:32:11,120 --> 00:32:15,720 Speaker 1: it wasn't what they were told when they first applied. Yeah, 550 00:32:15,800 --> 00:32:18,720 Speaker 1: that's one way to make me really angry, really fast. 551 00:32:19,240 --> 00:32:21,360 Speaker 1: And it is hard out there. I've been in a 552 00:32:21,400 --> 00:32:25,360 Speaker 1: steady gig since two thousand eight, and even from this 553 00:32:25,720 --> 00:32:29,760 Speaker 1: cushioned position, I know it's hard out there, and people 554 00:32:29,880 --> 00:32:32,120 Speaker 1: looking for a job fall into a category of people 555 00:32:32,120 --> 00:32:36,240 Speaker 1: who cannot afford to lose. They're looking for that chance 556 00:32:36,280 --> 00:32:40,840 Speaker 1: to land something steady. They're hopeful, and unethical jerk faces 557 00:32:40,880 --> 00:32:43,800 Speaker 1: will pounce on that there are other groups that are 558 00:32:43,840 --> 00:32:46,280 Speaker 1: even shadier, that are really just trying to get as 559 00:32:46,360 --> 00:32:49,280 Speaker 1: much personal information as possible and they have no job 560 00:32:49,360 --> 00:32:52,640 Speaker 1: to offer whatsoever. Or they are part of a pyramid 561 00:32:52,720 --> 00:32:55,440 Speaker 1: scheme that requires the job seeker to pour some of 562 00:32:55,480 --> 00:32:59,040 Speaker 1: their own limited money into that scheme, and the lure 563 00:32:59,240 --> 00:33:02,560 Speaker 1: is that if they bring other folks into that organization, 564 00:33:03,240 --> 00:33:05,240 Speaker 1: they will get some of the money of the people 565 00:33:05,320 --> 00:33:09,920 Speaker 1: that they they bring in. But those schemes are entirely 566 00:33:09,920 --> 00:33:13,680 Speaker 1: dependent upon convincing more and more people to join further 567 00:33:13,880 --> 00:33:16,960 Speaker 1: down the chain, and typically only a few folks towards 568 00:33:17,000 --> 00:33:20,640 Speaker 1: the very top ever really make anything, and it's at 569 00:33:20,640 --> 00:33:25,280 Speaker 1: the expense of everyone below them. That's classic pyramids schemes. 570 00:33:26,480 --> 00:33:28,600 Speaker 1: Now you've probably picked up on the fact that a 571 00:33:28,640 --> 00:33:31,120 Speaker 1: lot of this all has to do with catering to 572 00:33:31,200 --> 00:33:34,320 Speaker 1: what people want to believe is true, and that is 573 00:33:34,360 --> 00:33:37,120 Speaker 1: a huge part of it. But another is relying upon 574 00:33:37,240 --> 00:33:43,200 Speaker 1: people interpreting information incorrectly. For example, Moore's law. This is 575 00:33:43,200 --> 00:33:45,880 Speaker 1: a great one. Now let's begin by saying Moore's law 576 00:33:46,760 --> 00:33:49,880 Speaker 1: because came from an observation. It wasn't that Gordon Moore 577 00:33:50,200 --> 00:33:55,000 Speaker 1: declared this, you know, inherent law of the universe. Rather, 578 00:33:55,040 --> 00:33:59,640 Speaker 1: he was making an observation about how semiconductor companies were 579 00:33:59,680 --> 00:34:03,720 Speaker 1: able to fit more components on an inch square inch 580 00:34:03,840 --> 00:34:07,400 Speaker 1: of silicon wafer UH, and that there was a market 581 00:34:07,480 --> 00:34:11,200 Speaker 1: demand for doing more of that, which allowed them to 582 00:34:11,280 --> 00:34:15,920 Speaker 1: invest in the process of making, you know, the chips 583 00:34:15,920 --> 00:34:19,040 Speaker 1: that had even more transistors essentially on them. He was 584 00:34:19,120 --> 00:34:22,200 Speaker 1: he was looking at market trends that were supporting this 585 00:34:22,880 --> 00:34:26,680 Speaker 1: overall technological trend. But generally speaking, these days, we say 586 00:34:26,680 --> 00:34:30,200 Speaker 1: that Moore's law means that computers today are twice as powerful, 587 00:34:30,440 --> 00:34:35,319 Speaker 1: meaning faster at processing information as computers from two years ago, 588 00:34:35,600 --> 00:34:38,000 Speaker 1: and the computers from two years ago are twice as 589 00:34:38,000 --> 00:34:40,600 Speaker 1: powerful as the one from two years before that. So 590 00:34:41,040 --> 00:34:45,719 Speaker 1: every two years computer processing capabilities double. It's not a 591 00:34:45,760 --> 00:34:48,759 Speaker 1: hard and fast rule, but it's close enough to what 592 00:34:48,880 --> 00:34:52,160 Speaker 1: we observe to kind of serve as shorthand. But that 593 00:34:52,239 --> 00:34:56,360 Speaker 1: doesn't mean other technologies keep up at that same speed. 594 00:34:56,840 --> 00:35:01,560 Speaker 1: Battery technology, for example, doesn't. But because we've become accustomed 595 00:35:01,560 --> 00:35:04,600 Speaker 1: to computers evolving so quickly, it is easy for us 596 00:35:04,680 --> 00:35:08,720 Speaker 1: to make the mistake of extending that quality to other technologies. 597 00:35:09,160 --> 00:35:11,960 Speaker 1: And if you're an unscrupulous person you can jump on 598 00:35:12,040 --> 00:35:14,880 Speaker 1: that by making a spurious claim that sounds like it 599 00:35:14,920 --> 00:35:18,480 Speaker 1: could be possible and raking in the benefits. You guys 600 00:35:18,480 --> 00:35:22,640 Speaker 1: probably remember that I've done episodes on the company Thoroughness. 601 00:35:23,480 --> 00:35:26,920 Speaker 1: That's a company that aimed to produce a machine capable 602 00:35:27,000 --> 00:35:31,320 Speaker 1: of running hundreds of diagnostic tests on a single droplet 603 00:35:31,440 --> 00:35:34,760 Speaker 1: of blood. The ambition of the company and its founder, 604 00:35:34,800 --> 00:35:38,840 Speaker 1: Elizabeth Holmes was to create a device that would revolutionize medicine. 605 00:35:38,920 --> 00:35:42,000 Speaker 1: With a machine the size of a desktop printer, it 606 00:35:42,120 --> 00:35:44,880 Speaker 1: could become a household appliance and would allow people to 607 00:35:44,960 --> 00:35:47,360 Speaker 1: run a test quickly, and then they could share that 608 00:35:47,440 --> 00:35:50,560 Speaker 1: information with their doctors. They could lead healthier lives. They 609 00:35:50,560 --> 00:35:54,319 Speaker 1: can micro manage their health. It's kind of in line 610 00:35:54,320 --> 00:35:57,759 Speaker 1: with that whole quantitative approach we were taking for a 611 00:35:57,840 --> 00:35:59,960 Speaker 1: very long time with all this stuff like fitness tracker 612 00:36:00,040 --> 00:36:02,879 Speaker 1: that would give us detailed information about things like how 613 00:36:03,080 --> 00:36:05,040 Speaker 1: many hours sleep we got, how many times do we 614 00:36:05,080 --> 00:36:09,520 Speaker 1: toss in turn, it's in that same vein, and potentially 615 00:36:09,560 --> 00:36:12,000 Speaker 1: this would mean that a user would pick up warning 616 00:36:12,080 --> 00:36:14,359 Speaker 1: signs early enough to be able to take action before 617 00:36:14,480 --> 00:36:18,480 Speaker 1: things get really serious. It could save lives, and it 618 00:36:18,520 --> 00:36:22,080 Speaker 1: would make a crap ton of money. But there was 619 00:36:22,160 --> 00:36:26,960 Speaker 1: just one small problem. The technology didn't work, but a 620 00:36:26,960 --> 00:36:29,960 Speaker 1: lot of people believed it could work. I mean, we've 621 00:36:30,000 --> 00:36:33,480 Speaker 1: got some phenomenal technology at our disposal right now. Right 622 00:36:33,880 --> 00:36:36,680 Speaker 1: if you have a smartphone, you've got a computer that 623 00:36:36,760 --> 00:36:39,960 Speaker 1: fits in your pocket. It's got a ton of processing 624 00:36:40,000 --> 00:36:43,960 Speaker 1: power compared to even old mainframe computers, and it could 625 00:36:43,960 --> 00:36:47,000 Speaker 1: connect to the internet. It can give you amazing communication 626 00:36:47,040 --> 00:36:50,520 Speaker 1: abilities as well as access a truly enormous amount of 627 00:36:50,560 --> 00:36:54,520 Speaker 1: information that's been gathered around the world. It's a relatively 628 00:36:54,680 --> 00:36:58,479 Speaker 1: mundane piece of technology at this point, right And when 629 00:36:58,520 --> 00:37:04,280 Speaker 1: the amazing comes mundane, the impossible starts to sound plausible. 630 00:37:04,719 --> 00:37:07,759 Speaker 1: So why couldn't a machine use a tiny amount of 631 00:37:07,760 --> 00:37:10,480 Speaker 1: blood as a sample for hundreds of tests? I mean, 632 00:37:11,239 --> 00:37:14,760 Speaker 1: why couldn't it return accurate results in a matter of hours. 633 00:37:14,840 --> 00:37:18,160 Speaker 1: We've got technology that lets us shoot ultra high resolution 634 00:37:18,320 --> 00:37:22,319 Speaker 1: video using a phone. We've got tech that we can 635 00:37:22,400 --> 00:37:26,120 Speaker 1: talk to and give commands too, and it responds to us. 636 00:37:26,239 --> 00:37:27,880 Speaker 1: It stands to reason that we should be able to 637 00:37:27,960 --> 00:37:31,600 Speaker 1: run these sorts of tests on that small at sample 638 00:37:31,640 --> 00:37:35,719 Speaker 1: of blood, And that leap of logic is the problem. 639 00:37:35,760 --> 00:37:38,880 Speaker 1: Our capabilities in one area can lead us to believe 640 00:37:38,960 --> 00:37:42,719 Speaker 1: we are equally capable in unrelated areas. It would be 641 00:37:42,760 --> 00:37:45,040 Speaker 1: like if I were a world class athlete in a 642 00:37:45,080 --> 00:37:48,279 Speaker 1: specific sport and I thought, just because of that, I'm 643 00:37:48,320 --> 00:37:54,320 Speaker 1: automatically equally as amazing and some other unrelated sport. Even 644 00:37:54,600 --> 00:37:59,080 Speaker 1: very intelligent people got caught up in this promise. Very 645 00:37:59,160 --> 00:38:03,680 Speaker 1: successful people were caught up by this lie, and the 646 00:38:03,760 --> 00:38:06,799 Speaker 1: promise was just so darn positive. It would be amazing 647 00:38:06,920 --> 00:38:09,080 Speaker 1: if we could run that many tests on such a 648 00:38:09,120 --> 00:38:11,960 Speaker 1: small sample of blood. And it would be even more 649 00:38:12,000 --> 00:38:14,280 Speaker 1: amazing if we could come up with a consumer version 650 00:38:14,320 --> 00:38:17,080 Speaker 1: of that technology that the average person could have in 651 00:38:17,160 --> 00:38:20,800 Speaker 1: their home. That kind of gadget could potentially save millions 652 00:38:20,840 --> 00:38:24,600 Speaker 1: of lives. Of course, you want that tech to exist 653 00:38:24,719 --> 00:38:28,080 Speaker 1: and to work, but wishes don't make things true. So 654 00:38:28,120 --> 00:38:31,000 Speaker 1: we've got a technology that would be awesome if it existed. 655 00:38:31,200 --> 00:38:34,520 Speaker 1: We've got a culture and business that revolves around risk. 656 00:38:34,960 --> 00:38:38,600 Speaker 1: We've got a startup culture that glorifies the concept of 657 00:38:38,640 --> 00:38:41,319 Speaker 1: fake it until you make it, which means you come 658 00:38:41,360 --> 00:38:43,759 Speaker 1: up with a goal and then you raise a whole 659 00:38:43,760 --> 00:38:46,200 Speaker 1: bunch of money, and then you flail around trying to 660 00:38:46,239 --> 00:38:50,560 Speaker 1: achieve that goal and hopefully something you do sticks somewhere 661 00:38:50,560 --> 00:38:54,160 Speaker 1: along the line. You manage some level of success before 662 00:38:54,239 --> 00:38:57,480 Speaker 1: your money runs out, and while the money is rolling in, 663 00:38:57,840 --> 00:39:00,239 Speaker 1: you might as well live high on the hog. If 664 00:39:00,239 --> 00:39:02,000 Speaker 1: it doesn't work out, well, that's a bummer. But in 665 00:39:02,000 --> 00:39:05,680 Speaker 1: the startup world, failure is common. Fail fast is the 666 00:39:05,800 --> 00:39:10,279 Speaker 1: name of the game. Now. I don't know if Paronisis 667 00:39:10,320 --> 00:39:14,520 Speaker 1: founder Elizabeth Holmes genuinely believed she was going to be 668 00:39:14,560 --> 00:39:17,879 Speaker 1: able to create the tech that she envisioned. If she did, 669 00:39:18,239 --> 00:39:21,040 Speaker 1: that wouldn't really surprise me. Like I said, the idea 670 00:39:21,200 --> 00:39:24,480 Speaker 1: is really attractive, and our technology is already so incredible 671 00:39:24,920 --> 00:39:27,759 Speaker 1: that you could be swayed to think this could be possible. 672 00:39:28,239 --> 00:39:30,680 Speaker 1: I also don't know if she's still believed in it 673 00:39:30,880 --> 00:39:36,080 Speaker 1: by the end of that whole story, when numerous investigations 674 00:39:36,280 --> 00:39:40,400 Speaker 1: and leaks revealed the extent to which this technology definitely 675 00:39:40,480 --> 00:39:43,080 Speaker 1: did not work, and the extent to which the company 676 00:39:43,160 --> 00:39:46,120 Speaker 1: moved to hide that fact. But I do know this, 677 00:39:46,560 --> 00:39:51,120 Speaker 1: just because you believe in something super hard doesn't mean 678 00:39:51,160 --> 00:39:54,840 Speaker 1: that thing is magically true. Holmes herself is currently in 679 00:39:54,880 --> 00:39:57,840 Speaker 1: the middle of a legal battle over the entire Theonis 680 00:39:57,840 --> 00:40:01,440 Speaker 1: fallout with a new odd charge that was filed against 681 00:40:01,520 --> 00:40:04,640 Speaker 1: her recently, like as of the week of the recording 682 00:40:04,640 --> 00:40:07,400 Speaker 1: of this, but for more than a decade her company 683 00:40:07,440 --> 00:40:11,600 Speaker 1: was able to get significant investments from numerous wealthy individuals 684 00:40:11,600 --> 00:40:16,960 Speaker 1: and organizations. The promise of a truly huge payout was 685 00:40:17,040 --> 00:40:20,239 Speaker 1: tempting because if the technology worked, this would be a 686 00:40:20,400 --> 00:40:23,400 Speaker 1: gold mine. Better than that, it would create a return 687 00:40:23,440 --> 00:40:26,319 Speaker 1: on investment that would be impossible to guess at. You 688 00:40:26,400 --> 00:40:29,840 Speaker 1: might be thinking, Wow, I'm gonna pour a hundred thousand 689 00:40:29,840 --> 00:40:33,000 Speaker 1: dollars into this, and I'm gonna be breaking in ten 690 00:40:33,080 --> 00:40:37,200 Speaker 1: million dollars in no time or more, and that was 691 00:40:37,320 --> 00:40:40,520 Speaker 1: enough to shut down critical thinking. The message of Pharaonos 692 00:40:40,600 --> 00:40:43,719 Speaker 1: had multiple vectors of appeal. It appealed to your sense 693 00:40:43,719 --> 00:40:47,840 Speaker 1: of innovation. It appealed to the idea of democratization of medicine, 694 00:40:48,120 --> 00:40:53,560 Speaker 1: so sort of an altruistic appeal. Simultaneously and somewhat uh conflictingly, 695 00:40:53,719 --> 00:40:57,280 Speaker 1: it also appealed to greed, and lots of people allowed 696 00:40:57,280 --> 00:41:00,760 Speaker 1: themselves to be swayed by this. Now need to wrap 697 00:41:00,880 --> 00:41:03,840 Speaker 1: up this episode, but I said earlier that I would 698 00:41:03,840 --> 00:41:06,279 Speaker 1: take some time to talk about sponsors and ads that 699 00:41:06,360 --> 00:41:09,560 Speaker 1: I run on this show. Now. Most of the time, 700 00:41:09,920 --> 00:41:13,359 Speaker 1: our sales department here at I Heart runs potential ad 701 00:41:13,400 --> 00:41:17,520 Speaker 1: deals by me for my approval, and I take this seriously. 702 00:41:18,000 --> 00:41:20,799 Speaker 1: I try to look into each company and product or 703 00:41:20,840 --> 00:41:24,880 Speaker 1: service before I give my approval. I don't want to 704 00:41:24,880 --> 00:41:28,000 Speaker 1: pass along something that's misleading if I can help it. 705 00:41:28,719 --> 00:41:31,680 Speaker 1: Sometimes I don't get a choice in ads, but that's 706 00:41:31,680 --> 00:41:34,120 Speaker 1: a rare thing. So if I voice an AD for 707 00:41:34,200 --> 00:41:38,240 Speaker 1: something that's misleading or whatever, chances are the fault is mine. 708 00:41:38,680 --> 00:41:42,279 Speaker 1: I try to avoid it. Sponsors and advertisements are a 709 00:41:42,360 --> 00:41:45,439 Speaker 1: necessary part of what I do because without them, this 710 00:41:45,600 --> 00:41:48,520 Speaker 1: show would be an expense, but there would be no 711 00:41:48,560 --> 00:41:50,480 Speaker 1: way to generate money for it. We would just be 712 00:41:50,520 --> 00:41:53,400 Speaker 1: spending money to make the show and get nothing back, 713 00:41:53,800 --> 00:41:56,279 Speaker 1: and that means this show would go away. So I 714 00:41:56,360 --> 00:41:59,600 Speaker 1: try to balance the financial responsibilities I have as an 715 00:41:59,600 --> 00:42:03,799 Speaker 1: employed ye with my responsibilities I have as someone communicating 716 00:42:03,840 --> 00:42:06,680 Speaker 1: to you guys. And I wish I could share with 717 00:42:06,719 --> 00:42:08,919 Speaker 1: you a list of some of the deals that I've 718 00:42:08,920 --> 00:42:12,600 Speaker 1: said no to, but that would be really unprofessional of me. 719 00:42:13,600 --> 00:42:17,840 Speaker 1: Let's just say it's not a short list. But even 720 00:42:17,920 --> 00:42:20,640 Speaker 1: with all that said, I encourage people to think critically 721 00:42:20,800 --> 00:42:23,920 Speaker 1: about all messaging. You don't have to deny stuff right 722 00:42:23,960 --> 00:42:27,640 Speaker 1: out of hand, just question it, even if it aligns 723 00:42:27,640 --> 00:42:31,600 Speaker 1: with your beliefs. Heck, especially if it aligns with your beliefs. 724 00:42:32,200 --> 00:42:34,799 Speaker 1: In the long run, it can help your case. If 725 00:42:34,800 --> 00:42:37,160 Speaker 1: you're able to suss out the truth from the lies, 726 00:42:37,520 --> 00:42:40,880 Speaker 1: you can build your argument for your side more effectively, 727 00:42:41,120 --> 00:42:45,080 Speaker 1: and you don't leave yourself open to attacks on credibility. Moreover, 728 00:42:45,560 --> 00:42:48,479 Speaker 1: we need to exercise those skills to make sure we're 729 00:42:48,560 --> 00:42:52,120 Speaker 1: just being good humans, that we are taking care of 730 00:42:52,160 --> 00:42:56,480 Speaker 1: ourselves and of each other. Using those two things, critical 731 00:42:56,520 --> 00:43:00,200 Speaker 1: thinking and compassion, we can recognize when others are truely 732 00:43:00,320 --> 00:43:03,440 Speaker 1: in need. Then we can also help identify the best 733 00:43:03,480 --> 00:43:07,160 Speaker 1: ways to help them. If we think critically, we might 734 00:43:07,239 --> 00:43:10,400 Speaker 1: realize that changing a profile picture or tweeting a message 735 00:43:10,920 --> 00:43:15,600 Speaker 1: might not really be enough. It's more of a performance 736 00:43:15,840 --> 00:43:20,120 Speaker 1: than an action, and that we can and should do more, 737 00:43:20,640 --> 00:43:23,640 Speaker 1: even if that's just to shut our yaps and listen, 738 00:43:24,440 --> 00:43:29,239 Speaker 1: and guys, I'm listening. I promise if you guys have 739 00:43:29,280 --> 00:43:33,560 Speaker 1: any suggestions for future episodes of tech Stuff, whether it's 740 00:43:33,560 --> 00:43:36,560 Speaker 1: a specific technology, maybe it's a company, maybe it's a 741 00:43:36,560 --> 00:43:39,120 Speaker 1: personality in tech, maybe it's just a trend in tech 742 00:43:39,200 --> 00:43:42,239 Speaker 1: in general. Reach out to me and let me know. 743 00:43:42,640 --> 00:43:45,560 Speaker 1: You can get in touch on Facebook or Twitter. The 744 00:43:45,640 --> 00:43:48,719 Speaker 1: handle at both of those is text stuff H s 745 00:43:48,960 --> 00:43:57,719 Speaker 1: W and I'll talk to you again really soon. Text 746 00:43:57,760 --> 00:44:01,200 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 747 00:44:01,200 --> 00:44:03,959 Speaker 1: from I Heart Radio, visit the i Heart Radio app, 748 00:44:04,120 --> 00:44:07,240 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows. 749 00:44:11,800 --> 00:44:11,840 Speaker 1: H