1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:14,720 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:14,840 --> 00:00:17,880 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,960 --> 00:00:19,919 Speaker 1: and a lot of all things tech. And this is 5 00:00:19,960 --> 00:00:25,439 Speaker 1: the Tech News for Thursday, September twenty one. And our 6 00:00:25,480 --> 00:00:27,760 Speaker 1: first story is a big one. It's probably gonna take 7 00:00:27,800 --> 00:00:30,440 Speaker 1: the whole first section because there's a lot to say, 8 00:00:30,760 --> 00:00:34,599 Speaker 1: and then there's Jonathan's take on it as well, So 9 00:00:34,680 --> 00:00:37,519 Speaker 1: let's get to it. The Wall Street Journal published an 10 00:00:37,600 --> 00:00:40,680 Speaker 1: article this week revealing that their journalists got to see 11 00:00:40,680 --> 00:00:44,800 Speaker 1: a report about Instagram that is concerning, to say the least, 12 00:00:45,000 --> 00:00:49,639 Speaker 1: while also confirming a lot of commonly held suspicions. Now, 13 00:00:49,680 --> 00:00:51,800 Speaker 1: I think most of us have at least a sense 14 00:00:52,440 --> 00:00:56,480 Speaker 1: that social media can lead us to place unrealistic expectations 15 00:00:56,560 --> 00:00:59,080 Speaker 1: on us in the lives we lead. A lot of 16 00:00:59,080 --> 00:01:02,080 Speaker 1: people have written or talked about the tendency for folks 17 00:01:02,160 --> 00:01:06,080 Speaker 1: to put forward an idealized version of themselves online, like 18 00:01:06,160 --> 00:01:09,400 Speaker 1: it's not the real them, it's an idealized version of them. 19 00:01:09,400 --> 00:01:12,759 Speaker 1: It makes their lives out to seem more dramatic, more exciting, 20 00:01:12,800 --> 00:01:16,600 Speaker 1: and lush than they really are, and Instagram, but it's 21 00:01:16,600 --> 00:01:19,720 Speaker 1: focus on pictures is particularly bad about this. A picture 22 00:01:20,120 --> 00:01:23,720 Speaker 1: tells a thousand words, but you know, a lot of 23 00:01:23,720 --> 00:01:25,960 Speaker 1: those words tend to be Look how awesome I am, 24 00:01:26,280 --> 00:01:29,440 Speaker 1: or how awesome my house is, or how awesome this 25 00:01:29,600 --> 00:01:34,759 Speaker 1: vacation spot is now. Facebook representatives, including Mark Zuckerberg himself, 26 00:01:34,800 --> 00:01:38,400 Speaker 1: have frequently told the public and the United States Congress 27 00:01:38,880 --> 00:01:42,880 Speaker 1: that the benefits of social media include a boost to 28 00:01:43,000 --> 00:01:45,959 Speaker 1: mental health because they allow us to be part of 29 00:01:46,000 --> 00:01:48,880 Speaker 1: communities and to connect with people that we care about. 30 00:01:49,400 --> 00:01:53,640 Speaker 1: And that narrative might seem fairly convincing, particularly during COVID 31 00:01:53,760 --> 00:01:56,600 Speaker 1: times when a lot of us have had few or 32 00:01:56,760 --> 00:02:01,520 Speaker 1: possibly no opportunities to see people we used to hang 33 00:02:01,520 --> 00:02:05,360 Speaker 1: out with, you know, before COVID. But the report that 34 00:02:05,440 --> 00:02:10,800 Speaker 1: Wall Street saw confirmed suspicions that Instagram can have devastatingly 35 00:02:11,000 --> 00:02:15,280 Speaker 1: harmful effects to mental health, particularly for teenage girls. And 36 00:02:15,320 --> 00:02:18,560 Speaker 1: this report didn't come from some third party research group 37 00:02:19,240 --> 00:02:23,200 Speaker 1: that you might suspect was applying confirmation bias to the investigation. 38 00:02:23,240 --> 00:02:25,600 Speaker 1: By that, I mean if you have a group that 39 00:02:25,639 --> 00:02:29,400 Speaker 1: has an agenda like they expect to find something bad, 40 00:02:30,040 --> 00:02:33,119 Speaker 1: then they often will look for something bad, and they'll 41 00:02:33,160 --> 00:02:37,800 Speaker 1: ignore anything that is contrary to that particular you know, 42 00:02:37,880 --> 00:02:42,320 Speaker 1: sort of prejudice. But in this case, the investigation report 43 00:02:42,480 --> 00:02:46,960 Speaker 1: came from inside Facebook itself. Researchers in the company studied 44 00:02:46,960 --> 00:02:50,280 Speaker 1: the effects of Instagram on behavior and mental health and 45 00:02:50,320 --> 00:02:53,680 Speaker 1: found that nearly a third of all teenage girls who 46 00:02:53,720 --> 00:02:58,320 Speaker 1: are on Instagram traced serious issues with self image, confidence, 47 00:02:58,360 --> 00:03:03,800 Speaker 1: and mental health to their participation with Instagram. So let's 48 00:03:04,240 --> 00:03:07,240 Speaker 1: make some very rough estimates here, just again idea of 49 00:03:07,240 --> 00:03:10,880 Speaker 1: how many people we could be talking about. So Instagram 50 00:03:10,880 --> 00:03:15,919 Speaker 1: reportedly has around one billion users. That's billion with a B. Now, 51 00:03:16,120 --> 00:03:20,800 Speaker 1: more than of those users are no older than twenty two. 52 00:03:21,160 --> 00:03:25,000 Speaker 1: That's more than but let's just cut it at that 53 00:03:25,040 --> 00:03:28,920 Speaker 1: gives us four million users who are twenty two or younger. Now, 54 00:03:29,000 --> 00:03:32,920 Speaker 1: let's assume about half of those are women. Statistic, by 55 00:03:32,919 --> 00:03:36,600 Speaker 1: the way, estimates that women make up more than it's 56 00:03:36,640 --> 00:03:40,280 Speaker 1: more like fifty one. But we're doing kind of a 57 00:03:40,320 --> 00:03:45,240 Speaker 1: pen on a napkin style estimation here. So that means 58 00:03:45,280 --> 00:03:48,800 Speaker 1: two hundred million young women and girls are using Instagram. Now, 59 00:03:48,880 --> 00:03:54,200 Speaker 1: let's say that of that number are older than teenagers. Okay, 60 00:03:54,240 --> 00:03:56,800 Speaker 1: so one fifth of that number. This is just me 61 00:03:56,880 --> 00:03:59,960 Speaker 1: saying this that one fifth of them are aged twenty 62 00:04:00,160 --> 00:04:04,680 Speaker 1: or older. That's probably a generous estimate. It's probably fewer 63 00:04:04,760 --> 00:04:07,560 Speaker 1: than that, but I'm just doing this for the purposes 64 00:04:07,600 --> 00:04:11,280 Speaker 1: of illustrations. That would leave us with one sixty million 65 00:04:11,320 --> 00:04:15,280 Speaker 1: teenage girls. And the report said that around thirty two 66 00:04:16,040 --> 00:04:18,719 Speaker 1: of the girls that were you know, part of the 67 00:04:18,760 --> 00:04:21,919 Speaker 1: study said they ended up experiencing a negative impact on 68 00:04:21,960 --> 00:04:26,560 Speaker 1: their mental health that they attributed to their participation on Instagram. 69 00:04:26,600 --> 00:04:28,960 Speaker 1: So that means more than if we apply this across 70 00:04:29,000 --> 00:04:31,520 Speaker 1: the board and say that is representative, that would mean 71 00:04:31,520 --> 00:04:35,039 Speaker 1: that more than fifty one million teenage girls could be 72 00:04:35,040 --> 00:04:39,239 Speaker 1: struggling with mental health problems. They're the very least exacerbated 73 00:04:39,480 --> 00:04:43,159 Speaker 1: by their time on Instagram. By the way, it gets worse. 74 00:04:43,560 --> 00:04:46,320 Speaker 1: Those mental health issues can ripple into other dangerous and 75 00:04:46,400 --> 00:04:52,800 Speaker 1: catastrophic consequences, everything from developing eating disorders or even leading 76 00:04:52,839 --> 00:04:59,599 Speaker 1: to suicidal thoughts. Fifty one million teenage girls, and again 77 00:04:59,640 --> 00:05:02,360 Speaker 1: that's a rough estimate, and I took some pretty conservative 78 00:05:02,360 --> 00:05:06,760 Speaker 1: shortcuts getting there. Okay, so what's actually leading to the 79 00:05:06,839 --> 00:05:11,800 Speaker 1: issues that we're seeing with mental health? One possible contributor, 80 00:05:12,160 --> 00:05:16,040 Speaker 1: and again we're talking about possibilities. Here would be a 81 00:05:16,040 --> 00:05:19,560 Speaker 1: tendency called social comparison, and that's exactly what sounds like. 82 00:05:19,640 --> 00:05:22,320 Speaker 1: It's when you see the depiction of someone else's life 83 00:05:22,400 --> 00:05:25,040 Speaker 1: and then you compare your own life to that person's life, 84 00:05:25,440 --> 00:05:27,920 Speaker 1: and you come back with like a negative feeling about 85 00:05:27,920 --> 00:05:31,359 Speaker 1: your own situation. So maybe it's an influencer who looks 86 00:05:31,400 --> 00:05:35,200 Speaker 1: like they're in phenomenal shape. This happened to me. I'm 87 00:05:35,240 --> 00:05:37,760 Speaker 1: not a teenage girl. I'm a middle aged man. But 88 00:05:37,880 --> 00:05:42,120 Speaker 1: when I started on my fitness journey, which I totally 89 00:05:42,120 --> 00:05:45,560 Speaker 1: have fallen off of because COVID and then a lack 90 00:05:45,640 --> 00:05:49,160 Speaker 1: of resolve I'll say on my part. Anyway, when I 91 00:05:49,200 --> 00:05:52,039 Speaker 1: was on that journey, I actually looked at some Instagram 92 00:05:52,080 --> 00:05:54,880 Speaker 1: accounts that were about fitness, which meant that Instagram was 93 00:05:55,400 --> 00:05:59,800 Speaker 1: serving up tons of different pictures of people who were 94 00:05:59,839 --> 00:06:05,760 Speaker 1: insane shape. I mean Greek statues would be jealous of them, uh. 95 00:06:05,800 --> 00:06:09,279 Speaker 1: And so I was inundated with images of people who 96 00:06:09,279 --> 00:06:11,960 Speaker 1: were in much better shape than I was, some of 97 00:06:12,000 --> 00:06:15,000 Speaker 1: which appeared to suggest that they got there in a 98 00:06:15,120 --> 00:06:17,839 Speaker 1: very short amount of time. Obviously, this stuff can be 99 00:06:18,040 --> 00:06:20,920 Speaker 1: rather than motivating, discouraging, right, Like if I'm seeing people 100 00:06:20,920 --> 00:06:23,520 Speaker 1: who are like, yeah, you just must not be doing 101 00:06:23,560 --> 00:06:25,680 Speaker 1: it right because I got to where I am in 102 00:06:25,720 --> 00:06:30,280 Speaker 1: three weeks or something like that, some ludicrous explanation that 103 00:06:30,360 --> 00:06:32,640 Speaker 1: can be very discouraging to me. Well, that's just one example. 104 00:06:33,120 --> 00:06:37,039 Speaker 1: Or maybe the influencer is wearing designer clothing. Maybe they're 105 00:06:37,080 --> 00:06:39,920 Speaker 1: hanging out in gorgeous locations. I mean, how many times 106 00:06:39,920 --> 00:06:42,520 Speaker 1: have you opened Instagram and just browse stuff and saw 107 00:06:43,040 --> 00:06:47,320 Speaker 1: incredibly hot people hanging out on yachts or by pools 108 00:06:47,320 --> 00:06:50,520 Speaker 1: and stuff like that. In fact, I mentioned influencers. We 109 00:06:50,560 --> 00:06:55,400 Speaker 1: do have an entire profession called influencer that is geared 110 00:06:55,640 --> 00:07:00,240 Speaker 1: toward that kind of experience to project this pick sure 111 00:07:00,360 --> 00:07:03,720 Speaker 1: of idealization, and their whole job is to influence you, 112 00:07:03,800 --> 00:07:06,279 Speaker 1: to convince you that they're living out their best life 113 00:07:06,680 --> 00:07:09,760 Speaker 1: and that they're being showered with amazing products and trips 114 00:07:09,840 --> 00:07:12,200 Speaker 1: and stuff, and that your life could be better if 115 00:07:12,240 --> 00:07:14,680 Speaker 1: it was more like their life. And then you've got 116 00:07:14,680 --> 00:07:18,320 Speaker 1: the issue with people getting obsessed over numbers on Instagram, 117 00:07:18,400 --> 00:07:22,080 Speaker 1: like how many followers they might have, or how many 118 00:07:22,360 --> 00:07:25,840 Speaker 1: likes a post gets or how many comments of post receives. 119 00:07:26,120 --> 00:07:28,440 Speaker 1: And again, if we look at influencers, we see them 120 00:07:28,520 --> 00:07:32,360 Speaker 1: constantly trying to prompt engagement, often by including sort of 121 00:07:32,400 --> 00:07:35,000 Speaker 1: some sort of question as a caption for a photo. 122 00:07:35,280 --> 00:07:40,200 Speaker 1: So I happen to really like cool cosplay. Instagram profiles 123 00:07:40,240 --> 00:07:44,239 Speaker 1: people who put in tons of work and make incredible costumes, 124 00:07:44,280 --> 00:07:47,200 Speaker 1: so I get a mix of all sorts of cosplayers 125 00:07:47,320 --> 00:07:51,320 Speaker 1: served up. Now, some of these cosplayers, will you know, 126 00:07:51,400 --> 00:07:55,360 Speaker 1: end up posting provocative images, maybe something a bit steamy. Right. 127 00:07:55,840 --> 00:07:58,600 Speaker 1: Then they'll include a caption that might say something like 128 00:07:58,920 --> 00:08:02,440 Speaker 1: cheezburgers or hot dogs, which could possibly have nothing to 129 00:08:02,480 --> 00:08:05,640 Speaker 1: do with the image itself, but that gets people commenting, 130 00:08:05,920 --> 00:08:10,160 Speaker 1: because increased engagement is currency for influencers. Heck, I remember 131 00:08:10,160 --> 00:08:13,480 Speaker 1: when some influencers were using the trick of spell out 132 00:08:13,920 --> 00:08:17,080 Speaker 1: such and such in the comments letter by letter, meaning 133 00:08:17,080 --> 00:08:19,360 Speaker 1: that you would post each letter in order as a 134 00:08:19,440 --> 00:08:24,560 Speaker 1: separate post as a separate comment in order to prove something. 135 00:08:24,640 --> 00:08:27,360 Speaker 1: I guess that was really just a trick to kind 136 00:08:27,360 --> 00:08:29,280 Speaker 1: of get more comments, because if you get a high 137 00:08:29,360 --> 00:08:32,760 Speaker 1: comment count as an influencer, your price tag starts to 138 00:08:32,800 --> 00:08:35,960 Speaker 1: go up for brands. Right. Meanwhile, people who are comparing 139 00:08:36,000 --> 00:08:40,520 Speaker 1: themselves against these sorts of influencers may start to feel 140 00:08:40,520 --> 00:08:43,840 Speaker 1: really badly about themselves, like they're not popular, that something's 141 00:08:43,880 --> 00:08:48,360 Speaker 1: wrong with them, and Instagram reinforces that experience again and again. Worse, 142 00:08:48,760 --> 00:08:52,560 Speaker 1: Instagram has a vicious cycle built into it because so 143 00:08:52,640 --> 00:08:55,840 Speaker 1: many young people use Instagram to interact with their peers. 144 00:08:56,200 --> 00:08:59,199 Speaker 1: It's almost as if being on the platform is necessary 145 00:08:59,480 --> 00:09:02,559 Speaker 1: just to take part in socialization with your peer group, 146 00:09:03,120 --> 00:09:06,960 Speaker 1: and the experience, even when negative, can lead to compulsive 147 00:09:07,120 --> 00:09:09,800 Speaker 1: use of the platform. It's like, you can't quit it 148 00:09:10,440 --> 00:09:14,360 Speaker 1: because you need it. Now, I'm not just spouting off 149 00:09:14,360 --> 00:09:16,760 Speaker 1: my own opinions here, although I'm doing a bit of 150 00:09:16,800 --> 00:09:19,720 Speaker 1: that too. According to The Wall Street Journal, that internal 151 00:09:19,760 --> 00:09:23,320 Speaker 1: report that Facebook researchers generated backs a lot of this 152 00:09:23,400 --> 00:09:26,640 Speaker 1: stuff up. The researchers generated that report back in the 153 00:09:26,679 --> 00:09:32,000 Speaker 1: spring of twenty twenty. But wait, it gets worse because 154 00:09:32,000 --> 00:09:35,320 Speaker 1: in the spring of this year, twenty twenty one, Mark 155 00:09:35,400 --> 00:09:39,280 Speaker 1: Zuckerberg said at a congressional hearing that essentially their research 156 00:09:39,320 --> 00:09:43,240 Speaker 1: indicated that social platforms like Instagram have a benefit of 157 00:09:43,320 --> 00:09:47,120 Speaker 1: positive effect on mental health, which is the exact opposite 158 00:09:47,160 --> 00:09:50,400 Speaker 1: of what this internal report found. Now, again, it's an 159 00:09:50,400 --> 00:09:55,079 Speaker 1: internal report. Facebook did not share this outside the company, 160 00:09:55,200 --> 00:09:58,240 Speaker 1: so I don't think you can say that Facebook in general, 161 00:09:58,360 --> 00:10:00,480 Speaker 1: like executives in general, were on a aware of it. 162 00:10:00,800 --> 00:10:04,000 Speaker 1: Possibly Zuckerberg was unaware of it. I mean, maybe this 163 00:10:04,080 --> 00:10:07,480 Speaker 1: is like a case with Independence Day where the president 164 00:10:07,559 --> 00:10:10,680 Speaker 1: isn't told about the alien at area fifty one for 165 00:10:10,760 --> 00:10:15,440 Speaker 1: the purposes of plausible deniability. Maybe that happened. I doubt it, 166 00:10:15,480 --> 00:10:19,199 Speaker 1: but maybe Anyway. Congress people had already remarked on how 167 00:10:19,320 --> 00:10:23,600 Speaker 1: Zuckerberg seemed evasive when answering questions, like he was trying 168 00:10:23,640 --> 00:10:26,760 Speaker 1: to find ways to talk around things without actually addressing them, 169 00:10:26,960 --> 00:10:30,520 Speaker 1: and that had already raised concerns even before the existence 170 00:10:30,559 --> 00:10:33,920 Speaker 1: of this report was known. But this report confirms that 171 00:10:34,000 --> 00:10:38,280 Speaker 1: those concerns were well narrated and to make matters even worse. 172 00:10:38,640 --> 00:10:44,360 Speaker 1: Even knowing this, even knowing the potentially disastrous effect the 173 00:10:44,440 --> 00:10:48,320 Speaker 1: experience of being on Instagram can have on young people, 174 00:10:48,760 --> 00:10:51,720 Speaker 1: Facebook has been hard at work to develop an aversion 175 00:10:51,760 --> 00:10:55,079 Speaker 1: for Instagram for kids under the age of thirteen. Now, 176 00:10:55,160 --> 00:10:58,240 Speaker 1: let that settle in for a bit. You have a 177 00:10:58,280 --> 00:11:02,080 Speaker 1: company knowing that its product can have harmful effects on 178 00:11:02,120 --> 00:11:04,680 Speaker 1: young girls, they have a whole report about it, and 179 00:11:04,720 --> 00:11:09,360 Speaker 1: they're working to make a product for even younger ones. Also, 180 00:11:09,400 --> 00:11:12,360 Speaker 1: I should add that while the report focuses on young girls, 181 00:11:12,760 --> 00:11:16,000 Speaker 1: those are not the only people have a negative impact 182 00:11:16,080 --> 00:11:18,200 Speaker 1: from this stuff. I mean, I just mentioned that I 183 00:11:18,320 --> 00:11:21,520 Speaker 1: experienced this on a certain scale, so it can affect 184 00:11:21,520 --> 00:11:24,199 Speaker 1: people of all genders and ages. The teen girls appear 185 00:11:24,240 --> 00:11:27,120 Speaker 1: to be, you know, as a as a population, slightly 186 00:11:27,120 --> 00:11:29,800 Speaker 1: more vulnerable to this sort of influence, but that doesn't 187 00:11:29,800 --> 00:11:32,360 Speaker 1: mean they they're the only ones who end up feeling 188 00:11:32,559 --> 00:11:36,040 Speaker 1: lesser than after spending time on Instagram. It happens to 189 00:11:36,080 --> 00:11:38,200 Speaker 1: a lot of people. So then the question comes down 190 00:11:38,240 --> 00:11:41,240 Speaker 1: to what is to be done about this? Now? I 191 00:11:41,320 --> 00:11:45,760 Speaker 1: suspect that without intense external pressure, not much is going 192 00:11:45,840 --> 00:11:49,520 Speaker 1: to change a Facebook and Instagram. Uh. Facebook really hasn't 193 00:11:49,520 --> 00:11:52,440 Speaker 1: seemed to take any massive steps towards reducing the harm 194 00:11:52,640 --> 00:11:59,199 Speaker 1: it causes unless public opinion, political pressure, or most effectively 195 00:11:59,280 --> 00:12:03,480 Speaker 1: and most telling lee pressure from advertisers has forced it 196 00:12:03,520 --> 00:12:06,840 Speaker 1: to do so. Facebook is a heck of an example 197 00:12:07,400 --> 00:12:11,440 Speaker 1: of a capitalist organization gone to the extreme, in which 198 00:12:11,520 --> 00:12:14,840 Speaker 1: the primary purpose of the organization is to return value 199 00:12:14,880 --> 00:12:18,400 Speaker 1: to shareholders and everything else is secondary or maybe not 200 00:12:18,480 --> 00:12:22,520 Speaker 1: even a consideration. That being said, democratic lawmakers here in 201 00:12:22,520 --> 00:12:25,320 Speaker 1: the United States are starting to apply that kind of 202 00:12:25,320 --> 00:12:29,080 Speaker 1: pressure right now. In fact, today they've been calling on 203 00:12:29,160 --> 00:12:31,640 Speaker 1: Zuckerberg to give up on this idea of an Instagram 204 00:12:31,679 --> 00:12:35,720 Speaker 1: app for kids. There's also talk of another massive investigation 205 00:12:35,720 --> 00:12:38,440 Speaker 1: into the company, and armed with this internal report, I 206 00:12:38,440 --> 00:12:41,760 Speaker 1: think Zuckerberg is destined for another very uncomfortable hearing in 207 00:12:41,800 --> 00:12:44,760 Speaker 1: front of Congress. But for most of us, I think 208 00:12:44,800 --> 00:12:47,199 Speaker 1: the study shows that parents really need to be aware 209 00:12:47,360 --> 00:12:50,240 Speaker 1: of these influences and to take an active role in 210 00:12:50,320 --> 00:12:53,960 Speaker 1: helping their children with stuff like self esteem and confidence, 211 00:12:54,440 --> 00:12:57,360 Speaker 1: and an understanding that the representations we see of people 212 00:12:57,400 --> 00:13:02,120 Speaker 1: online are often a fabrication and are not reflective of reality, 213 00:13:02,240 --> 00:13:05,960 Speaker 1: nor are they reflective of a person's value. We need 214 00:13:06,000 --> 00:13:09,320 Speaker 1: to educate everyone that the reason most of this content 215 00:13:09,679 --> 00:13:14,120 Speaker 1: that falls into this kind of harmful tendency usually is 216 00:13:14,640 --> 00:13:17,400 Speaker 1: meant to ultimately serve one of a couple of different 217 00:13:17,440 --> 00:13:21,480 Speaker 1: purposes that are tightly related. Now, often it's all in 218 00:13:21,520 --> 00:13:23,840 Speaker 1: the effort to sell you something. Now, it might be 219 00:13:23,880 --> 00:13:27,720 Speaker 1: a specific branded product, or the thing on sale might 220 00:13:27,760 --> 00:13:29,760 Speaker 1: actually be the brand of the person who's sharing the 221 00:13:29,760 --> 00:13:34,120 Speaker 1: content themselves, or it could be to boost a person's status, 222 00:13:34,280 --> 00:13:37,120 Speaker 1: essentially saying look and how amazing my life is because 223 00:13:37,160 --> 00:13:39,920 Speaker 1: I'm the best. But either way, understanding that the images 224 00:13:40,160 --> 00:13:43,760 Speaker 1: that you see are not necessarily showing you what's really 225 00:13:43,760 --> 00:13:46,760 Speaker 1: going on, and also that the reason that these images 226 00:13:46,800 --> 00:13:50,720 Speaker 1: exist is to push out a specific branding message, all 227 00:13:50,760 --> 00:13:54,720 Speaker 1: of that is helpful. I mean, I see this everywhere obviously, 228 00:13:54,840 --> 00:13:58,680 Speaker 1: like this this tendency, especially in relation to Instagram. It's everywhere, 229 00:13:59,320 --> 00:14:02,400 Speaker 1: like there are plenty of pop up experiences that really 230 00:14:02,520 --> 00:14:06,000 Speaker 1: only exist for the purposes of people to take selfies 231 00:14:06,120 --> 00:14:10,160 Speaker 1: at them and to kind of generate this image of 232 00:14:10,720 --> 00:14:14,240 Speaker 1: a fun experience. But the whole experience is just about 233 00:14:14,640 --> 00:14:18,760 Speaker 1: taking those selfies. Like there's nothing, there's nothing more substantive 234 00:14:19,360 --> 00:14:22,080 Speaker 1: to the experience that that's all it is. It's just 235 00:14:22,200 --> 00:14:26,400 Speaker 1: surface level that in itself may not, you know, be 236 00:14:26,560 --> 00:14:30,040 Speaker 1: harmful on the face of it, but it contributes to 237 00:14:30,240 --> 00:14:33,640 Speaker 1: this tendency I'm talking about. Also, the report found that 238 00:14:33,640 --> 00:14:36,520 Speaker 1: most other social platforms don't have nearly the same negative 239 00:14:36,560 --> 00:14:40,800 Speaker 1: impact as Instagram. Stuff like TikTok tends to be focused 240 00:14:40,800 --> 00:14:44,600 Speaker 1: more on performance than on appearance. Stuff on Snapchat tends 241 00:14:44,600 --> 00:14:47,880 Speaker 1: to feature lots of ridiculous filters, and these things don't 242 00:14:48,000 --> 00:14:50,560 Speaker 1: seem to contribute as much of a negative effect on 243 00:14:50,680 --> 00:14:55,560 Speaker 1: mental health. So long long coverage of that one story. 244 00:14:55,600 --> 00:14:58,320 Speaker 1: But I think this one is really important. I don't 245 00:14:58,360 --> 00:15:00,760 Speaker 1: think it was quite enough for me to of a 246 00:15:00,800 --> 00:15:03,360 Speaker 1: full episode dedicated to it unless I get an expert in, 247 00:15:03,880 --> 00:15:07,240 Speaker 1: but I really wanted to talk about it because it's 248 00:15:07,280 --> 00:15:12,080 Speaker 1: it's pretty damning that Facebook had its own internal report 249 00:15:12,200 --> 00:15:14,720 Speaker 1: that said these things, and yet outwardly the company has 250 00:15:14,760 --> 00:15:19,360 Speaker 1: behaved as if everything it does is fine, when that 251 00:15:19,520 --> 00:15:22,680 Speaker 1: is patently not the case, and they know it, and 252 00:15:22,800 --> 00:15:26,400 Speaker 1: that's the worst part. All Right, We're gonna go and 253 00:15:26,440 --> 00:15:28,360 Speaker 1: take a break, and when we come back, we'll have 254 00:15:28,440 --> 00:15:39,960 Speaker 1: some other unrelated tech news. On Tuesday's episode this week, 255 00:15:40,320 --> 00:15:43,000 Speaker 1: I mentioned that Apple was about to hold its iPhone 256 00:15:43,040 --> 00:15:45,520 Speaker 1: thirteen event, but at that point it had not yet 257 00:15:45,640 --> 00:15:49,280 Speaker 1: done so as I was recording the show. Obviously it's 258 00:15:49,320 --> 00:15:52,280 Speaker 1: happened since then, and some of you may know all 259 00:15:52,320 --> 00:15:55,520 Speaker 1: about this, But really the big surprise at the event 260 00:15:55,600 --> 00:15:58,240 Speaker 1: was that some of the stuff that was rumored to 261 00:15:58,240 --> 00:16:01,000 Speaker 1: be featured, like a really big update to the Apple 262 00:16:01,040 --> 00:16:05,200 Speaker 1: Watch and new air pods, uh, those actually weren't part 263 00:16:05,240 --> 00:16:07,480 Speaker 1: of the show. I mean, there was a new Apple Watch, 264 00:16:07,520 --> 00:16:11,080 Speaker 1: but the The big changes that had been rumored uh 265 00:16:11,280 --> 00:16:13,680 Speaker 1: ended up being just a slightly larger screen and a 266 00:16:13,680 --> 00:16:16,800 Speaker 1: couple of new features, but nothing like the redesign that 267 00:16:16,840 --> 00:16:20,240 Speaker 1: folks were expecting. Uh, you know, nothing dramatic. The new 268 00:16:20,240 --> 00:16:23,160 Speaker 1: watch looks fairly close to the previous ones. It's a 269 00:16:23,200 --> 00:16:26,920 Speaker 1: little bigger with the screen, and the air pods with 270 00:16:27,000 --> 00:16:29,720 Speaker 1: the rumored shorter stems were a no show at the event. 271 00:16:30,040 --> 00:16:32,680 Speaker 1: There are also rumors about the iPhone itself, a big 272 00:16:32,680 --> 00:16:34,560 Speaker 1: one being that Apple was going to build in some 273 00:16:34,600 --> 00:16:37,600 Speaker 1: sort of satellite cell phone feature so that there would 274 00:16:37,600 --> 00:16:41,320 Speaker 1: be a transmitter where you could actually make a satellite 275 00:16:41,320 --> 00:16:44,040 Speaker 1: phone call if you had to. Now, you wouldn't normally 276 00:16:44,120 --> 00:16:47,040 Speaker 1: use the satellite system. There are issues with latency. It 277 00:16:47,080 --> 00:16:49,280 Speaker 1: tends to be a really expensive kind of thing to do. 278 00:16:49,840 --> 00:16:52,040 Speaker 1: But according to these rumors, you would be able to 279 00:16:52,080 --> 00:16:54,080 Speaker 1: make use of this feature if you happen to be 280 00:16:54,120 --> 00:16:56,320 Speaker 1: in an area that had no cell service. So that way, 281 00:16:56,440 --> 00:16:58,960 Speaker 1: if you need to make an emergency phone call but 282 00:16:59,040 --> 00:17:02,200 Speaker 1: you weren't close to a cell tower or a WiFi hotspot, 283 00:17:02,640 --> 00:17:05,640 Speaker 1: you can do it. But that ended up not being 284 00:17:05,640 --> 00:17:08,240 Speaker 1: part of the announcements either. So what we did get 285 00:17:08,280 --> 00:17:12,480 Speaker 1: were new iPhones with new A fifteen bionic chips powering them. 286 00:17:12,520 --> 00:17:16,600 Speaker 1: That's a proprietary chip from Apple, a new dual camera system, 287 00:17:16,880 --> 00:17:21,280 Speaker 1: and the two pro models support a one hurts refresh 288 00:17:21,359 --> 00:17:23,720 Speaker 1: rate for the screen. Now that means that the screen 289 00:17:23,800 --> 00:17:28,200 Speaker 1: refreshes on times per second. The iPad also got a facelift. 290 00:17:28,240 --> 00:17:32,840 Speaker 1: But yeah, you know, the announcements were fairly modest in nature. Honestly, 291 00:17:32,920 --> 00:17:35,760 Speaker 1: I wonder if it actually benefits Apple to keep holding 292 00:17:35,800 --> 00:17:38,800 Speaker 1: these big marketing events unless there happens to be something 293 00:17:38,800 --> 00:17:41,480 Speaker 1: that's going to really blow people's socks off, because the 294 00:17:41,520 --> 00:17:47,240 Speaker 1: company set expectations ridiculously high with big reveals, you know, 295 00:17:47,359 --> 00:17:51,280 Speaker 1: a decade ago or more, and now there's a tendency 296 00:17:51,320 --> 00:17:54,720 Speaker 1: for folks to say, oh, that's it, even even if 297 00:17:54,760 --> 00:17:57,520 Speaker 1: the stuff that Apple is showing is an improvement over 298 00:17:57,560 --> 00:18:01,240 Speaker 1: previous models, because you know, it's not like a show 299 00:18:01,280 --> 00:18:03,960 Speaker 1: stopper kind of thing. But hey, I mean, it's a 300 00:18:03,960 --> 00:18:06,200 Speaker 1: two trillion dollar company and the folks running it are 301 00:18:06,200 --> 00:18:09,399 Speaker 1: way smarter than I am, so I'm sure I'm just 302 00:18:09,680 --> 00:18:12,840 Speaker 1: not seeing the big picture. In the past, I've talked 303 00:18:12,880 --> 00:18:15,399 Speaker 1: about the N s O Group a few times on 304 00:18:15,440 --> 00:18:18,800 Speaker 1: this show. That's the Israeli company famous for developing a 305 00:18:18,840 --> 00:18:24,560 Speaker 1: malware tool that exploits security vulnerabilities and Apple's Eye message program. 306 00:18:24,600 --> 00:18:27,600 Speaker 1: And just to be clear, that's not the only company 307 00:18:27,640 --> 00:18:31,359 Speaker 1: to take aim at I message and create exploits. An 308 00:18:31,359 --> 00:18:35,600 Speaker 1: American company called Acuvant once upon a time did the 309 00:18:35,640 --> 00:18:38,600 Speaker 1: same thing. Now use the past tense here because Acuvant 310 00:18:38,640 --> 00:18:42,440 Speaker 1: subsequently found itself absorbed into a larger company called Optive, 311 00:18:43,000 --> 00:18:46,680 Speaker 1: and reportedly Optive is out of the exploit development game, 312 00:18:47,400 --> 00:18:51,400 Speaker 1: but Acuvant was very much in that game, using security 313 00:18:51,440 --> 00:18:54,959 Speaker 1: experts to not just find vulnerabilities in various software platforms, 314 00:18:55,280 --> 00:19:00,720 Speaker 1: but to develop exploits that leveraged those vulnerabilities, then selling 315 00:19:00,760 --> 00:19:05,600 Speaker 1: those as products to various other entities, usually government agencies. 316 00:19:06,119 --> 00:19:08,760 Speaker 1: So while I heap a lot of criticism on n 317 00:19:08,840 --> 00:19:11,080 Speaker 1: s O Group, which you know, does a similar thing 318 00:19:11,160 --> 00:19:14,760 Speaker 1: with the blessing and restrictions of the Israeli government, we 319 00:19:14,800 --> 00:19:16,479 Speaker 1: are seeing the same sort of thing going on in 320 00:19:16,520 --> 00:19:20,080 Speaker 1: other parts of the world, including here in the United States. Anyway, 321 00:19:20,119 --> 00:19:21,800 Speaker 1: I bring all this up because the m I T 322 00:19:22,000 --> 00:19:25,520 Speaker 1: Technology Review reports that the U. S. Department of Justice 323 00:19:25,560 --> 00:19:29,560 Speaker 1: recently find three former U S intelligence and military personnel 324 00:19:29,960 --> 00:19:34,320 Speaker 1: for working for the United Arab Emirates without US permission 325 00:19:34,680 --> 00:19:38,520 Speaker 1: in what was called Project Raven. The operatives were using 326 00:19:38,560 --> 00:19:42,520 Speaker 1: this exploit, they had purchased it from Acuvant, and then 327 00:19:42,560 --> 00:19:46,600 Speaker 1: they were acting as mercenaries. They were deploying the exploit 328 00:19:46,680 --> 00:19:49,520 Speaker 1: on behalf of the U A E. And reportedly the 329 00:19:49,520 --> 00:19:54,280 Speaker 1: list of targets included American citizens and companies. Uh. And 330 00:19:54,320 --> 00:19:56,560 Speaker 1: you know that's a big no no working on behalf 331 00:19:56,600 --> 00:19:58,880 Speaker 1: of a foreign government when you're a U. S citizen 332 00:19:58,920 --> 00:20:02,200 Speaker 1: without you know, the consent of the United States government. 333 00:20:02,960 --> 00:20:05,480 Speaker 1: The finds amounted to a little less than one point 334 00:20:05,520 --> 00:20:10,080 Speaker 1: seven million dollars. I'm assuming that's collectively, but the m 335 00:20:10,160 --> 00:20:13,639 Speaker 1: I T Technology Review Report was not specific, so to 336 00:20:13,680 --> 00:20:17,320 Speaker 1: be clear, the focus here is on the three and 337 00:20:17,480 --> 00:20:22,120 Speaker 1: that investigation, and there's not really any investigation into optave 338 00:20:22,280 --> 00:20:26,120 Speaker 1: or the former equivant. So if you take it like that, 339 00:20:26,600 --> 00:20:28,480 Speaker 1: you know, you could say, oh, well, the government says 340 00:20:28,480 --> 00:20:31,119 Speaker 1: it's okay if for your companies to develop tools that 341 00:20:31,200 --> 00:20:35,680 Speaker 1: specifically exploit some other company's product, you know, I message 342 00:20:35,720 --> 00:20:38,240 Speaker 1: from Apple in this case, it's just not okay to 343 00:20:38,280 --> 00:20:41,280 Speaker 1: take that same product and then work with a foreign government. 344 00:20:42,080 --> 00:20:45,520 Speaker 1: Pretty weird world we live in anyway. I Message, as 345 00:20:45,560 --> 00:20:50,440 Speaker 1: the Technology Review points out, is a very popular destination 346 00:20:50,600 --> 00:20:54,399 Speaker 1: for malware because Apple includes it on every iPhone, so 347 00:20:54,440 --> 00:20:57,600 Speaker 1: the install base is on every single iPhone that's out 348 00:20:57,600 --> 00:21:01,320 Speaker 1: there and you can't uninstall it. Anyone using an iOS 349 00:21:01,400 --> 00:21:04,120 Speaker 1: device with your phone number can send you a message 350 00:21:04,160 --> 00:21:08,840 Speaker 1: on I Message, and I Message automatically accepts those incoming messages, 351 00:21:09,000 --> 00:21:11,280 Speaker 1: doesn't matter if you don't recognize the number or not. 352 00:21:11,640 --> 00:21:15,119 Speaker 1: So if you build in what's called a zero click exploit, 353 00:21:15,680 --> 00:21:18,280 Speaker 1: that's an exploit that doesn't need someone to click on 354 00:21:18,320 --> 00:21:21,000 Speaker 1: a link or open up a file or anything like that, 355 00:21:21,600 --> 00:21:25,320 Speaker 1: it just infects malware as long as there's a hit. Well. 356 00:21:25,680 --> 00:21:28,120 Speaker 1: That means you can have one of these compromise your 357 00:21:28,119 --> 00:21:31,240 Speaker 1: phone just because you receive the message on I Message. 358 00:21:31,720 --> 00:21:34,400 Speaker 1: Apple has patched the vulnerabilities and I Message a couple 359 00:21:34,400 --> 00:21:37,920 Speaker 1: of times, including a patch called blast Door, but hackers 360 00:21:37,920 --> 00:21:40,480 Speaker 1: are always looking for other cracks in the security around 361 00:21:40,480 --> 00:21:44,080 Speaker 1: the app, and it's a constant seesaw battle between developers 362 00:21:44,200 --> 00:21:47,919 Speaker 1: and hackers. In addition to that, one of those three 363 00:21:47,960 --> 00:21:50,720 Speaker 1: people that have been fined by the US government for 364 00:21:50,760 --> 00:21:54,520 Speaker 1: this thing is Daniel Garrick, who currently serves as the 365 00:21:54,600 --> 00:21:58,919 Speaker 1: Chief Information Officer or c i O of Express VPN, 366 00:21:59,480 --> 00:22:01,520 Speaker 1: now the v The end company says it was aware 367 00:22:01,560 --> 00:22:04,720 Speaker 1: of Garrick's previous activities before they hired him, that he 368 00:22:04,800 --> 00:22:08,480 Speaker 1: was completely transparent about that, and that might sound surprising, 369 00:22:08,720 --> 00:22:11,520 Speaker 1: but then the company said that his experience was what 370 00:22:11,720 --> 00:22:15,320 Speaker 1: made him valuable to the company. So VPNs, or virtual 371 00:22:15,440 --> 00:22:19,359 Speaker 1: private networks, are a way for people to log into 372 00:22:19,400 --> 00:22:24,119 Speaker 1: one system in order to access other systems privately. Uh Effectively, 373 00:22:24,160 --> 00:22:28,359 Speaker 1: it masks your activities between you and whichever end server 374 00:22:28,520 --> 00:22:30,919 Speaker 1: you're trying to access. The VPN is kind of like 375 00:22:31,000 --> 00:22:34,760 Speaker 1: the man in the middle in this case, and they're 376 00:22:34,800 --> 00:22:37,920 Speaker 1: you know, encrypting everything so that snoopers don't know what 377 00:22:37,960 --> 00:22:41,560 Speaker 1: it is you're actually doing. Anyway, Express VPN's stance was 378 00:22:41,600 --> 00:22:45,919 Speaker 1: that Garrick's experiences meant he understood security vulnerabilities because he 379 00:22:45,960 --> 00:22:49,680 Speaker 1: had been searching for them and then exploiting them. So 380 00:22:50,080 --> 00:22:53,119 Speaker 1: this could help the company build more effective tools to 381 00:22:53,160 --> 00:22:56,359 Speaker 1: protect security and privacy, because who do you want on 382 00:22:56,400 --> 00:22:58,600 Speaker 1: your team other than, you know, the person who knows 383 00:22:58,640 --> 00:23:02,160 Speaker 1: how to break those thingsings And that definitely makes sense 384 00:23:02,200 --> 00:23:05,440 Speaker 1: from that perspective. Now, the bit about him working as 385 00:23:05,440 --> 00:23:09,040 Speaker 1: a mercenary for a foreign government seems a bit much 386 00:23:09,600 --> 00:23:12,199 Speaker 1: to me, but I get the concept of wanting to 387 00:23:12,240 --> 00:23:15,520 Speaker 1: find someone to put on your team who has experience 388 00:23:15,560 --> 00:23:18,920 Speaker 1: of familiarity with exploits. A few weeks ago, I talked 389 00:23:18,920 --> 00:23:22,400 Speaker 1: about how employees at Activision Blizzard had brought forth charges 390 00:23:22,640 --> 00:23:26,679 Speaker 1: relating to a toxic work culture within the company and 391 00:23:26,760 --> 00:23:29,959 Speaker 1: had been part of that company for years, including charges 392 00:23:30,000 --> 00:23:33,080 Speaker 1: of sexual harassment and assault, as well as a tendency 393 00:23:33,119 --> 00:23:36,200 Speaker 1: for the company to protect certain employees accused of perpetrating 394 00:23:36,320 --> 00:23:41,120 Speaker 1: or encouraging some truly awful behavior. There are also charges 395 00:23:41,200 --> 00:23:46,040 Speaker 1: of general gender discrimination, like women getting paid significantly less 396 00:23:46,080 --> 00:23:49,240 Speaker 1: for doing the same job as their male counterparts. Now 397 00:23:49,280 --> 00:23:52,800 Speaker 1: the company is in the news again because the Communications 398 00:23:52,800 --> 00:23:57,280 Speaker 1: Workers of America or c w A alleges that Activision 399 00:23:57,320 --> 00:24:03,040 Speaker 1: Blizzard intimidated employees in an to prevent unionization. In other words, 400 00:24:03,440 --> 00:24:05,920 Speaker 1: they didn't want their employees to band together to form 401 00:24:05,960 --> 00:24:09,960 Speaker 1: a union, so they set about hiring folks who would 402 00:24:10,280 --> 00:24:13,800 Speaker 1: you know, prevent that from happening. As such, the labor 403 00:24:13,920 --> 00:24:18,600 Speaker 1: organization has now brought a lawsuit against Activision Blizzard. According 404 00:24:18,600 --> 00:24:22,720 Speaker 1: to the organizing director of the c W A Tom Smith. Quote, 405 00:24:22,840 --> 00:24:28,200 Speaker 1: Activision Blizzards response to righteous worker activity was surveillance, intimidation 406 00:24:28,520 --> 00:24:33,200 Speaker 1: and hiring notorious union busters end quote. So that righteous 407 00:24:33,240 --> 00:24:37,479 Speaker 1: worker activity reference, Uh, that's actually two employees who were 408 00:24:37,480 --> 00:24:42,560 Speaker 1: taking a stand against Activision Blizzard, protesting conditions about you know, 409 00:24:42,600 --> 00:24:46,200 Speaker 1: the disparities and the sexual harassment and bringing those things 410 00:24:46,200 --> 00:24:50,159 Speaker 1: into the spotlight. I should add there's an ongoing discussion 411 00:24:50,240 --> 00:24:53,199 Speaker 1: within the gaming community over whether or not it's a 412 00:24:53,240 --> 00:24:56,800 Speaker 1: good idea to boycott Activision Blizzard titles. You know, there 413 00:24:56,800 --> 00:24:58,879 Speaker 1: are a lot of Twitch streamers who have been talking 414 00:24:58,880 --> 00:25:02,679 Speaker 1: about this, and some argue that this sends a message 415 00:25:02,720 --> 00:25:04,800 Speaker 1: to the company, and it's a message that's hard to 416 00:25:04,840 --> 00:25:08,439 Speaker 1: ignore because if enough people are boycotting the products, that 417 00:25:08,520 --> 00:25:10,479 Speaker 1: hurts the bottom line. It has to be a lot 418 00:25:10,520 --> 00:25:13,399 Speaker 1: of people, and the odds of getting that many people 419 00:25:13,440 --> 00:25:17,480 Speaker 1: to participate are pretty low. But if you can do it, 420 00:25:17,480 --> 00:25:19,720 Speaker 1: it sends a definite message. But then you have other 421 00:25:19,760 --> 00:25:23,719 Speaker 1: people saying no, game developers who are working under these 422 00:25:23,720 --> 00:25:26,960 Speaker 1: awful conditions are doing so partly because they really want 423 00:25:27,000 --> 00:25:30,439 Speaker 1: their work to be seen and experienced and enjoyed, and 424 00:25:30,480 --> 00:25:33,920 Speaker 1: that a boycott kind of pours salt on an already 425 00:25:34,200 --> 00:25:39,240 Speaker 1: you know, painful wound. So it's a complicated situation. Speaking 426 00:25:39,280 --> 00:25:43,040 Speaker 1: of complicated situations, I haven't talked about Tharoness for a while, 427 00:25:43,560 --> 00:25:47,240 Speaker 1: but the trial of the former CEO of that doomed company, 428 00:25:47,320 --> 00:25:52,119 Speaker 1: Elizabeth Holmes, is now underway and Erica Cheung, who once 429 00:25:52,160 --> 00:25:56,960 Speaker 1: worked for Tharaus, is testifying, and she actually has testified 430 00:25:57,000 --> 00:26:00,760 Speaker 1: that she observed questionable processes in the company as Holmes 431 00:26:00,920 --> 00:26:05,800 Speaker 1: continued to court investors to spend money on thairness, and 432 00:26:05,920 --> 00:26:08,639 Speaker 1: also as she was engaging in a rather lavish and 433 00:26:08,720 --> 00:26:13,680 Speaker 1: eccentric lifestyle. All right, so quick refresher on homes and thoroughness. 434 00:26:14,320 --> 00:26:18,520 Speaker 1: Holmes attended Stanford University before dropping out to go on 435 00:26:18,560 --> 00:26:23,280 Speaker 1: to found a biotech medical company that would evolve into Thoroughness. Now. 436 00:26:23,320 --> 00:26:27,359 Speaker 1: The goal was to develop a blood scanning technology that 437 00:26:27,400 --> 00:26:30,440 Speaker 1: could take a minuscule amount of blood as a sample 438 00:26:30,920 --> 00:26:34,520 Speaker 1: and then run a battery of tests on it, potentially 439 00:26:34,720 --> 00:26:37,960 Speaker 1: more than one hundred tests all from that tiny sample, 440 00:26:38,400 --> 00:26:42,480 Speaker 1: and then print out definitive and easy to understand results 441 00:26:42,560 --> 00:26:46,080 Speaker 1: based on those tests. So the ideal goal was to 442 00:26:46,119 --> 00:26:49,000 Speaker 1: produce something around the size of a desktop printer that 443 00:26:49,040 --> 00:26:52,959 Speaker 1: could do all of this, potentially even creating a consumer 444 00:26:53,040 --> 00:26:56,040 Speaker 1: version that people could buy and have in their own homes. 445 00:26:56,560 --> 00:26:59,880 Speaker 1: They could then run tests to understand whether that take 446 00:27:00,160 --> 00:27:02,840 Speaker 1: in the back of their throat with something serious or not, 447 00:27:03,359 --> 00:27:05,399 Speaker 1: or they could scan to see whether they might be 448 00:27:05,440 --> 00:27:09,000 Speaker 1: a risk for developing some sort of serious condition down 449 00:27:09,040 --> 00:27:12,080 Speaker 1: the line based upon their blood work. This was all 450 00:27:12,119 --> 00:27:17,600 Speaker 1: supposed to democratize medicine, to disrupt the entire blood testing industry, 451 00:27:17,960 --> 00:27:20,800 Speaker 1: and to empower patients so that they could have better, 452 00:27:21,040 --> 00:27:25,119 Speaker 1: more informed conversations with their doctors. But there was a 453 00:27:25,200 --> 00:27:28,119 Speaker 1: bit of a problem, and that problem was the tech 454 00:27:28,480 --> 00:27:31,600 Speaker 1: was just not up to snuff, and some folks, including 455 00:27:31,640 --> 00:27:34,520 Speaker 1: some people who served as advisors to Holmes while she 456 00:27:34,640 --> 00:27:38,160 Speaker 1: was at Stanford, have expressed that they thought the tech 457 00:27:38,280 --> 00:27:41,880 Speaker 1: might not ever work on such a small sample, that 458 00:27:42,000 --> 00:27:44,760 Speaker 1: this is just asking too much from the tech and 459 00:27:44,800 --> 00:27:48,680 Speaker 1: it puts too much faith on the powers of technology. 460 00:27:48,880 --> 00:27:51,560 Speaker 1: That alone is somewhat understandable. I mean, I see a 461 00:27:51,560 --> 00:27:57,280 Speaker 1: lot of people like putting a lot of hope in technology. 462 00:27:57,320 --> 00:27:59,720 Speaker 1: But you can kind of get that because most of 463 00:27:59,800 --> 00:28:02,360 Speaker 1: us carry a device in our pockets or our handbags, 464 00:28:02,840 --> 00:28:05,760 Speaker 1: that is far more powerful than the most powerful personal 465 00:28:05,800 --> 00:28:08,720 Speaker 1: computer from a decade ago, and we just carry it 466 00:28:08,720 --> 00:28:11,120 Speaker 1: around with us wherever we go. I mean, that's that's 467 00:28:11,160 --> 00:28:15,240 Speaker 1: what smartphone is. So if that became a reality, then 468 00:28:15,400 --> 00:28:18,200 Speaker 1: what can't tech do. That's sort of the thought process. 469 00:28:18,920 --> 00:28:23,320 Speaker 1: Chung's testimony included the revelation that Thereness staff had a 470 00:28:23,480 --> 00:28:28,040 Speaker 1: six data point test that each each you know, blood 471 00:28:28,040 --> 00:28:30,639 Speaker 1: test was supposed to pass in order to meet quality 472 00:28:30,680 --> 00:28:33,040 Speaker 1: control standards. So, in other words, when you ran a 473 00:28:33,119 --> 00:28:38,000 Speaker 1: test on the equipment, it should meet these six data points, 474 00:28:38,280 --> 00:28:41,240 Speaker 1: and if it doesn't, then it's a failed test. But 475 00:28:41,360 --> 00:28:44,239 Speaker 1: the staff were told to delete up to two of 476 00:28:44,280 --> 00:28:46,880 Speaker 1: those six points if that would mean that the test 477 00:28:46,920 --> 00:28:49,400 Speaker 1: would otherwise pass. So in other words, if you said, well, 478 00:28:49,440 --> 00:28:53,200 Speaker 1: if we ignore these outliers, then everything's fine. So in 479 00:28:53,240 --> 00:28:55,920 Speaker 1: other words, this could be seen as a form of 480 00:28:56,040 --> 00:29:00,120 Speaker 1: cherry picking, looking for positives and then ignoring the negatives 481 00:29:00,120 --> 00:29:03,440 Speaker 1: in order to boost your numbers. Chung quit her job 482 00:29:03,520 --> 00:29:07,760 Speaker 1: after working for Thoroughness for six months, uh disillusioned that 483 00:29:07,840 --> 00:29:10,560 Speaker 1: she was working for an organization that was, in her mind, 484 00:29:11,000 --> 00:29:15,440 Speaker 1: at best unprofessional and at worst downright unethical. Chung was 485 00:29:15,480 --> 00:29:18,880 Speaker 1: previously featured as a whistleblower in the book Bad Blood, 486 00:29:19,400 --> 00:29:22,840 Speaker 1: which is probably the most thorough expose a on Thorodness, 487 00:29:23,120 --> 00:29:26,640 Speaker 1: certainly the most famous. She was also featured on the 488 00:29:26,800 --> 00:29:31,560 Speaker 1: HBO documentary The Inventor. Cross Examination of Chung began yesterday 489 00:29:31,600 --> 00:29:35,120 Speaker 1: it will pick up again tomorrow on Friday. Well, I've 490 00:29:35,120 --> 00:29:38,080 Speaker 1: got a couple more stories to cover, but before I 491 00:29:38,120 --> 00:29:49,400 Speaker 1: do that, let's take another quick break and now we 492 00:29:49,480 --> 00:29:51,840 Speaker 1: move over to the world of n f t s 493 00:29:52,080 --> 00:29:57,000 Speaker 1: or non fungible tokens. These have been likened by some people, 494 00:29:57,200 --> 00:30:01,600 Speaker 1: including myself, as a sort of digital receipt that shows 495 00:30:01,640 --> 00:30:08,360 Speaker 1: that you quote unquote own some instance of something. Right, technically, 496 00:30:08,560 --> 00:30:13,160 Speaker 1: what you own is a digital token that represents something else. 497 00:30:13,680 --> 00:30:15,840 Speaker 1: That something else could be data in the form of 498 00:30:16,280 --> 00:30:20,680 Speaker 1: like an illustration or a digital baseball card, or a 499 00:30:20,720 --> 00:30:24,320 Speaker 1: tweet or something else. And the n f T market 500 00:30:24,400 --> 00:30:27,960 Speaker 1: is built on top of blockchain technology, which means that 501 00:30:28,080 --> 00:30:30,200 Speaker 1: the chain of ownership of an n f T is 502 00:30:30,240 --> 00:30:34,920 Speaker 1: well established and distributed in a ledger. Uh There's so 503 00:30:35,120 --> 00:30:37,560 Speaker 1: everyone who's part of the system can see when an 504 00:30:37,680 --> 00:30:40,160 Speaker 1: n f T changes hands from one entity to another 505 00:30:40,560 --> 00:30:43,840 Speaker 1: and verify that yes, ownership of this n f T 506 00:30:44,160 --> 00:30:47,280 Speaker 1: has changed. So what can you do with n f 507 00:30:47,320 --> 00:30:52,240 Speaker 1: T s. Well, you could collect them. You could try 508 00:30:52,280 --> 00:30:55,600 Speaker 1: and do some speculative investing, So you could try and 509 00:30:55,640 --> 00:30:57,520 Speaker 1: buy an n f T and hope that it improves 510 00:30:57,520 --> 00:30:59,320 Speaker 1: in value and then sell it off for a profit. 511 00:31:00,160 --> 00:31:04,320 Speaker 1: Or if you're an employee of open c that's s 512 00:31:04,360 --> 00:31:07,440 Speaker 1: e A that was that's an n f T marketplace 513 00:31:07,480 --> 00:31:09,720 Speaker 1: by the way, then you might engage in a little 514 00:31:09,760 --> 00:31:14,000 Speaker 1: insider trading. Apparently an open Sea employee secretly purchased n 515 00:31:14,000 --> 00:31:17,480 Speaker 1: f T s that the employee knew we're soon going 516 00:31:17,520 --> 00:31:19,880 Speaker 1: to be featured on the front page of the open 517 00:31:19,920 --> 00:31:23,040 Speaker 1: see website, and I guess the idea was that the 518 00:31:23,120 --> 00:31:25,440 Speaker 1: n f T S value would increase due to the 519 00:31:25,480 --> 00:31:28,360 Speaker 1: exposure of being on the front page of the website. 520 00:31:28,960 --> 00:31:31,440 Speaker 1: Then the employee could sell off the n f T 521 00:31:31,520 --> 00:31:35,000 Speaker 1: s at an inflated value for a tidy profit. Now, 522 00:31:35,040 --> 00:31:38,760 Speaker 1: the company did not say who that employee was. A 523 00:31:38,880 --> 00:31:43,280 Speaker 1: user on Twitter with the handle at zu w u 524 00:31:43,600 --> 00:31:47,960 Speaker 1: t V or zooo TV accused the head of product 525 00:31:48,080 --> 00:31:51,440 Speaker 1: for open c, a guy named Nate Chastain, of engaging 526 00:31:51,440 --> 00:31:55,480 Speaker 1: in this behavior, but open Sea gave no real confirmation 527 00:31:55,720 --> 00:31:58,880 Speaker 1: that that's the employee in question, at least not as 528 00:31:58,880 --> 00:32:02,600 Speaker 1: I record this episode. However, a Chinese news platform reported 529 00:32:02,600 --> 00:32:05,520 Speaker 1: that chas Stain's scheme brought in the equivalent of sixty 530 00:32:05,520 --> 00:32:10,320 Speaker 1: seven thousand dollars of Ether cryptocurrency. But that's according to 531 00:32:10,360 --> 00:32:13,960 Speaker 1: the current value of Ether, because cryptocurrency values tend to fluctuate, 532 00:32:14,360 --> 00:32:16,320 Speaker 1: so on one day it might be you know, sixty 533 00:32:16,360 --> 00:32:19,040 Speaker 1: four thousand, on another day it might be sixty eight thousands, 534 00:32:19,080 --> 00:32:22,480 Speaker 1: so it can change pretty dramatically in short order. So 535 00:32:22,480 --> 00:32:24,000 Speaker 1: it's kind of hard to make these kind of calls. 536 00:32:24,320 --> 00:32:27,280 Speaker 1: But yeah, not great. Not great to have any sort 537 00:32:27,320 --> 00:32:30,080 Speaker 1: of insider trading. It's it's not the sort of thing 538 00:32:30,080 --> 00:32:33,440 Speaker 1: that you can necessarily get away with quickly, because if 539 00:32:33,440 --> 00:32:37,200 Speaker 1: people are paying attention to which digital wallet these n 540 00:32:37,280 --> 00:32:39,640 Speaker 1: f t s are going to, then they can start 541 00:32:39,680 --> 00:32:42,400 Speaker 1: to connect the dots. Even though like on the surface 542 00:32:42,480 --> 00:32:45,520 Speaker 1: level it looks like it's anonymous, if you're really paying 543 00:32:45,560 --> 00:32:48,600 Speaker 1: attention to stuff, you can start to draw some conclusions, 544 00:32:48,600 --> 00:32:51,000 Speaker 1: and that is what happened in this case. Now, you 545 00:32:51,040 --> 00:32:53,920 Speaker 1: might remember, if you've listened to previous news episodes of 546 00:32:53,920 --> 00:32:56,800 Speaker 1: tech stuff, that General Motors has had to issue a 547 00:32:56,840 --> 00:33:00,640 Speaker 1: global recall on the Chevrolet Bolt electric vehicle, and the 548 00:33:00,680 --> 00:33:04,400 Speaker 1: problem stems from faulty batteries, which can develop a short 549 00:33:04,480 --> 00:33:07,640 Speaker 1: circuit and that leads to the battery overheating and potentially 550 00:33:07,720 --> 00:33:11,920 Speaker 1: catching on fire. Now, the company has started to advise 551 00:33:12,080 --> 00:33:15,240 Speaker 1: some Bolt owners that they shouldn't park their cars within 552 00:33:15,320 --> 00:33:20,480 Speaker 1: fifty feet of any other vehicles, which yauza, I mean, 553 00:33:21,080 --> 00:33:23,600 Speaker 1: you know, think about that in different parking situations, like 554 00:33:23,720 --> 00:33:27,080 Speaker 1: parking decks or parking lots. I live in Atlanta, and 555 00:33:27,120 --> 00:33:29,320 Speaker 1: if you're going anywhere in the city, it's going to 556 00:33:29,400 --> 00:33:31,800 Speaker 1: be a real challenge to find a spot that's fifty 557 00:33:31,880 --> 00:33:35,800 Speaker 1: feet from all other vehicles. But that illustrates how serious 558 00:33:35,840 --> 00:33:38,600 Speaker 1: this issue is and how seriously Bolt owners need to 559 00:33:38,640 --> 00:33:41,640 Speaker 1: take this now. To be clear, GM has given this 560 00:33:41,680 --> 00:33:45,400 Speaker 1: advice in response to Bolt owners who have called into 561 00:33:45,400 --> 00:33:48,720 Speaker 1: the company into like a helpline to ask what they 562 00:33:48,760 --> 00:33:51,640 Speaker 1: should do when it comes to parking their vehicles. So 563 00:33:51,720 --> 00:33:55,000 Speaker 1: this is not a proactive message that's going out to 564 00:33:55,080 --> 00:33:58,880 Speaker 1: Bolt owners. It's more like a well, we would advise 565 00:33:58,880 --> 00:34:00,800 Speaker 1: you not to park anywhere close to another car, just 566 00:34:00,880 --> 00:34:03,520 Speaker 1: in case your car starts to catch on fire, like 567 00:34:03,560 --> 00:34:07,200 Speaker 1: an answer to a question. Now, the recall is expected 568 00:34:07,240 --> 00:34:09,920 Speaker 1: to cost GM somewhere in the neighborhood of one point 569 00:34:09,960 --> 00:34:14,600 Speaker 1: eight billion dollars. As someone who is generally in favor 570 00:34:14,640 --> 00:34:17,000 Speaker 1: of electric vehicles, I'm kind of sad to see the 571 00:34:17,040 --> 00:34:19,959 Speaker 1: situation happen, not the recall. I'm not sad that they're 572 00:34:19,960 --> 00:34:25,359 Speaker 1: recalling the faulty vehicles. That's absolutely necessary. Instead, I'm just 573 00:34:25,680 --> 00:34:28,239 Speaker 1: kind of, you know, bummed that the defective batteries are 574 00:34:28,280 --> 00:34:31,360 Speaker 1: there in the first place, because anything that could contribute 575 00:34:31,360 --> 00:34:34,640 Speaker 1: to a reluctance to move toward an electric fleet is 576 00:34:34,680 --> 00:34:37,640 Speaker 1: going to make that process more difficult. But the fact 577 00:34:37,760 --> 00:34:40,799 Speaker 1: remains that we really do need to transition away from 578 00:34:40,800 --> 00:34:44,840 Speaker 1: fossil fuel vehicles. Actually that that includes the entire chain, 579 00:34:45,360 --> 00:34:50,040 Speaker 1: from the power production facilities like power plants, all the 580 00:34:50,040 --> 00:34:53,239 Speaker 1: way down to the technology that we're relying upon day 581 00:34:53,280 --> 00:34:57,440 Speaker 1: to day, like our vehicles. My last story today is 582 00:34:57,480 --> 00:35:01,640 Speaker 1: a pretty cool one cern the Scientific Organization that I 583 00:35:01,680 --> 00:35:04,520 Speaker 1: think might be best known in the United States either 584 00:35:04,680 --> 00:35:08,280 Speaker 1: as the company that ended up or organization I should 585 00:35:08,280 --> 00:35:11,600 Speaker 1: say that ended up giving birth to the Worldwide Web 586 00:35:11,920 --> 00:35:14,000 Speaker 1: because of Tim berners Lee, who was working at started 587 00:35:14,040 --> 00:35:17,799 Speaker 1: at the time, or more likely these days, it might 588 00:35:17,800 --> 00:35:20,640 Speaker 1: be known as the organization in charge of the large 589 00:35:20,680 --> 00:35:25,120 Speaker 1: Hadron Collider, the particle accelerator. Anyway, it's created a tool 590 00:35:25,360 --> 00:35:27,799 Speaker 1: that's being used by the organization as well as the 591 00:35:27,840 --> 00:35:30,880 Speaker 1: Institute of Global Health at the University of Geneva to 592 00:35:31,000 --> 00:35:34,600 Speaker 1: use machine learning to determine what the most effective methods 593 00:35:34,600 --> 00:35:37,520 Speaker 1: would be to prevent the spread of COVID nineteen in 594 00:35:37,600 --> 00:35:40,839 Speaker 1: school settings. So we're seeing a lot of unfortunate and 595 00:35:40,880 --> 00:35:44,920 Speaker 1: scary news about school systems being affected by COVID breakouts, 596 00:35:45,239 --> 00:35:49,439 Speaker 1: and clearly students, teachers, and staff all want safe environments, 597 00:35:49,480 --> 00:35:51,239 Speaker 1: but it can be difficult to know how to go 598 00:35:51,320 --> 00:35:55,560 Speaker 1: about creating those environments, particularly when you've got a lot 599 00:35:55,560 --> 00:35:58,440 Speaker 1: of variables involved, and you also have different levels of 600 00:35:58,520 --> 00:36:02,160 Speaker 1: understanding about the behavior or the virus. Sometimes you get 601 00:36:02,200 --> 00:36:05,239 Speaker 1: information that appears to contradict earlier information, so that can 602 00:36:05,280 --> 00:36:08,640 Speaker 1: cause confusion. So the research with this tool has shown 603 00:36:08,760 --> 00:36:12,760 Speaker 1: that a few things are most effective at really preventing 604 00:36:12,800 --> 00:36:16,640 Speaker 1: the spread of COVID. Among those things are natural ventilation, 605 00:36:17,080 --> 00:36:21,880 Speaker 1: so open windows are a good uh HEPPA filtration systems 606 00:36:21,880 --> 00:36:24,360 Speaker 1: that's h g p A filtration systems in order to 607 00:36:24,400 --> 00:36:28,719 Speaker 1: filter out contaminants in the air and face masks are 608 00:36:28,800 --> 00:36:32,440 Speaker 1: all the most effective you know strategies when they're all 609 00:36:32,520 --> 00:36:35,160 Speaker 1: used in combination with one another. On top of that 610 00:36:35,560 --> 00:36:38,920 Speaker 1: are practices that improve it even more, like social distancing, 611 00:36:39,000 --> 00:36:42,720 Speaker 1: so keeping everyone six ft apart or more, vaccinations clearly 612 00:36:42,960 --> 00:36:46,600 Speaker 1: very very important, and contact tracing in the event of 613 00:36:46,640 --> 00:36:49,480 Speaker 1: an actual case being detected, so that you can figure 614 00:36:49,480 --> 00:36:52,080 Speaker 1: out who that person has been in close contact with. 615 00:36:52,560 --> 00:36:56,439 Speaker 1: Since in most cases kids still can't get vaccinated yet, 616 00:36:56,600 --> 00:37:00,960 Speaker 1: especially younger kids can't get vaccinated, these steps are really 617 00:37:01,000 --> 00:37:05,839 Speaker 1: important because they can't benefit from the vaccination themselves. I mean, 618 00:37:05,840 --> 00:37:08,839 Speaker 1: they can benefit from the fact that adults around them 619 00:37:08,880 --> 00:37:12,799 Speaker 1: are vaccinated, but you know, the kids can't get vaccinated. Now, 620 00:37:12,840 --> 00:37:14,719 Speaker 1: you might say that this all just confirms what we 621 00:37:14,800 --> 00:37:18,400 Speaker 1: already thought, but that's an important part of science. Sometimes 622 00:37:18,719 --> 00:37:21,279 Speaker 1: we find out that the thing we suspected to be 623 00:37:21,360 --> 00:37:24,399 Speaker 1: true is true. That does happen. But there are other 624 00:37:24,440 --> 00:37:28,120 Speaker 1: times where the science shows we don't have the full picture, 625 00:37:28,360 --> 00:37:31,520 Speaker 1: or maybe we're asking the wrong questions. Because that's how 626 00:37:31,560 --> 00:37:34,319 Speaker 1: science works and why it's not as simple as this 627 00:37:34,440 --> 00:37:38,480 Speaker 1: is how it is and nothing more. Now. The reason 628 00:37:38,560 --> 00:37:40,799 Speaker 1: I say all that is actually not the lecture you, 629 00:37:41,080 --> 00:37:43,319 Speaker 1: because I know it comes across that way. But that's 630 00:37:43,320 --> 00:37:45,480 Speaker 1: not why I say it. I say it because I 631 00:37:45,520 --> 00:37:49,680 Speaker 1: need to remind myself of this very thing, because I 632 00:37:49,840 --> 00:37:52,239 Speaker 1: know that I have been guilty of looking at a 633 00:37:52,280 --> 00:37:56,120 Speaker 1: situation and drawing a conclusion and then acting on that 634 00:37:56,200 --> 00:37:59,840 Speaker 1: conclusion without you know, actually investigating whether or not that 635 00:38:00,080 --> 00:38:04,680 Speaker 1: conclusion was valid in the first place. That's not critical thinking, 636 00:38:04,800 --> 00:38:06,839 Speaker 1: and I've been guilty of it. So I say this 637 00:38:07,239 --> 00:38:12,320 Speaker 1: to try and hold myself more accountable. And yeah, responsibility 638 00:38:12,480 --> 00:38:17,759 Speaker 1: tastes bad. Well that's the news for Thursday, Septe one. 639 00:38:17,800 --> 00:38:20,000 Speaker 1: There was a ton of it, a lot of Jonathan 640 00:38:20,000 --> 00:38:23,160 Speaker 1: on a soapbox. I make no apologies for it. I 641 00:38:23,200 --> 00:38:26,040 Speaker 1: appreciate all you who listen all the way through, and 642 00:38:26,160 --> 00:38:30,040 Speaker 1: I understand that you may not agree with my perspective. 643 00:38:30,440 --> 00:38:33,359 Speaker 1: I completely respect that as long as you know you're 644 00:38:33,360 --> 00:38:36,520 Speaker 1: respecting others. That's the most important part, making sure that 645 00:38:36,600 --> 00:38:40,200 Speaker 1: we all do our own efforts to keep not just 646 00:38:40,280 --> 00:38:44,560 Speaker 1: ourselves safe, but everyone else too, like we gotta look 647 00:38:44,560 --> 00:38:47,680 Speaker 1: out for each other. Anyway, if you have suggestions for 648 00:38:47,760 --> 00:38:50,240 Speaker 1: topics I should cover on future episodes of Tech Stuff, 649 00:38:50,480 --> 00:38:52,480 Speaker 1: reach out to me the handle for the show. It's 650 00:38:52,520 --> 00:38:56,960 Speaker 1: Tech Stuff hs W and I'll talk to you again. Really. 651 00:39:02,440 --> 00:39:05,480 Speaker 1: Text Stuff is an I heart Radio production. For more 652 00:39:05,560 --> 00:39:08,920 Speaker 1: podcasts from I heart Radio, visit the i heart Radio app, 653 00:39:09,080 --> 00:39:12,240 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.