1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from my Heart Radio. 2 00:00:12,119 --> 00:00:15,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,120 --> 00:00:18,079 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio and 4 00:00:18,079 --> 00:00:20,840 Speaker 1: I love all things tech. And it's time for the 5 00:00:20,880 --> 00:00:26,640 Speaker 1: tech news for Tuesday, September twenty one, twenty one. And 6 00:00:26,800 --> 00:00:29,040 Speaker 1: here's a follow up on a story that I talked 7 00:00:29,080 --> 00:00:32,639 Speaker 1: about last week. The Wall Street Journal has recently published 8 00:00:33,040 --> 00:00:36,520 Speaker 1: a few stories that put Facebook in a really negative light. 9 00:00:37,159 --> 00:00:39,760 Speaker 1: One of those stories was about an internal study that 10 00:00:39,840 --> 00:00:43,800 Speaker 1: looked at how divisive content on Facebook goes up in 11 00:00:43,840 --> 00:00:47,000 Speaker 1: the ranks in the Facebook algorithm because that kind of 12 00:00:47,040 --> 00:00:50,800 Speaker 1: content encourages high engagement. So essentially, it's saying what most 13 00:00:50,800 --> 00:00:52,840 Speaker 1: of us already know, which is that if you stir 14 00:00:52,920 --> 00:00:55,480 Speaker 1: shinola up, folks will get head up about it and 15 00:00:55,560 --> 00:00:59,240 Speaker 1: join the fracas. That's my colloquial way of saying that, 16 00:00:59,720 --> 00:01:03,040 Speaker 1: you know, saying or posting awful things gets a rise 17 00:01:03,080 --> 00:01:05,960 Speaker 1: out of people, and they in turn get emotional and 18 00:01:06,000 --> 00:01:08,760 Speaker 1: typically they then engage. And we know this is a 19 00:01:08,800 --> 00:01:11,319 Speaker 1: thing because it's been going on since long before there 20 00:01:11,360 --> 00:01:14,520 Speaker 1: ever was an Internet, and even back in the days 21 00:01:14,560 --> 00:01:16,679 Speaker 1: when the Internet wasn't really much more than a bunch 22 00:01:16,680 --> 00:01:19,319 Speaker 1: of message boards. We saw this in the form of 23 00:01:19,360 --> 00:01:23,560 Speaker 1: flame wars. Like everybody knows this. This is like the 24 00:01:23,600 --> 00:01:26,840 Speaker 1: basis of a lot of toxic behavior on the Internet 25 00:01:26,920 --> 00:01:29,480 Speaker 1: and in the world in general. So none of this 26 00:01:29,560 --> 00:01:33,800 Speaker 1: is new, But the study essentially confirmed that Facebook's algorithm 27 00:01:33,840 --> 00:01:38,720 Speaker 1: promotes content like that and then amplifies it, so something 28 00:01:38,760 --> 00:01:41,760 Speaker 1: that was already getting a lot of engagement could go 29 00:01:41,840 --> 00:01:44,520 Speaker 1: to the level where it goes viral, and when that 30 00:01:44,640 --> 00:01:48,200 Speaker 1: stuff is divisive, it can do a lot of harm. 31 00:01:48,320 --> 00:01:50,520 Speaker 1: Then there was the study that I talked about last 32 00:01:50,520 --> 00:01:55,080 Speaker 1: week in which Facebook researchers found that Instagram users can 33 00:01:55,120 --> 00:01:59,520 Speaker 1: experience a negative impact on their mental health. Particularly teenage 34 00:01:59,520 --> 00:02:03,600 Speaker 1: girls are are vulnerable to this, and the researchers found 35 00:02:03,600 --> 00:02:07,080 Speaker 1: that some users, including up to thirty two of teenage girls, 36 00:02:07,440 --> 00:02:10,880 Speaker 1: experience self esteem issues and worse, and that this links 37 00:02:10,919 --> 00:02:15,400 Speaker 1: back to their activities on Instagram. But those stories weren't 38 00:02:15,520 --> 00:02:18,880 Speaker 1: the end of it. There was more. The journal also 39 00:02:19,040 --> 00:02:22,240 Speaker 1: published a report that said as bad as Facebook is 40 00:02:22,480 --> 00:02:27,000 Speaker 1: dealing with these issues in English speaking countries, it's even 41 00:02:27,080 --> 00:02:31,080 Speaker 1: worse elsewhere, and that's because apparently Facebook just doesn't have 42 00:02:31,160 --> 00:02:35,280 Speaker 1: the staff on hand to deal with problems in places 43 00:02:35,320 --> 00:02:40,680 Speaker 1: that have other languages, and so issues like misinformation campaigns 44 00:02:40,680 --> 00:02:44,160 Speaker 1: can be even worse in those other places because there 45 00:02:44,160 --> 00:02:47,040 Speaker 1: aren't enough moderators who can curb the problem. They're not 46 00:02:47,120 --> 00:02:50,480 Speaker 1: enough people who understand the language to recognize when those 47 00:02:50,520 --> 00:02:55,720 Speaker 1: things are happening. CNBC's Salvador Rodriguez points out that a 48 00:02:55,800 --> 00:02:58,880 Speaker 1: company that makes around thirty billion dollar in profit, and 49 00:02:58,880 --> 00:03:02,320 Speaker 1: that's not revenue, that's profit, you know, they might be 50 00:03:02,360 --> 00:03:05,399 Speaker 1: able to actually hire on some folks who understand other languages. 51 00:03:05,440 --> 00:03:08,639 Speaker 1: And I think that's a pretty salient point. Facebook has 52 00:03:08,720 --> 00:03:11,280 Speaker 1: responded in a blog post that essentially says the Wall 53 00:03:11,280 --> 00:03:14,600 Speaker 1: Street Journal got stuff wrong. However, the blog post does 54 00:03:14,639 --> 00:03:18,200 Speaker 1: not refute any of the points of the internal study 55 00:03:18,240 --> 00:03:21,800 Speaker 1: that the journal was citing. So I mean, I guess 56 00:03:21,840 --> 00:03:25,000 Speaker 1: the wrong stuff would be any conclusions that the journalists 57 00:03:25,040 --> 00:03:28,000 Speaker 1: were drawing as a result of reading over the data. 58 00:03:28,320 --> 00:03:30,679 Speaker 1: I'm not sure that that blog post is going to 59 00:03:30,760 --> 00:03:34,560 Speaker 1: convince anyone. I feel like there's a real momentum building 60 00:03:34,639 --> 00:03:38,400 Speaker 1: against Facebook, and my guess is that soon it's going 61 00:03:38,440 --> 00:03:42,120 Speaker 1: to be politically unfeasible to offer the company any sort 62 00:03:42,120 --> 00:03:45,640 Speaker 1: of protection. In related news, the m I T Technology 63 00:03:45,680 --> 00:03:49,920 Speaker 1: Review gained possession of another Facebook internal study. These things 64 00:03:49,960 --> 00:03:52,960 Speaker 1: just keep on coming out. This one was actually done 65 00:03:53,000 --> 00:03:55,880 Speaker 1: way back in twenty nineteen. I say way back because 66 00:03:55,920 --> 00:03:58,240 Speaker 1: twenty nineteen was before we ever had a pandemic. Do 67 00:03:58,280 --> 00:04:01,600 Speaker 1: you remember those days? Anyway, that study was looking into 68 00:04:01,640 --> 00:04:04,840 Speaker 1: the proliferation of troll farms that were dedicated to spreading 69 00:04:04,880 --> 00:04:09,000 Speaker 1: misinformation and propaganda in the lead up to the presidential 70 00:04:09,000 --> 00:04:13,040 Speaker 1: election in the United States. A former Facebook employee handed 71 00:04:13,080 --> 00:04:15,640 Speaker 1: the report over to the m I T Technology Review. 72 00:04:16,040 --> 00:04:19,560 Speaker 1: That employee reportedly had nothing to do with the the study, 73 00:04:19,760 --> 00:04:22,359 Speaker 1: but had access to it before they left Facebook, and 74 00:04:22,400 --> 00:04:25,680 Speaker 1: the study showed that troll farms were pushing content out 75 00:04:25,720 --> 00:04:30,599 Speaker 1: that was seen by around one hundred forty million Americans 76 00:04:30,640 --> 00:04:34,760 Speaker 1: every month, and that these troll farms had created networks 77 00:04:34,800 --> 00:04:39,040 Speaker 1: of Facebook pages, and that some fifteen thousand pages targeting 78 00:04:39,160 --> 00:04:44,200 Speaker 1: US audiences actually originated out of Kosovo and Macedonia. Now 79 00:04:44,279 --> 00:04:47,520 Speaker 1: keep in mind this is after Facebook had already been 80 00:04:47,560 --> 00:04:50,960 Speaker 1: through the wringer in the two thousand sixteen election, which 81 00:04:51,000 --> 00:04:53,960 Speaker 1: means you would expect Facebook to have some protections in 82 00:04:54,000 --> 00:04:56,760 Speaker 1: place to guard against the same sort of thing. Happening 83 00:04:56,760 --> 00:04:59,600 Speaker 1: again just a few years later, but according to the review, 84 00:05:00,160 --> 00:05:04,159 Speaker 1: the few measures that Facebook did activate mostly prevented quote 85 00:05:04,200 --> 00:05:07,760 Speaker 1: the worst of the worst end quote. Jeff Allen, author 86 00:05:07,800 --> 00:05:10,240 Speaker 1: of the report and who was at the time a 87 00:05:10,360 --> 00:05:14,640 Speaker 1: senior level data scientist with Facebook, said that quote, we 88 00:05:14,720 --> 00:05:19,359 Speaker 1: have empowered in authentic actors to accumulate huge followings for 89 00:05:19,560 --> 00:05:24,040 Speaker 1: largely unknown purposes end quote. So, in other words, Facebook 90 00:05:24,120 --> 00:05:27,839 Speaker 1: has given the ability for people who are not who 91 00:05:27,839 --> 00:05:32,839 Speaker 1: they claim to be to get enormous followings and then 92 00:05:32,920 --> 00:05:36,159 Speaker 1: potentially to do, you know whatever. It might be good, 93 00:05:36,240 --> 00:05:39,200 Speaker 1: or it might be bad, probably bad. So this ties 94 00:05:39,240 --> 00:05:42,159 Speaker 1: in with our earlier story showing that Facebook is very 95 00:05:42,240 --> 00:05:45,599 Speaker 1: much aware of how its systems enable bad actors to 96 00:05:45,640 --> 00:05:49,920 Speaker 1: reach huge audiences and spread misinformation. When that misinformation convinces 97 00:05:49,920 --> 00:05:53,400 Speaker 1: people that, you know, the COVID vaccine is harmful, or 98 00:05:53,520 --> 00:05:57,600 Speaker 1: that misinformation says the virus itself is overblown, then it 99 00:05:57,760 --> 00:06:01,080 Speaker 1: not just you know, affects those users on Facebook. It 100 00:06:01,080 --> 00:06:03,680 Speaker 1: affects their behaviors and people can get sick and some 101 00:06:03,760 --> 00:06:07,560 Speaker 1: of them will die. Facebook, as you would imagine as 102 00:06:07,600 --> 00:06:10,520 Speaker 1: representatives that say say the company is taking measures to 103 00:06:10,600 --> 00:06:14,000 Speaker 1: keep the platforms safe and to eliminate these issues. But 104 00:06:14,120 --> 00:06:17,200 Speaker 1: like I said, I feel that Facebook is pretty much 105 00:06:17,240 --> 00:06:21,080 Speaker 1: on a collision course with the US government and likely 106 00:06:21,120 --> 00:06:23,640 Speaker 1: some other governments around the world, with the aim to 107 00:06:23,800 --> 00:06:26,240 Speaker 1: break up the company or otherwise reduce its power to 108 00:06:26,279 --> 00:06:30,919 Speaker 1: facilitate harm. I mean, it is pretty damning evidence when 109 00:06:31,160 --> 00:06:35,520 Speaker 1: internal studies within Facebook are confirming some of the accusations 110 00:06:35,560 --> 00:06:39,600 Speaker 1: that people have had against the company, and meanwhile representatives 111 00:06:39,640 --> 00:06:44,320 Speaker 1: for the company have continuously denied or deflected those accusations. 112 00:06:44,520 --> 00:06:48,559 Speaker 1: Meanwhile their own studies from inside the house are saying 113 00:06:48,560 --> 00:06:51,720 Speaker 1: the same thing. Not a good look. A few weeks ago, 114 00:06:52,000 --> 00:06:55,400 Speaker 1: I talked about how El Salvador adopted bitcoin as their 115 00:06:55,640 --> 00:06:59,960 Speaker 1: the country's national currency. Previously, it had been mostly relying 116 00:07:00,160 --> 00:07:03,359 Speaker 1: on the US dollar. Well, recently, bitcoin has had a 117 00:07:03,360 --> 00:07:06,240 Speaker 1: bit of a dip. It dropped ten percent in value 118 00:07:06,600 --> 00:07:09,280 Speaker 1: very quickly, not long ago. It's not to say that 119 00:07:09,480 --> 00:07:12,200 Speaker 1: it's in a spiral or anything, but it didn't have 120 00:07:12,200 --> 00:07:16,720 Speaker 1: a dip, and Salvador and President Naive Boo Keley responded 121 00:07:16,760 --> 00:07:19,640 Speaker 1: to that by directing the government to purchase even more bitcoin, 122 00:07:20,040 --> 00:07:24,000 Speaker 1: buying the dip. In other words, the country purchased another 123 00:07:24,160 --> 00:07:26,360 Speaker 1: hundred fifty bitcoin which was worth around six and a 124 00:07:26,360 --> 00:07:28,880 Speaker 1: half million dollars at the time, and that brings the 125 00:07:28,920 --> 00:07:32,280 Speaker 1: total number of bitcoin held by El Salvador to seven hundred. 126 00:07:32,840 --> 00:07:35,680 Speaker 1: It will be interesting to see how El Salvador progresses 127 00:07:35,680 --> 00:07:38,960 Speaker 1: while relying on a currency that has frequent and frequently 128 00:07:39,040 --> 00:07:43,000 Speaker 1: dramatic fluctuations in value. I'm not saying it's impossible, but 129 00:07:43,080 --> 00:07:47,800 Speaker 1: it's gonna be tricky. If bitcoin stabilizes and becomes more 130 00:07:48,280 --> 00:07:54,640 Speaker 1: of a reliant kind of thing or reliable kind of thing. Um, yeah, 131 00:07:54,680 --> 00:07:56,760 Speaker 1: I guess it could work, but right now, it's just 132 00:07:57,320 --> 00:08:01,800 Speaker 1: it's so volatile that I am very curious to see 133 00:08:01,800 --> 00:08:04,840 Speaker 1: how this works for El Salvador in the future. And 134 00:08:04,880 --> 00:08:07,280 Speaker 1: of course we talked to in the previous episodes about 135 00:08:07,320 --> 00:08:10,680 Speaker 1: how a lot of critics are worried that the fact 136 00:08:10,720 --> 00:08:14,720 Speaker 1: that bitcoin is is frequently associated with stuff like money 137 00:08:14,800 --> 00:08:18,280 Speaker 1: laundering and corruption, that this will play a large role 138 00:08:18,320 --> 00:08:20,480 Speaker 1: in El Salvador as well, but we'll have to wait 139 00:08:20,480 --> 00:08:24,280 Speaker 1: and see. Let's talk about tech and climate change for 140 00:08:24,320 --> 00:08:29,400 Speaker 1: a second. So the Big Five that being Alphabet, Amazon, Apple, Microsoft, 141 00:08:29,440 --> 00:08:33,720 Speaker 1: and Facebook have all stated that their respective companies have 142 00:08:33,840 --> 00:08:36,600 Speaker 1: plans to go carbon neutral and at least in some 143 00:08:36,679 --> 00:08:40,880 Speaker 1: cases even carbon negative over the next several years, most 144 00:08:40,920 --> 00:08:44,880 Speaker 1: of which are aiming around for that goal. Microsoft even 145 00:08:44,920 --> 00:08:48,280 Speaker 1: says that the company intends to remove enough carbon from 146 00:08:48,320 --> 00:08:52,200 Speaker 1: the atmosphere to offset all the carbon the company has 147 00:08:52,280 --> 00:08:55,920 Speaker 1: ever generated. Ever, So, in other words, by I think 148 00:08:55,920 --> 00:08:59,800 Speaker 1: it's twenty fifty, Microsoft plans on having systems in place 149 00:09:00,240 --> 00:09:04,320 Speaker 1: that will effectively remove the equivalent amount of carbon that 150 00:09:04,360 --> 00:09:07,760 Speaker 1: the company has ever produced. That's pretty phenomenal if it 151 00:09:07,760 --> 00:09:11,559 Speaker 1: actually works out. Now, let's talk a little bit about actions, 152 00:09:11,640 --> 00:09:14,240 Speaker 1: because those speak louder than words, right. You can claim 153 00:09:14,640 --> 00:09:16,920 Speaker 1: to have this plan in place, but unless you do 154 00:09:17,000 --> 00:09:19,920 Speaker 1: stuff about it doesn't do you much good. And when 155 00:09:19,960 --> 00:09:22,160 Speaker 1: it comes to some of these companies, the promises being 156 00:09:22,200 --> 00:09:26,800 Speaker 1: made might not actually be achievable, at least not from 157 00:09:26,840 --> 00:09:30,880 Speaker 1: the companies themselves all by themselves. Let's take Apple for example. 158 00:09:31,160 --> 00:09:35,080 Speaker 1: So the Guardian reports that a semiconductor manufacturing company called 159 00:09:35,160 --> 00:09:38,880 Speaker 1: t s MC, this is part of Apple's supply chain, 160 00:09:39,480 --> 00:09:43,920 Speaker 1: has its goal to become carbon neutral by twenty Now, 161 00:09:43,960 --> 00:09:47,319 Speaker 1: Apple's goal was to get its entire supply chain to 162 00:09:47,440 --> 00:09:52,079 Speaker 1: carbon neutral by so there's a disconnect there of twenty years. 163 00:09:52,080 --> 00:09:55,320 Speaker 1: I mean, that's huge, right now, The company t s 164 00:09:55,480 --> 00:09:59,840 Speaker 1: MC in Taiwan uses nearly five of all electricity and 165 00:10:00,040 --> 00:10:04,000 Speaker 1: Iwan while manufacturing silicon chips. It also used around sixty 166 00:10:04,040 --> 00:10:08,320 Speaker 1: three million gallons of water in twenty nineteen for that purpose, 167 00:10:08,679 --> 00:10:12,160 Speaker 1: and it will likely consume more over the near future, 168 00:10:12,200 --> 00:10:16,360 Speaker 1: not less, at least, you know, when the ability to 169 00:10:16,440 --> 00:10:20,600 Speaker 1: manufacture at scale has really returned in full effect. And 170 00:10:20,720 --> 00:10:23,160 Speaker 1: this is yet another reminder that the world is way 171 00:10:23,200 --> 00:10:27,400 Speaker 1: more complicated than the way we typically see it in messaging. 172 00:10:27,840 --> 00:10:31,280 Speaker 1: You know, you might think, oh, Apple, Apple makes iPhones, 173 00:10:31,520 --> 00:10:34,240 Speaker 1: that iPhone came from Apple, but we all know that's 174 00:10:34,240 --> 00:10:37,719 Speaker 1: oversimplifying things way too much because there are a lot 175 00:10:37,760 --> 00:10:42,199 Speaker 1: of different companies that that produce components within that iPhone 176 00:10:42,600 --> 00:10:45,440 Speaker 1: that are not Apple. Apple partners with those companies, they 177 00:10:45,480 --> 00:10:48,040 Speaker 1: are part of Apple's supply chains. So it is a 178 00:10:48,120 --> 00:10:52,280 Speaker 1: much more complex ecosystem than just a single company. And 179 00:10:53,080 --> 00:10:55,360 Speaker 1: we have to remember there's this ripple effect, right, The 180 00:10:55,360 --> 00:10:59,480 Speaker 1: manufacturing process has this ripple effect on different economies around 181 00:10:59,520 --> 00:11:02,520 Speaker 1: the world as well as different environments. Now, I am 182 00:11:02,559 --> 00:11:05,280 Speaker 1: not blaming Apple for this, because I mean, for one thing, 183 00:11:05,320 --> 00:11:08,920 Speaker 1: the company made its own uh owned and operated businesses 184 00:11:09,280 --> 00:11:13,640 Speaker 1: carbon neutral back in eighteen. That is amazing. Hats off 185 00:11:13,679 --> 00:11:15,880 Speaker 1: to Apple for doing that. That's great. This is just 186 00:11:15,920 --> 00:11:20,440 Speaker 1: a reminder that when we hear these promises about attaining 187 00:11:20,480 --> 00:11:24,200 Speaker 1: carbon neutrality across all lines of businesses, we have to 188 00:11:24,240 --> 00:11:27,720 Speaker 1: remember it's not always entirely up to the companies that 189 00:11:27,760 --> 00:11:30,560 Speaker 1: are making those promises, and we need to make sure 190 00:11:30,600 --> 00:11:32,280 Speaker 1: that all the pieces are in place in order to 191 00:11:32,320 --> 00:11:36,360 Speaker 1: take care of all the different components. We have more 192 00:11:36,440 --> 00:11:38,720 Speaker 1: to talk about with tech, even more with tech and 193 00:11:38,800 --> 00:11:41,760 Speaker 1: climate in just a moment, but first let's take a 194 00:11:41,840 --> 00:11:52,240 Speaker 1: quick break. Okay, before the break, I was talking about 195 00:11:52,720 --> 00:11:56,040 Speaker 1: carbon neutrality and climate change in tech. This is kind 196 00:11:56,120 --> 00:11:59,640 Speaker 1: of related to that in a way. I want to 197 00:11:59,679 --> 00:12:03,400 Speaker 1: talk out the Bezos Earth Fund that's obviously named after 198 00:12:03,480 --> 00:12:06,960 Speaker 1: Jeff Bezos, the founder of Amazon. That fund has pledged 199 00:12:07,040 --> 00:12:10,920 Speaker 1: one billion dollars toward efforts to conserve and protect vulnerable 200 00:12:10,960 --> 00:12:13,840 Speaker 1: areas around the world. And this is part of a 201 00:12:14,000 --> 00:12:17,959 Speaker 1: ten billion dollar commitment that Jeff Bezos has made toward 202 00:12:18,080 --> 00:12:22,600 Speaker 1: preserving natural habitats and fighting climate change globally. And the 203 00:12:22,640 --> 00:12:26,480 Speaker 1: initial focus is on a region in the Pacific Ocean, 204 00:12:26,480 --> 00:12:29,960 Speaker 1: a tropical region of the Pacific uh the tropical Andes 205 00:12:30,040 --> 00:12:35,679 Speaker 1: Mountains and the Congo Basin. Now you might wonder how 206 00:12:35,760 --> 00:12:39,319 Speaker 1: does this actually work, because obviously you can't just dump 207 00:12:39,400 --> 00:12:43,280 Speaker 1: airplane loads of cash on an area to make it better. 208 00:12:43,320 --> 00:12:45,680 Speaker 1: The Pacific Ocean does not care if you make it 209 00:12:45,840 --> 00:12:49,599 Speaker 1: rain dollar bills over it. No, So this fund is 210 00:12:49,640 --> 00:12:54,600 Speaker 1: actually doling out large amounts of money toward established organizations 211 00:12:54,640 --> 00:12:58,200 Speaker 1: that focus on this kind of work on preservation and 212 00:12:58,240 --> 00:13:02,000 Speaker 1: fighting climate change. So really it's almost like a grant 213 00:13:02,040 --> 00:13:06,640 Speaker 1: foundation that's granting this money to other organizations. This in 214 00:13:06,679 --> 00:13:10,360 Speaker 1: itself is still a challenge, both because even if you're 215 00:13:10,400 --> 00:13:13,240 Speaker 1: just donating to charity yourself, you always want to try 216 00:13:13,280 --> 00:13:16,280 Speaker 1: and select organizations that have the potential to do the 217 00:13:16,320 --> 00:13:19,439 Speaker 1: most good with your donation and to make the best 218 00:13:19,559 --> 00:13:23,280 Speaker 1: use of the money you you offer. And also, once 219 00:13:23,280 --> 00:13:26,600 Speaker 1: you hit a certain scale of donation, you might actually 220 00:13:26,600 --> 00:13:30,600 Speaker 1: be giving an organization more money than that organization can manage. 221 00:13:30,960 --> 00:13:33,760 Speaker 1: It can literally be too much of a good thing. Yeah, 222 00:13:34,120 --> 00:13:37,840 Speaker 1: if you are really focusing on smaller, more local organizations 223 00:13:37,880 --> 00:13:41,800 Speaker 1: that can do you know, direct good in a region, 224 00:13:42,280 --> 00:13:44,439 Speaker 1: then you might have to spend a lot more time 225 00:13:44,480 --> 00:13:47,200 Speaker 1: managing that whole situation. Because again, you don't want to 226 00:13:47,240 --> 00:13:50,040 Speaker 1: overwhelm the organization by giving it more than it can handle. 227 00:13:50,640 --> 00:13:53,880 Speaker 1: But at least this is some good news that's related 228 00:13:53,880 --> 00:13:57,120 Speaker 1: to Amazon, which is nice because the rest of my 229 00:13:57,160 --> 00:14:01,000 Speaker 1: Amazon news is not quite so positive. Of those pieces 230 00:14:01,800 --> 00:14:07,319 Speaker 1: is about um sponsored results in Amazon item listings. So 231 00:14:07,559 --> 00:14:10,200 Speaker 1: when you go on to Amazon and you're searching for something, 232 00:14:10,280 --> 00:14:13,240 Speaker 1: let's say it's a ball peen hammer, then you're gonna 233 00:14:13,280 --> 00:14:17,439 Speaker 1: get results on that, and that's going to include results 234 00:14:17,480 --> 00:14:21,120 Speaker 1: from companies that have paid to be in your view, 235 00:14:21,440 --> 00:14:24,080 Speaker 1: like on that first page. So Amazon has been doing 236 00:14:24,080 --> 00:14:26,080 Speaker 1: this for a while. It's not like it's brand new, 237 00:14:26,280 --> 00:14:29,320 Speaker 1: but over time the company has increased the number of 238 00:14:29,360 --> 00:14:33,200 Speaker 1: sponsored item slots on those initial lists. Like in the 239 00:14:33,200 --> 00:14:35,880 Speaker 1: past it might be you know, two to three items, 240 00:14:36,280 --> 00:14:40,840 Speaker 1: and now it can be five or six on some searches. Uh. 241 00:14:40,920 --> 00:14:43,320 Speaker 1: And like search engines in general, you know, we typically 242 00:14:43,360 --> 00:14:46,440 Speaker 1: see people stick to the first page or first couple 243 00:14:46,480 --> 00:14:49,640 Speaker 1: of pages of results when they're searching for stuff. Now, 244 00:14:50,320 --> 00:14:52,320 Speaker 1: I have to admit I'm kind of an outlier because 245 00:14:52,360 --> 00:14:54,960 Speaker 1: I will frequently page through multiple pages in an effort 246 00:14:55,000 --> 00:14:57,480 Speaker 1: to find, you know, the best fit for whatever item 247 00:14:57,600 --> 00:15:00,400 Speaker 1: I'm looking for. I've always got this fear that on 248 00:15:00,520 --> 00:15:03,200 Speaker 1: the next page was the perfect fit, and I went 249 00:15:03,240 --> 00:15:06,560 Speaker 1: with something you know, I settled for something less. So 250 00:15:07,200 --> 00:15:10,720 Speaker 1: with these sponsored results, there's no algorithm around how popular 251 00:15:10,800 --> 00:15:13,360 Speaker 1: the item is, or how well reviewed it is or 252 00:15:13,400 --> 00:15:16,400 Speaker 1: anything like that. It shows up there because the company 253 00:15:16,480 --> 00:15:20,920 Speaker 1: paid for that spot. And Amazon has also increased the 254 00:15:21,000 --> 00:15:24,440 Speaker 1: price for ads, which, yeah, I don't even know how 255 00:15:24,520 --> 00:15:26,640 Speaker 1: much revenue Amazon must be pulling in when you factor 256 00:15:26,720 --> 00:15:29,320 Speaker 1: both of those things together. They're showing more ads, and 257 00:15:29,360 --> 00:15:33,400 Speaker 1: they're charging more per AD, and they've got millions of 258 00:15:33,400 --> 00:15:38,040 Speaker 1: people using the site, and I mean, that's going hog wild, 259 00:15:38,160 --> 00:15:42,520 Speaker 1: especially during pandemic. Now, the amount charged is not a 260 00:15:42,600 --> 00:15:45,040 Speaker 1: big amount on a per click basis. If you look 261 00:15:45,040 --> 00:15:49,200 Speaker 1: at it as how much does Amazon charge for an 262 00:15:49,240 --> 00:15:52,760 Speaker 1: AD for a single click, it's a dollar twenty seven. 263 00:15:53,320 --> 00:15:56,320 Speaker 1: That's not very much, But that's one click. That's per click, right. 264 00:15:56,640 --> 00:15:59,880 Speaker 1: You multiply that times potentially tens of thousands of click 265 00:16:00,480 --> 00:16:04,320 Speaker 1: and you apply that pricing across all items, across all categories, 266 00:16:04,600 --> 00:16:07,280 Speaker 1: and you really start seeing the cash pile up, which 267 00:16:07,280 --> 00:16:10,400 Speaker 1: I guess then you could throw at the Pacific Ocean. Amazon, 268 00:16:10,560 --> 00:16:13,200 Speaker 1: by the way, says that it doesn't dedicate slots to 269 00:16:13,440 --> 00:16:16,360 Speaker 1: sponsored items, So in other words, it doesn't have like 270 00:16:16,480 --> 00:16:19,680 Speaker 1: a minimum or a set amount. It's not like every 271 00:16:19,680 --> 00:16:22,560 Speaker 1: time you search for something, you're gonna get six sponsored 272 00:16:22,600 --> 00:16:27,040 Speaker 1: results and then everything else. In fact, it's possible, according 273 00:16:27,040 --> 00:16:29,480 Speaker 1: to Amazon, that you could do a search and get 274 00:16:29,480 --> 00:16:32,160 Speaker 1: no sponsored items at all, depending on whatever it was 275 00:16:32,200 --> 00:16:35,280 Speaker 1: you were searching for. Now, I've recently had to purchase 276 00:16:35,320 --> 00:16:38,520 Speaker 1: a bunch of stuff, so anecdotally, I can say that 277 00:16:38,600 --> 00:16:41,880 Speaker 1: I have noticed way more sponsored items than I remember 278 00:16:41,880 --> 00:16:44,400 Speaker 1: seeing in the past. However, I also have to say 279 00:16:44,480 --> 00:16:47,280 Speaker 1: that could just be confirmation bias on my part. It 280 00:16:47,400 --> 00:16:50,000 Speaker 1: might not actually reflect reality. It may just be that 281 00:16:50,040 --> 00:16:53,520 Speaker 1: I'm noticing it more. Anyway, it's good to pay attention 282 00:16:53,560 --> 00:16:56,160 Speaker 1: to this kind of thing when you're a comparison shopping now. 283 00:16:56,200 --> 00:16:59,480 Speaker 1: Just because something is sponsored doesn't mean it's the best fit, 284 00:16:59,560 --> 00:17:02,000 Speaker 1: but it all. It doesn't mean it's not the best fit. 285 00:17:02,160 --> 00:17:05,359 Speaker 1: It might be the best option for you, but it 286 00:17:05,440 --> 00:17:08,439 Speaker 1: is good to remember that it's a sponsored item and 287 00:17:08,480 --> 00:17:11,880 Speaker 1: that the reason it's showing up in your view has 288 00:17:11,920 --> 00:17:15,480 Speaker 1: nothing to do with its price, its quality, or reviews 289 00:17:15,600 --> 00:17:19,760 Speaker 1: or anything like that. Uh, it may fit those criteria 290 00:17:20,000 --> 00:17:23,800 Speaker 1: that you need, but it doesn't necessarily do that, so 291 00:17:24,160 --> 00:17:26,560 Speaker 1: you need to keep that in mind when you're comparison shopping. 292 00:17:26,840 --> 00:17:30,760 Speaker 1: OSHA a k a. The Occupational Safety and Health Administration 293 00:17:31,200 --> 00:17:33,760 Speaker 1: is a regulatory agency in the United States that is 294 00:17:33,880 --> 00:17:37,399 Speaker 1: charged with assuring safe and healthy working conditions in the 295 00:17:37,520 --> 00:17:43,239 Speaker 1: US by establishing and enforcing standards. And when there is 296 00:17:44,080 --> 00:17:47,760 Speaker 1: an incident that that results in an injury, like a 297 00:17:47,800 --> 00:17:51,680 Speaker 1: serious injury, something you know, beyond just slapping a band 298 00:17:51,680 --> 00:17:55,240 Speaker 1: aid on someone, then companies are supposed to report that 299 00:17:55,320 --> 00:17:59,160 Speaker 1: to OSHA. They are legally bound to do so well. 300 00:17:59,280 --> 00:18:02,800 Speaker 1: The Tech Journal is m site. The information took data 301 00:18:02,840 --> 00:18:06,760 Speaker 1: from OSHA and looked at Amazon delivery stations. Now, delivery 302 00:18:06,760 --> 00:18:11,960 Speaker 1: stations are hubs that exist between warehouses and customers. Amazon 303 00:18:12,080 --> 00:18:16,280 Speaker 1: essentially established delivery stations largely in an effort to eliminate 304 00:18:16,320 --> 00:18:20,040 Speaker 1: the need to rely on other delivery services like ups 305 00:18:20,080 --> 00:18:24,639 Speaker 1: and instead use its own owned and operated delivery vehicles 306 00:18:24,840 --> 00:18:27,960 Speaker 1: in order to get packages to end customers. So the 307 00:18:28,000 --> 00:18:30,760 Speaker 1: way Amazon typically works with shipping is that it has 308 00:18:30,840 --> 00:18:36,400 Speaker 1: these massive warehouses that hold products. These then get shipped 309 00:18:36,480 --> 00:18:39,840 Speaker 1: to the delivery stations, and then workers at the delivery 310 00:18:39,880 --> 00:18:43,280 Speaker 1: stations sort and load the boxes onto delivery vehicles, which 311 00:18:43,320 --> 00:18:46,280 Speaker 1: then go out to you know, deliver the stuff to customers. 312 00:18:46,640 --> 00:18:49,840 Speaker 1: And according to the information, the rate of injuries at 313 00:18:49,880 --> 00:18:54,000 Speaker 1: Amazon delivery stations is more than twice that of the 314 00:18:54,080 --> 00:18:58,359 Speaker 1: industry average. People are getting hurt twice as often at 315 00:18:58,400 --> 00:19:03,120 Speaker 1: Amazon delivery stations as people who work at comparable facilities. 316 00:19:03,359 --> 00:19:06,720 Speaker 1: It's also a higher injury rate than what is seen 317 00:19:06,800 --> 00:19:11,320 Speaker 1: at other Amazon facilities, including warehouses and sorting facilities. The 318 00:19:11,400 --> 00:19:15,439 Speaker 1: delivery stations are the most dangerous. The information also points 319 00:19:15,440 --> 00:19:19,359 Speaker 1: out that companies only report to OSHA if those injuries 320 00:19:19,400 --> 00:19:21,520 Speaker 1: are severe enough. Like I was saying before, it has 321 00:19:21,560 --> 00:19:23,760 Speaker 1: to be something more than what would require a first 322 00:19:23,800 --> 00:19:28,560 Speaker 1: aid response. So we're not just talking about people stubbing 323 00:19:28,560 --> 00:19:31,440 Speaker 1: a toe or something. We're talking about some serious injuries, 324 00:19:31,480 --> 00:19:36,840 Speaker 1: everything from moderate to potentially critical injuries. And you know, 325 00:19:36,840 --> 00:19:39,160 Speaker 1: who knows how many bumps and scrapes people are getting 326 00:19:39,160 --> 00:19:42,920 Speaker 1: along the way. The study also found that Amazon had 327 00:19:43,200 --> 00:19:47,080 Speaker 1: overstated the number of employees working at these delivery stations 328 00:19:47,119 --> 00:19:50,840 Speaker 1: on more than one occasions. Now that matters because it 329 00:19:50,880 --> 00:19:55,320 Speaker 1: means Amazon's information to OSHA indicates that these delivery stations 330 00:19:55,320 --> 00:19:59,800 Speaker 1: had more employees. Thus were more fully staffed, and that 331 00:20:00,000 --> 00:20:03,760 Speaker 1: would usually mean that you would see fewer injuries, right, 332 00:20:03,840 --> 00:20:08,200 Speaker 1: because you're you're dividing the work well amongst the people 333 00:20:08,280 --> 00:20:10,560 Speaker 1: that you have. But if in fact fewer people are 334 00:20:10,600 --> 00:20:14,240 Speaker 1: working there, then that means you have fewer people doing 335 00:20:14,320 --> 00:20:17,480 Speaker 1: more work. And that work includes moving heavy stuff like 336 00:20:17,520 --> 00:20:20,520 Speaker 1: heavy packages, as well as working around delivery vehicles that 337 00:20:20,560 --> 00:20:22,800 Speaker 1: are coming and going all the time. So it's no 338 00:20:22,840 --> 00:20:26,080 Speaker 1: wonder there have been lots of injuries. No real report 339 00:20:26,119 --> 00:20:29,040 Speaker 1: from Amazon yet about this, apart from the company saying 340 00:20:29,080 --> 00:20:35,280 Speaker 1: that it works hard to improve safety conditions and its facilities. Okay, well, 341 00:20:35,359 --> 00:20:38,240 Speaker 1: I've got a few more stories to cover, but before 342 00:20:38,240 --> 00:20:47,679 Speaker 1: I get to those, let's take another quick break. Okay, 343 00:20:47,720 --> 00:20:50,160 Speaker 1: we're gonna talk a little bit more about workers rights 344 00:20:50,160 --> 00:20:53,520 Speaker 1: and tech organizations in the form of instat cart. So, 345 00:20:53,560 --> 00:20:56,080 Speaker 1: the company is rumored to be preparing an initial public 346 00:20:56,119 --> 00:20:59,040 Speaker 1: offering or i p O. That's when a private company 347 00:20:59,080 --> 00:21:03,080 Speaker 1: becomes a publicly traded company on a stock exchange. This 348 00:21:03,160 --> 00:21:06,800 Speaker 1: process is lengthy, it is not a sure thing. They 349 00:21:06,880 --> 00:21:09,320 Speaker 1: just take a look at we work for an example 350 00:21:09,320 --> 00:21:13,960 Speaker 1: of that. But some instacart workers are spearheading a campaign 351 00:21:13,960 --> 00:21:17,159 Speaker 1: to convince users to boycott the app. They're using the 352 00:21:17,200 --> 00:21:21,360 Speaker 1: hashtag delete insta Cart to promote their cause, so they 353 00:21:21,359 --> 00:21:24,040 Speaker 1: want to put pressure on the company. If insta Cart 354 00:21:24,080 --> 00:21:27,080 Speaker 1: is planning ongoing public then public opinion is going to 355 00:21:27,119 --> 00:21:31,239 Speaker 1: impact the share price and thus the company valuation and 356 00:21:31,320 --> 00:21:34,399 Speaker 1: the I p O. So this is important stuff. And 357 00:21:34,440 --> 00:21:36,800 Speaker 1: the workers have a list of demands that they want 358 00:21:37,080 --> 00:21:41,000 Speaker 1: to have met from the company. One of those includes 359 00:21:41,080 --> 00:21:46,760 Speaker 1: Instacart reinstating a previous commission pay model where people got 360 00:21:46,800 --> 00:21:50,720 Speaker 1: a dedicated amount of money per order. Um. They want 361 00:21:50,800 --> 00:21:54,040 Speaker 1: a ten percent default tip built into the transaction. Right now, 362 00:21:54,080 --> 00:21:57,080 Speaker 1: the default tip is at five. They wanted to be 363 00:21:57,200 --> 00:21:59,280 Speaker 1: up to ten percent because they say a lot of 364 00:21:59,359 --> 00:22:03,680 Speaker 1: users never bother to go outside of the default. Uh. 365 00:22:03,760 --> 00:22:07,359 Speaker 1: I always I always go to twenty percent at least 366 00:22:07,680 --> 00:22:12,600 Speaker 1: and sometimes over than that, because I mean, these folks 367 00:22:12,600 --> 00:22:15,040 Speaker 1: are working hard and they're they're doing a real service 368 00:22:15,080 --> 00:22:18,280 Speaker 1: for me. But that's me anyway. They also want a 369 00:22:18,280 --> 00:22:21,480 Speaker 1: system that does not penalize workers for stuff that is 370 00:22:21,520 --> 00:22:25,200 Speaker 1: outside their control. So for example, uh, there's a cold 371 00:22:25,240 --> 00:22:28,320 Speaker 1: brew coffee that I absolutely love and it is almost 372 00:22:28,400 --> 00:22:30,960 Speaker 1: never in stock in any of the stores. I always 373 00:22:30,960 --> 00:22:33,880 Speaker 1: put it into the order just in case, but more 374 00:22:33,920 --> 00:22:36,359 Speaker 1: often than not, I get a message saying they were out. 375 00:22:36,840 --> 00:22:41,359 Speaker 1: Well right now, instacart allows me to review my you know, 376 00:22:41,520 --> 00:22:44,960 Speaker 1: results of my delivery and I can say, oh, the 377 00:22:45,000 --> 00:22:49,000 Speaker 1: delivery person didn't find all the items, uh, and that 378 00:22:49,080 --> 00:22:52,960 Speaker 1: can reflect badly on that person. It penalizes the insta 379 00:22:53,040 --> 00:22:56,160 Speaker 1: cart gig worker and it's not really their fault, right, 380 00:22:56,200 --> 00:22:58,639 Speaker 1: I mean, if the company's off stock, they couldn't do 381 00:22:58,680 --> 00:23:01,879 Speaker 1: anything about that, so it's not fair. They want that changed. 382 00:23:02,080 --> 00:23:05,280 Speaker 1: They also want occupational death benefits, and they want the 383 00:23:05,320 --> 00:23:08,719 Speaker 1: system that assigns orders to workers to be more transparent. 384 00:23:08,760 --> 00:23:11,240 Speaker 1: They say that it's gotten less so over the years. 385 00:23:11,680 --> 00:23:14,320 Speaker 1: The workers are asking users to delete the app to 386 00:23:14,400 --> 00:23:17,440 Speaker 1: send a message to insta Cart because if the installed 387 00:23:17,440 --> 00:23:19,919 Speaker 1: base for the app drops dramatically, they hope that that 388 00:23:19,960 --> 00:23:23,440 Speaker 1: will end up pressuring the company to granting these demands 389 00:23:23,520 --> 00:23:26,159 Speaker 1: because it will not look good on its way toward 390 00:23:26,240 --> 00:23:30,080 Speaker 1: an initial public offering. Workers say instat carts changes have 391 00:23:30,160 --> 00:23:33,159 Speaker 1: resulted in lower pay for workers who are working just 392 00:23:33,240 --> 00:23:36,080 Speaker 1: as hard, if not harder, than they were before, but 393 00:23:36,119 --> 00:23:39,639 Speaker 1: they're making less. The workers acknowledge that asking users to 394 00:23:39,720 --> 00:23:42,639 Speaker 1: delete a delivery app during a pandemic is potentially a 395 00:23:42,640 --> 00:23:46,520 Speaker 1: pretty big request for what it's worth, I just deleted 396 00:23:46,560 --> 00:23:49,840 Speaker 1: it from my phone, even though I have depended upon 397 00:23:49,880 --> 00:23:53,520 Speaker 1: it quite heavily in the past, because I happened to 398 00:23:53,600 --> 00:23:57,400 Speaker 1: believe in the cause of the workers. All right, moving on, 399 00:23:57,880 --> 00:24:01,280 Speaker 1: have you been to Thailand in the last ten years? 400 00:24:01,720 --> 00:24:04,520 Speaker 1: The reason I ask is that if you have, your 401 00:24:04,600 --> 00:24:08,679 Speaker 1: data might be among the one hundred six million records 402 00:24:08,720 --> 00:24:11,760 Speaker 1: of international travelers who have gone to Thailand that was 403 00:24:11,760 --> 00:24:15,280 Speaker 1: found to be stored on a database on the web 404 00:24:15,720 --> 00:24:19,320 Speaker 1: without password protection. So, in other words, anyone who got 405 00:24:19,440 --> 00:24:21,960 Speaker 1: hold of that U r L to that database could 406 00:24:22,000 --> 00:24:24,440 Speaker 1: go and look through it and see the personal information 407 00:24:24,560 --> 00:24:28,360 Speaker 1: of one six million people who had visited the country 408 00:24:28,440 --> 00:24:31,920 Speaker 1: over the last decade. A security researcher with compare Tech 409 00:24:32,240 --> 00:24:35,840 Speaker 1: alerted Thai authorities about this problem last month, and the 410 00:24:35,960 --> 00:24:39,639 Speaker 1: day following that alert, those authorities were able to secure 411 00:24:39,680 --> 00:24:43,439 Speaker 1: the database. So how long was that database up I 412 00:24:43,440 --> 00:24:47,399 Speaker 1: don't know, but the search engine census c E N 413 00:24:47,640 --> 00:24:53,159 Speaker 1: S y S indexed the database on August twenty one, So, 414 00:24:53,160 --> 00:24:56,040 Speaker 1: in other words, not only was the database unprotected and 415 00:24:56,200 --> 00:24:59,119 Speaker 1: up on the web, but a search engine indexed it 416 00:24:59,520 --> 00:25:02,240 Speaker 1: so that means it would come up in search results 417 00:25:02,320 --> 00:25:06,080 Speaker 1: for any sort of relevant query. The researcher who found 418 00:25:06,160 --> 00:25:09,560 Speaker 1: the database found it two days after it had been indexed, 419 00:25:09,920 --> 00:25:12,320 Speaker 1: and on the twenty three the database was under lock 420 00:25:12,359 --> 00:25:14,600 Speaker 1: and key. The former u r L of the database 421 00:25:14,640 --> 00:25:17,120 Speaker 1: is still active, but if you were to visit it now, 422 00:25:17,320 --> 00:25:19,719 Speaker 1: you would see a message that reads and I quote 423 00:25:20,119 --> 00:25:24,960 Speaker 1: this is honeypot. All access were logged end quote. According 424 00:25:24,960 --> 00:25:28,240 Speaker 1: to the authorities, no one obtained unauthorized access to the database, 425 00:25:28,240 --> 00:25:31,160 Speaker 1: though I'm not sure how they determined that or whether 426 00:25:31,160 --> 00:25:34,200 Speaker 1: the information is accurate, but once again that shows how 427 00:25:34,240 --> 00:25:38,119 Speaker 1: poor data management security policies can make all of us vulnerable. 428 00:25:38,280 --> 00:25:40,879 Speaker 1: The folks who were in that database did nothing wrong. 429 00:25:41,200 --> 00:25:43,160 Speaker 1: At least they did nothing wrong as far as being 430 00:25:43,160 --> 00:25:46,040 Speaker 1: included on that database is concerned. I don't know if 431 00:25:46,040 --> 00:25:49,080 Speaker 1: they're entirely innocent of all wrongdoing, but they had no 432 00:25:49,119 --> 00:25:51,720 Speaker 1: control over the security of their own data, and that's 433 00:25:51,720 --> 00:25:55,120 Speaker 1: a problem. Netflix has an interesting new offering that has 434 00:25:55,200 --> 00:25:58,680 Speaker 1: very specific parameters for a certain target audience. Viewers will 435 00:25:58,720 --> 00:26:02,480 Speaker 1: get free access to about tw of all the content 436 00:26:02,640 --> 00:26:06,760 Speaker 1: on Netflix Streaming to watch free of charge. That is, 437 00:26:07,080 --> 00:26:11,840 Speaker 1: there's no subscription. There's no advertising, nothing, but there are 438 00:26:11,840 --> 00:26:14,280 Speaker 1: a couple of requirements you have to meet. First, one 439 00:26:14,359 --> 00:26:16,919 Speaker 1: is that this offering is for Android users, so it 440 00:26:17,000 --> 00:26:20,159 Speaker 1: only applies on Android devices. And the other is that 441 00:26:20,200 --> 00:26:23,280 Speaker 1: the offering is for people in Kenya. And you might wonder, 442 00:26:24,040 --> 00:26:27,479 Speaker 1: what the what, what's going on, and maybe a lot 443 00:26:27,520 --> 00:26:29,720 Speaker 1: of you are already saying, well, of course, the first 444 00:26:29,760 --> 00:26:33,600 Speaker 1: taste is free, and that's exactly right. This is essentially 445 00:26:33,600 --> 00:26:36,399 Speaker 1: a marketing campaign with the ultimate goal to be to 446 00:26:36,520 --> 00:26:40,760 Speaker 1: convince Kenyans to subscribe for you know, paid for Netflix services. 447 00:26:41,440 --> 00:26:43,000 Speaker 1: You know, you give them a sample of what is 448 00:26:43,040 --> 00:26:46,720 Speaker 1: available on the platform. Android smartphones are popular in Kenya, 449 00:26:47,040 --> 00:26:50,439 Speaker 1: often like the most popular computing device in Kenya, and 450 00:26:50,520 --> 00:26:53,520 Speaker 1: so that represents a potential new audience for a Netflix. 451 00:26:53,760 --> 00:26:56,040 Speaker 1: But to get things rolling, the company needs to convince 452 00:26:56,080 --> 00:26:58,879 Speaker 1: Kenyans that they want the service in the first place. 453 00:26:59,240 --> 00:27:00,600 Speaker 1: And I think you were going to see lots of 454 00:27:00,640 --> 00:27:03,800 Speaker 1: similar campaigns across different companies in different parts of the world. 455 00:27:04,160 --> 00:27:07,439 Speaker 1: Many companies use growth as the most important metric of 456 00:27:07,480 --> 00:27:12,600 Speaker 1: the organization's success, but for subscription based businesses, there comes 457 00:27:12,600 --> 00:27:15,560 Speaker 1: a point in any market where you start to hit saturation. 458 00:27:15,960 --> 00:27:18,480 Speaker 1: And you see growth slow down, you might even see 459 00:27:18,480 --> 00:27:21,520 Speaker 1: it stabilize, and if you're in trouble, it might start 460 00:27:21,560 --> 00:27:25,040 Speaker 1: to go and reverse. It is very hard to create 461 00:27:25,200 --> 00:27:28,200 Speaker 1: new customers in a saturated market. You know, you've pretty 462 00:27:28,280 --> 00:27:31,000 Speaker 1: much got everyone who wants to be on the service 463 00:27:31,119 --> 00:27:33,560 Speaker 1: on the service you've got in the worlds to go, 464 00:27:33,760 --> 00:27:36,760 Speaker 1: So it makes more sense to look for untapped markets 465 00:27:36,840 --> 00:27:40,280 Speaker 1: elsewhere in the world and then aggressively go after those 466 00:27:40,600 --> 00:27:43,280 Speaker 1: and when you do that, boom, you get your growth back. Now. 467 00:27:43,280 --> 00:27:45,040 Speaker 1: I used to see this a lot when I worked 468 00:27:45,080 --> 00:27:52,320 Speaker 1: for a multibillion dollar cable content company that rhymes with Discovery. Anyway, 469 00:27:52,600 --> 00:27:55,400 Speaker 1: we'll have to see if Netflix is Queen's gambit pays off. 470 00:27:56,560 --> 00:28:00,359 Speaker 1: That was a reference. I've been talking a lot about 471 00:28:00,400 --> 00:28:03,400 Speaker 1: NASA recently, with series on text stuff dedicated to things 472 00:28:03,440 --> 00:28:06,480 Speaker 1: like the evolution of space suits and the various space 473 00:28:06,600 --> 00:28:08,800 Speaker 1: stations that have been in orbit around the Earth over 474 00:28:08,800 --> 00:28:12,440 Speaker 1: the years, and have also covered the Artemis Project, NASA's 475 00:28:12,480 --> 00:28:15,320 Speaker 1: plan to return to the Moon. Now, while the return 476 00:28:15,359 --> 00:28:17,280 Speaker 1: of humans to the Moon is likely to be pushed 477 00:28:17,280 --> 00:28:20,919 Speaker 1: back from the plan deadline because of stuff like you know, 478 00:28:21,640 --> 00:28:24,639 Speaker 1: not having space suits ready in time, other parts of 479 00:28:24,640 --> 00:28:27,480 Speaker 1: the project are moving forward. One of those is to 480 00:28:27,600 --> 00:28:30,200 Speaker 1: land a new rover on the surface of the Moon. 481 00:28:30,600 --> 00:28:35,040 Speaker 1: The rover is called the Volatile Investigating Polar Exploration Rover 482 00:28:35,359 --> 00:28:38,240 Speaker 1: or VIPER. Now that tells me someone came up with 483 00:28:38,280 --> 00:28:40,840 Speaker 1: an acronym that they thought was totally bad ass, and 484 00:28:40,880 --> 00:28:43,320 Speaker 1: then they worked their way backward from there. Or maybe 485 00:28:43,360 --> 00:28:45,520 Speaker 1: I'm wrong. Maybe they came up with a full name 486 00:28:45,600 --> 00:28:49,080 Speaker 1: first and then they said, hang on. The acronym for 487 00:28:49,120 --> 00:28:53,400 Speaker 1: this name is VIPER. That's bad ass. Anyway, NASA has 488 00:28:53,440 --> 00:28:56,960 Speaker 1: identified the target landing spot for the rover. It's near 489 00:28:57,000 --> 00:28:59,200 Speaker 1: the south pole of the Moon, on the edge of 490 00:28:59,200 --> 00:29:03,280 Speaker 1: a crater called the Nobile Creator and Viper's mission will 491 00:29:03,320 --> 00:29:06,720 Speaker 1: be to seek out resources that could be useful you know, 492 00:29:06,800 --> 00:29:10,600 Speaker 1: some might say critical for long term Moon missions. We're 493 00:29:10,640 --> 00:29:14,000 Speaker 1: talking stuff like ice, which could be used to generate 494 00:29:14,040 --> 00:29:17,440 Speaker 1: not just water, but also oxygen and even rocket fuel. 495 00:29:18,280 --> 00:29:21,560 Speaker 1: This will mark the first time that anything has directly 496 00:29:21,680 --> 00:29:25,800 Speaker 1: explored that specific region of the Moon. Previously, we've studied 497 00:29:25,800 --> 00:29:29,080 Speaker 1: it only through fly by missions, where you know, satellites 498 00:29:29,080 --> 00:29:32,720 Speaker 1: have gone by and taken measurements and metrics and pictures 499 00:29:32,720 --> 00:29:34,760 Speaker 1: and stuff. But this will be the first time we'll 500 00:29:34,800 --> 00:29:38,000 Speaker 1: have wheels on the ground, so to speak, in that region. 501 00:29:38,400 --> 00:29:41,320 Speaker 1: The thought is that in areas that are in permanent 502 00:29:41,440 --> 00:29:44,000 Speaker 1: shadow on the Moon, you might be able to find ice, 503 00:29:44,240 --> 00:29:46,520 Speaker 1: but we won't know for sure until we go there. 504 00:29:47,000 --> 00:29:52,040 Speaker 1: Viper is scheduled to go loony in tree. And finally, 505 00:29:52,280 --> 00:29:56,560 Speaker 1: another space news, Pooping in space is difficult. I've covered 506 00:29:56,600 --> 00:30:00,840 Speaker 1: this extensively, some might even say gratuitously, in those text 507 00:30:00,840 --> 00:30:04,360 Speaker 1: Stuff episodes about space suits. But recently this has popped 508 00:30:04,360 --> 00:30:06,520 Speaker 1: into the news again thanks to the crew of the 509 00:30:06,520 --> 00:30:10,840 Speaker 1: Inspiration Four. That was the group of four non astronauts 510 00:30:11,080 --> 00:30:15,640 Speaker 1: who boarded a SpaceX Crew Dragon spacecraft last Wednesday, went 511 00:30:15,680 --> 00:30:18,400 Speaker 1: into orbit, stayed there for a few days, and then 512 00:30:18,440 --> 00:30:22,920 Speaker 1: returned back to Earth this past weekend. Elon Musk, responding 513 00:30:22,960 --> 00:30:25,360 Speaker 1: to questions about the mission, said that the next Crew 514 00:30:25,520 --> 00:30:29,600 Speaker 1: Dragon space capsule would have quote an upgraded toilet end 515 00:30:29,640 --> 00:30:34,000 Speaker 1: quote because and I quote, we had some challenges with 516 00:30:34,080 --> 00:30:38,040 Speaker 1: it this flight. Now you might wonder where the toilet 517 00:30:38,280 --> 00:30:41,720 Speaker 1: is on the Crew Dragon capsule, and I'll tell you 518 00:30:42,360 --> 00:30:47,960 Speaker 1: it's located above the crew seats. It's on the ceiling. 519 00:30:48,800 --> 00:30:53,280 Speaker 1: Now in microgravity, up and down are largely meaningless, but 520 00:30:53,520 --> 00:30:55,520 Speaker 1: it does mean that to use the toilet, you'd have 521 00:30:55,560 --> 00:30:58,400 Speaker 1: to float up to the top of the capsule, position 522 00:30:58,440 --> 00:31:01,920 Speaker 1: yourself so that you are upside down with regard to 523 00:31:01,960 --> 00:31:05,520 Speaker 1: the bottom of the capsule, and position yourself on the toilet, 524 00:31:05,560 --> 00:31:08,040 Speaker 1: and you use a little privacy curtain that divides the 525 00:31:08,040 --> 00:31:10,400 Speaker 1: capsule so that you know you're not turning it into 526 00:31:10,440 --> 00:31:14,000 Speaker 1: a spectator sport, and then you are inside a glass 527 00:31:14,040 --> 00:31:16,920 Speaker 1: dome once you're you know, positioned on the toilet, and 528 00:31:16,960 --> 00:31:19,560 Speaker 1: then you would do your business. You know what a 529 00:31:19,640 --> 00:31:23,240 Speaker 1: feeling when you're pooping on the ceiling. I made myself 530 00:31:23,320 --> 00:31:25,959 Speaker 1: laugh when I thought that up, which tells you that 531 00:31:26,320 --> 00:31:28,920 Speaker 1: i might be forty six years old, but I'm still 532 00:31:28,960 --> 00:31:32,200 Speaker 1: a thirteen year old boy at heart. Anyway, the next 533 00:31:32,280 --> 00:31:35,080 Speaker 1: Dragon capsule will have an improved toilet, and I cannot 534 00:31:35,120 --> 00:31:38,520 Speaker 1: wait to learn what the heck that actually means. And 535 00:31:38,560 --> 00:31:42,880 Speaker 1: that's it. That's the news for Tuesday, September twenty one. One. 536 00:31:43,040 --> 00:31:45,280 Speaker 1: If you have suggestions for topics I should cover on 537 00:31:45,400 --> 00:31:48,120 Speaker 1: tech Stuff, reach out to me on Twitter. The handle 538 00:31:48,160 --> 00:31:51,560 Speaker 1: for the show is text Stuff h s W and 539 00:31:51,600 --> 00:31:59,720 Speaker 1: I'll talk to you again really soon. Text Stuff is 540 00:31:59,760 --> 00:32:02,960 Speaker 1: an I heart radio production. For more podcasts from I 541 00:32:03,040 --> 00:32:06,640 Speaker 1: heart Radio, visit the i heart Radio app, Apple Podcasts, 542 00:32:06,760 --> 00:32:08,760 Speaker 1: or wherever you listen to your favorite shows.