1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:17,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:17,800 --> 00:00:20,040 Speaker 1: and I love all things tech and it's time for 5 00:00:20,079 --> 00:00:25,040 Speaker 1: the tech news for Tuesday, October one. Also, I wanted 6 00:00:25,040 --> 00:00:27,720 Speaker 1: to mention we will not be having a news episode 7 00:00:27,720 --> 00:00:31,120 Speaker 1: on Thursday, will be bringing an episode of Smart Talks 8 00:00:31,120 --> 00:00:35,720 Speaker 1: on that day. But let's get to some news for today, 9 00:00:36,120 --> 00:00:39,360 Speaker 1: and y'all, I would love to say that today's episode 10 00:00:39,360 --> 00:00:42,360 Speaker 1: has no news about Facebook because I'm tired of talking 11 00:00:42,400 --> 00:00:44,239 Speaker 1: about it, probably about as much as you're tired of 12 00:00:44,280 --> 00:00:48,280 Speaker 1: hearing about it. But that ding dang darn company just 13 00:00:48,360 --> 00:00:51,199 Speaker 1: keeps making headlines. I'm still waiting to find out what 14 00:00:51,280 --> 00:00:54,200 Speaker 1: the new name of the company itself is going to be. 15 00:00:54,640 --> 00:00:58,680 Speaker 1: There is this reported rebranding that's about to happen, and 16 00:00:58,720 --> 00:01:01,680 Speaker 1: by the time you hear this, that news might have broken. 17 00:01:02,240 --> 00:01:04,520 Speaker 1: And of course there are tons of thing pieces out 18 00:01:04,520 --> 00:01:08,160 Speaker 1: there as to why Facebook has decided on this rebrand strategy. 19 00:01:08,200 --> 00:01:11,600 Speaker 1: I pretty much agree with Jake Hancock's take. He wrote 20 00:01:11,600 --> 00:01:14,679 Speaker 1: a piece for Fast Company and he said that the 21 00:01:14,720 --> 00:01:17,399 Speaker 1: move is probably more of a thing that the company 22 00:01:17,480 --> 00:01:21,200 Speaker 1: is doing for investor relations than anything else, and I 23 00:01:21,240 --> 00:01:23,760 Speaker 1: feel like that's pretty on track, but there's some other 24 00:01:23,880 --> 00:01:28,760 Speaker 1: things we need to cover. First, Facebook, since I record this, 25 00:01:28,880 --> 00:01:30,440 Speaker 1: that's still the name of the company, so I'm just 26 00:01:30,480 --> 00:01:33,600 Speaker 1: going to use that as the company name. Has held 27 00:01:33,640 --> 00:01:38,800 Speaker 1: its quarterly earnings call with investors, so CEO and privacy 28 00:01:38,959 --> 00:01:42,720 Speaker 1: is dead. Guru Mark Zuckerberg took some time to address 29 00:01:42,760 --> 00:01:45,399 Speaker 1: the fact that the company has been in the headlines 30 00:01:45,440 --> 00:01:48,760 Speaker 1: for a couple of weeks due to whistle blowers coming 31 00:01:49,040 --> 00:01:53,520 Speaker 1: forth with lots of allegations against the company that range 32 00:01:53,600 --> 00:01:57,520 Speaker 1: from Facebook has done some dumb stuff to Facebook has 33 00:01:58,320 --> 00:02:02,800 Speaker 1: contributed to the downfall of democracy. Zuckerberg tried to downplay 34 00:02:02,840 --> 00:02:06,240 Speaker 1: those reports, claiming that they quote paint a false picture 35 00:02:06,240 --> 00:02:09,600 Speaker 1: of the company end quote. Now I don't know if 36 00:02:09,639 --> 00:02:13,360 Speaker 1: it's really a false picture. I do know that more 37 00:02:13,440 --> 00:02:16,959 Speaker 1: employees and former employees have come forward with additional complaints 38 00:02:16,960 --> 00:02:21,600 Speaker 1: and concerns about the company. However, one thing Facebook keeps 39 00:02:21,639 --> 00:02:26,400 Speaker 1: on doing is making cash by the truckload. So in 40 00:02:26,520 --> 00:02:31,760 Speaker 1: the last quarter, Facebook made twenty nine billion dollars in 41 00:02:31,880 --> 00:02:36,920 Speaker 1: revenue billion with a B that's nearly thirty billion dollars 42 00:02:36,919 --> 00:02:41,080 Speaker 1: in three months or ten billion dollars per month in revenue. 43 00:02:42,040 --> 00:02:45,440 Speaker 1: That is mind boggling. Now, I'm not sure how many 44 00:02:45,440 --> 00:02:48,760 Speaker 1: investors out there are concerned with stuff like ethics or 45 00:02:48,800 --> 00:02:52,680 Speaker 1: investigations if their investment is raking in the big bucks 46 00:02:52,720 --> 00:02:56,840 Speaker 1: on that kind of scale. I mean, from an investment standpoint, 47 00:02:56,919 --> 00:02:58,880 Speaker 1: you could be saying, hey, I don't know, share they're 48 00:02:58,919 --> 00:03:00,920 Speaker 1: doing some shady stuff up, but look at all the 49 00:03:00,960 --> 00:03:04,959 Speaker 1: green they're there bringing me. And granted I was talking 50 00:03:05,040 --> 00:03:08,280 Speaker 1: just now about revenue not profit. Those are two different things. 51 00:03:08,639 --> 00:03:12,600 Speaker 1: So if we stick solely to profit, then over the 52 00:03:12,639 --> 00:03:15,960 Speaker 1: last quarter we're actually talking about a measily nine billion 53 00:03:16,040 --> 00:03:19,480 Speaker 1: dollars in a couple hundred million and change. Hardly enough 54 00:03:19,480 --> 00:03:21,520 Speaker 1: reason to get out of bed in the morning. M 55 00:03:21,520 --> 00:03:25,720 Speaker 1: al right. Anyway, all sarcasm aside, a company making those 56 00:03:25,800 --> 00:03:28,639 Speaker 1: kinds of numbers is not likely to find itself dumped 57 00:03:28,639 --> 00:03:32,280 Speaker 1: by investors. Now. Granted, this could change if governments around 58 00:03:32,280 --> 00:03:34,720 Speaker 1: the world start to force the company to break up 59 00:03:34,760 --> 00:03:39,600 Speaker 1: and spin off various components into independent organizations, essentially making 60 00:03:39,640 --> 00:03:43,200 Speaker 1: Facebook peel away the companies that it has acquired over 61 00:03:43,240 --> 00:03:46,840 Speaker 1: the years, like Instagram and WhatsApp. There's also a possible 62 00:03:46,840 --> 00:03:50,560 Speaker 1: shift in the ad space that could affect Facebook's abilities 63 00:03:50,560 --> 00:03:53,600 Speaker 1: to generate revenue. That would be a huge shake up 64 00:03:53,640 --> 00:03:55,960 Speaker 1: for the company, and we've already seen evidence that the 65 00:03:55,960 --> 00:03:58,240 Speaker 1: company is trying to get ahead of that by counting 66 00:03:58,280 --> 00:04:01,920 Speaker 1: the same person twice if that person interacts with an 67 00:04:01,960 --> 00:04:05,240 Speaker 1: AD on more than one Facebook platform. So for example, 68 00:04:05,240 --> 00:04:08,440 Speaker 1: on Facebook and Instagram, they would count that one person 69 00:04:08,560 --> 00:04:11,240 Speaker 1: two times if they interacted with the same ad. So 70 00:04:12,360 --> 00:04:15,400 Speaker 1: that's a little odd and there's a chance that the 71 00:04:15,440 --> 00:04:18,039 Speaker 1: ad business will change Facebook's earning potential. That would have 72 00:04:18,040 --> 00:04:21,000 Speaker 1: a big effect, and we might see some investors lose 73 00:04:21,040 --> 00:04:23,400 Speaker 1: some confidence, But for now, the company is making some 74 00:04:23,520 --> 00:04:28,120 Speaker 1: serious bank On that same call, Zuckerberg addressed another growing 75 00:04:28,120 --> 00:04:31,640 Speaker 1: concern in the company, namely that younger people don't seem 76 00:04:31,680 --> 00:04:35,360 Speaker 1: to be as hooked into Facebook's products as older generations are. 77 00:04:36,600 --> 00:04:40,400 Speaker 1: Those are some smart younger people. Facebook has seen younger 78 00:04:40,440 --> 00:04:43,760 Speaker 1: people's time on its various platforms go into kind of 79 00:04:43,800 --> 00:04:48,840 Speaker 1: a decline recently, um as people started to favor other platforms. 80 00:04:49,200 --> 00:04:52,960 Speaker 1: Zuckerberg has said that Facebook will quote make serving young 81 00:04:53,000 --> 00:04:57,560 Speaker 1: adults their north star end quote. He said the company 82 00:04:57,640 --> 00:05:00,760 Speaker 1: is looking to cater more features to young adults, which 83 00:05:01,080 --> 00:05:03,480 Speaker 1: from my observation, usually means that Facebook is going to 84 00:05:03,560 --> 00:05:05,920 Speaker 1: take a look at what other popular platforms are doing 85 00:05:06,440 --> 00:05:09,800 Speaker 1: and then either create something similar in other words, copying 86 00:05:09,839 --> 00:05:13,760 Speaker 1: these platforms, or just go out and buy whatever popular 87 00:05:13,839 --> 00:05:18,640 Speaker 1: companies are are doing well. See also Instagram and WhatsApp. Now, 88 00:05:19,360 --> 00:05:23,400 Speaker 1: the gorilla in this scenario would be TikTok, which remains 89 00:05:23,480 --> 00:05:27,000 Speaker 1: really popular with young adults, so expect to see Instagram 90 00:05:27,080 --> 00:05:30,640 Speaker 1: reels become more prominent in the future. But yeah, we 91 00:05:30,680 --> 00:05:33,760 Speaker 1: should add the fact that Facebook's user base is aging 92 00:05:34,200 --> 00:05:36,600 Speaker 1: and young people are not balancing that out. So in 93 00:05:36,600 --> 00:05:39,799 Speaker 1: other words, the people who are using Facebook are getting older. 94 00:05:40,040 --> 00:05:42,839 Speaker 1: Eventually they're going to you know, not be using Facebook 95 00:05:42,880 --> 00:05:47,560 Speaker 1: anymore because you don't do it after you die. And meanwhile, 96 00:05:47,880 --> 00:05:52,719 Speaker 1: younger people aren't coming in to replenish Facebook's user base, 97 00:05:53,160 --> 00:05:56,920 Speaker 1: so that could eventually lead to the company's decline. And 98 00:05:57,360 --> 00:05:59,800 Speaker 1: that's why I think there was such an emphasis on 99 00:06:00,000 --> 00:06:04,760 Speaker 1: Facebook trying to aggressively go after younger users. Next up, 100 00:06:05,320 --> 00:06:10,160 Speaker 1: Ours Technica published an article titled employees pleaded with Facebook 101 00:06:10,240 --> 00:06:14,440 Speaker 1: to stop letting politicians bend rules, and I guess you 102 00:06:14,520 --> 00:06:16,240 Speaker 1: kind of get the gist of it just from the 103 00:06:16,240 --> 00:06:20,279 Speaker 1: headline of the article, but essentially, Facebook was facing criticism 104 00:06:20,360 --> 00:06:23,800 Speaker 1: from conservatives, some of whom were claiming that the platform 105 00:06:23,880 --> 00:06:28,560 Speaker 1: was censoring conservative voices and was thus biased toward liberals. 106 00:06:28,839 --> 00:06:33,360 Speaker 1: So the company's response, according to employees, was to allow 107 00:06:33,480 --> 00:06:38,520 Speaker 1: far right conservatives spread misinformation and otherwise violate Facebook policies 108 00:06:38,920 --> 00:06:42,440 Speaker 1: without facing any kind of reprimand for that, one employee 109 00:06:42,440 --> 00:06:45,400 Speaker 1: said that a director level executive had issued a memo 110 00:06:45,839 --> 00:06:49,480 Speaker 1: indicating that would be really cool if you know, Facebook's 111 00:06:49,480 --> 00:06:53,760 Speaker 1: moderation policies just didn't apply to quote unquote political considerations. 112 00:06:54,440 --> 00:06:58,560 Speaker 1: The employees have said that upper level management, including Zuckerberg himself, 113 00:06:58,880 --> 00:07:02,560 Speaker 1: have frequently moved to block any kind of moderation on 114 00:07:02,600 --> 00:07:06,120 Speaker 1: these kinds of posts, even if those posts contained false 115 00:07:06,279 --> 00:07:11,080 Speaker 1: and or harmful information. A Facebook rep named Joe Osborne 116 00:07:11,320 --> 00:07:14,080 Speaker 1: has said that the claims presented are based on a 117 00:07:14,120 --> 00:07:18,000 Speaker 1: false premise, that that premise being that Facebook is unconcerned 118 00:07:18,040 --> 00:07:22,280 Speaker 1: with user safety, and maybe Osborne is being sincere. I 119 00:07:22,360 --> 00:07:26,760 Speaker 1: fully admit that my own personal perception that Facebook is 120 00:07:26,800 --> 00:07:30,000 Speaker 1: at best concerned about safety only so far as it 121 00:07:30,080 --> 00:07:33,840 Speaker 1: will keep heat off the company. That could be totally wrong. 122 00:07:34,120 --> 00:07:38,280 Speaker 1: That could be a complete misperception on my part. In fact, 123 00:07:38,520 --> 00:07:42,040 Speaker 1: I hope I am wrong about that. That would be great. 124 00:07:42,600 --> 00:07:46,320 Speaker 1: I just feel like my perception might be a little 125 00:07:46,400 --> 00:07:49,679 Speaker 1: closer to the truth based upon how things have played 126 00:07:49,680 --> 00:07:52,440 Speaker 1: out over the last several years. On a slightly more 127 00:07:52,800 --> 00:07:58,600 Speaker 1: oddball note, Facebook has retained the services of lawyer Matthew Rosengart, 128 00:07:59,760 --> 00:08:04,640 Speaker 1: Rose and Gart most recently represented Brittany Spears in her 129 00:08:04,800 --> 00:08:08,200 Speaker 1: quest to free herself from the conservatorship her father had 130 00:08:08,320 --> 00:08:13,080 Speaker 1: instituted on her um. And that was a quest that 131 00:08:13,200 --> 00:08:18,080 Speaker 1: she achieved, actually she was able to to to bring 132 00:08:18,120 --> 00:08:21,760 Speaker 1: that to an end through the court system. Facebook's purpose 133 00:08:21,840 --> 00:08:25,200 Speaker 1: for bringing on Rosengart onto their side is apparently to 134 00:08:25,240 --> 00:08:29,400 Speaker 1: discourage a production company called Anonymous Content. See that company 135 00:08:29,480 --> 00:08:33,200 Speaker 1: is adapting a book that has the title and Ugly Truth, 136 00:08:33,760 --> 00:08:36,559 Speaker 1: and that book is all about Facebook around the year, 137 00:08:37,920 --> 00:08:40,400 Speaker 1: you know, when the company was involved with stuff like 138 00:08:40,440 --> 00:08:45,480 Speaker 1: the Cambridge Analytica scandal and misinformation campaigns. Essentially, the book 139 00:08:45,520 --> 00:08:47,920 Speaker 1: attempts to explain how things got to where they are 140 00:08:48,400 --> 00:08:52,360 Speaker 1: and arguably where they continue to be today. Anonymous Content 141 00:08:52,559 --> 00:08:55,960 Speaker 1: is making a television adaptation of that book, and they 142 00:08:56,000 --> 00:09:00,720 Speaker 1: are titling it doomsday machine. Anyway, rosen Art sent a 143 00:09:00,840 --> 00:09:05,000 Speaker 1: letter to the CEO of Anonymous Content, essentially threatening a 144 00:09:05,080 --> 00:09:11,240 Speaker 1: lawsuit should the series contain any knowingly false statements about Facebook. Further, 145 00:09:11,360 --> 00:09:14,400 Speaker 1: Rosengart claims that several of the passages in an Ugly 146 00:09:14,440 --> 00:09:18,400 Speaker 1: Truth are wrong or misleading, and so by extension, adapting 147 00:09:18,400 --> 00:09:20,840 Speaker 1: those for television would qualify as being a big old 148 00:09:20,840 --> 00:09:23,640 Speaker 1: no no. Now, The Verge reports that, at least to 149 00:09:23,679 --> 00:09:27,160 Speaker 1: their knowledge, Facebook has not pursued any sort of similar 150 00:09:27,280 --> 00:09:32,280 Speaker 1: claim against the publisher or the author of an Ugly Truth, 151 00:09:32,720 --> 00:09:35,679 Speaker 1: and that is somewhat puzzling. I mean, if the statements 152 00:09:35,880 --> 00:09:39,880 Speaker 1: were knowingly false, right, If if the statements in the 153 00:09:39,920 --> 00:09:45,600 Speaker 1: book are knowingly false, why didn't Facebook pursue litigation against 154 00:09:45,600 --> 00:09:49,400 Speaker 1: the book or the publisher? Or is Facebook saying, hey, hey, 155 00:09:49,559 --> 00:09:52,080 Speaker 1: we say that that thing in the book is a lie, 156 00:09:52,440 --> 00:09:54,120 Speaker 1: So if you put it on TV, then you know 157 00:09:54,280 --> 00:09:56,480 Speaker 1: we said it was a lie and then you did 158 00:09:56,480 --> 00:10:00,360 Speaker 1: it anyway, so we'll sue you for you know, having 159 00:10:01,040 --> 00:10:04,880 Speaker 1: known false information in there, which isn't how that should work, right, 160 00:10:04,960 --> 00:10:07,760 Speaker 1: But I don't know for sure. Rosengard did say that 161 00:10:07,840 --> 00:10:10,480 Speaker 1: Facebook was willing to consult on the TV show to 162 00:10:10,480 --> 00:10:14,960 Speaker 1: make sure it's accurate, which you know, sure anyway, no 163 00:10:15,040 --> 00:10:19,120 Speaker 1: word yet on how anonymous content is responding to this message. 164 00:10:19,679 --> 00:10:22,719 Speaker 1: One thing Facebook moderators did recently that was the right 165 00:10:22,760 --> 00:10:25,880 Speaker 1: decision was to remove a video posted by the President 166 00:10:25,880 --> 00:10:30,920 Speaker 1: of Brazil Bilsonaro. He had claimed that COVID nineteen vaccines 167 00:10:30,920 --> 00:10:34,079 Speaker 1: were giving people aids, a claim that originated as a 168 00:10:34,720 --> 00:10:37,160 Speaker 1: rumor that had at no actual evidence to support it 169 00:10:37,240 --> 00:10:43,040 Speaker 1: at all. It cited uh supposed British government officials, but 170 00:10:43,360 --> 00:10:46,360 Speaker 1: there was no such citation. Like it would be like 171 00:10:46,600 --> 00:10:49,320 Speaker 1: you walking up to me and saying your dad says 172 00:10:49,360 --> 00:10:52,520 Speaker 1: I can punch you, and not say I I don't 173 00:10:52,559 --> 00:10:56,240 Speaker 1: think my dad would say that. But you just argue 174 00:10:56,280 --> 00:10:59,160 Speaker 1: that no, no, no, no no, that that statements out there. 175 00:10:59,520 --> 00:11:02,360 Speaker 1: It's not a proof. So despite the fact that it's 176 00:11:02,360 --> 00:11:05,960 Speaker 1: an unsubstantiated rumor, it hasn't stopped, you know, people including 177 00:11:06,000 --> 00:11:10,240 Speaker 1: the President of Brazil, from spreading it. Now moderators have 178 00:11:10,320 --> 00:11:14,360 Speaker 1: removed this video that Balsinaro had posted, but it took 179 00:11:14,400 --> 00:11:16,560 Speaker 1: three days before they did it, And that's a bit 180 00:11:16,559 --> 00:11:19,040 Speaker 1: of a head scratcher because it seems like a pretty 181 00:11:19,080 --> 00:11:22,360 Speaker 1: clear violation of Facebook's policies. But at least we have 182 00:11:22,400 --> 00:11:25,200 Speaker 1: an instance of Facebook taking down a video that did 183 00:11:25,360 --> 00:11:28,960 Speaker 1: violate Facebook's rules, even when that video came from a 184 00:11:29,000 --> 00:11:32,800 Speaker 1: powerful politician, because too many times in the past we've 185 00:11:32,800 --> 00:11:36,440 Speaker 1: seen Facebook turn a blind eye to such things. All Right, 186 00:11:36,520 --> 00:11:39,679 Speaker 1: that's all the Facebook news out of the way, though 187 00:11:39,720 --> 00:11:42,280 Speaker 1: I think I might mention it once more before we're done. 188 00:11:42,760 --> 00:11:44,800 Speaker 1: We're gonna take a quick break and we'll come back 189 00:11:44,800 --> 00:11:55,240 Speaker 1: with the rest of the news. Today, lawmakers in the 190 00:11:55,360 --> 00:11:58,000 Speaker 1: United States are holding a hearing in which they will 191 00:11:58,120 --> 00:12:02,560 Speaker 1: question representatives from tick talk snap that's the company that 192 00:12:02,559 --> 00:12:06,680 Speaker 1: owned Snapchat and YouTube. So the focus of this hearing 193 00:12:07,000 --> 00:12:11,040 Speaker 1: is going to be on how these platforms affect younger users. 194 00:12:11,760 --> 00:12:16,000 Speaker 1: There's been increased scrutiny and how online platforms are influencing people, 195 00:12:16,280 --> 00:12:20,120 Speaker 1: particularly younger people, and there's also a concern that these 196 00:12:20,120 --> 00:12:25,319 Speaker 1: platforms haven't been fully transparent or forthcoming regarding that matter. 197 00:12:25,920 --> 00:12:28,600 Speaker 1: The lawmakers are expected to ask things like how these 198 00:12:28,640 --> 00:12:32,640 Speaker 1: companies protect user privacy, if they do at all, as 199 00:12:32,679 --> 00:12:35,400 Speaker 1: well as how the content on the platforms might include 200 00:12:35,400 --> 00:12:38,680 Speaker 1: stuff that could harm younger people, and how the algorithms 201 00:12:38,720 --> 00:12:42,360 Speaker 1: that services like TikTok and YouTube rely Upon can serve 202 00:12:42,440 --> 00:12:46,240 Speaker 1: up progressively more harmful material over time. Also how these 203 00:12:46,280 --> 00:12:52,200 Speaker 1: platforms monetize user activity on these you know, various platforms. 204 00:12:52,200 --> 00:12:54,960 Speaker 1: I keep using the word platform, but it's social networking 205 00:12:54,960 --> 00:12:58,560 Speaker 1: platforms essentially. Now. YouTube has been in this hot seat before, 206 00:12:59,080 --> 00:13:02,040 Speaker 1: having had a pretty infamous issue with its platform aimed 207 00:13:02,080 --> 00:13:06,360 Speaker 1: at children, and it's subsequently becoming a target for bad 208 00:13:06,400 --> 00:13:08,760 Speaker 1: actors and scam artists, as well as a home to 209 00:13:08,880 --> 00:13:13,000 Speaker 1: some truly bizarre and sometimes disturbing material. I'm not sure 210 00:13:13,040 --> 00:13:15,480 Speaker 1: if we're actually going to see anything productive emerge from 211 00:13:15,559 --> 00:13:18,880 Speaker 1: this hearing, though it might serve as motivation for lawmakers 212 00:13:18,920 --> 00:13:24,079 Speaker 1: to draft or support legislation aimed at protecting kids online. 213 00:13:24,440 --> 00:13:27,760 Speaker 1: Of course, when it comes to law and tech, particularly 214 00:13:27,800 --> 00:13:31,400 Speaker 1: in the online world, we frequently see that stuff that's 215 00:13:31,440 --> 00:13:35,120 Speaker 1: meant to be helpful can sometimes be the opposite. So 216 00:13:35,160 --> 00:13:37,200 Speaker 1: it's too early to say whether or not we're at 217 00:13:37,240 --> 00:13:39,679 Speaker 1: a turning point here, or if we are, if it's 218 00:13:39,679 --> 00:13:43,640 Speaker 1: a turning point that's a positive one. The information reports 219 00:13:43,720 --> 00:13:46,560 Speaker 1: that Apple is very likely to be the subject of 220 00:13:46,600 --> 00:13:50,160 Speaker 1: an upcoming Department of Justice investigation, so you might wonder 221 00:13:50,200 --> 00:13:52,720 Speaker 1: what's being investigated. Well, that would be whether or not 222 00:13:52,800 --> 00:13:56,400 Speaker 1: Apple has a monopolistic hold on some aspects of the 223 00:13:56,400 --> 00:13:59,960 Speaker 1: tech space, most notably in the field of apps, particularly 224 00:14:00,160 --> 00:14:04,640 Speaker 1: for the mobile world. Now, this dovetails with the lawsuits 225 00:14:04,640 --> 00:14:08,200 Speaker 1: flying between Apple and Epic Games, which are mostly about 226 00:14:08,200 --> 00:14:11,480 Speaker 1: how Apple institutes an in app purchasing system. It's a 227 00:14:11,520 --> 00:14:15,360 Speaker 1: required system, you cannot use anything else, and it also 228 00:14:15,400 --> 00:14:18,679 Speaker 1: takes a cut of all in app purchases, and again, 229 00:14:18,760 --> 00:14:22,160 Speaker 1: developers are denied the opportunity to offer up any alternative 230 00:14:22,560 --> 00:14:26,480 Speaker 1: to Apple's in app system. That's not the only matter 231 00:14:26,640 --> 00:14:28,840 Speaker 1: that the d o J is interested in. Another is 232 00:14:28,880 --> 00:14:32,120 Speaker 1: the allegation that Apple requires third party apps to abide 233 00:14:32,120 --> 00:14:36,400 Speaker 1: by certain restrictions with regard to location tracking, but Apple's 234 00:14:36,520 --> 00:14:40,520 Speaker 1: own apps are not held to those same restrictions, which 235 00:14:40,520 --> 00:14:42,520 Speaker 1: does sound a bit any competitive. It sounds like you've 236 00:14:42,560 --> 00:14:44,560 Speaker 1: got two different sets of rules. You've got the rules 237 00:14:44,600 --> 00:14:46,520 Speaker 1: that I follow, and then there are the rules that 238 00:14:46,560 --> 00:14:49,520 Speaker 1: you have to follow, and my rules might be a little, 239 00:14:49,680 --> 00:14:52,920 Speaker 1: you know, more loose than yours. That's not fair. The 240 00:14:53,000 --> 00:14:55,720 Speaker 1: d o J first opened up an antitrust probe into 241 00:14:55,760 --> 00:14:59,280 Speaker 1: Apple back in twenty nineteen. According to the information the 242 00:14:59,320 --> 00:15:02,920 Speaker 1: probe re and ly scaled up in activity, potentially indicating 243 00:15:02,960 --> 00:15:07,000 Speaker 1: that an official investigation is to follow. This past weekend, 244 00:15:07,200 --> 00:15:10,360 Speaker 1: Tesla deployed an update to its so called full self 245 00:15:10,440 --> 00:15:15,080 Speaker 1: driving program, which, as I've mentioned in previous episodes, takes 246 00:15:15,120 --> 00:15:18,960 Speaker 1: a really liberal definition of full self driving. But this update, 247 00:15:19,160 --> 00:15:23,800 Speaker 1: version ten point three, ended up causing some pretty scary situations. Apparently, 248 00:15:24,160 --> 00:15:28,920 Speaker 1: the Modes Forward Collision Warning System or f c W, 249 00:15:28,920 --> 00:15:32,800 Speaker 1: would malfunction upon the installation of ten point three, and 250 00:15:32,800 --> 00:15:35,840 Speaker 1: would occasionally detect a collision when there was no such 251 00:15:36,160 --> 00:15:39,720 Speaker 1: danger present. Some users even reported that their cars went 252 00:15:39,760 --> 00:15:43,520 Speaker 1: into fc W mode even when no other cars were 253 00:15:43,560 --> 00:15:46,160 Speaker 1: on the road, and in at least some of these cases, 254 00:15:46,200 --> 00:15:50,600 Speaker 1: the Tesla vehicle would then automatically apply the brakes really hard. 255 00:15:51,120 --> 00:15:54,120 Speaker 1: Often it would do so several times, even on short drives, 256 00:15:54,120 --> 00:15:58,840 Speaker 1: making the cars effectively undrivable. Now, clearly a card that's 257 00:15:58,920 --> 00:16:02,920 Speaker 1: just breaking hard for no reason is itself a danger 258 00:16:03,080 --> 00:16:06,280 Speaker 1: on the road and could potentially lead to actual collisions. 259 00:16:06,720 --> 00:16:10,760 Speaker 1: Tesla recalled the update and rolled users back to version 260 00:16:10,840 --> 00:16:15,160 Speaker 1: ten point two. Elon must tweeted that you know these 261 00:16:15,160 --> 00:16:19,200 Speaker 1: things happen. It's a beta program, so with beta programs 262 00:16:19,320 --> 00:16:22,800 Speaker 1: you expect to find bugs. I mean, honestly, that's the 263 00:16:22,800 --> 00:16:27,080 Speaker 1: purpose of a beta program. You open up testing at 264 00:16:27,160 --> 00:16:30,280 Speaker 1: scale so that you can quickly identify design flaws that 265 00:16:31,000 --> 00:16:33,920 Speaker 1: otherwise might not pop up in a limited internal test. 266 00:16:34,400 --> 00:16:37,880 Speaker 1: But this is where I start to take issue, because 267 00:16:37,920 --> 00:16:42,760 Speaker 1: in this case, you're talking about a technology that can 268 00:16:42,840 --> 00:16:47,320 Speaker 1: freaking kill you if it does not work properly. I'm 269 00:16:47,360 --> 00:16:51,160 Speaker 1: actually surprised that internal testing didn't bring up any incidents 270 00:16:51,160 --> 00:16:54,360 Speaker 1: of phantom warnings, because based upon the reports of the 271 00:16:54,360 --> 00:16:56,520 Speaker 1: people who were in the beta test, it seemed like 272 00:16:56,520 --> 00:16:58,720 Speaker 1: it was a pretty common issue. And I would also 273 00:16:58,800 --> 00:17:04,000 Speaker 1: say the cavalier added HUD is definitely out of place. Anyway. 274 00:17:04,200 --> 00:17:08,159 Speaker 1: Update ten point three point one rolled out yesterday, and 275 00:17:08,240 --> 00:17:12,040 Speaker 1: presumably vehicles will no longer detect ghosts in front of 276 00:17:12,080 --> 00:17:14,880 Speaker 1: them and then slam on the brakes. Now, on other 277 00:17:14,960 --> 00:17:18,399 Speaker 1: Tesla news, the company recently signed a huge deal with 278 00:17:18,480 --> 00:17:21,880 Speaker 1: the rental car company Hurts, which has placed in order 279 00:17:21,920 --> 00:17:25,600 Speaker 1: of one hundred thousand Tesla cars for around four point 280 00:17:25,720 --> 00:17:30,879 Speaker 1: to billion with a B dollars. That's a pretty big deal, 281 00:17:31,320 --> 00:17:33,560 Speaker 1: both in the sense that it's a big business deal 282 00:17:33,880 --> 00:17:35,439 Speaker 1: and that's a big deal to go so hard with 283 00:17:35,520 --> 00:17:39,080 Speaker 1: Tesla in an effort to electrify a fleet of rental cars. 284 00:17:39,680 --> 00:17:42,280 Speaker 1: The announcement of the deal gave a boost to Tesla 285 00:17:42,359 --> 00:17:45,280 Speaker 1: stock price, driving the company to a valuation of more 286 00:17:45,320 --> 00:17:49,280 Speaker 1: than a trillion dollars. I'm curious if Hurts will charge 287 00:17:49,320 --> 00:17:52,880 Speaker 1: customers if they return their rented Tesla's on less than 288 00:17:52,920 --> 00:17:55,879 Speaker 1: a full batteries charge, because one way is that rental 289 00:17:55,920 --> 00:17:58,760 Speaker 1: cars companies make cash is they charge you out the 290 00:17:58,800 --> 00:18:00,840 Speaker 1: wazoo if you don't have a old gas tank when 291 00:18:00,840 --> 00:18:04,240 Speaker 1: you return the vehicle. I do know that Hurts struck 292 00:18:04,240 --> 00:18:07,359 Speaker 1: a deal with Tesla to let renters use the company's 293 00:18:07,440 --> 00:18:11,720 Speaker 1: supercharger stations around the United States and Europe. Pretty interesting 294 00:18:11,720 --> 00:18:14,320 Speaker 1: development for a rental car company that was literally in 295 00:18:14,400 --> 00:18:18,520 Speaker 1: bankruptcy just a year ago. Finally, I mentioned last week 296 00:18:18,520 --> 00:18:22,000 Speaker 1: that some companies, including lockeed Martin, plan to collaborate and 297 00:18:22,080 --> 00:18:25,159 Speaker 1: launch a private space station into low Earth orbit by 298 00:18:25,200 --> 00:18:29,840 Speaker 1: twenty seven. Well, Jeff Bezos, founder of Amazon dot Com 299 00:18:30,040 --> 00:18:34,200 Speaker 1: and Blue Origin, announced that his space company also plans 300 00:18:34,240 --> 00:18:37,359 Speaker 1: to build a commercial space station, and it has the 301 00:18:37,400 --> 00:18:42,320 Speaker 1: operating name of Orbital Reef. Like the station I mentioned 302 00:18:42,359 --> 00:18:45,240 Speaker 1: last week, the purpose of this one will include stuff 303 00:18:45,240 --> 00:18:49,320 Speaker 1: like acting as a research lab for various sciences, presumably 304 00:18:49,400 --> 00:18:53,920 Speaker 1: ones that can potentially lead to a profitable application. Because 305 00:18:53,920 --> 00:18:57,480 Speaker 1: this is going to be largely commercial research, not necessarily 306 00:18:57,800 --> 00:19:01,200 Speaker 1: pure science research. It will also serve as a potential 307 00:19:01,240 --> 00:19:03,919 Speaker 1: site where a company could maybe run a hotel for 308 00:19:04,000 --> 00:19:08,400 Speaker 1: space tourism. Apart from these little details, Bezos and Blue 309 00:19:08,440 --> 00:19:11,719 Speaker 1: Origin haven't revealed much more about their plans, such as 310 00:19:12,040 --> 00:19:14,760 Speaker 1: whether or not the station will be modular in design 311 00:19:15,000 --> 00:19:18,520 Speaker 1: like the International Space Station or monolithic like the other 312 00:19:18,560 --> 00:19:21,800 Speaker 1: proposed private space station we talked about last week. Nor 313 00:19:21,880 --> 00:19:25,240 Speaker 1: do we have any concrete information on a timeline about 314 00:19:25,280 --> 00:19:27,959 Speaker 1: this other than the hope is that it will be 315 00:19:28,000 --> 00:19:30,520 Speaker 1: an operation before the end of this decade. So to 316 00:19:30,640 --> 00:19:34,920 Speaker 1: all those Amazon workers out there who are debating unionization, 317 00:19:35,440 --> 00:19:37,959 Speaker 1: let me just say, while you're peeing in a bottle, 318 00:19:38,400 --> 00:19:42,320 Speaker 1: you could be staring at the stars, just not as 319 00:19:42,320 --> 00:19:45,639 Speaker 1: close as Bezos will be. And that wraps up the 320 00:19:45,680 --> 00:19:49,879 Speaker 1: tech news for today, Tuesday, October twenty six one. I 321 00:19:49,920 --> 00:19:52,359 Speaker 1: hope you're all well. Like I said, we will not 322 00:19:52,520 --> 00:19:54,800 Speaker 1: have a news episode on Thursday. We will have an 323 00:19:54,840 --> 00:19:58,359 Speaker 1: episode of Smart Talks also on Thursday, If you haven't 324 00:19:58,359 --> 00:20:02,399 Speaker 1: already started listening, you should check out the podcast Thirteen 325 00:20:02,520 --> 00:20:06,960 Speaker 1: Days of Halloween. It's a three D audio spooky ghost 326 00:20:07,240 --> 00:20:12,679 Speaker 1: story podcast. Uh, And on Thursday, I will be in 327 00:20:12,720 --> 00:20:16,720 Speaker 1: an episode of that series. It's pretty cool. Uh. Episode 328 00:20:16,760 --> 00:20:20,639 Speaker 1: two is probably my favorite so far, and I'm not 329 00:20:20,720 --> 00:20:23,640 Speaker 1: in that one. It's just my honest opinion. I think 330 00:20:23,640 --> 00:20:27,040 Speaker 1: episode two is a really strong episode. But yeah, check 331 00:20:27,040 --> 00:20:30,639 Speaker 1: it out. It's a pretty creepy, fun like camp fire 332 00:20:30,680 --> 00:20:33,800 Speaker 1: spooky story type stuff and I really like it a lot, 333 00:20:34,080 --> 00:20:35,920 Speaker 1: so I hope you do too. If you have any 334 00:20:35,920 --> 00:20:38,280 Speaker 1: suggestions for topics I should cover in future episodes of 335 00:20:38,280 --> 00:20:40,159 Speaker 1: tech Stuff, feel free to reach out to me. The 336 00:20:40,200 --> 00:20:42,919 Speaker 1: best way to do that is on Twitter. The handle 337 00:20:42,920 --> 00:20:46,399 Speaker 1: for the show is text Stuff H s W and 338 00:20:46,400 --> 00:20:54,920 Speaker 1: I'll talk to you again really soon. Text Stuff is 339 00:20:54,960 --> 00:20:58,080 Speaker 1: an I Heart Radio production. For more podcasts from my 340 00:20:58,200 --> 00:21:01,840 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 341 00:21:01,960 --> 00:21:08,520 Speaker 1: or wherever you listen to your favorite shows. H