1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:15,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,160 --> 00:00:18,320 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,360 --> 00:00:22,120 Speaker 1: and I love all things tech, and we are covering 5 00:00:22,360 --> 00:00:25,720 Speaker 1: some of the biggest text stories of twenty twenty, some 6 00:00:25,880 --> 00:00:29,080 Speaker 1: of them, not all of them. In our last episode, 7 00:00:29,120 --> 00:00:32,040 Speaker 1: we looked at the first three months, which means if 8 00:00:32,040 --> 00:00:33,959 Speaker 1: we were to keep that pace, we would need four 9 00:00:34,000 --> 00:00:36,800 Speaker 1: episodes to get through the year, assuming each quarter has 10 00:00:36,840 --> 00:00:39,520 Speaker 1: about the same amount of news. However, if you listen 11 00:00:39,560 --> 00:00:42,120 Speaker 1: to the late last episode, you also know that I 12 00:00:42,240 --> 00:00:45,920 Speaker 1: stuck with stories that played out through twenty twenty, so 13 00:00:46,440 --> 00:00:49,760 Speaker 1: maybe that will help us out. I guess we'll see. 14 00:00:50,159 --> 00:00:53,479 Speaker 1: I know that that's going to be at least three episodes. Spoiler, 15 00:00:54,080 --> 00:00:56,920 Speaker 1: we don't finish it out today, but let's jump back 16 00:00:57,040 --> 00:01:01,000 Speaker 1: into April twenty. In a tip of cool Year, I 17 00:01:01,000 --> 00:01:05,000 Speaker 1: would start off talking about April by mentioning the various 18 00:01:05,040 --> 00:01:08,399 Speaker 1: pranks and jokes that brands pulled or tried to pull. 19 00:01:09,080 --> 00:01:14,320 Speaker 1: Something that exasperates many of my peers who cover technology, 20 00:01:14,360 --> 00:01:17,080 Speaker 1: but I kind of like a good silly joke as 21 00:01:17,080 --> 00:01:20,240 Speaker 1: long as it's not you know, mean spirited or just 22 00:01:20,319 --> 00:01:23,360 Speaker 1: an outright lie posing as the truth in an effort 23 00:01:23,400 --> 00:01:27,560 Speaker 1: to actually trick people. But the pandemic and the Black 24 00:01:27,600 --> 00:01:31,800 Speaker 1: Lives Matter movement really changed things in the United States. 25 00:01:32,080 --> 00:01:35,959 Speaker 1: Many states or cities had issued lockdown restrictions, and a 26 00:01:36,000 --> 00:01:40,480 Speaker 1: lot of brands chose to give April first twenty a pass. Personally, 27 00:01:40,680 --> 00:01:43,479 Speaker 1: I would have preferred twice as many jokes and no pandemic. 28 00:01:44,040 --> 00:01:48,720 Speaker 1: But the balls in your court, grouchy tech journalists, those 29 00:01:48,800 --> 00:01:51,400 Speaker 1: jokes don't seem so bad now, do they. In our 30 00:01:51,440 --> 00:01:54,720 Speaker 1: previous episode, I talked about how Zoom had a huge 31 00:01:54,960 --> 00:01:58,080 Speaker 1: year due to more people moving their work and personal 32 00:01:58,080 --> 00:02:01,800 Speaker 1: connections online. In April, we would see some concerns about 33 00:02:01,920 --> 00:02:06,080 Speaker 1: zooms security. The company's privacy protections were brought into question, 34 00:02:06,400 --> 00:02:09,120 Speaker 1: and we started seeing a trend of zoom bombing, in 35 00:02:09,200 --> 00:02:11,840 Speaker 1: which people who were not invited to a Zoom session 36 00:02:11,880 --> 00:02:14,680 Speaker 1: We're showing up, often because someone had shared the link 37 00:02:15,520 --> 00:02:18,160 Speaker 1: for that meeting in a way that wasn't secure, and 38 00:02:18,200 --> 00:02:22,399 Speaker 1: it would prompt Zoom to institute more accessible password protection 39 00:02:22,480 --> 00:02:27,799 Speaker 1: options uh into an encryption and other methods to make 40 00:02:27,840 --> 00:02:30,160 Speaker 1: sure that the people who are joining a session were 41 00:02:30,160 --> 00:02:32,680 Speaker 1: the ones who are meant to, not just people who 42 00:02:32,760 --> 00:02:35,280 Speaker 1: happen to have a link to the online session. Now, 43 00:02:35,320 --> 00:02:38,400 Speaker 1: granted these are all options. You don't have to have 44 00:02:38,440 --> 00:02:42,760 Speaker 1: password protection on your Zoom meeting, so it's still are 45 00:02:42,840 --> 00:02:45,200 Speaker 1: there still are possibilities for people to to barge in 46 00:02:45,240 --> 00:02:49,160 Speaker 1: on a meeting they aren't really welcome to, but there 47 00:02:49,160 --> 00:02:51,520 Speaker 1: are now more options to allow you to have more 48 00:02:51,560 --> 00:02:54,840 Speaker 1: control over that. We also began to see various coronavirus 49 00:02:54,880 --> 00:02:58,920 Speaker 1: tracking apps in development. The apps had a couple of 50 00:02:58,960 --> 00:03:02,400 Speaker 1: real big jol leunges. So the idea is that these 51 00:03:02,440 --> 00:03:05,840 Speaker 1: apps can track where you go, and if you should 52 00:03:05,840 --> 00:03:09,080 Speaker 1: happen to be near someone else who happens to have 53 00:03:09,120 --> 00:03:11,120 Speaker 1: the app and who has indicated that they have been 54 00:03:11,160 --> 00:03:14,600 Speaker 1: exposed to COVID, then you would get a notification saying, hey, 55 00:03:14,680 --> 00:03:18,080 Speaker 1: you came into contact with someone who has already indicated 56 00:03:18,160 --> 00:03:21,640 Speaker 1: that they may have been exposed. If used properly, you 57 00:03:21,639 --> 00:03:24,440 Speaker 1: can help mitigate the spread of the disease. But there's 58 00:03:24,480 --> 00:03:28,520 Speaker 1: also a genuine concern for privacy and a general fear 59 00:03:28,639 --> 00:03:30,680 Speaker 1: that such an app would violate the privacy of the 60 00:03:30,680 --> 00:03:33,520 Speaker 1: people using it, or that the companies that are making 61 00:03:33,520 --> 00:03:36,400 Speaker 1: the apps could then later on use that information in 62 00:03:36,440 --> 00:03:39,920 Speaker 1: ways that we don't necessarily approve of. And if this 63 00:03:39,960 --> 00:03:43,280 Speaker 1: were just a disease and people were just treating it 64 00:03:43,400 --> 00:03:47,400 Speaker 1: as a disease, maybe all of these concerns are things 65 00:03:47,400 --> 00:03:49,760 Speaker 1: that we could overlook. But we have to remember that 66 00:03:49,840 --> 00:03:52,680 Speaker 1: there are other factors at play here. One of those 67 00:03:52,720 --> 00:03:56,080 Speaker 1: is that some people hold up pretty racist and xenophobic 68 00:03:56,120 --> 00:03:59,680 Speaker 1: opinion about the origins of the disease. There were so 69 00:03:59,720 --> 00:04:03,320 Speaker 1: many stories of people with Chinese heritage being harassed because 70 00:04:03,400 --> 00:04:06,480 Speaker 1: of that perception, and people can be pretty awful if 71 00:04:06,480 --> 00:04:08,840 Speaker 1: they choose to be, so there are a lot of 72 00:04:08,840 --> 00:04:12,240 Speaker 1: things to balance out when it comes to tracking coronavirus. 73 00:04:12,680 --> 00:04:15,800 Speaker 1: On top of that, the apps tend to be voluntary, 74 00:04:15,880 --> 00:04:18,440 Speaker 1: which means each person has to choose to download and 75 00:04:18,480 --> 00:04:21,360 Speaker 1: install it, and some people are just unwilling to do that. 76 00:04:21,880 --> 00:04:23,440 Speaker 1: And then there's the fact that you would need to 77 00:04:23,560 --> 00:04:28,400 Speaker 1: update the app to reflect a diagnosis. It's an imperfect solution, 78 00:04:28,760 --> 00:04:31,280 Speaker 1: and in places like Europe, there were a lot of 79 00:04:31,320 --> 00:04:35,040 Speaker 1: long discussions about how to balance out these various concerns. 80 00:04:35,480 --> 00:04:38,320 Speaker 1: As it stands, there are lots of different apps, most 81 00:04:38,360 --> 00:04:41,560 Speaker 1: of them focus on relatively small regions, but much of 82 00:04:41,560 --> 00:04:44,160 Speaker 1: the world is not covered by them. I mean I 83 00:04:44,240 --> 00:04:47,000 Speaker 1: live in Georgia, which is the home for the Center 84 00:04:47,040 --> 00:04:50,040 Speaker 1: for Disease Control, and we don't have an app for 85 00:04:50,120 --> 00:04:53,839 Speaker 1: our state as of yet. We also started seeing cases 86 00:04:53,839 --> 00:04:57,960 Speaker 1: of people damaging five G towers in April of twenty twenty. 87 00:04:58,080 --> 00:05:01,040 Speaker 1: As I mentioned in my episode about five gen myths, 88 00:05:01,080 --> 00:05:05,719 Speaker 1: some people somehow corroborated these towers with the creation or 89 00:05:05,880 --> 00:05:10,799 Speaker 1: spread of coronavirus, as if somehow, then, for some reason, 90 00:05:11,120 --> 00:05:16,279 Speaker 1: the telecommunications companies were broadcasting a virus which is just impossible. 91 00:05:16,520 --> 00:05:19,479 Speaker 1: I mean, a virus that can affect humans. I guess 92 00:05:19,480 --> 00:05:23,440 Speaker 1: you could broadcast a computer virus. That's a different kettle 93 00:05:23,480 --> 00:05:26,520 Speaker 1: of fish. The pandemic also delayed one of the more 94 00:05:26,560 --> 00:05:32,239 Speaker 1: anticipated criminal trials in recent tech history. Elizabeth Holmes, founder 95 00:05:32,279 --> 00:05:35,719 Speaker 1: of medical startup Theoris, was to have her trial begin 96 00:05:35,839 --> 00:05:40,320 Speaker 1: in twenty but the pandemic became a factor and the 97 00:05:40,320 --> 00:05:43,000 Speaker 1: court would order the trial to be delayed until the 98 00:05:43,040 --> 00:05:48,240 Speaker 1: spring of For those who don't know who Elizabeth Holmes 99 00:05:48,360 --> 00:05:51,640 Speaker 1: is or what her company, Theopness did, here's a super 100 00:05:51,800 --> 00:05:57,159 Speaker 1: brief rundown Holmes, who idolized Silicon Valley leaders like Steve 101 00:05:57,279 --> 00:06:00,680 Speaker 1: jobs created a company that had the goal of developing 102 00:06:00,720 --> 00:06:03,760 Speaker 1: a medical device. The device, which was projected to be 103 00:06:03,880 --> 00:06:07,240 Speaker 1: about the size of your typical desktop printer, would be 104 00:06:07,279 --> 00:06:10,839 Speaker 1: able to take a very small blood sample, the tiniest 105 00:06:10,880 --> 00:06:15,520 Speaker 1: little droplet, and run hundreds of different analytical tests on 106 00:06:15,680 --> 00:06:19,920 Speaker 1: that sample. Within a short time, perhaps an hour or two. 107 00:06:20,279 --> 00:06:23,720 Speaker 1: The device would produce a report about that sample, giving 108 00:06:23,720 --> 00:06:28,560 Speaker 1: the user information about their health, diagnosing any diseases or conditions, 109 00:06:28,960 --> 00:06:32,599 Speaker 1: and in theory, empowering the user. And The idea was 110 00:06:32,640 --> 00:06:35,800 Speaker 1: to democratize medicine in a way that would give users 111 00:06:35,800 --> 00:06:38,680 Speaker 1: more information about their own health and a better way 112 00:06:38,720 --> 00:06:42,560 Speaker 1: to interact with their primary care physician and the general 113 00:06:42,600 --> 00:06:46,719 Speaker 1: medical establishment. Some doctors worried that this would cause people 114 00:06:46,720 --> 00:06:50,159 Speaker 1: to misinterpret results, but it turns out they didn't really 115 00:06:50,200 --> 00:06:53,320 Speaker 1: have much to fear because the device never worked properly, 116 00:06:53,560 --> 00:06:56,760 Speaker 1: at least not to the extent that the company wanted 117 00:06:56,800 --> 00:06:59,760 Speaker 1: it to. It turned out the actual process was way 118 00:06:59,800 --> 00:07:03,040 Speaker 1: more are complicated than Holmes had first imagined, and her 119 00:07:03,040 --> 00:07:07,120 Speaker 1: team of engineers were tackling problem after problem in order 120 00:07:07,160 --> 00:07:09,720 Speaker 1: to try and make it work. In the meantime, The 121 00:07:09,760 --> 00:07:13,040 Speaker 1: allegations against Holmes state that she and her fellow executives 122 00:07:13,280 --> 00:07:18,920 Speaker 1: purposefully misled investors, including using equipment from established blood testing 123 00:07:18,960 --> 00:07:23,280 Speaker 1: companies to run blood tests while claiming that a themised 124 00:07:23,320 --> 00:07:26,040 Speaker 1: device was actually doing all the work. The house of 125 00:07:26,080 --> 00:07:29,640 Speaker 1: cards came crashing down, but not before investors had poured 126 00:07:29,680 --> 00:07:33,160 Speaker 1: more than seven hundred million dollars into it. Holmes is 127 00:07:33,200 --> 00:07:36,120 Speaker 1: now charged with numerous counts of fraud, and we'll have 128 00:07:36,160 --> 00:07:39,040 Speaker 1: to wait to see how that all turns out. One 129 00:07:39,200 --> 00:07:43,160 Speaker 1: of the big early stories tied in with the coronavirus 130 00:07:43,280 --> 00:07:46,320 Speaker 1: was how ill prepared hospitals were to meet the challenge 131 00:07:46,400 --> 00:07:49,960 Speaker 1: of early serious cases. We saw some people try to 132 00:07:50,000 --> 00:07:53,680 Speaker 1: help in novel ways, such as using three D printers 133 00:07:53,720 --> 00:07:58,800 Speaker 1: to create airflow splitters for ventilators. So ventilators take over 134 00:07:58,920 --> 00:08:01,920 Speaker 1: the process of breathing for patients, and they are expensive 135 00:08:01,920 --> 00:08:05,000 Speaker 1: pieces of machinery and there's a limited number of them. 136 00:08:05,120 --> 00:08:08,600 Speaker 1: So the thought was that by using splitters, you could 137 00:08:08,640 --> 00:08:13,320 Speaker 1: divert air from one ventilator to multiple patients, like two patients, 138 00:08:13,360 --> 00:08:15,120 Speaker 1: and that way you wouldn't have to have just one 139 00:08:15,240 --> 00:08:18,400 Speaker 1: ventilator per patient, and thus you could stretch the limited 140 00:08:18,440 --> 00:08:21,320 Speaker 1: resources you have further. And I think it was a 141 00:08:21,520 --> 00:08:25,240 Speaker 1: fairly clever move. But I also have very strong feelings 142 00:08:25,280 --> 00:08:28,920 Speaker 1: about how the United States government completely failed to meet 143 00:08:28,960 --> 00:08:31,640 Speaker 1: the challenge of COVID on a national scale. But i'll 144 00:08:31,640 --> 00:08:35,560 Speaker 1: spare you that rant. I'm sure you're all familiar and 145 00:08:35,640 --> 00:08:38,120 Speaker 1: tired of hearing those kind of things already. But on 146 00:08:38,160 --> 00:08:41,560 Speaker 1: a related note, in April, Elon Musk said he had 147 00:08:41,600 --> 00:08:45,720 Speaker 1: sent ventilators to hospitals in California to help alleviate the 148 00:08:45,800 --> 00:08:50,840 Speaker 1: resource scarcity. Musk had sent valuable equipment, but they weren't 149 00:08:51,000 --> 00:08:55,640 Speaker 1: technically ventilators. There were machines that aid in breathing assistance, 150 00:08:56,040 --> 00:08:59,520 Speaker 1: but there are stepped down from ventilators themselves, which, as 151 00:08:59,559 --> 00:09:03,160 Speaker 1: I said, handle breathing entirely for a patient. This I 152 00:09:03,200 --> 00:09:06,360 Speaker 1: think was largely a case of bad communication, by which 153 00:09:06,360 --> 00:09:09,880 Speaker 1: I mean Musk really did help these hospitals, just not 154 00:09:10,040 --> 00:09:12,840 Speaker 1: in the way that was initially reported. So I don't 155 00:09:12,880 --> 00:09:15,719 Speaker 1: want to throw shade on Musk for this, as he 156 00:09:15,800 --> 00:09:19,119 Speaker 1: was doing something that actually was incredible and the hospitals 157 00:09:19,160 --> 00:09:22,040 Speaker 1: did have need of that equipment. But that being said, 158 00:09:22,520 --> 00:09:25,840 Speaker 1: Musk also used his position and platform to argue against 159 00:09:25,960 --> 00:09:30,680 Speaker 1: stuff like stay at home orders. He was defiant of them. 160 00:09:30,679 --> 00:09:33,240 Speaker 1: He said that Tesla would continue to operate even in 161 00:09:33,280 --> 00:09:35,560 Speaker 1: the face of stay at home orders, and he helps 162 00:09:35,559 --> 00:09:39,520 Speaker 1: spread misinformation about the coronavirus. So while he did help 163 00:09:39,600 --> 00:09:43,040 Speaker 1: some hospitals by sending them equipment, he also lent credence 164 00:09:43,080 --> 00:09:45,880 Speaker 1: to false narratives about COVID And I think the scales 165 00:09:45,920 --> 00:09:49,640 Speaker 1: mostly way on the You done did bad on this one, 166 00:09:50,480 --> 00:09:54,320 Speaker 1: and I almost forgot. On April six, twenty Quimby, the 167 00:09:54,520 --> 00:09:59,400 Speaker 1: short form high production video streaming service, officially launched. Led 168 00:09:59,400 --> 00:10:02,800 Speaker 1: by dream war its founder Jeffrey Katzenberg and former HP 169 00:10:03,120 --> 00:10:06,800 Speaker 1: CEO Meg Whitman. The service could not have come out 170 00:10:06,840 --> 00:10:10,360 Speaker 1: at a worse time. It was geared towards mobile users, 171 00:10:10,600 --> 00:10:13,040 Speaker 1: the idea being that people like to consume media on 172 00:10:13,080 --> 00:10:16,360 Speaker 1: their phones well they're waiting for stuff like when they're 173 00:10:16,360 --> 00:10:18,880 Speaker 1: on a bus or on a subway, or when they're 174 00:10:18,880 --> 00:10:22,240 Speaker 1: in line for something. But of course, the service launched 175 00:10:22,520 --> 00:10:25,160 Speaker 1: just as a pandemic was forcing most of us to 176 00:10:25,240 --> 00:10:28,959 Speaker 1: stay at home. Even without the pandemic, a new video 177 00:10:29,040 --> 00:10:31,120 Speaker 1: service was going to have a really tough fight to 178 00:10:31,160 --> 00:10:34,560 Speaker 1: be competitive, But the use case for Quimby was pretty 179 00:10:34,640 --> 00:10:37,960 Speaker 1: much wiped out before it could even get started. It 180 00:10:38,080 --> 00:10:40,840 Speaker 1: launched a lot of exclusive programming and gave a lot 181 00:10:40,840 --> 00:10:44,000 Speaker 1: of creative folks and outlet, but it couldn't tough it 182 00:10:44,080 --> 00:10:47,880 Speaker 1: out and it would shut down late in so R 183 00:10:47,960 --> 00:10:50,440 Speaker 1: I p Quimby if you want to learn more. I 184 00:10:50,440 --> 00:10:53,079 Speaker 1: did do a couple of episodes about Quimby earlier this year. 185 00:10:53,120 --> 00:10:56,000 Speaker 1: I did one a few months after it had launched, 186 00:10:56,000 --> 00:10:59,319 Speaker 1: and then I did another one after the announcement that 187 00:10:59,400 --> 00:11:02,520 Speaker 1: Quimby was going to shut down in December. There are 188 00:11:02,559 --> 00:11:05,280 Speaker 1: a few other stories that happened in April, but they 189 00:11:05,320 --> 00:11:08,080 Speaker 1: mostly involved how the heck do we do what we 190 00:11:08,200 --> 00:11:10,720 Speaker 1: do when there's a pandemic going on? And I think 191 00:11:10,760 --> 00:11:13,280 Speaker 1: we've heard, you know enough of those to kind of 192 00:11:13,480 --> 00:11:16,920 Speaker 1: know that these stories are largely the same as companies 193 00:11:16,960 --> 00:11:19,560 Speaker 1: big and small. We're grappling with the challenges of a 194 00:11:19,640 --> 00:11:24,000 Speaker 1: world being thrown into chaos thanks to a virus. Two 195 00:11:24,000 --> 00:11:26,520 Speaker 1: other stories I think I should mention, though, are that 196 00:11:26,679 --> 00:11:32,119 Speaker 1: Nintendo revealed in April that hackers compromised one sixty thousand 197 00:11:32,280 --> 00:11:35,839 Speaker 1: accounts on the Nintendo network, and they stole funds from 198 00:11:35,840 --> 00:11:39,640 Speaker 1: account holders in the process by using any stored funds 199 00:11:39,640 --> 00:11:41,760 Speaker 1: that were in the those accounts to do stuff like 200 00:11:42,320 --> 00:11:46,120 Speaker 1: by Fortnite's virtual currency, which, as we learned in a 201 00:11:46,160 --> 00:11:50,240 Speaker 1: previous episode, would not be subjected to federal taxes. The 202 00:11:50,280 --> 00:11:54,280 Speaker 1: hackers exploited the older Nintendo Network I D system, which 203 00:11:54,360 --> 00:11:58,240 Speaker 1: was used for the older three DS and we U consoles. 204 00:11:58,679 --> 00:12:01,640 Speaker 1: The company offered to fund money to those affected and 205 00:12:01,760 --> 00:12:05,120 Speaker 1: urged users to change their passwords. Later on, Nintendo would 206 00:12:05,120 --> 00:12:08,320 Speaker 1: reveal the hat compromised more accounts than they first realized, 207 00:12:08,720 --> 00:12:12,200 Speaker 1: nearly twice as many in fact. The other little story 208 00:12:12,600 --> 00:12:16,319 Speaker 1: that from April was that Apple introduced the iPhone SE, 209 00:12:16,360 --> 00:12:19,680 Speaker 1: the budget model entry into the iPhone line, and it 210 00:12:19,720 --> 00:12:22,319 Speaker 1: looked a lot like the older iPhone eight which came 211 00:12:22,320 --> 00:12:26,000 Speaker 1: out back in And while it is a budget model, 212 00:12:26,200 --> 00:12:29,600 Speaker 1: that doesn't mean it's actually cheap. The baseline iPhone SE 213 00:12:29,640 --> 00:12:34,240 Speaker 1: marketed for three dollars. Still, compared to top of the 214 00:12:34,280 --> 00:12:37,600 Speaker 1: line phones in the iPhone line, that's definitely on the 215 00:12:37,720 --> 00:12:43,760 Speaker 1: less expensive side. Now we're up to May, Airbnb announced 216 00:12:43,760 --> 00:12:46,960 Speaker 1: it would lay off a quarter of its employees. The 217 00:12:47,000 --> 00:12:50,680 Speaker 1: travel industry in general was hit super hard by the pandemic, 218 00:12:50,800 --> 00:12:54,880 Speaker 1: from hotels to airlines, to ride share companies and more. 219 00:12:55,400 --> 00:12:58,520 Speaker 1: We'd see a lot of other companies follow suit, either 220 00:12:58,679 --> 00:13:02,880 Speaker 1: laying off employees or ending contract worker agreements, all in 221 00:13:02,960 --> 00:13:05,680 Speaker 1: an effort to scale back and tighten the belt to 222 00:13:05,720 --> 00:13:09,280 Speaker 1: outlast the pandemic. So here are a few companies that 223 00:13:09,400 --> 00:13:13,839 Speaker 1: laid off people. In Cisco laid off around thirty five 224 00:13:13,960 --> 00:13:18,679 Speaker 1: hundred employees, Comcast laid off another thirty dred A. T 225 00:13:18,800 --> 00:13:22,240 Speaker 1: and T was right behind with thirty four hundred. Salesforce 226 00:13:22,320 --> 00:13:25,439 Speaker 1: laid off a thousand people. Microsoft laid off nine hundred 227 00:13:25,559 --> 00:13:29,800 Speaker 1: sixty employees, PayPal laid off a hundred employees, and all 228 00:13:29,840 --> 00:13:35,240 Speaker 1: of those companies managed to remain profitable during COVID. So 229 00:13:36,280 --> 00:13:38,640 Speaker 1: that's a pretty tough pill to swallow. Right, that these 230 00:13:38,640 --> 00:13:41,280 Speaker 1: companies were making these big cuts in an effort to 231 00:13:41,360 --> 00:13:46,000 Speaker 1: remain profitable during a pandemic benefits the shareholders, but not 232 00:13:46,080 --> 00:13:49,240 Speaker 1: the employees. Meanwhile, those workers are left to struggle and 233 00:13:49,280 --> 00:13:53,400 Speaker 1: find new employment in an environment that's really, really hard. 234 00:13:53,480 --> 00:13:59,640 Speaker 1: I mean, it is difficult to to overstate how hard 235 00:13:59,679 --> 00:14:02,800 Speaker 1: it is to get a new gig in the midst 236 00:14:02,800 --> 00:14:05,840 Speaker 1: of a pandemic. And I'm sure some of you out 237 00:14:05,880 --> 00:14:08,199 Speaker 1: there either may have dealt with this personally or know 238 00:14:08,320 --> 00:14:11,440 Speaker 1: someone who is dealing with it. And you know, this 239 00:14:11,520 --> 00:14:14,800 Speaker 1: is tough. And then there are the companies that made 240 00:14:14,840 --> 00:14:19,560 Speaker 1: big cuts but we're still unprofitable during COVID. The Walt 241 00:14:19,560 --> 00:14:23,600 Speaker 1: Disney Company is a big one. Now I'm a big 242 00:14:23,680 --> 00:14:28,120 Speaker 1: Disney fan, but the company has got a pretty ugly 243 00:14:28,280 --> 00:14:31,840 Speaker 1: tarnish on it, having laid off more than thirty thousand 244 00:14:31,960 --> 00:14:35,440 Speaker 1: employees during the pandemic. And you could argue that a 245 00:14:35,440 --> 00:14:38,960 Speaker 1: lot of that might be necessary from a business standpoint 246 00:14:39,080 --> 00:14:44,080 Speaker 1: because some places, some points of employment within Disney, like Disneyland, 247 00:14:44,120 --> 00:14:48,080 Speaker 1: for example, those remain closed with no firm reopening dates. 248 00:14:48,080 --> 00:14:50,680 Speaker 1: So it's hard to justify, Hey, let's keep all the 249 00:14:50,720 --> 00:14:55,720 Speaker 1: people from Disneyland on payroll indefinitely and we have no 250 00:14:56,120 --> 00:14:58,560 Speaker 1: idea when we'll be able to reopen. You can see 251 00:14:58,600 --> 00:15:02,480 Speaker 1: from a business standpoint how that's a tough call. But 252 00:15:02,600 --> 00:15:05,880 Speaker 1: you could also argue that Disney has been printing money 253 00:15:05,920 --> 00:15:08,960 Speaker 1: for the last few years, with franchises like Star Wars 254 00:15:09,000 --> 00:15:12,320 Speaker 1: and Marvel bringing in billions of dollars from the box 255 00:15:12,360 --> 00:15:15,680 Speaker 1: office alone, and that's before you even touch stuff like merchandizing. 256 00:15:16,360 --> 00:15:20,880 Speaker 1: The company has been losing money, however, and shareholders typically 257 00:15:20,880 --> 00:15:23,600 Speaker 1: are not super crazy about that, So this is a 258 00:15:23,720 --> 00:15:27,200 Speaker 1: very difficult road to travel here. It's I don't want 259 00:15:27,200 --> 00:15:32,080 Speaker 1: to to dismiss this as saying Disney bad. I am 260 00:15:32,120 --> 00:15:34,080 Speaker 1: not a fan of them laying off so many people 261 00:15:34,160 --> 00:15:36,840 Speaker 1: during a pandemic, But at the same time, you have 262 00:15:36,880 --> 00:15:41,240 Speaker 1: to acknowledge the realities of what business is. So it's 263 00:15:41,240 --> 00:15:44,000 Speaker 1: not like there's an easy decision to make here, but 264 00:15:44,080 --> 00:15:47,120 Speaker 1: man talking about having the magic stripped away, or rather 265 00:15:47,440 --> 00:15:50,760 Speaker 1: having to reckon with the realization that magic is all 266 00:15:50,880 --> 00:15:56,400 Speaker 1: in perception, and when the veil has fallen, it's a 267 00:15:56,400 --> 00:16:00,760 Speaker 1: lot harder to you know, recapture that feeling of Oh, 268 00:16:00,800 --> 00:16:05,600 Speaker 1: it's a magical company. Uh, it's a company, and it's 269 00:16:05,600 --> 00:16:09,400 Speaker 1: a company that that is very much focused on its 270 00:16:09,440 --> 00:16:13,080 Speaker 1: forward facing pr with the public, and that took a 271 00:16:13,080 --> 00:16:19,120 Speaker 1: pretty big hit in Boeing, Chevron and Exxon Mobile also 272 00:16:19,200 --> 00:16:23,080 Speaker 1: laid off thousands of employees and also remained unprofitable during 273 00:16:23,080 --> 00:16:25,880 Speaker 1: the pandemic. Uber laid off a lot of its staff. 274 00:16:26,280 --> 00:16:29,200 Speaker 1: Unemployment claims in the United States have hovered around one 275 00:16:29,280 --> 00:16:31,840 Speaker 1: million each month, with some of the most recent being 276 00:16:31,880 --> 00:16:35,920 Speaker 1: between a hundred thousand and nine hundred thousand claims. UH 277 00:16:35,960 --> 00:16:38,960 Speaker 1: the average in a year that is not going through 278 00:16:39,000 --> 00:16:42,320 Speaker 1: a pandemic in the United States would be two thousand 279 00:16:42,360 --> 00:16:45,280 Speaker 1: claims or so. Now we should keep that in mind 280 00:16:45,680 --> 00:16:48,960 Speaker 1: that as bad as all this is, at least it's 281 00:16:48,960 --> 00:16:52,400 Speaker 1: not the levels that we saw unemployment hit in March 282 00:16:53,440 --> 00:16:58,600 Speaker 1: when the claims spiked up to six point nine million. Meanwhile, 283 00:16:58,920 --> 00:17:02,600 Speaker 1: companies like door Ash and insta Carts saw their valuation 284 00:17:02,720 --> 00:17:07,320 Speaker 1: skyrocket as demands for home delivery services increased. We saw 285 00:17:07,359 --> 00:17:10,199 Speaker 1: a lot of this stuff too, with services specializing in 286 00:17:10,240 --> 00:17:12,879 Speaker 1: deliveries growing quickly and hiring on a lot of people. 287 00:17:13,000 --> 00:17:16,159 Speaker 1: So there were some industries that were actually in a 288 00:17:16,520 --> 00:17:20,920 Speaker 1: hiring mode, not a laying off mode. Twitter would announce 289 00:17:20,920 --> 00:17:24,720 Speaker 1: in May that it would flag tweets containing coronavirus misinformation 290 00:17:24,800 --> 00:17:28,359 Speaker 1: on its service, even if those tweets should come from 291 00:17:28,680 --> 00:17:32,359 Speaker 1: really important people such as President Trump, and this would 292 00:17:32,400 --> 00:17:34,840 Speaker 1: kick off essentially a big old mess in the United 293 00:17:34,840 --> 00:17:38,200 Speaker 1: States as Twitter would become more aggressive and flagging tweets 294 00:17:38,240 --> 00:17:42,679 Speaker 1: that contained misinformation, including those coming from Trump, and the 295 00:17:42,720 --> 00:17:46,520 Speaker 1: categories of misinformation would extend beyond the pandemic as well. 296 00:17:47,280 --> 00:17:50,800 Speaker 1: This would infuriate the President, who stepped up his opposition 297 00:17:50,920 --> 00:17:54,960 Speaker 1: to Section two thirty A bit of American legislation that 298 00:17:55,040 --> 00:17:59,400 Speaker 1: gives online platforms protection against what their users post, as 299 00:17:59,400 --> 00:18:03,560 Speaker 1: well as gives them the authority to moderate posts without 300 00:18:03,720 --> 00:18:07,119 Speaker 1: fear of litigation. I did a full episode about section 301 00:18:07,160 --> 00:18:08,920 Speaker 1: to thirty not too long ago. If you want to 302 00:18:09,000 --> 00:18:11,960 Speaker 1: learn more about that and what the whole argument is 303 00:18:12,000 --> 00:18:14,800 Speaker 1: both for and against it, I recommend you check that out. 304 00:18:15,400 --> 00:18:17,320 Speaker 1: Now it's time for us to take a quick break. 305 00:18:17,560 --> 00:18:20,239 Speaker 1: When we come back, we will get through spring and 306 00:18:20,320 --> 00:18:24,160 Speaker 1: start pushing into the summer of But first let's take 307 00:18:24,520 --> 00:18:35,119 Speaker 1: a quick break. May was when A T and T 308 00:18:35,320 --> 00:18:39,800 Speaker 1: agreed to stop using five G evolution in its marketing efforts. 309 00:18:39,880 --> 00:18:42,040 Speaker 1: And if you listen to my episode about five G 310 00:18:42,240 --> 00:18:45,119 Speaker 1: Myths versus Reality, you know that A T and T 311 00:18:45,400 --> 00:18:50,720 Speaker 1: had rebranded its LTE service, a four G service, as 312 00:18:50,960 --> 00:18:54,600 Speaker 1: five G E, with the icon and users phones changing 313 00:18:54,640 --> 00:18:58,400 Speaker 1: to reflect this. Now, a lot of other carriers criticized 314 00:18:58,440 --> 00:19:00,240 Speaker 1: A T and T for doing this. They claim that 315 00:19:00,840 --> 00:19:04,240 Speaker 1: this was a deliberate misdirection with the intent to make 316 00:19:04,240 --> 00:19:06,720 Speaker 1: people think that they suddenly had a five G phone, 317 00:19:06,800 --> 00:19:09,360 Speaker 1: when in fact they didn't. They were using a four 318 00:19:09,440 --> 00:19:12,360 Speaker 1: G phone on a four G network that was being 319 00:19:12,480 --> 00:19:15,320 Speaker 1: labeled as five G. A T and T faced some 320 00:19:15,400 --> 00:19:18,600 Speaker 1: sanctions when they were using this in their marketing, and 321 00:19:18,640 --> 00:19:22,800 Speaker 1: so ultimately the company stopped using the terminology in its 322 00:19:23,119 --> 00:19:27,760 Speaker 1: various advertising campaigns, but the five G E icon on 323 00:19:27,800 --> 00:19:32,600 Speaker 1: the phones stuck around. Also in May, Doug Lavero, who 324 00:19:32,640 --> 00:19:35,719 Speaker 1: had served as the head of human Spaceflight Operations at 325 00:19:35,800 --> 00:19:40,880 Speaker 1: NASA for six months, resigned abruptly, and in his resignation note, 326 00:19:41,240 --> 00:19:44,560 Speaker 1: he alluded to making a mistake, but there was no 327 00:19:44,600 --> 00:19:49,200 Speaker 1: real elaboration on what that mistake was. Speculation and rumor 328 00:19:49,359 --> 00:19:51,600 Speaker 1: soon took hold. I mean, if there's ever a gap 329 00:19:51,640 --> 00:19:53,920 Speaker 1: in knowledge, people will fill it with whatever they can. 330 00:19:54,400 --> 00:19:56,560 Speaker 1: So could it be that he made some sort of 331 00:19:56,600 --> 00:20:00,439 Speaker 1: controversial decision regarding the use of commercial space company like 332 00:20:00,520 --> 00:20:04,200 Speaker 1: SpaceX might have something to do with the Artemis program, 333 00:20:04,520 --> 00:20:08,320 Speaker 1: which has the incredibly aggressive goal of getting people back 334 00:20:08,440 --> 00:20:12,920 Speaker 1: on the Moon by four At first, there wasn't much 335 00:20:13,080 --> 00:20:16,679 Speaker 1: information on the matter, but later on in August, the 336 00:20:16,680 --> 00:20:20,400 Speaker 1: Wall Street Journal reported that a federal criminal probe into 337 00:20:20,440 --> 00:20:24,200 Speaker 1: Lavero's activities was commencing That was looking to see if 338 00:20:24,440 --> 00:20:28,280 Speaker 1: Laverro had given any information illegally to Boeing during the 339 00:20:28,280 --> 00:20:31,639 Speaker 1: bidding process to design and build a new human lunar 340 00:20:31,760 --> 00:20:35,800 Speaker 1: lander for the Artemis project. If he had, that would 341 00:20:35,840 --> 00:20:39,800 Speaker 1: be in violation of the Procurement Integrity Act. As NASA 342 00:20:40,040 --> 00:20:44,399 Speaker 1: isn't to show any favoritism to any particular bidding entity. 343 00:20:44,440 --> 00:20:48,040 Speaker 1: You're not supposed to give information that might give one 344 00:20:48,359 --> 00:20:52,879 Speaker 1: bitter uh of an advantage over others. And the announcement 345 00:20:52,920 --> 00:20:56,600 Speaker 1: of his resignation in May was particularly of concern because 346 00:20:56,640 --> 00:20:59,480 Speaker 1: it happened just two weeks before NASA was to rely 347 00:20:59,600 --> 00:21:03,600 Speaker 1: upon SpaceX launch vehicles and spacecraft to take astronauts to 348 00:21:03,600 --> 00:21:06,680 Speaker 1: the International Space Station. So this was just two weeks 349 00:21:06,720 --> 00:21:11,280 Speaker 1: before the first manned launch of a spacecraft on US 350 00:21:11,359 --> 00:21:15,640 Speaker 1: soil since the conclusion of the Space Shuttle program. Bad timing. 351 00:21:16,359 --> 00:21:19,520 Speaker 1: Besides the pandemic and the election, another thing we need 352 00:21:19,560 --> 00:21:22,600 Speaker 1: to remember about in the United States is the Black 353 00:21:22,640 --> 00:21:25,920 Speaker 1: Lives Matter movement and how they would play a big 354 00:21:25,960 --> 00:21:30,439 Speaker 1: part in how tech and society intersect. As organizers relied 355 00:21:30,480 --> 00:21:34,040 Speaker 1: on social platforms to bring people together to demonstrate in 356 00:21:34,080 --> 00:21:38,440 Speaker 1: the real world, so too did those who opposed the movement, 357 00:21:38,840 --> 00:21:43,400 Speaker 1: So you had rights activists clashing with white supremacists online 358 00:21:43,440 --> 00:21:47,080 Speaker 1: and off. Some platforms like Facebook tried to take a 359 00:21:47,200 --> 00:21:51,919 Speaker 1: hands off Switzerland like approach. Others like Twitter, tried to 360 00:21:51,960 --> 00:21:56,200 Speaker 1: flag content that contained misinformation or those that glorified violence, 361 00:21:56,600 --> 00:22:00,119 Speaker 1: including tweets from Trump, and it became the subject of 362 00:22:00,200 --> 00:22:04,440 Speaker 1: increasingly heated debates about what text role is in these 363 00:22:04,480 --> 00:22:08,320 Speaker 1: matters and whether or not these social platforms were making 364 00:22:08,359 --> 00:22:13,800 Speaker 1: things better or worse. Let's move into June. Early that month, 365 00:22:13,960 --> 00:22:17,440 Speaker 1: a group of Facebook employees staged a virtual walk off 366 00:22:17,560 --> 00:22:21,520 Speaker 1: in protest of Zuckerberg's lack of response to Trump's messaging 367 00:22:21,560 --> 00:22:26,080 Speaker 1: on Facebook, largely regarding the Black Lives Matter movement. They 368 00:22:26,080 --> 00:22:28,760 Speaker 1: were arguing that he was turning a blind eye to 369 00:22:28,880 --> 00:22:33,879 Speaker 1: posts that clearly violated Facebook's own terms of service. Internally, 370 00:22:34,200 --> 00:22:38,080 Speaker 1: Facebook managers were told not to retaliate or require employees 371 00:22:38,080 --> 00:22:41,280 Speaker 1: to use paid time off to cover for the walkout, 372 00:22:41,720 --> 00:22:45,080 Speaker 1: and so the company was facing criticism both from external 373 00:22:45,160 --> 00:22:48,760 Speaker 1: sources and from within the company itself. Not that this 374 00:22:48,800 --> 00:22:52,120 Speaker 1: would change things very much over at Facebook for most 375 00:22:52,119 --> 00:22:55,400 Speaker 1: of twenty and you might think I have a bias 376 00:22:55,480 --> 00:22:58,320 Speaker 1: against Facebook, And you know what, I guess I kind 377 00:22:58,320 --> 00:23:01,240 Speaker 1: of do, as I've said in previous podcasts, if you 378 00:23:01,359 --> 00:23:04,680 Speaker 1: give the truth and lies and equal ground to stand 379 00:23:04,760 --> 00:23:08,399 Speaker 1: upon lines are gonna win. It's just way easier to 380 00:23:08,400 --> 00:23:11,080 Speaker 1: tell people what they want to hear rather than to 381 00:23:11,119 --> 00:23:15,120 Speaker 1: make them listen to the truth. At the protests themselves, 382 00:23:15,160 --> 00:23:18,280 Speaker 1: we saw tech also play a role. People used drones 383 00:23:18,320 --> 00:23:22,320 Speaker 1: to get footage of protests, including one case at least 384 00:23:22,359 --> 00:23:26,879 Speaker 1: one in which US Customs and Border Patrol flew a 385 00:23:27,000 --> 00:23:30,480 Speaker 1: drone over Minneapolis. And I'm not talking about some little 386 00:23:30,560 --> 00:23:33,240 Speaker 1: quad captor. The drone in this case appeared to be 387 00:23:33,320 --> 00:23:36,600 Speaker 1: a Predator B drone a k A and m Q 388 00:23:37,080 --> 00:23:40,080 Speaker 1: nine Reaper, which is designed to be a so called 389 00:23:40,400 --> 00:23:44,760 Speaker 1: hunter killer drone, as in, this was a drone designed 390 00:23:44,760 --> 00:23:48,040 Speaker 1: to be armed and used to track down a target 391 00:23:48,119 --> 00:23:51,000 Speaker 1: and fire upon it. Now, the agency said it was 392 00:23:51,040 --> 00:23:53,680 Speaker 1: operating the drone purely as a means to keep police 393 00:23:53,760 --> 00:23:58,040 Speaker 1: informed of potential problems like outbreaks of violence or looting. 394 00:23:58,440 --> 00:24:02,760 Speaker 1: But the Customs and Border Patrol jurisdiction ends once you 395 00:24:02,840 --> 00:24:06,120 Speaker 1: get one hundred miles out from US borders. I mean 396 00:24:06,119 --> 00:24:11,359 Speaker 1: it's Customs and Borders Patrol. Minneapolis is three hundred miles 397 00:24:11,400 --> 00:24:14,760 Speaker 1: away from the Canadian border, so that creates a pretty 398 00:24:14,840 --> 00:24:18,960 Speaker 1: thorny basis for operations. How can the the U S 399 00:24:19,000 --> 00:24:23,240 Speaker 1: Customs and Border Patrol justify flying a drone over Minneapolis 400 00:24:23,600 --> 00:24:26,879 Speaker 1: when it has no jurisdiction over that area. This is 401 00:24:27,880 --> 00:24:31,760 Speaker 1: a troubling matter. The use of drones raised multiple privacy 402 00:24:31,800 --> 00:24:35,359 Speaker 1: concerns and fueled allegations that the government was actively attempting 403 00:24:35,400 --> 00:24:40,800 Speaker 1: to disrupt citizens as they exercise their constitutionally guaranteed First 404 00:24:40,800 --> 00:24:45,880 Speaker 1: Amendment rights to public assembly and free speech. One Twitter follower, 405 00:24:45,960 --> 00:24:48,439 Speaker 1: I have suggested that I talk about how the federal 406 00:24:48,480 --> 00:24:52,200 Speaker 1: government used drones and other aircraft to jam cell phone 407 00:24:52,240 --> 00:24:55,560 Speaker 1: service over areas that had protesters. Now, as far as 408 00:24:55,560 --> 00:24:59,719 Speaker 1: I know, that never actually happened. Jamming cell phone service 409 00:24:59,800 --> 00:25:04,480 Speaker 1: is technically possible, but legally it's very tricky to do, 410 00:25:04,840 --> 00:25:08,000 Speaker 1: and it brings along a lot of collateral damage because 411 00:25:08,080 --> 00:25:12,160 Speaker 1: anyone in an area would find their service disrupted, whether 412 00:25:12,200 --> 00:25:14,800 Speaker 1: they were part of a protest or not. You can 413 00:25:14,840 --> 00:25:18,160 Speaker 1: easily imagine how that could lead to disaster. I mean, 414 00:25:18,200 --> 00:25:21,000 Speaker 1: you could have a medical emergency, or there could be 415 00:25:21,040 --> 00:25:24,040 Speaker 1: a fire not even connected to the protests, and you 416 00:25:24,040 --> 00:25:27,679 Speaker 1: wouldn't have any way of of contacting first responders to 417 00:25:27,760 --> 00:25:32,160 Speaker 1: get aid. But there were reports of various agencies, federal 418 00:25:32,280 --> 00:25:36,119 Speaker 1: and local, using things like helicopters, drones, and planes to 419 00:25:36,200 --> 00:25:41,960 Speaker 1: conduct surveillance on protesters, which definitely was happening. The cell 420 00:25:42,000 --> 00:25:46,840 Speaker 1: phone jamming probably not, but surveillance definitely happened, including some 421 00:25:46,880 --> 00:25:49,680 Speaker 1: allegations that the government was making use of devices called 422 00:25:49,880 --> 00:25:53,479 Speaker 1: sting rays. Now, a sting ray poses as a legit 423 00:25:53,600 --> 00:25:57,000 Speaker 1: cell phone service tower as far as phones are concerned, 424 00:25:57,440 --> 00:26:00,040 Speaker 1: that's what it is. But it's real purpose is to 425 00:26:00,119 --> 00:26:02,719 Speaker 1: pick up data from phones that are in the area. 426 00:26:03,119 --> 00:26:05,399 Speaker 1: So it's kind of like having someone posted at a 427 00:26:05,440 --> 00:26:08,159 Speaker 1: cell phone tower and making note of all the phones 428 00:26:08,200 --> 00:26:11,480 Speaker 1: that connect to that tower and to what those phones 429 00:26:11,520 --> 00:26:14,320 Speaker 1: are there then connecting to, like if they're making a 430 00:26:14,359 --> 00:26:18,160 Speaker 1: call somewhere or uploading data to a service. In October 431 00:26:18,200 --> 00:26:21,320 Speaker 1: of twenty a group of US representatives called for an 432 00:26:21,359 --> 00:26:25,280 Speaker 1: investigation into the surveillance techniques that were being used by 433 00:26:25,320 --> 00:26:29,840 Speaker 1: federal agencies during the Black Lives Matter protests and demonstrations, 434 00:26:30,160 --> 00:26:34,240 Speaker 1: questioning whether the government violated the constitutional rights of US citizens. 435 00:26:34,800 --> 00:26:37,919 Speaker 1: As I record this that has not concluded, but it 436 00:26:38,040 --> 00:26:41,720 Speaker 1: points to how serious this issue was. The protests would 437 00:26:41,760 --> 00:26:45,200 Speaker 1: affect the tech sector in other ways as well. Sony 438 00:26:45,320 --> 00:26:48,679 Speaker 1: chose to postpone its PlayStation five event. Largely due to 439 00:26:48,720 --> 00:26:53,439 Speaker 1: the protests, Various online personalities such as YouTubers and twitch 440 00:26:53,480 --> 00:26:56,720 Speaker 1: streamers would use their platforms to help raise money for 441 00:26:56,760 --> 00:27:00,840 Speaker 1: the activist movement. IBM canceled and entire your program that 442 00:27:00,920 --> 00:27:04,000 Speaker 1: was using machine learning and artificial intelligence to build facial 443 00:27:04,000 --> 00:27:09,760 Speaker 1: recognition systems, recognizing that these technologies had a disproportionately negative 444 00:27:09,800 --> 00:27:13,720 Speaker 1: impact on people of color. Amazon announced it would stop 445 00:27:13,720 --> 00:27:17,320 Speaker 1: providing facial recognition software to police for the terms of 446 00:27:17,359 --> 00:27:20,800 Speaker 1: a year, with the company representatives saying that their their 447 00:27:20,840 --> 00:27:25,080 Speaker 1: hope was that the US government would quote implement appropriate 448 00:27:25,280 --> 00:27:30,399 Speaker 1: rules end quote regarding facial recognition technologies. During that year 449 00:27:30,440 --> 00:27:34,399 Speaker 1: long ban, Microsoft also announced it would not sell facial 450 00:27:34,440 --> 00:27:38,000 Speaker 1: recognition technology to police departments in the United States. So 451 00:27:38,040 --> 00:27:41,800 Speaker 1: he started to see the corporations make a shift as 452 00:27:42,040 --> 00:27:45,679 Speaker 1: the public sentiment grew around the Black Lives Matter movement. 453 00:27:46,040 --> 00:27:50,560 Speaker 1: On a totally different note, Microsoft announced in June that 454 00:27:50,600 --> 00:27:53,800 Speaker 1: it was going to shut down its streaming video game 455 00:27:53,840 --> 00:27:58,679 Speaker 1: platform uh known as Mixer. So it's kind of like Twitch. 456 00:27:58,720 --> 00:28:01,560 Speaker 1: It's a platform for people to stream their video game 457 00:28:01,800 --> 00:28:05,680 Speaker 1: playing sessions to an audience, and they chose instead to 458 00:28:05,760 --> 00:28:09,800 Speaker 1: partner with Facebook Gaming and some very high profile streamers 459 00:28:10,200 --> 00:28:15,280 Speaker 1: had been enticed to leave Twitch for lucrative contracts with Microsoft. 460 00:28:15,560 --> 00:28:18,600 Speaker 1: Among them guys like Ninja and Shroud, two of the 461 00:28:18,640 --> 00:28:22,920 Speaker 1: most popular video game streamers in the world. Microsoft spent 462 00:28:23,320 --> 00:28:26,520 Speaker 1: a lot of money on these exclusive agreements with these 463 00:28:26,560 --> 00:28:29,800 Speaker 1: top streamers, hoping that they would bring their legions of 464 00:28:29,880 --> 00:28:33,760 Speaker 1: fans along with them to Mixer, but that really just 465 00:28:33,800 --> 00:28:37,600 Speaker 1: didn't happen. Twitch was pulling in fifteen million visits per 466 00:28:37,680 --> 00:28:42,560 Speaker 1: day with one million monthly active users. Mixer, on the 467 00:28:42,600 --> 00:28:47,360 Speaker 1: other hand, was managing ten million monthly active users, literally 468 00:28:47,560 --> 00:28:51,520 Speaker 1: one tenth of what Twitch was pulling. Microsoft had introduced 469 00:28:51,520 --> 00:28:55,480 Speaker 1: a lot of innovative features that users and critics were praising, 470 00:28:55,560 --> 00:28:58,640 Speaker 1: but it just couldn't make much ground against the juggernauts 471 00:28:58,640 --> 00:29:02,440 Speaker 1: of Twitch and YouTube gaming. The low return on investment 472 00:29:02,800 --> 00:29:05,440 Speaker 1: meant that Microsoft decided to pull the plug and folks 473 00:29:05,440 --> 00:29:08,000 Speaker 1: like Ninja and Shroud were free to go wherever they pleased, 474 00:29:08,320 --> 00:29:11,840 Speaker 1: and the Mixer technology would transition over to Facebook gaming, 475 00:29:11,960 --> 00:29:15,520 Speaker 1: which I have honestly never checked out. I have no 476 00:29:15,600 --> 00:29:20,080 Speaker 1: idea how big Facebook gaming is, uh, which is odd. 477 00:29:20,400 --> 00:29:22,400 Speaker 1: It's one of those things that I've heard about, but 478 00:29:22,480 --> 00:29:25,320 Speaker 1: I have never experienced. I am more of a twitch 479 00:29:25,400 --> 00:29:29,800 Speaker 1: and YouTube gaming kind of guy. Also tying into this 480 00:29:30,440 --> 00:29:33,080 Speaker 1: was that in June we saw another surge of the 481 00:29:33,240 --> 00:29:37,520 Speaker 1: me too movement, this time focused in video games and streaming. 482 00:29:38,080 --> 00:29:42,080 Speaker 1: Many women's streamers came forward with accusations against influential people 483 00:29:42,240 --> 00:29:46,160 Speaker 1: in the streaming community, most of them men, saying that 484 00:29:46,480 --> 00:29:50,080 Speaker 1: they had been manipulated by these people, they had been groomed, 485 00:29:50,160 --> 00:29:53,800 Speaker 1: they had been coerced, and it's all really ugly, ugly stuff. 486 00:29:54,120 --> 00:29:56,240 Speaker 1: And this was also when we started hearing more about 487 00:29:56,280 --> 00:30:00,800 Speaker 1: ongoing sexual harassment issues at the major video game developed Ubisoft, 488 00:30:01,200 --> 00:30:05,480 Speaker 1: which would go through a tumultuous series of personnel changes 489 00:30:05,560 --> 00:30:09,400 Speaker 1: as a result. And I recently did several episodes about Ubisoft, 490 00:30:09,600 --> 00:30:11,120 Speaker 1: so if you want to know more, you can check 491 00:30:11,120 --> 00:30:14,080 Speaker 1: those out. But some of the people who were ultimately 492 00:30:14,160 --> 00:30:16,400 Speaker 1: let go had been with the company for more than 493 00:30:16,440 --> 00:30:21,640 Speaker 1: a decade and included very high ranking executives. Meanwhile, over 494 00:30:21,680 --> 00:30:24,520 Speaker 1: at Facebook, the company was now dealing with more than 495 00:30:24,600 --> 00:30:28,680 Speaker 1: just criticism. Advertisers were starting to leave the platform. They 496 00:30:28,680 --> 00:30:32,280 Speaker 1: were concerned that Facebook's opposition to dealing with harassment and 497 00:30:32,400 --> 00:30:35,800 Speaker 1: misinformation and hate groups would end up reflecting poorly on 498 00:30:35,840 --> 00:30:39,880 Speaker 1: the companies that were using Facebook to advertise their Verizon 499 00:30:40,040 --> 00:30:43,680 Speaker 1: would announce it would withdraw ads from Facebook in late June, 500 00:30:43,720 --> 00:30:45,400 Speaker 1: and that must have been a serious wake up call 501 00:30:45,480 --> 00:30:48,560 Speaker 1: for the company because most of the companies that were 502 00:30:48,560 --> 00:30:51,240 Speaker 1: doing this, where they were backing away and boycotting Facebook, 503 00:30:51,280 --> 00:30:54,680 Speaker 1: were in the small to medium range, which honestly doesn't 504 00:30:54,680 --> 00:30:58,880 Speaker 1: represent very much of Facebook's revenue. It's of a few 505 00:30:59,080 --> 00:31:02,360 Speaker 1: very very large companies that represent the bulk of revenue. 506 00:31:02,400 --> 00:31:06,640 Speaker 1: So a company like Verizon definitely got a lot more attention, 507 00:31:06,800 --> 00:31:10,320 Speaker 1: and this became known as the hashtag stop Hate for 508 00:31:10,440 --> 00:31:15,160 Speaker 1: Profit movement. More than one thousand advertisers would join this campaign, 509 00:31:15,640 --> 00:31:19,280 Speaker 1: but many of those also said that their boycott was 510 00:31:19,600 --> 00:31:22,640 Speaker 1: temporary and that the pandemic had made it too risky 511 00:31:22,760 --> 00:31:25,560 Speaker 1: to remain off Facebook for very long, like it was 512 00:31:26,280 --> 00:31:30,240 Speaker 1: unnecessary evil essentially, and the company managed to actually grow 513 00:31:30,280 --> 00:31:33,440 Speaker 1: its revenue despite this boycott. As The New York Times 514 00:31:33,440 --> 00:31:36,120 Speaker 1: would point out in November, the movement did more to 515 00:31:36,200 --> 00:31:40,640 Speaker 1: hurt the platform's reputation than its actual bottom line. That's 516 00:31:40,640 --> 00:31:43,880 Speaker 1: not to say that a future, more widespread effort wouldn't 517 00:31:43,920 --> 00:31:47,200 Speaker 1: have a greater impact, but this was more like a 518 00:31:47,400 --> 00:31:51,800 Speaker 1: warning signal sent to Facebook than anything else. Facebook did 519 00:31:51,840 --> 00:31:55,800 Speaker 1: agree to ban negative ads about immigrants or those who 520 00:31:55,840 --> 00:31:59,640 Speaker 1: seek asylum, as well as on ads that include hate 521 00:31:59,640 --> 00:32:04,200 Speaker 1: speech or discrimination. Zuckerberg said Facebook would also begin to 522 00:32:04,360 --> 00:32:08,680 Speaker 1: label quote unquote newsworthy posts that violate the site's policies, 523 00:32:08,920 --> 00:32:11,880 Speaker 1: the idea being that if a post is newsworthy, then 524 00:32:12,240 --> 00:32:16,400 Speaker 1: it can still be on Facebook, even if otherwise the 525 00:32:16,600 --> 00:32:20,800 Speaker 1: content of that post would violate Facebook's terms of service, 526 00:32:21,320 --> 00:32:26,280 Speaker 1: but now at least Facebook would flag those ya. Meanwhile, 527 00:32:26,600 --> 00:32:30,160 Speaker 1: YouTube would ban David Dukes and Richard Spencer and a 528 00:32:30,160 --> 00:32:34,000 Speaker 1: few other prominent white supremacist channels from its platform after 529 00:32:34,120 --> 00:32:36,840 Speaker 1: being pressured to do so, and a year after the 530 00:32:36,880 --> 00:32:39,520 Speaker 1: company had said it was going to ban accounts that 531 00:32:39,560 --> 00:32:43,000 Speaker 1: posted videos supporting white supremacists. So there was a big 532 00:32:43,040 --> 00:32:45,360 Speaker 1: question about why it took so long for these particular 533 00:32:45,440 --> 00:32:49,120 Speaker 1: high profile cases by the company's own policies. This action 534 00:32:49,240 --> 00:32:52,239 Speaker 1: was long overdue, and it followed what other platforms like 535 00:32:52,360 --> 00:32:54,320 Speaker 1: Reddit had been doing in the wake of the Black 536 00:32:54,320 --> 00:32:58,080 Speaker 1: Lives Matter movement. These bands would sometimes include accounts or 537 00:32:58,160 --> 00:33:02,360 Speaker 1: subbreddits that purported to support President Trump, turning this into 538 00:33:02,400 --> 00:33:05,200 Speaker 1: kind of a political controversy, though the reason for the 539 00:33:05,240 --> 00:33:09,360 Speaker 1: removal wasn't because of a political affiliation, had to do 540 00:33:09,400 --> 00:33:13,600 Speaker 1: with the actual content within these accounts and subreddits. So, 541 00:33:13,640 --> 00:33:18,160 Speaker 1: for example, the subreddit community the Donald was home to 542 00:33:18,240 --> 00:33:21,280 Speaker 1: a lot of posts that included hate speech and promoted 543 00:33:21,320 --> 00:33:25,600 Speaker 1: conspiracy theories that blamed minorities for various societal problems, so 544 00:33:25,640 --> 00:33:28,880 Speaker 1: it got the acts. Toward late June, we learned that 545 00:33:29,000 --> 00:33:32,440 Speaker 1: Uber was in talks of acquiring the delivery service Postmates. 546 00:33:33,000 --> 00:33:36,880 Speaker 1: Uber already has its own food delivery service with Uber Eats, 547 00:33:36,920 --> 00:33:40,800 Speaker 1: and the talks progressed throughout ending in November, when the 548 00:33:40,800 --> 00:33:44,160 Speaker 1: companies agreed to an all stock transaction that would see 549 00:33:44,200 --> 00:33:47,920 Speaker 1: Postmates moved to Uber for the princely sum of two 550 00:33:47,960 --> 00:33:52,120 Speaker 1: point six five billion dollars worth of stock. Uber had 551 00:33:52,120 --> 00:33:56,040 Speaker 1: previously attempted to scoop up grub Hub, but the company 552 00:33:56,080 --> 00:33:58,160 Speaker 1: was beaten to the punch by a different company out 553 00:33:58,160 --> 00:34:05,280 Speaker 1: of Europe called Just Takeaway for seven billion dollars. Holy cats. 554 00:34:05,960 --> 00:34:09,000 Speaker 1: One bit of more lighthearted news that I can add 555 00:34:09,000 --> 00:34:12,239 Speaker 1: to June was published on my birthday, that would be 556 00:34:12,360 --> 00:34:16,719 Speaker 1: June twenty six. That's when I learned that NASA has 557 00:34:16,800 --> 00:34:20,920 Speaker 1: a design competition, or rather had a design competition, and 558 00:34:21,000 --> 00:34:23,960 Speaker 1: the first prize of this competition was twenty thousand dollars, 559 00:34:24,239 --> 00:34:27,439 Speaker 1: second prize was ten thousand dollars, and third prize five 560 00:34:27,440 --> 00:34:32,120 Speaker 1: thousand dollars. And what you are meant to design was 561 00:34:32,560 --> 00:34:36,279 Speaker 1: a toilet yep. NASA was looking for designs for a 562 00:34:36,320 --> 00:34:39,319 Speaker 1: toilet that can work in micro gravity as well as 563 00:34:39,360 --> 00:34:42,200 Speaker 1: the reduced gravity on the Moon. The Moon's gravity is 564 00:34:42,200 --> 00:34:45,880 Speaker 1: about one sixth that of Earth's and presumably this would 565 00:34:45,880 --> 00:34:49,600 Speaker 1: be on the lunar Lander module of the future, so 566 00:34:49,880 --> 00:34:53,600 Speaker 1: they had very strict size and weight limitations for this 567 00:34:53,680 --> 00:34:57,279 Speaker 1: particular piece of technology. And because NASA intends to send 568 00:34:57,320 --> 00:34:59,719 Speaker 1: women to the Moon, it means the toilet has to 569 00:34:59,760 --> 00:35:03,240 Speaker 1: be to work for both men and women. And a bonus, 570 00:35:03,400 --> 00:35:06,640 Speaker 1: said NASA, and I swear I am not making this up, 571 00:35:07,200 --> 00:35:09,560 Speaker 1: is that the toilet should be able to deal with 572 00:35:10,200 --> 00:35:13,560 Speaker 1: an astronaut who needs to vomit without the astronaut having 573 00:35:13,600 --> 00:35:18,279 Speaker 1: to place their head and said toilet. That was the 574 00:35:18,360 --> 00:35:22,080 Speaker 1: thing that NASA was looking for. And if you have 575 00:35:22,120 --> 00:35:23,920 Speaker 1: a great idea for the design, well, I hate to 576 00:35:23,960 --> 00:35:27,600 Speaker 1: be the bearer of crappy news. But the deadline was August, 577 00:35:28,320 --> 00:35:30,759 Speaker 1: So I guess I just flushed all those hopes and 578 00:35:30,840 --> 00:35:34,920 Speaker 1: dreams down the moon toilet. When we come back, we'll 579 00:35:34,960 --> 00:35:37,000 Speaker 1: head into July and see if I can compress the 580 00:35:37,040 --> 00:35:39,120 Speaker 1: second half of the year's news into one third of 581 00:35:39,120 --> 00:35:43,640 Speaker 1: an episode. Spoiler alert. I couldn't, so we will have 582 00:35:43,680 --> 00:35:46,080 Speaker 1: a part three. But before we get into all of that, 583 00:35:46,160 --> 00:35:56,200 Speaker 1: let's take another break. I haven't really talked about it 584 00:35:56,320 --> 00:35:58,640 Speaker 1: so far in these episodes, but one of the stories 585 00:35:58,680 --> 00:36:01,279 Speaker 1: that played out over the core of twenty was the 586 00:36:01,320 --> 00:36:04,680 Speaker 1: fate of TikTok and how the US government was continuing 587 00:36:04,719 --> 00:36:07,799 Speaker 1: to exert pressure on the company for its links to 588 00:36:08,080 --> 00:36:12,600 Speaker 1: a Chinese parent company like Huawei. The implication was that 589 00:36:13,120 --> 00:36:17,160 Speaker 1: the company was potentially vulnerable to Chinese government interference, and 590 00:36:17,239 --> 00:36:20,279 Speaker 1: since so many Americans were using the app, it could 591 00:36:20,280 --> 00:36:23,960 Speaker 1: pose as a national security concern. Or if you look 592 00:36:23,960 --> 00:36:26,880 Speaker 1: at it another way, you could say this was a 593 00:36:26,960 --> 00:36:30,360 Speaker 1: high profile company that the US government could target in 594 00:36:30,440 --> 00:36:34,040 Speaker 1: an ongoing trade dispute between the United States and China 595 00:36:34,400 --> 00:36:38,000 Speaker 1: and thus be a point of pain that US could 596 00:36:38,000 --> 00:36:41,239 Speaker 1: inflict on China. Now, the truth behind the motivations to 597 00:36:41,400 --> 00:36:46,759 Speaker 1: pressure TikTok is probably somewhere between those two pillars of philosophy. 598 00:36:47,120 --> 00:36:50,680 Speaker 1: Cybersecurity experts, however, have pointed out that it is unlikely 599 00:36:50,760 --> 00:36:54,480 Speaker 1: that China would be able to access data about US 600 00:36:54,600 --> 00:36:58,840 Speaker 1: TikTok users because that data would be stored on servers 601 00:36:58,880 --> 00:37:02,040 Speaker 1: that are here in the United States and thus outside 602 00:37:02,040 --> 00:37:05,240 Speaker 1: the grasp of China directly, and that the TikTok CEO 603 00:37:05,520 --> 00:37:09,520 Speaker 1: is an American CEO, so there might be a desire 604 00:37:09,760 --> 00:37:12,920 Speaker 1: to access all that stuff, but not necessarily an actual 605 00:37:12,960 --> 00:37:16,600 Speaker 1: ability to do that anyway. In July, we saw some 606 00:37:16,719 --> 00:37:21,400 Speaker 1: curious reactions to TikTok in corporate America. Wells Fargo sent 607 00:37:21,480 --> 00:37:24,920 Speaker 1: a directive to employees to delete TikTok from any corporate 608 00:37:24,960 --> 00:37:29,000 Speaker 1: owned devices, citing security concerns. Now, I don't think this 609 00:37:29,080 --> 00:37:32,560 Speaker 1: is actually that unusual. Companies must frequently deal with the 610 00:37:32,560 --> 00:37:36,799 Speaker 1: possibility of data breaches or security vulnerabilities, and many of 611 00:37:36,840 --> 00:37:40,000 Speaker 1: them are created not because of a hole in the 612 00:37:40,160 --> 00:37:43,120 Speaker 1: technical security that a company has put in place, but 613 00:37:43,280 --> 00:37:47,640 Speaker 1: rather the human element, the behavior of employees. I felt 614 00:37:47,680 --> 00:37:50,480 Speaker 1: that Wells Fargo was actually pretty much in the right 615 00:37:50,520 --> 00:37:53,600 Speaker 1: with this one, as the directive really only covered corporate 616 00:37:53,680 --> 00:37:56,680 Speaker 1: owned devices, so to me, it's the same as if 617 00:37:56,719 --> 00:37:59,680 Speaker 1: my company were to tell me, hey, you can't install 618 00:37:59,760 --> 00:38:03,320 Speaker 1: the games on your work computer because that computer doesn't 619 00:38:03,320 --> 00:38:06,279 Speaker 1: belong to you, it belongs to the company. So to me, 620 00:38:07,440 --> 00:38:11,040 Speaker 1: that works the same way with this Wells Fargo example. 621 00:38:11,160 --> 00:38:14,160 Speaker 1: TikTok's response, however, was to say that it was open 622 00:38:14,200 --> 00:38:19,040 Speaker 1: to talk with Wells Fargo about how TikTok's secures user privacy, 623 00:38:19,080 --> 00:38:21,480 Speaker 1: and maybe if Wells Fargo had told employees that they 624 00:38:21,520 --> 00:38:24,840 Speaker 1: couldn't use TikTok on their own personal phones, I would 625 00:38:24,840 --> 00:38:29,239 Speaker 1: feel that that was a totally different conversation and would 626 00:38:29,320 --> 00:38:32,120 Speaker 1: be unjustifiable. But as it stands, I don't really see 627 00:38:32,120 --> 00:38:35,160 Speaker 1: a big deal here, because even if you remove the 628 00:38:35,360 --> 00:38:38,359 Speaker 1: Chinese element, let's say, all right, let's let's say let's 629 00:38:38,360 --> 00:38:40,919 Speaker 1: say that it is impossible for China to get hold 630 00:38:40,960 --> 00:38:44,799 Speaker 1: of that information, which is probably true. It's still a 631 00:38:44,840 --> 00:38:48,880 Speaker 1: problem to have an app from a different company installed 632 00:38:48,920 --> 00:38:52,960 Speaker 1: onto a corporate owned device because it could represent a 633 00:38:53,000 --> 00:38:57,480 Speaker 1: potential security leak. And we already give a lot of 634 00:38:57,480 --> 00:39:02,320 Speaker 1: our information over to company. He's both private and publicly traded, 635 00:39:02,960 --> 00:39:06,120 Speaker 1: and that's already an issue. It's it's almost like the 636 00:39:07,000 --> 00:39:13,200 Speaker 1: problem is present. It's just that the the ultimate uh 637 00:39:13,480 --> 00:39:18,759 Speaker 1: entity that we should be afraid of is misidentified, if 638 00:39:18,800 --> 00:39:22,640 Speaker 1: that makes sense. Now, Interestingly, Amazon had also sent a 639 00:39:22,719 --> 00:39:26,680 Speaker 1: similar directive to its employees to delete TikTok from corporate 640 00:39:26,719 --> 00:39:29,640 Speaker 1: owned devices, but within a couple of hours of that, 641 00:39:29,800 --> 00:39:32,720 Speaker 1: Amazon sent a follow up email saying that the first 642 00:39:32,760 --> 00:39:38,239 Speaker 1: message was sent in error, so Amazon retracted that directive. Now, 643 00:39:38,280 --> 00:39:41,279 Speaker 1: we did see one big security breach in July. On 644 00:39:41,400 --> 00:39:46,360 Speaker 1: July to be precise, that was buzzworthy. Hackers gained access 645 00:39:46,400 --> 00:39:50,000 Speaker 1: to multiple high profile accounts on Twitter belonging to people 646 00:39:50,000 --> 00:39:54,760 Speaker 1: like Bill Gates, Barack Obama, Joe Biden, Elon Musk, and others. 647 00:39:55,320 --> 00:39:58,960 Speaker 1: The accounts blasted out messages promising something too good to 648 00:39:59,000 --> 00:40:02,319 Speaker 1: be true, and we all know what that means. But 649 00:40:02,560 --> 00:40:04,640 Speaker 1: in this case, the message has said that people should 650 00:40:04,640 --> 00:40:09,800 Speaker 1: send bitcoin, the digital cryptocurrency, to a particular digital wallet, 651 00:40:09,960 --> 00:40:13,479 Speaker 1: and in return they would receive double what they sent. 652 00:40:13,920 --> 00:40:16,640 Speaker 1: And if you're thinking, well, that just don't make no sense, 653 00:40:17,040 --> 00:40:19,480 Speaker 1: then you're way ahead of the game, because this is 654 00:40:19,560 --> 00:40:22,879 Speaker 1: just a scam that praise on the impulsive and the greedy. 655 00:40:23,000 --> 00:40:25,640 Speaker 1: If I walked up to you and said, hey, if 656 00:40:25,680 --> 00:40:27,920 Speaker 1: you give me a five dollar bill, I'll give you 657 00:40:27,960 --> 00:40:31,400 Speaker 1: ten bucks. You would probably be suspicious of that, and 658 00:40:31,440 --> 00:40:34,080 Speaker 1: you should be, because I'm a shifty character. But for 659 00:40:34,120 --> 00:40:38,280 Speaker 1: some reason, scammers use this technique with Bitcoin and compromising 660 00:40:38,320 --> 00:40:42,120 Speaker 1: such high profile accounts to do this seems to give 661 00:40:42,160 --> 00:40:44,520 Speaker 1: the scam a bit more legitimacy. I mean, if it's 662 00:40:44,520 --> 00:40:46,680 Speaker 1: coming from Barack Obama, it's got to be legit, right, 663 00:40:46,960 --> 00:40:50,120 Speaker 1: So maybe that was the real difference here. So how 664 00:40:50,160 --> 00:40:53,160 Speaker 1: the heck did this hack even happen? Well, once again 665 00:40:53,200 --> 00:40:55,440 Speaker 1: we saw that the way into the system wasn't to 666 00:40:55,600 --> 00:40:59,760 Speaker 1: brute force passwords or anything like that. Rather, the scammers 667 00:40:59,800 --> 00:41:03,400 Speaker 1: targeted Twitter employees, trying to trick them into going to 668 00:41:03,520 --> 00:41:08,040 Speaker 1: a dummy website to enter their credentials and solve some 669 00:41:08,239 --> 00:41:12,080 Speaker 1: imaginary problem that was in the initial message. And it's 670 00:41:12,080 --> 00:41:14,759 Speaker 1: the sort of approach that requires a very wide net, 671 00:41:14,840 --> 00:41:16,960 Speaker 1: but you only have to catch a few fishies for 672 00:41:17,000 --> 00:41:20,280 Speaker 1: it to pay off. And it paid off. The scammers 673 00:41:20,280 --> 00:41:24,440 Speaker 1: were able to get enough Twitter administrators to give them 674 00:41:24,480 --> 00:41:27,240 Speaker 1: that kind of access, and they began to use Twitter's 675 00:41:27,320 --> 00:41:32,000 Speaker 1: methods to access various accounts, which actually raises questions about 676 00:41:32,040 --> 00:41:34,640 Speaker 1: whether or not Twitter could ever do this to your account, 677 00:41:34,960 --> 00:41:39,759 Speaker 1: could Twitter send out messages under your account without your permission, 678 00:41:40,040 --> 00:41:43,800 Speaker 1: but that's a topic for another discussion. I guess Twitter 679 00:41:43,880 --> 00:41:46,440 Speaker 1: eventually got it all locked down, but it was a 680 00:41:46,440 --> 00:41:50,000 Speaker 1: sobering experience, particularly in a year where there was so 681 00:41:50,120 --> 00:41:54,600 Speaker 1: much misinformation running rampant regarding everything from the pandemic to 682 00:41:54,800 --> 00:41:58,680 Speaker 1: social movements to the upcoming US elections. Towards the end 683 00:41:58,719 --> 00:42:03,680 Speaker 1: of July, the U S Congress called upon executives from Google, Apple, Amazon, 684 00:42:03,760 --> 00:42:07,400 Speaker 1: and Facebook to appear for hearings about whether these companies 685 00:42:07,400 --> 00:42:12,520 Speaker 1: are engaging in anti competitive practices and monopolizing various tech sectors, 686 00:42:12,760 --> 00:42:17,120 Speaker 1: primarily digital advertising. And it was the first antitrust hearing 687 00:42:17,160 --> 00:42:20,440 Speaker 1: in the tech sector in the United States since Microsoft 688 00:42:20,480 --> 00:42:24,120 Speaker 1: had to do this back in nine now. The outcome 689 00:42:24,160 --> 00:42:27,920 Speaker 1: of that Microsoft case was initially a threat of breaking 690 00:42:27,960 --> 00:42:32,279 Speaker 1: Microsoft up into separate companies, but ultimately Microsoft was able 691 00:42:32,320 --> 00:42:34,759 Speaker 1: to come to an agreement with the government that kept 692 00:42:34,800 --> 00:42:38,879 Speaker 1: it whole, so it didn't have to split up. These 693 00:42:38,880 --> 00:42:43,360 Speaker 1: discussions with Google, Apple, Amazon, and Facebook continue up to 694 00:42:43,400 --> 00:42:46,360 Speaker 1: the recording of this podcast, but for one company in particular, 695 00:42:46,440 --> 00:42:50,960 Speaker 1: and that is Facebook, things escalated dramatically. In December, a 696 00:42:51,000 --> 00:42:55,040 Speaker 1: collection of states and the US federal government sued Facebook 697 00:42:55,040 --> 00:42:59,440 Speaker 1: in a pair of antitrust lawsuits, and the Federal Trade Commission, 698 00:42:59,680 --> 00:43:03,360 Speaker 1: or f TC started taking steps toward demanding that Facebook 699 00:43:03,400 --> 00:43:06,680 Speaker 1: divest itself of some of the properties it has acquired 700 00:43:06,719 --> 00:43:10,640 Speaker 1: over the years, like Instagram and WhatsApp. So Facebook has 701 00:43:10,640 --> 00:43:14,080 Speaker 1: a history of acquiring companies that fill in gaps in 702 00:43:14,160 --> 00:43:18,120 Speaker 1: what Facebook can offer, particularly from other types of social 703 00:43:18,120 --> 00:43:23,200 Speaker 1: networks that serve a slightly different purpose from Facebook's main site. 704 00:43:23,719 --> 00:43:26,680 Speaker 1: And sometimes it's just easier to buy up your competition 705 00:43:26,920 --> 00:43:30,440 Speaker 1: than trying to outperform them on their own turf. But 706 00:43:30,560 --> 00:43:33,880 Speaker 1: now various governments and agencies are arguing that Facebook has 707 00:43:33,880 --> 00:43:39,280 Speaker 1: effectively established an exclusive lock on certain online services, particularly 708 00:43:39,320 --> 00:43:43,399 Speaker 1: with regard to monetization, and that further, the company has 709 00:43:43,440 --> 00:43:47,600 Speaker 1: actively tried to discourage competition from other companies, and that's 710 00:43:47,600 --> 00:43:50,480 Speaker 1: a big no no. It's still way too early to 711 00:43:50,480 --> 00:43:52,400 Speaker 1: say how this is all going to turn out, and 712 00:43:52,440 --> 00:43:56,359 Speaker 1: it's quite possible that litigation could last several years. Now. 713 00:43:56,360 --> 00:43:59,360 Speaker 1: Complicating matters is the fact that the US government changes 714 00:43:59,480 --> 00:44:02,640 Speaker 1: over all action cycles, and new leaders might not share 715 00:44:02,680 --> 00:44:05,960 Speaker 1: the same concerns or beliefs about Facebook as the ones 716 00:44:06,000 --> 00:44:09,680 Speaker 1: who are currently pursuing this litigation, and there are those 717 00:44:09,680 --> 00:44:13,640 Speaker 1: who argue that should the antitrust lawsuits conclude with Facebook 718 00:44:13,680 --> 00:44:17,360 Speaker 1: having to divest itself of Instagram and WhatsApp, that the 719 00:44:17,440 --> 00:44:21,480 Speaker 1: underlying problem itself still won't be solved. So, in other words, 720 00:44:21,760 --> 00:44:24,839 Speaker 1: this process is going about things the wrong way. But 721 00:44:25,040 --> 00:44:28,400 Speaker 1: it's a very complicated issue with a lot of vested 722 00:44:28,440 --> 00:44:31,719 Speaker 1: parties involved in Honestly, I don't know what the right 723 00:44:31,760 --> 00:44:35,720 Speaker 1: answer is here, or even if there is a right answer, 724 00:44:36,160 --> 00:44:39,600 Speaker 1: because tech is simple, but business is hard, you know. 725 00:44:40,760 --> 00:44:44,120 Speaker 1: By the end of July, Google's parent company, Alphabet, posted 726 00:44:44,160 --> 00:44:48,040 Speaker 1: the first decline in revenue in the company's history. Now, 727 00:44:48,120 --> 00:44:51,239 Speaker 1: that doesn't mean the company lost money. Rather, it means 728 00:44:51,280 --> 00:44:54,440 Speaker 1: the company made less money than it had the year before, 729 00:44:54,520 --> 00:44:57,600 Speaker 1: so it didn't grow. It indicated that the company wasn't 730 00:44:57,600 --> 00:45:00,520 Speaker 1: growing at the same rate as it had been. Alphabet 731 00:45:00,560 --> 00:45:03,880 Speaker 1: brought in a revenue of thirty eight point three billion 732 00:45:04,120 --> 00:45:08,520 Speaker 1: dollars for the second quarter of so I'm not talking 733 00:45:08,520 --> 00:45:12,080 Speaker 1: about annual revenue. That's the revenue that the company generated 734 00:45:12,160 --> 00:45:15,960 Speaker 1: in just three months, just shy of forty billion dollars. 735 00:45:16,520 --> 00:45:20,080 Speaker 1: And this represented a two percent decline in revenues from 736 00:45:20,080 --> 00:45:22,960 Speaker 1: the previous year. However, it was still more than what 737 00:45:23,120 --> 00:45:27,240 Speaker 1: analysts had predicted for Alphabet given the circumstances of stuff 738 00:45:27,280 --> 00:45:30,959 Speaker 1: like the pandemic and social unrest, so it was still 739 00:45:31,080 --> 00:45:33,759 Speaker 1: good news for Alphabet. And when you start talking in 740 00:45:33,840 --> 00:45:35,960 Speaker 1: terms like that, you begin to get a feeling of 741 00:45:36,000 --> 00:45:39,280 Speaker 1: the scope of those antitrust issues that the US government 742 00:45:39,560 --> 00:45:42,760 Speaker 1: and places like the European Union have really been looking into. 743 00:45:43,040 --> 00:45:45,360 Speaker 1: It's one thing to say we've had a two percent 744 00:45:45,400 --> 00:45:48,640 Speaker 1: decline in revenues, which you know isn't great, particularly in 745 00:45:48,640 --> 00:45:51,359 Speaker 1: a world that values year over year growth over just 746 00:45:51,440 --> 00:45:54,439 Speaker 1: about everything else. But when you break that down as 747 00:45:54,800 --> 00:45:57,600 Speaker 1: we only brought in thirty eight point three billion dollars 748 00:45:57,640 --> 00:46:00,800 Speaker 1: for these last three months, it kind of breaks your brain, 749 00:46:01,600 --> 00:46:04,200 Speaker 1: or rather it breaks my brain. I mean, your brain 750 00:46:04,280 --> 00:46:06,160 Speaker 1: might be just fine. I don't. I don't mean to 751 00:46:06,200 --> 00:46:10,480 Speaker 1: project my own limitations on you. But we have now 752 00:46:10,520 --> 00:46:13,080 Speaker 1: reached the end of July, and if I'm looking at 753 00:46:13,080 --> 00:46:16,399 Speaker 1: this calendar correctly, we still have a few months to go. 754 00:46:16,680 --> 00:46:18,279 Speaker 1: So it looks like I'm going to need to do 755 00:46:18,360 --> 00:46:21,120 Speaker 1: one more episode to cover everything else. And I don't 756 00:46:21,160 --> 00:46:23,439 Speaker 1: want to jump into August only to have to cut 757 00:46:23,480 --> 00:46:25,480 Speaker 1: off everything after a couple of stories for the next part. 758 00:46:25,480 --> 00:46:29,040 Speaker 1: So we're gonna pick up in August for part three 759 00:46:29,080 --> 00:46:31,840 Speaker 1: of our retrospective on the year. And I guess I 760 00:46:31,880 --> 00:46:35,560 Speaker 1: could go on about how this is all really irrational 761 00:46:35,600 --> 00:46:37,800 Speaker 1: because of our way of tracking time. It's all human 762 00:46:37,840 --> 00:46:40,520 Speaker 1: construct so it doesn't really matter what I cover when. 763 00:46:40,640 --> 00:46:43,120 Speaker 1: But that's not really interesting or techie, so well, I'll 764 00:46:43,120 --> 00:46:46,520 Speaker 1: just spare you. In our next episode, we'll wrap up 765 00:46:46,560 --> 00:46:49,200 Speaker 1: the big stories of and you can look forward to 766 00:46:49,239 --> 00:46:52,360 Speaker 1: a quick overview of the struggle between Epic Games and Apple, 767 00:46:52,960 --> 00:46:56,120 Speaker 1: as well as the continuing proliferation of streaming services and 768 00:46:56,160 --> 00:46:59,200 Speaker 1: how they are making huge waves in the entertainment industry 769 00:46:59,200 --> 00:47:02,160 Speaker 1: at large, and more in our next episode. If you 770 00:47:02,200 --> 00:47:05,319 Speaker 1: have suggestions for future episodes of tech Stuff, shoot them 771 00:47:05,320 --> 00:47:07,320 Speaker 1: my way. It can be a technology, it could be 772 00:47:07,360 --> 00:47:09,560 Speaker 1: a company, could be a personality in tech, could just 773 00:47:09,600 --> 00:47:12,200 Speaker 1: be a trend in tech. Any of those things are 774 00:47:12,239 --> 00:47:15,319 Speaker 1: fair games. Send me your suggestions over on Twitter. The 775 00:47:15,360 --> 00:47:18,560 Speaker 1: handle is tech Stuff hs W and I'll talk to 776 00:47:18,560 --> 00:47:27,319 Speaker 1: you again really soon. Y. Tech Stuff is an I 777 00:47:27,440 --> 00:47:30,920 Speaker 1: Heart Radio production. For more podcasts from my Heart Radio, 778 00:47:31,280 --> 00:47:34,440 Speaker 1: visit the i Heart Radio app, Apple Podcasts or wherever 779 00:47:34,520 --> 00:47:36,040 Speaker 1: you listen to your favorite shows,