1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,800 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,840 --> 00:00:17,080 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:17,079 --> 00:00:21,760 Speaker 1: how Stuff Works, and I love all things tech, and 5 00:00:21,960 --> 00:00:25,760 Speaker 1: one of the greatest things about technology, in my opinion, 6 00:00:26,000 --> 00:00:30,520 Speaker 1: is that it's always evolving. However, that does make covering 7 00:00:30,600 --> 00:00:37,480 Speaker 1: technology somewhat challenging or perhaps frustrating or infuriating, particularly if 8 00:00:37,520 --> 00:00:42,000 Speaker 1: you try to do an evergreen style show like tech Stuff. Ideally, 9 00:00:42,400 --> 00:00:44,480 Speaker 1: an episode of tech Stuff goes out and you should 10 00:00:44,479 --> 00:00:48,159 Speaker 1: be able to listen to it today, next week, two 11 00:00:48,280 --> 00:00:53,080 Speaker 1: years from now. But sometimes stories will update, you know, 12 00:00:53,200 --> 00:00:56,400 Speaker 1: they'll they'll continue more things will happen between the time 13 00:00:56,400 --> 00:00:59,520 Speaker 1: I record a show and even when it goes live 14 00:00:59,720 --> 00:01:03,160 Speaker 1: some times. So this episode is all about giving updates 15 00:01:03,160 --> 00:01:06,440 Speaker 1: to various stories I've mentioned in previous episodes of tech Stuff, 16 00:01:06,800 --> 00:01:09,800 Speaker 1: but mostly these are updates that are too short to 17 00:01:09,880 --> 00:01:13,759 Speaker 1: merit a full episode on their own. So let's get started. 18 00:01:14,160 --> 00:01:18,040 Speaker 1: And I've done several episodes on the concept of net neutrality, 19 00:01:18,560 --> 00:01:22,880 Speaker 1: including ones about the FCC reclassifying the Internet service providers 20 00:01:22,920 --> 00:01:26,840 Speaker 1: as common carriers, and the various campaigns for and against 21 00:01:26,840 --> 00:01:31,160 Speaker 1: the whole idea of mandating net neutrality. So if you 22 00:01:31,520 --> 00:01:35,760 Speaker 1: aren't familiar with the concept neutrality, it encompasses several things, 23 00:01:35,760 --> 00:01:37,720 Speaker 1: but one of the things it says is that all 24 00:01:37,840 --> 00:01:42,119 Speaker 1: traffic on the Internet should be treated equally, whether that 25 00:01:42,360 --> 00:01:46,480 Speaker 1: is data that belongs originally to the Internet service provider 26 00:01:46,560 --> 00:01:49,720 Speaker 1: let's say that they also own some other services, or 27 00:01:49,920 --> 00:01:52,840 Speaker 1: it's to a competitor. All that data should be able 28 00:01:52,840 --> 00:01:57,120 Speaker 1: to travel across the Internet without any sort of interference 29 00:01:57,240 --> 00:02:01,280 Speaker 1: or preferential treatment or uh negative treatment in the case 30 00:02:01,280 --> 00:02:04,800 Speaker 1: of a competitor's services. It should all just go no 31 00:02:04,840 --> 00:02:07,720 Speaker 1: matter what device you have connected to the Internet, and 32 00:02:07,760 --> 00:02:09,920 Speaker 1: no matter what service you connect to, it should be 33 00:02:09,960 --> 00:02:13,760 Speaker 1: treated equally well. In early June two thousand eighteen, the 34 00:02:13,880 --> 00:02:18,000 Speaker 1: f c C repealed the previously established net neutrality rules, 35 00:02:18,400 --> 00:02:21,480 Speaker 1: and those rules included restrictions on Internet service providers, such 36 00:02:21,520 --> 00:02:24,320 Speaker 1: as being able to give preferential treatment to certain types 37 00:02:24,360 --> 00:02:27,800 Speaker 1: of web traffic over others, or charging more for different 38 00:02:27,800 --> 00:02:30,480 Speaker 1: types of content. So, in other words, these rules were 39 00:02:30,480 --> 00:02:32,480 Speaker 1: meant to keep I s p s from being able 40 00:02:32,520 --> 00:02:34,880 Speaker 1: to serve up their own services on a fast track 41 00:02:35,000 --> 00:02:38,120 Speaker 1: while potentially slowing down or charging more or both for 42 00:02:38,200 --> 00:02:42,200 Speaker 1: competing services. So as an example, imagine you have an 43 00:02:42,200 --> 00:02:46,440 Speaker 1: Internet service provider, let's call it Bombast, and it has 44 00:02:46,480 --> 00:02:50,000 Speaker 1: its own streaming video service, and you can watch certain 45 00:02:50,040 --> 00:02:53,519 Speaker 1: films and certain television shows using that service. The I 46 00:02:53,720 --> 00:02:57,799 Speaker 1: s P might make the service free to anyone who 47 00:02:58,000 --> 00:03:01,120 Speaker 1: is a subscriber to that Internet service providers, so you're 48 00:03:01,160 --> 00:03:05,600 Speaker 1: already a bomb Bast customer. You get those services for free. 49 00:03:06,040 --> 00:03:08,720 Speaker 1: And not only that, but they serve those videos on 50 00:03:08,760 --> 00:03:11,200 Speaker 1: a fast track, right They're using a lot of bandwidth 51 00:03:11,240 --> 00:03:14,200 Speaker 1: to get that stuff to you, and that way, on 52 00:03:14,240 --> 00:03:16,800 Speaker 1: your side, there's no buffering, there's no delays. It's a 53 00:03:16,880 --> 00:03:21,240 Speaker 1: it's a good user experience. However, that same I s 54 00:03:21,280 --> 00:03:25,440 Speaker 1: P then puts the brakes on competing services like Netflix, 55 00:03:25,919 --> 00:03:29,840 Speaker 1: so that you, as a Bombast customer, have a less 56 00:03:30,000 --> 00:03:33,000 Speaker 1: user friendly experience. When you try to use Netflix, everything 57 00:03:33,040 --> 00:03:35,440 Speaker 1: takes forever to load. You might even have to pay 58 00:03:35,520 --> 00:03:38,440 Speaker 1: extra just to be able to access the Netflix service. 59 00:03:38,880 --> 00:03:41,480 Speaker 1: Under the previous net neutrality rules, that would be a 60 00:03:41,520 --> 00:03:45,480 Speaker 1: big no no, but now those rules are gone. Several 61 00:03:45,520 --> 00:03:48,120 Speaker 1: states in the United States have introduced bills that are 62 00:03:48,160 --> 00:03:51,120 Speaker 1: intended to act as a type of net neutrality on 63 00:03:51,160 --> 00:03:54,040 Speaker 1: the state level, but that's kind of a haphazard way 64 00:03:54,080 --> 00:03:56,880 Speaker 1: to go about it because not every state is proposing 65 00:03:56,920 --> 00:03:59,640 Speaker 1: such legislation. A little more than half of the states 66 00:03:59,640 --> 00:04:02,040 Speaker 1: and the u US have some form of legislation either 67 00:04:02,120 --> 00:04:05,120 Speaker 1: passed or in the process, and there's no guarantee that 68 00:04:05,160 --> 00:04:07,240 Speaker 1: the bills that have been proposed will be passed in 69 00:04:07,280 --> 00:04:10,240 Speaker 1: all the states that have taken up the SLACK. Meanwhile, 70 00:04:10,280 --> 00:04:14,000 Speaker 1: the US Senate voted to reverse the f c C repeal, 71 00:04:14,440 --> 00:04:17,080 Speaker 1: and it passed in the Senate. However, in the United 72 00:04:17,120 --> 00:04:18,960 Speaker 1: States we have not just the Senate, we also have 73 00:04:19,040 --> 00:04:22,200 Speaker 1: the House of Representatives, and in the House, the measure 74 00:04:22,240 --> 00:04:26,160 Speaker 1: did not see the same momentum. They needed two representatives 75 00:04:26,200 --> 00:04:29,480 Speaker 1: to agree to vote on that bill just to just 76 00:04:29,560 --> 00:04:32,880 Speaker 1: to have a vote for it, and every Democrat representative 77 00:04:33,000 --> 00:04:36,080 Speaker 1: signed the petition for this, but it was still more 78 00:04:36,120 --> 00:04:38,840 Speaker 1: than forty signatures short of getting the measure to the 79 00:04:38,920 --> 00:04:41,920 Speaker 1: voting floor, at least as of the recording of this podcast. 80 00:04:43,240 --> 00:04:48,719 Speaker 1: So net neutrality currently in the United States is a 81 00:04:48,720 --> 00:04:51,240 Speaker 1: a dead concept. At the federal level, the f c 82 00:04:51,400 --> 00:04:54,359 Speaker 1: C is now pushing that over to the Federal Trade 83 00:04:54,360 --> 00:04:57,280 Speaker 1: Commission and saying this belongs to them, not to us. 84 00:04:57,600 --> 00:05:02,440 Speaker 1: They'll make sure that no anti competitive monopolistic behaviors happened 85 00:05:02,440 --> 00:05:07,479 Speaker 1: across the internet. Meanwhile, critics of this say that fat 86 00:05:07,600 --> 00:05:10,400 Speaker 1: chance like that this was our best way of making 87 00:05:10,480 --> 00:05:13,919 Speaker 1: certain that users had a fair shake. So there's a 88 00:05:13,920 --> 00:05:17,280 Speaker 1: lot of rhetoric on both sides of this issue, and 89 00:05:17,320 --> 00:05:22,080 Speaker 1: it continues to be a big issue in the US. Recently, 90 00:05:22,160 --> 00:05:24,200 Speaker 1: I also did some episodes giving an update on what 91 00:05:24,320 --> 00:05:28,239 Speaker 1: Microsoft has been up to since. One of the things 92 00:05:28,240 --> 00:05:33,159 Speaker 1: I mentioned was a dual touchscreen device called Andromeda. Well, 93 00:05:33,200 --> 00:05:36,039 Speaker 1: on July two, two eighteen, I saw an article stating 94 00:05:36,080 --> 00:05:39,400 Speaker 1: that the dual panel mobile device running the Andromeda OS 95 00:05:39,440 --> 00:05:44,400 Speaker 1: has been put on hold. So again, just after I 96 00:05:44,480 --> 00:05:48,000 Speaker 1: published this Microsoft update, they go and do this. But 97 00:05:48,080 --> 00:05:50,840 Speaker 1: what's really going on? Well, one thing I did not 98 00:05:51,080 --> 00:05:54,839 Speaker 1: mention in the update episodes for Microsoft is that back 99 00:05:54,839 --> 00:05:59,400 Speaker 1: in two thousand fifteen, Microsoft was calling Windows ten the 100 00:05:59,680 --> 00:06:03,440 Speaker 1: lab version of Windows. Now. That was not to say 101 00:06:03,480 --> 00:06:06,640 Speaker 1: that the company would release Windows ten then brush its 102 00:06:06,680 --> 00:06:11,000 Speaker 1: hands off and walk away whistling merrily. Instead, Windows was 103 00:06:11,040 --> 00:06:14,600 Speaker 1: meant to evolve almost like it's a service on demand, 104 00:06:15,120 --> 00:06:18,680 Speaker 1: so not your your traditional operating system, where you would 105 00:06:18,880 --> 00:06:21,520 Speaker 1: pay for an update, or you would get a regular update. 106 00:06:21,520 --> 00:06:25,160 Speaker 1: It's the foundation. Windows ten is a foundation for the 107 00:06:25,240 --> 00:06:28,159 Speaker 1: service of an operating system, and Microsoft would continue to 108 00:06:28,200 --> 00:06:30,640 Speaker 1: support it and update it with new features as time 109 00:06:30,720 --> 00:06:33,560 Speaker 1: goes on, but there would be no Windows eleven. There 110 00:06:33,600 --> 00:06:35,880 Speaker 1: would just be updates to Windows ten that would roll 111 00:06:35,920 --> 00:06:39,000 Speaker 1: on out to users. One of those updates, which is 112 00:06:39,040 --> 00:06:41,920 Speaker 1: scheduled for the fall of two thousand eighteen, is code 113 00:06:41,960 --> 00:06:46,200 Speaker 1: named Redstone five. And what was supposed to happen was 114 00:06:46,240 --> 00:06:50,200 Speaker 1: that the Andromeda operating system, the mobile version of Windows 115 00:06:50,200 --> 00:06:53,520 Speaker 1: that was going to run on this upcoming dual touchscreen 116 00:06:53,720 --> 00:06:58,400 Speaker 1: foldable mobile device, it would be included in that update 117 00:06:58,440 --> 00:07:01,400 Speaker 1: to Windows ten. So you would have of this interoperability 118 00:07:01,520 --> 00:07:05,320 Speaker 1: between the full build of Windows for desktops and bcs 119 00:07:05,360 --> 00:07:08,200 Speaker 1: and laptops that kind of thing, and the mobile version 120 00:07:08,279 --> 00:07:12,520 Speaker 1: that would exist on the Andromeda device. The Andromeda phone, 121 00:07:13,200 --> 00:07:15,880 Speaker 1: or mobile device I should call it, was rumored to 122 00:07:15,920 --> 00:07:19,880 Speaker 1: have cellular technology built into it, but not necessarily in 123 00:07:19,920 --> 00:07:22,560 Speaker 1: a way that was made to make it a competitor 124 00:07:22,600 --> 00:07:25,800 Speaker 1: with iOS or androids. So in other words, calling it 125 00:07:25,840 --> 00:07:28,200 Speaker 1: a mobile device is more accurate than a phone. It 126 00:07:28,480 --> 00:07:32,360 Speaker 1: might have had cellular technology built into it, but it 127 00:07:32,440 --> 00:07:35,680 Speaker 1: was really meant to increase the devices functionality, allowing you 128 00:07:35,720 --> 00:07:38,400 Speaker 1: to connect to the Internet, whether you had WiFi access 129 00:07:38,480 --> 00:07:41,360 Speaker 1: or not, so could you make calls on it. You 130 00:07:41,440 --> 00:07:43,880 Speaker 1: might be able to use a service like Skype, which 131 00:07:43,880 --> 00:07:47,280 Speaker 1: Microsoft owns, and make a call through that, but it 132 00:07:47,320 --> 00:07:50,840 Speaker 1: wasn't necessarily meant to act as a smartphone. It was 133 00:07:50,920 --> 00:07:53,960 Speaker 1: more like a device that could also make calls. But 134 00:07:54,040 --> 00:07:56,200 Speaker 1: as it turns out, it's all a moot point. In 135 00:07:56,280 --> 00:07:59,360 Speaker 1: the spring, Microsoft shook up the mobile division with another 136 00:07:59,360 --> 00:08:03,920 Speaker 1: reorganization ation, and the company reevaluated its strategy. The elements 137 00:08:03,920 --> 00:08:07,000 Speaker 1: of Andromeda OS either did not align with this new 138 00:08:07,040 --> 00:08:10,240 Speaker 1: strategy or they weren't ready to go, but either way, 139 00:08:10,280 --> 00:08:12,080 Speaker 1: they are not going to be part of the red 140 00:08:12,120 --> 00:08:17,000 Speaker 1: Stone five update, and the development has been shelved, possibly forever, 141 00:08:17,520 --> 00:08:20,320 Speaker 1: and some folks like zd Nets Mary Joe Folly have 142 00:08:20,440 --> 00:08:23,520 Speaker 1: pointed out this might not be a bad idea. The 143 00:08:23,560 --> 00:08:27,560 Speaker 1: Andromeda was rumored to only be able to run Microsoft 144 00:08:27,720 --> 00:08:33,400 Speaker 1: Store apps, which limited its utility considerably. Fully cites sources 145 00:08:33,440 --> 00:08:36,280 Speaker 1: that say Microsoft may still try to develop a multi 146 00:08:36,320 --> 00:08:39,120 Speaker 1: screen device running a build of Windows in the future, 147 00:08:39,520 --> 00:08:42,920 Speaker 1: but then it might more closely resemble a laptop inform 148 00:08:43,040 --> 00:08:47,040 Speaker 1: factor rather than a handheld device. Back on May twenty second, 149 00:08:47,080 --> 00:08:51,040 Speaker 1: two thousand eighteen, I published an episode about Project Maven, 150 00:08:51,160 --> 00:08:54,680 Speaker 1: which focused on incorporating elements of artificial intelligence in a 151 00:08:54,800 --> 00:08:58,240 Speaker 1: drone program for the military. Google had been involved in 152 00:08:58,280 --> 00:09:01,480 Speaker 1: that project and faced criticis them both from within and 153 00:09:01,520 --> 00:09:05,440 Speaker 1: from outside the company about its involvement. On June one, 154 00:09:05,600 --> 00:09:09,800 Speaker 1: two thousand eighteen, several headlines reported Google would not renew 155 00:09:10,000 --> 00:09:13,679 Speaker 1: the project, may even contract, according to Google employees. The 156 00:09:13,720 --> 00:09:17,840 Speaker 1: CEO of Google Cloud, Diane Green, addressed the company and 157 00:09:17,880 --> 00:09:21,200 Speaker 1: announced that once the current contract is over, Google will 158 00:09:21,240 --> 00:09:24,719 Speaker 1: not seek a renewal, and as a reminder, Google had 159 00:09:24,760 --> 00:09:27,880 Speaker 1: maintained that its work on the project focused solely on 160 00:09:28,080 --> 00:09:32,920 Speaker 1: non aggressive technologies and open source software, but emails from 161 00:09:32,960 --> 00:09:36,360 Speaker 1: within Google showed that executives were really eager to land 162 00:09:36,600 --> 00:09:41,040 Speaker 1: lucrative contracts with the Pentagon to provide services like machine learning, 163 00:09:41,280 --> 00:09:45,240 Speaker 1: which could then be used to support surveillance efforts. Presumably 164 00:09:45,559 --> 00:09:47,920 Speaker 1: that same technology might in the future be used to 165 00:09:47,960 --> 00:09:52,760 Speaker 1: create autonomous or semi autonomous weapons platforms. So while Google 166 00:09:52,840 --> 00:09:56,440 Speaker 1: might have the plausible deniability of saying our work was 167 00:09:56,520 --> 00:10:02,880 Speaker 1: just to help with identifying places and people and activity. 168 00:10:03,120 --> 00:10:07,079 Speaker 1: It wasn't meant to be used in and direct combat applications. 169 00:10:07,080 --> 00:10:10,760 Speaker 1: You could argue, well, with some small adaptation, you could 170 00:10:10,880 --> 00:10:16,040 Speaker 1: make that technology uh more of a weapons platform. So 171 00:10:16,080 --> 00:10:17,920 Speaker 1: it's to me it's one of those kind of weak 172 00:10:18,320 --> 00:10:21,960 Speaker 1: arguments at some moonpoint, though Google has now backed away 173 00:10:22,000 --> 00:10:26,400 Speaker 1: from that contract. The dramatic Bitcoin roller coaster story continues 174 00:10:26,520 --> 00:10:30,760 Speaker 1: today in the cryptocurrency hit an all time high of 175 00:10:30,920 --> 00:10:35,000 Speaker 1: nearly twenty thousand dollars per bitcoin. That was toward the 176 00:10:35,160 --> 00:10:39,160 Speaker 1: end of seventeen. Now, remember bitcoin is a digital currency 177 00:10:39,360 --> 00:10:43,160 Speaker 1: that can be divided up into much smaller subunits. After 178 00:10:43,200 --> 00:10:46,920 Speaker 1: reaching those incredible highs, the currency has slumped quite a bit. 179 00:10:47,280 --> 00:10:49,560 Speaker 1: At the end of June two eighteen, the value dipped 180 00:10:49,640 --> 00:10:52,800 Speaker 1: below six thousand dollars per bitcoin, which is still you know, 181 00:10:52,920 --> 00:10:55,520 Speaker 1: pretty pretty high, but it's a huge drop from that 182 00:10:55,559 --> 00:10:59,160 Speaker 1: twenty grand it enjoyed back in December twenty seventeen. And 183 00:10:59,200 --> 00:11:02,760 Speaker 1: as I researched these updates, Bitcoin is in another bit 184 00:11:02,760 --> 00:11:06,400 Speaker 1: of an upswing. It's recovered to above six thousand dollars again, 185 00:11:06,600 --> 00:11:08,720 Speaker 1: like it dropped down to about five thousand, eight hundred. 186 00:11:09,000 --> 00:11:12,679 Speaker 1: Now it's back above six thousand, but cryptocurrencies in general 187 00:11:12,840 --> 00:11:15,440 Speaker 1: have had a really rough time of it. Recently, the 188 00:11:15,520 --> 00:11:18,640 Speaker 1: Japanese government passed some strict laws to try and eliminate 189 00:11:18,720 --> 00:11:22,400 Speaker 1: the use of cryptocurrencies and money laundering exploits, and some 190 00:11:22,480 --> 00:11:26,320 Speaker 1: high profile digital currency exchanges have been hacked recently. These 191 00:11:26,360 --> 00:11:30,120 Speaker 1: sort of events have really hurt cryptocurrencies in general. Just 192 00:11:30,200 --> 00:11:33,719 Speaker 1: like all currencies, their value rests partly on perception. If 193 00:11:33,760 --> 00:11:37,440 Speaker 1: we believe them to have value, they have value. And 194 00:11:37,480 --> 00:11:39,880 Speaker 1: if everyone just woke up one morning and said, wait 195 00:11:39,920 --> 00:11:43,000 Speaker 1: a minute, that's actually worthless, it would in fact be 196 00:11:43,040 --> 00:11:46,240 Speaker 1: worthless because no one would accept the currency for anything. 197 00:11:46,880 --> 00:11:50,959 Speaker 1: So it's only our our belief that it it's worth 198 00:11:51,000 --> 00:11:53,320 Speaker 1: anything that makes it worth anything. Now this is true 199 00:11:53,360 --> 00:11:57,680 Speaker 1: for any currency. It's you have the the assurance of 200 00:11:57,720 --> 00:12:00,880 Speaker 1: most currencies that have a bank or from behind them 201 00:12:00,920 --> 00:12:04,319 Speaker 1: that they are backing the value of that. But ultimately 202 00:12:04,600 --> 00:12:07,560 Speaker 1: it's just that we all agree that it's worth something. 203 00:12:07,840 --> 00:12:09,720 Speaker 1: If we all woke up and said, you know what, 204 00:12:10,160 --> 00:12:13,360 Speaker 1: this paper money really isn't you can't really do anything 205 00:12:13,400 --> 00:12:16,120 Speaker 1: with it. It's not really it doesn't have any real 206 00:12:16,160 --> 00:12:19,880 Speaker 1: worth to it, then the currency could collapse if everybody 207 00:12:19,880 --> 00:12:22,720 Speaker 1: did it. Now, the funny thing is that as I 208 00:12:22,760 --> 00:12:25,360 Speaker 1: researched this, I'm aware that this story could actually be 209 00:12:25,400 --> 00:12:28,080 Speaker 1: totally different by the time the updates go live, which 210 00:12:28,120 --> 00:12:30,880 Speaker 1: is how crazy and volatile Bitcoin tans to be. And 211 00:12:30,960 --> 00:12:32,360 Speaker 1: it also means that I may have to do an 212 00:12:32,400 --> 00:12:35,800 Speaker 1: update to my updates episode. The irony is not lost 213 00:12:35,840 --> 00:12:39,280 Speaker 1: upon me. While I wallow in some self pity as 214 00:12:39,320 --> 00:12:42,200 Speaker 1: I think about this, Let's take a quick break to 215 00:12:42,280 --> 00:12:52,800 Speaker 1: thank our sponsor. Back at the beginning of two thousand eighteen, 216 00:12:52,840 --> 00:12:57,079 Speaker 1: I did three episodes about YouTube's history, and that also 217 00:12:57,200 --> 00:12:59,640 Speaker 1: deserves a bit of an update now. Just as I 218 00:12:59,720 --> 00:13:02,760 Speaker 1: was king on the YouTube episodes, the company was getting 219 00:13:02,760 --> 00:13:06,000 Speaker 1: ready to roll out some major changes to its partner platform. 220 00:13:06,160 --> 00:13:09,360 Speaker 1: YouTube partner program is it's the platform that creators can 221 00:13:09,440 --> 00:13:13,040 Speaker 1: use to monetize their videos, and essentially it allows channel 222 00:13:13,040 --> 00:13:16,560 Speaker 1: owners to elect to have ads served against their videos. 223 00:13:16,840 --> 00:13:20,000 Speaker 1: YouTube gets a cut and the video owner gets the rest. 224 00:13:20,480 --> 00:13:23,320 Speaker 1: So if you have a really popular channel with videos 225 00:13:23,320 --> 00:13:25,040 Speaker 1: that get a lot of views, you can stand to 226 00:13:25,120 --> 00:13:28,400 Speaker 1: make some decent money from advertising. But in the spring 227 00:13:28,520 --> 00:13:34,040 Speaker 1: of YouTube was trying to deal with a really controversial 228 00:13:34,120 --> 00:13:38,840 Speaker 1: problem some users had been uploading videos that featured hateful content. 229 00:13:39,280 --> 00:13:43,200 Speaker 1: There could be racist content, misogynists content, other forms of 230 00:13:43,280 --> 00:13:47,160 Speaker 1: hate speech, and some of those users were YouTube partners. 231 00:13:47,200 --> 00:13:50,400 Speaker 1: They had a large following, and they had ads served 232 00:13:50,480 --> 00:13:54,400 Speaker 1: up against their videos, and advertisers were not too keen 233 00:13:54,559 --> 00:13:57,480 Speaker 1: to have their brands associated with someone who was spouting 234 00:13:57,520 --> 00:14:01,040 Speaker 1: off hateful garbage. So youtubes Are sponse was to create 235 00:14:01,080 --> 00:14:04,320 Speaker 1: a new algorithm that attempted to flag videos that might 236 00:14:04,360 --> 00:14:08,600 Speaker 1: have problematic content in them and then demonetize those videos, 237 00:14:08,679 --> 00:14:11,680 Speaker 1: essentially flip the switch off so that ads would no 238 00:14:11,760 --> 00:14:15,360 Speaker 1: longer be served against those videos, and no matter how 239 00:14:15,400 --> 00:14:18,320 Speaker 1: many views such a video would get, it would never 240 00:14:18,400 --> 00:14:22,000 Speaker 1: earn any money for the creator. Now, I personally do 241 00:14:22,120 --> 00:14:26,000 Speaker 1: not have a problem with YouTube demonetizing videos that advocate 242 00:14:26,040 --> 00:14:30,040 Speaker 1: for violence or spout off hateful things. YouTube is a company. 243 00:14:30,360 --> 00:14:33,440 Speaker 1: They have the right to do that, and I don't 244 00:14:33,520 --> 00:14:37,680 Speaker 1: see a problem with cutting off this, especially considering that 245 00:14:37,720 --> 00:14:40,000 Speaker 1: the business side of it is that YouTube has to 246 00:14:40,000 --> 00:14:42,320 Speaker 1: work with the advertisers too, and not a lot of 247 00:14:42,320 --> 00:14:46,200 Speaker 1: advertisers are too eager to have their brand associated with 248 00:14:46,240 --> 00:14:50,480 Speaker 1: that kind of stuff. However, the algorithm affected a much 249 00:14:50,560 --> 00:14:57,680 Speaker 1: broader spectrum of videos, including educational videos, advocacy videos, news 250 00:14:57,720 --> 00:15:02,320 Speaker 1: oriented videos covering topics racism in a newsworthy way. Some 251 00:15:02,400 --> 00:15:06,280 Speaker 1: of these videos were getting tagged as being inappropriate, even 252 00:15:06,320 --> 00:15:10,160 Speaker 1: when there was no real inappropriate content in them. Creators 253 00:15:10,200 --> 00:15:14,280 Speaker 1: began to refer to it as the apocalypse. YouTube had 254 00:15:14,360 --> 00:15:18,360 Speaker 1: swung hard in an effort to keep major advertisers on 255 00:15:18,400 --> 00:15:20,200 Speaker 1: the platform because there were a lot of big ones 256 00:15:20,200 --> 00:15:23,760 Speaker 1: that were threatening to leave. The alternative was losing some 257 00:15:23,840 --> 00:15:26,680 Speaker 1: of those valuable clients, and we're talking big companies like 258 00:15:26,800 --> 00:15:30,960 Speaker 1: auto companies and major retailers. However, in the process of 259 00:15:31,000 --> 00:15:35,920 Speaker 1: trying to mollify these major companies, the creators who are 260 00:15:35,920 --> 00:15:39,560 Speaker 1: actually making the content that ads were getting served up against, 261 00:15:39,640 --> 00:15:42,800 Speaker 1: we're hurting. So YouTube adjusted the algorithm in the fall 262 00:15:42,840 --> 00:15:45,600 Speaker 1: of seventeen to try and correct some of these problems, 263 00:15:45,680 --> 00:15:48,840 Speaker 1: kind of a course correction, and the company noted that 264 00:15:48,920 --> 00:15:51,080 Speaker 1: no fix was ever going to be perfect, and there 265 00:15:51,080 --> 00:15:53,600 Speaker 1: would be some instances that would have to be handled 266 00:15:53,640 --> 00:15:56,280 Speaker 1: on a case by case basis where someone might see 267 00:15:56,320 --> 00:15:58,800 Speaker 1: that a video they've uploaded has suddenly been flagged as 268 00:15:58,840 --> 00:16:01,880 Speaker 1: being demonetized, and you would have to take it up 269 00:16:01,880 --> 00:16:05,360 Speaker 1: to YouTube and say, I really don't think that this 270 00:16:05,560 --> 00:16:08,280 Speaker 1: should count, and here are my reasons why, and could 271 00:16:08,280 --> 00:16:10,960 Speaker 1: you give me an explanation as to why my video 272 00:16:11,000 --> 00:16:13,400 Speaker 1: has been demonetizeding what I can do to fix this. 273 00:16:13,960 --> 00:16:17,040 Speaker 1: Of course, the problem is during this time you're not 274 00:16:17,120 --> 00:16:20,359 Speaker 1: making any money off views. They're coming to that video 275 00:16:20,640 --> 00:16:23,480 Speaker 1: and you might eventually get monetized again. Let's say that 276 00:16:23,600 --> 00:16:26,120 Speaker 1: YouTube says, oh, you're right, that was completely on us. 277 00:16:26,520 --> 00:16:29,960 Speaker 1: The algorithm was over zealous. Let's fix this. It is monetized. 278 00:16:30,000 --> 00:16:32,680 Speaker 1: We flipped the switch. You're good to go. Yeah, you're 279 00:16:32,680 --> 00:16:35,200 Speaker 1: good to go from that point forward. But all views 280 00:16:35,240 --> 00:16:39,200 Speaker 1: that had happened before then don't count because no ads 281 00:16:39,200 --> 00:16:42,560 Speaker 1: were served against your video at that time. And the 282 00:16:42,600 --> 00:16:45,320 Speaker 1: way videos tend to work, I mean, there are long 283 00:16:45,400 --> 00:16:48,400 Speaker 1: tail videos that will get views months and months and 284 00:16:48,440 --> 00:16:51,400 Speaker 1: months after they go live. I had a video go 285 00:16:51,640 --> 00:16:56,000 Speaker 1: viral nine months after it was published on YouTube. So 286 00:16:56,120 --> 00:16:59,160 Speaker 1: there are those cases. But a lot of videos it's 287 00:16:59,200 --> 00:17:02,520 Speaker 1: the first couple of days, maybe the first week, where 288 00:17:02,520 --> 00:17:04,879 Speaker 1: they get the majority of their views. So if it 289 00:17:04,920 --> 00:17:10,120 Speaker 1: takes too long to monetize a video again, that's lost revenue. Anyway. 290 00:17:10,160 --> 00:17:12,439 Speaker 1: Now we're back up to January of two thousand eighteen. 291 00:17:12,520 --> 00:17:14,720 Speaker 1: That was when I was actually publishing the episodes about 292 00:17:14,720 --> 00:17:18,880 Speaker 1: YouTube up to that time. To qualify for the Partner platform, 293 00:17:19,119 --> 00:17:22,000 Speaker 1: your YouTube channel would need to have accrued at least 294 00:17:22,000 --> 00:17:25,919 Speaker 1: ten thousand views across however many videos you had. If 295 00:17:25,960 --> 00:17:28,520 Speaker 1: you had one video that did great and you had 296 00:17:28,520 --> 00:17:31,200 Speaker 1: ten thousand views on it, that counted. If you had 297 00:17:31,600 --> 00:17:34,320 Speaker 1: a thousand videos and each one got ten views, that 298 00:17:34,480 --> 00:17:37,680 Speaker 1: gonna do, although you would still be under review about 299 00:17:37,680 --> 00:17:40,040 Speaker 1: whether or not you joined the partner platform, but the 300 00:17:40,080 --> 00:17:43,520 Speaker 1: overhaul ended up changing all of that. Anyway. Now, to 301 00:17:43,600 --> 00:17:46,159 Speaker 1: be a partner on YouTube, you have to have at 302 00:17:46,240 --> 00:17:49,240 Speaker 1: least one thousand subscribers to your channel, and you have 303 00:17:49,280 --> 00:17:52,240 Speaker 1: to have at least four thousand hours of watch time 304 00:17:52,440 --> 00:17:56,160 Speaker 1: over a twelve month period across all your videos. That's 305 00:17:56,200 --> 00:17:59,879 Speaker 1: somewhat problematic because if you're a creator who specializes in 306 00:18:00,080 --> 00:18:03,760 Speaker 1: short form videos, you might end up having lots of 307 00:18:03,840 --> 00:18:06,320 Speaker 1: views but still end up on the short end of 308 00:18:06,400 --> 00:18:10,080 Speaker 1: total accumulated view time. Because let's say that I make 309 00:18:10,119 --> 00:18:12,840 Speaker 1: a crazy popular video series, but each of those videos 310 00:18:12,920 --> 00:18:16,360 Speaker 1: is about three minutes long, and let's say that I'm 311 00:18:16,400 --> 00:18:19,080 Speaker 1: doing it maybe once a week, I have to go 312 00:18:19,200 --> 00:18:22,160 Speaker 1: for a long time and hope for really really big 313 00:18:22,200 --> 00:18:25,000 Speaker 1: audience reach. To reach that four thousand hours of total 314 00:18:25,080 --> 00:18:28,280 Speaker 1: view time, people have to watch tons of my videos. 315 00:18:28,280 --> 00:18:31,200 Speaker 1: I might have to step up production and start publishing 316 00:18:31,240 --> 00:18:33,720 Speaker 1: much more frequently to get to that four thousand hours 317 00:18:33,720 --> 00:18:37,200 Speaker 1: of total view time within twelve months. So if I 318 00:18:37,280 --> 00:18:40,320 Speaker 1: make really good longer form content, I could hit that 319 00:18:40,400 --> 00:18:43,399 Speaker 1: number with fewer actual viewers. Right. If each of my 320 00:18:43,560 --> 00:18:47,680 Speaker 1: videos is like an hour long and it's really really good, 321 00:18:48,040 --> 00:18:51,560 Speaker 1: people tend to watch for the majority of the video, 322 00:18:52,040 --> 00:18:56,320 Speaker 1: I'll hit that four thousand hour number way faster now. 323 00:18:56,359 --> 00:18:59,280 Speaker 1: YouTube says that if you're uploading videos four or five 324 00:18:59,359 --> 00:19:01,600 Speaker 1: times a week and you have about a thousand subscribers, 325 00:19:01,640 --> 00:19:03,560 Speaker 1: it shouldn't be too hard to hit that four thousand 326 00:19:03,640 --> 00:19:07,679 Speaker 1: hours of you content, but your mileage may vary. For 327 00:19:07,840 --> 00:19:12,080 Speaker 1: those who have actually met those requirements, some of them 328 00:19:12,119 --> 00:19:14,840 Speaker 1: have found the new approval process to take a really 329 00:19:14,880 --> 00:19:18,040 Speaker 1: long time. In April two thousand eighteen, the CEO of 330 00:19:18,080 --> 00:19:21,439 Speaker 1: YouTube said that the company would launch a pilot program 331 00:19:21,440 --> 00:19:25,160 Speaker 1: in which creators could provide more feedback about video content 332 00:19:25,520 --> 00:19:28,920 Speaker 1: in an effort to help YouTube find appropriate advertiser matches 333 00:19:29,240 --> 00:19:32,800 Speaker 1: and potentially sidestep false positive or flip flopping issues where 334 00:19:32,840 --> 00:19:36,000 Speaker 1: an ad gets monetized and then demonetized and back again 335 00:19:36,119 --> 00:19:38,600 Speaker 1: multiple times. So, in other words, let's say that you're 336 00:19:38,600 --> 00:19:41,840 Speaker 1: making a comedy series, and let's say it's a little raunchy, 337 00:19:42,080 --> 00:19:46,040 Speaker 1: not outside of YouTube's rules and regulations. You're you're still 338 00:19:46,040 --> 00:19:48,919 Speaker 1: well within that, but maybe it's a little bit more 339 00:19:48,960 --> 00:19:52,439 Speaker 1: on the rowdy side than your typical video might be. 340 00:19:53,600 --> 00:19:57,000 Speaker 1: That might mean that certain advertisers wouldn't want to be 341 00:19:57,080 --> 00:20:00,000 Speaker 1: associated with you, but other advertisers might be perfectly fine. 342 00:20:00,240 --> 00:20:03,800 Speaker 1: And this process would allow you a way to tell YouTube, 343 00:20:04,359 --> 00:20:06,720 Speaker 1: here's what my video series has in it. Here are 344 00:20:06,760 --> 00:20:12,200 Speaker 1: the potentially problematic points. Uh So, can you find advertisers 345 00:20:12,200 --> 00:20:15,679 Speaker 1: that are okay to work with my particular content, and 346 00:20:15,720 --> 00:20:19,160 Speaker 1: then YouTube will match the two up. This benefits everybody. 347 00:20:19,200 --> 00:20:22,000 Speaker 1: It benefits the advertiser, it benefits the creator, and ultimately 348 00:20:22,000 --> 00:20:25,760 Speaker 1: it benefits YouTube. YouTube only is making money when those 349 00:20:25,800 --> 00:20:29,760 Speaker 1: ads are being served against other content. Uh on the 350 00:20:30,400 --> 00:20:33,400 Speaker 1: setting aside subscriber models and all that kind of stuff. So, 351 00:20:33,480 --> 00:20:35,880 Speaker 1: in other words, this is supposed to be the solution 352 00:20:35,880 --> 00:20:39,200 Speaker 1: for everybody. The problem is it hasn't yet been really formulated. 353 00:20:40,280 --> 00:20:43,080 Speaker 1: In June, YouTube rolled out some more features, such as 354 00:20:43,160 --> 00:20:46,520 Speaker 1: channel membership model. Uh. This is where creators who are 355 00:20:46,520 --> 00:20:49,640 Speaker 1: partners can create a membership channel and charge users four 356 00:20:49,680 --> 00:20:54,240 Speaker 1: dollars a month to access special members only content, including 357 00:20:54,680 --> 00:20:58,680 Speaker 1: video streams and emojis. If this sounds familiar, it's because 358 00:20:58,680 --> 00:21:02,199 Speaker 1: it's pretty much what Twitched us with live streaming. With Twitch, 359 00:21:02,280 --> 00:21:05,520 Speaker 1: you can be a subscriber and you can get uh 360 00:21:05,720 --> 00:21:09,679 Speaker 1: special material through your subscription and creators when ten thousand 361 00:21:09,760 --> 00:21:13,719 Speaker 1: subscribers can sell merchandise directly from their channels through a 362 00:21:13,760 --> 00:21:16,159 Speaker 1: company called t Spring. But t Spring also takes a 363 00:21:16,160 --> 00:21:18,880 Speaker 1: pretty healthy cut of all sales, so you only get 364 00:21:18,880 --> 00:21:21,280 Speaker 1: a portion of those sales, which makes sense. I mean, 365 00:21:21,320 --> 00:21:24,600 Speaker 1: it's t Spring that's supplying the actual material. But still, 366 00:21:25,280 --> 00:21:29,000 Speaker 1: um so, some people say that these these moves on 367 00:21:29,040 --> 00:21:32,200 Speaker 1: YouTube's part to try and create new revenue streams for 368 00:21:32,560 --> 00:21:36,879 Speaker 1: content creators is still not really on the right track. 369 00:21:37,400 --> 00:21:41,360 Speaker 1: One last bit about the YouTube updates. YouTube recently posted 370 00:21:41,400 --> 00:21:44,320 Speaker 1: an apology on Twitter to the l g B t 371 00:21:44,520 --> 00:21:49,119 Speaker 1: Q community regarding how it's algorithm unfairly demonetized people in 372 00:21:49,200 --> 00:21:51,960 Speaker 1: that community. You know, creators in the l g B 373 00:21:52,119 --> 00:21:55,959 Speaker 1: t q community who had uploaded videos to their YouTube 374 00:21:56,040 --> 00:21:59,920 Speaker 1: channels start seeing their videos demonetized at a much high 375 00:22:00,080 --> 00:22:04,400 Speaker 1: or rate um even though they were treating the subject 376 00:22:04,480 --> 00:22:10,720 Speaker 1: matter with respect. They were not being racist or misogynistic 377 00:22:10,840 --> 00:22:13,840 Speaker 1: or using hate speech. They were treating real issues, and 378 00:22:13,840 --> 00:22:16,800 Speaker 1: they were seeing their videos get demonetized. And I said, well, 379 00:22:16,840 --> 00:22:19,000 Speaker 1: this is kind of discrimination. And not only that, but 380 00:22:19,200 --> 00:22:23,240 Speaker 1: some of their content got paired with ads that actually 381 00:22:23,320 --> 00:22:29,119 Speaker 1: ran with homophobic language, ads that were viewing that were 382 00:22:29,160 --> 00:22:34,560 Speaker 1: essentially from from organizations that view uh homosexuality as a 383 00:22:34,600 --> 00:22:38,919 Speaker 1: sin or as something that is immoral. And so this 384 00:22:39,080 --> 00:22:41,520 Speaker 1: was like a real slap in the face. Not only 385 00:22:41,560 --> 00:22:45,040 Speaker 1: were a lot of these videos getting demonetized, the ones 386 00:22:45,080 --> 00:22:48,560 Speaker 1: that weren't demonetized were getting paired with ads that outright 387 00:22:48,640 --> 00:22:53,280 Speaker 1: said the person who created the content was bad in 388 00:22:53,359 --> 00:22:56,600 Speaker 1: some way. The tweet said the company had taken action 389 00:22:56,720 --> 00:22:59,560 Speaker 1: on those ads and would tighten enforcement on its policies, 390 00:23:00,119 --> 00:23:02,679 Speaker 1: which I'm sure a lot of people appreciate, but it's still, 391 00:23:03,280 --> 00:23:05,359 Speaker 1: you know, it's it's not a great thing to have 392 00:23:05,440 --> 00:23:09,280 Speaker 1: to apologize for back in two thousand fourteen, So it's 393 00:23:09,320 --> 00:23:11,560 Speaker 1: going back quite a ways. I did a couple of 394 00:23:11,560 --> 00:23:14,840 Speaker 1: episodes about the automotive company Tesla, and a lot has 395 00:23:14,960 --> 00:23:17,239 Speaker 1: happened since two thousand fourteen. I could probably do a 396 00:23:17,280 --> 00:23:20,600 Speaker 1: full update episode, but I want to concentrate on just 397 00:23:20,640 --> 00:23:23,320 Speaker 1: a couple of little things. Uh. In fact, one of 398 00:23:23,359 --> 00:23:27,040 Speaker 1: the things that happened since then was some various incidents 399 00:23:27,080 --> 00:23:31,000 Speaker 1: involving Tesla's driver assist features, which the company referred to 400 00:23:31,200 --> 00:23:34,879 Speaker 1: as autopilot, a term that I really didn't care for, 401 00:23:35,040 --> 00:23:37,120 Speaker 1: but I did a full episode about that. Scott Benjamin 402 00:23:37,200 --> 00:23:39,159 Speaker 1: joined me in two thousand and sixteen to talk about that. 403 00:23:39,160 --> 00:23:42,040 Speaker 1: But so rather focus on those tragic stories, I thought, 404 00:23:42,040 --> 00:23:43,800 Speaker 1: I mentioned that just a few days before I came 405 00:23:43,840 --> 00:23:46,960 Speaker 1: in to record this episode, Tesla hit a milestone it 406 00:23:47,000 --> 00:23:49,399 Speaker 1: had been striving towards, and that was to produce at 407 00:23:49,480 --> 00:23:53,080 Speaker 1: least five thousand Model three Tesla electric cars in a 408 00:23:53,119 --> 00:23:57,240 Speaker 1: week in its manufacturing facilities. The Model three is Tesla's 409 00:23:57,359 --> 00:24:00,480 Speaker 1: sports sedan that's supposed to be in the more affordable range, 410 00:24:01,600 --> 00:24:04,840 Speaker 1: but affordable is means different things to different people. The 411 00:24:04,880 --> 00:24:08,040 Speaker 1: prices on the low end started thirty six thousand dollars. 412 00:24:08,560 --> 00:24:11,119 Speaker 1: The company unveiled the Model three in two thousand sixteen, 413 00:24:11,440 --> 00:24:14,000 Speaker 1: and they were taking pre orders and reservations for the 414 00:24:14,080 --> 00:24:16,479 Speaker 1: vehicle for quite some time. At one point the company 415 00:24:16,520 --> 00:24:19,560 Speaker 1: had more than five hundred thousand pre orders five hundred 416 00:24:19,640 --> 00:24:22,920 Speaker 1: eighteen thousand, I believe for the Model three, but over 417 00:24:22,960 --> 00:24:26,479 Speaker 1: time they lost thousands. They were down to four hundred 418 00:24:26,480 --> 00:24:31,080 Speaker 1: twenty thousand reservations. In June two thousand eighteen, Elon Musk 419 00:24:31,119 --> 00:24:33,760 Speaker 1: had claimed the company would hit ten thousand cars produced 420 00:24:33,800 --> 00:24:36,119 Speaker 1: per week at some point in twenty eighteen, which I 421 00:24:36,119 --> 00:24:38,840 Speaker 1: guess could still happen. But they just hit five thousand, 422 00:24:38,920 --> 00:24:42,240 Speaker 1: and it's July eighteen now. But the five thousand vehicle 423 00:24:42,280 --> 00:24:44,960 Speaker 1: per week milestone still is a really big one, even 424 00:24:45,000 --> 00:24:47,840 Speaker 1: though Wall Street wasn't as impressed. Not to be fair, 425 00:24:48,240 --> 00:24:50,679 Speaker 1: the expectation was that by the end of the quarter 426 00:24:51,080 --> 00:24:54,800 Speaker 1: that ended in June eighteen, Tesla would have delivered fifty 427 00:24:54,880 --> 00:24:58,479 Speaker 1: one thousand vehicles across all of its models for that 428 00:24:58,560 --> 00:25:01,680 Speaker 1: fiscal year, but the ACTU number was forty thousand, seven 429 00:25:01,720 --> 00:25:06,720 Speaker 1: hundred forty so more than ten thousand vehicles short. The 430 00:25:06,760 --> 00:25:10,240 Speaker 1: company's share prices fell nearly four percent after the quarterly 431 00:25:10,280 --> 00:25:13,720 Speaker 1: report was posted, and on another note, as of July two, 432 00:25:13,800 --> 00:25:17,680 Speaker 1: two thousand eighteen, Tesla has delivered twenty eight thousand, three 433 00:25:18,119 --> 00:25:20,640 Speaker 1: eighty six Model three cars to customers who had put 434 00:25:20,680 --> 00:25:24,600 Speaker 1: down a reservation, so only about four hundred thousand ago, 435 00:25:25,680 --> 00:25:28,600 Speaker 1: I guess. In February eighteen, I did a couple of 436 00:25:28,600 --> 00:25:31,119 Speaker 1: episodes about Uber, and I've got some updates about that 437 00:25:31,200 --> 00:25:34,320 Speaker 1: company as well. One of those updates is that Uber 438 00:25:34,400 --> 00:25:36,880 Speaker 1: is once again allowed to operate in London. You may 439 00:25:36,920 --> 00:25:41,080 Speaker 1: remember that London had passed a ban on Uber, but 440 00:25:41,160 --> 00:25:43,479 Speaker 1: a judge in the UK overturned the band, but with 441 00:25:43,520 --> 00:25:47,280 Speaker 1: several conditions. Uber now has a fifteen month license to 442 00:25:47,320 --> 00:25:50,600 Speaker 1: operate within the City of London, but the company must 443 00:25:50,720 --> 00:25:54,399 Speaker 1: submit to an independent review looking into policies, procedures and 444 00:25:54,440 --> 00:25:57,919 Speaker 1: safety records every six months, and the company must notify 445 00:25:58,040 --> 00:26:01,240 Speaker 1: UK government officials of any major che changes in processes 446 00:26:01,320 --> 00:26:04,760 Speaker 1: or policies. In New York City, the Taxi and Limousine 447 00:26:04,760 --> 00:26:08,000 Speaker 1: Commission proposed new rules regarding how much Uber drivers get 448 00:26:08,040 --> 00:26:11,639 Speaker 1: to take home. The take home pay that is not 449 00:26:11,640 --> 00:26:14,080 Speaker 1: not passengers. They're not allowed to take any passengers to 450 00:26:14,160 --> 00:26:17,240 Speaker 1: their home. That's that's kind of against the rules now. 451 00:26:17,280 --> 00:26:20,280 Speaker 1: The study commission by this group stated that Uber driver 452 00:26:20,400 --> 00:26:23,399 Speaker 1: salaries have remained low despite what it called the rapid 453 00:26:23,480 --> 00:26:27,040 Speaker 1: growth of the industry, and the rules, if adopted, would 454 00:26:27,080 --> 00:26:30,240 Speaker 1: require Uber to step in and make up shortcomings and 455 00:26:30,359 --> 00:26:35,560 Speaker 1: driver earnings. So the threshold is seventeen dollars twenty two 456 00:26:35,600 --> 00:26:38,800 Speaker 1: cents per hour. If a driver were to average that 457 00:26:39,800 --> 00:26:41,800 Speaker 1: over the course of a week, everything's fine. But if 458 00:26:41,840 --> 00:26:44,320 Speaker 1: that if it were to drop below that average in 459 00:26:44,320 --> 00:26:46,440 Speaker 1: the course of a week, Uber would have to come 460 00:26:46,480 --> 00:26:49,280 Speaker 1: in and pay to make up the shortfall. Of course, 461 00:26:49,280 --> 00:26:51,920 Speaker 1: that only applies to times when someone would actually be 462 00:26:51,960 --> 00:26:55,159 Speaker 1: working on the clock for Uber, it's not just the 463 00:26:55,200 --> 00:26:57,760 Speaker 1: week in general. And the study said that Uber could 464 00:26:57,840 --> 00:27:01,920 Speaker 1: lower the amount they take from passenger fairs. In other words, 465 00:27:01,960 --> 00:27:04,920 Speaker 1: the company could take a smaller percentage to help make 466 00:27:04,960 --> 00:27:07,560 Speaker 1: this possible. And according to the study, the media net 467 00:27:07,600 --> 00:27:11,920 Speaker 1: hourly earnings in New York was fourteen dollars and cents, 468 00:27:11,920 --> 00:27:15,200 Speaker 1: which is nearly three dollars per hour below the amount 469 00:27:15,280 --> 00:27:18,679 Speaker 1: the new rules would call for. Nationwide, the company's drivers 470 00:27:18,720 --> 00:27:21,320 Speaker 1: make an average of eleven dollars and seventy seven cents 471 00:27:21,359 --> 00:27:24,359 Speaker 1: per hour once you deduct Uber's cut and car expenses. 472 00:27:24,960 --> 00:27:27,520 Speaker 1: And by the way, I'm saying Uber, but this actually 473 00:27:27,520 --> 00:27:30,840 Speaker 1: applies to all ride hailing services. It's not just Uber, 474 00:27:30,880 --> 00:27:34,119 Speaker 1: it's also lift in others um In New York City, 475 00:27:34,280 --> 00:27:37,080 Speaker 1: by the way, this Taxi and Limousine Commission can actually 476 00:27:37,119 --> 00:27:40,720 Speaker 1: adopt these rules without any involvement from either the City 477 00:27:40,760 --> 00:27:43,720 Speaker 1: Council or the mayor because it's New York City and 478 00:27:43,720 --> 00:27:47,200 Speaker 1: the Taxi and Limo Commission is a powerful entity there. 479 00:27:47,760 --> 00:27:51,720 Speaker 1: The city is meanwhile considering options to regulate car hailing services. 480 00:27:51,720 --> 00:27:54,760 Speaker 1: In general, traffic has gotten really bad because there are 481 00:27:54,760 --> 00:27:57,360 Speaker 1: a lot more people who have started driving for those services. 482 00:27:57,359 --> 00:27:59,960 Speaker 1: It's actually put more cars on the streets, and there's 483 00:28:00,000 --> 00:28:03,280 Speaker 1: an ongoing concern that driving for the services doesn't really 484 00:28:03,320 --> 00:28:07,040 Speaker 1: amount to a living, livable wage. And because a lot 485 00:28:07,160 --> 00:28:10,080 Speaker 1: of the people who are driving for these services come 486 00:28:10,119 --> 00:28:13,520 Speaker 1: from low income families, it can actually tramp them in 487 00:28:13,520 --> 00:28:16,239 Speaker 1: a terrible situation in which they have car payments and 488 00:28:16,280 --> 00:28:20,000 Speaker 1: car insurance and they're struggling to meet these extra expenses 489 00:28:20,040 --> 00:28:23,159 Speaker 1: they normally wouldn't have because they're not taking home as 490 00:28:23,200 --> 00:28:27,359 Speaker 1: much pay as they were originally not maybe promises the 491 00:28:27,359 --> 00:28:30,800 Speaker 1: wrong word, but lead to believe they could earn, so 492 00:28:31,200 --> 00:28:33,720 Speaker 1: ends up being like working for the company store, right, 493 00:28:33,880 --> 00:28:36,800 Speaker 1: Like everything you earn has to be spent at the 494 00:28:36,840 --> 00:28:39,040 Speaker 1: company store to buy the stuff you need to keep 495 00:28:39,040 --> 00:28:41,680 Speaker 1: a living, and because of the prices in the company store, 496 00:28:42,240 --> 00:28:44,920 Speaker 1: you're never going to earn enough to get free. Same 497 00:28:44,960 --> 00:28:46,680 Speaker 1: sort of thing here, except in this case, it's not 498 00:28:46,720 --> 00:28:50,720 Speaker 1: the company store, it's car payments and insurance. Well, I've 499 00:28:50,720 --> 00:28:53,040 Speaker 1: got some more updates to give you about stuff that's 500 00:28:53,080 --> 00:28:55,800 Speaker 1: happened since I've had episodes go live. But before I 501 00:28:55,840 --> 00:29:06,880 Speaker 1: do that, let's take another quick break to thank our sponsor. Okay, 502 00:29:07,360 --> 00:29:10,560 Speaker 1: I could probably do a full episode with an update 503 00:29:10,640 --> 00:29:13,240 Speaker 1: on what Facebook has been doing since two thousand eleven, 504 00:29:14,040 --> 00:29:16,360 Speaker 1: because that's when Chris Poulette and I originally published the 505 00:29:16,400 --> 00:29:20,280 Speaker 1: Facebook Story, which I think was a one episode thing too, 506 00:29:20,320 --> 00:29:23,240 Speaker 1: so it's it's a pretty high view of what Facebook 507 00:29:23,280 --> 00:29:26,080 Speaker 1: has been doing. But I did do a follow up 508 00:29:26,080 --> 00:29:28,480 Speaker 1: episode about Facebook's I p O in two thousand twelve, 509 00:29:28,480 --> 00:29:31,280 Speaker 1: and I've done numerous episodes where I've talked about Facebook. 510 00:29:31,560 --> 00:29:33,880 Speaker 1: But I'll probably have to do more episodes about the 511 00:29:33,960 --> 00:29:36,520 Speaker 1: whole company in the near future, give it the full 512 00:29:36,560 --> 00:29:38,760 Speaker 1: treatment that I tend to do for companies these days. 513 00:29:39,280 --> 00:29:41,360 Speaker 1: But for the time being, one story I have to 514 00:29:41,400 --> 00:29:45,480 Speaker 1: talk about is Mark Zuckerberg's appearances before the US government. 515 00:29:46,000 --> 00:29:50,600 Speaker 1: So Facebook was already under scrutiny after reports and service 516 00:29:50,680 --> 00:29:55,080 Speaker 1: that Russian individuals or companies have been funding targeted political 517 00:29:55,160 --> 00:29:59,840 Speaker 1: advertising aimed to influence the US presidential election back in sixteen, 518 00:30:00,400 --> 00:30:03,800 Speaker 1: and that Facebook had become a platform on which propaganda 519 00:30:03,880 --> 00:30:07,719 Speaker 1: and fake news was spreading rapidly. It was just becoming 520 00:30:08,000 --> 00:30:14,440 Speaker 1: a tool to radicalize people and to mislead people. And 521 00:30:14,480 --> 00:30:18,280 Speaker 1: then came Cambridge Analytica, and I'll have to do a 522 00:30:18,280 --> 00:30:21,240 Speaker 1: full episode about Cambridge Analytica at some point to really 523 00:30:21,280 --> 00:30:24,720 Speaker 1: go into all the detail about that company. It's become 524 00:30:24,800 --> 00:30:28,479 Speaker 1: such a huge news item in twenty eighteen. But in short, 525 00:30:29,040 --> 00:30:32,320 Speaker 1: it was a British company that specialized in the collection 526 00:30:32,400 --> 00:30:37,000 Speaker 1: and mining of data, specifically for political consulting gigs. It, 527 00:30:37,320 --> 00:30:40,360 Speaker 1: by the way, has filed for insolvency, which is why 528 00:30:40,480 --> 00:30:43,240 Speaker 1: I now use the past tense to refer to it. 529 00:30:43,760 --> 00:30:45,840 Speaker 1: Though the guy who was the director of the company 530 00:30:45,960 --> 00:30:48,320 Speaker 1: is also the head of eight or so other companies 531 00:30:48,360 --> 00:30:51,360 Speaker 1: that all are registered to the same physical address. So 532 00:30:52,600 --> 00:30:57,200 Speaker 1: just because the name Cambridge Analytica is technically quote unquote gone, 533 00:30:57,560 --> 00:31:01,720 Speaker 1: doesn't mean the game is over anyway. During the election season, 534 00:31:02,040 --> 00:31:06,440 Speaker 1: the Trump campaign hired Cambridge Analytica as a consulting firm 535 00:31:06,520 --> 00:31:10,000 Speaker 1: for their campaign. The firm managed to get hold of 536 00:31:10,040 --> 00:31:13,920 Speaker 1: the private information of more than eighty million Facebook users. 537 00:31:14,440 --> 00:31:18,200 Speaker 1: The firm claimed it could use this data to identify 538 00:31:18,520 --> 00:31:22,640 Speaker 1: user political preferences and then target those users to influence voting, 539 00:31:23,160 --> 00:31:26,680 Speaker 1: right to to kind of guide people. The firm worked 540 00:31:26,680 --> 00:31:29,560 Speaker 1: with a professor at Cambridge University to develop an app 541 00:31:29,760 --> 00:31:33,440 Speaker 1: that Facebook users were encouraged to download, and that app, 542 00:31:33,560 --> 00:31:37,640 Speaker 1: or really to just install on Facebook, not even fully download, 543 00:31:37,880 --> 00:31:41,040 Speaker 1: but that app would scrape user Facebook profiles for information 544 00:31:41,080 --> 00:31:44,560 Speaker 1: about them and all of their friends. And the US 545 00:31:44,640 --> 00:31:47,520 Speaker 1: Congress ended up calling Zuckerberg in to explain themselves and 546 00:31:47,560 --> 00:31:49,400 Speaker 1: the company in an effort to find out just what 547 00:31:49,520 --> 00:31:53,160 Speaker 1: sort of information was used and what Facebook's policies were 548 00:31:53,240 --> 00:31:58,440 Speaker 1: regarding user privacy and security. So Zuckerberg was under questioning 549 00:31:58,480 --> 00:32:02,360 Speaker 1: for five hours over in Congress, and he didn't shy 550 00:32:02,400 --> 00:32:05,680 Speaker 1: away from taking responsibility. At one point he said, quote, 551 00:32:06,240 --> 00:32:09,560 Speaker 1: we didn't take a broad enough view of our responsibility 552 00:32:09,640 --> 00:32:12,480 Speaker 1: and that was a big mistake and it was my mistake, 553 00:32:12,680 --> 00:32:15,800 Speaker 1: and I'm sorry. I started Facebook, I run it, and 554 00:32:15,840 --> 00:32:18,920 Speaker 1: I'm responsible for what happens here. End quote. He said 555 00:32:18,960 --> 00:32:22,800 Speaker 1: the company failed to handle the Cambridge Analytica problem correctly. 556 00:32:23,200 --> 00:32:26,320 Speaker 1: He said Facebook was told that Cambridge Analytica had deleted 557 00:32:26,360 --> 00:32:29,560 Speaker 1: the data it had collected, and that Facebook failed to 558 00:32:29,640 --> 00:32:32,120 Speaker 1: follow up to make certain that that was actually the case, 559 00:32:32,640 --> 00:32:34,680 Speaker 1: and that was a big mistake. He also said the 560 00:32:34,680 --> 00:32:38,240 Speaker 1: company was hiring a special counsel to investigate all apps 561 00:32:38,280 --> 00:32:41,440 Speaker 1: that have access to large amounts of user data to 562 00:32:41,520 --> 00:32:45,400 Speaker 1: make certain that those two were not violating Facebook's policies. So, 563 00:32:45,440 --> 00:32:47,520 Speaker 1: in other words, he was saying the company would now 564 00:32:47,600 --> 00:32:50,800 Speaker 1: actually enforce the rules they had created. They had created 565 00:32:50,840 --> 00:32:53,880 Speaker 1: these rules, which sounded good, the idea being that if 566 00:32:53,880 --> 00:32:56,440 Speaker 1: you create an app for Facebook, you're only supposed to 567 00:32:56,480 --> 00:32:59,160 Speaker 1: ask for the information you actually need in order for 568 00:32:59,240 --> 00:33:01,920 Speaker 1: your app to work, and you're not supposed to do 569 00:33:01,960 --> 00:33:05,840 Speaker 1: anything nefarious with that data. Anything you do plan to 570 00:33:05,920 --> 00:33:09,040 Speaker 1: use with that data you're supposed to communicate to the 571 00:33:09,200 --> 00:33:12,040 Speaker 1: end user. So at least in theory, the end user 572 00:33:12,120 --> 00:33:14,560 Speaker 1: has the opportunity to say I'm not really cool with 573 00:33:14,600 --> 00:33:18,680 Speaker 1: that and and back out. Although very few of us 574 00:33:18,720 --> 00:33:21,520 Speaker 1: tend to read all of that stuff, and that's that's 575 00:33:21,600 --> 00:33:26,040 Speaker 1: kind of on us. Anyway. Facebook wasn't really enforcing those 576 00:33:26,120 --> 00:33:29,200 Speaker 1: rules to any great extent. They were just kind of 577 00:33:30,640 --> 00:33:33,920 Speaker 1: codifying the rules so that Facebook could say, well, you know, 578 00:33:34,040 --> 00:33:38,480 Speaker 1: that's our rules. Say this, but unless you actually enforce rules, 579 00:33:38,520 --> 00:33:41,840 Speaker 1: they don't really mean much anyway. At one point, Senator 580 00:33:41,880 --> 00:33:46,200 Speaker 1: Dick Durbin actually asked Zuckerberg if Zuckerberg would be willing 581 00:33:46,240 --> 00:33:48,360 Speaker 1: to share the name of the hotel he stayed at 582 00:33:48,400 --> 00:33:52,000 Speaker 1: the night before or who he had messaged earlier that week, 583 00:33:52,400 --> 00:33:54,960 Speaker 1: and Zuckerberg said he would rather not do that publicly, 584 00:33:55,200 --> 00:33:59,160 Speaker 1: and Senator Dick Durbin said, quote, I think that maybe 585 00:33:59,200 --> 00:34:02,120 Speaker 1: what this is all about your right to privacy, the 586 00:34:02,240 --> 00:34:05,239 Speaker 1: limits of your right to privacy, and how much you 587 00:34:05,280 --> 00:34:08,320 Speaker 1: give away in modern America in the name of connecting 588 00:34:08,360 --> 00:34:12,320 Speaker 1: people around the world end quote. So, in other words, stating, 589 00:34:13,520 --> 00:34:16,040 Speaker 1: if you're not willing to give up this basic information, 590 00:34:17,080 --> 00:34:20,680 Speaker 1: and you realize the value of that information, and you 591 00:34:20,719 --> 00:34:23,520 Speaker 1: realize the impact that information could have if it got 592 00:34:23,520 --> 00:34:27,600 Speaker 1: out to other people, why are you allowing your platform 593 00:34:27,640 --> 00:34:31,120 Speaker 1: to do this on a broad scale to billions of people, 594 00:34:31,520 --> 00:34:34,160 Speaker 1: or at least a billion people, since that's how many 595 00:34:34,160 --> 00:34:37,759 Speaker 1: people Facebook has these days, well, the story continues to 596 00:34:37,840 --> 00:34:40,759 Speaker 1: unfold as of the time I'm researching the show. The 597 00:34:40,800 --> 00:34:43,960 Speaker 1: Washington Post recently said that five people familiar with the 598 00:34:44,000 --> 00:34:48,200 Speaker 1: matter told journalists that the Department of Justice, the FBI, 599 00:34:48,520 --> 00:34:51,600 Speaker 1: the U S. Securities and Exchange Commission, and the Federal 600 00:34:51,680 --> 00:34:55,400 Speaker 1: Trade Commission we're all investigating the Cambridge analytic a matter 601 00:34:55,760 --> 00:34:59,480 Speaker 1: and Facebook's actions and statements over the last several years. 602 00:35:00,000 --> 00:35:02,480 Speaker 1: The agencies are trying to figure out how much Facebook 603 00:35:02,520 --> 00:35:05,200 Speaker 1: employees knew about what was going on, and how truthful 604 00:35:05,239 --> 00:35:08,040 Speaker 1: the communication from Facebook to the authorities has been up 605 00:35:08,040 --> 00:35:11,719 Speaker 1: to this point, and if things have not all been 606 00:35:11,760 --> 00:35:13,759 Speaker 1: on the up and up, I expect we're gonna see 607 00:35:13,760 --> 00:35:17,719 Speaker 1: some massive fines. In fact, Facebook could face billions of 608 00:35:17,760 --> 00:35:21,520 Speaker 1: dollars in fines if these agencies find that the company 609 00:35:21,600 --> 00:35:25,040 Speaker 1: was culpable in any way to what went on with 610 00:35:25,080 --> 00:35:29,040 Speaker 1: Cambridge Analytica. Now unrelated to all of that mess, but 611 00:35:29,160 --> 00:35:32,120 Speaker 1: still a Facebook story. Is when I wanted to touch on, 612 00:35:32,440 --> 00:35:36,880 Speaker 1: Facebook had been pursuing a project called the Aquila Program. 613 00:35:36,920 --> 00:35:39,280 Speaker 1: The goal of that program was to use solar powered, 614 00:35:39,400 --> 00:35:42,000 Speaker 1: high altitude drones that would act as a backbone for 615 00:35:42,040 --> 00:35:45,440 Speaker 1: wireless internet connectivity in places that have limited or no 616 00:35:45,640 --> 00:35:48,160 Speaker 1: access to the Internet. So I think of things like 617 00:35:48,480 --> 00:35:52,680 Speaker 1: the continent of Africa and imagine these drones flying way, way, 618 00:35:52,680 --> 00:35:56,080 Speaker 1: way up in the atmosphere, and they're powered by the sun. 619 00:35:56,160 --> 00:35:58,279 Speaker 1: They have batteries so that they can continue to fly 620 00:35:58,360 --> 00:36:01,759 Speaker 1: at night, they recharged during the day, and they are 621 00:36:01,800 --> 00:36:06,520 Speaker 1: able to broadcast uh WiFi signals across to each other 622 00:36:06,640 --> 00:36:09,920 Speaker 1: using mostly stuff like lasers actually, and then beam that 623 00:36:09,960 --> 00:36:14,080 Speaker 1: information down to ground stations, giving people on the ground 624 00:36:14,280 --> 00:36:17,719 Speaker 1: Internet access. It's a pretty cool idea, but it is 625 00:36:17,760 --> 00:36:21,439 Speaker 1: no more. The Facebook had only managed to do two 626 00:36:21,440 --> 00:36:24,680 Speaker 1: test flights of its drone technology. The first one ended 627 00:36:24,719 --> 00:36:28,120 Speaker 1: in a crash, so not a successful test. The second 628 00:36:28,120 --> 00:36:32,040 Speaker 1: one had a modest success, but still performed below expectations, 629 00:36:32,400 --> 00:36:35,319 Speaker 1: so Facebook decided to shelve it. Google actually had a 630 00:36:35,360 --> 00:36:38,239 Speaker 1: similar drone program it was working on. It also has 631 00:36:38,239 --> 00:36:42,640 Speaker 1: shuttered that effort, but Project Loon, which uses high altitude 632 00:36:42,640 --> 00:36:46,200 Speaker 1: balloons rather than drones, is still in development over at Google. 633 00:36:46,800 --> 00:36:50,240 Speaker 1: Well that's just a quick series of updates about various 634 00:36:50,239 --> 00:36:53,879 Speaker 1: stories in the tech world. There are clearly tons more 635 00:36:54,000 --> 00:36:56,520 Speaker 1: I could have touched on, but I can only do 636 00:36:57,239 --> 00:37:00,279 Speaker 1: so many per episode, so I'll probably do more of 637 00:37:00,280 --> 00:37:02,440 Speaker 1: these in the future, where I will do like a 638 00:37:02,480 --> 00:37:06,000 Speaker 1: grab bag of updates about various things going on in 639 00:37:06,040 --> 00:37:09,800 Speaker 1: the tech world that maybe again don't marrit a full episode. 640 00:37:10,239 --> 00:37:12,959 Speaker 1: Cambridge Analytica will definitely get a full episode, a full 641 00:37:13,000 --> 00:37:16,959 Speaker 1: treatment from the entire story, probably a little bit later 642 00:37:17,040 --> 00:37:21,200 Speaker 1: once more of these investigations have been wrapped up, so 643 00:37:21,239 --> 00:37:23,680 Speaker 1: I can have kind of a beginning, middle and into that. 644 00:37:24,040 --> 00:37:26,360 Speaker 1: But expect that in the future. And if you guys 645 00:37:26,360 --> 00:37:29,480 Speaker 1: have any suggestions for other topics I should cover on 646 00:37:29,560 --> 00:37:33,600 Speaker 1: tech stuff, whether it's a company, a technology, an issue 647 00:37:33,760 --> 00:37:36,240 Speaker 1: in tech. Maybe there's someone you want me to interview 648 00:37:36,320 --> 00:37:39,000 Speaker 1: or have on as a guest, make certain to let 649 00:37:39,000 --> 00:37:41,560 Speaker 1: me know. Send me a message the email addresses tech 650 00:37:41,640 --> 00:37:44,640 Speaker 1: stuff at how stuff works dot com, or drop me 651 00:37:44,680 --> 00:37:46,759 Speaker 1: a line on Facebook or Twitter. They handle with both 652 00:37:46,760 --> 00:37:49,800 Speaker 1: of those. Is tech stuff H s W. Don't forget 653 00:37:49,840 --> 00:37:52,319 Speaker 1: to follow us on Instagram and I'll talk to you 654 00:37:52,360 --> 00:38:01,239 Speaker 1: again really soon from more on this and thousands of 655 00:38:01,239 --> 00:38:13,040 Speaker 1: other topics. Is that how stuff Works dot com. M