1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,920 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,920 --> 00:00:16,640 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,680 --> 00:00:18,439 Speaker 1: iHeart Radio and a love of all things tech, and 5 00:00:18,520 --> 00:00:21,720 Speaker 1: we are going to continue our retrospective look at how 6 00:00:21,760 --> 00:00:25,479 Speaker 1: technology has changed over the last ten years. Now. In 7 00:00:25,520 --> 00:00:28,720 Speaker 1: our last episode, I went from two thousand nine to 8 00:00:28,800 --> 00:00:31,560 Speaker 1: the end of two thousand thirteen, and we talked about 9 00:00:31,560 --> 00:00:34,479 Speaker 1: stuff like Apple launching the iPad, all the way up 10 00:00:34,520 --> 00:00:38,680 Speaker 1: to the emergence of Cambridge ANALYTICAM. And in this episode, 11 00:00:38,720 --> 00:00:41,440 Speaker 1: we'll keep on marching down the path of time to 12 00:00:41,479 --> 00:00:45,000 Speaker 1: see how tech has evolved and shaped our world in 13 00:00:45,040 --> 00:00:48,520 Speaker 1: the years since. Now, if you listen to the last episode, 14 00:00:48,840 --> 00:00:52,080 Speaker 1: you'll remember then two thousand thirteen, Steve Balmer, who was 15 00:00:52,159 --> 00:00:56,360 Speaker 1: at that point the CEO of Microsoft, had announced his retirement. 16 00:00:56,680 --> 00:01:00,400 Speaker 1: He would actually remain CEO until February of two thousand teen, 17 00:01:00,840 --> 00:01:03,120 Speaker 1: and that's when he would hand over the reins to 18 00:01:03,320 --> 00:01:07,279 Speaker 1: Satya Nadela. Nadela, who had been the executive vice president 19 00:01:07,319 --> 00:01:10,560 Speaker 1: of Microsoft's Cloud and Enterprise Group, had been working for 20 00:01:10,640 --> 00:01:15,080 Speaker 1: Microsoft since two so this was definitely a case of 21 00:01:15,080 --> 00:01:19,400 Speaker 1: a company promoting from within. Microsoft had been around nearly 22 00:01:19,480 --> 00:01:23,119 Speaker 1: forty years, and Nadela would mark only the third CEO 23 00:01:23,520 --> 00:01:28,600 Speaker 1: in the company's history. Nadella's first year wasn't exactly smooth, 24 00:01:28,959 --> 00:01:32,520 Speaker 1: partly due to his own actions. While at a conference 25 00:01:32,920 --> 00:01:37,640 Speaker 1: named after tech pioneer Grace Hopper, who many people referred 26 00:01:37,640 --> 00:01:40,440 Speaker 1: to as the person who invented the term computer bug, 27 00:01:41,080 --> 00:01:44,120 Speaker 1: Nadela made some comments that women in the tech industry 28 00:01:44,160 --> 00:01:47,440 Speaker 1: should rely on good karma rather than asking for a 29 00:01:47,520 --> 00:01:50,440 Speaker 1: pay raise, essentially saying if you were if your work 30 00:01:50,560 --> 00:01:54,000 Speaker 1: is good, it'll come back to you now. As you 31 00:01:54,080 --> 00:01:58,080 Speaker 1: might imagine, the response to this rather tone deaf statement 32 00:01:58,560 --> 00:02:02,840 Speaker 1: wasn't terribly positive of and he would subsequently apologize for 33 00:02:02,840 --> 00:02:05,600 Speaker 1: those comments, and he would later explain that he was 34 00:02:05,640 --> 00:02:09,360 Speaker 1: applying a personal philosophy that he followed that doing good 35 00:02:09,360 --> 00:02:12,920 Speaker 1: work leads to rewards to a broad environment, but he 36 00:02:13,000 --> 00:02:17,000 Speaker 1: did not take into account stuff like systemic bias, in 37 00:02:17,040 --> 00:02:21,440 Speaker 1: which an entire population of employees contends with lower salaries 38 00:02:21,600 --> 00:02:24,760 Speaker 1: among other problems, just as a matter of course, that 39 00:02:24,840 --> 00:02:28,320 Speaker 1: if you don't address that part, then this whole do 40 00:02:28,440 --> 00:02:30,920 Speaker 1: good work and you'll get rewarded thing ends up being 41 00:02:31,000 --> 00:02:36,120 Speaker 1: lips service. Nadela committed to continuing existing policies and enacting 42 00:02:36,200 --> 00:02:40,040 Speaker 1: new ones to help overcome this systemic bias in general, 43 00:02:40,240 --> 00:02:46,040 Speaker 1: including launching programs that would help expand diversity within Microsoft. Now. 44 00:02:46,080 --> 00:02:49,240 Speaker 1: On top of that scandal, Nadela also had to helm 45 00:02:49,360 --> 00:02:54,079 Speaker 1: Microsoft through some pretty tough layoffs throughout two thousand fourteen. 46 00:02:54,560 --> 00:02:59,080 Speaker 1: In July, he announced that the company would eliminate eighteen 47 00:02:59,080 --> 00:03:03,839 Speaker 1: thousand jobs. Yikes. Many of those, about twelve thousand, five 48 00:03:03,919 --> 00:03:07,399 Speaker 1: hundred of them, in fact, were employees who had previously 49 00:03:07,400 --> 00:03:11,880 Speaker 1: worked at Nokia. And here's the really rough thing. Microsoft 50 00:03:11,919 --> 00:03:16,120 Speaker 1: had just acquired Nokia in April two thousand fourteen. This 51 00:03:16,240 --> 00:03:19,120 Speaker 1: was all in an effort to create mobile devices, mainly 52 00:03:19,360 --> 00:03:23,440 Speaker 1: smartphones that could compete against Google, Android devices and the 53 00:03:23,480 --> 00:03:28,280 Speaker 1: Apple iPhone, and spoiler alert, this would not really pan 54 00:03:28,360 --> 00:03:31,520 Speaker 1: out so well. Microsoft would ultimately get out of the 55 00:03:31,560 --> 00:03:35,920 Speaker 1: smartphone market entirely, though it stuck around longer than a 56 00:03:35,960 --> 00:03:39,840 Speaker 1: lot of people thought was necessary. On the US government level, 57 00:03:40,200 --> 00:03:43,400 Speaker 1: the FCC was forced to go back to the drawing 58 00:03:43,440 --> 00:03:46,040 Speaker 1: board in an effort to establish rules about the concept 59 00:03:46,080 --> 00:03:48,800 Speaker 1: of net neutrality. If they had been able to look 60 00:03:48,840 --> 00:03:52,120 Speaker 1: into the future, maybe they wouldn't have even bothered this. 61 00:03:52,240 --> 00:03:54,520 Speaker 1: A little commentary will come back by the end of 62 00:03:54,560 --> 00:03:58,760 Speaker 1: this episode, but essentially, the idea behind net neutrality is 63 00:03:58,800 --> 00:04:01,640 Speaker 1: that all data should be equal on the Internet. So 64 00:04:01,720 --> 00:04:04,760 Speaker 1: no service provider, that is, the company that actually allows 65 00:04:04,800 --> 00:04:07,520 Speaker 1: you to connect to the Internet, none of those should 66 00:04:07,520 --> 00:04:10,440 Speaker 1: be able to give preferential treatment to some data at 67 00:04:10,440 --> 00:04:14,680 Speaker 1: the expense of others, particularly for services or data provided 68 00:04:14,720 --> 00:04:18,320 Speaker 1: by the I s P itself or against competing services. So, 69 00:04:18,320 --> 00:04:20,760 Speaker 1: in other words, if you have an I s P, 70 00:04:20,960 --> 00:04:24,280 Speaker 1: that I s P should not be able to prioritize 71 00:04:24,279 --> 00:04:28,240 Speaker 1: its own properties at the expense of others in an 72 00:04:28,240 --> 00:04:31,360 Speaker 1: effort to get you to subscribe to more of their stuff. 73 00:04:32,120 --> 00:04:34,680 Speaker 1: And also, net neutrality states that you should be able 74 00:04:34,680 --> 00:04:38,920 Speaker 1: to access your data on whatever Internet capable device you're using, 75 00:04:39,320 --> 00:04:41,320 Speaker 1: so that a company would not be able to restrict 76 00:04:41,360 --> 00:04:44,360 Speaker 1: your access to certain content just because you were using 77 00:04:44,360 --> 00:04:47,200 Speaker 1: a device that was made by a competitor. For example, 78 00:04:47,760 --> 00:04:51,599 Speaker 1: the FCC had established regulations in two thousand eleven that 79 00:04:51,600 --> 00:04:55,200 Speaker 1: would require Internet providers to treat all traffic equally, but 80 00:04:55,320 --> 00:04:58,799 Speaker 1: the U S courts ultimately ruled that the FCC didn't 81 00:04:58,800 --> 00:05:02,360 Speaker 1: actually have the authority to enforce those rules, which means 82 00:05:02,400 --> 00:05:05,240 Speaker 1: the rules were meaningless. Right if you can't enforce them, 83 00:05:05,480 --> 00:05:08,599 Speaker 1: then the rules might as well not exist. So the 84 00:05:08,720 --> 00:05:12,559 Speaker 1: FCC changed tactics and began to argue that broadband should 85 00:05:12,560 --> 00:05:17,039 Speaker 1: be reclassified as a regulated utility, which FCC does have 86 00:05:17,200 --> 00:05:21,400 Speaker 1: the authority to oversee. Now, we'll revisit this issue again later, 87 00:05:21,440 --> 00:05:24,799 Speaker 1: because who boy, has it been a mess now? Over 88 00:05:25,000 --> 00:05:29,080 Speaker 1: in the European Union, a ruling made things pretty complicated 89 00:05:29,120 --> 00:05:32,520 Speaker 1: for a companies like Google. The EU passed a rule 90 00:05:32,920 --> 00:05:36,839 Speaker 1: that informally is referred to as the right to be forgotten. 91 00:05:37,120 --> 00:05:40,520 Speaker 1: That is, people should have some option to remove their 92 00:05:40,600 --> 00:05:43,719 Speaker 1: name and details from search results to protect their privacy, 93 00:05:43,839 --> 00:05:46,799 Speaker 1: or to remove reference to something that really no longer 94 00:05:46,839 --> 00:05:50,000 Speaker 1: applies to them. For example, let's say that I grew 95 00:05:50,080 --> 00:05:52,760 Speaker 1: up in the era of the Internet, and as a 96 00:05:52,880 --> 00:05:56,960 Speaker 1: dumb teenager, I did some stupid act that got me 97 00:05:57,040 --> 00:06:00,200 Speaker 1: in legal trouble. Nothing too serious, but bad and off 98 00:06:00,200 --> 00:06:03,560 Speaker 1: to bring my judgment into question, and as a result, 99 00:06:04,040 --> 00:06:07,360 Speaker 1: someone wrote about it and perhaps my identity is known 100 00:06:07,880 --> 00:06:11,040 Speaker 1: as a result of that. Then, as an adult who 101 00:06:11,080 --> 00:06:14,440 Speaker 1: has grown up and supposedly matured, I would never do 102 00:06:14,520 --> 00:06:16,840 Speaker 1: such a thing again. But I still have this thing 103 00:06:16,920 --> 00:06:19,080 Speaker 1: that's in my past, that's in my record that people 104 00:06:19,120 --> 00:06:22,320 Speaker 1: can see, that follows me around everywhere. Then I might 105 00:06:22,400 --> 00:06:24,320 Speaker 1: find harmful and I might want to find a way 106 00:06:24,360 --> 00:06:27,280 Speaker 1: to wipe that off the Internet, because that's not really 107 00:06:27,320 --> 00:06:30,160 Speaker 1: who I am anymore. That was the whole purpose of 108 00:06:30,200 --> 00:06:32,599 Speaker 1: this rule, But critics of the rule pointed out that 109 00:06:32,640 --> 00:06:35,240 Speaker 1: this could allow someone to whitewash their own background. They 110 00:06:35,240 --> 00:06:38,280 Speaker 1: could erase records that have real meaning and are of 111 00:06:38,360 --> 00:06:42,119 Speaker 1: public importance. They could, in theory, demand that links leading 112 00:06:42,160 --> 00:06:46,920 Speaker 1: to factually correct articles about them be removed from search results. 113 00:06:47,200 --> 00:06:50,520 Speaker 1: So let's say you're a local business owner and you 114 00:06:50,560 --> 00:06:53,280 Speaker 1: want to run for a publicly elected office, but you 115 00:06:53,360 --> 00:06:56,719 Speaker 1: also have some dark skeletons in your closet, like illegal 116 00:06:56,839 --> 00:06:59,880 Speaker 1: stuff that people wrote about in the past. You could 117 00:07:00,040 --> 00:07:03,600 Speaker 1: manned any references to your past activities be removed from 118 00:07:03,640 --> 00:07:06,520 Speaker 1: search results, making it harder for the voting public to 119 00:07:06,640 --> 00:07:10,000 Speaker 1: know about the kind of terrible, nefarious person you actually 120 00:07:10,040 --> 00:07:13,080 Speaker 1: are when they go to vote. Now, as it stands, 121 00:07:13,600 --> 00:07:16,960 Speaker 1: the rule would require people to submit requests to Google 122 00:07:17,120 --> 00:07:20,480 Speaker 1: to have their search results removed from general search, and 123 00:07:20,560 --> 00:07:23,880 Speaker 1: Google would review the request before acting on it, and 124 00:07:24,040 --> 00:07:28,240 Speaker 1: so it's a pretty messy subject. Speaking of messy subjects, 125 00:07:28,640 --> 00:07:32,000 Speaker 1: two thousand fourteen was also when more hacking stories made 126 00:07:32,040 --> 00:07:36,640 Speaker 1: the news. Many people, including prominent celebrities, found their private 127 00:07:36,640 --> 00:07:41,360 Speaker 1: stuff that was stored in iCloud on public display. Apple 128 00:07:41,440 --> 00:07:43,560 Speaker 1: was quick to say that I cloud itself had not 129 00:07:43,680 --> 00:07:47,960 Speaker 1: been breached. This wasn't an example of hackers getting access 130 00:07:48,000 --> 00:07:52,120 Speaker 1: to Apple's servers. Rather, the people who were hacked were 131 00:07:52,120 --> 00:07:56,520 Speaker 1: hacked because their passwords and their security questions were compromised, 132 00:07:56,760 --> 00:08:00,720 Speaker 1: probably through guesswork and brute force attacks. Apple promised to 133 00:08:00,760 --> 00:08:04,119 Speaker 1: boost security in the wake of the hacks, but yeah, 134 00:08:04,320 --> 00:08:07,960 Speaker 1: this was pretty ugly stuff. On a much larger scale, 135 00:08:08,000 --> 00:08:11,680 Speaker 1: Sony Pictures was hit with a massive hack attack, which 136 00:08:11,720 --> 00:08:16,600 Speaker 1: included everything from employee compensation records and benefits data to 137 00:08:16,880 --> 00:08:21,440 Speaker 1: unreleased movies. Suspicion turned to North Korea, with a possible 138 00:08:21,480 --> 00:08:24,400 Speaker 1: motivation for the hacks being the upcoming release of a 139 00:08:24,440 --> 00:08:27,560 Speaker 1: film titled The Interview, which was a comedy in which 140 00:08:27,720 --> 00:08:31,520 Speaker 1: two characters are charged with assassinating North Korea's leader Kim 141 00:08:31,600 --> 00:08:36,080 Speaker 1: Jong Un. Chinese e commerce company Ali Baba held its 142 00:08:36,120 --> 00:08:38,360 Speaker 1: i p O on the New York Stock Exchange in 143 00:08:38,400 --> 00:08:42,120 Speaker 1: September of two thousand fourteen, reaching a stock price of 144 00:08:42,720 --> 00:08:46,400 Speaker 1: two dollars and seventy cents per share, which was more 145 00:08:46,480 --> 00:08:50,520 Speaker 1: than thirty five percent higher than it's already pretty hefty 146 00:08:50,600 --> 00:08:54,160 Speaker 1: sixty eight dollars per share opening price. It was the 147 00:08:54,160 --> 00:08:56,960 Speaker 1: biggest I p O in tech up to that point 148 00:08:57,360 --> 00:09:00,000 Speaker 1: and helped show how the Chinese market was becoming an 149 00:09:00,040 --> 00:09:05,160 Speaker 1: increasingly powerful player in the tech space. Apple announced Apple Pay, 150 00:09:05,240 --> 00:09:08,000 Speaker 1: the company's mobile payment technology that has found its way 151 00:09:08,000 --> 00:09:11,199 Speaker 1: into multiple Apple products. To use it, you can tap 152 00:09:11,240 --> 00:09:16,040 Speaker 1: an NFC enabled device against a compatible payment terminal, and 153 00:09:16,120 --> 00:09:18,760 Speaker 1: after authorizing the payment on the device, you could be 154 00:09:18,800 --> 00:09:21,960 Speaker 1: on your way. Apple also announced, but did not launch, 155 00:09:22,160 --> 00:09:25,280 Speaker 1: the Apple Watch in two thousand fourteen, and would enter 156 00:09:25,320 --> 00:09:27,920 Speaker 1: into the wearables market in the hopes of succeeding where 157 00:09:27,920 --> 00:09:31,199 Speaker 1: many other companies had struggled. One other thing about the 158 00:09:31,240 --> 00:09:33,800 Speaker 1: Apple Pay thing that I just think is funny. When 159 00:09:33,840 --> 00:09:36,760 Speaker 1: I was on a trip to San Francisco, Tari was there. 160 00:09:37,200 --> 00:09:41,120 Speaker 1: We went past a girl scout group that was selling 161 00:09:41,320 --> 00:09:45,040 Speaker 1: cookies and they accepted Apple Pay, and I thought, well, 162 00:09:45,080 --> 00:09:49,679 Speaker 1: of course they do. It's San Francisco. Facebook shocked the 163 00:09:49,720 --> 00:09:54,480 Speaker 1: tech world with a sixteen billion dollar acquisition. This was 164 00:09:54,559 --> 00:09:58,640 Speaker 1: the messaging service WhatsApp. Since then, it's become clear that 165 00:09:58,679 --> 00:10:02,880 Speaker 1: the intent was team merge the features of Facebook, WhatsApp, 166 00:10:03,000 --> 00:10:07,240 Speaker 1: and Instagram together, but that has come under recent scrutiny 167 00:10:07,280 --> 00:10:10,360 Speaker 1: as various government officials around the world have speculated that 168 00:10:10,880 --> 00:10:14,559 Speaker 1: breaking up Facebook might be a good idea, that perhaps 169 00:10:14,559 --> 00:10:18,320 Speaker 1: the company has gotten too big, too powerful, and people 170 00:10:18,360 --> 00:10:21,760 Speaker 1: were too dependent upon it. Uh And also that's just 171 00:10:21,960 --> 00:10:25,360 Speaker 1: anti competitive in general, so there's been some moves to 172 00:10:25,679 --> 00:10:29,520 Speaker 1: perhaps forced the company to break into separate companies again. 173 00:10:30,559 --> 00:10:35,679 Speaker 1: But yes, this was the massive acquisition Facebook made with WhatsApp, 174 00:10:36,040 --> 00:10:37,640 Speaker 1: and that was the first time a lot of people, 175 00:10:37,840 --> 00:10:40,600 Speaker 1: including myself, had heard of the app, despite the fact 176 00:10:40,640 --> 00:10:43,720 Speaker 1: that it was a very very popular and still is 177 00:10:43,800 --> 00:10:47,520 Speaker 1: a very popular text messaging service used by millions of 178 00:10:47,520 --> 00:10:49,640 Speaker 1: people around the world. I just hadn't heard of it 179 00:10:49,720 --> 00:10:54,280 Speaker 1: because I was ignorant. And two thousand fourteen would be 180 00:10:54,360 --> 00:10:57,360 Speaker 1: a bad year for Uber, but by far not the 181 00:10:57,360 --> 00:11:00,440 Speaker 1: worst year for Uber. The company did get hold of 182 00:11:00,440 --> 00:11:03,880 Speaker 1: an enormous amount of cash in the form of investments, 183 00:11:03,920 --> 00:11:07,439 Speaker 1: but it also faced increasing resistance and multiple markets around 184 00:11:07,440 --> 00:11:11,160 Speaker 1: the world as governments and taxi cab unions pushed back. 185 00:11:11,640 --> 00:11:15,920 Speaker 1: In addition, some pretty unethical corporate behavior became public knowledge 186 00:11:15,920 --> 00:11:19,240 Speaker 1: in two thousand fourteen, including the revelation that Uber corporate 187 00:11:19,240 --> 00:11:22,960 Speaker 1: employees could access writer logs without any type of consent, 188 00:11:23,440 --> 00:11:28,200 Speaker 1: opening up the possibility of extortion or blackmail. This wouldn't 189 00:11:28,240 --> 00:11:30,880 Speaker 1: be the low point for Uber, as we will see 190 00:11:31,040 --> 00:11:35,160 Speaker 1: later on. Hewitt Packard would announce in October two fourteen 191 00:11:35,200 --> 00:11:38,320 Speaker 1: that the company was going to split up, spinning off 192 00:11:38,360 --> 00:11:42,520 Speaker 1: the PC and printer operations from the corporate enterprise side 193 00:11:42,640 --> 00:11:47,199 Speaker 1: of the business, and Google acquired smart demstat company Nest 194 00:11:47,520 --> 00:11:52,200 Speaker 1: for three point to billion dollars, bolstering Google's smart home 195 00:11:52,240 --> 00:11:56,240 Speaker 1: technology strategy. Nest would operate as an independent company just 196 00:11:56,400 --> 00:11:59,880 Speaker 1: a year later, but would then rejoin Google into the 197 00:12:00,040 --> 00:12:04,040 Speaker 1: in eighteen. Two thousand fourteen was a busy year, all right, 198 00:12:04,120 --> 00:12:07,880 Speaker 1: Let's get on to now. One big thing to happen 199 00:12:07,920 --> 00:12:12,120 Speaker 1: in was a confusing series of corporate maneuvers over at 200 00:12:12,200 --> 00:12:16,200 Speaker 1: the aforementioned Google Now. The end result was that the 201 00:12:16,280 --> 00:12:20,600 Speaker 1: company formed a new umbrella corporation called Alphabet, which would 202 00:12:20,600 --> 00:12:23,120 Speaker 1: be at the top of the hierarchy of companies that 203 00:12:23,160 --> 00:12:27,640 Speaker 1: would include stuff like Google, and YouTube, and later on Weymo, 204 00:12:27,960 --> 00:12:32,120 Speaker 1: the automated driving company, as well as others. The logic 205 00:12:32,160 --> 00:12:34,839 Speaker 1: behind the move was that this would allow each individual 206 00:12:34,920 --> 00:12:38,440 Speaker 1: company in the family to operate independently of all the others, 207 00:12:38,840 --> 00:12:41,520 Speaker 1: giving each company the ability to make moves without having 208 00:12:41,520 --> 00:12:44,480 Speaker 1: to coordinate everything with everyone else. So you wouldn't have 209 00:12:44,800 --> 00:12:46,760 Speaker 1: one leader at the top of all of these having 210 00:12:46,760 --> 00:12:50,240 Speaker 1: to consider all these very different initiatives. Each one would 211 00:12:50,240 --> 00:12:52,560 Speaker 1: have its own leader and thus would be able to 212 00:12:52,600 --> 00:12:56,520 Speaker 1: move more nimbly. Larry Page moved up as CEO of 213 00:12:56,559 --> 00:13:00,640 Speaker 1: Alphabet and Sundar Pichai would take over as CEO of Google, 214 00:13:01,040 --> 00:13:03,439 Speaker 1: which at this point was a little more stripped down 215 00:13:03,520 --> 00:13:09,840 Speaker 1: but still an enormous company. So uh, Apple would officially 216 00:13:09,920 --> 00:13:13,880 Speaker 1: launch the Apple Watch. It received mixed reviews. A lot 217 00:13:13,880 --> 00:13:16,040 Speaker 1: of folks said that it was clearly the best smart 218 00:13:16,040 --> 00:13:18,960 Speaker 1: watch on the market. Others said, yes, it is, but 219 00:13:19,240 --> 00:13:21,920 Speaker 1: the bar is pretty darn low, so that's not a 220 00:13:21,920 --> 00:13:25,240 Speaker 1: big thing now. Generally speaking, the watch has a devoted 221 00:13:25,280 --> 00:13:28,200 Speaker 1: following among Apple fans, but it hasn't really moved on 222 00:13:28,240 --> 00:13:31,679 Speaker 1: to become a bigger success among a broader audience. That's 223 00:13:32,320 --> 00:13:36,080 Speaker 1: often looked at as something of a disappointment in tech 224 00:13:36,160 --> 00:13:40,000 Speaker 1: journalism circles. I'm not as sure that the owners find 225 00:13:40,000 --> 00:13:44,240 Speaker 1: it as disappointing, but reviewers kind of did. Microsoft launched 226 00:13:44,280 --> 00:13:48,840 Speaker 1: Windows ten in now this was a really big reversal 227 00:13:48,920 --> 00:13:52,160 Speaker 1: from Windows eight, and it also skipped right the heck 228 00:13:52,200 --> 00:13:56,760 Speaker 1: over Windows nine. There's no Windows nine goes from Windows 229 00:13:56,760 --> 00:14:00,000 Speaker 1: eight to Windows ten, seeing how is how the iPhone 230 00:14:00,160 --> 00:14:01,640 Speaker 1: is going to do the exact same thing just a 231 00:14:01,679 --> 00:14:04,240 Speaker 1: little bit later. It starts to make you wonder if 232 00:14:04,240 --> 00:14:07,760 Speaker 1: the number nine holds any sort of dark significance in 233 00:14:07,800 --> 00:14:10,000 Speaker 1: the world of technology, like you just don't want to 234 00:14:10,040 --> 00:14:13,120 Speaker 1: have version nine, or maybe it's just people want to 235 00:14:13,120 --> 00:14:15,680 Speaker 1: create the perception of a fresh start and the number 236 00:14:15,720 --> 00:14:18,720 Speaker 1: ten gives you a fresh start that one zero. It 237 00:14:18,800 --> 00:14:21,920 Speaker 1: makes you think of it being a big leap forward 238 00:14:21,920 --> 00:14:25,000 Speaker 1: as opposed to a nine. In some cases it was 239 00:14:25,040 --> 00:14:28,920 Speaker 1: meant to mark not the tenth version of a property, 240 00:14:29,160 --> 00:14:32,560 Speaker 1: but rather the anniversary of a product launch. But whatever, 241 00:14:32,600 --> 00:14:35,920 Speaker 1: I hate it because it makes tracking versions a pain. 242 00:14:36,280 --> 00:14:38,200 Speaker 1: It just when you look over the history and you're 243 00:14:38,200 --> 00:14:40,680 Speaker 1: skipping over numbers, you start to think, did I miss something? 244 00:14:41,000 --> 00:14:44,240 Speaker 1: So I hate it. It's dumb. So we're gonna call 245 00:14:44,280 --> 00:14:48,280 Speaker 1: Windows ten Windows nine. Now everybody write it down the 246 00:14:48,320 --> 00:14:53,040 Speaker 1: year's and it's Windows nine. Anyway, Windows ten launched and 247 00:14:53,040 --> 00:14:55,840 Speaker 1: in general people liked it way more than Windows eight. However, 248 00:14:55,880 --> 00:14:58,640 Speaker 1: it didn't have a perfect launch. There were reports coming 249 00:14:58,640 --> 00:15:02,520 Speaker 1: out that the OS incorporated elements of data tracking, and 250 00:15:02,680 --> 00:15:06,520 Speaker 1: it had some troubling implications regarding user privacy. As a result, 251 00:15:06,760 --> 00:15:09,880 Speaker 1: eventually Microsoft was able to smooth this over, but initially 252 00:15:09,920 --> 00:15:14,160 Speaker 1: those were big concerns. In the US, the Federal Aviation 253 00:15:14,200 --> 00:15:17,280 Speaker 1: Administration or f a A released its set of guidelines 254 00:15:17,320 --> 00:15:20,600 Speaker 1: and rules for drone operation in the US airspace. So 255 00:15:20,680 --> 00:15:23,600 Speaker 1: generally speaking, the rules are not that much different from 256 00:15:23,600 --> 00:15:27,920 Speaker 1: the ones that govern stuff like the operation of model aircraft, 257 00:15:28,280 --> 00:15:31,200 Speaker 1: though they do require operators to get a permit if 258 00:15:31,240 --> 00:15:34,200 Speaker 1: the drone they are piloting is over a certain size. 259 00:15:35,240 --> 00:15:40,200 Speaker 1: Amazon launched its Echo smart speaker. In it incorporates a 260 00:15:40,240 --> 00:15:44,680 Speaker 1: personal digital assistant and not surprisingly deep integration and Amazon's 261 00:15:44,720 --> 00:15:48,760 Speaker 1: e commerce services. It introduced us to Alexa, the digital 262 00:15:48,840 --> 00:15:51,480 Speaker 1: voice that can do all sorts of stuff, though you 263 00:15:51,560 --> 00:15:53,680 Speaker 1: might have to ask her to do it for five 264 00:15:53,720 --> 00:15:57,040 Speaker 1: times if you have a particularly strong Southern accent. I 265 00:15:57,080 --> 00:15:59,760 Speaker 1: based that off listening to my mom tried to interact 266 00:15:59,760 --> 00:16:03,560 Speaker 1: with a LEXA, I love you, mom. Both SpaceX and 267 00:16:03,600 --> 00:16:06,040 Speaker 1: Blue Origin were able to launch and return to Earth 268 00:16:06,320 --> 00:16:10,640 Speaker 1: a reusable rocket that year. In Blue Origin actually did 269 00:16:10,680 --> 00:16:13,640 Speaker 1: it first, and then SpaceX followed suit a few weeks later. 270 00:16:13,960 --> 00:16:17,240 Speaker 1: The achievement was a significant step in the effort to 271 00:16:17,360 --> 00:16:20,960 Speaker 1: reduce the cost of getting stuff out into space. By 272 00:16:21,040 --> 00:16:24,160 Speaker 1: using the same launch vehicle for multiple launches, the price 273 00:16:24,240 --> 00:16:29,240 Speaker 1: tag drops significantly. Now. It's still not exactly cheap, mind you, 274 00:16:29,480 --> 00:16:31,720 Speaker 1: but it lowers the price tag enough to allow parties 275 00:16:31,760 --> 00:16:35,480 Speaker 1: that had previously never been able to launch a payload 276 00:16:35,520 --> 00:16:39,480 Speaker 1: into space potentially piggyback on a launch, which is pretty cool. 277 00:16:40,320 --> 00:16:44,240 Speaker 1: Over at Twitter, Dick Costolo, the CEO back at that point, 278 00:16:44,440 --> 00:16:47,200 Speaker 1: was forced to essentially step out of that role, and 279 00:16:47,320 --> 00:16:51,040 Speaker 1: Jack Dorsey, the co founder of Twitter, became the interim 280 00:16:51,080 --> 00:16:54,600 Speaker 1: CEO and then full time CEO of the company. One 281 00:16:54,600 --> 00:16:57,480 Speaker 1: of his first big moves was to oversee an eight 282 00:16:57,520 --> 00:17:00,440 Speaker 1: percent cut of the workforce, staying that it was a 283 00:17:00,480 --> 00:17:03,920 Speaker 1: tough move, he viewed it as absolutely necessary for the 284 00:17:03,960 --> 00:17:08,199 Speaker 1: company's success. Oh and hey, in Twitter got rid of 285 00:17:08,600 --> 00:17:11,639 Speaker 1: favorites in which you would mark tweets you liked with 286 00:17:11,680 --> 00:17:15,200 Speaker 1: a star, and they replaced it with likes and which 287 00:17:15,240 --> 00:17:18,320 Speaker 1: you marked tweets you like with a heart. Not. At 288 00:17:18,359 --> 00:17:22,200 Speaker 1: the time, this was considered a big deal and some 289 00:17:22,280 --> 00:17:26,119 Speaker 1: people really hated it, And now I think most people 290 00:17:26,119 --> 00:17:29,040 Speaker 1: don't even remember when it was different, which kind of 291 00:17:29,080 --> 00:17:32,199 Speaker 1: sums up a lot about Internet culture in general. We 292 00:17:32,240 --> 00:17:34,480 Speaker 1: all have memories like a goldfish when it comes to 293 00:17:34,560 --> 00:17:37,239 Speaker 1: these things. All right, well, I have a little bit 294 00:17:37,240 --> 00:17:40,879 Speaker 1: more about a lot of stuff happened in these years, 295 00:17:41,600 --> 00:17:43,800 Speaker 1: must have been making up for that recession. But I'll 296 00:17:43,800 --> 00:17:45,800 Speaker 1: get to it in just a second after we take 297 00:17:46,040 --> 00:17:57,360 Speaker 1: this quick break. Okay, some more stuff From two thousand fifteen, 298 00:17:57,960 --> 00:18:01,760 Speaker 1: Volkswagen had to deal within an enormous scandal. It was 299 00:18:01,800 --> 00:18:05,560 Speaker 1: discovered that the company had installed devices in diesel vehicles 300 00:18:05,560 --> 00:18:08,480 Speaker 1: being sold in the United States. The purpose for these 301 00:18:08,520 --> 00:18:11,480 Speaker 1: devices was to detect when the vehicle was going through 302 00:18:11,520 --> 00:18:14,760 Speaker 1: emissions testing, something that many places in the United States 303 00:18:14,800 --> 00:18:17,399 Speaker 1: require in order for a vehicle to be registered in 304 00:18:17,440 --> 00:18:21,000 Speaker 1: that region. So during testing, the device would cause the 305 00:18:21,080 --> 00:18:25,600 Speaker 1: vehicle to underperform, which would reduce the emissions the vehicle 306 00:18:25,640 --> 00:18:29,200 Speaker 1: would give off in the process. Once the cars systems 307 00:18:29,280 --> 00:18:32,959 Speaker 1: detected that the test was over, it would switch itself 308 00:18:33,040 --> 00:18:35,800 Speaker 1: off and the car would go into full performance mode. 309 00:18:36,000 --> 00:18:39,760 Speaker 1: But that also meant the car was emitting more pollution. So, 310 00:18:39,760 --> 00:18:43,040 Speaker 1: in other words, Volkswagen had installed gadgets that allowed its 311 00:18:43,080 --> 00:18:47,320 Speaker 1: cars to cheat on official tests, and as you can imagine, 312 00:18:47,840 --> 00:18:50,960 Speaker 1: that's a big no no and it cost the company 313 00:18:51,080 --> 00:18:55,840 Speaker 1: dearly HP. After announcing the intent to split its company 314 00:18:55,840 --> 00:18:58,800 Speaker 1: into two businesses, went forward with that plan, and in 315 00:18:58,840 --> 00:19:04,760 Speaker 1: the process the company eliminated thirty three thousand jobs. Yikes. 316 00:19:05,480 --> 00:19:09,440 Speaker 1: Meg Whitman would remain CEO of the enterprise unit of 317 00:19:09,480 --> 00:19:13,960 Speaker 1: the company. The other side, the PC and UH and 318 00:19:14,280 --> 00:19:18,520 Speaker 1: consumer facing approach would go to someone else. Now. Tesla 319 00:19:18,840 --> 00:19:22,800 Speaker 1: introduced the autopilot feature in its vehicles in two thousand fifteen. 320 00:19:23,359 --> 00:19:26,360 Speaker 1: The feature has been the news several times since then 321 00:19:26,440 --> 00:19:30,640 Speaker 1: due to crashes, including crashes with fatalities that happened while 322 00:19:30,680 --> 00:19:35,200 Speaker 1: Tesla cars were allegedly in autopilot mode. Now, the company 323 00:19:35,320 --> 00:19:38,679 Speaker 1: has repeatedly stated that this feature is not meant to 324 00:19:38,720 --> 00:19:42,240 Speaker 1: be an autonomous car mode and it requires drivers to 325 00:19:42,280 --> 00:19:46,480 Speaker 1: acknowledge as much before they can enact the feature, But 326 00:19:46,560 --> 00:19:51,200 Speaker 1: many people, including myself, still criticized the company for calling 327 00:19:51,280 --> 00:19:55,040 Speaker 1: the darned thing autopilot in the first place. Now, not 328 00:19:55,119 --> 00:19:57,560 Speaker 1: to go off on too much of a tangent, but 329 00:19:57,640 --> 00:20:00,000 Speaker 1: I think the company bears at least a little response 330 00:20:00,000 --> 00:20:04,199 Speaker 1: stability for setting unrealistic expectations with that name, though I 331 00:20:04,240 --> 00:20:08,120 Speaker 1: also agree that ultimately the majority of accountability should fall 332 00:20:08,160 --> 00:20:12,760 Speaker 1: on the shoulders of the drivers, not on just Tesla. 333 00:20:13,119 --> 00:20:15,480 Speaker 1: Over on the data security side, one of the big 334 00:20:15,480 --> 00:20:19,920 Speaker 1: stories of that prompted more than a little schadenfreude affected 335 00:20:19,960 --> 00:20:23,359 Speaker 1: the site Ashley Madison, which is a sort of dating 336 00:20:23,400 --> 00:20:27,280 Speaker 1: website for people who want to have extramarital affairs. The 337 00:20:27,320 --> 00:20:31,760 Speaker 1: hackers demanded that the site's owner, Avid Life Media, shut 338 00:20:31,840 --> 00:20:35,040 Speaker 1: down the website or else they were going to release 339 00:20:35,080 --> 00:20:38,080 Speaker 1: the data to the world. Well, the site stayed up, 340 00:20:38,400 --> 00:20:41,359 Speaker 1: and the info got leaked onto the dark web, where 341 00:20:41,359 --> 00:20:44,560 Speaker 1: it quickly circulated and people hurriedly did searches to make 342 00:20:44,560 --> 00:20:49,639 Speaker 1: sure they weren't showing up on any lists. Two thousand 343 00:20:49,680 --> 00:20:52,800 Speaker 1: fifteen was also when so called hoverboards, which is a 344 00:20:52,920 --> 00:20:55,959 Speaker 1: terrible name for what's essentially a variant of a motorized scooter, 345 00:20:56,359 --> 00:20:59,560 Speaker 1: started bursting into flames. Do you remember when that happened? 346 00:21:00,000 --> 00:21:03,560 Speaker 1: Airline said you can't take them on board because they 347 00:21:03,560 --> 00:21:07,639 Speaker 1: were catching on fire so much. Yikes. Okay, finally let's 348 00:21:07,760 --> 00:21:11,280 Speaker 1: move on into twenty six and twenty sixteen, we had 349 00:21:11,280 --> 00:21:14,760 Speaker 1: to say goodbye to something beloved. I am, of course, 350 00:21:14,880 --> 00:21:18,040 Speaker 1: referring to the three and a half millimeter headphone jack, 351 00:21:18,520 --> 00:21:21,879 Speaker 1: the wired headphone jack on phones. The iPhone seven got 352 00:21:21,960 --> 00:21:24,400 Speaker 1: rid of it, and so did a lot of other smartphones. 353 00:21:24,600 --> 00:21:27,639 Speaker 1: The industry was pushing a move towards bluetooth headphones or 354 00:21:28,119 --> 00:21:31,640 Speaker 1: this was even worse, the use of a dongle adapter 355 00:21:31,960 --> 00:21:34,880 Speaker 1: so you could plug your old wired headsets into new 356 00:21:34,920 --> 00:21:38,879 Speaker 1: smartphone devices. And there were a lot of jokes about 357 00:21:38,920 --> 00:21:42,919 Speaker 1: the word dongle back in two thousand sixteen, because we 358 00:21:43,000 --> 00:21:48,000 Speaker 1: are all very mature. Over at Samsung, there was smoke, 359 00:21:48,320 --> 00:21:51,719 Speaker 1: and where there's smoke, there's fire, you know. Well, in 360 00:21:51,720 --> 00:21:56,760 Speaker 1: this case, the Galaxy Note seven, a flagship smartphone from Samsung, 361 00:21:57,080 --> 00:22:01,280 Speaker 1: began to have a pretty disastrous year as reports surfaced 362 00:22:01,280 --> 00:22:06,640 Speaker 1: of Notes seven handsets exploding into flames. At fault were 363 00:22:06,680 --> 00:22:10,400 Speaker 1: the batteries that prompted a global recall of the device. 364 00:22:10,960 --> 00:22:13,960 Speaker 1: Airlines made announcements the passengers with a Note seven would 365 00:22:13,960 --> 00:22:17,160 Speaker 1: have to surrender their phone before being allowed on a plane. 366 00:22:17,560 --> 00:22:21,760 Speaker 1: It was really bad press, a huge mess, so Samsung 367 00:22:22,040 --> 00:22:25,680 Speaker 1: recalled them and replaced the faulty batteries with new ones 368 00:22:26,280 --> 00:22:29,680 Speaker 1: that didn't seem to do much better than the original ones. 369 00:22:29,960 --> 00:22:32,800 Speaker 1: There were more reports of accidents flooding the news, and 370 00:22:32,840 --> 00:22:38,480 Speaker 1: eventually Samsung was forced to discontinue the handset entirely. Yeah. 371 00:22:38,560 --> 00:22:42,600 Speaker 1: In early twenty sixteen, a Google engineer named Anthony Lewandowski 372 00:22:42,800 --> 00:22:45,560 Speaker 1: left the company and he went on to found a 373 00:22:45,640 --> 00:22:49,480 Speaker 1: new self driving car company called Auto O T t O. 374 00:22:50,320 --> 00:22:54,520 Speaker 1: Lewandowski had previously founded other companies related to autonomous cars 375 00:22:54,560 --> 00:22:57,200 Speaker 1: that had later been acquired by Google. That would actually 376 00:22:57,320 --> 00:23:01,000 Speaker 1: lead to some people questioning the ethic behind those moves. 377 00:23:01,200 --> 00:23:04,199 Speaker 1: Some people said that it seemed like he was specifically 378 00:23:04,240 --> 00:23:08,480 Speaker 1: creating companies with the intent for Google to acquire them later, 379 00:23:08,720 --> 00:23:11,680 Speaker 1: so essentially printing his own money. That was the implication 380 00:23:11,840 --> 00:23:15,840 Speaker 1: that some people had made against him. In late twenty sixteen, 381 00:23:16,359 --> 00:23:19,879 Speaker 1: Uber acquired Auto and Lewandowski would move on over to 382 00:23:20,000 --> 00:23:22,800 Speaker 1: Uber and become the head of their attempts to develop 383 00:23:22,840 --> 00:23:26,560 Speaker 1: self driving cars. Now. The following year, Google mostly in 384 00:23:26,600 --> 00:23:29,600 Speaker 1: the form of Weymo that was the self driving startup 385 00:23:29,640 --> 00:23:32,560 Speaker 1: under Google's parent company, Alphabet. It didn't really exist when 386 00:23:32,640 --> 00:23:36,400 Speaker 1: Lewandowski was there. The division that would become Weymo existed, 387 00:23:36,520 --> 00:23:38,560 Speaker 1: but Weymo as a thing didn't exist till a little 388 00:23:38,560 --> 00:23:42,080 Speaker 1: bit later. But Weymo would allege that Lewandowski had downloaded 389 00:23:42,119 --> 00:23:46,840 Speaker 1: nearly ten gigs of confidential information, including trade secrets, from 390 00:23:46,840 --> 00:23:52,680 Speaker 1: Google before he left the company. Lewandowski, when met with 391 00:23:52,760 --> 00:23:56,160 Speaker 1: questions about this at a trial, exercised his Fifth Amendment rights. 392 00:23:56,720 --> 00:24:00,000 Speaker 1: The Fifth Amendment protects US citizens from having to increase 393 00:24:00,000 --> 00:24:03,480 Speaker 1: emanate themselves in a court of law. Uber would subsequently 394 00:24:03,720 --> 00:24:07,960 Speaker 1: fire Lewandowski for not cooperating with the investigation. Google and 395 00:24:08,040 --> 00:24:11,200 Speaker 1: Uber would have ultimately settled the lawsuit in two thousand eighteen, 396 00:24:11,520 --> 00:24:14,760 Speaker 1: and Lewandowski would go on to found another self driving 397 00:24:14,800 --> 00:24:19,320 Speaker 1: car company. Google would bring another charge against Lewandowski personally 398 00:24:19,560 --> 00:24:23,080 Speaker 1: for theft of trade secrets, and after being indicted, the 399 00:24:23,160 --> 00:24:26,520 Speaker 1: board of directors of his new startup, called Pronto, announced 400 00:24:26,520 --> 00:24:29,640 Speaker 1: that Lewandowski was no longer CEO and appointed a new 401 00:24:29,760 --> 00:24:33,159 Speaker 1: leader in his place. Lewandowski has pled not guilty to 402 00:24:33,240 --> 00:24:36,000 Speaker 1: the charges, and his trial has not yet happened as 403 00:24:36,040 --> 00:24:40,280 Speaker 1: of the recording of this podcast. Apple also had a 404 00:24:40,280 --> 00:24:42,840 Speaker 1: face off with the U. S Department of Justice in 405 00:24:42,880 --> 00:24:45,800 Speaker 1: two thousand sixteen, and the crux of the matter was 406 00:24:46,000 --> 00:24:50,359 Speaker 1: a mass shooting that happened in late The perpetrators of 407 00:24:50,400 --> 00:24:55,240 Speaker 1: the shooting had an Apple iPhone five C in their possession, 408 00:24:55,320 --> 00:24:58,520 Speaker 1: and the FBI wanted Apple to unlock the phone and 409 00:24:58,560 --> 00:25:03,280 Speaker 1: decrypt its contents. Apple said no way. Tim Cook, the 410 00:25:03,359 --> 00:25:06,560 Speaker 1: CEO of Apple, said that if they complied with the 411 00:25:06,600 --> 00:25:10,560 Speaker 1: FBI's orders, it would set a dangerous precedent, and that 412 00:25:10,680 --> 00:25:13,800 Speaker 1: if the company acquiesced in the United States, it might 413 00:25:13,840 --> 00:25:16,400 Speaker 1: be forced to do the same thing in other countries, 414 00:25:16,640 --> 00:25:21,080 Speaker 1: including countries that have more authoritarian governments and fewer guaranteed 415 00:25:21,320 --> 00:25:26,359 Speaker 1: human rights for citizens. Ultimately, the FBI gained access to 416 00:25:26,400 --> 00:25:29,880 Speaker 1: the phone using a third party contractor to break through 417 00:25:29,960 --> 00:25:33,199 Speaker 1: the security, and the entire matter was dropped without a 418 00:25:33,200 --> 00:25:37,040 Speaker 1: final resolution. Something else that had its roots in two 419 00:25:37,040 --> 00:25:39,800 Speaker 1: thousand fifteen, but really played out over the course of 420 00:25:39,840 --> 00:25:43,800 Speaker 1: two thousand sixteen was the precipitous fall of the medical 421 00:25:43,840 --> 00:25:48,560 Speaker 1: startup company Paronus. The expose of Faroness began in October 422 00:25:49,720 --> 00:25:53,879 Speaker 1: and continued throughout the following year. Paronus launched with the 423 00:25:53,960 --> 00:25:56,720 Speaker 1: promise that it was going to develop an amazing piece 424 00:25:56,720 --> 00:25:59,719 Speaker 1: of technology capable of testing a single drop of blood 425 00:26:00,119 --> 00:26:04,000 Speaker 1: for dozens of different diseases and conditions, reducing the need 426 00:26:04,040 --> 00:26:07,600 Speaker 1: for blood tests and creating unprecedented access to medical data. 427 00:26:07,720 --> 00:26:09,800 Speaker 1: You could have one of these devices sitting on a 428 00:26:09,880 --> 00:26:12,639 Speaker 1: desktop at home and do a blood test all on 429 00:26:12,680 --> 00:26:15,919 Speaker 1: your own, just check if you wanted to. That was 430 00:26:15,960 --> 00:26:18,480 Speaker 1: the promise. There were a lot of claims that Saronus 431 00:26:18,560 --> 00:26:20,879 Speaker 1: was using smoke and mirrors to make it seem like 432 00:26:20,960 --> 00:26:23,480 Speaker 1: the company was making progress when in fact development was 433 00:26:23,520 --> 00:26:27,439 Speaker 1: hitting some major roadblocks, and there Anos founder Elizabeth Holmes 434 00:26:27,440 --> 00:26:30,560 Speaker 1: found herself in the center of a massive investigation that 435 00:26:30,720 --> 00:26:33,800 Speaker 1: is still heading towards criminal proceedings as of the recording 436 00:26:33,800 --> 00:26:37,240 Speaker 1: of this episode. And it was also two thousand sixteen 437 00:26:37,280 --> 00:26:40,800 Speaker 1: when Yahoo disclosed the data breach that had happened back 438 00:26:40,840 --> 00:26:43,879 Speaker 1: in two thousand and fourteen. They actually disclosed two of them. 439 00:26:44,080 --> 00:26:46,159 Speaker 1: The first time they disclosed one, they said that it 440 00:26:46,280 --> 00:26:50,720 Speaker 1: affected around half a billion accounts, and then a little 441 00:26:50,720 --> 00:26:52,879 Speaker 1: bit later in twenty sixteen, they said, whoops, there was 442 00:26:52,920 --> 00:26:57,040 Speaker 1: also another one in that one affected at least one 443 00:26:57,359 --> 00:27:00,800 Speaker 1: billion accounts. This was also right around the time that 444 00:27:00,920 --> 00:27:05,520 Speaker 1: Verizon was looking to acquire Yahoo. Uh. The acquisition would 445 00:27:05,560 --> 00:27:08,480 Speaker 1: still go through, but it did so at a pretty 446 00:27:08,560 --> 00:27:13,040 Speaker 1: hefty price drop, and that happened in Merissa Meyer, the 447 00:27:13,080 --> 00:27:15,960 Speaker 1: CEO of Yahoo, would find herself out of a job 448 00:27:16,040 --> 00:27:19,960 Speaker 1: not long after the acquisition. We also watched a bizarre 449 00:27:20,040 --> 00:27:23,520 Speaker 1: series of allegations over at a company called hyper Loop One. 450 00:27:23,960 --> 00:27:28,240 Speaker 1: That company had emerged as one of many with the 451 00:27:28,280 --> 00:27:31,960 Speaker 1: intended goal of bringing Elon Musk's vision of high speed 452 00:27:31,960 --> 00:27:35,240 Speaker 1: transportation to life. This one was not directly affiliated with 453 00:27:35,280 --> 00:27:39,639 Speaker 1: Elon Musk himself. The company had built and demonstrated technology 454 00:27:39,680 --> 00:27:42,600 Speaker 1: that would be a foundational part of operations earlier in 455 00:27:44,040 --> 00:27:47,240 Speaker 1: But then there was a big executive kerfuffle that mainly 456 00:27:47,240 --> 00:27:52,119 Speaker 1: involved the CEO, Shervin Pishavar and the chief technology officer 457 00:27:52,280 --> 00:27:57,119 Speaker 1: Brogan bam Brogan. Both parties alleged wrongdoing conducted by the 458 00:27:57,160 --> 00:28:01,520 Speaker 1: other and it got really ugly and weird, and ultimately 459 00:28:01,560 --> 00:28:04,320 Speaker 1: the whole matter would be settled out of court. Bam 460 00:28:04,400 --> 00:28:07,639 Speaker 1: Brogan would go on to co found another maglev train 461 00:28:07,760 --> 00:28:11,640 Speaker 1: company called Arrivo, which would eventually fold due to lack 462 00:28:11,680 --> 00:28:14,520 Speaker 1: of funds. Hyper Loop one would go on to become 463 00:28:14,800 --> 00:28:18,160 Speaker 1: Virgin hyper Loop one in two thousand seventeen, after announcing 464 00:28:18,160 --> 00:28:21,800 Speaker 1: a partnership with the Virgin Group, Richard Branson's company. We 465 00:28:21,880 --> 00:28:24,639 Speaker 1: also got to say hello to three different consumer VR 466 00:28:24,760 --> 00:28:29,520 Speaker 1: headsets in you had the Oculus Rift, the PlayStation VR, 467 00:28:29,680 --> 00:28:33,120 Speaker 1: and the HTC Vibe that all came out that year 468 00:28:33,200 --> 00:28:36,399 Speaker 1: in the hopes that this time virtual reality would become 469 00:28:36,480 --> 00:28:39,440 Speaker 1: you know, real reality, meaning that it would become a 470 00:28:39,520 --> 00:28:43,640 Speaker 1: viable consumer industry. While new VR hardware is still coming 471 00:28:43,640 --> 00:28:47,520 Speaker 1: out today and new experiences arrive every year, the tech 472 00:28:47,800 --> 00:28:51,880 Speaker 1: hasn't really caught on with the mainstream general public, and 473 00:28:52,000 --> 00:28:54,720 Speaker 1: part of the reason might be the relatively high price tag. 474 00:28:55,080 --> 00:28:58,080 Speaker 1: Not only are the headsets expensive, but in many cases 475 00:28:58,120 --> 00:29:02,120 Speaker 1: they require a PC with some fairly hefty specs in 476 00:29:02,240 --> 00:29:06,400 Speaker 1: order to run smoothly. They also require a dedicated space, 477 00:29:07,120 --> 00:29:10,800 Speaker 1: particularly the headsets that allow for full movement within an area. 478 00:29:10,960 --> 00:29:13,640 Speaker 1: You've got to have the space available to actually use it. 479 00:29:13,720 --> 00:29:17,000 Speaker 1: So it's asking a lot of consumers to adopt the technology, 480 00:29:17,240 --> 00:29:20,280 Speaker 1: and so far we've only seen limited interest in the space. 481 00:29:20,760 --> 00:29:23,400 Speaker 1: The people who love it really love it, but it 482 00:29:23,480 --> 00:29:26,840 Speaker 1: remains a pretty hard sell. To the general consumer. And 483 00:29:26,960 --> 00:29:29,160 Speaker 1: one thing we had to say goodbye to in two 484 00:29:29,160 --> 00:29:33,120 Speaker 1: thousand sixteen was Vine, the service that lets users record 485 00:29:33,160 --> 00:29:36,880 Speaker 1: and upload six second videos to the Internet. Twitter had 486 00:29:36,920 --> 00:29:39,920 Speaker 1: acquired the service in two thousand twelve, not long after 487 00:29:39,960 --> 00:29:43,960 Speaker 1: it launched, but decided to sunset it in oh and 488 00:29:45,040 --> 00:29:48,040 Speaker 1: was an election year in the United States, and you 489 00:29:48,120 --> 00:29:51,160 Speaker 1: might have heard about how data was weaponized during that 490 00:29:51,200 --> 00:29:55,760 Speaker 1: whole circus from misleading news propagating on platforms like Twitter 491 00:29:55,840 --> 00:29:58,280 Speaker 1: and Facebook. That's a big thing. The whole fake news 492 00:29:58,320 --> 00:30:03,280 Speaker 1: stuff to hack into the Democratic National Committee servers and beyond. 493 00:30:03,640 --> 00:30:06,920 Speaker 1: It was and is a huge deal. And this isn't 494 00:30:06,920 --> 00:30:09,440 Speaker 1: restricted to just the United States, of course, but the 495 00:30:09,480 --> 00:30:12,000 Speaker 1: effects here in the States are still being revealed as 496 00:30:12,000 --> 00:30:15,920 Speaker 1: of this recording as we head into yet another election year. Now, 497 00:30:16,080 --> 00:30:19,239 Speaker 1: this entire part of our recent history is one that 498 00:30:19,400 --> 00:30:22,640 Speaker 1: is far too complicated and emotionally charged for me to 499 00:30:22,680 --> 00:30:25,200 Speaker 1: go into in this episode, but it remains one of 500 00:30:25,200 --> 00:30:28,600 Speaker 1: the most important stories in our recent history, and perhaps 501 00:30:28,680 --> 00:30:32,440 Speaker 1: one of the biggest consequences is that for many people 502 00:30:32,520 --> 00:30:36,240 Speaker 1: it has undermined their confidence in the democratic process, which 503 00:30:36,280 --> 00:30:40,280 Speaker 1: is a devastating outcome. We've seen companies like Google, Twitter, 504 00:30:40,400 --> 00:30:44,360 Speaker 1: and Facebook all struggle with how to deal with misinformation campaigns. 505 00:30:44,360 --> 00:30:46,880 Speaker 1: But that's something that continues to evolve as a record 506 00:30:46,920 --> 00:30:50,000 Speaker 1: these shows, and probably will continue to evolve for quite 507 00:30:50,040 --> 00:30:54,360 Speaker 1: some time to come. Now, I would say that also 508 00:30:54,400 --> 00:30:57,120 Speaker 1: made it pretty obvious how powerful a tool social media 509 00:30:57,200 --> 00:31:00,280 Speaker 1: can be, as well as how easily it can be manipulated. 510 00:31:00,440 --> 00:31:03,360 Speaker 1: Facebook in particular. Now I've said this on other episodes, 511 00:31:03,440 --> 00:31:07,520 Speaker 1: but Facebook's business model is dependent upon people spending time 512 00:31:07,640 --> 00:31:11,520 Speaker 1: scrolling through Facebook and thus being served ads. The best 513 00:31:11,560 --> 00:31:14,360 Speaker 1: way to get people to stay on your site is 514 00:31:14,360 --> 00:31:18,960 Speaker 1: to present engaging material. This doesn't have to be positive material, 515 00:31:19,160 --> 00:31:21,400 Speaker 1: it doesn't have to be true. It just has to 516 00:31:21,400 --> 00:31:24,640 Speaker 1: be stuff that gets people engaged, and that can be 517 00:31:24,760 --> 00:31:27,600 Speaker 1: liking the post or sharing it or commenting on it 518 00:31:27,680 --> 00:31:31,720 Speaker 1: or whatever. So inflammatory stuff actually does really well over 519 00:31:31,760 --> 00:31:35,880 Speaker 1: at Facebook because it prompts a great deal of engagement. Again, 520 00:31:35,920 --> 00:31:38,360 Speaker 1: it doesn't have to be positive engagement. It just has 521 00:31:38,400 --> 00:31:40,640 Speaker 1: to get people using the site because that's when they're 522 00:31:40,760 --> 00:31:45,200 Speaker 1: looking at advertising. So if you know that little bit 523 00:31:45,360 --> 00:31:49,800 Speaker 1: of information, then you have all you need to devise 524 00:31:49,880 --> 00:31:52,200 Speaker 1: a misinformation campaign that will get a lot of play 525 00:31:52,240 --> 00:31:55,880 Speaker 1: on Facebook because you can create something that isn't true, 526 00:31:56,360 --> 00:31:59,840 Speaker 1: that's inflammatory, and you can be pretty sure that Facebook 527 00:31:59,880 --> 00:32:03,120 Speaker 1: as a platform will help promote it because once it 528 00:32:03,160 --> 00:32:06,720 Speaker 1: starts getting interaction, once people engage with it, Facebook will 529 00:32:06,720 --> 00:32:09,480 Speaker 1: want to promote it to more people to drive more engagement. 530 00:32:09,840 --> 00:32:12,400 Speaker 1: So you're gaming the system, all right. Now, when we 531 00:32:12,480 --> 00:32:15,320 Speaker 1: come back, we'll talk about two thousand seventeen and two 532 00:32:15,320 --> 00:32:18,520 Speaker 1: thousand eighteen, and then I'm gonna go take a nap, 533 00:32:26,120 --> 00:32:31,160 Speaker 1: all right. So two thousand seventeen, what a year. It 534 00:32:31,240 --> 00:32:32,880 Speaker 1: was the year when Facebook would be in the news 535 00:32:32,920 --> 00:32:36,440 Speaker 1: repeatedly as a company and leadership team were put under scrutiny, 536 00:32:36,680 --> 00:32:39,640 Speaker 1: particularly within the context of promoting the misinformation I was 537 00:32:39,680 --> 00:32:43,880 Speaker 1: just talking about, particularly stories have been seeded on Facebook 538 00:32:43,880 --> 00:32:45,960 Speaker 1: from a collection of accounts that were traced back to 539 00:32:46,120 --> 00:32:48,840 Speaker 1: Russian sources. But hey, I just talked about all that 540 00:32:48,920 --> 00:32:51,040 Speaker 1: before the break, and I'm gonna be talking about Facebook 541 00:32:51,080 --> 00:32:53,600 Speaker 1: even more for two thousand eighteen, so I'm not gonna 542 00:32:53,840 --> 00:32:56,800 Speaker 1: rehash it all here except to say the company continued 543 00:32:56,840 --> 00:33:01,520 Speaker 1: to squirm under scrutiny while simultaneously may king bucket loads 544 00:33:01,560 --> 00:33:06,320 Speaker 1: of cash, like crazy amounts of money, y'all. And remember 545 00:33:06,360 --> 00:33:08,600 Speaker 1: how I said that two thousand fourteen was a rough 546 00:33:08,680 --> 00:33:11,680 Speaker 1: year for Uber. Well, that was a cake walk compared 547 00:33:11,720 --> 00:33:16,400 Speaker 1: to two thousand seventeen. The year started out rough. US 548 00:33:16,440 --> 00:33:19,760 Speaker 1: President Donald Trump announced a travel ban at New York's 549 00:33:19,880 --> 00:33:23,840 Speaker 1: John F. Kennedy Airport, a move that many companies and unions, 550 00:33:23,880 --> 00:33:27,720 Speaker 1: including the New York City Taxi Union, protested, but Uber 551 00:33:28,120 --> 00:33:31,640 Speaker 1: continued to operate at the airport, even turning off surge 552 00:33:31,680 --> 00:33:35,320 Speaker 1: pricing to undercut all competition, which led to people accusing 553 00:33:35,400 --> 00:33:37,719 Speaker 1: Uber of trying to profit off of the situation at 554 00:33:37,760 --> 00:33:40,959 Speaker 1: the expense of taxi cab companies, and it led to 555 00:33:41,120 --> 00:33:45,840 Speaker 1: a movement called hashtag delete Uber. It's also when former 556 00:33:45,960 --> 00:33:49,680 Speaker 1: Uber employees Susan Fowler went public with her allegations of 557 00:33:49,720 --> 00:33:54,680 Speaker 1: sexual harassment and other misconduct at Uber's corporate level. Following 558 00:33:54,680 --> 00:33:57,800 Speaker 1: the publication of her account, were several other stories that 559 00:33:57,880 --> 00:34:02,440 Speaker 1: pointed to a truly awful culture at the ride hailing company, 560 00:34:02,480 --> 00:34:06,280 Speaker 1: indicating some pretty deep trends of sexism and a tolerance 561 00:34:06,320 --> 00:34:11,319 Speaker 1: for inappropriate and sometimes criminal behavior. Apparently, if you were 562 00:34:11,400 --> 00:34:14,319 Speaker 1: viewed as a top performer in the company, you were 563 00:34:14,360 --> 00:34:17,040 Speaker 1: given a great deal of leeway as far as your 564 00:34:17,080 --> 00:34:22,680 Speaker 1: behavior was concerned. It was, without a question gross. However, 565 00:34:22,719 --> 00:34:25,799 Speaker 1: it was also one of the big launching points for 566 00:34:25,840 --> 00:34:29,840 Speaker 1: the hashtag me too movement, and which many people spoke 567 00:34:29,880 --> 00:34:34,840 Speaker 1: out against sexism, sexual harassment, racism, and related issues in 568 00:34:34,840 --> 00:34:38,040 Speaker 1: the workplace. Now, I would argue that gamer Gate sort 569 00:34:38,040 --> 00:34:41,120 Speaker 1: of set the stage for this, but I believe Fowler's 570 00:34:41,160 --> 00:34:44,279 Speaker 1: post really got things moving, and while the need to 571 00:34:44,520 --> 00:34:48,840 Speaker 1: movement goes well beyond technology, it was particularly visible in 572 00:34:48,880 --> 00:34:52,120 Speaker 1: the tech world, which has long been dominated by mostly 573 00:34:52,200 --> 00:34:56,280 Speaker 1: men for decades. I was around this time in early 574 00:34:56,920 --> 00:35:01,040 Speaker 1: seventeen that alphabet Google's parent comp He began to go 575 00:35:01,160 --> 00:35:05,919 Speaker 1: after Uber, stating that the former Googler Anthony Lewandowski had 576 00:35:05,960 --> 00:35:08,720 Speaker 1: stolen trade secrets before making his way over to Uber 577 00:35:08,840 --> 00:35:12,400 Speaker 1: the year earlier. Then, there was a video of Uber 578 00:35:12,560 --> 00:35:17,520 Speaker 1: CEO Travis Kalenik berating again Uber driver that went viral 579 00:35:17,560 --> 00:35:21,480 Speaker 1: in February seventeen, and that prompted the CEO to say 580 00:35:21,520 --> 00:35:24,960 Speaker 1: he would quote fundamentally change as a leader and grow 581 00:35:25,080 --> 00:35:29,640 Speaker 1: up end quote. Several Uber executives were either fired or 582 00:35:29,680 --> 00:35:34,080 Speaker 1: opted to leave the company as investigators looked into Fowler's claims, 583 00:35:34,440 --> 00:35:38,719 Speaker 1: indicating that there was indeed a pretty terrible culture at 584 00:35:38,719 --> 00:35:41,759 Speaker 1: the company and that it extended all the way up 585 00:35:41,800 --> 00:35:45,279 Speaker 1: the executive ranks. The senior vice president of Engineering, a 586 00:35:45,280 --> 00:35:48,840 Speaker 1: guy named Amit Singhal, was told to resign after about 587 00:35:48,880 --> 00:35:51,360 Speaker 1: a month on the job because it turned out he 588 00:35:51,440 --> 00:35:55,040 Speaker 1: had a sexual harassment allegation against him from a previous 589 00:35:55,120 --> 00:35:59,600 Speaker 1: employer that he had not divulged. The Uber VP of 590 00:35:59,640 --> 00:36:02,879 Speaker 1: Product and Growth, a guy named Ed Baker, resigned now. 591 00:36:02,880 --> 00:36:04,920 Speaker 1: He said he resigned because he wanted to focus on 592 00:36:04,920 --> 00:36:07,359 Speaker 1: the public sector, but there were other people in the 593 00:36:07,400 --> 00:36:11,839 Speaker 1: company who claimed he had engaged in inappropriate behavior while 594 00:36:11,920 --> 00:36:16,160 Speaker 1: at Uber. The president of Uber, Jeff Jones, resigned in 595 00:36:16,239 --> 00:36:20,319 Speaker 1: March of stating he had differences over the approach to 596 00:36:20,440 --> 00:36:25,200 Speaker 1: leadership at the company. By May, Uber fired Lewandowski for 597 00:36:25,320 --> 00:36:29,960 Speaker 1: not cooperating with that investigation, stemming from allegations that Lewandowski 598 00:36:30,040 --> 00:36:34,720 Speaker 1: had stolen terabytes of confidential information from Google. In June, 599 00:36:35,200 --> 00:36:38,640 Speaker 1: the company would fire twenty people at various levels within 600 00:36:38,840 --> 00:36:42,240 Speaker 1: Uber as a result of an internal investigation into Fowler's 601 00:36:42,239 --> 00:36:45,920 Speaker 1: claims that showed that she was being truthful. And it 602 00:36:45,960 --> 00:36:50,760 Speaker 1: didn't stop there. There were more resignations, including CEO Travis Kalenik, 603 00:36:50,880 --> 00:36:54,279 Speaker 1: whom I venture to say was not helping matters. He 604 00:36:54,320 --> 00:36:57,320 Speaker 1: was essentially forced to resign his position and eventually was 605 00:36:57,400 --> 00:37:00,000 Speaker 1: forced out the board of directors as well. The company 606 00:37:00,040 --> 00:37:03,920 Speaker 1: announced a new commitment to addressing its problems, acknowledging that 607 00:37:03,960 --> 00:37:08,480 Speaker 1: there were in fact problems in the process, but obviously 608 00:37:08,719 --> 00:37:12,720 Speaker 1: more happened in twenty seventeen than just Uber's super crazy 609 00:37:12,800 --> 00:37:17,120 Speaker 1: bad year. Amazon acquired high end grocery store chain Whole 610 00:37:17,200 --> 00:37:22,279 Speaker 1: Foods for fourteen billion dollars. For example, Apple launched the 611 00:37:22,400 --> 00:37:26,000 Speaker 1: iPhone ten, the tenth anniversary iPhone. They skipped right over 612 00:37:26,040 --> 00:37:28,520 Speaker 1: iPhone nine, just as Windows has skipped over Windows nine 613 00:37:28,560 --> 00:37:30,960 Speaker 1: and driving me a little closer to the edge because 614 00:37:31,000 --> 00:37:33,239 Speaker 1: my brain works in a very numeric way, and that's 615 00:37:33,239 --> 00:37:36,279 Speaker 1: on me and I'll stop now. We also had a 616 00:37:36,320 --> 00:37:39,200 Speaker 1: lot of efforts to codify net neutrality in the United 617 00:37:39,239 --> 00:37:43,720 Speaker 1: States that were reversed in seventeen. A Jitpi, the FCC 618 00:37:44,000 --> 00:37:47,680 Speaker 1: chairman during the Trump era, led the effort in reversing 619 00:37:47,680 --> 00:37:51,439 Speaker 1: the regulations that the FCC under the previous US administration 620 00:37:51,480 --> 00:37:55,840 Speaker 1: had established. So essentially he was saying, yeah, all those 621 00:37:56,400 --> 00:38:00,920 Speaker 1: efforts that the FCC made to regulate the road band industry, 622 00:38:01,080 --> 00:38:05,000 Speaker 1: We're reversing that, and uh, actually the following year, in ten, 623 00:38:05,360 --> 00:38:12,799 Speaker 1: net neutrality would be declared dead. So there's that. A 624 00:38:12,840 --> 00:38:16,960 Speaker 1: type of ransomware called Wanna cry went viral in seventeen. 625 00:38:17,040 --> 00:38:19,879 Speaker 1: The malware would infect machines and lock them from being 626 00:38:19,960 --> 00:38:23,040 Speaker 1: used by their rightful owners, and those owners would be 627 00:38:23,080 --> 00:38:25,479 Speaker 1: met with a demand to pay a ransom or else 628 00:38:25,560 --> 00:38:29,000 Speaker 1: lose their data forever. A twenty two year old security 629 00:38:29,080 --> 00:38:33,400 Speaker 1: researcher named Marcus Hutchins found and implemented at kill switch 630 00:38:33,680 --> 00:38:36,440 Speaker 1: that prevented Wanna Cry from being even worse than it 631 00:38:36,520 --> 00:38:40,879 Speaker 1: already was. Hutchins himself would be subsequently arrested by US 632 00:38:41,000 --> 00:38:43,960 Speaker 1: law enforcement and charged with numerous accounts of crimes that 633 00:38:44,000 --> 00:38:47,280 Speaker 1: included wire fraud and distributing a device meant to intercept 634 00:38:47,320 --> 00:38:50,480 Speaker 1: electronic communications. This was not related to want to Cry, 635 00:38:50,480 --> 00:38:53,960 Speaker 1: but to earlier malware. He would plead guilty to those 636 00:38:54,040 --> 00:38:57,080 Speaker 1: charges and would be sentenced to time served plus a 637 00:38:57,160 --> 00:39:01,040 Speaker 1: year of supervised uh you know, not really captivity, but 638 00:39:01,120 --> 00:39:04,160 Speaker 1: supervised activity and then before you be released. And that 639 00:39:04,480 --> 00:39:08,000 Speaker 1: sentencing happened in July of two thousand nineteen, so it'll 640 00:39:08,000 --> 00:39:11,719 Speaker 1: be middle of when that is up, all right, So 641 00:39:11,760 --> 00:39:15,040 Speaker 1: we're in the home stretch. Let's talk abouteen. And remember again, 642 00:39:15,120 --> 00:39:17,680 Speaker 1: I'm not covering twenty nineteen. I just did that ding 643 00:39:17,800 --> 00:39:19,840 Speaker 1: dang durn thing a couple of episodes ago. But I 644 00:39:19,880 --> 00:39:22,200 Speaker 1: will wrap up with a couple of thoughts about the decade. 645 00:39:22,520 --> 00:39:27,319 Speaker 1: So was when Cambridge an Oalytica that whole scandal became 646 00:39:27,360 --> 00:39:30,160 Speaker 1: public knowledge. It was revealed that through a loophole and 647 00:39:30,239 --> 00:39:33,080 Speaker 1: Facebook's API, a developer was able to collect information on 648 00:39:33,160 --> 00:39:37,759 Speaker 1: eighty seven million Facebook accounts without the user's permission. It 649 00:39:37,840 --> 00:39:41,080 Speaker 1: marked yet another really rough year for Facebook's leadership team, 650 00:39:41,080 --> 00:39:43,840 Speaker 1: which was compelled to testify to the United States government 651 00:39:43,920 --> 00:39:46,600 Speaker 1: a couple of times. Now, all that being said, the 652 00:39:46,600 --> 00:39:50,600 Speaker 1: company continued to be massively profitable, So while the testimony 653 00:39:50,640 --> 00:39:54,440 Speaker 1: was certainly uncomfortable, the financial results weren't exactly hurting the 654 00:39:54,480 --> 00:39:58,440 Speaker 1: executive team. Night they might be under some pretty intense 655 00:39:58,560 --> 00:40:01,560 Speaker 1: questioning in front of Congress, but then they could comfort 656 00:40:01,600 --> 00:40:07,000 Speaker 1: themselves with a nice, snugly, enormous yacht or mansion or whatever. 657 00:40:07,680 --> 00:40:10,600 Speaker 1: So in two thousand eighteen, Apple became the first U 658 00:40:10,680 --> 00:40:16,520 Speaker 1: S company to hit a market capitalization of one trillion dollars. However, 659 00:40:16,560 --> 00:40:18,799 Speaker 1: this only lasted about a month before the value of 660 00:40:18,840 --> 00:40:24,719 Speaker 1: the company dropped down to a measly seven billion losers. 661 00:40:25,560 --> 00:40:29,200 Speaker 1: And yes, that was me being facetious. That was also 662 00:40:29,239 --> 00:40:33,080 Speaker 1: the year we learned about two massive vulnerabilities in CPUs. 663 00:40:33,480 --> 00:40:35,239 Speaker 1: I probably need to do a full episode about this 664 00:40:35,280 --> 00:40:37,360 Speaker 1: at some point to explain exactly what's going on. But 665 00:40:37,400 --> 00:40:41,920 Speaker 1: the vulnerabilities are called specter and meltdown, And ultimately it 666 00:40:41,920 --> 00:40:45,400 Speaker 1: means that these chips that are built on this architecture, 667 00:40:45,760 --> 00:40:49,080 Speaker 1: and that's almost all of them, almost all modern chips 668 00:40:49,880 --> 00:40:54,880 Speaker 1: have a tendency to leak valuable information. Um, it's just 669 00:40:55,120 --> 00:40:58,400 Speaker 1: it's just an inherent vulnerability in the chip design and 670 00:40:58,480 --> 00:41:01,080 Speaker 1: it's terrible. But I'll probably have to do a full 671 00:41:01,120 --> 00:41:05,120 Speaker 1: episode to explain exactly how that works. Amazon opened its 672 00:41:05,120 --> 00:41:09,000 Speaker 1: first automated supermarket in two thousand eighteen. It's called Amazon Go. 673 00:41:09,680 --> 00:41:12,040 Speaker 1: And the ideas you walk in, you pick up whatever 674 00:41:12,280 --> 00:41:15,280 Speaker 1: stuff you want, and you walk out, and Amazon tracks 675 00:41:15,280 --> 00:41:17,480 Speaker 1: where you are in the store, It tracks what you grab, 676 00:41:17,560 --> 00:41:20,359 Speaker 1: it checks how much of it you take, and whether 677 00:41:20,440 --> 00:41:22,800 Speaker 1: or not you leave the store with it ideally anyway, 678 00:41:22,840 --> 00:41:26,040 Speaker 1: and then it builds you. Hopefully it does so correctly. 679 00:41:26,080 --> 00:41:28,000 Speaker 1: I know at least one person who had an experience 680 00:41:28,040 --> 00:41:30,080 Speaker 1: where he picked something up, put it back down, walked 681 00:41:30,080 --> 00:41:33,080 Speaker 1: out of the store uh with other stuff, but was 682 00:41:33,120 --> 00:41:34,839 Speaker 1: also charged from the thing he picked up but did 683 00:41:34,840 --> 00:41:39,400 Speaker 1: not actually take. So it's not full proof, but it's interesting. Also, 684 00:41:39,480 --> 00:41:42,520 Speaker 1: Amazon announced it would open up It's HQ two in 685 00:41:42,640 --> 00:41:46,279 Speaker 1: two cities, one near Alexandria, Virginia and the other one 686 00:41:46,280 --> 00:41:48,239 Speaker 1: in New York City. Only if you listen to my 687 00:41:48,280 --> 00:41:51,480 Speaker 1: two thousand nineteen episodes, you know how that all turned out. 688 00:41:53,680 --> 00:41:56,600 Speaker 1: We also started seeing online platforms react to the pressure 689 00:41:56,600 --> 00:41:59,759 Speaker 1: they were feeling in the wake of various misinformation campaigns 690 00:42:00,040 --> 00:42:03,239 Speaker 1: and fake news scandals. One of those reactions was to 691 00:42:03,280 --> 00:42:06,960 Speaker 1: remove many of the platforms used by Alex Jones, a 692 00:42:06,960 --> 00:42:11,120 Speaker 1: guy who has championed various fringe theories for years and 693 00:42:11,480 --> 00:42:15,080 Speaker 1: tends to be pretty hateful in his delivery. From his 694 00:42:15,160 --> 00:42:19,440 Speaker 1: podcasts getting pulled to his accounts on various platforms getting banned, 695 00:42:19,800 --> 00:42:23,680 Speaker 1: he found himself largely silenced on mainstream channels. Now that 696 00:42:23,760 --> 00:42:27,799 Speaker 1: being said, I'm not sure that I've seen similar movements 697 00:42:27,880 --> 00:42:31,719 Speaker 1: for you know, lesser known personalities, So part of me 698 00:42:31,760 --> 00:42:35,520 Speaker 1: wonders if this particular initiative was mostly meant to placate 699 00:42:35,600 --> 00:42:38,600 Speaker 1: people who were worried about rhetoric on these platforms. In 700 00:42:38,600 --> 00:42:43,719 Speaker 1: other words, we silenced someone who is really notable in 701 00:42:43,760 --> 00:42:46,520 Speaker 1: the field, So maybe you won't notice if we don't 702 00:42:46,600 --> 00:42:52,040 Speaker 1: silence all these other hundreds of lesser known accounts. That's 703 00:42:52,400 --> 00:42:55,000 Speaker 1: I don't know. Maybe I'm just too cynical. The U. S. 704 00:42:55,000 --> 00:42:59,760 Speaker 1: Supreme Court voted five to four in that law enforcement 705 00:42:59,800 --> 00:43:03,520 Speaker 1: of shows must secure a warrant before they are allowed 706 00:43:03,560 --> 00:43:07,680 Speaker 1: to obtain location data from mobile carriers. The court recognized 707 00:43:07,719 --> 00:43:10,719 Speaker 1: that location data gives a really deep insight into a 708 00:43:10,800 --> 00:43:13,600 Speaker 1: person and their activities, and that it could be a 709 00:43:13,640 --> 00:43:17,359 Speaker 1: major invasion of privacy and therefore uh it falls under 710 00:43:17,400 --> 00:43:22,560 Speaker 1: the Unreasonable Search and Seizure Amendment. Over at Google, more 711 00:43:22,600 --> 00:43:25,880 Speaker 1: than four thousand employees signed a petition asking the company 712 00:43:25,920 --> 00:43:29,760 Speaker 1: to end its participation in a Pentagon project called Project Maven. 713 00:43:30,120 --> 00:43:33,080 Speaker 1: The purpose of Maven was to develop drone platforms that, 714 00:43:33,440 --> 00:43:37,680 Speaker 1: with the addition of image recognition capabilities, could identify specific 715 00:43:37,760 --> 00:43:42,000 Speaker 1: humans from a really good distance away. And while Mayven 716 00:43:42,040 --> 00:43:47,160 Speaker 1: itself wasn't directly connected to weaponization, employees at Google felt 717 00:43:47,200 --> 00:43:49,480 Speaker 1: that it would not take much of a stretch to 718 00:43:49,560 --> 00:43:53,719 Speaker 1: incorporate that sort of technology on a weaponized platform, which 719 00:43:53,760 --> 00:43:57,960 Speaker 1: means you could have a future of flying robotic assassins. 720 00:43:58,640 --> 00:44:02,440 Speaker 1: Now Understandably, these Google employees felt that this approach conflicted 721 00:44:02,800 --> 00:44:06,920 Speaker 1: with the no longer official Google motto of don't be evil. 722 00:44:07,600 --> 00:44:10,040 Speaker 1: At least a dozen or so employees resigned over the 723 00:44:10,080 --> 00:44:12,520 Speaker 1: whole thing, refusing to work for a company that participated 724 00:44:12,560 --> 00:44:15,560 Speaker 1: in the project. Google said that once the contract was up, 725 00:44:15,600 --> 00:44:17,960 Speaker 1: it would not seek to renew, but it would try 726 00:44:18,000 --> 00:44:21,680 Speaker 1: and find someone else to take over that contract. It does, however, 727 00:44:21,800 --> 00:44:25,280 Speaker 1: continue to pursue and work with the Department of Defense 728 00:44:25,320 --> 00:44:29,200 Speaker 1: on other projects. Google also showed off a project of 729 00:44:29,239 --> 00:44:33,279 Speaker 1: its own called Duplex that it's IO Developer conference. The 730 00:44:33,320 --> 00:44:36,240 Speaker 1: AI program called up a restaurant and made a dinner 731 00:44:36,280 --> 00:44:39,759 Speaker 1: reservation automatically over the phone, all without the person on 732 00:44:39,800 --> 00:44:42,080 Speaker 1: the other end at the restaurant realizing that they were 733 00:44:42,120 --> 00:44:46,400 Speaker 1: talking to an AI BT So that was kind of creepy. 734 00:44:47,320 --> 00:44:50,120 Speaker 1: This was also the year that Google employees around the 735 00:44:50,120 --> 00:44:52,640 Speaker 1: world staged a walk out to protest the way the 736 00:44:52,640 --> 00:44:56,040 Speaker 1: company was handling sexual harassment claims. This was in part 737 00:44:56,080 --> 00:44:58,840 Speaker 1: prompted by reports that Andy Rubin, the guy who created 738 00:44:58,920 --> 00:45:01,880 Speaker 1: Android and who been forced to resign due to sexual 739 00:45:01,880 --> 00:45:05,880 Speaker 1: harassment allegations, had left the company with a ninety million 740 00:45:06,120 --> 00:45:12,560 Speaker 1: dollar severance package. Yeah. Tragedy struck in Tempe, Arizona, in 741 00:45:12,640 --> 00:45:16,240 Speaker 1: March two thousand eighteen, when an autonomous Uber test vehicle 742 00:45:16,320 --> 00:45:19,840 Speaker 1: collided with a pedestrian. A video analysis of the incident 743 00:45:19,920 --> 00:45:22,880 Speaker 1: seemed to indicate that the accident was preventable, and the 744 00:45:22,920 --> 00:45:26,440 Speaker 1: incident raised serious questions about the viability of autonomous cars 745 00:45:26,480 --> 00:45:31,320 Speaker 1: moving forward. Over in Europe, Visa experienced a major outage 746 00:45:31,360 --> 00:45:34,600 Speaker 1: that lasted eight hours due to a hardware failure that 747 00:45:34,640 --> 00:45:36,840 Speaker 1: in turn led to a run on a t m 748 00:45:36,920 --> 00:45:38,839 Speaker 1: S for cash. So that kind of shows you how 749 00:45:38,880 --> 00:45:41,799 Speaker 1: delicate our technological society can be. You can have a 750 00:45:41,840 --> 00:45:46,480 Speaker 1: single point of failure end up causing massive problems. Physicists 751 00:45:46,520 --> 00:45:50,320 Speaker 1: and science communicator Stephen Hawking passed away on March fourteen, 752 00:45:50,400 --> 00:45:53,080 Speaker 1: two thousand eighteen. He was seventy six years old, and 753 00:45:53,160 --> 00:45:56,719 Speaker 1: his work aimed to expand our understanding of the universe. 754 00:45:57,400 --> 00:46:01,440 Speaker 1: UH the sort of science communicator that I found particularly interesting. 755 00:46:02,239 --> 00:46:06,080 Speaker 1: Over in the European Union, the General Data Protection Regulation 756 00:46:06,160 --> 00:46:09,120 Speaker 1: Rules went into effect. These rules limit how companies can 757 00:46:09,160 --> 00:46:12,720 Speaker 1: gather and use personal information, and places requirements on companies 758 00:46:12,760 --> 00:46:14,960 Speaker 1: to make sure that users are aware of how and 759 00:46:15,000 --> 00:46:18,480 Speaker 1: when their data is collected. Has prompted some pretty massive 760 00:46:18,560 --> 00:46:21,120 Speaker 1: changes in how companies do business all over Europe as 761 00:46:21,120 --> 00:46:26,840 Speaker 1: a result. Then we get to SpaceX SpaceX conducting a 762 00:46:26,920 --> 00:46:30,480 Speaker 1: test of its Falcon heavy launch vehicle. How by launching 763 00:46:30,480 --> 00:46:34,080 Speaker 1: a payload that included a Tesla roadster that belonged to 764 00:46:34,120 --> 00:46:37,239 Speaker 1: Elon Musk, they got pushed into an orbit around the Sun, 765 00:46:37,320 --> 00:46:40,759 Speaker 1: because why not, right, Also, Samsung would show off its 766 00:46:40,800 --> 00:46:44,160 Speaker 1: foldable phone concept, a concept that would become a massive 767 00:46:44,200 --> 00:46:46,400 Speaker 1: headache in twenty nineteen. But I talked about that in 768 00:46:46,400 --> 00:46:49,560 Speaker 1: the last episodes, And let's wrap up with a couple 769 00:46:49,560 --> 00:46:52,680 Speaker 1: of observations about the decade in general. Now, one thing 770 00:46:53,280 --> 00:46:56,239 Speaker 1: that I noticed while looking back on this was the 771 00:46:56,280 --> 00:46:59,960 Speaker 1: evolution of the Internet of things, because ten years ago 772 00:47:00,360 --> 00:47:03,360 Speaker 1: it was sort of just a concept, almost like a buzzword, 773 00:47:04,120 --> 00:47:07,359 Speaker 1: closer to that than to reality. But over the course 774 00:47:07,400 --> 00:47:10,640 Speaker 1: of ten years we saw that change. You back in 775 00:47:10,680 --> 00:47:13,120 Speaker 1: two thousand nine, you know, you're talking about routers, computers, 776 00:47:13,120 --> 00:47:16,840 Speaker 1: and maybe some smartphones connected to the Internet, and today 777 00:47:17,400 --> 00:47:21,040 Speaker 1: they're looking at things like standalone sensors, home security systems, 778 00:47:21,520 --> 00:47:24,600 Speaker 1: maybe even a homemade dog feeding device, all connected to 779 00:47:24,640 --> 00:47:26,920 Speaker 1: the Internet, and we're just in the early stages of 780 00:47:26,920 --> 00:47:29,920 Speaker 1: this era. For better or for worse, the evolution has 781 00:47:29,960 --> 00:47:33,080 Speaker 1: taught us many lessons, including that the data we generate 782 00:47:33,120 --> 00:47:36,239 Speaker 1: throughout the day can be used for us or against us, 783 00:47:36,680 --> 00:47:38,440 Speaker 1: and we need to pay close attention to things like 784 00:47:38,520 --> 00:47:41,400 Speaker 1: data security as we connect more devices to the Internet 785 00:47:41,440 --> 00:47:44,799 Speaker 1: at large, unless we're just willing to have all of 786 00:47:44,840 --> 00:47:47,840 Speaker 1: our details, of all of our lives on display for everyone, 787 00:47:47,880 --> 00:47:50,480 Speaker 1: all the time for any purpose, which I don't think 788 00:47:50,560 --> 00:47:54,919 Speaker 1: is necessarily the best course of action. We also saw 789 00:47:54,960 --> 00:47:57,440 Speaker 1: how automated cars went from being a cool idea to 790 00:47:57,560 --> 00:48:00,080 Speaker 1: something being tested in the real world to something that, 791 00:48:00,480 --> 00:48:03,800 Speaker 1: if implemented poorly, can result in tragedy. And perhaps the 792 00:48:03,800 --> 00:48:05,880 Speaker 1: biggest lesson there is that we need to do a 793 00:48:05,920 --> 00:48:08,680 Speaker 1: lot more testing and tweaking before we see a wider 794 00:48:08,719 --> 00:48:12,920 Speaker 1: implementation of the technology, and perhaps we shouldn't take the 795 00:48:13,200 --> 00:48:17,960 Speaker 1: fail fast, you know approach when it comes to autonomous cars. 796 00:48:18,239 --> 00:48:22,120 Speaker 1: We really can't afford to. We also saw how data security, privacy, 797 00:48:22,160 --> 00:48:26,839 Speaker 1: and misinformation are all kind of linked together, and they're complicated, 798 00:48:27,200 --> 00:48:29,920 Speaker 1: and we have to pay much closer attention to make 799 00:48:29,960 --> 00:48:32,680 Speaker 1: sure that we stay safe and relatively sure that the 800 00:48:32,719 --> 00:48:37,000 Speaker 1: information that we are encountering is correct and true. It 801 00:48:37,000 --> 00:48:40,600 Speaker 1: puts a lot more accountability on us as consumers to 802 00:48:40,719 --> 00:48:44,240 Speaker 1: make sure that we're seeking out information from the right sources. 803 00:48:44,520 --> 00:48:47,840 Speaker 1: It puts a lot more pressure on platforms like Facebook 804 00:48:47,840 --> 00:48:50,160 Speaker 1: and Twitter and Google to make sure that they're not 805 00:48:50,760 --> 00:48:55,040 Speaker 1: enabling the spread of misinformation and making the world a 806 00:48:55,120 --> 00:48:59,000 Speaker 1: worse place. My goal has always been leave the world 807 00:48:59,120 --> 00:49:01,120 Speaker 1: a little better than the way it was when you 808 00:49:01,200 --> 00:49:04,480 Speaker 1: showed up. Uh. It is becoming increasingly difficult to do 809 00:49:04,520 --> 00:49:06,560 Speaker 1: that because there's only so much I can do and 810 00:49:06,600 --> 00:49:09,640 Speaker 1: there's a lot of bad stuff happening out there. However, 811 00:49:09,680 --> 00:49:14,080 Speaker 1: I do still think that with the right approach, we 812 00:49:14,120 --> 00:49:17,400 Speaker 1: can make a positive change. I am still an optimist. 813 00:49:17,400 --> 00:49:20,560 Speaker 1: I'm just an optimist who acknowledges that we have big, 814 00:49:20,600 --> 00:49:23,080 Speaker 1: big challenges ahead of us in order to make things 815 00:49:23,080 --> 00:49:25,960 Speaker 1: work out. But I think we can do it. I mean, 816 00:49:26,000 --> 00:49:29,520 Speaker 1: we've sent people to the moon. We've done the impossible before, 817 00:49:29,880 --> 00:49:32,359 Speaker 1: so let's do it again. It might take ten years 818 00:49:32,360 --> 00:49:34,320 Speaker 1: to get into a better place, but I'm willing to 819 00:49:34,360 --> 00:49:36,600 Speaker 1: put in the work if you are. If you guys 820 00:49:36,600 --> 00:49:39,520 Speaker 1: have suggestions for future episodes of tech Stuff, knowing that 821 00:49:39,560 --> 00:49:42,919 Speaker 1: I already covered twenty nine, don't send me messages about that. 822 00:49:43,200 --> 00:49:45,880 Speaker 1: Those are two episodes back. Go check those out. They're 823 00:49:45,960 --> 00:49:49,279 Speaker 1: a bummer, but they're there. If you have other suggestions, though, 824 00:49:49,440 --> 00:49:51,560 Speaker 1: reach out tell me about them. You can do so 825 00:49:51,680 --> 00:49:53,759 Speaker 1: on Facebook or Twitter. The handle for both of those 826 00:49:53,880 --> 00:49:57,360 Speaker 1: is tech stuff hs W and uh I'll talk to 827 00:49:57,360 --> 00:50:05,120 Speaker 1: you again really soon. Yeah. Text Stuff is a production 828 00:50:05,120 --> 00:50:08,120 Speaker 1: of I Heart Radio's How Stuff Works. For more podcasts 829 00:50:08,160 --> 00:50:10,920 Speaker 1: from I heart Radio, visit the i heart Radio app, 830 00:50:11,040 --> 00:50:14,200 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.