1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,640 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,680 --> 00:00:16,319 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,320 --> 00:00:18,320 Speaker 1: How Stuff Works and my Heart Radio and I love 5 00:00:18,520 --> 00:00:22,000 Speaker 1: all things tech. And it is that time of the year, 6 00:00:22,560 --> 00:00:26,160 Speaker 1: the time when I cover the big tech news stories 7 00:00:26,280 --> 00:00:29,840 Speaker 1: from the previous twelve months and talk about some of 8 00:00:29,840 --> 00:00:33,159 Speaker 1: the most important developments in the tech industry over the 9 00:00:33,159 --> 00:00:36,640 Speaker 1: course of the year. So we're looking back on and 10 00:00:36,720 --> 00:00:39,680 Speaker 1: some of the big stories. I am not going month 11 00:00:39,760 --> 00:00:43,320 Speaker 1: by month. Sometimes I do that. Sometimes I will look 12 00:00:43,360 --> 00:00:46,720 Speaker 1: over a year and go through it chronologically. Instead, I 13 00:00:46,760 --> 00:00:51,720 Speaker 1: decided to kind of group things thematically. So this first section, 14 00:00:51,800 --> 00:00:54,040 Speaker 1: I'm going to focus on some of the biggest news 15 00:00:54,200 --> 00:00:59,160 Speaker 1: stories um especially around the big, big companies that we 16 00:00:59,240 --> 00:01:02,800 Speaker 1: are all familiar you're with, And I can't think of 17 00:01:02,840 --> 00:01:09,200 Speaker 1: any place more appropriate to start than how Facebook's eighteen unfolded. 18 00:01:09,280 --> 00:01:13,840 Speaker 1: It was a particularly tumultuous year for Facebook, and then 19 00:01:13,880 --> 00:01:15,880 Speaker 1: I have a feeling that twenty nineteen is not going 20 00:01:15,920 --> 00:01:19,120 Speaker 1: to be a whole lot easier. Anyway. I've covered a 21 00:01:19,120 --> 00:01:21,680 Speaker 1: lot of the stories about this in episodes of tech 22 00:01:21,720 --> 00:01:25,480 Speaker 1: Stuff over but I want to sum up some of 23 00:01:25,520 --> 00:01:28,119 Speaker 1: the things that the company has experienced over these last 24 00:01:28,120 --> 00:01:33,520 Speaker 1: twelve months, because they've been pretty spectacular. In March, the 25 00:01:33,560 --> 00:01:38,000 Speaker 1: Guardian and The New York Times published pieces that exposed 26 00:01:38,400 --> 00:01:42,120 Speaker 1: the Cambridge Analytica scandal, and that was the scandal in 27 00:01:42,160 --> 00:01:45,120 Speaker 1: which it was revealed a political marketing and consulting firm, 28 00:01:45,319 --> 00:01:49,760 Speaker 1: Cambridge Analytica, based in the UK, had apparently taken advantage 29 00:01:49,760 --> 00:01:53,440 Speaker 1: of a design flaw that had existed in Facebook several 30 00:01:53,520 --> 00:01:56,880 Speaker 1: years ago. Is way back in when this design flaw, 31 00:01:57,040 --> 00:01:59,360 Speaker 1: or this this policy, it wasn't even a design flaw 32 00:01:59,400 --> 00:02:02,280 Speaker 1: so much as it was just sort of a lack 33 00:02:02,400 --> 00:02:06,440 Speaker 1: of accountability was in place back then. If you were 34 00:02:06,480 --> 00:02:10,680 Speaker 1: creating apps for Facebook, you could build into your app 35 00:02:11,280 --> 00:02:14,600 Speaker 1: a point where the person installing the app would essentially 36 00:02:14,639 --> 00:02:17,800 Speaker 1: give permission for you to view all sorts of information 37 00:02:17,880 --> 00:02:20,359 Speaker 1: not just about the person who was installing the app, 38 00:02:21,000 --> 00:02:26,200 Speaker 1: but possibly access to all of their friends. So think 39 00:02:26,240 --> 00:02:29,600 Speaker 1: of it like if you befriend someone on Facebook and 40 00:02:29,639 --> 00:02:32,560 Speaker 1: they're friends with somebody else, but you were not friends 41 00:02:32,560 --> 00:02:35,079 Speaker 1: with that third person. You might be able to follow 42 00:02:35,080 --> 00:02:37,400 Speaker 1: the links back and take a look at that third 43 00:02:37,440 --> 00:02:41,640 Speaker 1: person's profile and if they haven't said it to private. 44 00:02:41,680 --> 00:02:43,960 Speaker 1: If they have public information, you can find out a 45 00:02:44,000 --> 00:02:47,680 Speaker 1: lot about this stranger, this person you only uh, you 46 00:02:47,680 --> 00:02:50,000 Speaker 1: don't know that one. You just happen to have a 47 00:02:50,000 --> 00:02:55,400 Speaker 1: common friend. Well, the app in question, which was called 48 00:02:56,520 --> 00:02:59,440 Speaker 1: you know, a Digital Life app that what it was 49 00:02:59,480 --> 00:03:02,560 Speaker 1: doing the same thing, and Cambridge Analytica was able to 50 00:03:02,840 --> 00:03:06,560 Speaker 1: harvest data from this app. They leveraged it. It's actually 51 00:03:06,639 --> 00:03:09,120 Speaker 1: called This is Your Digital Life. That was the name 52 00:03:09,120 --> 00:03:10,680 Speaker 1: of the app, and it was a survey. And not 53 00:03:10,720 --> 00:03:12,919 Speaker 1: only that, but it was a survey where you could 54 00:03:12,960 --> 00:03:15,400 Speaker 1: be paid for filling up the survey, so people had 55 00:03:15,400 --> 00:03:19,280 Speaker 1: an incentive to participate in it, and it generated this 56 00:03:19,440 --> 00:03:23,680 Speaker 1: enormous amount of data that originally Facebook assumed was going 57 00:03:23,720 --> 00:03:26,919 Speaker 1: to be around fifty million people or so that would 58 00:03:26,960 --> 00:03:29,119 Speaker 1: be affected. As it turned out it would be more 59 00:03:29,120 --> 00:03:33,240 Speaker 1: than eighty seven million people would They would revise this 60 00:03:33,400 --> 00:03:37,080 Speaker 1: estimation later on, And so that was a pretty ugly scandal, 61 00:03:38,040 --> 00:03:42,640 Speaker 1: mostly about user privacy and security and also a little 62 00:03:42,640 --> 00:03:45,800 Speaker 1: bit about election interference, although really it was privacy and 63 00:03:45,840 --> 00:03:49,440 Speaker 1: security over everything else at that point. The reports prompted 64 00:03:49,480 --> 00:03:52,400 Speaker 1: the Federal Trade Commission in the United States to open 65 00:03:52,480 --> 00:03:57,200 Speaker 1: an investigation into Facebook. Mark Zuckerberg would eventually appear in 66 00:03:57,240 --> 00:03:59,960 Speaker 1: front of Congress a couple of times to answer quest 67 00:04:00,080 --> 00:04:02,640 Speaker 1: gens about this scandal, and one of the things that 68 00:04:02,680 --> 00:04:05,800 Speaker 1: came out of those hearings was the possibility that perhaps 69 00:04:06,320 --> 00:04:10,320 Speaker 1: social platforms like Facebook should fall under some sort of 70 00:04:10,320 --> 00:04:13,960 Speaker 1: official regulation policy, that the government should be able to 71 00:04:14,080 --> 00:04:18,279 Speaker 1: regulate these sort of things. Also, while this story exploded 72 00:04:18,320 --> 00:04:21,240 Speaker 1: in I should point out this was not the first 73 00:04:21,240 --> 00:04:25,560 Speaker 1: time that journalists had discovered evidence of something hinky going on. 74 00:04:25,640 --> 00:04:29,039 Speaker 1: There are articles way back in about it, but it 75 00:04:29,080 --> 00:04:33,440 Speaker 1: wasn't until eighteen that it really got traction. That was 76 00:04:33,480 --> 00:04:36,920 Speaker 1: when a former Cambridge Analytica employee came forward and gave 77 00:04:36,960 --> 00:04:39,679 Speaker 1: a lot more information about what was going on inside 78 00:04:39,680 --> 00:04:41,920 Speaker 1: the company, and that's when it really blew the doors 79 00:04:41,960 --> 00:04:47,360 Speaker 1: open coming into Facebook was already neck deep in scandal 80 00:04:47,640 --> 00:04:51,839 Speaker 1: since before the election. The company has had to deal 81 00:04:52,120 --> 00:04:56,359 Speaker 1: with fake accounts that were taking advantage of how Facebook 82 00:04:56,400 --> 00:05:00,400 Speaker 1: works to to post propaganda masquerading is real news. They 83 00:05:00,400 --> 00:05:05,960 Speaker 1: were they understood how Facebook's algorithm would favor certain types 84 00:05:06,040 --> 00:05:10,040 Speaker 1: of content over others, largely through engagement. If you get 85 00:05:10,080 --> 00:05:11,960 Speaker 1: a post that has a lot of likes, there are 86 00:05:11,960 --> 00:05:13,880 Speaker 1: a lot of comments, a lot of people are sharing it. 87 00:05:14,640 --> 00:05:18,479 Speaker 1: That post ends up showing up more frequently in various 88 00:05:18,480 --> 00:05:24,440 Speaker 1: people's news feeds because Facebook's algorithm assumes, oh, this post 89 00:05:24,560 --> 00:05:27,680 Speaker 1: must be important because a lot of people are active 90 00:05:27,680 --> 00:05:30,080 Speaker 1: on it. Therefore, I'm going to show it to more 91 00:05:30,120 --> 00:05:33,120 Speaker 1: people because I'm sure more people will be interested in it. 92 00:05:33,720 --> 00:05:37,719 Speaker 1: This is from a very high level perspective. Well, knowing 93 00:05:38,160 --> 00:05:41,839 Speaker 1: that that's how Facebook's algorithm works means that you could 94 00:05:41,880 --> 00:05:46,599 Speaker 1: engineer content specifically to try and bring about the situation 95 00:05:47,240 --> 00:05:53,160 Speaker 1: that leads to the sort of viral uh uh posts 96 00:05:53,279 --> 00:05:57,840 Speaker 1: you know, shared across millions of different profiles. And as 97 00:05:57,839 --> 00:06:00,960 Speaker 1: it turns out, the goal was not just to get shared, 98 00:06:01,000 --> 00:06:04,080 Speaker 1: but to try and influence the outcome of the election. 99 00:06:04,200 --> 00:06:06,760 Speaker 1: So Facebook was already dealing with that, and that story 100 00:06:06,800 --> 00:06:13,200 Speaker 1: continued to unfold throughout. Less scandalous, but still an important 101 00:06:13,200 --> 00:06:16,320 Speaker 1: story is how Facebook would change its algorithms so that 102 00:06:16,400 --> 00:06:20,080 Speaker 1: Facebook pages, that is, the pages that are run by 103 00:06:20,120 --> 00:06:24,800 Speaker 1: like organizations, businesses, personalities, not not someone's personal page but 104 00:06:24,880 --> 00:06:31,320 Speaker 1: a page representing something are getting less engagement. Now, this 105 00:06:31,440 --> 00:06:36,080 Speaker 1: is Facebook's attempt to make the experience of using Facebook 106 00:06:36,200 --> 00:06:39,960 Speaker 1: more centered around your family and friends that you're seeing 107 00:06:39,960 --> 00:06:44,159 Speaker 1: more of their posts and fewer posts from businesses and 108 00:06:44,320 --> 00:06:48,839 Speaker 1: organizations that are trying to connect with a community or 109 00:06:48,920 --> 00:06:52,239 Speaker 1: a group of customers or whatever. So you can understand 110 00:06:52,240 --> 00:06:56,280 Speaker 1: where Facebook's coming from. However, for any business that was 111 00:06:56,320 --> 00:06:59,800 Speaker 1: at least partly dependent upon Facebook to create a connection 112 00:07:00,080 --> 00:07:03,360 Speaker 1: with customers or a community, it became a huge problem. 113 00:07:03,400 --> 00:07:07,039 Speaker 1: Suddenly their engagement starts to drop down, their businesses start 114 00:07:07,080 --> 00:07:10,520 Speaker 1: to lose momentum, and they're scrambling to find an alternative. 115 00:07:11,320 --> 00:07:13,600 Speaker 1: Now you could say this is yet another point of 116 00:07:13,600 --> 00:07:16,760 Speaker 1: evidence that strengthens that argument of don't put all your 117 00:07:16,800 --> 00:07:20,040 Speaker 1: eggs in one basket. In other words, if you are 118 00:07:20,080 --> 00:07:24,760 Speaker 1: a business, don't count on someone else's platform to be 119 00:07:25,040 --> 00:07:28,920 Speaker 1: your platform for success, whether that happens to be Facebook 120 00:07:29,080 --> 00:07:32,160 Speaker 1: or YouTube or anything else. You should build a presence 121 00:07:32,680 --> 00:07:38,440 Speaker 1: that is independent of those so that when these other 122 00:07:38,520 --> 00:07:42,360 Speaker 1: platforms change, you are not thrown into the lurch as 123 00:07:42,360 --> 00:07:45,040 Speaker 1: a result because you put it all in on that one. 124 00:07:45,400 --> 00:07:48,240 Speaker 1: But on the other hand, Facebook was still taking money 125 00:07:48,400 --> 00:07:52,320 Speaker 1: from businesses to promote pages even when the impressions were 126 00:07:52,320 --> 00:07:55,680 Speaker 1: on the decline, so that was a problem. Then there 127 00:07:55,720 --> 00:07:58,600 Speaker 1: are the properties that Facebook owns, and the story is 128 00:07:58,680 --> 00:08:04,280 Speaker 1: going around with oos like WhatsApp. WhatsApp is the messaging 129 00:08:04,320 --> 00:08:07,560 Speaker 1: service that Facebook purchased way back in It was a 130 00:08:07,600 --> 00:08:10,240 Speaker 1: big story then because of how much money Facebook paid 131 00:08:10,280 --> 00:08:13,080 Speaker 1: for it, and people in the United States are not 132 00:08:13,280 --> 00:08:16,640 Speaker 1: terribly familiar with WhatsApp. There are people in the US 133 00:08:16,680 --> 00:08:19,200 Speaker 1: who use it, some of you out there may be 134 00:08:19,400 --> 00:08:23,120 Speaker 1: using it, and you might be US citizens, particularly people 135 00:08:23,120 --> 00:08:25,760 Speaker 1: who travel internationally, they tend to use it, but really 136 00:08:25,800 --> 00:08:28,760 Speaker 1: it's much more popular outside the US than it is 137 00:08:28,920 --> 00:08:34,839 Speaker 1: in the US. Anyway, in news broke out that people 138 00:08:34,880 --> 00:08:39,160 Speaker 1: in Brazil were using WhatsApp to spread misinformation and polarized 139 00:08:39,200 --> 00:08:43,800 Speaker 1: messaging in the lead up to a contentious election in Brazil, 140 00:08:44,320 --> 00:08:47,079 Speaker 1: which sounds pretty familiar. I mean, essentially the same thing 141 00:08:47,160 --> 00:08:50,480 Speaker 1: was happening in the United States two years previously with 142 00:08:50,559 --> 00:08:54,319 Speaker 1: Facebook in general and Twitter and several other social platforms. 143 00:08:54,720 --> 00:08:58,320 Speaker 1: So that was another terrible story that's still unfolding as 144 00:08:58,400 --> 00:09:02,520 Speaker 1: we record this podcast. Then there's also the story about 145 00:09:02,559 --> 00:09:07,480 Speaker 1: Instagram and its founders, and Instagram founders Kevin Systrom and 146 00:09:07,559 --> 00:09:11,800 Speaker 1: Mike Kreeger both resigned from Facebook. The reasons were not 147 00:09:12,040 --> 00:09:15,480 Speaker 1: spelled out plainly, why they decided to leave, but the 148 00:09:15,520 --> 00:09:19,440 Speaker 1: general consensus is that they were becoming exasperated by the 149 00:09:19,440 --> 00:09:22,920 Speaker 1: way Facebook was interfering with how Instagram was running, and 150 00:09:23,040 --> 00:09:28,240 Speaker 1: they were taking features that Instagram was developing specifically for 151 00:09:28,520 --> 00:09:31,960 Speaker 1: Instagram and then using those to incorporate them into Facebook 152 00:09:32,000 --> 00:09:36,960 Speaker 1: or Facebook Messenger. Now. On Facebook first acquired Instagram, the 153 00:09:37,040 --> 00:09:39,760 Speaker 1: thought was that they were not going to interfere at all. 154 00:09:39,800 --> 00:09:43,320 Speaker 1: They were going to allow Instagram to continue to operate independently, 155 00:09:43,760 --> 00:09:47,040 Speaker 1: and it sounds like that didn't happen. So the founders 156 00:09:47,080 --> 00:09:50,400 Speaker 1: decided that they wanted to get out. And a lot 157 00:09:50,440 --> 00:09:53,920 Speaker 1: more happened with Facebook, but those were the really big 158 00:09:53,960 --> 00:09:56,120 Speaker 1: points I wanted to touch on, and I don't want 159 00:09:56,120 --> 00:09:59,280 Speaker 1: to make this just an episode about what Facebook went 160 00:09:59,280 --> 00:10:01,040 Speaker 1: through in twenty team, so I'm going to move on 161 00:10:01,080 --> 00:10:04,240 Speaker 1: to some other stories. Um. There were other companies that 162 00:10:04,280 --> 00:10:08,360 Speaker 1: also had to weather controversy. In Twitter had its share 163 00:10:08,400 --> 00:10:13,360 Speaker 1: and continues to from troll accounts, fake accounts, bought accounts. 164 00:10:14,360 --> 00:10:17,720 Speaker 1: Jack Dorsey would attend a congressional hearing about whether regulation 165 00:10:17,760 --> 00:10:21,199 Speaker 1: and moderation of online content was something that needed more attention. 166 00:10:21,760 --> 00:10:24,800 Speaker 1: Topics ranged from the fake accounts to Russian interference in 167 00:10:24,880 --> 00:10:28,400 Speaker 1: US political events, and whether or not Twitter was suppressing 168 00:10:28,440 --> 00:10:33,240 Speaker 1: conservative voices online. Dorsey would also come under scrutiny in 169 00:10:33,280 --> 00:10:37,600 Speaker 1: December for personal reasons. He went to Myanmar and he 170 00:10:37,640 --> 00:10:40,720 Speaker 1: tweeted about it, and he called it a beautiful country. Meanwhile, 171 00:10:40,760 --> 00:10:47,200 Speaker 1: at the same time, there is a global uh campaign. 172 00:10:47,720 --> 00:10:53,080 Speaker 1: United Nations is focusing on Myanmar because it's military is 173 00:10:53,120 --> 00:10:57,480 Speaker 1: carrying out a campaign of genocide against a minority Muslim 174 00:10:57,559 --> 00:11:02,400 Speaker 1: population within the country. So there's Dorsey saying Myanmar, it's 175 00:11:02,400 --> 00:11:05,160 Speaker 1: a beautiful country. Meanwhile, the rest of the world is 176 00:11:05,240 --> 00:11:09,520 Speaker 1: looking at Myanmar and these atrocities that are taking place, 177 00:11:10,000 --> 00:11:14,480 Speaker 1: and so people said, at best, Dorsey's tweets were tone deaf. 178 00:11:14,840 --> 00:11:17,840 Speaker 1: His response was that he needed to learn more about Myanmar, 179 00:11:18,559 --> 00:11:23,040 Speaker 1: probably before posting about it. Meanwhile, Evan Williams, who is 180 00:11:23,080 --> 00:11:27,040 Speaker 1: a co founder of Twitter, was quietly selling off a 181 00:11:27,200 --> 00:11:30,480 Speaker 1: lot of his ownership of Twitter throughout the second half 182 00:11:30,600 --> 00:11:35,080 Speaker 1: of ten and it would amount to nearly half of 183 00:11:35,240 --> 00:11:39,920 Speaker 1: his ownership steak. So essentially, you know, he has all 184 00:11:39,960 --> 00:11:43,480 Speaker 1: these different shares in Twitter, he sells off almost half 185 00:11:43,480 --> 00:11:46,960 Speaker 1: of them by the second half of which has led 186 00:11:47,000 --> 00:11:50,480 Speaker 1: to some speculation that perhaps Twitter has officially passed its peak, 187 00:11:50,960 --> 00:11:53,840 Speaker 1: and it's now on the decline, and eventually it will 188 00:11:53,880 --> 00:11:58,719 Speaker 1: head towards obsolescence. That's one interpretation. Another is just that 189 00:11:59,040 --> 00:12:02,320 Speaker 1: he might want to versify his portfolio, so you could 190 00:12:02,320 --> 00:12:05,320 Speaker 1: say financial coverage is even more vague than tech coverage 191 00:12:05,679 --> 00:12:10,080 Speaker 1: HAZA for that. Then there's Google, and that company has 192 00:12:10,160 --> 00:12:12,839 Speaker 1: also had its own share of issues to deal with 193 00:12:12,920 --> 00:12:18,560 Speaker 1: throughout It too came under scrutiny and the various congressional hearings. 194 00:12:18,720 --> 00:12:22,560 Speaker 1: Google was invited to participate, but declined to send anyone 195 00:12:22,600 --> 00:12:25,480 Speaker 1: to meet with Congress, while Facebook and Twitter would send 196 00:12:25,480 --> 00:12:30,360 Speaker 1: their top brass to d C. In December, Google CEO 197 00:12:30,679 --> 00:12:34,000 Speaker 1: Sundar Picha, I did meet with Congress, and this was 198 00:12:34,040 --> 00:12:37,400 Speaker 1: to talk about political bias and search results. If you 199 00:12:37,440 --> 00:12:40,360 Speaker 1: followed those hearings, you might have been rolling your eyes 200 00:12:40,440 --> 00:12:45,200 Speaker 1: pretty hard because it got a little silly. Representatives of 201 00:12:45,240 --> 00:12:48,040 Speaker 1: Congress were asking if Google was shaping the public perception 202 00:12:48,080 --> 00:12:52,400 Speaker 1: of various topics through search results. In other words, they 203 00:12:52,440 --> 00:12:56,880 Speaker 1: were maybe guiding search results in order to send a 204 00:12:56,960 --> 00:13:00,160 Speaker 1: specific message to people when they started searching for things. 205 00:13:01,040 --> 00:13:04,920 Speaker 1: Um So, for example, they showed that if you searched 206 00:13:05,080 --> 00:13:08,599 Speaker 1: the word idiot, and you did an image search, it 207 00:13:08,640 --> 00:13:11,000 Speaker 1: would bring up pictures of Donald Trump, and they said, 208 00:13:11,120 --> 00:13:13,480 Speaker 1: is this your commentary? Are you calling the President of 209 00:13:13,480 --> 00:13:16,920 Speaker 1: the United States an idiot? Now? PA's response was said 210 00:13:17,520 --> 00:13:20,800 Speaker 1: that these search results were a reflection of the public discourse. 211 00:13:20,920 --> 00:13:24,679 Speaker 1: This is reflecting what's going on in the world. It's 212 00:13:25,000 --> 00:13:27,480 Speaker 1: giving people what they're asking for. It is not an 213 00:13:27,520 --> 00:13:31,680 Speaker 1: attempt to guide the discourse, but rather, here's what people 214 00:13:31,679 --> 00:13:35,840 Speaker 1: are saying. So it's not that a person is an idiot, 215 00:13:35,920 --> 00:13:39,880 Speaker 1: it's that people are writing about someone they call that 216 00:13:39,920 --> 00:13:42,440 Speaker 1: person an idiot. Then that's what's going to come up 217 00:13:42,480 --> 00:13:45,640 Speaker 1: in search results, not that Google says that person is 218 00:13:45,679 --> 00:13:50,199 Speaker 1: an idiot. And the hearing really punctuated what we really 219 00:13:50,240 --> 00:13:54,480 Speaker 1: already knew. That's the most politicians are totally ignorant of 220 00:13:54,520 --> 00:13:57,240 Speaker 1: how the Internet in general works and how search in 221 00:13:57,280 --> 00:14:02,360 Speaker 1: particular works. We did get some punctuation about this where 222 00:14:03,000 --> 00:14:07,360 Speaker 1: at one point a representative said, if you search the 223 00:14:08,360 --> 00:14:11,040 Speaker 1: news items for a specific person and you get a 224 00:14:11,040 --> 00:14:15,840 Speaker 1: bunch of negative stories, it's because that person has done 225 00:14:15,920 --> 00:14:18,839 Speaker 1: things that have been reported on in a negative way. 226 00:14:19,040 --> 00:14:21,040 Speaker 1: But if you went out and did lots of wonderful 227 00:14:21,080 --> 00:14:24,040 Speaker 1: things that got lots of positive coverage, then the news 228 00:14:24,040 --> 00:14:26,440 Speaker 1: stories that would pop up would reflect that it's not 229 00:14:26,480 --> 00:14:30,920 Speaker 1: that Google is shaping this message, but rather the people 230 00:14:31,000 --> 00:14:33,680 Speaker 1: themselves are shaping it, and that if you want to 231 00:14:33,720 --> 00:14:36,400 Speaker 1: have less bad news written about you, don't go out 232 00:14:36,440 --> 00:14:41,640 Speaker 1: and do bad things. Seems like a pretty solid idea, assuming, 233 00:14:41,680 --> 00:14:45,840 Speaker 1: of course, that the search engine results are accurate and 234 00:14:46,040 --> 00:14:49,720 Speaker 1: they're fair and unbiased, and that they're not, say, censoring 235 00:14:49,920 --> 00:14:54,600 Speaker 1: any articles or stories that would otherwise be positive. This 236 00:14:54,640 --> 00:14:57,760 Speaker 1: applies to politicians of all political points of view. By 237 00:14:57,760 --> 00:15:01,160 Speaker 1: the way, I don't mean to say that one political 238 00:15:01,240 --> 00:15:06,200 Speaker 1: party is guilty of being terrible, terrible, terrible people and 239 00:15:06,240 --> 00:15:08,840 Speaker 1: the other ones a bunch of saints. I don't believe 240 00:15:08,880 --> 00:15:13,280 Speaker 1: that for a second. There are people who have great 241 00:15:13,360 --> 00:15:18,120 Speaker 1: scruples and they they adhere to their own code of ethics, 242 00:15:18,560 --> 00:15:23,800 Speaker 1: they are not hypocrites, and they should have really great 243 00:15:23,840 --> 00:15:27,160 Speaker 1: stories written about them and and their their efforts, whether 244 00:15:27,200 --> 00:15:29,600 Speaker 1: they happen to be conservative or liberal. And then there 245 00:15:29,600 --> 00:15:32,400 Speaker 1: are others, both on the conservative and the liberal side, 246 00:15:33,160 --> 00:15:36,360 Speaker 1: who do dumb things and are hypocrites, and they were 247 00:15:36,360 --> 00:15:38,640 Speaker 1: gonna have bad stories written about them, and they should. 248 00:15:39,200 --> 00:15:42,280 Speaker 1: So that's kind of what Google was saying is that 249 00:15:43,040 --> 00:15:46,640 Speaker 1: don't shoot the messenger. We're just giving you what exists. 250 00:15:46,760 --> 00:15:50,360 Speaker 1: We're not making it, we're just serving it up. Another 251 00:15:50,480 --> 00:15:53,680 Speaker 1: scandal that Google had to handle in two thousand eighteen. 252 00:15:53,920 --> 00:15:57,440 Speaker 1: Didn't mean for that. The rhyme was Project Maven, and 253 00:15:57,480 --> 00:16:00,760 Speaker 1: I talked about that earlier this year in an their episode. 254 00:16:01,160 --> 00:16:03,680 Speaker 1: So this was a project with the Pentagon, still is 255 00:16:03,680 --> 00:16:07,560 Speaker 1: a project. It's still undergoing over and between Google and 256 00:16:07,600 --> 00:16:11,400 Speaker 1: the Pentagon, but to create algorithms that can improve computer 257 00:16:11,560 --> 00:16:15,920 Speaker 1: vision processes, which ultimately could be used to identify and 258 00:16:15,960 --> 00:16:19,400 Speaker 1: target individuals. If it were used in a weaponized form, 259 00:16:20,040 --> 00:16:22,480 Speaker 1: the military could choose to go down that path. Now. 260 00:16:22,520 --> 00:16:26,520 Speaker 1: Google insisted that when it signed on to do Project Maven, 261 00:16:27,000 --> 00:16:29,480 Speaker 1: this was in the understanding that it was a non 262 00:16:29,520 --> 00:16:32,520 Speaker 1: weaponized program. They were helping with computer vision, but it 263 00:16:32,600 --> 00:16:34,960 Speaker 1: was not meant to be used in any kind of 264 00:16:35,080 --> 00:16:39,320 Speaker 1: weaponized way. But people said that's pretty naive because there's 265 00:16:39,320 --> 00:16:44,640 Speaker 1: nothing stopping anyone from adapting those protocols and using them 266 00:16:44,640 --> 00:16:49,200 Speaker 1: in something like a personalized missile that specifically is looking 267 00:16:49,240 --> 00:16:53,600 Speaker 1: for a particular individual and honing in on that person um, 268 00:16:53,640 --> 00:16:57,640 Speaker 1: whether directly or through satellite coordination or whatever it may be. 269 00:16:58,240 --> 00:17:02,200 Speaker 1: Google eventually was as mount was facing so much pressure 270 00:17:02,240 --> 00:17:04,720 Speaker 1: both from outside the company and within the company itself. 271 00:17:04,720 --> 00:17:08,119 Speaker 1: Google employees started to sign petition saying that the company 272 00:17:08,200 --> 00:17:12,800 Speaker 1: should extract itself from this deal. They said, well, we 273 00:17:12,920 --> 00:17:17,080 Speaker 1: can't extract ourselves, but we will not renew this contract 274 00:17:17,080 --> 00:17:20,000 Speaker 1: when it comes up in two thousand nineteen. So the 275 00:17:20,040 --> 00:17:24,120 Speaker 1: contract will expire next year and Google executives have said 276 00:17:24,160 --> 00:17:26,679 Speaker 1: they are not going to pursue a renewal. There was 277 00:17:26,720 --> 00:17:32,000 Speaker 1: also a lot of disturbing allegations about sexual harassment and discrimination. 278 00:17:32,320 --> 00:17:36,119 Speaker 1: This was true throughout all of tech, but it was 279 00:17:36,440 --> 00:17:40,680 Speaker 1: particularly highlighted over at Google. So in February there was 280 00:17:40,720 --> 00:17:44,840 Speaker 1: a Google programmer named Loretta Lee who filed claims against Google, 281 00:17:45,280 --> 00:17:50,840 Speaker 1: and our allegations were numerous. There's a lot of things 282 00:17:50,840 --> 00:17:54,560 Speaker 1: that Loretta Lee said had happened at Google that are very, 283 00:17:55,240 --> 00:17:59,840 Speaker 1: very upsetting. It's it's a terrible account that no one 284 00:18:00,119 --> 00:18:05,000 Speaker 1: should be subjected to. And at the time, Google had 285 00:18:05,040 --> 00:18:09,159 Speaker 1: a policy of mandatory arbitration within the company. This is 286 00:18:09,200 --> 00:18:12,240 Speaker 1: something that a lot of tech companies were putting in place. 287 00:18:12,600 --> 00:18:15,919 Speaker 1: That meant that all of the conflict had to be 288 00:18:15,960 --> 00:18:20,240 Speaker 1: handled within the company. This would prevent employees from being 289 00:18:20,280 --> 00:18:24,840 Speaker 1: able to take the matter to the courts. It had 290 00:18:24,880 --> 00:18:29,560 Speaker 1: to be handled inside of Google, and employees would sign 291 00:18:29,600 --> 00:18:32,399 Speaker 1: agreements essentially when they would get employed with Google that 292 00:18:32,520 --> 00:18:35,880 Speaker 1: said they were agreeing to this, they would give up 293 00:18:36,320 --> 00:18:40,879 Speaker 1: the right to sue in court. They would have to 294 00:18:41,080 --> 00:18:45,280 Speaker 1: go through this mandatory arbitration process, which could limit the 295 00:18:45,280 --> 00:18:48,080 Speaker 1: amount of money a victim could get if they were 296 00:18:48,640 --> 00:18:51,960 Speaker 1: proven to be telling the truth. And not only that, 297 00:18:52,000 --> 00:18:56,119 Speaker 1: but there was this perception that if the company was 298 00:18:56,160 --> 00:18:58,720 Speaker 1: handling it, they were more likely to just kind of 299 00:18:58,720 --> 00:19:01,280 Speaker 1: smooth things over as best they could and ignore that 300 00:19:01,359 --> 00:19:05,160 Speaker 1: it was happening. They wouldn't if someone who was being 301 00:19:05,160 --> 00:19:08,800 Speaker 1: accused was also thought of as being a high performer, 302 00:19:09,400 --> 00:19:13,200 Speaker 1: then it would largely go ignored. Which is a terrible thing. Right. 303 00:19:13,280 --> 00:19:17,159 Speaker 1: If if there's somebody who's who's sexually harassing people in 304 00:19:17,200 --> 00:19:19,359 Speaker 1: a company, whether it's a man or it's a woman, 305 00:19:19,400 --> 00:19:22,919 Speaker 1: whether the people they are harassing our men or women, 306 00:19:23,280 --> 00:19:27,280 Speaker 1: whatever it may be, that should not be tolerated. And 307 00:19:27,720 --> 00:19:31,720 Speaker 1: the problem was that using this mandatory arbitraction approach, if 308 00:19:31,760 --> 00:19:35,320 Speaker 1: executives said, well, yeah, I mean it shouldn't be tolerated. However, 309 00:19:36,680 --> 00:19:39,320 Speaker 1: one Ever, since they've been in charge, that department has 310 00:19:39,359 --> 00:19:43,119 Speaker 1: been performing at an incredible level. Then there seemed to 311 00:19:43,119 --> 00:19:49,040 Speaker 1: be this at least perceived habit of not paying the 312 00:19:49,119 --> 00:19:53,000 Speaker 1: appropriate amount of attention and care to those those cases. 313 00:19:53,160 --> 00:19:59,560 Speaker 1: So Google employees began to protest this. There was a walkout. 314 00:20:00,280 --> 00:20:04,080 Speaker 1: All the women employees at Google, or a large majority 315 00:20:04,119 --> 00:20:07,760 Speaker 1: of them agreed to walk out in protest of this 316 00:20:07,880 --> 00:20:11,360 Speaker 1: policy and just the general way that Google is handling 317 00:20:11,520 --> 00:20:16,040 Speaker 1: these sexual harassment cases. Uh. Several men also stood with 318 00:20:16,080 --> 00:20:19,120 Speaker 1: the women in solidarity and saying that they supported this, 319 00:20:19,600 --> 00:20:24,120 Speaker 1: and ultimately Google has reversed that policy. There is no 320 00:20:24,160 --> 00:20:29,520 Speaker 1: longer mandatory arbitration. Arbitration is optional, but employees are allowed 321 00:20:29,560 --> 00:20:33,600 Speaker 1: to pursue these cases in court. It is not retroactive, 322 00:20:34,000 --> 00:20:38,760 Speaker 1: so any case that's in arbitration remains there for now. 323 00:20:39,560 --> 00:20:44,240 Speaker 1: But this, this is just another example of a systemic 324 00:20:44,320 --> 00:20:47,879 Speaker 1: problem not just of Google, but in the tech industry 325 00:20:47,920 --> 00:20:52,600 Speaker 1: that is slowly changing as more scrutiny is applied to 326 00:20:52,680 --> 00:20:56,480 Speaker 1: these companies and and the the culture is changing to 327 00:20:57,400 --> 00:21:01,399 Speaker 1: say that this is not appropriate and not acceptable. That 328 00:21:01,560 --> 00:21:04,760 Speaker 1: is changing, uh, but it is a slow process. I mean, 329 00:21:04,840 --> 00:21:11,640 Speaker 1: we're talking about a system that's been generations in the making. So, uh, 330 00:21:11,840 --> 00:21:14,240 Speaker 1: that was the other big story of Google. Now I've 331 00:21:14,240 --> 00:21:17,440 Speaker 1: got all those big scandals out of the way. Hopefully 332 00:21:17,440 --> 00:21:19,080 Speaker 1: that means that we're going to be able to move 333 00:21:19,080 --> 00:21:21,600 Speaker 1: on to happier stories. Spoiler alert, not all of them 334 00:21:21,600 --> 00:21:24,040 Speaker 1: are happy. But before I get into those, let's take 335 00:21:24,040 --> 00:21:34,320 Speaker 1: a quick break to thank our sponsor. In two thousand eighteen, 336 00:21:34,359 --> 00:21:37,440 Speaker 1: I also did an episode about the General Data Protection 337 00:21:37,480 --> 00:21:40,520 Speaker 1: Regulation Rules or g d p R those took effect 338 00:21:40,520 --> 00:21:44,000 Speaker 1: in These are rules that apply to online companies that 339 00:21:44,080 --> 00:21:47,760 Speaker 1: serve people within the European Union. Whether the company is 340 00:21:47,800 --> 00:21:51,119 Speaker 1: located in the EU or not doesn't matter. It matters 341 00:21:51,119 --> 00:21:54,920 Speaker 1: if the people using the company's services or products are 342 00:21:54,920 --> 00:21:57,040 Speaker 1: in the EU, and the rules are meant to protect 343 00:21:57,160 --> 00:22:01,320 Speaker 1: data and privacy. They call for responsible, transparent, and easily 344 00:22:01,400 --> 00:22:04,719 Speaker 1: understandable policies from these online companies, whether it's a store 345 00:22:04,840 --> 00:22:07,280 Speaker 1: or social media platform, whatever it may be. Now, there 346 00:22:07,280 --> 00:22:09,120 Speaker 1: are a lot of rules, and I did a full 347 00:22:09,119 --> 00:22:11,280 Speaker 1: episode on this, so I'm not going to go over them. 348 00:22:11,320 --> 00:22:13,520 Speaker 1: But the important thing to remember is that the companies 349 00:22:13,560 --> 00:22:18,280 Speaker 1: that are working in the EU, whether they're located there 350 00:22:18,359 --> 00:22:21,000 Speaker 1: or not have to adhere to these rules or they 351 00:22:21,040 --> 00:22:24,160 Speaker 1: can face penalties. And the scope of the penalties depend 352 00:22:24,240 --> 00:22:26,480 Speaker 1: upon the severity of the violation, but it can be 353 00:22:26,600 --> 00:22:31,840 Speaker 1: up to twenty million euros or four of all the 354 00:22:31,880 --> 00:22:35,480 Speaker 1: total sales of a company from the year before, whichever 355 00:22:35,640 --> 00:22:39,040 Speaker 1: amount is greater. So there's some serious penalties there. So 356 00:22:39,160 --> 00:22:42,280 Speaker 1: companies are paying attention. In fact, some companies have even 357 00:22:42,359 --> 00:22:46,120 Speaker 1: temporarily at least closed up shop, you know, turned off 358 00:22:46,160 --> 00:22:51,399 Speaker 1: those European Union facing websites at least until they can 359 00:22:51,600 --> 00:22:56,560 Speaker 1: and they can, you know, conform to the regulations. And 360 00:22:56,640 --> 00:22:59,439 Speaker 1: now here's some big business news elements of tech in 361 00:22:59,520 --> 00:23:03,680 Speaker 1: twenty team. First, A T and T initially got approval 362 00:23:03,760 --> 00:23:07,560 Speaker 1: for its eighty five billion dollar acquisition deal for Time Warner. 363 00:23:07,880 --> 00:23:09,840 Speaker 1: I'm going to talk a little bit more about that 364 00:23:09,880 --> 00:23:12,520 Speaker 1: in some upcoming episodes of Tech Stuff that I've already 365 00:23:12,520 --> 00:23:15,920 Speaker 1: recorded but have not aired. But as I record this episode, 366 00:23:16,280 --> 00:23:19,320 Speaker 1: that approval is actually under appeal and another court is 367 00:23:19,359 --> 00:23:22,480 Speaker 1: considering whether or not to allow the process to continue. 368 00:23:23,359 --> 00:23:26,800 Speaker 1: If it does continue, it's another consolidation of communications and 369 00:23:26,880 --> 00:23:30,199 Speaker 1: entertainment companies. Not everyone thinks that's a great thing, but 370 00:23:30,600 --> 00:23:33,040 Speaker 1: it could be a big benefit to both of those 371 00:23:33,080 --> 00:23:36,280 Speaker 1: individual companies. We'll have to wait and see. By the 372 00:23:36,320 --> 00:23:38,240 Speaker 1: time you hear this episode, all of that may have 373 00:23:38,240 --> 00:23:41,960 Speaker 1: already been resolved. Disney got approval for its acquisition of 374 00:23:42,000 --> 00:23:45,240 Speaker 1: twenty first Century Fox, both from the United States government 375 00:23:45,280 --> 00:23:48,560 Speaker 1: and from shareholders, and the company had announced that intent 376 00:23:48,560 --> 00:23:50,840 Speaker 1: way back in two thousand seventeen. The acquisition will be 377 00:23:50,880 --> 00:23:54,720 Speaker 1: complete on January one, two thousand nineteen. T Mobile and 378 00:23:54,800 --> 00:23:57,280 Speaker 1: Sprint are looking to merge into a single company that 379 00:23:57,320 --> 00:24:01,960 Speaker 1: got announced in April, and if everything goes as they 380 00:24:02,000 --> 00:24:05,359 Speaker 1: expect it to, the process will be complete by April 381 00:24:05,400 --> 00:24:07,719 Speaker 1: two thousand nineteen. Not at the time I'm recording this. 382 00:24:08,080 --> 00:24:11,240 Speaker 1: Analysts predict that regulators are probably going to allow this 383 00:24:11,359 --> 00:24:14,440 Speaker 1: merger to go forward. They might require that the companies 384 00:24:14,480 --> 00:24:18,520 Speaker 1: divest themselves of certain businesses in order to prevent the 385 00:24:18,560 --> 00:24:21,640 Speaker 1: possibility of having a monopoly, but otherwise they think it's 386 00:24:21,640 --> 00:24:24,000 Speaker 1: going to happen. And there are a ton of other 387 00:24:24,040 --> 00:24:26,640 Speaker 1: stories that fall into this category. IBM announced it would 388 00:24:26,640 --> 00:24:29,240 Speaker 1: sell off Lotus notes to an Indian firm called hc 389 00:24:29,520 --> 00:24:33,439 Speaker 1: L for one point eight billion dollars. Of course, IBM 390 00:24:33,440 --> 00:24:36,360 Speaker 1: had purchased Lotus for three point five billion dollars back 391 00:24:36,359 --> 00:24:40,960 Speaker 1: in IBM also announced a thirty four billion dollar acquisition 392 00:24:41,000 --> 00:24:43,399 Speaker 1: plan for red Hat, which is a software company that 393 00:24:43,480 --> 00:24:48,480 Speaker 1: specializes in enterprise products. It's also the largest software acquisition ever. 394 00:24:49,080 --> 00:24:52,840 Speaker 1: And this isn't an acquisition, but Dell Computers is currently 395 00:24:52,840 --> 00:24:55,600 Speaker 1: scheduled to return to the stock market as a publicly 396 00:24:55,600 --> 00:24:59,439 Speaker 1: traded company on December. I'm recording this on December twelve, 397 00:24:59,480 --> 00:25:03,800 Speaker 1: so that's in the future for me. Also saw some 398 00:25:04,320 --> 00:25:07,919 Speaker 1: game studios close up shop, so that's sad, not just 399 00:25:07,960 --> 00:25:10,080 Speaker 1: for gamers, but obviously for the people who worked for 400 00:25:10,080 --> 00:25:13,439 Speaker 1: those companies. So here's a list of some of the 401 00:25:13,440 --> 00:25:17,080 Speaker 1: companies that closed up shop in tween in the video 402 00:25:17,119 --> 00:25:22,160 Speaker 1: game industry. Boss Key, which was founded by a Cliff Blazinski. 403 00:25:22,520 --> 00:25:26,960 Speaker 1: It was the producer of the game Lawbreakers. That One closed, 404 00:25:27,320 --> 00:25:30,840 Speaker 1: Telltale Games That One hurt a lot. It's they're the 405 00:25:30,840 --> 00:25:34,119 Speaker 1: creators of the Walking Dead series as well as the 406 00:25:34,119 --> 00:25:37,560 Speaker 1: Wolf among us um. They've done a Game of Thrones game, 407 00:25:37,600 --> 00:25:41,280 Speaker 1: They've done a Minecraft story based game, that done Borderland 408 00:25:41,320 --> 00:25:44,919 Speaker 1: story based game. They shut down in September ten. It 409 00:25:44,960 --> 00:25:49,160 Speaker 1: was an incredibly messy and ugly process. Employees were let 410 00:25:49,160 --> 00:25:53,440 Speaker 1: go without severance or prior warning. Um it was largely 411 00:25:53,520 --> 00:25:58,280 Speaker 1: viewed as being very poorly handled. Carbine Studios closed up 412 00:25:58,280 --> 00:26:01,040 Speaker 1: as well. They produced only on game. It was a 413 00:26:01,160 --> 00:26:05,120 Speaker 1: multi player game called wild Star UM. That one also 414 00:26:05,119 --> 00:26:09,239 Speaker 1: shut down in September two eighteen. Also in September two 415 00:26:09,240 --> 00:26:12,680 Speaker 1: thou eighteen, Man, that was a bad month. Capcom closed 416 00:26:12,720 --> 00:26:17,120 Speaker 1: its Capcom Vancouver studio. That team oversaw the Dead Rising 417 00:26:17,240 --> 00:26:21,720 Speaker 1: series of games, and war Gaming Seattle closed in May 418 00:26:21,840 --> 00:26:25,120 Speaker 1: two th eighteen. They made Dungeon Siege and Supreme Commander. 419 00:26:25,680 --> 00:26:28,000 Speaker 1: Now that that's terrible news, but it's not to say 420 00:26:28,000 --> 00:26:30,919 Speaker 1: that video games as a whole are in trouble. The 421 00:26:30,960 --> 00:26:35,280 Speaker 1: industry has tripled in size over the last decade. Developing 422 00:26:35,359 --> 00:26:38,959 Speaker 1: video games, particularly if you're talking about large prestige titles 423 00:26:39,000 --> 00:26:44,040 Speaker 1: like Red Dead. Redemption to for example, costs millions of dollars, 424 00:26:44,080 --> 00:26:47,280 Speaker 1: and it requires hundreds of people, maybe up to a 425 00:26:47,359 --> 00:26:50,400 Speaker 1: thousand for some of these big games, and it also 426 00:26:50,480 --> 00:26:54,080 Speaker 1: can include some really brutal work. Weeksen was also a 427 00:26:54,119 --> 00:26:56,280 Speaker 1: year in which the world took a closer look at 428 00:26:56,280 --> 00:27:00,639 Speaker 1: the working conditions that game developers experience, particular and crunch 429 00:27:00,720 --> 00:27:04,200 Speaker 1: time while developing a title, and the industry is slowly 430 00:27:04,280 --> 00:27:07,439 Speaker 1: changing in response to some really tough criticisms, things like 431 00:27:08,040 --> 00:27:11,040 Speaker 1: there's gotta be a better work life balance, that people 432 00:27:11,200 --> 00:27:15,200 Speaker 1: should not be expected to work a hundred hour weeks, 433 00:27:15,280 --> 00:27:18,720 Speaker 1: that kind of stuff. We don't appear to have reached 434 00:27:18,720 --> 00:27:22,800 Speaker 1: the tipping point quite yet, for the point where a 435 00:27:22,840 --> 00:27:26,520 Speaker 1: triple A title game costs so much to make that 436 00:27:26,560 --> 00:27:30,199 Speaker 1: you're never going to recover those costs and sales. We 437 00:27:30,240 --> 00:27:32,879 Speaker 1: haven't reached that point, but it's getting pretty tight. Probably 438 00:27:32,920 --> 00:27:36,800 Speaker 1: means that I wouldn't be surprised to see game prices 439 00:27:36,800 --> 00:27:40,280 Speaker 1: start to climb again in the near future. We already 440 00:27:40,320 --> 00:27:43,520 Speaker 1: see more expensive ones for like collectors, editions and stuff 441 00:27:43,520 --> 00:27:45,240 Speaker 1: like that, but I think it's going to get more 442 00:27:45,280 --> 00:27:50,840 Speaker 1: expensive pretty soon. Super Smash Brothers Ultimate was the best 443 00:27:50,920 --> 00:27:54,439 Speaker 1: selling game on Amazon in two thousand eighteen, followed by 444 00:27:54,520 --> 00:27:57,639 Speaker 1: Super Mario Party and then Red Dead Redemption too for 445 00:27:57,720 --> 00:28:02,120 Speaker 1: the PS four. Now, to be fair, these facts and figures, 446 00:28:02,119 --> 00:28:05,479 Speaker 1: they take into account games by platform, not by title, 447 00:28:06,160 --> 00:28:09,640 Speaker 1: because if you look at titles, you would say Red 448 00:28:09,640 --> 00:28:13,560 Speaker 1: Dead Redemption two is in the lead across all platforms 449 00:28:13,760 --> 00:28:16,520 Speaker 1: when you when you add them all up, that's when 450 00:28:16,560 --> 00:28:20,160 Speaker 1: you only look at it um by platform that Super 451 00:28:20,160 --> 00:28:22,920 Speaker 1: Smash Brothers Ultimate becomes the best selling game. There was 452 00:28:23,000 --> 00:28:25,800 Speaker 1: no shortage of games that received critical acclaim in twenty eighteen. 453 00:28:25,840 --> 00:28:28,960 Speaker 1: There were a ton of them, Assassin's Creed Odyssey, the 454 00:28:29,000 --> 00:28:32,919 Speaker 1: Tetris Effect, Forts Horizon for God of War. Lots of 455 00:28:32,920 --> 00:28:36,399 Speaker 1: them were lauded for their art style, their gameplay, the 456 00:28:36,480 --> 00:28:43,280 Speaker 1: stories win applicable um. Also, Fortnite's Battle Royal Mode became 457 00:28:43,320 --> 00:28:45,880 Speaker 1: a huge hit in ten. A lot of people said 458 00:28:45,920 --> 00:28:51,320 Speaker 1: it was essentially a copy, a complete lift of Player 459 00:28:51,400 --> 00:28:57,400 Speaker 1: Unknowns Battle Grounds game style, although there are differences with Fortnite, 460 00:28:58,000 --> 00:29:00,400 Speaker 1: but they said that you know the game is centially 461 00:29:00,440 --> 00:29:05,480 Speaker 1: lifted the style from from Battlegrounds. To be fair, battle 462 00:29:05,480 --> 00:29:07,640 Speaker 1: Grounds is not the first Battle Royal game that's ever 463 00:29:07,680 --> 00:29:12,760 Speaker 1: come out either, But I see where they accusations come from. 464 00:29:12,800 --> 00:29:14,920 Speaker 1: But the game became so popular that now there are 465 00:29:15,040 --> 00:29:18,160 Speaker 1: organizations popping up that are supposed to help wean kids 466 00:29:18,200 --> 00:29:21,320 Speaker 1: off of playing the darn thing. Not all games got 467 00:29:21,360 --> 00:29:25,240 Speaker 1: universal acclaim, however. Fall Out seventy six, the online multiplayer 468 00:29:25,280 --> 00:29:28,480 Speaker 1: game from the THEESDA and the Fallout series, was greeted 469 00:29:28,520 --> 00:29:33,880 Speaker 1: with at best if you're being super generous A mixed response. 470 00:29:34,320 --> 00:29:36,760 Speaker 1: I would actually say those more of a negative response. 471 00:29:37,120 --> 00:29:40,280 Speaker 1: In fact, it was almost overwhelmingly negative. There are a 472 00:29:40,280 --> 00:29:43,840 Speaker 1: lot of legitimate complaints leveled at that game. There are 473 00:29:43,840 --> 00:29:46,080 Speaker 1: a lot of things I cannot just say no that 474 00:29:46,120 --> 00:29:49,680 Speaker 1: doesn't exist, No there. That game does have real problems. However, 475 00:29:51,000 --> 00:29:53,520 Speaker 1: I still have fun when I log in now. I 476 00:29:53,600 --> 00:29:56,440 Speaker 1: can't deny the game has major issues though I'm having 477 00:29:56,480 --> 00:30:01,520 Speaker 1: fun because of the setting and the game world. But 478 00:30:03,120 --> 00:30:07,480 Speaker 1: I definitely can completely understand the complaints people have about it. 479 00:30:07,600 --> 00:30:10,440 Speaker 1: I can't deny them. There was another game called Sea 480 00:30:10,480 --> 00:30:12,440 Speaker 1: if Thieves, still as a game called See If Thieves, 481 00:30:12,440 --> 00:30:16,360 Speaker 1: another online game. They got a lot of early enthusiasm. 482 00:30:16,440 --> 00:30:18,280 Speaker 1: It's a game where you play as a member of 483 00:30:18,280 --> 00:30:20,360 Speaker 1: a crew on a pirate ship. You and some friends 484 00:30:20,400 --> 00:30:22,440 Speaker 1: can join up and grab a ship and go out 485 00:30:22,480 --> 00:30:26,800 Speaker 1: and plunder. But a lot of people felt that there 486 00:30:26,840 --> 00:30:29,400 Speaker 1: was a lack of content that was very repetitive. After 487 00:30:29,480 --> 00:30:31,440 Speaker 1: you've done a few missions, you pretty much have played 488 00:30:31,480 --> 00:30:33,600 Speaker 1: the whole game. It's just doing it over and over again. 489 00:30:34,560 --> 00:30:36,920 Speaker 1: The game did continue to add more stuff throughout the year, 490 00:30:37,200 --> 00:30:39,840 Speaker 1: whether it was enough to satisfy the typical pirate remained 491 00:30:39,880 --> 00:30:42,680 Speaker 1: a matter of individual preference. I got a little bit 492 00:30:42,680 --> 00:30:45,280 Speaker 1: more to say about Before I get into that, Let's 493 00:30:45,320 --> 00:30:55,720 Speaker 1: take another quick break to thank our sponsor. Back in January, 494 00:30:55,760 --> 00:30:59,920 Speaker 1: we learned about vulnerabilities that are baked into modern micro processors. 495 00:31:00,000 --> 00:31:03,840 Speaker 1: This is another huge story. It's an ongoing problem in 496 00:31:03,920 --> 00:31:07,840 Speaker 1: tech now. Those vulnerabilities that were built in microprocessors weren't 497 00:31:07,920 --> 00:31:12,520 Speaker 1: done so on purpose, and we call them meltdown and specter. 498 00:31:13,440 --> 00:31:16,040 Speaker 1: I think did an episode about this. But generally speaking, 499 00:31:16,040 --> 00:31:21,680 Speaker 1: the vulnerabilities allow a malicious application to spy on other applications. 500 00:31:21,760 --> 00:31:25,600 Speaker 1: The way it does this is different depending upon the 501 00:31:25,720 --> 00:31:31,120 Speaker 1: vulnerability it's exploiting, but generally speaking, it's this idea of 502 00:31:31,200 --> 00:31:36,320 Speaker 1: a malicious program sapping information away, largely based on the 503 00:31:36,480 --> 00:31:40,080 Speaker 1: information gathered by other programs. And this can be on 504 00:31:40,760 --> 00:31:43,520 Speaker 1: a computer, it can be on a mobile device. All 505 00:31:43,520 --> 00:31:47,200 Speaker 1: of these things are running microprocessors that have architecture that's 506 00:31:47,240 --> 00:31:51,600 Speaker 1: vulnerable to this stuff. So conceptually, applications are supposed to 507 00:31:51,640 --> 00:31:55,800 Speaker 1: be isolated from each other, supposed to be siloed and protected. 508 00:31:56,600 --> 00:31:58,920 Speaker 1: That way, it would prevent people from doing this very thing. 509 00:31:59,040 --> 00:32:01,840 Speaker 1: But these vulnerability would allow programmers to build apps that 510 00:32:01,840 --> 00:32:05,200 Speaker 1: could get around this isolation and gather data from other apps, 511 00:32:05,200 --> 00:32:09,080 Speaker 1: including stuff like passwords and even more critical information than that. 512 00:32:09,280 --> 00:32:12,080 Speaker 1: In some cases. There are a lot of talks about 513 00:32:12,160 --> 00:32:15,800 Speaker 1: patches that might address these vulnerabilities, but some analysts say 514 00:32:15,840 --> 00:32:18,160 Speaker 1: that's not going to be enough. What really needs to 515 00:32:18,200 --> 00:32:23,320 Speaker 1: happen is an entirely new microprocessor design, something that needs 516 00:32:23,360 --> 00:32:28,640 Speaker 1: to happen for different microprocessors in order to address this vulnerability, 517 00:32:28,640 --> 00:32:33,400 Speaker 1: which is pretty tough to do, might be necessary. However, 518 00:32:33,960 --> 00:32:36,959 Speaker 1: Also in January, Tim Cook of Apple apologize for not 519 00:32:37,000 --> 00:32:41,000 Speaker 1: being transparent about Apple's policy to throttle older model phones. 520 00:32:42,040 --> 00:32:44,440 Speaker 1: In other words, they would run more slowly than they 521 00:32:44,480 --> 00:32:48,720 Speaker 1: had before. Now. Supposedly this was to increase the use 522 00:32:48,760 --> 00:32:52,080 Speaker 1: time per battery charge on older phones, the idea being 523 00:32:52,080 --> 00:32:56,640 Speaker 1: that later versions of operating systems would put greater demands 524 00:32:56,840 --> 00:33:00,480 Speaker 1: on a phone's battery, and so without throttling it, the 525 00:33:00,520 --> 00:33:03,680 Speaker 1: older phones would just drain their batteries super fast. But 526 00:33:03,800 --> 00:33:08,200 Speaker 1: some people said this feels more like it's a tactic 527 00:33:08,280 --> 00:33:11,440 Speaker 1: to convince people to upgrade to a newer phone. Like 528 00:33:11,520 --> 00:33:13,440 Speaker 1: my old phones running slow, I must need to get 529 00:33:13,480 --> 00:33:16,560 Speaker 1: a new one, And the fact that Apple was not 530 00:33:16,640 --> 00:33:19,800 Speaker 1: transparent about this. They weren't even admitting that they were 531 00:33:19,840 --> 00:33:24,120 Speaker 1: doing it at first. That seems to lend credence to 532 00:33:24,200 --> 00:33:26,440 Speaker 1: that second theory, the idea that they were doing this 533 00:33:26,520 --> 00:33:29,920 Speaker 1: in an effort to try and up sell an upgraded phone. 534 00:33:30,960 --> 00:33:35,520 Speaker 1: So that's an issue. Um. That was a pretty ugly 535 00:33:35,600 --> 00:33:39,560 Speaker 1: story early on in the year, and I talked about 536 00:33:39,640 --> 00:33:42,680 Speaker 1: this in my suite of episodes about autonomous cars recently. 537 00:33:42,720 --> 00:33:44,680 Speaker 1: But back in March two thousand eighteen, that were a 538 00:33:44,720 --> 00:33:49,680 Speaker 1: pair of really bad accidents that brought the whole concept 539 00:33:49,680 --> 00:33:52,600 Speaker 1: of autonomous cars back into the headlines. First, there was 540 00:33:52,640 --> 00:33:55,600 Speaker 1: a self driving Uber vehicle that struck and killed a 541 00:33:55,600 --> 00:34:00,239 Speaker 1: pedestrian in Arizona. That accident prompted Uber to curtail all 542 00:34:00,280 --> 00:34:03,880 Speaker 1: autonomous car tests in their various cities, at least temporarily, 543 00:34:04,240 --> 00:34:09,040 Speaker 1: and to completely withdraw from Arizona with their driverless program. 544 00:34:09,080 --> 00:34:12,960 Speaker 1: But also in March, a Tesla Model X was put 545 00:34:13,000 --> 00:34:16,240 Speaker 1: in autopilot mode and it crashed at a high speed 546 00:34:16,280 --> 00:34:20,480 Speaker 1: with a concrete highway divider. The driver died in the accident. 547 00:34:21,400 --> 00:34:25,040 Speaker 1: Tesla released information stating that the driver was not paying 548 00:34:25,040 --> 00:34:28,600 Speaker 1: attention to the road despite the car providing multiple warnings, 549 00:34:28,840 --> 00:34:32,000 Speaker 1: and this wasn't the first time that a Tesla autopilot 550 00:34:32,640 --> 00:34:36,279 Speaker 1: accident that ended in a fatality had happened. But I 551 00:34:36,280 --> 00:34:39,520 Speaker 1: do have to point out, to be fair to Tesla, 552 00:34:39,719 --> 00:34:43,000 Speaker 1: that it's very clear that the autopilot is not meant 553 00:34:43,000 --> 00:34:47,439 Speaker 1: to be a truly driverless type of technology. In fact, 554 00:34:47,480 --> 00:34:51,359 Speaker 1: Tesla requires that drivers read and acknowledge or at least 555 00:34:51,360 --> 00:34:54,680 Speaker 1: acknowledge a warning saying as much, and they're supposed to 556 00:34:54,719 --> 00:34:57,080 Speaker 1: keep their hands on the wheel at all times and 557 00:34:57,160 --> 00:34:59,840 Speaker 1: be paying attention attention to the road even when the 558 00:35:00,000 --> 00:35:04,440 Speaker 1: are is in control. So there's at least some truth 559 00:35:04,480 --> 00:35:07,280 Speaker 1: to Tesla saying that this is not really our fault 560 00:35:07,320 --> 00:35:12,759 Speaker 1: because they're not following the actual policy. Um, it's it's 561 00:35:12,800 --> 00:35:15,280 Speaker 1: hard to argue against that, although you could argue, well, 562 00:35:15,560 --> 00:35:20,720 Speaker 1: this autonomous technology is obviously not advanced enough to avoid 563 00:35:20,800 --> 00:35:23,720 Speaker 1: these kind of accidents, so that's very worrisome as well. 564 00:35:24,040 --> 00:35:26,960 Speaker 1: Elon Musk got into all sorts of hot water in 565 00:35:27,000 --> 00:35:31,120 Speaker 1: two thousand eighteen. Man, that guy, what a year he had. 566 00:35:31,160 --> 00:35:33,600 Speaker 1: For one thing, he tweeted out messages that said he 567 00:35:33,680 --> 00:35:37,719 Speaker 1: intended to take Tesla private, and he was. He had 568 00:35:37,719 --> 00:35:40,200 Speaker 1: secured funding and it was at a price of four 569 00:35:40,880 --> 00:35:44,840 Speaker 1: dollars per share, which he apparently later said was a 570 00:35:44,920 --> 00:35:52,520 Speaker 1: joke about um, you know and marijuana. Funny joke. Here's 571 00:35:52,520 --> 00:35:56,200 Speaker 1: the problem. It's illegal to share misleading information to shareholders, 572 00:35:56,440 --> 00:36:00,799 Speaker 1: and it wasn't clear that funding had been secured, so 573 00:36:01,239 --> 00:36:05,719 Speaker 1: the Department of Justice said that's not cool. Then the 574 00:36:06,040 --> 00:36:09,640 Speaker 1: SEC announced in September that it was going to sue 575 00:36:09,719 --> 00:36:14,000 Speaker 1: Elon Musk over this matter, that he was sharing misleading information, 576 00:36:14,400 --> 00:36:18,560 Speaker 1: that shareholders could see that and have the wrong idea 577 00:36:18,640 --> 00:36:21,600 Speaker 1: about what was about to happen with their investment, and 578 00:36:21,640 --> 00:36:25,840 Speaker 1: that that caused problems. And eventually that lawsuit was settled 579 00:36:25,880 --> 00:36:28,719 Speaker 1: out of court. Elon Musk and Tesla agreed to pay 580 00:36:28,760 --> 00:36:32,359 Speaker 1: a total of forty million dollars in fines. Musk would 581 00:36:32,400 --> 00:36:35,920 Speaker 1: later say that he quote did not respect the SEC 582 00:36:36,360 --> 00:36:40,200 Speaker 1: end quote. I think that was already apparent, but anyway. 583 00:36:40,480 --> 00:36:42,880 Speaker 1: Musk also was the center of some other stories. In 584 00:36:42,920 --> 00:36:45,200 Speaker 1: twenty eighteen, he sat down for an interview with Joe 585 00:36:45,280 --> 00:36:49,800 Speaker 1: Rogan for his show, and while on camera for that show, 586 00:36:50,080 --> 00:36:54,200 Speaker 1: he took a puff off of some marijuana, and folks 587 00:36:54,239 --> 00:36:57,280 Speaker 1: plumb done lost their minds about it. There were questions 588 00:36:57,280 --> 00:36:59,760 Speaker 1: as to whether he might have violated his own company's 589 00:36:59,760 --> 00:37:03,640 Speaker 1: cause duct policy. There were discussions about the public perception 590 00:37:03,680 --> 00:37:07,920 Speaker 1: of marijuana use, which is admittedly complicated and weird, has 591 00:37:07,920 --> 00:37:11,239 Speaker 1: a lot of historical taboo stuff tied to it. Personally, 592 00:37:11,520 --> 00:37:14,960 Speaker 1: I find his marijuana use way less concerning than his 593 00:37:15,040 --> 00:37:18,800 Speaker 1: disregard for the legality of spreading misinformation from a position 594 00:37:18,800 --> 00:37:21,040 Speaker 1: of power. I think that's the thing I would worry 595 00:37:21,040 --> 00:37:24,800 Speaker 1: about more. But I also say that, you know, because 596 00:37:24,880 --> 00:37:30,200 Speaker 1: I think that has the greater capacity to have greater harm. Uh. 597 00:37:30,280 --> 00:37:32,600 Speaker 1: And I also say this as someone who's a total square. 598 00:37:33,560 --> 00:37:37,160 Speaker 1: You're not gonna catch me smoking marijuana because I'm boring. 599 00:37:37,920 --> 00:37:40,200 Speaker 1: I have no interest in it. But I also don't 600 00:37:40,320 --> 00:37:42,560 Speaker 1: see it as being as big a deal as this 601 00:37:42,640 --> 00:37:46,800 Speaker 1: other issue that tweeting out about the taking Tesla private. 602 00:37:46,840 --> 00:37:49,960 Speaker 1: I think that one causes the greater harm. He also 603 00:37:50,000 --> 00:37:53,279 Speaker 1: insulted a British cave diver who is helping rescue some 604 00:37:53,400 --> 00:37:55,400 Speaker 1: young boys who are trapped in a flooded cave. You 605 00:37:55,480 --> 00:38:00,920 Speaker 1: might remember that story from and Uh. He indicated a 606 00:38:00,920 --> 00:38:02,880 Speaker 1: couple of times during the year that he was working 607 00:38:03,000 --> 00:38:06,319 Speaker 1: such long hours and getting so little sleep that he 608 00:38:06,400 --> 00:38:09,799 Speaker 1: was starting to make pretty bad goofs, you know, like 609 00:38:09,880 --> 00:38:12,440 Speaker 1: consulting someone who is risking his own life to try 610 00:38:12,480 --> 00:38:15,600 Speaker 1: and save other people. For example. I also did an 611 00:38:15,600 --> 00:38:20,120 Speaker 1: episode about tunneling machines and must Boring Company and how 612 00:38:20,160 --> 00:38:21,800 Speaker 1: that was in the news as he was trying to 613 00:38:21,840 --> 00:38:24,239 Speaker 1: work out plans to create underground tunnels for a new 614 00:38:24,280 --> 00:38:27,919 Speaker 1: method of urban transportation. UM planning on building it out 615 00:38:28,280 --> 00:38:31,759 Speaker 1: under Los Angeles, for example, but he's hit some obstacles 616 00:38:32,120 --> 00:38:34,839 Speaker 1: and the tunneling machines can't just dig through these. This 617 00:38:34,920 --> 00:38:37,719 Speaker 1: is obstacles, like government saying we got to work out 618 00:38:37,760 --> 00:38:40,600 Speaker 1: some very important details. You have to secure a lot 619 00:38:40,600 --> 00:38:43,640 Speaker 1: of permissions before you can dig under any cities. And 620 00:38:43,680 --> 00:38:48,160 Speaker 1: so now while the technological side might be solved, there's 621 00:38:48,160 --> 00:38:50,520 Speaker 1: the red tape to get through. And that red tape 622 00:38:50,640 --> 00:38:54,360 Speaker 1: has good purpose. I don't mean to dismiss it. And 623 00:38:54,440 --> 00:38:58,040 Speaker 1: the Google Io event in the spring of Google's CEO 624 00:38:58,160 --> 00:39:01,040 Speaker 1: showed off a new application called dup Lex, which is 625 00:39:01,080 --> 00:39:03,839 Speaker 1: an AI assistant that can actually make phone calls on 626 00:39:03,880 --> 00:39:06,520 Speaker 1: behalf of the user, and it can make arrangements like 627 00:39:06,560 --> 00:39:10,000 Speaker 1: reserving a car or a hotel room, and it sounds 628 00:39:10,040 --> 00:39:14,080 Speaker 1: eerily human. It even incorporates stuff like ums and us 629 00:39:14,080 --> 00:39:17,160 Speaker 1: in the speech, and it left some folks wondering, at 630 00:39:17,200 --> 00:39:19,920 Speaker 1: what point will we be unable to tell if we're 631 00:39:19,920 --> 00:39:23,360 Speaker 1: talking to a robot and not a human. News flash, 632 00:39:23,400 --> 00:39:27,880 Speaker 1: We're already there, at least for some limited applications. In October, 633 00:39:27,960 --> 00:39:31,359 Speaker 1: we found out that a vulnerability in Google Plus meant 634 00:39:31,360 --> 00:39:35,640 Speaker 1: that developers were able to get access to users private data, 635 00:39:35,719 --> 00:39:38,640 Speaker 1: and that the vulnerability had likely been present for maybe 636 00:39:38,640 --> 00:39:41,759 Speaker 1: as long as three years. Google found out about this 637 00:39:41,840 --> 00:39:45,719 Speaker 1: vulnerability in March, but they kept it quiet until October, 638 00:39:46,480 --> 00:39:49,640 Speaker 1: and that made critics concerned about the lack of transparency. 639 00:39:49,880 --> 00:39:52,040 Speaker 1: So Google announced it was going to shut down Google 640 00:39:52,080 --> 00:39:56,440 Speaker 1: Plus in August nineteen. That's when a little end and 641 00:39:56,480 --> 00:40:01,040 Speaker 1: then in December of another vulnerability was to scovered similar 642 00:40:01,080 --> 00:40:05,120 Speaker 1: to that, exposing the data of fifty million or so people. 643 00:40:06,120 --> 00:40:09,360 Speaker 1: So Google has now moved the sunset date for Google 644 00:40:09,360 --> 00:40:13,560 Speaker 1: Plus to April nineteen. And I remember when that platform launched. 645 00:40:13,920 --> 00:40:15,840 Speaker 1: It's kind of weird to see it from from birth 646 00:40:15,920 --> 00:40:20,400 Speaker 1: to death. Magic Leap, the augmented reality company that had 647 00:40:20,440 --> 00:40:23,080 Speaker 1: been operating in secrecy for a couple of years but 648 00:40:23,600 --> 00:40:27,800 Speaker 1: was really driving up anticipation about augmented reality. Finally showed 649 00:40:27,800 --> 00:40:30,680 Speaker 1: off a headset. It's called the Magic Leap one. This 650 00:40:30,719 --> 00:40:33,400 Speaker 1: is a developer kit. It's not met for the average consumer, 651 00:40:33,719 --> 00:40:35,920 Speaker 1: but for people who are going to make applications for it. 652 00:40:36,080 --> 00:40:40,400 Speaker 1: It costs two thousand two so definitely not priced in 653 00:40:40,400 --> 00:40:43,879 Speaker 1: the realm of consumer technology. They look like some high 654 00:40:43,880 --> 00:40:46,560 Speaker 1: tech goggles, and some of the reviews I've seen say 655 00:40:46,640 --> 00:40:49,200 Speaker 1: it is really impressive, but it still falls short of 656 00:40:49,239 --> 00:40:53,000 Speaker 1: the expectations that were created by Magic Leap itself, as 657 00:40:53,000 --> 00:40:55,359 Speaker 1: well as the media coverage around it that led up 658 00:40:55,400 --> 00:40:58,480 Speaker 1: to the release. But still it's pretty encouraging as far 659 00:40:58,520 --> 00:41:01,759 Speaker 1: as advances in a argo. Switching over to cryptocurrency for 660 00:41:01,760 --> 00:41:06,799 Speaker 1: a second, Bitcoin's value went into decline throughout seventeen. The 661 00:41:06,840 --> 00:41:10,200 Speaker 1: currency had hit a value of nearly twenty thousand dollars 662 00:41:10,239 --> 00:41:15,239 Speaker 1: per bitcoin at princely some By early January that had 663 00:41:15,280 --> 00:41:18,840 Speaker 1: dropped to just shy of fifteen thousand dollars per bitcoin. 664 00:41:19,360 --> 00:41:22,160 Speaker 1: By the time you hear this, they will be very different. 665 00:41:22,239 --> 00:41:25,279 Speaker 1: Right now, as I record this episode, the value is 666 00:41:25,320 --> 00:41:28,719 Speaker 1: down to three thousand, four hundred dollars or so per bitcoin. 667 00:41:28,880 --> 00:41:31,520 Speaker 1: So the question is is it going to continue to 668 00:41:31,640 --> 00:41:35,319 Speaker 1: dip down or is it toward the bottom of its 669 00:41:35,320 --> 00:41:37,600 Speaker 1: trough and now it's going to go through another upswing. 670 00:41:38,000 --> 00:41:39,480 Speaker 1: And I guess I'll have to make a prediction in 671 00:41:39,520 --> 00:41:44,480 Speaker 1: our Predictions episode for twenty nineteen about that for much 672 00:41:44,480 --> 00:41:48,080 Speaker 1: of cities in North America waited anxiously to find out 673 00:41:48,120 --> 00:41:51,319 Speaker 1: where Amazon was going to put its second headquarters. And 674 00:41:51,360 --> 00:41:53,920 Speaker 1: it announced that it was going to choose such a 675 00:41:53,960 --> 00:41:57,480 Speaker 1: location in late seen and opened up the opportunity for 676 00:41:57,560 --> 00:42:00,520 Speaker 1: various cities in metropolitan areas to send in a proposal. 677 00:42:01,080 --> 00:42:04,520 Speaker 1: And then, after months of speculation, Amazon announced that it 678 00:42:04,600 --> 00:42:09,000 Speaker 1: was going to select an area in northern Virginia very 679 00:42:09,000 --> 00:42:12,520 Speaker 1: close to Washington, d C. And a section of Queens 680 00:42:12,560 --> 00:42:17,600 Speaker 1: in New York City, splitting the second headquarters into two locations, 681 00:42:17,640 --> 00:42:21,480 Speaker 1: which coincidentally happened to be in places where Jeff Bezos 682 00:42:21,520 --> 00:42:26,600 Speaker 1: already owns homes. I'm equally convinced the whole thing was 683 00:42:26,640 --> 00:42:30,319 Speaker 1: a sincere attempt to find suitable locations for Amazon's headquarters, 684 00:42:30,400 --> 00:42:34,520 Speaker 1: and not just an enormous manipulative attempt to get the 685 00:42:34,520 --> 00:42:38,000 Speaker 1: sweetest deals at two locations that the company had already 686 00:42:38,040 --> 00:42:43,080 Speaker 1: decided upon before making the announcement. But then you guys 687 00:42:43,440 --> 00:42:45,360 Speaker 1: heard me talk about that in the previous episodes, so 688 00:42:45,360 --> 00:42:47,920 Speaker 1: I'm going to drop it finally in a story that 689 00:42:48,040 --> 00:42:52,040 Speaker 1: is extremely close to home. I Heart Media, the company 690 00:42:52,080 --> 00:42:55,960 Speaker 1: that was once known as Clear Channel, acquired Stuff Media, LLC, 691 00:42:56,600 --> 00:43:00,000 Speaker 1: the company I work for. The acquisition was complete by 692 00:43:00,000 --> 00:43:02,840 Speaker 1: eight fall, and as I record this, I Heart Media 693 00:43:02,960 --> 00:43:05,840 Speaker 1: is preparing to emerge from bankruptcy and there are a 694 00:43:05,840 --> 00:43:08,359 Speaker 1: lot of rumors flying around about what might happen next, 695 00:43:08,560 --> 00:43:11,120 Speaker 1: and I have no idea what it might be now 696 00:43:11,160 --> 00:43:13,200 Speaker 1: and have an inside line to any of that. I 697 00:43:13,200 --> 00:43:16,320 Speaker 1: am excited to find out what happens in twenty nineteen, 698 00:43:17,400 --> 00:43:21,680 Speaker 1: and that wraps up this retrospective look at There are 699 00:43:21,760 --> 00:43:25,640 Speaker 1: tons of stories that didn't cover. Obviously, a lot happened 700 00:43:25,760 --> 00:43:30,040 Speaker 1: in I'm pretty sure that eighteen was four times longer 701 00:43:30,040 --> 00:43:33,480 Speaker 1: than any year that came before it. I think somewhere 702 00:43:33,520 --> 00:43:36,400 Speaker 1: we added extra months in there because it felt like 703 00:43:36,400 --> 00:43:39,760 Speaker 1: it lasted an eternity. I have no idea what twenty 704 00:43:39,880 --> 00:43:43,080 Speaker 1: nineteen has in store that I'm going to make some guesses. Anyway. 705 00:43:43,440 --> 00:43:45,600 Speaker 1: You'll have to hear about that in my Predictions episode. 706 00:43:46,600 --> 00:43:49,239 Speaker 1: That's it for this time. In our next episode, I 707 00:43:49,280 --> 00:43:52,200 Speaker 1: will go over how my predictions for this past year went. 708 00:43:53,000 --> 00:43:55,759 Speaker 1: Spoiler alert, I did pretty well that you'll hear more 709 00:43:55,800 --> 00:43:57,239 Speaker 1: about that in the next one. If you guys have 710 00:43:57,280 --> 00:44:00,000 Speaker 1: suggestions for future episodes, send me an email. The address 711 00:44:00,080 --> 00:44:03,200 Speaker 1: is tech stuff at how stuff works dot com, or 712 00:44:03,280 --> 00:44:06,560 Speaker 1: drop me a line on Facebook or Twitter. I have 713 00:44:06,600 --> 00:44:09,080 Speaker 1: the handle tech stuff hs W. Go to tech stuff 714 00:44:09,080 --> 00:44:11,239 Speaker 1: podcast dot com. That's our website where you can learn 715 00:44:11,239 --> 00:44:13,239 Speaker 1: more about the show, and you also have a link 716 00:44:13,280 --> 00:44:15,879 Speaker 1: to our merchandise store. Remember every purchase you make there 717 00:44:16,000 --> 00:44:18,200 Speaker 1: goes to help our show, and we greatly appreciate it. 718 00:44:18,640 --> 00:44:27,359 Speaker 1: And I'll talk to you again really soon. For more 719 00:44:27,400 --> 00:44:29,680 Speaker 1: on this and thousands of other topics. Is that how 720 00:44:29,719 --> 00:44:40,600 Speaker 1: stuff works dot com.