1 00:00:04,480 --> 00:00:12,520 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,520 --> 00:00:16,160 Speaker 1: and welcome to tech Stuff. I'm your host, jonvan Strickland. 3 00:00:16,160 --> 00:00:19,560 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:19,720 --> 00:00:24,439 Speaker 1: Tech Are you. Welcome to my last Friday as the 5 00:00:24,560 --> 00:00:28,120 Speaker 1: host of tech Stuff. You've almost made it. I've almost 6 00:00:28,120 --> 00:00:31,000 Speaker 1: made it. It's really exciting. I thought it could be 7 00:00:31,080 --> 00:00:35,040 Speaker 1: fun to look back on kind of like a big 8 00:00:35,200 --> 00:00:39,120 Speaker 1: tech story from each year since I've been a host 9 00:00:39,320 --> 00:00:43,040 Speaker 1: of tech Stuff. So that's from mid two thousand and 10 00:00:43,080 --> 00:00:47,000 Speaker 1: eight to this year or this past year twenty twenty four. 11 00:00:47,200 --> 00:00:49,280 Speaker 1: It's weird for me to say this past year because 12 00:00:49,280 --> 00:00:52,160 Speaker 1: I'm actually putting this together when it has not yet 13 00:00:52,320 --> 00:00:56,080 Speaker 1: turned to twenty twenty five. I also figured this would 14 00:00:56,080 --> 00:00:58,000 Speaker 1: take up a couple of episodes because I'm a chatty 15 00:00:58,080 --> 00:01:00,120 Speaker 1: Cathy and a lot has gone on since mid two 16 00:01:00,160 --> 00:01:02,720 Speaker 1: thousand and eight. So this is part one, and in 17 00:01:02,760 --> 00:01:06,000 Speaker 1: part two we will pick up and continue on. That's 18 00:01:07,120 --> 00:01:10,399 Speaker 1: how parts work now. To be clear, the entries I 19 00:01:10,480 --> 00:01:14,520 Speaker 1: submit to you, these are not necessarily the biggest or 20 00:01:14,560 --> 00:01:17,920 Speaker 1: even the most important techt stories of each year. Instead, 21 00:01:17,920 --> 00:01:21,160 Speaker 1: it's a story I picked for whatever capricious reason. Entered 22 00:01:21,160 --> 00:01:24,440 Speaker 1: into my head as I was looking back over everything 23 00:01:24,480 --> 00:01:27,000 Speaker 1: that's happened over the last sixteen and a half years 24 00:01:27,000 --> 00:01:30,400 Speaker 1: in the world of tech. So we started off tech 25 00:01:30,440 --> 00:01:33,720 Speaker 1: Stuff in early June of two thousand and eight. For 26 00:01:33,760 --> 00:01:36,200 Speaker 1: that reason, I went to look at the back half 27 00:01:36,280 --> 00:01:38,679 Speaker 1: of two thousand and eight for the tech story to 28 00:01:38,680 --> 00:01:42,319 Speaker 1: talk about, So I ignored anything that happened up until June. 29 00:01:42,760 --> 00:01:47,960 Speaker 1: And that brings us to Apple launching the iPhone app Store. Now, 30 00:01:48,000 --> 00:01:51,000 Speaker 1: the iPhone had come out the year before, but the 31 00:01:51,040 --> 00:01:55,040 Speaker 1: app store would end up becoming a thing one month 32 00:01:55,240 --> 00:01:58,559 Speaker 1: after we started tech Stuff in July of two thousand 33 00:01:58,600 --> 00:02:01,280 Speaker 1: and eight. That's when that app store came online. Now, 34 00:02:01,280 --> 00:02:04,920 Speaker 1: before that, iPhone users had to be content with the 35 00:02:04,960 --> 00:02:08,000 Speaker 1: apps that Apple made available on the iPhone that were 36 00:02:08,000 --> 00:02:12,320 Speaker 1: preloaded on the device. Third party choices were not actually 37 00:02:12,360 --> 00:02:16,600 Speaker 1: an option yet. Apple had spent a year, well more 38 00:02:16,639 --> 00:02:19,080 Speaker 1: than a year technically, but a year since the iPhone 39 00:02:19,120 --> 00:02:23,360 Speaker 1: had come out, just coming up with a way to 40 00:02:23,400 --> 00:02:25,919 Speaker 1: create an app store, Like, what was the strategy there? 41 00:02:25,960 --> 00:02:27,960 Speaker 1: They didn't want it to just be a fire hose 42 00:02:28,600 --> 00:02:32,120 Speaker 1: of content because then the store would get flooded with 43 00:02:32,400 --> 00:02:36,000 Speaker 1: all sorts of apps, many of which would be pretty terrible. 44 00:02:36,680 --> 00:02:41,000 Speaker 1: So their plan included a vetting process for apps, and 45 00:02:41,160 --> 00:02:43,920 Speaker 1: at the best of times, this process was rather opaque 46 00:02:43,960 --> 00:02:48,000 Speaker 1: and obtuse. Sometimes a developer could get an app approved 47 00:02:48,040 --> 00:02:51,040 Speaker 1: and in the store right away without any real revisions. 48 00:02:51,160 --> 00:02:55,000 Speaker 1: Other times developers would get pushedback from Apple and they 49 00:02:55,040 --> 00:02:57,640 Speaker 1: would have to tweak their apps for one reason or another, 50 00:02:58,520 --> 00:03:01,600 Speaker 1: and that was not always clear, like it wasn't always 51 00:03:02,280 --> 00:03:06,359 Speaker 1: apparent why one app would be accepted and another app 52 00:03:06,440 --> 00:03:10,320 Speaker 1: rejected when they were, at least on the surface, very 53 00:03:10,360 --> 00:03:13,040 Speaker 1: similar to one another. So it raised a lot of 54 00:03:13,160 --> 00:03:17,160 Speaker 1: questions and confusion. And because Apple didn't allow for side loading, 55 00:03:17,440 --> 00:03:21,320 Speaker 1: which is when users can install apps from outside an 56 00:03:21,360 --> 00:03:25,639 Speaker 1: official app store, this meant that developers had no alternative 57 00:03:25,639 --> 00:03:29,240 Speaker 1: pathway to get to their customer base. It was either 58 00:03:29,400 --> 00:03:33,240 Speaker 1: played by Apple's rules or get shut out entirely. Now, 59 00:03:33,280 --> 00:03:37,040 Speaker 1: the app store would have a huge impact on tech 60 00:03:37,120 --> 00:03:39,560 Speaker 1: and how we interact with it, which is putting it lightly. 61 00:03:40,080 --> 00:03:42,720 Speaker 1: Not to mention it would help propel Apple's revenue into 62 00:03:42,760 --> 00:03:45,680 Speaker 1: the stratosphere. The company would take a chunk out of 63 00:03:45,800 --> 00:03:49,480 Speaker 1: every purchase made through an app. Now that could also 64 00:03:49,560 --> 00:03:52,760 Speaker 1: include if the app was one where customers had to 65 00:03:52,760 --> 00:03:55,080 Speaker 1: pay to download the app, like if it wasn't a 66 00:03:55,080 --> 00:03:58,200 Speaker 1: free app, if it costs five or ten bucks or whatever, 67 00:03:58,880 --> 00:04:01,560 Speaker 1: Apple got a chunk that, But they also got a 68 00:04:01,640 --> 00:04:05,160 Speaker 1: percentage of all in app purchasing options, assuming it wasn't 69 00:04:05,160 --> 00:04:08,440 Speaker 1: an app like ordering a physical thing, like if you 70 00:04:08,480 --> 00:04:11,120 Speaker 1: were using an app to order food from a restaurant, 71 00:04:11,200 --> 00:04:13,560 Speaker 1: Apple wouldn't get a chunk of that. But if you 72 00:04:13,560 --> 00:04:15,960 Speaker 1: were using an app and the app had features that 73 00:04:16,000 --> 00:04:18,960 Speaker 1: were locked behind a paywall and you paid to get 74 00:04:18,960 --> 00:04:22,040 Speaker 1: to those features, Apple got a cut. So developers were 75 00:04:22,080 --> 00:04:25,800 Speaker 1: obliged to use Apple's own payment processing features, they couldn't 76 00:04:25,800 --> 00:04:28,719 Speaker 1: introduce their own, and Apple took, like I said, a 77 00:04:28,720 --> 00:04:31,120 Speaker 1: percentage with every transaction. And it was that sort of 78 00:04:31,160 --> 00:04:34,279 Speaker 1: policy that would eventually land Apple in hot water with 79 00:04:34,400 --> 00:04:37,680 Speaker 1: antitrust regulators who would argue that the company was using 80 00:04:37,720 --> 00:04:41,800 Speaker 1: its dominant position to deny competition in the space. And 81 00:04:41,839 --> 00:04:44,560 Speaker 1: that's something that still plays out to this day, Like 82 00:04:44,720 --> 00:04:48,480 Speaker 1: there's still court cases going on that go one way 83 00:04:48,560 --> 00:04:51,200 Speaker 1: or the other. Some have been kind of an Apple's favor, 84 00:04:51,400 --> 00:04:55,479 Speaker 1: some have been against, but it's an ongoing thing. But 85 00:04:55,600 --> 00:04:57,400 Speaker 1: on top of all that, the app store would bring 86 00:04:57,440 --> 00:05:00,640 Speaker 1: about an era in which everyone and their doll began 87 00:05:00,680 --> 00:05:04,120 Speaker 1: to develop apps for the web content creators out there. 88 00:05:04,400 --> 00:05:06,919 Speaker 1: This became a real headache because it wasn't enough to 89 00:05:07,080 --> 00:05:09,719 Speaker 1: just optimize your website so that it would look good 90 00:05:09,760 --> 00:05:13,360 Speaker 1: on mobile devices. That was absolutely a necessity, because trends 91 00:05:13,400 --> 00:05:15,479 Speaker 1: were showing that more people were starting to rely on 92 00:05:15,560 --> 00:05:19,120 Speaker 1: mobile devices to access the web every single year, but 93 00:05:19,279 --> 00:05:22,480 Speaker 1: you also had to go and make your own bespoke app, 94 00:05:22,560 --> 00:05:26,720 Speaker 1: or at least that that was the philosophy. Countless apps followed. 95 00:05:27,080 --> 00:05:29,640 Speaker 1: Some of these would stand the test of time, but 96 00:05:29,760 --> 00:05:32,520 Speaker 1: a lot of them ultimately withered and died. I am 97 00:05:32,600 --> 00:05:35,240 Speaker 1: reminded of the house Stuff Works app, which we actually 98 00:05:35,279 --> 00:05:37,760 Speaker 1: mentioned in a recent episode when I had the boys 99 00:05:37,800 --> 00:05:39,679 Speaker 1: of Stuff They Don't Want You to Know on the show. 100 00:05:40,240 --> 00:05:43,400 Speaker 1: The House Stuff Works app served up articles and quizzes 101 00:05:43,400 --> 00:05:46,279 Speaker 1: and such to users, but it was tough to convince 102 00:05:46,279 --> 00:05:48,280 Speaker 1: people to download it. I mean, a lot of folks 103 00:05:48,320 --> 00:05:51,560 Speaker 1: would just go to how Stuff Works if Google pointed 104 00:05:51,600 --> 00:05:53,719 Speaker 1: them that way when they had the question about something. 105 00:05:54,400 --> 00:05:58,200 Speaker 1: And it was also really tough to monetize apps. Optimizing 106 00:05:58,279 --> 00:06:01,240 Speaker 1: articles to display in an app meant that you might 107 00:06:01,320 --> 00:06:04,240 Speaker 1: have to issue advertising, So like, if they went to 108 00:06:04,279 --> 00:06:06,600 Speaker 1: the web version of the article, they would get the 109 00:06:06,640 --> 00:06:09,159 Speaker 1: ads and you would get the impressions and that would 110 00:06:09,160 --> 00:06:11,719 Speaker 1: count against whatever the ad deal was. But if it 111 00:06:11,760 --> 00:06:15,040 Speaker 1: was an app, you might not get that. And ads 112 00:06:15,080 --> 00:06:17,479 Speaker 1: were a top revenue driver for a lot of companies, 113 00:06:17,480 --> 00:06:21,920 Speaker 1: including How Stuff Works. So how do you both serve 114 00:06:22,040 --> 00:06:25,480 Speaker 1: content up to people in a streamlined way via an 115 00:06:25,520 --> 00:06:28,480 Speaker 1: app and how do you make money off of that 116 00:06:29,000 --> 00:06:31,200 Speaker 1: as its sponsorships? As that what? A lot of people 117 00:06:31,279 --> 00:06:34,320 Speaker 1: hadn't figured that out, and a lot of other outlets 118 00:06:34,400 --> 00:06:37,240 Speaker 1: encountered similar issues to what we saw How Stuff Works. 119 00:06:37,440 --> 00:06:40,480 Speaker 1: For a while, apps were seen as a possible lifeline 120 00:06:40,480 --> 00:06:44,640 Speaker 1: for content companies that were struggling with the old business model. 121 00:06:44,880 --> 00:06:48,360 Speaker 1: You know, maybe you could convince people to subscribe through 122 00:06:48,400 --> 00:06:52,000 Speaker 1: an app and put content behind a paywall. That would 123 00:06:52,040 --> 00:06:54,520 Speaker 1: help push back against the myriad of ways content could 124 00:06:54,520 --> 00:06:58,359 Speaker 1: be shared outside of just visiting the website itself. Sure, 125 00:06:58,440 --> 00:07:01,880 Speaker 1: stuff like social media could let you find more readers, 126 00:07:02,160 --> 00:07:05,640 Speaker 1: so a lot of content companies embrace social media early on, 127 00:07:06,920 --> 00:07:10,080 Speaker 1: But then people started just skimming content through social media 128 00:07:10,160 --> 00:07:13,480 Speaker 1: and never bothered to click through to visit the home website, 129 00:07:13,680 --> 00:07:17,920 Speaker 1: which meant again, you weren't actually monetizing any of that traffic. 130 00:07:18,080 --> 00:07:20,800 Speaker 1: All the traffic was going to the social network, not 131 00:07:20,960 --> 00:07:24,280 Speaker 1: to you. Anyway, the App store made a truly huge 132 00:07:24,400 --> 00:07:28,600 Speaker 1: change on the tech sector. Entire companies built around apps 133 00:07:28,640 --> 00:07:33,200 Speaker 1: were made possible by Apple's move. And while Google would 134 00:07:33,280 --> 00:07:36,640 Speaker 1: launch the Android operating system in two thousand and eight, 135 00:07:37,080 --> 00:07:39,600 Speaker 1: I don't think even the most hardcore Android fan out 136 00:07:39,600 --> 00:07:41,320 Speaker 1: there would be able to stay with a straight face 137 00:07:41,360 --> 00:07:43,840 Speaker 1: that Google would have accomplished the same thing that Apple 138 00:07:43,920 --> 00:07:47,920 Speaker 1: managed to do. Google did some great things and continues 139 00:07:47,960 --> 00:07:49,680 Speaker 1: to do some great things along with some not so 140 00:07:49,760 --> 00:07:53,000 Speaker 1: great things. But honestly, without Apple, I don't think that 141 00:07:53,040 --> 00:07:56,240 Speaker 1: this ecosystem really takes off. Now, let us move on 142 00:07:56,280 --> 00:07:58,480 Speaker 1: to two thousand and nine, and we've got a whole 143 00:07:58,560 --> 00:08:00,840 Speaker 1: year to play with this time, just half a year. 144 00:08:00,920 --> 00:08:03,560 Speaker 1: So what tech story caught my eye? Well, there were 145 00:08:03,600 --> 00:08:06,440 Speaker 1: a couple of contenders. One is a very sad one. 146 00:08:06,760 --> 00:08:10,200 Speaker 1: Steve Jobs's health was in serious decline. And whatever you 147 00:08:10,280 --> 00:08:12,880 Speaker 1: might have thought of Steve Jobs as a person or 148 00:08:12,920 --> 00:08:17,840 Speaker 1: a boss, still hard to talk about someone's health deteriorating, 149 00:08:18,160 --> 00:08:21,920 Speaker 1: at least it is for me. And journalists were swarming 150 00:08:21,960 --> 00:08:23,680 Speaker 1: to find out the truth of what was going on 151 00:08:23,760 --> 00:08:27,280 Speaker 1: with Steve Jobs, which was kind of achy. He initially 152 00:08:27,280 --> 00:08:30,960 Speaker 1: said he was experiencing a quote unquote hormonal imbalance, but 153 00:08:31,000 --> 00:08:33,920 Speaker 1: it later turned out that he had secretly had a 154 00:08:34,000 --> 00:08:37,600 Speaker 1: liver transplant, and part of me is frustrated that he 155 00:08:37,679 --> 00:08:40,280 Speaker 1: actually had to deal with journalists who were prying at 156 00:08:40,360 --> 00:08:43,400 Speaker 1: his personal life at all. But then I also understand 157 00:08:43,440 --> 00:08:46,120 Speaker 1: that since he was CEO of Apple at the time, 158 00:08:46,880 --> 00:08:50,120 Speaker 1: there were reasons people needed to know about his health 159 00:08:50,480 --> 00:08:54,600 Speaker 1: because if his well being impacted the company, well that's 160 00:08:54,600 --> 00:08:58,280 Speaker 1: a huge thing, and it impacts thousands of other people. 161 00:08:58,720 --> 00:09:00,480 Speaker 1: So I guess that's one of the trade offs you 162 00:09:00,520 --> 00:09:02,000 Speaker 1: have to accept when you become the head of a 163 00:09:02,040 --> 00:09:06,000 Speaker 1: supremely profitable company. And obviously, while his health was big 164 00:09:06,000 --> 00:09:08,760 Speaker 1: news in two thousand and nine, his passing two years 165 00:09:08,840 --> 00:09:12,480 Speaker 1: later would eclipse that. Another big story in two thousand 166 00:09:12,520 --> 00:09:15,440 Speaker 1: and nine was that Microsoft released Windows seven, which would 167 00:09:15,480 --> 00:09:17,720 Speaker 1: go on to be one of the more popular versions 168 00:09:17,760 --> 00:09:21,600 Speaker 1: of the operating system. Microsoft has had an inconsistent track 169 00:09:21,679 --> 00:09:25,839 Speaker 1: record in that regard. People loved Windows XP, but they 170 00:09:25,840 --> 00:09:29,760 Speaker 1: weren't fans of Vista. For example. People liked Windows seven, 171 00:09:30,280 --> 00:09:33,439 Speaker 1: but they wouldn't very much care for Windows eight and 172 00:09:33,520 --> 00:09:36,240 Speaker 1: so on. But I feel the story I should really 173 00:09:36,280 --> 00:09:39,479 Speaker 1: touch on is that in two thousand and nine, Facebook 174 00:09:39,520 --> 00:09:43,440 Speaker 1: officially passed MySpace as far as the number of visitors 175 00:09:43,559 --> 00:09:47,760 Speaker 1: going to each site per month within the United States 176 00:09:48,160 --> 00:09:51,280 Speaker 1: is concerned. It is wild to me that tech Stuff 177 00:09:51,400 --> 00:09:55,320 Speaker 1: was around back when MySpace wasn't just a thing, it 178 00:09:55,360 --> 00:09:58,079 Speaker 1: was the dominant thing in the social network space here 179 00:09:58,120 --> 00:10:00,400 Speaker 1: in the States. Of course, by the time time tech 180 00:10:00,400 --> 00:10:03,320 Speaker 1: Stuff got started, MySpace was already getting into trouble and 181 00:10:03,440 --> 00:10:06,160 Speaker 1: Facebook's star was on the rise. In fact, if we 182 00:10:06,200 --> 00:10:10,680 Speaker 1: look at global numbers, Facebook had already surpassed MySpace in 183 00:10:10,800 --> 00:10:14,400 Speaker 1: unique visitors worldwide back in April two thousand and eight, 184 00:10:14,400 --> 00:10:17,200 Speaker 1: which was a couple of months before tech Stuff even launched. But, 185 00:10:17,280 --> 00:10:19,960 Speaker 1: as I'm sure you all know, here in the United States, 186 00:10:19,960 --> 00:10:23,520 Speaker 1: folks essentially behave as though nothing matters unless it happens here. 187 00:10:24,040 --> 00:10:26,160 Speaker 1: I'm not saying that's right. In fact, I'm saying it's 188 00:10:26,160 --> 00:10:29,040 Speaker 1: outright myopic and dumb, but it's still how a lot 189 00:10:29,120 --> 00:10:32,280 Speaker 1: of people behave here in the United States. So the 190 00:10:32,320 --> 00:10:36,520 Speaker 1: trajectory Facebook was on would propel the company to incredible 191 00:10:36,520 --> 00:10:40,560 Speaker 1: heights as well as set itself up for scrutiny and criticism. 192 00:10:40,880 --> 00:10:44,280 Speaker 1: In two thousand and nine, Facebook changed its terms of use. 193 00:10:44,640 --> 00:10:47,480 Speaker 1: The new terms had a concerning clause in them that read, 194 00:10:47,600 --> 00:10:49,040 Speaker 1: and this is a long one, but I'm going to 195 00:10:49,080 --> 00:10:54,880 Speaker 1: quote it. Quote you here. By grant Facebook and irrevocable, perpetual, 196 00:10:55,240 --> 00:11:00,480 Speaker 1: non exclusive, transferable, fully paid worldwide license would right to 197 00:11:00,600 --> 00:11:06,520 Speaker 1: sub license to a use, copy, publish, stream, store, retain, publicly, 198 00:11:06,559 --> 00:11:14,160 Speaker 1: perform or display, transmit, scan, reformat, modify, edit, frame, translate, excerpt, adapt, 199 00:11:14,440 --> 00:11:19,040 Speaker 1: create derivative works, and distribute through multiple tiers. Any user 200 00:11:19,120 --> 00:11:23,760 Speaker 1: content you lowercase I post on or in connection with 201 00:11:23,800 --> 00:11:27,000 Speaker 1: the Facebook service or the promotion thereof, subject only to 202 00:11:27,080 --> 00:11:32,000 Speaker 1: your privacy settings. Or I enable a user to post, 203 00:11:32,160 --> 00:11:35,160 Speaker 1: including by offering a share link on your website, and 204 00:11:35,480 --> 00:11:38,720 Speaker 1: lowercase B to use your name, likeness, and image for 205 00:11:38,800 --> 00:11:43,000 Speaker 1: any purpose, including commercial or advertising, each of lowercase A 206 00:11:43,760 --> 00:11:47,320 Speaker 1: and lowercase B on or in connection with the Facebook 207 00:11:47,360 --> 00:11:52,760 Speaker 1: service or the promotion thereof. End quote. Whew, Okay, that's 208 00:11:52,800 --> 00:11:55,160 Speaker 1: a mouthful. I should have just said the headline of 209 00:11:55,320 --> 00:11:58,920 Speaker 1: the consumerist piece about this particular change, which was quote 210 00:11:59,000 --> 00:12:01,760 Speaker 1: Facebook's new term of service. We can do anything we 211 00:12:01,800 --> 00:12:05,200 Speaker 1: want with your content forever. End quote. It's actually it's 212 00:12:05,240 --> 00:12:08,600 Speaker 1: a great article. So there was a pretty swift and 213 00:12:08,640 --> 00:12:12,960 Speaker 1: negative reaction to this policy change, understandably, and ultimately Facebook 214 00:12:13,000 --> 00:12:16,720 Speaker 1: walked it back, putting up changes to vote among the 215 00:12:16,800 --> 00:12:18,840 Speaker 1: user base, saying, hey, you should be able to have 216 00:12:18,840 --> 00:12:20,679 Speaker 1: a say in this because it's going to impact you, 217 00:12:21,320 --> 00:12:24,960 Speaker 1: and a very tiny percentage of users actually bothered to vote, 218 00:12:25,240 --> 00:12:29,319 Speaker 1: but critics applauded Facebook's move to at least take user 219 00:12:29,360 --> 00:12:33,880 Speaker 1: concerns into consideration. The honeymoon would not last very long, however. 220 00:12:34,200 --> 00:12:37,160 Speaker 1: In twenty ten, Zuckerberg would claim that privacy was no 221 00:12:37,280 --> 00:12:40,040 Speaker 1: longer a social norm, which I infer as a blanket 222 00:12:40,080 --> 00:12:42,560 Speaker 1: excuse for Facebook to collect as much data from as 223 00:12:42,600 --> 00:12:45,880 Speaker 1: many people and profit as much as possible. And obviously, 224 00:12:46,120 --> 00:12:48,800 Speaker 1: in later years, Facebook would be held accountable for all 225 00:12:48,840 --> 00:12:52,439 Speaker 1: sorts of problems it caused or facilitated. But yeah, two 226 00:12:52,480 --> 00:12:54,520 Speaker 1: thousand and nine is a good year to point to 227 00:12:54,559 --> 00:12:59,000 Speaker 1: and say, here's where Facebook's journey to becoming a huge company, 228 00:12:59,360 --> 00:13:04,319 Speaker 1: one of the big five in tech really began. Okay, 229 00:13:04,360 --> 00:13:06,920 Speaker 1: we're up to twenty ten. That's the year we'd get 230 00:13:06,960 --> 00:13:09,760 Speaker 1: the iPad, which I thought would be a flop because 231 00:13:09,760 --> 00:13:11,840 Speaker 1: no one had managed to make a tablet computer a 232 00:13:11,960 --> 00:13:15,560 Speaker 1: viable consumer product up to that point. It's also the 233 00:13:15,640 --> 00:13:19,760 Speaker 1: year of Antennagate. You might not remember Antennagate because we've 234 00:13:19,760 --> 00:13:23,280 Speaker 1: had so many iPhones since then, but Antennagate was when 235 00:13:23,320 --> 00:13:26,400 Speaker 1: folks were complaining about connectivity issues with the brand new 236 00:13:26,559 --> 00:13:31,520 Speaker 1: iPhone four, and upon closer inspection, it looked like the 237 00:13:31,559 --> 00:13:35,960 Speaker 1: issue was the internal antenna of the iPhone was placed 238 00:13:35,960 --> 00:13:37,840 Speaker 1: in such a way that if you held the phone 239 00:13:37,920 --> 00:13:40,320 Speaker 1: like you know, like a human being holding a phone, 240 00:13:40,679 --> 00:13:44,000 Speaker 1: you could end up blocking the antenna from getting a 241 00:13:44,000 --> 00:13:48,160 Speaker 1: solid signal. And we were all essentially told, you're holding 242 00:13:48,160 --> 00:13:52,440 Speaker 1: it wrong. Thus, Apple pushed the blame onto consumers rather 243 00:13:52,520 --> 00:13:54,600 Speaker 1: than owning up to the fact that the iPhone four 244 00:13:54,679 --> 00:13:57,520 Speaker 1: had a design flaw. But the story that really jumped 245 00:13:57,520 --> 00:14:00,360 Speaker 1: out at me in twenty ten was that the world 246 00:14:00,600 --> 00:14:05,200 Speaker 1: learned of stucks net at stux and e T that 247 00:14:05,360 --> 00:14:09,240 Speaker 1: was malware designed for a very specific purpose, specifically to 248 00:14:09,320 --> 00:14:14,480 Speaker 1: sabotage nuclear centrifuges that were part of Iran's nuclear program. 249 00:14:14,960 --> 00:14:18,560 Speaker 1: So in order to create a uranium based nuclear power plant, 250 00:14:18,720 --> 00:14:22,920 Speaker 1: or indeed a uranium based nuclear weapon, you first have 251 00:14:23,040 --> 00:14:25,880 Speaker 1: to get hold of the right kind of uranium, which 252 00:14:25,920 --> 00:14:30,360 Speaker 1: is the isotope U two thirty five. Trouble is, that 253 00:14:30,520 --> 00:14:33,080 Speaker 1: is really rare stuff. It makes up less than a 254 00:14:33,120 --> 00:14:36,880 Speaker 1: percentage of the uranium we actually mind like zero point 255 00:14:36,960 --> 00:14:41,000 Speaker 1: seven percent. But you need a concentration somewhere between three 256 00:14:41,080 --> 00:14:43,760 Speaker 1: to five percent of uranium two thirty five in your 257 00:14:43,920 --> 00:14:47,800 Speaker 1: uranium mix to have viable material. So we have to 258 00:14:48,200 --> 00:14:51,520 Speaker 1: enrich the uranium. We have to get the right concentration. 259 00:14:52,000 --> 00:14:55,600 Speaker 1: Part of that process typically involves using centrifuges to separate 260 00:14:55,680 --> 00:15:00,160 Speaker 1: out you two thirty five from you know, the other stuff. Well, 261 00:15:00,160 --> 00:15:02,240 Speaker 1: the stucks Neat virus would do a few things. One 262 00:15:02,280 --> 00:15:04,480 Speaker 1: thing it would do is that it hit itself once 263 00:15:04,520 --> 00:15:06,840 Speaker 1: it was installed in a target computer, so that you 264 00:15:06,880 --> 00:15:10,360 Speaker 1: couldn't easily see that malware was on the device. These 265 00:15:10,360 --> 00:15:13,520 Speaker 1: particular computers were not connected to the Internet, a very 266 00:15:13,560 --> 00:15:16,800 Speaker 1: wise decision. They had what we would call an air gap, 267 00:15:17,160 --> 00:15:20,040 Speaker 1: as in no connectivity to an outside network. So in 268 00:15:20,120 --> 00:15:21,680 Speaker 1: order to get the virus on the computer in the 269 00:15:21,680 --> 00:15:24,560 Speaker 1: first place, the malware makers would put this code on 270 00:15:24,720 --> 00:15:28,320 Speaker 1: USB drives like little thumb drives, and then they tricked 271 00:15:28,400 --> 00:15:31,880 Speaker 1: Iranian plant workers to install the malware on these air 272 00:15:31,960 --> 00:15:37,760 Speaker 1: gapped machines, so human beings mostly unknowingly carried this and 273 00:15:38,200 --> 00:15:41,560 Speaker 1: ended up installing them on the computers. The code would 274 00:15:41,600 --> 00:15:46,720 Speaker 1: send commands to centrifuges and cause the centrifuges to rotate 275 00:15:46,760 --> 00:15:49,200 Speaker 1: faster than what they were rated for, and the intent 276 00:15:49,320 --> 00:15:53,600 Speaker 1: was to break the machinery and sabotage e Iron's nuclear program, 277 00:15:53,720 --> 00:15:55,840 Speaker 1: and it worked. This was not your run of the 278 00:15:55,840 --> 00:15:59,400 Speaker 1: mill hacker attack. Of course. While it's never been officially acknowledged, 279 00:15:59,400 --> 00:16:02,000 Speaker 1: at least as far as I know, the generally accepted 280 00:16:02,040 --> 00:16:05,320 Speaker 1: stories that Israel and the United States collaborated on this 281 00:16:05,360 --> 00:16:08,960 Speaker 1: attack for several years before deploying it, and it made 282 00:16:09,000 --> 00:16:11,600 Speaker 1: history by being the first major malware attack connected with 283 00:16:11,800 --> 00:16:15,560 Speaker 1: industrial sabotage. That's a heck of a story for twenty ten. 284 00:16:16,280 --> 00:16:18,080 Speaker 1: We got a lot more, but it's time for us 285 00:16:18,080 --> 00:16:30,880 Speaker 1: to take a quick break to thank our sponsors. We're 286 00:16:31,040 --> 00:16:34,640 Speaker 1: back and now we're moving on to twenty eleven. Now. 287 00:16:34,720 --> 00:16:37,680 Speaker 1: As I mentioned earlier in this episode, twenty eleven was 288 00:16:37,760 --> 00:16:41,120 Speaker 1: the year that Steve Jobs passed away. By the time 289 00:16:41,120 --> 00:16:44,080 Speaker 1: of his passing, Tim Cook was acting as the new 290 00:16:44,200 --> 00:16:47,040 Speaker 1: CEO of Apple. That's a position he holds to this 291 00:16:47,200 --> 00:16:51,440 Speaker 1: very day. And obviously Steve Jobs's passing was huge news. 292 00:16:51,480 --> 00:16:55,120 Speaker 1: Like the tech news cycle and really the mainstream news 293 00:16:55,160 --> 00:16:59,520 Speaker 1: cycle was dominated by that at the time, Steve Jobs 294 00:16:59,720 --> 00:17:02,360 Speaker 1: was kind of he was the face of Apple like 295 00:17:02,400 --> 00:17:04,280 Speaker 1: he was the guy who was a co founder of 296 00:17:04,320 --> 00:17:08,960 Speaker 1: the company. He famously was banished from the company in 297 00:17:09,000 --> 00:17:13,359 Speaker 1: the early mid eighties, left Apple, went and did his 298 00:17:13,400 --> 00:17:16,000 Speaker 1: own thing for a while, came back to Apple as 299 00:17:16,119 --> 00:17:21,520 Speaker 1: a consultant in the mid to late nineties, eventually reassumed 300 00:17:21,560 --> 00:17:24,919 Speaker 1: control of Apple, turned it around from a company that 301 00:17:25,000 --> 00:17:29,040 Speaker 1: was on the brink of potentially bankruptcy it was doing 302 00:17:29,080 --> 00:17:32,760 Speaker 1: so poorly to becoming one of the most successful tech 303 00:17:32,840 --> 00:17:37,760 Speaker 1: companies in the world, propelled by successes like the iMac 304 00:17:37,880 --> 00:17:40,760 Speaker 1: and the iPod and of course, later on the iPhone, 305 00:17:41,200 --> 00:17:44,359 Speaker 1: largely helped by Johnny Ive, the incredible designer who helped 306 00:17:44,400 --> 00:17:47,639 Speaker 1: create the new look of Apple Like. It is a 307 00:17:47,680 --> 00:17:50,400 Speaker 1: huge story that he passed away. He was also famous 308 00:17:50,440 --> 00:17:53,840 Speaker 1: for being a really tough boss and sometimes a really 309 00:17:53,920 --> 00:17:59,480 Speaker 1: really unkind boss. There are some pretty scary stories about 310 00:17:59,560 --> 00:18:03,920 Speaker 1: encounter between employees and Steve Jobs. But while his passing 311 00:18:04,040 --> 00:18:07,920 Speaker 1: was huge, news that wasn't the story I really wanted 312 00:18:07,960 --> 00:18:10,760 Speaker 1: to focus on. Another big one that unfolded in twenty 313 00:18:10,800 --> 00:18:13,840 Speaker 1: eleven was how protesters and countries across the Middle East 314 00:18:14,160 --> 00:18:17,440 Speaker 1: took to social media to argue for massive social changes 315 00:18:18,119 --> 00:18:21,320 Speaker 1: that technically actually started in late twenty ten, but the 316 00:18:21,359 --> 00:18:25,000 Speaker 1: majority of the activity was really spread across twenty eleven. 317 00:18:25,480 --> 00:18:29,360 Speaker 1: This became known as the Arab Spring, and there were, 318 00:18:29,800 --> 00:18:33,199 Speaker 1: you know, variable degrees of success for social change throughout 319 00:18:33,400 --> 00:18:36,800 Speaker 1: the nations that were part of the Arab Spring. But 320 00:18:36,880 --> 00:18:41,320 Speaker 1: what that really did was it highlighted how important platforms 321 00:18:42,080 --> 00:18:45,320 Speaker 1: like Twitter and other social networks at the time really 322 00:18:45,400 --> 00:18:48,080 Speaker 1: are for the purposes of social change. Like we can 323 00:18:48,119 --> 00:18:50,919 Speaker 1: forget that because we often will, or at least I 324 00:18:51,000 --> 00:18:54,639 Speaker 1: will forget that because I will often focus on things 325 00:18:54,640 --> 00:18:57,479 Speaker 1: that are happening, like at the corporate level, where the 326 00:18:57,480 --> 00:19:01,080 Speaker 1: people in charge of those platforms are making questionable decisions 327 00:19:01,160 --> 00:19:04,879 Speaker 1: or ones that I find ethically troubling, but the people 328 00:19:04,920 --> 00:19:08,639 Speaker 1: actually using the platforms can often do so in ways 329 00:19:08,680 --> 00:19:15,040 Speaker 1: that evoke positive social change. Also twenty eleven, Google launched 330 00:19:15,040 --> 00:19:18,240 Speaker 1: Google Plus that year, but that network had shut down 331 00:19:18,280 --> 00:19:20,960 Speaker 1: by twenty nineteen, so it barely merits mentioned it was 332 00:19:21,040 --> 00:19:24,920 Speaker 1: just Google making another attempt to creating a social network 333 00:19:25,080 --> 00:19:28,719 Speaker 1: similar to that of Facebook or MySpace before it, and 334 00:19:28,800 --> 00:19:34,280 Speaker 1: Google just never really nailed that. They tried several times, 335 00:19:34,359 --> 00:19:37,000 Speaker 1: and I'm sure they'll try again at some point, but 336 00:19:37,160 --> 00:19:40,040 Speaker 1: it is just never stuck. Now, the story I really 337 00:19:40,040 --> 00:19:42,840 Speaker 1: wanted to focus on is one that, at least on 338 00:19:42,880 --> 00:19:45,240 Speaker 1: the face of it, seems like more of a novelty 339 00:19:45,520 --> 00:19:48,080 Speaker 1: and not that important. But I argue it is important, 340 00:19:48,520 --> 00:19:53,000 Speaker 1: and that is IBM's Watson AI computer famously took on 341 00:19:53,200 --> 00:19:58,399 Speaker 1: two returning Jeopardy champions the game show Jeopardy in a 342 00:19:58,440 --> 00:20:01,760 Speaker 1: display of natural language process sing an artificial intelligence powered 343 00:20:01,840 --> 00:20:07,600 Speaker 1: problem solving. So Watson played against Ken Jennings and Brad Rudder, 344 00:20:07,680 --> 00:20:10,919 Speaker 1: to former champions on Jeopardy. They played a game of 345 00:20:11,000 --> 00:20:13,720 Speaker 1: Jeopardy and Watson came out the winner. Watson was able 346 00:20:13,760 --> 00:20:18,520 Speaker 1: to navigate clues that incorporated wordplay and you know, references 347 00:20:18,560 --> 00:20:21,440 Speaker 1: and pop culture and history, all without having an external 348 00:20:21,440 --> 00:20:24,639 Speaker 1: connection to the Internet. So Watson did have access to 349 00:20:24,680 --> 00:20:27,840 Speaker 1: a rather massive database of information that was connected to 350 00:20:27,880 --> 00:20:30,719 Speaker 1: the computer, but it could not you know, dial out 351 00:20:30,840 --> 00:20:35,440 Speaker 1: of the studio for example, to get information from the Internet. Itself. 352 00:20:35,800 --> 00:20:39,520 Speaker 1: Watson essentially would buzz in if it calculated that the 353 00:20:39,560 --> 00:20:43,960 Speaker 1: probability of its answer being the right answer was above 354 00:20:44,000 --> 00:20:47,720 Speaker 1: a certain threshold. And that's interesting too, writ Like, it's 355 00:20:47,720 --> 00:20:50,600 Speaker 1: not that it quote unquote knew the right answer. It 356 00:20:50,680 --> 00:20:54,080 Speaker 1: had an answer. It would analyze the likelihood of that 357 00:20:54,119 --> 00:20:56,920 Speaker 1: answer being the correct one, and if the likelihood were 358 00:20:56,960 --> 00:21:00,359 Speaker 1: high enough, Watson would buzz in. Now, the reason I 359 00:21:00,400 --> 00:21:02,840 Speaker 1: bring it up here is not because a machine beat 360 00:21:02,920 --> 00:21:06,200 Speaker 1: humans in a game, because that's a story we've heard 361 00:21:06,359 --> 00:21:09,880 Speaker 1: many times, like the famous one being Gary Kasparov when 362 00:21:09,960 --> 00:21:14,920 Speaker 1: he faced off against IBM's Deep Blue in various chess matches. 363 00:21:15,960 --> 00:21:17,960 Speaker 1: IBM has a long history of doing this kind of 364 00:21:17,960 --> 00:21:21,280 Speaker 1: stuff too. Instead, the reason I wanted to talk about 365 00:21:21,280 --> 00:21:23,480 Speaker 1: it is because in twenty eleven we would get a 366 00:21:23,600 --> 00:21:28,439 Speaker 1: hint of where artificial intelligence, particularly AI that has natural 367 00:21:28,520 --> 00:21:33,000 Speaker 1: language processing capabilities, would be headed. I would say that 368 00:21:33,080 --> 00:21:37,159 Speaker 1: today's generative AI resembles Watson in many ways that we 369 00:21:37,280 --> 00:21:40,880 Speaker 1: do have to remember that artificial intelligence isn't always right 370 00:21:41,040 --> 00:21:45,240 Speaker 1: or trustworthy. So just as a reminder, you know, this year, 371 00:21:45,280 --> 00:21:48,280 Speaker 1: this past year, back in twenty twenty four, I gotta 372 00:21:48,280 --> 00:21:50,960 Speaker 1: say twenty twenty four. I'm recording in twenty twenty four. 373 00:21:51,720 --> 00:21:56,120 Speaker 1: But I once had AI quote unquote write an episode 374 00:21:56,160 --> 00:21:59,200 Speaker 1: of tech Stuff, and then I did kind of director's 375 00:21:59,240 --> 00:22:03,920 Speaker 1: commentary on that. I suppose the AI ended up inventing 376 00:22:04,080 --> 00:22:09,000 Speaker 1: supposed experts who didn't actually exist and was quoting them 377 00:22:09,320 --> 00:22:13,400 Speaker 1: as support for the arguments being made in the episode. 378 00:22:13,840 --> 00:22:17,200 Speaker 1: And that's just not cool. I mean, like any any 379 00:22:17,359 --> 00:22:20,600 Speaker 1: teacher could tell you if you are if your students 380 00:22:20,640 --> 00:22:25,240 Speaker 1: are rating essays and they're quoting made up experts, that's 381 00:22:25,520 --> 00:22:27,919 Speaker 1: unethical and it's going to get you a failing grade. 382 00:22:27,920 --> 00:22:30,720 Speaker 1: But AI was doing it when I asked it to 383 00:22:30,760 --> 00:22:34,920 Speaker 1: create an episode. That's not great. But let's now move 384 00:22:35,000 --> 00:22:40,840 Speaker 1: on to two thy twelve. Microsoft pushed Windows eight out 385 00:22:40,920 --> 00:22:43,960 Speaker 1: the door. It was redesigned to work in a world 386 00:22:44,200 --> 00:22:47,480 Speaker 1: that was now filling up with touch screens. But while 387 00:22:47,480 --> 00:22:51,920 Speaker 1: the touch screen layout may have, you know, really made 388 00:22:51,960 --> 00:22:55,560 Speaker 1: sense for people who are on mobile devices, those who 389 00:22:55,600 --> 00:22:59,160 Speaker 1: are still working on laptops and desktops were less enchanted 390 00:22:59,200 --> 00:23:03,600 Speaker 1: by that user interface. Lots of people preferred sticking with 391 00:23:03,680 --> 00:23:06,359 Speaker 1: Windows seven, even though there was a version of Windows 392 00:23:06,359 --> 00:23:11,400 Speaker 1: eight you could activate that was less tile based. That's 393 00:23:11,440 --> 00:23:15,639 Speaker 1: what Windows eight called the various little icons that you 394 00:23:15,680 --> 00:23:19,080 Speaker 1: would navigate through the UI. So this was kind of 395 00:23:19,080 --> 00:23:22,240 Speaker 1: another case of Windows Vista all over again, where people 396 00:23:22,880 --> 00:23:27,639 Speaker 1: largely rejected the advancements that Microsoft had made and whether 397 00:23:28,040 --> 00:23:31,720 Speaker 1: those advancements actually made sense or not. Users didn't like them, 398 00:23:32,280 --> 00:23:37,240 Speaker 1: and so there was a resistance to adopting Windows eight 399 00:23:37,359 --> 00:23:40,800 Speaker 1: in large part because of these issues I just mentioned. 400 00:23:41,800 --> 00:23:46,479 Speaker 1: Apple also introduced Apple Maps that year. Apple Maps quickly 401 00:23:46,560 --> 00:23:50,120 Speaker 1: became the subject of ridicule. You might not remember this 402 00:23:50,240 --> 00:23:53,399 Speaker 1: because it's sense gotten much better, but when it was 403 00:23:53,400 --> 00:23:57,800 Speaker 1: first released, Apple Maps was pretty bad. Users noted several 404 00:23:57,840 --> 00:24:00,520 Speaker 1: deficiencies in the mapping features, you know, like guiding you 405 00:24:00,600 --> 00:24:04,280 Speaker 1: to the wrong location or suggesting that you take a 406 00:24:04,359 --> 00:24:08,200 Speaker 1: river instead of the highway. In all seriousness, it was bad. 407 00:24:08,720 --> 00:24:10,560 Speaker 1: Maybe it wasn't like the worst thing in the world, 408 00:24:10,560 --> 00:24:12,919 Speaker 1: but it was bad enough that Tim Cook was actually 409 00:24:12,920 --> 00:24:16,000 Speaker 1: prompted to apologize for it. That's not a good look. 410 00:24:16,040 --> 00:24:18,720 Speaker 1: That's not something that Apple would typically do. I don't 411 00:24:18,760 --> 00:24:22,280 Speaker 1: know that Steve Jobs would have done that, even with 412 00:24:22,440 --> 00:24:25,480 Speaker 1: the negative reaction that there was to Apple Maps. He 413 00:24:25,560 --> 00:24:27,879 Speaker 1: might have said, you know, we were too enthusiastic and 414 00:24:27,920 --> 00:24:29,679 Speaker 1: we released it too early, but I'm not sure he 415 00:24:29,680 --> 00:24:34,200 Speaker 1: would apologize. No way of knowing. But no, the story 416 00:24:34,240 --> 00:24:38,359 Speaker 1: I want to talk about is about Marissa Meyer. So 417 00:24:38,760 --> 00:24:41,320 Speaker 1: she was one of the early employees at Google. She 418 00:24:41,400 --> 00:24:46,200 Speaker 1: was actually employee number twenty way back in nineteen ninety nine. 419 00:24:46,480 --> 00:24:50,640 Speaker 1: But in twenty twelve she had a huge career change. 420 00:24:50,680 --> 00:24:54,560 Speaker 1: She left Google behind and became the president and CEO 421 00:24:54,800 --> 00:24:57,400 Speaker 1: of Yahoo. She also scored a seat on the board 422 00:24:57,400 --> 00:25:01,879 Speaker 1: of directors in the process. Old stomping grounds of Google 423 00:25:01,960 --> 00:25:05,040 Speaker 1: had been kind of stomping on Yahoo for around the year, 424 00:25:05,400 --> 00:25:08,120 Speaker 1: and so Meyer's appointment was intended to shake things up 425 00:25:08,520 --> 00:25:11,600 Speaker 1: at the old search engine and web portal, and Yahoo 426 00:25:11,600 --> 00:25:14,399 Speaker 1: had been through some pretty tough times. From two thousand 427 00:25:14,400 --> 00:25:18,159 Speaker 1: and nine to twenty eleven, Carol Bart's was CEO, and 428 00:25:18,240 --> 00:25:21,359 Speaker 1: her tenure was marked with making some really big replacements 429 00:25:21,400 --> 00:25:24,360 Speaker 1: in the executive ranks, as well as a struggle over 430 00:25:24,480 --> 00:25:29,080 Speaker 1: Yahoo's very identity. Was it a media company or was 431 00:25:29,119 --> 00:25:32,160 Speaker 1: it a tech company? Because Yahoo was really getting into 432 00:25:32,200 --> 00:25:36,200 Speaker 1: content creation as well, and it just felt like the 433 00:25:36,960 --> 00:25:40,800 Speaker 1: attention of the company was divided and it wasn't doing 434 00:25:40,920 --> 00:25:45,640 Speaker 1: either thing particularly well. Eventually, Yahoo's board of Director's lost 435 00:25:45,720 --> 00:25:49,439 Speaker 1: patients with Carol Bart's and fired her over the phone 436 00:25:50,240 --> 00:25:55,760 Speaker 1: Yike's not classy. Then, in early twenty twelve, and embarrassing 437 00:25:55,800 --> 00:25:58,479 Speaker 1: situation reared its ugly head when it was discovered that 438 00:25:58,560 --> 00:26:03,159 Speaker 1: Barts's replacement as CEO, a guy named Scott Thompson, not 439 00:26:03,320 --> 00:26:06,240 Speaker 1: the guy from Kids in the Hall, that he had 440 00:26:06,280 --> 00:26:09,679 Speaker 1: fibbed on his resume, that he had made up some 441 00:26:09,800 --> 00:26:12,000 Speaker 1: information that was on his resume, And this was a 442 00:26:12,000 --> 00:26:14,600 Speaker 1: bit of a black eye situation, and Thompson was very 443 00:26:14,680 --> 00:26:17,920 Speaker 1: quickly given his walking papers just a few months after 444 00:26:17,960 --> 00:26:21,120 Speaker 1: he took the job. Meyer would come in as the 445 00:26:21,160 --> 00:26:24,359 Speaker 1: new head of Yahoo. She said in an interview with 446 00:26:24,440 --> 00:26:29,240 Speaker 1: Wired's Virginia Hefferman that Yahoo as a business constituted all 447 00:26:29,280 --> 00:26:32,040 Speaker 1: the different areas in tech that she had worked within 448 00:26:32,240 --> 00:26:34,680 Speaker 1: over her years at Google, and she thought she might 449 00:26:34,720 --> 00:26:37,040 Speaker 1: be able to bring her skill set to help a 450 00:26:37,080 --> 00:26:41,359 Speaker 1: company that was clearly in need. So Meyer had a 451 00:26:41,400 --> 00:26:44,479 Speaker 1: lot of work to do. Yahoo was in a pretty 452 00:26:44,480 --> 00:26:47,720 Speaker 1: weird space. Morale was shaky, the company had lacked a 453 00:26:47,760 --> 00:26:50,840 Speaker 1: solid sense of direction, and the board was eager to 454 00:26:50,880 --> 00:26:54,240 Speaker 1: see change. On top of that, there was this ongoing 455 00:26:54,280 --> 00:26:57,480 Speaker 1: problem that Yahoo would fail to address. This shift of 456 00:26:57,640 --> 00:27:02,200 Speaker 1: user habits to a more mobile center online experience. While 457 00:27:02,240 --> 00:27:05,840 Speaker 1: other services and sites were adapting to mobile, Yahoo had 458 00:27:05,880 --> 00:27:08,679 Speaker 1: lagged behind, and according to that interview I mentioned a 459 00:27:08,680 --> 00:27:11,920 Speaker 1: second ago, Meyer felt that the move to mobile happened 460 00:27:11,960 --> 00:27:15,119 Speaker 1: far too late, like eight years too late, she had said. 461 00:27:15,560 --> 00:27:17,600 Speaker 1: While she says she's proud of the work she did 462 00:27:17,600 --> 00:27:20,439 Speaker 1: at Yahoo, ultimately it would prove to be too little, 463 00:27:20,680 --> 00:27:23,200 Speaker 1: too late, and later on the board of directors would 464 00:27:23,240 --> 00:27:26,800 Speaker 1: decide to sell Yahoo to Verizon, at which point Meyer 465 00:27:26,880 --> 00:27:30,439 Speaker 1: would resign from her position. Meyer's tenure at Yahoo was 466 00:27:30,440 --> 00:27:33,000 Speaker 1: marked by many attempts to right the wrongs that had 467 00:27:33,040 --> 00:27:36,480 Speaker 1: plagued Yahoo over the previous few years, but without the 468 00:27:36,480 --> 00:27:39,280 Speaker 1: support of the board, it was pretty much a doomed endeavor. 469 00:27:39,840 --> 00:27:44,240 Speaker 1: Would it have worked otherwise? Well, I guess we're never 470 00:27:44,280 --> 00:27:47,480 Speaker 1: gonna know. All right, now, we're up to twenty thirteen, 471 00:27:47,560 --> 00:27:50,600 Speaker 1: and a lot of stuff happened that year. Twitter became 472 00:27:50,680 --> 00:27:54,160 Speaker 1: a publicly traded company in twenty thirteen and would remain 473 00:27:54,240 --> 00:27:57,119 Speaker 1: so until Elon Musk would announce he was going to 474 00:27:57,160 --> 00:27:59,239 Speaker 1: buy it, try to back out of buying it, and 475 00:27:59,280 --> 00:28:03,119 Speaker 1: then was forced to buy it. Many years later. Bitcoin's 476 00:28:03,200 --> 00:28:05,880 Speaker 1: value took off for the first time in twenty thirteen. 477 00:28:05,920 --> 00:28:08,840 Speaker 1: It would later crash and then rise and crash again. 478 00:28:08,920 --> 00:28:13,399 Speaker 1: We've seen that happen numerous times. New consoles like the 479 00:28:13,560 --> 00:28:16,440 Speaker 1: Xbox One and the PlayStation four would hit the market 480 00:28:16,480 --> 00:28:20,640 Speaker 1: in twenty thirteen. But the big story, at least here 481 00:28:20,640 --> 00:28:24,840 Speaker 1: in the United States, was a huge scandal about how 482 00:28:24,920 --> 00:28:29,960 Speaker 1: an intelligence agency, or actually a few intelligence agencies here 483 00:28:30,000 --> 00:28:33,879 Speaker 1: in the States, had been collecting enormous amounts of information 484 00:28:33,960 --> 00:28:37,760 Speaker 1: about what were otherwise considered to be private activities of 485 00:28:37,960 --> 00:28:43,600 Speaker 1: US citizens and others, and that would end up making 486 00:28:44,040 --> 00:28:47,840 Speaker 1: world news. We'll talk about that more in just a moment, 487 00:28:47,840 --> 00:28:50,080 Speaker 1: but we're going to take another quick break to think 488 00:28:50,120 --> 00:29:04,400 Speaker 1: our sponsors. All right. Before the break, I alluded to 489 00:29:04,920 --> 00:29:11,280 Speaker 1: this scandalous revelation of how intelligence agencies within the United 490 00:29:11,320 --> 00:29:16,240 Speaker 1: States were apparently collecting truly huge amounts of data. So 491 00:29:16,320 --> 00:29:19,480 Speaker 1: the story is a contractor with the NSA, A guy 492 00:29:19,560 --> 00:29:23,560 Speaker 1: named Edward Snowden blew the whistle on these programs that 493 00:29:23,600 --> 00:29:26,720 Speaker 1: seemed to be a massive breach of constitutional protections or 494 00:29:26,760 --> 00:29:31,280 Speaker 1: at the very least a disturbing example of overly intrusive surveillance. 495 00:29:31,760 --> 00:29:36,480 Speaker 1: Snowden revealed the existence of a program called PRISM. Now 496 00:29:36,720 --> 00:29:42,200 Speaker 1: that's PRISM. Essentially, this described a sweeping program in which 497 00:29:42,240 --> 00:29:45,800 Speaker 1: the NSA had, as the Guardian would put it, quote, 498 00:29:45,960 --> 00:29:50,320 Speaker 1: direct access to the systems of Google, Facebook, Apple and 499 00:29:50,560 --> 00:29:56,840 Speaker 1: other US Internet giants end quote. And that alone was alarming. 500 00:29:57,280 --> 00:30:00,840 Speaker 1: So what sort of data did the NSA have access to? Well, again, 501 00:30:00,880 --> 00:30:04,320 Speaker 1: according to the Guardian, it was pretty much everything. I mean, 502 00:30:04,360 --> 00:30:09,200 Speaker 1: it was everything from search histories to emails, to file transfers, 503 00:30:09,600 --> 00:30:15,920 Speaker 1: voiceover Internet Protocol calls. Beyond all this information came directly 504 00:30:16,040 --> 00:30:20,160 Speaker 1: from the provider's servers. Apparently, the documents that Snowden shared 505 00:30:20,200 --> 00:30:23,200 Speaker 1: claimed that the NSA had cooperation of the various companies 506 00:30:23,200 --> 00:30:26,719 Speaker 1: in this endeavor, although when The Guardian reached out for comment, 507 00:30:26,920 --> 00:30:31,400 Speaker 1: representatives from the various companies denied knowing about PRISM. Curiouser 508 00:30:31,560 --> 00:30:35,400 Speaker 1: and curiouser, as it would turn out, this was top 509 00:30:35,440 --> 00:30:40,840 Speaker 1: secret stuff and companies weren't allowed to reveal that they 510 00:30:40,840 --> 00:30:42,280 Speaker 1: were part of it. It was all part of the 511 00:30:42,320 --> 00:30:47,120 Speaker 1: clandestine nature. So were they lying that they didn't know 512 00:30:47,160 --> 00:30:49,240 Speaker 1: about it? I guess it depends on who was actually 513 00:30:49,280 --> 00:30:51,800 Speaker 1: making the statement, because there were probably people who didn't 514 00:30:51,840 --> 00:30:53,520 Speaker 1: know about it who were told hey, you got to 515 00:30:53,520 --> 00:30:55,800 Speaker 1: tell them we don't know anything about this, and others 516 00:30:55,840 --> 00:30:58,000 Speaker 1: who may maybe they did know about it, but they 517 00:30:58,040 --> 00:31:00,640 Speaker 1: still had to say they didn't. Later documents would show 518 00:31:00,680 --> 00:31:04,240 Speaker 1: that the intelligence community was actually paying millions of dollars 519 00:31:04,600 --> 00:31:09,760 Speaker 1: collectively to these various platforms, and the program originated with 520 00:31:09,960 --> 00:31:12,800 Speaker 1: a couple of pieces of legislation that were signed into 521 00:31:12,880 --> 00:31:17,080 Speaker 1: law years earlier, namely the Protect America Act, which was 522 00:31:17,120 --> 00:31:19,440 Speaker 1: signed into law in two thousand and seven and the 523 00:31:19,480 --> 00:31:22,440 Speaker 1: FISA Amendments Act of two thousand and eight. Both of 524 00:31:22,480 --> 00:31:26,120 Speaker 1: those were signed into law by former President George W. Bush, 525 00:31:26,160 --> 00:31:28,720 Speaker 1: And it would turn out that the NSA didn't exactly 526 00:31:29,120 --> 00:31:32,600 Speaker 1: have its own backdoor access to all these different platforms. 527 00:31:32,600 --> 00:31:34,680 Speaker 1: So it wasn't like, you know, it's a slow day 528 00:31:34,720 --> 00:31:36,640 Speaker 1: at the NSA. Let's go see what people have been 529 00:31:36,640 --> 00:31:40,760 Speaker 1: posting on Facebook to each other in private messages. It 530 00:31:40,800 --> 00:31:44,080 Speaker 1: wasn't quite like that. Instead, the way it works from 531 00:31:44,120 --> 00:31:48,800 Speaker 1: a very very high level is that you send a 532 00:31:48,840 --> 00:31:52,640 Speaker 1: request to the FBI. The FBI sends a directive to 533 00:31:53,160 --> 00:31:57,000 Speaker 1: various Internet service providers, and by law, those providers have 534 00:31:57,080 --> 00:32:01,680 Speaker 1: to hand over essentially raw communications data, and you just 535 00:32:01,720 --> 00:32:06,200 Speaker 1: get this a massive amount of information that encompasses all 536 00:32:06,240 --> 00:32:09,320 Speaker 1: the data sent an over a given amount of time. 537 00:32:09,760 --> 00:32:12,200 Speaker 1: The FBI then sends this huge amount of information to 538 00:32:12,240 --> 00:32:17,920 Speaker 1: THESA stores it on various databases. And presumably the reason 539 00:32:17,920 --> 00:32:19,880 Speaker 1: there is that you have all this data that you 540 00:32:19,920 --> 00:32:24,640 Speaker 1: can then sift through and search and look for clues 541 00:32:24,720 --> 00:32:29,200 Speaker 1: for stuff that potentially threatened national security. So we're talking 542 00:32:29,200 --> 00:32:32,920 Speaker 1: about things like terrorist plans, that kind of thing. The 543 00:32:33,200 --> 00:32:38,520 Speaker 1: NSSA may try to decrypt encrypted information that can take 544 00:32:38,640 --> 00:32:41,720 Speaker 1: quite some time, depending upon the level of encryption. But 545 00:32:42,080 --> 00:32:46,120 Speaker 1: in theory, any investigation would first need to secure a 546 00:32:46,240 --> 00:32:50,360 Speaker 1: warrant from a court before anyone would be allowed to 547 00:32:50,440 --> 00:32:54,640 Speaker 1: search through all that communications data. However, in practice, well 548 00:32:54,720 --> 00:32:59,080 Speaker 1: journalist Glenn Greenwald argued that analysts at the NSA can 549 00:33:00,000 --> 00:33:04,040 Speaker 1: how much access communications anytime they like without securing any 550 00:33:04,080 --> 00:33:05,920 Speaker 1: kind of warrant first, Like they could just you know, 551 00:33:06,040 --> 00:33:09,280 Speaker 1: scan through it if they want to. They have that capability. 552 00:33:09,280 --> 00:33:12,200 Speaker 1: There's nothing preventing them from doing this. And in fact 553 00:33:12,240 --> 00:33:15,120 Speaker 1: that this is something that was and perhaps still is 554 00:33:15,160 --> 00:33:19,560 Speaker 1: happening at the NSA. This is according again to Glenn Greenwald, 555 00:33:19,840 --> 00:33:22,840 Speaker 1: and you know, people are people, there are people who 556 00:33:22,840 --> 00:33:26,760 Speaker 1: will do dumb stuff, even though they might be burdened 557 00:33:26,800 --> 00:33:31,480 Speaker 1: with great responsibility. So there have certainly been examples of 558 00:33:31,560 --> 00:33:35,880 Speaker 1: people within the intelligence community making use of official tools 559 00:33:36,160 --> 00:33:40,000 Speaker 1: for more personal reasons, like checking up on what an 560 00:33:40,040 --> 00:33:43,320 Speaker 1: ex partner has been doing since a messy breakup, for example. 561 00:33:43,640 --> 00:33:47,920 Speaker 1: That kind of stuff. Like you give someone the keys 562 00:33:48,000 --> 00:33:51,280 Speaker 1: to a really powerful tool like that, I suppose the 563 00:33:51,320 --> 00:33:56,240 Speaker 1: temptation to misuse it must be pretty great. Prism has 564 00:33:56,320 --> 00:34:00,000 Speaker 1: not gone away now. I would say that one lesson 565 00:34:00,120 --> 00:34:04,120 Speaker 1: to take home is that encryption is your friend. It 566 00:34:04,160 --> 00:34:07,560 Speaker 1: doesn't guarantee that the communications you participate in will be 567 00:34:07,640 --> 00:34:10,120 Speaker 1: hidden away, but it makes it much harder for folks 568 00:34:10,200 --> 00:34:13,640 Speaker 1: to just snoop on you. Of course, we're rapidly approaching 569 00:34:13,719 --> 00:34:17,440 Speaker 1: a quantum computing future in which all previously reliable methods 570 00:34:17,440 --> 00:34:21,040 Speaker 1: of encryption could potentially just become child's play to unravel. 571 00:34:21,760 --> 00:34:24,480 Speaker 1: But that's the future. Our concern today right now is 572 00:34:24,880 --> 00:34:30,560 Speaker 1: the past. So yeah, Snowden obviously his story still ongoing. 573 00:34:30,600 --> 00:34:36,239 Speaker 1: He's been living in exile for the last decade, and 574 00:34:36,719 --> 00:34:43,120 Speaker 1: some view him as a important whistleblower who alerted citizens 575 00:34:43,200 --> 00:34:47,800 Speaker 1: to a program that appears to have a crazy amount 576 00:34:48,040 --> 00:34:55,000 Speaker 1: of depth in surveillance, even if you accept the statements 577 00:34:55,040 --> 00:34:58,359 Speaker 1: of the intelligence community that oh, sure we're collecting all 578 00:34:58,360 --> 00:35:00,160 Speaker 1: of it, but we're not searching it, so we don't 579 00:35:00,160 --> 00:35:02,239 Speaker 1: even know what we have, so you have nothing to 580 00:35:02,239 --> 00:35:06,080 Speaker 1: worry about. That's not very That doesn't make me feel better. 581 00:35:06,320 --> 00:35:09,080 Speaker 1: But not only have we had that, like, he's also 582 00:35:09,200 --> 00:35:13,320 Speaker 1: obviously had to deal with finding a totally new life 583 00:35:13,840 --> 00:35:16,600 Speaker 1: outside of the United States because there's some people who 584 00:35:16,680 --> 00:35:19,640 Speaker 1: view him as a traitor to the country and he 585 00:35:19,760 --> 00:35:24,520 Speaker 1: could face serious repercussions were he ever to return. So, yeah, 586 00:35:24,760 --> 00:35:29,279 Speaker 1: a pretty dramatic story in tech. In twenty fourteen, we 587 00:35:29,320 --> 00:35:32,160 Speaker 1: had a whole year of corporate changes happening in tech. 588 00:35:32,239 --> 00:35:35,880 Speaker 1: Steve Balmer, who had led Microsoft as the CEO for 589 00:35:35,960 --> 00:35:39,359 Speaker 1: fourteen years at that point, stepped down to make way 590 00:35:39,400 --> 00:35:44,040 Speaker 1: for Satya Nadella to take over. Google bought Nest Labs. 591 00:35:44,320 --> 00:35:48,239 Speaker 1: They're the makers of the smart thermostat, and had just 592 00:35:48,480 --> 00:35:51,920 Speaker 1: at the end of twenty thirteen, had acquired Boston Dynamics. 593 00:35:51,920 --> 00:35:56,600 Speaker 1: That's the robotics company that makes adorably terrifying four legged robots, 594 00:35:56,600 --> 00:36:01,239 Speaker 1: among other things. Google would later divest itself of Boston Dynamics. 595 00:36:01,239 --> 00:36:04,960 Speaker 1: They did so fairly recently found a new home at Hyundai. 596 00:36:05,840 --> 00:36:09,960 Speaker 1: Apple announced it would acquire Beats Electronics for a whopping 597 00:36:10,080 --> 00:36:14,520 Speaker 1: three billion With a B dollars, Facebook would make even 598 00:36:14,640 --> 00:36:17,399 Speaker 1: larger purchases. Well, they made one that was smaller. They 599 00:36:17,400 --> 00:36:21,879 Speaker 1: bought Oculus VR for two billion, but they bought wattsapp 600 00:36:22,040 --> 00:36:26,640 Speaker 1: for twenty two billion dollars. At this stage, I think 601 00:36:26,719 --> 00:36:30,279 Speaker 1: Facebook was still looking to solidify everything around the tent 602 00:36:30,440 --> 00:36:35,880 Speaker 1: pole Facebook social network platform. Like Facebook itself, the platform 603 00:36:36,040 --> 00:36:39,640 Speaker 1: was the core of its strategy. But in future years 604 00:36:39,640 --> 00:36:42,759 Speaker 1: the company, especially once it changed its name to Meta, 605 00:36:42,880 --> 00:36:45,920 Speaker 1: would take a less focused approach to creating what it 606 00:36:46,000 --> 00:36:49,400 Speaker 1: hopes will be the future of online life. They're like, well, 607 00:36:49,880 --> 00:36:52,880 Speaker 1: Facebook is just one part of our strategy as opposed 608 00:36:52,920 --> 00:36:56,120 Speaker 1: to the central part. However, the story I want to 609 00:36:56,120 --> 00:36:59,759 Speaker 1: focus on from twenty fourteen is the Sony Pictures hack. 610 00:37:00,000 --> 00:37:02,520 Speaker 1: This was a big one, partly because of the sheer 611 00:37:02,520 --> 00:37:05,600 Speaker 1: amount of data that was lifted during the hack, partly 612 00:37:05,640 --> 00:37:09,200 Speaker 1: because of the sensitive nature of that information, which included 613 00:37:09,239 --> 00:37:13,640 Speaker 1: everything from personal information about Sony employees, including stuff like 614 00:37:13,719 --> 00:37:17,600 Speaker 1: salary information, all the way up to data about and 615 00:37:17,719 --> 00:37:22,480 Speaker 1: even digital prints of unreleased Sony pictures films, but also 616 00:37:22,600 --> 00:37:26,240 Speaker 1: the hack would end up becoming a case study in cybersecurity, 617 00:37:26,600 --> 00:37:31,680 Speaker 1: with numerous researchers publishing papers on everything from how organizations 618 00:37:31,719 --> 00:37:35,400 Speaker 1: can better protect themselves from intrusions two works detailing the 619 00:37:35,440 --> 00:37:39,600 Speaker 1: appropriate way to address a crisis once it happens. Because 620 00:37:39,640 --> 00:37:43,480 Speaker 1: Sony did a lot of things poorly, So what actually 621 00:37:43,480 --> 00:37:47,240 Speaker 1: did happen? Well? In twenty fourteen, Sony Pictures was preparing 622 00:37:47,280 --> 00:37:50,840 Speaker 1: to release a film titled The Interview. In this film, 623 00:37:50,960 --> 00:37:55,080 Speaker 1: a pair of dufices, a celebrity interview show host and 624 00:37:55,280 --> 00:37:58,719 Speaker 1: his producer are recruited by the CIA to use an 625 00:37:58,760 --> 00:38:02,439 Speaker 1: opportunity to inter view North Korea's leader Kim Jong un 626 00:38:02,880 --> 00:38:06,600 Speaker 1: and then assassinate him. So this is a satire. It 627 00:38:06,719 --> 00:38:10,239 Speaker 1: makes fun of Kim Jong Un a lot. It also 628 00:38:10,480 --> 00:38:13,920 Speaker 1: comments on the nation of North Korea in general and 629 00:38:13,960 --> 00:38:18,080 Speaker 1: its government in particular, and apparently all that was enough 630 00:38:18,120 --> 00:38:22,319 Speaker 1: to prompt North Korea to direct state backed hackers to 631 00:38:22,719 --> 00:38:28,080 Speaker 1: attack Sony Pictures. Now. I say apparently because definitive evidence 632 00:38:28,120 --> 00:38:31,080 Speaker 1: pointing to North Korea has never actually been made public. 633 00:38:31,840 --> 00:38:35,120 Speaker 1: That doesn't mean the evidence doesn't exist. It doesn't mean 634 00:38:35,560 --> 00:38:39,399 Speaker 1: that there are any other prime suspects that we could 635 00:38:39,400 --> 00:38:42,000 Speaker 1: point at right like, there's there's no one who looks 636 00:38:42,120 --> 00:38:47,240 Speaker 1: as good as North Korea in this. North Korea certainly 637 00:38:47,320 --> 00:38:50,200 Speaker 1: had the capability to pull something like this off. The 638 00:38:50,239 --> 00:38:54,640 Speaker 1: country has been implicated in other state backed cyber attacks. So, 639 00:38:55,320 --> 00:38:56,719 Speaker 1: but I do want to make it clear that the 640 00:38:56,760 --> 00:39:00,239 Speaker 1: working assumption is that North Korea was responsible, but there 641 00:39:00,320 --> 00:39:03,799 Speaker 1: is a little bit of ambiguity there. A group called 642 00:39:03,840 --> 00:39:09,080 Speaker 1: the Guardians of Peace ultimately claimed responsibility and claimed that 643 00:39:09,120 --> 00:39:12,640 Speaker 1: this was not in direct connection with North Korea's government, 644 00:39:12,680 --> 00:39:16,839 Speaker 1: but rather on behalf of North Korea. That is a 645 00:39:16,880 --> 00:39:22,000 Speaker 1: claim that I think most security experts have dismissed as 646 00:39:22,040 --> 00:39:24,600 Speaker 1: being unlikely. They think it really is more of a 647 00:39:24,640 --> 00:39:29,280 Speaker 1: state backed attack. But muddying the waters further is Sony's 648 00:39:29,320 --> 00:39:33,680 Speaker 1: own history with cybersecurity, which hasn't been great so Earlier 649 00:39:33,719 --> 00:39:36,280 Speaker 1: in the two thousands, the company got into hot water 650 00:39:36,360 --> 00:39:39,240 Speaker 1: when it was found that the music division within Sony 651 00:39:39,600 --> 00:39:43,320 Speaker 1: had been using a digital rights management or DRM strategy 652 00:39:43,640 --> 00:39:47,160 Speaker 1: on their compact discs. And the idea was that if 653 00:39:47,200 --> 00:39:49,720 Speaker 1: you took a compact disc and you put it into 654 00:39:49,880 --> 00:39:53,840 Speaker 1: the optical drive of a computer, because children back in 655 00:39:53,880 --> 00:39:57,440 Speaker 1: those days, our computers had optical drives, meaning that you 656 00:39:57,440 --> 00:40:00,440 Speaker 1: would put a disc in the drive. I know that 657 00:40:00,440 --> 00:40:02,520 Speaker 1: that sounds strange, but that's how we used to do 658 00:40:02,600 --> 00:40:04,960 Speaker 1: things before that. It was floppy disks. But I don't 659 00:40:05,000 --> 00:40:08,080 Speaker 1: want to take you into prehistory anyway. If you were 660 00:40:08,120 --> 00:40:09,680 Speaker 1: to do this, you might do it so that you 661 00:40:09,680 --> 00:40:12,759 Speaker 1: could rip the music and make a digital copy for 662 00:40:12,800 --> 00:40:16,120 Speaker 1: the purposes of, say, putting it on like an iPod 663 00:40:16,239 --> 00:40:19,600 Speaker 1: or something. And you might also do it so that 664 00:40:19,640 --> 00:40:24,640 Speaker 1: you could burn new CDs with the music that you 665 00:40:24,800 --> 00:40:27,759 Speaker 1: just got from the one you bought. So Sony wanted 666 00:40:27,800 --> 00:40:31,120 Speaker 1: to prevent people from doing that second thing. They didn't 667 00:40:31,160 --> 00:40:34,879 Speaker 1: want people to make unauthorized copies of music. Never mind 668 00:40:34,880 --> 00:40:36,760 Speaker 1: the fact that, at least here in the United States, 669 00:40:37,080 --> 00:40:40,400 Speaker 1: you are legally allowed to make a copy of a 670 00:40:40,440 --> 00:40:43,800 Speaker 1: CD for the purposes of a personal backup. That's completely 671 00:40:43,840 --> 00:40:47,239 Speaker 1: within your rights. So if you bought a CD and 672 00:40:47,440 --> 00:40:49,200 Speaker 1: you made a copy of it so that you could 673 00:40:49,239 --> 00:40:51,920 Speaker 1: listen to it in case your original CD gets scratched 674 00:40:52,000 --> 00:40:56,320 Speaker 1: or stolen or broken or something, that's fine. Oddly enough, however, 675 00:40:56,560 --> 00:40:59,719 Speaker 1: it is against the law to try and circumvent the 676 00:41:00,000 --> 00:41:04,280 Speaker 1: detective measures that people might place on things like CDs 677 00:41:04,280 --> 00:41:06,600 Speaker 1: to prevent you from doing that. So, yeah, you have 678 00:41:06,640 --> 00:41:08,600 Speaker 1: the right to make a copy, but you don't have 679 00:41:08,680 --> 00:41:12,919 Speaker 1: the right to break the copy protection on a disc. 680 00:41:13,560 --> 00:41:18,520 Speaker 1: It's odd, right, A's a catch twenty two situation. Anyway, 681 00:41:18,760 --> 00:41:24,879 Speaker 1: Sony did a thing that really shook people up. They 682 00:41:24,960 --> 00:41:28,759 Speaker 1: had installed DRM that would put in essentially a rootkit 683 00:41:29,400 --> 00:41:32,960 Speaker 1: on your computer and allow someone to get remote access 684 00:41:33,120 --> 00:41:36,920 Speaker 1: to your computer through the Internet. That obviously was a 685 00:41:36,920 --> 00:41:39,080 Speaker 1: big no no. It did not go well, and later 686 00:41:39,120 --> 00:41:41,680 Speaker 1: on in twenty eleven, Sony found itself the target of 687 00:41:41,719 --> 00:41:45,239 Speaker 1: several cyber attacks, when networks being breached more than twenty 688 00:41:45,280 --> 00:41:49,040 Speaker 1: times that year, and the company then declared a war 689 00:41:49,239 --> 00:41:52,960 Speaker 1: on hackers, which I think mini hackers viewed as both 690 00:41:53,000 --> 00:41:57,080 Speaker 1: amusing and potentially a personal challenge. And so there's an 691 00:41:57,200 --> 00:41:59,960 Speaker 1: argument to be made that Sony's own approach to cyber 692 00:42:00,080 --> 00:42:04,439 Speaker 1: security was one that incited hackers to find new ways 693 00:42:04,480 --> 00:42:07,200 Speaker 1: to exploit the company's systems. Meanwhile, Sony was not doing 694 00:42:07,200 --> 00:42:12,200 Speaker 1: a very good job at all in actually improving their security. 695 00:42:13,040 --> 00:42:15,759 Speaker 1: The attacks ended up locking employees out of their computers 696 00:42:15,800 --> 00:42:19,520 Speaker 1: at least temporarily, and the hackers stole truly huge amounts 697 00:42:19,560 --> 00:42:23,360 Speaker 1: of data. They then released that information online, and that 698 00:42:23,440 --> 00:42:27,120 Speaker 1: info included stuff like executive salaries, which is pretty sensitive 699 00:42:27,120 --> 00:42:30,440 Speaker 1: information that companies typically don't want out in the open. 700 00:42:31,239 --> 00:42:33,840 Speaker 1: And investigation showed that the hackers didn't have to do 701 00:42:33,880 --> 00:42:37,080 Speaker 1: anything particularly spectacular to pull all this off, because again, 702 00:42:37,160 --> 00:42:39,439 Speaker 1: Sony Pictures was not exactly up to date with good 703 00:42:39,440 --> 00:42:43,520 Speaker 1: security hygiene, and employees had not been properly trained and 704 00:42:43,560 --> 00:42:47,560 Speaker 1: they weren't really vigilant. The lack of multi factor authentication 705 00:42:47,760 --> 00:42:50,040 Speaker 1: made it much easier for the hackers to find a 706 00:42:50,120 --> 00:42:54,200 Speaker 1: compromise log in for systems without having to worry about 707 00:42:54,200 --> 00:42:57,520 Speaker 1: additional layers of security. The hackers apparently tricked a few 708 00:42:57,520 --> 00:43:00,880 Speaker 1: select employees with a fake Apple ID login page to 709 00:43:00,880 --> 00:43:04,000 Speaker 1: get hold of their credentials. Then it was just off 710 00:43:04,040 --> 00:43:06,800 Speaker 1: to the races, so this was kind of an example 711 00:43:06,840 --> 00:43:11,680 Speaker 1: of spear fishing. Sony was obviously deeply embarrassed by this 712 00:43:11,880 --> 00:43:15,120 Speaker 1: whole thing, and it was impossible for the company to 713 00:43:15,200 --> 00:43:17,799 Speaker 1: cover it up. The hackers had dumped a ton of 714 00:43:17,840 --> 00:43:22,360 Speaker 1: proprietary and private information online and the mess costs Sony 715 00:43:22,960 --> 00:43:25,839 Speaker 1: a whole lot of money, though determining just how much 716 00:43:25,840 --> 00:43:29,759 Speaker 1: money is actually pretty tricky. On the low end of things, 717 00:43:29,880 --> 00:43:34,720 Speaker 1: Sony set aside around fifteen million dollars to address security concerns, 718 00:43:34,760 --> 00:43:37,280 Speaker 1: which is not much at all in the grand scheme 719 00:43:37,280 --> 00:43:40,600 Speaker 1: of things. Later on, the company also had to agree 720 00:43:40,640 --> 00:43:43,279 Speaker 1: to pay around eight million dollars in response to a 721 00:43:43,320 --> 00:43:46,000 Speaker 1: class action lawsuit that was brought against the company by 722 00:43:46,080 --> 00:43:48,759 Speaker 1: employees who have been affected by this. You know, their 723 00:43:48,800 --> 00:43:51,160 Speaker 1: personal information had been stolen as part of this hack. 724 00:43:52,160 --> 00:43:54,920 Speaker 1: But because of the nature of what happened with all 725 00:43:54,960 --> 00:43:59,120 Speaker 1: that proprietary information leaked and then shared online, it's actually 726 00:43:59,160 --> 00:44:02,200 Speaker 1: really hard to put a figure on, ultimately how much 727 00:44:02,200 --> 00:44:05,920 Speaker 1: it costs. I've seen estimates that have a wild range 728 00:44:05,960 --> 00:44:08,759 Speaker 1: from around thirty five million dollars on the low end 729 00:44:08,800 --> 00:44:13,120 Speaker 1: to almost two hundred million on the high end. It's 730 00:44:13,440 --> 00:44:16,160 Speaker 1: I honestly don't know how much it costs. But it 731 00:44:16,320 --> 00:44:19,640 Speaker 1: was embarrassing for Sony, and it was a real wake 732 00:44:19,719 --> 00:44:22,680 Speaker 1: up call. I think for a lot of companies and 733 00:44:22,800 --> 00:44:29,040 Speaker 1: organizations that they have to take cybersecurity seriously because something 734 00:44:29,080 --> 00:44:34,520 Speaker 1: as small as a perceived slight might prompt a nation 735 00:44:34,800 --> 00:44:39,080 Speaker 1: backed group of hackers to try and reach your systems. 736 00:44:39,600 --> 00:44:42,360 Speaker 1: I mean, in this case, like I said, the straw 737 00:44:42,400 --> 00:44:46,400 Speaker 1: that appears to have broken the proverbial camel's back was 738 00:44:46,480 --> 00:44:48,800 Speaker 1: the fact that Sony Pictures was going to really release 739 00:44:48,840 --> 00:44:52,359 Speaker 1: the interview, which y'all, in my opinion, was not even 740 00:44:52,440 --> 00:44:55,120 Speaker 1: that funny of a comedy, no grit. That's me. I 741 00:44:55,200 --> 00:44:56,680 Speaker 1: think there were a lot of funny people in it, 742 00:44:57,120 --> 00:45:01,040 Speaker 1: but I didn't personally find it that entertaining. Then again, 743 00:45:01,120 --> 00:45:03,160 Speaker 1: I'm probably the wrong audience. I'm such a stick in 744 00:45:03,200 --> 00:45:06,600 Speaker 1: the mud. But that is our first part of our 745 00:45:06,640 --> 00:45:09,279 Speaker 1: look back on some of the big tech stories that 746 00:45:09,360 --> 00:45:12,520 Speaker 1: have unfolded since we launched tech Stuff in June two 747 00:45:12,560 --> 00:45:15,480 Speaker 1: thousand and eight. Part two will be coming up next week, 748 00:45:15,680 --> 00:45:18,239 Speaker 1: and then my final episode of tech Stuff host will 749 00:45:18,440 --> 00:45:21,360 Speaker 1: follow that. So will I still have a chance to 750 00:45:21,400 --> 00:45:30,680 Speaker 1: say this? I'll talk to you again really soon. Tech 751 00:45:30,760 --> 00:45:35,320 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 752 00:45:35,640 --> 00:45:39,360 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 753 00:45:39,360 --> 00:45:43,719 Speaker 1: to your favorite shows.