1 00:00:04,400 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,680 Speaker 1: and welcome to tech Stuff. I'm your host, jonvan Strickland. 3 00:00:15,680 --> 00:00:18,680 Speaker 1: I'm an executive producer with iHeart Podcasts and How the 4 00:00:18,760 --> 00:00:23,440 Speaker 1: tech Are. Yeah, welcome to my penultimate episode of hosting 5 00:00:23,480 --> 00:00:26,960 Speaker 1: tech Stuff. I mean, you know, second to last normal 6 00:00:27,120 --> 00:00:30,760 Speaker 1: tech Stuff. Anyway, I might occasionally host a special episode 7 00:00:30,800 --> 00:00:33,400 Speaker 1: that drops into the feed now and then I'll still 8 00:00:33,400 --> 00:00:36,760 Speaker 1: be around at iHeart Podcasts. I'll just be executive producer 9 00:00:36,840 --> 00:00:39,720 Speaker 1: rising for the most part. But anyway, we're going to 10 00:00:39,760 --> 00:00:42,800 Speaker 1: continue what we started in our last episode and talk 11 00:00:42,840 --> 00:00:46,000 Speaker 1: about some of the biggest tech stories that have unfolded 12 00:00:46,040 --> 00:00:49,160 Speaker 1: since tech Stuff first launched in mid two thousand and eight. 13 00:00:49,520 --> 00:00:52,080 Speaker 1: And in our last episode, I went through the second 14 00:00:52,080 --> 00:00:54,720 Speaker 1: half of two thousand and eight up through twenty fourteen. 15 00:00:55,000 --> 00:00:57,840 Speaker 1: Today we're gonna go twenty fifteen up to present day. 16 00:00:58,480 --> 00:01:02,800 Speaker 1: So that's more in years and I wrote more. So 17 00:01:03,000 --> 00:01:05,320 Speaker 1: this is going to be one of those long episodes. 18 00:01:05,480 --> 00:01:08,600 Speaker 1: But hey, I'm hanging up my boots, so you won't 19 00:01:08,640 --> 00:01:10,360 Speaker 1: be having to put up with me for too much longer. 20 00:01:10,400 --> 00:01:12,680 Speaker 1: So let's just go on this journey together, shall we. 21 00:01:13,160 --> 00:01:16,279 Speaker 1: So we're up to twenty fifteen. That's when Apple first 22 00:01:16,319 --> 00:01:19,959 Speaker 1: introduced the Apple Watch, and it would take a bit 23 00:01:20,000 --> 00:01:24,160 Speaker 1: of time, no pun intended to get momentum on this 24 00:01:24,280 --> 00:01:27,480 Speaker 1: product compared to other Apple products. But these days it 25 00:01:27,600 --> 00:01:30,240 Speaker 1: is the most popular brand of smartwatch here in the 26 00:01:30,360 --> 00:01:33,160 Speaker 1: United States and it holds like a thirty percent of 27 00:01:33,200 --> 00:01:36,800 Speaker 1: the market share globally. Wearables and smart watches had been 28 00:01:36,840 --> 00:01:39,400 Speaker 1: around for a while. It's not like Apple invented them. 29 00:01:39,720 --> 00:01:43,199 Speaker 1: And though Apple created a popular smart watch, I would 30 00:01:43,240 --> 00:01:46,800 Speaker 1: say that it didn't have the same sort of Cinderella 31 00:01:47,000 --> 00:01:51,000 Speaker 1: energy as the original iPod, or the iPhone or even 32 00:01:51,040 --> 00:01:54,920 Speaker 1: the iPad, at least not initially. So just as a reminder, 33 00:01:55,080 --> 00:01:58,200 Speaker 1: Apple didn't invent the smartphone, it didn't invent the MP 34 00:01:58,280 --> 00:02:00,960 Speaker 1: three player, it didn't invent the tablet computer. But it 35 00:02:01,000 --> 00:02:04,200 Speaker 1: did succeed where everyone else had not in that it 36 00:02:04,280 --> 00:02:08,200 Speaker 1: was able to create products that appealed to a wide 37 00:02:08,440 --> 00:02:12,880 Speaker 1: audience and not just like a narrow tech geek niche. 38 00:02:13,160 --> 00:02:16,720 Speaker 1: So there was every reason to expect that they would 39 00:02:16,720 --> 00:02:18,519 Speaker 1: do the same with smart watches. It would just take 40 00:02:18,560 --> 00:02:22,079 Speaker 1: a little bit longer for those to really become popular, 41 00:02:22,120 --> 00:02:25,400 Speaker 1: And I honestly don't know how popular they are in 42 00:02:25,400 --> 00:02:28,960 Speaker 1: comparison to Apple's other product lines. But another innovation that 43 00:02:29,040 --> 00:02:33,480 Speaker 1: was introduced in twenty fifteen was Tesla's autopilot feature. That's 44 00:02:33,560 --> 00:02:37,760 Speaker 1: the not really self driving mode that has made headlines 45 00:02:37,880 --> 00:02:41,960 Speaker 1: many times, often for tragic or unfortunate reasons. This year, 46 00:02:42,120 --> 00:02:45,240 Speaker 1: Miles Klee of Rolling Stone Magazine wrote a piece titled, 47 00:02:45,400 --> 00:02:49,520 Speaker 1: Tesla has highest rate of deadly accidents among car brands, 48 00:02:49,600 --> 00:02:54,000 Speaker 1: study fines. That's not great news. Meanwhile, President elect Trump's 49 00:02:54,040 --> 00:02:57,919 Speaker 1: transition team has recently recommended that the new regime in 50 00:02:57,960 --> 00:03:01,720 Speaker 1: the US repeal a rule that otherwise requires car companies 51 00:03:01,720 --> 00:03:05,720 Speaker 1: that have self driving technologies incorporated into their vehicles to 52 00:03:05,840 --> 00:03:09,560 Speaker 1: report on car crashes. So yeah, now the team is saying, hey, 53 00:03:09,840 --> 00:03:11,959 Speaker 1: we should get rid of that rule. And you know, 54 00:03:12,120 --> 00:03:14,280 Speaker 1: Elon Musk also happens to be heading up one of 55 00:03:14,320 --> 00:03:18,840 Speaker 1: the government departments in the new administration. So I suspect 56 00:03:18,880 --> 00:03:21,840 Speaker 1: this is just the beginning of a lot of these 57 00:03:22,000 --> 00:03:26,359 Speaker 1: kinds of arguments that are are designed to remove restrictions 58 00:03:26,360 --> 00:03:30,680 Speaker 1: and regulations and accountability to allow businesses to operate with 59 00:03:31,160 --> 00:03:34,120 Speaker 1: little to know oversight. In other news, Google did a 60 00:03:34,120 --> 00:03:38,880 Speaker 1: big old corporate restructuring by creating a company called Alphabet, 61 00:03:39,080 --> 00:03:42,160 Speaker 1: which acts as a holding company for Google and its 62 00:03:42,240 --> 00:03:46,960 Speaker 1: various spinoffs like YouTube and Android. In space news, Jeff 63 00:03:47,000 --> 00:03:52,320 Speaker 1: Bezos's private space company, Blue Origin successfully launched and landed 64 00:03:52,440 --> 00:03:56,600 Speaker 1: a reusable rocket called New Shepherd. SpaceX at that point 65 00:03:56,800 --> 00:03:59,320 Speaker 1: had not yet managed to do the same thing. They 66 00:03:59,360 --> 00:04:03,240 Speaker 1: had launched some spacecraft, but the landing had not really, 67 00:04:04,000 --> 00:04:06,200 Speaker 1: they hadn't stuck the landing. I guess it's safe to 68 00:04:06,240 --> 00:04:09,440 Speaker 1: say like sometimes their landings could be a little exclusive. 69 00:04:09,880 --> 00:04:13,360 Speaker 1: But obviously SpaceX would catch up to Blue Origin and 70 00:04:13,400 --> 00:04:16,480 Speaker 1: then like leave Blue Origin in the dust as far 71 00:04:16,560 --> 00:04:19,520 Speaker 1: as private space industry stuff is concerned. I mean, you're 72 00:04:19,520 --> 00:04:21,960 Speaker 1: going to see way more stories about SpaceX than Blue 73 00:04:21,960 --> 00:04:25,000 Speaker 1: Origin typically. But the big story that I want to 74 00:04:25,080 --> 00:04:30,160 Speaker 1: highlight from twenty fifteen came to be known as diesel Gate, 75 00:04:30,560 --> 00:04:34,240 Speaker 1: and it involved the car company Volkswagen. So in September 76 00:04:34,640 --> 00:04:38,920 Speaker 1: of twenty fifteen, the United States Environmental Protection Agency or EPA, 77 00:04:39,279 --> 00:04:45,120 Speaker 1: accused Volkswagen of violating the US Clean Air Act through trickery. Essentially, 78 00:04:45,279 --> 00:04:48,480 Speaker 1: the issue was that Volkswagen was using software, you know, 79 00:04:48,600 --> 00:04:52,640 Speaker 1: software technology in order to fool emissions testing equipment regarding 80 00:04:52,720 --> 00:04:56,800 Speaker 1: its diesel engine vehicles and the emissions that they put out. 81 00:04:57,200 --> 00:04:59,440 Speaker 1: So here's kind of how it worked from a very 82 00:04:59,520 --> 00:05:04,680 Speaker 1: high level. When a Volkswagen diesel vehicle detected that it 83 00:05:04,720 --> 00:05:08,760 Speaker 1: was being put into testing mode, it would artificially limit 84 00:05:08,800 --> 00:05:11,640 Speaker 1: the output of the engine so as not to exceed 85 00:05:11,800 --> 00:05:15,720 Speaker 1: emissions standards. Now it would look like the engine was 86 00:05:15,839 --> 00:05:20,120 Speaker 1: operating at normal capacity, but in fact it was purposefully 87 00:05:20,160 --> 00:05:23,560 Speaker 1: holding itself back. Once it detected it was out of 88 00:05:23,720 --> 00:05:27,440 Speaker 1: testing mode, the engine would switch back into normal operations, 89 00:05:27,520 --> 00:05:30,000 Speaker 1: at which point the vehicle would be emitting more than 90 00:05:30,080 --> 00:05:34,120 Speaker 1: restrictions would legally allow. So the investigation had actually been 91 00:05:34,160 --> 00:05:36,120 Speaker 1: going on for a couple of years at this point. 92 00:05:36,279 --> 00:05:39,680 Speaker 1: It started with some researchers at West Virginia's University and 93 00:05:39,800 --> 00:05:42,119 Speaker 1: they were finding some pretty odd readings with a couple 94 00:05:42,120 --> 00:05:45,880 Speaker 1: of Volkswagen diesel vehicles. Now they were using testing equipment 95 00:05:45,920 --> 00:05:49,080 Speaker 1: that allowed for on road tests. That meant the vehicle 96 00:05:49,160 --> 00:05:52,279 Speaker 1: was actually in use on roads at the time it 97 00:05:52,320 --> 00:05:55,440 Speaker 1: was traveling. Most emissions tests are done with a vehicle 98 00:05:55,480 --> 00:05:58,039 Speaker 1: that's stationary, it's not actually on the road, So the 99 00:05:58,080 --> 00:06:02,000 Speaker 1: researchers were seeing unusually high emissions ratings on the two 100 00:06:02,080 --> 00:06:05,680 Speaker 1: Volkswagen vehicles, and that merited further investigation because you know, 101 00:06:05,720 --> 00:06:09,920 Speaker 1: when they were routinely tested in normal emissions testing facilities, 102 00:06:10,200 --> 00:06:13,080 Speaker 1: they were under the limit, So something was going on here. 103 00:06:13,480 --> 00:06:15,960 Speaker 1: Their findings found their way to regulators at both the 104 00:06:16,000 --> 00:06:19,640 Speaker 1: California state level and the US federal level, and ultimately 105 00:06:19,720 --> 00:06:24,160 Speaker 1: led to the EPA accusing Volkswagen of deception. Other countries 106 00:06:24,200 --> 00:06:28,159 Speaker 1: subsequently investigated the matter and found similar results, and Volkswagen's 107 00:06:28,200 --> 00:06:33,559 Speaker 1: stock prices plummeted. The CEO resigned, and the company apologized for, 108 00:06:34,040 --> 00:06:38,359 Speaker 1: you know, essentially defrauding everybody. The gambit turned out to 109 00:06:38,400 --> 00:06:41,920 Speaker 1: be a costly one for Volkswagen. Estimates put the total 110 00:06:42,000 --> 00:06:45,919 Speaker 1: cost at nearly thirty five billion with a B dollars. 111 00:06:46,279 --> 00:06:49,360 Speaker 1: That's a hefty amount of fines and legal expenses and 112 00:06:49,400 --> 00:06:52,120 Speaker 1: replacement costs and that kind of thing. But beyond that, 113 00:06:52,520 --> 00:06:56,600 Speaker 1: there were some considerable non financial costs. In fact, they 114 00:06:56,600 --> 00:07:00,919 Speaker 1: were incalculable. For example, you can't really to fy the 115 00:07:00,960 --> 00:07:03,600 Speaker 1: amount of damage that was just done to the environment 116 00:07:04,040 --> 00:07:07,800 Speaker 1: or people's health, right like. That certainly was a factor, 117 00:07:07,880 --> 00:07:09,960 Speaker 1: but we don't really have a way of putting a 118 00:07:10,160 --> 00:07:13,000 Speaker 1: number on it. All in all, Diesel gait was an 119 00:07:13,000 --> 00:07:16,960 Speaker 1: example of corporate subterfuge and greed. And here's a fun 120 00:07:16,960 --> 00:07:20,120 Speaker 1: thing to think about. It's possible that organizations like the 121 00:07:20,160 --> 00:07:22,960 Speaker 1: EPA here in the United States are going to face 122 00:07:23,000 --> 00:07:26,640 Speaker 1: some pretty drastic cuts in funding or possibly even dissolution, 123 00:07:27,080 --> 00:07:30,119 Speaker 1: setting the stage for companies to pull similar stunts without 124 00:07:30,160 --> 00:07:32,440 Speaker 1: having to worry so much about being called to answer 125 00:07:32,480 --> 00:07:35,240 Speaker 1: for transgressions, you know, the way that Volkswagen did back 126 00:07:35,280 --> 00:07:39,000 Speaker 1: in twenty fifteen. If there are no regulatory agencies out 127 00:07:39,000 --> 00:07:41,600 Speaker 1: there to oversee it, then it's almost like you're not 128 00:07:41,680 --> 00:07:46,520 Speaker 1: doing it at all. Moving swiftly onwards to twenty sixteen, Okay, 129 00:07:46,920 --> 00:07:51,480 Speaker 1: twenty sixteen was a huge year for tech news. In 130 00:07:51,480 --> 00:07:54,080 Speaker 1: most of these years, when I was looking back, I'd 131 00:07:54,080 --> 00:07:56,040 Speaker 1: be like, oh, there's like two or three stories that 132 00:07:56,080 --> 00:07:59,280 Speaker 1: are contenders. But twenty sixteen just it was like the 133 00:07:59,400 --> 00:08:02,080 Speaker 1: year that just kept on giving. It was an election year, 134 00:08:02,240 --> 00:08:04,360 Speaker 1: and that was part of it here in the United States. 135 00:08:04,680 --> 00:08:09,680 Speaker 1: But in twenty sixteen, we got an early smart watch 136 00:08:10,040 --> 00:08:14,160 Speaker 1: that had been boosted through crowdfunding and boasted an e 137 00:08:14,360 --> 00:08:17,720 Speaker 1: ink display that maximized battery life. It kicked the bucket. 138 00:08:18,040 --> 00:08:21,960 Speaker 1: In twenty sixteen. This was the Pebble smart Watch, which 139 00:08:22,080 --> 00:08:27,400 Speaker 1: was an ambitious project, independent project. Ultimately it was not 140 00:08:27,520 --> 00:08:30,360 Speaker 1: able to succeed as a business. That was a bummer 141 00:08:30,400 --> 00:08:33,520 Speaker 1: for me personally. I had actually backed it after the 142 00:08:33,640 --> 00:08:37,640 Speaker 1: kickstarting campaign was already over it had already successfully completed 143 00:08:37,640 --> 00:08:39,920 Speaker 1: the Kickstart campaign. In fact, it was one of the 144 00:08:39,960 --> 00:08:44,640 Speaker 1: most successful campaigns at that time, and I had held 145 00:08:44,720 --> 00:08:47,200 Speaker 1: back on backing it. But then I went to Cees 146 00:08:47,320 --> 00:08:50,280 Speaker 1: and I saw a presentation about the Pebble smart Watch, 147 00:08:50,520 --> 00:08:52,679 Speaker 1: and that convinced me to put in a pre order, 148 00:08:52,960 --> 00:08:55,400 Speaker 1: and even though I got in too late, I ended 149 00:08:55,480 --> 00:08:58,680 Speaker 1: up getting a Kickstarter edition watch. It's one of the 150 00:08:58,720 --> 00:09:03,120 Speaker 1: red ones. I'm guessing that the people who were backing 151 00:09:03,160 --> 00:09:05,760 Speaker 1: it didn't order as many of the red ones, although 152 00:09:05,800 --> 00:09:07,719 Speaker 1: I thought the red one was really striking, which is 153 00:09:07,760 --> 00:09:09,959 Speaker 1: why I got it, so I guess that's why I 154 00:09:09,960 --> 00:09:12,679 Speaker 1: got a Kickstarter version. Anyway, I thought it was nifty, 155 00:09:13,000 --> 00:09:16,920 Speaker 1: but Pebble ultimately didn't have staying power and it folded 156 00:09:17,280 --> 00:09:21,120 Speaker 1: in twenty sixteen. Other notable tech stories from twenty sixteen 157 00:09:21,160 --> 00:09:25,520 Speaker 1: included the tragic reveal that a Tesla vehicle operating in 158 00:09:25,559 --> 00:09:29,040 Speaker 1: autopilot mode was involved in a fatal car accident that 159 00:09:29,080 --> 00:09:31,760 Speaker 1: would sadly not be the last time we would hear 160 00:09:32,040 --> 00:09:36,559 Speaker 1: something like that happening. The Hyperloop One company was plagued 161 00:09:36,600 --> 00:09:41,000 Speaker 1: by a crazy power struggle and various legal battles, which 162 00:09:41,040 --> 00:09:44,760 Speaker 1: meant we were not getting any closer to the science 163 00:09:44,840 --> 00:09:47,800 Speaker 1: fiction vision that Elon Musk had laid out a couple 164 00:09:47,800 --> 00:09:50,800 Speaker 1: of years earlier. If you don't remember what hyperloop is, 165 00:09:51,160 --> 00:09:54,600 Speaker 1: the concept is it's a transportation system meant to go 166 00:09:54,720 --> 00:09:57,520 Speaker 1: between like cities there may be a couple hundred or 167 00:09:57,600 --> 00:09:59,800 Speaker 1: few hundred miles away from each other, and it can 168 00:10:00,000 --> 00:10:05,920 Speaker 1: cysts in its original form as a tunnel that has 169 00:10:06,000 --> 00:10:08,400 Speaker 1: had most or all of the air pumped out of it, 170 00:10:08,480 --> 00:10:11,040 Speaker 1: so you have very very very low air pressure inside 171 00:10:11,080 --> 00:10:14,800 Speaker 1: the tunnel. You've got a magnet system or air bearings, 172 00:10:14,880 --> 00:10:17,400 Speaker 1: one of the two, or maybe both. It allows a 173 00:10:17,480 --> 00:10:21,680 Speaker 1: train to essentially hover above the floor of the tunnel 174 00:10:22,000 --> 00:10:25,560 Speaker 1: and travel with very little friction, so very little resistance, 175 00:10:25,880 --> 00:10:29,839 Speaker 1: and it allows the train to reach incredible top speeds 176 00:10:29,880 --> 00:10:33,720 Speaker 1: and thus make the trip in pretty short order while 177 00:10:33,760 --> 00:10:37,240 Speaker 1: being more efficient and less costly than air travel, less 178 00:10:37,320 --> 00:10:45,000 Speaker 1: environmentally pollutive, I guess than air travel as well, but Unfortunately, 179 00:10:45,480 --> 00:10:49,240 Speaker 1: hyper Loop one, because of this big battle, ended up 180 00:10:49,280 --> 00:10:53,559 Speaker 1: kind of getting again not to use a pun, but derailed. 181 00:10:53,920 --> 00:10:58,199 Speaker 1: Hyper Loop one would stick around, but would severely cut 182 00:10:58,320 --> 00:11:01,480 Speaker 1: back on its vision as far as what it's setting 183 00:11:01,520 --> 00:11:05,360 Speaker 1: out to do. Another big story that happened in twenty 184 00:11:05,440 --> 00:11:10,640 Speaker 1: sixteen is the explosive and that is intended controversy around 185 00:11:10,640 --> 00:11:14,800 Speaker 1: the Samsung Galaxy seven Note smartphones. So there were multiple 186 00:11:14,840 --> 00:11:18,800 Speaker 1: reports that detailed how these smartphones would spontaneously burst into 187 00:11:18,800 --> 00:11:22,520 Speaker 1: fire and explode. It led to lots of restrictions on 188 00:11:22,559 --> 00:11:25,520 Speaker 1: the phones. There were airlines that said, you cannot board 189 00:11:25,520 --> 00:11:27,280 Speaker 1: a plane if you're carrying one of these. You have 190 00:11:27,320 --> 00:11:29,640 Speaker 1: to leave it here. You can't take it on board 191 00:11:30,200 --> 00:11:35,319 Speaker 1: because these things had proven to be unstable. And it 192 00:11:35,400 --> 00:11:40,040 Speaker 1: turned out that the battery manufacturing process, which was actually 193 00:11:40,080 --> 00:11:42,520 Speaker 1: being carried out with two different companies and had two 194 00:11:42,559 --> 00:11:45,560 Speaker 1: separate issues, but they resulted in the same problem in 195 00:11:45,600 --> 00:11:48,920 Speaker 1: that they created short circuits inside the battery that would 196 00:11:49,000 --> 00:11:52,240 Speaker 1: lead to thermal runaway, meaning the battery would heat up 197 00:11:52,240 --> 00:11:54,360 Speaker 1: and heat up and heat up until it ultimately failed 198 00:11:54,679 --> 00:11:58,000 Speaker 1: and would explode. This was huge news, obviously, and it 199 00:11:58,080 --> 00:12:01,480 Speaker 1: cost Samsung millions of dollars in recalls and lawsuits. It 200 00:12:01,920 --> 00:12:05,319 Speaker 1: was a big story in twenty sixteen. There's so much more. Though, 201 00:12:05,440 --> 00:12:07,760 Speaker 1: Yahoo admitted that it had been the victim of a 202 00:12:07,800 --> 00:12:11,559 Speaker 1: massive data breach that year. That breach affected hundreds of 203 00:12:11,600 --> 00:12:14,520 Speaker 1: millions of people. We would find out later it essentially 204 00:12:14,559 --> 00:12:18,240 Speaker 1: affected anyone who had a Yahoo account at that time, Like, 205 00:12:18,320 --> 00:12:20,520 Speaker 1: even if you hadn't used it in a while, you 206 00:12:20,600 --> 00:12:25,200 Speaker 1: were affected. Also another cyber warfare kind of story, hackers 207 00:12:25,240 --> 00:12:28,400 Speaker 1: brought down a domain name server host in the fall 208 00:12:28,559 --> 00:12:32,840 Speaker 1: of that year, which in turn affected numerous major websites 209 00:12:32,880 --> 00:12:36,920 Speaker 1: and services like Netflix and Amazon, so they were unavailable 210 00:12:37,000 --> 00:12:40,360 Speaker 1: on again and off again in October twenty first, twenty sixteen. Like, 211 00:12:40,400 --> 00:12:43,120 Speaker 1: that was just a bad day if you were one 212 00:12:43,160 --> 00:12:45,360 Speaker 1: of those companies. That's just the tip of the iceberg. 213 00:12:45,400 --> 00:12:48,120 Speaker 1: A ton of stuff happened in twenty sixteen and tech, 214 00:12:48,160 --> 00:12:51,400 Speaker 1: but none of that is what I wanted to focus 215 00:12:51,480 --> 00:12:55,800 Speaker 1: on in this episode. Instead, for me, the story that 216 00:12:55,960 --> 00:12:59,800 Speaker 1: caught my attention most in twenty sixteen was kind of 217 00:12:59,840 --> 00:13:04,640 Speaker 1: the official beginning of the end for the biotech company, Thearrahnose. 218 00:13:04,920 --> 00:13:07,120 Speaker 1: It was literally the end of Therahnose that year. But 219 00:13:07,280 --> 00:13:10,200 Speaker 1: then the beginning of the end for the freedom of 220 00:13:10,240 --> 00:13:13,600 Speaker 1: the people who are leading Therahnose. So this was a 221 00:13:13,640 --> 00:13:16,839 Speaker 1: company that was founded by Elizabeth Holmes, and it was 222 00:13:16,880 --> 00:13:21,120 Speaker 1: a company that aimed to democratize medical testing. The idea 223 00:13:21,160 --> 00:13:23,800 Speaker 1: being that you would end up with a device the 224 00:13:23,840 --> 00:13:26,960 Speaker 1: size of a desktop printer that would allow the average 225 00:13:26,960 --> 00:13:29,280 Speaker 1: person the chance to screen themselves for more than one 226 00:13:29,400 --> 00:13:33,200 Speaker 1: hundred different diseases, all while just parting with the tiniest 227 00:13:33,320 --> 00:13:37,840 Speaker 1: micro drop of blood. Because Elizabeth Holmes said she was 228 00:13:38,040 --> 00:13:40,080 Speaker 1: scared of needles, and that was one of the reasons 229 00:13:40,120 --> 00:13:43,760 Speaker 1: she didn't, you know, she would kind of avoid medical checkups. 230 00:13:43,760 --> 00:13:45,400 Speaker 1: But she thought, said, well, if you could do it 231 00:13:45,480 --> 00:13:48,160 Speaker 1: where you could only need the smallest drop of blood 232 00:13:48,360 --> 00:13:51,400 Speaker 1: and you could run, you know, potentially hundreds of different 233 00:13:51,480 --> 00:13:53,920 Speaker 1: or more than one hundred different medical tests on that 234 00:13:53,960 --> 00:13:56,240 Speaker 1: little drop of blood, on a device that could sit 235 00:13:56,320 --> 00:13:58,520 Speaker 1: on top of your desk and find out you know, 236 00:13:59,040 --> 00:14:03,600 Speaker 1: what you need to look out for and what you know, 237 00:14:03,640 --> 00:14:05,640 Speaker 1: what your health was, and maybe if there were any 238 00:14:05,679 --> 00:14:07,839 Speaker 1: lifestyle changes you needed to make in order to live 239 00:14:07,880 --> 00:14:11,120 Speaker 1: a healthier life. It would open up options for people 240 00:14:11,120 --> 00:14:14,319 Speaker 1: that otherwise never would exist. That was the dream. When 241 00:14:14,360 --> 00:14:17,960 Speaker 1: we come back, I'll talk about the problem that dream 242 00:14:18,320 --> 00:14:20,400 Speaker 1: had when it came to meeting up with a thing 243 00:14:20,440 --> 00:14:22,760 Speaker 1: I like to call reality. But first, let's take a 244 00:14:22,840 --> 00:14:35,080 Speaker 1: quick break to thank our sponsors. Okay, we're back. So 245 00:14:35,320 --> 00:14:38,960 Speaker 1: I left off talking about twenty sixteen and Fairhos. So 246 00:14:39,800 --> 00:14:42,280 Speaker 1: here's an issue. This company's story was kind of the 247 00:14:42,280 --> 00:14:45,200 Speaker 1: stuff of Silicon Valley fairy tales, Like it was the 248 00:14:45,240 --> 00:14:48,760 Speaker 1: sort of thing that investors really get excited about. So 249 00:14:48,880 --> 00:14:52,880 Speaker 1: Holmes had been attending Stanford University before she dropped out 250 00:14:52,920 --> 00:14:57,320 Speaker 1: to create her company. She courted various investors, including really famous, 251 00:14:57,400 --> 00:15:01,240 Speaker 1: high profile ones like Oracle found under Larry Ellison was 252 00:15:01,240 --> 00:15:06,600 Speaker 1: one of them. The media conglomerate owner Rupert Murdoch was another. 253 00:15:06,920 --> 00:15:09,760 Speaker 1: And so she had raised hundreds of millions of dollars 254 00:15:09,880 --> 00:15:12,840 Speaker 1: to make her dream of becoming an iconic tech leader 255 00:15:12,880 --> 00:15:14,720 Speaker 1: come true. And I want to be clear about that. Like, 256 00:15:15,040 --> 00:15:18,440 Speaker 1: the technology was something that she thought would be really cool. 257 00:15:18,760 --> 00:15:23,160 Speaker 1: She seemed to be pretty passionate about it. But my interpretation, 258 00:15:23,360 --> 00:15:25,720 Speaker 1: which I admit is fully biased and I could be 259 00:15:26,080 --> 00:15:29,280 Speaker 1: entirely wrong, was more that she was interested in being 260 00:15:29,640 --> 00:15:34,040 Speaker 1: an icon than really, you know, having a belief and 261 00:15:34,120 --> 00:15:37,760 Speaker 1: knowledge about this technology. She wanted to be another Steve Jobs, 262 00:15:38,000 --> 00:15:40,360 Speaker 1: you know, be on the cover of magazines and be 263 00:15:40,640 --> 00:15:44,440 Speaker 1: talked about as like this visionary leader. The trouble was 264 00:15:44,760 --> 00:15:47,840 Speaker 1: her idea proved to be much less attainable than she 265 00:15:48,000 --> 00:15:51,760 Speaker 1: initially thought. In late twenty fifteen, Wall Street Journal's John 266 00:15:51,920 --> 00:15:56,000 Speaker 1: Kerrieru whose name I always butcher anyway, he broke the 267 00:15:56,080 --> 00:15:59,560 Speaker 1: story that Darrenos the company was actually built. On top 268 00:15:59,640 --> 00:16:04,359 Speaker 1: of this, that the company's supposed demonstrations of its technology 269 00:16:04,520 --> 00:16:09,160 Speaker 1: were in actuality frauds. They were relying heavily on existing 270 00:16:09,200 --> 00:16:13,360 Speaker 1: blood analysis technologies from other companies, so actual machines made 271 00:16:13,440 --> 00:16:17,440 Speaker 1: by other companies that did standard blood tests. They were 272 00:16:17,560 --> 00:16:21,080 Speaker 1: using those, but then claiming that the results came from 273 00:16:21,120 --> 00:16:25,880 Speaker 1: their own technology, and that they would take tiny blood 274 00:16:25,920 --> 00:16:29,080 Speaker 1: samples larger than a micro drop but still you know, 275 00:16:29,120 --> 00:16:32,920 Speaker 1: pretty small. Then they would dilute those samples and then 276 00:16:33,200 --> 00:16:37,520 Speaker 1: run tests on them using these other companies, you know, machines. 277 00:16:37,920 --> 00:16:41,400 Speaker 1: Then they would pass off whatever the results were as 278 00:16:41,520 --> 00:16:44,480 Speaker 1: their own work, while desperately trying to get their technology 279 00:16:44,520 --> 00:16:47,880 Speaker 1: to operate properly. The story didn't end there either. There 280 00:16:48,000 --> 00:16:51,160 Speaker 1: got worse, Like it unfolded that there had been advisors 281 00:16:51,160 --> 00:16:54,680 Speaker 1: who had told Holmes early on that her idea was 282 00:16:54,760 --> 00:16:59,080 Speaker 1: unrealistic from a technological and biological perspective, that just because 283 00:16:59,160 --> 00:17:01,160 Speaker 1: she wanted it to work doesn't mean it would work. 284 00:17:01,520 --> 00:17:03,760 Speaker 1: So essentially it turned out to be kind of an 285 00:17:03,760 --> 00:17:07,959 Speaker 1: Emperor has No Clothes situation, and the company would totally 286 00:17:08,040 --> 00:17:11,960 Speaker 1: unravel throughout twenty sixteen. In the process, Holmes, who had 287 00:17:11,960 --> 00:17:16,240 Speaker 1: developed a reputation for extravagance and peculiar habits, found her 288 00:17:16,280 --> 00:17:19,480 Speaker 1: net worth plunge to zero. She went from being one 289 00:17:19,520 --> 00:17:23,199 Speaker 1: of the wealthiest women on the planet to having no 290 00:17:23,600 --> 00:17:26,960 Speaker 1: net worth at all. The SEC opened an investigation into 291 00:17:27,000 --> 00:17:30,120 Speaker 1: the company. Holmes and her business partner and on again 292 00:17:30,200 --> 00:17:34,560 Speaker 1: off again romantic partner, Ramesh Malwani, would ultimately stand trial 293 00:17:34,600 --> 00:17:38,320 Speaker 1: for multiple criminal charges of fraud and related crimes. Holmes 294 00:17:38,359 --> 00:17:41,520 Speaker 1: then would be found guilty and sentenced to eleven years 295 00:17:41,560 --> 00:17:45,320 Speaker 1: in prison, although that did get reduced a couple of times. Currently, 296 00:17:45,359 --> 00:17:48,639 Speaker 1: I believe she will be eligible for release in the 297 00:17:48,640 --> 00:17:52,120 Speaker 1: summer of twenty thirty two. Balwani would get a similar 298 00:17:52,160 --> 00:17:55,240 Speaker 1: sentence of nearly thirteen years for his part in the 299 00:17:55,280 --> 00:17:58,639 Speaker 1: deception and the reason I wanted to highlight therahos in 300 00:17:58,720 --> 00:18:01,520 Speaker 1: this episode is because I think it really shines a 301 00:18:01,640 --> 00:18:05,040 Speaker 1: light on some real issues in tech in general. So 302 00:18:05,520 --> 00:18:10,000 Speaker 1: Issue one, companies overhype the capabilities of their technologies just 303 00:18:10,520 --> 00:18:13,080 Speaker 1: par for the course. Here. Some folks refer to this 304 00:18:13,119 --> 00:18:17,560 Speaker 1: as marketing. I call it lying. Issue two. Their nose 305 00:18:17,640 --> 00:18:20,360 Speaker 1: really strips bear the dangers of the fake it till 306 00:18:20,400 --> 00:18:23,200 Speaker 1: you make it philosophy that you know, folks will assume 307 00:18:23,560 --> 00:18:27,320 Speaker 1: that if they can imagine it, that it's actually possible, 308 00:18:27,560 --> 00:18:31,120 Speaker 1: and they'll build a business idea or an actual business 309 00:18:31,160 --> 00:18:35,000 Speaker 1: around this concept before finding out if it really is 310 00:18:35,440 --> 00:18:38,840 Speaker 1: possible or not, and by then it's too late. So 311 00:18:38,880 --> 00:18:41,400 Speaker 1: you sell yourself on this idea, and then you scramble 312 00:18:41,480 --> 00:18:43,520 Speaker 1: to try and make the idea actually work in the 313 00:18:43,560 --> 00:18:46,800 Speaker 1: real world. It is not a good approach. I mean, 314 00:18:46,840 --> 00:18:50,440 Speaker 1: when it works, it's great. It just very rarely works. 315 00:18:50,760 --> 00:18:53,640 Speaker 1: Issue three, and this is relating to the last one. 316 00:18:53,880 --> 00:18:57,119 Speaker 1: It reminds us that while technology is incredible and it 317 00:18:57,280 --> 00:18:59,960 Speaker 1: can let us do stuff that previous generations would have 318 00:19:00,080 --> 00:19:04,520 Speaker 1: thought was impossible, technology is not without limitations. It's not 319 00:19:04,640 --> 00:19:06,920 Speaker 1: some sort of magic get out of jail free card 320 00:19:07,000 --> 00:19:12,520 Speaker 1: for all situations. Issue for the trappings of the tech 321 00:19:12,960 --> 00:19:17,959 Speaker 1: icon identity are really tempting, you know, from wealth or 322 00:19:18,520 --> 00:19:22,040 Speaker 1: influence to the public perception that you're someone who's just 323 00:19:22,119 --> 00:19:25,960 Speaker 1: really innovative and savvy. There's a lot to envy there, 324 00:19:26,359 --> 00:19:28,679 Speaker 1: so you can understand why someone would want to be 325 00:19:28,760 --> 00:19:33,240 Speaker 1: in that category. Issue five. Investors can sometimes propel a 326 00:19:33,280 --> 00:19:36,879 Speaker 1: bad idea into the world despite the fact it's a 327 00:19:36,920 --> 00:19:40,680 Speaker 1: bad idea, and that wouldn't be an issue except that 328 00:19:40,960 --> 00:19:44,280 Speaker 1: in the course of things, other people who are innocent 329 00:19:44,400 --> 00:19:46,800 Speaker 1: can end up suffering as a result of this. So 330 00:19:46,920 --> 00:19:51,240 Speaker 1: with the Theradouse case, that included patients who were relying 331 00:19:51,400 --> 00:19:55,040 Speaker 1: on what ended up being unreliable medical results. So they 332 00:19:55,040 --> 00:19:58,120 Speaker 1: are like actual people, sick people who were you know, 333 00:19:58,320 --> 00:20:02,119 Speaker 1: making use of this supposed tech and there's a chance 334 00:20:02,200 --> 00:20:04,119 Speaker 1: that some of the stuff they got back was not 335 00:20:04,280 --> 00:20:08,439 Speaker 1: at all reliable, which that's life endangering. That's terrible. So 336 00:20:08,720 --> 00:20:10,960 Speaker 1: I think Fraans is going to be a case study 337 00:20:11,200 --> 00:20:13,639 Speaker 1: for many, many years. It's a lesson on what not 338 00:20:13,920 --> 00:20:16,240 Speaker 1: to do and how not to do it, and I 339 00:20:16,280 --> 00:20:19,360 Speaker 1: hope a lot of people take that to heart and they, 340 00:20:19,480 --> 00:20:23,240 Speaker 1: you know, they learn from those mistakes. Let's move on 341 00:20:23,800 --> 00:20:27,200 Speaker 1: to twenty seventeen. Holy cow, this is taking forever. It's 342 00:20:27,200 --> 00:20:30,760 Speaker 1: almost as long as living through those years in real time. Anyway, 343 00:20:30,920 --> 00:20:35,400 Speaker 1: twenty seventeen, mega venture capital fund company soft Bank invested 344 00:20:35,560 --> 00:20:38,480 Speaker 1: hundreds of millions of dollars into a company that was 345 00:20:38,760 --> 00:20:41,760 Speaker 1: portrayed as if it were a tech company, but it 346 00:20:41,800 --> 00:20:45,639 Speaker 1: really wasn't. In reality, all it was doing was little 347 00:20:45,640 --> 00:20:49,560 Speaker 1: more than subleasing office space to clients, and that company 348 00:20:49,840 --> 00:20:53,200 Speaker 1: was we Work. Now, over the following years, soft Bank 349 00:20:53,240 --> 00:20:56,200 Speaker 1: would increase this investment to the point where it's spent 350 00:20:56,320 --> 00:21:01,200 Speaker 1: around sixteen billion with a B dollars on work, not 351 00:21:01,280 --> 00:21:04,600 Speaker 1: just in investments, but you know, overall. We Work meanwhile, 352 00:21:04,920 --> 00:21:08,800 Speaker 1: would prep itself for an initial public offering or IPO 353 00:21:09,119 --> 00:21:14,320 Speaker 1: in twenty nineteen. Unfortunately, that public offering imploded before it 354 00:21:14,320 --> 00:21:18,679 Speaker 1: could happen. It was kind of an absurd situation, like 355 00:21:18,720 --> 00:21:23,280 Speaker 1: there were corporate documents that just read like new age, 356 00:21:23,640 --> 00:21:29,320 Speaker 1: you know, ridiculous meaningless babble. Anyway, we Work did not 357 00:21:29,560 --> 00:21:33,080 Speaker 1: go public, and it then started to go into a 358 00:21:33,080 --> 00:21:36,840 Speaker 1: period of decline and actually went into bankruptcy protection in 359 00:21:36,880 --> 00:21:39,920 Speaker 1: twenty twenty three. And it didn't go away, It didn't 360 00:21:39,920 --> 00:21:42,680 Speaker 1: go out of business. It did go into bankruptcy, but 361 00:21:42,720 --> 00:21:45,280 Speaker 1: it's still around. Some view we work as sort of 362 00:21:45,320 --> 00:21:50,000 Speaker 1: a cautionary tale about the startup ecosystem, and as crazy 363 00:21:50,000 --> 00:21:53,000 Speaker 1: as that story is, it's still not the one I 364 00:21:53,080 --> 00:21:56,920 Speaker 1: picked to focus on this episode. Other things that happened 365 00:21:56,920 --> 00:21:59,320 Speaker 1: in twenty seventeen. There were a lot more data breaches 366 00:21:59,359 --> 00:22:02,680 Speaker 1: that year, including equa Fax, which is a credit reporting 367 00:22:02,720 --> 00:22:06,360 Speaker 1: company that was huge like bad bad news. I mean, 368 00:22:06,400 --> 00:22:10,600 Speaker 1: you expect an institution like a credit reporting company to 369 00:22:10,640 --> 00:22:15,880 Speaker 1: have really strong security measures because that's incredibly sensitive information 370 00:22:15,920 --> 00:22:18,400 Speaker 1: that affects millions of people, Like we're all lumped into 371 00:22:18,440 --> 00:22:21,640 Speaker 1: that system here in the United States. You know, everyone's 372 00:22:21,680 --> 00:22:24,560 Speaker 1: got their credit rating and everything, so having that information 373 00:22:24,680 --> 00:22:29,400 Speaker 1: get breached very damaging. The Yahoo breach I mentioned earlier 374 00:22:30,000 --> 00:22:34,720 Speaker 1: would end up becoming more clear in twenty seventeen. That's 375 00:22:34,760 --> 00:22:38,119 Speaker 1: when we learned that essentially everybody was whoever had a 376 00:22:38,200 --> 00:22:40,800 Speaker 1: Yahoo account was affected by it. And it also had 377 00:22:40,840 --> 00:22:45,359 Speaker 1: a huge impact on an acquisition because Verizon was buying Yahoo, 378 00:22:46,040 --> 00:22:49,680 Speaker 1: and it ended up affecting that acquisition deal by knocking 379 00:22:49,720 --> 00:22:52,760 Speaker 1: off around three hundred and fifty million dollars off Yahoo's 380 00:22:52,840 --> 00:22:57,080 Speaker 1: sales price. So yikes. Facebook's mark Zuckerberg was called to 381 00:22:57,080 --> 00:22:59,760 Speaker 1: face questioning from Congress, and not for the last time, 382 00:23:00,280 --> 00:23:04,200 Speaker 1: after Facebook had served up political ads that turned out 383 00:23:04,240 --> 00:23:07,720 Speaker 1: to have been paid for by Russian groups, and these 384 00:23:07,720 --> 00:23:10,199 Speaker 1: were political ads for the United States, so they were 385 00:23:10,240 --> 00:23:13,320 Speaker 1: intended to sway Americans who were voting in the twenty 386 00:23:13,320 --> 00:23:16,480 Speaker 1: sixteen presidential elections, and this started up a conversation in 387 00:23:16,520 --> 00:23:20,600 Speaker 1: America about how to protect against foreign interference in American elections, 388 00:23:20,800 --> 00:23:25,159 Speaker 1: something that hadn't really taken like the headlines by storm 389 00:23:25,200 --> 00:23:28,520 Speaker 1: in previous years, but this time it really was. Fortunately, 390 00:23:28,800 --> 00:23:31,400 Speaker 1: we have all of that sorted out today and there's 391 00:23:31,440 --> 00:23:36,440 Speaker 1: no more problem. That was me being sarcastic. Cryptocurrency boomed 392 00:23:36,440 --> 00:23:39,560 Speaker 1: in twenty seventeen and Ethereum became more of a known 393 00:23:39,680 --> 00:23:44,000 Speaker 1: quantity and introduced the possibility of other blockchain based cryptocurrencies 394 00:23:44,000 --> 00:23:47,639 Speaker 1: based off the Ethereum blockchain. At this point, Ethereum was 395 00:23:47,680 --> 00:23:51,760 Speaker 1: a proof of work cryptocurrency, just like Bitcoin is, but 396 00:23:51,960 --> 00:23:54,320 Speaker 1: would later turn to be a proof of stake, but 397 00:23:54,359 --> 00:23:57,280 Speaker 1: that would take a few years. Arguably this was really 398 00:23:57,320 --> 00:23:59,960 Speaker 1: when proof of work currencies began, making it very hard 399 00:24:00,280 --> 00:24:03,840 Speaker 1: for non crypto customers to get hold of graphics processing 400 00:24:03,920 --> 00:24:08,160 Speaker 1: units or GPUs, particularly in the video game space. GPUs 401 00:24:08,200 --> 00:24:11,639 Speaker 1: are important. Crypto miners were relying on these cards because 402 00:24:11,680 --> 00:24:15,280 Speaker 1: they have great parallel processing capabilities and that helps when 403 00:24:15,280 --> 00:24:19,880 Speaker 1: you're mining cryptocurrency and a proof of work system. So yeah, 404 00:24:19,960 --> 00:24:22,280 Speaker 1: this was the beginning of a tough time. If you 405 00:24:22,320 --> 00:24:24,320 Speaker 1: were a gamer and you wanted to get hold of 406 00:24:24,359 --> 00:24:27,399 Speaker 1: the latest in GPUs, it was just as hard to 407 00:24:27,400 --> 00:24:29,439 Speaker 1: get hold of them for a few years. But the 408 00:24:29,440 --> 00:24:31,760 Speaker 1: big story I really wanted to talk about is the 409 00:24:32,119 --> 00:24:35,720 Speaker 1: no good, very bad year that Uber had in twenty seventeen. Now, 410 00:24:35,760 --> 00:24:38,240 Speaker 1: to be clear, a lot of the stuff that happened 411 00:24:38,440 --> 00:24:41,040 Speaker 1: to Uber in twenty seventeen had actually been happening for 412 00:24:41,080 --> 00:24:43,240 Speaker 1: a while. It's just that it all came to a 413 00:24:43,359 --> 00:24:46,919 Speaker 1: head in twenty seventeen. So starting off, we've got some 414 00:24:46,960 --> 00:24:51,239 Speaker 1: political events that landed Uber in a critical place. They 415 00:24:51,440 --> 00:24:54,320 Speaker 1: got a lot of criticism for this. So in January 416 00:24:54,320 --> 00:24:58,080 Speaker 1: twenty seventeen, Donald Trump passed some executive actions that placed 417 00:24:58,119 --> 00:25:02,120 Speaker 1: hefty restrictions on foreign nation being allowed into the United States, 418 00:25:02,359 --> 00:25:05,919 Speaker 1: and largely these restrictions affected people who were traveling from 419 00:25:06,000 --> 00:25:10,240 Speaker 1: countries that have a large Muslim population. So people got 420 00:25:10,320 --> 00:25:14,600 Speaker 1: very upset They were enraged that Trump was apparently engaging 421 00:25:14,840 --> 00:25:18,159 Speaker 1: in profiling because it's not like these other countries that 422 00:25:18,240 --> 00:25:21,200 Speaker 1: don't have large Muslim populations were affected. So yeah, it 423 00:25:21,560 --> 00:25:25,040 Speaker 1: was it was hard to justify. But in response to 424 00:25:25,359 --> 00:25:29,120 Speaker 1: his move, taxi cab drivers in New York City refused 425 00:25:29,160 --> 00:25:32,920 Speaker 1: to work in protest of these travel restrictions, and they 426 00:25:32,920 --> 00:25:37,520 Speaker 1: did this the day following when Trump signed those executive actions, 427 00:25:37,760 --> 00:25:43,080 Speaker 1: and Uber became news because it lifted surge pricing in 428 00:25:43,200 --> 00:25:45,760 Speaker 1: New York in an apparent attempt to take advantage of 429 00:25:45,800 --> 00:25:50,280 Speaker 1: a lack of competition due to that taxi driver work stoppage, 430 00:25:50,520 --> 00:25:52,320 Speaker 1: and it was seen as a move to undermine a 431 00:25:52,359 --> 00:25:56,479 Speaker 1: political protest, and in response, a hashtag called delete Uber 432 00:25:56,600 --> 00:25:59,400 Speaker 1: began to circulate. As it would turn out, there would 433 00:25:59,440 --> 00:26:01,560 Speaker 1: be lots of other reasons folks would want to delete 434 00:26:01,680 --> 00:26:04,320 Speaker 1: Uber as the year went on that didn't involve travel 435 00:26:04,359 --> 00:26:07,600 Speaker 1: restrictions at all. So next up. In February twenty seventeen, 436 00:26:07,880 --> 00:26:12,360 Speaker 1: a former Uber engineer named Susan Fowler came forward posting 437 00:26:12,359 --> 00:26:16,000 Speaker 1: a blog post that made public accusations regarding gender discrimination 438 00:26:16,359 --> 00:26:20,879 Speaker 1: and sexual harassment within Uber, and she detailed a devastatingly 439 00:26:21,119 --> 00:26:25,480 Speaker 1: toxic work culture that was particularly harmful toward women within 440 00:26:25,520 --> 00:26:29,040 Speaker 1: the company. Fowler's move would inspire other women within the 441 00:26:29,080 --> 00:26:32,000 Speaker 1: tech industry to come forward with their own stories, and 442 00:26:32,080 --> 00:26:35,600 Speaker 1: when paired with the blossoming Me Too movement in late 443 00:26:35,720 --> 00:26:38,800 Speaker 1: twenty seventeen, this would build into a more unified call 444 00:26:38,880 --> 00:26:41,440 Speaker 1: for change and accountability. And I wish I could say 445 00:26:41,440 --> 00:26:44,800 Speaker 1: that we got there, that things were all better now, 446 00:26:44,840 --> 00:26:47,920 Speaker 1: But over the following years we would get more stories 447 00:26:48,160 --> 00:26:51,840 Speaker 1: about companies and toxic work cultures and sexual harassment that 448 00:26:51,880 --> 00:26:55,560 Speaker 1: had been codified into company culture. So we got a 449 00:26:55,560 --> 00:26:58,280 Speaker 1: long way to go. But also in February twenty seventeen, 450 00:26:58,400 --> 00:27:02,600 Speaker 1: a video went viral showing then Uber CEO Travis Kalanik, 451 00:27:02,760 --> 00:27:05,240 Speaker 1: who co founded the company, getting into an argument with 452 00:27:05,320 --> 00:27:08,199 Speaker 1: an Uber driver, and the driver protested to Kalanick that 453 00:27:08,440 --> 00:27:11,520 Speaker 1: the company was putting drivers in a squeeze because it 454 00:27:11,560 --> 00:27:13,880 Speaker 1: was raising standards for stuff like, you know, the types 455 00:27:13,920 --> 00:27:17,639 Speaker 1: of cars that could be used on various Uber services, 456 00:27:17,760 --> 00:27:21,960 Speaker 1: while also dropping prices for customers, and drivers were having 457 00:27:22,040 --> 00:27:25,080 Speaker 1: to pay more and they were getting paid less. Kalan 458 00:27:25,160 --> 00:27:28,560 Speaker 1: Nick got really argumentative, but the whole exchange was captured 459 00:27:28,600 --> 00:27:31,520 Speaker 1: on a dash cam. The driver later uploaded it to YouTube. 460 00:27:31,800 --> 00:27:35,240 Speaker 1: I found quite a bit of popularity. Kalanik later issued 461 00:27:35,280 --> 00:27:38,840 Speaker 1: an apology in an internal email in Uber. He sensed 462 00:27:38,880 --> 00:27:42,880 Speaker 1: that perhaps he didn't come off looking really sympathetic because 463 00:27:42,880 --> 00:27:45,639 Speaker 1: he was sitting in the back of an Uber black 464 00:27:45,800 --> 00:27:50,560 Speaker 1: vehicle between two women sitting next to him and arguing 465 00:27:50,600 --> 00:27:53,280 Speaker 1: with his driver about the fact that the driver needed 466 00:27:53,280 --> 00:27:56,400 Speaker 1: to take responsibility for his own financial situation. That Uber 467 00:27:56,560 --> 00:27:59,920 Speaker 1: was blameless in this particular batter, You know, considering how 468 00:28:00,040 --> 00:28:03,359 Speaker 1: Uber was also trying to dodge the responsibility for accusations 469 00:28:03,400 --> 00:28:06,040 Speaker 1: brought against the company from Vowler. The fact that he 470 00:28:06,160 --> 00:28:08,600 Speaker 1: was arguing that, hey, you need to take responsibility for 471 00:28:08,640 --> 00:28:11,800 Speaker 1: your life while simultaneously the company was trying to dodge 472 00:28:11,800 --> 00:28:14,520 Speaker 1: responsibility struck a lot of people as being perhaps on 473 00:28:14,600 --> 00:28:17,360 Speaker 1: the touch ironic side, and we're not done with their year. 474 00:28:17,560 --> 00:28:20,119 Speaker 1: That's just the beginning of the year. In March, the 475 00:28:20,160 --> 00:28:23,119 Speaker 1: New York Times published an article revealing that Uber was 476 00:28:23,119 --> 00:28:26,000 Speaker 1: making use of a tool called gray Ball. Now. The 477 00:28:26,000 --> 00:28:29,240 Speaker 1: purpose of this tool was to fool local authorities in 478 00:28:29,320 --> 00:28:33,440 Speaker 1: regions where Uber's operations were unsanctioned. In other words, gray 479 00:28:33,480 --> 00:28:36,040 Speaker 1: Ball was meant to trick police so that Uber drivers 480 00:28:36,040 --> 00:28:39,480 Speaker 1: could operate in places where they weren't technically allowed. The 481 00:28:39,520 --> 00:28:43,400 Speaker 1: gray Ball tool would identify law enforcement that was attempting 482 00:28:43,440 --> 00:28:46,360 Speaker 1: to hail a ride. So let's say that police are 483 00:28:46,400 --> 00:28:49,120 Speaker 1: trying to prove that Uber is operating in a place 484 00:28:49,120 --> 00:28:51,880 Speaker 1: where they're not supposed to, so they're using their phone 485 00:28:51,880 --> 00:28:54,120 Speaker 1: to call a ride. Well, grey Ball was meant to 486 00:28:54,200 --> 00:28:57,480 Speaker 1: identify when it was a police officer, and then the 487 00:28:57,520 --> 00:29:01,240 Speaker 1: police officer would see cars on the app, but no 488 00:29:01,360 --> 00:29:04,320 Speaker 1: car would ever come to actually pick them up, so 489 00:29:04,960 --> 00:29:06,760 Speaker 1: you know, there was no way for them to use 490 00:29:06,800 --> 00:29:10,520 Speaker 1: that as evidence that Uber was picking up or dropping 491 00:29:10,560 --> 00:29:15,080 Speaker 1: off in that particular region. So Uber spokespeople claimed that 492 00:29:15,160 --> 00:29:17,280 Speaker 1: the tool was actually meant to protect drivers. It was 493 00:29:17,320 --> 00:29:20,080 Speaker 1: meant to prevent them from picking up a fair from 494 00:29:20,120 --> 00:29:24,600 Speaker 1: somebody who was abusing Uber's systems in other words, and 495 00:29:25,080 --> 00:29:27,800 Speaker 1: said that you know, the drivers are independent contractors and 496 00:29:27,840 --> 00:29:30,840 Speaker 1: that they should also not be persecuted or They also 497 00:29:31,120 --> 00:29:34,560 Speaker 1: suggested that law enforcement was somehow in collusion with taxi 498 00:29:34,600 --> 00:29:38,200 Speaker 1: companies to muscle ride hailing services out of various regions. 499 00:29:38,640 --> 00:29:40,360 Speaker 1: I think it is good to point out that even 500 00:29:40,360 --> 00:29:43,600 Speaker 1: if that were true, it doesn't mean that Uber gets 501 00:29:44,200 --> 00:29:46,800 Speaker 1: a free pass to break the law, Like the law 502 00:29:46,880 --> 00:29:49,320 Speaker 1: is the law, and a law can be unjust, but 503 00:29:49,400 --> 00:29:53,280 Speaker 1: breaking an unjust law still comes with consequences. And I 504 00:29:53,320 --> 00:29:56,600 Speaker 1: know it's an easy pat answer to say that the 505 00:29:56,640 --> 00:29:59,320 Speaker 1: solution is to change the law, because you know, I 506 00:29:59,360 --> 00:30:01,840 Speaker 1: know our system here in the United States. I know, 507 00:30:02,240 --> 00:30:06,160 Speaker 1: at best it's byzantine and complicated, and at worst it's 508 00:30:06,240 --> 00:30:09,440 Speaker 1: broken and biased. But you know, it's still a law. 509 00:30:09,960 --> 00:30:12,160 Speaker 1: That's the problem. Like you can't just choose which laws 510 00:30:12,200 --> 00:30:14,520 Speaker 1: to obey and which ones not to obey. Now, that 511 00:30:14,600 --> 00:30:17,600 Speaker 1: still just gets us into to March of twenty seventeen, 512 00:30:17,600 --> 00:30:19,600 Speaker 1: and Uber's very bad, no good year still has a 513 00:30:19,640 --> 00:30:22,040 Speaker 1: long way to go. In May, the company had to 514 00:30:22,040 --> 00:30:25,200 Speaker 1: come clean about how it was paying drivers. So, in 515 00:30:25,240 --> 00:30:27,840 Speaker 1: an example that g Nome Shiber used in the New 516 00:30:27,920 --> 00:30:32,200 Speaker 1: York Times, if a fare cost twenty dollars and taxes 517 00:30:32,280 --> 00:30:35,760 Speaker 1: on that fare were two dollars, Uber was only supposed 518 00:30:35,800 --> 00:30:39,720 Speaker 1: to take its commission, which is twenty five percent off 519 00:30:39,800 --> 00:30:44,080 Speaker 1: the eighteen dollars that were left over so after taxes. However, 520 00:30:44,160 --> 00:30:47,680 Speaker 1: it was discovered that Uber appeared to instead be taking 521 00:30:47,760 --> 00:30:52,120 Speaker 1: its twenty five percent share before taxes, so in the 522 00:30:52,520 --> 00:30:55,520 Speaker 1: example that Schiber gave, it would take twenty five percent 523 00:30:55,520 --> 00:30:58,240 Speaker 1: of twenty bucks, which is five dollars, instead of twenty 524 00:30:58,520 --> 00:31:01,520 Speaker 1: five percent of eighteen bucks, which is four dollars fifty cents, 525 00:31:01,800 --> 00:31:04,160 Speaker 1: and it meant that drivers were taking home less money 526 00:31:04,200 --> 00:31:07,560 Speaker 1: than what they were actually owed. An advocacy group argued 527 00:31:07,560 --> 00:31:10,680 Speaker 1: that what Uber was actually doing was passing the cost 528 00:31:10,800 --> 00:31:14,280 Speaker 1: of state taxes onto drivers, particularly in the state of 529 00:31:14,320 --> 00:31:18,280 Speaker 1: New York, and Uber was not assuming the responsibility of 530 00:31:18,360 --> 00:31:21,960 Speaker 1: paying taxes as a company, and this was a tough 531 00:31:22,080 --> 00:31:24,880 Speaker 1: argument against Uber, and Uber executives said that the company 532 00:31:24,920 --> 00:31:27,360 Speaker 1: was dedicated to paying all drivers the money that they 533 00:31:27,360 --> 00:31:30,280 Speaker 1: were owed due to this practice, that they were sorry 534 00:31:30,320 --> 00:31:32,760 Speaker 1: for it, and you know, et cetera, et cetera. So 535 00:31:32,880 --> 00:31:35,240 Speaker 1: this ended up costing the company as well, gave it 536 00:31:35,280 --> 00:31:38,520 Speaker 1: another black eye, which was unfortunate because at this point 537 00:31:38,680 --> 00:31:41,680 Speaker 1: Uber had more black eyes than eyes. At the halfway 538 00:31:41,720 --> 00:31:44,480 Speaker 1: mark of the year in June, and internal investigation in 539 00:31:44,600 --> 00:31:48,280 Speaker 1: Uber led to twenty people in the company, some of 540 00:31:48,280 --> 00:31:52,880 Speaker 1: them in leadership positions, being fired for inappropriate behavior. And 541 00:31:52,960 --> 00:31:57,720 Speaker 1: by inappropriate behavior, I'm mostly referring to sexual harassment. In addition, 542 00:31:57,760 --> 00:32:00,960 Speaker 1: the board of directors pressured Travis Kalanick to resign, which 543 00:32:00,960 --> 00:32:03,240 Speaker 1: he did, though he still had many close friends on 544 00:32:03,280 --> 00:32:05,400 Speaker 1: the board of directors, so that was a bit of 545 00:32:05,400 --> 00:32:09,320 Speaker 1: a struggle too, and we're not done so. In Singapore, 546 00:32:09,920 --> 00:32:13,880 Speaker 1: it was discovered that Uber had been renting dangerous vehicles 547 00:32:13,880 --> 00:32:17,840 Speaker 1: to drivers. These were vehicles that had known safety issues, namely, 548 00:32:18,160 --> 00:32:20,720 Speaker 1: once in a while they would catch on fire, and 549 00:32:21,120 --> 00:32:24,680 Speaker 1: Uber apparently knew this and still use those vehicles to 550 00:32:24,760 --> 00:32:27,360 Speaker 1: rent to drivers in Singapore to drive for the company, 551 00:32:27,400 --> 00:32:31,400 Speaker 1: which seems in my mind a tiny bit negligent. Uber 552 00:32:31,480 --> 00:32:34,120 Speaker 1: also lost its license to operate within the City of 553 00:32:34,160 --> 00:32:37,959 Speaker 1: London for failure to adhere to certain corporate responsibility requirements, 554 00:32:38,120 --> 00:32:40,000 Speaker 1: and near the end of the year, the company had 555 00:32:40,000 --> 00:32:42,200 Speaker 1: to come clean about a data breach that had happened 556 00:32:42,240 --> 00:32:45,720 Speaker 1: in twenty sixteen and allowed hackers to access information about 557 00:32:45,800 --> 00:32:48,680 Speaker 1: drivers and customers alike. Oh and also that Uber had 558 00:32:48,720 --> 00:32:51,360 Speaker 1: paid a ransom to keep the whole thing quiet. And 559 00:32:51,400 --> 00:32:54,680 Speaker 1: as I always point out, paying ransoms is bad because 560 00:32:54,720 --> 00:32:57,960 Speaker 1: it reinforces to the attackers that ransoms work, and it 561 00:32:58,040 --> 00:33:00,320 Speaker 1: just means we get more of them. So it's better 562 00:33:00,400 --> 00:33:02,840 Speaker 1: not to do that anyway. Yeah, not a good year 563 00:33:02,880 --> 00:33:04,880 Speaker 1: for Uber, especially since you know, they paid to keep 564 00:33:04,920 --> 00:33:07,040 Speaker 1: that thing quiet and it came out in the open anyway. 565 00:33:07,080 --> 00:33:10,360 Speaker 1: So twenty seventeen rough year. All Right, we got a 566 00:33:10,360 --> 00:33:13,400 Speaker 1: whole lot to get through. So let's take another quick 567 00:33:13,400 --> 00:33:15,080 Speaker 1: break and when we come back, we'll pick up with 568 00:33:15,160 --> 00:33:28,040 Speaker 1: twenty eighteen. We're back. So in twenty eighteen here in 569 00:33:28,080 --> 00:33:30,400 Speaker 1: the United States, we were coming to grips with how 570 00:33:30,440 --> 00:33:34,640 Speaker 1: to navigate city streets while all these folks on rental 571 00:33:34,680 --> 00:33:38,520 Speaker 1: electric scooters were whizzing around on sidewalks and roads, and 572 00:33:38,720 --> 00:33:40,920 Speaker 1: it was a mess. It was a mess because you know, 573 00:33:40,960 --> 00:33:43,560 Speaker 1: you had all these people just getting in the way 574 00:33:43,600 --> 00:33:46,080 Speaker 1: of pedestrians, traffic and stuff. It was a mess because 575 00:33:46,120 --> 00:33:50,200 Speaker 1: you would have folks abandoning their electric scooter wherever they wanted, 576 00:33:50,240 --> 00:33:52,800 Speaker 1: including like in the middle of the road sometimes or 577 00:33:52,840 --> 00:33:55,480 Speaker 1: some people would get upset about the electric scooters and 578 00:33:55,520 --> 00:33:57,280 Speaker 1: would round up as many as they could and like 579 00:33:57,400 --> 00:33:59,840 Speaker 1: throw them in a pond or whatever. It was just 580 00:33:59,840 --> 00:34:02,360 Speaker 1: a a huge mess and an example of a tech 581 00:34:02,440 --> 00:34:06,960 Speaker 1: oriented business model that hit the streets before local laws 582 00:34:06,960 --> 00:34:10,160 Speaker 1: and regulations could really weigh in on how this should 583 00:34:10,360 --> 00:34:15,680 Speaker 1: interact with the existing infrastructure. So example here in Atlanta. Technically, 584 00:34:15,880 --> 00:34:18,200 Speaker 1: the law is that if you're driving one of these things, 585 00:34:18,239 --> 00:34:20,200 Speaker 1: you're supposed to be on the road. You're not supposed 586 00:34:20,200 --> 00:34:22,480 Speaker 1: to be on a sidewalk. And technically I believe you're 587 00:34:22,480 --> 00:34:25,279 Speaker 1: supposed to be wearing a helmet, though nobody does, and 588 00:34:25,320 --> 00:34:28,040 Speaker 1: as far as I've seen, no one really enforces that either, 589 00:34:28,239 --> 00:34:30,760 Speaker 1: But that's what the rule is supposed to be anyway. 590 00:34:30,800 --> 00:34:33,520 Speaker 1: Twenty eighteen was also the year when five G wireless 591 00:34:33,520 --> 00:34:37,520 Speaker 1: technologies first began to take headlines by storm. And it 592 00:34:37,600 --> 00:34:40,359 Speaker 1: turns out five G is a little more complicated than 593 00:34:40,400 --> 00:34:43,160 Speaker 1: just being one G better than four G. There are 594 00:34:43,200 --> 00:34:46,200 Speaker 1: different flavors, if you will, of five G. So on 595 00:34:46,200 --> 00:34:49,080 Speaker 1: one end, you've got like the ultra high frequency five 596 00:34:49,160 --> 00:34:51,600 Speaker 1: G bands. These are the ones that are great for 597 00:34:51,760 --> 00:34:54,760 Speaker 1: super high throughput, so you're able to make huge data 598 00:34:54,800 --> 00:34:58,879 Speaker 1: transfers in very short amounts of time. But it has 599 00:34:58,920 --> 00:35:03,200 Speaker 1: a limited transmission range, and also the frequencies aren't really 600 00:35:03,200 --> 00:35:06,279 Speaker 1: good at penetrating solid matter, so if you happen to be, 601 00:35:06,400 --> 00:35:09,719 Speaker 1: say on the other side of a wall as a transmitter, 602 00:35:09,920 --> 00:35:13,160 Speaker 1: you might not actually be able to make use of 603 00:35:13,200 --> 00:35:17,920 Speaker 1: that five G transmitter. Lower frequency five G bands allow 604 00:35:17,960 --> 00:35:23,000 Speaker 1: for longer distance communications, and they penetrate through stuff like walls, 605 00:35:23,080 --> 00:35:25,799 Speaker 1: but they also have lower data throughput values, so you 606 00:35:25,840 --> 00:35:30,239 Speaker 1: don't have the quote unquote speed of the other version 607 00:35:30,280 --> 00:35:33,120 Speaker 1: of five G. But communicating all of this to customers 608 00:35:33,200 --> 00:35:36,520 Speaker 1: is hard, Like how do you say, Hey, under certain circumstances, 609 00:35:36,560 --> 00:35:38,839 Speaker 1: you're going to get this kind of experience, but under 610 00:35:38,880 --> 00:35:42,240 Speaker 1: every other circumstance it's going to be like this, that's muddy. 611 00:35:42,560 --> 00:35:45,000 Speaker 1: So a lot of the marketing that I saw really 612 00:35:45,040 --> 00:35:50,320 Speaker 1: focused on the ideal hypothetical situation you could find yourself 613 00:35:50,360 --> 00:35:53,960 Speaker 1: in where you have this incredible throughput with these ultrai 614 00:35:54,040 --> 00:35:58,880 Speaker 1: frequency five G transmitters and ignored everything else, and that 615 00:35:59,000 --> 00:36:02,640 Speaker 1: just set people up for disappointment unless they just happened 616 00:36:02,640 --> 00:36:04,920 Speaker 1: to be in an area where they could take advantage 617 00:36:05,120 --> 00:36:08,520 Speaker 1: of those ultra high frequency transmitter stations, in which case 618 00:36:08,560 --> 00:36:10,919 Speaker 1: they had a great experience as long as a wall 619 00:36:10,920 --> 00:36:13,879 Speaker 1: didn't come between them and the tower. Rumors first began 620 00:36:14,000 --> 00:36:16,840 Speaker 1: to circulate in this year that Apple had been working 621 00:36:16,880 --> 00:36:19,600 Speaker 1: on a mixed reality headset, with the original goal of 622 00:36:19,640 --> 00:36:22,320 Speaker 1: having it come to market by twenty twenty. Of course, 623 00:36:22,880 --> 00:36:26,120 Speaker 1: we would realize later that the Apple Vision pro would 624 00:36:26,160 --> 00:36:29,760 Speaker 1: not debut until early twenty twenty four, and the somewhat 625 00:36:29,800 --> 00:36:33,360 Speaker 1: tep in response to that headset has put the future 626 00:36:33,400 --> 00:36:35,640 Speaker 1: of the product line into question. I have read a 627 00:36:35,640 --> 00:36:38,920 Speaker 1: lot of analysts that have said that Apple is likely 628 00:36:39,400 --> 00:36:43,120 Speaker 1: going to release a lower cost, stripped down version of 629 00:36:43,200 --> 00:36:47,560 Speaker 1: the Vision pro that they haven't fully abandoned the tech 630 00:36:47,640 --> 00:36:50,680 Speaker 1: product line. But you know, we'll just have to see 631 00:36:50,719 --> 00:36:52,680 Speaker 1: what happens, and by the time you hear this, maybe 632 00:36:52,719 --> 00:36:56,040 Speaker 1: we'll already have seen it. That's also twenty eighteen was 633 00:36:56,040 --> 00:36:58,520 Speaker 1: the year when Elon Musk would get into trouble with SEC, 634 00:36:58,680 --> 00:37:01,560 Speaker 1: not for the last time. He tweeted out that he 635 00:37:01,719 --> 00:37:04,719 Speaker 1: intended to take Tesla private for four hundred and twenty 636 00:37:04,760 --> 00:37:09,160 Speaker 1: dollars a share and that some presumably some unnamed investors 637 00:37:09,200 --> 00:37:11,640 Speaker 1: would front him the funds to do it. But turns 638 00:37:11,680 --> 00:37:14,640 Speaker 1: out that was what we call a lie, and the 639 00:37:14,680 --> 00:37:18,000 Speaker 1: SEC wasn't so big on the whole. Hey, it was 640 00:37:18,120 --> 00:37:22,280 Speaker 1: just a weed joke. Defense Musk, as CEO of Tesla, 641 00:37:22,680 --> 00:37:27,839 Speaker 1: has a responsibility to Tesla's shareholders, and the SEC said, hey, 642 00:37:27,880 --> 00:37:31,200 Speaker 1: you can't go around defrauding everyone, even if it's a joke, 643 00:37:31,400 --> 00:37:33,839 Speaker 1: because you're the CEO. You can't say that you're going 644 00:37:33,920 --> 00:37:36,480 Speaker 1: to take the company private if you don't actually have 645 00:37:36,560 --> 00:37:39,320 Speaker 1: plans to do that. Musk would go on to build 646 00:37:39,360 --> 00:37:43,400 Speaker 1: a very contentious relationship with the SEC, and I'm very 647 00:37:43,480 --> 00:37:45,239 Speaker 1: curious to see how that plays out over the next 648 00:37:45,280 --> 00:37:49,200 Speaker 1: few years. Okay, here's the story I actually wanted to 649 00:37:49,320 --> 00:37:53,600 Speaker 1: focus on for twenty eighteen. That was the year when Facebook, 650 00:37:54,200 --> 00:37:58,360 Speaker 1: specifically Mark Zuckerberg, was called before Congress again, this time 651 00:37:58,840 --> 00:38:03,000 Speaker 1: to talk about the Cambridge Analytica scandal, which had been 652 00:38:03,080 --> 00:38:05,360 Speaker 1: broiling around the tech world for a couple of years 653 00:38:05,719 --> 00:38:09,240 Speaker 1: but really only became public knowledge and a big story 654 00:38:09,360 --> 00:38:11,719 Speaker 1: in twenty eighteen. So in case you don't remember what 655 00:38:11,800 --> 00:38:17,040 Speaker 1: this was all about, Cambridge Analytica was a political consulting 656 00:38:17,200 --> 00:38:21,400 Speaker 1: firm that worked on various political campaigns, including Trump's first 657 00:38:21,719 --> 00:38:24,440 Speaker 1: candidacy for president back in twenty sixteen here in the 658 00:38:24,560 --> 00:38:28,200 Speaker 1: United States. And it turned out that Cambridge Analytica worked 659 00:38:28,239 --> 00:38:31,879 Speaker 1: with a researcher, a data scientist, who had designed an 660 00:38:31,920 --> 00:38:36,440 Speaker 1: app essentially a personality test slash survey, and this app 661 00:38:36,520 --> 00:38:40,400 Speaker 1: would scrape data off of Facebook without the knowledge or 662 00:38:40,440 --> 00:38:43,279 Speaker 1: consent of most of the people who were affected by it. 663 00:38:43,640 --> 00:38:46,640 Speaker 1: So here's how it kind of worked. So essentially, some 664 00:38:46,880 --> 00:38:51,760 Speaker 1: users were invited to take this little personality test slash survey, 665 00:38:52,040 --> 00:38:55,040 Speaker 1: and in return they would receive a small amount of money, 666 00:38:55,160 --> 00:38:57,800 Speaker 1: so they were being paid to do it. The survey 667 00:38:58,160 --> 00:39:02,120 Speaker 1: involved installing this app extension in Facebook that let the 668 00:39:02,200 --> 00:39:06,120 Speaker 1: developer look at all the profiles connected to the users 669 00:39:06,160 --> 00:39:10,200 Speaker 1: who were taking the test as if the developer were 670 00:39:10,719 --> 00:39:14,080 Speaker 1: was that that user. So let me use an example 671 00:39:14,160 --> 00:39:17,640 Speaker 1: to explain this. Let's say that I'm on Facebook and 672 00:39:17,680 --> 00:39:21,120 Speaker 1: I'm friends with you, and I've decided to take this survey, 673 00:39:21,400 --> 00:39:24,560 Speaker 1: and let's say that you had set your Facebook profiles 674 00:39:24,640 --> 00:39:27,200 Speaker 1: so that only your friends could see it, like the 675 00:39:27,200 --> 00:39:30,279 Speaker 1: public cannot see what you post, only your friends can 676 00:39:30,320 --> 00:39:33,960 Speaker 1: see that. Well, because of the loophole that existed in 677 00:39:34,040 --> 00:39:38,120 Speaker 1: Facebook's app environment at the time when this data scientist 678 00:39:38,200 --> 00:39:41,040 Speaker 1: was doing this research, the data scientists could view your 679 00:39:41,080 --> 00:39:44,200 Speaker 1: profile as if the data scientist were me, like it 680 00:39:44,239 --> 00:39:46,279 Speaker 1: would be like if it were in my place, and 681 00:39:46,320 --> 00:39:48,799 Speaker 1: they would just see everything you had posted because I'm 682 00:39:48,840 --> 00:39:52,000 Speaker 1: your friend, and now the data scientist is posing against 683 00:39:52,040 --> 00:39:54,760 Speaker 1: me and can see everything that you've posted. And thus 684 00:39:54,960 --> 00:39:57,520 Speaker 1: they were able to get hold of tons of information 685 00:39:57,840 --> 00:40:01,319 Speaker 1: that involved people who had never consented to having their 686 00:40:01,360 --> 00:40:05,520 Speaker 1: information collected or used. Now, Cambridge Analytica was aiming to 687 00:40:05,680 --> 00:40:11,400 Speaker 1: take that information and create various targeted campaigns in various regions, 688 00:40:11,440 --> 00:40:15,520 Speaker 1: including doing stuff like discouraging people from voting. You know, 689 00:40:15,719 --> 00:40:18,000 Speaker 1: if it turned out that there was a region that 690 00:40:18,120 --> 00:40:21,000 Speaker 1: was not likely to support one of the candidates that 691 00:40:21,160 --> 00:40:24,560 Speaker 1: had hired Cambridge Analytica, they would try to discourage people 692 00:40:24,560 --> 00:40:28,480 Speaker 1: from voting at all to help boost their clients' numbers. So, 693 00:40:28,880 --> 00:40:31,040 Speaker 1: in the Trump example, if it turns out like there's 694 00:40:31,040 --> 00:40:34,240 Speaker 1: a county in a state that's not likely to support Trump, 695 00:40:34,480 --> 00:40:37,320 Speaker 1: but maybe Trump could carry the state. If that county 696 00:40:37,440 --> 00:40:40,480 Speaker 1: just didn't participate in the election, they would target that 697 00:40:40,520 --> 00:40:43,840 Speaker 1: county that kind of thing. Now, their effectiveness in actually 698 00:40:44,560 --> 00:40:47,319 Speaker 1: accomplishing this goal was a matter of debate. I saw 699 00:40:47,320 --> 00:40:49,360 Speaker 1: a lot of people kind of dismissed, saying like, yeah, 700 00:40:49,400 --> 00:40:53,160 Speaker 1: you know, what they did was unethical and potentially illegal, 701 00:40:53,480 --> 00:40:56,640 Speaker 1: but they also were not really good at it. But still, 702 00:40:56,800 --> 00:40:59,520 Speaker 1: the data collection thing was huge, Like it violated the 703 00:40:59,560 --> 00:41:04,160 Speaker 1: privacy of millions of people. So Facebook also faced a 704 00:41:04,160 --> 00:41:07,960 Speaker 1: ton of criticism because it facilitated this. It allowed it 705 00:41:08,000 --> 00:41:11,480 Speaker 1: to happen through having this lax policy, which, to be 706 00:41:11,560 --> 00:41:14,920 Speaker 1: fair to Facebook, the company had already addressed. They had 707 00:41:14,960 --> 00:41:19,080 Speaker 1: already closed that loophole before anyone had come to the 708 00:41:19,080 --> 00:41:21,720 Speaker 1: public with Cambridge Analytica. Like, I don't even think anyone 709 00:41:21,719 --> 00:41:25,240 Speaker 1: at Facebook necessarily knew about Cambridge Analytica at that point. 710 00:41:25,560 --> 00:41:28,920 Speaker 1: They just had been made aware of this loophole and 711 00:41:28,960 --> 00:41:33,080 Speaker 1: they had fixed it. Also unrelated to that, but the 712 00:41:33,080 --> 00:41:36,840 Speaker 1: company's chief operating officer at the time, Cheryl Sandberg, landed 713 00:41:36,840 --> 00:41:39,359 Speaker 1: in the hot seat because apparently she had hired a 714 00:41:39,440 --> 00:41:43,160 Speaker 1: PR firm to investigate people who are criticizing Facebook, which 715 00:41:43,160 --> 00:41:46,399 Speaker 1: seems like an intimidation tactic to me, right, Like, sure 716 00:41:46,440 --> 00:41:49,400 Speaker 1: would be a shame if you said something bad about 717 00:41:49,400 --> 00:41:52,960 Speaker 1: Facebook and then I don't know, people found out you 718 00:41:53,000 --> 00:41:56,000 Speaker 1: were having an affair on your spouse or something like that. 719 00:41:56,160 --> 00:42:01,120 Speaker 1: As very skullduggery and cloak and daggery not attract. Okay, 720 00:42:01,560 --> 00:42:04,400 Speaker 1: we are now up to twenty nineteen. We still have 721 00:42:04,440 --> 00:42:06,160 Speaker 1: several years to go, but don't worry. Like once I 722 00:42:06,200 --> 00:42:10,200 Speaker 1: hit twenty twenty, things go real fast. However, because of 723 00:42:10,200 --> 00:42:12,040 Speaker 1: the length of this episode, we do need to take 724 00:42:12,080 --> 00:42:14,480 Speaker 1: another quick break to thank our sponsors. We'll be back 725 00:42:14,560 --> 00:42:16,839 Speaker 1: to wrap all this up. Stick with me because I'm 726 00:42:16,840 --> 00:42:27,879 Speaker 1: not going to be doing this much longer. All right, 727 00:42:27,920 --> 00:42:31,440 Speaker 1: we're back, and we're up to twenty nineteen. So Tesla 728 00:42:31,640 --> 00:42:35,960 Speaker 1: introduced the cyber truck that year. It was a sometimes 729 00:42:36,320 --> 00:42:41,800 Speaker 1: a fairly embarrassing unveiling. It included a moment when windows 730 00:42:41,800 --> 00:42:45,840 Speaker 1: that were supposed to be reinforced and difficult to shatter shattered. 731 00:42:46,320 --> 00:42:49,240 Speaker 1: That was a whoopsie. But it would actually take several 732 00:42:49,320 --> 00:42:51,960 Speaker 1: more years before any cyber trucks would be ready to 733 00:42:52,000 --> 00:42:54,640 Speaker 1: ship to customers, so you could argue this was one 734 00:42:54,680 --> 00:42:58,120 Speaker 1: of those events that happened way too early. I also 735 00:42:58,160 --> 00:43:00,279 Speaker 1: think that's going to be the case for some other 736 00:43:00,320 --> 00:43:04,440 Speaker 1: Tesla products, like their robots. Like a fully fledged robot 737 00:43:04,520 --> 00:43:08,080 Speaker 1: capable of doing what their robots appeared to do at 738 00:43:08,080 --> 00:43:10,560 Speaker 1: a recent Tesla event, I imagine is going to take 739 00:43:10,560 --> 00:43:13,200 Speaker 1: a few more years. And also, I'm sure no one 740 00:43:13,400 --> 00:43:17,000 Speaker 1: anticipated someone blowing up a cyber truck outside of Trump's 741 00:43:17,000 --> 00:43:20,719 Speaker 1: Casino in Las Vegas, Nevada. So as I record this episode, 742 00:43:20,760 --> 00:43:23,200 Speaker 1: we still don't have all the details about the Las 743 00:43:23,320 --> 00:43:26,400 Speaker 1: Vegas incident. We know that there was a fatality, the 744 00:43:26,440 --> 00:43:29,400 Speaker 1: person who was in the car and seven injuries, and 745 00:43:29,440 --> 00:43:31,320 Speaker 1: it appears to have been an attack. I mean, the 746 00:43:31,800 --> 00:43:34,560 Speaker 1: news articles I've read have suggested that the car was 747 00:43:34,640 --> 00:43:38,360 Speaker 1: laiden down with potentially a mixture of fireworks and fuel, 748 00:43:38,680 --> 00:43:42,160 Speaker 1: maybe other explosive devices, and so it appears that it 749 00:43:42,200 --> 00:43:47,560 Speaker 1: was purposefully exploded outside of Trump's casino in Nevada. But 750 00:43:47,600 --> 00:43:49,799 Speaker 1: we don't know anything more than that at the time 751 00:43:49,840 --> 00:43:52,239 Speaker 1: that I'm recording this. So if more details have come 752 00:43:52,239 --> 00:43:54,840 Speaker 1: out by the time you hear this, what a tragic 753 00:43:55,040 --> 00:43:58,080 Speaker 1: and baffling sort of thing. Like I never get these 754 00:43:58,719 --> 00:44:05,000 Speaker 1: acts of public violence. It's absolutely alien to me. Twenty 755 00:44:05,120 --> 00:44:07,799 Speaker 1: nineteen is also the year that Disney Plus launched, which 756 00:44:07,800 --> 00:44:10,240 Speaker 1: I don't think merrit's a whole lot of conversation, except 757 00:44:10,239 --> 00:44:12,239 Speaker 1: to say I think of it as almost like a 758 00:44:12,280 --> 00:44:15,120 Speaker 1: tipping point where the number of streaming services started to 759 00:44:15,160 --> 00:44:18,560 Speaker 1: get out of control. I mean, we already had Hulu 760 00:44:18,960 --> 00:44:23,640 Speaker 1: and Netflix and Amazon Prime Video and HBO Max and 761 00:44:23,760 --> 00:44:26,920 Speaker 1: Disney Plus joining the ranks would make things more crowded, 762 00:44:26,960 --> 00:44:29,160 Speaker 1: and we were just getting started. I mean, Apple TV 763 00:44:29,320 --> 00:44:32,520 Speaker 1: Plus also debuted in twenty nineteen that was more of 764 00:44:32,600 --> 00:44:36,560 Speaker 1: a prestige television angle. That Apple was going for but 765 00:44:36,680 --> 00:44:41,640 Speaker 1: still adding even more options there. And today we live 766 00:44:41,640 --> 00:44:44,360 Speaker 1: in a world where there's so many different streaming services 767 00:44:44,360 --> 00:44:47,359 Speaker 1: that we're starting to see them make bundles because they 768 00:44:47,400 --> 00:44:50,600 Speaker 1: recognize that they're too many and that if they all 769 00:44:50,719 --> 00:44:53,400 Speaker 1: try and stand on their own, they're not going to 770 00:44:53,400 --> 00:44:55,920 Speaker 1: get a big slice of the pie because customers have 771 00:44:55,960 --> 00:44:58,359 Speaker 1: a limit to how much they're willing to subscribe to, 772 00:44:58,560 --> 00:45:01,879 Speaker 1: or at least most do. We're getting bundles and consolidation 773 00:45:02,560 --> 00:45:04,799 Speaker 1: because of that, because there's just so many of these 774 00:45:04,840 --> 00:45:08,120 Speaker 1: gosh darn streaming services out there, and most of them 775 00:45:08,160 --> 00:45:10,920 Speaker 1: aren't even nearly as good as dropout TV. And I 776 00:45:10,960 --> 00:45:13,000 Speaker 1: don't know anyone on dropout TV. I just know that 777 00:45:13,080 --> 00:45:15,880 Speaker 1: dropout TV is really good, all right, Speaking of a 778 00:45:15,880 --> 00:45:19,279 Speaker 1: flood of media. It's funny because in twenty nineteen we 779 00:45:19,320 --> 00:45:23,959 Speaker 1: saw what we thought was a big booming period in podcasting. 780 00:45:24,200 --> 00:45:27,760 Speaker 1: And I say it's funny because the next year would 781 00:45:27,800 --> 00:45:30,640 Speaker 1: put twenty nineteen to shame. But yeah, twenty nineteen was 782 00:45:31,000 --> 00:45:33,160 Speaker 1: an important year, Like we saw a lot of people 783 00:45:33,239 --> 00:45:37,840 Speaker 1: getting into podcasting, including high profile personalities like cod and 784 00:45:37,880 --> 00:45:41,560 Speaker 1: O'Brien starting his podcast that year. Meanwhile, there were folks 785 00:45:41,560 --> 00:45:43,480 Speaker 1: like yours truly, who had been going strong for more 786 00:45:43,480 --> 00:45:45,720 Speaker 1: than a decade to that point. And all I'm saying 787 00:45:45,800 --> 00:45:48,400 Speaker 1: is I was podcasting before it was cool now. I 788 00:45:48,440 --> 00:45:50,799 Speaker 1: think the story of twenty nineteen that I want to 789 00:45:50,800 --> 00:45:54,640 Speaker 1: focus on is that the various US regulatory agencies and 790 00:45:54,680 --> 00:45:57,440 Speaker 1: the Department of Justice here in the United States began 791 00:45:57,520 --> 00:46:02,759 Speaker 1: to consider anti trust investigations into big tech companies, primarily 792 00:46:03,120 --> 00:46:07,479 Speaker 1: members of the Big Five. And those are Facebook, which 793 00:46:07,520 --> 00:46:12,799 Speaker 1: is now Meta Alphabet, which is you know, Google's parent company, Apple, Amazon, 794 00:46:13,200 --> 00:46:16,160 Speaker 1: and Microsoft. And those stories are still playing out today, 795 00:46:16,200 --> 00:46:19,239 Speaker 1: with Google in particular looking ahead to a ruling from 796 00:46:19,280 --> 00:46:21,920 Speaker 1: a judge that could determine the future of Google Chrome, 797 00:46:22,040 --> 00:46:24,920 Speaker 1: among other things. Twenty nineteen was also the year that 798 00:46:25,040 --> 00:46:31,080 Speaker 1: TikTok exploded in popularity, so it already launched, but before 799 00:46:31,160 --> 00:46:34,799 Speaker 1: twenty nineteen, it was largely just this obscure app that 800 00:46:34,920 --> 00:46:38,040 Speaker 1: teens were using to lip sync to various music tracks 801 00:46:38,239 --> 00:46:40,719 Speaker 1: and create, you know, fun little videos that way. But 802 00:46:40,760 --> 00:46:43,719 Speaker 1: in twenty nineteen it became a fully fledged short form 803 00:46:43,840 --> 00:46:48,080 Speaker 1: video social network, and it also quickly became a concern 804 00:46:48,360 --> 00:46:52,000 Speaker 1: among people who were worried about stuff like privacy and security, 805 00:46:52,200 --> 00:46:55,879 Speaker 1: and obviously that escalated in later years and ultimately would 806 00:46:55,960 --> 00:46:59,319 Speaker 1: lead to the United States government banning the app in 807 00:46:59,400 --> 00:47:03,360 Speaker 1: twenty twenty before unless it separates from its Chinese parent company, 808 00:47:03,400 --> 00:47:06,319 Speaker 1: Byte Dance. However, I should add that the Trump administration 809 00:47:06,560 --> 00:47:10,279 Speaker 1: has asked the Supreme Court to delay the deadline for this. 810 00:47:10,440 --> 00:47:12,960 Speaker 1: It was set for January nineteenth, which is the day 811 00:47:13,000 --> 00:47:16,840 Speaker 1: before Trump takes office. He's now asking the Supreme Court 812 00:47:16,880 --> 00:47:21,960 Speaker 1: to delay that deadline, and presumably Trump would then attempt 813 00:47:21,960 --> 00:47:26,640 Speaker 1: to change the thing like it's already been signed into law. 814 00:47:26,760 --> 00:47:29,040 Speaker 1: So I don't actually know what the process is at 815 00:47:29,040 --> 00:47:31,600 Speaker 1: this point. I mean, he's the president of the United States, 816 00:47:31,640 --> 00:47:35,240 Speaker 1: but like, I don't think he just has the power 817 00:47:35,280 --> 00:47:40,080 Speaker 1: to undo something for a previous president had signed into law. 818 00:47:40,400 --> 00:47:43,400 Speaker 1: But I'm no constitutional expert, so what do I know? 819 00:47:43,640 --> 00:47:45,759 Speaker 1: All Right, So, now we're up to twenty twenty, and 820 00:47:46,280 --> 00:47:48,880 Speaker 1: nothing happened that year, At least I'd like to pretend 821 00:47:49,280 --> 00:47:53,040 Speaker 1: nothing happened in twenty twenty. The COVID nineteen pandemic obviously 822 00:47:53,080 --> 00:47:56,920 Speaker 1: made an enormous impact on everything in the world, including 823 00:47:56,960 --> 00:48:00,840 Speaker 1: the tech sector. It pushed some companies to the spotlight, 824 00:48:00,960 --> 00:48:05,000 Speaker 1: so tools like Zoom and Slack became crucial to businesses 825 00:48:05,200 --> 00:48:07,800 Speaker 1: that needed their teams to be able to work together 826 00:48:07,920 --> 00:48:10,719 Speaker 1: even when they weren't allowed to like go to the office. 827 00:48:11,000 --> 00:48:14,400 Speaker 1: And as I mentioned a moment ago, podcasts had another 828 00:48:14,560 --> 00:48:17,759 Speaker 1: explosive moment of growth in twenty twenty. Everyone and their 829 00:48:17,800 --> 00:48:20,560 Speaker 1: Dog launched a show that year. Most of those shows 830 00:48:20,680 --> 00:48:23,760 Speaker 1: ended within three episodes, but some of them stuck around. 831 00:48:24,040 --> 00:48:27,480 Speaker 1: Some tech companies or companies connected to tech, found themselves 832 00:48:27,760 --> 00:48:30,640 Speaker 1: in deep trouble in twenty twenty. You know, like we work, 833 00:48:30,920 --> 00:48:33,480 Speaker 1: we work not technically being a tech company, but treated 834 00:48:33,560 --> 00:48:36,759 Speaker 1: that way. It was already having issues before the pandemic, 835 00:48:37,040 --> 00:48:39,279 Speaker 1: but the pandemic meant that customers had no need for 836 00:48:39,360 --> 00:48:41,800 Speaker 1: office space in the meantime, and we work with sitting 837 00:48:41,800 --> 00:48:45,000 Speaker 1: on all these leases that it needed to you know, 838 00:48:45,320 --> 00:48:48,200 Speaker 1: pay off, So that put we work in a really 839 00:48:48,200 --> 00:48:51,879 Speaker 1: tough position, set it up for the path to bankruptcy 840 00:48:51,920 --> 00:48:55,080 Speaker 1: in twenty twenty three. Now, one huge story connected to 841 00:48:55,120 --> 00:48:58,319 Speaker 1: the pandemic was how the Internet facilitated the spread of 842 00:48:58,480 --> 00:49:02,360 Speaker 1: misinformation that year, twenty twenty election year in the United States. 843 00:49:02,600 --> 00:49:05,400 Speaker 1: We always see an uptick in misinformation on those years, 844 00:49:05,600 --> 00:49:08,400 Speaker 1: and boy howdy, did we see it in twenty twenty, Like, 845 00:49:08,520 --> 00:49:13,000 Speaker 1: remember stories like how five G wireless technologies were supposedly 846 00:49:13,000 --> 00:49:16,440 Speaker 1: carrying the coronavirus in some inexplicable way, Like there were 847 00:49:16,480 --> 00:49:19,479 Speaker 1: actual people who believe that. I don't understand how that's 848 00:49:19,560 --> 00:49:22,520 Speaker 1: the case. I guess if you aren't aware of how 849 00:49:22,640 --> 00:49:26,879 Speaker 1: wireless technologies work and the way or lack of way 850 00:49:26,920 --> 00:49:32,200 Speaker 1: that they interact with organisms, I suppose you could be convinced. 851 00:49:32,239 --> 00:49:35,800 Speaker 1: But like it just takes the smallest amount of education 852 00:49:35,960 --> 00:49:39,879 Speaker 1: to learn that just isn't a viable possibility. I still 853 00:49:39,920 --> 00:49:42,880 Speaker 1: can't wrap my head around that one. It's been four years, y'all. 854 00:49:43,160 --> 00:49:45,640 Speaker 1: And that's just one example of one of the weird 855 00:49:45,719 --> 00:49:49,200 Speaker 1: fringe theories that went around related to tech and the coronavirus. 856 00:49:49,200 --> 00:49:52,160 Speaker 1: There's so many. And because it was an election year, 857 00:49:52,200 --> 00:49:54,560 Speaker 1: and while twenty sixteen gave us a taste of how 858 00:49:54,600 --> 00:49:58,520 Speaker 1: misinformation and propaganda campaigns could work, largely funded from foreign 859 00:49:58,840 --> 00:50:02,279 Speaker 1: sources and primarily from Russia, honestly, twenty sixteen felt like 860 00:50:02,320 --> 00:50:06,240 Speaker 1: an informal rehearsal compared to what happened in twenty twenty plus. 861 00:50:06,520 --> 00:50:09,200 Speaker 1: We were starting to see how emerging technologies like deep 862 00:50:09,239 --> 00:50:12,760 Speaker 1: fakes could potentially play a part in misinformation campaigns down 863 00:50:12,800 --> 00:50:15,399 Speaker 1: the road. It was in twenty twenty when Trump as 864 00:50:15,440 --> 00:50:20,000 Speaker 1: president first attempted to ban TikTok through an executive order. Interestingly, 865 00:50:20,120 --> 00:50:23,160 Speaker 1: now that TikTok is actually facing a legal ban here 866 00:50:23,200 --> 00:50:25,279 Speaker 1: in the United States, not just an executive order but 867 00:50:25,400 --> 00:50:28,959 Speaker 1: a law that's been passed, Trump now opposes this move, 868 00:50:29,360 --> 00:50:32,000 Speaker 1: perhaps because one of his big financial supporters during the 869 00:50:32,040 --> 00:50:35,040 Speaker 1: twenty twenty four campaign also happened to be a big 870 00:50:35,040 --> 00:50:38,200 Speaker 1: investor in byte Dance, which is TikTok's parent company. But 871 00:50:38,280 --> 00:50:42,160 Speaker 1: who's to say, okay. There's also a move to take 872 00:50:42,239 --> 00:50:44,800 Speaker 1: the protections out of section two thirty of the nineteen 873 00:50:44,880 --> 00:50:49,239 Speaker 1: ninety six Communications Decency Act in twenty twenty. That's the 874 00:50:49,920 --> 00:50:53,760 Speaker 1: measure that protects platforms online platforms from being held legally 875 00:50:53,880 --> 00:50:57,640 Speaker 1: liable for content that was posted by users. So if 876 00:50:57,680 --> 00:51:01,520 Speaker 1: a user posts something that's illegal, the platform is not 877 00:51:01,840 --> 00:51:07,319 Speaker 1: held responsible for that, particularly if the platform responds appropriately 878 00:51:07,480 --> 00:51:10,960 Speaker 1: by removing the content and restricting the person from being 879 00:51:11,000 --> 00:51:13,640 Speaker 1: able to do that again. I suspect that we're going 880 00:51:13,719 --> 00:51:16,000 Speaker 1: to see a much harder push to strip Section two 881 00:51:16,080 --> 00:51:19,960 Speaker 1: thirty protections moving forward. In twenty twenty five. I would 882 00:51:20,000 --> 00:51:23,080 Speaker 1: be amazed if section two thirty remains a thing by 883 00:51:23,120 --> 00:51:26,279 Speaker 1: the end of Trump's presidency. We'll see, all right. So 884 00:51:26,320 --> 00:51:29,200 Speaker 1: twenty twenty is also the year that Quibi debuted. Do 885 00:51:29,239 --> 00:51:32,840 Speaker 1: you remember Quibi? Quibi debuted, it sputtered, and it died, 886 00:51:33,000 --> 00:51:34,879 Speaker 1: all in the span of a year. If you don't 887 00:51:34,920 --> 00:51:38,319 Speaker 1: remember Quibi. It was a short form video platform that 888 00:51:38,360 --> 00:51:41,560 Speaker 1: poured a whole lot of money into very highly produced, 889 00:51:41,880 --> 00:51:45,400 Speaker 1: professional video projects. So this wasn't user generated. This was 890 00:51:45,480 --> 00:51:49,280 Speaker 1: meant to be like Netflix or Amazon Prime or whatever, 891 00:51:49,360 --> 00:51:52,480 Speaker 1: except for short form video. It was made with an 892 00:51:52,560 --> 00:51:55,720 Speaker 1: audience watching on mobile devices in mind. In fact, every 893 00:51:55,760 --> 00:51:58,640 Speaker 1: production was made so that you could view it portrait 894 00:51:58,760 --> 00:52:02,239 Speaker 1: or landscape, to the point where like directors had to 895 00:52:02,280 --> 00:52:06,279 Speaker 1: frame their shots so that it would work in either orientation. 896 00:52:06,600 --> 00:52:08,360 Speaker 1: But the project did not work out so well in 897 00:52:08,400 --> 00:52:10,880 Speaker 1: a world where everybody was stuck at home because of 898 00:52:10,920 --> 00:52:13,880 Speaker 1: the coronavirus and they would just watch stuff on larger 899 00:52:13,920 --> 00:52:17,799 Speaker 1: screens instead of their phones. Most analysts that I've read 900 00:52:17,920 --> 00:52:20,360 Speaker 1: up on seemed to agree that even if the pandemic 901 00:52:20,360 --> 00:52:22,800 Speaker 1: hadn't been a thing, Quibi's success was a long shot 902 00:52:22,840 --> 00:52:26,480 Speaker 1: at best. And that you couldn't say that the coronavirus 903 00:52:26,600 --> 00:52:29,600 Speaker 1: made Quibi fail, but sure to help. I think the 904 00:52:29,640 --> 00:52:32,480 Speaker 1: story to reflect on in twenty twenty is the solar 905 00:52:32,600 --> 00:52:35,040 Speaker 1: Winds hack, which opened their eyes to the dangers of 906 00:52:35,080 --> 00:52:38,600 Speaker 1: supply chain attacks. Now that's a complicated story, but i'll 907 00:52:38,600 --> 00:52:41,880 Speaker 1: sum it up. A company called solar Winds creates software 908 00:52:41,920 --> 00:52:46,279 Speaker 1: for enterprise clients, so that is other businesses. Hackers were 909 00:52:46,320 --> 00:52:50,160 Speaker 1: able to hide some malware into a solar Winds update 910 00:52:50,440 --> 00:52:54,080 Speaker 1: to software called Orion. So rather than focus on the 911 00:52:54,200 --> 00:52:57,640 Speaker 1: end targets such as you know, specific agencies in the 912 00:52:57,760 --> 00:53:01,840 Speaker 1: US government, the hackers implant the malware in a trusted 913 00:53:01,880 --> 00:53:05,960 Speaker 1: partner's software package, which then got pushed out to the 914 00:53:06,000 --> 00:53:09,960 Speaker 1: customers all around the world and infected those customers. So 915 00:53:10,040 --> 00:53:13,839 Speaker 1: this was a huge and very effective cyber attack. While 916 00:53:13,880 --> 00:53:17,640 Speaker 1: thousands of organizations were impacted, it appeared as though the 917 00:53:17,640 --> 00:53:21,560 Speaker 1: attackers had a few specific targets in mind. It's just 918 00:53:21,880 --> 00:53:25,440 Speaker 1: their attack was super effective across all of solar Winds 919 00:53:25,480 --> 00:53:29,160 Speaker 1: customers using the Orion platforms. So the reason I included 920 00:53:29,239 --> 00:53:32,239 Speaker 1: in this episode isn't because of how devastating the attack was, 921 00:53:32,280 --> 00:53:35,600 Speaker 1: although it was quite devastating, but rather it illustrates how 922 00:53:35,640 --> 00:53:38,880 Speaker 1: effective a supply chain attack can be. So, if you 923 00:53:39,080 --> 00:53:42,920 Speaker 1: have identified that your target has really strong defenses and 924 00:53:42,960 --> 00:53:45,400 Speaker 1: you're not likely to be able to penetrate their systems, 925 00:53:45,880 --> 00:53:49,840 Speaker 1: maybe you target one of their trusted partners so you 926 00:53:49,840 --> 00:53:53,680 Speaker 1: can find a way in. It's scary stuff. In twenty 927 00:53:53,800 --> 00:53:57,960 Speaker 1: twenty one, we had a few interesting things happen. Antivirus 928 00:53:58,000 --> 00:54:02,600 Speaker 1: software developer and totally bunked's personality, John McAfee passed away. 929 00:54:03,000 --> 00:54:05,919 Speaker 1: I know it's ghost to speak ill of the dead, 930 00:54:05,960 --> 00:54:10,520 Speaker 1: but this guy really was wild. So in a normal world, 931 00:54:10,880 --> 00:54:13,360 Speaker 1: McAfee would best be known as a guy who created 932 00:54:13,480 --> 00:54:16,319 Speaker 1: McAfee antivirus. But we don't live in a normal world. 933 00:54:16,400 --> 00:54:18,000 Speaker 1: We live in this one, and this is the world 934 00:54:18,040 --> 00:54:21,080 Speaker 1: where McAfee was perhaps best known for living a kind 935 00:54:21,120 --> 00:54:24,720 Speaker 1: of sex, drugs, and rock and roll lifestyle, which comes 936 00:54:24,760 --> 00:54:27,359 Speaker 1: complete with him living in a different country and at 937 00:54:27,400 --> 00:54:32,480 Speaker 1: one point being under suspicion for a committing murder on 938 00:54:32,480 --> 00:54:34,960 Speaker 1: one of his neighbors, something that may or may not 939 00:54:35,000 --> 00:54:38,000 Speaker 1: have involved him, we don't know. He was wanted in 940 00:54:38,040 --> 00:54:41,240 Speaker 1: the United States for tax evasion, and he was arrested 941 00:54:41,280 --> 00:54:45,680 Speaker 1: while in Barcelona Spain, and then he was found dead 942 00:54:45,840 --> 00:54:48,520 Speaker 1: in his prison cell not long after. The Spanish courts 943 00:54:48,520 --> 00:54:52,720 Speaker 1: that approved his extradition to the United States absolutely wild, 944 00:54:53,160 --> 00:54:56,479 Speaker 1: and the implication is that he did take his own 945 00:54:56,480 --> 00:54:59,640 Speaker 1: life while he was in jail rather than be extradited 946 00:54:59,680 --> 00:55:03,720 Speaker 1: to the un to stand trial. Microsoft launched Windows eleven 947 00:55:03,760 --> 00:55:06,560 Speaker 1: and twenty twenty one, and it has generally received a 948 00:55:06,600 --> 00:55:11,520 Speaker 1: more positive reception. There are some notable exceptions that would 949 00:55:11,840 --> 00:55:15,680 Speaker 1: unfold later on. Microsoft had skipped over the number nine. 950 00:55:16,120 --> 00:55:19,000 Speaker 1: It went from Windows eight to Windows ten. Windows ten 951 00:55:19,080 --> 00:55:21,279 Speaker 1: got a fair reception. Windows eleven seemed to be an 952 00:55:21,280 --> 00:55:25,280 Speaker 1: improvement in many ways, but some recent developments have critics 953 00:55:25,360 --> 00:55:29,600 Speaker 1: upset at Microsoft, namely the recall or recall feature, which 954 00:55:29,640 --> 00:55:32,680 Speaker 1: takes snapshots of user activities on their computers, so like, 955 00:55:32,880 --> 00:55:36,600 Speaker 1: you can search through these snapshots to find times when 956 00:55:36,640 --> 00:55:39,120 Speaker 1: you've visited certain sites or use certain apps or whatever. 957 00:55:39,320 --> 00:55:42,839 Speaker 1: Advocates have argued that this represents a privacy and security vulnerability, 958 00:55:43,040 --> 00:55:46,640 Speaker 1: and Microsoft did tweak it quite a bit in response 959 00:55:46,680 --> 00:55:49,799 Speaker 1: to those criticisms. But it's opt in, so at least 960 00:55:49,800 --> 00:55:52,480 Speaker 1: you're not like using it by default. But I guess 961 00:55:52,600 --> 00:55:54,759 Speaker 1: the big story that year was really a pair of 962 00:55:54,800 --> 00:55:57,600 Speaker 1: news items that arguably had some connection to one another, 963 00:55:57,680 --> 00:56:00,920 Speaker 1: and both of them involved Facebook. So in the first story, 964 00:56:00,960 --> 00:56:04,319 Speaker 1: a whistleblower named Francis Hogan, who had previously worked as 965 00:56:04,320 --> 00:56:07,319 Speaker 1: a data scientist for Facebook, came forward with the accusations 966 00:56:07,320 --> 00:56:10,759 Speaker 1: and documentation showing that Facebook's policies did a lot of 967 00:56:10,800 --> 00:56:14,080 Speaker 1: bad stuff, like putting children at risk, for example, and 968 00:56:14,120 --> 00:56:17,680 Speaker 1: that Facebook executives further know about this, and the company 969 00:56:17,719 --> 00:56:20,120 Speaker 1: had worked hard to hide it from the public while 970 00:56:20,160 --> 00:56:26,600 Speaker 1: still pursuing these variously harmful or potentially harmful policies, and 971 00:56:26,640 --> 00:56:28,719 Speaker 1: that really what was all about was the company was 972 00:56:28,760 --> 00:56:32,080 Speaker 1: seeking profit over anything else, even in the wake of 973 00:56:32,120 --> 00:56:35,200 Speaker 1: potentially hurting not just children, which is bad enough all 974 00:56:35,239 --> 00:56:39,800 Speaker 1: by itself, but also hurting things like institutions like democracy. 975 00:56:40,120 --> 00:56:43,760 Speaker 1: And Haugen's testimony to Congress was brutal. It is wild 976 00:56:43,840 --> 00:56:46,040 Speaker 1: to me that we have largely forgotten the things she 977 00:56:46,160 --> 00:56:49,200 Speaker 1: brought forward back in twenty twenty one, because it was 978 00:56:49,280 --> 00:56:53,200 Speaker 1: extensive and I thought pretty damning, But you know, we've 979 00:56:53,280 --> 00:56:56,120 Speaker 1: largely moved on since then. Now. Not long after her 980 00:56:56,200 --> 00:57:00,440 Speaker 1: accusations became headlines, Facebook added to the chaos by announcing 981 00:57:00,440 --> 00:57:03,640 Speaker 1: the company was changing its name to Meta Now. Ostensibly 982 00:57:03,719 --> 00:57:06,160 Speaker 1: this was because Facebook would be investing heavily in the 983 00:57:06,360 --> 00:57:10,120 Speaker 1: metaverse concept, which is a vague vision of the future 984 00:57:10,120 --> 00:57:13,880 Speaker 1: of online interactivity that typically involves some form of mixed 985 00:57:13,920 --> 00:57:17,400 Speaker 1: reality component. Critics argued that it was an attempt for 986 00:57:17,440 --> 00:57:20,520 Speaker 1: the company to distance itself from the very real accusations 987 00:57:20,560 --> 00:57:24,000 Speaker 1: flying from the testimony and documentation that had come out 988 00:57:24,000 --> 00:57:26,520 Speaker 1: earlier in the year. However, here we are in twenty 989 00:57:26,600 --> 00:57:29,120 Speaker 1: twenty four and Meta is still very much a thing. 990 00:57:29,280 --> 00:57:32,080 Speaker 1: Investors sometimes get a little miffed at Mark Zuckerberg for 991 00:57:32,120 --> 00:57:34,720 Speaker 1: continuing to pour so much money and R and D 992 00:57:34,960 --> 00:57:38,800 Speaker 1: into mixed reality AI and the metaverse concepts, but other 993 00:57:38,880 --> 00:57:43,680 Speaker 1: than that, the Meta ship continues to sail fairly smoothly 994 00:57:43,760 --> 00:57:46,680 Speaker 1: all things considered. All Right, now, we're up to twenty 995 00:57:46,720 --> 00:57:49,720 Speaker 1: twenty two. Man, let's power through it saw a lot 996 00:57:49,760 --> 00:57:52,960 Speaker 1: of stuff happen in twenty twenty two. Non fungible tokens 997 00:57:53,000 --> 00:57:57,200 Speaker 1: aka NFTs had become mainstream headlines in twenty twenty one. 998 00:57:57,360 --> 00:57:59,280 Speaker 1: They'd been around a little longer than that, but twenty 999 00:57:59,320 --> 00:58:02,480 Speaker 1: twenty one was when the mainstream public really became aware 1000 00:58:02,520 --> 00:58:06,000 Speaker 1: of them, and that's when we saw a speculative market 1001 00:58:06,040 --> 00:58:09,720 Speaker 1: form around these blockchain based digital markers, and I think 1002 00:58:09,720 --> 00:58:12,680 Speaker 1: a lot of the speculation was rooted deeply in ignorance 1003 00:58:12,760 --> 00:58:16,200 Speaker 1: as to what NFTs actually are, or maybe more accurately, 1004 00:58:16,240 --> 00:58:19,240 Speaker 1: what they are not. Anyway, it was early in twenty 1005 00:58:19,280 --> 00:58:22,600 Speaker 1: twenty two when that speculative market collapsed in on itself, 1006 00:58:23,040 --> 00:58:26,240 Speaker 1: and NFTs as a whole lost nearly all of their 1007 00:58:26,640 --> 00:58:30,520 Speaker 1: value in the process. So what had been the gold 1008 00:58:30,560 --> 00:58:33,520 Speaker 1: mine of twenty twenty one became kind of a punchline 1009 00:58:33,960 --> 00:58:36,880 Speaker 1: in twenty twenty two, and suddenly some people found themselves 1010 00:58:36,880 --> 00:58:40,600 Speaker 1: in possession of a digital marker that was worthless after 1011 00:58:40,640 --> 00:58:44,160 Speaker 1: they had spent perhaps thousands or tens of thousands of 1012 00:58:44,200 --> 00:58:47,320 Speaker 1: dollars on them. One big story that spanned nearly all 1013 00:58:47,360 --> 00:58:51,160 Speaker 1: of twenty twenty two was Elon Musk's quest to purchase Twitter. 1014 00:58:51,640 --> 00:58:54,440 Speaker 1: His intentions first became clear in the spring of twenty 1015 00:58:54,480 --> 00:58:57,200 Speaker 1: twenty two, but not long after he announced his plan 1016 00:58:57,320 --> 00:59:00,520 Speaker 1: to buy it, he began to backpedal, and he tried 1017 00:59:00,520 --> 00:59:02,200 Speaker 1: to get out of the deal, but doing so would 1018 00:59:02,200 --> 00:59:04,880 Speaker 1: have cost him an awful lot of money because he 1019 00:59:04,920 --> 00:59:08,560 Speaker 1: had already entered into the negotiations in good faith and 1020 00:59:08,640 --> 00:59:12,720 Speaker 1: backing out just wasn't an option, at least not without consequences. 1021 00:59:13,080 --> 00:59:15,920 Speaker 1: At one point, the United States court system got involved, 1022 00:59:16,040 --> 00:59:19,840 Speaker 1: and Musk capitulated and agreed to buy Twitter for a 1023 00:59:19,840 --> 00:59:24,040 Speaker 1: whopping forty four billion dollars before he was forced to 1024 00:59:24,080 --> 00:59:29,600 Speaker 1: do so, so he did it by choice technically. One 1025 00:59:29,640 --> 00:59:32,720 Speaker 1: of the investors in Twitter, a company called Fidelity, would 1026 00:59:32,840 --> 00:59:36,480 Speaker 1: later estimate in twenty twenty four that Twitter's value had 1027 00:59:36,760 --> 00:59:39,520 Speaker 1: gone all the way down to nine point four billion dollars. 1028 00:59:39,600 --> 00:59:41,840 Speaker 1: So he spent forty four billion and now it's worth 1029 00:59:41,960 --> 00:59:44,200 Speaker 1: nine point four billion. That is a heck of a 1030 00:59:44,240 --> 00:59:47,840 Speaker 1: decline in just two years. Russia's invasion of Ukraine would 1031 00:59:47,880 --> 00:59:51,080 Speaker 1: also have a huge impact on technology. For one thing, 1032 00:59:51,080 --> 00:59:53,720 Speaker 1: the US would start to restrict business dealings with Russian 1033 00:59:53,760 --> 00:59:57,560 Speaker 1: based companies that included tech companies like Kaspersky, which is 1034 00:59:57,600 --> 01:00:01,280 Speaker 1: a cybersecurity firm, and many a American based tech companies 1035 01:00:01,320 --> 01:00:05,000 Speaker 1: would pull out of operations within Russia. So you had 1036 01:00:05,160 --> 01:00:07,960 Speaker 1: companies just picking up stakes and moving out of Russia, 1037 01:00:08,000 --> 01:00:12,360 Speaker 1: sometimes even suspending customer access to platforms within Russia in 1038 01:00:12,400 --> 01:00:15,040 Speaker 1: the process. YouTube has been in hot water with the 1039 01:00:15,120 --> 01:00:19,439 Speaker 1: Russian government on and off again since this, because they 1040 01:00:19,560 --> 01:00:24,080 Speaker 1: have done things like block state backed misinformation channels because 1041 01:00:24,120 --> 01:00:28,439 Speaker 1: it's misinformation and Russia's government doesn't like that very much. Also, 1042 01:00:28,600 --> 01:00:31,320 Speaker 1: as the tech sector began to settle after the chaos 1043 01:00:31,320 --> 01:00:33,680 Speaker 1: that had followed the COVID outbreak, we started seeing the 1044 01:00:33,720 --> 01:00:37,440 Speaker 1: first of many rounds of layoffs. Companies that had grown 1045 01:00:37,560 --> 01:00:40,960 Speaker 1: very quickly to respond to changing conditions during the pandemic 1046 01:00:40,960 --> 01:00:44,120 Speaker 1: found themselves ready to slim down again once things were 1047 01:00:44,160 --> 01:00:47,320 Speaker 1: starting to settle, and we still see that happening today. 1048 01:00:47,800 --> 01:00:50,720 Speaker 1: There have been numerous cuts in the sector and it's 1049 01:00:50,760 --> 01:00:54,400 Speaker 1: really demoralizing and hard to see those headlines month after month. 1050 01:00:54,920 --> 01:00:59,200 Speaker 1: Also in twenty twenty two, we got open AI's chat GPT. 1051 01:00:59,560 --> 01:01:02,880 Speaker 1: Arguably that should be my big story, because generative AI 1052 01:01:03,040 --> 01:01:05,640 Speaker 1: would continue to be an enormous topic in the following years. 1053 01:01:05,920 --> 01:01:07,680 Speaker 1: But I think the big story I want to focus 1054 01:01:07,760 --> 01:01:11,000 Speaker 1: on from twenty twenty two is the implosion of FTX, 1055 01:01:11,080 --> 01:01:14,760 Speaker 1: the cryptocurrency exchange. So as a reminder, the face of 1056 01:01:14,800 --> 01:01:18,280 Speaker 1: this company, the co founder Sam Bankman Freed or SBF, 1057 01:01:18,680 --> 01:01:21,320 Speaker 1: would be accused of running a kind of fraud in 1058 01:01:21,320 --> 01:01:25,640 Speaker 1: which he was taking customer deposits in FTX and then 1059 01:01:25,760 --> 01:01:28,640 Speaker 1: using those to cover investments that were being made by 1060 01:01:28,680 --> 01:01:32,600 Speaker 1: a sister company called Alameda Research. So Alameda was kind 1061 01:01:32,600 --> 01:01:35,120 Speaker 1: of a hedge fund company that would invest in businesses 1062 01:01:35,160 --> 01:01:38,440 Speaker 1: related to the crypto space, but to cover investments that 1063 01:01:38,480 --> 01:01:42,520 Speaker 1: were made there. SBF and team were siphoning money away 1064 01:01:42,520 --> 01:01:44,760 Speaker 1: from FTX in the hopes that no one would really 1065 01:01:44,800 --> 01:01:46,720 Speaker 1: notice and that maybe in the long run they would 1066 01:01:46,720 --> 01:01:49,200 Speaker 1: be able to cover all their bases. But an expose 1067 01:01:49,440 --> 01:01:52,840 Speaker 1: in late twenty twenty two led to investigations, which in 1068 01:01:52,880 --> 01:01:55,800 Speaker 1: turn led to SBF getting the boot from the company 1069 01:01:55,840 --> 01:01:59,400 Speaker 1: and ultimately to his arrest. He has since been sentenced 1070 01:01:59,400 --> 01:02:02,040 Speaker 1: to twenty five years in prison and ordered to pay 1071 01:02:02,160 --> 01:02:06,160 Speaker 1: eleven billion with a B dollars in forfeiture, which is 1072 01:02:06,200 --> 01:02:08,760 Speaker 1: something I suspect will not ever be paid off, but heck, 1073 01:02:08,800 --> 01:02:12,280 Speaker 1: who knows, maybe he'll get a pardon. This administration lot 1074 01:02:12,320 --> 01:02:15,000 Speaker 1: to see now. The collapse of FDx added to the 1075 01:02:15,040 --> 01:02:18,040 Speaker 1: instability in general in the crypto market. It led to 1076 01:02:18,120 --> 01:02:21,960 Speaker 1: other crypto exchanges and other companies falling apart as a result, 1077 01:02:22,080 --> 01:02:24,640 Speaker 1: including a couple of banks that had been instrumental in 1078 01:02:24,640 --> 01:02:27,800 Speaker 1: the crypto world, and for a short while, it looked 1079 01:02:27,840 --> 01:02:30,560 Speaker 1: like crypto in general, could end up sinking as a 1080 01:02:30,560 --> 01:02:34,120 Speaker 1: result of the FTX failure, but ultimately the industry did 1081 01:02:34,200 --> 01:02:36,760 Speaker 1: weather the storm and would prove to be as volatile 1082 01:02:36,800 --> 01:02:38,920 Speaker 1: as ever in the long run. But yeah, the failure 1083 01:02:38,960 --> 01:02:41,959 Speaker 1: of FTX was a reminder that sometimes tech isn't there 1084 01:02:42,000 --> 01:02:45,760 Speaker 1: to enable new things, but maybe just mask attempts to 1085 01:02:45,760 --> 01:02:48,440 Speaker 1: commit fraud, you know, just make it confusing enough and 1086 01:02:48,520 --> 01:02:53,760 Speaker 1: nobody asks any questions. Okay. Twenty twenty three this was 1087 01:02:53,800 --> 01:02:56,040 Speaker 1: the year that Musk changed the name of Twitter to 1088 01:02:56,400 --> 01:02:59,840 Speaker 1: X while making sweeping changes that alienated many users and 1089 01:03:00,080 --> 01:03:03,240 Speaker 1: encouraged a bunch of others. The users who were alienated 1090 01:03:03,320 --> 01:03:07,400 Speaker 1: often left and ultimately would go to some other alternative 1091 01:03:07,680 --> 01:03:12,560 Speaker 1: short form messaging or social network services. The whole pay 1092 01:03:12,640 --> 01:03:15,800 Speaker 1: for a verified check mark strategy must introduce chased off 1093 01:03:15,800 --> 01:03:18,080 Speaker 1: a ton of people who had previously received a check 1094 01:03:18,080 --> 01:03:21,200 Speaker 1: mark from being a notable personality of some sort. I 1095 01:03:21,240 --> 01:03:23,000 Speaker 1: was one of those people. I had a checkmark and 1096 01:03:23,040 --> 01:03:24,800 Speaker 1: I was on Twitter. I would argue I am not 1097 01:03:24,960 --> 01:03:27,640 Speaker 1: actually notable, but I did have a check mark. But 1098 01:03:27,800 --> 01:03:30,200 Speaker 1: you know, twenty twenty three was really a year that 1099 01:03:30,320 --> 01:03:34,400 Speaker 1: was all about AI. So Chad GBT had debuted in 1100 01:03:34,440 --> 01:03:36,960 Speaker 1: twenty twenty two, but it was twenty twenty three that 1101 01:03:37,080 --> 01:03:40,840 Speaker 1: saw artificial intelligence make headlines pretty much every single day 1102 01:03:40,840 --> 01:03:43,960 Speaker 1: of the year. Sometimes it was because of the incredible 1103 01:03:43,960 --> 01:03:48,800 Speaker 1: display of AI's capabilities, some of which really are you know, amazing. 1104 01:03:49,000 --> 01:03:53,280 Speaker 1: There were also tons of think pieces about the technology's shortcomings, 1105 01:03:53,320 --> 01:03:59,240 Speaker 1: everything from hallucinations or confabulations to the possible impact on 1106 01:03:59,480 --> 01:04:03,520 Speaker 1: creative like you know, whether or not AI is plagiarizing 1107 01:04:03,560 --> 01:04:05,840 Speaker 1: the work of actual human beings or putting them out 1108 01:04:05,840 --> 01:04:08,520 Speaker 1: of work because folks will use AI to I don't know, 1109 01:04:08,640 --> 01:04:13,400 Speaker 1: generate marketing slogans as opposed to hiring marketers or create 1110 01:04:13,480 --> 01:04:15,720 Speaker 1: artwork instead of hiring an artist, that kind of stuff. 1111 01:04:16,000 --> 01:04:19,040 Speaker 1: We got lots of warnings from critics who were worried 1112 01:04:19,080 --> 01:04:22,560 Speaker 1: about deploying a technology before it was thoroughly tested to 1113 01:04:22,560 --> 01:04:24,919 Speaker 1: make sure it was safe. You know, we just could 1114 01:04:25,000 --> 01:04:27,959 Speaker 1: not get away from AI in twenty twenty three. Whether 1115 01:04:28,000 --> 01:04:32,000 Speaker 1: it was Open AI or Google or Meta or Microsoft 1116 01:04:32,080 --> 01:04:37,120 Speaker 1: or Amazon or some other company. Artificial intelligence dominated headlines 1117 01:04:37,200 --> 01:04:39,960 Speaker 1: throughout twenty twenty three, and we asked a lot of 1118 01:04:40,040 --> 01:04:43,000 Speaker 1: questions and we only got a few answers, and none 1119 01:04:43,080 --> 01:04:46,480 Speaker 1: of them were relevant or good. So we're still working 1120 01:04:46,520 --> 01:04:50,160 Speaker 1: on that related to this. In twenty twenty three, Sam Altman, 1121 01:04:50,240 --> 01:04:52,680 Speaker 1: who was one of the co founders and the CEO 1122 01:04:52,880 --> 01:04:56,280 Speaker 1: of open Ai, got the boot from the Board of directors. 1123 01:04:56,560 --> 01:05:00,640 Speaker 1: He was fired, but then he was a to return 1124 01:05:00,680 --> 01:05:03,920 Speaker 1: to the company after enormous pressure was put on the 1125 01:05:03,920 --> 01:05:07,160 Speaker 1: Board of directors to reverse their decision. The board was 1126 01:05:07,240 --> 01:05:10,360 Speaker 1: concerned that Altman was leading open Ai in a direction 1127 01:05:10,440 --> 01:05:13,600 Speaker 1: that was in opposition to its founding principles, which were 1128 01:05:14,120 --> 01:05:19,560 Speaker 1: to develop AI in a safe and accountable and transparent way. 1129 01:05:20,000 --> 01:05:22,560 Speaker 1: So the board directors were saying, that's not what's happening, 1130 01:05:22,680 --> 01:05:24,480 Speaker 1: and we need to get rid of this guy because 1131 01:05:24,520 --> 01:05:27,360 Speaker 1: he's pushing the company to do something that's fundamentally in 1132 01:05:27,400 --> 01:05:31,120 Speaker 1: opposition to what it's supposed to be about. But the 1133 01:05:31,120 --> 01:05:34,920 Speaker 1: board's decision was reversed. The board was then kind of 1134 01:05:35,240 --> 01:05:38,000 Speaker 1: given the boot and replaced, and Sam Altman to this 1135 01:05:38,120 --> 01:05:41,520 Speaker 1: day remains the CEO of open Ai, And considering the 1136 01:05:41,560 --> 01:05:45,520 Speaker 1: stories involving open Ai since then, I think the Board 1137 01:05:45,520 --> 01:05:51,880 Speaker 1: of Directors' concerns were merited. I'll say that. And then finally, 1138 01:05:52,360 --> 01:05:54,840 Speaker 1: after all this time, we get to twenty twenty four, 1139 01:05:55,120 --> 01:05:58,000 Speaker 1: and a lot happened in twenty twenty four, but let's 1140 01:05:58,000 --> 01:06:01,080 Speaker 1: be honest, there's only one story that we really need 1141 01:06:01,080 --> 01:06:04,160 Speaker 1: to talk about. The big story in tech is that 1142 01:06:04,320 --> 01:06:08,360 Speaker 1: Jonathan Strickland, co founder and host of the popular podcast 1143 01:06:08,400 --> 01:06:11,080 Speaker 1: tech Stuff, announced he would be stepping down from the 1144 01:06:11,120 --> 01:06:16,160 Speaker 1: show in early twenty twenty five. In fact, next episode 1145 01:06:16,240 --> 01:06:20,120 Speaker 1: will be my last as host of the show officially, 1146 01:06:20,440 --> 01:06:23,160 Speaker 1: though again who knows. I may occasionally pop in with 1147 01:06:23,240 --> 01:06:27,600 Speaker 1: a special episode here or there, depending upon circumstances, but 1148 01:06:28,080 --> 01:06:30,240 Speaker 1: I should add I don't have any plans to do 1149 01:06:30,280 --> 01:06:33,760 Speaker 1: that just yet. I'm just saying it's a possibility. I'm 1150 01:06:33,800 --> 01:06:38,000 Speaker 1: not leaving it off the table. It could happen. We 1151 01:06:38,120 --> 01:06:40,200 Speaker 1: just don't have plans for it. But yeah, the curtain 1152 01:06:40,240 --> 01:06:42,640 Speaker 1: is closing in on me and I've got one episode 1153 01:06:42,680 --> 01:06:45,960 Speaker 1: left before I shuffle off backstage. And in our next episode, 1154 01:06:46,160 --> 01:06:49,000 Speaker 1: you will meet the new hosts of Tech Stuff, the 1155 01:06:49,000 --> 01:06:51,520 Speaker 1: people who will bring the show to new heights that 1156 01:06:51,600 --> 01:06:55,360 Speaker 1: I can only dream about. I'm genuinely excited to see 1157 01:06:55,360 --> 01:06:58,000 Speaker 1: where they take it and how they share their own 1158 01:06:58,120 --> 01:07:01,560 Speaker 1: view of technology. And yeah, the show will be different, 1159 01:07:01,720 --> 01:07:04,120 Speaker 1: but I have no doubts it will be even better 1160 01:07:04,560 --> 01:07:08,760 Speaker 1: than before, and now so I can say it one 1161 01:07:08,840 --> 01:07:12,959 Speaker 1: last time to y'all, I'll talk to you again really soon. 1162 01:07:19,240 --> 01:07:23,880 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 1163 01:07:24,200 --> 01:07:27,919 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 1164 01:07:27,960 --> 01:07:32,640 Speaker 1: to your favorite shows.