1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,800 --> 00:00:20,120 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:20,280 --> 00:00:25,440 Speaker 1: are you? You know, Predicting the future is really hard, 5 00:00:25,680 --> 00:00:28,120 Speaker 1: even though that's where we're going to spend the rest 6 00:00:28,160 --> 00:00:31,080 Speaker 1: of our lives. Thank you, Plan nine from outer Space. 7 00:00:31,440 --> 00:00:34,839 Speaker 1: So I used to start every year off with a 8 00:00:34,960 --> 00:00:39,159 Speaker 1: big tech predictions episode, and I would sit down and 9 00:00:39,200 --> 00:00:43,280 Speaker 1: try and guess what could unfold in the following year. 10 00:00:43,800 --> 00:00:47,519 Speaker 1: And I think I had a pretty dismal average on 11 00:00:47,560 --> 00:00:52,159 Speaker 1: those predictions. Some years I was like, you know, mostly correct, 12 00:00:52,479 --> 00:00:54,920 Speaker 1: but in a very lame way, and a lot of 13 00:00:54,920 --> 00:00:56,880 Speaker 1: times I felt like it was because the predictions I 14 00:00:56,920 --> 00:00:59,560 Speaker 1: made were very, very safe ones, so I was never 15 00:00:59,600 --> 00:01:03,120 Speaker 1: particularly happy with them. Some of the time I would 16 00:01:03,160 --> 00:01:08,280 Speaker 1: get big predictions partly right, very very rarely, I would 17 00:01:08,280 --> 00:01:11,080 Speaker 1: get one right on the money, but it wouldn't again 18 00:01:11,160 --> 00:01:14,360 Speaker 1: be impressive because the writing would already be on the 19 00:01:14,360 --> 00:01:17,160 Speaker 1: wall right, like saying that a company is going to 20 00:01:17,200 --> 00:01:19,840 Speaker 1: go out of business when the company is currently like 21 00:01:20,080 --> 00:01:23,200 Speaker 1: massively struggling, not a big prediction. So yeah, most of 22 00:01:23,240 --> 00:01:25,160 Speaker 1: the time I was off the mark, and sometimes by 23 00:01:25,240 --> 00:01:28,960 Speaker 1: a significant margin. However, I have learned I should not 24 00:01:29,160 --> 00:01:32,039 Speaker 1: beat myself up over that, because, as it turns out, 25 00:01:32,080 --> 00:01:37,120 Speaker 1: a lot of people have made really bad tech predictions 26 00:01:37,360 --> 00:01:40,360 Speaker 1: over the years. Some of those folks were or are 27 00:01:41,440 --> 00:01:44,959 Speaker 1: way way smarter than I am. So today we're going 28 00:01:45,000 --> 00:01:50,080 Speaker 1: to talk about a few predictions that were notably incorrect. Now, 29 00:01:50,200 --> 00:01:53,480 Speaker 1: let us first remind ourselves, and I'm going to touch 30 00:01:53,480 --> 00:01:55,760 Speaker 1: on this a couple times in this episode, that often 31 00:01:56,400 --> 00:01:59,640 Speaker 1: we end up being wrong in our predictions because we 32 00:01:59,680 --> 00:02:03,920 Speaker 1: are rejecting from what we know is possible today, right, 33 00:02:04,400 --> 00:02:08,040 Speaker 1: And that's understandable. But obviously you can't bring into the 34 00:02:08,080 --> 00:02:12,440 Speaker 1: picture anything like breakthroughs in fields that make the previously 35 00:02:12,520 --> 00:02:17,520 Speaker 1: impossible now possible, or on the flip side, we can't 36 00:02:17,560 --> 00:02:21,200 Speaker 1: imagine the hurdles will encounter that will slow or stop 37 00:02:21,280 --> 00:02:26,760 Speaker 1: our progress toward a lofty goal. See also driverless cars, 38 00:02:26,800 --> 00:02:29,440 Speaker 1: which have proven to be much more complicated than most 39 00:02:29,480 --> 00:02:33,040 Speaker 1: folks believed a decade ago. Or if you want to 40 00:02:33,040 --> 00:02:36,880 Speaker 1: be really cynical, you can take things like farahanos and say, well, 41 00:02:36,919 --> 00:02:40,000 Speaker 1: of course people believed it was possible, even though it 42 00:02:40,040 --> 00:02:44,000 Speaker 1: would later turn out that no the technology was not 43 00:02:44,160 --> 00:02:48,480 Speaker 1: possible or not practical, all right. So we are also 44 00:02:49,120 --> 00:02:54,760 Speaker 1: really good at misattributing statements to folks. So several of 45 00:02:54,800 --> 00:02:58,000 Speaker 1: the claims that I'm going to talk about today did 46 00:02:58,080 --> 00:03:01,440 Speaker 1: not come from the peer people who often will see 47 00:03:01,440 --> 00:03:05,160 Speaker 1: their names attached to those statements, and this is a 48 00:03:05,160 --> 00:03:07,840 Speaker 1: real problem. As I was researching this episode, I would 49 00:03:07,840 --> 00:03:10,880 Speaker 1: come across a prediction and I'd think, Wow, do they 50 00:03:10,960 --> 00:03:13,640 Speaker 1: really say that? Then I would do some more research, 51 00:03:13,680 --> 00:03:15,360 Speaker 1: and I would start digging in, and I would start 52 00:03:15,400 --> 00:03:19,880 Speaker 1: looking for the history of a particular statement and ultimately 53 00:03:19,919 --> 00:03:23,680 Speaker 1: find out that the person who was doing the supposed 54 00:03:23,720 --> 00:03:27,480 Speaker 1: prognostication never actually said the ding dang darned thing in 55 00:03:27,520 --> 00:03:32,240 Speaker 1: the first place. Sometimes someone else said it and actually 56 00:03:32,320 --> 00:03:37,520 Speaker 1: was trying to predict it. Sometimes the prediction just appeared 57 00:03:37,520 --> 00:03:39,840 Speaker 1: to be an invention meant to make a famous person 58 00:03:39,920 --> 00:03:42,840 Speaker 1: seem foolish. So we'll talk about a few of those 59 00:03:43,080 --> 00:03:46,160 Speaker 1: two in this episode. Now. To kick us off, I 60 00:03:46,160 --> 00:03:50,000 Speaker 1: thought I would talk about a very likely apocryphal story. 61 00:03:50,080 --> 00:03:53,720 Speaker 1: In fact, I'll just say I think this one is fake. 62 00:03:54,880 --> 00:03:58,720 Speaker 1: The person who allegedly made the prediction would later deny 63 00:03:59,160 --> 00:04:03,080 Speaker 1: that it ever happened. Frequently, and there's no reason to 64 00:04:03,240 --> 00:04:08,120 Speaker 1: disbelieve this person. So this relates to Bill Gates, the 65 00:04:08,160 --> 00:04:13,320 Speaker 1: co founder of Microsoft. Now, according to the story, supposedly 66 00:04:14,080 --> 00:04:17,479 Speaker 1: back in the early eighties, Bill Gates proclaimed that six 67 00:04:17,600 --> 00:04:21,120 Speaker 1: hundred and forty kilobytes of memory quote ought to be 68 00:04:21,320 --> 00:04:25,720 Speaker 1: enough for anybody. End quote. Now this line pops up 69 00:04:25,800 --> 00:04:29,320 Speaker 1: again and again if you start looking for examples of 70 00:04:29,360 --> 00:04:33,360 Speaker 1: people who are making bad predictions or outright dumb statements 71 00:04:33,440 --> 00:04:37,640 Speaker 1: about technology, and you know, there's this delicious irony, this 72 00:04:37,720 --> 00:04:41,920 Speaker 1: person who is really influential in the text space making 73 00:04:42,120 --> 00:04:47,039 Speaker 1: an outright incorrect statement or prediction. So, if you're not aware, 74 00:04:47,080 --> 00:04:50,680 Speaker 1: six hundred and forty kilobytes was a decent amount of 75 00:04:50,720 --> 00:04:53,960 Speaker 1: memory back in nineteen eighty one when this story supposedly 76 00:04:54,000 --> 00:04:58,520 Speaker 1: took place. But it's a minuscule amount of memory these days, 77 00:04:58,560 --> 00:05:02,799 Speaker 1: like not even worth talking about. Like you know, computers today, 78 00:05:03,000 --> 00:05:04,839 Speaker 1: if you if you've got a computer that is on 79 00:05:04,880 --> 00:05:07,800 Speaker 1: the low side, you're still talking something like two gigabytes 80 00:05:07,839 --> 00:05:11,640 Speaker 1: of memory, maybe up to sixty four gigabytes, and then 81 00:05:11,720 --> 00:05:15,000 Speaker 1: you could maybe expanded up to one hundred twenty eight gigabytes. 82 00:05:15,440 --> 00:05:17,960 Speaker 1: If you've got a sixty four bit system. So remember 83 00:05:18,160 --> 00:05:22,080 Speaker 1: a kilobyte is one thousand bytes. Okay, well, really we 84 00:05:22,120 --> 00:05:23,880 Speaker 1: should be doing this in base two. It should really 85 00:05:23,920 --> 00:05:26,039 Speaker 1: be one thy twenty four bytes. But it really depends 86 00:05:26,120 --> 00:05:28,760 Speaker 1: upon the company, Like some companies will just round it off, 87 00:05:28,800 --> 00:05:31,560 Speaker 1: and some companies will use the base two. So either 88 00:05:31,600 --> 00:05:34,159 Speaker 1: a thousand or one thousand, twenty four bytes, that's what 89 00:05:34,240 --> 00:05:36,880 Speaker 1: a kilobyte is, So six hundred and forty would be 90 00:05:36,920 --> 00:05:42,479 Speaker 1: six hundred forty times that. A gigabyte is a billion bytes, 91 00:05:42,600 --> 00:05:45,080 Speaker 1: or if we're going base two, it's technically one billion, 92 00:05:45,360 --> 00:05:48,400 Speaker 1: seventy three million, seven hundred and forty one thousand, eight 93 00:05:48,520 --> 00:05:52,880 Speaker 1: hundred and twenty four bytes. So yeah, like gigabytes are 94 00:05:53,360 --> 00:05:56,960 Speaker 1: orders of magnitude larger than kilobytes. So obviously the story 95 00:05:56,960 --> 00:06:00,240 Speaker 1: seems to paint Bill Gates as extremely short sighted. To 96 00:06:00,279 --> 00:06:02,960 Speaker 1: assume that six hundred and forty kilobytes would be enough 97 00:06:02,960 --> 00:06:07,600 Speaker 1: for anybody would be an enormous whiff when these days 98 00:06:07,640 --> 00:06:10,680 Speaker 1: even a bargain computer will have orders of magnitude more 99 00:06:10,800 --> 00:06:15,840 Speaker 1: memory as standard. But the thing is, Gates says he 100 00:06:15,920 --> 00:06:18,800 Speaker 1: never actually made this claim. In fact, he said that 101 00:06:18,880 --> 00:06:21,640 Speaker 1: he was always pushing to create systems that could take 102 00:06:21,680 --> 00:06:25,280 Speaker 1: advantage of more memory, which is pretty much the opposite 103 00:06:25,360 --> 00:06:28,040 Speaker 1: of what the claim says. So on top of that, 104 00:06:28,160 --> 00:06:30,640 Speaker 1: while the general stories that Gates said this at some 105 00:06:30,880 --> 00:06:35,480 Speaker 1: point at some trade show in nineteen eighty one, there's 106 00:06:35,640 --> 00:06:40,120 Speaker 1: no actual record of him saying that. There's no account 107 00:06:40,279 --> 00:06:45,400 Speaker 1: from that year that says at this event, during this conversation, 108 00:06:45,520 --> 00:06:49,000 Speaker 1: Bill Gates said this thing. So Gates would later say 109 00:06:49,000 --> 00:06:53,040 Speaker 1: in an interview quote, I've said some stupid things and 110 00:06:53,120 --> 00:06:57,039 Speaker 1: some wrong things, but not that No one involved in 111 00:06:57,120 --> 00:06:59,960 Speaker 1: computers would ever say that a certain amount of memory 112 00:07:00,080 --> 00:07:03,560 Speaker 1: is enough for all time. End quote. So if Gates 113 00:07:03,880 --> 00:07:07,400 Speaker 1: had said this, he would certainly qualify someone making a 114 00:07:07,440 --> 00:07:11,080 Speaker 1: wrong prediction or statement about tech. But it doesn't seem 115 00:07:11,120 --> 00:07:13,680 Speaker 1: like that ever happened. There's lots of stuff we could 116 00:07:13,720 --> 00:07:18,440 Speaker 1: say about Gates that is terrible and deeply disturbing, but 117 00:07:19,440 --> 00:07:22,440 Speaker 1: when it comes to making this particular prediction that appears 118 00:07:22,480 --> 00:07:26,040 Speaker 1: to just be made up in whole cloth. There's a 119 00:07:26,080 --> 00:07:29,400 Speaker 1: similar story that I want to touch on that also 120 00:07:29,480 --> 00:07:33,120 Speaker 1: paints a tech leader in a foolish light, but this 121 00:07:33,280 --> 00:07:36,720 Speaker 1: is due to a lack of context. This leader would 122 00:07:36,720 --> 00:07:39,720 Speaker 1: be Ken Olsen. He was a co founder of a 123 00:07:39,760 --> 00:07:45,840 Speaker 1: company called Digital Equipment Corporation or DEC. Later on COMPAC 124 00:07:46,120 --> 00:07:50,720 Speaker 1: would acquire DEC, and then even later Hewlett Packard would 125 00:07:50,800 --> 00:07:54,720 Speaker 1: acquire COMPAC. So there's always a bigger fish. But the 126 00:07:54,760 --> 00:07:59,200 Speaker 1: story goes that Olsen, back in nineteen seventy seven gave 127 00:07:59,440 --> 00:08:03,640 Speaker 1: a present at the World Future Society in which he proclaimed, 128 00:08:03,760 --> 00:08:07,800 Speaker 1: quote there is no reason for any individual to have 129 00:08:07,840 --> 00:08:11,520 Speaker 1: a computer in his home. End quote. Now, if we 130 00:08:11,640 --> 00:08:14,440 Speaker 1: take that quote on the face of it, it sounds 131 00:08:14,520 --> 00:08:18,080 Speaker 1: like what Olsen was saying is that the very idea 132 00:08:18,080 --> 00:08:21,600 Speaker 1: of a home personal computer is ludicrous. And considering that 133 00:08:21,640 --> 00:08:24,720 Speaker 1: the nineteen seventies, the late seventies that was the launching 134 00:08:24,800 --> 00:08:29,440 Speaker 1: ground for the home computer, Olsen's words appear to be indefensible, 135 00:08:29,840 --> 00:08:33,880 Speaker 1: like he was just totally wrong. The personal computer would 136 00:08:33,960 --> 00:08:36,720 Speaker 1: become a huge deal. Today it's a market that's nearly 137 00:08:36,760 --> 00:08:40,680 Speaker 1: two hundred billion dollars in value. But here's the thing. 138 00:08:41,160 --> 00:08:45,240 Speaker 1: Olsen explained that the problem was people were lifting that 139 00:08:45,320 --> 00:08:49,320 Speaker 1: statement out of his presentation without the benefit of context. 140 00:08:49,960 --> 00:08:53,600 Speaker 1: He later defended what he said. He said he wasn't 141 00:08:53,640 --> 00:08:56,560 Speaker 1: talking about personal computers. He wasn't talking about little desktop 142 00:08:56,600 --> 00:08:59,080 Speaker 1: computers that let us do all sorts of stuff. He 143 00:08:59,360 --> 00:09:02,600 Speaker 1: obviously believe that those would be a thing, because dec 144 00:09:03,280 --> 00:09:07,320 Speaker 1: was in that business itself, So there's no reason why 145 00:09:07,400 --> 00:09:10,240 Speaker 1: his company would be pursuing that line of business if 146 00:09:10,240 --> 00:09:14,439 Speaker 1: he didn't believe that it was viable. Rather, what Olsen 147 00:09:14,559 --> 00:09:16,640 Speaker 1: was talking about was that you were not going to 148 00:09:16,679 --> 00:09:20,360 Speaker 1: see people get a mainframe like computer system installed in 149 00:09:20,400 --> 00:09:25,000 Speaker 1: their home for the purposes of automating everything, like having 150 00:09:25,120 --> 00:09:29,080 Speaker 1: a computer run household. That's what he was talking about. 151 00:09:29,440 --> 00:09:32,679 Speaker 1: So we're not going to see people get a computer 152 00:09:32,760 --> 00:09:37,040 Speaker 1: to be the central operating system of your home. So 153 00:09:37,080 --> 00:09:41,720 Speaker 1: we're talking about functions like controlling your lights or climate controls, 154 00:09:41,920 --> 00:09:44,360 Speaker 1: or the sort of stuff that we can now do 155 00:09:44,400 --> 00:09:48,360 Speaker 1: with network products like smart thermostats and light bulbs. Olsen 156 00:09:48,440 --> 00:09:50,240 Speaker 1: was saying he didn't see a future where people were 157 00:09:50,240 --> 00:09:53,480 Speaker 1: going to buy and install these hefty computer systems, and 158 00:09:53,520 --> 00:09:56,319 Speaker 1: also that people wouldn't want their lives to be run 159 00:09:56,440 --> 00:09:59,719 Speaker 1: by computers. And this would have been back in the 160 00:09:59,760 --> 00:10:01,880 Speaker 1: night teen seventies when he made this statement, and he 161 00:10:02,000 --> 00:10:06,320 Speaker 1: was mostly right right, like, we didn't see people pay 162 00:10:06,880 --> 00:10:09,920 Speaker 1: ridiculous amounts of money to automate their homes. Now these days, 163 00:10:09,920 --> 00:10:12,720 Speaker 1: we do have lots of home automation products out there, 164 00:10:13,440 --> 00:10:20,280 Speaker 1: and depending upon how deeply integrated your computer calendar is 165 00:10:20,360 --> 00:10:22,840 Speaker 1: with your life, maybe you do feel like you're being 166 00:10:22,960 --> 00:10:26,360 Speaker 1: your life is being run by a computer, which that's 167 00:10:26,360 --> 00:10:29,120 Speaker 1: a possibility, but it doesn't. It's not the same thing 168 00:10:29,160 --> 00:10:31,200 Speaker 1: as holding his statement up and saying, oh, he was 169 00:10:31,200 --> 00:10:34,400 Speaker 1: totally wrong about home computers. So Olsen's point was aimed 170 00:10:34,400 --> 00:10:37,200 Speaker 1: more at Pie and the sky futurists who had imagined 171 00:10:37,240 --> 00:10:39,839 Speaker 1: the fully automated home, which is a vision that actually 172 00:10:39,920 --> 00:10:42,640 Speaker 1: dates back decades. Some of my favorite cartoons as a 173 00:10:42,720 --> 00:10:46,360 Speaker 1: kid were cartoons that were about the home of tomorrow, 174 00:10:46,760 --> 00:10:49,440 Speaker 1: and the cartoonists and writers found goofy ways to poke 175 00:10:49,520 --> 00:10:54,040 Speaker 1: fun at basic automation concepts. Olsen was saying that was 176 00:10:54,040 --> 00:10:55,680 Speaker 1: the sort of system no one would be buying for 177 00:10:55,760 --> 00:10:58,120 Speaker 1: their home, and he was right about that, but a 178 00:10:58,160 --> 00:11:01,240 Speaker 1: misinterpretation of his meaning had people to say that he 179 00:11:01,400 --> 00:11:05,280 Speaker 1: was absolutely wrong and that he was talking about home computers. Okay, 180 00:11:05,440 --> 00:11:08,760 Speaker 1: Next up, I want to talk about The Hitchhiker's Guide 181 00:11:08,880 --> 00:11:12,360 Speaker 1: to the Galaxy and how Douglas Adams dreamed up an 182 00:11:12,400 --> 00:11:17,960 Speaker 1: outlandish technology in the titular Guide. When Adams wrote the 183 00:11:18,040 --> 00:11:20,640 Speaker 1: first version of the story, which was actually a radio 184 00:11:20,720 --> 00:11:23,400 Speaker 1: play for BBC Radio back in nineteen seventy eight. There 185 00:11:23,440 --> 00:11:27,600 Speaker 1: are like half a dozen different versions of the Hitchhiker's 186 00:11:27,640 --> 00:11:30,680 Speaker 1: Guide to the Galaxy story, and no two are exactly alike. 187 00:11:31,240 --> 00:11:33,200 Speaker 1: You know, you have the radio play, you have the novels, 188 00:11:33,520 --> 00:11:36,080 Speaker 1: you have a TV series, you've got a movie like 189 00:11:36,320 --> 00:11:40,040 Speaker 1: and each version tells the story slightly differently. Even even 190 00:11:40,120 --> 00:11:45,280 Speaker 1: the vinyl record which took the radio play scripts, changed things. 191 00:11:45,440 --> 00:11:49,480 Speaker 1: So there's no definitive version of Hitchicker's Guide to the Galaxy. Anyway. 192 00:11:50,080 --> 00:11:53,600 Speaker 1: When he wrote this back in nineteen seventy eight, personal 193 00:11:53,679 --> 00:11:57,559 Speaker 1: computers were really new, right, they had not been around 194 00:11:57,559 --> 00:12:00,480 Speaker 1: for very long. So in his story he has a 195 00:12:00,559 --> 00:12:05,040 Speaker 1: character that has this Guide to the Galaxy, and it 196 00:12:05,120 --> 00:12:08,840 Speaker 1: is essentially a digital book. It's about the size of 197 00:12:08,840 --> 00:12:12,000 Speaker 1: a book, and it contains enormous amounts of information on 198 00:12:12,080 --> 00:12:15,320 Speaker 1: pretty much anything you could encounter in the great big 199 00:12:15,360 --> 00:12:20,520 Speaker 1: galaxy out there. Although some entries warranted longer descriptions than others, 200 00:12:21,160 --> 00:12:25,960 Speaker 1: Earth's entry, for example, was just mostly harmless, and even 201 00:12:26,000 --> 00:12:30,760 Speaker 1: before that it was just harmless. While Adams's work reveled 202 00:12:30,800 --> 00:12:34,320 Speaker 1: in absurdity and comedy, this idea of this sort of 203 00:12:34,440 --> 00:12:37,040 Speaker 1: portable device that could have access to vast amounts of 204 00:12:37,040 --> 00:12:40,839 Speaker 1: information would persist beyond the pages of fiction, and as 205 00:12:40,840 --> 00:12:43,880 Speaker 1: companies found ways to build smaller components and then cram 206 00:12:43,920 --> 00:12:47,720 Speaker 1: those components into microchips, computers got more powerful and capable 207 00:12:47,760 --> 00:12:52,160 Speaker 1: of storing more information. Adding the ability to network these 208 00:12:52,200 --> 00:12:56,040 Speaker 1: devices and things really would start to take off. This 209 00:12:56,160 --> 00:13:00,240 Speaker 1: brings me to Andy Grove, the former CEO of INDI Hell. 210 00:13:00,640 --> 00:13:04,360 Speaker 1: Back in nineteen ninety two, Andy Grove famously dismissed the 211 00:13:04,400 --> 00:13:07,960 Speaker 1: notion that before long, executives would be walking around with 212 00:13:08,000 --> 00:13:12,120 Speaker 1: a wireless digital personal communications device capable of doing things 213 00:13:12,200 --> 00:13:15,920 Speaker 1: like sending and receiving email or can you just imagine 214 00:13:15,960 --> 00:13:18,880 Speaker 1: being able to pull up like real time local maps 215 00:13:18,960 --> 00:13:22,959 Speaker 1: complete with traffic information and a suggested route to get 216 00:13:23,000 --> 00:13:28,240 Speaker 1: to your next destination. Essentially, Grove thought this vision, which 217 00:13:28,240 --> 00:13:32,760 Speaker 1: would slowly coalesce into the smartphone, was a pipe dream, 218 00:13:32,960 --> 00:13:35,360 Speaker 1: and that it was something being hyped up by companies 219 00:13:35,679 --> 00:13:40,360 Speaker 1: that were greedy but unrealistic. Of course, Grove was wrong. 220 00:13:40,920 --> 00:13:44,000 Speaker 1: The invention of the personal digital assistant and then the 221 00:13:44,120 --> 00:13:47,960 Speaker 1: gradual convergence of the PDA with the cell phone would 222 00:13:48,000 --> 00:13:51,600 Speaker 1: give birth to the modern smartphone, and it wouldn't just 223 00:13:51,640 --> 00:13:54,080 Speaker 1: be executives who would carry them around. It would be 224 00:13:54,120 --> 00:13:58,640 Speaker 1: tons of people, hundreds of millions of people. Something that 225 00:13:58,720 --> 00:14:01,320 Speaker 1: was once in the realm of science fiction would now 226 00:14:01,640 --> 00:14:04,719 Speaker 1: be a reality. Now we all have access to a 227 00:14:04,800 --> 00:14:08,640 Speaker 1: vast database of information. Some of the information is really useful, 228 00:14:09,440 --> 00:14:13,320 Speaker 1: some of it is diverting, some of it's outright harmful. 229 00:14:13,840 --> 00:14:18,040 Speaker 1: We can send and receive emails or instant messages, even 230 00:14:18,120 --> 00:14:21,120 Speaker 1: photos and videos. We could jump online and interact with 231 00:14:21,280 --> 00:14:24,440 Speaker 1: various platforms. We can shop from our phones, we can 232 00:14:24,520 --> 00:14:27,600 Speaker 1: use them as navigation devices. We could just play with 233 00:14:27,680 --> 00:14:31,240 Speaker 1: them like toys. It turns out that the pipe dream 234 00:14:31,560 --> 00:14:34,960 Speaker 1: was in fact a possibility and then a reality. But 235 00:14:35,040 --> 00:14:38,920 Speaker 1: back in nineteen ninety two you could probably understand Grove's skepticism. 236 00:14:39,320 --> 00:14:42,440 Speaker 1: Apple launched the Newton in ninety two, and that became 237 00:14:42,480 --> 00:14:45,760 Speaker 1: the first product that would be called a personal Digital assistant, 238 00:14:45,840 --> 00:14:49,040 Speaker 1: or PDA. But that particular device had a lot of 239 00:14:49,040 --> 00:14:53,840 Speaker 1: limitations and quirks, plus it lacked wireless connectivity. Still, IBM 240 00:14:53,920 --> 00:14:56,840 Speaker 1: released a PDA with analog cell phone connectivity in nineteen 241 00:14:56,920 --> 00:15:00,520 Speaker 1: ninety four, and Nokia followed with a PA that had 242 00:15:00,560 --> 00:15:04,200 Speaker 1: digital cell phone connectivity in nineteen ninety six, so it 243 00:15:04,320 --> 00:15:09,760 Speaker 1: did not take very long for Grove's prediction to fall flat. Okay, 244 00:15:09,760 --> 00:15:11,360 Speaker 1: we're gonna take a quick break and then we'll be 245 00:15:11,400 --> 00:15:24,200 Speaker 1: back with some more bad tech predictions. We're back and 246 00:15:24,320 --> 00:15:28,600 Speaker 1: next up. Sometimes the guy who helped build the thing 247 00:15:29,440 --> 00:15:32,600 Speaker 1: ends up being very wrong about the thing. So in 248 00:15:32,640 --> 00:15:37,120 Speaker 1: this case, I'm talking about Robert or Bob Metcalf. So 249 00:15:37,320 --> 00:15:41,320 Speaker 1: when he was a graduate student, Metcalf worked on ARPANET. 250 00:15:41,720 --> 00:15:44,960 Speaker 1: So for those who are unfamiliar with arpa neet, you 251 00:15:45,000 --> 00:15:47,760 Speaker 1: can think of it as sort of the predecessor to 252 00:15:47,840 --> 00:15:52,120 Speaker 1: the Internet. You add a lot of engineers and scientists 253 00:15:52,240 --> 00:15:56,120 Speaker 1: and researchers who worked to create the means to network 254 00:15:56,240 --> 00:15:59,840 Speaker 1: different computers together, even if those computers were far apart 255 00:15:59,840 --> 00:16:02,760 Speaker 1: from from one another. This was a non trivial task. 256 00:16:02,840 --> 00:16:05,800 Speaker 1: You know, these different computers worked on different operating systems. 257 00:16:06,560 --> 00:16:09,480 Speaker 1: You could think of it as they communicated in different languages. 258 00:16:09,560 --> 00:16:12,360 Speaker 1: So you had to create a way, a common ground 259 00:16:12,960 --> 00:16:16,240 Speaker 1: for these different machines to be able to send and 260 00:16:16,280 --> 00:16:19,360 Speaker 1: receive information in a useful way with it or other machines. 261 00:16:19,560 --> 00:16:24,400 Speaker 1: And then how does that information travel across communications lines. 262 00:16:24,680 --> 00:16:26,560 Speaker 1: You had to come up with ways for that to 263 00:16:26,680 --> 00:16:29,480 Speaker 1: be foolproof, or at least as close to foolproof as 264 00:16:29,480 --> 00:16:31,720 Speaker 1: you could get, because if it were something where it 265 00:16:31,760 --> 00:16:35,240 Speaker 1: was just a solid connection and there was an interruption 266 00:16:35,360 --> 00:16:39,680 Speaker 1: in that connection, then what happens to the process. These 267 00:16:39,680 --> 00:16:43,080 Speaker 1: were all practical problems that the arpinet folks had to solve, 268 00:16:43,600 --> 00:16:46,720 Speaker 1: and the work would become a foundational component for the 269 00:16:46,800 --> 00:16:51,000 Speaker 1: Internet which would follow after arpinet. Metcalf wrote an early 270 00:16:51,120 --> 00:16:54,680 Speaker 1: definitive work describing different ways to use the arpinet. He 271 00:16:54,760 --> 00:16:59,120 Speaker 1: also included information on resources and instructions to make use 272 00:16:59,200 --> 00:17:01,640 Speaker 1: of arpinet. He would then go on to take a 273 00:17:01,720 --> 00:17:04,680 Speaker 1: job at Xerox Park. That's a facility that I've talked 274 00:17:04,720 --> 00:17:09,000 Speaker 1: about in recent tech Stuff episodes, very important in various 275 00:17:09,040 --> 00:17:13,120 Speaker 1: technological innovations, although Xerox itself had a reputation for failing 276 00:17:13,200 --> 00:17:16,680 Speaker 1: to capitalize on the developments that came out of Park. 277 00:17:17,440 --> 00:17:20,600 Speaker 1: And while he was at Park, he developed ethernet. That's 278 00:17:20,680 --> 00:17:25,200 Speaker 1: the cable based technology that allows data transfers between connected computers. 279 00:17:25,520 --> 00:17:28,560 Speaker 1: He based it off of the Alohan net that was 280 00:17:28,640 --> 00:17:32,080 Speaker 1: used by the University of Hawaii, which relied on radio 281 00:17:32,080 --> 00:17:35,200 Speaker 1: waves rather than cables to send signals back and forth. 282 00:17:35,480 --> 00:17:40,040 Speaker 1: But he built upon the technologies of alohan Net to 283 00:17:40,119 --> 00:17:44,840 Speaker 1: develop ethernet. Okay, but let's flash forward to nineteen ninety five, 284 00:17:45,320 --> 00:17:47,640 Speaker 1: So this is five years after the US government had 285 00:17:47,640 --> 00:17:50,960 Speaker 1: already decommissioned Arpinet. It had been shut down a couple 286 00:17:51,000 --> 00:17:55,600 Speaker 1: of years earlier, got decommissioned in nineteen ninety the Internet 287 00:17:55,680 --> 00:17:59,159 Speaker 1: itself was actually really taking off. It was helped in 288 00:17:59,280 --> 00:18:03,200 Speaker 1: large part by the development of the Worldwide Web, which 289 00:18:04,080 --> 00:18:07,480 Speaker 1: wasn't a thing when the Internet was first coalescing, but 290 00:18:07,720 --> 00:18:11,000 Speaker 1: became a thing in the early nineties. And on December fourth, 291 00:18:11,320 --> 00:18:15,880 Speaker 1: nineteen ninety five, the magazine InfoWorld published an article written 292 00:18:16,040 --> 00:18:20,639 Speaker 1: by Metcalf in which the network visionary said, quote, I 293 00:18:20,760 --> 00:18:23,879 Speaker 1: predict the Internet, which only just recently got this section 294 00:18:24,000 --> 00:18:29,119 Speaker 1: here in InfoWorld, will soon go spectacularly supernova and in 295 00:18:29,240 --> 00:18:36,560 Speaker 1: nineteen ninety six catastrophically collapse end quote. So Metcalf thought 296 00:18:36,600 --> 00:18:41,560 Speaker 1: the Internet was growing beyond the technological and economic capacities 297 00:18:42,000 --> 00:18:45,440 Speaker 1: that would be needed to support the Internet. He envisioned 298 00:18:45,440 --> 00:18:49,240 Speaker 1: a scenario in which the money just wouldn't be there 299 00:18:49,400 --> 00:18:52,600 Speaker 1: to build out the infrastructure that would be required to 300 00:18:52,800 --> 00:18:55,840 Speaker 1: allow for the explosive growth. He didn't deny that the 301 00:18:55,840 --> 00:18:59,000 Speaker 1: Internet was growing. He just said it's going to reach 302 00:18:59,040 --> 00:19:02,960 Speaker 1: a tipping point where where we're not able to supply 303 00:19:03,080 --> 00:19:05,680 Speaker 1: it with the technology needed to let it run, and 304 00:19:05,720 --> 00:19:08,480 Speaker 1: it's going to collapse under its own weight. So he 305 00:19:08,640 --> 00:19:11,080 Speaker 1: also predicted that we were going to see a lot 306 00:19:11,119 --> 00:19:14,920 Speaker 1: more vulnerabilities in the Internet that would facilitate security breaches, 307 00:19:15,119 --> 00:19:17,240 Speaker 1: and that would convince folks that the Internet would be 308 00:19:17,280 --> 00:19:20,000 Speaker 1: too dangerous. Right once you have a couple of big 309 00:19:20,000 --> 00:19:25,040 Speaker 1: security breaches, people would say, oh, we can't you know, 310 00:19:25,040 --> 00:19:27,040 Speaker 1: we can't rely on the Internet because if we do, 311 00:19:27,160 --> 00:19:30,480 Speaker 1: we're going to potentially lose everything. So he proclaimed that 312 00:19:30,560 --> 00:19:34,040 Speaker 1: he would even eat his words if he were proven wrong. 313 00:19:34,640 --> 00:19:38,919 Speaker 1: In April nineteen ninety seven, Bob Metcalf proved to be 314 00:19:38,960 --> 00:19:41,440 Speaker 1: a man of his word. While at a tech conference, 315 00:19:41,760 --> 00:19:45,560 Speaker 1: Metcalf had a cake wheeled out. His column was printed 316 00:19:45,680 --> 00:19:49,159 Speaker 1: on icing on the cake. Some versions of the story 317 00:19:49,240 --> 00:19:51,639 Speaker 1: say that the crowd kind of turned against him, saying 318 00:19:51,640 --> 00:19:54,000 Speaker 1: that he was taking the easy way out, and so 319 00:19:54,080 --> 00:19:56,159 Speaker 1: at least one version of the story says that he 320 00:19:56,280 --> 00:20:00,000 Speaker 1: then had the actual physical article on paper brought out 321 00:20:00,080 --> 00:20:02,880 Speaker 1: out and then put the article in a blender with 322 00:20:02,880 --> 00:20:05,560 Speaker 1: some water and blended it into kind of slurry, and 323 00:20:05,600 --> 00:20:09,280 Speaker 1: then he ate the goop. But either way, he reportedly 324 00:20:09,359 --> 00:20:11,960 Speaker 1: did in fact eat his own words and admitted that 325 00:20:12,000 --> 00:20:16,879 Speaker 1: he had been wrong, which I totally respect. Metcalf failed 326 00:20:16,880 --> 00:20:20,560 Speaker 1: to predict the innovations that would drive the Internet's expansion, 327 00:20:21,240 --> 00:20:23,639 Speaker 1: and again that really gets back to that heart of 328 00:20:23,720 --> 00:20:26,440 Speaker 1: a lot of wrong predictions that we lean so heavily 329 00:20:26,600 --> 00:20:29,960 Speaker 1: on basing our guess of what comes next by looking 330 00:20:30,000 --> 00:20:33,879 Speaker 1: at how we do things currently. But obviously this fails 331 00:20:33,880 --> 00:20:37,360 Speaker 1: to take into account new techniques and technologies and ideas, which, 332 00:20:37,440 --> 00:20:41,040 Speaker 1: let's be fair, makes sense because if we could predict 333 00:20:41,640 --> 00:20:45,720 Speaker 1: new techniques and technologies and ideas, we would already have them. Like, 334 00:20:46,560 --> 00:20:50,040 Speaker 1: you can't fault people for not guessing something that hasn't 335 00:20:50,080 --> 00:20:52,840 Speaker 1: been thought up yet, because otherwise they would have thought 336 00:20:52,840 --> 00:20:57,080 Speaker 1: it up. Of course, technologists aren't the only ones who 337 00:20:57,080 --> 00:21:01,080 Speaker 1: get tech predictions wrong. Economists can do a real fine 338 00:21:01,200 --> 00:21:06,080 Speaker 1: job of getting stuff wrong too, while also referencing a technologist, 339 00:21:06,240 --> 00:21:09,959 Speaker 1: perhaps in the process. See In nineteen ninety eight, an 340 00:21:10,040 --> 00:21:13,840 Speaker 1: economist named Paul Krugman had a dire prediction for the Internet, 341 00:21:14,200 --> 00:21:17,840 Speaker 1: and he ended up referencing our previous example of Robert Metcalf. 342 00:21:18,440 --> 00:21:21,560 Speaker 1: Krugman wrote an article in a magazine titled red Herring 343 00:21:22,080 --> 00:21:25,919 Speaker 1: and said, quote, the growth of the Internet will slow 344 00:21:26,040 --> 00:21:30,199 Speaker 1: drastically as the flaw in Metcalf's law, which states that 345 00:21:30,240 --> 00:21:33,760 Speaker 1: the number of potential connections in a network is proportional 346 00:21:33,800 --> 00:21:37,160 Speaker 1: to the square of the number of participants, becomes apparent. 347 00:21:37,720 --> 00:21:40,360 Speaker 1: Most people have nothing to say to each other. By 348 00:21:40,359 --> 00:21:42,520 Speaker 1: two thousand and five or so, it will become clear 349 00:21:42,720 --> 00:21:45,520 Speaker 1: that the Internet's impact on the economy has been no 350 00:21:45,640 --> 00:21:50,760 Speaker 1: greater than the fax machines. End quote. All right, so 351 00:21:51,440 --> 00:21:54,640 Speaker 1: this does raise the question about Metcalf's law. What is that? 352 00:21:54,800 --> 00:21:57,280 Speaker 1: What actually comes from an observation that Robert Metcalf had 353 00:21:57,280 --> 00:21:59,600 Speaker 1: made way back in nineteen eighty, which was that the 354 00:21:59,600 --> 00:22:03,560 Speaker 1: financial value of a telecommunications network is proportional to the 355 00:22:03,600 --> 00:22:07,480 Speaker 1: square of the number of connected communication devices on that network. 356 00:22:07,520 --> 00:22:10,240 Speaker 1: Sometimes we simplify this to say the number of users 357 00:22:10,280 --> 00:22:12,880 Speaker 1: on a network, but it's really more fair to say 358 00:22:13,080 --> 00:22:18,000 Speaker 1: nodes or connected devices. Essentially, met Keef was saying that 359 00:22:18,080 --> 00:22:21,600 Speaker 1: the more devices you have connected within a network, the 360 00:22:21,680 --> 00:22:26,560 Speaker 1: more possible connections exist between those devices, and we can 361 00:22:26,600 --> 00:22:30,879 Speaker 1: express this mathematically with the equation of n times n 362 00:22:30,920 --> 00:22:34,400 Speaker 1: minus one divided by two. In in this case would 363 00:22:34,400 --> 00:22:38,119 Speaker 1: be the number of users or connected devices or nodes 364 00:22:38,119 --> 00:22:39,879 Speaker 1: however you want to think of it. So if we 365 00:22:39,920 --> 00:22:42,080 Speaker 1: only have two devices, right, let's say that we've got 366 00:22:42,119 --> 00:22:44,520 Speaker 1: a direct connection between device one and device two. But 367 00:22:44,560 --> 00:22:47,359 Speaker 1: that's it. Well, we would fill in our equation. We 368 00:22:47,400 --> 00:22:49,800 Speaker 1: would use two in place of end, So our equation 369 00:22:49,880 --> 00:22:53,520 Speaker 1: will be two times two minus one, which is one 370 00:22:53,800 --> 00:22:56,080 Speaker 1: and then divided by two. So then that means we 371 00:22:56,119 --> 00:22:58,560 Speaker 1: get two times one divided by two. That means we 372 00:22:58,600 --> 00:23:01,720 Speaker 1: eventually just get one. That's the number of possible connections 373 00:23:01,760 --> 00:23:04,840 Speaker 1: between these two connected devices. You only have one possible connection. 374 00:23:05,400 --> 00:23:10,320 Speaker 1: But let's say we've got twenty connected devices on this network. Well, 375 00:23:10,320 --> 00:23:14,000 Speaker 1: that means now our equation is twenty times twenty minus 376 00:23:14,000 --> 00:23:17,879 Speaker 1: one divided by two, So that means it's twenty times 377 00:23:17,960 --> 00:23:21,280 Speaker 1: nineteen then divide by two, or we get one hundred 378 00:23:21,320 --> 00:23:25,600 Speaker 1: and ninety possible connections. So as you add more users 379 00:23:25,640 --> 00:23:29,800 Speaker 1: or devices to a network, then network's value increases significantly. 380 00:23:30,240 --> 00:23:33,240 Speaker 1: But Krugman was saying that if no one has anything 381 00:23:33,280 --> 00:23:35,320 Speaker 1: interesting to say to each other then you don't really 382 00:23:35,359 --> 00:23:37,840 Speaker 1: have any added value there, and then growth is going 383 00:23:37,880 --> 00:23:40,399 Speaker 1: to slow down, and this is going to show that 384 00:23:40,520 --> 00:23:44,919 Speaker 1: Metcalf's law is flawed. Clearly, the era of social media 385 00:23:44,960 --> 00:23:49,239 Speaker 1: has proved Krugman way wrong. People spend all day not 386 00:23:49,400 --> 00:23:52,760 Speaker 1: saying anything to each other, at least nothing of substance, 387 00:23:53,400 --> 00:23:57,760 Speaker 1: and it is going like gangbusters. Even as some platforms 388 00:23:57,800 --> 00:24:01,480 Speaker 1: are slowing down, others are picking up. Krugman and himself 389 00:24:01,480 --> 00:24:03,719 Speaker 1: had said that he was just trying to be provocative, 390 00:24:04,040 --> 00:24:06,679 Speaker 1: and sometimes when you do try to be provocative, you 391 00:24:06,760 --> 00:24:09,440 Speaker 1: end up just being very wrong, and he just happened 392 00:24:09,440 --> 00:24:13,720 Speaker 1: to be very very wrong about this. Now, sometimes we 393 00:24:13,760 --> 00:24:16,800 Speaker 1: get predictions of doom and gloom from someone who appears 394 00:24:16,800 --> 00:24:19,240 Speaker 1: to be driven by having a vested interest in a 395 00:24:19,359 --> 00:24:23,439 Speaker 1: competing technology. They say, oh, that technology is going to fail, 396 00:24:23,800 --> 00:24:28,000 Speaker 1: partly because they are supporting a different technology. There's a 397 00:24:28,080 --> 00:24:32,879 Speaker 1: quote frequently and also incorrectly attributed to a Hollywood movie 398 00:24:32,920 --> 00:24:35,720 Speaker 1: producer named Darryl F. Zanuk. He was one of the 399 00:24:35,760 --> 00:24:39,240 Speaker 1: folks responsible for creating the film company twentieth Century Pictures, 400 00:24:39,320 --> 00:24:42,240 Speaker 1: among a lot of other things. But the story goes 401 00:24:42,640 --> 00:24:49,160 Speaker 1: that Xanik famously and incorrectly dismissed the impact of television, saying, quote, 402 00:24:49,680 --> 00:24:53,040 Speaker 1: video isn't able to hold on to the market it captures. 403 00:24:53,240 --> 00:24:56,560 Speaker 1: After the first six months, people soon get tired of 404 00:24:56,600 --> 00:25:00,600 Speaker 1: staring at a plywood box every night. End quote. Now, 405 00:25:00,640 --> 00:25:05,800 Speaker 1: clearly the movie studios saw television as an existential threat. 406 00:25:05,840 --> 00:25:08,359 Speaker 1: I mean, why would people go to the cinema to 407 00:25:08,400 --> 00:25:11,720 Speaker 1: spend a few hours watching movies and cartoons and newsreels 408 00:25:12,000 --> 00:25:14,080 Speaker 1: if they could get access to many of those things 409 00:25:14,480 --> 00:25:17,399 Speaker 1: just from home through the television. For a while, film 410 00:25:17,440 --> 00:25:21,520 Speaker 1: studios saw television as being a true threat to their 411 00:25:21,640 --> 00:25:25,359 Speaker 1: very existence, at least until movie studios started to consolidate 412 00:25:25,400 --> 00:25:28,480 Speaker 1: with TV studios. So, of course it would be delicious 413 00:25:28,880 --> 00:25:31,720 Speaker 1: to point to a movie mogul who stuck his neck 414 00:25:31,720 --> 00:25:33,840 Speaker 1: out to proclaim that television would be no more than 415 00:25:33,840 --> 00:25:37,199 Speaker 1: a passing fad, only to be proven very wrong and 416 00:25:37,280 --> 00:25:42,359 Speaker 1: television would become an incredibly important component in communications. Now, 417 00:25:42,720 --> 00:25:46,520 Speaker 1: I'm not saying that no one at all ever made 418 00:25:47,080 --> 00:25:50,080 Speaker 1: these quotes apart from like like, I'm not saying they 419 00:25:50,119 --> 00:25:53,800 Speaker 1: were just invented, but it certainly doesn't appear to have 420 00:25:53,800 --> 00:25:56,680 Speaker 1: been Zanak. He is not the person who said these things. 421 00:25:57,640 --> 00:26:00,920 Speaker 1: The website quote investigator actually looked into this particular statement. 422 00:26:01,359 --> 00:26:06,119 Speaker 1: The earliest version they found from the statements about video 423 00:26:06,200 --> 00:26:09,040 Speaker 1: and the Plywood box actually came from a Wall Street 424 00:26:09,119 --> 00:26:13,560 Speaker 1: Journal article back in nineteen fifty one, and it appeared 425 00:26:13,560 --> 00:26:18,439 Speaker 1: as two separate statements from two different people. So the 426 00:26:18,520 --> 00:26:21,720 Speaker 1: video isn't able to hold onto the market it captures 427 00:26:21,760 --> 00:26:24,720 Speaker 1: after the first six months. Statement supposedly came from a 428 00:26:24,760 --> 00:26:28,679 Speaker 1: movie executive based out of New York. The phrase people 429 00:26:28,760 --> 00:26:31,600 Speaker 1: soon get tired of staring at plywood box every night 430 00:26:31,880 --> 00:26:35,280 Speaker 1: came from quote a San Franciscan end quote. In fact, 431 00:26:35,320 --> 00:26:37,840 Speaker 1: it doesn't even get specific enough to say it's a 432 00:26:37,880 --> 00:26:41,280 Speaker 1: movie executive in San Francisco, although we can assume that 433 00:26:41,400 --> 00:26:45,480 Speaker 1: was the case. However, either way, Xanik, the person who 434 00:26:45,520 --> 00:26:49,560 Speaker 1: often gets attributed with this these pair of quotes that 435 00:26:49,640 --> 00:26:52,879 Speaker 1: are combined into a single quote, he was based in Hollywood, 436 00:26:53,160 --> 00:26:58,000 Speaker 1: so presumably he was neither of the unnamed individuals who 437 00:26:58,040 --> 00:27:01,720 Speaker 1: provided the Wall Street Journal with these quotes. So Xanuuk 438 00:27:01,800 --> 00:27:05,520 Speaker 1: is in the clear. Still assuming the Wall Street Journal 439 00:27:05,560 --> 00:27:09,480 Speaker 1: reporter was not inventing quotations of a thin air, there 440 00:27:09,480 --> 00:27:12,840 Speaker 1: were two people in the movie business who were brazenly 441 00:27:12,880 --> 00:27:15,920 Speaker 1: predicting the downfall of television. Now, this was during a 442 00:27:16,000 --> 00:27:19,520 Speaker 1: time when movie theaters were starting to see a rise 443 00:27:19,560 --> 00:27:24,200 Speaker 1: in attendance. There had been multiple years of audience drop off, 444 00:27:24,640 --> 00:27:27,520 Speaker 1: so like four years in a row they saw smaller 445 00:27:27,640 --> 00:27:32,119 Speaker 1: audiences for movie theaters, and then things were picking up 446 00:27:32,119 --> 00:27:35,960 Speaker 1: in nineteen fifty one. So it's possible that movie executives 447 00:27:36,000 --> 00:27:39,080 Speaker 1: were chomping their cigars and saying, ha, TV took a 448 00:27:39,119 --> 00:27:42,040 Speaker 1: swing at us, but it's not staying around. Light my 449 00:27:42,119 --> 00:27:45,439 Speaker 1: cigar with another one hundred dollars bill or something. I 450 00:27:45,520 --> 00:27:49,560 Speaker 1: admittedly have a very cartoonish imagination. They probably weren't saying that, 451 00:27:49,680 --> 00:27:53,920 Speaker 1: but it does make it sound like movie executives were thinking, oh, 452 00:27:54,320 --> 00:27:59,000 Speaker 1: television took a temporary hit out of our business. But 453 00:27:59,320 --> 00:28:02,800 Speaker 1: as it turns out, people prefer their experience in the theaters, 454 00:28:02,720 --> 00:28:05,280 Speaker 1: TV is too expensive for the average person, etc. Etc. 455 00:28:06,080 --> 00:28:08,359 Speaker 1: And so now we're just dismissing it. And of course 456 00:28:08,359 --> 00:28:10,480 Speaker 1: it would turn out that television was not just a 457 00:28:10,480 --> 00:28:13,600 Speaker 1: passing fad, at least not a short one. You could 458 00:28:13,720 --> 00:28:16,560 Speaker 1: argue that maybe because of cord cutting and stuff, that 459 00:28:17,119 --> 00:28:21,399 Speaker 1: it was a very long passing fad, but certainly at 460 00:28:21,440 --> 00:28:23,560 Speaker 1: the time. It wasn't just a passing fad, and that 461 00:28:23,760 --> 00:28:26,560 Speaker 1: predicting that people would get tired of TV and come 462 00:28:26,600 --> 00:28:31,040 Speaker 1: back to the movie theaters was just an optimistic prediction 463 00:28:31,200 --> 00:28:34,480 Speaker 1: on behalf of the executives. Another example of someone making 464 00:28:34,480 --> 00:28:37,480 Speaker 1: a prediction when he had a vested interest in the 465 00:28:37,520 --> 00:28:42,800 Speaker 1: outcome is Steve Balmer, the former CEO of Microsoft. Balmer 466 00:28:42,880 --> 00:28:46,080 Speaker 1: was actually employee number thirty at Microsoft when he joined 467 00:28:46,080 --> 00:28:48,880 Speaker 1: in nineteen eighty, and he became CEO of the company 468 00:28:48,880 --> 00:28:52,040 Speaker 1: in two thousand. His presentations are the stuff of legend. 469 00:28:52,240 --> 00:28:54,960 Speaker 1: If you've not had the experience of digging up a 470 00:28:55,000 --> 00:28:59,200 Speaker 1: clip of Steve Balmer on stage at some event over 471 00:28:59,240 --> 00:29:02,040 Speaker 1: on YouTube, you need to give that a go. There 472 00:29:02,040 --> 00:29:05,520 Speaker 1: are a lot of different ones. The famous one is developers, Developers, Developers, 473 00:29:05,520 --> 00:29:08,680 Speaker 1: but that's just one. You know, you should check them out. 474 00:29:08,840 --> 00:29:10,760 Speaker 1: Just make sure your volume is turned down a bit, 475 00:29:10,800 --> 00:29:14,760 Speaker 1: because that dude loves dal and scream and intense is 476 00:29:14,800 --> 00:29:17,840 Speaker 1: a good way to describe him. Anyway. In two thousand 477 00:29:17,840 --> 00:29:21,520 Speaker 1: and seven, Balmer reacted to something that, unbeknownst to pretty 478 00:29:21,560 --> 00:29:24,000 Speaker 1: much everyone at the time, was actually going to lead 479 00:29:24,040 --> 00:29:27,200 Speaker 1: to enormous changes in the tech space. And that was 480 00:29:27,240 --> 00:29:31,960 Speaker 1: the debut of the Apple iPhone. Balmer said, quote, there's 481 00:29:32,080 --> 00:29:34,800 Speaker 1: no chance that the iPhone is going to get any 482 00:29:34,840 --> 00:29:39,280 Speaker 1: significant market share. No chance end quote, and he went 483 00:29:39,360 --> 00:29:42,360 Speaker 1: on to call the iPhone a quote five hundred dollars 484 00:29:42,360 --> 00:29:47,360 Speaker 1: subsidized item end quote. He predicted that Microsoft's software would 485 00:29:47,360 --> 00:29:49,840 Speaker 1: be in most phones on the market, and that while 486 00:29:49,880 --> 00:29:53,560 Speaker 1: Apple could make a lot of money selling phones, they 487 00:29:53,600 --> 00:29:56,040 Speaker 1: would not make up a significant amount of the market share. 488 00:29:56,080 --> 00:29:58,480 Speaker 1: They would maybe have two or three percent of the 489 00:29:58,480 --> 00:30:02,560 Speaker 1: market at best. So Microsoft strategy was a lot like 490 00:30:02,640 --> 00:30:04,960 Speaker 1: what we would see from Google a little later on, 491 00:30:05,360 --> 00:30:07,960 Speaker 1: which was to create the operating system in the software 492 00:30:08,160 --> 00:30:12,720 Speaker 1: for smartphones, but to leave the manufacturing to the handset companies. 493 00:30:13,320 --> 00:30:16,240 Speaker 1: Apple was taking an all in approach rather than licensing 494 00:30:16,280 --> 00:30:20,320 Speaker 1: software to businesses that made the hardware. Balmer was convinced 495 00:30:20,640 --> 00:30:22,840 Speaker 1: that was the bad way to go, that it made 496 00:30:22,840 --> 00:30:25,360 Speaker 1: way more sense to just focus on the software and 497 00:30:25,440 --> 00:30:28,080 Speaker 1: license it out to the hardware companies. But as it 498 00:30:28,120 --> 00:30:31,840 Speaker 1: turned out, Apple strategy worked like gangbusters. In two thousand 499 00:30:31,880 --> 00:30:35,520 Speaker 1: and seven, Apple sold around one point thirty nine million iPhones. 500 00:30:35,800 --> 00:30:37,760 Speaker 1: That was the year that they introduced it. They didn't 501 00:30:38,120 --> 00:30:40,600 Speaker 1: offer it for sale until the back half of the year. 502 00:30:41,160 --> 00:30:44,080 Speaker 1: In two thousand and eight, Apple sold eleven point three 503 00:30:44,200 --> 00:30:48,880 Speaker 1: six million iPhones, so more than around ten million more 504 00:30:48,880 --> 00:30:51,080 Speaker 1: than they had the year before. By the end of 505 00:30:51,080 --> 00:30:54,480 Speaker 1: the next year, that doubled again at around twenty point 506 00:30:54,560 --> 00:30:58,360 Speaker 1: seven three million units sold. In fact, Apple saw sales 507 00:30:58,440 --> 00:31:02,960 Speaker 1: numbers increase every year until you get to twenty sixteen, 508 00:31:03,280 --> 00:31:05,560 Speaker 1: because in twenty fifteen the company sold two hundred and 509 00:31:05,560 --> 00:31:09,200 Speaker 1: thirty one point two two million iPhones, and in twenty 510 00:31:09,240 --> 00:31:12,760 Speaker 1: sixteen it sold quote unquote only two hundred eleven point 511 00:31:12,760 --> 00:31:15,960 Speaker 1: one point eight million units, still more than two hundred 512 00:31:15,960 --> 00:31:18,600 Speaker 1: million units, but a drop of around twenty million. As 513 00:31:18,600 --> 00:31:22,280 Speaker 1: for Microsoft, the company pushed hard to try and establish 514 00:31:22,280 --> 00:31:25,880 Speaker 1: a foothold in smartphone operating systems, but it just never 515 00:31:25,920 --> 00:31:29,120 Speaker 1: really worked out for the company. Companies stopped making Windows 516 00:31:29,120 --> 00:31:33,160 Speaker 1: phone devices in twenty seventeen, and the company completely ended 517 00:31:33,200 --> 00:31:37,160 Speaker 1: support for the operating system in twenty twenty two. Okay, 518 00:31:37,320 --> 00:31:39,160 Speaker 1: we're going to take another quick break, but we still 519 00:31:39,200 --> 00:31:52,320 Speaker 1: have some more bad predictions to get through. We're back. 520 00:31:52,360 --> 00:31:54,280 Speaker 1: And just before the break, I was talking about Steve 521 00:31:54,320 --> 00:31:57,880 Speaker 1: Baumer dismissing the iPhone, and of course it turned out 522 00:31:57,920 --> 00:32:01,320 Speaker 1: that he was totally off. I mean, granted, it's not 523 00:32:01,360 --> 00:32:03,520 Speaker 1: like he could have said that the iPhone's going to 524 00:32:03,520 --> 00:32:06,440 Speaker 1: be a huge hit. He was leading a major competitor 525 00:32:06,480 --> 00:32:09,960 Speaker 1: to Apple at the time, So whether he believed that 526 00:32:10,000 --> 00:32:12,920 Speaker 1: the iPhone truly was just gonna be a failure or not, 527 00:32:13,040 --> 00:32:15,680 Speaker 1: I can't say, but I certainly don't think he could 528 00:32:15,680 --> 00:32:19,920 Speaker 1: have said anything different. Anyway, Apple also was not immune 529 00:32:20,160 --> 00:32:24,520 Speaker 1: to making bad predictions. Steve Jobs, a man through force 530 00:32:24,560 --> 00:32:28,000 Speaker 1: of personality and a famous intolerance for deviation from his vision, 531 00:32:28,720 --> 00:32:31,800 Speaker 1: returned to a struggling Apple in the nineteen nineties and 532 00:32:31,840 --> 00:32:33,640 Speaker 1: set it on a path to become a company that 533 00:32:33,760 --> 00:32:37,440 Speaker 1: is today worth more than two point eight trillion dollars 534 00:32:37,560 --> 00:32:41,080 Speaker 1: at the time of this recording. Anyway, back in two 535 00:32:41,120 --> 00:32:44,480 Speaker 1: thousand and three, Apple introduced the iTunes music store for 536 00:32:44,520 --> 00:32:46,840 Speaker 1: the first time. So the company had already introduced the 537 00:32:46,920 --> 00:32:50,000 Speaker 1: iPod a couple of years earlier, but now it was 538 00:32:50,040 --> 00:32:53,200 Speaker 1: introducing an online digital music store where you could buy 539 00:32:53,320 --> 00:32:57,000 Speaker 1: albums and tracks, either to port over to an iPod 540 00:32:57,520 --> 00:33:01,640 Speaker 1: or to listen from your computer. Job believed the customers 541 00:33:01,680 --> 00:33:06,080 Speaker 1: wanted to own their music. He was dismissive of the 542 00:33:06,080 --> 00:33:09,320 Speaker 1: business model that was being used by Rhapsody and by 543 00:33:09,400 --> 00:33:13,200 Speaker 1: press Play, both of which offered subscription services to customers 544 00:33:13,320 --> 00:33:15,960 Speaker 1: to get access to music. So you pay a certain 545 00:33:15,960 --> 00:33:18,200 Speaker 1: amount of money each month and then you're able to 546 00:33:18,360 --> 00:33:23,000 Speaker 1: listen to music that is covered by these different companies. 547 00:33:23,400 --> 00:33:27,520 Speaker 1: Jobs said, quote, we think subscriptions are the wrong path. 548 00:33:28,080 --> 00:33:30,320 Speaker 1: One of the reasons we think this is because people 549 00:33:30,360 --> 00:33:33,000 Speaker 1: bought their music for as long as we can remember. 550 00:33:33,360 --> 00:33:36,440 Speaker 1: We bought our music on LPs. We bought our music 551 00:33:36,520 --> 00:33:40,280 Speaker 1: on cassettes, we bought our music on CDs. And we 552 00:33:40,320 --> 00:33:43,160 Speaker 1: think people want to buy their music on the internet 553 00:33:43,400 --> 00:33:47,320 Speaker 1: by buying downloads, just like they bought LPs, just like 554 00:33:47,360 --> 00:33:51,120 Speaker 1: they bought cassettes, just like they bought CDs. They're used 555 00:33:51,280 --> 00:33:53,880 Speaker 1: to buying their music, and they're used to getting a 556 00:33:53,920 --> 00:33:56,840 Speaker 1: broad set of rights with it. When you own your music, 557 00:33:57,240 --> 00:34:00,560 Speaker 1: it never goes away. When you own your music, you 558 00:34:00,600 --> 00:34:03,160 Speaker 1: have a broad set of personal use rights. You can 559 00:34:03,200 --> 00:34:06,840 Speaker 1: listen to it however you want. End quote. And it's 560 00:34:06,840 --> 00:34:10,120 Speaker 1: not like Jobs was wrong, right. People do like to 561 00:34:10,239 --> 00:34:12,439 Speaker 1: own stuff. I think it's safe to say that most 562 00:34:12,440 --> 00:34:17,160 Speaker 1: people definitely prefer owning music to losing access to something 563 00:34:17,200 --> 00:34:21,279 Speaker 1: because a licensing deal has expired. I'm sure everyone out 564 00:34:21,320 --> 00:34:24,440 Speaker 1: there has had that experience where something that used to 565 00:34:24,440 --> 00:34:28,239 Speaker 1: be covered on one of the streaming services you either 566 00:34:28,320 --> 00:34:31,759 Speaker 1: listen to or watch or whatever goes away because a 567 00:34:31,840 --> 00:34:34,759 Speaker 1: licensing deal expired, or because the company that was in 568 00:34:34,840 --> 00:34:36,840 Speaker 1: charge of it decided to get rid of it because 569 00:34:37,000 --> 00:34:39,680 Speaker 1: of you know, sticky residual deals. I'm looking at you 570 00:34:40,320 --> 00:34:44,440 Speaker 1: Max and Zaslov, who would remove stuff so that he 571 00:34:44,480 --> 00:34:48,040 Speaker 1: wouldn't have to worry about paying residuals to people anyway, 572 00:34:48,080 --> 00:34:51,240 Speaker 1: we know that people prefer being able to access their stuff. 573 00:34:51,239 --> 00:34:53,800 Speaker 1: They hate it when the stuff goes away. But despite 574 00:34:53,840 --> 00:34:58,840 Speaker 1: all that, the subscription based business model has seen incredible success. 575 00:34:59,200 --> 00:35:01,919 Speaker 1: It's a very convene thing, like instead of buying track 576 00:35:01,960 --> 00:35:04,520 Speaker 1: by track or album by album, you get access to 577 00:35:04,600 --> 00:35:08,520 Speaker 1: a huge library of material. In fact, it was so 578 00:35:08,600 --> 00:35:12,680 Speaker 1: successful that Apple would introduce its own music subscription service 579 00:35:12,680 --> 00:35:16,640 Speaker 1: in twenty fifteen. Notably, Steve Jobs had passed away in 580 00:35:16,680 --> 00:35:21,279 Speaker 1: twenty eleven, so it didn't happen within his lifetime, and 581 00:35:21,320 --> 00:35:23,799 Speaker 1: he again he was famously dismissive of it when he 582 00:35:23,880 --> 00:35:28,120 Speaker 1: introduced the music store. But in twenty fifteen we got 583 00:35:28,160 --> 00:35:31,680 Speaker 1: Apple Music, which would expand to include not just music 584 00:35:31,719 --> 00:35:36,520 Speaker 1: tracks but also video. Oh. Also, journalists get stuff wrong 585 00:35:36,680 --> 00:35:40,279 Speaker 1: a lot too. Goodness knows, I've gotten a lot wrong. 586 00:35:40,320 --> 00:35:43,399 Speaker 1: Although I really shouldn't reference myself as a journalist. I'm 587 00:35:43,400 --> 00:35:46,600 Speaker 1: not really a journalist. I don't have those qualifications. But 588 00:35:47,200 --> 00:35:49,960 Speaker 1: David Pogue wrote a piece for The New York Times 589 00:35:49,960 --> 00:35:53,960 Speaker 1: about Apple in September two thousand and six. That piece 590 00:35:54,040 --> 00:35:57,560 Speaker 1: was called iPhone Rumors, and it starts off with quote, 591 00:35:57,920 --> 00:36:00,560 Speaker 1: Everyone's always asking me when Apple will come out with 592 00:36:00,600 --> 00:36:04,360 Speaker 1: a cell phone. My answer is probably never. End quote. 593 00:36:04,520 --> 00:36:08,200 Speaker 1: And of course Apple introduced the iPhone the very next year. 594 00:36:08,560 --> 00:36:10,560 Speaker 1: But if you read Poke's piece, he lays out some 595 00:36:10,600 --> 00:36:13,440 Speaker 1: really good arguments about why it would make sense to 596 00:36:13,480 --> 00:36:16,880 Speaker 1: be skeptical that Apple would release a phone. One of 597 00:36:16,920 --> 00:36:20,480 Speaker 1: his really big points is that telecommunications carriers, you know, 598 00:36:20,520 --> 00:36:24,200 Speaker 1: the companies that actually own the infrastructure that allows communication 599 00:36:24,239 --> 00:36:27,480 Speaker 1: across devices. You know your AT and t's, your Verizons, 600 00:36:27,480 --> 00:36:30,200 Speaker 1: et cetera, they have a lot of power when it 601 00:36:30,280 --> 00:36:34,520 Speaker 1: comes to hardware. The telecommunications companies can actually approve or 602 00:36:34,560 --> 00:36:39,440 Speaker 1: deny features on devices. Essentially, they do this by saying, Okay, 603 00:36:39,480 --> 00:36:41,719 Speaker 1: well we're not going to let your hardware work on 604 00:36:41,840 --> 00:36:46,120 Speaker 1: our network. If you include that feature, we don't want 605 00:36:46,160 --> 00:36:49,080 Speaker 1: to support that feature. We will not allow you to 606 00:36:49,200 --> 00:36:52,040 Speaker 1: use that device on our network if they don't like something. 607 00:36:52,320 --> 00:36:55,320 Speaker 1: So Pogue's point was that Apple was not the type 608 00:36:55,320 --> 00:36:59,160 Speaker 1: of company that would compromise or allow some other business 609 00:36:59,680 --> 00:37:03,520 Speaker 1: that little control into their processes, and that was reasonable, 610 00:37:03,600 --> 00:37:07,279 Speaker 1: Like you can't imagine Steve Jobs being told in no 611 00:37:07,440 --> 00:37:09,840 Speaker 1: uncertain terms like we're not going to allow that. You 612 00:37:09,960 --> 00:37:12,759 Speaker 1: have to design it this way. So it seemed to 613 00:37:12,800 --> 00:37:15,920 Speaker 1: be a reasonable conclusion to say that Apple was not 614 00:37:16,000 --> 00:37:17,719 Speaker 1: going to release a phone in the first place. But 615 00:37:17,760 --> 00:37:21,000 Speaker 1: as it turned out, Apple worked very closely with AT 616 00:37:21,120 --> 00:37:22,920 Speaker 1: and T for the launch of the iPhone. It was 617 00:37:22,960 --> 00:37:25,200 Speaker 1: an AT and T exclusive here in the United States 618 00:37:25,200 --> 00:37:27,840 Speaker 1: when it first launched. But common sense would have suggested 619 00:37:27,880 --> 00:37:30,400 Speaker 1: that Apple would not have managed such a relationship and 620 00:37:30,440 --> 00:37:33,359 Speaker 1: that the company would have instead focused on technologies where 621 00:37:33,400 --> 00:37:37,480 Speaker 1: it would maintain near total control of the user experience. 622 00:37:37,600 --> 00:37:41,280 Speaker 1: So you can understand why Pogue made that particular statement. 623 00:37:41,600 --> 00:37:45,560 Speaker 1: It just turned out to be completely wrong, but again, 624 00:37:45,640 --> 00:37:47,800 Speaker 1: just based on the information that was available, it was 625 00:37:47,800 --> 00:37:51,160 Speaker 1: an understandable one. So yeah, it is really fun to 626 00:37:51,239 --> 00:37:53,799 Speaker 1: go back over these kind of old statements and old 627 00:37:53,880 --> 00:37:57,200 Speaker 1: predictions and see with the benefit of hindsight how off 628 00:37:57,400 --> 00:37:59,560 Speaker 1: they were. Or at least it's fun to me, because 629 00:37:59,640 --> 00:38:03,040 Speaker 1: again I used to make predictions, and I was often 630 00:38:03,280 --> 00:38:05,799 Speaker 1: just as wrong, or sometimes far more wrong than any 631 00:38:05,840 --> 00:38:09,480 Speaker 1: of the examples I've cited here. I think that some 632 00:38:09,520 --> 00:38:13,560 Speaker 1: of y'all might even remember one of those I famously predicted. 633 00:38:13,920 --> 00:38:16,840 Speaker 1: I don't know, famous that's giving myself too much credit. 634 00:38:17,280 --> 00:38:22,120 Speaker 1: I very much predicted that the iPad was going to 635 00:38:22,120 --> 00:38:25,239 Speaker 1: be a flop. I could not see the iPad succeeding, 636 00:38:25,560 --> 00:38:29,280 Speaker 1: and that was because tablet computers had been around for ages. 637 00:38:29,800 --> 00:38:33,000 Speaker 1: Even touchscreen tablet computers had been around for quite some time, 638 00:38:33,320 --> 00:38:36,440 Speaker 1: but no one had managed to make one that appealed 639 00:38:36,800 --> 00:38:40,640 Speaker 1: to the broader consumer market. The tablet computers that were 640 00:38:40,680 --> 00:38:44,319 Speaker 1: in use were niche products. They were used in very 641 00:38:44,320 --> 00:38:48,080 Speaker 1: specific applications, like you had some in the sciences, you 642 00:38:48,160 --> 00:38:50,640 Speaker 1: had some in medicine, but you didn't really have a 643 00:38:50,680 --> 00:38:55,000 Speaker 1: consumer tablet that had seen great success. And I just 644 00:38:55,040 --> 00:38:59,200 Speaker 1: couldn't imagine people wanting something of that form. Factor too 645 00:38:59,200 --> 00:39:03,320 Speaker 1: big to be easily portable unless you're carrying a bag around, 646 00:39:04,280 --> 00:39:07,359 Speaker 1: too small and too limited to be really useful if 647 00:39:07,360 --> 00:39:10,520 Speaker 1: you wanted it for something like productivity, because typing on 648 00:39:10,560 --> 00:39:13,120 Speaker 1: a screen is far slower than typing on a keyboard. 649 00:39:13,360 --> 00:39:16,839 Speaker 1: So I just assumed that even Apple wouldn't be able 650 00:39:16,840 --> 00:39:21,040 Speaker 1: to make the tablet computer commercial success for the consumer market, 651 00:39:21,200 --> 00:39:26,360 Speaker 1: and I was totally wrong. I doubted Steve Jobs's marketing ability, 652 00:39:26,400 --> 00:39:30,880 Speaker 1: I doubted Apple's engineering and making a product that had 653 00:39:31,400 --> 00:39:36,640 Speaker 1: a very compelling user interface. And my prediction was one 654 00:39:36,719 --> 00:39:42,200 Speaker 1: hundred percent incorrect, and I own it. It was, you know, 655 00:39:42,440 --> 00:39:45,360 Speaker 1: I felt like I had based it on some solid ground, 656 00:39:45,560 --> 00:39:48,200 Speaker 1: but it all turned out to be quicksand I guess 657 00:39:48,520 --> 00:39:51,000 Speaker 1: so it can happen to anyone. I don't think I'll 658 00:39:51,040 --> 00:39:54,680 Speaker 1: be bringing the Predictions episodes back anytime soon. They would 659 00:39:54,719 --> 00:39:58,520 Speaker 1: cause me huge amounts of stress because it's hard, right. 660 00:39:59,160 --> 00:40:02,960 Speaker 1: It involves doing a lot of work to just look 661 00:40:03,000 --> 00:40:07,120 Speaker 1: at what is the current state of technology, and even 662 00:40:07,480 --> 00:40:10,720 Speaker 1: working from that, I have an incomplete picture, because obviously 663 00:40:10,760 --> 00:40:13,960 Speaker 1: there are people and companies working on things that are 664 00:40:14,000 --> 00:40:18,440 Speaker 1: not yet publicly known, and so I have an incomplete 665 00:40:18,520 --> 00:40:22,839 Speaker 1: picture from that respect, and basing predictions off of an 666 00:40:22,880 --> 00:40:26,239 Speaker 1: incomplete picture is even more shoddy than just you know, 667 00:40:26,920 --> 00:40:31,080 Speaker 1: having to concede that you can't anticipate the innovation that's 668 00:40:31,120 --> 00:40:33,600 Speaker 1: going to follow in the months ahead. So I don't 669 00:40:33,800 --> 00:40:36,880 Speaker 1: think i'll bring it back. We'll see. Maybe toward the 670 00:40:36,960 --> 00:40:38,560 Speaker 1: end of the year, I'll think, ah, heck, I'll give 671 00:40:38,560 --> 00:40:41,120 Speaker 1: it another shot, and I'll see if I can predict 672 00:40:41,520 --> 00:40:45,319 Speaker 1: what will happen in twenty twenty four. But honestly, when 673 00:40:45,360 --> 00:40:47,920 Speaker 1: I look back at the last like three years, when 674 00:40:47,960 --> 00:40:51,400 Speaker 1: I stopped doing predictions episodes, I see so many examples 675 00:40:51,440 --> 00:40:54,600 Speaker 1: of stuff I never would have predicted. I definitely wouldn't 676 00:40:54,640 --> 00:40:57,719 Speaker 1: have predicted Elon Musk purchasing Twitter, for example. That would 677 00:40:57,760 --> 00:41:01,480 Speaker 1: not have been on my list. I'm not sure that 678 00:41:01,520 --> 00:41:04,120 Speaker 1: I would have predicted. You know, everyone knows my opinion 679 00:41:04,160 --> 00:41:06,239 Speaker 1: of Elon Musk is pretty dodgy. But I don't think 680 00:41:06,280 --> 00:41:08,520 Speaker 1: I would have predicted that Elon Musk taking over Twitter 681 00:41:08,800 --> 00:41:14,439 Speaker 1: would lead to such a train wreck, a slow degrading 682 00:41:16,000 --> 00:41:19,040 Speaker 1: situation for Twitter at this point, as the company appears 683 00:41:19,080 --> 00:41:21,160 Speaker 1: to be falling apart. I don't know that I would 684 00:41:21,200 --> 00:41:26,600 Speaker 1: predicted that either. So yeah, we'll see. If I'm feeling 685 00:41:26,880 --> 00:41:29,920 Speaker 1: spunky at the end of the year, maybe I'll give 686 00:41:29,960 --> 00:41:33,879 Speaker 1: it another go, but it is interesting to keep an 687 00:41:33,880 --> 00:41:37,080 Speaker 1: eye out for these. Maybe I'll also do another episode 688 00:41:37,120 --> 00:41:41,600 Speaker 1: where I'll take good predictions, stuff that people thought was 689 00:41:41,640 --> 00:41:44,040 Speaker 1: coming across the horizon and it turned out they were, 690 00:41:44,480 --> 00:41:47,359 Speaker 1: you know, mostly right, or maybe completely right. That would 691 00:41:47,360 --> 00:41:49,000 Speaker 1: be fun too. It's fun to look at the ones 692 00:41:49,040 --> 00:41:51,440 Speaker 1: where we got it totally wrong because it kind of 693 00:41:51,440 --> 00:41:54,000 Speaker 1: brings a little humility into the situation. But once in 694 00:41:54,040 --> 00:41:56,759 Speaker 1: a while people will make a prediction and boy how 695 00:41:56,800 --> 00:41:59,680 Speaker 1: they they get it bang on the money. So maybe 696 00:41:59,680 --> 00:42:02,160 Speaker 1: I should try and do an episode it's based on 697 00:42:02,239 --> 00:42:05,799 Speaker 1: that too. In the meantime, I hope you are all 698 00:42:05,920 --> 00:42:09,640 Speaker 1: well and I will talk to you again really soon. 699 00:42:15,719 --> 00:42:20,360 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 700 00:42:20,680 --> 00:42:24,400 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 701 00:42:24,440 --> 00:42:29,040 Speaker 1: to your favorite shows.