1 00:00:04,480 --> 00:00:12,520 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:16,400 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,400 --> 00:00:19,680 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:19,760 --> 00:00:23,960 Speaker 1: tech are you. I read the news today, Oh boy, 5 00:00:24,280 --> 00:00:28,200 Speaker 1: I woke up to really glum global financial news as 6 00:00:28,320 --> 00:00:31,360 Speaker 1: various news outlets reported on how a fear that the 7 00:00:31,480 --> 00:00:35,519 Speaker 1: US market was slowing down is now having massive repercussions 8 00:00:35,680 --> 00:00:39,080 Speaker 1: around the world. So, according to the New York Times quote, 9 00:00:39,320 --> 00:00:44,440 Speaker 1: Japan's benchmark index logged its worst single day point decline 10 00:00:44,840 --> 00:00:48,239 Speaker 1: end quote. That index fell by more than twelve percent. 11 00:00:48,800 --> 00:00:52,640 Speaker 1: Europe's markets saw investors freaking out and selling off assets, 12 00:00:52,840 --> 00:00:56,120 Speaker 1: causing prices to tumble further. And while I'm writing this 13 00:00:56,200 --> 00:00:59,000 Speaker 1: episode before the market's opened in the US, in fact, 14 00:00:59,000 --> 00:01:01,320 Speaker 1: I think they just opened as I started recording this. 15 00:01:01,760 --> 00:01:03,760 Speaker 1: The assumption is that our markets are going to follow 16 00:01:03,800 --> 00:01:09,080 Speaker 1: suit after stock futures took a tumble here in America. Now, 17 00:01:09,160 --> 00:01:13,759 Speaker 1: tech companies were hit harder than other sectors were. Everyone 18 00:01:13,880 --> 00:01:16,880 Speaker 1: was hit, but tech companies were hit particularly hard. But 19 00:01:16,959 --> 00:01:20,360 Speaker 1: then tech has also been driving some pretty crazy growth 20 00:01:20,520 --> 00:01:23,400 Speaker 1: in the recent past. They were kind of surging past 21 00:01:23,440 --> 00:01:25,679 Speaker 1: everybody else, so they had further to fall as well. 22 00:01:25,920 --> 00:01:30,280 Speaker 1: The semiconductor fabrication company TSMC, which is responsible for more 23 00:01:30,319 --> 00:01:34,680 Speaker 1: than half the global market share in the semiconductor foundry industry, 24 00:01:34,920 --> 00:01:39,880 Speaker 1: saw it stock price fall by ten percent. Samsung Electronics 25 00:01:39,920 --> 00:01:43,440 Speaker 1: same story ten percent decline. Bitcoin drop by more than 26 00:01:43,520 --> 00:01:49,480 Speaker 1: ten percent as well. So what caused this Well, the 27 00:01:49,800 --> 00:01:52,880 Speaker 1: instigating factor appears to be a US jobs report that 28 00:01:52,960 --> 00:01:56,440 Speaker 1: showed unemployment rows to four point three percent, which is 29 00:01:56,440 --> 00:01:59,560 Speaker 1: the highest it's been since twenty twenty one. There are 30 00:01:59,680 --> 00:02:02,880 Speaker 1: fear of a recession in the United States. Seems like 31 00:02:02,920 --> 00:02:06,040 Speaker 1: there have been fears ever since the pandemic. This would 32 00:02:06,080 --> 00:02:10,320 Speaker 1: continue to affect world markets. Obviously, and one other culprit 33 00:02:11,000 --> 00:02:14,160 Speaker 1: was mentioned, at least in a CNN article on this topic. 34 00:02:14,480 --> 00:02:17,280 Speaker 1: I do not know how realistic it is or how 35 00:02:17,360 --> 00:02:19,480 Speaker 1: much weight we should give it, but that is our 36 00:02:19,560 --> 00:02:25,280 Speaker 1: good old friend artificial intelligence. Now, to be clear, it's 37 00:02:25,400 --> 00:02:29,160 Speaker 1: not really AI's fault. This is not a case where 38 00:02:29,200 --> 00:02:33,440 Speaker 1: some AI process has cast the world economy into chaos, right, 39 00:02:33,520 --> 00:02:39,000 Speaker 1: like some artificial intelligence algorithm played hanky panky with the 40 00:02:39,000 --> 00:02:42,720 Speaker 1: stock market and now everything's crashing. That's not what happened. 41 00:02:42,720 --> 00:02:46,560 Speaker 1: This is not some science fiction black mirror episode. Rather, 42 00:02:47,000 --> 00:02:51,400 Speaker 1: it's the perception of AI, the marketing around AI, the 43 00:02:51,520 --> 00:02:56,880 Speaker 1: swell of greed around artificial intelligence, and the inevitable reaction 44 00:02:56,960 --> 00:03:01,840 Speaker 1: when investors discover that perhaps the goose ain't so golden, 45 00:03:02,560 --> 00:03:06,480 Speaker 1: or at least it ain't golden yet. By that, I 46 00:03:06,480 --> 00:03:09,680 Speaker 1: mean we're gonna talk about the good old hype cycle again, 47 00:03:10,000 --> 00:03:11,320 Speaker 1: and I thought it would be a good idea to 48 00:03:11,360 --> 00:03:14,520 Speaker 1: revisit the topic as we are seeing now how it 49 00:03:14,560 --> 00:03:19,440 Speaker 1: can make a bad situation worse when hype is allowed 50 00:03:19,480 --> 00:03:23,560 Speaker 1: to prolificate. So, first, to be clear, I don't think 51 00:03:23,639 --> 00:03:27,239 Speaker 1: AI hype is the main reason for the economic crisis 52 00:03:27,639 --> 00:03:30,960 Speaker 1: right I'm not saying that the excitement around AI goddess 53 00:03:31,000 --> 00:03:33,200 Speaker 1: to where we are right now. I think at best, 54 00:03:33,560 --> 00:03:37,760 Speaker 1: it's just a contributing factor. It's exacerbating something that was 55 00:03:37,800 --> 00:03:40,840 Speaker 1: already going to happen. But it is true that investors 56 00:03:40,880 --> 00:03:44,640 Speaker 1: have started to back off of AI assets after seeing 57 00:03:44,640 --> 00:03:49,600 Speaker 1: that there isn't a fast track toward profit, as investors 58 00:03:49,640 --> 00:03:51,760 Speaker 1: have had to grapple with the fact that yes, AI 59 00:03:52,400 --> 00:03:56,760 Speaker 1: is incredibly exciting and it has an insane potential. It 60 00:03:56,840 --> 00:04:00,760 Speaker 1: is not, however, ready to be a massive revenue generator, 61 00:04:00,920 --> 00:04:03,640 Speaker 1: and so it's like it's not fully cooked yet, and 62 00:04:03,680 --> 00:04:06,000 Speaker 1: investors were like trying to rush it out to the 63 00:04:06,040 --> 00:04:09,560 Speaker 1: store and it's not even cooked. Well. There are tons 64 00:04:09,560 --> 00:04:11,840 Speaker 1: of headlines out there about how folks have become a 65 00:04:11,840 --> 00:04:14,880 Speaker 1: bit disillusioned when they realize that while AI could be 66 00:04:14,960 --> 00:04:18,760 Speaker 1: boys to change everything, it's not currently doing that. Throwing 67 00:04:18,800 --> 00:04:22,480 Speaker 1: AI into your business plan doesn't immediately met you insane returns. 68 00:04:22,920 --> 00:04:26,480 Speaker 1: AI has limitations, including a very high cost of operation, 69 00:04:26,800 --> 00:04:30,279 Speaker 1: and it's too unrealistic to say it's somehow just magically 70 00:04:30,320 --> 00:04:34,040 Speaker 1: going to cause revenue and profit to search. So investors 71 00:04:34,040 --> 00:04:37,920 Speaker 1: could be cooling down from their initial excitement around AI, 72 00:04:38,480 --> 00:04:41,560 Speaker 1: which means that businesses will have less incentive to just 73 00:04:41,680 --> 00:04:44,560 Speaker 1: shove AI anywhere they can because it's not going to 74 00:04:44,560 --> 00:04:47,680 Speaker 1: get their investors excited. But in turn, this could just 75 00:04:47,760 --> 00:04:52,039 Speaker 1: be one contribution to this market instability, and ultimately it 76 00:04:52,040 --> 00:04:55,600 Speaker 1: may be a relatively small contribution. But it is certain 77 00:04:55,640 --> 00:04:58,880 Speaker 1: that tech companies are seeing some of the biggest losses 78 00:04:58,960 --> 00:05:01,120 Speaker 1: right now, and it's also true that a lot of 79 00:05:01,160 --> 00:05:04,559 Speaker 1: tech companies jumped right on that hype train for AI. 80 00:05:04,920 --> 00:05:08,359 Speaker 1: Now a cohesive look at all the factors contributing to 81 00:05:08,360 --> 00:05:11,839 Speaker 1: the global economic situation is beyond the scope of the show. 82 00:05:12,120 --> 00:05:15,240 Speaker 1: It is way beyond my ability to talk about it. 83 00:05:15,320 --> 00:05:18,839 Speaker 1: I am by no means an economic expert, but we 84 00:05:18,920 --> 00:05:23,440 Speaker 1: can definitely examine this smaller piece of that bigger's puzzle. 85 00:05:23,600 --> 00:05:26,840 Speaker 1: So let's do a quick overview of what the hype 86 00:05:26,920 --> 00:05:31,120 Speaker 1: cycle is. And it's really the process that new ideas 87 00:05:31,200 --> 00:05:38,040 Speaker 1: and often specifically technology typically follows, and it seems ridiculous 88 00:05:38,040 --> 00:05:40,560 Speaker 1: that no one ever seems to learn the lessons. Though 89 00:05:40,600 --> 00:05:43,039 Speaker 1: more on that in the second So first up, what 90 00:05:43,080 --> 00:05:46,600 Speaker 1: we're specifically talking about has the formal name of the 91 00:05:46,640 --> 00:05:51,040 Speaker 1: Gartner hype cycle. Gardner is a US based technology consulting 92 00:05:51,080 --> 00:05:54,360 Speaker 1: firm that got into business back in nineteen seventy nine. 93 00:05:54,400 --> 00:05:57,840 Speaker 1: It was founded by Gideon Gartner, who was a business analyst, 94 00:05:58,080 --> 00:06:00,480 Speaker 1: and the Gardner hype cycle, which I believe he was 95 00:06:00,720 --> 00:06:03,880 Speaker 1: first proposed like in the mid nineties. It's kind of 96 00:06:03,880 --> 00:06:06,880 Speaker 1: an observation. I think of it like Moore's law. We 97 00:06:06,960 --> 00:06:09,440 Speaker 1: call it Moore's law, but Gordon Moore didn't call it that. 98 00:06:09,760 --> 00:06:11,880 Speaker 1: The guy who actually came up with the idea. He 99 00:06:12,040 --> 00:06:15,560 Speaker 1: was making an observation and then making predictions based off 100 00:06:15,600 --> 00:06:18,839 Speaker 1: that observation. The Gardener HiPE cycle is kind of similar. 101 00:06:19,120 --> 00:06:23,719 Speaker 1: It's really more of a way of framing and contextualizing 102 00:06:23,760 --> 00:06:29,159 Speaker 1: an observation about the path a new technology can take 103 00:06:29,880 --> 00:06:33,080 Speaker 1: once it starts to reach a certain level of visibility, 104 00:06:33,400 --> 00:06:38,600 Speaker 1: and the Gardner hype cycle describes five phases with regard 105 00:06:38,680 --> 00:06:42,880 Speaker 1: to how the customer base perceives and uses this technology. Now, 106 00:06:42,880 --> 00:06:45,200 Speaker 1: I should add a lot of folks have called out 107 00:06:45,240 --> 00:06:48,480 Speaker 1: the hype cycle for having some major flaws. One of 108 00:06:48,520 --> 00:06:51,080 Speaker 1: the big flaws is that it's not actually a cycle, 109 00:06:51,560 --> 00:06:54,840 Speaker 1: because you know, like a cycle's a circle, you end 110 00:06:54,920 --> 00:06:58,719 Speaker 1: up back where you began eventually, because that's what circles do. 111 00:06:59,040 --> 00:07:01,359 Speaker 1: The hype cycle is more like a wave, and the 112 00:07:01,360 --> 00:07:05,760 Speaker 1: wave has a high peak of you know, your expectations 113 00:07:05,960 --> 00:07:08,080 Speaker 1: you are are. In fact, it's called the peak of 114 00:07:08,120 --> 00:07:11,720 Speaker 1: inflated expectations. We'll get to it, but that's where your 115 00:07:12,000 --> 00:07:15,680 Speaker 1: perception of what this technology can do is far above 116 00:07:15,680 --> 00:07:18,600 Speaker 1: what the technology is actually capable of doing. There's a gap, 117 00:07:19,080 --> 00:07:22,000 Speaker 1: and then you have a trough that's almost as low 118 00:07:22,040 --> 00:07:26,000 Speaker 1: as the peak, typically the troth of disillusionment. That's where 119 00:07:26,600 --> 00:07:31,200 Speaker 1: you come to grips with oh, this technology isn't as 120 00:07:31,840 --> 00:07:36,080 Speaker 1: capable as I first imagined. And then after that you 121 00:07:36,120 --> 00:07:39,679 Speaker 1: have a slow climb to a steady plateau, the plateau 122 00:07:39,760 --> 00:07:43,480 Speaker 1: of you know, of productivity. But that's not a cycle, 123 00:07:43,680 --> 00:07:48,560 Speaker 1: that's just a path. However, beyond that criticism, other critiques 124 00:07:48,600 --> 00:07:51,760 Speaker 1: include challenges to the observations validity in the first place, 125 00:07:51,800 --> 00:07:55,440 Speaker 1: because there's a distinct lack of data supporting the hype cycle. 126 00:07:55,720 --> 00:07:57,640 Speaker 1: It's one of those things that when you think about it, 127 00:07:57,640 --> 00:08:00,120 Speaker 1: it kind of makes sense, like it feels like it 128 00:08:00,120 --> 00:08:04,000 Speaker 1: it falls into the realm of common sense. But if 129 00:08:04,000 --> 00:08:08,160 Speaker 1: you don't have any actual firm data to support these observations, 130 00:08:08,240 --> 00:08:11,600 Speaker 1: just seeming like it's right isn't good enough. Not really. 131 00:08:12,000 --> 00:08:15,280 Speaker 1: I'm sure you've encountered situations where your own common sense 132 00:08:15,400 --> 00:08:18,640 Speaker 1: told you one thing but it turned out that you 133 00:08:18,720 --> 00:08:21,080 Speaker 1: were wrong. Well, that could also be the case with 134 00:08:21,120 --> 00:08:24,400 Speaker 1: the Gartner hype cycle. And also, as we go through 135 00:08:24,400 --> 00:08:27,000 Speaker 1: this cycle, one really big flaw is that it doesn't 136 00:08:27,040 --> 00:08:31,080 Speaker 1: give us much useful information on either how ideas move 137 00:08:31,360 --> 00:08:34,880 Speaker 1: from one phase of the cycle into the next or 138 00:08:35,480 --> 00:08:39,200 Speaker 1: what we can actually do with this information, Like it 139 00:08:39,240 --> 00:08:42,679 Speaker 1: could just be hey, we keep doing this thing and 140 00:08:42,840 --> 00:08:45,480 Speaker 1: I don't have any information on how this thing happens 141 00:08:45,800 --> 00:08:48,760 Speaker 1: other than this has happened many, many times, but I 142 00:08:48,760 --> 00:08:52,720 Speaker 1: can't can't give you anything useful beyond that. That's one 143 00:08:52,760 --> 00:08:55,960 Speaker 1: way to think about the limitations of the Gartner hype cycle. 144 00:08:56,200 --> 00:08:59,960 Speaker 1: All right, so let's walk through it. The cycle identified 145 00:09:00,320 --> 00:09:04,040 Speaker 1: five phases of a technology. So first up is the 146 00:09:04,160 --> 00:09:07,880 Speaker 1: technology trigger. This is the initial event that introduces a 147 00:09:07,960 --> 00:09:11,880 Speaker 1: technology ultimately to the general public. It doesn't have to 148 00:09:11,880 --> 00:09:14,400 Speaker 1: be a brand new technology. It could be a new 149 00:09:14,400 --> 00:09:17,480 Speaker 1: way to apply an existing technology, or it could be 150 00:09:17,679 --> 00:09:21,800 Speaker 1: some new aspect of an existing technology that gets added in, 151 00:09:21,880 --> 00:09:26,840 Speaker 1: but generally we're talking about a pretty new idea here. Now. 152 00:09:26,920 --> 00:09:29,600 Speaker 1: At first, not many people are going to know about 153 00:09:29,600 --> 00:09:32,600 Speaker 1: this idea, so visibility of the tech is low. Your 154 00:09:32,840 --> 00:09:35,480 Speaker 1: group of initial folks who are talking about this technology 155 00:09:35,720 --> 00:09:38,400 Speaker 1: start to get other people excited about it, and that's 156 00:09:38,400 --> 00:09:41,959 Speaker 1: probably going to first include colleagues and peers, and then 157 00:09:42,200 --> 00:09:45,000 Speaker 1: immediately after that investors, because you always want to try 158 00:09:45,000 --> 00:09:47,640 Speaker 1: and get money for the cool idea you have, And 159 00:09:47,679 --> 00:09:52,640 Speaker 1: then enthusiasm gradually starts to build. The text visibility increases 160 00:09:52,720 --> 00:09:55,360 Speaker 1: as more people become aware of it and become excited 161 00:09:55,360 --> 00:09:57,680 Speaker 1: about it and talk it up even more. The people 162 00:09:57,679 --> 00:10:00,559 Speaker 1: who invested in it, they have an insidi to talk 163 00:10:00,600 --> 00:10:03,080 Speaker 1: it up. Right, if you invest in something, you want 164 00:10:03,120 --> 00:10:05,120 Speaker 1: other people to invest in it too, so that your 165 00:10:05,160 --> 00:10:07,800 Speaker 1: investment has a better chance of paying off. So you 166 00:10:07,840 --> 00:10:10,760 Speaker 1: get your money in first, right, because you're smart. You 167 00:10:10,800 --> 00:10:13,280 Speaker 1: want to get in when it's the bar is at 168 00:10:13,280 --> 00:10:15,360 Speaker 1: it's lowest. You pour in as much money as you 169 00:10:15,360 --> 00:10:17,640 Speaker 1: feel comfortable with. Then you talk it up trying to 170 00:10:17,640 --> 00:10:21,480 Speaker 1: get other people to get excited. And either the technology 171 00:10:21,520 --> 00:10:25,480 Speaker 1: is going to ultimately deliver on what it promised and 172 00:10:25,520 --> 00:10:27,439 Speaker 1: you're going to get a payout because of that, or 173 00:10:27,880 --> 00:10:29,440 Speaker 1: you know you're going to fake it till you make 174 00:10:29,480 --> 00:10:31,319 Speaker 1: it and you get a pay out down the line. 175 00:10:31,360 --> 00:10:33,920 Speaker 1: Maybe someone bigger comes along and buys up the company 176 00:10:33,960 --> 00:10:35,800 Speaker 1: that you invested in, and then you get paid out, 177 00:10:35,920 --> 00:10:37,200 Speaker 1: Like you just want to get paid out, and there 178 00:10:37,200 --> 00:10:39,200 Speaker 1: are a lot of opportunities to get paid out. So 179 00:10:39,240 --> 00:10:41,880 Speaker 1: that's why getting in early is a big deal. But 180 00:10:42,080 --> 00:10:44,760 Speaker 1: we start to see a rapid ascension in visibility as 181 00:10:44,840 --> 00:10:47,800 Speaker 1: media begins to report on it. Folks are talking about it, 182 00:10:47,880 --> 00:10:50,800 Speaker 1: and ultimately the text hits kind of a saturation point 183 00:10:50,840 --> 00:10:53,480 Speaker 1: for awareness, and the tech then is said to be 184 00:10:54,120 --> 00:10:58,199 Speaker 1: at phase two, which is the peak of inflated expectations. 185 00:10:58,240 --> 00:11:01,240 Speaker 1: It's where people are the most most excited for this 186 00:11:01,320 --> 00:11:04,640 Speaker 1: new technology and speculation is running wild about how this 187 00:11:04,760 --> 00:11:07,920 Speaker 1: tech is going to change everything. And it's at this 188 00:11:08,000 --> 00:11:10,800 Speaker 1: stage where people expect this new technology to do things 189 00:11:10,840 --> 00:11:14,200 Speaker 1: it simply will not be able to do. Maybe one 190 00:11:14,280 --> 00:11:17,520 Speaker 1: day it can do some version of that thing, but 191 00:11:17,600 --> 00:11:20,439 Speaker 1: it certainly can't do it right away. So with AI, 192 00:11:20,720 --> 00:11:22,800 Speaker 1: you could say the belief that AI is ready to 193 00:11:22,880 --> 00:11:27,920 Speaker 1: transform business across every industry and produce instantaneous results, that 194 00:11:27,960 --> 00:11:31,520 Speaker 1: would be the peak of inflated expectations. It's not that 195 00:11:31,600 --> 00:11:34,720 Speaker 1: the tech may not ultimately get to that point, but 196 00:11:34,800 --> 00:11:38,440 Speaker 1: it can't do it now. Businesses are still grappling with 197 00:11:38,679 --> 00:11:41,760 Speaker 1: how to integrate AI in ways that make sense to 198 00:11:41,880 --> 00:11:46,160 Speaker 1: their workflow and operations. To use an analogy, if I 199 00:11:46,200 --> 00:11:50,280 Speaker 1: were a hiring manager and if I hired on a 200 00:11:50,320 --> 00:11:53,960 Speaker 1: brand new employee, a human being, and this human being 201 00:11:54,200 --> 00:11:57,480 Speaker 1: is incredibly skilled, and they're knowledgeable, and they have an 202 00:11:57,520 --> 00:12:01,880 Speaker 1: amazing work ethic, the hiring manager I would expect this 203 00:12:02,000 --> 00:12:07,200 Speaker 1: employee to make significant contributions to corporate success. Down the line. 204 00:12:07,600 --> 00:12:10,120 Speaker 1: But I wouldn't think that they would just transform the 205 00:12:10,480 --> 00:12:14,200 Speaker 1: organization overnight unless this was like some weird production of 206 00:12:14,240 --> 00:12:17,120 Speaker 1: the musical How to Succeed in Business Without Really Trying 207 00:12:17,200 --> 00:12:21,040 Speaker 1: or something. Yeah, if it's a fantasy film or something 208 00:12:21,120 --> 00:12:24,520 Speaker 1: like that, sure, but in the real world, no, even 209 00:12:24,559 --> 00:12:28,240 Speaker 1: the best higher in the world is it going to 210 00:12:28,280 --> 00:12:31,720 Speaker 1: transform the business overnight. It's going to take time. Well, 211 00:12:31,760 --> 00:12:35,040 Speaker 1: we should hold AI to the same set of standards. 212 00:12:35,400 --> 00:12:38,679 Speaker 1: You know, it's not something that's just magically going to 213 00:12:38,800 --> 00:12:43,080 Speaker 1: transform everything. Okay, that's just the first two phases of 214 00:12:43,120 --> 00:12:45,160 Speaker 1: the hype cycle. We've got three more to get through. 215 00:12:45,360 --> 00:12:47,480 Speaker 1: But before we do that, let's take a quick break 216 00:12:47,520 --> 00:13:00,280 Speaker 1: to thank our sponsors. Okay, we're back. So the world 217 00:13:00,720 --> 00:13:04,760 Speaker 1: realizes that the hyped technology from the first two phases 218 00:13:05,120 --> 00:13:09,280 Speaker 1: cannot meet the peak of inflated expectations, and folks start 219 00:13:09,320 --> 00:13:14,560 Speaker 1: to get disappointed and excitement and enthusiasm begins to drain away. So, 220 00:13:14,600 --> 00:13:18,719 Speaker 1: according to the hype cycle, visibility now goes into decline. 221 00:13:19,000 --> 00:13:22,560 Speaker 1: People move on to do other things. Some folks will 222 00:13:22,600 --> 00:13:25,960 Speaker 1: stick with the technology typically, I mean it's rare that 223 00:13:26,200 --> 00:13:29,840 Speaker 1: everyone abandoned ship. If they did, then that would just 224 00:13:29,840 --> 00:13:32,600 Speaker 1: be the end of that story, right. The technology just 225 00:13:32,640 --> 00:13:35,360 Speaker 1: wouldn't have any support. No one would be putting any 226 00:13:35,360 --> 00:13:38,760 Speaker 1: money into R and D. It just it'd be a 227 00:13:38,800 --> 00:13:42,520 Speaker 1: dead end most of the time. Though there are those 228 00:13:42,559 --> 00:13:45,880 Speaker 1: who stick with it, especially those who never really hyped 229 00:13:45,960 --> 00:13:48,040 Speaker 1: up what it was capable of in the first place. 230 00:13:48,080 --> 00:13:53,040 Speaker 1: They had more measured expectations. The initial ground swell of support, however, 231 00:13:53,240 --> 00:13:58,000 Speaker 1: is gone. The tech then moves into the troth of disillusionment. 232 00:13:58,440 --> 00:14:01,800 Speaker 1: So in this phase you see a lot of sad 233 00:14:01,880 --> 00:14:05,080 Speaker 1: stuff happen because startups that launched during the earlier hype 234 00:14:05,080 --> 00:14:08,520 Speaker 1: phase face a really tough reality. Unless they can convince 235 00:14:08,640 --> 00:14:13,520 Speaker 1: stakeholders to hang on, they may find themselves out of 236 00:14:13,600 --> 00:14:16,280 Speaker 1: investment money and they just fade away. So a lot 237 00:14:16,280 --> 00:14:20,080 Speaker 1: of startups fail during this phase if you subscribe to 238 00:14:20,120 --> 00:14:24,000 Speaker 1: the Gartner hype cycle model. So this really did happen 239 00:14:24,040 --> 00:14:26,320 Speaker 1: with a lot of VR companies back in the nineteen nineties. 240 00:14:26,360 --> 00:14:28,160 Speaker 1: I'll talk more about VR a little bit later in 241 00:14:28,200 --> 00:14:31,000 Speaker 1: this episode, but we actually saw this kind of play out. 242 00:14:31,080 --> 00:14:33,200 Speaker 1: In fact, you could argue VR was one of those 243 00:14:33,480 --> 00:14:37,240 Speaker 1: technologies that inspired the creation of the hype cycle in 244 00:14:37,280 --> 00:14:40,760 Speaker 1: the beginning anyway, but even larger companies that existed before 245 00:14:40,880 --> 00:14:43,600 Speaker 1: the tech had hit the scene may find themselves in 246 00:14:43,680 --> 00:14:48,440 Speaker 1: trouble if they invested too heavily in whatever the technology was. 247 00:14:49,120 --> 00:14:55,120 Speaker 1: So with AI companies like Intel, Meta, Google, and Microsoft, 248 00:14:55,160 --> 00:14:57,520 Speaker 1: they kind of fall into this category like they're huge, 249 00:14:57,600 --> 00:15:01,000 Speaker 1: and they're much bigger than just artific intelligence. That's not 250 00:15:01,040 --> 00:15:03,440 Speaker 1: the only business they're in, right, It's not like they're 251 00:15:03,440 --> 00:15:06,240 Speaker 1: a startup that's totally focused on AI, but they've also 252 00:15:06,360 --> 00:15:10,840 Speaker 1: spent an enormous amount of money in AI research, and 253 00:15:10,920 --> 00:15:15,440 Speaker 1: for interest and enthusiasm around AI to kind of fade 254 00:15:15,960 --> 00:15:19,760 Speaker 1: is bad news for them because that's one less thing 255 00:15:19,800 --> 00:15:21,960 Speaker 1: they can hype up to their investors when it comes 256 00:15:22,000 --> 00:15:24,760 Speaker 1: to things like earnings, calls, and such to get them 257 00:15:24,760 --> 00:15:29,240 Speaker 1: excited and reinvest in the company. Gradually, according to the 258 00:15:29,240 --> 00:15:32,040 Speaker 1: Gartner hype cycle, folks start to figure out the best 259 00:15:32,200 --> 00:15:35,320 Speaker 1: uses for the technology in question, whatever it may be. 260 00:15:36,400 --> 00:15:39,720 Speaker 1: These uses might not be as transformative and impactful as 261 00:15:39,720 --> 00:15:42,640 Speaker 1: what people believed, especially at the height of the peak 262 00:15:42,760 --> 00:15:46,560 Speaker 1: of inflated expectations, but assuming the technology has any utility 263 00:15:46,600 --> 00:15:49,840 Speaker 1: and value to it, enough folks will stick with it. 264 00:15:49,840 --> 00:15:54,200 Speaker 1: It'll find its place and gradually and with much less hoopla, 265 00:15:54,280 --> 00:15:58,720 Speaker 1: folks will adopt this technology. This Gartner calls the slope 266 00:15:58,840 --> 00:16:03,080 Speaker 1: of enlightenment. People take a more grounded approach to implementing 267 00:16:03,320 --> 00:16:06,240 Speaker 1: this tech. The slope of enlightenment then leads to the 268 00:16:06,280 --> 00:16:10,120 Speaker 1: final phase in the Gartner hype cycle, called the plateau 269 00:16:10,360 --> 00:16:14,160 Speaker 1: of productivity, where folks and organizations make regular use of 270 00:16:14,200 --> 00:16:17,720 Speaker 1: this technology in ways that just makes sense. And this 271 00:16:17,880 --> 00:16:22,680 Speaker 1: technology might not boost company profits into overdrive, but they 272 00:16:22,720 --> 00:16:26,120 Speaker 1: would improve results in various ways. That's the hype cycle, 273 00:16:26,440 --> 00:16:29,560 Speaker 1: which again is not a cycle. Now, as I mentioned earlier, 274 00:16:29,800 --> 00:16:32,560 Speaker 1: this observation is really just that it's not a law, 275 00:16:32,800 --> 00:16:35,840 Speaker 1: and some argue it's not even an accurate observation that 276 00:16:35,920 --> 00:16:41,240 Speaker 1: technologies don't necessarily follow this pattern that Gartner has laid out. 277 00:16:41,520 --> 00:16:44,080 Speaker 1: Some may argue that it's more accurate to say it's 278 00:16:44,120 --> 00:16:48,840 Speaker 1: the marketing around technologies that follow at least a variation 279 00:16:49,200 --> 00:16:53,000 Speaker 1: of this story, but that the tech itself really should 280 00:16:53,040 --> 00:16:56,920 Speaker 1: be considered separately. Also, the terminology used in the Gartner 281 00:16:57,040 --> 00:17:01,320 Speaker 1: hype cycle gets pretty wishy washy because it's not quantitative. 282 00:17:01,400 --> 00:17:08,200 Speaker 1: You cannot measure it right. Inflated expectations, disillusionment, enlightenment, these 283 00:17:08,240 --> 00:17:12,800 Speaker 1: words have no objective meaning or means of measurement when 284 00:17:12,800 --> 00:17:17,440 Speaker 1: it comes to a technology's success or even acceptance. Also, 285 00:17:17,480 --> 00:17:20,520 Speaker 1: it's pretty hard or even impossible, to tell where any 286 00:17:20,560 --> 00:17:24,199 Speaker 1: technology might be along this cycle until it moves on 287 00:17:24,240 --> 00:17:26,800 Speaker 1: to the next phase. Right, how can you say, oh, 288 00:17:26,800 --> 00:17:29,600 Speaker 1: we're at the peak if then next week it goes 289 00:17:29,600 --> 00:17:31,600 Speaker 1: even higher like oh, I was wrong, Now we're at 290 00:17:31,640 --> 00:17:33,439 Speaker 1: the peak, And you might be doing that over and 291 00:17:33,480 --> 00:17:36,200 Speaker 1: over again. And as I mentioned before, another really big 292 00:17:36,240 --> 00:17:39,440 Speaker 1: issue is that this doesn't identify anything that actually causes 293 00:17:39,560 --> 00:17:43,159 Speaker 1: the technology to transition from one phase to the next. So, 294 00:17:43,640 --> 00:17:46,000 Speaker 1: at least according to some critics, not only can you 295 00:17:46,119 --> 00:17:51,720 Speaker 1: not tell where along this supposed hype cycle technologies may fall, 296 00:17:51,920 --> 00:17:54,480 Speaker 1: you don't know when they're going to make the turn 297 00:17:54,520 --> 00:17:57,560 Speaker 1: to the next stage, and you don't know how dramatic 298 00:17:57,600 --> 00:17:59,720 Speaker 1: that next stage is going to be. It's possible that 299 00:18:00,240 --> 00:18:03,560 Speaker 1: the trough of disillusionment isn't as big a dip, or 300 00:18:03,600 --> 00:18:06,359 Speaker 1: it could be a much worse depth like NFTs, I 301 00:18:06,359 --> 00:18:10,600 Speaker 1: would argue, went through a really big one. After they 302 00:18:10,640 --> 00:18:13,240 Speaker 1: hit a really tall peak, they went to a really 303 00:18:13,359 --> 00:18:16,840 Speaker 1: deep trough. And it's not like NFTs don't exist now 304 00:18:16,880 --> 00:18:22,320 Speaker 1: but boy, they are not anywhere close to the level 305 00:18:22,520 --> 00:18:25,480 Speaker 1: of popular that they were at the height of the 306 00:18:25,560 --> 00:18:29,280 Speaker 1: NFT frenzy. You could also argue some technologies don't seem 307 00:18:29,280 --> 00:18:33,280 Speaker 1: to follow this path at all. I think consumer smartphones 308 00:18:33,359 --> 00:18:36,400 Speaker 1: fall into this camp, Like when Apple launched the iPhone. 309 00:18:36,720 --> 00:18:40,200 Speaker 1: Smartphones had existed before the iPhone. Apple is known as 310 00:18:40,240 --> 00:18:44,520 Speaker 1: a company that launches refined products, as in products that 311 00:18:44,560 --> 00:18:46,800 Speaker 1: have already been on the market in some other form, 312 00:18:46,840 --> 00:18:50,640 Speaker 1: but Apple has put its own refinement on that technology, 313 00:18:51,040 --> 00:18:53,240 Speaker 1: and that's where a ton of the value comes from. 314 00:18:53,640 --> 00:18:56,440 Speaker 1: So when Apple launched the iPhone, even though the iPhone 315 00:18:56,480 --> 00:18:58,880 Speaker 1: was not the first smartphone to ever hit the market, 316 00:18:59,160 --> 00:19:01,119 Speaker 1: it was the first one that I think was really 317 00:19:01,160 --> 00:19:04,639 Speaker 1: marketed to the average consumer as opposed to executives and 318 00:19:04,760 --> 00:19:08,960 Speaker 1: early adopters. Well, I think the hype just went up 319 00:19:09,000 --> 00:19:11,920 Speaker 1: and up and up with the iPhone and with each 320 00:19:11,960 --> 00:19:14,960 Speaker 1: subsequent iPhone release for quite some time, until we reached 321 00:19:14,960 --> 00:19:19,480 Speaker 1: a point where the improvements from generation to generation were 322 00:19:19,520 --> 00:19:25,280 Speaker 1: bound to be incremental rather than revolutionary, which was gonna happen. Right. 323 00:19:25,520 --> 00:19:30,760 Speaker 1: You can't have every phone reinvent the phone every single time. 324 00:19:31,280 --> 00:19:34,320 Speaker 1: It might work for the first few generations simply because 325 00:19:34,640 --> 00:19:37,360 Speaker 1: of the rapid development of technology. But you eventually start 326 00:19:37,400 --> 00:19:41,280 Speaker 1: getting diminishing returns, and so then you get the incremental improvements. 327 00:19:41,520 --> 00:19:44,159 Speaker 1: But I don't think we ever really entered a trough 328 00:19:44,200 --> 00:19:48,240 Speaker 1: of disillusionment with consumer smartphones. I don't think that happened. 329 00:19:48,400 --> 00:19:52,280 Speaker 1: I think people just sort of entered into like a 330 00:19:52,440 --> 00:19:55,360 Speaker 1: realm of managed expectations. For the most part, you still 331 00:19:55,359 --> 00:19:58,600 Speaker 1: see people get excited every fall, hoping that the next 332 00:19:58,720 --> 00:20:03,280 Speaker 1: Apple iPhone event is going to be one that blows 333 00:20:03,359 --> 00:20:06,199 Speaker 1: us all out of the water by introducing some feature 334 00:20:06,240 --> 00:20:09,920 Speaker 1: no one ever even imagined, which is an unrealistic expectation. 335 00:20:10,000 --> 00:20:13,399 Speaker 1: It may still happen on occasion, but it's unrealistic. So 336 00:20:13,920 --> 00:20:17,639 Speaker 1: I don't think consumer smartphones followed the hype cycle, so 337 00:20:17,880 --> 00:20:22,320 Speaker 1: not every technology goes along this path. Still, While I 338 00:20:22,359 --> 00:20:25,359 Speaker 1: share skepticism that the hype cycle is the be all, 339 00:20:25,440 --> 00:20:29,560 Speaker 1: end all description of the phases in technological acceptance and adoption, 340 00:20:30,080 --> 00:20:33,199 Speaker 1: I do think it provides a useful starting point for 341 00:20:33,320 --> 00:20:38,960 Speaker 1: discussions around technology that have experienced a skyrocketing early effect 342 00:20:39,480 --> 00:20:42,560 Speaker 1: followed by a quick dip once folks realize that the 343 00:20:42,600 --> 00:20:46,680 Speaker 1: tech perhaps is not fully baked. So I already mentioned 344 00:20:47,240 --> 00:20:50,639 Speaker 1: VR virtual reality as an example of that. In the 345 00:20:50,720 --> 00:20:54,600 Speaker 1: late nineteen eighties, virtual reality was just starting to get attention. 346 00:20:54,880 --> 00:20:57,879 Speaker 1: You know, Jaron Lanier, who is typically credited as the 347 00:20:58,160 --> 00:21:01,960 Speaker 1: person who coined the term virtual reality, did that somewhere 348 00:21:02,000 --> 00:21:06,560 Speaker 1: around nineteen eighty seven, so it's a pretty recent tech, 349 00:21:06,640 --> 00:21:10,040 Speaker 1: and the concept from the get go was intriguing. Instead 350 00:21:10,040 --> 00:21:13,280 Speaker 1: of staring at a computer screen and interacting through mouse 351 00:21:13,320 --> 00:21:17,440 Speaker 1: and keyboard, the future of computing would put you inside 352 00:21:17,520 --> 00:21:21,080 Speaker 1: the programs. It was like Tron, but for real. Okay, 353 00:21:21,080 --> 00:21:24,320 Speaker 1: for people who aren't old. The original Tron film came 354 00:21:24,359 --> 00:21:27,280 Speaker 1: out way back in nineteen eighty two, and it featured 355 00:21:27,320 --> 00:21:31,200 Speaker 1: a human character getting digitized and uploaded into a computer 356 00:21:31,520 --> 00:21:35,400 Speaker 1: as a program, so he was inside the computer. That's 357 00:21:35,480 --> 00:21:37,800 Speaker 1: kind of how people thought of virtual reality in the 358 00:21:37,840 --> 00:21:41,399 Speaker 1: early days. So imaginations ran wild with the idea of VR. 359 00:21:41,720 --> 00:21:45,119 Speaker 1: You'd be able to navigate operating systems and programs the 360 00:21:45,160 --> 00:21:48,160 Speaker 1: same way as you would walk around like a building 361 00:21:48,280 --> 00:21:51,480 Speaker 1: or even a city landscape. Never mind that that isn't 362 00:21:51,680 --> 00:21:55,600 Speaker 1: really efficient or practical. The concept of that really appealed 363 00:21:55,600 --> 00:21:58,520 Speaker 1: to people, and you can see reflections of this to 364 00:21:58,640 --> 00:22:03,480 Speaker 1: this day and other ideas like the metaverse. You still 365 00:22:03,480 --> 00:22:07,640 Speaker 1: have people holding on to this idea that somehow navigating 366 00:22:08,200 --> 00:22:12,440 Speaker 1: computer programs as if they were physical landmarks is appealing. 367 00:22:12,640 --> 00:22:16,200 Speaker 1: I'm no longer convinced that it is. But anyway, you'd 368 00:22:16,240 --> 00:22:20,560 Speaker 1: be put into the middle of all this stuff, whether 369 00:22:20,600 --> 00:22:24,600 Speaker 1: it was to blast polygonal aliens or perform surgery on 370 00:22:24,640 --> 00:22:26,919 Speaker 1: a patient who's half a world away. The sky was 371 00:22:26,960 --> 00:22:29,800 Speaker 1: the limit with virtual reality. There was a ton of 372 00:22:29,840 --> 00:22:32,520 Speaker 1: money thrown at VR in those days. R and D 373 00:22:32,680 --> 00:22:35,880 Speaker 1: departments were flush with cash, and a lot of folks 374 00:22:36,080 --> 00:22:40,080 Speaker 1: started doing really cool research in VR. But all was 375 00:22:40,160 --> 00:22:45,120 Speaker 1: not well. A few companies rushed to develop consumer VR experiences. 376 00:22:45,520 --> 00:22:48,480 Speaker 1: That equipment was far too large and expensive for anyone 377 00:22:48,520 --> 00:22:51,600 Speaker 1: other than the very wealthy to own it for themselves, 378 00:22:51,680 --> 00:22:55,840 Speaker 1: so the idea wasn't to create consumer products. Instead, the 379 00:22:55,880 --> 00:22:59,320 Speaker 1: idea was to create VR arcade experiences. So you had 380 00:22:59,320 --> 00:23:04,320 Speaker 1: companies like Virtuality building these enormous gaming rigs that included 381 00:23:04,400 --> 00:23:08,639 Speaker 1: bulky head mounted displays and pedestals with a railing built 382 00:23:08,680 --> 00:23:11,440 Speaker 1: in so that it would prevent players from stepping off 383 00:23:11,440 --> 00:23:15,080 Speaker 1: and falling over. Arcade operators would charge players to play games. 384 00:23:15,200 --> 00:23:18,240 Speaker 1: Sometimes it was a flat fee for a specific game title. 385 00:23:18,440 --> 00:23:21,160 Speaker 1: Sometimes you were paying for like five minutes of game 386 00:23:21,240 --> 00:23:24,240 Speaker 1: time per go, which seemed expensive, but then if you 387 00:23:24,280 --> 00:23:26,760 Speaker 1: spent five minutes in VR, you'd be like, no, I'm good. 388 00:23:27,720 --> 00:23:30,240 Speaker 1: But players who took the plunge like yours truly. I 389 00:23:30,280 --> 00:23:33,040 Speaker 1: was one of these people. There was a mall. Doesn't 390 00:23:33,680 --> 00:23:36,600 Speaker 1: it no longer really exists. I mean the building does, 391 00:23:36,640 --> 00:23:39,119 Speaker 1: but there's hardly any businesses in it. But when that 392 00:23:39,240 --> 00:23:43,040 Speaker 1: place mall was the big mall that was within an 393 00:23:43,040 --> 00:23:45,800 Speaker 1: hour's drive of where I grew up, and we would 394 00:23:45,840 --> 00:23:50,119 Speaker 1: go there on occasion, and they had a VR arcade 395 00:23:50,840 --> 00:23:54,240 Speaker 1: in that mall, and people who tried it out, including myself, 396 00:23:54,440 --> 00:23:57,520 Speaker 1: often we would be impressed with the user interface experience 397 00:23:57,560 --> 00:24:00,200 Speaker 1: because being able to look around and change your point 398 00:24:00,200 --> 00:24:04,199 Speaker 1: of view in game by physically moving your body, that 399 00:24:04,320 --> 00:24:07,080 Speaker 1: was a really big deal. You know, first person shooters 400 00:24:07,119 --> 00:24:09,840 Speaker 1: weren't really a genre yet when these VR systems first 401 00:24:09,880 --> 00:24:13,600 Speaker 1: hit the market in like nineteen ninety one. Wolfenstein three D, 402 00:24:13,720 --> 00:24:16,200 Speaker 1: which really set the stage for first person shooters, came 403 00:24:16,200 --> 00:24:19,000 Speaker 1: out in nineteen ninety two, and you couldn't even really 404 00:24:19,400 --> 00:24:21,359 Speaker 1: aim up or down in that just kind of left 405 00:24:21,359 --> 00:24:23,679 Speaker 1: and right, So being able to move around a virtual 406 00:24:23,760 --> 00:24:26,600 Speaker 1: environment and to control your perspective just by turning your 407 00:24:26,600 --> 00:24:29,240 Speaker 1: head or squatting down or whatever, that was a big deal. 408 00:24:29,480 --> 00:24:32,480 Speaker 1: But the graphics were really primitive, and that's putting it lightly. 409 00:24:33,040 --> 00:24:36,840 Speaker 1: They were blocky, and on screen characters typically had only 410 00:24:36,840 --> 00:24:39,600 Speaker 1: a few points of articulation. This was necessary at the 411 00:24:39,640 --> 00:24:42,720 Speaker 1: time because of the massive amount of processing power needed 412 00:24:42,840 --> 00:24:45,119 Speaker 1: to make everything work. By massive, by the way, I 413 00:24:45,119 --> 00:24:47,960 Speaker 1: mean relative to the capabilities of the time. Today, the 414 00:24:48,040 --> 00:24:51,480 Speaker 1: requirements of a nineteen ninety one era VR game would 415 00:24:51,520 --> 00:24:53,520 Speaker 1: be trivial. You could probably run it on your phone, 416 00:24:53,600 --> 00:24:57,080 Speaker 1: but back in nineteen ninety one it required a lot. 417 00:24:57,760 --> 00:25:01,240 Speaker 1: As folks realized the limitations of VA are at this stage, 418 00:25:01,920 --> 00:25:04,800 Speaker 1: excitement kind of puffed out of existence. So there had 419 00:25:04,800 --> 00:25:09,040 Speaker 1: been all this media hype, especially like in movies, where 420 00:25:09,600 --> 00:25:13,280 Speaker 1: the concept of VR became like an integral part of 421 00:25:13,320 --> 00:25:16,880 Speaker 1: the plot of films, and then people realized, oh, that's 422 00:25:16,960 --> 00:25:20,440 Speaker 1: not actually where VR is. It's nothing close to what 423 00:25:20,480 --> 00:25:25,480 Speaker 1: we've been thinking about. And with excitement and enthusiasm fading away, 424 00:25:25,520 --> 00:25:29,040 Speaker 1: the money followed, so research labs that had been enjoying 425 00:25:29,040 --> 00:25:31,680 Speaker 1: support suddenly were scrambling to stay afloat, and a lot 426 00:25:31,720 --> 00:25:34,120 Speaker 1: of companies and labs would either go out of business 427 00:25:34,520 --> 00:25:36,640 Speaker 1: or they had to pivot to something else. Folks who 428 00:25:36,680 --> 00:25:39,439 Speaker 1: were determined to use VR to tackle issues like treating 429 00:25:39,520 --> 00:25:44,520 Speaker 1: mental issues like I know ones that were used for 430 00:25:44,600 --> 00:25:49,960 Speaker 1: psychology purposes and used to treat things like phobias, You 431 00:25:50,000 --> 00:25:52,600 Speaker 1: would use VR to do kind of an immersion therapy 432 00:25:53,040 --> 00:25:56,040 Speaker 1: where the trigger the person was afraid of could be 433 00:25:56,040 --> 00:26:00,480 Speaker 1: introduced virtually and the person would still have the reaction 434 00:26:01,080 --> 00:26:03,960 Speaker 1: to whatever their fear was of, but they would know 435 00:26:04,000 --> 00:26:06,399 Speaker 1: that they were in a safe space ultimately, and it 436 00:26:06,480 --> 00:26:09,840 Speaker 1: was a way to have immersion therapy without a real 437 00:26:10,000 --> 00:26:14,719 Speaker 1: danger being present or a perceived danger actually being present. 438 00:26:15,480 --> 00:26:18,879 Speaker 1: And it was an incredibly interesting area of research, but 439 00:26:19,440 --> 00:26:22,240 Speaker 1: the money started to go away. Folks who were determined 440 00:26:22,280 --> 00:26:26,360 Speaker 1: to keep using VR had to scrape for every penny 441 00:26:26,520 --> 00:26:29,040 Speaker 1: and would often have to repurpose gear that was made 442 00:26:29,119 --> 00:26:32,520 Speaker 1: for other purposes, primarily stuff like gaming systems, in order 443 00:26:32,560 --> 00:26:35,080 Speaker 1: to keep going. And it would stay like that until 444 00:26:35,080 --> 00:26:38,440 Speaker 1: the two thousands, when VR would experience a more measured 445 00:26:38,520 --> 00:26:42,720 Speaker 1: amount of support. Now, the VR story gets more complicated 446 00:26:43,160 --> 00:26:46,880 Speaker 1: and it doesn't really fit the hype cycle format. Easily 447 00:26:46,960 --> 00:26:50,520 Speaker 1: once we get past that initial part of the story 448 00:26:50,560 --> 00:26:54,280 Speaker 1: of the peak of inflated expectations in the trough of disillusionment. 449 00:26:54,640 --> 00:26:57,000 Speaker 1: I'll talk more about that in just a moment, but 450 00:26:57,119 --> 00:27:09,879 Speaker 1: first let's take another quick break. Okay, before the break, 451 00:27:09,880 --> 00:27:13,840 Speaker 1: I said the VR story doesn't fully fit the hype cycle, 452 00:27:14,240 --> 00:27:17,720 Speaker 1: at least not perfectly. The initial part of it seems 453 00:27:17,720 --> 00:27:21,480 Speaker 1: to quite well, right. VR as a concept starts to 454 00:27:22,280 --> 00:27:26,240 Speaker 1: break into mainstream consciousness after being kind of the realm 455 00:27:26,320 --> 00:27:31,840 Speaker 1: of engineers and computer scientists and various technical conferences, and 456 00:27:31,880 --> 00:27:35,199 Speaker 1: then people get really excited, Hollywood gets really excited, and 457 00:27:35,240 --> 00:27:37,800 Speaker 1: then folks get to experience it, and that excitement goes 458 00:27:37,840 --> 00:27:41,840 Speaker 1: away and the investment goes away, and VR nearly died 459 00:27:41,960 --> 00:27:44,800 Speaker 1: as a result. But despite the fact that you had 460 00:27:44,800 --> 00:27:47,359 Speaker 1: a lot of hype and you had this rug pole 461 00:27:47,480 --> 00:27:52,320 Speaker 1: moment where support rapidly disappeared, the technology did build its 462 00:27:52,359 --> 00:27:55,920 Speaker 1: way up again, kind of like that slope of enlightenment 463 00:27:56,000 --> 00:27:59,000 Speaker 1: story with the hype cycle. But VR's relationship with other 464 00:27:59,040 --> 00:28:03,440 Speaker 1: technologies like mixed reality, which brings augmented reality into it, 465 00:28:03,760 --> 00:28:08,000 Speaker 1: that makes VR too complex for the cycle to accommodate, 466 00:28:08,280 --> 00:28:10,840 Speaker 1: because it's not like it's a single story, it's part 467 00:28:10,840 --> 00:28:15,840 Speaker 1: of a multi branch story. It's most things in our 468 00:28:15,920 --> 00:28:19,399 Speaker 1: lives fall into this, right I like to think I 469 00:28:19,400 --> 00:28:22,320 Speaker 1: shouldn't say we. I like to think of history as 470 00:28:22,359 --> 00:28:25,359 Speaker 1: a series of stories, But in reality things are so 471 00:28:25,440 --> 00:28:29,000 Speaker 1: complicated you rarely ever have a true beginning, middle, and end, 472 00:28:29,480 --> 00:28:32,320 Speaker 1: which is unfortunate for people like me who really like 473 00:28:32,400 --> 00:28:36,120 Speaker 1: to have that kind of structure and closure. But that's 474 00:28:36,160 --> 00:28:39,280 Speaker 1: not how reality works. Well that's not how VR works either. 475 00:28:39,440 --> 00:28:44,160 Speaker 1: VR is deeply integrated into other disciplines, and other disciplines 476 00:28:44,200 --> 00:28:47,160 Speaker 1: are a big part of VR. So you can't really 477 00:28:47,160 --> 00:28:50,160 Speaker 1: talk about VR as a specific technology, right It depends 478 00:28:50,240 --> 00:28:52,480 Speaker 1: upon a lot of other technologies, many of which are 479 00:28:52,640 --> 00:28:56,680 Speaker 1: are far more mature than VR is, and these technologies 480 00:28:56,680 --> 00:28:59,560 Speaker 1: have proven themselves, so it is too complicated to just 481 00:28:59,680 --> 00:29:03,400 Speaker 1: reduce down to VR is a technology. So critics could 482 00:29:03,440 --> 00:29:06,960 Speaker 1: argue that the hype cycle is fine for contextualizing the 483 00:29:06,960 --> 00:29:11,400 Speaker 1: initial era of VR, but that doesn't really hold up 484 00:29:11,440 --> 00:29:14,080 Speaker 1: once you get past that. And the same might be 485 00:29:14,240 --> 00:29:20,080 Speaker 1: true for artificial intelligence. Certainly, AI is not a single technology, 486 00:29:20,600 --> 00:29:24,000 Speaker 1: right like we often talk about it. Even I often 487 00:29:24,040 --> 00:29:27,080 Speaker 1: talk about AI as if it were just a single 488 00:29:27,120 --> 00:29:30,520 Speaker 1: tech like you could go into a big box store 489 00:29:30,560 --> 00:29:35,280 Speaker 1: and buy a package of AI. That's not accurate, it's 490 00:29:35,320 --> 00:29:41,600 Speaker 1: not realistic. Often it's because we oversimplify in an effort 491 00:29:41,680 --> 00:29:46,760 Speaker 1: to try and tackle a really complicated and diverse discipline. 492 00:29:47,120 --> 00:29:49,520 Speaker 1: That's what AI is. It's a discipline, and in fact, 493 00:29:49,520 --> 00:29:53,680 Speaker 1: many other disciplines feed into or overlap with AI. And 494 00:29:53,680 --> 00:29:56,480 Speaker 1: there are a lot of different implementations for artificial intelligence. 495 00:29:56,840 --> 00:29:59,200 Speaker 1: So generative AI gets a lot of the attention right 496 00:29:59,200 --> 00:30:02,920 Speaker 1: now because it's flashy and it's easier to demonstrate than 497 00:30:02,960 --> 00:30:05,560 Speaker 1: a lot of other AI applications. To the average person, 498 00:30:06,240 --> 00:30:10,239 Speaker 1: your typical human being can easily get a grip on 499 00:30:10,280 --> 00:30:13,080 Speaker 1: what generative AI is all about. They can play with 500 00:30:13,120 --> 00:30:16,600 Speaker 1: it online. You can have conversations with chadbots, you can 501 00:30:17,160 --> 00:30:21,240 Speaker 1: have an AI artist generate an image based upon your prompts. 502 00:30:21,720 --> 00:30:25,280 Speaker 1: You can even have AI generated video and music. And 503 00:30:25,320 --> 00:30:28,040 Speaker 1: I'm sure you've all seen cases where someone seemed to 504 00:30:28,080 --> 00:30:33,920 Speaker 1: equate generative AI with all artificial intelligence technologies, but that's 505 00:30:33,960 --> 00:30:38,280 Speaker 1: extremely reductive. To say that generative AI is the whole 506 00:30:38,400 --> 00:30:41,800 Speaker 1: of artificial intelligence is just wrong. There are tons of 507 00:30:41,840 --> 00:30:48,320 Speaker 1: different implementations and applications of artificial intelligence, so and many 508 00:30:48,320 --> 00:30:50,960 Speaker 1: of them have nothing to do with generating any kind 509 00:30:51,000 --> 00:30:54,000 Speaker 1: of content the way generative AI does. Now. I can't 510 00:30:54,040 --> 00:30:57,520 Speaker 1: speak for all tech communicators, but personally I found the 511 00:30:57,600 --> 00:31:02,840 Speaker 1: hype around AI this most recent round because again, AI also, 512 00:31:03,080 --> 00:31:05,800 Speaker 1: it's a discipline that's been around for more than half 513 00:31:05,800 --> 00:31:08,800 Speaker 1: a century at this point, so to call it like 514 00:31:09,080 --> 00:31:13,320 Speaker 1: new is crazy. It's been around for longer than I've 515 00:31:13,320 --> 00:31:17,200 Speaker 1: been alive. But I guess if you're looking specifically at 516 00:31:17,280 --> 00:31:20,160 Speaker 1: generative AI, even though that's been around for quite a 517 00:31:20,160 --> 00:31:24,440 Speaker 1: long time too, the most recent focus is relatively recent. 518 00:31:24,720 --> 00:31:27,400 Speaker 1: But I found it really frustrating because it's not that 519 00:31:27,520 --> 00:31:31,880 Speaker 1: I think AI isn't useful or interesting. I do. It's 520 00:31:31,960 --> 00:31:36,960 Speaker 1: just I'm so tired of seeing AI evangelists talk about 521 00:31:37,080 --> 00:31:41,120 Speaker 1: artificial intelligence as if it's already a mature discipline capable 522 00:31:41,160 --> 00:31:45,080 Speaker 1: of instantly transforming the world. Parts of artificial intelligence, I 523 00:31:45,080 --> 00:31:48,120 Speaker 1: would argue, you are very much a mature discipline, But 524 00:31:48,320 --> 00:31:53,920 Speaker 1: they might have limited practical application for real world results. 525 00:31:53,960 --> 00:31:58,520 Speaker 1: They might be more interesting in a computer science context 526 00:31:58,560 --> 00:32:02,080 Speaker 1: than in a practical application context, which isn't to say 527 00:32:02,080 --> 00:32:05,520 Speaker 1: that they won't be incredibly important, just that you're talking 528 00:32:05,520 --> 00:32:09,960 Speaker 1: about foundational building blocks that are going to be used 529 00:32:09,960 --> 00:32:15,000 Speaker 1: to build the next really cool implementation. But I'm really 530 00:32:15,040 --> 00:32:19,000 Speaker 1: tired of companies rushing into AI implementations without actually considering 531 00:32:19,120 --> 00:32:24,760 Speaker 1: what value, if any, those implementations add, and how best 532 00:32:24,800 --> 00:32:28,520 Speaker 1: to integrate them so that they actually enhance the company's business. 533 00:32:28,840 --> 00:32:31,120 Speaker 1: In most cases, I think AI ends up being a 534 00:32:31,160 --> 00:32:34,920 Speaker 1: distraction at best and harmful at worst. Now that's not 535 00:32:34,960 --> 00:32:37,440 Speaker 1: to say there are no businesses out there doing it right. 536 00:32:37,520 --> 00:32:39,800 Speaker 1: I think there are businesses that are doing this the 537 00:32:39,960 --> 00:32:43,720 Speaker 1: right way. But doing it the right way is hard. 538 00:32:44,320 --> 00:32:46,760 Speaker 1: It takes a lot of planning, and I feel like 539 00:32:46,800 --> 00:32:49,160 Speaker 1: a lot of companies are trying to take shortcuts out 540 00:32:49,160 --> 00:32:52,720 Speaker 1: of fear of being left behind if they drag their 541 00:32:52,760 --> 00:32:56,200 Speaker 1: feet on artificial intelligence, and that does not play out 542 00:32:56,240 --> 00:32:59,480 Speaker 1: well very often. Honestly, it's really making me think of 543 00:32:59,480 --> 00:33:02,960 Speaker 1: stuff like Web three and the metaverse and NFTs to 544 00:33:03,040 --> 00:33:04,960 Speaker 1: go back to that, I think all three of those 545 00:33:05,000 --> 00:33:07,960 Speaker 1: concepts have already gone through at least one round of 546 00:33:08,000 --> 00:33:10,800 Speaker 1: the trough of disillusionment. If we were to apply the 547 00:33:11,320 --> 00:33:15,400 Speaker 1: Gartner model here NFTs, I would argue you had the 548 00:33:15,400 --> 00:33:19,760 Speaker 1: most spectacular fall as unlike the metaverse or Web three 549 00:33:20,280 --> 00:33:23,520 Speaker 1: NFTs were actually a thing and you could implement them. 550 00:33:23,720 --> 00:33:27,080 Speaker 1: People are still arguing about what Web three or metaverse 551 00:33:27,200 --> 00:33:30,760 Speaker 1: even means or what they will look like when fully realized. 552 00:33:30,960 --> 00:33:35,160 Speaker 1: But NFTs existed, they were inflated like crazy, you know, 553 00:33:35,240 --> 00:33:38,160 Speaker 1: hype reached a frenzy, and then the bottom dropped out 554 00:33:38,200 --> 00:33:40,760 Speaker 1: when people seemingly woke up and said, what the hell 555 00:33:40,800 --> 00:33:43,360 Speaker 1: are we doing for the metaverse and Web three. The 556 00:33:43,400 --> 00:33:46,080 Speaker 1: decline in the excitement has meant that companies that are 557 00:33:46,080 --> 00:33:50,120 Speaker 1: determined to work on those projects are doing so an 558 00:33:50,160 --> 00:33:55,880 Speaker 1: increasingly skeptical and sometimes outright hostile environment. Meta has spent 559 00:33:56,280 --> 00:33:59,960 Speaker 1: billions of dollars in the area of developing the metafor 560 00:34:00,440 --> 00:34:03,120 Speaker 1: and investors have made it no secret that they are 561 00:34:03,200 --> 00:34:05,320 Speaker 1: not convinced that the metaverse is going to be the 562 00:34:05,320 --> 00:34:08,680 Speaker 1: next big thing. But since the digital ads business has 563 00:34:08,760 --> 00:34:12,000 Speaker 1: recently been doing much better, I think investors are willing 564 00:34:12,040 --> 00:34:14,279 Speaker 1: to look the other way with Meta. I imagine a 565 00:34:14,320 --> 00:34:16,160 Speaker 1: lot of them wish Meta would just stop with the 566 00:34:16,160 --> 00:34:19,240 Speaker 1: whole metaverse thing, but as long as it's not actually 567 00:34:19,400 --> 00:34:23,480 Speaker 1: harming the bottom line too much, then they're okay with it. 568 00:34:23,760 --> 00:34:26,840 Speaker 1: But maybe Meta's long term plan will actually pay off, 569 00:34:27,160 --> 00:34:30,880 Speaker 1: because goodness knows, I criticize companies for sacrificing the long 570 00:34:31,000 --> 00:34:34,320 Speaker 1: term in favor of short term results. So if Meta 571 00:34:34,480 --> 00:34:38,040 Speaker 1: is able to create something that people actually want to use, 572 00:34:38,680 --> 00:34:41,880 Speaker 1: then I think in retrospect people will say that Meta's 573 00:34:41,920 --> 00:34:44,600 Speaker 1: investment was worthwhile, despite the fact that it made a 574 00:34:44,600 --> 00:34:47,800 Speaker 1: lot of investors ant see. They'll say Meta was visionary 575 00:34:47,880 --> 00:34:50,439 Speaker 1: and the investors were short sighted. And you can say 576 00:34:50,440 --> 00:34:53,200 Speaker 1: that in retrospect. Right now, it's a lot harder to 577 00:34:53,239 --> 00:34:56,080 Speaker 1: say that for sure. We don't know if the metaverse 578 00:34:56,719 --> 00:34:59,319 Speaker 1: gambit is going to ultimately pay off, or if it's 579 00:34:59,320 --> 00:35:01,680 Speaker 1: just going to end up being something that ends up 580 00:35:01,680 --> 00:35:05,560 Speaker 1: being a curiosity that spent costs billions of dollars. Well, 581 00:35:05,560 --> 00:35:07,480 Speaker 1: I think the same thing could be true for AI. 582 00:35:07,800 --> 00:35:10,279 Speaker 1: I think that in the long run, AI is going 583 00:35:10,320 --> 00:35:13,879 Speaker 1: to be incredibly helpful in many ways, but it may 584 00:35:13,920 --> 00:35:18,160 Speaker 1: not be as transformative as folks were saying earlier this 585 00:35:18,280 --> 00:35:20,839 Speaker 1: year and last year, and it may require a lot 586 00:35:20,920 --> 00:35:25,920 Speaker 1: more customization to optimize AI applications for specific companies and functions. 587 00:35:26,239 --> 00:35:29,279 Speaker 1: In other words, it might not be so easy as 588 00:35:29,360 --> 00:35:33,319 Speaker 1: just throwing the AI switch and letting the money flood in. 589 00:35:33,320 --> 00:35:35,600 Speaker 1: In the meantime, it looks like we'll be in for 590 00:35:35,680 --> 00:35:38,920 Speaker 1: more rough waters. With the fears about the economy, I 591 00:35:38,960 --> 00:35:43,879 Speaker 1: personally worry about more industry layoffs across tech. We've seen 592 00:35:44,120 --> 00:35:47,840 Speaker 1: so many already in the past couple of years. Intel 593 00:35:48,000 --> 00:35:50,480 Speaker 1: recently announcing that it's going to let go of thousands 594 00:35:50,520 --> 00:35:54,560 Speaker 1: of people, like fifteen to nineteen thousand employees over the 595 00:35:54,560 --> 00:35:57,480 Speaker 1: course of layoffs in the very near future. We see 596 00:35:57,520 --> 00:35:59,840 Speaker 1: it all over the place in the video game industry, 597 00:36:00,080 --> 00:36:04,279 Speaker 1: with some entire studios getting shut down. If the tech 598 00:36:04,320 --> 00:36:08,640 Speaker 1: companies see their stock prices tumble, and that is what's happening, 599 00:36:09,120 --> 00:36:11,560 Speaker 1: we could be in for more of the same across 600 00:36:11,680 --> 00:36:14,799 Speaker 1: multiple companies. I would be shocked if we don't have 601 00:36:15,280 --> 00:36:17,719 Speaker 1: a lot more stories in the back half of this 602 00:36:17,880 --> 00:36:23,200 Speaker 1: year about layoffs and more cutbacks. And I think in 603 00:36:23,280 --> 00:36:29,640 Speaker 1: large part it's because of this diminished enthusiasm and faith 604 00:36:29,920 --> 00:36:33,279 Speaker 1: in the tech industry in general and the economy in 605 00:36:33,320 --> 00:36:36,399 Speaker 1: general as well. And that's terrible news. I really hope 606 00:36:36,440 --> 00:36:39,680 Speaker 1: anyone who's affected by layoffs lands on their feet very quickly. 607 00:36:40,200 --> 00:36:43,560 Speaker 1: It stinks. I've been there. It is a terrible feeling 608 00:36:43,960 --> 00:36:48,960 Speaker 1: and it can really make you question everything, because especially 609 00:36:48,960 --> 00:36:51,640 Speaker 1: for people who have really poured a lot of their 610 00:36:52,000 --> 00:36:55,840 Speaker 1: time and effort into a job making it a career, 611 00:36:56,160 --> 00:36:58,640 Speaker 1: for that to come to an end, it can be devastating. 612 00:36:59,040 --> 00:37:01,960 Speaker 1: So my heart goes out to anyone affected by these 613 00:37:02,080 --> 00:37:06,200 Speaker 1: kinds of things, and I'm hopeful that it won't be 614 00:37:06,360 --> 00:37:11,120 Speaker 1: as bad as I fear it will be. So I 615 00:37:11,120 --> 00:37:14,200 Speaker 1: have two different I've got the Angel on one shoulder 616 00:37:14,400 --> 00:37:17,880 Speaker 1: being hopeful and the devil on the other shoulder being fearful. 617 00:37:18,360 --> 00:37:21,319 Speaker 1: And I really hope the angels win on this one. 618 00:37:22,600 --> 00:37:25,279 Speaker 1: They're taking the outfield. From what I understand, I don't know. 619 00:37:25,320 --> 00:37:28,239 Speaker 1: I never saw that movie, so I'm pretty sure that 620 00:37:28,320 --> 00:37:33,719 Speaker 1: was a baseball film. That's it for this retrospect on 621 00:37:33,840 --> 00:37:37,160 Speaker 1: the Gartner hype cycle, how it applies to artificial intelligence, 622 00:37:37,239 --> 00:37:41,239 Speaker 1: or even if it applies, and I'll be interested to 623 00:37:41,280 --> 00:37:45,560 Speaker 1: see how things play out in the near future. I 624 00:37:45,600 --> 00:37:50,840 Speaker 1: really do question whether AI is going to be cited 625 00:37:50,920 --> 00:37:55,160 Speaker 1: as a massive factor for any kind of economic uncertainty 626 00:37:55,160 --> 00:37:58,399 Speaker 1: in the tech sphere. I think it'll be a scapegoat 627 00:37:58,640 --> 00:38:00,800 Speaker 1: for some of that, because I think these are issues 628 00:38:00,840 --> 00:38:04,600 Speaker 1: that are bigger than just the disillusionment around AI. I 629 00:38:04,680 --> 00:38:09,000 Speaker 1: do think that that's contributing. I personally don't think it's 630 00:38:09,040 --> 00:38:13,239 Speaker 1: like a lynchpin feature of what we're seeing unfold now. 631 00:38:13,800 --> 00:38:17,360 Speaker 1: All right, y'all stay safe out there. I wish you 632 00:38:17,400 --> 00:38:20,480 Speaker 1: all health and happiness, and I will talk to you 633 00:38:20,520 --> 00:38:31,360 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 634 00:38:31,520 --> 00:38:36,359 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 635 00:38:36,480 --> 00:38:42,200 Speaker 1: or wherever you listen to your favorite shows