1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:17,560 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:17,600 --> 00:00:21,279 Speaker 1: and I love all things tech, and I figured for 5 00:00:21,320 --> 00:00:25,960 Speaker 1: the last new episode of twenty twenty one, we should 6 00:00:26,000 --> 00:00:30,720 Speaker 1: talk predictions for twenty twenty two. This is actually gonna 7 00:00:30,720 --> 00:00:36,080 Speaker 1: be a pretty short episode because it's hard. It's always 8 00:00:36,080 --> 00:00:40,680 Speaker 1: hard to predict tech full stop, Like whether you know 9 00:00:40,920 --> 00:00:42,800 Speaker 1: it's a good year or a bad year, it's always 10 00:00:42,840 --> 00:00:47,280 Speaker 1: tough to predict tech. In the era of a pandemic 11 00:00:47,960 --> 00:00:52,440 Speaker 1: and semiconductor shortages and all that stuff, it has grown 12 00:00:53,400 --> 00:00:57,320 Speaker 1: increasingly difficult to predict tech at least in any way 13 00:00:57,360 --> 00:01:01,320 Speaker 1: that isn't like gloomy. So uh, this is also gonna 14 00:01:01,360 --> 00:01:03,440 Speaker 1: be short because it is kind of a gloomy episode, 15 00:01:03,440 --> 00:01:06,119 Speaker 1: and I apologize for that. It may just very well 16 00:01:06,160 --> 00:01:10,880 Speaker 1: be a reflection of my own mental state as opposed to, 17 00:01:11,400 --> 00:01:15,840 Speaker 1: you know, an actual vision of what will become. So 18 00:01:16,000 --> 00:01:18,720 Speaker 1: take solace in that it could very well be then 19 00:01:18,800 --> 00:01:22,600 Speaker 1: I am just being incredibly pessimistic with some of these predictions. Also, 20 00:01:22,640 --> 00:01:25,560 Speaker 1: I should say like I have a pretty strong track 21 00:01:25,600 --> 00:01:29,600 Speaker 1: record for not being right, So that should also be 22 00:01:29,680 --> 00:01:33,120 Speaker 1: some comfort because I'm the guy who said that I 23 00:01:33,160 --> 00:01:36,080 Speaker 1: was skeptical the iPad was ever going to be a hit. 24 00:01:36,120 --> 00:01:38,479 Speaker 1: I thought it was gonna be a flop, and obviously 25 00:01:38,520 --> 00:01:41,640 Speaker 1: I was completely wrong on that. So, you know, with 26 00:01:41,680 --> 00:01:45,640 Speaker 1: that kind of bold prognosticating, I think we can all 27 00:01:45,680 --> 00:01:48,280 Speaker 1: rest comfortably at the end of this episode, all right. 28 00:01:48,760 --> 00:01:51,160 Speaker 1: I actually thought at first about making this sort of 29 00:01:51,320 --> 00:01:56,720 Speaker 1: a wild and inaccurate predictions episode as a joke, you know, 30 00:01:56,760 --> 00:02:00,800 Speaker 1: like predicting stuff like person an, old jet packs and 31 00:02:00,840 --> 00:02:03,240 Speaker 1: all that kind of stuff. Except obviously that joke would 32 00:02:03,240 --> 00:02:06,080 Speaker 1: get old within a minute, and then I would just 33 00:02:06,120 --> 00:02:08,200 Speaker 1: have to keep doing it, you know, and commit to 34 00:02:08,240 --> 00:02:10,400 Speaker 1: the joke until we had an episode on our hands, 35 00:02:10,400 --> 00:02:13,040 Speaker 1: and that would be miserable. So to spare all of 36 00:02:13,120 --> 00:02:17,240 Speaker 1: us us that, you know, terrible fate. Here's some basic 37 00:02:17,320 --> 00:02:21,240 Speaker 1: predictions I think we'll see come to pass in twenty two, 38 00:02:22,720 --> 00:02:26,880 Speaker 1: and I mentioned the semiconductor shortage. I think the semiconductor 39 00:02:26,960 --> 00:02:32,040 Speaker 1: shortage will continue throughout next year. There are some tech 40 00:02:32,080 --> 00:02:35,840 Speaker 1: analysts and some leaders who think that we might emerge 41 00:02:36,000 --> 00:02:40,040 Speaker 1: from this shortage sometime in twenty twenty two, either by 42 00:02:40,280 --> 00:02:43,920 Speaker 1: mid year or maybe late year, but I tend to 43 00:02:43,960 --> 00:02:47,680 Speaker 1: be a bit more pessimistic about that. I agree with 44 00:02:47,800 --> 00:02:51,960 Speaker 1: Intel's CEO Pat Gelsinger, who believes it will stretch into 45 00:02:52,080 --> 00:02:56,680 Speaker 1: twenty twenty three. Now, there are a lot of companies 46 00:02:56,720 --> 00:02:59,920 Speaker 1: out there that are rushing to build out production capable 47 00:03:00,000 --> 00:03:03,920 Speaker 1: at these you know, manufacturing facilities, factories in other words, 48 00:03:04,919 --> 00:03:07,800 Speaker 1: but those are still years out from coming online. Like 49 00:03:08,040 --> 00:03:11,720 Speaker 1: that process is not fast, it's gonna take a long time. 50 00:03:11,760 --> 00:03:15,600 Speaker 1: And again we're operating in a pandemic. Things like giant 51 00:03:15,680 --> 00:03:22,440 Speaker 1: construction UH projects are hard to do. So also, chip 52 00:03:22,480 --> 00:03:25,800 Speaker 1: production is just one part of this challenge. There's also 53 00:03:25,840 --> 00:03:29,840 Speaker 1: the packaging side of the business. Most of the packaging 54 00:03:29,919 --> 00:03:33,600 Speaker 1: happens in places like China, which receives the freshly produced 55 00:03:33,680 --> 00:03:37,480 Speaker 1: semiconductors and then slaps them into packaging before they are 56 00:03:37,520 --> 00:03:41,560 Speaker 1: shipped off. It is kind of crazy to think about 57 00:03:41,640 --> 00:03:47,800 Speaker 1: how it is more economically viable to produce chips in 58 00:03:47,880 --> 00:03:52,040 Speaker 1: one country, ship them to another country to be packaged, 59 00:03:52,480 --> 00:03:56,360 Speaker 1: and then go off to go to warehouses around the 60 00:03:56,400 --> 00:04:00,840 Speaker 1: world and stuff that that is actually cheaper than doing 61 00:04:00,840 --> 00:04:03,680 Speaker 1: it all in the same country. But when you're talking 62 00:04:03,720 --> 00:04:06,760 Speaker 1: about operating at scale, and when you're talking about countries 63 00:04:06,800 --> 00:04:11,920 Speaker 1: that have uh not great worker practices. That's the way 64 00:04:11,920 --> 00:04:16,200 Speaker 1: that kind of pans out. Anyway, The packaging part is 65 00:04:16,240 --> 00:04:19,360 Speaker 1: another issue, right, Like, even if we ramp up production, 66 00:04:19,480 --> 00:04:22,240 Speaker 1: we still have to have the other parts of this 67 00:04:23,120 --> 00:04:27,120 Speaker 1: moving machine working. So it's possible that even should we 68 00:04:27,160 --> 00:04:30,520 Speaker 1: see production capacity rise, will have a bottleneck in the 69 00:04:30,560 --> 00:04:33,560 Speaker 1: packaging side of the chain. And then we have other 70 00:04:33,680 --> 00:04:36,960 Speaker 1: general supply chain issues that we still have to deal with. 71 00:04:37,640 --> 00:04:40,719 Speaker 1: And then there's the continuing uncertainty that just goes along 72 00:04:40,760 --> 00:04:44,480 Speaker 1: with COVID in general. I suspect we will have moved 73 00:04:44,520 --> 00:04:49,520 Speaker 1: beyond a macron sometime early in two at least I 74 00:04:49,560 --> 00:04:54,120 Speaker 1: hope so. But then who knows what variant could be next. 75 00:04:54,600 --> 00:04:58,840 Speaker 1: Like I would love to see COVID go from pandemic 76 00:04:58,920 --> 00:05:05,120 Speaker 1: to endemic, uh, to have that shift happen in two 77 00:05:05,480 --> 00:05:08,920 Speaker 1: and I really hope it happens. But you know, I 78 00:05:08,960 --> 00:05:10,760 Speaker 1: had hoped it would happen by the end of twenty 79 00:05:11,240 --> 00:05:13,880 Speaker 1: and here we are. So I'm sad to say I 80 00:05:13,880 --> 00:05:16,279 Speaker 1: feel we won't see the other side of the semiconductor 81 00:05:16,400 --> 00:05:21,240 Speaker 1: shortage in two UH, and that is going to affect 82 00:05:21,320 --> 00:05:26,320 Speaker 1: numerous other industries. The computer and electronics industries are obviously 83 00:05:26,360 --> 00:05:28,880 Speaker 1: going to be hit by this, but it also extends 84 00:05:28,880 --> 00:05:32,640 Speaker 1: to other kinds of tech, including you know, automobiles. It 85 00:05:32,720 --> 00:05:35,800 Speaker 1: might be a while before we start seeing super futuristic 86 00:05:35,800 --> 00:05:39,000 Speaker 1: options available again, or even just you know, some of 87 00:05:39,000 --> 00:05:41,040 Speaker 1: those cool options you saw on some of the higher 88 00:05:41,080 --> 00:05:46,440 Speaker 1: end vehicles, things like really advanced entertainment systems and things 89 00:05:46,480 --> 00:05:49,360 Speaker 1: like that. Some of that's gonna get scaled back because 90 00:05:49,400 --> 00:05:53,760 Speaker 1: carmakers will have to kind of dial back a little 91 00:05:53,760 --> 00:05:58,880 Speaker 1: bit in order to continue producing cars. Otherwise they may 92 00:05:58,920 --> 00:06:01,200 Speaker 1: find out that their production quotas are held up because 93 00:06:01,240 --> 00:06:04,320 Speaker 1: they don't have enough semiconductors to make these other, you know, 94 00:06:04,560 --> 00:06:10,400 Speaker 1: tertiary systems alright. So that prediction in general is actually 95 00:06:10,400 --> 00:06:13,440 Speaker 1: going to factor into a lot of other predictions I have. Uh. 96 00:06:13,520 --> 00:06:15,760 Speaker 1: I like to look at other people who are making 97 00:06:15,760 --> 00:06:18,960 Speaker 1: predictions as well and see what I think about them. 98 00:06:19,839 --> 00:06:23,400 Speaker 1: One prediction I saw is from the chief Strategy Officer 99 00:06:23,560 --> 00:06:29,400 Speaker 1: of lux Excel, person named Guido Gerert, who predicts that 100 00:06:29,560 --> 00:06:34,000 Speaker 1: mixed reality glasses and other products in that that area 101 00:06:34,080 --> 00:06:36,960 Speaker 1: are going to have a big year in two So 102 00:06:37,040 --> 00:06:40,520 Speaker 1: these would be devices that tap into stuff like virtual 103 00:06:40,600 --> 00:06:46,160 Speaker 1: reality and augmented reality experiences. Presumably these kinds of things 104 00:06:46,160 --> 00:06:49,680 Speaker 1: would be really important for many of the proposed incarnations 105 00:06:49,720 --> 00:06:54,040 Speaker 1: of a future metaverse, which obviously is in the hype 106 00:06:54,040 --> 00:06:59,000 Speaker 1: cycle right now. I would love to see mixed reality 107 00:06:59,440 --> 00:07:05,120 Speaker 1: headset really take off, but my prediction is they're gonna flop. Now. 108 00:07:05,160 --> 00:07:08,000 Speaker 1: I hope I'm actually wrong about that. I, like I said, 109 00:07:08,120 --> 00:07:10,360 Speaker 1: I want to see him succeed. I want to see 110 00:07:10,360 --> 00:07:14,360 Speaker 1: them emerge from the niche market there in now. I mean, 111 00:07:14,400 --> 00:07:16,920 Speaker 1: when Microsoft first started showing off the hollow lens a 112 00:07:16,960 --> 00:07:21,280 Speaker 1: few years ago, I was really intrigued. But this technology 113 00:07:21,320 --> 00:07:25,440 Speaker 1: has mostly been kept to the business sector. It's not 114 00:07:25,760 --> 00:07:29,960 Speaker 1: like a consumer electronics product, and the stuff we have 115 00:07:30,080 --> 00:07:33,360 Speaker 1: seen in the consumer space hasn't really established a considerable 116 00:07:33,560 --> 00:07:36,920 Speaker 1: user base. I worry there aren't going to be nearly 117 00:07:37,080 --> 00:07:40,680 Speaker 1: enough applications for this technology to really make it seem 118 00:07:40,720 --> 00:07:44,120 Speaker 1: worthwhile to the average consumer. I mean, you're gonna have 119 00:07:44,200 --> 00:07:47,720 Speaker 1: your early adopters. They'll rush out and get it. You'll 120 00:07:47,720 --> 00:07:49,960 Speaker 1: have a few tech enthusiasts who are going to jump 121 00:07:50,040 --> 00:07:53,040 Speaker 1: in there. I mean, I'm one of the people who 122 00:07:53,080 --> 00:07:57,280 Speaker 1: actually owned a pair of Google glass, so I get 123 00:07:57,320 --> 00:08:00,720 Speaker 1: that early adopter tech enthusiast thing. But unless there are 124 00:08:00,760 --> 00:08:05,440 Speaker 1: some really compelling experiences and software that go along with 125 00:08:05,560 --> 00:08:10,040 Speaker 1: the hardware, I don't think we're gonna see this catch on. 126 00:08:10,840 --> 00:08:13,680 Speaker 1: Uh and even with really good applications, I still think 127 00:08:13,800 --> 00:08:17,360 Speaker 1: this is an uphill battle. Folks have traditionally been a 128 00:08:17,400 --> 00:08:21,520 Speaker 1: bit unimpressed or uninterested in headsets, from the failure of 129 00:08:21,640 --> 00:08:26,840 Speaker 1: three D television to the relatively small VR gamer user base. 130 00:08:27,040 --> 00:08:29,640 Speaker 1: I mean, I just feel like this is a tech 131 00:08:29,720 --> 00:08:34,360 Speaker 1: that just isn't clicking with the mainstream, and without mainstream money, 132 00:08:34,840 --> 00:08:37,880 Speaker 1: I'm not sure how many companies will stick around the 133 00:08:38,000 --> 00:08:42,640 Speaker 1: mixed reality space for the long haul. Okay, but I 134 00:08:42,720 --> 00:08:46,400 Speaker 1: do have one huge caveat to all of this. If 135 00:08:46,480 --> 00:08:51,200 Speaker 1: Apple actually debuts it's rumored mixed reality headset this year, 136 00:08:51,520 --> 00:08:54,920 Speaker 1: not necessarily offering it on sale, but at least showing 137 00:08:54,960 --> 00:09:00,400 Speaker 1: it off, that might be enough to get the ball rolling, 138 00:09:00,679 --> 00:09:03,560 Speaker 1: because Apple has proven a few times now that it 139 00:09:03,600 --> 00:09:06,160 Speaker 1: can take a tech that has failed to hook the 140 00:09:06,200 --> 00:09:10,640 Speaker 1: mainstream and then turn it around. Uh. That being said, 141 00:09:10,880 --> 00:09:13,280 Speaker 1: it has been a while since Apple has done that, right. 142 00:09:13,400 --> 00:09:16,360 Speaker 1: I mean, like they did it with the iPod, they 143 00:09:16,360 --> 00:09:19,520 Speaker 1: did it with the iPhone, they did it with the iPad. Uh. 144 00:09:19,559 --> 00:09:22,920 Speaker 1: They've done it to some extent with Matt computers, but 145 00:09:23,040 --> 00:09:27,600 Speaker 1: it's been a while so UH. I don't know if 146 00:09:27,640 --> 00:09:30,080 Speaker 1: Apple still has the same magic. But I would say 147 00:09:30,080 --> 00:09:32,640 Speaker 1: that if anyone can get this thing moving in the 148 00:09:32,720 --> 00:09:36,440 Speaker 1: right direction, that is getting people to adopt mixed reality, 149 00:09:36,840 --> 00:09:39,240 Speaker 1: I would imagine Apple would be sort of the gateway. 150 00:09:39,320 --> 00:09:42,880 Speaker 1: It'll also be incredibly expensive, but they might open up 151 00:09:42,880 --> 00:09:47,199 Speaker 1: doors for others. I do think mixed reality has a 152 00:09:47,280 --> 00:09:50,000 Speaker 1: valid place in technology. I think it's possible to make 153 00:09:50,040 --> 00:09:53,960 Speaker 1: something really cool and exciting and useful. Uh. I saw 154 00:09:54,040 --> 00:09:57,440 Speaker 1: potential in Google Glass, but that was a very early 155 00:09:57,640 --> 00:09:59,760 Speaker 1: version of what I hope to see in the future. 156 00:10:00,640 --> 00:10:02,440 Speaker 1: But I am skeptical that we're going to see any 157 00:10:02,480 --> 00:10:07,280 Speaker 1: mixed reality headsets. Hardware software really established itself as a 158 00:10:07,280 --> 00:10:11,920 Speaker 1: core technology in two at least not for the average person. 159 00:10:13,520 --> 00:10:16,640 Speaker 1: I also mentioned the metaverse just then, and I feel 160 00:10:16,640 --> 00:10:19,160 Speaker 1: like that's something else we're gonna hear some more hype about, 161 00:10:19,280 --> 00:10:22,439 Speaker 1: but I don't expect that to work out anytime soon either. 162 00:10:23,080 --> 00:10:27,920 Speaker 1: The computational power required to make a really robust, immersive 163 00:10:28,080 --> 00:10:32,600 Speaker 1: and immense metaverse something that can support thousands or even 164 00:10:32,640 --> 00:10:35,640 Speaker 1: tens of thousands of people being on at the same time. 165 00:10:36,200 --> 00:10:39,360 Speaker 1: That's just something that we don't have. We don't have 166 00:10:39,400 --> 00:10:42,080 Speaker 1: that computational power at our disposal, at least not on 167 00:10:42,120 --> 00:10:45,440 Speaker 1: a scale that would be really impressive. And when you 168 00:10:45,480 --> 00:10:50,120 Speaker 1: think about all the challenges involved, like building the tech foundation, 169 00:10:50,920 --> 00:10:54,280 Speaker 1: figuring out what standards you want to use, like is 170 00:10:54,840 --> 00:10:57,880 Speaker 1: your your system going to work with people who don't 171 00:10:57,920 --> 00:11:01,200 Speaker 1: have say a VR setup, If it works with VR, 172 00:11:01,440 --> 00:11:02,840 Speaker 1: is it going to work with all of them? Is 173 00:11:02,840 --> 00:11:04,959 Speaker 1: it going to work with a subset of them? If 174 00:11:04,960 --> 00:11:07,760 Speaker 1: it works with a subset, then you're automatically cutting out 175 00:11:07,800 --> 00:11:11,480 Speaker 1: people who don't have that particular hardware. These are all, 176 00:11:12,840 --> 00:11:15,840 Speaker 1: you know, basic problems that you have to tackle. You 177 00:11:15,920 --> 00:11:20,160 Speaker 1: also have to support a coherent and persistent virtual landscape, 178 00:11:20,800 --> 00:11:23,720 Speaker 1: and that might mean building in redundancy to handle issues 179 00:11:23,760 --> 00:11:26,400 Speaker 1: like when the system might go down. We've seen a 180 00:11:26,440 --> 00:11:31,880 Speaker 1: lot of issues with the Internet integrity in one. Obviously, 181 00:11:31,880 --> 00:11:34,520 Speaker 1: you don't want the quote unquote future of the Internet 182 00:11:34,559 --> 00:11:37,839 Speaker 1: to be unstable or unreliable. So these are really big 183 00:11:37,880 --> 00:11:42,520 Speaker 1: engineering problems, made more complicated by the fact that at 184 00:11:42,600 --> 00:11:45,920 Speaker 1: least most concepts of the metaverse include being in this 185 00:11:46,120 --> 00:11:50,400 Speaker 1: very uh convincing and immersive experience. I mean that just 186 00:11:50,440 --> 00:11:54,680 Speaker 1: requires a lot of of computer power. Now, it's I 187 00:11:54,720 --> 00:11:58,960 Speaker 1: don't think that these problems are insurmountable, but I also 188 00:11:59,080 --> 00:12:03,120 Speaker 1: think we're a long way from actually rising to meet 189 00:12:03,160 --> 00:12:06,240 Speaker 1: the challenge. Not that we won't eventually do it, but 190 00:12:06,280 --> 00:12:08,679 Speaker 1: that I don't think it's going to happen in two. 191 00:12:09,559 --> 00:12:12,560 Speaker 1: I actually worry that if someone rushes out to try 192 00:12:12,559 --> 00:12:15,040 Speaker 1: and get people excited in the idea and they release, 193 00:12:15,679 --> 00:12:19,560 Speaker 1: you know, like an alpha version that is really stripped down, 194 00:12:20,240 --> 00:12:22,040 Speaker 1: then the people who try it are probably gonna end 195 00:12:22,080 --> 00:12:25,600 Speaker 1: up being disappointed when when they see what is actually 196 00:12:25,640 --> 00:12:28,440 Speaker 1: possible is not measured up to what they thought was 197 00:12:28,440 --> 00:12:31,560 Speaker 1: going to be possible what they were promised, which is 198 00:12:31,679 --> 00:12:35,240 Speaker 1: essentially how people felt with VR back in the day. 199 00:12:35,800 --> 00:12:38,960 Speaker 1: So I think the metaverse is just going to be 200 00:12:39,080 --> 00:12:43,120 Speaker 1: a dud in two. I'm not saying it's never gonna happen. 201 00:12:43,160 --> 00:12:46,200 Speaker 1: I just don't think we'll see any substantial progress next year. 202 00:12:47,000 --> 00:12:49,559 Speaker 1: Now I could be totally wrong about that. And as 203 00:12:49,600 --> 00:12:51,439 Speaker 1: to whether or not I think the metaverse is a 204 00:12:51,480 --> 00:12:55,120 Speaker 1: good idea, I'm skeptical about that too. I worry that 205 00:12:55,160 --> 00:12:57,520 Speaker 1: we're going to see a bunch of very wealthy, out 206 00:12:57,520 --> 00:13:03,480 Speaker 1: of touch folks. Mark Zuckerberg head the design of virtual 207 00:13:03,559 --> 00:13:06,280 Speaker 1: worlds that are gonna put the wealth and digital divides 208 00:13:06,360 --> 00:13:10,400 Speaker 1: under an extremely bright spotlight. And I imagine accessing the 209 00:13:10,440 --> 00:13:14,320 Speaker 1: metaverse whenever we do get one, is going to require 210 00:13:14,400 --> 00:13:18,080 Speaker 1: some pretty hefty hardware and a really fast Internet connection, 211 00:13:18,679 --> 00:13:21,839 Speaker 1: and that those requirements are going to cut off a 212 00:13:21,880 --> 00:13:24,800 Speaker 1: substantial amount of the world's population. They will not be 213 00:13:24,880 --> 00:13:28,080 Speaker 1: able to participate in this. And so what you end 214 00:13:28,160 --> 00:13:32,880 Speaker 1: up with is potentially the tech elite playing a part 215 00:13:33,000 --> 00:13:36,720 Speaker 1: in a world of digital conspicuous consumption, and no one 216 00:13:36,760 --> 00:13:40,880 Speaker 1: else can participate. Honestly, if it's just conspicuous consumption, I'm 217 00:13:40,920 --> 00:13:44,400 Speaker 1: okay with being left out. It'll just be gross. But 218 00:13:44,520 --> 00:13:46,720 Speaker 1: that's what I think it's going to turn out to be. 219 00:13:46,920 --> 00:13:50,000 Speaker 1: And again, I don't think we'll actually see that in two. 220 00:13:50,880 --> 00:13:53,240 Speaker 1: All Right, we're gonna take a quick break when we 221 00:13:53,320 --> 00:14:03,480 Speaker 1: come back a few more predictions. Something I do think 222 00:14:03,480 --> 00:14:06,520 Speaker 1: we're gonna see in two this is not necessarily a 223 00:14:06,520 --> 00:14:10,120 Speaker 1: good thing, is an explosion of tech in the healthcare space. 224 00:14:11,160 --> 00:14:13,880 Speaker 1: You better believe there are tons of folks out there 225 00:14:13,960 --> 00:14:17,160 Speaker 1: eyeing the medical world and looking for any angle they 226 00:14:17,160 --> 00:14:19,840 Speaker 1: can exploit. So my guess is we're going to see 227 00:14:19,880 --> 00:14:23,280 Speaker 1: a lot of startups taking aim at various aspects of 228 00:14:23,520 --> 00:14:26,320 Speaker 1: healthcare and medicine. I mean that's already been the case, 229 00:14:26,400 --> 00:14:30,840 Speaker 1: Like there's been a healthcare healthy if you will, healthcare 230 00:14:31,680 --> 00:14:34,360 Speaker 1: tech sector for a long time. I think it's going 231 00:14:34,440 --> 00:14:40,280 Speaker 1: to explode in Some of the startups are you know, 232 00:14:40,880 --> 00:14:43,680 Speaker 1: kinda be good. They're gonna take aim at various aspects 233 00:14:43,680 --> 00:14:47,920 Speaker 1: of medicine and make them better. But others I fear 234 00:14:48,240 --> 00:14:51,040 Speaker 1: might fall more along the lines of Tharah no nos. 235 00:14:51,160 --> 00:14:54,000 Speaker 1: You know, companies that tackle what turns out to be 236 00:14:54,120 --> 00:14:58,200 Speaker 1: an impossible task. Uh. With Sarahs it was this idea 237 00:14:58,200 --> 00:15:01,640 Speaker 1: of creating a desktop device that could test a micro 238 00:15:01,720 --> 00:15:05,800 Speaker 1: drop of blood for hundreds of different potential signs of 239 00:15:05,840 --> 00:15:11,520 Speaker 1: disease and conditions. Turned out that really wasn't viable. I'm 240 00:15:11,560 --> 00:15:13,520 Speaker 1: worried we're gonna see a lot more of that because 241 00:15:13,560 --> 00:15:17,240 Speaker 1: there's this tendency for folks to look at technology and 242 00:15:17,280 --> 00:15:20,480 Speaker 1: see it as a general purpose solution that could fix 243 00:15:20,520 --> 00:15:22,960 Speaker 1: any problem as long as you throw enough R and 244 00:15:23,040 --> 00:15:26,560 Speaker 1: D and money at it. But as we've seen, that's 245 00:15:26,560 --> 00:15:29,880 Speaker 1: not necessarily the case. Anyway, I'm certain by the end 246 00:15:29,920 --> 00:15:32,320 Speaker 1: of two we will see a lot of companies that 247 00:15:32,400 --> 00:15:36,000 Speaker 1: are a blend of tech and medicine. I hope more 248 00:15:36,040 --> 00:15:39,000 Speaker 1: of them are helpful and fewer of them are a 249 00:15:39,040 --> 00:15:42,920 Speaker 1: waste of time, money, and resources. And I really hope 250 00:15:43,440 --> 00:15:46,720 Speaker 1: we don't see another case like Theoriness, which was a 251 00:15:46,760 --> 00:15:50,200 Speaker 1: company that not only burnt through investment capital with nothing 252 00:15:50,240 --> 00:15:53,520 Speaker 1: to really show for it, but in the process caused 253 00:15:53,560 --> 00:15:56,720 Speaker 1: harm to innocent human beings who were relying on products 254 00:15:56,800 --> 00:16:01,040 Speaker 1: or services from that company. Uh. That's the real tragedy 255 00:16:01,080 --> 00:16:05,120 Speaker 1: of theiriness. I think two might be the year where 256 00:16:05,120 --> 00:16:09,960 Speaker 1: we see companies actually put blockchain to an effective use. 257 00:16:10,760 --> 00:16:14,840 Speaker 1: Maybe I don't think it's gonna be as grandiose as 258 00:16:14,840 --> 00:16:19,280 Speaker 1: a fully fledged Web three, largely because no one's really 259 00:16:19,360 --> 00:16:22,640 Speaker 1: nailed down what that's gonna look like. But some companies 260 00:16:22,680 --> 00:16:27,480 Speaker 1: are bound to find at least some useful purpose for blockchain. Uh. This, 261 00:16:27,600 --> 00:16:30,480 Speaker 1: by the way, is another story that frustrates me, because 262 00:16:30,560 --> 00:16:32,960 Speaker 1: I think a lot of folks out there saw blockchain 263 00:16:33,080 --> 00:16:35,720 Speaker 1: as again, kind of like a cure all solutions, sort 264 00:16:35,720 --> 00:16:38,760 Speaker 1: of the way the cult of technology looks at tech. 265 00:16:39,440 --> 00:16:42,840 Speaker 1: They think of blockchain and say that's our our, our solution, 266 00:16:42,920 --> 00:16:45,720 Speaker 1: but they didn't define the problem that needed to be 267 00:16:45,760 --> 00:16:48,920 Speaker 1: solved by blockchain, and that led to folks saying that 268 00:16:49,000 --> 00:16:53,400 Speaker 1: blockchain could underpin all sorts of systems, but they never 269 00:16:53,480 --> 00:16:56,840 Speaker 1: actually defined the first problem. So, in other words, this 270 00:16:56,920 --> 00:16:59,320 Speaker 1: felt like a case where you have a solution and 271 00:16:59,360 --> 00:17:01,840 Speaker 1: you're looking are a problem to fix that is not 272 00:17:01,920 --> 00:17:04,680 Speaker 1: the best way to go about problem solving. In fact, 273 00:17:04,760 --> 00:17:07,320 Speaker 1: it kind of reminds me of the as seen on 274 00:17:07,480 --> 00:17:10,320 Speaker 1: TV type gadgets that I used to see on late 275 00:17:10,400 --> 00:17:14,199 Speaker 1: night television, you know, back when I still watched television, 276 00:17:14,680 --> 00:17:17,159 Speaker 1: and I'm pretty sure I heard some variation of it 277 00:17:17,240 --> 00:17:20,360 Speaker 1: solves a problem you never even knew you had on 278 00:17:20,400 --> 00:17:23,320 Speaker 1: those kinds of things, and a problem you don't know 279 00:17:23,400 --> 00:17:26,640 Speaker 1: you have is usually not a problem at all. I mean, 280 00:17:26,680 --> 00:17:28,840 Speaker 1: if you have a problem, usually you're aware of it. 281 00:17:28,840 --> 00:17:31,679 Speaker 1: It's possible that you have some problems you're not aware of, 282 00:17:32,280 --> 00:17:34,440 Speaker 1: But for this kind of stuff, yeah, you you kind 283 00:17:34,440 --> 00:17:36,040 Speaker 1: of say like, oh, there's gotta be a better way 284 00:17:36,040 --> 00:17:39,800 Speaker 1: to do this, and often this is not a better way, 285 00:17:39,800 --> 00:17:43,439 Speaker 1: it's just a different way. Potentially you're actually making the 286 00:17:43,480 --> 00:17:47,080 Speaker 1: process of dealing with whatever it is, you're doing more complicated, 287 00:17:47,520 --> 00:17:50,880 Speaker 1: and you're incorporating a technology poorly. I feel that's how 288 00:17:50,960 --> 00:17:55,639 Speaker 1: companies have been tackling blockchain. However, I do think we're 289 00:17:55,640 --> 00:17:59,880 Speaker 1: gonna see at least some blockchain applications emerge in two 290 00:18:00,359 --> 00:18:03,920 Speaker 1: that will be of actual use to companies. I still 291 00:18:03,960 --> 00:18:06,880 Speaker 1: remain a bit skeptical over the whole concept of Web three, 292 00:18:07,640 --> 00:18:09,880 Speaker 1: but I need to do a full episode about that. 293 00:18:10,200 --> 00:18:15,200 Speaker 1: I will say there are a lot of extremely intelligent people, 294 00:18:15,240 --> 00:18:18,680 Speaker 1: people smarter than I am, and some of whom might 295 00:18:18,800 --> 00:18:21,240 Speaker 1: have a personal stake in the concept of Web three, 296 00:18:21,320 --> 00:18:23,840 Speaker 1: so that creates a bit of bias. But there are 297 00:18:23,880 --> 00:18:25,960 Speaker 1: a lot of smart people who say that is the 298 00:18:26,000 --> 00:18:29,880 Speaker 1: way forward, Web three is the future. They might be right. 299 00:18:30,640 --> 00:18:33,000 Speaker 1: I feel this is an area where my own lack 300 00:18:33,160 --> 00:18:38,040 Speaker 1: of awareness and understanding could be coloring my prediction. I 301 00:18:38,080 --> 00:18:42,800 Speaker 1: just haven't seen much there there, you know, like I 302 00:18:42,840 --> 00:18:48,040 Speaker 1: haven't seen substance to this. It's a lot of notions, 303 00:18:48,920 --> 00:18:52,040 Speaker 1: like the idea of a decentralized Internet where the people 304 00:18:52,119 --> 00:18:56,720 Speaker 1: participating have ownership of the actual Internet, and it breaks 305 00:18:56,720 --> 00:19:00,280 Speaker 1: the Internet free from giant monoliths like m is on 306 00:19:00,359 --> 00:19:05,399 Speaker 1: in Google and Facebook, but I haven't actually seen, like 307 00:19:05,520 --> 00:19:09,760 Speaker 1: how I hear what it's supposed to do, I haven't 308 00:19:09,840 --> 00:19:12,720 Speaker 1: really seen how it's supposed to do it. Um, I 309 00:19:12,800 --> 00:19:15,880 Speaker 1: have seen a lot of venture capitalists get really excited 310 00:19:15,920 --> 00:19:19,600 Speaker 1: about it, and like Jack Dorsey, the former CEO of Twitter, said, 311 00:19:20,600 --> 00:19:22,720 Speaker 1: it may very well be that you're just looking at 312 00:19:22,800 --> 00:19:26,560 Speaker 1: not a decentralized Internet. It'll just be centralized in different 313 00:19:27,080 --> 00:19:30,960 Speaker 1: you know, areas of power, namely the venture capitalists who 314 00:19:30,960 --> 00:19:36,160 Speaker 1: are funding everything. But anyway, I very much doubt Web 315 00:19:36,160 --> 00:19:38,480 Speaker 1: three is going to be a thing anytime soon, at 316 00:19:38,560 --> 00:19:41,439 Speaker 1: least not in two But I also have to admit 317 00:19:42,080 --> 00:19:44,840 Speaker 1: I need to do a ton more research into this topic. 318 00:19:45,520 --> 00:19:48,600 Speaker 1: It's very possible I'm just not seeing the obvious here, 319 00:19:48,800 --> 00:19:51,119 Speaker 1: and so I know that's a long way of saying 320 00:19:52,720 --> 00:19:54,919 Speaker 1: I might be wrong, but like, and I need to 321 00:19:54,960 --> 00:19:58,520 Speaker 1: couch all that because while I feel a lot of skepticism, 322 00:19:58,640 --> 00:20:01,760 Speaker 1: I also have to admit there's a lack of knowledge 323 00:20:01,800 --> 00:20:04,479 Speaker 1: there on my part, and that that you know, you 324 00:20:04,520 --> 00:20:07,879 Speaker 1: can't just be skeptical skeptical of something just because you 325 00:20:08,040 --> 00:20:11,160 Speaker 1: didn't bother to learn more about it. So I will 326 00:20:11,200 --> 00:20:13,600 Speaker 1: do another episode, a full episode about Web three in 327 00:20:13,600 --> 00:20:16,520 Speaker 1: the future. Once I get some time to really dive 328 00:20:16,600 --> 00:20:20,600 Speaker 1: into it, I suspect we will see China take a 329 00:20:20,680 --> 00:20:25,840 Speaker 1: harder regulatory stance in the tech industry. Within China, um 330 00:20:25,880 --> 00:20:29,520 Speaker 1: Si Jinping has proven to be pretty proactive in reining 331 00:20:29,560 --> 00:20:34,200 Speaker 1: in companies that get very large, very wealthy, very quickly. Now, 332 00:20:34,200 --> 00:20:36,040 Speaker 1: whether that is to make sure the companies are not 333 00:20:36,080 --> 00:20:39,840 Speaker 1: gonna balloon out of control or cause harm like, there's 334 00:20:40,160 --> 00:20:44,480 Speaker 1: obviously the fear about collecting data, right, we see this 335 00:20:44,480 --> 00:20:47,359 Speaker 1: this concern around the world, this idea of companies collecting 336 00:20:47,520 --> 00:20:51,560 Speaker 1: enormous amounts of data about private individuals and how harmful 337 00:20:51,600 --> 00:20:54,360 Speaker 1: that can be. Well, that's some of the reasoning behind 338 00:20:54,480 --> 00:20:58,000 Speaker 1: the regulations in China. There are others who say that 339 00:20:58,080 --> 00:21:02,000 Speaker 1: perhaps this is more about a play in keeping any 340 00:21:02,200 --> 00:21:05,560 Speaker 1: entity from rivaling the power of the communist government in China, 341 00:21:06,359 --> 00:21:09,280 Speaker 1: the idea that you don't want to create corporations so 342 00:21:09,359 --> 00:21:13,679 Speaker 1: powerful that they hold sway over the government like you 343 00:21:13,760 --> 00:21:17,320 Speaker 1: might in other countries. But the end result here is 344 00:21:17,359 --> 00:21:19,960 Speaker 1: that a lot of companies in China, and tech companies 345 00:21:20,040 --> 00:21:23,159 Speaker 1: in particular in China, have been hit with some heavy 346 00:21:23,200 --> 00:21:26,800 Speaker 1: regulations this past year, and I expect that's going to continue. 347 00:21:27,640 --> 00:21:30,240 Speaker 1: I also think that's going to convince more companies to 348 00:21:30,400 --> 00:21:33,840 Speaker 1: follow in the footsteps of Yahoo, which pulled up stakes 349 00:21:33,880 --> 00:21:36,600 Speaker 1: as far as doing business in China is concerned, I 350 00:21:36,600 --> 00:21:38,960 Speaker 1: think we're gonna see more companies come to a similar 351 00:21:39,000 --> 00:21:43,200 Speaker 1: conclusion that the revenues generated in China are not worth 352 00:21:43,480 --> 00:21:47,280 Speaker 1: the price of doing business there. Okay, I've got a 353 00:21:47,320 --> 00:21:50,640 Speaker 1: few more predictions to go through, but before we get 354 00:21:50,640 --> 00:22:00,600 Speaker 1: to that, let's take another quick break. As we see 355 00:22:00,640 --> 00:22:04,399 Speaker 1: countries commit to a move toward being carbon neutral or 356 00:22:04,760 --> 00:22:08,600 Speaker 1: even carbon negative, I think we should also expect to 357 00:22:08,640 --> 00:22:11,520 Speaker 1: see a lot of companies offer up products and services 358 00:22:12,000 --> 00:22:17,000 Speaker 1: targeted at supporting those efforts. So I suspect in two 359 00:22:17,080 --> 00:22:21,400 Speaker 1: we're gonna see lots of pushes to expand renewable energy capabilities, 360 00:22:22,160 --> 00:22:27,560 Speaker 1: energy storage solutions, and similar implementations. I also worry about 361 00:22:27,600 --> 00:22:31,480 Speaker 1: this a lot. Not because I think that renewable energy 362 00:22:31,520 --> 00:22:34,240 Speaker 1: technology is a bad thing. I don't. I think it's 363 00:22:34,240 --> 00:22:38,520 Speaker 1: a good thing. But the reason I worry is, once again, 364 00:22:39,320 --> 00:22:42,000 Speaker 1: too many of us have this tendency to put a 365 00:22:42,040 --> 00:22:45,160 Speaker 1: lot of faith in technology and then just assume that 366 00:22:45,320 --> 00:22:48,680 Speaker 1: the problem has been fixed. I think that's a very 367 00:22:48,760 --> 00:22:52,440 Speaker 1: dangerous mindset, and I worry about it. In fact, as 368 00:22:52,480 --> 00:22:55,560 Speaker 1: a little anecdote, I remember, before it was a podcaster, 369 00:22:56,240 --> 00:22:59,920 Speaker 1: I remember listening to a podcast about skepticism and critical 370 00:23:00,000 --> 00:23:04,879 Speaker 1: thinking in which one of the hosts largely dismissed climate 371 00:23:04,960 --> 00:23:09,040 Speaker 1: change concerns, essentially saying, we would engineer our way out 372 00:23:09,040 --> 00:23:12,440 Speaker 1: of it, we would use technology to solve the problem. 373 00:23:12,480 --> 00:23:17,320 Speaker 1: And I think that is an extremely dangerous response, and 374 00:23:17,400 --> 00:23:22,200 Speaker 1: it ignores issues like holding people and companies accountable for 375 00:23:22,280 --> 00:23:27,040 Speaker 1: how they contribute to climate change and carbon emissions. So 376 00:23:27,600 --> 00:23:29,680 Speaker 1: I guess what I'm saying is that I do want 377 00:23:29,680 --> 00:23:32,399 Speaker 1: to see more renewable energy solutions. I want to see 378 00:23:32,440 --> 00:23:36,160 Speaker 1: a lot of that in two but I don't want 379 00:23:36,280 --> 00:23:39,199 Speaker 1: us to let that serve as a smoke screen for 380 00:23:39,240 --> 00:23:42,000 Speaker 1: companies and nations that are still dumping tons of carbon 381 00:23:42,040 --> 00:23:46,080 Speaker 1: dioxide into the atmosphere and contributing to climate change. Like 382 00:23:46,800 --> 00:23:50,760 Speaker 1: we can't just say, oh, we made more solar panels, 383 00:23:50,760 --> 00:23:53,520 Speaker 1: so everything's okay. Now, we have to take a bigger 384 00:23:53,520 --> 00:23:57,719 Speaker 1: picture view of this, or else we're not really solving 385 00:23:57,760 --> 00:23:59,600 Speaker 1: the problem. It's kind of the issue I have with 386 00:24:00,800 --> 00:24:04,359 Speaker 1: a lot of of of carbon taxing and things of 387 00:24:04,400 --> 00:24:08,040 Speaker 1: that nature, and carbon offsets. I feel like a lot 388 00:24:08,080 --> 00:24:10,960 Speaker 1: of that ends up just being carte blanche for companies 389 00:24:10,960 --> 00:24:14,159 Speaker 1: to continue to dump enormous amounts of carbon dioxide with 390 00:24:14,359 --> 00:24:18,920 Speaker 1: the intent of capturing it in some way, but that 391 00:24:19,000 --> 00:24:22,879 Speaker 1: falls to other people other places. Uh, it may just 392 00:24:22,960 --> 00:24:27,080 Speaker 1: mean that that becomes a bottom line expense in a 393 00:24:27,119 --> 00:24:30,560 Speaker 1: company ledger and nothing is actually done about it. And 394 00:24:31,119 --> 00:24:34,119 Speaker 1: the fact is that, yeah, you might call it carbon 395 00:24:34,240 --> 00:24:36,280 Speaker 1: capture or whatever, but if you're not actually doing it, 396 00:24:36,400 --> 00:24:38,920 Speaker 1: then all you're really doing is shifting some money around 397 00:24:39,600 --> 00:24:43,120 Speaker 1: and the world just keeps getting more polluted. So that's 398 00:24:43,160 --> 00:24:46,640 Speaker 1: why I get on my high horse about that. Uh. 399 00:24:46,800 --> 00:24:48,840 Speaker 1: One thing that we will definitely see more of in 400 00:24:48,880 --> 00:24:52,919 Speaker 1: two This is such an obvious prediction that I almost 401 00:24:52,920 --> 00:24:55,320 Speaker 1: didn't even put it in here, is that we're gonna 402 00:24:55,359 --> 00:24:58,800 Speaker 1: see a lot more focus on cybersecurity. We're gonna see 403 00:24:58,800 --> 00:25:03,960 Speaker 1: calls around the world to create regulations to boost cybersecurity 404 00:25:04,000 --> 00:25:09,560 Speaker 1: practices and hold companies and organizations responsible for following cybersecurity 405 00:25:09,560 --> 00:25:13,240 Speaker 1: best practices. Like I would not be surprised to see 406 00:25:13,240 --> 00:25:18,560 Speaker 1: countries pass laws that say when a patch is released, 407 00:25:18,600 --> 00:25:21,520 Speaker 1: like a security patch is released for a product that 408 00:25:21,600 --> 00:25:24,760 Speaker 1: has a known vulnerability in it. The companies will have 409 00:25:26,280 --> 00:25:29,679 Speaker 1: a maximum amount of time they're allowed before they have 410 00:25:29,840 --> 00:25:33,080 Speaker 1: to implement the patch, or else they will be fined. 411 00:25:33,320 --> 00:25:36,800 Speaker 1: I think that's gonna be a thing. I'm less convinced 412 00:25:37,520 --> 00:25:39,760 Speaker 1: that we're gonna see something like that done in the 413 00:25:39,880 --> 00:25:45,680 Speaker 1: United States. Um, even though we saw several high profile 414 00:25:45,800 --> 00:25:50,760 Speaker 1: ransomware cases play out in the United States, I don't 415 00:25:50,800 --> 00:25:52,879 Speaker 1: think we're gonna see a whole lot of movement in 416 00:25:52,920 --> 00:25:56,840 Speaker 1: the US on this from a political standpoint, simply because 417 00:25:56,880 --> 00:25:58,879 Speaker 1: the US is in the middle of a pretty dark 418 00:25:58,960 --> 00:26:03,879 Speaker 1: era and there's increasing polarization between the left side of 419 00:26:03,920 --> 00:26:07,280 Speaker 1: the political spectrum and the right side of the political spectrum, 420 00:26:07,320 --> 00:26:12,080 Speaker 1: and the likelihood of reaching consensus on anything seems optimistic 421 00:26:12,119 --> 00:26:14,560 Speaker 1: to the extreme because a lot of the energy seems 422 00:26:14,600 --> 00:26:19,680 Speaker 1: to be dedicated toward demonizing the other side and less 423 00:26:19,680 --> 00:26:23,560 Speaker 1: about getting stuff done. However, I really do hope we 424 00:26:23,600 --> 00:26:26,919 Speaker 1: actually see real movement to make cybersecurity a top priority 425 00:26:27,000 --> 00:26:30,080 Speaker 1: in all industries, because it is clear that the cyber 426 00:26:30,119 --> 00:26:33,919 Speaker 1: attacks are not going to just stop, so we have 427 00:26:34,080 --> 00:26:39,199 Speaker 1: to put that work in to fight against it. I 428 00:26:39,240 --> 00:26:42,000 Speaker 1: think we're gonna see companies struggle to figure out what 429 00:26:42,160 --> 00:26:44,480 Speaker 1: the new normal is. I think for a lot of people, 430 00:26:44,800 --> 00:26:47,520 Speaker 1: there's now an expectation that having to go into the 431 00:26:47,600 --> 00:26:51,679 Speaker 1: office is no longer a given requirement, that should not 432 00:26:51,880 --> 00:26:54,760 Speaker 1: be part of the job. Necessarily for people who want 433 00:26:54,760 --> 00:26:56,840 Speaker 1: to go sure, but I think a lot of folks 434 00:26:56,880 --> 00:26:59,600 Speaker 1: say that should not be a requirement for all offices 435 00:26:59,640 --> 00:27:03,639 Speaker 1: every there, and companies that push for employees to return 436 00:27:03,680 --> 00:27:06,479 Speaker 1: to the office might find themselves struggling to attract and 437 00:27:06,520 --> 00:27:10,480 Speaker 1: retain talent as folks migrate to other companies that are 438 00:27:10,600 --> 00:27:13,359 Speaker 1: less adamant about office life, or they go and found 439 00:27:13,400 --> 00:27:16,360 Speaker 1: their own businesses. Now, I don't think we're saying goodbye 440 00:27:16,359 --> 00:27:20,320 Speaker 1: to offices everywhere. I don't think office culture is dead 441 00:27:20,359 --> 00:27:23,040 Speaker 1: and gone, but I do think there's gonna be a 442 00:27:23,040 --> 00:27:26,600 Speaker 1: pretty big shift, and obviously that will start to have 443 00:27:26,640 --> 00:27:29,639 Speaker 1: an impact on things like office space in general. We 444 00:27:29,720 --> 00:27:34,640 Speaker 1: haven't yet seen trends of massive office spaces going vacant 445 00:27:34,720 --> 00:27:37,879 Speaker 1: in the sense that companies have actually pulled out of 446 00:27:37,920 --> 00:27:40,439 Speaker 1: their lease and no one else has come in. We 447 00:27:40,520 --> 00:27:43,520 Speaker 1: haven't really seen that trend yet. We've definitely seen the 448 00:27:43,520 --> 00:27:45,960 Speaker 1: trend of empty offices. I mean, whenever I go to 449 00:27:46,119 --> 00:27:50,240 Speaker 1: the office these days, I might see as many as 450 00:27:50,280 --> 00:27:53,160 Speaker 1: two or three other people. I think on the busiest 451 00:27:53,240 --> 00:27:56,639 Speaker 1: day I saw five. But otherwise, like, there have been 452 00:27:57,200 --> 00:27:59,080 Speaker 1: several times where I've had to go into the office 453 00:27:59,080 --> 00:28:00,840 Speaker 1: and I am the only on there, and I know 454 00:28:00,920 --> 00:28:05,240 Speaker 1: that's not unusual, but again, we have that office space. 455 00:28:05,320 --> 00:28:08,040 Speaker 1: I'm saying that in two we're going to see companies 456 00:28:08,400 --> 00:28:11,359 Speaker 1: start to scale back on the amount of office space 457 00:28:11,359 --> 00:28:14,240 Speaker 1: they actually have, that they won't need as much because 458 00:28:14,240 --> 00:28:18,720 Speaker 1: they'll adopt a different approach to work. I'm not super 459 00:28:18,800 --> 00:28:22,119 Speaker 1: confident about this prediction, however, because there might be a 460 00:28:22,119 --> 00:28:24,720 Speaker 1: strong enough effort from companies to push back on that 461 00:28:25,200 --> 00:28:27,800 Speaker 1: and get back to a return to commuting into work 462 00:28:27,800 --> 00:28:30,120 Speaker 1: every day or at least a few times every week, 463 00:28:30,680 --> 00:28:34,520 Speaker 1: that maybe that won't come to pass. However, another trend 464 00:28:34,640 --> 00:28:37,600 Speaker 1: that I expect we will see continue in two is 465 00:28:37,640 --> 00:28:41,360 Speaker 1: one that's being called the Great Resignation. UH. The United 466 00:28:41,400 --> 00:28:44,080 Speaker 1: States Labor Department has been tracking the rate at which 467 00:28:44,160 --> 00:28:47,760 Speaker 1: people have been quitting their jobs so before the pandemic, 468 00:28:48,240 --> 00:28:51,560 Speaker 1: the number of people who are resigning, that is, quitting 469 00:28:51,640 --> 00:28:55,880 Speaker 1: voluntarily as opposed to being encouraged to depart, which is 470 00:28:55,960 --> 00:28:59,400 Speaker 1: just another way of saying getting canned. Uh that rate 471 00:28:59,440 --> 00:29:02,120 Speaker 1: was around three and a half million people per month. 472 00:29:03,200 --> 00:29:05,160 Speaker 1: Late this year, those numbers grew to more than four 473 00:29:05,200 --> 00:29:09,240 Speaker 1: million people per month. Employers are posting more job openings, 474 00:29:09,240 --> 00:29:12,200 Speaker 1: but there's a labor shortage at this point. Meanwhile, the 475 00:29:12,280 --> 00:29:15,800 Speaker 1: folks who are left behind when someone quits are frequently 476 00:29:15,840 --> 00:29:20,120 Speaker 1: burdened with extra work that can be demoralizing. That can 477 00:29:20,160 --> 00:29:23,440 Speaker 1: then end up contributing to people who are left behind 478 00:29:23,480 --> 00:29:25,840 Speaker 1: to say, you know what, I'm out of here too, 479 00:29:26,080 --> 00:29:28,120 Speaker 1: And it becomes a vicious cycle and you start to 480 00:29:28,160 --> 00:29:31,520 Speaker 1: see more and more people resign. Many of those resigning 481 00:29:31,520 --> 00:29:34,520 Speaker 1: are older workers, and we're seeing larger numbers of them 482 00:29:35,120 --> 00:29:38,640 Speaker 1: for an early retirement. Uh so they're not going to 483 00:29:38,720 --> 00:29:42,600 Speaker 1: work for somewhere you know different, They're just they're just retiring. 484 00:29:42,640 --> 00:29:46,160 Speaker 1: Early a third of all people who quit are saying 485 00:29:46,160 --> 00:29:49,240 Speaker 1: they set out to start their own business, which is 486 00:29:49,280 --> 00:29:53,480 Speaker 1: pretty incredible. I mean, thirty three of people resigning saying 487 00:29:53,520 --> 00:29:57,280 Speaker 1: they want to go into business for themselves. I mean, 488 00:29:57,480 --> 00:29:59,560 Speaker 1: it is a hustle culture out there. It's a lot 489 00:29:59,640 --> 00:30:01,800 Speaker 1: of health hustle to have your own business. But then again, 490 00:30:01,920 --> 00:30:04,800 Speaker 1: I mean, we've been training an entire generation of people 491 00:30:04,800 --> 00:30:07,800 Speaker 1: to hustle. That's that's the way they get by, So 492 00:30:07,960 --> 00:30:10,720 Speaker 1: maybe it's all going to work out. At the moment, 493 00:30:11,200 --> 00:30:14,480 Speaker 1: workers have the leverage in this situation. It's more of 494 00:30:14,520 --> 00:30:18,200 Speaker 1: a workers market than a company market, and I expect 495 00:30:18,280 --> 00:30:20,520 Speaker 1: we will see that continue at least into the first 496 00:30:20,520 --> 00:30:25,320 Speaker 1: half of twenty twenty two. And those are my predictions. 497 00:30:25,400 --> 00:30:27,240 Speaker 1: Like I said, it's a little bit of a shorter episode. 498 00:30:27,520 --> 00:30:32,200 Speaker 1: Didn't want to get too granular in this because, honestly, 499 00:30:32,480 --> 00:30:35,720 Speaker 1: with COVID making such a huge impact, it's hard to 500 00:30:35,760 --> 00:30:40,040 Speaker 1: make more granular predictions. Like I thought about predicting which 501 00:30:40,120 --> 00:30:44,240 Speaker 1: video games would not come out in two and we 502 00:30:44,320 --> 00:30:47,960 Speaker 1: get pushed into three, but then that starts to sound 503 00:30:48,040 --> 00:30:50,920 Speaker 1: kind of ugly, and I didn't want to get ugly 504 00:30:51,160 --> 00:30:53,720 Speaker 1: on this episode, not any uglier than I already did. 505 00:30:54,160 --> 00:30:58,160 Speaker 1: I hope you out there all are having a lovely 506 00:30:58,360 --> 00:31:03,920 Speaker 1: holiday season. I wish you all the best in two. 507 00:31:04,280 --> 00:31:07,520 Speaker 1: I really hope my gloomy predictions are all wrong. I 508 00:31:07,520 --> 00:31:10,640 Speaker 1: would love to come to the end of two and say, 509 00:31:10,680 --> 00:31:14,280 Speaker 1: why twenty two rocked it so hard and I was 510 00:31:14,400 --> 00:31:17,920 Speaker 1: so wrong. That would be a fantastic thing to have 511 00:31:18,040 --> 00:31:20,320 Speaker 1: to do at the end of the year. Here's hoping. 512 00:31:20,680 --> 00:31:22,840 Speaker 1: And if you have suggestions for topics I should cover 513 00:31:22,880 --> 00:31:25,480 Speaker 1: in future episodes of tech Stuff, please reach out to 514 00:31:25,520 --> 00:31:27,960 Speaker 1: me on Twitter. The handle for the show is text 515 00:31:27,960 --> 00:31:31,320 Speaker 1: Stuff H s W and I'll talk to you again 516 00:31:32,880 --> 00:31:40,520 Speaker 1: in the new year. Text Stuff is an I Heart 517 00:31:40,600 --> 00:31:44,360 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 518 00:31:44,400 --> 00:31:47,479 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 519 00:31:47,520 --> 00:31:48,880 Speaker 1: listen to your favorite shows.