1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,880 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,880 --> 00:00:19,239 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,320 --> 00:00:22,919 Speaker 1: tech are you. It is time for me to pay 5 00:00:23,040 --> 00:00:26,720 Speaker 1: the piper. By that, I mean it's time. That's special 6 00:00:26,800 --> 00:00:29,520 Speaker 1: time where I look back in an episode I recorded 7 00:00:29,560 --> 00:00:32,360 Speaker 1: at the end of last year in twenty twenty three, 8 00:00:32,680 --> 00:00:34,800 Speaker 1: in which I made predictions for what would happen in 9 00:00:34,800 --> 00:00:38,360 Speaker 1: the world of tech this year in twenty twenty four. 10 00:00:38,880 --> 00:00:42,640 Speaker 1: Traditionally pretty shaky on these predictions, and it turns out 11 00:00:42,640 --> 00:00:45,440 Speaker 1: making a really good prediction as hard. I never wanted 12 00:00:45,440 --> 00:00:48,960 Speaker 1: to do a layup or anything like that, so sometimes 13 00:00:48,960 --> 00:00:52,280 Speaker 1: I try and stick my neck out. But on top 14 00:00:52,320 --> 00:00:54,600 Speaker 1: of that, weird stuff just keeps happening in the world, 15 00:00:54,640 --> 00:00:57,320 Speaker 1: you know, stuff I didn't see coming, and inevitably that 16 00:00:57,360 --> 00:01:01,200 Speaker 1: has a really big impact on everything else. Like that 17 00:01:01,400 --> 00:01:04,000 Speaker 1: pandemic thing a few years back, I didn't see that coming, 18 00:01:04,280 --> 00:01:06,400 Speaker 1: and that shure as heck had a big impact on 19 00:01:06,840 --> 00:01:11,440 Speaker 1: well everything but including the tech sector. Also, sometimes I 20 00:01:11,480 --> 00:01:14,640 Speaker 1: make predictions not based on what I think is going 21 00:01:14,680 --> 00:01:18,720 Speaker 1: to happen, but perhaps on what I hope will happen. 22 00:01:19,160 --> 00:01:22,319 Speaker 1: So first up, the first prediction I made last year 23 00:01:22,880 --> 00:01:27,400 Speaker 1: is that X formerly known as Twitter, would finally get 24 00:01:27,440 --> 00:01:30,960 Speaker 1: it sack together and Elon Musk would step back from 25 00:01:31,000 --> 00:01:34,119 Speaker 1: the service a little bit more, and Linda Yakarino, who's 26 00:01:34,160 --> 00:01:37,480 Speaker 1: the actual CEO of X, would have a chance to 27 00:01:37,480 --> 00:01:40,480 Speaker 1: define a new era at the company, because the perception 28 00:01:40,520 --> 00:01:43,640 Speaker 1: at least is that anything she does as leader is 29 00:01:43,760 --> 00:01:47,760 Speaker 1: immediately undermined and reversed by Elon Musk. But I would 30 00:01:47,800 --> 00:01:49,800 Speaker 1: say that this prediction of mine was a big old 31 00:01:50,000 --> 00:01:54,040 Speaker 1: swing and a miss. Now, in my own prediction, I 32 00:01:54,080 --> 00:01:58,639 Speaker 1: admitted that I did not believe X would actually get better. 33 00:01:58,960 --> 00:02:02,400 Speaker 1: I said that it's quote just going to be a 34 00:02:02,480 --> 00:02:06,720 Speaker 1: bigger mess end quote. Now that is something we can check. 35 00:02:07,280 --> 00:02:10,160 Speaker 1: Is X worse off at the end of twenty twenty 36 00:02:10,200 --> 00:02:13,760 Speaker 1: four than it was at the end of twenty twenty three. Now, 37 00:02:13,840 --> 00:02:17,080 Speaker 1: keep in mind, Elon Musk himself is on top of 38 00:02:17,080 --> 00:02:20,560 Speaker 1: the world. He's the richest person on Earth. He's going 39 00:02:20,639 --> 00:02:24,840 Speaker 1: to be part of Trump's administration. Once Trump takes office 40 00:02:24,880 --> 00:02:27,919 Speaker 1: on January twentieth, twenty twenty five, he will be co 41 00:02:28,040 --> 00:02:32,040 Speaker 1: leading a new governmental department on spending I would be 42 00:02:32,160 --> 00:02:34,799 Speaker 1: shocked if he doesn't try to leverage his wealth and 43 00:02:34,840 --> 00:02:37,800 Speaker 1: position to get things to turn around for him. For example, 44 00:02:37,800 --> 00:02:39,960 Speaker 1: I could see him making a real go of punishing 45 00:02:40,000 --> 00:02:43,360 Speaker 1: all those companies that decided to stop advertising on X, 46 00:02:43,520 --> 00:02:46,920 Speaker 1: as if somehow there were some sort of mandate in 47 00:02:47,040 --> 00:02:50,960 Speaker 1: place that requires companies to advertise at specific places. I 48 00:02:50,960 --> 00:02:54,240 Speaker 1: honestly don't know how you can make that argument that 49 00:02:54,480 --> 00:02:59,040 Speaker 1: somehow deciding not to advertise on X is illegal. I 50 00:02:59,080 --> 00:03:02,400 Speaker 1: would imagine it's a First Amendment issue. You know, companies 51 00:03:02,440 --> 00:03:06,200 Speaker 1: have the right to advertise with whomever they want or 52 00:03:06,280 --> 00:03:09,040 Speaker 1: not advertise. But what do I know. I'm not a 53 00:03:09,080 --> 00:03:13,160 Speaker 1: constitutional expert. We also don't know if he will attempt 54 00:03:13,200 --> 00:03:17,720 Speaker 1: to push through his controversial compensation package over at Tesla. 55 00:03:17,800 --> 00:03:21,080 Speaker 1: That's something that's been denied him by a court in 56 00:03:21,120 --> 00:03:24,320 Speaker 1: the past, but I can imagine him using his connections 57 00:03:24,360 --> 00:03:28,000 Speaker 1: to kind of yeah, that reversed. We'll see. But how 58 00:03:28,080 --> 00:03:31,840 Speaker 1: is X doing right now? Well, since X is a 59 00:03:31,880 --> 00:03:35,880 Speaker 1: private company again once Elon purchased it, we don't have 60 00:03:36,280 --> 00:03:39,480 Speaker 1: SEC filings that we can look at, right They're not 61 00:03:40,120 --> 00:03:44,840 Speaker 1: obligated to file with the SEC, so analysts have had 62 00:03:44,880 --> 00:03:47,320 Speaker 1: to make some guesses about this stuff, but they've been 63 00:03:47,360 --> 00:03:51,480 Speaker 1: saying this year that X is heading toward another yearly loss. 64 00:03:51,760 --> 00:03:54,920 Speaker 1: Fidelity estimates that the current value of X is somewhere 65 00:03:54,960 --> 00:03:58,160 Speaker 1: around nine point four billion dollars. Now that's a lot 66 00:03:58,160 --> 00:04:00,680 Speaker 1: of money, but do keep in mind that US paid 67 00:04:00,800 --> 00:04:07,720 Speaker 1: forty four billion dollars for Twitter, so yikes. Performance Marketing 68 00:04:07,720 --> 00:04:11,160 Speaker 1: World predicts that X will bring in around two billion 69 00:04:11,240 --> 00:04:14,160 Speaker 1: dollars in revenue this year, and that's actually down from 70 00:04:14,240 --> 00:04:18,280 Speaker 1: two point two billion dollars last year and four point 71 00:04:18,480 --> 00:04:22,880 Speaker 1: five billion dollars in twenty twenty two. So you could 72 00:04:23,000 --> 00:04:26,360 Speaker 1: argue X is a bigger mess than it was last year, 73 00:04:26,400 --> 00:04:32,120 Speaker 1: but maybe not that much bigger. It's just still in decline. 74 00:04:32,720 --> 00:04:35,200 Speaker 1: There were also stories of hundreds of thousands, or even 75 00:04:35,240 --> 00:04:38,960 Speaker 1: millions of users ditching X for other platforms like blue 76 00:04:39,000 --> 00:04:43,680 Speaker 1: Sky and Threads Plus. X changed how blocking users works 77 00:04:43,800 --> 00:04:46,520 Speaker 1: late this past year. So it used to be that 78 00:04:46,640 --> 00:04:50,520 Speaker 1: if you were on Twitter and you blocked somebody, that 79 00:04:50,560 --> 00:04:53,240 Speaker 1: person would no longer be able to see your posts. 80 00:04:53,440 --> 00:04:55,480 Speaker 1: It would be as if you had disappeared from Twitter 81 00:04:55,560 --> 00:04:59,640 Speaker 1: to that person. Today, because of the changes that musks 82 00:04:59,800 --> 00:05:03,440 Speaker 1: Ex has made, the blocked person can continue to see 83 00:05:03,560 --> 00:05:06,240 Speaker 1: what you post. They just can't interact with it, at 84 00:05:06,279 --> 00:05:09,000 Speaker 1: least not with the tools that are built into X. 85 00:05:09,160 --> 00:05:12,239 Speaker 1: You can still directly do something like copy and paste 86 00:05:12,360 --> 00:05:16,440 Speaker 1: someone's post and then post your own version of it 87 00:05:16,480 --> 00:05:19,359 Speaker 1: and respond to it that way if you wanted to. Anyway, 88 00:05:19,400 --> 00:05:22,320 Speaker 1: I'm sad to report X did not transform into a 89 00:05:22,440 --> 00:05:26,040 Speaker 1: decent platform this past year. I'm still not really sure 90 00:05:26,040 --> 00:05:29,039 Speaker 1: what Yakarino is actually doing at this point. You don't 91 00:05:29,080 --> 00:05:32,520 Speaker 1: see her mentioned in the news that much like there 92 00:05:32,520 --> 00:05:38,440 Speaker 1: are articles about her work at X, but Elon Musk 93 00:05:38,520 --> 00:05:41,760 Speaker 1: just dominates the news when it comes to stories about 94 00:05:42,000 --> 00:05:46,120 Speaker 1: X and his other companies, which tells you a lot 95 00:05:46,240 --> 00:05:50,960 Speaker 1: like it may not be that Elon Musk is essentially 96 00:05:51,000 --> 00:05:54,440 Speaker 1: still controlling the company from behind Yakarino. It may not 97 00:05:54,480 --> 00:05:57,240 Speaker 1: be the case, but it appears to be that case 98 00:05:57,640 --> 00:06:00,920 Speaker 1: because of the way it's reported, and certainly the way 99 00:06:00,960 --> 00:06:05,240 Speaker 1: that X has performed and the way Elon behaves it 100 00:06:05,279 --> 00:06:08,719 Speaker 1: all seems to be aligned. But again, I don't know 101 00:06:08,800 --> 00:06:12,360 Speaker 1: the truth of it now. My next prediction was definitely 102 00:06:12,400 --> 00:06:15,560 Speaker 1: a GIMMI. I said that quote, We're going to see 103 00:06:15,560 --> 00:06:19,040 Speaker 1: a lot more AI integration end quote, and we sure 104 00:06:19,080 --> 00:06:23,159 Speaker 1: did see an awful lot of AI integration. I mentioned 105 00:06:23,200 --> 00:06:25,160 Speaker 1: that there would be a lot of pressure on businesses 106 00:06:25,160 --> 00:06:27,760 Speaker 1: to integrate AI into their operations, even if they don't 107 00:06:27,839 --> 00:06:31,000 Speaker 1: have a fully thought out strategy as to how they 108 00:06:31,000 --> 00:06:33,560 Speaker 1: should do that, that the whole fear of missing out 109 00:06:33,800 --> 00:06:38,440 Speaker 1: would drive a lot of corporate implementations of AI, similar 110 00:06:38,520 --> 00:06:40,920 Speaker 1: to what we saw when the metaverse was first being 111 00:06:40,920 --> 00:06:45,080 Speaker 1: talked about, or NFTs emerged out of obscurity. Keeping in 112 00:06:45,080 --> 00:06:47,680 Speaker 1: mind NFTs had been around for a bit before the 113 00:06:47,720 --> 00:06:51,719 Speaker 1: world went bonkers about them for a little less than 114 00:06:51,760 --> 00:06:55,040 Speaker 1: a year. I said that I think there was an 115 00:06:55,120 --> 00:06:58,599 Speaker 1: AI bubble and that it would likely collapse, possibly before 116 00:06:58,600 --> 00:07:01,480 Speaker 1: the end of twenty twenty four. That has not happened. 117 00:07:01,520 --> 00:07:03,920 Speaker 1: We have not had a total bubble collapse, or if 118 00:07:03,920 --> 00:07:06,680 Speaker 1: it is happening, it's a slow enough collapse that's not 119 00:07:06,800 --> 00:07:11,440 Speaker 1: been noteworthy yet. There have been several think pieces about 120 00:07:11,480 --> 00:07:15,920 Speaker 1: the possibility of AI advancements kind of slowing down a 121 00:07:15,960 --> 00:07:19,800 Speaker 1: bit like plateauing, particularly in the field of generative AI, 122 00:07:19,960 --> 00:07:22,360 Speaker 1: that we might be hitting the limits of what large 123 00:07:22,400 --> 00:07:26,800 Speaker 1: language models can do, and more folks are seeing AI 124 00:07:26,880 --> 00:07:31,400 Speaker 1: as something that's interesting and potentially useful, but not quite 125 00:07:31,680 --> 00:07:35,160 Speaker 1: there yet, that a lot of the hoopla around AI 126 00:07:35,520 --> 00:07:40,040 Speaker 1: is perhaps ahead of its time, and that we're seeing 127 00:07:40,160 --> 00:07:42,960 Speaker 1: progress in the field slow down a bit. I'm not 128 00:07:43,080 --> 00:07:47,320 Speaker 1: sure if that's entirely accurate. That just might be the 129 00:07:47,480 --> 00:07:52,760 Speaker 1: perception of the industry. Certainly, we're seeing plenty of investors 130 00:07:52,800 --> 00:07:56,680 Speaker 1: continue to pour truckloads of money into AI research and development. 131 00:07:56,920 --> 00:08:00,880 Speaker 1: Open AI received more than six billion dollars and investments 132 00:08:01,440 --> 00:08:04,480 Speaker 1: late this year. That helps stave off the situation in 133 00:08:04,520 --> 00:08:07,360 Speaker 1: which the company would spend itself out of business because 134 00:08:07,400 --> 00:08:11,760 Speaker 1: AI is also wicked expensive. I did warn in the 135 00:08:11,760 --> 00:08:15,080 Speaker 1: Predictions episode that if the bubble burst, that would hurt 136 00:08:15,120 --> 00:08:18,320 Speaker 1: AI across the board, and by that I meant if 137 00:08:18,320 --> 00:08:21,119 Speaker 1: there's a collapse or a market adjustment, if you prefer 138 00:08:21,160 --> 00:08:24,000 Speaker 1: those words, the investors might become a little more conservative 139 00:08:24,040 --> 00:08:27,520 Speaker 1: with their investments, and the AI field in general would 140 00:08:27,560 --> 00:08:29,960 Speaker 1: find it harder to get money to do research needed 141 00:08:30,000 --> 00:08:32,880 Speaker 1: to advance the field. Because, keep in mind, I've said 142 00:08:32,880 --> 00:08:35,640 Speaker 1: it over and over again, generative AI is not the 143 00:08:35,640 --> 00:08:39,160 Speaker 1: only kind of AI out there. It is a kind 144 00:08:39,240 --> 00:08:43,000 Speaker 1: of AI. But the fear is that if generative AI 145 00:08:43,120 --> 00:08:47,880 Speaker 1: reaches a point where businesses say this is interesting, but 146 00:08:48,000 --> 00:08:50,000 Speaker 1: it's just not where we need it to be. Like 147 00:08:50,160 --> 00:08:53,400 Speaker 1: our expectations were at one level and reality is at 148 00:08:53,400 --> 00:08:56,000 Speaker 1: a different level, and the rug gets pulled out from 149 00:08:56,080 --> 00:09:00,679 Speaker 1: under generative AI. Figuratively speaking, the fear is that investors 150 00:09:00,679 --> 00:09:06,240 Speaker 1: would shy away from any AI oriented endeavor, not just 151 00:09:06,280 --> 00:09:09,360 Speaker 1: the generative AI. And we've seen this kind of stuff 152 00:09:09,440 --> 00:09:12,240 Speaker 1: happen before. I always think back to the first VR 153 00:09:12,320 --> 00:09:15,959 Speaker 1: bubble in the nineteen nineties, but so far that has 154 00:09:16,000 --> 00:09:20,200 Speaker 1: not happened, and it may not happen. According to Ifosa 155 00:09:20,440 --> 00:09:24,560 Speaker 1: Udenwin of tech Radar, while the hype might be dying 156 00:09:24,640 --> 00:09:28,760 Speaker 1: down a bit around artificial intelligence, investments have not slacked 157 00:09:28,800 --> 00:09:31,600 Speaker 1: off so far, so that's good news for all the 158 00:09:31,640 --> 00:09:35,199 Speaker 1: AI companies out there. I also mentioned that we would 159 00:09:35,200 --> 00:09:38,559 Speaker 1: see a lot more legislators and regulators attempt to wrap 160 00:09:38,600 --> 00:09:41,440 Speaker 1: their heads around AI, but we would not get very 161 00:09:41,440 --> 00:09:44,280 Speaker 1: many meaningful laws or regulations out of it by the 162 00:09:44,360 --> 00:09:47,000 Speaker 1: end of the year. And hey, here in the United States, 163 00:09:47,040 --> 00:09:50,839 Speaker 1: I was one hundred percent right. There is no comprehensive 164 00:09:50,920 --> 00:09:54,720 Speaker 1: federal legislation regarding artificial intelligence here in the United States. 165 00:09:54,840 --> 00:09:59,120 Speaker 1: There are some individual state laws across our union that 166 00:09:59,240 --> 00:10:03,920 Speaker 1: do handle AI. But that just means there's no uniform 167 00:10:03,960 --> 00:10:07,000 Speaker 1: set of rules in the country that companies have to 168 00:10:07,040 --> 00:10:10,760 Speaker 1: abide by, which actually makes it really darn challenging if 169 00:10:10,800 --> 00:10:13,240 Speaker 1: you are running an AI company. I mean, you might 170 00:10:13,360 --> 00:10:15,640 Speaker 1: end up being perfectly in line with the laws of 171 00:10:15,679 --> 00:10:18,240 Speaker 1: say Massachusetts, but it turns out those laws are not 172 00:10:18,280 --> 00:10:22,200 Speaker 1: compatible with the laws in I don't know, Arizona, and 173 00:10:22,280 --> 00:10:25,040 Speaker 1: so it literally becomes impossible to do business in a 174 00:10:25,120 --> 00:10:28,120 Speaker 1: meaningful way across the country if that's the case, because 175 00:10:28,440 --> 00:10:31,360 Speaker 1: you're never going to find a way to thread all 176 00:10:31,400 --> 00:10:33,840 Speaker 1: of those needles. They're just not going to line up properly. 177 00:10:34,320 --> 00:10:38,000 Speaker 1: So there is a real need for federal rules and 178 00:10:38,120 --> 00:10:42,000 Speaker 1: laws around AI. It will be interesting to see where 179 00:10:42,080 --> 00:10:45,800 Speaker 1: laws and regulations around AI develop in the near future. 180 00:10:45,840 --> 00:10:49,959 Speaker 1: With President Elect Trump back in office starting January twentieth, 181 00:10:50,080 --> 00:10:54,120 Speaker 1: twenty twenty five. Oddly enough, I actually briefly worked with 182 00:10:54,240 --> 00:10:58,920 Speaker 1: his pick for AI policy advisor, shreerram Now. I say 183 00:10:58,960 --> 00:11:02,800 Speaker 1: briefly because our working relationship only lasted a few months, 184 00:11:02,840 --> 00:11:06,240 Speaker 1: but our contact with each other was pretty limited in 185 00:11:06,280 --> 00:11:09,600 Speaker 1: that time as well. However, it's wild to me that 186 00:11:09,679 --> 00:11:12,720 Speaker 1: my path crossed with someone who ends up being the 187 00:11:12,760 --> 00:11:16,400 Speaker 1: Advisor to the President on an issue. It's also a 188 00:11:16,440 --> 00:11:19,199 Speaker 1: little strange because last I heard, he had relocated to London, 189 00:11:19,960 --> 00:11:21,920 Speaker 1: and from what I understand, that's where he still is. 190 00:11:22,000 --> 00:11:26,960 Speaker 1: So for the US advisor to AI policy to be 191 00:11:27,040 --> 00:11:29,640 Speaker 1: living in London now, grant he is an American citizen, 192 00:11:30,360 --> 00:11:33,280 Speaker 1: but he's living in London, it's just kind of weird. Anyway, 193 00:11:33,640 --> 00:11:37,920 Speaker 1: I know Shreroan is very pro AI. He's also very 194 00:11:37,920 --> 00:11:40,400 Speaker 1: pro Elon Musk, So I suspect for the next few 195 00:11:40,480 --> 00:11:43,640 Speaker 1: years we're not going to see any really restrictive laws 196 00:11:43,679 --> 00:11:48,280 Speaker 1: around AI's development or deployment, assuming that his vision for 197 00:11:48,559 --> 00:11:51,880 Speaker 1: how AI should develop moves forward. So I think it's 198 00:11:51,920 --> 00:11:55,199 Speaker 1: safe to say that my AI predictions for twenty twenty 199 00:11:55,240 --> 00:11:58,640 Speaker 1: four turned out to be just how I anticipated. Whether 200 00:11:58,720 --> 00:12:01,079 Speaker 1: that changes next year will have to see. I think 201 00:12:01,120 --> 00:12:03,320 Speaker 1: there's a good chance we will not get a bubble 202 00:12:03,320 --> 00:12:07,040 Speaker 1: bursting situation considering how favorable the US government will likely 203 00:12:07,120 --> 00:12:09,760 Speaker 1: be to the sector. So, in other words, maybe there 204 00:12:09,800 --> 00:12:12,600 Speaker 1: would be a bubble burst if it weren't for the 205 00:12:12,640 --> 00:12:16,719 Speaker 1: fact that you're going to have a very pro AI 206 00:12:17,760 --> 00:12:20,480 Speaker 1: push from the government, is my guest. I mean, with 207 00:12:20,559 --> 00:12:24,960 Speaker 1: Elon Musk and with folks like Shum in charge of 208 00:12:25,000 --> 00:12:30,040 Speaker 1: advising the president. It would shock me if the government 209 00:12:30,760 --> 00:12:35,680 Speaker 1: became a little more critical or cautious around AI. But 210 00:12:35,720 --> 00:12:39,880 Speaker 1: who knows, certainly not me. I also said that there 211 00:12:39,880 --> 00:12:44,400 Speaker 1: would be a lot of misinformation campaigns last year or 212 00:12:44,440 --> 00:12:46,760 Speaker 1: this past year because the United States was in an 213 00:12:46,800 --> 00:12:50,520 Speaker 1: election cycle, and that a lot of those misinformation campaigns 214 00:12:50,559 --> 00:12:55,040 Speaker 1: would be augmented or entirely powered by AI. And again 215 00:12:55,120 --> 00:12:57,680 Speaker 1: that was a gimme. We've been seeing an increase in 216 00:12:57,720 --> 00:13:02,120 Speaker 1: misinformation and disinformation campaign out of places like China and Russia, 217 00:13:02,600 --> 00:13:06,600 Speaker 1: primarily Russia, if we're being honest, but also other regions 218 00:13:06,600 --> 00:13:08,880 Speaker 1: of the world, and they aim to disrupt the normal 219 00:13:08,920 --> 00:13:12,840 Speaker 1: democratic process and sow seeds of doubt and confusion, and 220 00:13:12,920 --> 00:13:17,400 Speaker 1: generally either try to discourage people from participating in the 221 00:13:17,440 --> 00:13:21,240 Speaker 1: democratic process at all, which a lot of that happened 222 00:13:21,280 --> 00:13:26,479 Speaker 1: this past cycle, or trying to sway people to support 223 00:13:26,840 --> 00:13:33,200 Speaker 1: particular candidates or policies that ultimately align with say Russia's 224 00:13:33,360 --> 00:13:38,280 Speaker 1: own policies and philosophies, or the very least do not 225 00:13:38,440 --> 00:13:43,400 Speaker 1: stand to be an impediment to the disruptor. Now, I 226 00:13:43,480 --> 00:13:47,440 Speaker 1: mentioned that we'd likely see calls to put pressure on 227 00:13:47,679 --> 00:13:51,600 Speaker 1: various online platforms out there to identify and remove cases 228 00:13:51,640 --> 00:13:54,880 Speaker 1: of misinformation, which did happen a lot, but that the 229 00:13:54,920 --> 00:13:58,600 Speaker 1: real issue that stopping the campaigns in the first place, 230 00:13:58,920 --> 00:14:01,760 Speaker 1: would largely be ignored because it turns out it's very 231 00:14:01,760 --> 00:14:05,199 Speaker 1: hard to enforce those kinds of rules because the people 232 00:14:05,240 --> 00:14:08,440 Speaker 1: who are breaking them live in other countries. So this 233 00:14:08,520 --> 00:14:11,360 Speaker 1: comes down to a shoot the messenger kind of thing. 234 00:14:11,480 --> 00:14:14,040 Speaker 1: But then a lot of these platforms have been fairly 235 00:14:14,160 --> 00:14:17,240 Speaker 1: laxed when it comes to acting quickly with regard to 236 00:14:17,440 --> 00:14:22,080 Speaker 1: misinformation campaigns. One perspective on disinformation campaigns that you can 237 00:14:22,120 --> 00:14:26,440 Speaker 1: read up on is Darryl M. West's piece on Brookings. 238 00:14:26,880 --> 00:14:30,040 Speaker 1: It's titled how Disinformation to Find the twenty twenty four 239 00:14:30,120 --> 00:14:34,000 Speaker 1: election Narrative. Check that out if you have a few minutes. 240 00:14:34,040 --> 00:14:36,680 Speaker 1: It's a great piece. Okay, we're going to take a 241 00:14:36,760 --> 00:14:39,040 Speaker 1: quick break to thank our sponsors. When we come back 242 00:14:39,400 --> 00:14:51,320 Speaker 1: more predictions and how I did. We're back, and my 243 00:14:51,520 --> 00:14:54,800 Speaker 1: next prediction is that we would see even more opposition 244 00:14:54,960 --> 00:14:57,960 Speaker 1: to TikTok here in the United States, and golly, we 245 00:14:58,040 --> 00:15:02,720 Speaker 1: sure did. Congress passed a bill this year that says 246 00:15:02,800 --> 00:15:06,840 Speaker 1: TikTok's parent company, Byte Dance, which is located in China, 247 00:15:07,600 --> 00:15:11,880 Speaker 1: must divest itself of TikTok, or else the app would 248 00:15:11,920 --> 00:15:15,520 Speaker 1: be banned in the United States. And that bill passed 249 00:15:15,520 --> 00:15:19,040 Speaker 1: into law. President Biden signed it into law, and now 250 00:15:19,120 --> 00:15:23,240 Speaker 1: Byte Dance has a deadline of January nineteenth, twenty twenty 251 00:15:23,320 --> 00:15:26,960 Speaker 1: five to divest itself of the company or face this 252 00:15:27,200 --> 00:15:30,680 Speaker 1: ban in the US. Currently, Byte Dance and TikTok are 253 00:15:30,760 --> 00:15:32,840 Speaker 1: challenging the new law, or at least trying to get 254 00:15:32,880 --> 00:15:37,040 Speaker 1: the deadline extended so that President elect Trump can weigh in, 255 00:15:37,640 --> 00:15:41,000 Speaker 1: because again, Trump doesn't come into office until January twentieth, 256 00:15:41,080 --> 00:15:43,880 Speaker 1: one day after the ban is supposed to take place. 257 00:15:44,160 --> 00:15:47,080 Speaker 1: So far, there's been very little movement from the Supreme 258 00:15:47,120 --> 00:15:49,920 Speaker 1: Court that would indicate that they will push back the deadline. 259 00:15:50,080 --> 00:15:52,400 Speaker 1: But we'll see. Maybe by the time you hear this 260 00:15:52,600 --> 00:15:55,760 Speaker 1: that will have changed. I don't think anything is off 261 00:15:55,760 --> 00:15:59,000 Speaker 1: the table at this point anyway. In my prediction, I 262 00:15:59,040 --> 00:16:00,920 Speaker 1: said what I always say that I'm not a fan 263 00:16:00,960 --> 00:16:05,120 Speaker 1: of TikTok, but I think going after TikTok is a 264 00:16:05,160 --> 00:16:09,280 Speaker 1: myopic move that doesn't really address the underlying problem. If 265 00:16:09,320 --> 00:16:14,080 Speaker 1: you're concerned about China, siphoning data from the US. There 266 00:16:14,120 --> 00:16:17,440 Speaker 1: are plenty of ways the country is doing that without 267 00:16:17,520 --> 00:16:20,640 Speaker 1: using an app like TikTok. I mean, China's got infiltrated 268 00:16:20,800 --> 00:16:23,160 Speaker 1: systems like in the United States, a lot of our 269 00:16:23,760 --> 00:16:29,120 Speaker 1: our infrastructure has been infiltrated by Chinese espionage agents over 270 00:16:29,160 --> 00:16:32,480 Speaker 1: the course of years. They didn't use TikTok to do it, 271 00:16:33,480 --> 00:16:37,880 Speaker 1: So that problem's there. There are data brokers out there 272 00:16:38,160 --> 00:16:41,360 Speaker 1: that you can buy information from that have nothing to 273 00:16:41,400 --> 00:16:44,560 Speaker 1: do with TikTok. And as far as propaganda goes, there 274 00:16:44,600 --> 00:16:48,160 Speaker 1: are lots of other platforms out there that have served 275 00:16:48,360 --> 00:16:52,040 Speaker 1: very well as being a way to get propaganda in 276 00:16:52,120 --> 00:16:55,640 Speaker 1: front of US viewers, like you know, things like Facebook 277 00:16:55,680 --> 00:16:58,560 Speaker 1: and Instagram have done that, or x for that matter. 278 00:16:59,120 --> 00:17:02,760 Speaker 1: So what I'm saying is that getting rid of TikTok 279 00:17:03,320 --> 00:17:07,800 Speaker 1: doesn't actually address the underlying problem. It's like a symptom 280 00:17:08,040 --> 00:17:11,360 Speaker 1: of a disease. It's not dealing with the actual disease. 281 00:17:12,000 --> 00:17:15,000 Speaker 1: So I think lawmakers are going to have a lot 282 00:17:15,000 --> 00:17:20,760 Speaker 1: to answer for if they ban TikTok and suddenly China 283 00:17:20,840 --> 00:17:23,359 Speaker 1: is not magically cut out from all that information and 284 00:17:23,400 --> 00:17:26,640 Speaker 1: propaganda stuff. But what do I know, Maybe they won't 285 00:17:26,640 --> 00:17:29,480 Speaker 1: face any consequences at all, apart from all the young 286 00:17:29,560 --> 00:17:34,880 Speaker 1: people being really ticked off that the government has banned TikTok. 287 00:17:35,560 --> 00:17:37,719 Speaker 1: As for Trump, he's the one who got this started. 288 00:17:37,760 --> 00:17:41,280 Speaker 1: He's the one who argued that we should ban TikTok 289 00:17:42,080 --> 00:17:45,480 Speaker 1: years ago, back in like twenty twenty. He even signed 290 00:17:45,520 --> 00:17:48,720 Speaker 1: an executive order to that effect. But in subsequent years 291 00:17:48,800 --> 00:17:51,719 Speaker 1: he has reversed his position for reasons I'm not going 292 00:17:51,800 --> 00:17:54,560 Speaker 1: to go into here, or at least his reasons that 293 00:17:54,600 --> 00:17:56,560 Speaker 1: he stated is that young people seem to really like 294 00:17:56,600 --> 00:17:59,280 Speaker 1: the platform, which doesn't seem like it's an actual reason 295 00:18:00,359 --> 00:18:02,399 Speaker 1: young people like it, so why should we get rid 296 00:18:02,440 --> 00:18:05,239 Speaker 1: of it? That doesn't make any sense at all. What 297 00:18:05,400 --> 00:18:08,280 Speaker 1: might make more sense is that someone who has invested 298 00:18:08,400 --> 00:18:15,000 Speaker 1: literally billions of dollars in companies, including stuff like ByteDance, 299 00:18:15,280 --> 00:18:18,919 Speaker 1: also happened to be a big campaign contributor to Trump. 300 00:18:19,280 --> 00:18:23,160 Speaker 1: But hey, maybe I'm just way off base. My next 301 00:18:23,160 --> 00:18:26,320 Speaker 1: prediction was that the Apple Vision Pro, the mixed reality 302 00:18:26,359 --> 00:18:29,879 Speaker 1: headset from Apple, would be a commercial failure in twenty 303 00:18:29,920 --> 00:18:33,280 Speaker 1: twenty four. It had not yet launched in twenty twenty 304 00:18:33,280 --> 00:18:35,879 Speaker 1: three when I made these predictions, and I said, I 305 00:18:35,920 --> 00:18:39,680 Speaker 1: don't think it's going to be the runaway success that 306 00:18:40,240 --> 00:18:43,040 Speaker 1: it needs to be, and I'm pretty much right on 307 00:18:43,440 --> 00:18:47,439 Speaker 1: about that. The vision Pro headset cost thirty five hundred 308 00:18:47,560 --> 00:18:52,399 Speaker 1: dollars upon launch, crazy expensive. Moreover, there were a limited 309 00:18:52,480 --> 00:18:56,359 Speaker 1: number of experiences and applications for the technology, so it 310 00:18:56,400 --> 00:18:59,720 Speaker 1: was a really huge request to get customers to dole 311 00:18:59,800 --> 00:19:02,840 Speaker 1: out thirty five hundred dollars for something that only had 312 00:19:02,880 --> 00:19:08,639 Speaker 1: limited applicability. In June, Jonathan Reichenthall of Forbes had an 313 00:19:08,720 --> 00:19:13,280 Speaker 1: article that's titled Apple's Vision Pro is amazing, but nobody 314 00:19:13,320 --> 00:19:16,240 Speaker 1: wants one. So In that piece, he said that Apple's 315 00:19:16,240 --> 00:19:18,680 Speaker 1: target for the year was to sell around eight hundred 316 00:19:18,720 --> 00:19:21,840 Speaker 1: thousand units by the end of twenty twenty four. However, 317 00:19:21,880 --> 00:19:25,399 Speaker 1: by midyear, Apple had adjusted that to four hundred and 318 00:19:25,400 --> 00:19:29,920 Speaker 1: fifty thousand units. That's a big wolf, you know, going 319 00:19:29,960 --> 00:19:32,840 Speaker 1: down to a little more than than half of what 320 00:19:32,920 --> 00:19:37,160 Speaker 1: you had originally anticipated. That's tough. As Reichenthall pointed out, 321 00:19:37,359 --> 00:19:40,280 Speaker 1: it's a rough comparison to the iPad, which sold seventy 322 00:19:40,320 --> 00:19:45,680 Speaker 1: three million units in its first year. What a huge contrast. 323 00:19:46,840 --> 00:19:49,560 Speaker 1: There are rumors that Apple will stop offering the Vision 324 00:19:49,600 --> 00:19:53,959 Speaker 1: Pro entirely and instead focus on a slimmed down model 325 00:19:54,160 --> 00:19:57,040 Speaker 1: that would cost somewhere in the one thousand, five hundred 326 00:19:57,080 --> 00:20:00,119 Speaker 1: dollar neighborhood in the new year. That might help the 327 00:20:00,160 --> 00:20:03,440 Speaker 1: technology gained subtraction, though I imagine the people who tried 328 00:20:03,440 --> 00:20:06,440 Speaker 1: out the vision Pro model may end up being disappointed 329 00:20:06,480 --> 00:20:10,000 Speaker 1: that some of their favorite features are not there in 330 00:20:10,040 --> 00:20:14,520 Speaker 1: the slimmed down version. Apple CEO Tim Cook has repeatedly 331 00:20:14,560 --> 00:20:17,320 Speaker 1: stated that the Vision Pro is not and was never 332 00:20:17,440 --> 00:20:20,640 Speaker 1: intended to be a mass market product. It was rather 333 00:20:21,080 --> 00:20:25,320 Speaker 1: an early adopter product, and I believe him. I think 334 00:20:25,320 --> 00:20:27,760 Speaker 1: that's true. But I think it's also true that Apple 335 00:20:27,920 --> 00:20:31,560 Speaker 1: nearly had its expectations on how many units it would 336 00:20:31,560 --> 00:20:35,560 Speaker 1: sell then, and that's a discouraging message. Like if you 337 00:20:35,720 --> 00:20:38,680 Speaker 1: are in Apple and you're told, hey, we were hoping 338 00:20:38,680 --> 00:20:40,960 Speaker 1: to sell eight hundred thousand units, but now we're marking 339 00:20:41,000 --> 00:20:43,639 Speaker 1: it down to four hundred and fifty thousand, that's not 340 00:20:43,760 --> 00:20:46,880 Speaker 1: great news. Like that's telling you, Okay, well, the demand 341 00:20:46,960 --> 00:20:49,719 Speaker 1: for this particular technology is not as high as what 342 00:20:49,760 --> 00:20:52,560 Speaker 1: we had hoped, and the price tag is probably too 343 00:20:52,680 --> 00:20:55,960 Speaker 1: high for more people to jump on board. But I 344 00:20:56,000 --> 00:20:59,560 Speaker 1: don't think that Apple's journey into mixed reality is over 345 00:21:00,320 --> 00:21:04,800 Speaker 1: just yet. Okay, next up, we have predictions about quantum computing. 346 00:21:04,960 --> 00:21:07,359 Speaker 1: I said quantum computing was going to be much bigger 347 00:21:07,400 --> 00:21:10,840 Speaker 1: news in twenty twenty four, But considering all that did 348 00:21:10,840 --> 00:21:13,280 Speaker 1: happen in twenty twenty four and how much stuff took 349 00:21:13,359 --> 00:21:17,479 Speaker 1: up space in the news cycle, I think calling it 350 00:21:17,800 --> 00:21:20,320 Speaker 1: big news in the sense of a lot of people 351 00:21:20,560 --> 00:21:24,320 Speaker 1: hearing about quantum computing that was a bit off. That's 352 00:21:24,359 --> 00:21:27,399 Speaker 1: not to say there weren't really important advancements and big 353 00:21:27,480 --> 00:21:31,320 Speaker 1: news within the quantum computing world this past year. There 354 00:21:31,440 --> 00:21:34,919 Speaker 1: certainly were. I just don't think it broke through to 355 00:21:35,000 --> 00:21:38,199 Speaker 1: the headlines for a lot of other people outside the 356 00:21:38,240 --> 00:21:41,360 Speaker 1: field to really learn about it. So, for example, recently 357 00:21:41,400 --> 00:21:44,960 Speaker 1: Google announced a new quantum chip called Willow, and the 358 00:21:45,000 --> 00:21:47,679 Speaker 1: hope is that this will allow for more real world 359 00:21:47,720 --> 00:21:53,040 Speaker 1: applications of quantum computing in the near future. So quantum 360 00:21:53,080 --> 00:21:56,639 Speaker 1: computing has been a revolutionary field of research, but the 361 00:21:56,640 --> 00:22:02,479 Speaker 1: practical applications have been limited, partly because of the power 362 00:22:02,480 --> 00:22:05,840 Speaker 1: limitations of quantum computing, because it's just hard to build 363 00:22:05,880 --> 00:22:10,000 Speaker 1: a really powerful one. Quantum computers are extremely delicate, if 364 00:22:10,000 --> 00:22:11,720 Speaker 1: you like to think of it, that way, like the 365 00:22:11,880 --> 00:22:17,479 Speaker 1: quantum nature can collapse easily if not handled properly. On 366 00:22:17,560 --> 00:22:20,679 Speaker 1: top of that, the algorithms you need to build, the 367 00:22:20,720 --> 00:22:23,520 Speaker 1: programs you need to build to run on quantum computers 368 00:22:23,520 --> 00:22:27,840 Speaker 1: are really complicated. So we've had a lot of progress 369 00:22:27,920 --> 00:22:31,800 Speaker 1: on the hardware front, and often the progress on the 370 00:22:31,840 --> 00:22:37,320 Speaker 1: software side is not mentioned nearly as frequently. But the 371 00:22:37,359 --> 00:22:43,560 Speaker 1: potential for quantum computing is absolutely incredible. Quantum computing has 372 00:22:43,600 --> 00:22:47,439 Speaker 1: the potential to tackle certain subsets of computational problems in 373 00:22:47,480 --> 00:22:50,320 Speaker 1: a fraction of a fraction of the amount of time 374 00:22:50,359 --> 00:22:54,000 Speaker 1: it takes a classical computer to do it. Like, there's 375 00:22:54,040 --> 00:22:58,280 Speaker 1: some computational problems that are so difficult that a classical 376 00:22:58,320 --> 00:23:03,000 Speaker 1: computer could take billions or trillions of years before it 377 00:23:03,000 --> 00:23:06,959 Speaker 1: could complete it, whereas a quantum computer of sufficient power 378 00:23:07,280 --> 00:23:10,600 Speaker 1: and with the right algorithm could potentially solve it in 379 00:23:10,680 --> 00:23:15,040 Speaker 1: just a few minutes. I like the analogy that Tam Hallabeek, 380 00:23:15,200 --> 00:23:18,520 Speaker 1: who is a strategist with the company Digi Sert, has said. 381 00:23:18,560 --> 00:23:22,320 Speaker 1: He says, imagine that you have a typical maze puzzle. 382 00:23:23,080 --> 00:23:26,360 Speaker 1: Now your classical computer will try to solve this puzzle 383 00:23:26,600 --> 00:23:30,119 Speaker 1: by taking each path in turn, like all right, Well, 384 00:23:30,200 --> 00:23:32,359 Speaker 1: let me follow this path. Okay, that led to a 385 00:23:32,400 --> 00:23:34,479 Speaker 1: dead end. So let me now follow this path that 386 00:23:34,560 --> 00:23:36,360 Speaker 1: led to a dead end. Let me follow this path, 387 00:23:36,359 --> 00:23:39,960 Speaker 1: et cetera, et cetera. Now imagine if the computer were 388 00:23:40,000 --> 00:23:44,359 Speaker 1: able to take every path at the same time and 389 00:23:44,400 --> 00:23:46,520 Speaker 1: then come back with the answer as to which path 390 00:23:46,880 --> 00:23:51,640 Speaker 1: was the best one. That speeds up the solution considerably. 391 00:23:52,200 --> 00:23:58,080 Speaker 1: And that's how quantum computers work from a very abstract level. So, 392 00:23:58,240 --> 00:24:01,760 Speaker 1: in compaired with the right algorithms and you have sufficient 393 00:24:01,880 --> 00:24:06,959 Speaker 1: cubits or quantum bits in your quantum computer, the computer 394 00:24:07,119 --> 00:24:12,199 Speaker 1: could potentially make previously safe encryption methods a breeze to decrypt. 395 00:24:12,720 --> 00:24:15,879 Speaker 1: That's because the quantum computer, assuming it's powerful enough and 396 00:24:15,920 --> 00:24:19,560 Speaker 1: the algorithm works, can solve for all potential pathways that 397 00:24:19,680 --> 00:24:22,560 Speaker 1: led to the encryption at the same time and then 398 00:24:22,640 --> 00:24:26,720 Speaker 1: reverse it. So assuming you've got the right program to 399 00:24:26,800 --> 00:24:29,760 Speaker 1: run with the right quantum computer, decrypting stuff would be 400 00:24:29,800 --> 00:24:33,960 Speaker 1: a breeze. That also means that potentially you could tackle 401 00:24:34,000 --> 00:24:37,600 Speaker 1: the tough challenge of bitcoin mining and break proof of 402 00:24:37,680 --> 00:24:41,359 Speaker 1: work systems, which is what bitcoin is. It's a proof 403 00:24:41,359 --> 00:24:45,359 Speaker 1: of work system, So a proof of work cryptocurrency system 404 00:24:46,160 --> 00:24:50,000 Speaker 1: essentially creates a really tough math problem for computers to solve. 405 00:24:50,160 --> 00:24:54,399 Speaker 1: And this is the mining part of bitcoin mining. So 406 00:24:54,480 --> 00:24:58,840 Speaker 1: you've got a block of transactions bitcoin transactions, and they 407 00:24:58,880 --> 00:25:03,520 Speaker 1: need to be verified. So the verification process involves solving 408 00:25:03,640 --> 00:25:07,840 Speaker 1: a really challenging math problem, or kind of guessing at 409 00:25:07,840 --> 00:25:11,680 Speaker 1: the solution of a math problem. So the way this 410 00:25:11,800 --> 00:25:16,479 Speaker 1: typically works with bitcoin is you have a really huge number. 411 00:25:17,840 --> 00:25:21,840 Speaker 1: This really huge number was generated by multiplying two other 412 00:25:22,240 --> 00:25:27,320 Speaker 1: really huge prime numbers together. So in order to solve 413 00:25:27,600 --> 00:25:29,560 Speaker 1: the problem, you have to come up with what those 414 00:25:29,600 --> 00:25:33,360 Speaker 1: two prime numbers were. Your classical computer is the way 415 00:25:33,359 --> 00:25:36,040 Speaker 1: they do this as they start going through every potential 416 00:25:36,440 --> 00:25:40,320 Speaker 1: prime number and saying, hey, is this it? Hey? Is 417 00:25:40,359 --> 00:25:43,159 Speaker 1: this it? Hey? Is this it? Until they get to 418 00:25:43,160 --> 00:25:47,040 Speaker 1: the right one. Whereas a quantum computer could solve for 419 00:25:47,280 --> 00:25:51,040 Speaker 1: all the potential prime number factors that are used to 420 00:25:51,080 --> 00:25:54,919 Speaker 1: create this number and find the right pair much faster 421 00:25:55,560 --> 00:26:01,439 Speaker 1: in theory, which could be an incredible disrupt for bitcoin mining. 422 00:26:02,280 --> 00:26:06,240 Speaker 1: Right now, you have these massive computer networks that are 423 00:26:06,320 --> 00:26:11,320 Speaker 1: running in places like defunct power plants that are just 424 00:26:11,920 --> 00:26:14,639 Speaker 1: churning and churning and churning all these numbers trying to 425 00:26:14,640 --> 00:26:17,800 Speaker 1: be the first one to get to the correct one 426 00:26:17,840 --> 00:26:21,520 Speaker 1: in order to mine the next block of bitcoin. So 427 00:26:21,600 --> 00:26:23,280 Speaker 1: if you were able to do this in a fraction 428 00:26:23,359 --> 00:26:25,600 Speaker 1: of the amount of time, you would completely upset that 429 00:26:25,840 --> 00:26:29,720 Speaker 1: entire ecosystem, which would not necessarily be a bad thing 430 00:26:29,760 --> 00:26:32,800 Speaker 1: because these computer networks are often drawing huge amounts of 431 00:26:32,840 --> 00:26:37,720 Speaker 1: electricity from fossil fuel sources, so they can have a 432 00:26:37,760 --> 00:26:42,480 Speaker 1: big impact on climate and other issues like pollution and 433 00:26:42,520 --> 00:26:45,840 Speaker 1: things like that. So being able to do so much 434 00:26:45,880 --> 00:26:49,320 Speaker 1: faster with much less power would be a game changer, 435 00:26:49,920 --> 00:26:54,320 Speaker 1: but it would totally upset the way bitcoin works. Right now, however, 436 00:26:54,720 --> 00:26:57,560 Speaker 1: we are years away from that becoming a thing because 437 00:26:57,560 --> 00:27:01,199 Speaker 1: it's all theoretical. You need to have the quantum computer 438 00:27:01,320 --> 00:27:04,080 Speaker 1: be powerful enough, and as I said, like creating a 439 00:27:04,119 --> 00:27:06,600 Speaker 1: quantum computer that has sufficient number of cubits to do 440 00:27:06,640 --> 00:27:10,120 Speaker 1: these kind of things is a non trivial challenge, and 441 00:27:10,200 --> 00:27:12,880 Speaker 1: you need the right algorithm to be able to go 442 00:27:12,920 --> 00:27:16,280 Speaker 1: about this as well, and that's just not something that 443 00:27:16,280 --> 00:27:19,280 Speaker 1: we have access to and probably won't for several years, 444 00:27:19,400 --> 00:27:23,399 Speaker 1: maybe many years. It may be quite some time before 445 00:27:23,440 --> 00:27:27,520 Speaker 1: this is a reality, but it's a possibility. Meanwhile, the 446 00:27:27,520 --> 00:27:30,560 Speaker 1: cryptography world has been hard at work designing new methods 447 00:27:30,560 --> 00:27:34,760 Speaker 1: that will confound quantum computers so that stuff like security 448 00:27:34,800 --> 00:27:37,879 Speaker 1: and privacy can still be a thing in a post 449 00:27:38,040 --> 00:27:43,160 Speaker 1: traditional encryption world. So that's good because you would hate 450 00:27:43,200 --> 00:27:45,000 Speaker 1: to find out, like you'd wake up one morning and 451 00:27:45,040 --> 00:27:48,120 Speaker 1: just say, hey, no more secrets, because, as it turns out, 452 00:27:48,560 --> 00:27:52,920 Speaker 1: this quantum computer can decrypt anything that's been encrypted. Ever, 453 00:27:53,680 --> 00:27:56,679 Speaker 1: that would be terrifying because think of all the stuff 454 00:27:56,680 --> 00:27:59,160 Speaker 1: out there. It could be your personal finances, it could 455 00:27:59,200 --> 00:28:02,800 Speaker 1: be medical records, it could be state secrets, it could 456 00:28:02,800 --> 00:28:05,879 Speaker 1: be all sorts of stuff, and being able to suddenly 457 00:28:05,960 --> 00:28:08,840 Speaker 1: get access to all of that would be pretty terrifying. 458 00:28:08,920 --> 00:28:13,760 Speaker 1: But Yeah, while there were impressive developments in quantum computing 459 00:28:13,760 --> 00:28:16,160 Speaker 1: in twenty twenty four, I think those achievements were largely 460 00:28:16,200 --> 00:28:21,160 Speaker 1: overshadowed by all the other chaotic stuff on fire going 461 00:28:21,200 --> 00:28:23,320 Speaker 1: on around the world this past year. So I'm going 462 00:28:23,400 --> 00:28:26,240 Speaker 1: to say I was not right about this one, only 463 00:28:26,280 --> 00:28:28,920 Speaker 1: because I don't think we saw enough news about quantum 464 00:28:28,960 --> 00:28:34,480 Speaker 1: computing hit the mainstream to justify a check mark for me. Okay, 465 00:28:35,960 --> 00:28:39,640 Speaker 1: one prediction I did not make because at the time 466 00:28:39,720 --> 00:28:42,400 Speaker 1: I thought it was a sure thing was that Nintendo 467 00:28:42,600 --> 00:28:45,280 Speaker 1: would announce its next console before the end of the year. 468 00:28:45,320 --> 00:28:47,840 Speaker 1: I said, I'm not going to predict that that's definitely 469 00:28:47,840 --> 00:28:51,080 Speaker 1: going to happen. Nintendo will say it, so there's not 470 00:28:51,120 --> 00:28:53,160 Speaker 1: even a prediction here, and that was a big old 471 00:28:53,160 --> 00:28:55,920 Speaker 1: whoop see because that did not happen. There were a 472 00:28:55,920 --> 00:28:59,640 Speaker 1: lot of rumors and leaks about the next Nintendo console. 473 00:29:00,080 --> 00:29:02,680 Speaker 1: Most of those are calling it the Switch to and 474 00:29:03,200 --> 00:29:06,200 Speaker 1: apparently some leaked images back that up, saying that yeah, 475 00:29:06,200 --> 00:29:09,160 Speaker 1: that's what Nintendo's calling it too, and it suggests that 476 00:29:09,320 --> 00:29:11,600 Speaker 1: it'll just be an upgrade to the Switch, but it'll 477 00:29:11,640 --> 00:29:14,640 Speaker 1: have some serious new features in it, such as potentially 478 00:29:14,680 --> 00:29:17,680 Speaker 1: the ability to create four K visuals when docked with 479 00:29:17,720 --> 00:29:22,400 Speaker 1: the television. That's not announced, so maybe it'll happen. We 480 00:29:22,480 --> 00:29:25,880 Speaker 1: are still waiting on that official announcement, which Nintendo said 481 00:29:26,000 --> 00:29:27,960 Speaker 1: is going to happen before the end of its current 482 00:29:27,960 --> 00:29:32,120 Speaker 1: fiscal year. Nintendo's fiscal years end in March and then 483 00:29:32,240 --> 00:29:35,280 Speaker 1: the new one starts in April, so it's possible we 484 00:29:35,360 --> 00:29:39,600 Speaker 1: won't hear anything until Spring twenty twenty five, or who knows, 485 00:29:39,880 --> 00:29:42,840 Speaker 1: maybe Nintendo has announced something between when I recorded this 486 00:29:42,960 --> 00:29:46,480 Speaker 1: on December twenty third, and when you hear it, that 487 00:29:46,520 --> 00:29:49,840 Speaker 1: would be fun. You know what's also fun If we 488 00:29:49,880 --> 00:29:51,960 Speaker 1: take a quick break to thank our sponsors, we'll be 489 00:29:52,080 --> 00:30:03,280 Speaker 1: right back. We're back. And you know another thing I 490 00:30:03,280 --> 00:30:05,800 Speaker 1: said I would not bother making a prediction about was 491 00:30:05,840 --> 00:30:08,080 Speaker 1: that electric vehicles would have a really big year in 492 00:30:08,080 --> 00:30:10,640 Speaker 1: twenty twenty four. And the reason why I said that 493 00:30:10,760 --> 00:30:13,480 Speaker 1: was because so many different parts of the world have 494 00:30:13,600 --> 00:30:17,240 Speaker 1: been passing laws that are looking into phasing out internal 495 00:30:17,280 --> 00:30:22,120 Speaker 1: combustion engines over the next few years. But then Trump 496 00:30:22,360 --> 00:30:25,320 Speaker 1: won the election this year and he is not that 497 00:30:25,640 --> 00:30:28,880 Speaker 1: keen on electric vehicles. He seems to be far more 498 00:30:28,920 --> 00:30:33,920 Speaker 1: interested in fossil fuels for some reason. Whether Elon Musk's 499 00:30:33,960 --> 00:30:38,360 Speaker 1: presence on Trump's team will change that, at least for Tesla, 500 00:30:38,760 --> 00:30:42,320 Speaker 1: I don't know, but it's possible that we will actually 501 00:30:42,320 --> 00:30:45,680 Speaker 1: see electric vehicle support in the United States lag behind 502 00:30:45,760 --> 00:30:49,840 Speaker 1: other parts of the world for the next few years now. 503 00:30:51,120 --> 00:30:55,240 Speaker 1: I also predicted another chaotic year for streaming services in 504 00:30:55,320 --> 00:30:58,480 Speaker 1: twenty twenty four. I mentioned that the streaming business is 505 00:30:58,520 --> 00:31:02,080 Speaker 1: a really tough one or to make a profit, right, 506 00:31:02,160 --> 00:31:05,320 Speaker 1: you're competing for customers, so you've got all these other 507 00:31:05,360 --> 00:31:08,680 Speaker 1: competitors on the market. That means you have to market 508 00:31:08,720 --> 00:31:12,200 Speaker 1: aggressively to your customers or your potential customers, and you 509 00:31:12,200 --> 00:31:14,760 Speaker 1: have to pour a lot of money into original content 510 00:31:15,360 --> 00:31:20,000 Speaker 1: or purchase lots of exclusive media libraries or both. That's 511 00:31:20,040 --> 00:31:24,440 Speaker 1: really expensive, so your operating costs are incredibly high. You 512 00:31:24,440 --> 00:31:27,080 Speaker 1: can't charge too much for your service or else no 513 00:31:27,120 --> 00:31:29,320 Speaker 1: one's going to sign up for it. And you need 514 00:31:29,360 --> 00:31:31,960 Speaker 1: to get to a certain number of subscribers in order 515 00:31:32,000 --> 00:31:35,160 Speaker 1: for it to be sustainable, and your revenue streams tend 516 00:31:35,160 --> 00:31:38,680 Speaker 1: to be pretty limited. Usually we're talking about subscriptions and 517 00:31:38,800 --> 00:31:42,280 Speaker 1: maybe advertising, right you might get some ad support in 518 00:31:42,320 --> 00:31:47,640 Speaker 1: there too, And consumers are absolutely flooded with different options. 519 00:31:47,680 --> 00:31:51,160 Speaker 1: I mean you've got your netflixes, your Hulus, your Amazon Primes, 520 00:31:51,200 --> 00:31:55,160 Speaker 1: your Maxes, your Disney pluses, your Paramount pluses, your Peacocks, 521 00:31:55,280 --> 00:31:58,880 Speaker 1: et cetera, the Apple TV pluses. I mean, there's so many. 522 00:31:59,840 --> 00:32:04,239 Speaker 1: There was some consolidation this year, but mostly in the 523 00:32:04,240 --> 00:32:08,040 Speaker 1: form of companies entering into partnerships to offer bundled services, 524 00:32:08,080 --> 00:32:11,880 Speaker 1: where for a flat subscription fee you get access to 525 00:32:12,000 --> 00:32:16,520 Speaker 1: multiple services. So like with Disney Plus, it's Disney Plus, 526 00:32:16,800 --> 00:32:21,440 Speaker 1: ESPN Plus and Hulu, right, I have the Disney Plus 527 00:32:21,440 --> 00:32:25,400 Speaker 1: and Hulu one. I didn't bother with ESPN because there 528 00:32:25,400 --> 00:32:27,640 Speaker 1: are only two sports that I really follow, and I 529 00:32:27,640 --> 00:32:29,480 Speaker 1: don't even follow them. There are only two sports that 530 00:32:29,520 --> 00:32:32,240 Speaker 1: I actually like to see, and only then if I'm 531 00:32:32,240 --> 00:32:34,680 Speaker 1: there in person, and those are baseball and hockey, and 532 00:32:34,800 --> 00:32:36,960 Speaker 1: otherwise I just don't have any interests, so I don't 533 00:32:37,000 --> 00:32:43,360 Speaker 1: bother subscribing to ESPN. But yeah, there wasn't that many 534 00:32:44,000 --> 00:32:47,200 Speaker 1: cases of companies going like belly up or getting out 535 00:32:47,200 --> 00:32:50,600 Speaker 1: of streaming entirely in this past year. There are a 536 00:32:50,640 --> 00:32:54,240 Speaker 1: lot of talks of mergers and acquisitions that didn't happen, 537 00:32:54,880 --> 00:33:00,600 Speaker 1: and lots of movement that mostly was involved with shuffling 538 00:33:00,680 --> 00:33:03,960 Speaker 1: things around, so things didn't really get upset too much. 539 00:33:04,000 --> 00:33:05,960 Speaker 1: So I don't think things got as messy as I 540 00:33:06,000 --> 00:33:09,200 Speaker 1: had anticipated last year. There's still lots of stories about 541 00:33:09,200 --> 00:33:12,120 Speaker 1: companies struggling to find a path toward profitability, so that's 542 00:33:12,160 --> 00:33:14,680 Speaker 1: still a thing. But in a large part, twenty twenty 543 00:33:14,680 --> 00:33:17,000 Speaker 1: four I feel has been a year of treading water 544 00:33:17,440 --> 00:33:21,200 Speaker 1: in the streaming media space. Maybe we'll see it change 545 00:33:21,240 --> 00:33:24,240 Speaker 1: more in the next couple of years, but yeah, I 546 00:33:24,240 --> 00:33:27,400 Speaker 1: feel like this wasn't as chaotic as I was anticipating, 547 00:33:28,240 --> 00:33:30,920 Speaker 1: all right. Also, last year I predicted that Google would 548 00:33:31,000 --> 00:33:34,680 Speaker 1: come out of the anti trust case more or less okay. 549 00:33:35,520 --> 00:33:38,120 Speaker 1: I thought that they wouldn't get away scott free. I 550 00:33:38,120 --> 00:33:41,800 Speaker 1: thought they would be found guilty of engaging in anti 551 00:33:41,800 --> 00:33:44,560 Speaker 1: competitive behaviors, And in fact they did. A court did 552 00:33:44,640 --> 00:33:51,160 Speaker 1: find Google guilty of practicing anti competitive policies. But I 553 00:33:51,160 --> 00:33:54,240 Speaker 1: thought Google wouldn't be punished too hard. It would kind 554 00:33:54,240 --> 00:33:56,720 Speaker 1: of get the equivalent of a slap on the wrist. 555 00:33:56,720 --> 00:33:59,320 Speaker 1: It might get a really big fine, but that's not 556 00:33:59,600 --> 00:34:05,320 Speaker 1: nearly as intense as, say, being forced to break up. Now, 557 00:34:06,360 --> 00:34:09,800 Speaker 1: we don't know what's actually going to happen with Google's 558 00:34:09,800 --> 00:34:12,440 Speaker 1: anti trust case. Like it has been found. The company's 559 00:34:12,440 --> 00:34:17,719 Speaker 1: been found guilty of practicing this anti trust behaviors, but 560 00:34:17,800 --> 00:34:22,120 Speaker 1: a judge next year will actually decide what the punishment 561 00:34:22,280 --> 00:34:25,480 Speaker 1: is going to be. So the question of what happens 562 00:34:25,520 --> 00:34:29,960 Speaker 1: to Google technically remains unanswered. The Department of Justice right 563 00:34:30,000 --> 00:34:35,120 Speaker 1: now has sent a list of suggestions that include Google 564 00:34:35,400 --> 00:34:39,200 Speaker 1: having to sell off its Chrome business, that's the browser. 565 00:34:40,440 --> 00:34:44,040 Speaker 1: But you know, Google obviously has protested that saying no, 566 00:34:44,360 --> 00:34:46,880 Speaker 1: that doesn't make any sense. It doesn't fix any problems. 567 00:34:46,960 --> 00:34:49,040 Speaker 1: That's not the issue it any way. And plus you 568 00:34:49,080 --> 00:34:53,320 Speaker 1: will ruin the world economy if you do this, among 569 00:34:53,360 --> 00:34:56,000 Speaker 1: other things. I am paraphrasing, and I'm being a little 570 00:34:56,080 --> 00:34:59,640 Speaker 1: facetious here, So that's not exactly what Google's saying. It's 571 00:34:59,640 --> 00:35:02,319 Speaker 1: just kind of how I walk away feeling about it. 572 00:35:02,719 --> 00:35:07,000 Speaker 1: But we don't know what's going to actually happen with 573 00:35:07,120 --> 00:35:10,720 Speaker 1: this judge next year. It could be that the judge 574 00:35:11,160 --> 00:35:15,279 Speaker 1: does command Google to sell off some of its businesses. 575 00:35:15,640 --> 00:35:18,759 Speaker 1: We'll have to see, but we don't know for sure. 576 00:35:18,920 --> 00:35:21,200 Speaker 1: For one thing, the Department of Justice itself is going 577 00:35:21,239 --> 00:35:25,919 Speaker 1: to be different once Trump takes office, and so there's 578 00:35:25,920 --> 00:35:29,359 Speaker 1: some question as to whether or not the DOJ will 579 00:35:29,360 --> 00:35:34,520 Speaker 1: back off a bit once that happens. However, Trump has 580 00:35:34,640 --> 00:35:39,080 Speaker 1: also had a really contentious relationship with big tech companies, 581 00:35:39,120 --> 00:35:45,200 Speaker 1: particularly Google, but also some other ones as well, and 582 00:35:45,239 --> 00:35:48,279 Speaker 1: he seems to feel like there's a perceived bias in 583 00:35:48,280 --> 00:35:53,920 Speaker 1: these different platforms against him like him personally, and at 584 00:35:54,000 --> 00:35:57,120 Speaker 1: least that seems to be the case. Again, this is 585 00:35:57,160 --> 00:36:00,440 Speaker 1: all me kind of drawing conclusions based on what I 586 00:36:00,440 --> 00:36:04,080 Speaker 1: have read and what I've seen, But yeah, there seems 587 00:36:04,120 --> 00:36:07,680 Speaker 1: to be at least some speculation that the DOJ will 588 00:36:07,719 --> 00:36:10,960 Speaker 1: still be going after Google, it just may not be 589 00:36:11,560 --> 00:36:15,319 Speaker 1: at the same level. Like the anti competitive stuff. May 590 00:36:15,360 --> 00:36:19,960 Speaker 1: be one of the arguments the DOJ makes against big tech, 591 00:36:20,120 --> 00:36:26,160 Speaker 1: but not because of a renewed interest in trust breaking 592 00:36:26,520 --> 00:36:30,000 Speaker 1: in the government, but rather it's a tactic that might 593 00:36:30,080 --> 00:36:34,680 Speaker 1: work in order to punish companies that have been perceived 594 00:36:34,719 --> 00:36:38,680 Speaker 1: to be anti Trump. We'll see that that might be 595 00:36:38,719 --> 00:36:43,440 Speaker 1: an unfair assessment. I don't want to just be hardheaded 596 00:36:43,480 --> 00:36:45,719 Speaker 1: and say this is definitely the way it is. I 597 00:36:45,760 --> 00:36:50,280 Speaker 1: don't I don't know, but uh yeah, I don't think 598 00:36:50,520 --> 00:36:53,279 Speaker 1: Google is going to be forced to sell anything off. 599 00:36:53,360 --> 00:36:56,279 Speaker 1: I think that my prediction from twenty twenty three is 600 00:36:56,320 --> 00:36:59,480 Speaker 1: going to hold for twenty twenty five. But we'll see. 601 00:36:59,719 --> 00:37:02,960 Speaker 1: I don't you know, I could be wrong. I also 602 00:37:03,080 --> 00:37:06,280 Speaker 1: said that startups would find it harder to get funding 603 00:37:06,400 --> 00:37:10,440 Speaker 1: in twenty twenty four, which is kind of true. Crunch 604 00:37:10,480 --> 00:37:15,960 Speaker 1: Base found that the big five tech companies, which are Meta, Amazon, Apple, Google, 605 00:37:16,000 --> 00:37:20,480 Speaker 1: and Microsoft, invested at least a million dollars into one 606 00:37:20,560 --> 00:37:22,719 Speaker 1: hundred and forty nine different startups this year. I was like, 607 00:37:22,800 --> 00:37:25,680 Speaker 1: I mean a million apiece right, like saying a million 608 00:37:25,719 --> 00:37:29,399 Speaker 1: dollars or more into one hundred forty nine startups. That's 609 00:37:29,440 --> 00:37:33,359 Speaker 1: actually slightly up from twenty twenty three. Back that year, 610 00:37:34,000 --> 00:37:36,440 Speaker 1: it was just one hundred and thirty one startups. However, 611 00:37:36,920 --> 00:37:38,759 Speaker 1: even with one hundred and forty nine, that's still the 612 00:37:38,880 --> 00:37:43,200 Speaker 1: second lowest number in the last five years for these 613 00:37:43,239 --> 00:37:46,400 Speaker 1: companies to invest in. So I would say it wasn't 614 00:37:46,440 --> 00:37:50,719 Speaker 1: harder than twenty twenty three, at least by going by 615 00:37:50,840 --> 00:37:53,600 Speaker 1: what the Big five were doing, But it wasn't that 616 00:37:53,760 --> 00:37:59,480 Speaker 1: much easier either. Joe Procopio of ink as An Incorporated, 617 00:37:59,560 --> 00:38:03,600 Speaker 1: not as in Tattoos wrote a piece that's titled The 618 00:38:03,640 --> 00:38:07,480 Speaker 1: Future of Starting a Business Looks drastically Different in twenty 619 00:38:07,520 --> 00:38:10,160 Speaker 1: twenty four, and it goes over some of the challenges 620 00:38:10,200 --> 00:38:12,960 Speaker 1: that startups face these days, and it mentions that a 621 00:38:12,960 --> 00:38:17,520 Speaker 1: lot of investors are far more cautious about pouring money 622 00:38:17,600 --> 00:38:20,759 Speaker 1: into companies than they used to be. AI has been 623 00:38:20,800 --> 00:38:25,600 Speaker 1: going fairly strong, though Procopia argues that expectations and reality 624 00:38:26,120 --> 00:38:29,040 Speaker 1: aren't exactly matching up, and that investors may find themselves 625 00:38:29,080 --> 00:38:32,440 Speaker 1: reluctant to risk money in those ventures before too long. 626 00:38:32,920 --> 00:38:36,400 Speaker 1: So I think This is sort of a partial correct 627 00:38:36,440 --> 00:38:40,160 Speaker 1: answer for yours. Truly, It's not like a solid win, 628 00:38:40,719 --> 00:38:45,160 Speaker 1: but I feel like I was more right than wrong. Oh. 629 00:38:45,200 --> 00:38:47,640 Speaker 1: I also predicted that we would see a slow down 630 00:38:47,640 --> 00:38:51,200 Speaker 1: in podcasting, with fewer new shows premiering in twenty twenty four, 631 00:38:51,239 --> 00:38:55,520 Speaker 1: particularly at the network level. I got that wrong. That's 632 00:38:55,560 --> 00:39:00,680 Speaker 1: just flat wrong. Podcasting has been year over year at increase, 633 00:39:00,840 --> 00:39:03,560 Speaker 1: over and over and over again, whether that's a number 634 00:39:03,560 --> 00:39:06,560 Speaker 1: of listeners, number of shows, number of new shows, that 635 00:39:06,640 --> 00:39:11,440 Speaker 1: kind of thing. Video podcasting has also made a huge impact. 636 00:39:11,800 --> 00:39:16,280 Speaker 1: There are more video podcasts out there. That has changed things, 637 00:39:16,280 --> 00:39:19,560 Speaker 1: not just in how podcasts are produced, but also how 638 00:39:19,560 --> 00:39:23,480 Speaker 1: they're monetized. Because if you're hosting your video podcast on 639 00:39:23,520 --> 00:39:26,759 Speaker 1: something like YouTube, well then you're kind of beholden to 640 00:39:26,880 --> 00:39:31,120 Speaker 1: YouTube's monetization approach unless you're doing like sponsored episodes or 641 00:39:31,120 --> 00:39:34,719 Speaker 1: something that's slightly different, right, But if you're doing ad 642 00:39:34,800 --> 00:39:38,000 Speaker 1: supported shows, then you don't have nearly the same level 643 00:39:38,040 --> 00:39:41,720 Speaker 1: of control as you would on other platforms. But YouTube 644 00:39:41,719 --> 00:39:44,400 Speaker 1: is kind of the place for online video for the 645 00:39:44,440 --> 00:39:47,920 Speaker 1: most part. So yeah, things did not slow down the 646 00:39:47,920 --> 00:39:51,960 Speaker 1: way I anticipated. I thought we would enter into kind 647 00:39:52,000 --> 00:39:57,160 Speaker 1: of a more conservative year for new shows, simply because 648 00:39:58,000 --> 00:40:02,200 Speaker 1: the shows that exist they're all bad for listeners. And 649 00:40:02,239 --> 00:40:04,520 Speaker 1: it's not a zero sum game. Lots of people listen 650 00:40:04,560 --> 00:40:09,040 Speaker 1: to lots of different shows, but it's hard, it's very challenging, 651 00:40:09,120 --> 00:40:11,520 Speaker 1: and you spend a lot of money developing these shows, 652 00:40:11,560 --> 00:40:14,200 Speaker 1: particularly if it's like a prestige show, if it's a 653 00:40:14,239 --> 00:40:18,920 Speaker 1: show that has recognizable names on it, you're probably spending 654 00:40:19,239 --> 00:40:22,120 Speaker 1: a guaranteed amount of money just to make the show, 655 00:40:22,800 --> 00:40:26,120 Speaker 1: and whether that show actually makes its money back or not, 656 00:40:26,239 --> 00:40:29,560 Speaker 1: that's a totally different thing. And if you're competing against 657 00:40:29,640 --> 00:40:33,600 Speaker 1: everybody else, My thought was that we would see companies 658 00:40:33,719 --> 00:40:35,839 Speaker 1: kind of ease off a little bit on that and 659 00:40:35,920 --> 00:40:40,120 Speaker 1: try and not spend quite so much money in development. 660 00:40:40,160 --> 00:40:42,680 Speaker 1: But that's not been the case. It has continued to 661 00:40:42,719 --> 00:40:47,120 Speaker 1: be very competitive, with big companies spending lots of money 662 00:40:47,719 --> 00:40:51,799 Speaker 1: to get different shows on their networks or developed on 663 00:40:51,840 --> 00:40:56,880 Speaker 1: their networks. So I was wrong about that. I will say, however, 664 00:40:57,200 --> 00:41:00,400 Speaker 1: that both twenty twenty three and twenty twenty four twenty 665 00:41:00,440 --> 00:41:04,080 Speaker 1: twenty two in fact saw a huge dip from twenty 666 00:41:04,120 --> 00:41:06,480 Speaker 1: twenty and twenty twenty one, but that shouldn't be a surprise, 667 00:41:06,600 --> 00:41:10,520 Speaker 1: right twenty twenty, everybody was stuck at home, so lots 668 00:41:10,520 --> 00:41:13,879 Speaker 1: of people turned to making podcasts because it was one 669 00:41:13,920 --> 00:41:18,440 Speaker 1: way to try and stay sane and stay busy. For 670 00:41:18,520 --> 00:41:20,920 Speaker 1: a lot of people in the entertainment industry, it was 671 00:41:20,960 --> 00:41:24,960 Speaker 1: a way to stay creative, to stay in front of fans, 672 00:41:25,040 --> 00:41:28,440 Speaker 1: to stay busy when the rest of the industry was 673 00:41:28,440 --> 00:41:31,400 Speaker 1: shut down due to COVID. So, yeah, there was a 674 00:41:31,480 --> 00:41:34,480 Speaker 1: huge dip after that, but that's no surprise, right that 675 00:41:34,600 --> 00:41:39,200 Speaker 1: was bound to happen, just like all the other boom 676 00:41:39,480 --> 00:41:42,080 Speaker 1: moments we saw out of COVID kind of went bust 677 00:41:42,600 --> 00:41:46,600 Speaker 1: after a couple of years. Thus we have really sad 678 00:41:46,640 --> 00:41:49,640 Speaker 1: stories about like massive layoffs in the tech sector and 679 00:41:49,680 --> 00:41:53,520 Speaker 1: in other industries as well. But yeah, that's it. That's 680 00:41:53,520 --> 00:41:58,000 Speaker 1: how I did in my predictions for twenty twenty four. 681 00:41:58,040 --> 00:42:00,000 Speaker 1: I actually think I did better this year than i'd 682 00:42:00,200 --> 00:42:03,200 Speaker 1: most years. I think that I was pretty on track 683 00:42:03,280 --> 00:42:07,480 Speaker 1: for a lot of these, with the exception of notable 684 00:42:09,520 --> 00:42:13,279 Speaker 1: outliers like X getting its act together, but again that 685 00:42:13,400 --> 00:42:18,640 Speaker 1: was more aspirational than based in reality. I hope all 686 00:42:18,680 --> 00:42:22,640 Speaker 1: of you out there have had an amazing twenty twenty four. 687 00:42:23,640 --> 00:42:27,080 Speaker 1: I hope you have an even better twenty twenty five. 688 00:42:27,320 --> 00:42:31,120 Speaker 1: I am not making predictions about twenty twenty five this year, because, 689 00:42:31,680 --> 00:42:34,719 Speaker 1: as I'm sure most of you know, I am stepping 690 00:42:34,760 --> 00:42:38,560 Speaker 1: down from Tech Stuff on January tenth. You'll have a 691 00:42:38,560 --> 00:42:41,319 Speaker 1: brand new episode with your brand new hosts, whom you'll 692 00:42:41,400 --> 00:42:44,800 Speaker 1: going to meet very soon, and that will be exciting. 693 00:42:44,880 --> 00:42:47,759 Speaker 1: It'll be an all new year of tech Stuff. I'm 694 00:42:47,880 --> 00:42:52,200 Speaker 1: really eager to listen to that and to hear what happens. 695 00:42:52,520 --> 00:42:55,000 Speaker 1: I think it's going to be an amazing year for 696 00:42:55,080 --> 00:42:57,560 Speaker 1: the show and for all of you listeners out there. 697 00:42:58,360 --> 00:43:01,319 Speaker 1: And I'll miss doing this. I got a couple more 698 00:43:01,360 --> 00:43:05,080 Speaker 1: in me before I head off into the sunset. But yeah, 699 00:43:05,120 --> 00:43:06,839 Speaker 1: this has been a real treat. I mean, I've been 700 00:43:06,840 --> 00:43:09,600 Speaker 1: doing sixteen and a half years. You figure I'm gonna 701 00:43:09,640 --> 00:43:14,480 Speaker 1: miss it. But in the meantime, you'all take care of yourselves, 702 00:43:14,719 --> 00:43:18,400 Speaker 1: hold your loved ones close, have the happiest of New years, 703 00:43:18,760 --> 00:43:28,000 Speaker 1: and I'll talk to you again really soon. Tech Stuff 704 00:43:28,080 --> 00:43:32,600 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 705 00:43:32,640 --> 00:43:36,200 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 706 00:43:36,239 --> 00:43:40,760 Speaker 1: your favorite shows.