1 00:00:04,519 --> 00:00:12,559 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,560 --> 00:00:16,040 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,079 --> 00:00:18,799 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:18,880 --> 00:00:23,720 Speaker 1: Tech are you? Happy New Year, everybody. Welcome to tech Stuff. 5 00:00:23,760 --> 00:00:28,080 Speaker 1: Predicts twenty twenty four. Just so y'all know, I'm actually 6 00:00:28,120 --> 00:00:33,280 Speaker 1: recording this on December twenty eighth, so I build that 7 00:00:33,360 --> 00:00:35,600 Speaker 1: in just in case anything I predict for twenty twenty 8 00:00:35,640 --> 00:00:38,400 Speaker 1: four already happened in the last couple of days of 9 00:00:38,640 --> 00:00:41,159 Speaker 1: twenty twenty three, and I thought, you know that it 10 00:00:41,159 --> 00:00:43,640 Speaker 1: had been years and years since I did the last 11 00:00:43,680 --> 00:00:47,120 Speaker 1: one of these. Because these episodes are actually really hard 12 00:00:47,159 --> 00:00:51,120 Speaker 1: to do. Unlike a standard episode, there's not really anything 13 00:00:51,600 --> 00:00:56,160 Speaker 1: I can research before I start writing and then recording. 14 00:00:56,480 --> 00:00:59,280 Speaker 1: I can have an idea about something that might happen 15 00:00:59,320 --> 00:01:01,880 Speaker 1: that I could do a lot research to determine is 16 00:01:01,920 --> 00:01:05,840 Speaker 1: it a plausible prediction or am I just shooting in 17 00:01:05,880 --> 00:01:09,039 Speaker 1: the dark. But that's about it. Plus I also have 18 00:01:09,120 --> 00:01:11,000 Speaker 1: to revisit the episode at the end of the year 19 00:01:11,360 --> 00:01:13,679 Speaker 1: to see how well I made out with my predictions 20 00:01:13,720 --> 00:01:17,000 Speaker 1: because I believe in accountability, So these episodes tend to 21 00:01:17,040 --> 00:01:21,679 Speaker 1: be arduous. They're time consuming, they're frustrating because I don't 22 00:01:21,680 --> 00:01:24,080 Speaker 1: have a great track record, and I thought I had 23 00:01:24,120 --> 00:01:27,400 Speaker 1: not done one in many years because of how hard 24 00:01:27,440 --> 00:01:29,400 Speaker 1: they are. In my mind, it was like, man, it 25 00:01:29,480 --> 00:01:32,160 Speaker 1: must have been since like maybe twenty seventeen when I 26 00:01:32,200 --> 00:01:34,200 Speaker 1: did the last one of these. But then I look 27 00:01:34,240 --> 00:01:35,679 Speaker 1: back in my archive and I saw I did a 28 00:01:35,680 --> 00:01:38,560 Speaker 1: prediction episode in twenty twenty one for the year of 29 00:01:38,600 --> 00:01:42,640 Speaker 1: twenty twenty two, which shows you how incredibly unreliable my 30 00:01:42,720 --> 00:01:44,920 Speaker 1: memory is, which also makes it hard to do a 31 00:01:44,920 --> 00:01:49,000 Speaker 1: predictions episode. It's why I say, like, maybe I predicted 32 00:01:49,000 --> 00:01:51,760 Speaker 1: something that already happened. Heck, maybe I predict something that 33 00:01:51,840 --> 00:01:53,880 Speaker 1: happened like two years ago, because that's how bad my 34 00:01:53,920 --> 00:01:58,440 Speaker 1: memory is. But anyway, gather round the old coat while 35 00:01:58,440 --> 00:02:01,280 Speaker 1: he tells you about the future, and you wonder if 36 00:02:01,280 --> 00:02:04,160 Speaker 1: maybe you could just quietly ship them off to a 37 00:02:04,200 --> 00:02:08,000 Speaker 1: home or something. Let's get started, now, why don't we 38 00:02:08,160 --> 00:02:12,120 Speaker 1: just jump on board with a prediction about x AKA, 39 00:02:12,520 --> 00:02:17,200 Speaker 1: the platform formerly known as Twitter. Now, I'm tempted to 40 00:02:17,280 --> 00:02:19,839 Speaker 1: say that twenty twenty four will be the year when 41 00:02:20,160 --> 00:02:24,239 Speaker 1: x folds. The company has obviously been on the decline 42 00:02:24,360 --> 00:02:27,000 Speaker 1: since Elon Musk bought it, and it actually wasn't in 43 00:02:27,040 --> 00:02:29,960 Speaker 1: the best of shape before Musk bought it either, But 44 00:02:30,040 --> 00:02:33,960 Speaker 1: with advertisers becoming increasingly gun shy around X due to 45 00:02:34,080 --> 00:02:37,919 Speaker 1: Musk's tendency to not only allow hate speech on the platform, 46 00:02:38,200 --> 00:02:41,560 Speaker 1: but also he occasionally promotes it himself, it's hard to 47 00:02:41,600 --> 00:02:44,640 Speaker 1: imagine Elon and company finding a way to course correct. 48 00:02:45,080 --> 00:02:48,160 Speaker 1: On top of that, there are tons of issues mounting 49 00:02:48,480 --> 00:02:51,639 Speaker 1: for the company. One big one is that X as 50 00:02:51,680 --> 00:02:54,200 Speaker 1: a whole hasn't been so good about paying a lot 51 00:02:54,280 --> 00:02:57,080 Speaker 1: of the bills. Last year, we heard about vendors and 52 00:02:57,200 --> 00:03:01,280 Speaker 1: landlords bringing lawsuits against X because the company failed to 53 00:03:01,320 --> 00:03:05,240 Speaker 1: pay what it owed. Current and former employees have also 54 00:03:05,280 --> 00:03:08,280 Speaker 1: brought lawsuits against the company arguing that it hasn't paid 55 00:03:08,280 --> 00:03:13,440 Speaker 1: out contractually obligated compensation and severance, so less money's coming 56 00:03:13,480 --> 00:03:17,840 Speaker 1: into the company due to boycotts from advertisers, and a 57 00:03:17,880 --> 00:03:20,280 Speaker 1: lot of parties are coming up saying you need to 58 00:03:20,440 --> 00:03:23,239 Speaker 1: pay your debts to us. It is not a very 59 00:03:23,280 --> 00:03:27,520 Speaker 1: good situation economically speaking. But I am not going to 60 00:03:27,560 --> 00:03:31,119 Speaker 1: predict that X will die in twenty twenty four for 61 00:03:31,240 --> 00:03:35,080 Speaker 1: a couple of reasons. One, while I personally disliked the 62 00:03:35,120 --> 00:03:39,040 Speaker 1: platform and Elon Musk, I don't want to see any 63 00:03:39,040 --> 00:03:42,240 Speaker 1: company go under. I don't want all the folks working 64 00:03:42,280 --> 00:03:45,800 Speaker 1: at X to find themselves without a job. And two, 65 00:03:45,880 --> 00:03:49,640 Speaker 1: it's too easy of a prediction right. Plus, like Elon 66 00:03:49,760 --> 00:03:53,320 Speaker 1: Musk has what appears to be kind of an obsession 67 00:03:53,840 --> 00:03:57,520 Speaker 1: with this concept of turning X into the everything app, 68 00:03:58,200 --> 00:04:00,240 Speaker 1: and I don't think he's going to be willing to 69 00:04:00,280 --> 00:04:04,680 Speaker 1: give that up, even if it's costing money. So instead, 70 00:04:04,760 --> 00:04:07,440 Speaker 1: I'm going to predict that X actually gets its act 71 00:04:07,600 --> 00:04:11,640 Speaker 1: together this year, that Elon Musk perhaps realizes that maybe 72 00:04:11,680 --> 00:04:15,160 Speaker 1: he should back away from being such an active and 73 00:04:15,280 --> 00:04:19,200 Speaker 1: visible part of X and maybe let the actual CEO, 74 00:04:19,400 --> 00:04:23,080 Speaker 1: Linda Yakarino, work on repairing the relationship the company has 75 00:04:23,080 --> 00:04:27,680 Speaker 1: with brands and advertisers. Maybe they will actually determine that 76 00:04:27,839 --> 00:04:29,920 Speaker 1: X needs to commit to paying the amount it o's 77 00:04:30,000 --> 00:04:33,080 Speaker 1: to all those parties, and that slowly it starts to 78 00:04:33,120 --> 00:04:36,560 Speaker 1: set itself on the right path. I don't think X 79 00:04:36,640 --> 00:04:40,039 Speaker 1: is going to be in a remarkably better position by 80 00:04:40,040 --> 00:04:42,880 Speaker 1: the end of twenty twenty four, but I'm predicting that 81 00:04:42,920 --> 00:04:45,680 Speaker 1: I'll be pointed the right way. For once, do I 82 00:04:45,800 --> 00:04:49,920 Speaker 1: believe in my own prediction. If I'm being honest, I don't. 83 00:04:50,920 --> 00:04:53,039 Speaker 1: I really think X is just going to be a 84 00:04:53,080 --> 00:04:57,200 Speaker 1: bigger mess. But I'm going to predict the opposite because 85 00:04:57,240 --> 00:05:00,440 Speaker 1: if it pans out, won't it be a ma that 86 00:05:00,480 --> 00:05:03,560 Speaker 1: I was calling a shot like this that'd be phenomenal. 87 00:05:03,760 --> 00:05:05,640 Speaker 1: And if it doesn't work out, then it essentially just 88 00:05:05,720 --> 00:05:09,640 Speaker 1: meets everybody's expectations. So here's to looking forward to a 89 00:05:09,680 --> 00:05:12,600 Speaker 1: new and kinder X by the end of this year. 90 00:05:13,160 --> 00:05:17,080 Speaker 1: Maybe I suppose the next one should have actually been 91 00:05:17,320 --> 00:05:21,640 Speaker 1: my first prediction. Considering all the fuss around AI in 92 00:05:21,680 --> 00:05:24,599 Speaker 1: twenty twenty three, I think twenty twenty four we're going 93 00:05:24,680 --> 00:05:27,279 Speaker 1: to see a lot more AI integration. There was a 94 00:05:27,279 --> 00:05:30,080 Speaker 1: lot of talk about it last year, but we didn't 95 00:05:30,120 --> 00:05:34,640 Speaker 1: really see a lot of actual integrations and implementations of AI. 96 00:05:34,720 --> 00:05:36,840 Speaker 1: We saw a lot of hype around it. So I 97 00:05:36,880 --> 00:05:39,680 Speaker 1: think that a lot of those integrations we're going to 98 00:05:39,720 --> 00:05:41,359 Speaker 1: see this year are actually going to turn out to 99 00:05:41,360 --> 00:05:45,000 Speaker 1: be more trouble than what they're worth, because I think 100 00:05:45,040 --> 00:05:47,320 Speaker 1: a lot of business leaders are going to look at 101 00:05:47,360 --> 00:05:51,039 Speaker 1: AI as some sort of cure all for business challenges. 102 00:05:51,560 --> 00:05:54,640 Speaker 1: Now that's not to say that every AI implementation will 103 00:05:54,680 --> 00:05:58,160 Speaker 1: be bad. I think we'll hear about some really creative 104 00:05:58,279 --> 00:06:02,279 Speaker 1: and effective ways to incorporate A into business, whether it's 105 00:06:02,320 --> 00:06:06,600 Speaker 1: for customer facing experiences or behind the scenes where it's 106 00:06:06,760 --> 00:06:11,240 Speaker 1: largely gone unnoticed by anyone outside the organization. But with 107 00:06:11,320 --> 00:06:14,760 Speaker 1: all the hype around artificial intelligence paired with a gap 108 00:06:15,040 --> 00:06:18,159 Speaker 1: in understanding as to what it can and cannot do, 109 00:06:18,600 --> 00:06:21,200 Speaker 1: I suspect we're going to see something similar to when 110 00:06:21,240 --> 00:06:24,800 Speaker 1: companies first heard about the concept of the metaverse. They're 111 00:06:24,839 --> 00:06:27,320 Speaker 1: going to rush in because they're going to be afraid 112 00:06:27,320 --> 00:06:31,640 Speaker 1: of missing out. FOMO will be serious, They're not really 113 00:06:31,680 --> 00:06:35,600 Speaker 1: going to have grips on what AI is good for, 114 00:06:36,200 --> 00:06:39,359 Speaker 1: and we're ultimately going to see a situation similar to 115 00:06:39,440 --> 00:06:41,760 Speaker 1: what we saw with NFTs a couple of years back. 116 00:06:41,800 --> 00:06:44,080 Speaker 1: There's going to be a bubble, and that bubble will 117 00:06:44,080 --> 00:06:47,560 Speaker 1: then eventually collapse, perhaps before the end of twenty twenty four. 118 00:06:48,080 --> 00:06:50,440 Speaker 1: I actually think this is a shame, because I do 119 00:06:50,480 --> 00:06:52,599 Speaker 1: think there are ways that AI can make a positive 120 00:06:52,680 --> 00:06:56,920 Speaker 1: change for work. Those changes don't necessarily lead to layoffs 121 00:06:57,040 --> 00:07:01,039 Speaker 1: or perhaps saddling some poor human employee with the job 122 00:07:01,080 --> 00:07:03,240 Speaker 1: of double checking to make sure the AI didn't make 123 00:07:03,279 --> 00:07:06,320 Speaker 1: some sort of dumb mistake or invent something that doesn't 124 00:07:06,360 --> 00:07:09,400 Speaker 1: exist that would eventually tank the company. And if there 125 00:07:09,520 --> 00:07:12,120 Speaker 1: is a bubble, and if it does burst, I think 126 00:07:12,160 --> 00:07:15,320 Speaker 1: that's going to set back the AI field in general, 127 00:07:15,400 --> 00:07:17,800 Speaker 1: not just generative AI, although that's where a lot of 128 00:07:17,800 --> 00:07:20,560 Speaker 1: the focus has been in the last year, but for 129 00:07:20,680 --> 00:07:24,120 Speaker 1: AI across the board, because we've seen similar things like 130 00:07:24,200 --> 00:07:28,040 Speaker 1: that happen with other technologies. We've seen it with blockchain, 131 00:07:28,480 --> 00:07:31,679 Speaker 1: we've seen it with virtual reality a couple of times actually, 132 00:07:31,720 --> 00:07:33,920 Speaker 1: and we'll talk more about mixed reality a little bit 133 00:07:33,960 --> 00:07:37,280 Speaker 1: later in this Predictions episode. And I think it's there's 134 00:07:37,280 --> 00:07:38,840 Speaker 1: a good chance we're going to see it with AI. 135 00:07:39,680 --> 00:07:42,440 Speaker 1: And again it's not that AI is bad or that 136 00:07:43,120 --> 00:07:47,280 Speaker 1: there's no place for it, but that the hype around 137 00:07:47,280 --> 00:07:51,800 Speaker 1: it and the expectation around it has a large gap 138 00:07:52,080 --> 00:07:56,840 Speaker 1: between what is an actual good implementation of it, and 139 00:07:57,320 --> 00:07:59,960 Speaker 1: I think particularly we're going to see that in generative AI. 140 00:08:00,880 --> 00:08:03,440 Speaker 1: In fact, we started seeing it in twenty twenty three, 141 00:08:03,520 --> 00:08:05,200 Speaker 1: but I think twenty twenty four is the year where 142 00:08:05,240 --> 00:08:08,320 Speaker 1: we'll really see generative AI put through the paces and 143 00:08:08,680 --> 00:08:12,200 Speaker 1: in at least several areas it's going to come up short, 144 00:08:12,760 --> 00:08:15,400 Speaker 1: or it's going to cause headaches that could have been 145 00:08:15,440 --> 00:08:18,560 Speaker 1: avoided if the AI had not been implemented in the 146 00:08:18,600 --> 00:08:21,200 Speaker 1: first place. On a related note, I think we're going 147 00:08:21,280 --> 00:08:24,360 Speaker 1: to see a lot more legislators and regulators all around 148 00:08:24,400 --> 00:08:29,680 Speaker 1: the world try to tackle AI. However, I don't think 149 00:08:29,720 --> 00:08:32,800 Speaker 1: we're going to get very many meaningful laws or regulations 150 00:08:32,840 --> 00:08:35,120 Speaker 1: out of it. Perhaps this is because I have become 151 00:08:35,240 --> 00:08:38,160 Speaker 1: cynical when it comes to lawmakers wrapping their heads around 152 00:08:38,200 --> 00:08:42,960 Speaker 1: a technology and then trying to draft legislation, or it's 153 00:08:43,040 --> 00:08:46,439 Speaker 1: possible that I feel this way because there are influential 154 00:08:46,480 --> 00:08:49,840 Speaker 1: people in the AI space, for example Open AIS on 155 00:08:49,920 --> 00:08:53,480 Speaker 1: again off against CEO Sam Altman, who are taking an 156 00:08:53,480 --> 00:08:59,120 Speaker 1: aggressive role in trying to shape these regulations. When the 157 00:08:59,200 --> 00:09:01,920 Speaker 1: parties that are going to be governed by regulations are 158 00:09:01,960 --> 00:09:05,360 Speaker 1: taking an active role in forming the regulations, you should 159 00:09:05,440 --> 00:09:08,640 Speaker 1: expect that they have an agenda and they're trying to 160 00:09:09,040 --> 00:09:13,480 Speaker 1: have the regulations meet that agenda. So my worry is 161 00:09:13,559 --> 00:09:16,520 Speaker 1: that we're going to see regulations that perhaps serve the 162 00:09:16,559 --> 00:09:20,720 Speaker 1: interests of particular parties like open a Eye, while simultaneously 163 00:09:20,760 --> 00:09:26,840 Speaker 1: holding back competition like a startup. Maybe it's also because 164 00:09:26,880 --> 00:09:29,800 Speaker 1: I see a lot of politicians use technology more as 165 00:09:29,840 --> 00:09:33,800 Speaker 1: a way to promote that they have a certain ideology. 166 00:09:34,360 --> 00:09:37,720 Speaker 1: You know, think of the various politicians who point at 167 00:09:39,400 --> 00:09:42,280 Speaker 1: efforts to think of the children when it comes to 168 00:09:42,720 --> 00:09:47,280 Speaker 1: like taking an approach to objectionable material online. Right, if 169 00:09:47,320 --> 00:09:50,960 Speaker 1: you look closely at the legislation, it almost never addresses 170 00:09:51,160 --> 00:09:55,239 Speaker 1: any of the root problems. Instead, it's like patching symptoms. 171 00:09:56,080 --> 00:09:59,000 Speaker 1: And sometimes they don't even patch the symptoms, so you 172 00:09:59,120 --> 00:10:04,040 Speaker 1: end up with rules that don't address the actual problem 173 00:10:04,559 --> 00:10:07,560 Speaker 1: and don't do anything useful, and it just makes things 174 00:10:07,679 --> 00:10:12,080 Speaker 1: more muddy. In some cases, lawmakers go on and on 175 00:10:12,120 --> 00:10:15,400 Speaker 1: about a problem, but they don't even draft anything. They 176 00:10:15,440 --> 00:10:18,079 Speaker 1: just use the problem again as sort of a political 177 00:10:18,200 --> 00:10:24,280 Speaker 1: stepping point to try and signify to potential voters. Hey, 178 00:10:24,760 --> 00:10:27,880 Speaker 1: this is what I stand for, and the perceived problem 179 00:10:28,040 --> 00:10:30,640 Speaker 1: is just a target, right, It's not anything that they're 180 00:10:30,640 --> 00:10:33,520 Speaker 1: actually going to do anything about. The last few years 181 00:10:33,600 --> 00:10:36,440 Speaker 1: have perhaps taken a lot of my optimism with them. 182 00:10:37,000 --> 00:10:40,959 Speaker 1: So maybe I am being just incredibly pessimistic and cynical, 183 00:10:41,000 --> 00:10:44,120 Speaker 1: but this is how it seems to play out to me. Anyway, 184 00:10:44,440 --> 00:10:48,240 Speaker 1: we may get a few definitive rules in twenty twenty 185 00:10:48,240 --> 00:10:51,280 Speaker 1: four regarding AI, similar to how a judge in the 186 00:10:51,400 --> 00:10:54,520 Speaker 1: US deemed that works authored by AI are not eligible 187 00:10:54,559 --> 00:10:56,960 Speaker 1: for copyright protection. But I think there's going to be 188 00:10:57,080 --> 00:11:00,920 Speaker 1: very specific for particular use cases. I don't think we're 189 00:11:01,000 --> 00:11:08,080 Speaker 1: necessarily going to see comprehensive, effective legislation or regulations around 190 00:11:08,120 --> 00:11:12,240 Speaker 1: AI not in twenty twenty four. On a related note, 191 00:11:12,600 --> 00:11:15,400 Speaker 1: because twenty twenty four is an election year here in 192 00:11:15,440 --> 00:11:18,400 Speaker 1: the United States, and because we've seen in recent years 193 00:11:18,440 --> 00:11:22,280 Speaker 1: arise in attempts to influence voters, I suspect we're going 194 00:11:22,320 --> 00:11:26,719 Speaker 1: to see an awful lot of AI assisted attempts to mislead, misinform, 195 00:11:26,840 --> 00:11:30,520 Speaker 1: and deceive US citizens. I think deep fakes will play 196 00:11:30,520 --> 00:11:34,400 Speaker 1: a big part, whether it's video or audio or whatever 197 00:11:34,440 --> 00:11:37,840 Speaker 1: it may be, in a way to try and sway 198 00:11:37,960 --> 00:11:41,120 Speaker 1: voters from one point of view to another or to 199 00:11:41,240 --> 00:11:46,600 Speaker 1: reinforce certain ideas. I think AI drafted misinformation campaigns will 200 00:11:46,640 --> 00:11:49,040 Speaker 1: be a big part of this too, And in turn, 201 00:11:49,120 --> 00:11:51,960 Speaker 1: this is going to prompt the US government to pressure 202 00:11:52,000 --> 00:11:56,520 Speaker 1: the social platforms to identify and remove misinformation quickly and accurately. 203 00:11:57,120 --> 00:12:00,600 Speaker 1: But again, this doesn't actually address the underlying problem, right, 204 00:12:00,840 --> 00:12:04,200 Speaker 1: because even if you are putting pressure on the social platforms, 205 00:12:04,760 --> 00:12:07,960 Speaker 1: the parties that are actually generating the misinformation are still 206 00:12:08,000 --> 00:12:10,640 Speaker 1: active and they're still going to be flooding any available 207 00:12:10,720 --> 00:12:15,680 Speaker 1: channel with that misinformation. So again, it's not addressing the 208 00:12:15,800 --> 00:12:18,120 Speaker 1: underlying issue, although I don't even know how you would 209 00:12:18,160 --> 00:12:21,160 Speaker 1: go about doing that, but the point being that it's 210 00:12:21,559 --> 00:12:24,560 Speaker 1: a little reactive, it's not proactive, and it doesn't solve 211 00:12:24,640 --> 00:12:27,360 Speaker 1: the problem. It just shifts the problem to a different party, 212 00:12:27,400 --> 00:12:31,280 Speaker 1: that being the platforms that serve as the launching ground 213 00:12:31,320 --> 00:12:36,000 Speaker 1: for these misinformation campaigns. And I don't think it's going 214 00:12:36,080 --> 00:12:37,760 Speaker 1: to mean that we're going to see less of it. 215 00:12:38,400 --> 00:12:40,440 Speaker 1: I think one consequence of this is we'll see even 216 00:12:40,440 --> 00:12:43,880 Speaker 1: more opposition to TikTok in particular, not that I think 217 00:12:43,880 --> 00:12:47,439 Speaker 1: TikTok necessarily deserves that treatment, but we're already seeing a 218 00:12:47,480 --> 00:12:51,640 Speaker 1: growing belief that the platform purposefully directs users to specific 219 00:12:51,760 --> 00:12:55,440 Speaker 1: videos to push an agenda. Various investigations by journalists have 220 00:12:55,520 --> 00:12:58,160 Speaker 1: shown that there's very little evidence of this, that TikTok 221 00:12:58,280 --> 00:13:01,480 Speaker 1: is more focused on serving up content that will keep 222 00:13:01,520 --> 00:13:04,880 Speaker 1: a user on TikTok longer, and that it doesn't really 223 00:13:04,960 --> 00:13:09,280 Speaker 1: care what the messaging of that content is as long 224 00:13:09,320 --> 00:13:11,720 Speaker 1: as it's not breaking the law, in which case TikTok 225 00:13:11,840 --> 00:13:14,280 Speaker 1: wants to remove it so that they don't get in trouble. 226 00:13:14,520 --> 00:13:18,120 Speaker 1: If someone stays on TikTok longer because of misinformation, and 227 00:13:18,160 --> 00:13:22,160 Speaker 1: that misinformation has a particular like political ideology associated with it, 228 00:13:22,800 --> 00:13:25,880 Speaker 1: TikTok will just keep serving up that content to that 229 00:13:25,920 --> 00:13:29,400 Speaker 1: particular user, not because it's trying to radicalize that person, 230 00:13:30,080 --> 00:13:33,280 Speaker 1: but because TikTok benefits from folks staying on TikTok longer. 231 00:13:33,800 --> 00:13:36,040 Speaker 1: But this is a system that bad actors can gain. 232 00:13:36,160 --> 00:13:39,359 Speaker 1: Right Meanwhile, lawmakers look at TikTok as being the problem 233 00:13:39,640 --> 00:13:42,079 Speaker 1: rather than those who are creating the actual content. So 234 00:13:42,320 --> 00:13:44,120 Speaker 1: I think we're going to see a lot more political 235 00:13:44,160 --> 00:13:47,559 Speaker 1: posturing about how TikTok is destroying the United States and 236 00:13:47,600 --> 00:13:50,080 Speaker 1: should be banned. And to be clear, I am not 237 00:13:50,160 --> 00:13:52,400 Speaker 1: a fan of TikTok, but I really feel like the 238 00:13:52,400 --> 00:13:55,280 Speaker 1: concern is directed at the wrong place here. I will 239 00:13:55,280 --> 00:13:58,559 Speaker 1: say that I expect there to be increased rhetoric around 240 00:13:58,720 --> 00:14:01,200 Speaker 1: banning TikTok here in the United States, but we're not 241 00:14:01,240 --> 00:14:03,240 Speaker 1: actually going to see that happen at least not on 242 00:14:03,240 --> 00:14:06,000 Speaker 1: a national level. I think it'll be used more as 243 00:14:06,000 --> 00:14:07,880 Speaker 1: a political tool for folks who are running in the 244 00:14:07,920 --> 00:14:11,280 Speaker 1: election rather than as an actual policy. So I predict 245 00:14:11,280 --> 00:14:14,120 Speaker 1: that by the end of twenty twenty four, TikTok will 246 00:14:14,160 --> 00:14:16,079 Speaker 1: still be a thing and folks in the US will 247 00:14:16,080 --> 00:14:19,080 Speaker 1: still be able to access it legally. Okay, I've got 248 00:14:19,320 --> 00:14:21,600 Speaker 1: several more predictions to go through before we get to those. 249 00:14:21,680 --> 00:14:34,680 Speaker 1: Let's take a quick break to think our sponsors. Okay, 250 00:14:34,840 --> 00:14:37,080 Speaker 1: Next up, I'm gonna stick my neck out about an 251 00:14:37,080 --> 00:14:42,120 Speaker 1: Apple product. Now, historically, I have a terrible record for 252 00:14:42,200 --> 00:14:45,600 Speaker 1: this because I famously dismissed the iPad back in the 253 00:14:45,680 --> 00:14:48,240 Speaker 1: day because I thought no one wanted a tablet form 254 00:14:48,280 --> 00:14:52,360 Speaker 1: factor computer system. Tablet computers had been around for years 255 00:14:52,400 --> 00:14:55,320 Speaker 1: before the iPad came out. Also, I thought the iPad 256 00:14:55,400 --> 00:14:58,280 Speaker 1: was a really dumb name. It's funny because now it's 257 00:14:58,320 --> 00:14:59,720 Speaker 1: been around for a while, but when it was first 258 00:14:59,760 --> 00:15:03,040 Speaker 1: and an lots of folks said, iPad, are you serious? 259 00:15:03,520 --> 00:15:07,120 Speaker 1: But then boom, the product comes out and like the iPhone, 260 00:15:07,200 --> 00:15:10,280 Speaker 1: it establishes itself in the hearts of consumers and Apple 261 00:15:10,320 --> 00:15:14,320 Speaker 1: fanatics everywhere. Hecker Ment other companies could actually produce their 262 00:15:14,360 --> 00:15:18,840 Speaker 1: own consumer tablets and sell those to the public as well. 263 00:15:18,880 --> 00:15:22,359 Speaker 1: And previously you would only really come across tablet computers 264 00:15:22,800 --> 00:15:24,920 Speaker 1: if you happen to work in certain environments, like say 265 00:15:24,960 --> 00:15:28,640 Speaker 1: a hospital, But now you know, like kids are using 266 00:15:28,640 --> 00:15:32,600 Speaker 1: them as playthings. However, I really feel that the Apple 267 00:15:32,720 --> 00:15:36,320 Speaker 1: Vision Pro headset, which is an upcoming mixed reality device 268 00:15:36,360 --> 00:15:39,960 Speaker 1: that leans heavily on the augmented reality part of the 269 00:15:40,080 --> 00:15:42,720 Speaker 1: mixed reality spectrum, I don't think it's going to be 270 00:15:42,720 --> 00:15:45,760 Speaker 1: a big success, and that's partly because no one has 271 00:15:45,880 --> 00:15:50,680 Speaker 1: managed to make a mixed reality viable consumer product that 272 00:15:50,960 --> 00:15:54,360 Speaker 1: is at like the high end of the tech spectrum. Obviously, 273 00:15:54,360 --> 00:15:57,280 Speaker 1: there are VR headsets that are popular among a subset 274 00:15:57,320 --> 00:16:00,280 Speaker 1: of gamers, but even that, it's like a slow of 275 00:16:00,320 --> 00:16:04,040 Speaker 1: a slice of a demographic, right, it's not the general public. 276 00:16:04,160 --> 00:16:08,760 Speaker 1: General public has not adopted VR, but some gamers have. Now, 277 00:16:08,760 --> 00:16:11,560 Speaker 1: I'd say that there's no real precedent here, and that's 278 00:16:11,600 --> 00:16:14,320 Speaker 1: why I think Apple's going to fail. However, I also 279 00:16:14,440 --> 00:16:16,840 Speaker 1: have to face the fact that that's the same thing 280 00:16:16,880 --> 00:16:20,160 Speaker 1: I said about tablet computers years ago, before the iPad 281 00:16:20,160 --> 00:16:22,640 Speaker 1: had come out. But that's just one reason I think 282 00:16:22,640 --> 00:16:24,600 Speaker 1: it's going to fail. Another one is because the price 283 00:16:24,680 --> 00:16:27,760 Speaker 1: tag the Apple Vision Pro is going to retail for 284 00:16:27,880 --> 00:16:31,440 Speaker 1: three four hundred and ninety nine dollars thirty five hundred 285 00:16:31,480 --> 00:16:34,560 Speaker 1: dollars is incredibly expensive, and it puts the Vision Pro 286 00:16:34,680 --> 00:16:38,160 Speaker 1: outside the price range for most consumers. Now, I have 287 00:16:38,280 --> 00:16:41,480 Speaker 1: not experienced the Vision Pro personally. I have watched some 288 00:16:41,600 --> 00:16:44,560 Speaker 1: videos and I've read some reviews that said the experience 289 00:16:44,640 --> 00:16:49,120 Speaker 1: is really impressive, that Apple delivers a good experience, But 290 00:16:49,160 --> 00:16:51,400 Speaker 1: a lot of those reviews also raise concerns that there 291 00:16:51,440 --> 00:16:54,320 Speaker 1: are a limited number of things you can do with 292 00:16:54,440 --> 00:16:58,560 Speaker 1: the Apple Vision Pro, that some of those things probably 293 00:16:58,600 --> 00:17:02,600 Speaker 1: aren't practical at least not for say, like productivity purposes 294 00:17:02,600 --> 00:17:05,000 Speaker 1: for a full day of work, because you know the 295 00:17:05,040 --> 00:17:08,639 Speaker 1: battery won't last that long. And this also leads us 296 00:17:08,680 --> 00:17:11,160 Speaker 1: to a chicken in the egg problem that a lot 297 00:17:11,200 --> 00:17:14,919 Speaker 1: of new technology faces. Maybe instead, I should say the 298 00:17:15,080 --> 00:17:20,080 Speaker 1: hardware and the software problem. So launching any new computer, 299 00:17:20,440 --> 00:17:24,520 Speaker 1: hardware or platform is a tricky thing, whether that's a 300 00:17:24,680 --> 00:17:29,480 Speaker 1: video game console, a brand new computer system, or an 301 00:17:29,560 --> 00:17:37,000 Speaker 1: augmented reality headset. And that's because the hardware needs good 302 00:17:37,080 --> 00:17:41,680 Speaker 1: software in order to be useful, and something that's valuable 303 00:17:41,720 --> 00:17:45,240 Speaker 1: to consumers. Otherwise you end up with a very expensive 304 00:17:45,240 --> 00:17:49,000 Speaker 1: piece of technology, like a pair of electronic goggles that 305 00:17:49,240 --> 00:17:51,720 Speaker 1: only is able to perform a limited number of tasks, 306 00:17:52,400 --> 00:17:56,800 Speaker 1: and so consumers don't see the necessity of purchasing such 307 00:17:56,800 --> 00:17:59,439 Speaker 1: a thing, right Apart from the desire to have the newest, 308 00:17:59,480 --> 00:18:03,280 Speaker 1: coolest techechnology, if the tech only does a limited number 309 00:18:03,280 --> 00:18:05,600 Speaker 1: of things, then you might say, well, how is this 310 00:18:05,680 --> 00:18:09,720 Speaker 1: worth the money? Meanwhile, software developers need to feel confident 311 00:18:09,760 --> 00:18:12,440 Speaker 1: that they're going to see a return on investment for 312 00:18:12,840 --> 00:18:17,280 Speaker 1: their work for developing applications for the hardware platform. If 313 00:18:17,280 --> 00:18:19,480 Speaker 1: you're a developer and you have the chance to build 314 00:18:19,520 --> 00:18:22,840 Speaker 1: apps for a platform that's brand new, but you also 315 00:18:22,920 --> 00:18:26,360 Speaker 1: see that the platform is not likely to receive widespread adoption, 316 00:18:27,040 --> 00:18:31,240 Speaker 1: you might pull a fagin and review the situation, because 317 00:18:31,280 --> 00:18:34,359 Speaker 1: it won't do you any good. If you spend hundreds 318 00:18:34,400 --> 00:18:38,320 Speaker 1: of developer hours, spending significant amounts of money to develop 319 00:18:38,359 --> 00:18:41,440 Speaker 1: something that only a few people will ever purchase or experience, 320 00:18:41,760 --> 00:18:44,119 Speaker 1: you're not going to get a return on your investment. 321 00:18:44,560 --> 00:18:47,800 Speaker 1: That time and money could go to something else that 322 00:18:47,840 --> 00:18:51,960 Speaker 1: would actually reach more customers. So software developers really want 323 00:18:52,000 --> 00:18:55,280 Speaker 1: a large user base. They want to have as many 324 00:18:55,400 --> 00:18:58,840 Speaker 1: users as possible on the hardware so that they're likely 325 00:18:58,920 --> 00:19:03,160 Speaker 1: to see success with their app. Meanwhile, the user base 326 00:19:03,640 --> 00:19:07,200 Speaker 1: or potential user base wants a product to have lots 327 00:19:07,280 --> 00:19:10,320 Speaker 1: of apps on it. They want a real good library 328 00:19:10,400 --> 00:19:13,960 Speaker 1: of applications, and that needs to be there for the 329 00:19:14,000 --> 00:19:17,480 Speaker 1: consumers to feel confident to purchase the product. So you 330 00:19:17,520 --> 00:19:21,040 Speaker 1: have potential users waiting for more features and functionality, and 331 00:19:21,080 --> 00:19:23,679 Speaker 1: you have developers waiting for enough users to justify the 332 00:19:23,760 --> 00:19:27,520 Speaker 1: work that is needed to build out those features in functionality, 333 00:19:27,800 --> 00:19:31,439 Speaker 1: and it's a classic Chicken and the software problem or something. 334 00:19:31,480 --> 00:19:34,760 Speaker 1: I'm bad with metaphors. So I think that the Apple 335 00:19:34,880 --> 00:19:38,119 Speaker 1: Vision pro will impress reviewers at least as far as 336 00:19:38,119 --> 00:19:41,720 Speaker 1: the functionality goes when it launches, But I don't think 337 00:19:41,720 --> 00:19:45,080 Speaker 1: it's going to become an Apple flagship product like the iPhone. 338 00:19:45,720 --> 00:19:49,639 Speaker 1: I'm not sure what, if anything, will ever make mixed 339 00:19:49,640 --> 00:19:54,359 Speaker 1: reality headsets a successful consumer item. I worry that the 340 00:19:54,400 --> 00:19:58,919 Speaker 1: form factor as it stands now, the bulky headset is 341 00:19:58,960 --> 00:20:02,399 Speaker 1: a really big herd. Just like people didn't flock to 342 00:20:02,480 --> 00:20:07,640 Speaker 1: three D televisions despite multiple manufacturers really pushing three DTV 343 00:20:07,960 --> 00:20:11,200 Speaker 1: like a decade ago, because nobody really wanted to deal 344 00:20:11,280 --> 00:20:13,760 Speaker 1: with the hassle of having to wear special glasses just 345 00:20:13,760 --> 00:20:17,280 Speaker 1: to watch TV at home. So I predict that the 346 00:20:17,280 --> 00:20:21,120 Speaker 1: Apple Vision pro will be a commercial failure. I look 347 00:20:21,200 --> 00:20:23,480 Speaker 1: forward to being gobsmacked at the end of twenty twenty 348 00:20:23,520 --> 00:20:26,080 Speaker 1: four when it becomes one of the top selling items 349 00:20:26,119 --> 00:20:29,439 Speaker 1: on the market or something. But I just I don't 350 00:20:29,520 --> 00:20:32,240 Speaker 1: think the market is there. I think if Apple had 351 00:20:32,400 --> 00:20:36,400 Speaker 1: been able to actually create a pair of ar glasses 352 00:20:36,440 --> 00:20:39,959 Speaker 1: that look like eyeglasses, it would be a totally different story. 353 00:20:40,400 --> 00:20:43,960 Speaker 1: But it just didn't happen. That form factor is so 354 00:20:44,240 --> 00:20:47,280 Speaker 1: small that cramming in all the components, you would need 355 00:20:47,320 --> 00:20:51,320 Speaker 1: to have a decent pair of AR glasses that one 356 00:20:52,000 --> 00:20:53,800 Speaker 1: could do all the things he wanted to do, and 357 00:20:53,880 --> 00:20:56,840 Speaker 1: two had a battery that would support it for longer 358 00:20:56,880 --> 00:21:01,160 Speaker 1: than say, five minutes. It just was isn't realistic, And 359 00:21:01,520 --> 00:21:05,639 Speaker 1: I just don't think consumers are eager to strap a 360 00:21:05,680 --> 00:21:09,400 Speaker 1: big headset to their face for multiple hours a day, 361 00:21:10,200 --> 00:21:13,080 Speaker 1: even though it honestly could be super cool as far 362 00:21:13,080 --> 00:21:15,239 Speaker 1: as the experience goes. I just don't think it's going 363 00:21:15,280 --> 00:21:17,600 Speaker 1: to be there. Another thing I think is going to 364 00:21:17,600 --> 00:21:21,440 Speaker 1: be big in twenty twenty four will be news about 365 00:21:21,520 --> 00:21:25,160 Speaker 1: quantum computing. But that being said, a lot of stuff 366 00:21:25,160 --> 00:21:29,760 Speaker 1: actually has to happen for quantum computers to transform from 367 00:21:29,800 --> 00:21:35,760 Speaker 1: being impressive technological and scientific achievements to becoming a practical technology. 368 00:21:36,240 --> 00:21:40,919 Speaker 1: Quantum computing has the potential to totally change the world. 369 00:21:41,040 --> 00:21:46,640 Speaker 1: It could disrupt fundamental technologies like encryption methodologies, but that 370 00:21:46,720 --> 00:21:50,800 Speaker 1: only is true if the quantum computer is paired with 371 00:21:50,920 --> 00:21:55,200 Speaker 1: the proper quantum algorithms and programs. Again, it's a hardware 372 00:21:55,240 --> 00:21:58,240 Speaker 1: software issue. It's not just that you have a device 373 00:21:58,320 --> 00:22:01,840 Speaker 1: that from a technical level could do these things. You 374 00:22:01,880 --> 00:22:06,560 Speaker 1: have to actually build the programs or algorithms to achieve 375 00:22:06,920 --> 00:22:10,200 Speaker 1: those things on the hardware, and we haven't heard a 376 00:22:10,240 --> 00:22:14,840 Speaker 1: whole lot about that second category of developments in quantum computing. 377 00:22:14,840 --> 00:22:18,480 Speaker 1: We've heard a lot about the incredible advancements companies have 378 00:22:18,560 --> 00:22:22,600 Speaker 1: made in building larger and larger quantum computers, but not 379 00:22:22,720 --> 00:22:26,359 Speaker 1: so much on the algorithm development side. So my guess 380 00:22:26,440 --> 00:22:29,440 Speaker 1: is we're going to see more stories not just about 381 00:22:29,560 --> 00:22:33,439 Speaker 1: impressive quantum computing hardware, but on the development of the 382 00:22:33,480 --> 00:22:36,440 Speaker 1: software side of the equation. So again, it's one thing 383 00:22:36,480 --> 00:22:39,159 Speaker 1: to make a quantum computer work. It's another thing to 384 00:22:39,160 --> 00:22:42,320 Speaker 1: make it work for you, and I think that's really 385 00:22:42,359 --> 00:22:44,480 Speaker 1: where we're going to see more of the story shift 386 00:22:44,640 --> 00:22:47,760 Speaker 1: in this year, because otherwise you're just going to have 387 00:22:47,840 --> 00:22:49,920 Speaker 1: a really powerful machine, but you have nothing to run 388 00:22:49,960 --> 00:22:52,760 Speaker 1: on it. It's not very useful. And it's not like 389 00:22:52,840 --> 00:22:57,000 Speaker 1: quantum computers can replace classical computers. They work on entirely 390 00:22:57,040 --> 00:23:02,240 Speaker 1: different principles, at least from a a scale perspective, and 391 00:23:02,880 --> 00:23:05,800 Speaker 1: quantum computers would be terrible at the things that classical 392 00:23:05,800 --> 00:23:09,440 Speaker 1: computers are good at, just like classical computers would take 393 00:23:09,560 --> 00:23:12,840 Speaker 1: ages to complete tasks that quantum computers, when paired with 394 00:23:12,880 --> 00:23:16,600 Speaker 1: a correct algorithm, could complete in a fraction of the time. 395 00:23:17,119 --> 00:23:19,679 Speaker 1: So I think we're going to see the narrative shift 396 00:23:19,720 --> 00:23:22,280 Speaker 1: a bit in twenty twenty four as we approach a 397 00:23:22,320 --> 00:23:27,560 Speaker 1: future where quantum computers are actively doing incredible stuff. Now, 398 00:23:27,600 --> 00:23:31,199 Speaker 1: some predictions are a bit too easy. For example, I 399 00:23:31,200 --> 00:23:33,120 Speaker 1: could say we're going to see a lot more development 400 00:23:33,160 --> 00:23:37,360 Speaker 1: around electric vehicles, but that's no brainer. With various regions 401 00:23:37,400 --> 00:23:41,040 Speaker 1: passing laws that will phase out internal combustion engines over 402 00:23:41,080 --> 00:23:44,160 Speaker 1: the next decade or so, it's pretty much a guarantee 403 00:23:44,160 --> 00:23:47,480 Speaker 1: that electric vehicles are going to be big news from 404 00:23:47,480 --> 00:23:52,359 Speaker 1: this point moving forward. Likewise, I could make a prediction about, say, 405 00:23:52,480 --> 00:23:55,199 Speaker 1: the next Nintendo video game console. I could say, oh, 406 00:23:55,240 --> 00:23:58,200 Speaker 1: we're going to get a new Nintendo in twenty twenty four. 407 00:23:59,000 --> 00:24:03,320 Speaker 1: But there's there's been leaks of internal Nintendo documentation that 408 00:24:03,440 --> 00:24:06,920 Speaker 1: already suggests that that's going to happen, right that we're 409 00:24:06,920 --> 00:24:08,879 Speaker 1: going to get a follow up to the Switch. No 410 00:24:08,920 --> 00:24:11,800 Speaker 1: one knows outside of Nintendo anyway. No one knows what 411 00:24:11,800 --> 00:24:15,000 Speaker 1: it's going to be called. We're pretty sure the form 412 00:24:15,040 --> 00:24:17,520 Speaker 1: factor isn't going to be that different from the Switch. 413 00:24:17,560 --> 00:24:23,040 Speaker 1: It'll still be some sort of handheld console hybrid. But 414 00:24:23,200 --> 00:24:26,040 Speaker 1: beyond that, you know, we don't have details, but we 415 00:24:26,040 --> 00:24:29,280 Speaker 1: do know it's coming. There's not much point predicting something 416 00:24:29,920 --> 00:24:33,600 Speaker 1: that is already pretty likely to happen due to the 417 00:24:33,640 --> 00:24:36,359 Speaker 1: bits we know. I do think I can make a 418 00:24:36,400 --> 00:24:40,480 Speaker 1: couple of other interesting predictions. I think we're going to 419 00:24:40,520 --> 00:24:43,199 Speaker 1: see a lot of change in the streaming media space. 420 00:24:43,800 --> 00:24:48,320 Speaker 1: For the last couple of years, streaming services have encounter 421 00:24:48,480 --> 00:24:51,639 Speaker 1: challenges when it comes to running a profitable business. It 422 00:24:51,680 --> 00:24:54,720 Speaker 1: turns out you can't depend entirely on never ending growth 423 00:24:54,920 --> 00:24:59,080 Speaker 1: of subscribers, because sooner or later you're gonna plateau because 424 00:24:59,119 --> 00:25:02,520 Speaker 1: adoption will hit us saturation point. There just won't be 425 00:25:02,720 --> 00:25:06,040 Speaker 1: more people willing or able to subscribe to your service. 426 00:25:06,640 --> 00:25:09,240 Speaker 1: And if that's how you are looking at growth and 427 00:25:09,320 --> 00:25:13,440 Speaker 1: revenue is by adding subscribers, you're kind of up the creek. 428 00:25:14,000 --> 00:25:16,600 Speaker 1: The cable industry faced the same thing because you had 429 00:25:16,600 --> 00:25:20,000 Speaker 1: cable networks that we're trying to solve this problem, right, 430 00:25:20,160 --> 00:25:23,680 Speaker 1: how do we get more subscribers when we're already on 431 00:25:24,080 --> 00:25:27,479 Speaker 1: pretty much all the cable packages. Well, the solution that 432 00:25:27,520 --> 00:25:30,879 Speaker 1: the cable industry adopted, for the most part was to 433 00:25:30,920 --> 00:25:35,040 Speaker 1: find new regions to launch their networks in and just 434 00:25:35,119 --> 00:25:38,200 Speaker 1: start the process all over again. Right when I worked 435 00:25:38,240 --> 00:25:43,080 Speaker 1: for Discovery Communications, or rather when Discovery Communications owned the 436 00:25:43,119 --> 00:25:45,880 Speaker 1: company I was working for before they sold us off, 437 00:25:46,600 --> 00:25:50,560 Speaker 1: I observed this personally. I saw the plans being, well, 438 00:25:50,640 --> 00:25:54,080 Speaker 1: we can't really expand in the United States anymore because 439 00:25:54,600 --> 00:25:58,040 Speaker 1: pretty much every cable package has the Discovery network as 440 00:25:58,080 --> 00:26:00,560 Speaker 1: part of it. So what we're going to do focus 441 00:26:00,600 --> 00:26:06,200 Speaker 1: on other regions like Asia or South America, and we'll 442 00:26:06,240 --> 00:26:08,600 Speaker 1: just do it there. Which, if you're thinking about this, 443 00:26:09,359 --> 00:26:12,800 Speaker 1: eventually run out of places to expand into, and you're 444 00:26:12,840 --> 00:26:16,000 Speaker 1: back at the problem you were already facing. How do 445 00:26:16,080 --> 00:26:21,080 Speaker 1: you grow when you're already everywhere? Well, streaming services have 446 00:26:21,200 --> 00:26:24,520 Speaker 1: kind of run into a similar issue. The snazzy thing 447 00:26:24,840 --> 00:26:27,160 Speaker 1: it looks like to do in twenty twenty four is 448 00:26:27,200 --> 00:26:31,239 Speaker 1: to bundle streaming services with other streaming services to kind 449 00:26:31,280 --> 00:26:35,239 Speaker 1: of spread both the risk and cost and also the 450 00:26:35,280 --> 00:26:38,399 Speaker 1: profits around a little bit. So again, the hope is 451 00:26:38,400 --> 00:26:42,080 Speaker 1: really to reach new customers, new subscribers, and then share 452 00:26:42,119 --> 00:26:45,200 Speaker 1: the expense of generating content for folks to watch. This 453 00:26:45,240 --> 00:26:48,480 Speaker 1: is the really complicated thing with streaming services, right, How 454 00:26:48,520 --> 00:26:52,119 Speaker 1: do you generate enough revenue to support the generation of 455 00:26:52,280 --> 00:26:56,160 Speaker 1: content that you need to have in order to convince 456 00:26:56,200 --> 00:26:59,840 Speaker 1: people to subscribe and stay with you. If you stop 457 00:26:59,840 --> 00:27:03,520 Speaker 1: me content, eventually your subscribers are going to say, I've 458 00:27:03,520 --> 00:27:06,720 Speaker 1: already watched everything that i'm interested in on here, there's 459 00:27:06,720 --> 00:27:09,359 Speaker 1: no reason for me to stay, and they cancel their subscription. 460 00:27:09,840 --> 00:27:13,320 Speaker 1: But if you keep generating content, especially like high end, 461 00:27:13,680 --> 00:27:19,000 Speaker 1: high quality content, that's expensive, and if you're no longer 462 00:27:19,080 --> 00:27:23,360 Speaker 1: adding lots of subscribers each quarter, it looks like your 463 00:27:23,800 --> 00:27:27,200 Speaker 1: service isn't profitable. It's a tough position to be in. 464 00:27:27,520 --> 00:27:32,040 Speaker 1: It's a very tricky business, and maybe that's why Amazon 465 00:27:32,119 --> 00:27:36,840 Speaker 1: Prime Video is introducing advertising to its service. The costs 466 00:27:37,000 --> 00:27:39,639 Speaker 1: keep coming, but the subscriber numbers aren't increasing at a 467 00:27:39,720 --> 00:27:43,560 Speaker 1: rate that's fast enough to satisfy leaders and stakeholders. So 468 00:27:44,000 --> 00:27:49,000 Speaker 1: if you can't compensate the cost by adding more subscribers, 469 00:27:49,040 --> 00:27:51,080 Speaker 1: you've got to find some other way to generate revenue. 470 00:27:51,640 --> 00:27:55,080 Speaker 1: So with Amazon, they're going to add advertising in with 471 00:27:55,280 --> 00:27:58,600 Speaker 1: the service, which could be really frustrating for those people 472 00:27:58,600 --> 00:28:01,800 Speaker 1: who are actually spending the money to be subscribed to 473 00:28:01,920 --> 00:28:05,960 Speaker 1: Amazon Prime and who had previously enjoyed a relatively ad 474 00:28:06,000 --> 00:28:10,920 Speaker 1: free experience. Disney Plus and Hulu are merging. In fact, 475 00:28:10,960 --> 00:28:14,480 Speaker 1: if you're subscribed to both already, you've seen this for yourself, 476 00:28:14,800 --> 00:28:16,400 Speaker 1: but the rest of us have to wait till later 477 00:28:16,440 --> 00:28:20,280 Speaker 1: this year to get the truly bundled service. As I 478 00:28:20,320 --> 00:28:23,879 Speaker 1: record this, Warner Brothers Discovery is reportedly considering an acquisition 479 00:28:23,920 --> 00:28:27,679 Speaker 1: of Paramount, though various analysts say that it's doubtful this 480 00:28:27,720 --> 00:28:30,479 Speaker 1: will actually happen, but it does show how media companies 481 00:28:30,480 --> 00:28:34,280 Speaker 1: are looking for solutions to this tricky problem, and it's 482 00:28:34,280 --> 00:28:36,199 Speaker 1: a problem that has extra weight to it in the 483 00:28:36,240 --> 00:28:40,040 Speaker 1: wake of the union strikes. In the entertainment industry, artists 484 00:28:40,040 --> 00:28:42,960 Speaker 1: and crafts people want a slice of the profits, and 485 00:28:43,000 --> 00:28:46,200 Speaker 1: that makes sense. They're the ones actually creating the content, 486 00:28:46,320 --> 00:28:49,640 Speaker 1: so they deserve compensation. They should not be cut out 487 00:28:49,640 --> 00:28:53,720 Speaker 1: of things like residuals. However, for the companies, the media companies, 488 00:28:54,040 --> 00:28:57,120 Speaker 1: it means profit margins get even trickier for them. So 489 00:28:57,160 --> 00:29:01,200 Speaker 1: I suspect we'll see a few streaming services fail outright 490 00:29:01,360 --> 00:29:03,680 Speaker 1: in twenty twenty four as companies decide they're just not 491 00:29:03,800 --> 00:29:08,440 Speaker 1: profitable enough and they cease to be Like, I'm actually 492 00:29:08,440 --> 00:29:12,880 Speaker 1: surprised that Max hasn't been completely scrubbed at this point. 493 00:29:13,280 --> 00:29:16,520 Speaker 1: Consumer frustrations are going to continue to grow because no 494 00:29:16,560 --> 00:29:19,680 Speaker 1: one wants to subscribe to a dozen different services in 495 00:29:19,800 --> 00:29:22,680 Speaker 1: order to get the entertainment they want. That's another reason 496 00:29:22,680 --> 00:29:25,840 Speaker 1: why I think the consolidation is going to happen. Partly 497 00:29:25,880 --> 00:29:29,160 Speaker 1: is going to happen because otherwise consumers are going to 498 00:29:29,440 --> 00:29:33,160 Speaker 1: drop their subscriptions out of frustration and out of a 499 00:29:33,240 --> 00:29:37,280 Speaker 1: need to prioritize, especially in an economic climate where things 500 00:29:37,280 --> 00:29:41,560 Speaker 1: are still seen as being uncertain. Okay, with that in mind, 501 00:29:41,960 --> 00:29:44,440 Speaker 1: we're going to take another quick break when we come back. 502 00:29:44,480 --> 00:29:46,920 Speaker 1: I've got a few more predictions to close out for 503 00:29:47,240 --> 00:30:01,000 Speaker 1: twenty twenty four, so we're back, and a lot of 504 00:30:01,080 --> 00:30:05,200 Speaker 1: predictions that I would try to make depend on factors 505 00:30:05,240 --> 00:30:07,840 Speaker 1: that are outside of the tech world. So, for example, 506 00:30:08,000 --> 00:30:10,960 Speaker 1: in recent years, the US government has taken a more 507 00:30:11,040 --> 00:30:16,719 Speaker 1: hardline stance on anti trust issues, right about anti competitive practices, 508 00:30:17,160 --> 00:30:19,920 Speaker 1: companies becoming monopolies, that sort of thing. But that can 509 00:30:20,000 --> 00:30:23,280 Speaker 1: all change depending on how the twenty twenty four elections go. 510 00:30:23,640 --> 00:30:26,640 Speaker 1: I'm not sure that we're going to see the same 511 00:30:26,720 --> 00:30:31,480 Speaker 1: momentum heading into twenty twenty five depending on how those 512 00:30:31,520 --> 00:30:35,160 Speaker 1: elections turn out, right, And I don't know if we'll 513 00:30:35,160 --> 00:30:37,560 Speaker 1: see dramatic events unfold as a result of, say the 514 00:30:37,600 --> 00:30:41,200 Speaker 1: anti trust lawsuits brought against companies like Google, or if 515 00:30:41,240 --> 00:30:43,680 Speaker 1: those efforts will actually fizzle out. Maybe there'll be a 516 00:30:43,800 --> 00:30:48,640 Speaker 1: change in administrations and the support for going after companies 517 00:30:48,640 --> 00:30:52,080 Speaker 1: that are seen as being anti competitive will just drop out. 518 00:30:52,520 --> 00:30:56,080 Speaker 1: That's a possibility. I'm tempted to say that even without 519 00:30:56,120 --> 00:30:59,280 Speaker 1: a change in administrations, we're not likely to see much 520 00:30:59,320 --> 00:31:02,680 Speaker 1: happen with Google. It's actually been a couple of decades 521 00:31:02,720 --> 00:31:06,719 Speaker 1: since the US government really pushed back against a giant 522 00:31:06,800 --> 00:31:11,160 Speaker 1: corporation on anti trust issues, and I suspect there's not 523 00:31:11,360 --> 00:31:16,520 Speaker 1: enough political momentum to make things actually have a meaningful change. 524 00:31:16,560 --> 00:31:19,240 Speaker 1: So my prediction for the Google anti trust lawsuit is 525 00:31:19,280 --> 00:31:21,800 Speaker 1: that Google will come out of it just fine. It 526 00:31:21,880 --> 00:31:25,200 Speaker 1: may have to pay some fines, but ultimately it will 527 00:31:25,240 --> 00:31:29,320 Speaker 1: remain intact and will continue to consolidate its power. And 528 00:31:29,520 --> 00:31:32,280 Speaker 1: maybe it'll be freaking out because AI enabled search is 529 00:31:32,320 --> 00:31:35,560 Speaker 1: totally disrupting the advertising market. In fact, I'm sure that's 530 00:31:35,600 --> 00:31:41,320 Speaker 1: happening already. But likewise, besides political factors, you have economic 531 00:31:41,400 --> 00:31:44,120 Speaker 1: factors that play a huge part in the tech world, 532 00:31:44,240 --> 00:31:46,960 Speaker 1: especially in the space of startups. You know, we've already 533 00:31:47,000 --> 00:31:49,800 Speaker 1: seen a major Silicon Valley bank, in fact, it was 534 00:31:49,880 --> 00:31:53,880 Speaker 1: the Silicon Valley Bank collapse because of factors like high 535 00:31:53,880 --> 00:31:57,400 Speaker 1: interest rates. With the unpredictability of the global economy, it's 536 00:31:57,440 --> 00:31:59,120 Speaker 1: hard to say if twenty twenty four will be a 537 00:31:59,200 --> 00:32:03,000 Speaker 1: rough year for funding startups. You know, as angel investors 538 00:32:03,360 --> 00:32:06,520 Speaker 1: show their angelic natures, can only go so far, because 539 00:32:06,520 --> 00:32:09,560 Speaker 1: if they can't take out a loan to pour into 540 00:32:09,640 --> 00:32:12,880 Speaker 1: startup investments, they might not be willing to part with 541 00:32:12,960 --> 00:32:16,160 Speaker 1: their own cash. So it could be another tough year 542 00:32:16,160 --> 00:32:18,280 Speaker 1: for folks who dream of starting a company and then 543 00:32:18,280 --> 00:32:20,440 Speaker 1: getting acquired before you know, they have to figure out 544 00:32:20,560 --> 00:32:23,560 Speaker 1: a way to make money themselves. I might also be 545 00:32:23,600 --> 00:32:28,600 Speaker 1: really cynical about startup culture in the podcast world. I 546 00:32:28,600 --> 00:32:32,880 Speaker 1: think we're actually going to see the number of shows decrease. Overall. 547 00:32:33,400 --> 00:32:36,240 Speaker 1: We're still going to get new podcasts launching throughout the year, 548 00:32:36,520 --> 00:32:39,280 Speaker 1: I have no doubt about that, but with companies like 549 00:32:39,320 --> 00:32:43,040 Speaker 1: Spotify cutting back on how much they spend on podcast production, 550 00:32:43,400 --> 00:32:47,080 Speaker 1: the money will dwindle and you'll see less that will 551 00:32:47,080 --> 00:32:50,960 Speaker 1: be available to fund the creation of shows they're Also 552 00:32:51,320 --> 00:32:53,360 Speaker 1: I'm guessing there's going to be a decline in the 553 00:32:53,360 --> 00:32:58,120 Speaker 1: willingness to support shows that aren't pulling in big numbers. Right, Like, 554 00:32:58,160 --> 00:33:00,880 Speaker 1: there might be a show that's a prestige on a network, 555 00:33:01,000 --> 00:33:04,040 Speaker 1: but if it's doing really poorly a number of listeners, 556 00:33:04,440 --> 00:33:07,760 Speaker 1: Like maybe it's highly regarded among critics, but no one's 557 00:33:07,800 --> 00:33:10,960 Speaker 1: really subscribing to it. I think we're going to see 558 00:33:11,000 --> 00:33:14,120 Speaker 1: less support for those kinds of shows, because, I mean, 559 00:33:14,200 --> 00:33:16,920 Speaker 1: it never looks good to cancel a Prestie's show, but 560 00:33:17,760 --> 00:33:21,760 Speaker 1: it also is not great to fund the production of 561 00:33:21,760 --> 00:33:24,960 Speaker 1: something that's not making its money back. So I think 562 00:33:25,240 --> 00:33:28,320 Speaker 1: we're going to see a reversal of the podcast trend 563 00:33:28,720 --> 00:33:30,680 Speaker 1: this year. Like over the last couple of years, we've 564 00:33:30,680 --> 00:33:34,640 Speaker 1: seen so many podcasts launch that we've even joked about 565 00:33:34,640 --> 00:33:37,760 Speaker 1: everyone and their dog getting a podcast. I think in 566 00:33:37,800 --> 00:33:40,320 Speaker 1: some cases that's true. I think there probably are podcasts 567 00:33:40,320 --> 00:33:43,000 Speaker 1: that are hosted by a person and their dog. I 568 00:33:43,040 --> 00:33:44,520 Speaker 1: don't think we're going to see as much of that 569 00:33:45,040 --> 00:33:49,720 Speaker 1: this year. Now. For independent podcasters, the people who are 570 00:33:50,240 --> 00:33:53,080 Speaker 1: doing everything themselves, maybe they're doing it just as a hobby, 571 00:33:53,160 --> 00:33:55,240 Speaker 1: maybe they're not looking for a way to make a 572 00:33:55,280 --> 00:33:58,480 Speaker 1: living through their podcasts. I don't think this is applicable. 573 00:33:58,520 --> 00:34:00,160 Speaker 1: I think we're still going to get those folks who 574 00:34:00,200 --> 00:34:03,000 Speaker 1: have a story to tell and they're willing to put 575 00:34:03,040 --> 00:34:04,600 Speaker 1: in the work to tell the story, and they're not 576 00:34:04,680 --> 00:34:07,280 Speaker 1: expecting anything back out of it. I don't see that 577 00:34:07,400 --> 00:34:10,759 Speaker 1: changing necessarily in twenty twenty four. But when we're looking 578 00:34:10,800 --> 00:34:13,680 Speaker 1: at the big networks, I suspect we're going to see 579 00:34:13,719 --> 00:34:17,560 Speaker 1: some scaling back. Now, in the interest of full disclosure, 580 00:34:17,600 --> 00:34:19,840 Speaker 1: I have to say I am an executive producer for 581 00:34:19,920 --> 00:34:24,480 Speaker 1: iHeart Podcasts. But I don't actually have any insight into 582 00:34:24,880 --> 00:34:27,239 Speaker 1: what will happen here. It's not like I've been part 583 00:34:27,320 --> 00:34:33,360 Speaker 1: of deep strategic discussions about whether or not we personally 584 00:34:33,400 --> 00:34:36,400 Speaker 1: as a company are going to launch more or fewer shows. 585 00:34:36,480 --> 00:34:39,600 Speaker 1: I do not know. I am not privy to that information. 586 00:34:40,000 --> 00:34:41,520 Speaker 1: In fact, it may very well be that by the 587 00:34:41,600 --> 00:34:43,400 Speaker 1: end of twenty twenty four I'll have a ton of 588 00:34:43,400 --> 00:34:46,040 Speaker 1: new shows added onto my slate, and it may turn 589 00:34:46,040 --> 00:34:47,719 Speaker 1: out that I'm one hundred percent wrong on this and 590 00:34:47,760 --> 00:34:52,080 Speaker 1: that we'll actually see another boom period for podcasts on 591 00:34:52,160 --> 00:34:56,359 Speaker 1: big networks. But I don't feel like that's where things 592 00:34:56,400 --> 00:34:58,840 Speaker 1: are headed for twenty twenty four. And this is just 593 00:34:58,880 --> 00:35:02,480 Speaker 1: a gut feeling I'm going with. And that is a 594 00:35:02,520 --> 00:35:05,520 Speaker 1: selection of my predictions for the tech space in twenty 595 00:35:05,560 --> 00:35:08,760 Speaker 1: twenty four. I didn't go into a lot of specific 596 00:35:08,840 --> 00:35:14,920 Speaker 1: like products or anything like that, because honestly, it's if 597 00:35:14,960 --> 00:35:17,320 Speaker 1: you're looking at products, chances are you're looking at things 598 00:35:17,400 --> 00:35:20,560 Speaker 1: like leaked information about stuff that's already been in development 599 00:35:20,560 --> 00:35:22,480 Speaker 1: for a couple of years. It would be a real 600 00:35:22,800 --> 00:35:25,120 Speaker 1: a miracle if I just predicted something out of the 601 00:35:25,120 --> 00:35:29,160 Speaker 1: blue that no one is knowingly working on. But I 602 00:35:29,200 --> 00:35:33,360 Speaker 1: don't have that kind of creativity in me right now. However, 603 00:35:33,640 --> 00:35:36,799 Speaker 1: at the end of this year, I will revisit this 604 00:35:36,880 --> 00:35:40,280 Speaker 1: episode and I will see how I did with my predictions. 605 00:35:40,760 --> 00:35:42,920 Speaker 1: One thing is for certain, twenty twenty four will have 606 00:35:43,040 --> 00:35:47,399 Speaker 1: a ton of events that no one saw coming right, 607 00:35:47,719 --> 00:35:51,399 Speaker 1: stuff that just isn't predictable. Maybe a massive company will 608 00:35:51,480 --> 00:35:55,040 Speaker 1: go out of business, Maybe some tech leader will end 609 00:35:55,120 --> 00:35:59,200 Speaker 1: up behind bars, Maybe NFTs will have a massive comeback. 610 00:35:59,480 --> 00:36:02,360 Speaker 1: Maybe the most will become more than just a vague 611 00:36:02,360 --> 00:36:05,359 Speaker 1: collection of ideas fueled by wishful thinking. By the way, 612 00:36:05,440 --> 00:36:07,839 Speaker 1: for that last one, I do think you're gonna still 613 00:36:07,880 --> 00:36:11,319 Speaker 1: have evangelists talking about the metaverse, but there'll be an 614 00:36:11,320 --> 00:36:14,280 Speaker 1: increasing number of people who will have the mean girl's 615 00:36:14,320 --> 00:36:17,279 Speaker 1: reaction of stop trying to make the metaverse happen. It 616 00:36:17,480 --> 00:36:20,560 Speaker 1: isn't going to happen. But we'll see. I could be 617 00:36:20,600 --> 00:36:23,719 Speaker 1: wrong about that too. Maybe it turns out that the 618 00:36:23,760 --> 00:36:27,120 Speaker 1: metaverse was one of those ideas that was announced a 619 00:36:27,120 --> 00:36:31,279 Speaker 1: little early and then took some more time to kind 620 00:36:31,320 --> 00:36:36,160 Speaker 1: of solidify and become cohesive, and will be the next 621 00:36:36,160 --> 00:36:40,440 Speaker 1: big thing. That is a possibility. I am personally skeptical 622 00:36:40,560 --> 00:36:43,799 Speaker 1: of it, but as I've said many times before in 623 00:36:43,840 --> 00:36:47,600 Speaker 1: this episode, I've been wrong before, so we'll see. I'll 624 00:36:47,640 --> 00:36:50,600 Speaker 1: definitely come back and talk to y'all about these predictions 625 00:36:50,640 --> 00:36:54,080 Speaker 1: at the end of the year, and until then, I 626 00:36:54,120 --> 00:36:58,200 Speaker 1: hope you all are having a wonderful new year and 627 00:36:58,239 --> 00:37:07,520 Speaker 1: I'll talk to you again really soon. Tech Stuff is 628 00:37:07,520 --> 00:37:12,239 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 629 00:37:12,280 --> 00:37:15,920 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 630 00:37:15,960 --> 00:37:16,680 Speaker 1: favorite shows.