1 00:00:04,440 --> 00:00:12,480 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,480 --> 00:00:16,320 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:16,320 --> 00:00:19,599 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,680 --> 00:00:22,880 Speaker 1: tech are you. Well, it's time for us to look 5 00:00:22,920 --> 00:00:26,360 Speaker 1: back on some of the big tech stories that shaped 6 00:00:26,720 --> 00:00:29,840 Speaker 1: the last twelve months. I always like to take some 7 00:00:29,960 --> 00:00:32,000 Speaker 1: time at the end of a year to reflect back 8 00:00:32,080 --> 00:00:35,960 Speaker 1: on what has happened, and this year, there was an 9 00:00:36,080 --> 00:00:38,839 Speaker 1: awful lot of stuff going on. So we're just going 10 00:00:38,920 --> 00:00:42,160 Speaker 1: to look at some of the big stories and headlines 11 00:00:42,159 --> 00:00:45,159 Speaker 1: in tech in twenty twenty four. This is by no 12 00:00:45,320 --> 00:00:48,640 Speaker 1: means an exhaustive list, and of course some of these 13 00:00:48,640 --> 00:00:51,920 Speaker 1: stories continue to play out. Some of them are arguably 14 00:00:52,360 --> 00:00:54,560 Speaker 1: not as big as ones that I left out of 15 00:00:54,600 --> 00:00:57,000 Speaker 1: this list, But these were ones that I thought were 16 00:00:57,160 --> 00:00:59,840 Speaker 1: kind of interesting because I feel like they're a commentary, 17 00:00:59,840 --> 00:01:03,480 Speaker 1: no just on where we are in tech, but where 18 00:01:03,480 --> 00:01:06,480 Speaker 1: we're going. But that's enough jibber jabber. We got a 19 00:01:06,480 --> 00:01:10,479 Speaker 1: lot to get through. Let's get started, and where else 20 00:01:10,520 --> 00:01:14,720 Speaker 1: to begin but to talk about AI. No one was 21 00:01:14,760 --> 00:01:18,560 Speaker 1: surprised that artificial intelligence dominated the news cycle in twenty 22 00:01:18,640 --> 00:01:21,679 Speaker 1: twenty four. Everyone said it was going to happen, because 23 00:01:21,720 --> 00:01:23,800 Speaker 1: it was clear that that's how it was going to 24 00:01:24,280 --> 00:01:27,880 Speaker 1: turn out, and it did. The gears were already turning 25 00:01:27,920 --> 00:01:30,600 Speaker 1: in twenty twenty three. In fact, in late twenty twenty two, 26 00:01:30,680 --> 00:01:34,080 Speaker 1: that's when AI really started to capture the public's attention. 27 00:01:35,040 --> 00:01:39,600 Speaker 1: Generative AI in particular has received a ton of focus. 28 00:01:40,000 --> 00:01:41,959 Speaker 1: Though this is a good time to remind all of 29 00:01:42,000 --> 00:01:46,319 Speaker 1: y'all that generative AI is just one subset of artificial 30 00:01:46,400 --> 00:01:49,400 Speaker 1: intelligence disciplines. Sometimes it can be easy to forget that, 31 00:01:49,840 --> 00:01:53,800 Speaker 1: seeing as how much of our attention goes to generative AI. 32 00:01:54,080 --> 00:01:57,200 Speaker 1: But there are lots of other types of AI. But 33 00:01:57,280 --> 00:02:00,160 Speaker 1: one company that's always at the forefront of these conversation 34 00:02:00,440 --> 00:02:03,520 Speaker 1: is open Ai. That's the company that's perhaps best known 35 00:02:03,720 --> 00:02:09,200 Speaker 1: for chat GPT. Open ai began as a non profit 36 00:02:09,680 --> 00:02:13,600 Speaker 1: organization a few years ago, and it was dedicated to 37 00:02:13,639 --> 00:02:17,680 Speaker 1: the responsible and safe development and deployment of artificial intelligence 38 00:02:17,720 --> 00:02:20,480 Speaker 1: with kind of an open approach. Thus the name like 39 00:02:20,520 --> 00:02:23,799 Speaker 1: this idea that they were going to share their findings 40 00:02:23,840 --> 00:02:27,560 Speaker 1: and research and knowledge with others in order to ensure 41 00:02:28,080 --> 00:02:31,920 Speaker 1: that the AI that the company developed would ultimately be 42 00:02:32,000 --> 00:02:36,440 Speaker 1: put to beneficial use with the lowest risk of harm, 43 00:02:37,080 --> 00:02:41,880 Speaker 1: but that nonprofit would spawn a for profit arm because, 44 00:02:41,919 --> 00:02:46,280 Speaker 1: as it turns out, AI is crazy expensive. Yo. I mean, 45 00:02:46,960 --> 00:02:50,239 Speaker 1: R and D costs a lot, but just running the 46 00:02:50,240 --> 00:02:54,080 Speaker 1: stuff you've already developed also costs a huge amount. And 47 00:02:54,120 --> 00:02:57,040 Speaker 1: I've talked in the past about how AI applications typically 48 00:02:57,040 --> 00:03:02,200 Speaker 1: require enormous amounts of compute power, and that stuff isn't free. 49 00:03:03,040 --> 00:03:06,280 Speaker 1: In fact, in September of this year, New York Times 50 00:03:06,280 --> 00:03:09,680 Speaker 1: reporters Mike Isaac and Aaron Griffith wrote a piece about 51 00:03:09,680 --> 00:03:13,959 Speaker 1: how much money open ai was making versus how much 52 00:03:14,000 --> 00:03:17,520 Speaker 1: it was spending, and the numbers are jaw dropping. The 53 00:03:17,639 --> 00:03:21,200 Speaker 1: article is titled open ai is growing fast and burning 54 00:03:21,280 --> 00:03:24,959 Speaker 1: through piles of money, and it published on September twenty seventh, 55 00:03:25,040 --> 00:03:28,600 Speaker 1: twenty twenty four. In that piece, the authors revealed that 56 00:03:28,720 --> 00:03:32,359 Speaker 1: open AI's revenue reached three hundred million dollars in August 57 00:03:32,440 --> 00:03:34,800 Speaker 1: of this past year. Now that's a lot of cash, 58 00:03:35,560 --> 00:03:39,960 Speaker 1: and the company overall expected a three point seven billion 59 00:03:40,000 --> 00:03:43,200 Speaker 1: dollars in sales by the end of twenty twenty four. 60 00:03:44,120 --> 00:03:46,840 Speaker 1: Now that's even more cash if I'm doing my math right. 61 00:03:46,880 --> 00:03:49,760 Speaker 1: Let's see three point seven billion versus three hundred Yeah, no, 62 00:03:49,880 --> 00:03:53,560 Speaker 1: that's a lot. However, this same company was on track 63 00:03:53,840 --> 00:03:58,600 Speaker 1: to lose five billion dollars in operating costs. Even with 64 00:03:58,720 --> 00:04:02,120 Speaker 1: all that in consideration, that's how expensive it is to 65 00:04:02,200 --> 00:04:05,960 Speaker 1: run a cutting edge AI company. Some of that money, 66 00:04:06,000 --> 00:04:09,680 Speaker 1: of course, went to normal overhead operations that any regular 67 00:04:09,760 --> 00:04:12,840 Speaker 1: business would encounter, you know, stuff like salaries and office 68 00:04:12,840 --> 00:04:15,480 Speaker 1: space and that kind of stuff. But the bulk of 69 00:04:15,520 --> 00:04:19,200 Speaker 1: it presumably went to running operations and R and D work. Now. 70 00:04:19,240 --> 00:04:22,440 Speaker 1: I say presumably, because, as Isaac and Griffith point out 71 00:04:22,520 --> 00:04:26,720 Speaker 1: in that article, there were quote several large expenses not 72 00:04:26,880 --> 00:04:31,239 Speaker 1: fully explained in the documents end quote. The cash churn 73 00:04:31,360 --> 00:04:35,279 Speaker 1: prompted several pieces that were speculating that without a truly 74 00:04:35,400 --> 00:04:40,520 Speaker 1: massive influx of cash from investors, open Ai could spend 75 00:04:40,560 --> 00:04:44,799 Speaker 1: itself out of business. But then, like literally, the next week, 76 00:04:44,960 --> 00:04:48,040 Speaker 1: headlines announced that open ai had brokeered a six point 77 00:04:48,080 --> 00:04:53,279 Speaker 1: six billion dollar round of new investments. Covering the spread 78 00:04:53,720 --> 00:04:56,960 Speaker 1: if you will. This would put the valuation of open 79 00:04:57,000 --> 00:05:01,200 Speaker 1: Ai at around one hundred and fifty seven billion dollars, 80 00:05:01,720 --> 00:05:05,160 Speaker 1: and the company predicted it would hit one hundred billion 81 00:05:05,200 --> 00:05:09,359 Speaker 1: dollars in revenue by twenty twenty nine. Which might be 82 00:05:09,400 --> 00:05:12,479 Speaker 1: the case, and investors might be eager enough to see 83 00:05:12,520 --> 00:05:15,279 Speaker 1: that happen in order to, you know, like, they'll continue 84 00:05:15,320 --> 00:05:18,800 Speaker 1: pouring truckloads of cash into the company at a rate 85 00:05:18,839 --> 00:05:22,359 Speaker 1: that just outpaces open AI's spending habits in order to 86 00:05:22,400 --> 00:05:25,960 Speaker 1: actually get to that destination. I always get nervous about 87 00:05:26,000 --> 00:05:28,520 Speaker 1: stories like these because it's hard to believe that this 88 00:05:28,640 --> 00:05:32,119 Speaker 1: is all sustainable. But I guess if folks are determined enough, 89 00:05:32,400 --> 00:05:37,960 Speaker 1: then if they have access to enough resources, anything is possible. Now, 90 00:05:38,000 --> 00:05:41,440 Speaker 1: Earlier this month, I talked about another story involving open Ai, 91 00:05:41,920 --> 00:05:45,719 Speaker 1: one that has raised many concerns. Open Ai has signed 92 00:05:45,720 --> 00:05:49,000 Speaker 1: a deal with the defense tech company and a Rill, 93 00:05:49,440 --> 00:05:52,960 Speaker 1: which is a total keen reference. That company was founded 94 00:05:52,960 --> 00:05:58,360 Speaker 1: by the creator of the Oculus VR headset, Palmer Lucky. Now, again, 95 00:05:58,440 --> 00:06:00,680 Speaker 1: the original pitch for open ai is that it was 96 00:06:00,760 --> 00:06:03,360 Speaker 1: meant to be an organization dedicated to the safe and 97 00:06:03,400 --> 00:06:07,240 Speaker 1: responsible development of AI. So this announcement has caused a 98 00:06:07,240 --> 00:06:10,719 Speaker 1: lot of concern with people in the AI space, including 99 00:06:10,760 --> 00:06:17,360 Speaker 1: people who were formerly part of open ai itself. So specifically, 100 00:06:17,360 --> 00:06:21,000 Speaker 1: these two companies are working to improve US military technology 101 00:06:21,080 --> 00:06:24,880 Speaker 1: in order to defend against unpiloted aerial attacks. So you 102 00:06:24,880 --> 00:06:27,279 Speaker 1: can think of it as all right, they're working to 103 00:06:27,360 --> 00:06:34,960 Speaker 1: create systems that can identify target and bring down unmanned drones. Now, 104 00:06:35,160 --> 00:06:37,400 Speaker 1: considering that the end of this year has also brought 105 00:06:37,440 --> 00:06:40,919 Speaker 1: with it reports of mysterious drones flying over places like 106 00:06:41,000 --> 00:06:44,520 Speaker 1: New Jersey and Pennsylvania and other areas, you could argue 107 00:06:44,520 --> 00:06:46,880 Speaker 1: that there's a growing need for this kind of technology. 108 00:06:46,960 --> 00:06:50,920 Speaker 1: But critics worry that the same tech could be used 109 00:06:50,920 --> 00:06:55,080 Speaker 1: to identify target and bring down normal aircraft, you know, 110 00:06:55,120 --> 00:06:57,680 Speaker 1: the kind that's actually piloted by a human being in 111 00:06:57,720 --> 00:07:01,960 Speaker 1: the cockpit, And that definitely does not sound like it's 112 00:07:02,040 --> 00:07:07,200 Speaker 1: a safe or responsible application for artificial intelligence, especially when 113 00:07:07,200 --> 00:07:10,800 Speaker 1: you consider that AI is far from perfect. It can 114 00:07:10,880 --> 00:07:14,440 Speaker 1: make some mistakes. Now, there's a lot more we could 115 00:07:14,440 --> 00:07:17,960 Speaker 1: say about open AI, but let's move on to some 116 00:07:18,120 --> 00:07:22,120 Speaker 1: other companies that made headlines that involve artificial intelligence. Now, 117 00:07:22,160 --> 00:07:25,520 Speaker 1: one of those was in Vidia, which is perhaps best 118 00:07:25,560 --> 00:07:29,520 Speaker 1: known as a company that designs graphics processing units or GPUs. 119 00:07:30,120 --> 00:07:34,120 Speaker 1: Those are the things that gamers and some cryptominers really 120 00:07:34,240 --> 00:07:36,920 Speaker 1: want to get hold of. Well, it turns out that 121 00:07:37,000 --> 00:07:41,239 Speaker 1: parallel processing, which is what GPUs really specialize in, also 122 00:07:41,280 --> 00:07:45,200 Speaker 1: happens to be really great for at least some AI processes, 123 00:07:45,480 --> 00:07:49,200 Speaker 1: and so Nvidia has embraced the AI industry really. In 124 00:07:49,280 --> 00:07:52,920 Speaker 1: Vidia now identifies itself as an AI chip maker and 125 00:07:52,960 --> 00:07:55,480 Speaker 1: has turned out lots of chip designs meant to handle 126 00:07:55,560 --> 00:07:59,800 Speaker 1: large AI compute loads. This has more than doubled Nvidia's 127 00:07:59,840 --> 00:08:02,720 Speaker 1: value from the end of twenty twenty three to the 128 00:08:02,800 --> 00:08:05,840 Speaker 1: end of twenty twenty four, which is huge because at 129 00:08:05,840 --> 00:08:08,160 Speaker 1: the end of twenty twenty three, and Video was valued 130 00:08:08,200 --> 00:08:13,520 Speaker 1: at one point two trillion dollars already. That was a 131 00:08:13,520 --> 00:08:16,320 Speaker 1: two hundred and forty percent increase over what it had 132 00:08:16,360 --> 00:08:19,920 Speaker 1: been in twenty twenty two. However, today, in twenty twenty four, 133 00:08:19,960 --> 00:08:23,640 Speaker 1: and Vidia's value is at more than three trillion dollars. 134 00:08:23,920 --> 00:08:26,080 Speaker 1: In fact, it puts in Vidia in the top three 135 00:08:26,200 --> 00:08:30,720 Speaker 1: most valuable publicly traded companies, and depending upon when you're 136 00:08:30,760 --> 00:08:34,160 Speaker 1: looking at that list, it typically falls right behind Apple 137 00:08:34,440 --> 00:08:37,720 Speaker 1: and right in front of Microsoft. That kind of depends 138 00:08:37,760 --> 00:08:40,640 Speaker 1: on which day you look at that list, because these 139 00:08:40,720 --> 00:08:43,600 Speaker 1: numbers do shift around a lot. But yeah, Nvidia had 140 00:08:43,600 --> 00:08:46,040 Speaker 1: a pretty huge year in twenty twenty four. Now that 141 00:08:46,120 --> 00:08:49,960 Speaker 1: being said, as I record this episode which is on 142 00:08:50,040 --> 00:08:53,000 Speaker 1: December nineteenth when I'm recording it. Just to be transparent, 143 00:08:53,600 --> 00:08:57,320 Speaker 1: Nvidia is currently recovering from an overall market dip that 144 00:08:57,480 --> 00:09:01,079 Speaker 1: saw shares of Nvidia drop by around fourteen percent, And 145 00:09:01,160 --> 00:09:04,920 Speaker 1: when you're valued in the trillions, any single percent represents 146 00:09:04,960 --> 00:09:08,640 Speaker 1: a whole lot of money, so that that represents a 147 00:09:08,760 --> 00:09:15,040 Speaker 1: considerable drop, But the estimates I've seen recently seem to 148 00:09:15,040 --> 00:09:19,560 Speaker 1: say that it's a very temporary situation. One more AI 149 00:09:19,679 --> 00:09:21,920 Speaker 1: related story that I would like to touch on, keeping 150 00:09:21,920 --> 00:09:24,640 Speaker 1: in mind that, of course there were countless other noteworthy 151 00:09:24,679 --> 00:09:27,800 Speaker 1: AI stories the unfolded this past year is that twenty 152 00:09:27,920 --> 00:09:30,080 Speaker 1: twenty four is the year where we started seeing companies 153 00:09:30,120 --> 00:09:34,400 Speaker 1: incorporate processors meant to handle AI applications on the device level. 154 00:09:34,920 --> 00:09:38,240 Speaker 1: So Open AI's business model largely focuses on companies and 155 00:09:38,360 --> 00:09:42,440 Speaker 1: other customers using an Internet connection to tap into like 156 00:09:42,600 --> 00:09:47,319 Speaker 1: massive server farms that have the awesome powers of AI processing. 157 00:09:47,720 --> 00:09:51,960 Speaker 1: But another approach is to dedicate processors at the device 158 00:09:52,040 --> 00:09:54,719 Speaker 1: level so that you can do AI implementations on your 159 00:09:54,760 --> 00:09:59,160 Speaker 1: own computer, you know, or whatever other device. Earlier this year, 160 00:09:59,200 --> 00:10:02,240 Speaker 1: I did an episode about the Yoga laptop from Lenovo 161 00:10:02,400 --> 00:10:06,920 Speaker 1: that's a copilot plus PC and Copilot is Microsoft's AI product, 162 00:10:07,360 --> 00:10:11,320 Speaker 1: and the laptop I talked about features an NPU that's 163 00:10:11,360 --> 00:10:15,120 Speaker 1: a neural processing unit. So just as a GPU is 164 00:10:15,200 --> 00:10:19,200 Speaker 1: optimized to process graphics, and NPU is optimized to handle 165 00:10:19,280 --> 00:10:23,640 Speaker 1: AI applications. Now, there are several benefits to having AI 166 00:10:23,760 --> 00:10:26,600 Speaker 1: native on a device, and the big ones really are 167 00:10:26,640 --> 00:10:30,600 Speaker 1: security and privacy. With cloud based AI computing, there's always 168 00:10:30,600 --> 00:10:33,000 Speaker 1: a risk of data breaches. You don't have full control 169 00:10:33,040 --> 00:10:36,640 Speaker 1: over your information because it's being handled by a third party. 170 00:10:37,080 --> 00:10:39,800 Speaker 1: There's also a risk that the provider you use might 171 00:10:40,040 --> 00:10:44,240 Speaker 1: train future AI models on your information, and if that 172 00:10:44,520 --> 00:10:49,400 Speaker 1: data includes proprietary information or trade secrets or something like that, 173 00:10:49,400 --> 00:10:53,160 Speaker 1: that represents a huge security risk. And I bring it 174 00:10:53,200 --> 00:10:57,439 Speaker 1: up because we have seen examples of AI agents revealing 175 00:10:57,640 --> 00:11:02,439 Speaker 1: stuff that they shouldn't, stuff that involved other users of 176 00:11:02,480 --> 00:11:06,280 Speaker 1: the AI agents. Where let's say person one is using 177 00:11:06,320 --> 00:11:09,440 Speaker 1: the AI agent too, I don't know, plan out their 178 00:11:09,480 --> 00:11:13,679 Speaker 1: insurance and then person two, while asking questions, happens to 179 00:11:13,720 --> 00:11:19,080 Speaker 1: start getting information about Person one and the situations they're 180 00:11:19,120 --> 00:11:22,400 Speaker 1: in that are leading to them to seeking these solutions 181 00:11:22,400 --> 00:11:28,000 Speaker 1: to their insurance problem. That's a terrible, terrible privacy problem 182 00:11:28,040 --> 00:11:31,200 Speaker 1: and a security problem for companies particularly. You don't want 183 00:11:31,640 --> 00:11:35,760 Speaker 1: your trade secrets to become industry wide knowledge, so moving 184 00:11:35,800 --> 00:11:41,439 Speaker 1: AI applications to the device level somewhat mollifies those concerns. Now, 185 00:11:41,440 --> 00:11:43,880 Speaker 1: of course, you still have to practice good security protocols. 186 00:11:43,960 --> 00:11:46,120 Speaker 1: You also have to maintain control of your device so 187 00:11:46,120 --> 00:11:49,040 Speaker 1: that someone else doesn't get physical access to it and 188 00:11:49,080 --> 00:11:51,000 Speaker 1: then is able to see it. But that's true no 189 00:11:51,040 --> 00:11:54,840 Speaker 1: matter what. But the fear of some other company profiting 190 00:11:54,840 --> 00:11:57,520 Speaker 1: off of your information or potentially leaking your data to 191 00:11:57,640 --> 00:12:01,520 Speaker 1: others is greatly reduced. At least, I expect we'll continue 192 00:12:01,520 --> 00:12:04,280 Speaker 1: to see more on device implementations of AI, and I'm 193 00:12:04,280 --> 00:12:08,199 Speaker 1: not the only one making that. Guess Gartner predicts that 194 00:12:08,440 --> 00:12:12,080 Speaker 1: all aipcs rather will make up like forty three percent 195 00:12:12,240 --> 00:12:15,080 Speaker 1: of all PC shipments by twenty twenty five, and you know, 196 00:12:15,120 --> 00:12:17,920 Speaker 1: by my calendar, we're nearly there. So we'll have to 197 00:12:17,920 --> 00:12:21,320 Speaker 1: see if that becomes truth. Now, before we go to break, 198 00:12:21,400 --> 00:12:25,640 Speaker 1: I mentioned earlier that Copilot is one of Microsoft's products. 199 00:12:25,840 --> 00:12:28,600 Speaker 1: Microsoft was in the news a lot in twenty twenty 200 00:12:28,600 --> 00:12:31,520 Speaker 1: four for a lot of different reasons, but one story 201 00:12:31,559 --> 00:12:36,319 Speaker 1: focused on a different feature found in some Copilot plus PCs, 202 00:12:36,880 --> 00:12:41,400 Speaker 1: and that is Microsoft Recall. So the company announced this 203 00:12:41,440 --> 00:12:44,840 Speaker 1: feature in June of twenty twenty four, and the reaction 204 00:12:45,120 --> 00:12:50,000 Speaker 1: was rather critical. To put it gently, Recall takes screenshots 205 00:12:50,360 --> 00:12:54,760 Speaker 1: of your activities on your PC. These screenshots are searchable, 206 00:12:55,320 --> 00:12:57,560 Speaker 1: and it's intended to make it easier for users to 207 00:12:57,600 --> 00:12:59,840 Speaker 1: look back on their past activities in order to find 208 00:13:00,320 --> 00:13:03,319 Speaker 1: like files or data or whatever. You know, if you've 209 00:13:03,320 --> 00:13:07,440 Speaker 1: ever said, gosh, what was that website I visited where 210 00:13:07,480 --> 00:13:09,920 Speaker 1: I found that awesome recipe or something like that, and 211 00:13:09,960 --> 00:13:13,920 Speaker 1: you didn't bookmarket or whatever, screenshots is one way that 212 00:13:14,000 --> 00:13:16,320 Speaker 1: you might use to try and track that information down 213 00:13:16,360 --> 00:13:19,480 Speaker 1: by searching the database that's on your machine. Now, some 214 00:13:19,520 --> 00:13:22,840 Speaker 1: folks worried right away that recall would be sending screenshots 215 00:13:22,880 --> 00:13:26,319 Speaker 1: back to Microsoft, which obviously would represent a huge threat 216 00:13:26,360 --> 00:13:30,240 Speaker 1: to both privacy and security. So imagine being an executive 217 00:13:30,320 --> 00:13:32,959 Speaker 1: at a big company and working on an internal report 218 00:13:33,360 --> 00:13:35,960 Speaker 1: only to find out that your computer has been screenshotting 219 00:13:36,000 --> 00:13:38,480 Speaker 1: everything and then sending it off to Microsoft servers. That 220 00:13:38,480 --> 00:13:42,679 Speaker 1: would surely violate company policies at the very least, But 221 00:13:42,720 --> 00:13:45,320 Speaker 1: Microsoft was quick to state that the screenshots would not 222 00:13:45,480 --> 00:13:48,560 Speaker 1: be sent over the Internet to any other location. They 223 00:13:48,600 --> 00:13:52,559 Speaker 1: would live squarely on the native device. However, this did 224 00:13:52,559 --> 00:13:58,680 Speaker 1: not stop the criticism. Now, originally recall was apparently unencrypted, 225 00:13:58,960 --> 00:14:02,360 Speaker 1: meaning it would store these screenshots essentially in plain text 226 00:14:02,400 --> 00:14:05,240 Speaker 1: on the user's computer, So if someone gained access to 227 00:14:05,280 --> 00:14:08,880 Speaker 1: your machine, even if it wasn't administrative access, they would 228 00:14:08,960 --> 00:14:11,400 Speaker 1: be able to go back and look at all those screenshots, 229 00:14:11,440 --> 00:14:15,880 Speaker 1: which potentially could include stuff like login information for various 230 00:14:15,920 --> 00:14:19,840 Speaker 1: accounts and services. So again the concerns around privacy and 231 00:14:19,840 --> 00:14:24,720 Speaker 1: safety were flaring up. Ultimately, these and other criticisms prompted 232 00:14:24,760 --> 00:14:29,480 Speaker 1: Microsoft to delay recalls rollout and ended up turning it 233 00:14:29,600 --> 00:14:34,400 Speaker 1: from a mandated feature to an opt in feature so 234 00:14:34,440 --> 00:14:38,160 Speaker 1: you can actually elect to make use of this feature. 235 00:14:39,720 --> 00:14:44,040 Speaker 1: So they also would go back and change it so 236 00:14:44,080 --> 00:14:48,680 Speaker 1: that the screenshots are encrypted, so you should only be 237 00:14:48,760 --> 00:14:52,360 Speaker 1: able to access these if you have administrative level like 238 00:14:52,440 --> 00:14:56,240 Speaker 1: access to the machine. However, that also could mean that 239 00:14:56,400 --> 00:15:00,000 Speaker 1: perhaps a company could have access to that sort of stuff, 240 00:15:00,280 --> 00:15:05,400 Speaker 1: right like, if you have an employer provided computer as 241 00:15:05,480 --> 00:15:09,120 Speaker 1: part of your work, then presumably your employer or someone 242 00:15:09,200 --> 00:15:13,200 Speaker 1: at the company you work for has that administrator level 243 00:15:13,560 --> 00:15:16,240 Speaker 1: access to the machine, which means they can go back 244 00:15:16,240 --> 00:15:18,800 Speaker 1: and look at all the screen shots, which could potentially 245 00:15:18,800 --> 00:15:22,640 Speaker 1: become a monitoring issue if some bosses decide, hey, this 246 00:15:22,720 --> 00:15:24,800 Speaker 1: is a great way for me to see if you know, 247 00:15:24,880 --> 00:15:27,880 Speaker 1: Jonathan over there is actually doing his work, or if 248 00:15:27,920 --> 00:15:32,400 Speaker 1: he's goofing off, you know, playing Galaga or something and 249 00:15:32,720 --> 00:15:36,040 Speaker 1: on his company machine, which he should not do. So 250 00:15:36,480 --> 00:15:39,240 Speaker 1: there are those concerns still, but again, since it's opt 251 00:15:39,280 --> 00:15:43,560 Speaker 1: in unless your company forces you to activate it, I 252 00:15:43,560 --> 00:15:48,240 Speaker 1: don't think it's that big of an issue. Microsoft also 253 00:15:48,320 --> 00:15:50,520 Speaker 1: enabled some security and privacy features that are meant to 254 00:15:50,560 --> 00:15:53,600 Speaker 1: obscure stuff like your login credentials. You know, in theory, 255 00:15:53,960 --> 00:15:57,160 Speaker 1: the screenshots will skip screens where you're filling in that 256 00:15:57,240 --> 00:15:59,880 Speaker 1: kind of info, so that if someone were somehow to 257 00:15:59,880 --> 00:16:02,960 Speaker 1: get access to your machine, even if they searched through 258 00:16:03,040 --> 00:16:06,200 Speaker 1: for log in stuff, they wouldn't find it. However, some 259 00:16:06,360 --> 00:16:10,840 Speaker 1: folks have found that this is not foolproof because sometimes, 260 00:16:10,840 --> 00:16:14,560 Speaker 1: like if someone's filling out say a PDF, an interactive PDF, 261 00:16:14,960 --> 00:16:17,640 Speaker 1: then they put login information on that. Well, Microsoft doesn't 262 00:16:17,680 --> 00:16:22,840 Speaker 1: necessarily detect that the PDF is related to logins, and 263 00:16:22,880 --> 00:16:25,120 Speaker 1: so you might end up with screenshots that show in 264 00:16:25,160 --> 00:16:29,880 Speaker 1: plain text. Well, once you decrypt the screenshot but show 265 00:16:30,200 --> 00:16:34,440 Speaker 1: that log in credential. So there are still some concerns, 266 00:16:34,720 --> 00:16:38,880 Speaker 1: but from one to understand, it's miles better than what 267 00:16:39,440 --> 00:16:41,800 Speaker 1: the company initially announced when they rolled it out. The 268 00:16:41,840 --> 00:16:44,720 Speaker 1: tool is now out for Windows eleven users. I have 269 00:16:44,920 --> 00:16:47,200 Speaker 1: not seen any stats on how many people are actually 270 00:16:47,320 --> 00:16:50,200 Speaker 1: using it. It'll be interesting to hear in twenty twenty 271 00:16:50,240 --> 00:16:53,640 Speaker 1: five if it becomes a more widely accepted feature. Okay, 272 00:16:54,160 --> 00:16:56,480 Speaker 1: we're going to take a quick break for our sponsors. 273 00:16:56,480 --> 00:16:59,720 Speaker 1: When we come back. More top tech stories of twenty 274 00:17:00,000 --> 00:17:11,120 Speaker 1: twenty four. We're back so here. In the United States, 275 00:17:11,160 --> 00:17:14,520 Speaker 1: several big tech companies have faced scrutiny from regulatory agencies 276 00:17:14,560 --> 00:17:18,960 Speaker 1: such as the Federal Trade Commission or FTC this year. Microsoft, 277 00:17:19,000 --> 00:17:21,479 Speaker 1: for example, has recently found itself the subject of an 278 00:17:21,520 --> 00:17:25,000 Speaker 1: antitrust investigation, not for the first time, but in this 279 00:17:25,119 --> 00:17:29,400 Speaker 1: case it's in businesses ranging from cloud computing to cybersecurity. 280 00:17:29,880 --> 00:17:34,320 Speaker 1: Other members of the Big five tech companies, those being Apple, Amazon, Meta, 281 00:17:34,400 --> 00:17:38,200 Speaker 1: and Google, have also felt the heat. As Christine Bartholomew 282 00:17:38,440 --> 00:17:43,120 Speaker 1: of pro Market wrote, quote, anti trust enforcers filed new 283 00:17:43,240 --> 00:17:47,159 Speaker 1: Section two claims against Amazon and Apple while continuing to 284 00:17:47,240 --> 00:17:50,920 Speaker 1: prosecute older cases like the Federal Trade Commissions suit against 285 00:17:50,920 --> 00:17:55,280 Speaker 1: Meta originally filed in twenty twenty end. Quote that section 286 00:17:55,480 --> 00:17:59,320 Speaker 1: two that Bartholomew mentions refers to a section in a 287 00:17:59,440 --> 00:18:02,960 Speaker 1: very old piece of legislation, namely the Sherman Act, which 288 00:18:03,000 --> 00:18:06,399 Speaker 1: is an anti trust piece of legislation. But Google in 289 00:18:06,440 --> 00:18:10,200 Speaker 1: particular could potentially face government mandates to spin off certain 290 00:18:10,280 --> 00:18:15,000 Speaker 1: key business units or face massive consequences. So the US 291 00:18:15,080 --> 00:18:18,240 Speaker 1: Department of Justice alleged that Google has engaged in anti 292 00:18:18,240 --> 00:18:21,600 Speaker 1: competitive practices, one of which is the company's practice of 293 00:18:21,600 --> 00:18:25,240 Speaker 1: paying other platforms so that Google Search remains the default 294 00:18:25,240 --> 00:18:29,720 Speaker 1: search engine on those platforms. So Google spends billions of dollars, 295 00:18:29,800 --> 00:18:33,480 Speaker 1: in fact, it spends billions of dollars paying Apple alone 296 00:18:33,880 --> 00:18:36,399 Speaker 1: in order to be the default search engine of choice, 297 00:18:36,720 --> 00:18:40,240 Speaker 1: and the DOJ has argued that Google is effectively suppressing 298 00:18:40,280 --> 00:18:43,480 Speaker 1: competition this way. The case has made its way to court. 299 00:18:43,640 --> 00:18:47,920 Speaker 1: A US district judge named Ahmit Meta ruled against Google, 300 00:18:48,440 --> 00:18:51,919 Speaker 1: and that brings us up to what happens next. So 301 00:18:53,320 --> 00:18:55,520 Speaker 1: the answer to that is that it's not going to 302 00:18:55,600 --> 00:18:59,080 Speaker 1: unfold until later in twenty twenty five, and depending on 303 00:18:59,119 --> 00:19:01,920 Speaker 1: how things go with the courts maybe even later than that. 304 00:19:01,960 --> 00:19:04,960 Speaker 1: These things can stretch on for many years, as was 305 00:19:05,480 --> 00:19:08,800 Speaker 1: indicated by Bartholomew. You know that FTC case that started 306 00:19:08,800 --> 00:19:11,960 Speaker 1: in twenty twenty. But the DOJ wants to make Google 307 00:19:12,080 --> 00:19:15,800 Speaker 1: divest itself of several business units, including the Chrome browser, 308 00:19:16,080 --> 00:19:20,080 Speaker 1: and to stop entering into those exclusionary agreements with companies 309 00:19:20,119 --> 00:19:24,159 Speaker 1: like Apple and such. The recommendations go much further, and Google, 310 00:19:24,280 --> 00:19:28,240 Speaker 1: understandably has protested them all. The company has argued that 311 00:19:28,280 --> 00:19:30,760 Speaker 1: it didn't really do anything anti competitive in the first place, 312 00:19:30,760 --> 00:19:33,639 Speaker 1: no matter what that judge thought. And anyway, if the 313 00:19:33,680 --> 00:19:35,920 Speaker 1: DOJ gets its way, it's going to have a huge 314 00:19:35,960 --> 00:19:40,680 Speaker 1: harmful effect on the tech sector, nay, the very world. 315 00:19:40,359 --> 00:19:43,840 Speaker 1: It definitely gets a bit over dramatic. I think both 316 00:19:43,880 --> 00:19:47,240 Speaker 1: sides get over dramatic to be fair. Now, a separate 317 00:19:47,320 --> 00:19:49,800 Speaker 1: judge is going to have to decide which measures should 318 00:19:49,840 --> 00:19:53,720 Speaker 1: actually be enforced against Google later in twenty twenty five, 319 00:19:53,880 --> 00:19:56,959 Speaker 1: so the story is far from over. One thing that 320 00:19:57,040 --> 00:19:59,840 Speaker 1: was a potential concern is that the change in administration 321 00:20:00,520 --> 00:20:03,399 Speaker 1: could affect the outcome. So typically when we see a 322 00:20:03,440 --> 00:20:06,040 Speaker 1: big shift in power dynamics here in the United States, 323 00:20:06,080 --> 00:20:09,960 Speaker 1: we also see a lot of change in practices and policies. However, 324 00:20:10,040 --> 00:20:12,920 Speaker 1: in this case, pressure against big tech is something that 325 00:20:12,960 --> 00:20:17,439 Speaker 1: both sides of the aisle feel is necessary, although often 326 00:20:17,480 --> 00:20:22,080 Speaker 1: for completely different reasons. But that pressure actually began under 327 00:20:22,119 --> 00:20:25,760 Speaker 1: Trump's first presidential administration, so there's every reason to believe 328 00:20:26,359 --> 00:20:29,360 Speaker 1: that the same will be true this go around. Now, 329 00:20:29,359 --> 00:20:32,080 Speaker 1: while I expect the Trump administration to stick generally to 330 00:20:32,440 --> 00:20:34,719 Speaker 1: big businesses are great and they should get away with 331 00:20:34,760 --> 00:20:38,760 Speaker 1: whatever they want, I think big tech companies are a 332 00:20:38,760 --> 00:20:41,600 Speaker 1: potential exception to that rule, though a lot of analysts 333 00:20:41,600 --> 00:20:46,119 Speaker 1: say that they'll probably face lighter consequences. They won't be 334 00:20:46,240 --> 00:20:50,640 Speaker 1: forced to divest themselves of business units. Perhaps under Trump's 335 00:20:50,680 --> 00:20:54,639 Speaker 1: DOJ We'll have to see Something else in tech that 336 00:20:54,760 --> 00:20:58,240 Speaker 1: is a looming question mark. Is the fate of TikTok 337 00:20:58,520 --> 00:21:01,560 Speaker 1: in the United States. A huge story in twenty twenty 338 00:21:01,560 --> 00:21:04,479 Speaker 1: four was that the legislative branch of the US government 339 00:21:04,600 --> 00:21:08,000 Speaker 1: passed a law that requires TikTok's parent company, Byte Dance, 340 00:21:08,400 --> 00:21:11,720 Speaker 1: to divest itself of TikTok or else face a ban 341 00:21:12,080 --> 00:21:15,880 Speaker 1: for the app throughout the entire United States. Now, byte 342 00:21:15,960 --> 00:21:18,560 Speaker 1: Dance is a Chinese company, and the lawmakers share a 343 00:21:18,600 --> 00:21:20,840 Speaker 1: concern that the app could serve as a way to 344 00:21:20,960 --> 00:21:24,520 Speaker 1: siphon data from the United States and send it on 345 00:21:24,600 --> 00:21:27,520 Speaker 1: over to China. Plus it could serve as a delivery 346 00:21:27,560 --> 00:21:32,560 Speaker 1: platform for Chinese propaganda. Basically, these claims, even if they 347 00:21:32,600 --> 00:21:34,560 Speaker 1: were to turn out to be one hundred percent true, 348 00:21:34,840 --> 00:21:38,359 Speaker 1: are kind of moot as far as I'm concerned, because 349 00:21:38,480 --> 00:21:42,119 Speaker 1: China can both collect data from other sources and deliver 350 00:21:42,200 --> 00:21:47,119 Speaker 1: propaganda using other platforms without ever needing TikTok. So to me, 351 00:21:47,240 --> 00:21:49,720 Speaker 1: this is kind of missing the forest for the trees 352 00:21:49,800 --> 00:21:53,199 Speaker 1: sort of situation. But then TikTok is also an easier 353 00:21:53,240 --> 00:21:55,600 Speaker 1: thing to wrap your head around as opposed to the 354 00:21:55,640 --> 00:21:59,159 Speaker 1: whole US approach to data privacy and security, which I 355 00:21:59,320 --> 00:22:03,439 Speaker 1: feel is severely out of touch. Now. President Biden signed 356 00:22:03,480 --> 00:22:06,040 Speaker 1: that bill into law earlier this year, and byte Dance 357 00:22:06,080 --> 00:22:09,439 Speaker 1: faces a deadline of January nineteenth to divest itself of 358 00:22:09,480 --> 00:22:13,440 Speaker 1: TikTok or else TikTok gets the ban hammer now. Back 359 00:22:13,480 --> 00:22:16,280 Speaker 1: in twenty twenty, President Trump at that time was all 360 00:22:16,359 --> 00:22:19,600 Speaker 1: gung ho on banning TikTok, even sign an executive order 361 00:22:20,520 --> 00:22:24,600 Speaker 1: ordering as much, but it got overturned. More recently, he's 362 00:22:24,680 --> 00:22:27,679 Speaker 1: changed his tune for reasons that remain rather opaque, at 363 00:22:27,760 --> 00:22:30,639 Speaker 1: least if you take his own statements at face value. 364 00:22:31,400 --> 00:22:34,280 Speaker 1: I won't get into that, but people have been listening 365 00:22:34,320 --> 00:22:37,159 Speaker 1: to the show for a wild No more about that story, 366 00:22:37,200 --> 00:22:41,040 Speaker 1: But anyway, Trump doesn't come into office until January twentieth, 367 00:22:41,119 --> 00:22:45,080 Speaker 1: twenty twenty five, So the deadlines of the day before that. 368 00:22:45,920 --> 00:22:49,200 Speaker 1: Byte Dance and TikTok are hoping to extend that deadline 369 00:22:49,200 --> 00:22:51,520 Speaker 1: so that Trump can weigh in on this and potentially 370 00:22:51,560 --> 00:22:54,200 Speaker 1: reverse course before a ban is actually put in place. 371 00:22:54,800 --> 00:22:57,880 Speaker 1: But that would require intervention from the US Supreme Court, 372 00:22:58,000 --> 00:23:00,960 Speaker 1: and right now that intervention is are from certain. It 373 00:23:01,080 --> 00:23:03,879 Speaker 1: might happen, but it might not. So it could be 374 00:23:03,960 --> 00:23:06,600 Speaker 1: that one day before Trump takes off as TikTok gets 375 00:23:06,640 --> 00:23:09,480 Speaker 1: banned here in the United States. Now what happens after that? 376 00:23:09,600 --> 00:23:12,920 Speaker 1: I got no clue. TikTok still maintains there's no feasible 377 00:23:12,960 --> 00:23:15,320 Speaker 1: way to separate from its parent company, which I also 378 00:23:15,440 --> 00:23:18,320 Speaker 1: find a little hard to believe, but it may be true. 379 00:23:18,440 --> 00:23:23,720 Speaker 1: I am no expert in these matters. Okay, more big 380 00:23:23,760 --> 00:23:26,560 Speaker 1: stories from twenty twenty four. So early in the year, 381 00:23:26,640 --> 00:23:30,399 Speaker 1: the US Senate Judiciary Committee called leaders of several social 382 00:23:30,440 --> 00:23:34,480 Speaker 1: media companies to testify about how social platforms have contributed 383 00:23:34,480 --> 00:23:39,439 Speaker 1: to harming children. Folks like Mark Zuckerberg, who actually offered 384 00:23:39,680 --> 00:23:44,040 Speaker 1: an apology, were compelled to listen to numerous cases brought 385 00:23:44,080 --> 00:23:47,120 Speaker 1: before them by parents who had tragic stories to share 386 00:23:47,119 --> 00:23:51,360 Speaker 1: about their kids. The overall response from the committee side 387 00:23:51,720 --> 00:23:54,720 Speaker 1: was that none of the tech leaders actually acquitted themselves 388 00:23:54,880 --> 00:23:58,199 Speaker 1: very well or appeared to make any concrete plans to 389 00:23:58,240 --> 00:24:02,000 Speaker 1: actually address the underlying con concerns. There was also a 390 00:24:02,040 --> 00:24:05,800 Speaker 1: lot of lawmakers questioning Section two thirty. That's a law, 391 00:24:05,920 --> 00:24:09,119 Speaker 1: or rather it's part of a law that provides protection 392 00:24:09,200 --> 00:24:14,359 Speaker 1: to online platforms. Essentially, Section two thirty says online platforms 393 00:24:14,359 --> 00:24:17,120 Speaker 1: cannot be held legally liable for the stuff that their 394 00:24:17,240 --> 00:24:21,520 Speaker 1: users post to those platforms. But even those protections do 395 00:24:21,560 --> 00:24:24,159 Speaker 1: have limits. So, for example, if a platform becomes aware 396 00:24:24,200 --> 00:24:27,840 Speaker 1: of illegal activity, it is supposed to take action. But 397 00:24:27,880 --> 00:24:30,320 Speaker 1: the thing that drove me bonkers with this hearing was 398 00:24:30,320 --> 00:24:33,159 Speaker 1: that some of these lawmakers seemed to behave like Section 399 00:24:33,240 --> 00:24:37,000 Speaker 1: two thirty was written by the social media companies and 400 00:24:37,119 --> 00:24:39,679 Speaker 1: it wasn't. It was actually created by Congress as a 401 00:24:39,680 --> 00:24:42,960 Speaker 1: way to protect companies in the Internet space from being 402 00:24:43,000 --> 00:24:46,840 Speaker 1: squashed before they could even establish themselves, and to encourage 403 00:24:46,840 --> 00:24:50,199 Speaker 1: investors to put money into that stuff. If an investor thinks, oh, well, 404 00:24:50,240 --> 00:24:52,520 Speaker 1: I don't want to invest in that because I could 405 00:24:52,560 --> 00:24:55,280 Speaker 1: be held legally responsible for something that someone else did 406 00:24:55,800 --> 00:24:58,000 Speaker 1: that would have killed the early Internet. So that's why 407 00:24:58,040 --> 00:25:03,080 Speaker 1: section two thirty exists. But let's then move on and 408 00:25:03,240 --> 00:25:06,920 Speaker 1: dive into our Muskie segment, shall we. That means it's 409 00:25:06,960 --> 00:25:11,119 Speaker 1: time to talk about Elon Musk. He continued to alienate 410 00:25:11,200 --> 00:25:15,600 Speaker 1: some users of x formerly known as Twitter, and alternative 411 00:25:15,640 --> 00:25:20,000 Speaker 1: platforms like Threads, Mastedon and Blue Sky benefited from Elon 412 00:25:20,119 --> 00:25:23,600 Speaker 1: Musk's shenanigans because users began to bail and go to 413 00:25:23,640 --> 00:25:27,040 Speaker 1: those alternatives. Blue Sky in particular, has had a heck 414 00:25:27,080 --> 00:25:30,320 Speaker 1: of a year. The service emerged from beta and became 415 00:25:30,359 --> 00:25:34,200 Speaker 1: available to the general public relatively early in twenty twenty four, 416 00:25:34,920 --> 00:25:38,320 Speaker 1: and it has experienced a couple of bursts of popularity, 417 00:25:38,400 --> 00:25:41,440 Speaker 1: including a big one following the twenty twenty four elections 418 00:25:41,440 --> 00:25:44,400 Speaker 1: here in the United States. So right now it boasts 419 00:25:44,440 --> 00:25:47,600 Speaker 1: more than twenty five million users, which isn't bad at 420 00:25:47,640 --> 00:25:51,600 Speaker 1: all for the first year in operation. More than eleven 421 00:25:51,640 --> 00:25:55,440 Speaker 1: million of those joined following the US election, and you 422 00:25:55,520 --> 00:26:00,040 Speaker 1: know that that's pretty impressive. It is still miles and 423 00:26:00,320 --> 00:26:03,399 Speaker 1: miles behind X, which has an estimated six hundred million 424 00:26:03,520 --> 00:26:06,520 Speaker 1: or so monthly users. So it's not like Blue Sky's 425 00:26:06,600 --> 00:26:09,760 Speaker 1: poised to topple X. We're not. This is not even 426 00:26:09,840 --> 00:26:14,119 Speaker 1: David versus Goliath. This is Goliath and David's very short 427 00:26:14,160 --> 00:26:19,800 Speaker 1: friend Dwayne or something. Upon Trump's win in the election, 428 00:26:19,920 --> 00:26:23,399 Speaker 1: Elon Musk would become part of the new administration. He 429 00:26:23,440 --> 00:26:28,680 Speaker 1: will co chair a new Department of Government Efficiency akadge 430 00:26:29,000 --> 00:26:34,440 Speaker 1: or I can't believe I'm saying this doge. That suggests 431 00:26:34,480 --> 00:26:36,480 Speaker 1: to me that Musk may have had a hand in 432 00:26:36,560 --> 00:26:40,159 Speaker 1: naming the organization, knowing his past with the doge coin 433 00:26:40,280 --> 00:26:43,760 Speaker 1: and other memes. Considering the Musk is also in an 434 00:26:43,760 --> 00:26:47,840 Speaker 1: ongoing battle regarding his compensation package at TESLA, I imagine 435 00:26:47,840 --> 00:26:50,080 Speaker 1: we're going to see a lot of Musk's fingerprints on 436 00:26:50,160 --> 00:26:52,840 Speaker 1: government actions in the near future. Also, it's kind of 437 00:26:52,880 --> 00:26:55,560 Speaker 1: wild that the guy who's put in place of authority 438 00:26:55,840 --> 00:26:59,119 Speaker 1: over government spending is also the guy who happens to 439 00:26:59,119 --> 00:27:03,160 Speaker 1: be the richest human in the world. It's just weird now, 440 00:27:03,200 --> 00:27:06,560 Speaker 1: speaking of election stuff. As you are probably aware, twenty 441 00:27:06,600 --> 00:27:10,679 Speaker 1: twenty four was a year of unprecedented misinformation campaigns, some 442 00:27:10,760 --> 00:27:15,840 Speaker 1: of which were fueled by generative AI, and it continues 443 00:27:15,880 --> 00:27:18,280 Speaker 1: to be an issue. While elections in the United States 444 00:27:18,280 --> 00:27:21,200 Speaker 1: are over for the time being, other elections are around 445 00:27:21,240 --> 00:27:24,520 Speaker 1: the world. They're preparing for a flood of misinformation. A 446 00:27:24,520 --> 00:27:27,679 Speaker 1: lot of the effort in misinformation campaigns appears to be 447 00:27:27,720 --> 00:27:31,280 Speaker 1: coming out of Russia. Pewtin has long relied on Internet 448 00:27:31,359 --> 00:27:34,320 Speaker 1: channels to help push pro Russian policies and leaders and 449 00:27:34,440 --> 00:27:37,879 Speaker 1: other nations. Not all of those campaigns have been successful 450 00:27:37,920 --> 00:27:41,280 Speaker 1: in their goals, but they've definitely been all over the place. 451 00:27:41,920 --> 00:27:45,119 Speaker 1: So we expect that that's going to be a continuing 452 00:27:45,200 --> 00:27:48,919 Speaker 1: trend moving into twenty twenty five. And we're going to 453 00:27:48,920 --> 00:27:51,359 Speaker 1: take another quick break right now. When I get back, 454 00:27:51,440 --> 00:27:53,960 Speaker 1: I'm going to wrap this up with some more tech 455 00:27:54,040 --> 00:27:57,600 Speaker 1: headlines and stories from the past year. But first let's 456 00:27:57,600 --> 00:28:08,280 Speaker 1: thank our sponsors. All right, we're in the home stretch, 457 00:28:08,480 --> 00:28:11,719 Speaker 1: though again I'm only touching on some of the stories 458 00:28:11,720 --> 00:28:15,720 Speaker 1: that I thought were particularly interesting. I am leaving an 459 00:28:15,880 --> 00:28:21,399 Speaker 1: awful lot of stuff untouched, so you know, don't get 460 00:28:21,480 --> 00:28:24,040 Speaker 1: upset if your favorite tech story doesn't get a mention here, 461 00:28:24,200 --> 00:28:26,679 Speaker 1: there's too much to go over. But one thing that 462 00:28:26,720 --> 00:28:31,720 Speaker 1: remains untouched by yours truly is the Apple Vision Pro headset. 463 00:28:32,800 --> 00:28:35,920 Speaker 1: I just don't have like more than three thousand dollars 464 00:28:36,000 --> 00:28:38,320 Speaker 1: just lying around waiting to be spent on an augmented 465 00:28:38,360 --> 00:28:42,080 Speaker 1: reality headset from Apple that has limited utility in my 466 00:28:42,200 --> 00:28:45,800 Speaker 1: day to day life. But the long rumored technology finally 467 00:28:45,840 --> 00:28:50,240 Speaker 1: emerged early in twenty twenty four. Now, the Apple Vision 468 00:28:50,280 --> 00:28:55,320 Speaker 1: Pro was a far cry from like the Blue Sky 469 00:28:55,560 --> 00:28:59,720 Speaker 1: early concepts, because, rather than like a sleek pair of glasses, 470 00:29:00,440 --> 00:29:02,800 Speaker 1: it was a headset that could project images of the 471 00:29:02,880 --> 00:29:05,720 Speaker 1: user's eyes on the forward facing side of the visor, 472 00:29:06,080 --> 00:29:07,800 Speaker 1: so everyone else could see that you were wearing a 473 00:29:07,800 --> 00:29:11,920 Speaker 1: big visor, but you had gigantic digital eyes projected on 474 00:29:12,040 --> 00:29:15,640 Speaker 1: the back of it, apparently staring at folks. It was 475 00:29:15,640 --> 00:29:18,760 Speaker 1: something that I thought was super creepy. Now, the reviews 476 00:29:18,800 --> 00:29:22,520 Speaker 1: I read about the device all said it was legitimately impressive, 477 00:29:22,840 --> 00:29:26,800 Speaker 1: like it's well made and it does a really good job. 478 00:29:27,760 --> 00:29:30,520 Speaker 1: But those reviews also indicated there were a limited number 479 00:29:30,560 --> 00:29:33,440 Speaker 1: of use cases for the technology. Now, part of that 480 00:29:33,640 --> 00:29:36,360 Speaker 1: is a chicken in the egg problem that frequently happens 481 00:29:36,400 --> 00:29:41,280 Speaker 1: in tech, and it happens between hardware and software. So 482 00:29:41,280 --> 00:29:45,160 Speaker 1: software developers have very little incentive to build stuff for 483 00:29:45,240 --> 00:29:49,600 Speaker 1: a platform that doesn't have a large install base. But 484 00:29:49,720 --> 00:29:52,840 Speaker 1: customers don't have a lot of incentive to adopt a 485 00:29:52,840 --> 00:29:56,800 Speaker 1: brand new technology, particularly a really expensive one like the 486 00:29:56,880 --> 00:30:00,920 Speaker 1: Vision Pro, if that technology doesn't also have useful library 487 00:30:00,960 --> 00:30:04,280 Speaker 1: of applications. And so, while the Vision Pro had a 488 00:30:04,280 --> 00:30:06,920 Speaker 1: brief moment in the spotlight, over the course of twenty 489 00:30:06,960 --> 00:30:10,400 Speaker 1: twenty four it became clear it just wasn't going to 490 00:30:10,440 --> 00:30:12,960 Speaker 1: be the runaway hit that Apple was kind of hoping 491 00:30:12,960 --> 00:30:16,800 Speaker 1: for in the past, so Apple rather quietly stopped mentioning it. 492 00:30:16,840 --> 00:30:19,320 Speaker 1: For the most part, the company has ended production on 493 00:30:19,360 --> 00:30:22,440 Speaker 1: the Vision Pro model, according to reports, and rumor has 494 00:30:22,480 --> 00:30:26,200 Speaker 1: it that any plans for a second Pro unit have 495 00:30:26,320 --> 00:30:29,240 Speaker 1: been scrapped. Again, that's a rumor. I haven't seen any 496 00:30:29,280 --> 00:30:32,480 Speaker 1: official announcement from the company itself, but that seems to 497 00:30:32,480 --> 00:30:34,360 Speaker 1: be the case now. That does not mean that Apple 498 00:30:34,440 --> 00:30:37,160 Speaker 1: is totally out of the mixed reality game, and we 499 00:30:37,240 --> 00:30:40,960 Speaker 1: may see a lower cost version with fewer features in 500 00:30:41,040 --> 00:30:43,720 Speaker 1: the future. That was the plan at least for a while. 501 00:30:44,480 --> 00:30:48,480 Speaker 1: Those rumors suggest that this newer version of the headset 502 00:30:48,600 --> 00:30:51,719 Speaker 1: will be you know, wicked expensive, just not as wicked 503 00:30:51,760 --> 00:30:54,880 Speaker 1: expensive as the Vision Pro. I saw one post on 504 00:30:55,160 --> 00:30:59,120 Speaker 1: XR today I guess Mixed Reality Today that suggested the 505 00:30:59,160 --> 00:31:02,160 Speaker 1: slim down model might go for something like one thousand, 506 00:31:02,240 --> 00:31:05,240 Speaker 1: five hundred bucks, or about half of what the Vision 507 00:31:05,240 --> 00:31:11,040 Speaker 1: Pro goes for. It's still awful pricey, and I'm still 508 00:31:11,120 --> 00:31:14,000 Speaker 1: not convinced that the use case is there for most people. 509 00:31:14,080 --> 00:31:17,080 Speaker 1: Like I think people who are curious about it might 510 00:31:17,160 --> 00:31:20,320 Speaker 1: feel more tempted to get a fifteen hundred dollars headset 511 00:31:20,320 --> 00:31:23,040 Speaker 1: than a three thousand dollars headset. But that's still a 512 00:31:23,200 --> 00:31:26,880 Speaker 1: huge price tag for most people. And unless you're able 513 00:31:26,920 --> 00:31:30,080 Speaker 1: to demonstrate that this technology is going to have a 514 00:31:30,120 --> 00:31:33,720 Speaker 1: massive impact on how you get stuff done, I just 515 00:31:33,760 --> 00:31:38,400 Speaker 1: don't see it really moving that far. Speaking about things 516 00:31:38,440 --> 00:31:40,720 Speaker 1: not turning out the way you had hoped, twenty twenty 517 00:31:40,760 --> 00:31:43,440 Speaker 1: four was another year of layoffs in the tech industry. 518 00:31:44,000 --> 00:31:48,120 Speaker 1: One source I read mentioned that between companies like Tesla, Intel, Cisco, 519 00:31:48,200 --> 00:31:51,680 Speaker 1: and Microsoft, there were around one hundred and fifty thousand 520 00:31:51,800 --> 00:31:55,760 Speaker 1: jobs cut. It has been a brutal couple of years 521 00:31:55,840 --> 00:32:00,000 Speaker 1: in the tech sector. I think this is partly because 522 00:32:00,200 --> 00:32:03,840 Speaker 1: of just overall economic issues. It's also partly because of 523 00:32:03,880 --> 00:32:07,480 Speaker 1: how a lot of tech companies beefed up during the pandemic, 524 00:32:07,680 --> 00:32:11,320 Speaker 1: and now in the post pandemic years, they're having to 525 00:32:11,360 --> 00:32:14,440 Speaker 1: look for ways to slim back down again. But here's 526 00:32:14,480 --> 00:32:17,480 Speaker 1: hoping that twenty twenty five sees those trends change, because 527 00:32:17,640 --> 00:32:21,719 Speaker 1: it's always rough seeing those those stories break. All right, 528 00:32:21,760 --> 00:32:24,200 Speaker 1: let's talk about crypto, because ye gads, this was a 529 00:32:24,280 --> 00:32:27,040 Speaker 1: huge year for crypto and it didn't start off that 530 00:32:27,120 --> 00:32:30,200 Speaker 1: way really, But one thing that happened was Sam Bankman 531 00:32:30,280 --> 00:32:34,840 Speaker 1: freed AKASBF found himself before a judge and sentenced to 532 00:32:34,960 --> 00:32:38,160 Speaker 1: twenty five years in prison for his role in the 533 00:32:38,280 --> 00:32:42,240 Speaker 1: FTX debacle. Now, in case you missed all that, FDx 534 00:32:42,520 --> 00:32:47,000 Speaker 1: was a cryptocurrency exchange that apparently took customer money and 535 00:32:47,080 --> 00:32:49,480 Speaker 1: used it to cover some bad investments made by a 536 00:32:49,480 --> 00:32:53,480 Speaker 1: sister company called Alimator Research. Now it gets more complicated 537 00:32:53,520 --> 00:32:56,560 Speaker 1: than that, but essentially SBF was arrested on charges of 538 00:32:56,640 --> 00:33:00,680 Speaker 1: fraud among other things. He was found guilty and now 539 00:33:00,920 --> 00:33:04,200 Speaker 1: in this year he was sentenced to prison. SBF then 540 00:33:04,280 --> 00:33:07,520 Speaker 1: filed an appeal to his sentencing, arguing that the judge 541 00:33:07,520 --> 00:33:09,840 Speaker 1: overseeing the case had a conflict of interest and was 542 00:33:09,880 --> 00:33:13,680 Speaker 1: biased against him the very damn recording this the nineteenth 543 00:33:13,720 --> 00:33:18,160 Speaker 1: of December twenty twenty four, a US court dismissed that appeal, 544 00:33:18,480 --> 00:33:22,760 Speaker 1: finding SBF's claims invalid. That doesn't come as a surprise 545 00:33:22,800 --> 00:33:26,200 Speaker 1: to me. There are allegations that SBF and or his 546 00:33:26,320 --> 00:33:29,520 Speaker 1: legal team were allegedly, you know, engaging in witnessed tampering 547 00:33:29,600 --> 00:33:32,360 Speaker 1: for example. That kind of stuff typically doesn't go over 548 00:33:32,400 --> 00:33:35,000 Speaker 1: so well when matters come to a court of law. 549 00:33:35,240 --> 00:33:37,720 Speaker 1: So it sounds like SBF's future will not be spent 550 00:33:37,880 --> 00:33:42,560 Speaker 1: on a Caribbean island, but instead in a prison. Related 551 00:33:42,560 --> 00:33:45,320 Speaker 1: to the FTX catastrophe is the failure of a bank 552 00:33:45,440 --> 00:33:50,960 Speaker 1: called Silvergate Capital. So this technically happened in twenty twenty three. 553 00:33:51,720 --> 00:33:54,480 Speaker 1: That's when we started to see a few banks fail 554 00:33:54,600 --> 00:33:57,800 Speaker 1: that were connected to the crypto world. So one called 555 00:33:57,840 --> 00:34:02,200 Speaker 1: Silicon Valley Bank failed, another one called Signature Bank failed, 556 00:34:02,240 --> 00:34:05,440 Speaker 1: and Silvergate was the third. The owners at the time 557 00:34:05,480 --> 00:34:08,760 Speaker 1: said that they were going to wind down operations. However, 558 00:34:08,840 --> 00:34:12,560 Speaker 1: it wasn't until this passed September in twenty twenty four 559 00:34:13,080 --> 00:34:16,960 Speaker 1: that Silvergate Capital Corporation filed for Chapter eleven bankruptcy protection. 560 00:34:17,880 --> 00:34:21,719 Speaker 1: The filing included passages that accused regulators of interfering too 561 00:34:21,800 --> 00:34:24,839 Speaker 1: much in the cryptocurrency realm, and that perhaps Silvergate would 562 00:34:24,840 --> 00:34:27,480 Speaker 1: have been able to write itself if it weren't for 563 00:34:28,440 --> 00:34:33,160 Speaker 1: I don't know, imperial entanglements. As Obi Wan Kenobi might say, 564 00:34:34,480 --> 00:34:36,880 Speaker 1: Bitcoin itself had a heck of a year. So in 565 00:34:37,000 --> 00:34:40,160 Speaker 1: January twenty twenty four, bitcoin was trading at around forty 566 00:34:40,200 --> 00:34:43,759 Speaker 1: thousand dollars per bitcoin. That's a lot of money, and 567 00:34:43,760 --> 00:34:46,960 Speaker 1: that was already an impressive improvement over the previous January 568 00:34:47,000 --> 00:34:48,960 Speaker 1: back in twenty twenty three, when it was in the 569 00:34:49,080 --> 00:34:53,520 Speaker 1: twenty thousand dollars range, but it was still pretty far 570 00:34:53,560 --> 00:34:56,360 Speaker 1: off from its peak that it had hit previously in 571 00:34:56,400 --> 00:34:58,680 Speaker 1: the spring of twenty twenty when it was around sixty 572 00:34:58,800 --> 00:35:02,480 Speaker 1: thousand dollars per bitcoin. All of that was left behind 573 00:35:02,600 --> 00:35:06,360 Speaker 1: the dust recently when the price of bitcoin skyrocketed to 574 00:35:06,560 --> 00:35:10,719 Speaker 1: more than one hundred thousand dollars per bitcoin. Now, a 575 00:35:10,719 --> 00:35:12,600 Speaker 1: lot of that I attribute to the outcome of the 576 00:35:12,680 --> 00:35:15,919 Speaker 1: US elections, As I believe the crypto community at large 577 00:35:16,000 --> 00:35:18,520 Speaker 1: views President Trump is being much more friendly to the 578 00:35:18,520 --> 00:35:22,080 Speaker 1: crypto industry. So will this mean that cryptocurrency has become 579 00:35:22,120 --> 00:35:25,319 Speaker 1: a viable currency alternative or will they just remain more 580 00:35:25,480 --> 00:35:30,040 Speaker 1: like investment assets. I would probably guess on the latter, 581 00:35:30,239 --> 00:35:33,600 Speaker 1: but I'm no expert. It is wild to me that 582 00:35:33,640 --> 00:35:37,720 Speaker 1: the pizza guy in Florida bought pizzas for ten thousand 583 00:35:37,719 --> 00:35:40,560 Speaker 1: bitcoin way back in twenty ten, and those ten thousand 584 00:35:40,560 --> 00:35:43,799 Speaker 1: bitcoin today are worth a billion dollars, So I sure 585 00:35:43,880 --> 00:35:47,719 Speaker 1: hope that was a good pizza. Also, one other thing 586 00:35:47,719 --> 00:35:50,240 Speaker 1: that happened with bitcoin is that the number of coins 587 00:35:50,320 --> 00:35:55,560 Speaker 1: per block mind on the blockchain reduced by half this year. Now, 588 00:35:55,600 --> 00:36:00,560 Speaker 1: this is a planned feature that's built into bitcoin. In short, 589 00:36:00,600 --> 00:36:03,720 Speaker 1: there's a finite number of bitcoin that will ever be released, 590 00:36:04,120 --> 00:36:07,400 Speaker 1: and once all of those are mined, no new bitcoin 591 00:36:07,480 --> 00:36:10,399 Speaker 1: will ever emerge. The ones that are in circulation will 592 00:36:10,440 --> 00:36:13,520 Speaker 1: be the only ones to ever exist. To control the 593 00:36:13,600 --> 00:36:17,840 Speaker 1: supply of bitcoin going into circulation, the system actually has 594 00:36:18,000 --> 00:36:21,720 Speaker 1: the number of bitcoins awarded per block of transactions every 595 00:36:21,800 --> 00:36:26,279 Speaker 1: few years. It's like approximately every four years. So it 596 00:36:26,360 --> 00:36:28,600 Speaker 1: used to be that if you were to mine a 597 00:36:28,719 --> 00:36:30,920 Speaker 1: block on the blockchain, you would get six point two 598 00:36:31,080 --> 00:36:35,200 Speaker 1: five bitcoin per block mind now it's down to three 599 00:36:35,280 --> 00:36:38,879 Speaker 1: point one point two five, which could affect the incentives 600 00:36:38,920 --> 00:36:43,680 Speaker 1: for mining bitcoin. However, now that bitcoin is worth more 601 00:36:43,719 --> 00:36:47,120 Speaker 1: than one hundred thousand dollars per unit, that still means 602 00:36:47,200 --> 00:36:49,880 Speaker 1: the value of mining a block on the blockchain is 603 00:36:50,040 --> 00:36:52,440 Speaker 1: almost as much as it would have been back at 604 00:36:52,480 --> 00:36:57,200 Speaker 1: bitcoin's previous height of around sixty thousand dollars. Cause, yeah, 605 00:36:57,320 --> 00:37:00,360 Speaker 1: the number of bitcoins you get is cut in half, 606 00:37:00,400 --> 00:37:05,080 Speaker 1: but the amount that it's worth has almost not quite, 607 00:37:05,080 --> 00:37:08,120 Speaker 1: but almost doubled, So there's still a huge incentive for 608 00:37:08,239 --> 00:37:11,760 Speaker 1: mining operations to pour tons of assets into mining bitcoin. 609 00:37:12,280 --> 00:37:15,000 Speaker 1: That means consuming a lot of electricity, and I have 610 00:37:16,080 --> 00:37:18,359 Speaker 1: quite a bit of anxiety over how the next US 611 00:37:18,400 --> 00:37:21,920 Speaker 1: administration is going to handle matters like energy consumption and 612 00:37:22,000 --> 00:37:24,920 Speaker 1: where we turn to in order to generate electricity. So 613 00:37:24,960 --> 00:37:26,319 Speaker 1: I think crypto is going to be in a lot 614 00:37:26,400 --> 00:37:29,120 Speaker 1: of conversations about stuff like climate change for the next 615 00:37:29,280 --> 00:37:33,240 Speaker 1: several years, and I'm not looking forward to that. Okay, 616 00:37:33,840 --> 00:37:37,360 Speaker 1: let's finish up with some stuff about space. That's typically 617 00:37:37,400 --> 00:37:40,600 Speaker 1: how I like to end my news items. First up, 618 00:37:40,680 --> 00:37:43,120 Speaker 1: we have the trials and tribulations of Boeing, So the 619 00:37:43,160 --> 00:37:46,200 Speaker 1: aerospace company has long been associated with the space industry, 620 00:37:46,239 --> 00:37:48,400 Speaker 1: but this year Boeing has had a really rough one. 621 00:37:48,840 --> 00:37:51,720 Speaker 1: It actually started quite early because on January fifth, twenty 622 00:37:51,760 --> 00:37:55,359 Speaker 1: twenty four, and Alaskan Airlines Boeing seven thirty seven Max 623 00:37:55,440 --> 00:38:00,360 Speaker 1: aircraft experienced a terrifying failure. At sixteen thousand feet, a 624 00:38:00,440 --> 00:38:03,239 Speaker 1: door plug, meaning a plug that takes the place of 625 00:38:03,280 --> 00:38:07,160 Speaker 1: what otherwise would have been an emergency door, blew out 626 00:38:07,480 --> 00:38:13,319 Speaker 1: of the airplane, which meant the cabin experienced rapid, uncontrolled decompression. Fortunately, 627 00:38:13,480 --> 00:38:16,359 Speaker 1: everyone on board the aircraft survived the incident. There were 628 00:38:16,400 --> 00:38:19,480 Speaker 1: only three people I think who reported minor injuries, so 629 00:38:19,600 --> 00:38:21,240 Speaker 1: all of that is good. I mean, it was still 630 00:38:21,840 --> 00:38:25,840 Speaker 1: an incredibly traumatic experience. I can't imagine what I would 631 00:38:25,840 --> 00:38:28,640 Speaker 1: feel like if I had been on that flight. So 632 00:38:28,719 --> 00:38:32,440 Speaker 1: an investigation showed that the door plug perhaps was reinstalled 633 00:38:32,440 --> 00:38:35,279 Speaker 1: improperly and was missing some bolts meant to secure it 634 00:38:35,320 --> 00:38:38,000 Speaker 1: in place. There's still a question of whether or not 635 00:38:38,000 --> 00:38:40,600 Speaker 1: the bolts had been in place, or perhaps they were 636 00:38:41,120 --> 00:38:45,440 Speaker 1: ripped out during the incident, but either way, it certainly 637 00:38:45,480 --> 00:38:48,680 Speaker 1: seems like a big oversight. So the Federal Aviation Administration 638 00:38:48,800 --> 00:38:51,960 Speaker 1: or FAA ordered all seven to three seven Max nine 639 00:38:52,080 --> 00:38:55,800 Speaker 1: aircraft that had a similar door plug to be grounded 640 00:38:55,880 --> 00:38:59,359 Speaker 1: until each one could undergo a thorough inspection. Boeing would 641 00:38:59,400 --> 00:39:02,000 Speaker 1: end up paying various airlines a whole lot of money 642 00:39:02,080 --> 00:39:04,240 Speaker 1: due to the fact that those airlines could not operate 643 00:39:04,280 --> 00:39:07,760 Speaker 1: the aircraft they had purchased while the FAA was conducting 644 00:39:07,800 --> 00:39:11,320 Speaker 1: this inspection process. But that wasn't the only headache Boeing 645 00:39:11,400 --> 00:39:14,560 Speaker 1: encountered this year. Another relates to the long awaited spacecraft, 646 00:39:14,600 --> 00:39:17,680 Speaker 1: the Boeing Starliner. The spacecraft's purpose is to serve as 647 00:39:17,680 --> 00:39:20,279 Speaker 1: a cargo and crew transport for space missions that go 648 00:39:20,360 --> 00:39:24,239 Speaker 1: into low Earth orbit, and the spacecraft itself is reusable. 649 00:39:24,280 --> 00:39:27,760 Speaker 1: It's also a tad roomier than the SpaceX crew Dragon 650 00:39:27,840 --> 00:39:31,960 Speaker 1: capsule or the old Apollo capsules. The rollout of the 651 00:39:32,000 --> 00:39:35,680 Speaker 1: Starliner faced numerous delays. Originally, the hope was that it 652 00:39:35,719 --> 00:39:38,799 Speaker 1: would enter into service by twenty seventeen, but that did 653 00:39:38,840 --> 00:39:42,680 Speaker 1: not happen. The first successful test flight wouldn't happen until 654 00:39:42,719 --> 00:39:46,440 Speaker 1: twenty twenty two. That was an uncrewed test flight, and 655 00:39:46,520 --> 00:39:49,279 Speaker 1: this was the year when Starliner finally would bring astronauts 656 00:39:49,360 --> 00:39:53,360 Speaker 1: up to the space station. Unfortunately, the spacecraft would not 657 00:39:53,400 --> 00:39:56,520 Speaker 1: bring those astronauts back down again because on the way 658 00:39:56,560 --> 00:39:59,879 Speaker 1: to the space station, the Starliner experienced a partial thruster fail. 659 00:40:00,120 --> 00:40:04,040 Speaker 1: Ye Now, NASA, the astronauts aboard both the ISS and 660 00:40:04,120 --> 00:40:06,879 Speaker 1: the star Liner were collectively able to compensate for this, 661 00:40:07,400 --> 00:40:11,040 Speaker 1: and the spacecraft did dock safely with the International Space Station. 662 00:40:11,600 --> 00:40:14,480 Speaker 1: But after long consideration, NASA came to the determination that 663 00:40:14,520 --> 00:40:16,680 Speaker 1: it would be too risky to have the two astronauts 664 00:40:16,760 --> 00:40:19,880 Speaker 1: returned to Earth aboard the star Liner, and instead they 665 00:40:19,880 --> 00:40:22,280 Speaker 1: would have to wait for another opportunity to come home. 666 00:40:23,120 --> 00:40:25,960 Speaker 1: So what was supposed to be a short mission lasting 667 00:40:26,000 --> 00:40:28,919 Speaker 1: like less than two weeks, has now stretched into one 668 00:40:28,960 --> 00:40:31,439 Speaker 1: that's going to go all the way into twenty twenty five. 669 00:40:31,560 --> 00:40:34,440 Speaker 1: In fact, right now, NASA says that they can get 670 00:40:34,440 --> 00:40:37,080 Speaker 1: back no earlier than March twenty twenty five, and it 671 00:40:37,160 --> 00:40:40,200 Speaker 1: might be later than that. The incident, along with the 672 00:40:40,239 --> 00:40:43,560 Speaker 1: revelations regarding how expensive the star Liner missions are compared 673 00:40:43,560 --> 00:40:47,560 Speaker 1: to alternatives like SpaceX, have really created a great deal 674 00:40:47,600 --> 00:40:51,000 Speaker 1: of criticism directed both at Boeing and at NASA. Boeing's 675 00:40:51,040 --> 00:40:53,239 Speaker 1: been rather cagy about the whole thing. The company was 676 00:40:53,239 --> 00:40:57,080 Speaker 1: supposed to hold a press conference upon the uncrewed Starliner's 677 00:40:57,160 --> 00:41:00,880 Speaker 1: return to Earth this past September. The Boeing canceled that 678 00:41:00,960 --> 00:41:03,440 Speaker 1: event at the very last minute and didn't offer a 679 00:41:03,480 --> 00:41:06,640 Speaker 1: reason as to why they canceled it. Starliner's future in 680 00:41:06,680 --> 00:41:09,120 Speaker 1: the American space program remains another one of those big 681 00:41:09,200 --> 00:41:12,880 Speaker 1: question marks that we just don't know enough about yet. Theoretically, 682 00:41:12,960 --> 00:41:16,480 Speaker 1: once Boeing demonstrates to NASA's satisfaction that the star Liner 683 00:41:16,640 --> 00:41:20,279 Speaker 1: is spaceworthy, Boeing will then provide the star Liner for 684 00:41:20,320 --> 00:41:23,879 Speaker 1: three space missions. That's what a contract has agreed upon. 685 00:41:24,440 --> 00:41:27,880 Speaker 1: There's also an option for NASA to purchase additional missions, 686 00:41:27,880 --> 00:41:30,960 Speaker 1: I think three more per that contract. But as some 687 00:41:31,080 --> 00:41:33,560 Speaker 1: have pointed out, we're getting closer to the retirement date 688 00:41:33,600 --> 00:41:36,680 Speaker 1: of the International Space Station. We're talking like twenty thirty. 689 00:41:36,920 --> 00:41:39,960 Speaker 1: So by that time it may be that star Liner 690 00:41:40,000 --> 00:41:42,800 Speaker 1: has nowhere to take anybody because all the low Earth 691 00:41:42,920 --> 00:41:46,760 Speaker 1: orbit space stations we would go to will be gone, 692 00:41:46,880 --> 00:41:49,280 Speaker 1: all the ones that the United States has an interest 693 00:41:49,280 --> 00:41:52,960 Speaker 1: in it anyway. And finally, twenty twenty four was the 694 00:41:53,080 --> 00:41:57,040 Speaker 1: swan song for the Ingenuity Helicopter, the little aircraft on 695 00:41:57,080 --> 00:42:00,120 Speaker 1: Mars that flew far more times than was planned. The 696 00:42:00,280 --> 00:42:03,320 Speaker 1: Ingenuity was an experiment, could we design an aircraft that 697 00:42:03,360 --> 00:42:06,680 Speaker 1: could actually fly on Mars. So the atmosphere on Mars 698 00:42:06,719 --> 00:42:09,439 Speaker 1: is considerably thinner than what we have here on Earth, 699 00:42:09,480 --> 00:42:13,080 Speaker 1: which makes it hard to fly, but Mars's gravity is 700 00:42:13,080 --> 00:42:15,879 Speaker 1: about a third of Earth's, so that helps a little bit. 701 00:42:16,200 --> 00:42:18,200 Speaker 1: So the Ingenuity was a test to see if maybe 702 00:42:18,200 --> 00:42:20,799 Speaker 1: we could create an aircraft that could further explore the 703 00:42:20,800 --> 00:42:24,279 Speaker 1: Red planet and add to our scientific knowledge to that end. 704 00:42:24,320 --> 00:42:28,080 Speaker 1: While Ingenuity carried some basic sensors, it wasn't fully kitted out. 705 00:42:28,120 --> 00:42:30,640 Speaker 1: It was more of a proof of concept. The plan 706 00:42:30,800 --> 00:42:33,360 Speaker 1: was for Ingenuity to go on five flights when it 707 00:42:33,440 --> 00:42:35,919 Speaker 1: landed on Mars as part of the Perseverance Rover way 708 00:42:35,960 --> 00:42:39,560 Speaker 1: back in February twenty twenty one. The aircraft had to 709 00:42:39,560 --> 00:42:42,080 Speaker 1: fly autonomously because Mars is far enough away from Earth 710 00:42:42,120 --> 00:42:44,360 Speaker 1: that it takes several minutes for radio waves to travel 711 00:42:44,400 --> 00:42:47,439 Speaker 1: between the two, so you can't remote control it from Earth. 712 00:42:48,160 --> 00:42:51,640 Speaker 1: Rather than just complete five flights, however, Ingenuity would ultimately 713 00:42:51,800 --> 00:42:56,360 Speaker 1: take off seventy two times. These flights were relatively modest, 714 00:42:56,400 --> 00:42:59,600 Speaker 1: with altitudes of around ten to sixteen feet in flight 715 00:42:59,640 --> 00:43:01,320 Speaker 1: times light i to around a minute and a half, 716 00:43:01,640 --> 00:43:06,200 Speaker 1: but by golly, Ingenuity outperformed all expectations until finally this year. 717 00:43:07,080 --> 00:43:10,560 Speaker 1: One of Ingenuity's rotors broke off in January when it 718 00:43:10,600 --> 00:43:13,120 Speaker 1: came in for a landing and there was a miscalculation. 719 00:43:13,200 --> 00:43:16,480 Speaker 1: So apparently the Ingenuity made an error because it was 720 00:43:16,520 --> 00:43:19,600 Speaker 1: passing over what NASA referred to as a largely featureless 721 00:43:19,640 --> 00:43:23,000 Speaker 1: region and it just made a miscalculation of where the 722 00:43:23,040 --> 00:43:26,040 Speaker 1: ground was. That this is something the agency said will 723 00:43:26,080 --> 00:43:27,600 Speaker 1: not be a problem in the future when they have 724 00:43:27,680 --> 00:43:33,040 Speaker 1: a better mapped system of Mars's surface. But considering that 725 00:43:33,239 --> 00:43:35,439 Speaker 1: Ingenuity was only supposed to fly five times, it's still 726 00:43:35,440 --> 00:43:39,440 Speaker 1: really impressive. While Ingenuity will go sailing no more, I mean, 727 00:43:39,480 --> 00:43:43,920 Speaker 1: I'm sorry, flying no more, it is still operating even 728 00:43:44,000 --> 00:43:49,480 Speaker 1: as it's out of range radio communications leave perseverance. Ingenuity 729 00:43:49,520 --> 00:43:52,040 Speaker 1: is serving as a stationary platform that's testing stuff like 730 00:43:52,120 --> 00:43:55,719 Speaker 1: solar panel performance and battery operations. That's information that will 731 00:43:55,719 --> 00:43:57,960 Speaker 1: be invaluable should we ever take the step of actually 732 00:43:57,960 --> 00:44:01,160 Speaker 1: sending humans to Mars, or maybe even the grander dreams 733 00:44:01,200 --> 00:44:04,359 Speaker 1: of perhaps establishing a semi permanent colony there. But that's 734 00:44:04,360 --> 00:44:07,680 Speaker 1: putting the space cart before the space horse and that's it. 735 00:44:07,800 --> 00:44:10,000 Speaker 1: Those are the top stories of twenty twenty four that 736 00:44:10,160 --> 00:44:13,040 Speaker 1: I wanted to focus on, not saying that's all the 737 00:44:13,080 --> 00:44:15,240 Speaker 1: top stories, just the ones I wanted to talk about. 738 00:44:15,760 --> 00:44:17,840 Speaker 1: Hope you enjoyed this look back on some of the 739 00:44:17,840 --> 00:44:20,879 Speaker 1: big events of twenty twenty four, and I look forward 740 00:44:20,880 --> 00:44:29,279 Speaker 1: to chatting with you again really soon. Tech Stuff is 741 00:44:29,320 --> 00:44:33,880 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 742 00:44:33,920 --> 00:44:37,560 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 743 00:44:37,600 --> 00:44:41,800 Speaker 1: favorite shows.