1 00:00:04,400 --> 00:00:12,440 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,600 --> 00:00:16,200 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,200 --> 00:00:19,360 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,400 --> 00:00:21,840 Speaker 1: are you. It's time for the tech news for the 5 00:00:21,880 --> 00:00:27,000 Speaker 1: week ending on Friday the thirteenth in December of twenty 6 00:00:27,160 --> 00:00:31,520 Speaker 1: twenty four. So let's start off with something kind of 7 00:00:31,560 --> 00:00:36,240 Speaker 1: spooky for Friday the thirteenth. Apparently, for a few weeks now, 8 00:00:36,520 --> 00:00:41,560 Speaker 1: folks in New Jersey have been reporting strange aircraft flying overhead, 9 00:00:41,560 --> 00:00:46,519 Speaker 1: particularly along coastal regions. Most of the reports have referred 10 00:00:46,520 --> 00:00:50,800 Speaker 1: to the aircraft as drones, though various authorities have said 11 00:00:50,840 --> 00:00:55,200 Speaker 1: that at least in some of these cases, that's a misidentification. 12 00:00:55,680 --> 00:00:58,720 Speaker 1: They say that many of these reports are actually about 13 00:00:58,720 --> 00:01:03,160 Speaker 1: your run of the mill manned aircraft, just plane planes, 14 00:01:03,680 --> 00:01:06,319 Speaker 1: so to speak. But numerous citizens have said there have 15 00:01:06,360 --> 00:01:10,200 Speaker 1: been many instances of drones flying overhead, and we're talking 16 00:01:10,319 --> 00:01:13,920 Speaker 1: like big ones, not your little bitty quad copters, but 17 00:01:14,040 --> 00:01:17,200 Speaker 1: much larger drones. And no one has really come forward 18 00:01:17,280 --> 00:01:21,600 Speaker 1: with an official explanation as to what's going on or 19 00:01:21,640 --> 00:01:25,080 Speaker 1: who's behind it. And when there's a lack of information, 20 00:01:25,600 --> 00:01:29,120 Speaker 1: there will always be a surplus of speculation. We humans 21 00:01:29,160 --> 00:01:33,000 Speaker 1: don't like to have big, unanswered questions, right, So if 22 00:01:33,040 --> 00:01:35,959 Speaker 1: we don't have an answer, sometimes we just invent one. 23 00:01:36,480 --> 00:01:39,720 Speaker 1: And that includes one claim that the drones launch from 24 00:01:39,800 --> 00:01:45,600 Speaker 1: an Iranian mothership. That's a story that US Representative Jeff 25 00:01:45,760 --> 00:01:49,200 Speaker 1: Van Drew said to Fox News, though later he clarified 26 00:01:49,240 --> 00:01:54,000 Speaker 1: that he's actually seen no concrete evidence tying any drone 27 00:01:54,040 --> 00:01:58,600 Speaker 1: activity in New Jersey to Iran, so maybe it's not 28 00:01:58,760 --> 00:02:03,720 Speaker 1: an Iranian mothership after all. According to various news outlets, 29 00:02:04,160 --> 00:02:06,480 Speaker 1: most of the reports, like I said, have come along 30 00:02:06,560 --> 00:02:09,480 Speaker 1: the coastal areas of New Jersey, though there have been 31 00:02:09,560 --> 00:02:13,919 Speaker 1: reports of them coming as far inland as parts of Pennsylvania. 32 00:02:14,000 --> 00:02:17,640 Speaker 1: The FBI is asking people to send in videos and 33 00:02:17,639 --> 00:02:20,240 Speaker 1: photos of these drones in action in an attempt to 34 00:02:20,280 --> 00:02:23,840 Speaker 1: suss out what might be going on, if there is 35 00:02:23,919 --> 00:02:26,960 Speaker 1: something going on, because it's always possible that it's not like, 36 00:02:27,040 --> 00:02:30,800 Speaker 1: it's possible that this is all about misidentification and a few, 37 00:02:31,120 --> 00:02:35,200 Speaker 1: you know, completely innocent cases of people just operating drones 38 00:02:35,240 --> 00:02:39,520 Speaker 1: that aren't at all malicious or malevolent. Officials have said 39 00:02:39,560 --> 00:02:42,760 Speaker 1: repeatedly that there's no evidence that the drones are any 40 00:02:42,840 --> 00:02:46,240 Speaker 1: kind of credible threat, and there's no evidence tying them 41 00:02:46,360 --> 00:02:49,280 Speaker 1: to any foreign powers, so they are not suspected to 42 00:02:49,280 --> 00:02:55,359 Speaker 1: be under foreign control. However, with this ongoing lack of information, 43 00:02:55,960 --> 00:03:00,200 Speaker 1: that hasn't stopped people from saying exactly the opposite, and 44 00:03:00,760 --> 00:03:04,480 Speaker 1: I honestly don't know what's going on. My guess is 45 00:03:04,560 --> 00:03:08,959 Speaker 1: knowing how humans behave. The explanations or lack thereof that 46 00:03:09,080 --> 00:03:11,480 Speaker 1: the officials have given so far are not going to 47 00:03:11,520 --> 00:03:15,800 Speaker 1: satisfy people, and speculation will just continue to grow over 48 00:03:15,840 --> 00:03:19,440 Speaker 1: the coming days. Generally, when these sort of things happen, 49 00:03:20,080 --> 00:03:23,440 Speaker 1: I typically see this culminating in one of two ways. 50 00:03:23,720 --> 00:03:27,880 Speaker 1: Either we eventually do figure out what's happening, whether that's 51 00:03:28,560 --> 00:03:32,760 Speaker 1: some sort of concentrated effort to have drones fly over 52 00:03:32,840 --> 00:03:36,840 Speaker 1: New Jersey for one reason or another, or it eventually 53 00:03:36,920 --> 00:03:38,880 Speaker 1: will just kind of fade away and people will forget 54 00:03:38,920 --> 00:03:41,800 Speaker 1: about it. That's it, Like, I don't think there's anything 55 00:03:41,800 --> 00:03:45,200 Speaker 1: else that comes of this. Also, just to remind folks, 56 00:03:45,680 --> 00:03:50,920 Speaker 1: the way the US regulates drone use here is typically 57 00:03:50,960 --> 00:03:55,240 Speaker 1: tied to drone weight, So if a drone weighs less 58 00:03:55,240 --> 00:03:59,200 Speaker 1: than fifty five pounds, then you can operate it with 59 00:03:59,360 --> 00:04:03,040 Speaker 1: fewer restrictions and regulations. Although you are supposed to have 60 00:04:03,680 --> 00:04:07,760 Speaker 1: like an identifier tag, a digital tag that links the 61 00:04:07,880 --> 00:04:13,000 Speaker 1: drone to you, like that you are the authorized operator 62 00:04:13,080 --> 00:04:15,960 Speaker 1: of that drone. When you get above fifty five pounds, 63 00:04:16,000 --> 00:04:19,039 Speaker 1: then a lot more strict regulations come into play. But 64 00:04:19,160 --> 00:04:22,280 Speaker 1: that being said, while larger drones do weigh more than 65 00:04:22,320 --> 00:04:25,160 Speaker 1: smaller ones, there's also been a lot of advancements made 66 00:04:25,440 --> 00:04:30,080 Speaker 1: in lightweight materials like carbon fiber that have allowed for 67 00:04:30,600 --> 00:04:35,920 Speaker 1: larger drones that surprisingly don't weigh that much. So I 68 00:04:35,960 --> 00:04:38,720 Speaker 1: don't know what's going on here, but I do know 69 00:04:39,080 --> 00:04:41,840 Speaker 1: that it's been going on long enough that it's become 70 00:04:41,960 --> 00:04:45,480 Speaker 1: national news for a while. Apparently it was just news 71 00:04:45,520 --> 00:04:49,039 Speaker 1: in New Jersey and hadn't really gone much further than that. 72 00:04:49,200 --> 00:04:51,440 Speaker 1: This is the first I've heard of it, so we'll 73 00:04:51,480 --> 00:04:53,840 Speaker 1: see where it goes from here, if it goes anywhere. 74 00:04:54,560 --> 00:04:58,760 Speaker 1: Matt Novak of Gizmoto has a piece titled Trump's pick 75 00:04:58,839 --> 00:05:02,599 Speaker 1: for FTC chair is ready to undo the good things 76 00:05:02,680 --> 00:05:05,880 Speaker 1: from Lena Kahn, and I think it's well worth a read. 77 00:05:06,200 --> 00:05:08,520 Speaker 1: So a lot of what is said in the article 78 00:05:08,640 --> 00:05:12,360 Speaker 1: isn't exactly a surprise. If you know how US politics work. 79 00:05:12,720 --> 00:05:16,279 Speaker 1: Every time there's a change in administrations and someone's tenure 80 00:05:16,440 --> 00:05:20,360 Speaker 1: is up, departments like the FTC are affected. Even if 81 00:05:20,400 --> 00:05:23,280 Speaker 1: the new president is from the exact same party as 82 00:05:23,360 --> 00:05:26,160 Speaker 1: the outgoing president, you can expect things to shake up 83 00:05:26,160 --> 00:05:29,200 Speaker 1: a bit. But obviously it's more dramatic when there's a 84 00:05:29,240 --> 00:05:33,479 Speaker 1: shift in power dynamics between Democrat and Republican leaders. So 85 00:05:34,040 --> 00:05:38,080 Speaker 1: this is an overwhelmingly Republican administration that's coming in next year. 86 00:05:38,320 --> 00:05:40,720 Speaker 1: And in this case, Trump's pick is a guy named 87 00:05:40,720 --> 00:05:44,719 Speaker 1: Andrew Ferguson, who is already on the FTC. He's part 88 00:05:44,720 --> 00:05:48,120 Speaker 1: of the commission already, he was appointed under Joe Biden. 89 00:05:48,360 --> 00:05:51,039 Speaker 1: So the way the FTC works is there are five 90 00:05:51,320 --> 00:05:54,440 Speaker 1: members of the Commission, and at most three of those 91 00:05:54,440 --> 00:05:57,479 Speaker 1: members can come from a single political party. And since 92 00:05:57,480 --> 00:06:00,440 Speaker 1: in the United States we effectively just have two political parties, 93 00:06:00,480 --> 00:06:02,880 Speaker 1: I mean there are others, but they represent such a 94 00:06:02,920 --> 00:06:06,040 Speaker 1: small slice that you really just talk about two Republican 95 00:06:06,080 --> 00:06:08,480 Speaker 1: and Democrat, it means, in effect you get three from 96 00:06:08,520 --> 00:06:11,360 Speaker 1: one to two from the other. Novak says that Ferguson 97 00:06:11,440 --> 00:06:13,480 Speaker 1: is poised to roll back some of the moves the 98 00:06:13,560 --> 00:06:16,240 Speaker 1: FTC has made over the last few years. So, for example, 99 00:06:16,720 --> 00:06:20,880 Speaker 1: one of the changes that Lena Kahn's FTC made was 100 00:06:20,960 --> 00:06:24,560 Speaker 1: to ban non compete clauses. Now, if you've never had 101 00:06:24,600 --> 00:06:27,200 Speaker 1: one of these hanging over you, let me kind of 102 00:06:27,240 --> 00:06:30,760 Speaker 1: clue you in. So these clauses are measures that companies 103 00:06:30,800 --> 00:06:34,000 Speaker 1: put into place in employee agreements, and they state that 104 00:06:34,040 --> 00:06:37,159 Speaker 1: you're not allowed to work for a competing company for 105 00:06:37,279 --> 00:06:40,560 Speaker 1: a certain length of time once your employment with your 106 00:06:40,640 --> 00:06:45,359 Speaker 1: current employer ends. So typically the non compete might be 107 00:06:45,440 --> 00:06:48,479 Speaker 1: something like six months. So if I were under a 108 00:06:48,560 --> 00:06:51,400 Speaker 1: non compete like that, I'm not saying I am and 109 00:06:51,440 --> 00:06:54,760 Speaker 1: I'm not not saying that I'm not or whatever. That 110 00:06:54,880 --> 00:06:56,440 Speaker 1: was so many negatives. I don't even know where I 111 00:06:56,480 --> 00:06:59,200 Speaker 1: am now. Anyway, if I had that kind of non 112 00:06:59,240 --> 00:07:02,120 Speaker 1: compete clause, I would mean that should my employment with 113 00:07:02,200 --> 00:07:06,120 Speaker 1: iHeart end tomorrow for whatever reason, whether I quit or 114 00:07:06,160 --> 00:07:08,320 Speaker 1: they lay me off, or whatever it might be, I 115 00:07:08,320 --> 00:07:12,440 Speaker 1: would not be allowed to seek employment with a company 116 00:07:12,520 --> 00:07:15,560 Speaker 1: like Spotify for half a year or so, or else 117 00:07:15,680 --> 00:07:19,040 Speaker 1: risk legal action because I would have violated the non 118 00:07:19,080 --> 00:07:23,160 Speaker 1: compete clause. Cohn and her fellow Democrat commissioners argued that 119 00:07:23,200 --> 00:07:26,640 Speaker 1: these clauses are hurtful to employees, which I agree with 120 00:07:26,840 --> 00:07:29,880 Speaker 1: they are, and they're mostly used by companies as a 121 00:07:29,920 --> 00:07:33,480 Speaker 1: way to intimidate their employees into staying on board rather 122 00:07:33,520 --> 00:07:36,200 Speaker 1: than jumping ship to a better opportunity. Let's say that 123 00:07:36,320 --> 00:07:39,480 Speaker 1: I had risen to a certain level within iHeart, and 124 00:07:39,520 --> 00:07:45,040 Speaker 1: I had deep knowledge of proprietary information in the company 125 00:07:45,120 --> 00:07:47,880 Speaker 1: that really, you know, it's part of the trade secrets 126 00:07:47,880 --> 00:07:51,920 Speaker 1: that let iHeart do what it does. Obviously, there'd be 127 00:07:51,960 --> 00:07:54,480 Speaker 1: a vested interest in making sure I don't take that 128 00:07:54,560 --> 00:07:58,360 Speaker 1: information with me when I leave and then give that 129 00:07:58,760 --> 00:08:01,520 Speaker 1: to the benefit of some other competitor. That would be bad, 130 00:08:01,760 --> 00:08:03,600 Speaker 1: so you can see it from that level. But more 131 00:08:03,640 --> 00:08:06,240 Speaker 1: often than not, these non competes get thrown in on 132 00:08:07,720 --> 00:08:10,960 Speaker 1: positions that just don't have that big of an impact 133 00:08:11,040 --> 00:08:13,960 Speaker 1: to the overall organization, but it has an enormous impact 134 00:08:14,000 --> 00:08:18,000 Speaker 1: to the individual, right like iHeart. If they got rid 135 00:08:18,080 --> 00:08:20,400 Speaker 1: of me, I'm sure they would miss me, but they 136 00:08:20,520 --> 00:08:23,720 Speaker 1: do they'd be fine. I, on the other hand, would 137 00:08:23,720 --> 00:08:25,680 Speaker 1: be struggling to figure out what the heck I'm going 138 00:08:25,760 --> 00:08:28,480 Speaker 1: to do if I can't get another job in this 139 00:08:28,640 --> 00:08:31,640 Speaker 1: field within six months. That would be really tough on me. 140 00:08:31,880 --> 00:08:35,600 Speaker 1: So the FTC made this big change. And the concern 141 00:08:35,640 --> 00:08:39,880 Speaker 1: that Novak raises is that the new FTC under Trump's 142 00:08:40,080 --> 00:08:44,560 Speaker 1: administration is going to start rolling back the protections that 143 00:08:44,600 --> 00:08:47,960 Speaker 1: the previous administration put in place in order to benefit 144 00:08:48,240 --> 00:08:52,280 Speaker 1: employees versus businesses. And the thought is that the FTC 145 00:08:52,480 --> 00:08:55,760 Speaker 1: moving forward is going to be much more friendly with 146 00:08:55,960 --> 00:09:00,000 Speaker 1: the folks who own and operate businesses and much less 147 00:09:00,080 --> 00:09:02,680 Speaker 1: less helpful to those who work for them. Will have 148 00:09:02,840 --> 00:09:06,040 Speaker 1: to see. Meanwhile, big tech companies are lining up in 149 00:09:06,080 --> 00:09:09,160 Speaker 1: what looks like an attempt to get on Trump's good side. 150 00:09:09,520 --> 00:09:13,240 Speaker 1: Both Meta and Amazon have contributed the equivalent of a 151 00:09:13,320 --> 00:09:18,680 Speaker 1: million dollars each to Trump's inaugural fund, as in the 152 00:09:18,760 --> 00:09:24,240 Speaker 1: money going to his inauguration ceremony. Dana Mattioli and Rebecca 153 00:09:24,320 --> 00:09:28,160 Speaker 1: Balhause of The Wall Street Journal wrote that Zuckerberg appears 154 00:09:28,160 --> 00:09:30,920 Speaker 1: to be trying to mend defenses. You know, Trump has 155 00:09:31,000 --> 00:09:35,200 Speaker 1: long held a rather acrimonious stance against Meta, So why 156 00:09:35,280 --> 00:09:37,720 Speaker 1: is that? A Couple of reasons. One is he got 157 00:09:37,760 --> 00:09:41,200 Speaker 1: banned from Meta's Facebook platform after the whole January sixth 158 00:09:41,280 --> 00:09:44,600 Speaker 1: debacle several years back. Another is that he has long 159 00:09:44,679 --> 00:09:49,319 Speaker 1: maintained that various online platforms, including stuff like Google and Facebook, 160 00:09:49,440 --> 00:09:53,160 Speaker 1: have long held an anti conservative bias and actively suppressed 161 00:09:53,200 --> 00:09:56,440 Speaker 1: free speech. And seeing how folks like Zuckerberg and Bezos 162 00:09:56,480 --> 00:10:00,800 Speaker 1: are accustomed to accumulating vast amounts of wealth, they appear 163 00:10:00,840 --> 00:10:03,360 Speaker 1: to be taking steps to keep that trend going strong 164 00:10:03,679 --> 00:10:08,640 Speaker 1: and not inviting Trump to make things harder for them. Meanwhile, 165 00:10:08,720 --> 00:10:12,720 Speaker 1: Amazon's going to carry the inauguration ceremonies on Amazon Prime Video, 166 00:10:13,040 --> 00:10:14,760 Speaker 1: and I expect we're going to see a lot more 167 00:10:14,800 --> 00:10:18,280 Speaker 1: moves from leaders and big tech companies that are similar 168 00:10:18,280 --> 00:10:22,040 Speaker 1: to this in an effort to avoid Trump's rather mercurial wrath. 169 00:10:22,559 --> 00:10:27,040 Speaker 1: Speaking of that, TikTok's clock is running mighty low here 170 00:10:27,040 --> 00:10:29,920 Speaker 1: in the United States. Now. You could recall that earlier 171 00:10:29,960 --> 00:10:33,440 Speaker 1: this year, Congress passed a law that would require TikTok's 172 00:10:33,480 --> 00:10:37,439 Speaker 1: parent company, Bye Dance to divest itself of TikTok or 173 00:10:37,480 --> 00:10:41,080 Speaker 1: else face a nationwide ban on the app. Bye Dance 174 00:10:41,120 --> 00:10:44,199 Speaker 1: being a Chinese company, and the worry here is that 175 00:10:44,360 --> 00:10:47,920 Speaker 1: TikTok could just be a siphon for information that could 176 00:10:47,920 --> 00:10:51,560 Speaker 1: go direct to China. So anyway, TikTok has been fighting 177 00:10:51,600 --> 00:10:54,120 Speaker 1: this law ever since and challenged it in court by 178 00:10:54,280 --> 00:10:56,760 Speaker 1: arguing that it goes against the First Amendment, the right 179 00:10:56,840 --> 00:10:59,520 Speaker 1: of free speech. But a court found otherwise, and an 180 00:10:59,559 --> 00:11:02,640 Speaker 1: appeal panel has agreed, which is not a good sign 181 00:11:02,679 --> 00:11:06,280 Speaker 1: for TikTok. The agreement comes down to a few things. One, 182 00:11:06,679 --> 00:11:09,680 Speaker 1: the judicial system in the United States traditionally is reluctant 183 00:11:09,720 --> 00:11:13,800 Speaker 1: to contradict Congress, particularly in matters that involve national security, 184 00:11:14,000 --> 00:11:16,240 Speaker 1: and that, of course, is how the TikTok ban has 185 00:11:16,320 --> 00:11:20,040 Speaker 1: been framed, so that's one potential reason why the court 186 00:11:20,080 --> 00:11:23,960 Speaker 1: was reluctant to overturn this. Secondly, the court has found 187 00:11:23,960 --> 00:11:26,959 Speaker 1: that TikTok could operate in the United States without issue 188 00:11:27,040 --> 00:11:30,160 Speaker 1: if byte Edance actually divested itself of the company. So, 189 00:11:30,200 --> 00:11:32,880 Speaker 1: in other words, the court has essentially said there's no 190 00:11:32,960 --> 00:11:36,160 Speaker 1: freedom of speech issue here because TikTok has a solution 191 00:11:36,280 --> 00:11:39,520 Speaker 1: to this problem. Just separate from the Chinese parent company 192 00:11:39,679 --> 00:11:43,640 Speaker 1: and you can continue as per usual. TikTok has appealed 193 00:11:43,679 --> 00:11:45,959 Speaker 1: to the Supreme Court, but there are doubts that the 194 00:11:46,040 --> 00:11:48,439 Speaker 1: highest court in the United States will hear the case, 195 00:11:48,679 --> 00:11:51,480 Speaker 1: which means they'll just have to abide by the rulings 196 00:11:51,520 --> 00:11:54,559 Speaker 1: of the lower courts. TikTok has also filed an emergency 197 00:11:54,600 --> 00:11:57,800 Speaker 1: request to delay the law's effects until the Supreme Court 198 00:11:57,840 --> 00:12:00,640 Speaker 1: can weigh in and until the Trump administry can say 199 00:12:00,640 --> 00:12:04,960 Speaker 1: something about it. And the US government opposes this move, 200 00:12:05,120 --> 00:12:06,800 Speaker 1: So we haven't seen how this is going to turn 201 00:12:06,800 --> 00:12:10,199 Speaker 1: out yet. The ban is scheduled to begin on January nineteenth, 202 00:12:10,400 --> 00:12:14,920 Speaker 1: one day before Trump's inauguration, so time is running out. Interestingly, 203 00:12:15,000 --> 00:12:18,160 Speaker 1: during Trump's first term as president, he led the charge 204 00:12:18,160 --> 00:12:20,400 Speaker 1: on banning TikTok. He's the one who brought it up. 205 00:12:20,440 --> 00:12:24,400 Speaker 1: He even passed an executive order demanding TikTok separate from Byteedance, 206 00:12:24,640 --> 00:12:28,040 Speaker 1: but that order was never enforced. During his campaign, however, 207 00:12:28,080 --> 00:12:31,560 Speaker 1: Trump appeared to have reversed his decision because, well, he said, 208 00:12:31,559 --> 00:12:33,559 Speaker 1: it's because lots of people like to use the app, 209 00:12:33,640 --> 00:12:36,240 Speaker 1: which I mean, that's true, But I don't see how 210 00:12:36,240 --> 00:12:39,920 Speaker 1: that actually addresses the supposed risk to national security that 211 00:12:39,960 --> 00:12:42,880 Speaker 1: he was concerned about a few years earlier. All I'm 212 00:12:42,880 --> 00:12:46,240 Speaker 1: saying is that Trump changes his mind a lot, and 213 00:12:46,720 --> 00:12:49,080 Speaker 1: if he had said something like upon further reflection, I 214 00:12:49,160 --> 00:12:52,200 Speaker 1: determined that there's no risk to national security, at least 215 00:12:52,440 --> 00:12:55,440 Speaker 1: I think his decision to reverse his stance would make 216 00:12:55,480 --> 00:12:58,400 Speaker 1: more sense. But honestly, I think the real reason he 217 00:12:58,440 --> 00:13:00,800 Speaker 1: reversed his stance, at least during the campaign, is that 218 00:13:00,960 --> 00:13:03,280 Speaker 1: one of his big campaign donors also happened to be 219 00:13:03,280 --> 00:13:06,800 Speaker 1: someone who is heavily invested in byte dance and TikTok 220 00:13:07,240 --> 00:13:10,760 Speaker 1: so in order to keep those dollars coming into his campaign. 221 00:13:11,240 --> 00:13:14,840 Speaker 1: This is my opinion. Trump said whatever he thought would 222 00:13:14,840 --> 00:13:17,880 Speaker 1: get him that money, and now since he's gotten what 223 00:13:18,000 --> 00:13:20,920 Speaker 1: he wanted that he got elected president, I don't know 224 00:13:20,960 --> 00:13:24,280 Speaker 1: that he's so concerned about it. We'll see. I don't 225 00:13:24,360 --> 00:13:26,440 Speaker 1: know if he's going to follow through. All Right, We've 226 00:13:26,480 --> 00:13:28,640 Speaker 1: got a lot more news to cover before we get 227 00:13:28,679 --> 00:13:30,480 Speaker 1: to that, though. Let's take a quick break to thank 228 00:13:30,520 --> 00:13:43,040 Speaker 1: our sponsors. Okay, we're back, and we're switching gears away 229 00:13:43,040 --> 00:13:46,960 Speaker 1: from politics, for which I am thankful. So let's talk 230 00:13:47,080 --> 00:13:52,079 Speaker 1: about Google's latest push into the world of mixed reality. 231 00:13:52,360 --> 00:13:55,200 Speaker 1: Kevin Party of Ours Technica has a piece about this. 232 00:13:55,400 --> 00:14:01,920 Speaker 1: It's titled Google steps into extended Reality again with Android XR. 233 00:14:02,360 --> 00:14:05,840 Speaker 1: I had not heard the term extended reality before. I've 234 00:14:05,840 --> 00:14:09,520 Speaker 1: heard mixed reality, I've heard augmented, I've heard virtual, extended 235 00:14:09,520 --> 00:14:11,640 Speaker 1: as a new one for me, or if it's not new, 236 00:14:11,800 --> 00:14:14,240 Speaker 1: I hadn't I didn't remember it. But some of y'all 237 00:14:14,320 --> 00:14:18,360 Speaker 1: might recall that years ago Google really got things moving 238 00:14:18,440 --> 00:14:20,840 Speaker 1: in the AR space, at least as far as the 239 00:14:20,960 --> 00:14:26,240 Speaker 1: general tech consumer is concerned, when they launched Google Glass. Now, 240 00:14:26,280 --> 00:14:29,360 Speaker 1: to be clear, Google Glass was not the first example 241 00:14:29,400 --> 00:14:32,600 Speaker 1: of augmented reality, not by far. It was just an 242 00:14:32,640 --> 00:14:37,120 Speaker 1: early example of a technology that was somewhat consumer facing. Now, 243 00:14:37,160 --> 00:14:41,000 Speaker 1: I say somewhat because Google Glass was really limited upon release. 244 00:14:41,040 --> 00:14:42,960 Speaker 1: It was very expensive. I think it was like fifteen 245 00:14:43,040 --> 00:14:45,880 Speaker 1: hundred dollars for one of those sets, and it never 246 00:14:45,920 --> 00:14:50,040 Speaker 1: really evolved beyond that very limited run for the average consumer. 247 00:14:50,080 --> 00:14:53,480 Speaker 1: It did become an enterprise tool for some companies, but 248 00:14:53,960 --> 00:14:55,800 Speaker 1: that and it kind of found a place there at 249 00:14:55,880 --> 00:14:59,440 Speaker 1: least I think until twenty twenty three, but it hasn't 250 00:14:59,480 --> 00:15:03,080 Speaker 1: really had a larger impact. Now, Google is promoting a 251 00:15:03,120 --> 00:15:06,920 Speaker 1: mixed reality platform that's built upon the Android operating system 252 00:15:07,080 --> 00:15:10,920 Speaker 1: that's best known for the operating system for mobile devices. 253 00:15:11,360 --> 00:15:14,400 Speaker 1: Party writes that Samsung will be coming out with a 254 00:15:14,440 --> 00:15:18,400 Speaker 1: mixed reality headset built on Android XR at some point 255 00:15:18,480 --> 00:15:21,760 Speaker 1: next year. Currently, it's known as project Mouhan, though I'm 256 00:15:21,800 --> 00:15:23,480 Speaker 1: sure it's going to have a very different name by 257 00:15:23,520 --> 00:15:26,920 Speaker 1: the time it's an actual product. Google's video that promotes 258 00:15:27,000 --> 00:15:30,800 Speaker 1: the Android XR project is a little light on details, 259 00:15:30,960 --> 00:15:33,760 Speaker 1: but it indicates that users should expect to be able 260 00:15:33,800 --> 00:15:37,720 Speaker 1: to tap into features found in other existing Google products, 261 00:15:37,720 --> 00:15:40,920 Speaker 1: such as like Google Lens or you know, auto translation 262 00:15:41,040 --> 00:15:43,800 Speaker 1: tools and maps and that kind of thing. I honestly 263 00:15:43,920 --> 00:15:49,600 Speaker 1: don't know if there really is a market for XR headsets. 264 00:15:49,760 --> 00:15:51,880 Speaker 1: You know. I know that Meta has been offering ar 265 00:15:52,000 --> 00:15:54,880 Speaker 1: light glasses for quite some time now. You can get 266 00:15:54,920 --> 00:15:57,640 Speaker 1: you know, those those Meta ray bands or whatever, and 267 00:15:57,680 --> 00:16:00,040 Speaker 1: those are popular with some folks. I don't know, you 268 00:16:00,080 --> 00:16:03,680 Speaker 1: know how large that customer base is. And of course, 269 00:16:03,760 --> 00:16:06,960 Speaker 1: Apple came out with the Vision Pro early this year, 270 00:16:07,000 --> 00:16:09,920 Speaker 1: but the lack of support for that platform suggests to 271 00:16:09,960 --> 00:16:12,600 Speaker 1: me that Apple at the very least is backing off 272 00:16:13,000 --> 00:16:16,840 Speaker 1: on pushing AR headsets for a while. Maybe they won't 273 00:16:17,080 --> 00:16:21,120 Speaker 1: totally end development, but I imagine it's no longer a priority 274 00:16:21,320 --> 00:16:26,840 Speaker 1: because while everyone who tried the Apple Vision Pro said 275 00:16:26,880 --> 00:16:30,040 Speaker 1: they were pretty impressed by it, it just didn't have 276 00:16:30,560 --> 00:16:35,160 Speaker 1: the splash needed to establish a foundation for Apple to 277 00:16:35,200 --> 00:16:40,000 Speaker 1: really build upon and the lack of ongoing interest from 278 00:16:40,720 --> 00:16:44,920 Speaker 1: the general public. I think then translated to Apple's own 279 00:16:45,720 --> 00:16:48,640 Speaker 1: lack of interest in supporting it. So yeah, I don't 280 00:16:48,680 --> 00:16:52,360 Speaker 1: know if Google's going to crack the nut on consumer 281 00:16:52,480 --> 00:16:56,040 Speaker 1: mixed reality. Maybe we'll see, But I mean Google has 282 00:16:56,120 --> 00:17:00,800 Speaker 1: been instrumental in releasing tons and tons and tons of 283 00:17:00,840 --> 00:17:06,719 Speaker 1: different products that received middling to low success rates, and 284 00:17:06,760 --> 00:17:11,840 Speaker 1: then those elements would get engulfed into other Google efforts 285 00:17:11,960 --> 00:17:14,720 Speaker 1: and whatever it was that was launched would just disappear. 286 00:17:14,800 --> 00:17:16,880 Speaker 1: So I guess what I'm saying is that when these 287 00:17:16,920 --> 00:17:21,680 Speaker 1: mixed reality headsets hit the market, be cautious before adopting it, 288 00:17:21,720 --> 00:17:25,840 Speaker 1: because Google has a very long history of abandoning stuff 289 00:17:26,400 --> 00:17:29,840 Speaker 1: not long after launch, so that's not necessarily going to 290 00:17:29,880 --> 00:17:31,919 Speaker 1: be the case here. I'm just saying, if it's going 291 00:17:32,000 --> 00:17:35,080 Speaker 1: to be really expensive and you don't necessarily have the 292 00:17:35,080 --> 00:17:39,440 Speaker 1: money to spare, maybe hold back for generation two or three, 293 00:17:39,840 --> 00:17:41,920 Speaker 1: just to make sure that it's going to stick around. 294 00:17:42,480 --> 00:17:45,600 Speaker 1: Emmanuel Meberg of four h four Media has an article 295 00:17:45,640 --> 00:17:52,400 Speaker 1: titled YouTube enhances comment section with AI generated nonsense, and y'all, 296 00:17:52,480 --> 00:17:55,639 Speaker 1: I just can't. I mean, YouTube's comment section is already 297 00:17:55,680 --> 00:17:58,879 Speaker 1: a breeding ground for nonsense. Now, don't get me wrong, 298 00:17:59,080 --> 00:18:01,879 Speaker 1: there are actually some great community members out there in 299 00:18:01,960 --> 00:18:06,000 Speaker 1: various communities who post positive and supportive and helpful comments. 300 00:18:06,400 --> 00:18:09,200 Speaker 1: There are others who might be a little more abrasive, 301 00:18:09,560 --> 00:18:13,360 Speaker 1: but they clearly mean well in their criticism. Then there's 302 00:18:13,400 --> 00:18:16,119 Speaker 1: the ones who don't mean well in their criticisms, and 303 00:18:16,160 --> 00:18:20,200 Speaker 1: then the outright jokesters and trolls. The jokesters just want 304 00:18:20,240 --> 00:18:23,640 Speaker 1: an opportunity to crack a punchline and work on their material, 305 00:18:23,840 --> 00:18:26,359 Speaker 1: and the trolls just want to upset whomever they can. 306 00:18:26,880 --> 00:18:29,760 Speaker 1: And then you've got like stalkers who are fostering unhealthy 307 00:18:29,760 --> 00:18:33,560 Speaker 1: parasocial relationships online. And then you've got bots that are 308 00:18:33,600 --> 00:18:36,199 Speaker 1: just trying to trick people into going to malicious or 309 00:18:36,240 --> 00:18:39,359 Speaker 1: shady websites. Do we really need to add more AI 310 00:18:39,480 --> 00:18:42,600 Speaker 1: to this mix? Who? Apparently, YouTube's answer is yes. Now, 311 00:18:42,640 --> 00:18:45,640 Speaker 1: in this case, the AI is actually intended to give 312 00:18:45,800 --> 00:18:48,960 Speaker 1: content creators the ability to interact with, or at least 313 00:18:48,960 --> 00:18:53,400 Speaker 1: to appear to interact with their communities without as much effort. 314 00:18:53,760 --> 00:18:57,840 Speaker 1: It's meant to let creators reply to comments faster and easier, But, 315 00:18:58,000 --> 00:19:02,200 Speaker 1: as my Berg writes, the results aren't always helpful. In fact, 316 00:19:02,200 --> 00:19:07,760 Speaker 1: in his words, they sometimes can end up being quote misleading, nonsensical, 317 00:19:08,000 --> 00:19:11,720 Speaker 1: or weirdly intimate end quote. Now, this tool is not 318 00:19:11,880 --> 00:19:14,600 Speaker 1: fully automated. It's not like it just automatically goes and 319 00:19:14,640 --> 00:19:18,760 Speaker 1: starts answering every comment left on a creator's channel. Instead, 320 00:19:19,080 --> 00:19:22,679 Speaker 1: it sort of auto suggests a reply, and if the 321 00:19:22,760 --> 00:19:25,520 Speaker 1: reply is one that the content creator likes, they can 322 00:19:25,600 --> 00:19:27,960 Speaker 1: just post it. So that's pretty good, right. It gives 323 00:19:27,960 --> 00:19:31,320 Speaker 1: the creator the ultimate editorial control to whether you know, 324 00:19:31,440 --> 00:19:34,920 Speaker 1: they post it or just write their own response. It's 325 00:19:35,000 --> 00:19:37,400 Speaker 1: kind of like the auto messages that are generated by 326 00:19:37,440 --> 00:19:40,439 Speaker 1: a lot of different texting or messaging apps these days. 327 00:19:40,880 --> 00:19:43,440 Speaker 1: I suggest reading My Bird's piece for a full rundown 328 00:19:43,480 --> 00:19:46,560 Speaker 1: on the topic, as it includes some amusing examples of 329 00:19:46,560 --> 00:19:49,720 Speaker 1: how the AI suggested replies that maybe just aren't the 330 00:19:49,760 --> 00:19:53,440 Speaker 1: best fit for the comments that were left behind. Brandon 331 00:19:53,520 --> 00:19:56,879 Speaker 1: Viiglia Rollo of The Register has a headline that definitely 332 00:19:56,880 --> 00:20:00,560 Speaker 1: caught my attention this week. The headline is America cops 333 00:20:00,640 --> 00:20:04,920 Speaker 1: are using AI to draft police reports and the ACLU 334 00:20:05,119 --> 00:20:09,480 Speaker 1: isn't happy. So police officers cops they have to write 335 00:20:09,560 --> 00:20:13,240 Speaker 1: up reports detailing their work they have to document everything 336 00:20:13,280 --> 00:20:16,000 Speaker 1: that they're doing when it relates to their interactions with 337 00:20:16,080 --> 00:20:19,600 Speaker 1: the public and pursuing suspects and all that stuff. They 338 00:20:19,640 --> 00:20:22,520 Speaker 1: have to report everything because all of those details matter, 339 00:20:22,600 --> 00:20:26,800 Speaker 1: particularly once an issue reaches the court system, where prosecutors 340 00:20:26,840 --> 00:20:30,640 Speaker 1: are reliant upon these reports in order to make their case. Now, 341 00:20:30,680 --> 00:20:35,960 Speaker 1: once upon a time, many many many years ago, one 342 00:20:36,000 --> 00:20:41,600 Speaker 1: of my first jobs was to transcribe similar reports. Now 343 00:20:42,040 --> 00:20:45,160 Speaker 1: I had to take handwritten reports, and then I would 344 00:20:45,240 --> 00:20:47,760 Speaker 1: have to type those reports up. This was for a 345 00:20:47,800 --> 00:20:50,800 Speaker 1: security firm. It was not a police station. It was 346 00:20:50,840 --> 00:20:53,880 Speaker 1: a police force. But several members of the security team 347 00:20:53,920 --> 00:20:57,439 Speaker 1: were either active cops or former cops and they were 348 00:20:58,000 --> 00:21:01,360 Speaker 1: moonlighting as security officers. And y'all to say those reports 349 00:21:01,359 --> 00:21:06,480 Speaker 1: were often incomprehensible is being too generous. Not all police 350 00:21:06,480 --> 00:21:10,639 Speaker 1: officers are gifted with the ability to write a clear, 351 00:21:11,119 --> 00:21:15,480 Speaker 1: concise report. Some of those reports they might as well 352 00:21:15,480 --> 00:21:17,720 Speaker 1: have been gibberished. Now, granted, I was working again for 353 00:21:17,760 --> 00:21:21,040 Speaker 1: a security firm, not for a police station, so it 354 00:21:21,080 --> 00:21:24,159 Speaker 1: could have been different because of that. But anyway, I 355 00:21:24,160 --> 00:21:27,080 Speaker 1: can see why police forces might wish to experiment with 356 00:21:27,119 --> 00:21:29,960 Speaker 1: a tool that makes writing reports a lot easier to 357 00:21:30,000 --> 00:21:33,479 Speaker 1: do and easier to understand and reference, you know, easier 358 00:21:33,480 --> 00:21:36,080 Speaker 1: to follow. Like some of those reports, I would type 359 00:21:36,240 --> 00:21:39,640 Speaker 1: literally what had been written, but I couldn't tell you 360 00:21:39,720 --> 00:21:43,119 Speaker 1: what the report was actually saying, because it wasn't written 361 00:21:43,160 --> 00:21:46,240 Speaker 1: in a way that followed a clear line of thought, 362 00:21:46,240 --> 00:21:49,680 Speaker 1: at least not from my perspective. But there's an understandable 363 00:21:49,680 --> 00:21:53,320 Speaker 1: concern that an automated approach to generating reports could result 364 00:21:53,320 --> 00:21:58,680 Speaker 1: in documentation that's treated as official but as fundamentally incorrect, misleading, 365 00:21:59,080 --> 00:22:04,600 Speaker 1: or militia, even if it's not intentionally malicious. The technology 366 00:22:04,640 --> 00:22:08,800 Speaker 1: that viglia Rollo alludes to is called Draft one. It's 367 00:22:08,800 --> 00:22:13,040 Speaker 1: from a company called Axon. So this technology takes footage 368 00:22:13,480 --> 00:22:17,120 Speaker 1: captured by body cameras that are warned by police officers, 369 00:22:17,320 --> 00:22:21,959 Speaker 1: and it then generates reports based off that footage. Cops 370 00:22:22,000 --> 00:22:24,560 Speaker 1: then have the ability to edit these reports, and so 371 00:22:24,640 --> 00:22:27,120 Speaker 1: they can go in and correct any mistakes that might 372 00:22:27,119 --> 00:22:30,600 Speaker 1: be in the generated report. They might elaborate on points 373 00:22:30,640 --> 00:22:34,159 Speaker 1: that weren't captured by the camera or the AI tool. 374 00:22:34,520 --> 00:22:37,760 Speaker 1: But the ACLU has pointed out that offloading reports to 375 00:22:37,840 --> 00:22:42,840 Speaker 1: AI reports that again prosecutors rely upon during criminal trials 376 00:22:43,080 --> 00:22:46,719 Speaker 1: that's inevitably going to lead to civil rights violations and 377 00:22:46,760 --> 00:22:51,240 Speaker 1: related issues, and I think that's a pretty safe bet 378 00:22:51,640 --> 00:22:54,000 Speaker 1: that there will be these negative consequences. I mean, we've 379 00:22:54,000 --> 00:22:58,760 Speaker 1: already talked multiple times on this show about how certain technologies, 380 00:22:58,760 --> 00:23:03,400 Speaker 1: such as facial recognition algorithms, frequently have massive blind spots 381 00:23:03,440 --> 00:23:07,960 Speaker 1: that disproportionately affect already vulnerable populations. So in the case 382 00:23:08,000 --> 00:23:11,760 Speaker 1: with facial recognition tools, if you're not a white male, 383 00:23:12,400 --> 00:23:16,240 Speaker 1: the success rate for those tools begins to fall, and 384 00:23:16,680 --> 00:23:20,000 Speaker 1: you know, depending upon what demographic you fall into, you 385 00:23:20,000 --> 00:23:23,520 Speaker 1: could end up being the victim of a false positive 386 00:23:23,840 --> 00:23:26,320 Speaker 1: or a false negative. I mean, it's proven to be 387 00:23:26,359 --> 00:23:31,199 Speaker 1: an unreliable technology because of the inherent bias that underlies 388 00:23:31,320 --> 00:23:34,159 Speaker 1: the tech. Not saying that the tech can't get to 389 00:23:34,200 --> 00:23:37,280 Speaker 1: a point where it could reliably identify people, but as 390 00:23:37,320 --> 00:23:42,639 Speaker 1: it has been trained and designed, it doesn't for certain populations. 391 00:23:42,920 --> 00:23:47,720 Speaker 1: And often these are the same populations that are disproportionately criminalized. 392 00:23:48,119 --> 00:23:51,400 Speaker 1: And to get into that gets into a whole sociology 393 00:23:51,440 --> 00:23:53,960 Speaker 1: thing that goes so far beyond tech. So we're not 394 00:23:54,040 --> 00:23:56,400 Speaker 1: going to do that. I'm just saying tools like this 395 00:23:56,560 --> 00:24:00,160 Speaker 1: can add to those problems. And make them worse. So well, 396 00:24:00,200 --> 00:24:04,720 Speaker 1: the ACLU is urging that police not use this tool. 397 00:24:04,880 --> 00:24:09,240 Speaker 1: That in fact, municipalities end up banning the use of 398 00:24:09,800 --> 00:24:14,080 Speaker 1: AI generated police reports because of the potential harm it 399 00:24:14,119 --> 00:24:18,280 Speaker 1: can have on innocent citizens or just citizens in general, 400 00:24:18,320 --> 00:24:21,920 Speaker 1: Like it can violate your rights, and that's something that 401 00:24:22,680 --> 00:24:25,520 Speaker 1: is not supposed to happen and yet does a lot. 402 00:24:25,600 --> 00:24:27,359 Speaker 1: So let's not make the problem worse. It's what the 403 00:24:27,440 --> 00:24:31,280 Speaker 1: ACLU is saying. As viglia Rolo points out, the company 404 00:24:31,320 --> 00:24:35,040 Speaker 1: behind this I mentioned them once before, Axon already has 405 00:24:35,080 --> 00:24:37,879 Speaker 1: a pretty rough reputation. So a couple of years ago, 406 00:24:38,160 --> 00:24:45,840 Speaker 1: Axon proposed the brilliant idea of attaching tasers to remote 407 00:24:45,880 --> 00:24:49,639 Speaker 1: controlled drones for the purposes of law enforcement. So you 408 00:24:49,680 --> 00:24:54,520 Speaker 1: would have like taser armed flying drones that some police 409 00:24:54,520 --> 00:24:59,119 Speaker 1: officer using a remote control could operate and then zap 410 00:24:59,200 --> 00:25:02,800 Speaker 1: people with. Not very long after the company made that announcement, 411 00:25:02,880 --> 00:25:08,320 Speaker 1: their entire or nearly their entire ethics board resigned in protest. 412 00:25:08,720 --> 00:25:11,160 Speaker 1: Like when you've got an ethics board and they're like, oh, 413 00:25:11,200 --> 00:25:15,240 Speaker 1: for heaven's sake, this is so clearly not ethical, what 414 00:25:15,400 --> 00:25:19,480 Speaker 1: are we even doing here? And then they just they leave, 415 00:25:19,880 --> 00:25:22,080 Speaker 1: don't even bother to shut the door on the way out. 416 00:25:22,480 --> 00:25:25,840 Speaker 1: That is sending a pretty strong message. Now, in that case, 417 00:25:25,920 --> 00:25:30,320 Speaker 1: Axon did back away from that particular project, so we 418 00:25:30,400 --> 00:25:34,840 Speaker 1: don't have those taser armed drones flying around, for which 419 00:25:34,840 --> 00:25:37,560 Speaker 1: I am thankful. I highly recommend the article and the 420 00:25:37,600 --> 00:25:40,520 Speaker 1: register if you want to learn more about this. Again. 421 00:25:40,880 --> 00:25:44,240 Speaker 1: That's by Brandon viglia Rolo and the article title is 422 00:25:44,359 --> 00:25:47,879 Speaker 1: American cops are using AI to draft police reports and 423 00:25:47,920 --> 00:25:52,800 Speaker 1: the ACLU isn't happy. All right, let's take another quick break. 424 00:25:52,800 --> 00:25:54,639 Speaker 1: When we come back, I got a couple more news 425 00:25:54,680 --> 00:26:08,520 Speaker 1: items I want to talk about, and we'll be right back. Okay. 426 00:26:09,080 --> 00:26:12,240 Speaker 1: Next up, we have a story from Avrum Pilch of 427 00:26:12,320 --> 00:26:15,880 Speaker 1: Tom's Hardware. Tom's Hardware is a great resource, by the way, 428 00:26:15,920 --> 00:26:17,840 Speaker 1: I don't know if you've ever used it. I highly 429 00:26:17,880 --> 00:26:20,000 Speaker 1: recommend it. I mean I used to use Tom's Hardware 430 00:26:20,200 --> 00:26:24,160 Speaker 1: just to literally get my hands on certain electronic pieces 431 00:26:24,200 --> 00:26:26,439 Speaker 1: and that kind of stuff. But there's a ton of 432 00:26:26,480 --> 00:26:29,960 Speaker 1: great articles on there too. Anyway. Pilch's article on Tom's 433 00:26:29,960 --> 00:26:35,240 Speaker 1: Hardware is titled Microsoft recall screenshots, credit cards and Social 434 00:26:35,240 --> 00:26:39,200 Speaker 1: Security numbers. Even with the sensitive Information filter enabled, Okay, 435 00:26:39,520 --> 00:26:41,640 Speaker 1: that's a little hard for me to parse the way 436 00:26:41,640 --> 00:26:43,919 Speaker 1: I said it. Essentially, what they're talking about here, what 437 00:26:44,040 --> 00:26:48,040 Speaker 1: Pilch is saying is that Microsoft Recall, which is a 438 00:26:48,200 --> 00:26:54,639 Speaker 1: feature in certain new Windows eleven laptops, the Copilot plus PCs. 439 00:26:54,760 --> 00:26:59,440 Speaker 1: Essentially it has some blind spots, some issues that probably 440 00:26:59,480 --> 00:27:04,320 Speaker 1: need to be a if you don't recall Microsoft recall. 441 00:27:05,840 --> 00:27:08,480 Speaker 1: I love how I changed my pronunciation, just depending on 442 00:27:08,520 --> 00:27:11,439 Speaker 1: how I feel anyway. It's a feature that's in recent 443 00:27:12,040 --> 00:27:16,040 Speaker 1: PCs in which the computer takes screenshots of whatever it 444 00:27:16,119 --> 00:27:19,440 Speaker 1: is you're doing on the device, and it just keeps 445 00:27:19,480 --> 00:27:22,960 Speaker 1: a record of these screenshots. Then you can actually search 446 00:27:23,240 --> 00:27:26,600 Speaker 1: those screenshots to find information. And you might say, well, 447 00:27:26,600 --> 00:27:30,320 Speaker 1: what good does that do. Well, let's say you are 448 00:27:30,359 --> 00:27:32,919 Speaker 1: doing something that you very rarely have to do, Like 449 00:27:32,960 --> 00:27:34,679 Speaker 1: it's a task that you have to do once in 450 00:27:34,720 --> 00:27:36,960 Speaker 1: a blue moon, so every time you do it, you 451 00:27:37,040 --> 00:27:39,280 Speaker 1: have to relearn how to do it because you're not 452 00:27:39,400 --> 00:27:42,320 Speaker 1: doing it every day. Well, this could help you quickly 453 00:27:42,359 --> 00:27:44,640 Speaker 1: go back and retrace your steps and say, ah, here, 454 00:27:44,760 --> 00:27:46,679 Speaker 1: that's what I need to do in order to complete 455 00:27:46,680 --> 00:27:49,320 Speaker 1: this task. I just you know, search my screen shots 456 00:27:49,320 --> 00:27:50,720 Speaker 1: and I found out what I need to do, and 457 00:27:50,760 --> 00:27:55,359 Speaker 1: I'm good to go. And Microsoft had introduced this or 458 00:27:55,400 --> 00:28:00,280 Speaker 1: announced this earlier this year, but that initial announcement did 459 00:28:00,280 --> 00:28:03,800 Speaker 1: a lot of people to voice concerns about the feature, 460 00:28:04,000 --> 00:28:07,679 Speaker 1: specifically concerns about security and privacy. People didn't like the 461 00:28:07,720 --> 00:28:11,440 Speaker 1: idea of their activities on the computer becoming like a record, 462 00:28:11,520 --> 00:28:15,040 Speaker 1: a searchable record, for lots of reasons. I think you 463 00:28:15,080 --> 00:28:18,199 Speaker 1: can probably imagine there are a lot of legitimate reasons 464 00:28:18,200 --> 00:28:21,760 Speaker 1: why you wouldn't want to have like a searchable record 465 00:28:21,880 --> 00:28:25,800 Speaker 1: of your activities on your computer. For example, let's say 466 00:28:26,119 --> 00:28:30,920 Speaker 1: that you were researching something like divorce, like maybe you're 467 00:28:30,960 --> 00:28:34,480 Speaker 1: in an unhappy relationship, you wouldn't necessarily want that to 468 00:28:34,520 --> 00:28:38,040 Speaker 1: be a searchable record on your computer. Or maybe you're 469 00:28:38,200 --> 00:28:41,880 Speaker 1: researching something about a health issue and that's not anyone 470 00:28:41,880 --> 00:28:44,520 Speaker 1: else's business, so you don't want that to be searchable. 471 00:28:44,880 --> 00:28:47,920 Speaker 1: There are a lot of legitimate and innocent reasons why 472 00:28:48,120 --> 00:28:50,719 Speaker 1: you wouldn't want to have a full record of your 473 00:28:50,760 --> 00:28:54,600 Speaker 1: activities just sitting there on your computer. But let's say 474 00:28:54,640 --> 00:28:56,800 Speaker 1: that you did come around to saying, okay, well this 475 00:28:56,880 --> 00:28:59,160 Speaker 1: is more helpful than harmful. I definitely want to make 476 00:28:59,200 --> 00:29:04,120 Speaker 1: sure that and Microsoft assured people that the recall feature 477 00:29:04,240 --> 00:29:07,400 Speaker 1: exists solely at the device level. In other words, it's 478 00:29:07,440 --> 00:29:10,800 Speaker 1: not sending information over the cloud to Microsoft or to 479 00:29:10,880 --> 00:29:14,520 Speaker 1: other partners. It all lives on the machine. So that way, 480 00:29:14,760 --> 00:29:17,840 Speaker 1: as long as the machine is in your possession, you're good. 481 00:29:18,360 --> 00:29:20,720 Speaker 1: But that's still kind of a concern because if someone 482 00:29:20,760 --> 00:29:22,720 Speaker 1: were to get access to your machine, then they would 483 00:29:22,720 --> 00:29:25,640 Speaker 1: be able to see a full history of your activities 484 00:29:25,720 --> 00:29:30,120 Speaker 1: that might include stuff that gives them information about logins 485 00:29:30,160 --> 00:29:33,400 Speaker 1: and all that other kind of information. So then Microsoft said, Okay, 486 00:29:33,600 --> 00:29:37,160 Speaker 1: we're gonna work on this. They kind of pushed back 487 00:29:37,400 --> 00:29:40,640 Speaker 1: the debut of recall, and then they incorporated a couple 488 00:29:40,640 --> 00:29:44,200 Speaker 1: of new features. One is that the screenshots are all encrypted, 489 00:29:44,360 --> 00:29:47,600 Speaker 1: which means if you don't have administrative access to the 490 00:29:47,600 --> 00:29:49,640 Speaker 1: computer or whatever, or user access, like you don't have 491 00:29:49,680 --> 00:29:52,960 Speaker 1: the password to get into someone's machine, you can't do 492 00:29:53,040 --> 00:29:57,200 Speaker 1: anything with those files unless you first decrypt them. Presumably 493 00:29:57,240 --> 00:29:59,000 Speaker 1: that would take a very long time to do and 494 00:29:59,000 --> 00:30:02,880 Speaker 1: wouldn't be worth your effort. But as Pilch notes, sometimes 495 00:30:03,520 --> 00:30:08,080 Speaker 1: recall will capture sensitive information that it should not capture 496 00:30:08,360 --> 00:30:10,880 Speaker 1: in a screenshot and will leave it in plain text. 497 00:30:11,120 --> 00:30:14,160 Speaker 1: Although the screenshot itself is encrypted, if you were to 498 00:30:14,200 --> 00:30:16,520 Speaker 1: look at the screenshot, you would see the information that 499 00:30:16,560 --> 00:30:20,360 Speaker 1: shouldn't be protected in clear text. That's a problem. So 500 00:30:20,880 --> 00:30:25,400 Speaker 1: for example, credit card information. By default, screenshots are not 501 00:30:25,440 --> 00:30:28,760 Speaker 1: supposed to be taken of that kind of info. And 502 00:30:29,120 --> 00:30:32,760 Speaker 1: if you're using like a shopping site, Recalls pretty good 503 00:30:32,800 --> 00:30:35,720 Speaker 1: about not doing that. But Pilch found that if you 504 00:30:35,760 --> 00:30:38,520 Speaker 1: were using other cases, like let's say you've got a 505 00:30:38,520 --> 00:30:41,760 Speaker 1: PDF file and you open up the PDF file and 506 00:30:41,800 --> 00:30:44,520 Speaker 1: you're filling out the fields on this PDF file, they've 507 00:30:44,520 --> 00:30:48,400 Speaker 1: made it, you know, editable, so you might have a 508 00:30:48,440 --> 00:30:52,000 Speaker 1: file that has government information on it, like if you 509 00:30:52,040 --> 00:30:54,760 Speaker 1: were filing for a passport or something. He found that 510 00:30:54,800 --> 00:30:58,480 Speaker 1: in those cases, Recall would take screenshots and the sensitive 511 00:30:58,480 --> 00:31:02,080 Speaker 1: information would be clearly viewable in the screenshots, and that's 512 00:31:02,120 --> 00:31:05,840 Speaker 1: an issue. Again. The screenshot itself is encrypted and it 513 00:31:05,880 --> 00:31:08,840 Speaker 1: lives just on the device, but still a concern like 514 00:31:08,880 --> 00:31:11,920 Speaker 1: there's this record there that doesn't necessarily need to be there. 515 00:31:12,280 --> 00:31:15,920 Speaker 1: Microsoft says that it's continually working on its features, so 516 00:31:16,440 --> 00:31:19,120 Speaker 1: maybe this will be something that gets patched out later on, 517 00:31:19,520 --> 00:31:22,080 Speaker 1: but I think it's good to be alert and aware 518 00:31:22,320 --> 00:31:25,400 Speaker 1: about these sorts of things. So every country in the 519 00:31:25,400 --> 00:31:29,000 Speaker 1: world has rules around certain bands of radio frequencies and 520 00:31:29,080 --> 00:31:31,800 Speaker 1: how those bands can be used. So here in the 521 00:31:31,920 --> 00:31:36,240 Speaker 1: United States, the Federal Communications Commission or FCC oversees this, 522 00:31:36,440 --> 00:31:39,800 Speaker 1: and recently they opened up a frequency band, the six 523 00:31:39,920 --> 00:31:44,120 Speaker 1: gigahertz frequency band. In fact that the band itself is 524 00:31:44,160 --> 00:31:48,080 Speaker 1: like one point two gigaherts wide. But they've opened this 525 00:31:48,200 --> 00:31:52,640 Speaker 1: up to very low power devices. So this doesn't apply 526 00:31:52,760 --> 00:31:56,480 Speaker 1: to anyone who perhaps wanted to build a mega transmitter 527 00:31:56,880 --> 00:31:59,480 Speaker 1: in the six gigahertz band or anything like that. There's 528 00:31:59,480 --> 00:32:01,960 Speaker 1: still not a lot to do that. However, it does 529 00:32:02,200 --> 00:32:06,600 Speaker 1: allow for short range wireless devices, so for example, in 530 00:32:06,840 --> 00:32:12,080 Speaker 1: car entertainment systems or short range health monitoring devices, those 531 00:32:12,160 --> 00:32:15,360 Speaker 1: kinds of things, Internet of things kind of applications. The 532 00:32:15,440 --> 00:32:18,520 Speaker 1: data throughput in the six gigahertz range is far greater 533 00:32:18,560 --> 00:32:21,040 Speaker 1: than what you would get with the technology like Bluetooth. 534 00:32:21,400 --> 00:32:24,640 Speaker 1: But that just means that the implementation of the technology 535 00:32:24,720 --> 00:32:28,760 Speaker 1: should guide your choice as to which technology, which wireless 536 00:32:28,760 --> 00:32:34,000 Speaker 1: technology you incorporate into your invention. So if whatever application 537 00:32:34,080 --> 00:32:37,160 Speaker 1: you have in mind just requires small amounts of data 538 00:32:37,200 --> 00:32:40,400 Speaker 1: transfers like it's just little packets of data. Bluetooth would 539 00:32:40,400 --> 00:32:43,280 Speaker 1: be a great solution for that. It's it does that 540 00:32:43,440 --> 00:32:45,720 Speaker 1: job great, But if you need something that has a 541 00:32:45,760 --> 00:32:51,480 Speaker 1: bit more oomph in the data transmission context, well then 542 00:32:51,800 --> 00:32:55,480 Speaker 1: you would probably want to use this band of frequencies. 543 00:32:56,000 --> 00:32:59,520 Speaker 1: One other issue with this is that you could run 544 00:32:59,520 --> 00:33:03,200 Speaker 1: into interfer appearence. That's why the FCC has only allowed 545 00:33:03,200 --> 00:33:07,040 Speaker 1: for very low power applications using this band because if 546 00:33:07,040 --> 00:33:10,000 Speaker 1: it's very low power, the range of transmission is not 547 00:33:10,040 --> 00:33:13,000 Speaker 1: going to be very far, so the chance for interference 548 00:33:13,160 --> 00:33:17,480 Speaker 1: is lower. If you have higher power outputs, then you're 549 00:33:17,480 --> 00:33:20,920 Speaker 1: going to have more opportunities for interference, which means more 550 00:33:21,280 --> 00:33:24,400 Speaker 1: error rates and things of that nature. So if you 551 00:33:24,560 --> 00:33:28,160 Speaker 1: just restrict how much power the device is able to use, 552 00:33:28,200 --> 00:33:31,320 Speaker 1: you really sidestep that problem just because they don't transmit 553 00:33:31,400 --> 00:33:35,760 Speaker 1: far enough for interference to typically be an issue. Okay, 554 00:33:35,800 --> 00:33:39,960 Speaker 1: our last story before we get to recommended reading is 555 00:33:40,000 --> 00:33:43,200 Speaker 1: about the game awards that happened this week. Now I'm 556 00:33:43,200 --> 00:33:44,760 Speaker 1: going to cut right to the chase. The Game of 557 00:33:44,800 --> 00:33:49,360 Speaker 1: the Year went to Astrobot. I have not played Astrobot, 558 00:33:49,400 --> 00:33:52,760 Speaker 1: but everything I've heard suggests that it is worthy of 559 00:33:52,800 --> 00:33:54,960 Speaker 1: the title of Game of the Year, even though I'm 560 00:33:55,000 --> 00:33:57,760 Speaker 1: a little salty that like a dragon Infinite Wealth didn't 561 00:33:57,800 --> 00:33:59,680 Speaker 1: make it to the nominee list for a Game of 562 00:33:59,680 --> 00:34:02,280 Speaker 1: the Year. But then I play maybe three or four 563 00:34:02,320 --> 00:34:06,120 Speaker 1: games in a twelve month period, so I don't have 564 00:34:06,200 --> 00:34:08,439 Speaker 1: the experience to make a good call on these kinds 565 00:34:08,440 --> 00:34:11,319 Speaker 1: of things, Like I am certainly not an expert because 566 00:34:11,320 --> 00:34:14,520 Speaker 1: I don't play enough. But anyway, the Game Awards are 567 00:34:14,560 --> 00:34:17,480 Speaker 1: really known for two things, you know, giving out trophies 568 00:34:17,480 --> 00:34:21,040 Speaker 1: to games and creators and also showing a buttload of 569 00:34:21,120 --> 00:34:24,920 Speaker 1: trailers for upcoming video game titles. So some of the 570 00:34:25,000 --> 00:34:29,560 Speaker 1: stuff shown off during the awards included a teaser for 571 00:34:29,640 --> 00:34:33,200 Speaker 1: a game from Naughty Dog. They make the uncharted games. 572 00:34:33,480 --> 00:34:38,040 Speaker 1: It's titled Intergalactic the Heretic Project. They showed a trailer 573 00:34:38,080 --> 00:34:41,799 Speaker 1: for an elden Ring multiplayer spinoff game called night Rain. 574 00:34:42,400 --> 00:34:45,160 Speaker 1: They showed off a trailer for a new entry in 575 00:34:45,239 --> 00:34:48,279 Speaker 1: the Venerable Witcher series of games, so this would be 576 00:34:48,360 --> 00:34:50,960 Speaker 1: the Witcher four. They gave a first look at the 577 00:34:51,040 --> 00:34:56,560 Speaker 1: upcoming Borderlands four game. They announced a new Turrok game 578 00:34:56,640 --> 00:35:00,560 Speaker 1: of all things that was a shock and lots more 579 00:35:00,600 --> 00:35:03,640 Speaker 1: stuff too, So if you are a gamer, and somehow 580 00:35:03,640 --> 00:35:06,000 Speaker 1: you missed out on all the hype and all the announcements. 581 00:35:06,200 --> 00:35:08,400 Speaker 1: I recommend you take some time today to watch a 582 00:35:08,480 --> 00:35:12,360 Speaker 1: truckload of trailers and catch up and get excited for 583 00:35:12,400 --> 00:35:14,640 Speaker 1: what's coming next year. I'll have to say, like the 584 00:35:14,680 --> 00:35:18,319 Speaker 1: last couple of years have been pretty darn phenomenal on 585 00:35:18,360 --> 00:35:20,719 Speaker 1: the video game front. There have been some really incredible 586 00:35:20,760 --> 00:35:22,880 Speaker 1: titles to come out, and while I've only had a 587 00:35:22,960 --> 00:35:25,400 Speaker 1: chance to play a few of them, I'm just amazed 588 00:35:25,760 --> 00:35:28,360 Speaker 1: at the creativity and the innovation that's going on in 589 00:35:28,400 --> 00:35:31,799 Speaker 1: the video game space, mostly in the independent space, like 590 00:35:32,080 --> 00:35:35,160 Speaker 1: the TRIPAA titles and stuff. Those are always, you know, 591 00:35:35,320 --> 00:35:38,160 Speaker 1: impressive and everything. And I'd be lying if I said 592 00:35:38,160 --> 00:35:41,479 Speaker 1: I wasn't looking forward to Grand Theft Auto six. I am, 593 00:35:41,920 --> 00:35:46,480 Speaker 1: But honestly, I think some of the most exciting developments 594 00:35:46,520 --> 00:35:50,600 Speaker 1: in the video game space have been in the independent area. 595 00:35:51,000 --> 00:35:54,759 Speaker 1: There've been some great games. No independent itself is a 596 00:35:54,760 --> 00:35:58,520 Speaker 1: broad spectrum. Some independent studios are teeny tiny and consist 597 00:35:58,600 --> 00:36:01,400 Speaker 1: of maybe one or two people, and some independent studios 598 00:36:01,560 --> 00:36:05,400 Speaker 1: are actually quite large and have support from much bigger 599 00:36:05,480 --> 00:36:09,160 Speaker 1: companies somewhere along the line. So you know, that's a 600 00:36:09,239 --> 00:36:12,680 Speaker 1: huge range in itself. Okay, I do have some recommended 601 00:36:12,719 --> 00:36:16,040 Speaker 1: reading for y'all, and this stuff is pretty deep and 602 00:36:16,080 --> 00:36:19,120 Speaker 1: some of it's super heavy, So I didn't tackle it 603 00:36:19,120 --> 00:36:23,640 Speaker 1: in today's episode because typically the entries here are fairly short, 604 00:36:23,680 --> 00:36:26,440 Speaker 1: and I feel like these are stories that require a 605 00:36:26,480 --> 00:36:30,839 Speaker 1: lot more reading and consideration and you can't easily summarize them. 606 00:36:30,880 --> 00:36:34,080 Speaker 1: So up first is Ian Sample's piece in The Guardian 607 00:36:34,280 --> 00:36:39,400 Speaker 1: titled unprecedented risk to life on Earth Scientists call for 608 00:36:39,560 --> 00:36:43,719 Speaker 1: halt on mirror life MicroB research, you know, in case 609 00:36:43,760 --> 00:36:46,520 Speaker 1: those drones over New Jersey aren't filling you with enough 610 00:36:46,680 --> 00:36:50,600 Speaker 1: existential dread, but a very important piece just scary. Then 611 00:36:50,680 --> 00:36:54,560 Speaker 1: there's Dan Gooden's peace in Ours Tetnica titled Russia takes 612 00:36:54,680 --> 00:36:59,200 Speaker 1: unusual route to hack Starlink connected devices in Ukraine and 613 00:36:59,280 --> 00:37:02,520 Speaker 1: it details the rather circuitous route that Russia has created 614 00:37:02,600 --> 00:37:05,799 Speaker 1: in an effort to spy on Ukrainian forces in that war. 615 00:37:06,200 --> 00:37:10,560 Speaker 1: And finally there's Ashley Bellinger's piece, also in Ours Technica 616 00:37:10,640 --> 00:37:15,319 Speaker 1: that's titled character dot AI steps up teen safety after 617 00:37:15,400 --> 00:37:20,520 Speaker 1: bots allegedly caused suicide self harm. So yeah, that last 618 00:37:20,600 --> 00:37:24,520 Speaker 1: article in particular is very hard to read It does 619 00:37:24,600 --> 00:37:28,200 Speaker 1: deal with kids and mental health issues and suicide, but 620 00:37:28,280 --> 00:37:32,160 Speaker 1: I think it's important to stay informed about how interactions 621 00:37:32,160 --> 00:37:36,400 Speaker 1: with AI can have very real and sometimes extremely tragic 622 00:37:36,600 --> 00:37:40,440 Speaker 1: consequences in our world, and to learn more about what 623 00:37:40,520 --> 00:37:44,480 Speaker 1: companies are doing or not doing to address those issues. 624 00:37:44,800 --> 00:37:48,319 Speaker 1: That's it for this episode. It was a long news episode, 625 00:37:48,360 --> 00:37:50,080 Speaker 1: but you know, we're wrapping up the year. There's a 626 00:37:50,120 --> 00:37:52,480 Speaker 1: lot to talk about, and besides which I won't be 627 00:37:52,520 --> 00:37:55,440 Speaker 1: doing this much longer, so it's good to get it 628 00:37:55,440 --> 00:37:58,160 Speaker 1: all out now. I hope all of you out there 629 00:37:58,360 --> 00:38:02,279 Speaker 1: are doing really well. I'll talk to you again really soon. 630 00:38:08,920 --> 00:38:13,560 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 631 00:38:13,880 --> 00:38:17,600 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 632 00:38:17,640 --> 00:38:18,680 Speaker 1: to your favorite shows.