1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland. 3 00:00:15,840 --> 00:00:18,920 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,040 --> 00:00:21,720 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:21,800 --> 00:00:24,880 Speaker 1: July eighteenth, twenty twenty three, and let's get to it. 6 00:00:25,680 --> 00:00:29,720 Speaker 1: CNBC reports that Amaud Mustok, who is the CEO of 7 00:00:29,760 --> 00:00:34,440 Speaker 1: Stability AI, has predicted that artificial intelligence will eliminate coder 8 00:00:34,800 --> 00:00:38,800 Speaker 1: jobs that previously had been outsourced to India. That AI 9 00:00:39,200 --> 00:00:43,239 Speaker 1: is going to require a smaller human staff, and artificial 10 00:00:43,240 --> 00:00:45,320 Speaker 1: intelligence will do a lot of work at a pace 11 00:00:45,360 --> 00:00:48,440 Speaker 1: that's far faster than humans typically can match, and thus 12 00:00:48,520 --> 00:00:53,840 Speaker 1: will be more economically viable for companies. Moreover, Mustok's point 13 00:00:53,880 --> 00:00:57,920 Speaker 1: indicates the level of protection workers can expect based on 14 00:00:58,120 --> 00:01:02,240 Speaker 1: where they live. In the India, such protections are scarce, 15 00:01:02,720 --> 00:01:05,639 Speaker 1: so it isn't a stretch to imagine a future where 16 00:01:05,800 --> 00:01:09,600 Speaker 1: tech companies downsize to cut costs related to staffing and 17 00:01:09,680 --> 00:01:12,600 Speaker 1: start to lean heavily on AI. But most Stock says 18 00:01:12,680 --> 00:01:16,600 Speaker 1: other places such as France will take efforts to protect 19 00:01:16,680 --> 00:01:19,640 Speaker 1: human employees, and so tech companies in France won't have 20 00:01:19,720 --> 00:01:23,960 Speaker 1: the same options as those in India. Mustoc also clarified 21 00:01:23,959 --> 00:01:27,680 Speaker 1: his prediction. He said that coders will be an obsolete 22 00:01:27,760 --> 00:01:31,000 Speaker 1: job in a few years, but that programmers will not be. 23 00:01:31,640 --> 00:01:35,000 Speaker 1: He says, the programmers of the future will rely heavily 24 00:01:35,040 --> 00:01:38,800 Speaker 1: on AI to actually write the code, but things like 25 00:01:39,000 --> 00:01:43,280 Speaker 1: ideation or testing bugs, that kind of stuff that's still 26 00:01:43,280 --> 00:01:46,360 Speaker 1: going to need human counterparts. So while certain aspects of 27 00:01:46,400 --> 00:01:49,680 Speaker 1: programming could be eliminated, AI wouldn't be able to take 28 00:01:49,720 --> 00:01:53,960 Speaker 1: on all the roles that a programmer traditionally performs. Still, 29 00:01:54,000 --> 00:01:56,320 Speaker 1: I imagine we're going to see a lot more disruption 30 00:01:56,520 --> 00:02:01,880 Speaker 1: across multiple industries as companies experiment with AI. I imagine 31 00:02:02,040 --> 00:02:04,360 Speaker 1: more than a few companies will give into the temptation 32 00:02:04,920 --> 00:02:08,280 Speaker 1: to eliminate a large number of jobs in favor of 33 00:02:08,360 --> 00:02:12,440 Speaker 1: artificial intelligence. And once again I sigh, because this kind 34 00:02:12,440 --> 00:02:16,080 Speaker 1: of strategy is ultimately self defeating in the long run 35 00:02:16,120 --> 00:02:19,680 Speaker 1: in my opinion. By that, I mean, if AI ends 36 00:02:19,760 --> 00:02:23,560 Speaker 1: up displacing lots of people and those people now find 37 00:02:23,560 --> 00:02:26,920 Speaker 1: it difficult to land a job that pays a living wage, 38 00:02:27,480 --> 00:02:30,280 Speaker 1: then you begin to drain the consumer base. And once 39 00:02:30,320 --> 00:02:33,519 Speaker 1: the consumer base is gone, well, now there's no one 40 00:02:33,639 --> 00:02:36,560 Speaker 1: left to buy products and services. And once that happens, 41 00:02:36,600 --> 00:02:40,680 Speaker 1: the businesses which were so good at cutting costs and 42 00:02:40,760 --> 00:02:45,640 Speaker 1: maximizing profits while revenue perhaps remained steady, they're gonna see 43 00:02:45,639 --> 00:02:47,679 Speaker 1: revenue drop because no one will be able to buy 44 00:02:47,680 --> 00:02:50,720 Speaker 1: their stuff because no one has a job anymore, because 45 00:02:50,720 --> 00:02:53,560 Speaker 1: the robots took them all. And before you know it, 46 00:02:54,240 --> 00:02:57,000 Speaker 1: the robots are just making stuff for other robots, and 47 00:02:57,040 --> 00:02:59,360 Speaker 1: the rest of us are going, what the heck do 48 00:02:59,440 --> 00:03:03,760 Speaker 1: we do now? Meanwhile, advocates for universal basic income will 49 00:03:03,800 --> 00:03:08,040 Speaker 1: go bonkers. What a world. Reuter's reports that Microsoft is 50 00:03:08,080 --> 00:03:13,120 Speaker 1: working to extend its acquisition agreement with Activision Blizzard. So 51 00:03:13,480 --> 00:03:17,440 Speaker 1: the original agreement between these two companies set today as 52 00:03:17,480 --> 00:03:21,440 Speaker 1: the deadline for completing the deal. But we have seen 53 00:03:22,000 --> 00:03:26,280 Speaker 1: opposition from regulatory authorities that have introduced lots of hiccups 54 00:03:26,280 --> 00:03:29,120 Speaker 1: along the way. And last week a court dismissed the 55 00:03:29,240 --> 00:03:33,640 Speaker 1: US Federal Trade Commission's request for an injunction to at 56 00:03:33,720 --> 00:03:38,520 Speaker 1: least temporarily block this deal. That request was denied, and 57 00:03:38,560 --> 00:03:41,520 Speaker 1: that removed one impediment, But there's still a roadblock in place, 58 00:03:41,560 --> 00:03:45,360 Speaker 1: and that's the UK's Competition in Market's Authority or CMA, 59 00:03:45,600 --> 00:03:49,840 Speaker 1: which previously voted to block this acquisition, so that block 60 00:03:50,440 --> 00:03:53,280 Speaker 1: is still in effect, and while Microsoft and Activision Blizzard 61 00:03:53,320 --> 00:03:56,640 Speaker 1: attempt to persuade UK regulators that the deal will not 62 00:03:56,840 --> 00:03:59,760 Speaker 1: decrease competition in the video game space and the c 63 00:04:00,320 --> 00:04:03,600 Speaker 1: gaming space in particular, the deadline for the deal is 64 00:04:03,640 --> 00:04:06,800 Speaker 1: still in effect, and so according to Reuter's, Microsoft is 65 00:04:06,800 --> 00:04:09,400 Speaker 1: trying to work out a deal with Activision Blizzard to 66 00:04:09,600 --> 00:04:12,800 Speaker 1: extend this contract to give more time to clear those 67 00:04:12,840 --> 00:04:17,000 Speaker 1: regulatory hurdles in the UK. So what are the possible 68 00:04:17,040 --> 00:04:20,560 Speaker 1: outcomes here, Well, the one I think that is most 69 00:04:20,640 --> 00:04:23,760 Speaker 1: likely is that Activision Blizzard will agree to an extension. 70 00:04:24,520 --> 00:04:28,360 Speaker 1: Maybe the company will negotiate some additional terms in the process, 71 00:04:28,400 --> 00:04:31,040 Speaker 1: but I think that's what's going to happen. Another possibility 72 00:04:31,080 --> 00:04:34,160 Speaker 1: is that Activision Blizzard walks away and just remains its 73 00:04:34,160 --> 00:04:37,800 Speaker 1: own company. A third possibility is that Activision Blizzard, sensing 74 00:04:38,279 --> 00:04:41,400 Speaker 1: that it has increased importance and value in the market, 75 00:04:41,839 --> 00:04:45,440 Speaker 1: looks for a different suitor, one that we'll spend even 76 00:04:45,440 --> 00:04:48,840 Speaker 1: more money to acquire it than Microsoft would. I think 77 00:04:48,880 --> 00:04:52,400 Speaker 1: that last possibility is the least likely, but I'm also 78 00:04:52,440 --> 00:04:55,360 Speaker 1: no expert on corporate maneuvers, so it's entirely possible I'm 79 00:04:55,400 --> 00:04:59,400 Speaker 1: missing something obvious. Anyway, the agreement to extend the contract 80 00:04:59,440 --> 00:05:02,360 Speaker 1: will need to have or not happen by the end 81 00:05:02,520 --> 00:05:07,200 Speaker 1: of today. A new law in Norway will take aim 82 00:05:07,360 --> 00:05:11,520 Speaker 1: at Meta's main revenue strategy when it goes active in August, 83 00:05:11,640 --> 00:05:17,839 Speaker 1: and that is personalized advertising. Meta famously leverages the gargantuan 84 00:05:17,960 --> 00:05:21,440 Speaker 1: amount of information it collects from users as they interact 85 00:05:21,520 --> 00:05:26,320 Speaker 1: on platforms like Instagram and Facebook and beyond. To learn 86 00:05:26,320 --> 00:05:29,160 Speaker 1: more about that, read up on Meta Pixel. That'll tell 87 00:05:29,160 --> 00:05:32,200 Speaker 1: you about how Meta has a tool that can be 88 00:05:33,839 --> 00:05:38,800 Speaker 1: imprinted in someone else's web page that delivers information to them, 89 00:05:38,920 --> 00:05:42,640 Speaker 1: but also to Meta itself. Anyway, all this information is 90 00:05:42,720 --> 00:05:45,640 Speaker 1: gold to advertisers. They want to deliver just the right 91 00:05:45,680 --> 00:05:49,240 Speaker 1: message to just the right audience to maximize effectiveness, and 92 00:05:49,320 --> 00:05:51,719 Speaker 1: so for years now, Meta has been in the business 93 00:05:51,720 --> 00:05:55,920 Speaker 1: of selling premier ad services to companies with the implication 94 00:05:56,080 --> 00:05:58,520 Speaker 1: that the advertising campaigns are going to be far more 95 00:05:58,560 --> 00:06:02,280 Speaker 1: effective than if you just put out messaging in front 96 00:06:02,320 --> 00:06:05,520 Speaker 1: of a randomized crowd of people. If you want to 97 00:06:05,520 --> 00:06:09,440 Speaker 1: sell backpacks and tents, well, Meta knows which people are 98 00:06:09,520 --> 00:06:12,120 Speaker 1: keen to go outdoors and rough it do you want 99 00:06:12,120 --> 00:06:15,920 Speaker 1: to sell baby accessories and supplies? Well, Meta knows which 100 00:06:16,000 --> 00:06:19,320 Speaker 1: users are new parents. But Norway has now passed a 101 00:06:19,440 --> 00:06:23,440 Speaker 1: law making it illegal for Meta to practice targeted advertising, 102 00:06:23,520 --> 00:06:27,560 Speaker 1: and should it utilize that strategy in Norway, it could 103 00:06:27,560 --> 00:06:31,640 Speaker 1: face a daily fine of one hundred thousand dollars, so 104 00:06:31,680 --> 00:06:36,120 Speaker 1: one hundred grand every day unless Meta changes its approach, 105 00:06:36,360 --> 00:06:40,120 Speaker 1: at least its approach in Norway. Now, some people are 106 00:06:40,440 --> 00:06:43,880 Speaker 1: estimating already that while one hundred thousand dollars is really 107 00:06:43,920 --> 00:06:49,039 Speaker 1: nothing to Meta, a multi billion dollar company, the amount 108 00:06:49,040 --> 00:06:52,679 Speaker 1: that it makes while operating in Norway is likely less 109 00:06:52,680 --> 00:06:56,760 Speaker 1: than one hundred thousand dollars a day. So that means 110 00:06:56,800 --> 00:06:59,680 Speaker 1: that to operate in Norway, Meta would be doing so 111 00:06:59,760 --> 00:07:02,200 Speaker 1: at all loss. And what's more, Norway is urging other 112 00:07:02,279 --> 00:07:06,080 Speaker 1: countries to form similar legislation. Now this really kind of 113 00:07:06,160 --> 00:07:10,800 Speaker 1: dates back to in twenty twenty two, the Irish Data 114 00:07:10,800 --> 00:07:14,720 Speaker 1: Protection Commission, which is a major regulator in the European Union, 115 00:07:15,280 --> 00:07:20,040 Speaker 1: decided that Meta's approach to quote unquote behavioral advertising breaks 116 00:07:20,280 --> 00:07:23,760 Speaker 1: EU law. Metta subsequently made some adjustments to the way 117 00:07:23,760 --> 00:07:28,000 Speaker 1: it does business, but according to various people, including the 118 00:07:28,040 --> 00:07:31,880 Speaker 1: Irish regulators. Those changes failed to fix the problem and 119 00:07:31,960 --> 00:07:34,800 Speaker 1: they did not bring META into compliance with the law, 120 00:07:35,240 --> 00:07:39,080 Speaker 1: so Norwegian legislators, seeing a need for action, created this approach. 121 00:07:39,120 --> 00:07:42,120 Speaker 1: I should also add Norway is not part of the 122 00:07:42,160 --> 00:07:44,960 Speaker 1: European Union. They are part of the European Single Market, 123 00:07:45,040 --> 00:07:47,560 Speaker 1: but not in the EU. They're not a member of 124 00:07:47,600 --> 00:07:50,360 Speaker 1: the EU. It will be interesting to see how META 125 00:07:50,360 --> 00:07:53,880 Speaker 1: response to this and if other countries in Europe follow 126 00:07:53,960 --> 00:07:58,000 Speaker 1: Norway's lead. Tesla's board of directors has decided to settle 127 00:07:58,040 --> 00:08:02,480 Speaker 1: a lawsuit brought against it a retirement fund that represents 128 00:08:02,520 --> 00:08:08,120 Speaker 1: police and firefighters. The group accused Tesla of paying themselves 129 00:08:08,680 --> 00:08:11,920 Speaker 1: too much in Tesla stock and by Tesla, I mean 130 00:08:11,960 --> 00:08:15,320 Speaker 1: specifically they're board of directors. So as a consequence, the 131 00:08:15,360 --> 00:08:20,280 Speaker 1: directors are collectively returning seven hundred and thirty five million 132 00:08:20,480 --> 00:08:24,120 Speaker 1: dollars worth of stock options. So the lawsuit had argued 133 00:08:24,200 --> 00:08:27,360 Speaker 1: that these directors had voted to award themselves far more 134 00:08:27,400 --> 00:08:32,280 Speaker 1: stock options than a board typically would receive. And on 135 00:08:32,320 --> 00:08:35,520 Speaker 1: that board are well Elon Musk is on it, his 136 00:08:35,600 --> 00:08:39,400 Speaker 1: brother Kimball is on it. James Murdoch, the son of 137 00:08:39,640 --> 00:08:43,680 Speaker 1: Rupert Murdoch, The media Mogul, he's on it. Joe Grebia, 138 00:08:43,800 --> 00:08:46,760 Speaker 1: one of the co founders of Airbnb, he's on it. 139 00:08:47,160 --> 00:08:51,520 Speaker 1: And finally JB. Strabel, who was previously Tesla's chief technology officer, 140 00:08:51,640 --> 00:08:54,840 Speaker 1: is on the board. So Tesla reps said that the 141 00:08:54,920 --> 00:08:57,880 Speaker 1: idea was to align the board of directors with the 142 00:08:57,920 --> 00:09:01,920 Speaker 1: needs and successes of the company. Right, because your compensation 143 00:09:02,360 --> 00:09:07,440 Speaker 1: is a stock option, then your compensation increases in value 144 00:09:07,640 --> 00:09:11,760 Speaker 1: if the company's stock price increases, So that was the logic, 145 00:09:11,880 --> 00:09:15,439 Speaker 1: they said. Also, I should mention that Elon Musk is 146 00:09:15,480 --> 00:09:18,800 Speaker 1: facing a separate legal challenge relating to his own compensation 147 00:09:18,880 --> 00:09:22,160 Speaker 1: package at Tesla. That's a package that awards Musk fifty 148 00:09:22,240 --> 00:09:26,240 Speaker 1: six billion with a B dollars, which I can't even 149 00:09:26,320 --> 00:09:30,120 Speaker 1: begin to comprehend that amount of money. No wonder must 150 00:09:30,120 --> 00:09:34,480 Speaker 1: doesn't seem too fussed with the continuing descent of Twitter's value, 151 00:09:35,120 --> 00:09:38,840 Speaker 1: and he's got Tesla just totally to fall back on. 152 00:09:39,679 --> 00:09:43,160 Speaker 1: But Tesla also faces other challenges. This week, the US 153 00:09:43,280 --> 00:09:48,040 Speaker 1: National Highway Traffic Safety Administration, or NHTSA, announced it was 154 00:09:48,120 --> 00:09:51,800 Speaker 1: launching a new investigation into an old crash, a crash 155 00:09:51,880 --> 00:09:55,000 Speaker 1: that happened in twenty eighteen that involved a Tesla Model 156 00:09:55,080 --> 00:09:58,880 Speaker 1: three that may have been in advanced driver assists mode 157 00:09:58,960 --> 00:10:02,680 Speaker 1: at the time of the act. The NHTSA typically will 158 00:10:02,720 --> 00:10:06,079 Speaker 1: engage in around one hundred crash investigations in a year 159 00:10:06,559 --> 00:10:09,680 Speaker 1: for cases where emerging technology had played a part in 160 00:10:09,720 --> 00:10:12,880 Speaker 1: the crash. While this accident happened five years ago, there 161 00:10:12,880 --> 00:10:16,199 Speaker 1: have sadly been no shortage of accidents involving Tesla vehicles 162 00:10:16,240 --> 00:10:19,079 Speaker 1: in some form of self driving mode now. According to 163 00:10:19,120 --> 00:10:22,880 Speaker 1: the Washington Post, since twenty nineteen, there have been seven 164 00:10:23,000 --> 00:10:26,680 Speaker 1: hundred and thirty six accidents with seventeen fatalities that involved 165 00:10:26,720 --> 00:10:30,680 Speaker 1: a Tesla either in autopilot or full self driving mode. 166 00:10:31,120 --> 00:10:35,440 Speaker 1: The NHTSA numbers are actually lower than that. They identify 167 00:10:35,600 --> 00:10:40,000 Speaker 1: two hundred and seventy three crashes and launched investigations into 168 00:10:40,000 --> 00:10:43,520 Speaker 1: more of forty of those. Fourteen of those crashes resulted 169 00:10:43,600 --> 00:10:46,760 Speaker 1: in fatalities. Okay, we're going to take a quick break. 170 00:10:46,800 --> 00:10:48,920 Speaker 1: When we come back, we've got some more tech news 171 00:10:48,960 --> 00:11:01,640 Speaker 1: to talk about. We're back. Reddit is still feeling the 172 00:11:01,679 --> 00:11:05,880 Speaker 1: consequences of the platform's change in its API policy. So 173 00:11:06,000 --> 00:11:09,720 Speaker 1: quick reminder, Reddit, but in a new fee structure for 174 00:11:09,800 --> 00:11:14,400 Speaker 1: developers who use the platform's API to develop third party apps, 175 00:11:14,640 --> 00:11:17,120 Speaker 1: so developers have to pay Reddit based upon the number 176 00:11:17,160 --> 00:11:20,199 Speaker 1: of times their apps reference Reddit, which can include stuff 177 00:11:20,200 --> 00:11:25,600 Speaker 1: like pulling information or posting information to Reddit. The decision 178 00:11:25,720 --> 00:11:27,960 Speaker 1: pushed a lot of developers to shut down their apps. 179 00:11:28,080 --> 00:11:30,960 Speaker 1: They explained they just couldn't afford to pay those fees 180 00:11:31,040 --> 00:11:33,960 Speaker 1: because of the number of times their apps would reference 181 00:11:33,960 --> 00:11:37,640 Speaker 1: Reddit and the popularity of those apps. So a ton 182 00:11:37,679 --> 00:11:40,240 Speaker 1: of Reddit users got really mad because they depended on 183 00:11:40,280 --> 00:11:43,920 Speaker 1: those apps to access Reddit. The platform's own mobile app 184 00:11:44,280 --> 00:11:48,600 Speaker 1: doesn't have the best reputation among a lot of redditors anyway. 185 00:11:48,640 --> 00:11:51,679 Speaker 1: The latest brew haha the site is that Reddit has 186 00:11:51,800 --> 00:11:55,080 Speaker 1: changed its chat system, and as a consequence of that, 187 00:11:55,720 --> 00:11:58,560 Speaker 1: users can now only look at their chat histories from 188 00:11:58,640 --> 00:12:02,280 Speaker 1: January first, twenty three up to the present day. Anything 189 00:12:02,280 --> 00:12:05,840 Speaker 1: that was older, anything that was posted before January first 190 00:12:05,840 --> 00:12:10,880 Speaker 1: of this year is gonzo. Understandably, this has upset a 191 00:12:10,920 --> 00:12:13,480 Speaker 1: lot of people for various reasons. You know, some folks 192 00:12:13,520 --> 00:12:16,679 Speaker 1: were holding on to older messages as a touchstone of 193 00:12:16,720 --> 00:12:19,280 Speaker 1: a friend or loved one that they had no longer 194 00:12:19,320 --> 00:12:22,360 Speaker 1: been in touch with for whatever reason. Some were keeping 195 00:12:22,400 --> 00:12:26,760 Speaker 1: information for the purposes of future work, and now all 196 00:12:26,760 --> 00:12:31,400 Speaker 1: of that information is unavailable when you access your chat history. Now, 197 00:12:31,440 --> 00:12:34,000 Speaker 1: this change didn't happen out of the blue. Reddit actually 198 00:12:34,000 --> 00:12:37,080 Speaker 1: communicated it late last month in a change log, but 199 00:12:37,160 --> 00:12:39,960 Speaker 1: that didn't get that much attention or coverage at that time, 200 00:12:41,080 --> 00:12:44,040 Speaker 1: and Reddit didn't exactly take steps to reach out to 201 00:12:44,120 --> 00:12:46,600 Speaker 1: users and let them know it's coming. So in a way, 202 00:12:46,840 --> 00:12:48,720 Speaker 1: this is a bit like that Hitchhiker's Guide to the 203 00:12:48,760 --> 00:12:52,720 Speaker 1: Galaxy segment where Arthur Den explains that the plans for 204 00:12:52,800 --> 00:12:56,520 Speaker 1: demolishing his house were technically on quote unquote public display, 205 00:12:57,080 --> 00:13:00,600 Speaker 1: but they were stored in a disused lavatory that had 206 00:13:00,600 --> 00:13:04,280 Speaker 1: a sign on it reading Beware of the Leopard. Anyway, 207 00:13:04,520 --> 00:13:07,960 Speaker 1: the good news is that, according to AndroidPolice dot Com, 208 00:13:08,080 --> 00:13:10,960 Speaker 1: redditors can get their chat history by sending a data 209 00:13:11,000 --> 00:13:13,440 Speaker 1: request to Reddit to get a download of all of 210 00:13:13,480 --> 00:13:15,600 Speaker 1: your messages, which should include the ones that are no 211 00:13:15,679 --> 00:13:19,120 Speaker 1: longer accessible in chat. So there is a way to 212 00:13:19,120 --> 00:13:21,680 Speaker 1: still get that information, but you do have to reach 213 00:13:21,679 --> 00:13:24,440 Speaker 1: out and submit a data request to Reddit in order 214 00:13:24,440 --> 00:13:26,880 Speaker 1: to get it. The New York Police Department announced it 215 00:13:26,920 --> 00:13:32,040 Speaker 1: will test out a new emergency announcement system courtesy of drones, 216 00:13:32,400 --> 00:13:37,080 Speaker 1: So if you had flying robots warning of imminent danger 217 00:13:37,160 --> 00:13:41,080 Speaker 1: on your dystopian bingo card. Congratulations, you can mark that 218 00:13:41,240 --> 00:13:45,000 Speaker 1: square off. The plan, the police say is to deploy 219 00:13:45,080 --> 00:13:49,000 Speaker 1: the drones to high risk areas that emergency situations like 220 00:13:49,120 --> 00:13:53,079 Speaker 1: a severe weather event could impact. So recently, New York 221 00:13:53,080 --> 00:13:55,640 Speaker 1: has had problems with flooding as heavy rains have moved 222 00:13:55,679 --> 00:13:58,560 Speaker 1: through the city, and more flooding is expected this weekend, 223 00:13:58,880 --> 00:14:02,040 Speaker 1: so that prompted the NYPD to announce the test of 224 00:14:02,080 --> 00:14:05,760 Speaker 1: the system. While alerting locals to potential threats is important, 225 00:14:05,800 --> 00:14:08,800 Speaker 1: there's another concern that these drones could pull double duty 226 00:14:08,840 --> 00:14:12,880 Speaker 1: and serve as surveillance devices, and that's not nearly as cool. 227 00:14:13,480 --> 00:14:16,240 Speaker 1: And in fact, there are laws meant to protect citizens 228 00:14:16,280 --> 00:14:19,440 Speaker 1: in New York from such surveillance, and that includes the 229 00:14:19,480 --> 00:14:22,600 Speaker 1: requirement that the NYPD issue an impact statement on its 230 00:14:22,640 --> 00:14:25,880 Speaker 1: website and open up the policy to public comment, and 231 00:14:25,920 --> 00:14:28,920 Speaker 1: that that has to happen at least ninety days before 232 00:14:28,960 --> 00:14:32,920 Speaker 1: the NYPD actually deploys the technology. Now, there are questions 233 00:14:32,920 --> 00:14:35,800 Speaker 1: as to whether this particular use of drones would trigger 234 00:14:35,880 --> 00:14:40,080 Speaker 1: that policy. It's possible that NYPD doesn't need to go 235 00:14:40,120 --> 00:14:43,440 Speaker 1: to those steps, because if this is just a case 236 00:14:43,640 --> 00:14:46,480 Speaker 1: of the organization using existing tech in a new way 237 00:14:46,680 --> 00:14:50,160 Speaker 1: as opposed to rolling out an entirely new system. They 238 00:14:50,200 --> 00:14:53,920 Speaker 1: don't have to follow the same requirements anyway. Critics say 239 00:14:53,920 --> 00:14:56,280 Speaker 1: there are plenty of other ways that law enforcement can 240 00:14:56,320 --> 00:14:59,960 Speaker 1: reach out to citizens without throwing flying robots into the mix, 241 00:15:00,200 --> 00:15:02,000 Speaker 1: and that the use of such text seems like it's 242 00:15:02,040 --> 00:15:04,400 Speaker 1: trying to fix a problem with the wrong kind of tool. 243 00:15:05,320 --> 00:15:08,640 Speaker 1: Now in the department of driving Jonathan insane, we have 244 00:15:08,760 --> 00:15:11,680 Speaker 1: our penultimate story of Chuck Schumer, the leader of the 245 00:15:11,680 --> 00:15:15,720 Speaker 1: political majority in the US Senate, and Senator Mike Rownds, 246 00:15:15,880 --> 00:15:18,720 Speaker 1: who have jointly proposed an amendment to the National Defense 247 00:15:18,760 --> 00:15:23,160 Speaker 1: Authorization Act. This amendment would require the Executive branch to 248 00:15:23,200 --> 00:15:28,520 Speaker 1: share records relating to unidentified anomalous phenomena or UAPs, and 249 00:15:28,640 --> 00:15:31,160 Speaker 1: that the Executive branch would have to share the information 250 00:15:31,200 --> 00:15:34,680 Speaker 1: with Congress. So essentially, the way the story is being reported, 251 00:15:35,040 --> 00:15:37,760 Speaker 1: at least in some circles, is that Chucky boy here 252 00:15:37,840 --> 00:15:40,600 Speaker 1: wants to know about the aliens, or at least stuff 253 00:15:40,640 --> 00:15:44,040 Speaker 1: made by non human intelligence, which could include aliens, but 254 00:15:44,080 --> 00:15:46,240 Speaker 1: I guess it could also include stuff like I don't know, 255 00:15:46,560 --> 00:15:51,840 Speaker 1: robots or alternate dimension, Nicola Tesla's or something. Anyway, the 256 00:15:51,880 --> 00:15:55,200 Speaker 1: amendment argues that the executive branch is quote hiding that 257 00:15:55,280 --> 00:15:58,960 Speaker 1: information from both Congress and the public at large end quote, 258 00:15:59,360 --> 00:16:02,400 Speaker 1: and Schumer's that the public has quote a right to 259 00:16:02,480 --> 00:16:07,280 Speaker 1: learn about technologies of unknown origins, non human intelligence, and 260 00:16:07,440 --> 00:16:14,400 Speaker 1: unexplainable phenomena end quote. Now, y'all, you don't know this 261 00:16:14,600 --> 00:16:17,680 Speaker 1: because I edited it out, but I followed that last 262 00:16:17,680 --> 00:16:22,000 Speaker 1: sentence with a really, really long sigh because I think 263 00:16:22,040 --> 00:16:25,840 Speaker 1: this is just gonna fuel countless French theories about alien 264 00:16:25,880 --> 00:16:28,240 Speaker 1: civilizations and the like, when the truth of the matter 265 00:16:28,320 --> 00:16:31,280 Speaker 1: is that UAP is just really a catch all category 266 00:16:31,760 --> 00:16:35,600 Speaker 1: to describe situations in which someone saw something and where 267 00:16:35,720 --> 00:16:38,640 Speaker 1: they were unable to identify what it was, and that's 268 00:16:38,680 --> 00:16:43,040 Speaker 1: pretty much it. So that something could be incredibly mundane. 269 00:16:43,080 --> 00:16:45,480 Speaker 1: And I'm not just talking about like the old weather 270 00:16:45,600 --> 00:16:48,440 Speaker 1: balloons and swamp gas stuff that you hear, but even 271 00:16:48,480 --> 00:16:51,240 Speaker 1: things like I don't know the reflection of the moon 272 00:16:51,560 --> 00:16:56,560 Speaker 1: on the water, so UAP's that doesn't even necessarily imply 273 00:16:56,920 --> 00:17:00,560 Speaker 1: any sort of vehicle or technology, and the cases where 274 00:17:00,600 --> 00:17:03,600 Speaker 1: we are talking about technology that can include, you know, 275 00:17:03,920 --> 00:17:09,600 Speaker 1: prototypes of things like spy planes or military drones, and 276 00:17:09,960 --> 00:17:12,159 Speaker 1: that's the sort of stuff that countries don't like to 277 00:17:12,480 --> 00:17:14,879 Speaker 1: publicize because you know, they plan on using that on 278 00:17:15,000 --> 00:17:17,679 Speaker 1: other countries, so they tend to keep quiet about it. 279 00:17:18,280 --> 00:17:20,760 Speaker 1: But that doesn't mean that other people don't occasionally see 280 00:17:20,760 --> 00:17:23,359 Speaker 1: that stuff, and of course they can't identify it because it, 281 00:17:23,720 --> 00:17:26,879 Speaker 1: up to that point had been a secret vehicle. So 282 00:17:28,280 --> 00:17:32,439 Speaker 1: it's really not necessary that any of this stuff was 283 00:17:32,560 --> 00:17:36,399 Speaker 1: of non human origin. I also think if we're talking 284 00:17:36,440 --> 00:17:39,399 Speaker 1: non terrestrial, like if we're saying alien, I still think 285 00:17:39,880 --> 00:17:42,919 Speaker 1: that that is incredibly unlikely, almost to the point of 286 00:17:42,960 --> 00:17:45,359 Speaker 1: being impossible, because you have to keep in mind the 287 00:17:45,480 --> 00:17:48,800 Speaker 1: vast distances in space and how long it takes even 288 00:17:48,880 --> 00:17:54,000 Speaker 1: light to travel those vast distances. That's a lot. And 289 00:17:54,440 --> 00:17:57,240 Speaker 1: as long as we as a species have been able 290 00:17:57,280 --> 00:18:00,240 Speaker 1: to make radio waves, which is, you know, a little 291 00:18:00,240 --> 00:18:02,840 Speaker 1: more than a century, and most of those radio ways 292 00:18:02,880 --> 00:18:05,800 Speaker 1: wouldn't have reached very far out into space before they 293 00:18:06,040 --> 00:18:08,440 Speaker 1: got to a point where they weren't even really detectable. 294 00:18:09,040 --> 00:18:11,439 Speaker 1: It's just not realistic that anyone would have even noticed 295 00:18:11,480 --> 00:18:14,720 Speaker 1: we were here, let alone made the trip out here 296 00:18:14,760 --> 00:18:18,840 Speaker 1: to see what was up. Maybe it would involve technologies 297 00:18:18,880 --> 00:18:20,840 Speaker 1: that we just can't even imagine yet that would break 298 00:18:20,840 --> 00:18:22,920 Speaker 1: the laws of physics as we understand them, but that 299 00:18:22,960 --> 00:18:25,600 Speaker 1: would probably require so much energy as to raise the 300 00:18:25,680 --> 00:18:27,879 Speaker 1: question of why would you go to that kind of 301 00:18:27,920 --> 00:18:31,960 Speaker 1: effort to see the chuckle heads who occupy this planet? 302 00:18:32,280 --> 00:18:36,199 Speaker 1: It just it doesn't add up to me, Like it 303 00:18:36,240 --> 00:18:41,159 Speaker 1: doesn't make sense as a possibility. So I think this 304 00:18:41,240 --> 00:18:46,879 Speaker 1: is really just gonna create more suspicion and conspiracy theories 305 00:18:46,920 --> 00:18:49,360 Speaker 1: and fringe theories and that kind of stuff, and it's 306 00:18:49,359 --> 00:18:54,280 Speaker 1: not gonna really amount to anything substantial, But I guess 307 00:18:54,280 --> 00:18:57,679 Speaker 1: we'll see. Okay. The last bit I have for you 308 00:18:57,840 --> 00:19:01,359 Speaker 1: is an article recommendation. So other board over Advice has 309 00:19:01,400 --> 00:19:04,520 Speaker 1: posted an article titled Amazon told drivers not to worry 310 00:19:04,560 --> 00:19:08,879 Speaker 1: about in van surveillance cameras now footage is leaking online. 311 00:19:09,000 --> 00:19:12,159 Speaker 1: It was written by Jewles Roscoe. It's a great article 312 00:19:12,200 --> 00:19:15,160 Speaker 1: that reinforces something I mentioned in yesterday's tech Stuff episode 313 00:19:15,160 --> 00:19:18,280 Speaker 1: that even if the technology itself is benign, I would 314 00:19:18,359 --> 00:19:21,800 Speaker 1: argue surveillance equipment does not fall into that category. There 315 00:19:21,840 --> 00:19:24,239 Speaker 1: are still people who will behave as people and they 316 00:19:24,240 --> 00:19:26,560 Speaker 1: will do the wrong thing on occasion. So in this case, 317 00:19:26,560 --> 00:19:30,080 Speaker 1: it involves watching and at least in some cases, leaking 318 00:19:30,200 --> 00:19:33,760 Speaker 1: or sharing in vehicle surveillance footage, which Amazon claimed was 319 00:19:33,840 --> 00:19:36,520 Speaker 1: only meant to monitor drivers for the purposes of safety. 320 00:19:37,160 --> 00:19:40,919 Speaker 1: So there you go. That's your article recommendation. And as always, 321 00:19:40,920 --> 00:19:44,080 Speaker 1: I have no connection with Motherboard or Vice or Jewels Roscoe. 322 00:19:44,160 --> 00:19:46,160 Speaker 1: I just thought it was a good article and it's 323 00:19:46,160 --> 00:19:49,440 Speaker 1: worth your attention. That's it for the news for Tuesday, 324 00:19:49,560 --> 00:19:52,800 Speaker 1: July eighteenth, twenty twenty three. I hope you are all 325 00:19:52,920 --> 00:20:03,040 Speaker 1: well and I'll talk to you again really soon. Tech 326 00:20:03,119 --> 00:20:07,520 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 327 00:20:07,840 --> 00:20:11,560 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 328 00:20:11,600 --> 00:20:12,679 Speaker 1: to your favorite shows.