1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host 3 00:00:15,040 --> 00:00:18,080 Speaker 1: Jonathan strickland Um, an executive producer with I Heart Radio 4 00:00:18,200 --> 00:00:22,120 Speaker 1: and how the Tech Area. It's time for the tech 5 00:00:22,200 --> 00:00:27,640 Speaker 1: news for Thursday, jan twenty twenty two. Let's get to it. 6 00:00:28,280 --> 00:00:31,120 Speaker 1: Our first story is that we appear to be drawing 7 00:00:31,160 --> 00:00:35,640 Speaker 1: closer to the conclusion of a very long legal battle 8 00:00:35,960 --> 00:00:41,200 Speaker 1: between Apple and some of its employees, more specifically employees 9 00:00:41,200 --> 00:00:45,400 Speaker 1: who worked at Apple stores in California between July two 10 00:00:45,440 --> 00:00:51,200 Speaker 1: thousand nine until August two thousand and fifteen. So what's 11 00:00:51,200 --> 00:00:54,240 Speaker 1: at the heart of all this. Well, Apple had this 12 00:00:54,360 --> 00:00:58,400 Speaker 1: policy in place that every single employee was to submit 13 00:00:58,480 --> 00:01:03,080 Speaker 1: their bags and personal iPhones for a search before they 14 00:01:03,080 --> 00:01:06,040 Speaker 1: could leave the store. The company said this was an 15 00:01:06,080 --> 00:01:09,600 Speaker 1: effort to cut back on employees stealing products from the 16 00:01:09,640 --> 00:01:13,840 Speaker 1: store or potentially making off with trade secrets. A group 17 00:01:13,880 --> 00:01:17,640 Speaker 1: of employees sued the company and it became a class 18 00:01:17,680 --> 00:01:22,000 Speaker 1: action suit with more than fourteen thousand employees involved in it. 19 00:01:22,520 --> 00:01:26,600 Speaker 1: But the core of the lawsuit wasn't so much about 20 00:01:26,640 --> 00:01:30,520 Speaker 1: the fact that the employees had to submit to a search. 21 00:01:31,160 --> 00:01:35,679 Speaker 1: It was more about Apple refusing to count that time 22 00:01:36,440 --> 00:01:40,280 Speaker 1: as time worked by the employee. In other words, Apple 23 00:01:40,360 --> 00:01:43,080 Speaker 1: was treating these searches as if they were off the 24 00:01:43,120 --> 00:01:49,080 Speaker 1: clock in the free time of the employees, which the 25 00:01:49,120 --> 00:01:52,520 Speaker 1: employees were arguing was a violation of California state law. 26 00:01:52,760 --> 00:01:56,520 Speaker 1: So the legal representation for the employees were they were 27 00:01:56,520 --> 00:02:00,360 Speaker 1: successful to say that the searches were a ring on 28 00:02:00,480 --> 00:02:04,560 Speaker 1: company time and that employees therefore should have been compensated 29 00:02:04,600 --> 00:02:07,960 Speaker 1: for that time. Now, you might wonder how much time 30 00:02:08,200 --> 00:02:11,880 Speaker 1: did these searches take. Well. Engadget reports that the average 31 00:02:11,880 --> 00:02:15,520 Speaker 1: weight was between five to twenty minutes. And that doesn't 32 00:02:15,639 --> 00:02:18,440 Speaker 1: sound like very much time, right, I mean, it's it's 33 00:02:18,480 --> 00:02:20,560 Speaker 1: just not five to twenty minutes. It's not a whole 34 00:02:20,560 --> 00:02:23,560 Speaker 1: lot of time. But if you've ever driven in California, 35 00:02:23,880 --> 00:02:26,480 Speaker 1: you know that that state's rush hour ain't no joke. 36 00:02:26,680 --> 00:02:30,440 Speaker 1: And every minute you're spent being forced to stay at 37 00:02:30,480 --> 00:02:33,920 Speaker 1: work where you're not on the clock not fun. Uh, 38 00:02:33,919 --> 00:02:36,480 Speaker 1: And I mean that's that's the point, right, It's still 39 00:02:36,520 --> 00:02:39,760 Speaker 1: your time, and you would not be doing it, you know, 40 00:02:39,800 --> 00:02:43,519 Speaker 1: not be spending it sitting at work. You would only 41 00:02:43,520 --> 00:02:46,240 Speaker 1: do that if you were on the clock. So employees 42 00:02:46,280 --> 00:02:48,480 Speaker 1: should still have been compensated for their time, even if 43 00:02:48,520 --> 00:02:51,440 Speaker 1: that time was still a short amount each day plus 44 00:02:52,040 --> 00:02:54,880 Speaker 1: over you know, a length of your term of employment, 45 00:02:55,240 --> 00:02:57,880 Speaker 1: that five to twenty minutes. That really adds up. So 46 00:02:58,000 --> 00:03:00,640 Speaker 1: Apple has agreed to pay out a settlement to claimants. 47 00:03:01,320 --> 00:03:05,359 Speaker 1: The full settlement is a hair under thirty million dollars. 48 00:03:05,400 --> 00:03:08,359 Speaker 1: But again, we're talking about more than fourteen thousand, five 49 00:03:08,440 --> 00:03:11,880 Speaker 1: hundred employees who are covered in this class action suit. 50 00:03:12,440 --> 00:03:15,760 Speaker 1: So each person should expect to receive a payout of 51 00:03:15,800 --> 00:03:19,560 Speaker 1: around one thousand, two d eighty six bucks. Uh. That's 52 00:03:19,560 --> 00:03:22,880 Speaker 1: significantly more than what I typically see when I look 53 00:03:22,880 --> 00:03:26,440 Speaker 1: at class action suits. Uh. You often see ones that 54 00:03:26,520 --> 00:03:29,560 Speaker 1: have hundreds of thousands of claimants, and so everyone ends 55 00:03:29,600 --> 00:03:32,600 Speaker 1: up getting like a fraction of a dollar. Uh. And 56 00:03:32,680 --> 00:03:36,360 Speaker 1: typically the lawyers make out like bandits. The individual claimants 57 00:03:36,360 --> 00:03:39,400 Speaker 1: get very little, and the company gets you know, a 58 00:03:39,160 --> 00:03:42,600 Speaker 1: a fine that they must pay. But anyway, this whole 59 00:03:42,640 --> 00:03:46,360 Speaker 1: agreement is not quite done and dusted just yet. In fact, 60 00:03:46,360 --> 00:03:50,800 Speaker 1: the final approval for this settlement amount won't happen until 61 00:03:50,880 --> 00:03:55,480 Speaker 1: this summer. But then, assuming everything goes to you know, 62 00:03:55,720 --> 00:04:00,720 Speaker 1: the courts approval, this story will finally have an ending. 63 00:04:01,200 --> 00:04:05,080 Speaker 1: A federal judge has ruled that the US Federal Trade Commission, 64 00:04:05,200 --> 00:04:09,320 Speaker 1: or f TC, can move forward with an antitrust case 65 00:04:09,440 --> 00:04:14,760 Speaker 1: against Facebook slash Meta. Now this comes after lawyers for 66 00:04:14,920 --> 00:04:18,240 Speaker 1: Meta filed a motion to dismiss the case, and it 67 00:04:18,240 --> 00:04:19,760 Speaker 1: looks like it's going to be able to go on. 68 00:04:20,080 --> 00:04:23,479 Speaker 1: This was not always a sure bet. The FTC had 69 00:04:23,520 --> 00:04:27,120 Speaker 1: previously filed with the federal courts and received a rejection 70 00:04:27,480 --> 00:04:30,840 Speaker 1: from the judge. The judge said that the FTC had 71 00:04:30,880 --> 00:04:33,520 Speaker 1: failed to provide facts to support the case being made 72 00:04:33,520 --> 00:04:37,040 Speaker 1: against Meta. But second times the charm in this case, 73 00:04:37,160 --> 00:04:40,120 Speaker 1: I guess, as the judge now says that the revised 74 00:04:40,160 --> 00:04:44,320 Speaker 1: filing contains sufficient support to merit a case. The argument 75 00:04:44,360 --> 00:04:48,680 Speaker 1: the FTC is making is that Facebook slash Meta had 76 00:04:48,720 --> 00:04:51,960 Speaker 1: a pretty simple and straightforward strategy when it came to 77 00:04:52,000 --> 00:04:56,359 Speaker 1: competitors or even potential competitors, and it boils down to 78 00:04:56,640 --> 00:04:59,920 Speaker 1: if you can't beat them, by them. So, in other words, 79 00:05:00,200 --> 00:05:04,320 Speaker 1: when exectit, Facebook would look at services like say Instagram. 80 00:05:04,440 --> 00:05:08,000 Speaker 1: They saw that the photo sharing platform could potentially rival 81 00:05:08,120 --> 00:05:14,080 Speaker 1: Facebook itself and pull people's time and focus away from Facebook. Now, 82 00:05:14,080 --> 00:05:17,520 Speaker 1: I've talked about how Facebook's business model depends on keeping 83 00:05:17,520 --> 00:05:20,800 Speaker 1: people tied to the site for as long as possible. 84 00:05:21,320 --> 00:05:26,520 Speaker 1: That applies to all of Facebook's platforms, whether it's Facebook, Instagram, WhatsApp, 85 00:05:26,560 --> 00:05:30,440 Speaker 1: whatever it may be. So obviously, anything that could threaten that, 86 00:05:30,640 --> 00:05:34,200 Speaker 1: anything that could pull people's attention away from those properties, 87 00:05:34,600 --> 00:05:37,520 Speaker 1: is seen as a problem by Facebook. So the FTC 88 00:05:37,760 --> 00:05:41,360 Speaker 1: is arguing that Facebook tries to sidestep this challenge by 89 00:05:41,400 --> 00:05:44,680 Speaker 1: just buying up companies that could stand in the way 90 00:05:45,200 --> 00:05:50,640 Speaker 1: whenever the opportunity arises, and uh FTC provided enough information 91 00:05:50,720 --> 00:05:54,080 Speaker 1: to support this argument to convince the judge that a 92 00:05:54,360 --> 00:05:58,040 Speaker 1: lawsuit can in fact move forward, an antitrust suit can 93 00:05:58,040 --> 00:06:01,080 Speaker 1: move forward. I would also like to add that Facebook 94 00:06:01,120 --> 00:06:05,200 Speaker 1: frequently tries to imitate other platforms like U TikTok is 95 00:06:05,240 --> 00:06:09,080 Speaker 1: a big example right now. Clearly Facebook has not purchased TikTok, 96 00:06:09,160 --> 00:06:11,760 Speaker 1: but I bet you dollars to donuts they've talked about it, 97 00:06:11,839 --> 00:06:17,600 Speaker 1: at least at Facebook. But Facebook did introduce reels to Instagram, 98 00:06:17,600 --> 00:06:20,600 Speaker 1: which is a product that looks an awful lot like 99 00:06:20,839 --> 00:06:24,680 Speaker 1: the video platform on TikTok and and reels, it just 100 00:06:24,720 --> 00:06:27,640 Speaker 1: seems to be their approach to trying to keep people 101 00:06:27,720 --> 00:06:32,680 Speaker 1: on Instagram rather than having them go off the platform 102 00:06:32,720 --> 00:06:35,440 Speaker 1: and and check out videos and doom scroll on TikTok 103 00:06:35,440 --> 00:06:39,800 Speaker 1: instead of doom scrolling on Instagram. Uh so, I mean, 104 00:06:39,800 --> 00:06:41,440 Speaker 1: if you can't beat them, you buy them, And if 105 00:06:41,440 --> 00:06:44,080 Speaker 1: you can't buy them, I guess you imitate them. Anyway, 106 00:06:44,120 --> 00:06:46,880 Speaker 1: we still have to go through the actual antitrust suit, 107 00:06:47,080 --> 00:06:49,880 Speaker 1: Like this was just the preamble to see if if 108 00:06:50,080 --> 00:06:54,039 Speaker 1: a suit was you know, merited. So this matter is 109 00:06:54,120 --> 00:06:58,120 Speaker 1: far from resolved. It could potentially lead to a situation 110 00:06:58,320 --> 00:07:01,039 Speaker 1: where the US government forces to spin off some of 111 00:07:01,080 --> 00:07:04,640 Speaker 1: its properties. Maybe, but a lot of other things have 112 00:07:04,720 --> 00:07:08,560 Speaker 1: to happen before that comes to pass. An international group 113 00:07:08,600 --> 00:07:11,760 Speaker 1: of fact checkers have sent an open letter to YouTube's 114 00:07:11,800 --> 00:07:15,120 Speaker 1: CEO saying that the platform is quote one of the 115 00:07:15,160 --> 00:07:20,960 Speaker 1: major conduits of online disinformation and misinformation worldwide end quote this. 116 00:07:21,080 --> 00:07:25,600 Speaker 1: According to the Associated Press, The fact checkers assert that 117 00:07:25,880 --> 00:07:28,920 Speaker 1: bad actors on the platform are utilizing it to spread 118 00:07:28,960 --> 00:07:34,600 Speaker 1: falsehoods and dangerous information, particularly in non English speaking countries. 119 00:07:34,960 --> 00:07:37,920 Speaker 1: They say that YouTube appears to be doing very little 120 00:07:38,040 --> 00:07:42,360 Speaker 1: to address this problem, again in non English speaking countries, 121 00:07:42,760 --> 00:07:44,920 Speaker 1: and that the company's efforts have mostly been focused on 122 00:07:45,040 --> 00:07:49,840 Speaker 1: English speaking regions and the meanwhile, there is real damage 123 00:07:49,880 --> 00:07:54,640 Speaker 1: being done around the world and it's facilitated by YouTube. 124 00:07:54,720 --> 00:07:58,320 Speaker 1: That's essentially the argument. They also called for YouTube to 125 00:07:58,440 --> 00:08:04,840 Speaker 1: superimpose images our messages that debunk misinformation when it's identified. So, 126 00:08:05,480 --> 00:08:08,040 Speaker 1: in other words, when you find a video that is 127 00:08:08,640 --> 00:08:12,720 Speaker 1: UH sending out falsehoods that there should be a visual 128 00:08:13,080 --> 00:08:17,080 Speaker 1: queue that says, hey, this is misinformation. Because the group 129 00:08:17,120 --> 00:08:20,240 Speaker 1: says that displaying fact checking information is much more effective 130 00:08:20,280 --> 00:08:23,360 Speaker 1: than just deleting videos. And of course, the big issue 131 00:08:23,400 --> 00:08:26,200 Speaker 1: with deleting videos is that frequently bad actors can just 132 00:08:26,360 --> 00:08:29,680 Speaker 1: re upload it on another channel they own. But by 133 00:08:29,720 --> 00:08:34,360 Speaker 1: creating a process to label videos as containing misinformation, YouTube 134 00:08:34,400 --> 00:08:37,920 Speaker 1: can take away the power of those videos. I think 135 00:08:37,960 --> 00:08:40,200 Speaker 1: that's a pretty great idea in general, but I do 136 00:08:40,360 --> 00:08:44,520 Speaker 1: wonder how you would go about implementing it, particularly on 137 00:08:44,559 --> 00:08:48,679 Speaker 1: a scale as large as YouTube. We know that hundreds 138 00:08:48,760 --> 00:08:53,000 Speaker 1: of hours of video get uploaded to YouTube every single minute, 139 00:08:53,679 --> 00:08:55,520 Speaker 1: and I'm curious as to how you would design a 140 00:08:55,520 --> 00:09:00,840 Speaker 1: system to catch stuff quickly and effectively. If you relied 141 00:09:00,960 --> 00:09:04,120 Speaker 1: solely on human beings to moderate content, you would have 142 00:09:04,160 --> 00:09:06,800 Speaker 1: a situation like that old I Love Lucy episode in 143 00:09:06,800 --> 00:09:09,680 Speaker 1: which Lucy is at the conveyor line as bond bonds 144 00:09:09,679 --> 00:09:12,199 Speaker 1: are zooming pastor and she's just shoving them in her 145 00:09:12,200 --> 00:09:15,840 Speaker 1: shirt and in her mouth and everything because she's overwhelmed. Well, 146 00:09:16,080 --> 00:09:19,600 Speaker 1: imagine that scenario and multiply it by I don't know 147 00:09:20,040 --> 00:09:23,560 Speaker 1: a Brazilian because as you would be reviewing one video, 148 00:09:23,800 --> 00:09:26,280 Speaker 1: more than a hundred hours of other video would be 149 00:09:26,360 --> 00:09:30,480 Speaker 1: joining the network every single minute. That's a dawning challenge 150 00:09:30,480 --> 00:09:32,640 Speaker 1: and I don't have an answer for it. I assume 151 00:09:33,360 --> 00:09:37,040 Speaker 1: you would have to go towards some sort of automated solution, 152 00:09:37,160 --> 00:09:39,120 Speaker 1: but then you have to make sure that whatever automated 153 00:09:39,160 --> 00:09:42,800 Speaker 1: solution you design is effective and it's not prone to 154 00:09:42,880 --> 00:09:46,040 Speaker 1: bias or false negatives. There are a host of other 155 00:09:46,160 --> 00:09:49,680 Speaker 1: issues that come up, So I definitely feel that this 156 00:09:49,840 --> 00:09:52,040 Speaker 1: is a real world problem that we need to address. 157 00:09:52,800 --> 00:09:56,600 Speaker 1: I am at a loss as to what that solution 158 00:09:56,720 --> 00:10:00,240 Speaker 1: could be, but then you know there are people far 159 00:10:00,320 --> 00:10:03,280 Speaker 1: more intelligent than I am who are working on this problem. 160 00:10:03,360 --> 00:10:06,440 Speaker 1: I have an update for the ongoing story about the 161 00:10:06,559 --> 00:10:11,080 Speaker 1: US Federal Aviation Administration or f a A, and it's 162 00:10:11,440 --> 00:10:17,280 Speaker 1: uh kind of tense conversation with the telecommunications industry regarding 163 00:10:17,320 --> 00:10:20,960 Speaker 1: five G, namely A T and T and Verizon. Those 164 00:10:21,000 --> 00:10:24,000 Speaker 1: are the two companies that are specifically kind of at 165 00:10:24,000 --> 00:10:26,280 Speaker 1: odds with the f a A. So the story so 166 00:10:26,400 --> 00:10:30,959 Speaker 1: far has been some tension between them and the f 167 00:10:30,960 --> 00:10:33,440 Speaker 1: a A over the rollout of five G technology, and 168 00:10:33,440 --> 00:10:38,400 Speaker 1: the concern is that five G frequencies, specifically those in 169 00:10:38,480 --> 00:10:42,840 Speaker 1: the C band spectrum of five G, could potentially interfere 170 00:10:42,880 --> 00:10:47,840 Speaker 1: with the operation of sensitive equipment on board airplanes and jets, 171 00:10:47,880 --> 00:10:51,280 Speaker 1: like radio altimeters. So last year A T and T 172 00:10:51,480 --> 00:10:54,719 Speaker 1: and Verizon purchased up pretty much all of the C 173 00:10:54,920 --> 00:10:58,760 Speaker 1: band spectrum of frequencies in an auction. That's why those 174 00:10:58,800 --> 00:11:01,400 Speaker 1: two companies have been sing gold out. Now. I point 175 00:11:01,440 --> 00:11:05,160 Speaker 1: that out because until I read this update in Reuters, 176 00:11:05,240 --> 00:11:07,480 Speaker 1: I actually didn't know why it was just A T 177 00:11:07,640 --> 00:11:10,280 Speaker 1: and T and Verizon that we're getting called out. That's 178 00:11:10,320 --> 00:11:13,120 Speaker 1: just bad form on my part for not diving in 179 00:11:13,200 --> 00:11:17,440 Speaker 1: deeply enough to find out the reason why. Anyway, we're 180 00:11:17,440 --> 00:11:21,400 Speaker 1: starting to see some progress being made. For example, A 181 00:11:21,559 --> 00:11:23,520 Speaker 1: T and T has agreed to create a sort of 182 00:11:23,600 --> 00:11:27,640 Speaker 1: buffer zone around fifty airports so that jets flying in 183 00:11:27,720 --> 00:11:30,760 Speaker 1: and out of those won't be bombarded with frequencies that 184 00:11:30,800 --> 00:11:34,400 Speaker 1: could potentially cause interference. Uh. The f a A has 185 00:11:34,600 --> 00:11:39,640 Speaker 1: issued new sort of directives for aircraft that have untested altimeters, 186 00:11:39,679 --> 00:11:43,079 Speaker 1: you know, altimeters where we don't know if five G 187 00:11:43,280 --> 00:11:45,959 Speaker 1: would interfere with their operation or not, And the f 188 00:11:45,960 --> 00:11:48,920 Speaker 1: a A says that those aircraft will not be allowed 189 00:11:48,960 --> 00:11:52,599 Speaker 1: to land at air strips or airports in low visibility 190 00:11:52,640 --> 00:11:57,640 Speaker 1: settings if five G happens to be deployed within that region. Uh. 191 00:11:57,720 --> 00:12:00,280 Speaker 1: It sounds like there's a big move to replace older 192 00:12:00,400 --> 00:12:04,280 Speaker 1: radio altimeters with newer technologies that have more effective shielding 193 00:12:04,320 --> 00:12:08,120 Speaker 1: against these radio frequencies. So there's overall this kind of 194 00:12:08,160 --> 00:12:12,720 Speaker 1: push to adopt technologies that should at least theoretically be 195 00:12:12,840 --> 00:12:15,920 Speaker 1: resistant to this kind of interference. You might remember that 196 00:12:16,000 --> 00:12:19,160 Speaker 1: for ages, the rules on airplanes for bade people from 197 00:12:19,160 --> 00:12:22,120 Speaker 1: even having a cell phone on at all out of 198 00:12:22,160 --> 00:12:25,960 Speaker 1: concern that RF signals from the phones could create interference 199 00:12:26,000 --> 00:12:30,080 Speaker 1: for various airplane systems. Eventually we got away from that obviously. 200 00:12:30,160 --> 00:12:33,320 Speaker 1: Now you can have your phone on, although you're supposed 201 00:12:33,320 --> 00:12:35,679 Speaker 1: to have it an airplane mode when you're on an airplane. 202 00:12:36,120 --> 00:12:38,200 Speaker 1: But you know, before you couldn't even do that, Like 203 00:12:38,240 --> 00:12:42,160 Speaker 1: you couldn't even have like an e book reader on 204 00:12:42,160 --> 00:12:44,560 Speaker 1: on an airplane. Uh. And then they said, well, okay, 205 00:12:44,760 --> 00:12:46,599 Speaker 1: you can have it on, but not during takeoff and 206 00:12:46,679 --> 00:12:48,840 Speaker 1: landing and all that kind of stuff. This is similar 207 00:12:48,880 --> 00:12:52,360 Speaker 1: to that, but we are seeing some progress. All right, 208 00:12:52,440 --> 00:12:54,559 Speaker 1: We've got some more stories to get to, but before 209 00:12:54,600 --> 00:13:04,679 Speaker 1: we get to them, let's take a quick break. We're back. 210 00:13:05,360 --> 00:13:09,400 Speaker 1: Business Insider reports that hackers have launched a malware campaign 211 00:13:09,520 --> 00:13:13,240 Speaker 1: that is likely using email as the delivery mechanism. This 212 00:13:13,280 --> 00:13:17,160 Speaker 1: is kind of old school to to blast out emails 213 00:13:17,160 --> 00:13:20,760 Speaker 1: containing malware. The malwaring question is wrapped up in a 214 00:13:20,800 --> 00:13:24,319 Speaker 1: file that has been named a Macron stats dot e 215 00:13:24,679 --> 00:13:27,679 Speaker 1: x c. That's an executable file. So it sounds like 216 00:13:28,080 --> 00:13:30,800 Speaker 1: this is a campaign that is leveraging the a Macron 217 00:13:30,880 --> 00:13:33,480 Speaker 1: outbreak in an effort to convince people to click on 218 00:13:33,520 --> 00:13:36,600 Speaker 1: the file. You know, if you scare people and say 219 00:13:36,640 --> 00:13:39,199 Speaker 1: you need to know this information, click on this, then 220 00:13:39,320 --> 00:13:43,280 Speaker 1: maybe it's effective, and then at that point malware installs 221 00:13:43,320 --> 00:13:48,160 Speaker 1: on the target machine. The actual malware is called redline stealer, 222 00:13:48,679 --> 00:13:51,960 Speaker 1: and it typically steals login information like user names and 223 00:13:52,000 --> 00:13:55,640 Speaker 1: passwords from the infected computer and then sends them off 224 00:13:55,640 --> 00:13:58,320 Speaker 1: to the hacker, who then sells that information on the 225 00:13:58,400 --> 00:14:02,520 Speaker 1: dark web. The security firm Forte Guard has identified twelve 226 00:14:02,559 --> 00:14:05,800 Speaker 1: countries where this attack has appeared so far, and it 227 00:14:05,880 --> 00:14:10,080 Speaker 1: sounds like this is not a targeted focused campaign. Rather, 228 00:14:10,440 --> 00:14:13,960 Speaker 1: this is like a classic malware campaign where hackers are 229 00:14:14,000 --> 00:14:17,480 Speaker 1: just casting as wide a net as they possibly can 230 00:14:17,600 --> 00:14:19,960 Speaker 1: and hope to get some hits. So this is just 231 00:14:20,560 --> 00:14:23,600 Speaker 1: yet another reminder that it is a bad idea to 232 00:14:23,640 --> 00:14:27,800 Speaker 1: click on files that are sent through email, particularly if 233 00:14:27,800 --> 00:14:30,840 Speaker 1: they're coming from someone you don't know, and if they 234 00:14:30,960 --> 00:14:33,960 Speaker 1: are coming from someone you know, it's best to touch 235 00:14:34,000 --> 00:14:37,080 Speaker 1: base with that person make sure that they, you know, 236 00:14:37,400 --> 00:14:41,680 Speaker 1: legitimately sent you a file before you do anything, because 237 00:14:42,520 --> 00:14:46,360 Speaker 1: this is just uh an easy vector for hackers to 238 00:14:46,360 --> 00:14:48,160 Speaker 1: to take. It's also an easy one for us to 239 00:14:48,240 --> 00:14:52,640 Speaker 1: avoid if we're just a little careful. CNBC reports that 240 00:14:52,720 --> 00:14:55,760 Speaker 1: there's a land rush going on in the virtual space, 241 00:14:55,840 --> 00:15:00,000 Speaker 1: with folks spending hundreds of thousands of dollars to secure 242 00:15:00,040 --> 00:15:05,040 Speaker 1: your virtual real estate. And yes, this relates back to 243 00:15:05,080 --> 00:15:09,080 Speaker 1: the concept of the metaverse. Uh. This story actually feels 244 00:15:09,120 --> 00:15:11,720 Speaker 1: like deja vu to me because I remember hearing similar 245 00:15:11,720 --> 00:15:16,520 Speaker 1: things way back when Second Life was buzzworthy. But yeah, 246 00:15:16,640 --> 00:15:21,160 Speaker 1: folks are shelling out big bucks to secure specific space 247 00:15:21,600 --> 00:15:26,160 Speaker 1: on digital you know, various virtual environments, including ones with 248 00:15:26,320 --> 00:15:30,160 Speaker 1: names like the Central Land and Sandbox. And we're starting 249 00:15:30,160 --> 00:15:33,520 Speaker 1: to hear a lot of people say, pump the brakes 250 00:15:33,520 --> 00:15:38,080 Speaker 1: a bit before you jump on this speculation board. Uh. 251 00:15:38,120 --> 00:15:40,000 Speaker 1: And I would like to add my voice to the 252 00:15:40,040 --> 00:15:44,600 Speaker 1: folks who are saying, hey, be careful, um, don't put 253 00:15:44,640 --> 00:15:47,320 Speaker 1: any money that you aren't willing to just say goodbye 254 00:15:47,400 --> 00:15:50,960 Speaker 1: to into virtual real estate right now, right if you 255 00:15:51,000 --> 00:15:53,960 Speaker 1: can't afford to lose that money, I mean, just assume 256 00:15:54,040 --> 00:15:57,800 Speaker 1: that money is lost, because that way, if it's not, 257 00:15:57,960 --> 00:16:01,000 Speaker 1: if things turn around and things are great, that's fantastic. 258 00:16:01,160 --> 00:16:03,720 Speaker 1: If you're counting on it and it doesn't, that's bad. 259 00:16:03,840 --> 00:16:09,080 Speaker 1: So just say, all right, well goodbye three hundred thousand dollars. 260 00:16:09,120 --> 00:16:12,440 Speaker 1: I don't know who can afford to do that, but 261 00:16:13,080 --> 00:16:16,520 Speaker 1: that that's kind of the scale we're talking about. So yeah, 262 00:16:16,640 --> 00:16:20,160 Speaker 1: be careful and don't do that. Just just invest money 263 00:16:20,200 --> 00:16:23,400 Speaker 1: where you can afford to lose that money. And I 264 00:16:23,440 --> 00:16:30,640 Speaker 1: want to mention again that you know, we're so far 265 00:16:30,680 --> 00:16:36,320 Speaker 1: away from a real metaverse, a compelling metaverse, one that 266 00:16:36,680 --> 00:16:39,080 Speaker 1: can allow more than say, a couple of dozen people 267 00:16:39,120 --> 00:16:41,640 Speaker 1: to gather in the same virtual space at the same time, 268 00:16:42,200 --> 00:16:44,280 Speaker 1: um and do things that are on a really kind 269 00:16:44,280 --> 00:16:47,080 Speaker 1: of deep level, Like you can have a little virtual 270 00:16:47,160 --> 00:16:49,960 Speaker 1: environment that people can log into. You know, we've got 271 00:16:50,120 --> 00:16:53,520 Speaker 1: video games that allow for the simultaneous play of more 272 00:16:53,560 --> 00:16:58,640 Speaker 1: than a hundred people. But that's a very narrow, narrowly 273 00:16:58,760 --> 00:17:02,160 Speaker 1: defined environment it right, where you have a limited number 274 00:17:02,160 --> 00:17:04,880 Speaker 1: of interactions you can take within that environment. Because it's 275 00:17:04,920 --> 00:17:07,120 Speaker 1: a game. You know, you're not allowed to do everything. 276 00:17:07,640 --> 00:17:10,159 Speaker 1: You're allowed to do whatever the game is meant for 277 00:17:10,240 --> 00:17:14,200 Speaker 1: you to do. While the metaverse theoretically is a place 278 00:17:14,240 --> 00:17:17,439 Speaker 1: where you would have much fewer constraints on what you 279 00:17:17,440 --> 00:17:20,240 Speaker 1: could do. You would have an enormous variety of things 280 00:17:20,240 --> 00:17:22,320 Speaker 1: you could do, perhaps even things that you could never 281 00:17:22,359 --> 00:17:24,919 Speaker 1: do in real the real world. Like I think of 282 00:17:24,960 --> 00:17:28,080 Speaker 1: the matrix where you know you see neo fly at 283 00:17:28,080 --> 00:17:31,200 Speaker 1: the end of it, well spoiler alert from a movie 284 00:17:31,200 --> 00:17:34,000 Speaker 1: that came out in the late nineties. But but but 285 00:17:34,119 --> 00:17:36,320 Speaker 1: you know, I think of something like that with the metaverse. Well, 286 00:17:37,720 --> 00:17:39,439 Speaker 1: you know, it's gonna take a while for us to 287 00:17:39,560 --> 00:17:42,639 Speaker 1: have the technology to support that kind of thing. In fact, 288 00:17:42,880 --> 00:17:47,280 Speaker 1: I mentioned last year that a VP at Intel said, 289 00:17:47,960 --> 00:17:50,760 Speaker 1: our technology is is will need to be a thousand 290 00:17:50,760 --> 00:17:56,280 Speaker 1: times more powerful to support a really meaningful metaverse. Meanwhile, 291 00:17:56,680 --> 00:18:00,280 Speaker 1: if you go and buy virtual real estate in one 292 00:18:00,440 --> 00:18:05,280 Speaker 1: of these virtual environments, who's to say that that virtual 293 00:18:05,359 --> 00:18:08,120 Speaker 1: environment is still going to be around whenever the metaverse 294 00:18:08,240 --> 00:18:12,240 Speaker 1: actually does become a legit thing. If it ever does, 295 00:18:12,359 --> 00:18:15,359 Speaker 1: I'm skeptical that it ever will. But even if it does, 296 00:18:16,160 --> 00:18:18,720 Speaker 1: there's the very real possibility that the place where you 297 00:18:18,840 --> 00:18:25,160 Speaker 1: bought your you know, virtual acreage isn't really a thing anymore, 298 00:18:25,640 --> 00:18:29,040 Speaker 1: and is is the digital equivalent of a ghost town, 299 00:18:29,520 --> 00:18:34,520 Speaker 1: and isn't really connected to a larger metaverse that's meaningful. 300 00:18:34,880 --> 00:18:37,480 Speaker 1: That would just mean that you wasted all that money. 301 00:18:37,520 --> 00:18:40,200 Speaker 1: So that would be a bad investment right now. The 302 00:18:40,200 --> 00:18:43,119 Speaker 1: folks who are running these virtual environments, they're making the killing, 303 00:18:43,480 --> 00:18:48,159 Speaker 1: particularly as virtual real estate prices balloon beyond belief. But 304 00:18:48,240 --> 00:18:51,720 Speaker 1: this is probably not sustainable. And I agree with folks 305 00:18:51,760 --> 00:18:55,440 Speaker 1: like Mark Stapp, who is a professor at Arizona State University, 306 00:18:55,800 --> 00:18:58,560 Speaker 1: and he says, quote, if it continues the way it's going, 307 00:18:58,760 --> 00:19:01,800 Speaker 1: it is most likely going to be a bubble end quote, 308 00:19:02,560 --> 00:19:07,280 Speaker 1: and like Professor Staff, I would never say definitively that 309 00:19:07,400 --> 00:19:10,480 Speaker 1: this is a bubble, but I'm gonna say that it 310 00:19:10,520 --> 00:19:13,879 Speaker 1: sure does walk and quack like a bubble. Besides which, 311 00:19:14,119 --> 00:19:16,960 Speaker 1: my guess is that again, we're several years away from 312 00:19:16,960 --> 00:19:20,679 Speaker 1: having a really compelling metaverse experience. If such a thing 313 00:19:20,680 --> 00:19:24,160 Speaker 1: as possible, it probably will be possible. I'm just grouchy. 314 00:19:24,200 --> 00:19:26,400 Speaker 1: I'm determined to be a techno hermit. I don't think 315 00:19:26,400 --> 00:19:29,720 Speaker 1: you're ever gonna come across me in the metaverse. I'm 316 00:19:29,720 --> 00:19:34,119 Speaker 1: more likely to be sitting quietly in a chair, reading 317 00:19:34,119 --> 00:19:37,840 Speaker 1: a book, or perhaps playing Seven Days to Die because 318 00:19:37,880 --> 00:19:41,760 Speaker 1: I've gotten addicted to that game. In the state of 319 00:19:41,840 --> 00:19:45,159 Speaker 1: Massachusetts here in the United States, held a referendum on 320 00:19:45,240 --> 00:19:48,760 Speaker 1: a right to repair law for cars, and the voting 321 00:19:48,760 --> 00:19:55,720 Speaker 1: populace of Massachusetts overwhelmingly supported the law. Voted in favor 322 00:19:55,760 --> 00:20:00,320 Speaker 1: of it so voted against. That is an overwhelm homing 323 00:20:00,400 --> 00:20:02,639 Speaker 1: win when you're looking at a lot of votes that 324 00:20:02,720 --> 00:20:07,920 Speaker 1: come down to like fifty two versus is a huge win. Well, 325 00:20:07,960 --> 00:20:10,199 Speaker 1: the car companies were not thrilled with this, and they 326 00:20:10,280 --> 00:20:13,040 Speaker 1: quickly filed lawsuits to reverse the law or the very 327 00:20:13,119 --> 00:20:16,120 Speaker 1: least delay when it would go into effect. See it's 328 00:20:16,160 --> 00:20:19,439 Speaker 1: supposed to take effect this year, but the car companies 329 00:20:19,480 --> 00:20:21,880 Speaker 1: say that's too soon and it should really be more 330 00:20:21,920 --> 00:20:26,960 Speaker 1: like or really never. Years ago, Massachusetts actually voted that 331 00:20:26,960 --> 00:20:31,000 Speaker 1: car companies would have to make wired connection telematic data 332 00:20:31,040 --> 00:20:34,719 Speaker 1: accessible in Massachusetts, and that would mean that say, an 333 00:20:34,720 --> 00:20:38,840 Speaker 1: independent repair shop or potentially even just a private car 334 00:20:38,920 --> 00:20:43,080 Speaker 1: owner would be able to diagnose problems with their car 335 00:20:43,119 --> 00:20:47,000 Speaker 1: and complete repairs on that car, rather than being forced 336 00:20:47,080 --> 00:20:50,800 Speaker 1: to take the vehicle to, say, an official dealership to 337 00:20:50,880 --> 00:20:54,159 Speaker 1: get the work done. We've been seeing this trend of 338 00:20:54,240 --> 00:20:58,159 Speaker 1: turning tech into closed off ecosystems so that one party 339 00:20:58,240 --> 00:21:01,800 Speaker 1: has essentially economic control over the whole thing. In fact, 340 00:21:01,800 --> 00:21:04,239 Speaker 1: we talked about this earlier this week with Cannon and 341 00:21:04,320 --> 00:21:09,600 Speaker 1: its printer and toner cartridges. Uh. But anyway, that earlier 342 00:21:09,720 --> 00:21:13,199 Speaker 1: law which did pass, had a technical gap in it 343 00:21:13,280 --> 00:21:17,920 Speaker 1: because it did not anticipate wireless systems, and so the 344 00:21:18,040 --> 00:21:22,560 Speaker 1: law only required automakers to comply with it using wired systems. 345 00:21:22,680 --> 00:21:25,879 Speaker 1: So then you saw car companies migrate over to wireless 346 00:21:26,520 --> 00:21:29,840 Speaker 1: approaches and they didn't have to comply because the wireless 347 00:21:29,920 --> 00:21:33,879 Speaker 1: wasn't covered well the law shut that gap down, and 348 00:21:33,920 --> 00:21:37,320 Speaker 1: according to Motherboard, car companies have brought a lawsuit against 349 00:21:37,320 --> 00:21:41,360 Speaker 1: the state, claiming the law to be unconstitutional. I am 350 00:21:41,400 --> 00:21:45,000 Speaker 1: not sure on what grounds that claim is based off of. 351 00:21:45,440 --> 00:21:48,159 Speaker 1: Sounds like a Hail Mary play to me, but I 352 00:21:48,200 --> 00:21:51,359 Speaker 1: could very well be missing something obvious. I am not 353 00:21:51,400 --> 00:21:54,320 Speaker 1: a legal expert by any stretch of the imagination, but 354 00:21:54,400 --> 00:21:59,280 Speaker 1: my guess is that with a seventy support in the state, 355 00:21:59,760 --> 00:22:01,920 Speaker 1: the car industry is going to have a real tough 356 00:22:01,960 --> 00:22:06,119 Speaker 1: time convincing lawmakers in Massachusetts to overturn this law or 357 00:22:06,160 --> 00:22:09,400 Speaker 1: to amend it in a way that takes the teeth out. 358 00:22:09,720 --> 00:22:12,159 Speaker 1: I think most politicians in the state are going to 359 00:22:12,280 --> 00:22:14,520 Speaker 1: feel a lot of pressure to stick to what the 360 00:22:14,600 --> 00:22:18,200 Speaker 1: voters were asking for, because doing otherwise is a great 361 00:22:18,200 --> 00:22:23,080 Speaker 1: way to find your exit out of local politics. Okay, 362 00:22:23,119 --> 00:22:25,639 Speaker 1: we've got a couple more stories to cover, but before 363 00:22:25,640 --> 00:22:35,480 Speaker 1: we get to that, let's take another quick break. Okay, 364 00:22:35,680 --> 00:22:38,840 Speaker 1: if you happen to be a famous person and you 365 00:22:38,880 --> 00:22:43,080 Speaker 1: are approached by someone who's asking you to endorse a cryptocurrency, 366 00:22:43,760 --> 00:22:48,080 Speaker 1: you might want to do your homework first. Kim Kardashian 367 00:22:48,240 --> 00:22:51,760 Speaker 1: and Floyd Mayweather are learning that firsthand, as they face 368 00:22:51,840 --> 00:22:55,919 Speaker 1: a lawsuit from investors who lost money in a cryptocurrency 369 00:22:55,960 --> 00:23:00,119 Speaker 1: called ethere um Max, which I should add is not 370 00:23:00,320 --> 00:23:04,919 Speaker 1: actually related to the famous cryptocurrency ether that's on the 371 00:23:05,000 --> 00:23:08,960 Speaker 1: Ethereum blockchain. It has a very similar name, which is 372 00:23:09,000 --> 00:23:12,000 Speaker 1: a that's a red flag right there, but it's not 373 00:23:12,119 --> 00:23:17,040 Speaker 1: related to it. Both Kardashian and Mayweather promoted this crypto 374 00:23:17,119 --> 00:23:21,240 Speaker 1: token over the last year two and now a group 375 00:23:21,240 --> 00:23:26,200 Speaker 1: of investors claim that their promotion was spreading misinformation about 376 00:23:26,200 --> 00:23:30,520 Speaker 1: the tokens, and subsequently it drove up the cost of 377 00:23:30,560 --> 00:23:34,359 Speaker 1: the tokens temporarily before the bottom dropped out and the 378 00:23:34,400 --> 00:23:37,400 Speaker 1: investors then lost a ton of money. So what they're 379 00:23:37,400 --> 00:23:41,120 Speaker 1: saying is that ethereum Max was a pump and dump scheme. Now, 380 00:23:41,160 --> 00:23:44,439 Speaker 1: in case you're not familiar with that term, the idea 381 00:23:44,480 --> 00:23:47,040 Speaker 1: is that, let's say that you want to run this scheme, 382 00:23:47,160 --> 00:23:50,320 Speaker 1: so you either establish or you buy up a ton 383 00:23:50,359 --> 00:23:54,960 Speaker 1: of something that's super cheap and has low value, right, 384 00:23:55,040 --> 00:23:57,960 Speaker 1: low market value. Uh, this could be anything. It doesn't 385 00:23:58,000 --> 00:24:00,560 Speaker 1: have to be cryptocurrency. It can literally be anything, and 386 00:24:00,600 --> 00:24:03,440 Speaker 1: you just buy it up and then you use various 387 00:24:03,440 --> 00:24:08,000 Speaker 1: means to drive up excitement and speculation about that thing 388 00:24:08,560 --> 00:24:12,280 Speaker 1: so that more folks jump on board and buy it 389 00:24:12,359 --> 00:24:15,919 Speaker 1: all up. So this always makes me think of the 390 00:24:15,960 --> 00:24:20,560 Speaker 1: film trading places. But you get you get things moving 391 00:24:20,640 --> 00:24:24,240 Speaker 1: so that investors start thinking, oh, this is the next 392 00:24:24,800 --> 00:24:27,040 Speaker 1: you know, big ticket, this is the next to the 393 00:24:27,080 --> 00:24:29,600 Speaker 1: moon thing. We're gonna I want to get in on 394 00:24:29,640 --> 00:24:32,439 Speaker 1: this early on so I can make a mint and 395 00:24:32,480 --> 00:24:35,679 Speaker 1: a bunch of people start buying it. Well. Increased demand 396 00:24:35,720 --> 00:24:38,600 Speaker 1: typically prompts an increase in price. Right, if more people 397 00:24:38,600 --> 00:24:42,280 Speaker 1: are demanding it, the value goes up. You can demand 398 00:24:42,280 --> 00:24:44,160 Speaker 1: a higher price for the thing. So there you are 399 00:24:44,680 --> 00:24:47,440 Speaker 1: sitting on amount of whatever it we're talking about here, 400 00:24:47,440 --> 00:24:51,480 Speaker 1: whether it's cryptocurrency or something else that you bought at 401 00:24:52,040 --> 00:24:54,639 Speaker 1: pennies on the dollar, and now it's worth a ton 402 00:24:54,720 --> 00:24:58,080 Speaker 1: of money. So you quickly sell it all off and 403 00:24:58,160 --> 00:25:00,520 Speaker 1: you dump it before folks figured out it it's just 404 00:25:00,560 --> 00:25:04,600 Speaker 1: smoking mirrors, and you make a huge profit. Then you're 405 00:25:04,640 --> 00:25:08,520 Speaker 1: off to Rio to join the producers. Meanwhile, all the 406 00:25:08,560 --> 00:25:11,720 Speaker 1: rest of the folks who joined in after you, they're 407 00:25:11,800 --> 00:25:14,720 Speaker 1: left holding the bag and they're trying to sell off 408 00:25:14,800 --> 00:25:18,000 Speaker 1: their investment as the prices start to come down. And 409 00:25:18,040 --> 00:25:21,480 Speaker 1: if they're lucky, they might break. Even if they're not lucky, 410 00:25:21,600 --> 00:25:23,800 Speaker 1: they can lose a ton of money in the process. 411 00:25:24,320 --> 00:25:27,000 Speaker 1: That's what the lawsuit is alleging happened in this case, 412 00:25:27,040 --> 00:25:29,760 Speaker 1: that Ethereum Max was a pump and dump scheme from 413 00:25:29,800 --> 00:25:34,240 Speaker 1: the beginning, and that Kardashian and Mayweather were part of 414 00:25:34,240 --> 00:25:38,960 Speaker 1: the mechanism to drive up the cost of the token 415 00:25:39,040 --> 00:25:42,600 Speaker 1: before the scammers, you know, pull the plug. I think 416 00:25:42,600 --> 00:25:45,280 Speaker 1: the thing to take home here is that crypto remains 417 00:25:45,359 --> 00:25:49,399 Speaker 1: a scammers paradise, so don't let yourself get caught up 418 00:25:49,400 --> 00:25:54,520 Speaker 1: in the excitement without doing your homework. Now. I'm infamously 419 00:25:54,720 --> 00:25:58,000 Speaker 1: negative about crypto in general for lots of reasons that 420 00:25:58,080 --> 00:26:01,560 Speaker 1: I won't get into here, and I mostly feel like 421 00:26:01,680 --> 00:26:04,800 Speaker 1: crypto in general is kind of a scam in a 422 00:26:04,880 --> 00:26:08,399 Speaker 1: really big picture way. Um, it's sort of the thing 423 00:26:08,440 --> 00:26:10,800 Speaker 1: where you could actually have a fake it till you 424 00:26:10,840 --> 00:26:14,520 Speaker 1: make it approach, but I'm not convinced that the make 425 00:26:14,560 --> 00:26:18,560 Speaker 1: it has actually happened. However, you've all heard me go 426 00:26:18,640 --> 00:26:21,400 Speaker 1: on enough about that kind of stuff already, so let's 427 00:26:21,440 --> 00:26:25,679 Speaker 1: move on. But let's talk about other scams that you 428 00:26:25,680 --> 00:26:27,800 Speaker 1: should be on the lookout for. We talked about email, 429 00:26:28,040 --> 00:26:31,640 Speaker 1: We've talked about crypto. Ours Technica ran a piece recently 430 00:26:31,680 --> 00:26:35,760 Speaker 1: titled quote scammers put fake QR codes on parking meters 431 00:26:35,800 --> 00:26:39,720 Speaker 1: to intercept parker's payments end quote. I love that headline. 432 00:26:39,760 --> 00:26:42,439 Speaker 1: It gives you the gist right away. So kudos to 433 00:26:42,560 --> 00:26:45,800 Speaker 1: John Broadkin and the team at ours Technica for putting 434 00:26:45,800 --> 00:26:49,520 Speaker 1: this together. The article, by the way, also fantastic. So yeah, 435 00:26:49,560 --> 00:26:52,520 Speaker 1: the scam here is that thieves put up QR code 436 00:26:52,600 --> 00:26:56,200 Speaker 1: stickers on parking meters around several cities in the state 437 00:26:56,280 --> 00:27:00,800 Speaker 1: of Texas. So if you were to go park somewhere 438 00:27:00,800 --> 00:27:02,639 Speaker 1: in Texas and you see the parking meter, and you 439 00:27:02,640 --> 00:27:04,920 Speaker 1: walk up and it's got this sticker with a QR 440 00:27:05,000 --> 00:27:07,680 Speaker 1: code on it, and you use your smartphone to scan 441 00:27:07,800 --> 00:27:10,920 Speaker 1: the code and then visit the associated website it was 442 00:27:10,960 --> 00:27:13,879 Speaker 1: sending you to. You would see what appeared to be 443 00:27:13,920 --> 00:27:18,080 Speaker 1: a pretty simple payment site, presumably for that parking meter, 444 00:27:19,000 --> 00:27:21,280 Speaker 1: but in fact you would just be sending money straight 445 00:27:21,280 --> 00:27:24,280 Speaker 1: to the thieves because it wasn't connected to the payment 446 00:27:24,359 --> 00:27:28,040 Speaker 1: system at all for the cities. The cities in question 447 00:27:28,119 --> 00:27:31,119 Speaker 1: release statements saying that they don't use QR codes for 448 00:27:31,119 --> 00:27:34,240 Speaker 1: this very reason. That scammers can use fake QR codes 449 00:27:34,280 --> 00:27:38,399 Speaker 1: to divert traffic to their own sites. Instead, the cities 450 00:27:38,520 --> 00:27:42,280 Speaker 1: use cash and cards at the meters themselves, and I 451 00:27:42,320 --> 00:27:44,920 Speaker 1: assume all of them. I know Austin for sure has 452 00:27:45,000 --> 00:27:49,240 Speaker 1: an official smartphone app on that side, So as long 453 00:27:49,280 --> 00:27:53,000 Speaker 1: as scammers can't make a fake parking payment app and 454 00:27:53,080 --> 00:27:56,119 Speaker 1: pass it off as real and legit in various cities, 455 00:27:56,600 --> 00:28:00,080 Speaker 1: you can be reasonably sure that if you're using a 456 00:28:00,200 --> 00:28:03,080 Speaker 1: payment app for the city you are in, that you're 457 00:28:03,080 --> 00:28:05,199 Speaker 1: actually paying the meter. That way, you don't have to 458 00:28:05,200 --> 00:28:08,119 Speaker 1: worry about thieves taking your money or the city taking 459 00:28:08,119 --> 00:28:11,200 Speaker 1: your car. So yeah, just a reminder that QR codes 460 00:28:11,520 --> 00:28:14,560 Speaker 1: might lead you to places you really don't want to go. 461 00:28:15,200 --> 00:28:18,919 Speaker 1: But maybe you do want to go to Disney World. 462 00:28:19,560 --> 00:28:23,240 Speaker 1: I did last year, and it was weird. I was 463 00:28:23,280 --> 00:28:26,240 Speaker 1: being super careful, my whole family was being super careful 464 00:28:26,280 --> 00:28:30,119 Speaker 1: while we were there, but being around that many people, 465 00:28:30,400 --> 00:28:33,560 Speaker 1: a lot of whom we're not being particularly careful, that 466 00:28:33,640 --> 00:28:37,840 Speaker 1: gave me tons of anxiety. I actually kind of regret 467 00:28:38,160 --> 00:28:41,600 Speaker 1: having gone, but it was a big family trip and 468 00:28:41,640 --> 00:28:45,360 Speaker 1: a surprise for my nieces, so I waited and decided 469 00:28:45,400 --> 00:28:49,560 Speaker 1: to go, and yeah, still feel weird about that. Luckily, 470 00:28:49,600 --> 00:28:53,400 Speaker 1: everyone was safe, everyone was healthy both before, during, and after. 471 00:28:53,800 --> 00:28:55,840 Speaker 1: But anyway, that's not the point of this segment. No 472 00:28:56,080 --> 00:28:59,760 Speaker 1: Disney filed a patent that potentially gives us a glimpse 473 00:29:00,160 --> 00:29:03,680 Speaker 1: what we could encounter on future trips to their theme 474 00:29:03,720 --> 00:29:09,000 Speaker 1: parks and other Disney related places. The patent, which was 475 00:29:09,040 --> 00:29:11,840 Speaker 1: approved right at the very tail end of last year, 476 00:29:12,400 --> 00:29:16,720 Speaker 1: is for what they're calling a virtual world simulator. Now, 477 00:29:16,720 --> 00:29:19,080 Speaker 1: it sounds to me like the patent is describing a 478 00:29:19,240 --> 00:29:24,360 Speaker 1: three D projection mapping technology paired with cameras and motion 479 00:29:24,440 --> 00:29:28,240 Speaker 1: sensors and facial recognition and an AI system to determine 480 00:29:28,280 --> 00:29:32,720 Speaker 1: where someone is looking at any given time. So this 481 00:29:32,800 --> 00:29:35,320 Speaker 1: is how I imagine it would play out. Let's say 482 00:29:35,320 --> 00:29:39,160 Speaker 1: that you're on Main Street, USA at Disney World's Magic Kingdom, 483 00:29:39,400 --> 00:29:42,400 Speaker 1: and you've got a ton of cameras trained on you. 484 00:29:42,480 --> 00:29:44,720 Speaker 1: They're out of sight, but they're all looking at you, 485 00:29:45,320 --> 00:29:49,880 Speaker 1: and they're paired with tracking software to look at your 486 00:29:49,920 --> 00:29:53,600 Speaker 1: head position, where your eyes are pointed, how quickly you're moving, 487 00:29:53,800 --> 00:29:57,720 Speaker 1: all that kind of stuff, and then projectors connected to 488 00:29:57,720 --> 00:30:01,280 Speaker 1: this system create an image of I don't know, let's 489 00:30:01,320 --> 00:30:03,800 Speaker 1: say it's Captain Hook because he happens to be your 490 00:30:03,840 --> 00:30:07,560 Speaker 1: favorite Disney character. Uh, turns out the three D projectors 491 00:30:07,600 --> 00:30:12,240 Speaker 1: are not the only thing projecting in this particular scenario anyway. 492 00:30:12,520 --> 00:30:15,760 Speaker 1: The patent seems to describe a tech that would allow 493 00:30:15,800 --> 00:30:18,760 Speaker 1: Disney to kind of create these sorts of projections that 494 00:30:18,800 --> 00:30:21,600 Speaker 1: can move along with you and interact with you in 495 00:30:21,680 --> 00:30:25,760 Speaker 1: some way. The patent specifically dismisses the idea of an 496 00:30:25,760 --> 00:30:30,960 Speaker 1: associated headset, so this would not incorporate something like augmented 497 00:30:31,000 --> 00:30:34,920 Speaker 1: reality glasses. It sounds kind of cool, but it also 498 00:30:34,960 --> 00:30:37,440 Speaker 1: makes me a bit anxious because, I mean the level 499 00:30:37,480 --> 00:30:39,880 Speaker 1: of tracking a system would need to create this kind 500 00:30:39,880 --> 00:30:43,200 Speaker 1: of interaction on a level that would be really compelling. 501 00:30:43,640 --> 00:30:47,640 Speaker 1: That also means that Disney would be like seriously watching 502 00:30:48,120 --> 00:30:52,600 Speaker 1: your every move, and not just your move, but everyone's move. 503 00:30:53,600 --> 00:30:55,640 Speaker 1: And I'm not sure how the system would determine which 504 00:30:55,720 --> 00:31:00,200 Speaker 1: guests gets to have this kind of experience. Uh, you've 505 00:31:00,200 --> 00:31:02,880 Speaker 1: been to a Disney park, you know how crowded they 506 00:31:02,880 --> 00:31:05,920 Speaker 1: can be. And I imagine you could not have this 507 00:31:06,000 --> 00:31:08,640 Speaker 1: system cater to everyone all the time, because then the 508 00:31:08,680 --> 00:31:11,000 Speaker 1: park would be packed to the girls, both with real 509 00:31:11,040 --> 00:31:14,520 Speaker 1: world guests and all the projected characters interacting with them, 510 00:31:14,600 --> 00:31:16,480 Speaker 1: and if you're not wearing a headset or anything, those 511 00:31:16,520 --> 00:31:19,560 Speaker 1: interactions they could be seen by everybody. Right, It's not 512 00:31:19,640 --> 00:31:23,080 Speaker 1: like just something that you would experience. Everyone would be 513 00:31:23,120 --> 00:31:26,840 Speaker 1: able to see it. So it's an interesting patent. I 514 00:31:26,920 --> 00:31:31,200 Speaker 1: don't know if or how it would be implemented. We 515 00:31:31,240 --> 00:31:35,120 Speaker 1: may never actually see this tech materialize. Sometimes that happens 516 00:31:35,120 --> 00:31:38,200 Speaker 1: with patented technology. And in fact, it could just mean 517 00:31:38,240 --> 00:31:40,320 Speaker 1: that Disney sits on this to make sure no one 518 00:31:40,360 --> 00:31:44,440 Speaker 1: else does it, right, not that Disney is working to 519 00:31:44,680 --> 00:31:47,960 Speaker 1: do it themselves, but rather let's make sure that nobody 520 00:31:48,000 --> 00:31:51,080 Speaker 1: beats us to this, because that would be a real 521 00:31:51,640 --> 00:31:55,200 Speaker 1: competitive edge for for someone else. So let's patent it 522 00:31:55,400 --> 00:31:58,240 Speaker 1: and then they can't do it without our permission. That's 523 00:31:58,240 --> 00:32:01,040 Speaker 1: also a possibility. Uh. I think the idea is kind 524 00:32:01,040 --> 00:32:02,840 Speaker 1: of neat. I I just it's hard for me to 525 00:32:02,880 --> 00:32:05,320 Speaker 1: imagine how it would actually play out at Disney. For 526 00:32:05,400 --> 00:32:10,560 Speaker 1: one thing, the projection mapping stuff, I'm curious where you 527 00:32:10,560 --> 00:32:15,360 Speaker 1: would project it onto, um if it just be walls 528 00:32:15,520 --> 00:32:19,480 Speaker 1: or something, or if there would be maybe statues around 529 00:32:19,480 --> 00:32:22,040 Speaker 1: the park where you would do three D projection mapping 530 00:32:22,080 --> 00:32:23,840 Speaker 1: on that. But even then if it's a statue, then 531 00:32:23,840 --> 00:32:25,840 Speaker 1: you're limited in the kind of animations you can do. 532 00:32:25,880 --> 00:32:27,560 Speaker 1: You can do a lot of facial animations, but you 533 00:32:27,600 --> 00:32:29,360 Speaker 1: wouldn't be able to do a lot of body stuff. 534 00:32:30,360 --> 00:32:34,640 Speaker 1: So I don't know. Uh. The patent is interesting, the 535 00:32:34,680 --> 00:32:39,640 Speaker 1: implementation remains a mystery. Okay, that wraps up this news 536 00:32:39,680 --> 00:32:43,719 Speaker 1: episode for Thursday, January twenty two. Hope all of you 537 00:32:43,840 --> 00:32:46,600 Speaker 1: are well. If you have suggestions for topics I should 538 00:32:46,640 --> 00:32:49,960 Speaker 1: cover in future episodes of tech Stuff, please reach out 539 00:32:50,000 --> 00:32:52,680 Speaker 1: to me. The best way to do so is over 540 00:32:52,800 --> 00:32:55,680 Speaker 1: Twitter and the handle for the show is tech Stuff 541 00:32:56,000 --> 00:33:00,320 Speaker 1: H s W and I'll talk to you again, really y. 542 00:33:04,560 --> 00:33:07,600 Speaker 1: Text Stuff is an I Heart Radio production. For more 543 00:33:07,680 --> 00:33:11,080 Speaker 1: podcasts from I Heart Radio, visit the i Heart Radio app, 544 00:33:11,200 --> 00:33:14,360 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.