1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,160 --> 00:00:14,920 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,080 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,160 --> 00:00:20,560 Speaker 1: and allow of all things tech. And this is the 5 00:00:20,560 --> 00:00:25,280 Speaker 1: tech news for Thursday, July one, twenty one. We're heading 6 00:00:25,280 --> 00:00:28,479 Speaker 1: into the fourth of July weekend around the world, but 7 00:00:28,560 --> 00:00:30,960 Speaker 1: here in the U s. It's a holiday, and so 8 00:00:31,640 --> 00:00:35,360 Speaker 1: it seems fitting that a lot of my topics today 9 00:00:35,720 --> 00:00:40,960 Speaker 1: have to do with tech and things like liberty and freedom. 10 00:00:41,000 --> 00:00:44,960 Speaker 1: So on Wednesday of this week, the state of Maine 11 00:00:45,440 --> 00:00:50,080 Speaker 1: past into law with unanimous support, a a law that 12 00:00:50,159 --> 00:00:53,960 Speaker 1: places tight restrictions on how state agencies can use facial 13 00:00:54,040 --> 00:00:58,800 Speaker 1: recognition technology. In fact, the restrictions are so tight that 14 00:00:59,040 --> 00:01:02,560 Speaker 1: most state aid agencies are just flat out not allowed 15 00:01:02,720 --> 00:01:06,400 Speaker 1: to use facial recognition technology at all. The state government 16 00:01:06,440 --> 00:01:10,240 Speaker 1: also banned it's used for the purposes of surveillance. Now, 17 00:01:10,319 --> 00:01:14,119 Speaker 1: there are some exceptions that allow certain agencies to use 18 00:01:14,200 --> 00:01:19,039 Speaker 1: facial recognition technology, so law enforcement is allowed to use 19 00:01:19,080 --> 00:01:22,679 Speaker 1: it if they are investigating a serious crime. Uh. The 20 00:01:22,720 --> 00:01:25,440 Speaker 1: state also might use it in order to identify someone 21 00:01:25,480 --> 00:01:30,760 Speaker 1: who is deceased, missing, or endangered, so in cases where 22 00:01:30,800 --> 00:01:34,680 Speaker 1: you're talking about someone's life or safety, that is an 23 00:01:34,680 --> 00:01:37,320 Speaker 1: exception as well. However, for the most part, the state 24 00:01:37,480 --> 00:01:41,919 Speaker 1: is not to use facial recognition technology starting in October 25 00:01:41,959 --> 00:01:44,920 Speaker 1: of this year. That's when the law takes effect. The 26 00:01:45,040 --> 00:01:49,120 Speaker 1: law also requires police to go through proper channels and 27 00:01:49,200 --> 00:01:52,800 Speaker 1: to obtain access to facial recognition technology on a case 28 00:01:52,880 --> 00:01:57,600 Speaker 1: by case basis. UH civil liberties organizations had expressed approval 29 00:01:57,640 --> 00:02:00,360 Speaker 1: of these new measures. They point to main as being 30 00:02:00,400 --> 00:02:03,400 Speaker 1: the leading edge of protecting citizen privacy and shielding them 31 00:02:03,400 --> 00:02:09,440 Speaker 1: from surveillance. The essential argument was that you know this 32 00:02:09,520 --> 00:02:13,960 Speaker 1: is this is technology that has disproportionate ability to do 33 00:02:14,120 --> 00:02:19,520 Speaker 1: harm to someone, and the benefits do not outweigh that 34 00:02:19,639 --> 00:02:24,840 Speaker 1: disproportionate harm. Maybe this will spur similar state laws across 35 00:02:24,880 --> 00:02:27,720 Speaker 1: the country. We have seen a law in Washington that 36 00:02:27,720 --> 00:02:30,120 Speaker 1: didn't go nearly as far as the one in Maine, 37 00:02:30,919 --> 00:02:34,480 Speaker 1: so maybe we'll see other states follow suit. Perhaps will 38 00:02:34,520 --> 00:02:38,239 Speaker 1: even eventually see a federal law that covers this issue 39 00:02:38,280 --> 00:02:42,360 Speaker 1: here in the US. Sticking with tech and civil liberties, 40 00:02:42,680 --> 00:02:46,880 Speaker 1: during a government hearing, attorneys for cloud service companies plus 41 00:02:47,200 --> 00:02:52,359 Speaker 1: one executive from Microsoft spoke to lawmakers about the government's 42 00:02:52,360 --> 00:02:57,560 Speaker 1: practice of seeking personal information of company customers. In addition, 43 00:02:57,639 --> 00:03:01,640 Speaker 1: the discussion included the practice of having such requests hidden 44 00:03:01,760 --> 00:03:06,360 Speaker 1: from public view. They're filed under secret court orders with 45 00:03:06,440 --> 00:03:09,480 Speaker 1: the demand that whatever the cloud service company is, you know, 46 00:03:09,560 --> 00:03:14,000 Speaker 1: like Microsoft, they aren't supposed to acknowledge that any investigation 47 00:03:14,160 --> 00:03:19,200 Speaker 1: is ongoing. Very cloak and dagger kind of approach, and 48 00:03:19,240 --> 00:03:22,480 Speaker 1: a lot of people for years have been raising concerns 49 00:03:22,560 --> 00:03:25,440 Speaker 1: about this kind of thing. This actually follows in the 50 00:03:25,440 --> 00:03:29,440 Speaker 1: wake of a recent revelation that the Department of Justice 51 00:03:29,560 --> 00:03:34,560 Speaker 1: under former President Trump sought phone records of Democratic representatives 52 00:03:34,639 --> 00:03:37,720 Speaker 1: as well as certain reporters, and the d o J 53 00:03:38,120 --> 00:03:41,839 Speaker 1: used secret subpoenas to seek out that kind of information. 54 00:03:42,040 --> 00:03:45,840 Speaker 1: And the cloud service companies, or rather the lawyers representing 55 00:03:46,000 --> 00:03:49,160 Speaker 1: those companies, are arguing that the government should have to 56 00:03:49,280 --> 00:03:52,280 Speaker 1: follow the exact same sort of processes they have to 57 00:03:52,320 --> 00:03:56,000 Speaker 1: follow if they want to access, you know, physical files 58 00:03:56,120 --> 00:03:58,800 Speaker 1: stored in filing cabinets, if you have to have a 59 00:03:58,880 --> 00:04:01,560 Speaker 1: warrant to search at then you need to have the 60 00:04:01,640 --> 00:04:06,240 Speaker 1: equivalent in order to search digital data stored in the cloud. 61 00:04:06,920 --> 00:04:10,640 Speaker 1: The executive from Microsoft, Tom Burt, said that over the 62 00:04:10,720 --> 00:04:14,280 Speaker 1: last five years Microsoft has received between two thousand, four 63 00:04:14,360 --> 00:04:18,520 Speaker 1: hundred to three thousand, five hundred secrecy orders each year, 64 00:04:18,960 --> 00:04:22,119 Speaker 1: so at minimum two thousand, four hundred, at max around 65 00:04:22,120 --> 00:04:25,880 Speaker 1: thirty undred, and that these orders appeared to be quote 66 00:04:26,000 --> 00:04:30,919 Speaker 1: unsupported by any meaningful legal or factual analysis end quote. 67 00:04:30,960 --> 00:04:35,040 Speaker 1: So in other words, there didn't appear to be any 68 00:04:35,680 --> 00:04:41,520 Speaker 1: you know, you know, reasonable purpose for the search orders, 69 00:04:41,680 --> 00:04:44,520 Speaker 1: and yet these companies were being compelled to agree with 70 00:04:44,560 --> 00:04:48,080 Speaker 1: them by the government. So, like the widespread use of 71 00:04:48,120 --> 00:04:51,720 Speaker 1: facial recognition technology, this practice seems to be in violation 72 00:04:51,880 --> 00:04:55,560 Speaker 1: of the Fourth Amendment to the United States Constitution. The 73 00:04:55,600 --> 00:05:00,080 Speaker 1: Fourth Amendment protects citizens against unreasonable searches and seizures, and 74 00:05:00,120 --> 00:05:03,640 Speaker 1: that amendment states that no warrants shall be issued but 75 00:05:04,040 --> 00:05:08,240 Speaker 1: with probable cause. According to Microsoft, that doesn't seem like 76 00:05:08,320 --> 00:05:11,839 Speaker 1: that was the case in the majority of these secrecy 77 00:05:11,960 --> 00:05:15,360 Speaker 1: orders that the company received. And you know, you could 78 00:05:15,440 --> 00:05:18,840 Speaker 1: argue that if everything is valid and lawful, as in, 79 00:05:19,080 --> 00:05:22,640 Speaker 1: if the government is following a valid and lawful process, 80 00:05:23,160 --> 00:05:25,720 Speaker 1: why would the orders need to be secret. Why would 81 00:05:25,720 --> 00:05:31,040 Speaker 1: these companies be not allowed to even acknowledge that they 82 00:05:31,080 --> 00:05:34,920 Speaker 1: had received the requests. Currently, it sounds as the lawmakers 83 00:05:35,000 --> 00:05:37,600 Speaker 1: on both sides of the aisle are taking these proceedings 84 00:05:37,600 --> 00:05:40,640 Speaker 1: to heart, though obviously we'll have to wait and see 85 00:05:40,680 --> 00:05:43,920 Speaker 1: if any actual legislation follows, if we see the laws 86 00:05:44,400 --> 00:05:47,680 Speaker 1: change as a result of this, But for the time being, 87 00:05:47,720 --> 00:05:50,240 Speaker 1: it sounds like everyone's kind of on the same page 88 00:05:50,240 --> 00:05:55,720 Speaker 1: as saying, yo, this is not in line with the 89 00:05:56,160 --> 00:06:01,400 Speaker 1: basic tenants of US government. Now, we ain't done with 90 00:06:01,480 --> 00:06:04,640 Speaker 1: tech in US politics yet, which seems fitting because of 91 00:06:04,680 --> 00:06:07,760 Speaker 1: heading into that fourth of July weekend. A federal judge 92 00:06:07,800 --> 00:06:11,560 Speaker 1: on Wednesday slapped down a Florida state law that was 93 00:06:11,720 --> 00:06:15,640 Speaker 1: aimed at social networking sites like Facebook. This Florida law 94 00:06:15,839 --> 00:06:19,080 Speaker 1: would have given that state the authority to place enormous 95 00:06:19,120 --> 00:06:24,760 Speaker 1: fines on social networking sites should they ever ban political candidates. Now, 96 00:06:24,760 --> 00:06:28,039 Speaker 1: as you might imagine, this too appears to stem from 97 00:06:28,120 --> 00:06:32,280 Speaker 1: former President Trump, who received bands from Facebook and Twitter, 98 00:06:32,360 --> 00:06:37,480 Speaker 1: among other platforms, for you know, violating platform policies like 99 00:06:37,560 --> 00:06:42,640 Speaker 1: spreading misinformation and swing dangerous rhetoric that was unsupported by 100 00:06:42,880 --> 00:06:47,279 Speaker 1: you know reality. The federal judge in this case said 101 00:06:47,320 --> 00:06:51,480 Speaker 1: that the Florida law ironically was violating the First Amendment. 102 00:06:51,960 --> 00:06:54,880 Speaker 1: That's the amendment that guarantees the freedom of speech here 103 00:06:54,880 --> 00:06:58,159 Speaker 1: in the United States. So see, the Florida legislators were 104 00:06:58,240 --> 00:07:01,359 Speaker 1: arguing that social networks were prov ending the free speech 105 00:07:01,400 --> 00:07:04,680 Speaker 1: of candidates. But here's the problem with that. The amendment 106 00:07:04,720 --> 00:07:09,720 Speaker 1: to the Constitution that only guarantees protection from the government 107 00:07:09,800 --> 00:07:12,960 Speaker 1: infringing upon free speech. It's saying that the government is 108 00:07:13,000 --> 00:07:17,240 Speaker 1: not allowed to do that. Facebook, while it is bigger 109 00:07:17,240 --> 00:07:19,720 Speaker 1: than a lot of governments around the world, is not 110 00:07:20,240 --> 00:07:23,120 Speaker 1: the government. It's not the U. S. Government. So the 111 00:07:23,160 --> 00:07:25,840 Speaker 1: company has the right to decide who can and cannot 112 00:07:25,960 --> 00:07:30,480 Speaker 1: publish on that platform. The federal judge argued that restricting 113 00:07:30,520 --> 00:07:34,080 Speaker 1: Facebook from being able to make that determination would in 114 00:07:34,160 --> 00:07:39,040 Speaker 1: effect violate the free speech of tech companies like Facebook. 115 00:07:39,480 --> 00:07:42,240 Speaker 1: The Florida law would have allowed for fines of up 116 00:07:42,280 --> 00:07:46,800 Speaker 1: to two hundred fifty thousand dollars per day of any 117 00:07:46,880 --> 00:07:51,360 Speaker 1: band that went beyond fourteen days for statewide candidates, you know, 118 00:07:51,440 --> 00:07:55,760 Speaker 1: smaller candidates in local elections would have been a smaller fine, 119 00:07:55,800 --> 00:07:58,960 Speaker 1: but quarter million dollars per day according to that law. 120 00:07:59,200 --> 00:08:01,600 Speaker 1: The federal judge also pointed out that the Florida law 121 00:08:01,680 --> 00:08:04,520 Speaker 1: was clearly unbalanced, I mean, beyond the fact that it 122 00:08:04,560 --> 00:08:10,680 Speaker 1: had this enormous scope. It also had allowances for certain companies, 123 00:08:10,920 --> 00:08:14,960 Speaker 1: like certain companies would not be held to this standard, 124 00:08:15,080 --> 00:08:19,360 Speaker 1: namely companies that operate theme parks in the state of Florida. 125 00:08:19,720 --> 00:08:24,440 Speaker 1: You know, companies like Disney and NBC Universal. So why 126 00:08:24,520 --> 00:08:26,960 Speaker 1: is that, Well, it's because those are huge money makers. 127 00:08:26,960 --> 00:08:29,360 Speaker 1: They bring in lots of cash to the state of Florida. 128 00:08:29,600 --> 00:08:32,520 Speaker 1: So the sacred right of candidates being able to spread 129 00:08:32,559 --> 00:08:36,560 Speaker 1: misinformation and violate social platform policies does not extend to 130 00:08:36,720 --> 00:08:39,760 Speaker 1: big companies that make Florida lots of money. Now you 131 00:08:39,880 --> 00:08:42,240 Speaker 1: might sense that I'm a bit fed up with Florida, 132 00:08:42,679 --> 00:08:46,880 Speaker 1: and You're right. This is honestly just a thinly veiled 133 00:08:46,880 --> 00:08:50,880 Speaker 1: attempt to create a state run media agency, and that 134 00:08:50,920 --> 00:08:54,400 Speaker 1: would not serve the citizens of Florida well at all. 135 00:08:55,000 --> 00:08:57,840 Speaker 1: All right, more laws and tech. This is also in 136 00:08:57,840 --> 00:09:01,840 Speaker 1: the United States. The National Highway Traffic Safety Administration or 137 00:09:02,000 --> 00:09:06,160 Speaker 1: nht s A has passed a law, or rather a rule. 138 00:09:06,240 --> 00:09:08,080 Speaker 1: It's not a law, but it's a new rule that 139 00:09:08,160 --> 00:09:12,360 Speaker 1: will require all companies that offer semi autonomous driving assist 140 00:09:12,400 --> 00:09:17,160 Speaker 1: features within vehicles to report all crashes in which an 141 00:09:17,160 --> 00:09:21,720 Speaker 1: autonomous or semi autonomous feature was involved. So, in other words, 142 00:09:22,160 --> 00:09:25,320 Speaker 1: if someone is in a Tesla vehicle and they get 143 00:09:25,320 --> 00:09:28,920 Speaker 1: into a wreck and it's determined that the vehicle was 144 00:09:29,000 --> 00:09:33,480 Speaker 1: in autopilot mode, Tesla would be legally required to report 145 00:09:33,600 --> 00:09:36,960 Speaker 1: that crash to the nh T s A. Uh not 146 00:09:37,160 --> 00:09:40,280 Speaker 1: just Tesla, mind you, this rule applies to all companies 147 00:09:40,320 --> 00:09:44,320 Speaker 1: operating vehicles that have autonomous features. Tesla tends to be 148 00:09:44,400 --> 00:09:47,120 Speaker 1: the one that a lot of people single out simply 149 00:09:47,160 --> 00:09:49,560 Speaker 1: because it's the one that's had some of the most 150 00:09:49,679 --> 00:09:54,120 Speaker 1: high profile accidents while in this autopilot mode. The n 151 00:09:54,240 --> 00:09:57,439 Speaker 1: h T s A aims to have a better understanding 152 00:09:57,440 --> 00:10:01,800 Speaker 1: of how safe or unsafe these autonomous systems are, and 153 00:10:01,840 --> 00:10:05,240 Speaker 1: there then we'll be able to identify any that are 154 00:10:05,280 --> 00:10:09,360 Speaker 1: particularly unsafe. So like, statistically, if we were to say 155 00:10:09,400 --> 00:10:12,560 Speaker 1: Tesla actually is more unsafe, we would be able to 156 00:10:12,559 --> 00:10:16,120 Speaker 1: know that because of this information. Right now, it's anecdotal, right, 157 00:10:16,160 --> 00:10:19,840 Speaker 1: I mean, the Tesla accidents are very high profile, so 158 00:10:19,920 --> 00:10:22,800 Speaker 1: at least on the surface level, it seems like you 159 00:10:22,840 --> 00:10:27,120 Speaker 1: could argue that those are potentially the most unsafe, but 160 00:10:27,320 --> 00:10:31,280 Speaker 1: without data across the industry, that's impossible to say for sure. 161 00:10:31,640 --> 00:10:33,880 Speaker 1: It may just be because those happen to be very 162 00:10:33,960 --> 00:10:37,040 Speaker 1: high profile accidents. So it could turn out that like 163 00:10:37,160 --> 00:10:40,480 Speaker 1: you know, way Mo vehicles get into more traffic accidents 164 00:10:40,720 --> 00:10:45,520 Speaker 1: than Tesla does, and that would be, you know, something 165 00:10:45,520 --> 00:10:49,240 Speaker 1: we would be unaware of without this kind of rule. Uh, 166 00:10:49,280 --> 00:10:52,400 Speaker 1: there is a threshold for accidents. The nh T s 167 00:10:52,440 --> 00:10:56,600 Speaker 1: A is not asking for every single accident to get reported. 168 00:10:56,679 --> 00:10:59,920 Speaker 1: It's really interested in tracking the more serious cases, including 169 00:11:00,320 --> 00:11:04,480 Speaker 1: those with quote a hospital treated injury, a fatality, a 170 00:11:04,600 --> 00:11:08,040 Speaker 1: vehicle tow away and air bag deployment, or a vulnerable 171 00:11:08,120 --> 00:11:11,640 Speaker 1: road users such as pedestrian or bic cicklist end quote. 172 00:11:11,880 --> 00:11:16,840 Speaker 1: So it's not about like minor fender benders or paint 173 00:11:16,920 --> 00:11:20,720 Speaker 1: scrapings or something. It's rather ones that represent a potential 174 00:11:20,760 --> 00:11:25,280 Speaker 1: threat to life and safety and property damage. So companies 175 00:11:25,320 --> 00:11:29,040 Speaker 1: are also to submit monthly reports that detail any and 176 00:11:29,280 --> 00:11:33,280 Speaker 1: all incidents with self driving vehicles in that self driving 177 00:11:33,320 --> 00:11:37,199 Speaker 1: mode that include injury or damage to property. If they 178 00:11:37,240 --> 00:11:41,080 Speaker 1: do not submit a monthly report, they will be find 179 00:11:41,280 --> 00:11:44,640 Speaker 1: nearly twenty three thousand dollars per day that they failed 180 00:11:44,640 --> 00:11:49,719 Speaker 1: to respond, up to a maximum of one hundred million dollars. Ideally, 181 00:11:50,200 --> 00:11:52,880 Speaker 1: this sort of reporting will lead to federal and state 182 00:11:52,920 --> 00:11:56,400 Speaker 1: governments having a better understanding of the performance of autonomous 183 00:11:56,400 --> 00:11:59,840 Speaker 1: and semi autonomous vehicles and whether it's wise to allow 184 00:11:59,840 --> 00:12:03,600 Speaker 1: them on public roads, or if new legislation should apply 185 00:12:03,720 --> 00:12:06,960 Speaker 1: to the companies that are operating or offering these kinds 186 00:12:06,960 --> 00:12:10,520 Speaker 1: of vehicles. It will also give a more accurate view 187 00:12:10,600 --> 00:12:14,559 Speaker 1: of how effective or ineffective these technologies actually are, something 188 00:12:14,559 --> 00:12:16,880 Speaker 1: that right now is impossible to do because for the 189 00:12:16,920 --> 00:12:19,440 Speaker 1: most part, these companies keep all of those sorts of 190 00:12:19,480 --> 00:12:23,680 Speaker 1: records internally, and obviously the companies have a vested interest 191 00:12:23,720 --> 00:12:28,080 Speaker 1: in promoting the technologies as being safe because that's you know, 192 00:12:28,120 --> 00:12:30,680 Speaker 1: where they're going to get their revenues. So hopefully this 193 00:12:30,760 --> 00:12:34,120 Speaker 1: new rule add some accountability to the picture and ultimately 194 00:12:34,240 --> 00:12:38,120 Speaker 1: lead to better, safer, and more reliable vehicles. Now, I 195 00:12:38,200 --> 00:12:41,880 Speaker 1: fully believe that the future of cars is autonomous, but 196 00:12:41,960 --> 00:12:44,680 Speaker 1: I also believe that we're still pretty good ways away 197 00:12:44,679 --> 00:12:48,000 Speaker 1: from having vehicles that can safely operate in all scenarios 198 00:12:48,360 --> 00:12:51,800 Speaker 1: that a human driver might find themselves in. So this 199 00:12:51,880 --> 00:12:54,560 Speaker 1: is a good step toward getting more information so that 200 00:12:54,600 --> 00:12:58,080 Speaker 1: we can start to make decisions about when is it 201 00:12:58,120 --> 00:13:03,600 Speaker 1: appropriate to kind of transition to an autonomous vehicle. Uh reality, 202 00:13:03,960 --> 00:13:06,920 Speaker 1: All right, let's keep this train going. So the f 203 00:13:07,120 --> 00:13:10,240 Speaker 1: c C, the Federal Communications Commission here in the United States, 204 00:13:10,559 --> 00:13:13,880 Speaker 1: passed rules to require carriers, that is, like cell phone 205 00:13:13,920 --> 00:13:17,600 Speaker 1: carriers in the US, to combat spoofing. Now, that's when 206 00:13:17,600 --> 00:13:21,079 Speaker 1: someone uses a system to mask a real phone number, 207 00:13:21,440 --> 00:13:25,680 Speaker 1: and typically spoofing involves presenting a number to your caller 208 00:13:25,800 --> 00:13:29,880 Speaker 1: I D that's pretty close to your number. That way, 209 00:13:29,880 --> 00:13:31,760 Speaker 1: when you get a call and you see it's from 210 00:13:31,760 --> 00:13:34,440 Speaker 1: a phone number that's similar to yours, it's in your 211 00:13:34,840 --> 00:13:38,720 Speaker 1: local area code, psychologically, you might be more likely to 212 00:13:38,760 --> 00:13:41,160 Speaker 1: answer that call. You might think, oh, this is probably 213 00:13:41,200 --> 00:13:44,360 Speaker 1: someone I know. I just don't recognize the number. It's 214 00:13:44,400 --> 00:13:48,120 Speaker 1: a pretty scummy way for various entities to try and 215 00:13:48,200 --> 00:13:51,880 Speaker 1: get you on the phone. The FCC said we've had 216 00:13:52,000 --> 00:13:54,240 Speaker 1: enough of this, and they gave a mandate to the 217 00:13:54,360 --> 00:13:58,040 Speaker 1: major carriers to implement systems that would prevent people from 218 00:13:58,080 --> 00:14:02,000 Speaker 1: spoofing numbers. And now all three major carriers in the 219 00:14:02,120 --> 00:14:04,960 Speaker 1: US those being T Mobile, Verizon, and A T and 220 00:14:05,040 --> 00:14:08,240 Speaker 1: T have systems in place to do this. The carriers 221 00:14:08,240 --> 00:14:11,080 Speaker 1: will verify that a number calling into you is in 222 00:14:11,200 --> 00:14:15,400 Speaker 1: fact the actual number of the originating call before allowing 223 00:14:15,400 --> 00:14:17,880 Speaker 1: it to go through. The protocol that the companies are 224 00:14:17,960 --> 00:14:21,920 Speaker 1: using is called stir slash shaken, which feels very James 225 00:14:22,160 --> 00:14:26,440 Speaker 1: Bondy to me. Maybe James Bondage. No wait, no, never mind, 226 00:14:26,480 --> 00:14:31,960 Speaker 1: no scratch that. The deadline for implementing the protocols was yesterday, 227 00:14:32,120 --> 00:14:37,760 Speaker 1: June one, for all the major carriers. Smaller regional carriers 228 00:14:37,800 --> 00:14:40,440 Speaker 1: will actually still have two years to implement the protocols. 229 00:14:40,800 --> 00:14:44,000 Speaker 1: So it's a good thing that the major carriers are compliant, 230 00:14:44,120 --> 00:14:46,960 Speaker 1: or else they would be held liable for it. And 231 00:14:47,000 --> 00:14:49,880 Speaker 1: this is all part of an ongoing battle against robo 232 00:14:50,000 --> 00:14:53,320 Speaker 1: calls and spam calls and and stuff like that. And 233 00:14:53,400 --> 00:14:56,200 Speaker 1: I don't know about you, but I almost never answer 234 00:14:56,280 --> 00:14:59,160 Speaker 1: my phone. I definitely don't answer it. If I don't 235 00:14:59,160 --> 00:15:01,720 Speaker 1: recognize the number, I just let it go to voicemail. 236 00:15:02,040 --> 00:15:05,520 Speaker 1: And usually there's never you know, it's pretty rare that 237 00:15:05,560 --> 00:15:08,920 Speaker 1: I get a voicemail. Most people just give up, nor 238 00:15:09,040 --> 00:15:12,160 Speaker 1: most robots give up. I don't think it's even people 239 00:15:12,200 --> 00:15:14,160 Speaker 1: most of the time. And that's just to me that 240 00:15:14,240 --> 00:15:16,640 Speaker 1: whomever was trying to call me it wasn't about something 241 00:15:16,680 --> 00:15:19,160 Speaker 1: important because they didn't bother to leave a message. Now, 242 00:15:19,200 --> 00:15:22,400 Speaker 1: all these protocols won't mean the end of nuisance calls. 243 00:15:22,800 --> 00:15:26,760 Speaker 1: I think every little bit helps. We've got a few 244 00:15:26,920 --> 00:15:29,360 Speaker 1: more news items that I want to cover today, but 245 00:15:29,440 --> 00:15:39,880 Speaker 1: we're running a bit long, so let's take a quick break. Okay, 246 00:15:39,880 --> 00:15:42,920 Speaker 1: we're back and back. In April, hackers managed to gain 247 00:15:43,000 --> 00:15:46,720 Speaker 1: access to a folder in Sony's network that contained the 248 00:15:46,880 --> 00:15:51,920 Speaker 1: serial I D numbers for PlayStation three consoles, like all 249 00:15:51,960 --> 00:15:55,400 Speaker 1: of them, every number relating to every gamer with a 250 00:15:55,480 --> 00:15:59,960 Speaker 1: p S three that was registered with Sony's network. And subsequently, 251 00:16:00,080 --> 00:16:03,480 Speaker 1: some gamers have noticed that their PlayStation accounts appeared to 252 00:16:03,520 --> 00:16:06,960 Speaker 1: be inaccessible. They receive an error message. That error message 253 00:16:06,960 --> 00:16:10,240 Speaker 1: has a number. That number is eight zero seven one 254 00:16:10,520 --> 00:16:15,000 Speaker 1: zero zero six, and some players, after going to reregister 255 00:16:15,240 --> 00:16:19,040 Speaker 1: and activate two factor authentication, managed to regain control of 256 00:16:19,040 --> 00:16:21,920 Speaker 1: their accounts, but for others it's a bit more complicated. 257 00:16:22,280 --> 00:16:25,200 Speaker 1: So it looks like hackers have started to use some 258 00:16:25,320 --> 00:16:29,040 Speaker 1: of those exposed accounts in order to act as proxies 259 00:16:29,440 --> 00:16:35,320 Speaker 1: for malicious activity, essentially stuff that violates Sony's policies. So 260 00:16:35,320 --> 00:16:39,160 Speaker 1: Sony sees that these particular ideas connected to these particular 261 00:16:39,200 --> 00:16:42,680 Speaker 1: consoles are doing shady stuff. They've gone and started to 262 00:16:42,720 --> 00:16:47,800 Speaker 1: ban those accounts. Unfortunately, that means that the legitimate players 263 00:16:47,840 --> 00:16:51,600 Speaker 1: who had their information exposed are now not able to 264 00:16:51,640 --> 00:16:54,360 Speaker 1: get access to their accounts, even though they didn't do 265 00:16:54,440 --> 00:16:58,240 Speaker 1: anything wrong, and considering that some of those accounts likely 266 00:16:58,320 --> 00:17:01,480 Speaker 1: belonged to players who are miners, that's another really thorny 267 00:17:01,560 --> 00:17:04,840 Speaker 1: element to the story because the information is out there now. 268 00:17:05,240 --> 00:17:07,560 Speaker 1: Knowing that credentials are making the rounds on the dark 269 00:17:07,600 --> 00:17:10,879 Speaker 1: web isn't great. It appears as though Sony took no 270 00:17:11,040 --> 00:17:14,520 Speaker 1: real steps to secure this folder, which made it easy 271 00:17:14,600 --> 00:17:17,399 Speaker 1: pickings for the hackers once they actually gained access to 272 00:17:17,400 --> 00:17:21,280 Speaker 1: Sony systems, and that too has raised some pretty tough questions. 273 00:17:21,520 --> 00:17:24,640 Speaker 1: I have not yet seen a company response from Sony 274 00:17:24,720 --> 00:17:27,280 Speaker 1: to this issue, but I will continue to update the 275 00:17:27,320 --> 00:17:32,240 Speaker 1: story if we learn more. Speaking of data breaches, the 276 00:17:32,440 --> 00:17:36,360 Speaker 1: social network LinkedIn has experienced something similar to one, though 277 00:17:36,440 --> 00:17:39,880 Speaker 1: technically it's not actually a breach. The company saw a 278 00:17:39,920 --> 00:17:44,479 Speaker 1: massive data scrape, that is, someone collected a ton of 279 00:17:44,560 --> 00:17:47,760 Speaker 1: data off of linked In profiles, and that happened back 280 00:17:47,760 --> 00:17:51,720 Speaker 1: in May. Hackers scrape data from half a billion linked 281 00:17:51,760 --> 00:17:55,560 Speaker 1: In user profiles. But now a VPN called privacy Shark 282 00:17:56,000 --> 00:17:59,639 Speaker 1: says that a user with the handle tom Liner showed 283 00:17:59,680 --> 00:18:04,280 Speaker 1: that they have a document that includes seven hundred million 284 00:18:04,600 --> 00:18:08,800 Speaker 1: LinkedIn user records in it. That represents more than of 285 00:18:09,000 --> 00:18:12,080 Speaker 1: all LinkedIn users. So, in other words, if you have 286 00:18:12,200 --> 00:18:15,440 Speaker 1: a LinkedIn account, the odds are very good that your 287 00:18:15,480 --> 00:18:20,120 Speaker 1: information is in that big file. Tomliner, by the way, 288 00:18:20,240 --> 00:18:23,400 Speaker 1: is planning to sell that information off because data has 289 00:18:23,440 --> 00:18:26,359 Speaker 1: a price. Just you know, those of us who are 290 00:18:26,359 --> 00:18:29,320 Speaker 1: actually creating the data rarely get to reap the benefits 291 00:18:29,480 --> 00:18:32,600 Speaker 1: of that data apart from you know, having the privilege 292 00:18:32,640 --> 00:18:35,919 Speaker 1: of using these various online platforms. That is what the 293 00:18:35,960 --> 00:18:39,920 Speaker 1: same ones that are unfortunately allowing that data to roam free. 294 00:18:40,800 --> 00:18:43,800 Speaker 1: The information contained with tom Liner's collection includes stuff like 295 00:18:44,080 --> 00:18:48,199 Speaker 1: full names, phone numbers, addresses, that kind of stuff. It 296 00:18:48,240 --> 00:18:50,520 Speaker 1: does seem that this is the result of entities just 297 00:18:50,600 --> 00:18:54,760 Speaker 1: scraping public data off of LinkedIn files, which is similar 298 00:18:54,800 --> 00:18:57,240 Speaker 1: to something I talked about recently when LinkedIn when after 299 00:18:57,359 --> 00:19:00,119 Speaker 1: another company that essentially did the same sort of thing ing. 300 00:19:00,600 --> 00:19:03,240 Speaker 1: So in a way, you could say that people who 301 00:19:03,320 --> 00:19:06,000 Speaker 1: are selling this information are able to do so simply 302 00:19:06,040 --> 00:19:10,000 Speaker 1: because we as users have allowed this information to be 303 00:19:10,119 --> 00:19:12,800 Speaker 1: public It's just that the people who are collecting this 304 00:19:12,880 --> 00:19:16,160 Speaker 1: information were able to do it at scale, and they 305 00:19:16,240 --> 00:19:20,280 Speaker 1: likely exploited LinkedIn's application interface in order to create an 306 00:19:20,280 --> 00:19:25,280 Speaker 1: app that automatically ended up collecting all this data. Since 307 00:19:25,320 --> 00:19:28,400 Speaker 1: it's not a breach, this gets a little complicated. I mean, 308 00:19:28,440 --> 00:19:32,240 Speaker 1: LinkedIn didn't make our data vulnerable so much as it 309 00:19:32,320 --> 00:19:35,240 Speaker 1: had an ap I that facilitated the mass collection of 310 00:19:35,320 --> 00:19:39,160 Speaker 1: data publicly available on the platform. What does this mean 311 00:19:39,200 --> 00:19:43,040 Speaker 1: for you? If your data was in there? Probably means 312 00:19:43,080 --> 00:19:45,200 Speaker 1: you're gonna end up on a lot of other lists 313 00:19:45,200 --> 00:19:49,280 Speaker 1: for stuff like spam and robo calls, possibly with phishing 314 00:19:49,560 --> 00:19:52,560 Speaker 1: or maybe even spear phishing attacks. Spear Fishing is when 315 00:19:52,920 --> 00:19:55,960 Speaker 1: you've actually got some data about your target to work with, 316 00:19:56,200 --> 00:19:59,359 Speaker 1: which means you can craft your attack to be potentially 317 00:19:59,400 --> 00:20:04,760 Speaker 1: more effective and increase its likelihood of success. It's a bummer. Next, 318 00:20:05,000 --> 00:20:07,439 Speaker 1: the investing platform of robin Hood is going to have 319 00:20:07,480 --> 00:20:10,480 Speaker 1: to pay almost seventy million dollars in fines to the 320 00:20:10,520 --> 00:20:15,199 Speaker 1: Financial Industry Regulatory Authority or finn raw f I n 321 00:20:15,359 --> 00:20:18,760 Speaker 1: r A in order to settle allegations that robin Hood 322 00:20:18,800 --> 00:20:23,119 Speaker 1: caused quote widespread and significant harm end quote to customers 323 00:20:23,440 --> 00:20:28,560 Speaker 1: mostly through misinformation. That misinformation includes charges that the company 324 00:20:28,760 --> 00:20:31,560 Speaker 1: misled customers on how much money they had in their 325 00:20:31,600 --> 00:20:35,199 Speaker 1: accounts UH and whether or not they would be allowed 326 00:20:35,240 --> 00:20:38,960 Speaker 1: to place trades on margin. Finner found that the misinformation 327 00:20:39,119 --> 00:20:43,719 Speaker 1: meant customers collectively lost around seven million dollars, and honestly, 328 00:20:43,720 --> 00:20:45,800 Speaker 1: I think that's supposed to be the opposite of what 329 00:20:45,960 --> 00:20:49,560 Speaker 1: investment firms tend to promise their customers. It's a rare 330 00:20:49,600 --> 00:20:52,199 Speaker 1: thing to see an investment platform say, use us, we 331 00:20:52,240 --> 00:20:54,880 Speaker 1: can lose your money more effectively than anyone else can. 332 00:20:55,800 --> 00:20:58,120 Speaker 1: Robin Hood was in the news earlier this year when 333 00:20:58,160 --> 00:21:02,200 Speaker 1: independent investors were jumping on the game Stop stock roller coaster. 334 00:21:02,640 --> 00:21:05,879 Speaker 1: Robin Hood ended up placing tight restrictions on game Stop 335 00:21:06,160 --> 00:21:10,720 Speaker 1: stock trades, meaning if you were an independent investor and 336 00:21:10,800 --> 00:21:12,719 Speaker 1: you wanted to get in on the game Stop action, 337 00:21:13,240 --> 00:21:16,480 Speaker 1: you hit a major roadblock. The company's official statement at 338 00:21:16,480 --> 00:21:18,600 Speaker 1: that time was that it was trying to protect the 339 00:21:18,600 --> 00:21:22,640 Speaker 1: company and investors UH in a situation that had volderle 340 00:21:22,680 --> 00:21:27,520 Speaker 1: stock trades and also some regulatory issues, But some skeptical 341 00:21:27,560 --> 00:21:30,760 Speaker 1: investors argued that major parties that own a stake in 342 00:21:30,880 --> 00:21:34,240 Speaker 1: robin Hood had a vested interest in game Stop stock 343 00:21:34,320 --> 00:21:37,480 Speaker 1: going down and value, and thus the company was acting 344 00:21:37,480 --> 00:21:41,200 Speaker 1: in a conflict of interest with its customers. Google is 345 00:21:41,320 --> 00:21:43,960 Speaker 1: rolling out a new feature for Android users starting in 346 00:21:44,000 --> 00:21:46,520 Speaker 1: the United States that will let them store a digital 347 00:21:46,600 --> 00:21:50,520 Speaker 1: vaccine card called a COVID card. It will also support 348 00:21:50,560 --> 00:21:54,520 Speaker 1: storing COVID test results. The card will include information such 349 00:21:54,520 --> 00:21:58,520 Speaker 1: as where you received your vaccination, when that vaccination happened, 350 00:21:58,560 --> 00:22:01,800 Speaker 1: and possibly some other information sation like which vaccine you received. 351 00:22:02,400 --> 00:22:05,520 Speaker 1: Google designed the system so that the information remains on 352 00:22:05,600 --> 00:22:08,840 Speaker 1: your device, so, in other words, there's no copy of 353 00:22:08,880 --> 00:22:11,600 Speaker 1: this information sent to the cloud that could represent a 354 00:22:11,640 --> 00:22:16,639 Speaker 1: possible privacy risk. Google will collect some meta information about 355 00:22:16,640 --> 00:22:19,760 Speaker 1: the app, such as how frequently you actually use the card. 356 00:22:20,160 --> 00:22:22,679 Speaker 1: This could come in handy should vaccine cards become a 357 00:22:22,720 --> 00:22:25,920 Speaker 1: requirement for certain things, you know, like travel to specific 358 00:22:25,920 --> 00:22:30,000 Speaker 1: countries or attending specific events, though here in the US 359 00:22:30,080 --> 00:22:32,520 Speaker 1: we're seeing an awful lot of resistance to those sort 360 00:22:32,520 --> 00:22:34,960 Speaker 1: of things, going so far as to have some state 361 00:22:35,000 --> 00:22:38,480 Speaker 1: governors declare that a vaccination requirement will not be tolerated 362 00:22:38,480 --> 00:22:41,800 Speaker 1: for big events because I guess it's just more important 363 00:22:41,800 --> 00:22:44,199 Speaker 1: for potentially sick people to be able to go to 364 00:22:44,240 --> 00:22:47,040 Speaker 1: these public events than it is to ensure the safety 365 00:22:47,119 --> 00:22:50,960 Speaker 1: of those who are already in attendance. I don't know, 366 00:22:51,720 --> 00:22:55,480 Speaker 1: because I can't see the logic in it. Next half 367 00:22:55,520 --> 00:22:59,360 Speaker 1: a century ago, Stephen Hawking proposed a theorem that stated 368 00:22:59,400 --> 00:23:03,240 Speaker 1: a black holes event horizon, that is the boundary around 369 00:23:03,320 --> 00:23:05,760 Speaker 1: the black hole that marks the point of no escape. 370 00:23:05,800 --> 00:23:08,920 Speaker 1: If you cross the event horizon, you will be pulled 371 00:23:08,960 --> 00:23:12,400 Speaker 1: into that black hole. You cannot escape. That theorem says 372 00:23:12,440 --> 00:23:15,920 Speaker 1: that the event horizon would never shrink. The math worked 373 00:23:15,920 --> 00:23:18,840 Speaker 1: out that way, but that was all you could say 374 00:23:18,840 --> 00:23:21,639 Speaker 1: about it, is that mathematically it makes sense. And for 375 00:23:21,720 --> 00:23:24,160 Speaker 1: fifty years that's all we had to go on because 376 00:23:24,160 --> 00:23:26,720 Speaker 1: we had no real means of making observations that could 377 00:23:26,760 --> 00:23:30,640 Speaker 1: either confirm or deny that theorem. But that has changed. 378 00:23:31,040 --> 00:23:33,560 Speaker 1: Scientists at M I T and a few other research 379 00:23:33,640 --> 00:23:36,720 Speaker 1: facilities announced in a paper that was published in Physical 380 00:23:36,760 --> 00:23:40,600 Speaker 1: Review Letters that they have observed gravitational waves that confirm 381 00:23:40,720 --> 00:23:44,480 Speaker 1: Hawking's theorem for the first time. Now, I wish I 382 00:23:44,520 --> 00:23:47,480 Speaker 1: could explain the process of how they did this, but 383 00:23:47,840 --> 00:23:49,919 Speaker 1: I read over it about three or four times and 384 00:23:49,960 --> 00:23:53,520 Speaker 1: it's well beyond my ability to really understand it. So 385 00:23:53,640 --> 00:23:56,560 Speaker 1: rather than attempt to explain something and then get it 386 00:23:56,560 --> 00:23:59,240 Speaker 1: all wrong, I'm instead just going to report that the 387 00:23:59,320 --> 00:24:03,560 Speaker 1: scientists say they have a confidence that their observations mean 388 00:24:03,600 --> 00:24:07,080 Speaker 1: that Hawking's theorem is correct. This is another example of 389 00:24:07,119 --> 00:24:10,879 Speaker 1: how super deeper smart people have looked at the world 390 00:24:10,960 --> 00:24:14,120 Speaker 1: and the universe and they've used mathematics to describe it 391 00:24:14,359 --> 00:24:17,720 Speaker 1: and then predict something that is unobservable or at least 392 00:24:17,920 --> 00:24:22,040 Speaker 1: unobservable at the time, and then later we found out 393 00:24:22,119 --> 00:24:25,280 Speaker 1: it proved to be true. Einstein is famous for this 394 00:24:25,359 --> 00:24:27,639 Speaker 1: kind of stuff, and it really floors me. And I 395 00:24:27,640 --> 00:24:30,720 Speaker 1: suppose it shows that there is a logical order when 396 00:24:30,720 --> 00:24:33,240 Speaker 1: you get down to the nuts and bolts of the universe, 397 00:24:33,560 --> 00:24:38,159 Speaker 1: and if there weren't, then stuff like math just wouldn't work. Really, 398 00:24:38,320 --> 00:24:41,040 Speaker 1: nothing would work the way we understand it. Reality would 399 00:24:41,080 --> 00:24:44,280 Speaker 1: be fundamentally different. I may see if I can get 400 00:24:44,280 --> 00:24:46,520 Speaker 1: an expert on tech stuff at some point to talk 401 00:24:46,560 --> 00:24:48,960 Speaker 1: about this in greater detail and to kind of answer 402 00:24:49,040 --> 00:24:52,639 Speaker 1: questions about it and the actual process of validating or 403 00:24:52,880 --> 00:24:56,920 Speaker 1: or confirming Hawking's theorem, because I would love to learn 404 00:24:56,960 --> 00:25:00,080 Speaker 1: more myself, and I just know that I lack the 405 00:25:00,160 --> 00:25:04,400 Speaker 1: understanding necessary to be able to explain it properly. More 406 00:25:04,440 --> 00:25:06,679 Speaker 1: than a decade ago, I wrote an article for How 407 00:25:06,760 --> 00:25:10,720 Speaker 1: Stuff Works about quantum computers, talking about stuff that's difficult 408 00:25:10,720 --> 00:25:13,320 Speaker 1: to understand. At the time when I wrote the article, 409 00:25:13,359 --> 00:25:16,760 Speaker 1: there were only a few very rudimentary quantum computers even 410 00:25:16,760 --> 00:25:20,639 Speaker 1: in development. A quantum computers basic unit of information is 411 00:25:20,720 --> 00:25:24,520 Speaker 1: the cubit or quantum bit. So a normal bit is 412 00:25:24,560 --> 00:25:26,840 Speaker 1: either a zero or a one. It has to be 413 00:25:27,000 --> 00:25:30,960 Speaker 1: one or the other. Cubits can be different. They can 414 00:25:31,240 --> 00:25:36,359 Speaker 1: technically inhabit both values simultaneously. In fact, they can technically 415 00:25:36,560 --> 00:25:40,960 Speaker 1: inhabit all values in between zero and one. That means 416 00:25:41,000 --> 00:25:44,360 Speaker 1: that with the right kind of computational problems, a quantum 417 00:25:44,400 --> 00:25:48,400 Speaker 1: computer could arrive at solutions far faster and more reliably 418 00:25:48,600 --> 00:25:52,159 Speaker 1: than a classical you know computer could. It wouldn't be 419 00:25:52,160 --> 00:25:55,600 Speaker 1: great for everything, right, A quantum computer would not be 420 00:25:55,600 --> 00:25:58,840 Speaker 1: better than a classical computer in every application. You wouldn't 421 00:25:58,840 --> 00:26:00,879 Speaker 1: want to use a quantum computer to run the latest 422 00:26:00,960 --> 00:26:04,400 Speaker 1: version of Call of Duty, for example, But for this 423 00:26:04,760 --> 00:26:09,080 Speaker 1: subset of computer problems it could change the world. Now, 424 00:26:09,200 --> 00:26:12,400 Speaker 1: IBM has become the first company to make practical use 425 00:26:12,440 --> 00:26:15,280 Speaker 1: of a quantum computer in a way that proves it 426 00:26:15,400 --> 00:26:20,840 Speaker 1: is advantageous over classical computing. The researchers at IBM published 427 00:26:20,880 --> 00:26:24,280 Speaker 1: their work in Nature Physics in an article titled Quantum 428 00:26:24,359 --> 00:26:28,399 Speaker 1: advantage for Computations with limited space, which sounds like they 429 00:26:28,400 --> 00:26:30,160 Speaker 1: were trying to use a computer in my old dorm 430 00:26:30,280 --> 00:26:33,520 Speaker 1: room at college, but no. They performed an experiment in 431 00:26:33,520 --> 00:26:37,160 Speaker 1: which they pitted a classical computer against a quantum computer. 432 00:26:37,400 --> 00:26:39,880 Speaker 1: They tried to control for as many variables as possible, 433 00:26:40,240 --> 00:26:43,840 Speaker 1: and they found that for a specific subset of computational problems, 434 00:26:44,119 --> 00:26:48,000 Speaker 1: the quantum computer could perform flawlessly, while the classical computer 435 00:26:48,040 --> 00:26:51,560 Speaker 1: would have at best an error rate of around twelve percent. 436 00:26:52,119 --> 00:26:56,000 Speaker 1: Quantum computers have the potential to disrupt fields like encryption, 437 00:26:56,400 --> 00:26:59,840 Speaker 1: because a sufficiently powerful quantum computer could be able to 438 00:27:00,040 --> 00:27:04,080 Speaker 1: brake encryption essentially through brute force, in a fraction of 439 00:27:04,119 --> 00:27:07,480 Speaker 1: the time it would take a classical computer. For classical computers, 440 00:27:07,520 --> 00:27:10,120 Speaker 1: you might be looking at, you know, tens of thousands 441 00:27:10,119 --> 00:27:12,840 Speaker 1: of years to do this, whereas a quantum computer might 442 00:27:12,880 --> 00:27:16,240 Speaker 1: do it in a few minutes. Now, if that's the case, 443 00:27:16,320 --> 00:27:18,800 Speaker 1: if that in fact does come to pass, then this 444 00:27:18,840 --> 00:27:22,800 Speaker 1: will necessitate a massive change in how we encrypt digital information, 445 00:27:23,160 --> 00:27:26,119 Speaker 1: because it's effectively the same thing as handing out skeleton 446 00:27:26,200 --> 00:27:29,240 Speaker 1: keys for all the locks that are out there, which 447 00:27:29,280 --> 00:27:34,240 Speaker 1: is both scary and exciting stuff. Finally, Tim Burner's leave 448 00:27:34,359 --> 00:27:37,919 Speaker 1: recently auction off and n f T or non fungible 449 00:27:37,920 --> 00:27:42,040 Speaker 1: token representing the original source code for the Worldwide Web. 450 00:27:42,600 --> 00:27:45,680 Speaker 1: This n f T sold for five point four million 451 00:27:45,760 --> 00:27:49,119 Speaker 1: dollars at auction and quick reminder and n f T 452 00:27:49,800 --> 00:27:54,200 Speaker 1: is kind of like a receipt or a certificate of ownership. 453 00:27:54,320 --> 00:27:57,119 Speaker 1: It doesn't actually give anyone access to the thing it 454 00:27:57,200 --> 00:28:01,800 Speaker 1: represents necessarily, nor does it prevent anyone from making copies 455 00:28:02,000 --> 00:28:06,080 Speaker 1: of that thing. It's more like the representation of ownership. 456 00:28:06,520 --> 00:28:09,400 Speaker 1: And he might ask the question, well, what practical good 457 00:28:09,440 --> 00:28:13,720 Speaker 1: is that? And my response is, oh, n f T 458 00:28:13,880 --> 00:28:16,000 Speaker 1: s have become a bit of a commodity. However, I 459 00:28:16,000 --> 00:28:19,400 Speaker 1: would say they aren't quite in the same space as 460 00:28:19,400 --> 00:28:21,240 Speaker 1: they were a couple of months ago when people were 461 00:28:21,280 --> 00:28:25,320 Speaker 1: really going gaga for them. That's when the popularity really spiked. 462 00:28:25,440 --> 00:28:28,800 Speaker 1: They are built on top of a blockchain. That means 463 00:28:28,880 --> 00:28:31,800 Speaker 1: it is easy to trace and authenticate the ownership of 464 00:28:31,840 --> 00:28:34,080 Speaker 1: an n f T, so While you might not be 465 00:28:34,119 --> 00:28:37,199 Speaker 1: able to do anything practical with the thing that the 466 00:28:37,280 --> 00:28:39,520 Speaker 1: n f T represents, you could at least point to 467 00:28:39,560 --> 00:28:42,200 Speaker 1: the blockchain and say, but I for reals own it, 468 00:28:42,440 --> 00:28:44,680 Speaker 1: and you can see right here because it's part of 469 00:28:44,720 --> 00:28:49,720 Speaker 1: the blockchain of transactions. So I don't know. Humans are weird. 470 00:28:50,880 --> 00:28:54,360 Speaker 1: And that wraps up the news for Thursday, July one, 471 00:28:54,360 --> 00:28:58,400 Speaker 1: twenty twenty one. I hope all of you out there 472 00:28:58,440 --> 00:29:00,760 Speaker 1: in the United States have a safe Fourth of July 473 00:29:00,880 --> 00:29:05,200 Speaker 1: weekend as you celebrate. Do be careful if you are celebrating. 474 00:29:05,560 --> 00:29:07,800 Speaker 1: For those of you with dogs, I wish you a 475 00:29:07,960 --> 00:29:11,440 Speaker 1: quiet night or quiet weekend because I know how my 476 00:29:11,520 --> 00:29:14,760 Speaker 1: dog is with fireworks going off nearby, so my heart 477 00:29:14,800 --> 00:29:17,920 Speaker 1: goes out to you. For everyone everywhere else, obviously, have 478 00:29:18,000 --> 00:29:21,160 Speaker 1: a safe weekend a happy one. You just probably won't 479 00:29:21,160 --> 00:29:23,840 Speaker 1: be celebrating Fourth of July. And if you have any 480 00:29:23,840 --> 00:29:26,640 Speaker 1: suggestions for future topics of tech stuff, reach out to me. 481 00:29:27,080 --> 00:29:29,640 Speaker 1: Do so on Twitter. The handle is text stuff h 482 00:29:29,880 --> 00:29:33,640 Speaker 1: s W and I'll talk to you again really soon. 483 00:29:38,560 --> 00:29:41,600 Speaker 1: Text Stuff is an I Heart Radio production. For more 484 00:29:41,680 --> 00:29:45,080 Speaker 1: podcasts from my Heart Radio, visit the I Heart Radio app, 485 00:29:45,200 --> 00:29:48,360 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows