1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,760 --> 00:00:17,640 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,720 --> 00:00:21,680 Speaker 1: and how the tech area. It's time for the tech 5 00:00:21,760 --> 00:00:26,800 Speaker 1: news for Tuesday, August twenty three, two thousand twenty two. 6 00:00:27,640 --> 00:00:32,880 Speaker 1: Apple's semiconductor chip manufacturing company of choice which is t SMC, 7 00:00:33,240 --> 00:00:35,440 Speaker 1: and to be fair, that's kind of the choice for 8 00:00:35,479 --> 00:00:39,440 Speaker 1: pretty much every company that's making advanced chips and is 9 00:00:39,720 --> 00:00:41,880 Speaker 1: a problem in its own right, but that's a matter 10 00:00:41,920 --> 00:00:45,599 Speaker 1: for a different podcast episode. It is reportedly working on 11 00:00:45,640 --> 00:00:51,000 Speaker 1: a three nanometer chip for upcoming MacBook computers. Now, the 12 00:00:51,040 --> 00:00:54,400 Speaker 1: reason I decided to include this was because I wanted 13 00:00:54,400 --> 00:00:58,400 Speaker 1: to do a little deconstruction on the nomenclature we use 14 00:00:58,600 --> 00:01:05,360 Speaker 1: for chips, because it is wildly misleading. So for ages, 15 00:01:05,720 --> 00:01:12,000 Speaker 1: the semiconductor industry has differentiated chips by using the size 16 00:01:12,240 --> 00:01:16,480 Speaker 1: of nodes as the naming convention, and by nodes, we're 17 00:01:16,520 --> 00:01:21,200 Speaker 1: really talking about the length of transistor gates. So the 18 00:01:21,319 --> 00:01:26,119 Speaker 1: length of the transistor gate in whatever unit was an 19 00:01:26,160 --> 00:01:31,400 Speaker 1: indicator of the chips sophistication. Generally, think speaking, you know, 20 00:01:31,480 --> 00:01:33,959 Speaker 1: the more you can cram onto a chip, the more 21 00:01:34,000 --> 00:01:36,280 Speaker 1: powerful the chip can be. That's not always the case, 22 00:01:36,840 --> 00:01:39,240 Speaker 1: but that was kind of the rule of thumb. Now, 23 00:01:40,400 --> 00:01:43,039 Speaker 1: does this mean that a three nanometer chip has a 24 00:01:43,040 --> 00:01:47,640 Speaker 1: transistor gate that is three nanometers long. No, it does not, 25 00:01:48,440 --> 00:01:51,960 Speaker 1: because for more than a decade, companies have shifted away 26 00:01:51,960 --> 00:01:58,040 Speaker 1: from focusing on reducing components size almost exclusively and looked 27 00:01:58,080 --> 00:02:01,880 Speaker 1: more at stuff like chip arc a texture and increasing 28 00:02:01,920 --> 00:02:05,480 Speaker 1: the density of transistors and that sort of thing. So 29 00:02:05,960 --> 00:02:09,800 Speaker 1: chips are still getting more powerful and more sophisticated, but 30 00:02:09,919 --> 00:02:13,720 Speaker 1: the transistor gates aren't shrinking at the same crazy rate 31 00:02:13,760 --> 00:02:19,360 Speaker 1: that they were before. However, the naming convention that we 32 00:02:19,520 --> 00:02:23,040 Speaker 1: use where we use that transistor gate size as the 33 00:02:23,160 --> 00:02:27,480 Speaker 1: name for the next generation of manufacturing processes, has stuck around. 34 00:02:27,880 --> 00:02:31,040 Speaker 1: So if your transistor gates aren't getting that much smaller, 35 00:02:31,160 --> 00:02:35,000 Speaker 1: but you're still dependent upon that that naming convention, then 36 00:02:35,400 --> 00:02:39,679 Speaker 1: things rapidly stop measuring out. So that means that ten 37 00:02:39,760 --> 00:02:43,160 Speaker 1: nanometer chip doesn't necessarily have transistor gates that are ten 38 00:02:43,240 --> 00:02:46,280 Speaker 1: nanometers long. In fact, some of them have transistor gates 39 00:02:46,320 --> 00:02:49,560 Speaker 1: that are nearly twice as long as that, so it's 40 00:02:49,600 --> 00:02:52,520 Speaker 1: really just a naming convention. But a lot of folks 41 00:02:52,639 --> 00:02:55,519 Speaker 1: think that this naming convention is dumb because for one thing, 42 00:02:55,600 --> 00:02:58,720 Speaker 1: you know, it's not accurate. For another, since we keep 43 00:02:58,760 --> 00:03:02,079 Speaker 1: going down, you know, we're reducing the size, and now 44 00:03:02,120 --> 00:03:06,520 Speaker 1: we're talking about a three nanometer process. We're running out 45 00:03:06,560 --> 00:03:11,000 Speaker 1: of nanometers. We're about to get down to the atomic scale, y'all. 46 00:03:11,120 --> 00:03:14,800 Speaker 1: Because a nanometer is one billionth of a meter. And 47 00:03:14,840 --> 00:03:17,400 Speaker 1: that also means that consumers have been really confused for 48 00:03:17,400 --> 00:03:21,760 Speaker 1: a while and often draw the wrong conclusions because you 49 00:03:21,800 --> 00:03:24,880 Speaker 1: can have a so called ten nanometer chip from Company 50 00:03:24,960 --> 00:03:29,239 Speaker 1: A and a seven nanometer chip from Company B. And 51 00:03:29,360 --> 00:03:33,480 Speaker 1: because there's this implication that the smaller number means more 52 00:03:33,480 --> 00:03:37,520 Speaker 1: powerful chips, you would naturally think the seven nanometer chip 53 00:03:37,600 --> 00:03:41,000 Speaker 1: is superior. But that's not necessarily true because we're really 54 00:03:41,040 --> 00:03:46,880 Speaker 1: talking about things like architecture and power efficiency, and even 55 00:03:46,960 --> 00:03:49,360 Speaker 1: the size of the components of the ten nanometer chip 56 00:03:49,440 --> 00:03:53,440 Speaker 1: could be smaller than that on the seven nanometer chip. 57 00:03:53,440 --> 00:03:55,360 Speaker 1: Because you're talking about two different companies and you're just 58 00:03:55,400 --> 00:03:59,120 Speaker 1: talking about them using a naming convention to market a 59 00:03:59,160 --> 00:04:04,240 Speaker 1: new generation of semi conductor chip, it isn't actually measuring anything, 60 00:04:04,880 --> 00:04:09,000 Speaker 1: so a ten ter chip and a seven nimeter chip 61 00:04:09,040 --> 00:04:11,800 Speaker 1: from two different companies could be made in such a 62 00:04:11,840 --> 00:04:14,800 Speaker 1: way that the ten nimeter isn't always superior to the 63 00:04:14,880 --> 00:04:18,279 Speaker 1: seven nanometer. That's why I can get confusing. It's this 64 00:04:18,480 --> 00:04:24,800 Speaker 1: marketing strategy that creates confusion, and it perpetuates confusion. So yes, 65 00:04:24,839 --> 00:04:26,680 Speaker 1: I just used a news story to give a quick 66 00:04:26,760 --> 00:04:29,600 Speaker 1: lesson on why the semi conductor industry is using misleading 67 00:04:29,640 --> 00:04:32,440 Speaker 1: marketing material and you should do research before you choose 68 00:04:32,440 --> 00:04:35,080 Speaker 1: a processor and not just do it based off the 69 00:04:35,120 --> 00:04:41,719 Speaker 1: supposed node size. Former Apple employee Xiao Lang Jong has 70 00:04:41,800 --> 00:04:45,560 Speaker 1: pled guilty of charges of stealing proprietary information from Apple 71 00:04:46,080 --> 00:04:49,880 Speaker 1: for the purposes of sharing it with another company, one 72 00:04:49,920 --> 00:04:54,679 Speaker 1: in China for that matter. Now, this story actually started 73 00:04:54,680 --> 00:05:00,320 Speaker 1: back in when Jong was first arrested. Jeong had turned 74 00:05:00,400 --> 00:05:04,120 Speaker 1: to Apple after taking a trip to China, and then 75 00:05:04,200 --> 00:05:08,000 Speaker 1: not long after his return, he resigned from Apple, and 76 00:05:08,040 --> 00:05:11,279 Speaker 1: he also started sending corporate documents to his wife's computer, 77 00:05:12,080 --> 00:05:16,040 Speaker 1: including documents that, in fact, as far as I can tell, 78 00:05:16,040 --> 00:05:21,599 Speaker 1: exclusively focusing on documents related to Apple's worst kept secret. 79 00:05:22,120 --> 00:05:26,000 Speaker 1: It's you know, autonomous electric vehicle project that everyone knows about, 80 00:05:26,000 --> 00:05:29,800 Speaker 1: but Apple is never publicly at knowledge that included a 81 00:05:29,880 --> 00:05:33,400 Speaker 1: twenty five page document with detailed schematics of a circuit 82 00:05:33,440 --> 00:05:37,800 Speaker 1: board that Apple was designing for the vehicle. Moreover, Jeong 83 00:05:37,960 --> 00:05:40,440 Speaker 1: had told Apple that he was going to return to 84 00:05:40,520 --> 00:05:42,719 Speaker 1: China and he was also going to work for a 85 00:05:42,760 --> 00:05:46,240 Speaker 1: company called ex Paying Motors, which is an electric vehicle 86 00:05:46,279 --> 00:05:51,920 Speaker 1: manufacturing company that's also developing an autonomous vehicle. Jong had 87 00:05:51,960 --> 00:05:56,040 Speaker 1: previously led not guilty to the charges after being arrested, 88 00:05:56,480 --> 00:05:59,200 Speaker 1: but now he has changed his flea to guilty, and 89 00:05:59,240 --> 00:06:01,839 Speaker 1: he faces up to ten years in prison and a 90 00:06:01,920 --> 00:06:05,680 Speaker 1: fine of up to two fifty thousand bucks. And you 91 00:06:05,760 --> 00:06:09,520 Speaker 1: might remember the story of former Google employee Anthony Lewandowski, 92 00:06:09,760 --> 00:06:13,520 Speaker 1: whom Google accused of stealing documents from its autonomous car 93 00:06:13,560 --> 00:06:18,920 Speaker 1: project that would become way Mo. Lewandowski subsequently worked for Uber, 94 00:06:19,240 --> 00:06:21,880 Speaker 1: and that led to a nasty court battle between Google 95 00:06:21,920 --> 00:06:26,960 Speaker 1: and Uber plus, Uber unceremoniously ending its relationship with Lewandowski. 96 00:06:27,480 --> 00:06:32,800 Speaker 1: Lewandowski was subsequently tried and convicted of stealing documents, but 97 00:06:32,880 --> 00:06:36,400 Speaker 1: then former President Trump pardoned Lewandowski on the last day 98 00:06:36,440 --> 00:06:39,920 Speaker 1: of his presidency. Anyway, it looks like autonomous vehicle research 99 00:06:40,279 --> 00:06:44,120 Speaker 1: is the hottest target for industrial espionage in tech right now, 100 00:06:44,360 --> 00:06:48,840 Speaker 1: so I guess it's fashionable. A senior fellow for the 101 00:06:48,920 --> 00:06:54,040 Speaker 1: Irish Council for Civil Liberties named Johnny Ryan has spearheaded 102 00:06:54,120 --> 00:06:57,800 Speaker 1: a class action lawsuit in the United States targeting the 103 00:06:57,800 --> 00:07:01,920 Speaker 1: computer technology company Oracle. Now, in case you're not familiar 104 00:07:01,960 --> 00:07:05,360 Speaker 1: with Oracle, it is primarily a A B two B 105 00:07:05,760 --> 00:07:08,800 Speaker 1: kind of company, a business to business like Its clients 106 00:07:08,800 --> 00:07:12,880 Speaker 1: are other businesses, and it works in software and database 107 00:07:12,920 --> 00:07:17,040 Speaker 1: management and cloud services as well as hardware. Ryan's lawsuit 108 00:07:17,120 --> 00:07:21,040 Speaker 1: alleges that Oracle has illegally been collecting the information of 109 00:07:21,080 --> 00:07:27,320 Speaker 1: around five billion people. Essentially that Oracle is assembling dossiers 110 00:07:27,400 --> 00:07:30,560 Speaker 1: on folks, and those dossiers can contain information that includes 111 00:07:30,600 --> 00:07:37,760 Speaker 1: stuff like names, physical addresses, email addresses, political views, purchase history, 112 00:07:38,240 --> 00:07:43,120 Speaker 1: geolocation data, meaning that Oracle has been tracking people or 113 00:07:43,120 --> 00:07:46,640 Speaker 1: at least has access to tracking information so they know 114 00:07:46,760 --> 00:07:50,880 Speaker 1: where people have been, as well as records of online activity. 115 00:07:51,000 --> 00:07:55,200 Speaker 1: So essentially all the personal data stuff that we talk 116 00:07:55,320 --> 00:07:59,560 Speaker 1: about in other you know, news articles and such. So 117 00:07:59,720 --> 00:08:03,880 Speaker 1: Ryan disclaiming Oracle is collecting all of that and organizing 118 00:08:03,880 --> 00:08:07,520 Speaker 1: it into what he calls dossier's Ryan has brought this 119 00:08:07,600 --> 00:08:11,120 Speaker 1: lawsuit to California, probably because that is the U s 120 00:08:11,120 --> 00:08:15,120 Speaker 1: state that has the most strict privacy laws, and this 121 00:08:15,240 --> 00:08:18,240 Speaker 1: is a massive endeavor. It's too early to say how 122 00:08:18,280 --> 00:08:20,960 Speaker 1: it's all going to turn out, but some folks at 123 00:08:21,040 --> 00:08:23,200 Speaker 1: least suspect that this is a push to encourage the 124 00:08:23,240 --> 00:08:26,960 Speaker 1: United States to adopt stronger privacy laws more in line 125 00:08:27,000 --> 00:08:31,040 Speaker 1: with what we see over in the European Union. Joshua 126 00:08:31,120 --> 00:08:34,160 Speaker 1: Benton at Nieman Lab dot org has a great article. 127 00:08:34,280 --> 00:08:38,160 Speaker 1: It's titled are you legally Liable for the contents of 128 00:08:38,240 --> 00:08:42,439 Speaker 1: every web page You linked? To? Australia finally gets sensible? 129 00:08:43,200 --> 00:08:46,439 Speaker 1: All right, some backstory on this. So back in the 130 00:08:46,480 --> 00:08:50,240 Speaker 1: first decade of this millennium, I just hate saying. The 131 00:08:50,280 --> 00:08:55,040 Speaker 1: early two thousand's, an Australian lawyer named George Defterros was 132 00:08:55,520 --> 00:09:00,200 Speaker 1: arrested and charged with conspiring to commit murder. Defterro was 133 00:09:00,280 --> 00:09:03,679 Speaker 1: known as a lawyer who represented people accused of belonging 134 00:09:03,720 --> 00:09:08,800 Speaker 1: to organized crime gangs. Anyway, an Australian newspaper published an 135 00:09:08,880 --> 00:09:13,440 Speaker 1: article about defter Ros, alleging that he was in fact 136 00:09:14,320 --> 00:09:17,920 Speaker 1: part of this conspiracy and such, and Google ended up 137 00:09:18,000 --> 00:09:21,360 Speaker 1: linking to that article and its search results because Google 138 00:09:21,480 --> 00:09:24,839 Speaker 1: indexes the web, and when people do searches for things, 139 00:09:25,920 --> 00:09:29,840 Speaker 1: you get the links. Right. Well, flash forward many many years. 140 00:09:30,520 --> 00:09:34,480 Speaker 1: The lawyers representing Defterros, who by the way, had all 141 00:09:34,559 --> 00:09:37,720 Speaker 1: charges dropped against him, so he did not he did 142 00:09:37,720 --> 00:09:42,240 Speaker 1: not stand for those those charges they were dropped. His 143 00:09:42,320 --> 00:09:46,200 Speaker 1: lawyers were seeking to have this article removed from the 144 00:09:46,400 --> 00:09:49,880 Speaker 1: from the Internet, and they went to the newspaper and 145 00:09:49,960 --> 00:09:53,000 Speaker 1: demanded that the newspaper removed the article, and the newspaper 146 00:09:53,040 --> 00:09:56,800 Speaker 1: said no. So when that proved fruitless, they then went 147 00:09:56,800 --> 00:10:00,760 Speaker 1: after Google, and their argument was that Google, by publishing 148 00:10:00,840 --> 00:10:06,400 Speaker 1: a link to this article, was kind of endorsing the article, 149 00:10:06,840 --> 00:10:10,000 Speaker 1: that Google itself was acting as a publisher, and that 150 00:10:10,120 --> 00:10:12,440 Speaker 1: it was almost as if the offending piece had come 151 00:10:12,520 --> 00:10:17,600 Speaker 1: from Google because it was linking to it. So that's 152 00:10:17,760 --> 00:10:21,360 Speaker 1: kind of wild, right, like that a link can somehow 153 00:10:21,440 --> 00:10:26,280 Speaker 1: imply that you're responsible for the material that the link 154 00:10:26,400 --> 00:10:31,040 Speaker 1: goes to. While initially a court in Australia ruled that 155 00:10:31,080 --> 00:10:35,520 Speaker 1: Google was in fact responsible, but then it got appealed. 156 00:10:35,559 --> 00:10:38,200 Speaker 1: It went to Australia's High Court and the High Court 157 00:10:38,240 --> 00:10:41,840 Speaker 1: has reversed that ruling and essentially said this is ludicrous. 158 00:10:41,880 --> 00:10:45,280 Speaker 1: If we follow this logic, anyone who links to anything 159 00:10:45,320 --> 00:10:49,480 Speaker 1: that is later claimed, not even proven, just claimed to 160 00:10:49,559 --> 00:10:54,000 Speaker 1: be defamatory, shares responsibility and therefore could be sued for liabel. 161 00:10:55,240 --> 00:10:58,559 Speaker 1: That seems pretty extreme, doesn't it that a link alone 162 00:10:59,040 --> 00:11:02,959 Speaker 1: could make you respond constable for libel. So what if 163 00:11:03,000 --> 00:11:05,400 Speaker 1: you were to come across a link to a story 164 00:11:05,559 --> 00:11:09,600 Speaker 1: and you shared it on your social media plat platforms 165 00:11:09,640 --> 00:11:12,760 Speaker 1: like on Facebook or on Twitter. Maybe you saw the story, 166 00:11:12,800 --> 00:11:15,000 Speaker 1: you just thought it was interesting you wanted to share it. Well, 167 00:11:15,200 --> 00:11:18,760 Speaker 1: if this earlier court ruling had been upheld, it would 168 00:11:18,760 --> 00:11:21,319 Speaker 1: have said a precedent that suggests you could be found 169 00:11:21,320 --> 00:11:25,560 Speaker 1: guilty of libel yourself just by sharing the link, and 170 00:11:25,600 --> 00:11:28,319 Speaker 1: that you could potentially face charges for it even though 171 00:11:28,360 --> 00:11:32,439 Speaker 1: you didn't write the supposed defamatory material. By the way, 172 00:11:32,679 --> 00:11:35,120 Speaker 1: a big part of this story is that while the 173 00:11:35,240 --> 00:11:40,800 Speaker 1: lawyers were claiming that the article was found to be defamatory, 174 00:11:40,880 --> 00:11:44,000 Speaker 1: it never actually went to court. It was settled out 175 00:11:44,000 --> 00:11:49,800 Speaker 1: of court. So because of that, the claims were spurious, 176 00:11:50,200 --> 00:11:54,320 Speaker 1: and yet they still went through UH and got this 177 00:11:54,400 --> 00:11:57,400 Speaker 1: initial decision by the court that was then overturned by 178 00:11:57,400 --> 00:12:02,080 Speaker 1: the High Court. So it's good that the High Court 179 00:12:02,200 --> 00:12:04,920 Speaker 1: saw this for what it was, or at least five 180 00:12:04,960 --> 00:12:07,320 Speaker 1: of the seven judges saw it for what it was. 181 00:12:07,600 --> 00:12:10,600 Speaker 1: Two of them dissented and argued that Google was in 182 00:12:10,679 --> 00:12:16,280 Speaker 1: fact responsible. Not sure what they were thinking. Okay, we've 183 00:12:16,320 --> 00:12:18,439 Speaker 1: got a few more news stories to go, but before 184 00:12:18,440 --> 00:12:29,600 Speaker 1: we get to that, let's take a quick break. We're 185 00:12:29,679 --> 00:12:33,880 Speaker 1: back US automaker Ford announced it is laying off three 186 00:12:33,920 --> 00:12:38,080 Speaker 1: thousand employees and includes around two thousand salaried positions and 187 00:12:38,160 --> 00:12:41,440 Speaker 1: one thousand contractors, and the company says this is all 188 00:12:41,520 --> 00:12:44,200 Speaker 1: part of its strategy to pivot from focusing primarily on 189 00:12:44,240 --> 00:12:47,600 Speaker 1: internal combustion engine vehicles and to change to put more 190 00:12:47,640 --> 00:12:52,880 Speaker 1: emphasis on electric vehicle production. Ford CEO Jump Farley denies 191 00:12:52,960 --> 00:12:56,240 Speaker 1: that the cuts are a cost saving move, but rather 192 00:12:56,720 --> 00:13:00,319 Speaker 1: they indicate how Ford is serious about fundamentally each changing 193 00:13:00,320 --> 00:13:03,840 Speaker 1: course by committing to the future of electric vehicles. My 194 00:13:03,920 --> 00:13:05,560 Speaker 1: heart goes out to all the folks who got their 195 00:13:05,600 --> 00:13:09,920 Speaker 1: walking papers. It is an increasingly tough job market, particularly 196 00:13:10,000 --> 00:13:13,960 Speaker 1: when other auto manufacturers like Tesla have also been laying 197 00:13:13,960 --> 00:13:17,160 Speaker 1: off employees or making other kinds of cost saving cuts. 198 00:13:17,840 --> 00:13:21,960 Speaker 1: The CEO of the cryptocurrency exchange company Binance, says that 199 00:13:22,080 --> 00:13:26,320 Speaker 1: LinkedIn is absolutely swarming with people falsely claiming to be 200 00:13:26,559 --> 00:13:31,319 Speaker 1: Binance employees. And I'm not joking about swarming. He says. 201 00:13:31,520 --> 00:13:36,199 Speaker 1: There are about fifty real profiles belonging to Finance employees 202 00:13:36,280 --> 00:13:41,520 Speaker 1: on LinkedIn, but in total it's closer to seven thousand 203 00:13:41,800 --> 00:13:46,360 Speaker 1: claimed Binance employees, which is a big old yauza. So 204 00:13:46,800 --> 00:13:50,280 Speaker 1: why would people be lying about working for Binance, Well, 205 00:13:50,320 --> 00:13:54,439 Speaker 1: it's probably part of crypto scams. The scammers are likely 206 00:13:54,520 --> 00:13:58,000 Speaker 1: listing Finance on LinkedIn to give themselves a sense of 207 00:13:58,080 --> 00:14:02,400 Speaker 1: legitimacy when they're talking to their marks, their targets. They're 208 00:14:02,440 --> 00:14:06,280 Speaker 1: tricking people into pouring money into various schemes. They're usually 209 00:14:06,559 --> 00:14:08,760 Speaker 1: types of Ponzi schemes. If you don't know what a 210 00:14:08,800 --> 00:14:12,840 Speaker 1: Ponzi scheme is, it's a subset of pyramid schemes. So 211 00:14:12,880 --> 00:14:16,080 Speaker 1: a scammer convinces a group of investors to pour money 212 00:14:16,120 --> 00:14:18,400 Speaker 1: into you know, whatever it is, in this case, a 213 00:14:18,440 --> 00:14:22,800 Speaker 1: cryptocurrency scheme. Then the scammera convinces a second round of 214 00:14:22,840 --> 00:14:27,640 Speaker 1: investors to do the same, and then pays a percentage 215 00:14:27,680 --> 00:14:31,480 Speaker 1: out to the first round of investors to keep them 216 00:14:31,480 --> 00:14:34,720 Speaker 1: happy while pocketing the rest of the money, and then 217 00:14:34,760 --> 00:14:36,800 Speaker 1: they keep going and so on and so forth, and 218 00:14:36,880 --> 00:14:40,880 Speaker 1: effective scammers can often convince investors to reinvest into the scheme, 219 00:14:40,960 --> 00:14:43,920 Speaker 1: so they take the money that they're supposedly getting paid 220 00:14:43,960 --> 00:14:46,440 Speaker 1: out as the scheme is paying off, and they put 221 00:14:46,480 --> 00:14:49,080 Speaker 1: it back into it, which just gives more money to 222 00:14:49,120 --> 00:14:53,400 Speaker 1: the scammers. And ultimately these schemes all collapse in on themselves. 223 00:14:53,400 --> 00:14:57,800 Speaker 1: They cannot sustain themselves forever. And so the Binance CEO 224 00:14:58,320 --> 00:15:01,680 Speaker 1: is warning followers not to assume someone really is a 225 00:15:01,720 --> 00:15:04,560 Speaker 1: finance employee just because it might say so on a 226 00:15:04,600 --> 00:15:08,560 Speaker 1: LinkedIn account, particularly if that supposed employee is trying to 227 00:15:08,560 --> 00:15:11,680 Speaker 1: coerce people into pouring money into a crypto investment scheme. 228 00:15:12,400 --> 00:15:16,240 Speaker 1: This is pretty tricky because LinkedIn doesn't verify work or 229 00:15:16,360 --> 00:15:20,800 Speaker 1: education history. They do claim to respond to reports of 230 00:15:21,040 --> 00:15:26,560 Speaker 1: false accounts, and they say that they look for false accounts, 231 00:15:26,560 --> 00:15:31,120 Speaker 1: but yeah, this is a If it's seven thousand fake 232 00:15:31,200 --> 00:15:34,680 Speaker 1: ones out there, that's a that's a pretty big problem. 233 00:15:34,720 --> 00:15:37,480 Speaker 1: And you know, folks, fib on resumes all the time. 234 00:15:37,520 --> 00:15:40,560 Speaker 1: I get it, But this goes well beyond that. Japanese 235 00:15:40,600 --> 00:15:44,600 Speaker 1: company Fujitsu has partnered with reik In, a research institute, 236 00:15:44,920 --> 00:15:47,920 Speaker 1: with the intent of developing and selling a quantum computer 237 00:15:48,120 --> 00:15:51,920 Speaker 1: boasting sixty four cubits starting next year. Now, to brush 238 00:15:52,000 --> 00:15:56,040 Speaker 1: up on quantum computers, the fundamental unit of classical computers 239 00:15:56,240 --> 00:15:58,960 Speaker 1: is the bit, and a bit can either be a 240 00:15:59,160 --> 00:16:02,480 Speaker 1: zero or a one. The fundamental unit of information in 241 00:16:02,480 --> 00:16:06,800 Speaker 1: a quantum computer is a cubit, which, thanks to quantum effects, 242 00:16:07,520 --> 00:16:10,040 Speaker 1: can essentially be a zero and a one at the 243 00:16:10,120 --> 00:16:14,720 Speaker 1: same time under specific circumstances. And I'm being very high 244 00:16:14,760 --> 00:16:18,480 Speaker 1: level with this, but when paired with the right algorithms, 245 00:16:18,520 --> 00:16:21,680 Speaker 1: that kind of computer, a quantum computer can potentially solve 246 00:16:21,760 --> 00:16:25,960 Speaker 1: a subset of computer problems far faster than a classical 247 00:16:25,960 --> 00:16:32,640 Speaker 1: computer can. Uh. It's essentially solving for all potential solutions 248 00:16:32,680 --> 00:16:36,360 Speaker 1: at the same time and then presenting the one that 249 00:16:36,520 --> 00:16:40,720 Speaker 1: is most likely to be the best. It deals with probabilities, 250 00:16:40,800 --> 00:16:45,400 Speaker 1: not certainties. It gets very wibbly wobbly, but uh. It's 251 00:16:45,400 --> 00:16:48,360 Speaker 1: also important to remember quantum computers are no better than 252 00:16:48,440 --> 00:16:51,840 Speaker 1: classical computers for other types of applications, other types of 253 00:16:51,880 --> 00:16:54,920 Speaker 1: computer problems. You would not be using a quantum computer 254 00:16:54,960 --> 00:16:58,800 Speaker 1: as a gaming rig for example, but they do potentially 255 00:16:59,680 --> 00:17:04,600 Speaker 1: have the ability to change really important things that we 256 00:17:04,680 --> 00:17:09,160 Speaker 1: depend upon, like encryption in the near future. NASA has 257 00:17:09,240 --> 00:17:13,040 Speaker 1: narrowed down potential future lunar landing sites to thirteen regions, 258 00:17:13,080 --> 00:17:15,040 Speaker 1: all of which are not too far from the Moon's 259 00:17:15,080 --> 00:17:17,159 Speaker 1: south pole. I like to think they were on the 260 00:17:17,280 --> 00:17:21,159 Speaker 1: lunar equivalent to Zillo at the time. Scientists believe that 261 00:17:21,200 --> 00:17:23,720 Speaker 1: the region is perfect for future Moon missions because the 262 00:17:23,720 --> 00:17:26,639 Speaker 1: deep craters in the area could potentially hold hydrogen and 263 00:17:26,720 --> 00:17:30,320 Speaker 1: water ice. That kind of stuff would be useful if 264 00:17:30,359 --> 00:17:33,120 Speaker 1: you wanted to make your own rocket fuel, for example, 265 00:17:33,240 --> 00:17:36,080 Speaker 1: or if you wanted to perhaps process water ice to 266 00:17:36,119 --> 00:17:39,719 Speaker 1: create not just water, but maybe oxygen. This falls in 267 00:17:39,760 --> 00:17:42,240 Speaker 1: line with the goals of the Artemis campaign, which has 268 00:17:42,280 --> 00:17:46,240 Speaker 1: some really ambitious targets, including creating a base of operations 269 00:17:46,320 --> 00:17:49,880 Speaker 1: suitable for long term stays on the Moon. NASA has 270 00:17:49,920 --> 00:17:51,960 Speaker 1: been planning this out for years, and in fact, the 271 00:17:52,080 --> 00:17:56,120 Speaker 1: launch of Artemis one, which will be an unscrewed Orion 272 00:17:56,400 --> 00:17:59,840 Speaker 1: vessel on top of the Space Launch System, which is 273 00:17:59,880 --> 00:18:03,760 Speaker 1: a super heavy lift launch vehicle, is scheduled to launch 274 00:18:03,960 --> 00:18:07,240 Speaker 1: on Monday of next week. If everything goes to plan. 275 00:18:07,840 --> 00:18:10,560 Speaker 1: The actual return mission to the Moon in which humans 276 00:18:10,560 --> 00:18:13,480 Speaker 1: will head back up there. That one is designated as 277 00:18:13,640 --> 00:18:17,119 Speaker 1: Artemis three. That's not expected to launch until twenty twenty 278 00:18:17,200 --> 00:18:21,359 Speaker 1: five at the earliest. Sony has announced via Instagram of 279 00:18:21,400 --> 00:18:24,760 Speaker 1: all Things that's new generation of VR hardware for the 280 00:18:24,760 --> 00:18:29,720 Speaker 1: PlayStation console is likely to launch in early This generation 281 00:18:29,720 --> 00:18:32,440 Speaker 1: of hardware is going to work with the PlayStation five. 282 00:18:32,520 --> 00:18:36,560 Speaker 1: It's reportedly softer with better ergonomic design than the earlier 283 00:18:36,600 --> 00:18:40,439 Speaker 1: generation of Sony's VR peripherals, and Sony says the headset 284 00:18:40,440 --> 00:18:43,440 Speaker 1: will display graphics at a four thousand by two thousand 285 00:18:43,480 --> 00:18:47,320 Speaker 1: forty resolution per eye that breaks down to two thousand 286 00:18:47,400 --> 00:18:50,359 Speaker 1: by twenty and it will have a refresh rate of 287 00:18:50,440 --> 00:18:54,400 Speaker 1: nineties slash hurts. It's also gonna have a c through mode, 288 00:18:54,400 --> 00:18:56,200 Speaker 1: so if you get too close to the wall, it'll 289 00:18:56,240 --> 00:18:58,520 Speaker 1: show you so you don't bump your nose in there. 290 00:18:59,359 --> 00:19:01,480 Speaker 1: They have not have any information on how much it 291 00:19:01,480 --> 00:19:03,120 Speaker 1: will cost. My guess is it will be a few 292 00:19:03,160 --> 00:19:07,720 Speaker 1: hundred dollars um, So here's hoping that we find out soon. 293 00:19:07,800 --> 00:19:12,840 Speaker 1: We know that it's coming in early, and that's about it. Well, 294 00:19:13,600 --> 00:19:18,439 Speaker 1: that is the news for Tuesday, August two. Hope you 295 00:19:18,480 --> 00:19:21,280 Speaker 1: are all well, make sure you reach out to me 296 00:19:21,359 --> 00:19:24,399 Speaker 1: with any suggestions you have for future episodes of tech Stuff. 297 00:19:24,440 --> 00:19:26,240 Speaker 1: You can do that on the I Heart Radio app 298 00:19:26,760 --> 00:19:29,840 Speaker 1: or on Twitter at tech stuff H s W and 299 00:19:29,840 --> 00:19:39,480 Speaker 1: I'll talk to you again really soon. Text Stuff is 300 00:19:39,520 --> 00:19:42,639 Speaker 1: an I Heart Radio production. For more podcasts from my 301 00:19:42,800 --> 00:19:46,400 Speaker 1: Heart Radio, visit the i Heart Radio app, Apple Podcasts, 302 00:19:46,520 --> 00:19:48,480 Speaker 1: or wherever you listen to your favorite shows.