1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:15,120 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,320 --> 00:00:18,440 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,600 --> 00:00:20,959 Speaker 1: and a love of all things tech. And this is 5 00:00:21,000 --> 00:00:25,320 Speaker 1: the tech news for Thursday, August five, twenty twenty one. 6 00:00:26,160 --> 00:00:30,400 Speaker 1: We've got a lot of it, including some really crazy 7 00:00:30,480 --> 00:00:34,680 Speaker 1: quantum physics news that I am going to do my 8 00:00:34,760 --> 00:00:39,199 Speaker 1: best to try and impart unto you, dear listener. But 9 00:00:39,320 --> 00:00:43,959 Speaker 1: let's get started with some Facebook stories, because I just 10 00:00:44,159 --> 00:00:47,880 Speaker 1: can't seem to go an episode without talking about Facebook. 11 00:00:48,400 --> 00:00:51,320 Speaker 1: So one of these stories is likely not going to 12 00:00:51,360 --> 00:00:53,760 Speaker 1: be a surprise to any of you listeners at all. 13 00:00:54,120 --> 00:00:58,520 Speaker 1: And that's the fact that many law enforcement agencies are 14 00:00:58,760 --> 00:01:02,320 Speaker 1: crawling on social networking sites to collect images of people 15 00:01:02,640 --> 00:01:06,440 Speaker 1: and use those images for the purposes of facial recognition systems. 16 00:01:07,319 --> 00:01:11,440 Speaker 1: The m Live site, which covers news in Michigan here 17 00:01:11,480 --> 00:01:14,679 Speaker 1: in the United States, published an article this morning about this, 18 00:01:15,040 --> 00:01:18,880 Speaker 1: quoting Eric Goldman of the High Tech Law Institute at 19 00:01:18,920 --> 00:01:22,880 Speaker 1: Santa Clara University, and Goldman said that if you have 20 00:01:23,400 --> 00:01:27,800 Speaker 1: a public social networking accounts, such as a Facebook account 21 00:01:27,880 --> 00:01:31,520 Speaker 1: that posts publicly, it stands to reason that a lot 22 00:01:31,520 --> 00:01:36,120 Speaker 1: of different people and organizations have grabbed your photos at 23 00:01:36,160 --> 00:01:40,600 Speaker 1: some point because they are all publicly available. Publicly available 24 00:01:40,640 --> 00:01:44,440 Speaker 1: information is you know, public, so anyone can see it. 25 00:01:44,880 --> 00:01:47,840 Speaker 1: And we've seen numerous groups use software to crawl through 26 00:01:48,120 --> 00:01:51,480 Speaker 1: enormous platforms to pull down mountains of data for the 27 00:01:51,520 --> 00:01:55,600 Speaker 1: purposes of identification and analysis, and those are acts that 28 00:01:55,640 --> 00:01:59,760 Speaker 1: platforms frequently resist for a variety of reasons, like there's 29 00:02:00,480 --> 00:02:03,640 Speaker 1: an AI platform that has done this that has gotten 30 00:02:03,640 --> 00:02:06,320 Speaker 1: in trouble for it in the past because of that. 31 00:02:07,120 --> 00:02:09,360 Speaker 1: I should also add that while I say if you 32 00:02:09,400 --> 00:02:12,560 Speaker 1: post publicly, that's not even the full picture obviously, right, 33 00:02:12,600 --> 00:02:15,960 Speaker 1: Like let's say you don't even have a Facebook profile, 34 00:02:16,960 --> 00:02:20,079 Speaker 1: you could be in photos that your friends take, right, 35 00:02:20,480 --> 00:02:25,000 Speaker 1: and those photos might be on their their profiles, and 36 00:02:25,040 --> 00:02:28,839 Speaker 1: they might be public which means that those photos can 37 00:02:28,880 --> 00:02:34,280 Speaker 1: be captured and used for facial recognition purposes. So you 38 00:02:34,280 --> 00:02:36,440 Speaker 1: don't even have to be involved in this. You have 39 00:02:36,880 --> 00:02:41,040 Speaker 1: no say at that point if someone is posting your 40 00:02:41,040 --> 00:02:44,760 Speaker 1: photo publicly, and it's fair game for a lot of 41 00:02:44,760 --> 00:02:48,079 Speaker 1: these organizations, at least as far as they're concerned. Now, 42 00:02:48,120 --> 00:02:52,160 Speaker 1: we've also seen that many facial recognition systems have an 43 00:02:52,160 --> 00:02:54,720 Speaker 1: element of bias in them, with a tendency to be 44 00:02:54,800 --> 00:02:58,440 Speaker 1: less accurate for people of color. And that might not 45 00:02:58,600 --> 00:03:01,520 Speaker 1: sound you know, terrible, like, oh, it's less accurate, but 46 00:03:01,760 --> 00:03:04,720 Speaker 1: then you figure that a less accurate system means that 47 00:03:04,880 --> 00:03:09,680 Speaker 1: sometimes a person could be inaccurately identified as, say, a 48 00:03:09,840 --> 00:03:12,880 Speaker 1: person of interest in a legal case, and then you 49 00:03:12,919 --> 00:03:15,679 Speaker 1: can really see how it does become a problem. Such 50 00:03:15,720 --> 00:03:19,880 Speaker 1: a misidentification can lead in a best case scenario, it 51 00:03:19,880 --> 00:03:24,440 Speaker 1: can lead to an inconvenience. That's the best case scenario. 52 00:03:24,880 --> 00:03:29,600 Speaker 1: Worst case scenario ends up being incredibly harmful and disruptive. 53 00:03:30,120 --> 00:03:33,520 Speaker 1: I mean that could even include wrongful arrests which have 54 00:03:33,639 --> 00:03:39,240 Speaker 1: happened as a result of facial recognition software mistakenly identifying 55 00:03:39,320 --> 00:03:43,320 Speaker 1: someone as someone else. And the fact that this disproportionately 56 00:03:43,400 --> 00:03:47,000 Speaker 1: harms specific communities, like communities of color. That's a real 57 00:03:47,080 --> 00:03:50,839 Speaker 1: issue now, as the Peace and m Life points out, 58 00:03:51,240 --> 00:03:54,200 Speaker 1: there have been documented cases of police action based off 59 00:03:54,280 --> 00:03:59,480 Speaker 1: incorrect facial recognition analysis already in Michigan. There's a growing 60 00:03:59,560 --> 00:04:02,960 Speaker 1: resist to the use of this technology. There's been a 61 00:04:02,960 --> 00:04:06,360 Speaker 1: lot more pressure on local and federal governments to outlaw 62 00:04:06,360 --> 00:04:09,600 Speaker 1: it's used there's also been pressure on companies to not 63 00:04:10,080 --> 00:04:14,720 Speaker 1: make that technology available to law enforcement agencies, and this 64 00:04:14,760 --> 00:04:17,640 Speaker 1: ties in with several other stories that are unfolding around 65 00:04:17,640 --> 00:04:21,160 Speaker 1: the world, many of which tap into this issue of 66 00:04:21,520 --> 00:04:25,159 Speaker 1: wanting to lean on technology in order to solve a 67 00:04:25,240 --> 00:04:29,719 Speaker 1: particularly difficult social problem. But the issue is that in 68 00:04:29,760 --> 00:04:35,440 Speaker 1: some cases the technology isn't necessarily reliable enough to depend upon, 69 00:04:35,960 --> 00:04:38,279 Speaker 1: So I may have to do a full episode dedicated 70 00:04:38,320 --> 00:04:40,800 Speaker 1: to that concept because there are a lot of different 71 00:04:40,800 --> 00:04:45,279 Speaker 1: factors and play here. Right, there's a genuine desire to 72 00:04:45,560 --> 00:04:50,120 Speaker 1: right a social wrong or to prevent a particular social 73 00:04:50,160 --> 00:04:54,599 Speaker 1: problem from happening. There's a perception that the technology could 74 00:04:54,640 --> 00:04:57,400 Speaker 1: be the solution to this, and then there's the reality 75 00:04:57,440 --> 00:05:01,000 Speaker 1: of technology that may not be reliable enough for us 76 00:05:01,040 --> 00:05:04,560 Speaker 1: to really leverage it that way, or else we have 77 00:05:04,760 --> 00:05:09,040 Speaker 1: these exceptions pop up that then affect innocent people in 78 00:05:09,200 --> 00:05:13,640 Speaker 1: harmful ways. Moving on, The Guardian published an article about 79 00:05:13,760 --> 00:05:17,200 Speaker 1: how the fossil fuel industry was able to leverage Facebook 80 00:05:17,279 --> 00:05:21,080 Speaker 1: to push climate misinformation. The article cites a group called 81 00:05:21,200 --> 00:05:25,760 Speaker 1: influence Map, which is based in London, which uncovered this issue. 82 00:05:26,040 --> 00:05:29,279 Speaker 1: Influence Maps said that there was an increase in advertising 83 00:05:29,320 --> 00:05:33,640 Speaker 1: by fossil fuel companies like ex On Mobile in during 84 00:05:33,680 --> 00:05:38,400 Speaker 1: the election season, with the campaign specifically creating a misinformation 85 00:05:38,480 --> 00:05:43,000 Speaker 1: narrative to shape the conversation around climate change and fossil 86 00:05:43,040 --> 00:05:46,599 Speaker 1: fuel policies that were meant to address that problem. Further 87 00:05:47,279 --> 00:05:51,960 Speaker 1: influence MAP alleges that this campaign violated Facebook's own policies 88 00:05:51,960 --> 00:05:55,960 Speaker 1: with regard to false advertising, and yet Facebook did not 89 00:05:56,080 --> 00:05:59,880 Speaker 1: remove or label these posts as being misleading, which allowed 90 00:06:00,200 --> 00:06:03,920 Speaker 1: those views to really take hold within certain communities and Facebook. 91 00:06:04,400 --> 00:06:07,400 Speaker 1: The organization also found something that a lot of people 92 00:06:07,440 --> 00:06:11,400 Speaker 1: have observed over the years that the messaging around conservation 93 00:06:11,960 --> 00:06:16,080 Speaker 1: and you know, having an impact on climate change really 94 00:06:16,320 --> 00:06:20,200 Speaker 1: was shifting the focus to the individual rather than to 95 00:06:20,560 --> 00:06:24,400 Speaker 1: companies and industries. In other words, putting the burden of 96 00:06:24,440 --> 00:06:29,200 Speaker 1: addressing climate change solely on individual people rather than both 97 00:06:29,320 --> 00:06:33,280 Speaker 1: on people and on corporations. So this is kind of 98 00:06:33,279 --> 00:06:35,479 Speaker 1: the message of turn off the lights after you leave 99 00:06:35,520 --> 00:06:38,760 Speaker 1: a room, like or only you can prevent forest fires 100 00:06:38,839 --> 00:06:43,320 Speaker 1: kind of approach, rather than place regulations on or otherwise 101 00:06:43,360 --> 00:06:46,640 Speaker 1: tax companies that are consuming vast amounts of electricity, right 102 00:06:47,000 --> 00:06:50,240 Speaker 1: Like that would be the other message. Uh, we frequently 103 00:06:50,360 --> 00:06:54,760 Speaker 1: see this kind of personalized approach to messaging, and yeah, 104 00:06:54,800 --> 00:06:56,599 Speaker 1: we all do have a part to play here. But 105 00:06:56,680 --> 00:07:00,919 Speaker 1: when that messaging completely ignores how industry contribut mutes to 106 00:07:01,240 --> 00:07:05,160 Speaker 1: a large extent, to climate change issues, then it becomes 107 00:07:05,200 --> 00:07:11,200 Speaker 1: misinformation because it's purposefully omitting important details. Right. If you 108 00:07:11,280 --> 00:07:15,760 Speaker 1: don't say, hey, the transportation industry ends up being a 109 00:07:15,800 --> 00:07:21,640 Speaker 1: massive contributor to carbon emissions and you're instead saying turn 110 00:07:21,680 --> 00:07:24,160 Speaker 1: off the lights and don't run stuff when you're not 111 00:07:24,320 --> 00:07:29,880 Speaker 1: using it, then you allow the major contributing factor to 112 00:07:30,000 --> 00:07:33,520 Speaker 1: go on unhindered, and it then the problem worsens as 113 00:07:33,560 --> 00:07:37,160 Speaker 1: a result. So Facebook has received some pretty critical opposition 114 00:07:37,200 --> 00:07:39,960 Speaker 1: to this trend already and will likely face even more 115 00:07:40,000 --> 00:07:42,840 Speaker 1: scrutiny in the future. And it should. I mean, it's 116 00:07:42,880 --> 00:07:47,320 Speaker 1: it's cashing those checks, right, The advertising checks are coming 117 00:07:47,320 --> 00:07:50,760 Speaker 1: in and Facebook is cashing them, So it does definitely 118 00:07:50,800 --> 00:07:54,760 Speaker 1: play a part. Oh and the CNN Business story about 119 00:07:54,920 --> 00:07:58,720 Speaker 1: influence maps look into Facebook reveals that these fossil fuel 120 00:07:58,800 --> 00:08:02,720 Speaker 1: ads showed up more than four hundred thirty one million 121 00:08:02,840 --> 00:08:06,640 Speaker 1: times for US users, in which is not a bad 122 00:08:06,680 --> 00:08:10,840 Speaker 1: return on investment for nine point six million dollars that 123 00:08:11,000 --> 00:08:14,840 Speaker 1: was the given amount for that those collective ad campaigns 124 00:08:15,280 --> 00:08:18,920 Speaker 1: now think about this a super Bowl ad. A commercial 125 00:08:19,000 --> 00:08:22,440 Speaker 1: during the Super Bowl costs around five million dollars for 126 00:08:22,480 --> 00:08:25,400 Speaker 1: a thirty second spot, and the last Super Bowl had 127 00:08:25,440 --> 00:08:28,800 Speaker 1: around ninety s point four million people watching it. So 128 00:08:29,040 --> 00:08:33,160 Speaker 1: for a little less than twice the cash, you can 129 00:08:33,200 --> 00:08:37,040 Speaker 1: get four times the number of views by going through Facebook, 130 00:08:37,040 --> 00:08:40,520 Speaker 1: which is not a bad deal right. Meanwhile, the petroleum 131 00:08:40,559 --> 00:08:44,120 Speaker 1: industry companies have spoken out against the Influence Map study, 132 00:08:44,400 --> 00:08:46,960 Speaker 1: claiming that these companies are spending a great amount of 133 00:08:46,960 --> 00:08:52,760 Speaker 1: money on limiting carbon emissions and using carbon sequestration strategies 134 00:08:52,760 --> 00:08:55,560 Speaker 1: and such, which you know, could be totally true, but 135 00:08:55,760 --> 00:08:58,760 Speaker 1: doesn't change the fact that these companies actively try to 136 00:08:58,960 --> 00:09:03,679 Speaker 1: undermine climate action. Heck, and x On lobbyists essentially spilled 137 00:09:03,679 --> 00:09:07,280 Speaker 1: the beans on how they do that on video, which 138 00:09:07,360 --> 00:09:10,880 Speaker 1: then prompted the CEO of x On to apologize for it. 139 00:09:11,320 --> 00:09:16,360 Speaker 1: I'm sure they were sorry that they were caught. And 140 00:09:16,520 --> 00:09:19,400 Speaker 1: we are not yet done with Facebook. We've got one 141 00:09:19,400 --> 00:09:23,040 Speaker 1: more Facebook story here. Bloomberg reported yesterday that Facebook has 142 00:09:23,080 --> 00:09:26,720 Speaker 1: disabled the personal Facebook accounts of some researchers from New 143 00:09:26,800 --> 00:09:32,079 Speaker 1: York University who have been studying political ads on that platform. 144 00:09:32,080 --> 00:09:35,520 Speaker 1: Facebook says this group has been using tools like automated 145 00:09:35,559 --> 00:09:40,000 Speaker 1: tools to scrape Facebook of data, which is going against 146 00:09:40,040 --> 00:09:42,800 Speaker 1: Facebook's policy. This is kind of alluding to what I 147 00:09:42,840 --> 00:09:47,400 Speaker 1: was mentioning earlier about facial recognition technologies. You know, Facebook 148 00:09:47,720 --> 00:09:51,160 Speaker 1: isn't supposed to allow that to happen. Now, the research 149 00:09:51,200 --> 00:09:55,360 Speaker 1: team saw their project pretty much hamstrung by Facebook. The 150 00:09:55,400 --> 00:09:59,280 Speaker 1: company deactivated various apps and pages that were associated with 151 00:09:59,320 --> 00:10:03,480 Speaker 1: this research project, in addition to deactivating those personal pages 152 00:10:03,559 --> 00:10:06,959 Speaker 1: of the actual team involved. Now, on the face of it, 153 00:10:07,760 --> 00:10:10,120 Speaker 1: my danger starts to get up on behalf of the 154 00:10:10,200 --> 00:10:13,520 Speaker 1: research team. However, I have to admit this story is 155 00:10:13,520 --> 00:10:17,679 Speaker 1: actually more complicated than that. See back in two thousand nineteen, 156 00:10:18,080 --> 00:10:22,120 Speaker 1: the Federal Trade Commission or FTC here in the United 157 00:10:22,160 --> 00:10:26,840 Speaker 1: States hit Facebook with a fine of five billion dollars 158 00:10:26,840 --> 00:10:30,240 Speaker 1: as billion with a B, which is a princely sum indeed, 159 00:10:30,440 --> 00:10:32,880 Speaker 1: And the reason for that fine was that the FTC 160 00:10:33,080 --> 00:10:36,240 Speaker 1: found that Facebook had not done enough to protect users 161 00:10:36,480 --> 00:10:40,679 Speaker 1: from developers who were relying on Facebook to collect personal 162 00:10:40,760 --> 00:10:46,000 Speaker 1: information on a large scale, think of things like Cambridge Analytica. 163 00:10:46,080 --> 00:10:49,840 Speaker 1: So Facebook has kind of an obligation to push back 164 00:10:49,880 --> 00:10:53,080 Speaker 1: against groups that are using automated means to collect data 165 00:10:53,200 --> 00:10:56,400 Speaker 1: off the platform, even if that group has a seemingly 166 00:10:56,480 --> 00:10:59,920 Speaker 1: good reason for doing so. Facebook requires groups to see 167 00:11:00,040 --> 00:11:05,120 Speaker 1: out permission from the company before using those kinds of tools, 168 00:11:05,200 --> 00:11:08,880 Speaker 1: and this group, you could argue, failed to secure permission, 169 00:11:09,360 --> 00:11:12,400 Speaker 1: and in fact, Facebook had reached out and given the 170 00:11:12,440 --> 00:11:15,560 Speaker 1: group a couple of warnings in the past. Now that 171 00:11:15,640 --> 00:11:18,960 Speaker 1: being said, Facebook is also under a great deal of 172 00:11:19,000 --> 00:11:22,360 Speaker 1: scrutiny for how it serves up political ads and how 173 00:11:22,360 --> 00:11:25,720 Speaker 1: it targets people and how it chooses who gets to 174 00:11:25,760 --> 00:11:28,920 Speaker 1: see what. So, as I said, this is a really 175 00:11:28,960 --> 00:11:33,000 Speaker 1: complicated issue. The study could call more attention to problems 176 00:11:33,040 --> 00:11:36,680 Speaker 1: with Facebook's methods of accepting and serving ads to users, 177 00:11:37,080 --> 00:11:41,200 Speaker 1: which the company would likely want to suppress. It doesn't 178 00:11:41,240 --> 00:11:45,000 Speaker 1: want those problems to be brought into, you know, harsher light. 179 00:11:45,520 --> 00:11:49,000 Speaker 1: But then Facebook also has this obligation under the FTC, 180 00:11:49,600 --> 00:11:52,320 Speaker 1: and that at least gives the company a fairly compelling 181 00:11:52,360 --> 00:11:56,120 Speaker 1: reason to push back. Though one could imagine that this 182 00:11:57,480 --> 00:12:00,480 Speaker 1: issue could be solved if Facebook and the research group 183 00:12:01,040 --> 00:12:05,280 Speaker 1: could just get that permission to the researchers, but I 184 00:12:05,440 --> 00:12:08,360 Speaker 1: really doubt that Facebook is eager to do that. In 185 00:12:08,400 --> 00:12:13,200 Speaker 1: this particular case, we have more stories to cover. But 186 00:12:13,280 --> 00:12:23,520 Speaker 1: before we do that, let's take a quick break. Reuters 187 00:12:23,559 --> 00:12:26,880 Speaker 1: reports that are ransomware attacked targeting more than one thousand 188 00:12:26,960 --> 00:12:30,360 Speaker 1: organizations last month might be just the beginning of a 189 00:12:30,440 --> 00:12:35,559 Speaker 1: trend of attacks on service providers. The attack leveraged vulnerabilities 190 00:12:35,600 --> 00:12:38,719 Speaker 1: in a management software product from a company called CASA, 191 00:12:39,320 --> 00:12:44,160 Speaker 1: which affected more than fifty managed service provider organizations or 192 00:12:44,440 --> 00:12:48,520 Speaker 1: m sps. Now, an MSP is a company that other 193 00:12:48,600 --> 00:12:53,280 Speaker 1: companies rely upon to outsource services of some sort, like 194 00:12:53,400 --> 00:12:57,480 Speaker 1: they could be human resources services or I T services, 195 00:12:57,920 --> 00:13:01,480 Speaker 1: And that means that the MSP represents a really juicy 196 00:13:01,600 --> 00:13:06,880 Speaker 1: target for hackers because you could attack a specific company. 197 00:13:06,960 --> 00:13:09,920 Speaker 1: Let's say that you wanted to target I don't know, 198 00:13:10,040 --> 00:13:14,520 Speaker 1: like like, uh, Intel, you want to target Intel, and 199 00:13:14,520 --> 00:13:17,640 Speaker 1: you want to compromise one of Intel's systems. Well, that's 200 00:13:17,640 --> 00:13:20,640 Speaker 1: a single target. But let's say you go after an 201 00:13:20,800 --> 00:13:25,200 Speaker 1: MSP which could service dozens or hundreds of clients, and 202 00:13:25,280 --> 00:13:28,199 Speaker 1: Intel happens to be one of them, but it also 203 00:13:28,240 --> 00:13:31,440 Speaker 1: services all these others. Well, you can disrupt all of 204 00:13:31,480 --> 00:13:36,320 Speaker 1: those companies by targeting the MSP specifically, right if they 205 00:13:36,320 --> 00:13:38,600 Speaker 1: are providing a service to all these companies and you 206 00:13:38,679 --> 00:13:42,920 Speaker 1: interrupt it, you've affected not just one organization, but potentially 207 00:13:43,120 --> 00:13:47,720 Speaker 1: hundreds of them. This puts immense pressure on the MSP 208 00:13:47,920 --> 00:13:51,720 Speaker 1: companies to solve the problem quickly, because if all of 209 00:13:51,760 --> 00:13:54,880 Speaker 1: its customers are depending upon those services, and you've got 210 00:13:54,960 --> 00:13:58,440 Speaker 1: like a hundred important clients, you might be answering calls 211 00:13:58,640 --> 00:14:01,320 Speaker 1: all day long because you can't get their widgets to 212 00:14:01,320 --> 00:14:04,040 Speaker 1: work because of a problem with your product. You have 213 00:14:04,360 --> 00:14:07,959 Speaker 1: a high motivation to fix the issue that might include 214 00:14:08,280 --> 00:14:13,000 Speaker 1: paying off an exorbitant ransom or worse. The hackers might 215 00:14:13,080 --> 00:14:16,000 Speaker 1: use the MSPs as a launch point and find a 216 00:14:16,040 --> 00:14:19,240 Speaker 1: way to use an MSP product to have a supply 217 00:14:19,320 --> 00:14:23,920 Speaker 1: chain attack and thus infect the MSPs customers as well. 218 00:14:24,000 --> 00:14:27,280 Speaker 1: So the MSP becomes the entry point and then all 219 00:14:27,320 --> 00:14:30,440 Speaker 1: of its customers become targets. That multiplies the number of 220 00:14:30,440 --> 00:14:34,120 Speaker 1: targets that the hackers are able to to actually compromise, 221 00:14:34,560 --> 00:14:38,040 Speaker 1: and potentially multiplies the number of ransoms that they could 222 00:14:38,040 --> 00:14:42,040 Speaker 1: receive as a result. Reuters states that hacker groups are 223 00:14:42,200 --> 00:14:46,480 Speaker 1: feverishly researching ways to target MSPs for just that reason, 224 00:14:46,800 --> 00:14:49,440 Speaker 1: which suggests we should be on the lookout for similar 225 00:14:49,480 --> 00:14:51,760 Speaker 1: attacks in the future, and it should serve as a 226 00:14:51,800 --> 00:14:56,960 Speaker 1: warning to all organizations MSP and otherwise to be extra 227 00:14:57,080 --> 00:15:02,160 Speaker 1: cautious with security. Moving on on, the state of Massachusetts 228 00:15:02,320 --> 00:15:05,800 Speaker 1: is seeing a group called the Massachusetts Coalition for Independent 229 00:15:05,840 --> 00:15:09,720 Speaker 1: Work Lobby to have a special ballot introduced that would 230 00:15:09,800 --> 00:15:13,720 Speaker 1: create some special exemptions for companies like Uber and Lift 231 00:15:14,120 --> 00:15:18,040 Speaker 1: to be largely excluded from the restrictions and regulations of 232 00:15:18,080 --> 00:15:22,200 Speaker 1: the state's labor laws. And you will be shocked, shocked, 233 00:15:22,600 --> 00:15:25,560 Speaker 1: I tell you to learn that Uber and Lift run 234 00:15:26,040 --> 00:15:29,560 Speaker 1: this special interest group whose name makes it sound like 235 00:15:29,640 --> 00:15:34,080 Speaker 1: it's a worker advocacy organization. The companies have actually done 236 00:15:34,120 --> 00:15:37,920 Speaker 1: this before. In fact, they did so successfully in California 237 00:15:38,040 --> 00:15:42,160 Speaker 1: by introducing a measure called Proposition twenty two, which effectively 238 00:15:42,360 --> 00:15:46,680 Speaker 1: excluded them and several other gig economy companies from having 239 00:15:46,720 --> 00:15:49,920 Speaker 1: to abide by the laws that govern other businesses in 240 00:15:49,960 --> 00:15:53,000 Speaker 1: the state of California. And it looks like the companies 241 00:15:53,040 --> 00:15:56,480 Speaker 1: are going to spend a lot of money in Massachusetts 242 00:15:56,520 --> 00:15:59,720 Speaker 1: to fight the same sort of battle there. Whether that 243 00:15:59,800 --> 00:16:03,080 Speaker 1: were us are not remains to be seen, because generally speaking, 244 00:16:03,120 --> 00:16:06,520 Speaker 1: there's a growing awareness regarding the company's efforts to avoid 245 00:16:06,560 --> 00:16:10,720 Speaker 1: things like acknowledging the drivers are in fact employees as 246 00:16:10,720 --> 00:16:14,440 Speaker 1: opposed to you know, contracted freelancers and so on. But 247 00:16:14,520 --> 00:16:16,600 Speaker 1: we'll have to watch to see how this plays out, 248 00:16:16,640 --> 00:16:18,760 Speaker 1: because we have to keep in mind these companies already 249 00:16:18,760 --> 00:16:22,280 Speaker 1: have a track record for selling a political message effectively 250 00:16:22,480 --> 00:16:26,480 Speaker 1: to the public, and they might do it again. And 251 00:16:26,560 --> 00:16:29,960 Speaker 1: now for the latest of if misogyny in the workplace 252 00:16:30,040 --> 00:16:34,880 Speaker 1: is such a problem, why don't more women come forward? Department, Well, 253 00:16:34,920 --> 00:16:38,200 Speaker 1: to you, I submit the story of Ashley Giovic, a 254 00:16:38,320 --> 00:16:42,560 Speaker 1: program manager at Apple's engineering department. According to the verge, 255 00:16:42,680 --> 00:16:46,880 Speaker 1: Yovic found herself put on administrative leave indefinitely. In fact, 256 00:16:46,960 --> 00:16:50,240 Speaker 1: she actually requested that, but we'll get to that. After 257 00:16:50,320 --> 00:16:52,720 Speaker 1: she tweeted about how she found Apple to be a 258 00:16:52,760 --> 00:16:57,080 Speaker 1: hostile work environment and one that tolerates sexism within the workplace. 259 00:16:57,320 --> 00:17:01,680 Speaker 1: She had already gone through internal system ms numerous times 260 00:17:01,760 --> 00:17:05,159 Speaker 1: in order to address this, so tweeting this out was 261 00:17:05,240 --> 00:17:08,480 Speaker 1: kind of like, you know, the fact that she wasn't 262 00:17:08,520 --> 00:17:13,080 Speaker 1: seeing progress being made within the company. And when you 263 00:17:13,160 --> 00:17:18,720 Speaker 1: see a company essentially sideline someone after they come forward with, 264 00:17:19,960 --> 00:17:24,040 Speaker 1: you know, these these issues, that's really one of the 265 00:17:24,080 --> 00:17:28,280 Speaker 1: reasons why more women don't come forward because they've they've 266 00:17:28,280 --> 00:17:31,000 Speaker 1: been intimidated. They've seen time and time again that the 267 00:17:31,040 --> 00:17:35,560 Speaker 1: people who speak out end up getting sidelined. Gil Vic 268 00:17:35,600 --> 00:17:40,760 Speaker 1: tweeted quote so following raising concerns to hashtag Apple about 269 00:17:40,880 --> 00:17:47,480 Speaker 1: hashtag sexism, hashtag hostile work environment, and hashtag unsafe work conditions. 270 00:17:47,920 --> 00:17:52,800 Speaker 1: I'm now on indefinite paid administrative leave per hashtag Apple 271 00:17:52,880 --> 00:17:57,359 Speaker 1: employee relations while they investigate my concerns. This seems to 272 00:17:57,400 --> 00:18:02,400 Speaker 1: include me not using Apple's internal back end quote. Giovic 273 00:18:02,480 --> 00:18:05,320 Speaker 1: also said that the company offered to provide her medical 274 00:18:05,440 --> 00:18:09,640 Speaker 1: leave rather than address the underlying issues, which is kind 275 00:18:09,720 --> 00:18:13,879 Speaker 1: of like saying, hey, I know Pete in accounting keeps 276 00:18:13,960 --> 00:18:16,600 Speaker 1: stabbing you, but we've got these band aids so you 277 00:18:16,600 --> 00:18:19,200 Speaker 1: can patch yourself up every time it happens, rather than, 278 00:18:19,280 --> 00:18:23,960 Speaker 1: you know, going to arrest Pete from accounting. Apple is, 279 00:18:24,040 --> 00:18:28,160 Speaker 1: of course, just one of dozens of companies currently being 280 00:18:28,200 --> 00:18:32,199 Speaker 1: called to reckon with its culture, and once again, it 281 00:18:32,680 --> 00:18:35,359 Speaker 1: it's stories like these that feed into that feeling that 282 00:18:35,440 --> 00:18:39,080 Speaker 1: human resources isn't there to protect the humans who work 283 00:18:39,119 --> 00:18:43,359 Speaker 1: for the company, but rather to protect the company. Let 284 00:18:43,480 --> 00:18:47,000 Speaker 1: us now segue over to a moment i'd like to 285 00:18:47,040 --> 00:18:50,040 Speaker 1: call we live in the future, which can sometimes be 286 00:18:50,119 --> 00:18:55,200 Speaker 1: on inspiring and sometimes terrifying. We'll start with something terrifying. 287 00:18:55,640 --> 00:18:59,040 Speaker 1: Apparently the Pentagon is making use of an AI application 288 00:18:59,080 --> 00:19:02,640 Speaker 1: to analyze real time information in an effort to predict 289 00:19:02,920 --> 00:19:06,760 Speaker 1: what the enemy will do next, with the Pentagon claiming 290 00:19:06,920 --> 00:19:10,040 Speaker 1: that the AI can let them see quote days in 291 00:19:10,119 --> 00:19:13,399 Speaker 1: advance end quote. As a result, makes me think of 292 00:19:13,400 --> 00:19:17,680 Speaker 1: the precogs in Minority Report, except instead of being put 293 00:19:17,720 --> 00:19:22,080 Speaker 1: to use in law enforcement to predict crime, they are 294 00:19:22,320 --> 00:19:25,440 Speaker 1: being put in military use to predict what the enemy 295 00:19:25,560 --> 00:19:29,840 Speaker 1: is going to do in any given situation. And it 296 00:19:29,880 --> 00:19:32,439 Speaker 1: also makes me think of Grand Admiral Thrown in the 297 00:19:32,480 --> 00:19:35,720 Speaker 1: Star Wars Extended Universe. He could look at a culture's 298 00:19:35,800 --> 00:19:38,720 Speaker 1: work of art and figure out what that culture how 299 00:19:38,880 --> 00:19:41,600 Speaker 1: they would react, you know, to any given situation, which 300 00:19:41,640 --> 00:19:45,040 Speaker 1: is pretty far fetched. But using AI to analyze what's 301 00:19:45,080 --> 00:19:47,240 Speaker 1: going on within a specific region and then make a 302 00:19:47,280 --> 00:19:51,760 Speaker 1: strategic decision seems like it's a little bit less fantastical, right. 303 00:19:51,800 --> 00:19:54,320 Speaker 1: I Mean, you could argue that this is just an 304 00:19:54,359 --> 00:19:57,960 Speaker 1: extremely much more complicated version of a war game, like 305 00:19:58,000 --> 00:20:01,440 Speaker 1: a simulated war game. The AI is part of the 306 00:20:01,520 --> 00:20:08,200 Speaker 1: Global Information Dominance Experiment or g I d E. And boy, 307 00:20:08,280 --> 00:20:11,919 Speaker 1: that name is kind of scary. Obviously. The goal is 308 00:20:11,960 --> 00:20:15,000 Speaker 1: to predict what an enemy might do in order so 309 00:20:15,040 --> 00:20:19,680 Speaker 1: that we could prepare against that very move, potentially protecting 310 00:20:19,720 --> 00:20:22,879 Speaker 1: the lives of soldiers and civilians in the process and 311 00:20:22,960 --> 00:20:26,520 Speaker 1: maximizing the effectiveness of an armed presence in any given region. 312 00:20:26,720 --> 00:20:30,720 Speaker 1: We might dissuade someone from doing something. For example, if 313 00:20:30,760 --> 00:20:37,080 Speaker 1: the analysis says, hey, based upon historical data, such and 314 00:20:37,160 --> 00:20:40,679 Speaker 1: such country is likely to launch a submarine from this 315 00:20:40,840 --> 00:20:44,320 Speaker 1: port in the next two days, if you build up 316 00:20:44,480 --> 00:20:49,400 Speaker 1: a show of force around that area, then you might 317 00:20:49,520 --> 00:20:53,720 Speaker 1: dissuade them from doing that. That kind of thing. Well, 318 00:20:54,359 --> 00:20:57,439 Speaker 1: the Pentagon points out that this is work that people 319 00:20:57,480 --> 00:21:01,679 Speaker 1: have actually been doing for years by analyzing data in 320 00:21:01,720 --> 00:21:05,680 Speaker 1: search of patterns and trying to make predictions. But obviously 321 00:21:06,040 --> 00:21:08,639 Speaker 1: it took a long time in the past, so we 322 00:21:08,720 --> 00:21:10,800 Speaker 1: are now in an era where we can collect and 323 00:21:11,000 --> 00:21:15,160 Speaker 1: more importantly, analyze mountains of information rapidly, so we can 324 00:21:15,160 --> 00:21:18,159 Speaker 1: find those patterns much more quickly and then act on 325 00:21:18,160 --> 00:21:22,520 Speaker 1: that information. Of course, this requires a very careful design 326 00:21:22,600 --> 00:21:26,399 Speaker 1: of the algorithm, and it's always possible for an algorithm 327 00:21:26,440 --> 00:21:30,800 Speaker 1: to mistake noise for a meaningful pattern. But that's why 328 00:21:30,840 --> 00:21:34,400 Speaker 1: this initiative is in a testing phase, and it concerns 329 00:21:34,560 --> 00:21:37,919 Speaker 1: some critics of artificial intelligence in use with, you know, 330 00:21:37,960 --> 00:21:41,359 Speaker 1: military applications, because they see it as a potential starting 331 00:21:41,400 --> 00:21:44,399 Speaker 1: point for a slippery slope in which AI ends up 332 00:21:44,520 --> 00:21:48,200 Speaker 1: not just pointing out possible enemy decisions as a result 333 00:21:48,200 --> 00:21:50,720 Speaker 1: of what's going on in the world, but then going 334 00:21:50,800 --> 00:21:54,560 Speaker 1: on to potentially making life or death decisions of its own, 335 00:21:55,080 --> 00:21:57,520 Speaker 1: and then potentially even getting to the point where it 336 00:21:57,760 --> 00:22:02,399 Speaker 1: enacts those decisions without human oversight, because obviously human oversight 337 00:22:02,480 --> 00:22:06,359 Speaker 1: slows things down. So if you have really powerful AI 338 00:22:06,520 --> 00:22:10,080 Speaker 1: making all the decisions, and you have at least assumed 339 00:22:10,080 --> 00:22:13,440 Speaker 1: that the AI is infallible, you rely on the AI. 340 00:22:14,119 --> 00:22:17,800 Speaker 1: We're always off from that sort of dystopian future, but 341 00:22:17,920 --> 00:22:21,520 Speaker 1: being wary of that eventuality is probably not a bad idea. 342 00:22:21,720 --> 00:22:26,800 Speaker 1: So while you could argue that it's premature to worry 343 00:22:26,800 --> 00:22:29,680 Speaker 1: about that kind of stuff, the counter to that is, well, 344 00:22:29,760 --> 00:22:32,159 Speaker 1: if we don't bring it, bring it up now, we 345 00:22:32,200 --> 00:22:36,439 Speaker 1: could see this evolved slowly enough where it happens before 346 00:22:36,480 --> 00:22:38,720 Speaker 1: we were even aware of it. I've got a couple 347 00:22:38,800 --> 00:22:40,560 Speaker 1: more stories to tell, but before we get to that, 348 00:22:40,680 --> 00:22:52,639 Speaker 1: let's take another break. Okay, it is time, my friends, 349 00:22:52,760 --> 00:22:55,840 Speaker 1: and by time, I mean it's time to talk about 350 00:22:56,119 --> 00:23:01,320 Speaker 1: time crystals, which full disclosure, I did not even know 351 00:23:01,520 --> 00:23:06,320 Speaker 1: we're a thing before this morning. Apparently, time crystals are 352 00:23:06,400 --> 00:23:09,320 Speaker 1: a phase of matter, kind of like you know how 353 00:23:09,440 --> 00:23:13,800 Speaker 1: gas is a phase of matter, solids, liquids, plasmas, and 354 00:23:13,840 --> 00:23:17,240 Speaker 1: that the most likely context we would observe such a 355 00:23:17,280 --> 00:23:20,600 Speaker 1: thing as a time crystal is on the quantum level, 356 00:23:21,160 --> 00:23:25,040 Speaker 1: so you know, they're they're they're very small, like smaller 357 00:23:25,040 --> 00:23:29,479 Speaker 1: than Natman small. Some Google scientists have written a paper 358 00:23:29,720 --> 00:23:32,240 Speaker 1: that still has to go through the pure review process, 359 00:23:32,359 --> 00:23:36,040 Speaker 1: so we can't just assume that they got everything right. 360 00:23:36,480 --> 00:23:39,400 Speaker 1: But that paper claims that they have found a way 361 00:23:39,440 --> 00:23:43,919 Speaker 1: to use quantum computers to study time crystals, which is 362 00:23:44,720 --> 00:23:47,760 Speaker 1: pretty much the most science fiction sentence I think I 363 00:23:47,760 --> 00:23:50,840 Speaker 1: could say today. But let's break this down a bit, 364 00:23:50,880 --> 00:23:54,119 Speaker 1: because that's just words salad, right unless we understand what 365 00:23:54,160 --> 00:23:59,399 Speaker 1: we're talking about. Quantum computers are very complicated machines that 366 00:23:59,640 --> 00:24:05,240 Speaker 1: only somewhat resemble classic computers, and they are very delicate 367 00:24:05,359 --> 00:24:09,560 Speaker 1: as well. They have this tendency toward decoherence, which means 368 00:24:09,640 --> 00:24:13,480 Speaker 1: that your quantum computer kind of breaks down and becomes 369 00:24:13,520 --> 00:24:17,320 Speaker 1: a very low powered classical computer as a result. So 370 00:24:17,359 --> 00:24:21,560 Speaker 1: with a traditional computer, we talk about information in terms 371 00:24:21,720 --> 00:24:25,479 Speaker 1: of the basic unit of a bit or a binary digit, 372 00:24:25,920 --> 00:24:30,000 Speaker 1: which can either be a zero or a one. Quantum 373 00:24:30,000 --> 00:24:34,080 Speaker 1: computers are different. They have a cubit as the basic 374 00:24:34,200 --> 00:24:37,960 Speaker 1: unit of information, and a cubit or a quantum bit 375 00:24:38,600 --> 00:24:41,959 Speaker 1: can kind of sort of be both a zero and 376 00:24:42,400 --> 00:24:47,080 Speaker 1: a one at the same time, and every value in between. Technically, 377 00:24:47,119 --> 00:24:52,840 Speaker 1: this is a thing called superposition. Uh. There's also the 378 00:24:52,880 --> 00:24:55,760 Speaker 1: issue of entanglement, but we're not going to get into 379 00:24:55,800 --> 00:24:58,320 Speaker 1: all that. That would require a full episode by itself. 380 00:24:59,400 --> 00:25:01,840 Speaker 1: But the important thing with quantum computers is that with 381 00:25:02,080 --> 00:25:06,840 Speaker 1: the proper design of algorithms, that is, processes that are 382 00:25:06,960 --> 00:25:12,600 Speaker 1: specifically engineered to take advantage of these quantum properties, computer 383 00:25:12,680 --> 00:25:20,000 Speaker 1: scientists could use these quantum computers to solve particularly difficult problems. Now, 384 00:25:20,000 --> 00:25:22,680 Speaker 1: a quantum computer would not be much good for your 385 00:25:22,760 --> 00:25:26,880 Speaker 1: classic computer processes, so in other words, you wouldn't want 386 00:25:26,880 --> 00:25:29,359 Speaker 1: these as a gaming rig so you can play Overwatch 387 00:25:29,400 --> 00:25:33,359 Speaker 1: on it or something, but quantum computers would be really 388 00:25:33,400 --> 00:25:38,280 Speaker 1: good at tackling certain subsets of computational problems that would 389 00:25:38,280 --> 00:25:41,960 Speaker 1: tie up a classical computer for thousands of years or more. 390 00:25:43,240 --> 00:25:47,359 Speaker 1: So that's quantum computers from a very very high level. Alright, 391 00:25:47,400 --> 00:25:50,600 Speaker 1: what about time crystals. Well, here we got to talk 392 00:25:50,720 --> 00:25:54,200 Speaker 1: about entropy. So you can kind of think of entropy 393 00:25:54,320 --> 00:25:57,119 Speaker 1: as the natural tendency for stuff in the universe to 394 00:25:57,240 --> 00:26:01,440 Speaker 1: kind of fall apart or more accurately, to go from 395 00:26:01,560 --> 00:26:06,959 Speaker 1: more ordered two more disordered. So let's say you've got 396 00:26:07,119 --> 00:26:10,800 Speaker 1: a big room and you've subdivided this big room with 397 00:26:10,840 --> 00:26:15,240 Speaker 1: a removable panel, like it completely bisects the room in half, 398 00:26:15,920 --> 00:26:20,240 Speaker 1: and the panel has like heavy thermal insulation, and you 399 00:26:20,320 --> 00:26:23,320 Speaker 1: make one side of that room really hot, and you 400 00:26:23,320 --> 00:26:25,040 Speaker 1: make the other side of the room that's on the 401 00:26:25,080 --> 00:26:28,080 Speaker 1: opposite end of this panel, like on the opposite side 402 00:26:28,080 --> 00:26:31,520 Speaker 1: of the panel, you make that side really cold. So 403 00:26:31,640 --> 00:26:34,159 Speaker 1: you've got effectively two rooms here, right, You've got one 404 00:26:34,160 --> 00:26:36,520 Speaker 1: that's really hot, one is really cold. Then you remove 405 00:26:36,600 --> 00:26:40,400 Speaker 1: the panel. Well what happens, But we would actually see 406 00:26:40,400 --> 00:26:44,440 Speaker 1: these temperatures mix, right, we would see these the molecules 407 00:26:44,640 --> 00:26:46,680 Speaker 1: from the cold side and the molecules from the hot 408 00:26:46,720 --> 00:26:50,000 Speaker 1: side would intermingle, and eventually the room would reach an 409 00:26:50,000 --> 00:26:54,080 Speaker 1: equilibrium temperature, so you would not have one side of 410 00:26:54,080 --> 00:26:56,919 Speaker 1: the room just stay hot and the other side just 411 00:26:57,160 --> 00:27:01,400 Speaker 1: stay cold. In other words, that those ordered systems would 412 00:27:01,440 --> 00:27:05,680 Speaker 1: break down and you would have this entropy take effect 413 00:27:06,800 --> 00:27:12,520 Speaker 1: unless the room itself was a time crystal. In that case, 414 00:27:12,840 --> 00:27:16,720 Speaker 1: a time crystal can get stuck in two different high 415 00:27:16,880 --> 00:27:21,080 Speaker 1: energy configurations. So in our example, we would say, all right, well, 416 00:27:21,160 --> 00:27:23,320 Speaker 1: that means hot and cold in this case, and then 417 00:27:23,320 --> 00:27:26,840 Speaker 1: a time crystal would switch back and forth, never actually 418 00:27:27,000 --> 00:27:30,640 Speaker 1: moving towards equilibrium. So we would have a room that 419 00:27:30,800 --> 00:27:33,440 Speaker 1: at times would be very hot and that other times 420 00:27:33,440 --> 00:27:36,399 Speaker 1: would be very cold, or would have hot and cold 421 00:27:36,480 --> 00:27:40,119 Speaker 1: on specific sides and then hot and cold on opposite sides. 422 00:27:40,400 --> 00:27:44,520 Speaker 1: Now I can't pretend to really understand this because it 423 00:27:44,560 --> 00:27:48,360 Speaker 1: flies in the face of the second law of thermodynamics. 424 00:27:48,400 --> 00:27:51,479 Speaker 1: But apparently it's a thing, and Google says it can 425 00:27:51,560 --> 00:27:55,199 Speaker 1: use its quantum computing systems to study time crystals. Further, 426 00:27:56,440 --> 00:27:59,080 Speaker 1: as it stands, there's no real application for time crystals 427 00:27:59,080 --> 00:28:01,800 Speaker 1: that anyone has proposed, but it wouldn't mean learning more 428 00:28:01,800 --> 00:28:04,840 Speaker 1: about how our universe works and whether there really is 429 00:28:05,000 --> 00:28:09,240 Speaker 1: a big exception to an otherwise extremely well established rule, 430 00:28:09,359 --> 00:28:13,639 Speaker 1: that being the second law of the thermodynamics, which again 431 00:28:13,680 --> 00:28:18,680 Speaker 1: blows my mind. Finally something that doesn't blow my mind 432 00:28:18,720 --> 00:28:21,919 Speaker 1: nearly as much but still is really cool. Sony introduced 433 00:28:21,960 --> 00:28:25,600 Speaker 1: a new VR technology for the PS five to developers 434 00:28:25,920 --> 00:28:29,159 Speaker 1: this week. It's kind of a pitch to developers to say, 435 00:28:29,200 --> 00:28:31,679 Speaker 1: we've got this hardware coming out, we want you to 436 00:28:31,760 --> 00:28:34,960 Speaker 1: make stuff for it. The headsets will have oh LED 437 00:28:35,040 --> 00:28:39,280 Speaker 1: screens with two thousand by twenty forty resolution per eye, 438 00:28:39,280 --> 00:28:42,720 Speaker 1: which is pretty darn incredible. They will also have a 439 00:28:43,840 --> 00:28:46,520 Speaker 1: degree field of view, which is an improvement over the 440 00:28:46,640 --> 00:28:51,120 Speaker 1: current ps VR system that's on the market. These systems 441 00:28:51,200 --> 00:28:55,320 Speaker 1: might also support HDR that's high dynamic range. That is 442 00:28:55,320 --> 00:28:58,880 Speaker 1: a technology that can enable more vibrant colors to be 443 00:28:58,960 --> 00:29:04,800 Speaker 1: displayed through greater dynamics between those colors and brightness or luminosity. 444 00:29:05,160 --> 00:29:08,840 Speaker 1: The headset is said to also have some haptic feedback 445 00:29:09,160 --> 00:29:12,600 Speaker 1: built in, That is, systems that provide a tactile or 446 00:29:12,800 --> 00:29:16,560 Speaker 1: touch based feedback system. So for example, you might feel 447 00:29:16,600 --> 00:29:20,720 Speaker 1: a slight vibration through the headset as you use a 448 00:29:20,760 --> 00:29:24,040 Speaker 1: controller to move around a virtual environment. The idea being 449 00:29:24,080 --> 00:29:26,720 Speaker 1: that you're kind of tricking your brain into thinking you're 450 00:29:26,800 --> 00:29:30,960 Speaker 1: actually walking around this virtual space while you are, in 451 00:29:31,040 --> 00:29:35,960 Speaker 1: reality physically standing still inside your living room, for example, 452 00:29:36,640 --> 00:29:40,520 Speaker 1: and hopefully this would cut down on the motion sickness 453 00:29:40,560 --> 00:29:45,120 Speaker 1: than some folks like me experience when they are using 454 00:29:45,200 --> 00:29:49,000 Speaker 1: VR sets. There's also talk of technology that will detect 455 00:29:49,160 --> 00:29:52,720 Speaker 1: where a user is looking while they are wearing the headset, 456 00:29:52,960 --> 00:29:56,840 Speaker 1: which can be useful because developers could build beefier VR 457 00:29:56,960 --> 00:30:00,360 Speaker 1: experiences and really lean on this, because anything you're looking 458 00:30:00,360 --> 00:30:03,719 Speaker 1: at directly would need to be distinct and in high resolution, 459 00:30:04,080 --> 00:30:07,040 Speaker 1: so that's where it would need to really look good. 460 00:30:07,320 --> 00:30:10,520 Speaker 1: But the stuff at the periphery of your vision could 461 00:30:10,520 --> 00:30:13,640 Speaker 1: be a bit more hazy and you wouldn't notice because 462 00:30:13,680 --> 00:30:17,200 Speaker 1: that's not the way humans focus, right, And that could 463 00:30:17,280 --> 00:30:19,720 Speaker 1: cut down on the work that the PlayStation five would 464 00:30:19,760 --> 00:30:22,880 Speaker 1: have to do in order to provide whatever this experience is. 465 00:30:23,360 --> 00:30:25,720 Speaker 1: That really just means that developers can have a bit 466 00:30:25,760 --> 00:30:29,720 Speaker 1: more elbow room to really push the hardware to its limit. 467 00:30:30,280 --> 00:30:32,720 Speaker 1: And Sony is reaching out to some big Triple A 468 00:30:32,800 --> 00:30:36,160 Speaker 1: studios essentially saying, hey, why don't you know, make something 469 00:30:36,200 --> 00:30:39,800 Speaker 1: really cool for this? So we've seen time and again 470 00:30:39,840 --> 00:30:43,520 Speaker 1: that you know, hardware can be really exciting, but unless 471 00:30:43,560 --> 00:30:46,840 Speaker 1: there's a strong software library to go with that hardware, 472 00:30:47,560 --> 00:30:50,360 Speaker 1: it doesn't tend to have very much staying power. Tends 473 00:30:50,400 --> 00:30:53,600 Speaker 1: to fade into obscurity. And VR has always had an 474 00:30:53,600 --> 00:30:56,760 Speaker 1: issue when it comes to building out a really diverse, 475 00:30:57,120 --> 00:31:01,520 Speaker 1: compelling library of experiences. And of course there's a pretty 476 00:31:01,520 --> 00:31:04,960 Speaker 1: good reason for that. VR tends to be pretty expensive, 477 00:31:05,480 --> 00:31:07,320 Speaker 1: and my guest this new headset is going to be 478 00:31:07,400 --> 00:31:09,720 Speaker 1: no different. I'm guessing it's going to be a pretty 479 00:31:09,720 --> 00:31:13,680 Speaker 1: expensive piece of equipment. And when something is expensive, that means, 480 00:31:13,760 --> 00:31:17,080 Speaker 1: by nature, fewer people can afford to buy the product, 481 00:31:17,520 --> 00:31:21,440 Speaker 1: which means that developers are looking at a smaller pool 482 00:31:21,520 --> 00:31:25,840 Speaker 1: of potential customers. And that means that whatever return they're 483 00:31:25,840 --> 00:31:28,880 Speaker 1: going to get from selling something that runs on that 484 00:31:28,920 --> 00:31:32,280 Speaker 1: platform is going to be relatively smaller than it would 485 00:31:32,320 --> 00:31:36,200 Speaker 1: be for, you know, a more widely adopted platform. So 486 00:31:36,320 --> 00:31:40,240 Speaker 1: if you're a game studio, you might have yourself saying, 487 00:31:40,720 --> 00:31:44,120 Speaker 1: so do we pour a hundred million bucks into developing 488 00:31:44,120 --> 00:31:47,280 Speaker 1: this game that is going to target ten of the 489 00:31:47,320 --> 00:31:50,160 Speaker 1: PS five owners? That's the number I'm just kind of 490 00:31:50,200 --> 00:31:52,920 Speaker 1: pulling out the air. By the way, or do we 491 00:31:52,960 --> 00:31:57,920 Speaker 1: instead pour million bucks developing a more traditional style console 492 00:31:58,000 --> 00:32:01,880 Speaker 1: game and target all of the PS five owners. And 493 00:32:01,960 --> 00:32:05,960 Speaker 1: you can see how this situation becomes a vicious cycle, right, 494 00:32:06,560 --> 00:32:09,840 Speaker 1: Some folks will resist buying into VR because they see 495 00:32:09,880 --> 00:32:13,120 Speaker 1: there's a lack of content, and then developers are reluctant 496 00:32:13,200 --> 00:32:16,400 Speaker 1: to make content because there's a lack of VR owners 497 00:32:16,440 --> 00:32:18,880 Speaker 1: out there. But maybe this will be the device to 498 00:32:18,920 --> 00:32:21,880 Speaker 1: really break that open for Sony. Rumor is that the 499 00:32:21,920 --> 00:32:25,120 Speaker 1: hardware will get a reveal sometime early next year, so 500 00:32:25,240 --> 00:32:27,720 Speaker 1: we won't have to wait too long to learn more. 501 00:32:29,000 --> 00:32:32,440 Speaker 1: And that's it. That's all the news that I have 502 00:32:32,560 --> 00:32:37,200 Speaker 1: for you guys today, Thursday, August five, twenty twenty one. 503 00:32:37,280 --> 00:32:40,560 Speaker 1: I hope you are all well. I'm on the mend myself, 504 00:32:41,040 --> 00:32:44,960 Speaker 1: so each day is going to get better. That's my hope. Anyway, 505 00:32:45,200 --> 00:32:47,560 Speaker 1: if you have suggestions for topics I should cover in 506 00:32:47,640 --> 00:32:50,080 Speaker 1: future episodes of tech Stuff, please reach out to me. 507 00:32:50,520 --> 00:32:52,640 Speaker 1: The best way to do that is on Twitter. The 508 00:32:52,680 --> 00:32:55,600 Speaker 1: handle for the show is text Stuff h s W 509 00:32:56,360 --> 00:33:05,160 Speaker 1: and I'll talk to you again really soon, y. Text 510 00:33:05,160 --> 00:33:08,640 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 511 00:33:08,640 --> 00:33:11,400 Speaker 1: from I Heart Radio, visit the i heart radio, app, 512 00:33:11,560 --> 00:33:14,680 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.