1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:15,520 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,760 --> 00:00:19,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:20,200 --> 00:00:24,160 Speaker 1: and I love all things tech and it is time 5 00:00:24,400 --> 00:00:28,680 Speaker 1: for the tech news for Thursday August twelve, twenty twenty one. 6 00:00:29,160 --> 00:00:32,520 Speaker 1: Let's get into it now. You've probably heard me talk 7 00:00:32,960 --> 00:00:38,720 Speaker 1: about VPNs, or virtual private networks before. This is a 8 00:00:38,920 --> 00:00:41,680 Speaker 1: service in which you use a piece of software to 9 00:00:41,920 --> 00:00:45,839 Speaker 1: log into a remote server, and then that server kind 10 00:00:45,840 --> 00:00:50,559 Speaker 1: of acts as a type of proxy to access content 11 00:00:50,920 --> 00:00:55,279 Speaker 1: on the Internet on your behalf. And that means like 12 00:00:55,360 --> 00:00:58,560 Speaker 1: all the sites you're visiting are going through the server 13 00:00:58,720 --> 00:01:01,640 Speaker 1: before coming to you. So anyone who might be, say 14 00:01:01,680 --> 00:01:04,440 Speaker 1: spying on your side of the connection, all they would 15 00:01:04,480 --> 00:01:08,000 Speaker 1: see is that you are in communication with this VPN server, 16 00:01:08,360 --> 00:01:12,920 Speaker 1: with ideally all of that communication encrypted, so that whomever 17 00:01:13,040 --> 00:01:15,200 Speaker 1: is snooping on you has no idea what it is 18 00:01:15,240 --> 00:01:17,920 Speaker 1: you're looking at, right because they can only see that 19 00:01:18,000 --> 00:01:21,360 Speaker 1: it's coming from this VPN server. They cannot see beyond 20 00:01:21,440 --> 00:01:24,280 Speaker 1: that point, so they wouldn't have any clue what was 21 00:01:24,319 --> 00:01:27,600 Speaker 1: going on. And then anyone who was snooping on the 22 00:01:27,640 --> 00:01:31,959 Speaker 1: opposite side of this, All they would see is that 23 00:01:32,040 --> 00:01:37,000 Speaker 1: this server is communicating with other web servers, but not 24 00:01:37,120 --> 00:01:40,760 Speaker 1: knowing you know, whom on the other end is actually 25 00:01:40,800 --> 00:01:43,680 Speaker 1: responsible for that. Like let's say that you were visiting 26 00:01:43,959 --> 00:01:46,600 Speaker 1: your bank, Well, they would be able to see that 27 00:01:46,680 --> 00:01:50,880 Speaker 1: there was communication between this VPN server and the bank server, 28 00:01:51,520 --> 00:01:55,720 Speaker 1: but they wouldn't know who was ultimately trying to access that. 29 00:01:55,840 --> 00:02:00,440 Speaker 1: And also that communication would be encrypted. So there are 30 00:02:00,520 --> 00:02:02,600 Speaker 1: a lot of legit reasons to you that use this, 31 00:02:02,760 --> 00:02:04,720 Speaker 1: just like I was just mentioning, Like if you wanted 32 00:02:04,720 --> 00:02:09,880 Speaker 1: to use extra precautions when accessing certain services like banking 33 00:02:09,919 --> 00:02:13,840 Speaker 1: services while you're on say a public WiFi hot spot, 34 00:02:13,919 --> 00:02:17,560 Speaker 1: so you're at a coffee shop, you might want to 35 00:02:17,639 --> 00:02:21,359 Speaker 1: use a vp N to access various things so that 36 00:02:21,520 --> 00:02:26,600 Speaker 1: someone snooping on that public WiFi hotspot wouldn't know your business. 37 00:02:26,639 --> 00:02:31,080 Speaker 1: But one other popular use of VPNs is to access 38 00:02:31,280 --> 00:02:35,920 Speaker 1: region locked material that is otherwise unavailable to you, and 39 00:02:36,040 --> 00:02:39,280 Speaker 1: recently Netflix has been on the war path to fight 40 00:02:39,480 --> 00:02:45,240 Speaker 1: against that. Netflix has licensing deals with various studios and 41 00:02:45,280 --> 00:02:50,040 Speaker 1: it allows Netflix to stream content certain content to certain 42 00:02:50,080 --> 00:02:56,040 Speaker 1: regions or audiences. For example, there are programs on Netflix 43 00:02:56,080 --> 00:02:59,320 Speaker 1: that are available in the UK on Netflix, but I 44 00:02:59,360 --> 00:03:02,520 Speaker 1: can't watch them because I'm in the United States and 45 00:03:02,600 --> 00:03:06,560 Speaker 1: Netflix does not have a license to stream that specific 46 00:03:06,639 --> 00:03:11,079 Speaker 1: content two people here in the US. But through a VPN, 47 00:03:11,480 --> 00:03:16,600 Speaker 1: I could potentially circumvent those rules and just log into 48 00:03:16,600 --> 00:03:19,959 Speaker 1: a VPN server in the UK and try to get 49 00:03:19,960 --> 00:03:23,640 Speaker 1: around it that way. This isn't exactly legit, but a 50 00:03:23,680 --> 00:03:26,919 Speaker 1: lot of folks use VPNs for this specific reason. Now, 51 00:03:26,960 --> 00:03:32,079 Speaker 1: Netflix is blocking certain IP addresses that the company associates 52 00:03:32,160 --> 00:03:34,960 Speaker 1: with VPN services because some of these services what they're 53 00:03:34,960 --> 00:03:39,280 Speaker 1: doing is they're using IP addresses that look like residential 54 00:03:39,320 --> 00:03:43,840 Speaker 1: IP addresses. So Netflix is starting to block some IP 55 00:03:43,960 --> 00:03:46,640 Speaker 1: addresses that appear to be residential ones, and some of 56 00:03:46,680 --> 00:03:51,760 Speaker 1: them turn out to actually be residential IP addresses. Now, 57 00:03:51,880 --> 00:03:54,680 Speaker 1: the whole goal is to prevent the VPN users from 58 00:03:54,680 --> 00:03:58,160 Speaker 1: relying on this work around, right, But in the process 59 00:03:58,600 --> 00:04:02,200 Speaker 1: that means some innocent folks are getting misidentified as VPN 60 00:04:02,280 --> 00:04:05,760 Speaker 1: services and they find that their access to Netflix ends 61 00:04:05,840 --> 00:04:09,680 Speaker 1: up being limited. So clearly there's some collateral damage going 62 00:04:09,720 --> 00:04:12,360 Speaker 1: on here as the company tries to crack down on 63 00:04:12,440 --> 00:04:16,600 Speaker 1: VPN users attempting to skirt region locking but it is 64 00:04:16,640 --> 00:04:20,920 Speaker 1: impossible to say how common this is, like how frequently 65 00:04:21,000 --> 00:04:25,680 Speaker 1: is Netflix affecting legitimate users? Those getting hit with the 66 00:04:25,680 --> 00:04:29,960 Speaker 1: ban are still able to access Netflix original content, but 67 00:04:30,040 --> 00:04:32,479 Speaker 1: they might not be able to see stuff produced or 68 00:04:32,560 --> 00:04:38,200 Speaker 1: distributed outside of Netflix's own properties. Once again we see 69 00:04:38,200 --> 00:04:41,960 Speaker 1: how a media company's response to people breaking the rules 70 00:04:42,440 --> 00:04:47,599 Speaker 1: can cause further harm. See also digital rights management and 71 00:04:47,640 --> 00:04:52,880 Speaker 1: the various massive lawsuits that media companies leveled against people 72 00:04:52,920 --> 00:04:57,080 Speaker 1: who were identified as being pirates. I do not have 73 00:04:57,120 --> 00:05:00,320 Speaker 1: a solution to this problem, by the way, and I'm 74 00:05:00,360 --> 00:05:04,039 Speaker 1: not saying Netflix is necessarily in the wrong as far 75 00:05:04,080 --> 00:05:07,479 Speaker 1: as it goes trying to uphold its end of these 76 00:05:07,560 --> 00:05:11,400 Speaker 1: licensing deals. My guess is that the licensing agreements mean 77 00:05:11,480 --> 00:05:15,440 Speaker 1: that Netflix is obligated to pursue solutions to this problem 78 00:05:15,600 --> 00:05:18,920 Speaker 1: or else face some pretty tough consequences, such as not 79 00:05:19,000 --> 00:05:22,719 Speaker 1: being able to secure future licensing agreements with you know, 80 00:05:22,880 --> 00:05:27,320 Speaker 1: various TV and movie studios. It's just unfortunate that the 81 00:05:27,400 --> 00:05:31,120 Speaker 1: current solution can have a negative impact on people who 82 00:05:31,160 --> 00:05:34,479 Speaker 1: are not part of the problem. That is really unfortunate. 83 00:05:34,640 --> 00:05:38,520 Speaker 1: It really says that solution is a bad solution. I 84 00:05:38,640 --> 00:05:44,240 Speaker 1: just don't have a better one to offer, so bummer. Now, 85 00:05:44,240 --> 00:05:46,800 Speaker 1: when it comes to picking what the top stories of 86 00:05:47,720 --> 00:05:50,880 Speaker 1: we're here in the United States, there is a lot 87 00:05:51,120 --> 00:05:54,960 Speaker 1: to choose from. I mean, coronavirus is an obvious candidate, 88 00:05:55,360 --> 00:05:58,400 Speaker 1: as was the election and all the trappings around that, 89 00:05:58,520 --> 00:06:02,960 Speaker 1: from misinformation camp ains to the fights around voter suppression 90 00:06:03,040 --> 00:06:05,919 Speaker 1: and more. Uh, there's the big story of the Black 91 00:06:05,960 --> 00:06:11,839 Speaker 1: Lives Matter movement, which I support needs ongoing support. And 92 00:06:12,080 --> 00:06:15,880 Speaker 1: there's a related issue the de fund the police movement. 93 00:06:16,240 --> 00:06:20,040 Speaker 1: Now that last one is extremely controversial and politically a 94 00:06:20,040 --> 00:06:22,560 Speaker 1: lot of leaders don't want to have anything to do 95 00:06:22,640 --> 00:06:25,280 Speaker 1: with it, they don't even want to to address it. 96 00:06:25,839 --> 00:06:29,040 Speaker 1: But then we have a story like this one that 97 00:06:29,440 --> 00:06:32,240 Speaker 1: brings into sharp relief the concerns of those who want 98 00:06:32,279 --> 00:06:35,839 Speaker 1: to see an overhaul in how the United States handles 99 00:06:35,920 --> 00:06:39,520 Speaker 1: law enforcement. So, according to documents obtained by the Legal 100 00:06:39,600 --> 00:06:44,400 Speaker 1: Aid Society and the Surveillance Technology Oversight Project, the New 101 00:06:44,480 --> 00:06:48,120 Speaker 1: York City Police Department has spent around a hundred sixty 102 00:06:48,240 --> 00:06:52,880 Speaker 1: million dollars on surveillance systems since two thousand seven. And 103 00:06:52,920 --> 00:06:57,560 Speaker 1: here's the big part, without any public oversight. So these 104 00:06:57,560 --> 00:07:00,880 Speaker 1: technologies include systems like X ray, SCAE, inners for police 105 00:07:00,920 --> 00:07:06,120 Speaker 1: fans that could reportedly detect weapons in vehicles, you know, 106 00:07:06,360 --> 00:07:10,480 Speaker 1: hundreds of feet away, to facial recognition technologies, which, as 107 00:07:10,520 --> 00:07:13,640 Speaker 1: I've mentioned several times in the past, frequently proved to 108 00:07:13,680 --> 00:07:18,560 Speaker 1: be unreliable and subject to algorithmic bias that can disproportionately 109 00:07:18,640 --> 00:07:22,640 Speaker 1: harm non white populations, and also the use of sting 110 00:07:22,800 --> 00:07:26,480 Speaker 1: ray devices. A sting ray is a cell phone catcher. 111 00:07:26,600 --> 00:07:30,920 Speaker 1: It's spoofs or imitates a cell phone tower, and nearby 112 00:07:31,000 --> 00:07:33,000 Speaker 1: phones will connect to it as if it were a 113 00:07:33,080 --> 00:07:36,120 Speaker 1: legitimate cell tower, and then the operator of the sting 114 00:07:36,240 --> 00:07:41,400 Speaker 1: ray can pull information down from it, such as device identifications, 115 00:07:41,880 --> 00:07:47,920 Speaker 1: device locations, and other data. The department acquired all these 116 00:07:47,960 --> 00:07:52,000 Speaker 1: things by using a special Expenses fund line item in 117 00:07:52,040 --> 00:07:55,920 Speaker 1: the budget. The NYPD could make use of those funds 118 00:07:55,920 --> 00:07:59,680 Speaker 1: without first seeking approval from any other New York officials. 119 00:08:00,120 --> 00:08:02,040 Speaker 1: So you can think of it kind of like a 120 00:08:02,080 --> 00:08:04,960 Speaker 1: petty cash fund, except the amount of cash in it 121 00:08:05,000 --> 00:08:09,920 Speaker 1: is definitely not petty, nor were the uses of that cash. 122 00:08:10,080 --> 00:08:12,920 Speaker 1: I should also add that after the city passed a 123 00:08:13,040 --> 00:08:17,520 Speaker 1: law called the Public Oversight of Surveillance Technology Act last year, 124 00:08:17,960 --> 00:08:23,080 Speaker 1: the NYPD shut down the special Expenses fund. Now, whatever 125 00:08:23,120 --> 00:08:26,000 Speaker 1: your opinion is about police funding, I think it is 126 00:08:26,040 --> 00:08:31,000 Speaker 1: important to remember that these technologies tend to range from 127 00:08:31,120 --> 00:08:34,480 Speaker 1: they they can be effective if they are used correctly, 128 00:08:34,600 --> 00:08:37,800 Speaker 1: but they are difficult to use correctly, and they go 129 00:08:37,960 --> 00:08:42,240 Speaker 1: all the way to This technology is unproven and in 130 00:08:42,280 --> 00:08:45,920 Speaker 1: fact is known to be inaccurate. So even in a 131 00:08:46,040 --> 00:08:50,520 Speaker 1: best case scenario, this is an example of funds being 132 00:08:50,679 --> 00:08:54,319 Speaker 1: poorly spent, and in a worst case scenario, it's an 133 00:08:54,320 --> 00:08:59,040 Speaker 1: example of an organization of authority that is abusing that 134 00:08:59,120 --> 00:09:04,560 Speaker 1: authority and using technology to enable further abuse of authority. 135 00:09:04,760 --> 00:09:06,959 Speaker 1: This ties back in with something that I said earlier 136 00:09:07,000 --> 00:09:09,920 Speaker 1: this week that we have a tendency to lean on 137 00:09:10,040 --> 00:09:13,520 Speaker 1: technology in order to solve social problems, even if the 138 00:09:13,559 --> 00:09:18,319 Speaker 1: tech is unproven or unsuitable for solving a social problem, 139 00:09:18,440 --> 00:09:21,600 Speaker 1: or we just don't fully understand the technology, and this, 140 00:09:22,120 --> 00:09:25,600 Speaker 1: I should point out, is a bad thing. Now, let 141 00:09:25,679 --> 00:09:30,040 Speaker 1: us return to the issue of Facebook and Australian media companies. 142 00:09:30,120 --> 00:09:33,280 Speaker 1: You might remember that there was a big standoff between 143 00:09:33,280 --> 00:09:37,520 Speaker 1: the Australian government and Facebook a while back, stemming from 144 00:09:37,520 --> 00:09:41,640 Speaker 1: how Facebook offers uh previews of articles and other media 145 00:09:41,720 --> 00:09:47,160 Speaker 1: content without directing people to the sites that originate that content. 146 00:09:47,679 --> 00:09:50,320 Speaker 1: Now this is important because those media companies, you know, 147 00:09:50,400 --> 00:09:54,960 Speaker 1: like news organizations and such, they depend upon advertising revenue, 148 00:09:54,960 --> 00:09:58,760 Speaker 1: which depends upon traffic to that website. If people are 149 00:09:58,800 --> 00:10:02,600 Speaker 1: getting the content without actually visiting the site, then the 150 00:10:02,600 --> 00:10:06,760 Speaker 1: company is not generating any ad revenue from that transaction, 151 00:10:06,800 --> 00:10:11,520 Speaker 1: and ultimately you'll see these media companies and news organizations 152 00:10:11,559 --> 00:10:14,840 Speaker 1: struggle to remain solvent. They won't make enough money to 153 00:10:14,920 --> 00:10:18,360 Speaker 1: stay in business. The Australian government issued a mandate the 154 00:10:18,400 --> 00:10:21,920 Speaker 1: companies like Facebook and Google to work out deals with 155 00:10:21,960 --> 00:10:26,520 Speaker 1: Australian news outlets, which Facebook said it would do after 156 00:10:26,600 --> 00:10:29,200 Speaker 1: some back and forth, including moments in which Facebook implied 157 00:10:29,200 --> 00:10:32,440 Speaker 1: it would just plane shut down in Australia. Well, now 158 00:10:33,000 --> 00:10:36,319 Speaker 1: Facebook is under fire again after three publishers in Australia 159 00:10:36,400 --> 00:10:41,079 Speaker 1: alleged that Facebook included their content on their new news 160 00:10:41,120 --> 00:10:46,120 Speaker 1: service platform in Australia without first negotiating licensing deals for 161 00:10:46,160 --> 00:10:49,800 Speaker 1: that content, which you could argue would be in violation 162 00:10:49,840 --> 00:10:52,640 Speaker 1: of this new law. Now, the companies say that the 163 00:10:52,760 --> 00:10:57,000 Speaker 1: law might protect the larger publishing houses pretty well, but 164 00:10:57,160 --> 00:11:01,800 Speaker 1: for smaller content creators it's a different story. So, according 165 00:11:01,840 --> 00:11:05,400 Speaker 1: to these publishers. They approached Facebook in order to negotiate 166 00:11:05,520 --> 00:11:09,000 Speaker 1: licensing deals for their content to be featured on this 167 00:11:09,040 --> 00:11:13,360 Speaker 1: news platform, and Facebook essentially dismissed them, saying, no, you 168 00:11:13,360 --> 00:11:16,880 Speaker 1: guys are small potatoes. You're not suitable for our news 169 00:11:16,880 --> 00:11:20,600 Speaker 1: platform because you're not notable enough to be included on it, 170 00:11:20,679 --> 00:11:23,680 Speaker 1: So we're not signing a deal with you. But then, 171 00:11:24,160 --> 00:11:28,560 Speaker 1: according to these allegations, content from these publishers showed up 172 00:11:28,600 --> 00:11:32,080 Speaker 1: on that news platform anyway. So as I record this, 173 00:11:32,360 --> 00:11:34,600 Speaker 1: there's not a word as to whether the publishers are 174 00:11:34,600 --> 00:11:37,840 Speaker 1: going to pursue any sort of government intervention in response 175 00:11:37,880 --> 00:11:40,480 Speaker 1: to this problem, but they appear to at least be 176 00:11:40,559 --> 00:11:44,040 Speaker 1: open to the idea. Complicating matters is only one of 177 00:11:44,080 --> 00:11:47,760 Speaker 1: the three publishers, this one called Urban List, is actually 178 00:11:47,800 --> 00:11:52,200 Speaker 1: registered with the Australian Communications and Media Authority, which is 179 00:11:52,480 --> 00:11:55,840 Speaker 1: a prerequisite in order to be covered by this media law. 180 00:11:56,120 --> 00:12:00,120 Speaker 1: The other two have not yet applied for registration with 181 00:12:00,200 --> 00:12:03,600 Speaker 1: that organization. Well, we've got a lot more news to cover, 182 00:12:04,000 --> 00:12:06,440 Speaker 1: but before I get into any of that, let's take 183 00:12:06,559 --> 00:12:17,760 Speaker 1: a quick break. Let's go back over to Facebook. And 184 00:12:17,920 --> 00:12:20,520 Speaker 1: you know, we were just with Facebook and Australia now 185 00:12:20,559 --> 00:12:24,360 Speaker 1: we're gonna go to Facebook and the u K because 186 00:12:24,840 --> 00:12:29,319 Speaker 1: over there, over the pond, the Competition and Markets Authority 187 00:12:29,600 --> 00:12:32,360 Speaker 1: and agency and government agency in the UK has been 188 00:12:32,400 --> 00:12:37,400 Speaker 1: investigating Facebook's acquisition of Giffee or if you prefer Jiffy. 189 00:12:38,040 --> 00:12:41,920 Speaker 1: I'm sure they pronounced a Jiffy. I still maintained that 190 00:12:42,200 --> 00:12:46,520 Speaker 1: the correct pronunciation of you know, gift is gift. This 191 00:12:46,600 --> 00:12:49,920 Speaker 1: is the company that's famous for animated gifts. Actually most 192 00:12:49,960 --> 00:12:52,680 Speaker 1: of those, in fact aren't animated gifts at all. They're 193 00:12:52,760 --> 00:12:56,880 Speaker 1: rather other formats, their short looping video clips that mimic 194 00:12:57,440 --> 00:12:59,920 Speaker 1: an animated gift, but you know, that's beside the point. 195 00:13:00,720 --> 00:13:05,760 Speaker 1: And this this government group says that Facebook's acquisition presents 196 00:13:05,960 --> 00:13:10,280 Speaker 1: competition concerns. At least this is according to CNBC, where 197 00:13:10,559 --> 00:13:13,880 Speaker 1: I read about this, and the complaints states that the 198 00:13:14,600 --> 00:13:19,120 Speaker 1: this acquisition decreases competition in the digital advertising market, which 199 00:13:19,120 --> 00:13:21,760 Speaker 1: is a market that has already dominated by goliaths like 200 00:13:21,960 --> 00:13:25,640 Speaker 1: Facebook and Google. And one of the concerns is that 201 00:13:25,960 --> 00:13:31,200 Speaker 1: Facebook will limit the use of Giffee gifts to Facebook 202 00:13:31,280 --> 00:13:36,199 Speaker 1: platforms like the actual Facebook social network and Instagram, but 203 00:13:36,240 --> 00:13:39,520 Speaker 1: there are lots of other platforms that do use Giffee, 204 00:13:39,679 --> 00:13:42,400 Speaker 1: and the concern is that Facebook could cut that that 205 00:13:42,520 --> 00:13:46,600 Speaker 1: access off and thus that would be an anti competitive move. 206 00:13:47,200 --> 00:13:51,160 Speaker 1: The Government Office warns Facebook that it may require that 207 00:13:51,360 --> 00:13:54,600 Speaker 1: the company has to sell off Giffee, which would reverse 208 00:13:54,640 --> 00:13:58,520 Speaker 1: a four million dollar deal. The Office has yet to 209 00:13:58,559 --> 00:14:01,560 Speaker 1: file its full review, and under further investigation, it is 210 00:14:01,640 --> 00:14:06,640 Speaker 1: possible that all complaints could be dropped. Facebook naturally says 211 00:14:06,760 --> 00:14:11,000 Speaker 1: that the Government offices concerns are completely off base and 212 00:14:11,080 --> 00:14:16,959 Speaker 1: that the acquisition doesn't represent a decrease in competition. Now, 213 00:14:17,720 --> 00:14:21,560 Speaker 1: let's look at the world of anti virus software, because 214 00:14:22,040 --> 00:14:24,720 Speaker 1: that world is about to get a little smaller. Norton 215 00:14:24,800 --> 00:14:28,600 Speaker 1: LifeLock is on track to merge with former competitor a 216 00:14:28,720 --> 00:14:32,160 Speaker 1: Vast in a deal that's valued at eight billion dollars. 217 00:14:32,200 --> 00:14:35,640 Speaker 1: So Norton's experience with antivirus software dates back to nine. 218 00:14:36,960 --> 00:14:40,720 Speaker 1: A Vast was founded even earlier, back in nine and 219 00:14:40,800 --> 00:14:43,720 Speaker 1: the plan is to merge these companies and combine the 220 00:14:43,760 --> 00:14:48,680 Speaker 1: capabilities of the two and offer more robust solutions for cybersecurity, 221 00:14:48,960 --> 00:14:53,000 Speaker 1: including additional ways to detect intrusions and protect privacy and 222 00:14:53,160 --> 00:14:56,960 Speaker 1: disabled malware. And as we see a rise in innovative 223 00:14:56,960 --> 00:14:59,800 Speaker 1: attacks like the supply chain attacks that we saw with 224 00:14:59,880 --> 00:15:03,840 Speaker 1: the solar winds attacks, cybersecurity companies are rushing to keep 225 00:15:03,920 --> 00:15:07,280 Speaker 1: pace with the bad guys, and that's pretty much the 226 00:15:07,360 --> 00:15:10,000 Speaker 1: normal day to day in the world of cybersecurity, because 227 00:15:10,000 --> 00:15:12,920 Speaker 1: each side is constantly struggling to try and get ahead 228 00:15:12,920 --> 00:15:15,920 Speaker 1: of the other. There's still a bunch of other anti 229 00:15:15,960 --> 00:15:18,720 Speaker 1: virus providers out there. There's like a v G and 230 00:15:18,880 --> 00:15:22,960 Speaker 1: McAfee and Kaspersky and lots more. And I recommend that 231 00:15:23,080 --> 00:15:25,800 Speaker 1: you use anti virus, but I also recommend you research 232 00:15:25,920 --> 00:15:29,920 Speaker 1: anti virus software packages before you install one on your machine, 233 00:15:30,000 --> 00:15:33,720 Speaker 1: because some packages contain a lot more bloat than others 234 00:15:33,840 --> 00:15:36,920 Speaker 1: and will put a larger demand on your computer's resources, 235 00:15:36,920 --> 00:15:40,440 Speaker 1: which could mean that you see performance issues elsewhere. Also, 236 00:15:40,680 --> 00:15:43,960 Speaker 1: be sure you don't already have anti virus software on 237 00:15:44,000 --> 00:15:47,000 Speaker 1: your machine, because having more than one is a pretty 238 00:15:47,000 --> 00:15:49,480 Speaker 1: good way to make your computer run so slowly that 239 00:15:49,680 --> 00:15:53,960 Speaker 1: you can't do anything with it. A few news episodes back, 240 00:15:54,280 --> 00:15:58,200 Speaker 1: I reported on how employees at Activision Blizzard brought forth 241 00:15:58,240 --> 00:16:03,080 Speaker 1: allegations of sexual miscon harassment, hostile work environment, and more, 242 00:16:03,400 --> 00:16:06,800 Speaker 1: including a lawsuit against the company, and since then there's 243 00:16:06,840 --> 00:16:10,560 Speaker 1: been an employee walk out that happened on July. There 244 00:16:10,560 --> 00:16:14,840 Speaker 1: have been various calls online for game boycotts UH, and 245 00:16:14,960 --> 00:16:17,880 Speaker 1: also people who have said that unfortunately, those kinds of 246 00:16:17,880 --> 00:16:21,800 Speaker 1: boycotts often cause more harm to lower level employees than 247 00:16:21,880 --> 00:16:24,760 Speaker 1: the leaders who allow these toxic environments to kind of 248 00:16:24,760 --> 00:16:28,480 Speaker 1: take hold. There's also been responses from the company itself 249 00:16:28,480 --> 00:16:32,800 Speaker 1: claiming that the allegations in the lawsuit were UH in 250 00:16:32,840 --> 00:16:35,800 Speaker 1: reference to a company of the past and not reflective 251 00:16:36,160 --> 00:16:40,640 Speaker 1: of how Blizzard operates today. Well, now we've got some 252 00:16:40,720 --> 00:16:45,400 Speaker 1: more updates because three important staff members recently left the 253 00:16:45,440 --> 00:16:49,640 Speaker 1: studio than They include the former director of Diablo four, 254 00:16:49,760 --> 00:16:53,000 Speaker 1: which is not yet released. That would be Luis Barriga. 255 00:16:53,800 --> 00:16:57,760 Speaker 1: Also a designer who worked on Diablo four named Jesse McCree, 256 00:16:58,240 --> 00:17:02,680 Speaker 1: and a World of Warcraft designer named Jonathan Lacraft. The 257 00:17:02,720 --> 00:17:06,040 Speaker 1: company's statement is simply that these three are no longer 258 00:17:06,080 --> 00:17:09,679 Speaker 1: working for the company, so there's no clarification there about 259 00:17:09,720 --> 00:17:12,080 Speaker 1: how their terms of employment came to an end. We 260 00:17:12,119 --> 00:17:14,480 Speaker 1: don't know if they were fired, or if they resigned 261 00:17:14,560 --> 00:17:19,320 Speaker 1: or what. Kotak, who previously published photos of Lacraft and 262 00:17:19,400 --> 00:17:23,480 Speaker 1: McCree in relation to what was called the Cosby Shrine, 263 00:17:23,560 --> 00:17:27,520 Speaker 1: which was a suite connected to a fan event called 264 00:17:27,520 --> 00:17:31,119 Speaker 1: blizz Con back in two thousand thirteen, And obviously it 265 00:17:31,160 --> 00:17:36,200 Speaker 1: was in reference to the comedian Bill Cosby, which, if 266 00:17:36,240 --> 00:17:38,400 Speaker 1: you know anything about you know the story of Bill 267 00:17:38,440 --> 00:17:44,320 Speaker 1: Cosby's legal troubles. That implication is beyond horrifying, because that 268 00:17:44,400 --> 00:17:46,800 Speaker 1: line of logic means the group was at least on 269 00:17:46,840 --> 00:17:51,159 Speaker 1: some level condoning the idea of harassing and abusing women. 270 00:17:51,680 --> 00:17:55,240 Speaker 1: So it may well be that this is Blizzard trying 271 00:17:55,240 --> 00:17:59,359 Speaker 1: to clean house after insisting that its house was already clean. 272 00:17:59,840 --> 00:18:03,439 Speaker 1: I don't know. I have no contacts within Blizzard, and 273 00:18:03,480 --> 00:18:05,440 Speaker 1: so I have no clue if these employees were actually 274 00:18:05,560 --> 00:18:07,919 Speaker 1: terminated or if they left on their own accord and 275 00:18:08,000 --> 00:18:10,960 Speaker 1: chose to resign. Nor do I know what, if any 276 00:18:11,040 --> 00:18:14,520 Speaker 1: part they played in the alleged hostile work environment. But 277 00:18:14,640 --> 00:18:17,960 Speaker 1: it appears that the fallout from that issue is continuing 278 00:18:19,520 --> 00:18:22,720 Speaker 1: back to tech and politics, which honestly I did not 279 00:18:22,840 --> 00:18:25,520 Speaker 1: anticipate being something I would have to cover so frequently 280 00:18:25,640 --> 00:18:28,399 Speaker 1: in this show. But the US Senate has introduced a 281 00:18:28,400 --> 00:18:31,439 Speaker 1: piece of legislation aimed at taking some power away from 282 00:18:31,480 --> 00:18:35,800 Speaker 1: platforms like Google and Apple app stores. This comes out 283 00:18:35,840 --> 00:18:39,720 Speaker 1: of the Antitrust Committee in the Senate, which is a 284 00:18:39,720 --> 00:18:44,040 Speaker 1: body that is dedicated to fighting anti competitive and monopolistic practices, 285 00:18:44,560 --> 00:18:47,160 Speaker 1: and among the provisions in the legislation is a bit 286 00:18:47,200 --> 00:18:49,920 Speaker 1: that says platforms should not be able to force app 287 00:18:50,000 --> 00:18:54,399 Speaker 1: developers to use a platform provided payment system and that 288 00:18:54,480 --> 00:18:57,400 Speaker 1: app developers should be able to use their own systems instead. 289 00:18:57,720 --> 00:19:02,320 Speaker 1: So currently, companies like Apple and Google require apps that 290 00:19:02,400 --> 00:19:06,639 Speaker 1: are available within their respective stores to conform to the 291 00:19:06,720 --> 00:19:11,160 Speaker 1: platform payment systems, and the platforms take a cut, typically 292 00:19:11,400 --> 00:19:15,679 Speaker 1: around thirty for all in app purchases, assuming the purchase 293 00:19:15,800 --> 00:19:19,760 Speaker 1: isn't for some real world item or service. So for example, 294 00:19:20,320 --> 00:19:22,480 Speaker 1: if I were playing a game and I wanted to 295 00:19:22,480 --> 00:19:26,000 Speaker 1: make an in app purchase for game items within that game, 296 00:19:26,160 --> 00:19:29,240 Speaker 1: then because I use Android, Google would take a thirty 297 00:19:29,280 --> 00:19:33,399 Speaker 1: percent cut of that transaction. But if I use an 298 00:19:33,440 --> 00:19:37,360 Speaker 1: app to order a pizza or use a ride hailing 299 00:19:37,359 --> 00:19:40,520 Speaker 1: app like Uber or lift Google Google would not take 300 00:19:40,520 --> 00:19:43,119 Speaker 1: a cut of that. It would be it's under a 301 00:19:43,119 --> 00:19:46,320 Speaker 1: different set of rules. Anyway, This provision would allow pretty 302 00:19:46,359 --> 00:19:49,000 Speaker 1: much everyone to use their own payment systems in lieu 303 00:19:49,200 --> 00:19:52,480 Speaker 1: of you know, the official platform one and Another element 304 00:19:52,520 --> 00:19:55,159 Speaker 1: of the bill would actually be specific to Apple. It 305 00:19:55,200 --> 00:19:59,560 Speaker 1: would force Apple to allow users to sideload apps onto 306 00:19:59,640 --> 00:20:03,280 Speaker 1: their owns, which means they would be able to bypass 307 00:20:03,320 --> 00:20:06,639 Speaker 1: the Apple Store entirely. They could load in apps that 308 00:20:06,760 --> 00:20:09,879 Speaker 1: Apple did not allow in the App store. Now, Apple 309 00:20:09,960 --> 00:20:13,880 Speaker 1: maintains that it's policy of only allowing users to download 310 00:20:13,880 --> 00:20:18,320 Speaker 1: apps from the official Apple App Store protects the users 311 00:20:18,600 --> 00:20:22,440 Speaker 1: because each app has to pass approval before being allowed 312 00:20:22,520 --> 00:20:25,960 Speaker 1: in the store. This, according to Apple, cuts back on 313 00:20:26,119 --> 00:20:30,199 Speaker 1: malware and other issues, although those things still seem to 314 00:20:30,200 --> 00:20:33,920 Speaker 1: creep through on occasion just but I would argue probably 315 00:20:33,960 --> 00:20:36,760 Speaker 1: not nearly to the level that they would if that 316 00:20:37,000 --> 00:20:39,399 Speaker 1: if that system weren't in place. But that's kind of 317 00:20:39,440 --> 00:20:42,800 Speaker 1: Apple's policy. However, it also means that Apple is the 318 00:20:42,880 --> 00:20:46,800 Speaker 1: sole arbiter of which apps iPhone users are allowed to 319 00:20:46,880 --> 00:20:50,200 Speaker 1: load onto their phones, assuming those users haven't gone through 320 00:20:50,240 --> 00:20:53,359 Speaker 1: the trouble of jail breaking their phones, which is a 321 00:20:53,400 --> 00:20:57,240 Speaker 1: whole other thing. And we'll have to see how this progresses, 322 00:20:57,720 --> 00:21:00,719 Speaker 1: because we're still a long way from this legislation actually 323 00:21:00,760 --> 00:21:03,879 Speaker 1: becoming law. This is still in the formation stages. I 324 00:21:03,920 --> 00:21:06,359 Speaker 1: suspect we're going to see a lot of changes to 325 00:21:06,440 --> 00:21:10,159 Speaker 1: the language of that legislation introduced as Google and Apple 326 00:21:10,400 --> 00:21:14,359 Speaker 1: way in through various lobbying efforts. We have a couple 327 00:21:14,400 --> 00:21:17,160 Speaker 1: more stories, but before I get to that, let's take 328 00:21:17,200 --> 00:21:27,240 Speaker 1: another quick break. Before the break, I was talking about Apple. 329 00:21:27,440 --> 00:21:29,280 Speaker 1: It's time to talk about Apple a little bit more. 330 00:21:29,840 --> 00:21:34,080 Speaker 1: The company's image scanning technology has recently come under fire 331 00:21:34,160 --> 00:21:37,760 Speaker 1: from the Electronic Freedom Foundation or e ff SO. This 332 00:21:37,840 --> 00:21:42,040 Speaker 1: technology scans images sent through eye message in an effort 333 00:21:42,080 --> 00:21:46,280 Speaker 1: to detect and thus report on child exploitation and abuse. 334 00:21:46,800 --> 00:21:52,040 Speaker 1: In other words, like this background technology is scanning stuff 335 00:21:52,080 --> 00:21:55,440 Speaker 1: that otherwise is being sent privately between individuals to make 336 00:21:55,480 --> 00:21:58,320 Speaker 1: sure that it's not in violation of this that it 337 00:21:58,480 --> 00:22:03,200 Speaker 1: doesn't depict child abuse or exploitation. The e f F 338 00:22:03,359 --> 00:22:06,840 Speaker 1: maintains that, you know, fighting those things as noble right. 339 00:22:07,000 --> 00:22:10,959 Speaker 1: We want to protect children. However, the method that Apple 340 00:22:11,119 --> 00:22:16,359 Speaker 1: is using is also open for abuse, namely that you know, 341 00:22:16,400 --> 00:22:20,560 Speaker 1: Apple has now created a method to intercept and analyze 342 00:22:20,600 --> 00:22:23,960 Speaker 1: material sent through eye message, and that's meant to be 343 00:22:24,160 --> 00:22:28,600 Speaker 1: an end to end encryption communications channel that should provide 344 00:22:28,640 --> 00:22:32,520 Speaker 1: secure and private communications between two people. That means that 345 00:22:32,680 --> 00:22:36,600 Speaker 1: the whole secure thing there and private thing all that 346 00:22:36,640 --> 00:22:39,120 Speaker 1: gets thrown out the window if you have a system 347 00:22:39,240 --> 00:22:43,320 Speaker 1: that's actually analyzing the material being sent back and forth. Now, 348 00:22:43,359 --> 00:22:46,960 Speaker 1: the e f F points out that governments, including the 349 00:22:47,080 --> 00:22:51,399 Speaker 1: United States government, have frequently sought ways to get around 350 00:22:51,600 --> 00:22:57,040 Speaker 1: encrypted communications channels, up to and including demanding that companies 351 00:22:57,720 --> 00:23:03,119 Speaker 1: include a backdoor access so that they these various government 352 00:23:03,160 --> 00:23:06,919 Speaker 1: agencies can actually see what's being sent back and forth. 353 00:23:07,080 --> 00:23:12,399 Speaker 1: Companies like Apple traditionally have really resisted that, and this 354 00:23:12,520 --> 00:23:16,720 Speaker 1: approach of creating these ways to get around encryption invariably 355 00:23:17,200 --> 00:23:20,359 Speaker 1: creates a surveillance state that can grant more power to 356 00:23:20,600 --> 00:23:25,639 Speaker 1: authoritarian leaders that they can then abuse, and the victims 357 00:23:25,680 --> 00:23:30,360 Speaker 1: then are the citizens and others related to that government. 358 00:23:31,080 --> 00:23:34,720 Speaker 1: This is one of those situations that's actually really really 359 00:23:34,800 --> 00:23:37,480 Speaker 1: hard for me to grapple with because on the one hand, 360 00:23:37,520 --> 00:23:40,240 Speaker 1: I absolutely want there to be more systems in place 361 00:23:40,280 --> 00:23:44,119 Speaker 1: to protect children. I really want that. But on the 362 00:23:44,160 --> 00:23:49,080 Speaker 1: other hand, I don't want that pursuit to then enable 363 00:23:49,200 --> 00:23:53,800 Speaker 1: systems that could harm entire populations through misuse, where the 364 00:23:53,960 --> 00:23:57,639 Speaker 1: cure ends up having its own kind of disease that 365 00:23:57,720 --> 00:24:00,880 Speaker 1: affects society. I won't say it's ears than the disease, 366 00:24:01,200 --> 00:24:02,760 Speaker 1: because it's hard for me to think of things that 367 00:24:02,800 --> 00:24:06,840 Speaker 1: are worse than child abuse, but it could be another 368 00:24:07,119 --> 00:24:10,679 Speaker 1: bad consequence of that solution, and the e f F 369 00:24:10,800 --> 00:24:14,280 Speaker 1: warns that various governments and regimes around the world already 370 00:24:14,440 --> 00:24:18,200 Speaker 1: are standing up to take advantage of Apple's decision and 371 00:24:18,280 --> 00:24:20,919 Speaker 1: find ways to turn it to their own purposes for 372 00:24:21,000 --> 00:24:24,159 Speaker 1: the you know, the aim of surveillance, and the e 373 00:24:24,320 --> 00:24:28,920 Speaker 1: f F is adamant that Apple's choice ultimately weakens security 374 00:24:28,920 --> 00:24:33,000 Speaker 1: and privacy, even if the intention behind the decision was 375 00:24:33,040 --> 00:24:36,440 Speaker 1: a good one. This just points out that the world 376 00:24:36,520 --> 00:24:42,640 Speaker 1: is an incredibly complicated place, and again, while technology can 377 00:24:42,680 --> 00:24:47,280 Speaker 1: provide some tools for us to do things, sometimes the 378 00:24:47,359 --> 00:24:52,439 Speaker 1: consequences of using those tools creates problems that are you know, 379 00:24:53,000 --> 00:24:54,960 Speaker 1: as difficult as the one you were trying to solve 380 00:24:54,960 --> 00:24:57,359 Speaker 1: in the first place. I think this might be the 381 00:24:57,440 --> 00:25:00,199 Speaker 1: case here, and it really pains me to say that, 382 00:25:00,400 --> 00:25:02,480 Speaker 1: because again, I want there to be systems in place 383 00:25:02,520 --> 00:25:06,840 Speaker 1: to protect kids. But yeah, I don't know what the 384 00:25:06,920 --> 00:25:11,840 Speaker 1: right answer is to this one either. I wish I did. Finally, 385 00:25:12,200 --> 00:25:16,160 Speaker 1: and honestly, I apologize for not making this the top 386 00:25:16,280 --> 00:25:21,320 Speaker 1: story for today. Twitter, Hold onto your Butts is rolling 387 00:25:21,320 --> 00:25:25,359 Speaker 1: out a change in the font it uses to display messages. Yes, 388 00:25:25,960 --> 00:25:30,000 Speaker 1: you heard me correctly. Twitter is creating and rolling out 389 00:25:30,080 --> 00:25:33,879 Speaker 1: a new font called chirp into its apps and on 390 00:25:33,920 --> 00:25:37,320 Speaker 1: the web based feeds. Now. The rollout is in stages, 391 00:25:37,480 --> 00:25:39,600 Speaker 1: so you may or may not be able to see 392 00:25:39,600 --> 00:25:43,320 Speaker 1: it right now. According to Twitter's blog, the font quote 393 00:25:43,560 --> 00:25:47,280 Speaker 1: strikes the balance between messy and sharp to amplify the 394 00:25:47,320 --> 00:25:50,560 Speaker 1: fund and irreverence of a tweet, but can also carry 395 00:25:50,560 --> 00:25:55,159 Speaker 1: the weight of seriousness when needed. End quote. Which, you know, 396 00:25:55,440 --> 00:25:58,159 Speaker 1: that's a huge relief to me, because I'm glad that 397 00:25:58,200 --> 00:26:01,119 Speaker 1: my dumb dad joke tweets can joy a boost of 398 00:26:01,200 --> 00:26:05,000 Speaker 1: jocularity from this font, while at the same time, those 399 00:26:05,000 --> 00:26:08,080 Speaker 1: people who are using Twitter to you know, advance real 400 00:26:08,119 --> 00:26:13,400 Speaker 1: world important social movements, or, if you prefer, those who 401 00:26:13,440 --> 00:26:17,640 Speaker 1: are spreading crazy misinformation can lean more heavily on how 402 00:26:17,640 --> 00:26:20,800 Speaker 1: this font also supports the serious side. I mean, who 403 00:26:20,880 --> 00:26:24,320 Speaker 1: knew that fonts were so fundamental to conveying the intent 404 00:26:24,480 --> 00:26:29,760 Speaker 1: of a message. Also, full disclosure, I totally didn't notice 405 00:26:30,520 --> 00:26:33,960 Speaker 1: that this thing changed, that the font changed at all, 406 00:26:34,119 --> 00:26:36,440 Speaker 1: although that could say a lot more about my lack 407 00:26:36,480 --> 00:26:41,359 Speaker 1: of skill in the Observation Department than anything else. Um, 408 00:26:41,440 --> 00:26:43,600 Speaker 1: I don't know that it's rolled out to me. Even 409 00:26:43,640 --> 00:26:45,800 Speaker 1: when I was looking at a news article that was 410 00:26:45,840 --> 00:26:48,760 Speaker 1: talking about this and had pictures of examples of the font, 411 00:26:48,800 --> 00:26:53,159 Speaker 1: I thought, huh, I guess that's different. I would need 412 00:26:53,200 --> 00:26:54,920 Speaker 1: to have a side by side and then I would. 413 00:26:55,080 --> 00:26:58,920 Speaker 1: I'm sure I would notice it, but I guess I 414 00:26:59,000 --> 00:27:01,000 Speaker 1: just never really put much stock in the font. So 415 00:27:02,400 --> 00:27:05,800 Speaker 1: this I missed the boat. That's it for the news 416 00:27:05,960 --> 00:27:10,600 Speaker 1: for today, Thursday, August one. If you have suggestions for 417 00:27:10,640 --> 00:27:13,560 Speaker 1: topics I should cover in tech Stuff, please reach out 418 00:27:13,600 --> 00:27:16,520 Speaker 1: to me on Twitter. The handle is text stuff H 419 00:27:16,760 --> 00:27:20,680 Speaker 1: s W and I'll talk to you again really soon. 420 00:27:25,920 --> 00:27:28,960 Speaker 1: Text Stuff is an I Heart Radio production. For more 421 00:27:29,040 --> 00:27:32,440 Speaker 1: podcasts from my Heart Radio, visit the I Heart Radio app, 422 00:27:32,560 --> 00:27:35,720 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.