1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,119 --> 00:00:14,840 Speaker 1: Be there and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:18,040 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,079 --> 00:00:20,439 Speaker 1: and a love of all things tech. It is time 5 00:00:20,440 --> 00:00:25,239 Speaker 1: for the tech news for Thursday, July one. Let's get 6 00:00:25,280 --> 00:00:29,360 Speaker 1: to it. Yesterday, former President Donald Trump announced he was 7 00:00:29,440 --> 00:00:34,440 Speaker 1: filing class action complaints against YouTube, Twitter, and Facebook in 8 00:00:34,520 --> 00:00:38,080 Speaker 1: a Florida court, claiming that the company has violated his 9 00:00:38,159 --> 00:00:41,560 Speaker 1: First Amendment rights by banning his accounts in the wake 10 00:00:41,720 --> 00:00:46,120 Speaker 1: of the January sixth Capital riots. Numerous experts in law 11 00:00:46,400 --> 00:00:49,159 Speaker 1: have pointed out that this kind of sounds like a 12 00:00:49,280 --> 00:00:53,320 Speaker 1: lost cause. The First Amendment to the US Constitution guarantees 13 00:00:53,400 --> 00:00:57,440 Speaker 1: that the government shall not infringe upon the freedom of 14 00:00:57,520 --> 00:01:02,080 Speaker 1: expression the freedom of speech. But last I checked, Twitter, YouTube, 15 00:01:02,120 --> 00:01:06,160 Speaker 1: and even Facebook are not the US government. You might 16 00:01:06,280 --> 00:01:09,319 Speaker 1: argue that these entities are, at least in some ways 17 00:01:09,480 --> 00:01:12,440 Speaker 1: on a level with some parts of the government, but 18 00:01:12,520 --> 00:01:17,520 Speaker 1: that does not make them government entities. They are instead businesses, 19 00:01:17,959 --> 00:01:21,400 Speaker 1: and well, as such, they're allowed to do stuff like say, 20 00:01:21,600 --> 00:01:24,479 Speaker 1: here are the rules for being part of this platform, 21 00:01:24,560 --> 00:01:27,960 Speaker 1: and if you violate the rules, you get banned. Kind 22 00:01:27,959 --> 00:01:30,280 Speaker 1: of like, if you have someone come in your house 23 00:01:30,319 --> 00:01:34,080 Speaker 1: and they are bad mouthing you and your loved ones, 24 00:01:34,520 --> 00:01:36,440 Speaker 1: you can tell them to leave and not allow them 25 00:01:36,440 --> 00:01:40,920 Speaker 1: back in your house. That's not violating the First Amendment. Now, 26 00:01:41,080 --> 00:01:43,720 Speaker 1: what this does seem to be to me, and I 27 00:01:43,760 --> 00:01:48,680 Speaker 1: stressed this is just my opinion and it's total armshare observation, 28 00:01:49,160 --> 00:01:51,720 Speaker 1: but it seems to be an effort to raise money 29 00:01:51,960 --> 00:01:55,440 Speaker 1: from supporters, and ostensibly that money is supposed to go 30 00:01:55,480 --> 00:02:00,600 Speaker 1: to this legal fund. And maybe also it's an attempt 31 00:02:00,680 --> 00:02:04,480 Speaker 1: to stay in the news cycle, which I mean, you 32 00:02:04,520 --> 00:02:07,960 Speaker 1: know it worked because here I am covering it. But 33 00:02:08,000 --> 00:02:11,040 Speaker 1: if these complaints actually make their way through the system 34 00:02:11,120 --> 00:02:14,760 Speaker 1: and ultimately go to court like a trial, and that 35 00:02:14,840 --> 00:02:18,200 Speaker 1: the court finds in favor of the former president, it 36 00:02:18,240 --> 00:02:21,960 Speaker 1: would mean a monumentally huge shift in how the US 37 00:02:22,040 --> 00:02:25,400 Speaker 1: handles the First Amendment, and it would no doubt immediately 38 00:02:25,560 --> 00:02:28,680 Speaker 1: get moved to a court of appeals. It would be 39 00:02:29,440 --> 00:02:34,320 Speaker 1: pushed up the court system from there. But most of 40 00:02:34,360 --> 00:02:38,320 Speaker 1: the legal experts I've read seemed to indicate that this 41 00:02:38,480 --> 00:02:42,960 Speaker 1: is at most a frivolous lawsuit, and that in fact, 42 00:02:43,240 --> 00:02:48,919 Speaker 1: the law team supporting Trump could find themselves reprimanded for it. Anyway, 43 00:02:49,360 --> 00:02:51,800 Speaker 1: I thought I had to reference it because it is 44 00:02:51,840 --> 00:02:56,200 Speaker 1: something that's happening within the world of tech. Let's move on. Mozilla, 45 00:02:56,639 --> 00:03:01,679 Speaker 1: the company behind Firefox, create a browser extension called Regrets Reporter, 46 00:03:02,080 --> 00:03:04,080 Speaker 1: and this was all to be, you know, part of 47 00:03:04,080 --> 00:03:08,640 Speaker 1: an informal study. So Firefox users could install this extension 48 00:03:08,680 --> 00:03:12,920 Speaker 1: onto their browser and the extension would log that users 49 00:03:13,000 --> 00:03:16,079 Speaker 1: viewing activity on YouTube. This was all on the up 50 00:03:16,080 --> 00:03:20,040 Speaker 1: and up. That was the stated intent and purpose of 51 00:03:20,080 --> 00:03:24,800 Speaker 1: this extension. So it was essentially Mozilla looking for volunteers. 52 00:03:24,840 --> 00:03:28,080 Speaker 1: So these volunteers could then flag clips of stuff that 53 00:03:28,120 --> 00:03:31,040 Speaker 1: they wished they had not seen. That's why it's called 54 00:03:31,240 --> 00:03:35,720 Speaker 1: regrets Reporter. So the idea was to see how frequently 55 00:03:35,760 --> 00:03:41,560 Speaker 1: YouTube's recommendation algorithm might submit videos that people just plain 56 00:03:41,680 --> 00:03:44,840 Speaker 1: wish they had never encountered. And it turned out to be, 57 00:03:45,440 --> 00:03:48,960 Speaker 1: you know, a fairly decent number. Users would flag videos 58 00:03:49,000 --> 00:03:53,520 Speaker 1: that they found upsetting or nonsensical or just playing unappealing, 59 00:03:54,000 --> 00:03:56,560 Speaker 1: and some of them were videos that were dedicated to 60 00:03:57,280 --> 00:04:03,400 Speaker 1: spreading misinformation. Uh somewhere the bizarre and and offensive parodies 61 00:04:03,480 --> 00:04:06,640 Speaker 1: or mash ups that started making headlines a few years ago. 62 00:04:06,680 --> 00:04:10,520 Speaker 1: When they began to pop up on YouTube frequently. Nearly 63 00:04:10,600 --> 00:04:14,119 Speaker 1: forty thousand people volunteered to take part in this project 64 00:04:14,200 --> 00:04:18,320 Speaker 1: across one countries, and over the course of a little 65 00:04:18,400 --> 00:04:22,040 Speaker 1: less than a year, the app extension collected three thousand, 66 00:04:22,120 --> 00:04:25,960 Speaker 1: three hundred sixty two reports about recommended videos that people 67 00:04:26,000 --> 00:04:30,560 Speaker 1: regretted watching, which I mean three thousand, three d sixty two. 68 00:04:30,640 --> 00:04:34,200 Speaker 1: That's not an enormous number for a full year of 69 00:04:34,240 --> 00:04:36,440 Speaker 1: this project going on, or almost a full year of 70 00:04:36,440 --> 00:04:41,000 Speaker 1: it going on. Generally, areas in non English speaking countries 71 00:04:41,080 --> 00:04:44,800 Speaker 1: had more incidents than English speaking countries, which suggests that 72 00:04:44,839 --> 00:04:50,200 Speaker 1: perhaps YouTube's algorithms work less well in those countries, and 73 00:04:50,880 --> 00:04:55,240 Speaker 1: some of the recommended videos videos that YouTube's own algorithm 74 00:04:55,279 --> 00:05:00,200 Speaker 1: was pushing seemed to violate YouTube's own content policies, which 75 00:05:00,200 --> 00:05:04,320 Speaker 1: is not great. YouTube is pretty guarded about how it's 76 00:05:04,360 --> 00:05:07,719 Speaker 1: algorithm actually works, and in a way that makes sense 77 00:05:07,760 --> 00:05:12,680 Speaker 1: because if word got out how the algorithm worked, then 78 00:05:12,720 --> 00:05:16,440 Speaker 1: people could figure out ways to potentially game the system 79 00:05:16,480 --> 00:05:19,320 Speaker 1: even more than they already do. But at the end 80 00:05:19,320 --> 00:05:22,480 Speaker 1: of the day, the purpose of the algorithm isn't really 81 00:05:22,640 --> 00:05:25,880 Speaker 1: to serve up the perfect video to you. The purpose 82 00:05:25,880 --> 00:05:28,560 Speaker 1: of the algorithm is to keep you on YouTube for 83 00:05:28,600 --> 00:05:32,719 Speaker 1: as long as possible. It's an engagement algorithm designed to 84 00:05:32,760 --> 00:05:36,120 Speaker 1: have you stick to the platform for longer because that 85 00:05:36,160 --> 00:05:39,799 Speaker 1: means you watch more videos and you see more ads, 86 00:05:40,200 --> 00:05:43,960 Speaker 1: and that makes YouTube more money. So if you think 87 00:05:44,000 --> 00:05:47,880 Speaker 1: about it like that, the algorithm doesn't quote unquote care 88 00:05:48,360 --> 00:05:52,240 Speaker 1: about the content or quality of any given video. The 89 00:05:52,279 --> 00:05:56,200 Speaker 1: algorithm doesn't care if the video even violates YouTube's policies. 90 00:05:56,520 --> 00:05:59,840 Speaker 1: The algorithm's job is to suggest videos that are like 91 00:06:00,040 --> 00:06:04,839 Speaker 1: need to keep each individual on YouTube longer. That's it. 92 00:06:05,520 --> 00:06:08,560 Speaker 1: While YouTube says that the company has refined its algorithms 93 00:06:08,600 --> 00:06:12,320 Speaker 1: so that the time it's working perfectly well and it's 94 00:06:12,360 --> 00:06:15,239 Speaker 1: not going to push any sort of borderline content your way, 95 00:06:15,800 --> 00:06:18,359 Speaker 1: the fact that the company remains so guarded about that 96 00:06:18,440 --> 00:06:21,640 Speaker 1: algorithm means there's not really any way for someone to 97 00:06:21,880 --> 00:06:25,880 Speaker 1: verify that claim. Now, I have no idea how effective 98 00:06:25,920 --> 00:06:29,200 Speaker 1: the algorithm really is at serving up content without it 99 00:06:29,400 --> 00:06:32,800 Speaker 1: being borderline or in violation of YouTube's policies, but I 100 00:06:32,839 --> 00:06:36,280 Speaker 1: do know that the recommendation algorithms on various social platforms 101 00:06:36,720 --> 00:06:41,800 Speaker 1: can easily exacerbate problems sticking with Google. A collection of 102 00:06:42,000 --> 00:06:45,080 Speaker 1: thirty seven states in the United States have brought a 103 00:06:45,120 --> 00:06:49,120 Speaker 1: class action lawsuit against the company, claiming it is exercised 104 00:06:49,240 --> 00:06:54,240 Speaker 1: quote monopolistic leverage end quote with the Google play Store. 105 00:06:54,480 --> 00:06:57,799 Speaker 1: Now that's the Android app store in case you weren't familiar. 106 00:06:58,240 --> 00:07:01,840 Speaker 1: So Google imposes a comm sition on purchases made within 107 00:07:01,880 --> 00:07:04,960 Speaker 1: the play Store, and it takes up to thirty percent 108 00:07:05,520 --> 00:07:09,120 Speaker 1: of that money. And that's pretty standard in the industry. 109 00:07:09,360 --> 00:07:11,400 Speaker 1: So in other words, this is the same sort of 110 00:07:11,400 --> 00:07:15,920 Speaker 1: practice that's used by Apple, by Amazon, and by Microsoft. 111 00:07:16,400 --> 00:07:19,960 Speaker 1: In addition, this one is a bit odd because Google 112 00:07:20,160 --> 00:07:23,080 Speaker 1: is actually fairly lax when it comes to how you 113 00:07:23,120 --> 00:07:26,800 Speaker 1: get apps on your Android devices. You can actually download 114 00:07:26,800 --> 00:07:29,880 Speaker 1: apps directly from developer websites and not go through the 115 00:07:29,880 --> 00:07:34,160 Speaker 1: play Store at all, so you can bypass that whole system. Plus, 116 00:07:34,520 --> 00:07:37,640 Speaker 1: if you are an Android user, you can elect to 117 00:07:37,760 --> 00:07:40,840 Speaker 1: allow side loading. That means you can load apps that 118 00:07:40,880 --> 00:07:45,200 Speaker 1: developers never even submit to Google. So another you know, 119 00:07:45,400 --> 00:07:47,120 Speaker 1: it doesn't have to be in the Google play Store 120 00:07:47,160 --> 00:07:48,560 Speaker 1: for you to be able to use it. You can 121 00:07:48,600 --> 00:07:53,120 Speaker 1: just sideload it from wherever. This is a little risky 122 00:07:53,240 --> 00:07:56,720 Speaker 1: because it could involve installing malware to your Android device, 123 00:07:56,880 --> 00:08:00,640 Speaker 1: so it's not recommended for everyone, but it is something 124 00:08:00,680 --> 00:08:05,360 Speaker 1: you can do. Apple famously does not allow for sideloading, 125 00:08:05,640 --> 00:08:08,120 Speaker 1: So it seems to me like Google has a fairly 126 00:08:08,160 --> 00:08:11,600 Speaker 1: decent argument to make here about how the company allows 127 00:08:11,720 --> 00:08:14,920 Speaker 1: for alternatives then going through the Google Play Store. But 128 00:08:15,160 --> 00:08:18,240 Speaker 1: let's say this does go to trial and that Google loses, 129 00:08:18,800 --> 00:08:21,360 Speaker 1: that would be some pretty scary news for the other 130 00:08:21,400 --> 00:08:24,760 Speaker 1: companies that I've mentioned, like Apple and Amazon. We've heard 131 00:08:24,800 --> 00:08:27,520 Speaker 1: a lot about app stores and the cut that companies 132 00:08:27,560 --> 00:08:29,840 Speaker 1: like Google and Apple take. I mean, that's at the 133 00:08:29,880 --> 00:08:33,080 Speaker 1: heart of the ongoing battle between Apple and Epic Games, 134 00:08:33,120 --> 00:08:35,560 Speaker 1: So we will keep an eye on this particular story. 135 00:08:36,000 --> 00:08:40,120 Speaker 1: Speaking of Apple, Steve Wozniak, who co founded Apple with 136 00:08:40,160 --> 00:08:43,600 Speaker 1: Steve Jobs back in the nineteen seventies, gave what amounted 137 00:08:43,640 --> 00:08:46,800 Speaker 1: to a nearly ten minutes speech in support of the 138 00:08:46,880 --> 00:08:51,840 Speaker 1: right to repair. This was in response to a cameo request. Cameo, 139 00:08:52,160 --> 00:08:54,800 Speaker 1: just in case you're not familiar, is a service that 140 00:08:54,840 --> 00:08:58,960 Speaker 1: allows celebrities to set a price for personalized video messages 141 00:08:59,360 --> 00:09:03,720 Speaker 1: that fans and then buy, usually for someone else. I 142 00:09:03,800 --> 00:09:06,560 Speaker 1: tried to set up a cameo for myself, but turns 143 00:09:06,600 --> 00:09:08,680 Speaker 1: out I can't afford to pay people to have me 144 00:09:08,760 --> 00:09:13,040 Speaker 1: send them videos, so I'm out anyway. Louis Rossman, who 145 00:09:13,080 --> 00:09:16,240 Speaker 1: is a supporter of the right to Repair movement, purchased 146 00:09:16,280 --> 00:09:19,800 Speaker 1: a cameo request from Wosniac and asked him to speak 147 00:09:19,880 --> 00:09:23,520 Speaker 1: about the idea of right to repair, and Wosniac did 148 00:09:23,559 --> 00:09:27,120 Speaker 1: not disappoint You can actually watch this video on online. 149 00:09:27,320 --> 00:09:30,120 Speaker 1: And the right to repair is a general movement that 150 00:09:30,160 --> 00:09:34,199 Speaker 1: pushes for legislation that would require companies to stop making 151 00:09:34,240 --> 00:09:38,640 Speaker 1: it difficult or even impossible for someone to make alterations 152 00:09:38,760 --> 00:09:42,680 Speaker 1: or repair the technology that they purchase. So one example 153 00:09:42,720 --> 00:09:45,319 Speaker 1: I always like to give with this is with John 154 00:09:45,400 --> 00:09:49,600 Speaker 1: Deer equipment, as in like farming equipment like tractors, John 155 00:09:49,679 --> 00:09:55,120 Speaker 1: Dear restricts the ability to repair those products. So you 156 00:09:55,160 --> 00:09:58,480 Speaker 1: know what you're supposed to do is take your your 157 00:09:58,679 --> 00:10:02,640 Speaker 1: John Deer equipment and go to an authorized repair shop 158 00:10:02,840 --> 00:10:06,240 Speaker 1: that has a license with John Dear and they have 159 00:10:06,440 --> 00:10:12,160 Speaker 1: the equipment that is necessary to diagnose and repair the equipment. 160 00:10:12,200 --> 00:10:14,960 Speaker 1: If you try and do it yourself, uh, you get 161 00:10:15,320 --> 00:10:19,440 Speaker 1: you run into roadblocks, like figurative roadblocks. So for the 162 00:10:19,559 --> 00:10:21,880 Speaker 1: d I Y types out there, this is a slap 163 00:10:21,880 --> 00:10:24,640 Speaker 1: in the face, and I mean particularly for people like farmers, 164 00:10:24,640 --> 00:10:30,200 Speaker 1: who often can be very self reliant, having this can 165 00:10:30,280 --> 00:10:33,679 Speaker 1: mean not just it's it's not just inconvenient for some people, 166 00:10:33,760 --> 00:10:38,280 Speaker 1: it can be an enormous hassle. Wozniak spoke about how 167 00:10:38,320 --> 00:10:41,440 Speaker 1: in the not too distant past you could buy whatever 168 00:10:41,480 --> 00:10:43,520 Speaker 1: electronics he wanted. It could be a TV or a 169 00:10:43,640 --> 00:10:46,800 Speaker 1: radio or whatever, and you could make repairs on that 170 00:10:46,840 --> 00:10:49,760 Speaker 1: technology yourself. As long as you had to know how 171 00:10:50,160 --> 00:10:52,600 Speaker 1: and you had the components. You could just go out 172 00:10:52,600 --> 00:10:55,680 Speaker 1: and buy something like, you know, a vacuum tube and 173 00:10:55,760 --> 00:10:59,000 Speaker 1: replace it yourself. He even said that Apple started off 174 00:10:59,040 --> 00:11:01,720 Speaker 1: with that kind of pilosophy, which is a far cry 175 00:11:01,760 --> 00:11:05,400 Speaker 1: from how the company operates now. Wozniak asked the question, 176 00:11:05,920 --> 00:11:09,920 Speaker 1: is it your computer or is it some company's computer? 177 00:11:10,400 --> 00:11:12,920 Speaker 1: As an is it a company's computer that you get 178 00:11:12,960 --> 00:11:15,960 Speaker 1: to use but you don't actually own it. We've seen 179 00:11:15,960 --> 00:11:18,520 Speaker 1: the right to repair movement pick up support both here 180 00:11:18,559 --> 00:11:21,800 Speaker 1: in the United States and overseas in Europe, and I 181 00:11:21,840 --> 00:11:23,800 Speaker 1: am curious to see where it goes and whether or 182 00:11:23,840 --> 00:11:28,080 Speaker 1: not it actually leads to any meaningful change. Next security 183 00:11:28,120 --> 00:11:31,440 Speaker 1: firm trust Wave Spider Labs published a report that says, 184 00:11:31,480 --> 00:11:34,720 Speaker 1: the malicious code in ransomware used by hacker groups like 185 00:11:34,800 --> 00:11:39,440 Speaker 1: REvil have a feature that looks for Russian or Russian 186 00:11:39,520 --> 00:11:44,400 Speaker 1: related languages on computer systems, and if it detects that 187 00:11:44,520 --> 00:11:48,280 Speaker 1: language is present, like if it detects that most of 188 00:11:48,360 --> 00:11:52,760 Speaker 1: the programming and code and documents are in Russian, the 189 00:11:52,800 --> 00:11:55,880 Speaker 1: malware does not activate itself on that system. It doesn't 190 00:11:55,960 --> 00:12:00,280 Speaker 1: infect those systems. So why is that well? The owing 191 00:12:00,320 --> 00:12:05,160 Speaker 1: theory is that these criminal hacker groups are largely located 192 00:12:05,200 --> 00:12:10,400 Speaker 1: out of Russia or former parts of the USSR, and 193 00:12:10,480 --> 00:12:13,520 Speaker 1: that the Russian government is at the very least ignoring 194 00:12:13,920 --> 00:12:17,320 Speaker 1: these criminal groups as they target various computer systems in 195 00:12:17,400 --> 00:12:21,000 Speaker 1: other countries, particularly in the West, and that the hacker 196 00:12:21,040 --> 00:12:23,679 Speaker 1: groups wish to keep it that way, and so they 197 00:12:23,720 --> 00:12:26,400 Speaker 1: make sure that the code they unleash isn't going to 198 00:12:26,480 --> 00:12:29,520 Speaker 1: affect Russian systems. In other words, you just you don't 199 00:12:29,600 --> 00:12:33,960 Speaker 1: poop where you eat, as they say. Russian authorities have 200 00:12:34,080 --> 00:12:37,720 Speaker 1: so far proven to be uninterested in cracking down on 201 00:12:37,800 --> 00:12:41,040 Speaker 1: hacker groups, which has led to suspicions that perhaps the 202 00:12:41,120 --> 00:12:44,280 Speaker 1: Russian government might even go so far as to encourage 203 00:12:44,520 --> 00:12:50,000 Speaker 1: or maybe even subcontract these hacker groups. Now, those allegations 204 00:12:50,000 --> 00:12:52,920 Speaker 1: are difficult to prove, but the discovery that the malware 205 00:12:53,000 --> 00:12:57,160 Speaker 1: being used as specifically avoiding Russian systems suggests that the 206 00:12:57,200 --> 00:12:59,839 Speaker 1: hackers at least know that they are more likely able 207 00:12:59,880 --> 00:13:03,000 Speaker 1: to operate freely as long as they don't step on 208 00:13:03,120 --> 00:13:07,559 Speaker 1: Russian toes. We recently had Pride Month here in the 209 00:13:07,679 --> 00:13:11,760 Speaker 1: United States, and while people within the LGBT communities still 210 00:13:11,800 --> 00:13:16,079 Speaker 1: face struggles here significant ones in other countries, the threats 211 00:13:16,080 --> 00:13:21,199 Speaker 1: can be even more overt. Take China, for example, we Chat, 212 00:13:21,480 --> 00:13:26,120 Speaker 1: the most popular social networking platform within China, recently shut 213 00:13:26,160 --> 00:13:31,400 Speaker 1: down multiple accounts relating to LGBT topics, and allegedly we 214 00:13:31,600 --> 00:13:35,600 Speaker 1: Chat sent out messages to the account administrators claiming that 215 00:13:35,640 --> 00:13:38,560 Speaker 1: these channels that were you know, being hosted by these 216 00:13:38,559 --> 00:13:44,520 Speaker 1: admins were violating me chats policies, but gave no details 217 00:13:44,600 --> 00:13:47,760 Speaker 1: as to how they were doing that, and the lack 218 00:13:47,800 --> 00:13:51,360 Speaker 1: of information has led to speculation, and some of that 219 00:13:51,559 --> 00:13:55,840 Speaker 1: speculation questions if perhaps the communist government in China is 220 00:13:55,880 --> 00:14:00,360 Speaker 1: cracking down on these groups, potentially because they the government 221 00:14:00,440 --> 00:14:04,720 Speaker 1: might view these groups as maybe questioning the government's authority 222 00:14:04,840 --> 00:14:08,320 Speaker 1: or challenging that authority. In some way, and that the 223 00:14:08,360 --> 00:14:12,480 Speaker 1: government then puts pressure on companies like we Chat, because 224 00:14:12,520 --> 00:14:17,360 Speaker 1: in China, the Communist government is heavily involved in every 225 00:14:17,400 --> 00:14:22,720 Speaker 1: major company within the country. China classified homosexuality as being 226 00:14:23,120 --> 00:14:30,080 Speaker 1: criminal until but while homosexuality is now decriminalized, at least 227 00:14:30,120 --> 00:14:34,000 Speaker 1: on paper, people in LGBT communities still must deal with 228 00:14:34,080 --> 00:14:37,120 Speaker 1: a great deal of discrimination in China. Now. To be fair, 229 00:14:37,920 --> 00:14:40,080 Speaker 1: that's also the case in many other parts of the world, 230 00:14:40,240 --> 00:14:45,360 Speaker 1: including the United States. It's not like China is an 231 00:14:45,360 --> 00:14:48,560 Speaker 1: outlier in that regard. It's just that this is one 232 00:14:48,600 --> 00:14:53,160 Speaker 1: of those cases where it looks particularly overt. As I 233 00:14:53,200 --> 00:14:56,080 Speaker 1: said earlier for my last story today, I want to 234 00:14:56,120 --> 00:14:59,240 Speaker 1: preface this by saying I'm not a car guy, and 235 00:14:59,280 --> 00:15:04,240 Speaker 1: also I find hyper masculine marketing to be absolutely ridiculous. 236 00:15:04,360 --> 00:15:07,800 Speaker 1: I think it's laughable. So perhaps it comes as no 237 00:15:07,880 --> 00:15:11,120 Speaker 1: surprise that I was rolling my eyes so hard that 238 00:15:11,200 --> 00:15:14,240 Speaker 1: you could probably hear it at a recent video revealing 239 00:15:14,280 --> 00:15:18,760 Speaker 1: that Dodge will release an all electric muscle car in Now. 240 00:15:18,800 --> 00:15:21,400 Speaker 1: To be fair, I think muscle cars are kind of cool, 241 00:15:21,680 --> 00:15:23,640 Speaker 1: and I think an all electric muscle car is a 242 00:15:23,680 --> 00:15:27,320 Speaker 1: really neat idea. It's kind of in a way antithetical 243 00:15:27,440 --> 00:15:31,360 Speaker 1: to the way muscle cars typically get positioned in the market. 244 00:15:32,120 --> 00:15:37,080 Speaker 1: But the way the video unfolded really kind of I 245 00:15:37,120 --> 00:15:39,280 Speaker 1: don't know. I guess it pushed some buttons with me. 246 00:15:39,680 --> 00:15:42,640 Speaker 1: It would have to, it's an electric vehicle anyway. The 247 00:15:42,720 --> 00:15:46,440 Speaker 1: video includes a narrator who at one point incredulously says, 248 00:15:46,440 --> 00:15:50,520 Speaker 1: wait a minute, did we hear that right? Dodge as 249 00:15:50,560 --> 00:15:55,120 Speaker 1: an Dodge making an electric muscle car. And then there's 250 00:15:55,160 --> 00:15:56,960 Speaker 1: a shot of a dude who's behind the wheel of 251 00:15:57,000 --> 00:16:01,400 Speaker 1: a Dodge muscle car saying, you mean hypothetically right, Like 252 00:16:01,560 --> 00:16:06,360 Speaker 1: they wouldn't Dare make an electric muscle car because internal 253 00:16:06,400 --> 00:16:09,920 Speaker 1: combustion engines are manly and whatnot, and going electric is 254 00:16:09,960 --> 00:16:14,200 Speaker 1: antithetical to muscle cars aesthetic. I suspect this is really 255 00:16:14,240 --> 00:16:17,800 Speaker 1: just Dodge getting ahead of the inevitable reaction of the 256 00:16:17,800 --> 00:16:21,200 Speaker 1: thought of an all electric muscle car. But seeing as 257 00:16:21,240 --> 00:16:24,000 Speaker 1: how the automotive industry in general is having to switch 258 00:16:24,040 --> 00:16:26,440 Speaker 1: to e v s or at least alternatives to internal 259 00:16:26,480 --> 00:16:30,800 Speaker 1: combustion engines, it was kind of bound to happen. I mean, 260 00:16:30,840 --> 00:16:34,440 Speaker 1: it either had to happen this way, or muscle car 261 00:16:34,560 --> 00:16:37,720 Speaker 1: variants of vehicles would just have to go away anyway. 262 00:16:37,760 --> 00:16:39,600 Speaker 1: I really do think an electric muscle car is a 263 00:16:39,600 --> 00:16:42,040 Speaker 1: cool idea. I just find the marketing approach to me 264 00:16:42,600 --> 00:16:45,360 Speaker 1: a bit laughable. But I feel that way, like I said, 265 00:16:45,360 --> 00:16:49,400 Speaker 1: about all hyper masculine advertising. I mean, if it's if 266 00:16:49,440 --> 00:16:52,760 Speaker 1: it's an ad for steak or beer or something like that, 267 00:16:53,200 --> 00:16:57,960 Speaker 1: typically it just makes me kind of get exasperated at 268 00:16:58,040 --> 00:17:02,880 Speaker 1: the the the way that masculinity is depicted. I feel 269 00:17:02,920 --> 00:17:10,240 Speaker 1: like it tends to reinforce perhaps not the healthiest of stereotypes. 270 00:17:10,880 --> 00:17:14,040 Speaker 1: But hey, I'll get off my soapbox. We'll stick with tech. 271 00:17:14,800 --> 00:17:19,040 Speaker 1: That's it for the tech news for today, Thursday, July one. 272 00:17:19,080 --> 00:17:21,439 Speaker 1: We'll be back next week with some more news. If 273 00:17:21,480 --> 00:17:23,600 Speaker 1: you have suggestions for topics I should cover in future 274 00:17:23,600 --> 00:17:26,120 Speaker 1: episodes of tech Stuff, let me know the best way 275 00:17:26,160 --> 00:17:27,959 Speaker 1: to do that is send me a message on Twitter. 276 00:17:28,280 --> 00:17:31,000 Speaker 1: The handle for the show is tech Stuff H s 277 00:17:31,240 --> 00:17:39,600 Speaker 1: W and I'll talk to you again really soon. Tech 278 00:17:39,680 --> 00:17:43,160 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 279 00:17:43,160 --> 00:17:45,920 Speaker 1: from my Heart Radio, visit the I Heart Radio app, 280 00:17:46,040 --> 00:17:49,200 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.