1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,840 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,960 --> 00:00:17,880 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,960 --> 00:00:20,440 Speaker 1: and a love of all things tech. It is time 5 00:00:20,920 --> 00:00:26,960 Speaker 1: for the tech News for Thursday, December twenty one, And 6 00:00:27,080 --> 00:00:30,400 Speaker 1: before I start this episode, I want to give some 7 00:00:30,440 --> 00:00:33,599 Speaker 1: trigger warnings. A few of the news stories I'll be 8 00:00:33,640 --> 00:00:38,360 Speaker 1: covering today center around stuff like sexual harassment and abuse, 9 00:00:39,200 --> 00:00:43,360 Speaker 1: and these are important stories, but obviously they're also really upsetting. 10 00:00:43,520 --> 00:00:46,120 Speaker 1: I mean, I was upset by them, and I'm pretty 11 00:00:46,120 --> 00:00:49,559 Speaker 1: sure that will come through while I talk about them 12 00:00:49,560 --> 00:00:51,839 Speaker 1: in this episode, but I wanted to let all of 13 00:00:51,880 --> 00:00:54,560 Speaker 1: you know about that before we get there. The first 14 00:00:54,560 --> 00:00:57,760 Speaker 1: couple of stories do not cover that, but I didn't 15 00:00:57,800 --> 00:00:59,920 Speaker 1: want it to come out of the blue and surprise you. 16 00:01:00,360 --> 00:01:03,360 Speaker 1: But before any of that, you might remember that earlier 17 00:01:03,440 --> 00:01:07,119 Speaker 1: this year, Apple CEO Tim Cook was pushing to get 18 00:01:07,640 --> 00:01:12,040 Speaker 1: Apple corporate employees back into the office, citing a need 19 00:01:12,160 --> 00:01:14,160 Speaker 1: for the energy that comes with getting a bunch of 20 00:01:14,160 --> 00:01:17,720 Speaker 1: people together in the same space at the same time. Also, 21 00:01:17,920 --> 00:01:22,480 Speaker 1: I'm guessing probably trying to justify that incredibly expensive and 22 00:01:22,640 --> 00:01:26,760 Speaker 1: expansive Apple HQ that they built not too long ago, 23 00:01:27,319 --> 00:01:31,560 Speaker 1: Like that thing came together just as the pandemic was starting, 24 00:01:31,640 --> 00:01:34,600 Speaker 1: and so Apple has not had a chance to get 25 00:01:34,680 --> 00:01:36,759 Speaker 1: as much use out of that as I'm sure they 26 00:01:36,760 --> 00:01:39,920 Speaker 1: had anticipated. But you know, and there is something to 27 00:01:39,920 --> 00:01:42,920 Speaker 1: be said about in person collaboration. There is value to 28 00:01:43,440 --> 00:01:45,760 Speaker 1: being in the same space with people in order to 29 00:01:45,760 --> 00:01:49,480 Speaker 1: collaborate and innovate. However, there's also something to be said 30 00:01:49,520 --> 00:01:52,080 Speaker 1: for how having a bunch of people who are together 31 00:01:52,120 --> 00:01:54,480 Speaker 1: in the same place at the same time can create 32 00:01:54,520 --> 00:01:58,960 Speaker 1: an ideal environment for COVID to spread. Now Cook, the CEO, 33 00:01:59,480 --> 00:02:04,840 Speaker 1: he evently acknowledged this issue, particularly when the delta variant 34 00:02:04,920 --> 00:02:09,800 Speaker 1: first emerged and became very concerning a few months ago, 35 00:02:10,520 --> 00:02:14,480 Speaker 1: and so he pushed back Apple's returned to office. In fact, 36 00:02:14,560 --> 00:02:17,400 Speaker 1: that got pushed back a few times right we we 37 00:02:17,480 --> 00:02:22,080 Speaker 1: had it established and then pushed back, and most recently 38 00:02:22,919 --> 00:02:27,400 Speaker 1: the return date was listed as February one of However, 39 00:02:27,919 --> 00:02:31,120 Speaker 1: yesterday Cook sentella memo saying that the return to the 40 00:02:31,160 --> 00:02:34,200 Speaker 1: office has been pushed back and there is no yet 41 00:02:34,520 --> 00:02:38,359 Speaker 1: return date that's yet to be determined, So it's indefinite, 42 00:02:38,880 --> 00:02:43,720 Speaker 1: but not indefinite necessarily in the sense of, you know, perpetual, 43 00:02:43,800 --> 00:02:48,720 Speaker 1: but rather we just don't want to give a date yet. Uh. 44 00:02:48,720 --> 00:02:53,760 Speaker 1: With the omicron variant currently spreading like crazy, this is 45 00:02:53,800 --> 00:02:57,600 Speaker 1: not a surprise. Cook also announced that every Apple employee, 46 00:02:57,800 --> 00:03:00,519 Speaker 1: not just the corporate ones but ret l as well, 47 00:03:01,000 --> 00:03:04,080 Speaker 1: will receive a one thousand dollar bonus that can be 48 00:03:04,200 --> 00:03:07,960 Speaker 1: used for work from home needs. Uh. This isn't just 49 00:03:08,160 --> 00:03:10,320 Speaker 1: the right thing to do for employees. It is, but 50 00:03:10,400 --> 00:03:12,720 Speaker 1: it's not just that. It's also a really good pr 51 00:03:12,840 --> 00:03:16,320 Speaker 1: move considering how Apple is currently in the news regarding 52 00:03:16,360 --> 00:03:20,080 Speaker 1: the Apple two movement and lawsuits alleging that the company 53 00:03:20,200 --> 00:03:24,000 Speaker 1: established and facilitated a toxic work environment and, according to 54 00:03:24,080 --> 00:03:29,720 Speaker 1: one former employee, engaged in retaliatory behavior after she brought 55 00:03:29,760 --> 00:03:33,320 Speaker 1: her concerns forward with management. So I guess what I'm 56 00:03:33,360 --> 00:03:35,960 Speaker 1: saying is that this was the right call, but it 57 00:03:36,080 --> 00:03:39,840 Speaker 1: does not get the company off the hook for those other, 58 00:03:40,160 --> 00:03:45,320 Speaker 1: you know, alleged transgressions. Pat Gelsinger, the CEO of Intel, 59 00:03:45,520 --> 00:03:50,040 Speaker 1: says he anticipates the global chip shortage will stretch at 60 00:03:50,120 --> 00:03:54,360 Speaker 1: least as far as twenty twenty three. The demand for 61 00:03:54,480 --> 00:03:57,520 Speaker 1: chips is still very high and the supply side just 62 00:03:57,760 --> 00:04:00,800 Speaker 1: simply can't keep up. That comes not just from the 63 00:04:00,840 --> 00:04:04,880 Speaker 1: manufacturing facilities that are hit by issues related to COVID 64 00:04:05,840 --> 00:04:08,840 Speaker 1: or the finishing facilities in places like China which are 65 00:04:08,880 --> 00:04:12,400 Speaker 1: also hit by COVID. It's really the entire supply chain. 66 00:04:13,160 --> 00:04:17,120 Speaker 1: Shipping has taken a huge hit over the pandemic, with 67 00:04:17,240 --> 00:04:20,719 Speaker 1: ports getting backed up and personnel stretched to the limit, 68 00:04:20,800 --> 00:04:25,520 Speaker 1: like in order to catch up, we have these other bottlenecks, right, 69 00:04:25,600 --> 00:04:28,800 Speaker 1: So that is the problem is that even if you 70 00:04:29,160 --> 00:04:31,839 Speaker 1: solve one part of the issue, there's still the other 71 00:04:31,920 --> 00:04:34,800 Speaker 1: parts that are affected and you have to figure out 72 00:04:34,839 --> 00:04:37,560 Speaker 1: how to work around those. So while companies are rushing 73 00:04:37,560 --> 00:04:41,279 Speaker 1: to boost their production operations, there are other challenges that 74 00:04:41,279 --> 00:04:43,480 Speaker 1: we have to solve before chips can make it from 75 00:04:43,560 --> 00:04:46,600 Speaker 1: the factory to a final product, whether that product is 76 00:04:46,640 --> 00:04:50,880 Speaker 1: a computer or a video game or the video game console, 77 00:04:51,000 --> 00:04:53,760 Speaker 1: or a smartphone, or or a car or whatever it 78 00:04:53,839 --> 00:04:56,960 Speaker 1: might be. So we should expect to see this have 79 00:04:57,000 --> 00:04:59,839 Speaker 1: a ripple effect through the rest of the tech industry 80 00:04:59,880 --> 00:05:02,400 Speaker 1: to continue to do so. Really, because we've already seen this, 81 00:05:03,040 --> 00:05:04,920 Speaker 1: Uh that is probably gonna make it a rough year 82 00:05:04,960 --> 00:05:07,839 Speaker 1: for a lot of companies and of course millions of people. 83 00:05:07,880 --> 00:05:11,360 Speaker 1: Whether they're employees of those companies, or they're dependent upon 84 00:05:11,400 --> 00:05:15,080 Speaker 1: those products. It's gonna be tough. Of course, this is 85 00:05:15,120 --> 00:05:19,880 Speaker 1: also assuming that, you know, his expectations become reality. Now 86 00:05:19,920 --> 00:05:22,800 Speaker 1: I have no reason to doubt his prediction. In fact, 87 00:05:22,880 --> 00:05:26,240 Speaker 1: I worry that we're gonna see this issue extend beyond 88 00:05:26,560 --> 00:05:30,000 Speaker 1: twenty twenty three unless we come up with some really 89 00:05:30,000 --> 00:05:33,280 Speaker 1: creative solutions to those bottleneck issues. That I was kind 90 00:05:33,279 --> 00:05:36,880 Speaker 1: of alluding to. Another piece of Intel News ties into 91 00:05:36,920 --> 00:05:39,520 Speaker 1: a topic that has been picking up momentum this year, 92 00:05:39,920 --> 00:05:43,000 Speaker 1: that of the metaverse. Now once again, the metaverse is 93 00:05:43,000 --> 00:05:46,680 Speaker 1: this vague notion of a digital world. It might be 94 00:05:46,800 --> 00:05:50,520 Speaker 1: one that could overlay or otherwise interface with our real world, 95 00:05:50,800 --> 00:05:53,360 Speaker 1: where people will be able to engage in activities that 96 00:05:53,520 --> 00:05:57,280 Speaker 1: normally we would associate with, you know, an actual physical space. 97 00:05:57,720 --> 00:06:01,040 Speaker 1: So that could include everything from at ending a concert 98 00:06:01,640 --> 00:06:04,960 Speaker 1: virtually to meeting in a conference room with a bunch 99 00:06:04,960 --> 00:06:09,440 Speaker 1: of other people's avatars, to touring a foreign city or 100 00:06:09,440 --> 00:06:13,080 Speaker 1: perhaps touring a fictional city in a virtual environment. There's 101 00:06:13,120 --> 00:06:16,600 Speaker 1: no single implementation of the metaverse as of yet. It's 102 00:06:16,640 --> 00:06:20,160 Speaker 1: just kind of this notion of the future of computing 103 00:06:20,200 --> 00:06:24,839 Speaker 1: and an online interaction. In some ideations, it's a virtual 104 00:06:24,880 --> 00:06:27,560 Speaker 1: world that people visit using some sort of tech like 105 00:06:27,600 --> 00:06:31,480 Speaker 1: a computer or a smartphone, or most often a VR 106 00:06:31,560 --> 00:06:35,520 Speaker 1: setup of some kind. In other ideations, it involves not 107 00:06:35,839 --> 00:06:39,240 Speaker 1: virtual reality, but a R or augmented reality, in which 108 00:06:39,240 --> 00:06:41,839 Speaker 1: we see a digital overlay on top of the world 109 00:06:41,920 --> 00:06:44,880 Speaker 1: around us that enhances our experience in some way, or 110 00:06:44,920 --> 00:06:48,920 Speaker 1: at least it's meant to enhance our experience anyway. It's 111 00:06:48,920 --> 00:06:51,839 Speaker 1: hard to talk about the metaverse in firm terms because 112 00:06:51,839 --> 00:06:56,279 Speaker 1: we don't really have it yet. However, intels Rajah Kaduri, 113 00:06:56,520 --> 00:06:59,800 Speaker 1: who is a senior VP of the Accelerated Computing Systems 114 00:06:59,800 --> 00:07:04,160 Speaker 1: and Graphics Group, posted an editorial suggesting that any realization 115 00:07:04,200 --> 00:07:07,520 Speaker 1: of an actual metaverse will require a whole lot more 116 00:07:07,800 --> 00:07:10,680 Speaker 1: than what we have at our disposal right now. I'm 117 00:07:10,720 --> 00:07:15,120 Speaker 1: reminded of that classic sequence from the acclaimed film Back 118 00:07:15,120 --> 00:07:18,000 Speaker 1: to the Future Too, in which we hear Rafe data 119 00:07:18,120 --> 00:07:21,960 Speaker 1: Anger say to Marty McFly, those boards don't work on water, 120 00:07:22,440 --> 00:07:28,280 Speaker 1: and then Jester Whitey Nogura says, unless you got power. Well, 121 00:07:28,320 --> 00:07:30,400 Speaker 1: it's it's the power that we need for the metaverse 122 00:07:30,440 --> 00:07:33,720 Speaker 1: to work. According to Kunduri, he estimates that a metaverse 123 00:07:33,800 --> 00:07:36,840 Speaker 1: as we generally imagine it to be would require us 124 00:07:36,840 --> 00:07:39,800 Speaker 1: to have capabilities that are around a thousand times more 125 00:07:39,840 --> 00:07:43,400 Speaker 1: powerful than what we have at our our disposal today. 126 00:07:43,760 --> 00:07:46,840 Speaker 1: And that's not just computing power, it's also storage and 127 00:07:46,920 --> 00:07:50,360 Speaker 1: networking infrastructure and things like that in order to support 128 00:07:50,400 --> 00:07:55,000 Speaker 1: a full metaverse. As Chaim Gardenberg of The Verge points out, 129 00:07:55,760 --> 00:07:59,480 Speaker 1: that seems to track because Gartenberg mentions that Horizon Worlds, 130 00:07:59,640 --> 00:08:04,800 Speaker 1: which Facebook's um no, I'm sorry Meta, because that's the 131 00:08:04,840 --> 00:08:09,120 Speaker 1: new company name. Meta's VR environment is called Horizon Worlds, 132 00:08:09,160 --> 00:08:13,360 Speaker 1: and it can only hold at max twenty people in 133 00:08:13,520 --> 00:08:18,240 Speaker 1: a virtual environment for a pretty rudimentary interactive experience. And 134 00:08:18,400 --> 00:08:20,800 Speaker 1: I mean in a pandemic world, being around twenty people 135 00:08:20,880 --> 00:08:23,200 Speaker 1: might feel like a lot. But that is not a 136 00:08:23,240 --> 00:08:26,440 Speaker 1: full metaverse, not by a long stretch. So I think 137 00:08:26,720 --> 00:08:29,680 Speaker 1: Duris post is a great example of critical thinking. It's 138 00:08:29,680 --> 00:08:32,319 Speaker 1: an illustration that just because we seem to have some 139 00:08:32,440 --> 00:08:35,320 Speaker 1: pieces of the metaverse puzzle in a pretty good place, 140 00:08:36,160 --> 00:08:38,800 Speaker 1: it doesn't mean that all of the pieces are in 141 00:08:38,840 --> 00:08:40,760 Speaker 1: that same place, and we won't be able to complete 142 00:08:40,800 --> 00:08:43,360 Speaker 1: the picture until they're all ready to go. And that 143 00:08:43,440 --> 00:08:45,840 Speaker 1: might be a while. I guess you could argue that 144 00:08:45,880 --> 00:08:48,880 Speaker 1: it's good for companies like Meta Slash Facebook to get 145 00:08:48,920 --> 00:08:51,240 Speaker 1: ahead of all that, But then I'm still skeptical that 146 00:08:51,280 --> 00:08:54,520 Speaker 1: the metaverse will ever be more than a curiosity for 147 00:08:54,559 --> 00:08:56,920 Speaker 1: the small percentage of people who can both afford the 148 00:08:56,920 --> 00:08:59,920 Speaker 1: equipment to participate in it and who have the interest 149 00:09:00,040 --> 00:09:03,200 Speaker 1: to actually do it. The rest of us, I don't know. 150 00:09:03,960 --> 00:09:06,840 Speaker 1: I'm not sure we're going to go all ready Player 151 00:09:06,920 --> 00:09:09,360 Speaker 1: one or snow crash or anything like that in the 152 00:09:09,400 --> 00:09:14,400 Speaker 1: near future. All Right, well, we're gonna have a whole 153 00:09:14,440 --> 00:09:17,360 Speaker 1: bunch more stories after we take this break, and that's 154 00:09:17,360 --> 00:09:20,160 Speaker 1: when we're gonna get into the trigger warning section. But 155 00:09:20,280 --> 00:09:31,320 Speaker 1: before we get to that, let's hear these messages. Okay, So, 156 00:09:31,400 --> 00:09:33,920 Speaker 1: just before the break, I was talking about the metaverse 157 00:09:33,960 --> 00:09:37,400 Speaker 1: and meta and while we're on that subject, I want 158 00:09:37,400 --> 00:09:40,040 Speaker 1: to talk about an article written by Tanya Bessu and 159 00:09:40,080 --> 00:09:42,920 Speaker 1: published in the M I T Technology Review. It as 160 00:09:42,960 --> 00:09:48,439 Speaker 1: the title, the metaverse has a groping problem. Already that's 161 00:09:48,559 --> 00:09:52,440 Speaker 1: simultaneously a sad thing to see, and arguably this makes 162 00:09:52,440 --> 00:09:56,079 Speaker 1: it even more sad. It's not at all surprising now 163 00:09:56,240 --> 00:09:58,439 Speaker 1: for all the women who are listening to this podcast, 164 00:09:58,720 --> 00:10:01,679 Speaker 1: I'm gonna say some stuff that's so obvious that I'm 165 00:10:01,720 --> 00:10:03,959 Speaker 1: sure you could just skip ahead because you all know this, 166 00:10:04,280 --> 00:10:08,280 Speaker 1: you experience it. But for everyone else who is unaware, 167 00:10:08,760 --> 00:10:14,120 Speaker 1: Women have been disproportionately targeted with abuse since okay, well, 168 00:10:14,120 --> 00:10:17,200 Speaker 1: if we're talking big picture, pretty much ever. But it's 169 00:10:17,480 --> 00:10:23,040 Speaker 1: definitely been that way online since the beginning of online interactions, 170 00:10:23,640 --> 00:10:27,120 Speaker 1: whether it's harassment on Twitter or something like a group 171 00:10:27,160 --> 00:10:30,520 Speaker 1: of abusers who are trying to docks women or something 172 00:10:30,559 --> 00:10:34,240 Speaker 1: in between. Women face this kind of of abuse pretty 173 00:10:34,320 --> 00:10:37,760 Speaker 1: much every day that they're online. To some extent. Now, honestly, 174 00:10:38,240 --> 00:10:41,240 Speaker 1: I don't know how they manage, but anyway. Bassu writes 175 00:10:41,280 --> 00:10:44,880 Speaker 1: that on November twenty six, one beta tester reported that 176 00:10:44,960 --> 00:10:49,320 Speaker 1: she had been groped virtually and Meta's Horizon Worlds, and 177 00:10:49,360 --> 00:10:54,280 Speaker 1: she also says that Meta's internal review found quote that 178 00:10:54,320 --> 00:10:57,560 Speaker 1: the beta tester should have used a tool called safe 179 00:10:57,720 --> 00:11:01,760 Speaker 1: zone end quote and saves own is quote a protective 180 00:11:01,800 --> 00:11:05,880 Speaker 1: bubble that users can activate when feeling threatened end quote. Now, 181 00:11:06,320 --> 00:11:10,080 Speaker 1: assuming this is not a mischaracterization of what that internal 182 00:11:10,120 --> 00:11:14,160 Speaker 1: review specifically said, I just want to say that is 183 00:11:14,200 --> 00:11:17,520 Speaker 1: really gross. Meta. I mean, really gross, because to me, 184 00:11:17,679 --> 00:11:20,360 Speaker 1: that sounds like the internal review is saying, oh, hey, 185 00:11:20,440 --> 00:11:22,960 Speaker 1: you got groped, Well, that's your fault because you should 186 00:11:23,000 --> 00:11:27,520 Speaker 1: have activated this feature. And that's not how it is, y'all. 187 00:11:28,040 --> 00:11:31,000 Speaker 1: Is the it's not the user's fault. It's the fault 188 00:11:31,000 --> 00:11:34,160 Speaker 1: of the person who did the groping. Right, they're the 189 00:11:34,200 --> 00:11:37,800 Speaker 1: ones who violated someone else. It's not the fault of 190 00:11:37,800 --> 00:11:40,240 Speaker 1: the person who didn't activate a feature they may or 191 00:11:40,280 --> 00:11:42,120 Speaker 1: may not have known about. It's the fault of the 192 00:11:42,120 --> 00:11:45,840 Speaker 1: person who actually did the terrible thing. So again, assuming 193 00:11:45,960 --> 00:11:50,080 Speaker 1: that Bessue wasn't misrepresenting the internal report, uh, and I 194 00:11:50,120 --> 00:11:52,679 Speaker 1: have no reason to believe that she would. I just 195 00:11:52,760 --> 00:11:55,320 Speaker 1: want to say, shame on you, Meta for making such 196 00:11:55,360 --> 00:11:59,600 Speaker 1: a truly stupid and ugly statement. You can do better 197 00:11:59,640 --> 00:12:03,560 Speaker 1: than that, at least I hope you can. Anyway. Nasu 198 00:12:03,720 --> 00:12:06,640 Speaker 1: goes on to assert that until companies that are building 199 00:12:06,640 --> 00:12:09,520 Speaker 1: out VR environments can find ways to protect people within 200 00:12:09,640 --> 00:12:13,120 Speaker 1: those environments, the metaverse will never be a safe place. 201 00:12:13,800 --> 00:12:17,080 Speaker 1: I imagine it would be a very unwelcome place for women, 202 00:12:17,120 --> 00:12:20,520 Speaker 1: in particular, unless those women for some reason want to 203 00:12:20,559 --> 00:12:23,480 Speaker 1: be harassed and hounded. The entire time they're on and 204 00:12:23,960 --> 00:12:28,440 Speaker 1: news flash, they don't. Maybe it's a maybe it's dangerous 205 00:12:28,440 --> 00:12:31,199 Speaker 1: to pay them all with a blanket statement like that, 206 00:12:31,480 --> 00:12:34,920 Speaker 1: or to mix metaphors, but I feel pretty good about it. 207 00:12:35,200 --> 00:12:37,920 Speaker 1: So if you've ever played in any like online game 208 00:12:38,480 --> 00:12:41,040 Speaker 1: and a player revealed that they happen to be female, 209 00:12:41,559 --> 00:12:44,079 Speaker 1: you've likely experienced at least one person being a total 210 00:12:44,200 --> 00:12:47,800 Speaker 1: jackass about that. Now, I can only imagine that gets 211 00:12:47,880 --> 00:12:51,440 Speaker 1: way worse in virtual environments where you have avatars and stuff. 212 00:12:51,840 --> 00:12:53,720 Speaker 1: And before anyone reaches out to me and say, hey, 213 00:12:53,800 --> 00:12:56,520 Speaker 1: chill out, it's not real, I just wanna say to you, 214 00:12:56,920 --> 00:13:01,040 Speaker 1: sit down and shut up, because anyone who has even 215 00:13:01,160 --> 00:13:05,480 Speaker 1: done the tiniest bit of research into virtual reality can 216 00:13:05,520 --> 00:13:08,960 Speaker 1: tell you that the immersion of that experience is convincing 217 00:13:09,080 --> 00:13:12,719 Speaker 1: enough to get our brains and bodies to behave as 218 00:13:12,760 --> 00:13:16,080 Speaker 1: if it were a real experience. There are doctors who 219 00:13:16,160 --> 00:13:19,600 Speaker 1: use virtual reality to treat people who have phobias because 220 00:13:19,600 --> 00:13:23,040 Speaker 1: of virtual representation of a phobia can be an effective 221 00:13:23,040 --> 00:13:26,960 Speaker 1: form of exposure therapy for patients. So if v are 222 00:13:27,040 --> 00:13:31,560 Speaker 1: conserved as an effective psychological treatment, it stands to reason 223 00:13:31,640 --> 00:13:37,280 Speaker 1: it can also contribute to legit psychological trauma. Anyway, to 224 00:13:37,320 --> 00:13:41,199 Speaker 1: all my listeners out there, be good people, whether it's 225 00:13:41,200 --> 00:13:43,960 Speaker 1: in the real world or a virtual one. And to 226 00:13:44,040 --> 00:13:47,160 Speaker 1: all the companies out there working on metaverse solutions for 227 00:13:47,400 --> 00:13:50,960 Speaker 1: goodness sake, operate under the assumption that you're gonna have 228 00:13:51,040 --> 00:13:54,040 Speaker 1: some awful people log into your systems and create ways 229 00:13:54,040 --> 00:13:58,040 Speaker 1: to protect the rest of us from those jerks. Staying 230 00:13:58,080 --> 00:14:00,599 Speaker 1: on the topic of abuse, and to be clear, I 231 00:14:00,640 --> 00:14:04,080 Speaker 1: would much rather cover happier tech news, but here's what 232 00:14:04,200 --> 00:14:09,319 Speaker 1: I have. Let's talk about Tesla. Last month, Tesla employee 233 00:14:09,440 --> 00:14:13,600 Speaker 1: Jessica Barrazza filed a lawsuit against the company, claiming that 234 00:14:13,640 --> 00:14:17,040 Speaker 1: she and other women at the Fremont Tesla factory had 235 00:14:17,160 --> 00:14:23,160 Speaker 1: experienced quote nightmarish conditions of rampant sexual harassment end quote, 236 00:14:23,400 --> 00:14:29,240 Speaker 1: including being physically assaulted, and that Barazza had issued formal 237 00:14:29,320 --> 00:14:32,680 Speaker 1: complaints and yet received no protection from the company, meaning 238 00:14:32,720 --> 00:14:36,800 Speaker 1: Tesla was essentially complicit. And now six more women have 239 00:14:37,000 --> 00:14:41,680 Speaker 1: sued Tesla with similar allegations. Those allegations definitely make the 240 00:14:41,720 --> 00:14:46,000 Speaker 1: manufacturing facility sound like an absolutely horrifying place for women. 241 00:14:46,560 --> 00:14:50,160 Speaker 1: David Lowe, who is serving as the attorney for these employees, 242 00:14:50,520 --> 00:14:53,600 Speaker 1: says that Tesla's corporate culture shouldn't be a surprise because 243 00:14:53,640 --> 00:14:57,360 Speaker 1: of how Elon Musk, the company's CEO, has engaged in 244 00:14:57,440 --> 00:15:01,640 Speaker 1: misogynistic conduct and language. Low wrote in a press release 245 00:15:01,680 --> 00:15:06,280 Speaker 1: that quote Elon Musk tweeting allude comment about women's bodies 246 00:15:06,600 --> 00:15:10,400 Speaker 1: or a taunt toward employees who report misconduct reflects an 247 00:15:10,400 --> 00:15:13,600 Speaker 1: attitude at the top that enables the pattern of pervasive 248 00:15:13,760 --> 00:15:18,200 Speaker 1: sexual harassment and retaliation at the heart of these cases. 249 00:15:18,360 --> 00:15:21,840 Speaker 1: End quote. Now you might also remember that SpaceX, another 250 00:15:21,960 --> 00:15:26,760 Speaker 1: Musk enterprise, faces similar allegations from a former employee, an engineer, 251 00:15:26,760 --> 00:15:29,840 Speaker 1: who says that the company is hostile toward women employees. 252 00:15:30,120 --> 00:15:32,560 Speaker 1: So I recommend reading up on these stories because they 253 00:15:32,560 --> 00:15:36,760 Speaker 1: include testimonies from the women involved, and their stories are heartbreaking, 254 00:15:37,320 --> 00:15:41,760 Speaker 1: but I think are really important for people to know. Okay, 255 00:15:42,000 --> 00:15:45,200 Speaker 1: we will now switch gears away from that really awful 256 00:15:45,240 --> 00:15:48,920 Speaker 1: but important news. The Foreign motor company announced a project 257 00:15:48,960 --> 00:15:51,960 Speaker 1: called ford Pro Charging that will help support a fleet 258 00:15:52,040 --> 00:15:56,480 Speaker 1: of commercial electric vehicles. The commercial evs will include software 259 00:15:56,480 --> 00:15:59,280 Speaker 1: that provides telematics two companies so that they can keep 260 00:15:59,320 --> 00:16:02,800 Speaker 1: track of where their vehicles are uh the charging status, 261 00:16:02,800 --> 00:16:05,000 Speaker 1: so they'll know how charge the batteries are at any 262 00:16:05,000 --> 00:16:08,480 Speaker 1: given time, and also maintenance indicators. So the idea is 263 00:16:08,520 --> 00:16:12,479 Speaker 1: that the system will empower companies to operate electric vehicles effectively, 264 00:16:12,800 --> 00:16:15,360 Speaker 1: which is a good move considering the general trend towards 265 00:16:15,440 --> 00:16:19,320 Speaker 1: migrating to e vs in favor of those rather than 266 00:16:19,440 --> 00:16:22,840 Speaker 1: internal combustion engine vehicles, and having the ability to track 267 00:16:22,880 --> 00:16:26,560 Speaker 1: individual vehicle performance as well as overall fleet performance is 268 00:16:26,600 --> 00:16:31,040 Speaker 1: likely to create opportunities for innovation. For example, tracking charging 269 00:16:31,080 --> 00:16:33,160 Speaker 1: trends can help a company come up with strategies that 270 00:16:33,240 --> 00:16:36,720 Speaker 1: save in charging costs, which affects the bottom line, or 271 00:16:36,920 --> 00:16:40,880 Speaker 1: tracking maintenance trends might alert a company if a particular 272 00:16:40,960 --> 00:16:44,720 Speaker 1: type of electric vehicle is more or less reliable than 273 00:16:44,720 --> 00:16:48,480 Speaker 1: other ones just based on the trends. These are things 274 00:16:48,520 --> 00:16:51,560 Speaker 1: that ultimately down the line over time, can save an 275 00:16:51,640 --> 00:16:56,640 Speaker 1: enormous amount of money. So pretty cool news in that regard. Yesterday, 276 00:16:56,680 --> 00:16:58,760 Speaker 1: read It announced it was making a move toward taking 277 00:16:58,800 --> 00:17:01,520 Speaker 1: the company public, filing an S one with the United 278 00:17:01,520 --> 00:17:05,720 Speaker 1: States Security and Exchange Commission, or SEC. So this is 279 00:17:05,720 --> 00:17:08,360 Speaker 1: when a private company moves to become a publicly traded 280 00:17:08,400 --> 00:17:11,119 Speaker 1: company on the stock market. It's just one step of 281 00:17:11,119 --> 00:17:13,439 Speaker 1: many that a company has to go through. It's not 282 00:17:13,520 --> 00:17:15,920 Speaker 1: exactly an unexpected move. A lot of people have been 283 00:17:15,920 --> 00:17:19,000 Speaker 1: predicting this, but I'm sure it is sending shock waves 284 00:17:19,000 --> 00:17:22,040 Speaker 1: through Reddit itself because I'm guessing there are redditors who 285 00:17:22,040 --> 00:17:25,680 Speaker 1: are worrying that a publicly traded version of Reddit will 286 00:17:25,720 --> 00:17:30,600 Speaker 1: mean a more lockdown regulated Reddit. I don't necessarily think 287 00:17:30,640 --> 00:17:33,240 Speaker 1: that's a bad thing, at least in many cases, because 288 00:17:33,240 --> 00:17:37,960 Speaker 1: Reddit can also become a pretty brutal and abusive place. Um. 289 00:17:38,000 --> 00:17:40,439 Speaker 1: But in other cases you might say, well, this is 290 00:17:40,760 --> 00:17:45,159 Speaker 1: potentially a restriction on users being able to speak freely, 291 00:17:45,680 --> 00:17:49,640 Speaker 1: which you know that that has its own problems too, 292 00:17:49,720 --> 00:17:54,280 Speaker 1: So I get it. But considering some of the conversations 293 00:17:54,280 --> 00:18:00,000 Speaker 1: I've seen on Reddit and their tone and the language used, Uh, 294 00:18:00,920 --> 00:18:03,680 Speaker 1: it's an uphill battle. I think for some of those 295 00:18:03,760 --> 00:18:06,320 Speaker 1: users to argue that it's a bad thing, but I 296 00:18:06,400 --> 00:18:08,680 Speaker 1: get it, and we don't even know yet, right, it's 297 00:18:08,720 --> 00:18:13,200 Speaker 1: just a general fear. I think Roku received some very 298 00:18:13,240 --> 00:18:16,200 Speaker 1: bad news in the United States. The US International Trade 299 00:18:16,200 --> 00:18:19,520 Speaker 1: Commission or i t C has ruled against Roku in 300 00:18:19,560 --> 00:18:24,200 Speaker 1: a patent infringement case. A company called Universal Electronics Incorporated 301 00:18:24,240 --> 00:18:26,639 Speaker 1: brought the case against Roku, saying that the company was 302 00:18:26,680 --> 00:18:30,159 Speaker 1: infringing on I think six different patents, and the I 303 00:18:30,280 --> 00:18:35,159 Speaker 1: t C agreed with u EI. That's the Universal Electronics Incorporated. 304 00:18:35,680 --> 00:18:39,359 Speaker 1: And now I t C has barred the importation and 305 00:18:39,480 --> 00:18:43,080 Speaker 1: sale of those Roku products. That band will take effect 306 00:18:43,080 --> 00:18:46,200 Speaker 1: on January nine, twenty two, at which point merchants in 307 00:18:46,240 --> 00:18:49,320 Speaker 1: the US will no longer be allowed to import and 308 00:18:49,400 --> 00:18:53,720 Speaker 1: sell those Roku devices to American citizens, and Roku's stock 309 00:18:53,760 --> 00:18:58,320 Speaker 1: price dropped twelve percent following the news. Finally, the streaming 310 00:18:58,359 --> 00:19:00,840 Speaker 1: video game market is getting a bit more crowded. Twitch 311 00:19:01,240 --> 00:19:03,440 Speaker 1: is the dominant player in that space, with YouTube and 312 00:19:03,480 --> 00:19:06,560 Speaker 1: Facebook also vying to carve out massive slices of the market. 313 00:19:06,840 --> 00:19:09,520 Speaker 1: But now TikTok is waiting into the field, launching a 314 00:19:09,600 --> 00:19:13,560 Speaker 1: desktop streaming platform called TikTok Live Studio. It allows users 315 00:19:13,600 --> 00:19:17,320 Speaker 1: to stream out video gameplay and other desktop applications across TikTok, 316 00:19:17,720 --> 00:19:19,439 Speaker 1: though for the moment, it appears to be in a 317 00:19:19,480 --> 00:19:22,600 Speaker 1: limited rollout for testing and it's not always accessible on 318 00:19:22,600 --> 00:19:25,480 Speaker 1: TikTok itself. Not sure if this will graduate to a 319 00:19:25,520 --> 00:19:27,920 Speaker 1: full feature for all TikTok users, but I could see 320 00:19:27,960 --> 00:19:30,359 Speaker 1: the company wanting to tap into its young user base 321 00:19:30,600 --> 00:19:33,600 Speaker 1: to pull some focus away from Twitch in particular, and 322 00:19:33,680 --> 00:19:36,359 Speaker 1: that's it. If you have suggestions for topics I should 323 00:19:36,359 --> 00:19:38,600 Speaker 1: cover in future episodes of tech Stuff, please reach out 324 00:19:38,640 --> 00:19:40,919 Speaker 1: to me. The handle for the show is tech Stuff 325 00:19:41,320 --> 00:19:45,440 Speaker 1: H s W and I'll talk to you again really soon. 326 00:19:50,240 --> 00:19:53,240 Speaker 1: Tech Stuff is an I Heart Radio production. For more 327 00:19:53,320 --> 00:19:56,720 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 328 00:19:56,840 --> 00:20:00,000 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.