1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,480 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,560 --> 00:00:17,799 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:17,840 --> 00:00:20,560 Speaker 1: how the tech are you. It's time for the tech 5 00:00:20,640 --> 00:00:24,840 Speaker 1: news for Thursday, October twenty, twenty twenty two, and we've 6 00:00:24,880 --> 00:00:28,320 Speaker 1: got a ton to get through, So let's start off 7 00:00:28,400 --> 00:00:31,480 Speaker 1: with some Apple news. Last week, I talked about how 8 00:00:31,560 --> 00:00:35,839 Speaker 1: lawmakers in the EU had approved legislation that will require 9 00:00:36,040 --> 00:00:41,279 Speaker 1: all smartphone manufacturers that use a physical charger to adopt 10 00:00:41,360 --> 00:00:46,120 Speaker 1: the USBC standard for sales in the European Union, and 11 00:00:46,159 --> 00:00:49,800 Speaker 1: that this news particularly affects Apple, as the company has 12 00:00:49,800 --> 00:00:54,480 Speaker 1: relied upon its own proprietary technology called Lightning up until now. 13 00:00:55,080 --> 00:01:00,120 Speaker 1: Apple's Senior VP of Worldwide Marketing, Greg Joswiak, has is 14 00:01:00,160 --> 00:01:04,559 Speaker 1: confirmed that Apple will comply with the new laws, though 15 00:01:04,600 --> 00:01:07,320 Speaker 1: he did not comment on whether or not Apple will 16 00:01:07,360 --> 00:01:12,080 Speaker 1: make a global switch to the USBC standard. So it 17 00:01:12,240 --> 00:01:15,520 Speaker 1: is possible that the European market will get a us 18 00:01:15,560 --> 00:01:21,800 Speaker 1: b C version of the iPhone starting in late and 19 00:01:21,840 --> 00:01:25,080 Speaker 1: that the rest of the world we'll stick with iPhones 20 00:01:25,120 --> 00:01:28,280 Speaker 1: that have a lightning part. We don't know yet, Josiak 21 00:01:28,440 --> 00:01:32,280 Speaker 1: also argued that the lightning technology actually cuts down on 22 00:01:32,400 --> 00:01:36,360 Speaker 1: E waste, which is a claim I find dubious at best. 23 00:01:36,760 --> 00:01:39,560 Speaker 1: So the claim says that charging bricks, you know, the 24 00:01:39,600 --> 00:01:43,240 Speaker 1: actual things that you plug into a wall outlet, they 25 00:01:43,280 --> 00:01:47,760 Speaker 1: frequently have ports for both USB and lightning cables, and 26 00:01:47,800 --> 00:01:51,680 Speaker 1: so by switching to USBC, folks will have no choice 27 00:01:51,720 --> 00:01:54,400 Speaker 1: but to toss all their old lightning cables because those 28 00:01:54,440 --> 00:01:58,560 Speaker 1: won't go to anything anymore, which you know, sure, but 29 00:01:58,840 --> 00:02:01,160 Speaker 1: I don't know about you. I know that I have 30 00:02:01,280 --> 00:02:05,600 Speaker 1: to replace my cables fairly regularly either they get wear 31 00:02:05,640 --> 00:02:08,640 Speaker 1: and tear on them. I'm particularly bad about rolling my 32 00:02:08,720 --> 00:02:11,600 Speaker 1: office chair over cables that honestly are just too long. 33 00:02:11,680 --> 00:02:15,160 Speaker 1: I need to get shorter cables, or more likely I 34 00:02:15,320 --> 00:02:20,080 Speaker 1: misplaced them during travel. And then I find that I 35 00:02:20,120 --> 00:02:23,480 Speaker 1: need a cable to connect my thing to my other thing, 36 00:02:24,120 --> 00:02:28,240 Speaker 1: like my phone to a charging brick or whatever. By 37 00:02:28,280 --> 00:02:31,200 Speaker 1: consolidating all that into a single standard, a standard that 38 00:02:31,240 --> 00:02:34,679 Speaker 1: works pretty darn well for that matter, then I'm more 39 00:02:34,720 --> 00:02:36,800 Speaker 1: likely to have a backup cable than if I need 40 00:02:36,880 --> 00:02:39,919 Speaker 1: to keep separate types for all my devices. Yes, there 41 00:02:39,919 --> 00:02:43,920 Speaker 1: are different levels of USBC. There are like three amp 42 00:02:43,960 --> 00:02:47,600 Speaker 1: and five amp versions, So there are differences, but for 43 00:02:47,639 --> 00:02:50,320 Speaker 1: the most part, like you can just swap cables out, 44 00:02:50,400 --> 00:02:53,639 Speaker 1: unless you're doing something like trying to power a computer 45 00:02:53,840 --> 00:02:55,760 Speaker 1: or a display or something, in which case you need 46 00:02:56,040 --> 00:02:59,840 Speaker 1: to make sure you have the five amp version. But otherwise, like, 47 00:03:00,040 --> 00:03:02,960 Speaker 1: it's really makes it simple to to swap in and 48 00:03:02,960 --> 00:03:05,919 Speaker 1: out of your cables, and I'm all for that kind 49 00:03:05,919 --> 00:03:09,799 Speaker 1: of consolidation anyway, Jaws re act, you know, Apple's rep 50 00:03:10,080 --> 00:03:12,240 Speaker 1: kind of made it clear that Apple is complying, but 51 00:03:12,480 --> 00:03:15,960 Speaker 1: is tots not happy about doing it. At least it's 52 00:03:15,960 --> 00:03:18,120 Speaker 1: going to comply in Europe. Here in the United States, 53 00:03:18,200 --> 00:03:21,359 Speaker 1: there are a few lawmakers who are considering similar legislation 54 00:03:21,400 --> 00:03:24,200 Speaker 1: that would standardize stuff like charging cables. But I am 55 00:03:24,240 --> 00:03:29,440 Speaker 1: not particularly optimistic that such legislation will ultimately become law here. 56 00:03:30,120 --> 00:03:33,080 Speaker 1: I don't I don't know. I just don't think it's likely. 57 00:03:33,400 --> 00:03:37,200 Speaker 1: Maybe we'll see swapping over to Alphabet, which of course 58 00:03:37,280 --> 00:03:39,880 Speaker 1: is the company that's the parent company of the stuff 59 00:03:39,880 --> 00:03:44,280 Speaker 1: like Google, YouTube, etcetera. It's starting to really tighten its 60 00:03:44,320 --> 00:03:47,880 Speaker 1: belt across all of its subsidiaries. Over the course of 61 00:03:47,920 --> 00:03:50,720 Speaker 1: the last year, Alphabet had added more than thirty six 62 00:03:50,760 --> 00:03:53,160 Speaker 1: thousand new employees to the company, which is a big 63 00:03:53,200 --> 00:03:56,720 Speaker 1: old yaalza. That's a lot of hires. But those days 64 00:03:56,760 --> 00:03:59,640 Speaker 1: of heavy hiring appeared to be at an end, or 65 00:03:59,680 --> 00:04:02,720 Speaker 1: at least to be put on pause. In an investor 66 00:04:02,760 --> 00:04:07,840 Speaker 1: called this Week, Alphabet's CEO Sundar Pichai assured investors that 67 00:04:08,000 --> 00:04:11,120 Speaker 1: the company is taking a much more critical look at 68 00:04:11,160 --> 00:04:14,760 Speaker 1: projects to determine which ones are really important and to 69 00:04:14,920 --> 00:04:18,279 Speaker 1: direct resources to those as well as to make quote 70 00:04:18,320 --> 00:04:22,040 Speaker 1: unquote course corrections. And I think this is important for Google. 71 00:04:22,240 --> 00:04:27,599 Speaker 1: The company has frequently launched projects that saw like a 72 00:04:27,680 --> 00:04:31,960 Speaker 1: lackluster response sometimes because you know, a lot of people 73 00:04:32,360 --> 00:04:35,279 Speaker 1: feel like it's a boy who Cried wolf situation that 74 00:04:35,320 --> 00:04:40,960 Speaker 1: Google has so frequently pulled the plug on products that 75 00:04:41,400 --> 00:04:44,280 Speaker 1: there's a reluctance to get invested in a new one 76 00:04:44,440 --> 00:04:46,920 Speaker 1: because you're worried that the company will stop supporting it 77 00:04:47,320 --> 00:04:50,040 Speaker 1: within a year or two. That's kind of earned that reputation. 78 00:04:50,720 --> 00:04:54,880 Speaker 1: So there really is a need to focus on specific 79 00:04:54,920 --> 00:04:59,279 Speaker 1: projects to make sure that those are adding value to 80 00:04:59,320 --> 00:05:02,400 Speaker 1: the company. I get that, But the company has also 81 00:05:02,720 --> 00:05:04,960 Speaker 1: had to cut way back on a lot of employee 82 00:05:05,000 --> 00:05:09,320 Speaker 1: travel and related expenses. At the same time, the advertising 83 00:05:09,360 --> 00:05:12,760 Speaker 1: business is taking a pretty big hit that is not 84 00:05:12,960 --> 00:05:18,520 Speaker 1: unusual in times of economic crisis or economic distress or recession, 85 00:05:18,600 --> 00:05:22,160 Speaker 1: whatever you want to call it. I'll spare you an 86 00:05:22,200 --> 00:05:25,560 Speaker 1: actual rant, but just in your head. Insert at this 87 00:05:25,600 --> 00:05:29,520 Speaker 1: point in the episode, a rant where I yell about 88 00:05:29,520 --> 00:05:34,520 Speaker 1: the fact that avoiding naming something like the reluctance to 89 00:05:34,760 --> 00:05:38,640 Speaker 1: name whatever economic situation we're in, does not at all 90 00:05:38,880 --> 00:05:43,120 Speaker 1: change the nature of that situation. It's still bad even 91 00:05:43,160 --> 00:05:46,760 Speaker 1: if we refuse to name it something specific. Anyway. All 92 00:05:46,800 --> 00:05:51,760 Speaker 1: this means that Alphabet's chief source of revenue the advertising business. 93 00:05:52,120 --> 00:05:54,000 Speaker 1: That's where I remind you Google is not really a 94 00:05:54,000 --> 00:05:58,200 Speaker 1: search company. It's an advertising company anyway, that's taken a 95 00:05:58,200 --> 00:06:01,560 Speaker 1: big hit. It also means that other entities that depend 96 00:06:01,640 --> 00:06:05,159 Speaker 1: upon ad revenue are hurting, So that would include folks 97 00:06:05,160 --> 00:06:08,200 Speaker 1: like YouTube creators, for example. So this is a pretty 98 00:06:08,200 --> 00:06:11,359 Speaker 1: big ripple effect. The Chi's call with investors does not 99 00:06:11,520 --> 00:06:15,279 Speaker 1: signal like a massive catastrophe or anything like that, but 100 00:06:15,880 --> 00:06:18,159 Speaker 1: you could say that this is another kind of red 101 00:06:18,200 --> 00:06:21,120 Speaker 1: flag that we're in a period where tech companies in particular, 102 00:06:21,440 --> 00:06:24,120 Speaker 1: and the people looking for work within the tech industry 103 00:06:24,400 --> 00:06:29,840 Speaker 1: are encountering some pretty tough challenges. Okay, let's switch over 104 00:06:29,880 --> 00:06:35,440 Speaker 1: to our favorite punching bag that is Meta Yesterday. Yesterday 105 00:06:35,480 --> 00:06:40,000 Speaker 1: being Wednesday, October twenty two for any of y'all from 106 00:06:40,040 --> 00:06:42,360 Speaker 1: the future who are listening back on old tech news 107 00:06:42,440 --> 00:06:46,080 Speaker 1: episodes for some reason. Anyway, yesterday Mark Zuckerberg and his 108 00:06:46,120 --> 00:06:49,599 Speaker 1: team held an earnings call that delivered some bad news 109 00:06:49,600 --> 00:06:52,920 Speaker 1: to investors, who have more than responded in kind, as 110 00:06:52,960 --> 00:06:55,960 Speaker 1: we will soon learn. So in that call, we learned 111 00:06:56,000 --> 00:06:59,520 Speaker 1: that Meta's revenue dropped four percent compared to this time 112 00:06:59,600 --> 00:07:03,120 Speaker 1: last year, that net income was down a whopping fifty 113 00:07:03,279 --> 00:07:07,440 Speaker 1: two from this time last year, and that spending is 114 00:07:07,560 --> 00:07:13,440 Speaker 1: up by nine So Meta is bringing in less money overall. 115 00:07:13,680 --> 00:07:16,520 Speaker 1: That would be the revenue bit. It's bringing in way 116 00:07:16,600 --> 00:07:20,320 Speaker 1: less of what we will generously refer to as profit. 117 00:07:20,400 --> 00:07:22,480 Speaker 1: That would be the income bit. That's the drop off 118 00:07:23,240 --> 00:07:26,160 Speaker 1: from a year ago. And this is all following on 119 00:07:26,240 --> 00:07:28,480 Speaker 1: the heels of other bad news like the fact that 120 00:07:28,600 --> 00:07:34,200 Speaker 1: EU regulators have forced Meta to divest itself of Giffee, 121 00:07:34,240 --> 00:07:38,320 Speaker 1: the animated gift platform it had purchased in twenty So 122 00:07:38,360 --> 00:07:40,080 Speaker 1: that's something that the company is going to have to 123 00:07:40,120 --> 00:07:43,400 Speaker 1: do in the near future, and considering that there's also 124 00:07:43,520 --> 00:07:47,680 Speaker 1: this growing skepticism around the concept of the metaverse, and 125 00:07:47,680 --> 00:07:51,600 Speaker 1: that Mark Zuckerberg appears fully dedicated to pursuing his version, 126 00:07:51,800 --> 00:07:54,720 Speaker 1: version his vision, I guess I could say of the metaverse, 127 00:07:55,680 --> 00:07:59,400 Speaker 1: it has some folks, like some investors, extremely displeased with 128 00:07:59,480 --> 00:08:05,000 Speaker 1: the company direction. This lack of confidence in in Meta 129 00:08:05,320 --> 00:08:08,680 Speaker 1: and Meta strategy is reflected in the company's stock price, 130 00:08:09,000 --> 00:08:12,920 Speaker 1: which dropped nearly twenty percent and after hours trading following 131 00:08:12,960 --> 00:08:16,360 Speaker 1: this earnings call. That drop in stock price meant that 132 00:08:16,560 --> 00:08:20,000 Speaker 1: Meta saw a rapid loss of around sixty five billion 133 00:08:20,080 --> 00:08:24,840 Speaker 1: dollars in its market capitalization. Market Cap is essentially the 134 00:08:25,280 --> 00:08:27,480 Speaker 1: what you get when you take the value of a 135 00:08:27,600 --> 00:08:30,880 Speaker 1: share of stock in a company multiplied by the numbers 136 00:08:31,080 --> 00:08:34,480 Speaker 1: of shares of that stock, and then it gives a 137 00:08:34,559 --> 00:08:37,240 Speaker 1: kind of general indication of the company's value. Right you 138 00:08:37,320 --> 00:08:40,800 Speaker 1: take like, if if you've got ten dollar stock and 139 00:08:40,840 --> 00:08:43,520 Speaker 1: there are ten shares out there, you multiplied ten by ten, 140 00:08:43,559 --> 00:08:46,480 Speaker 1: you get a hundred. That's how much market cap your 141 00:08:46,520 --> 00:08:51,000 Speaker 1: little your little approach has. So Meta saw a drop 142 00:08:51,080 --> 00:08:54,920 Speaker 1: of sixty billion dollars in its market cap because of 143 00:08:54,960 --> 00:08:58,719 Speaker 1: that stock price drop. By the way, that doesn't have 144 00:08:58,760 --> 00:09:01,360 Speaker 1: any real direct impact on how much cash the company 145 00:09:01,360 --> 00:09:03,480 Speaker 1: may or may not have on hand at that moment. 146 00:09:03,800 --> 00:09:06,320 Speaker 1: They can have an impact on a company if it 147 00:09:06,520 --> 00:09:10,559 Speaker 1: wants to, you know, borrow money or whatever for an acquisition, 148 00:09:10,920 --> 00:09:13,679 Speaker 1: then the market cap change can make a big difference. 149 00:09:13,720 --> 00:09:16,600 Speaker 1: But it's really just to show that there's this drop 150 00:09:16,640 --> 00:09:20,520 Speaker 1: in confidence in Meta in general. Zuckerberg said on the 151 00:09:20,600 --> 00:09:24,600 Speaker 1: call that if Meta wasn't pursuing the development of a metaverse, 152 00:09:25,200 --> 00:09:27,600 Speaker 1: it might be the case that no one else would 153 00:09:27,600 --> 00:09:29,960 Speaker 1: have stepped up and no work would be done on it. 154 00:09:30,360 --> 00:09:35,240 Speaker 1: And to that, I say, okay, so what. But you know, 155 00:09:35,320 --> 00:09:39,560 Speaker 1: that's because I remain really skeptical that a metaverse approach 156 00:09:39,640 --> 00:09:45,600 Speaker 1: is really the future of connectivity, commerce, entertainment, etcetera. I mean, 157 00:09:45,960 --> 00:09:48,959 Speaker 1: maybe it is. Maybe that is the future, and and 158 00:09:49,040 --> 00:09:53,480 Speaker 1: I'm just incapable of seeing it. Maybe I am being 159 00:09:54,080 --> 00:09:58,240 Speaker 1: obstinate in my reluctance to buy into the metaverse vision, 160 00:09:59,120 --> 00:10:03,000 Speaker 1: but it just it just seems unrealistic to me because 161 00:10:03,000 --> 00:10:06,640 Speaker 1: of a lot of different factors. And I think a 162 00:10:06,640 --> 00:10:10,720 Speaker 1: lot of investors feel in a similar way. Right They 163 00:10:10,760 --> 00:10:14,440 Speaker 1: also feel feel uncertain. They certainly see how the metaverse 164 00:10:14,559 --> 00:10:21,080 Speaker 1: project is requiring enormous UH costs in Meta, and that 165 00:10:21,240 --> 00:10:26,240 Speaker 1: this is in fact impacting the company's performance. Maybe we're 166 00:10:26,240 --> 00:10:29,199 Speaker 1: all wrong. Maybe Mark Zuckerberg is right on the money. 167 00:10:29,320 --> 00:10:33,960 Speaker 1: That is possible. But whether we're right, or Zuckerberg's right, 168 00:10:34,160 --> 00:10:38,240 Speaker 1: or there's no one who's right, Zuckerberg has indicated that 169 00:10:38,320 --> 00:10:42,200 Speaker 1: we're likely to see future quarters with similar tough results 170 00:10:42,240 --> 00:10:47,120 Speaker 1: moving forward, that Meta remains committed to this metaverse pursuit 171 00:10:47,800 --> 00:10:52,040 Speaker 1: and will continue to spend money and perhaps in increasing quantities, 172 00:10:52,080 --> 00:10:55,520 Speaker 1: in an effort to see it to fruition. Meanwhile, companies 173 00:10:55,559 --> 00:10:59,160 Speaker 1: like TikTok continue to attract the younger users that Meta 174 00:10:59,280 --> 00:11:02,640 Speaker 1: desperately wants to hook into its own ecosystem. So it 175 00:11:02,720 --> 00:11:05,520 Speaker 1: may be that Meta's future is really just meant for 176 00:11:05,600 --> 00:11:08,840 Speaker 1: a group of folks who are steadily aging out of 177 00:11:08,880 --> 00:11:13,040 Speaker 1: the platform with no replenishment in sight. Okay, we've got 178 00:11:13,080 --> 00:11:15,200 Speaker 1: a lot more tech news to go through, including some 179 00:11:15,240 --> 00:11:18,080 Speaker 1: more from Meta, but first let's take this quick break. 180 00:11:27,440 --> 00:11:31,840 Speaker 1: We're back. The Association for Computer Machineries Journal published a 181 00:11:31,840 --> 00:11:35,640 Speaker 1: study that has some disturbing findings, namely that Facebook ads 182 00:11:35,679 --> 00:11:38,719 Speaker 1: appear to target people not just on their interests, and 183 00:11:38,760 --> 00:11:42,560 Speaker 1: their likes and their dislikes, and their browsing activity, and 184 00:11:42,600 --> 00:11:45,200 Speaker 1: sometimes their app activity, unless it's an Apple iPhone, in 185 00:11:45,200 --> 00:11:49,200 Speaker 1: which case I got kind of eliminated once Apple gave 186 00:11:49,360 --> 00:11:53,040 Speaker 1: users the option to opt out of that, but also 187 00:11:53,080 --> 00:11:56,040 Speaker 1: on things like their race, their gender, and their age, 188 00:11:56,120 --> 00:11:58,599 Speaker 1: even if the user isn't sharing that info with the 189 00:11:58,640 --> 00:12:02,000 Speaker 1: platform itself. The study says that Facebook is using image 190 00:12:02,000 --> 00:12:05,960 Speaker 1: recognition software to draw conclusions about users and then serve 191 00:12:06,080 --> 00:12:09,640 Speaker 1: up ads based on those conclusions. For example, the study 192 00:12:09,679 --> 00:12:12,920 Speaker 1: found that white users were far less likely to encounter 193 00:12:12,960 --> 00:12:16,360 Speaker 1: ads that feature black people in them. The researchers actually 194 00:12:16,400 --> 00:12:21,160 Speaker 1: created ads for job listings to post on Facebook, and 195 00:12:21,240 --> 00:12:25,160 Speaker 1: these job listings featured AI generated images of people. Some 196 00:12:25,280 --> 00:12:27,559 Speaker 1: of the ads had white people in them, some had 197 00:12:27,600 --> 00:12:30,080 Speaker 1: black people in them, and by tracking the ads, the 198 00:12:30,120 --> 00:12:33,880 Speaker 1: research group saw that black users made up of the 199 00:12:33,880 --> 00:12:36,960 Speaker 1: audience that saw ads that had black people in them. 200 00:12:37,000 --> 00:12:40,040 Speaker 1: With ads that had white people in them, black users 201 00:12:40,040 --> 00:12:44,320 Speaker 1: made up fifty of that audience. Ads with teenage girls 202 00:12:44,440 --> 00:12:46,840 Speaker 1: featured in them went on to an audience that was 203 00:12:46,920 --> 00:12:49,800 Speaker 1: fifty seven percent mail, and many of them over the 204 00:12:49,840 --> 00:12:53,600 Speaker 1: age of fifty five, which creepy I mean, that's not 205 00:12:53,640 --> 00:12:56,319 Speaker 1: a good look for a platform that's often associated with 206 00:12:56,440 --> 00:12:59,880 Speaker 1: an aging user base. If it was an AD that 207 00:13:00,040 --> 00:13:04,240 Speaker 1: featured an older woman inside the image, well, the audience 208 00:13:04,280 --> 00:13:08,559 Speaker 1: for those ads ended up being women. So the researchers 209 00:13:08,559 --> 00:13:11,600 Speaker 1: indicate that for some uses, this kind of targeting might 210 00:13:11,800 --> 00:13:15,200 Speaker 1: feel like like it's a little suss, but it's not 211 00:13:15,280 --> 00:13:18,400 Speaker 1: necessarily a bad thing. I mean, let's face it, you 212 00:13:18,440 --> 00:13:21,400 Speaker 1: are more likely to respond to an AD if the 213 00:13:21,559 --> 00:13:25,040 Speaker 1: person or person's appearing in the ad kind of look 214 00:13:25,080 --> 00:13:28,640 Speaker 1: like you do right, like you. There's just this this tendency. 215 00:13:28,760 --> 00:13:31,800 Speaker 1: You know, we want to see ourselves reflected in the 216 00:13:31,880 --> 00:13:34,840 Speaker 1: things that we see. But when it comes to stuff 217 00:13:34,840 --> 00:13:39,920 Speaker 1: like job listings and housing and education, the targeting can 218 00:13:39,960 --> 00:13:43,920 Speaker 1: reinforce social problems. In fact, Facebook has been in trouble 219 00:13:44,000 --> 00:13:46,320 Speaker 1: for that in the past. Back in there was a 220 00:13:46,360 --> 00:13:51,840 Speaker 1: massive lawsuit that focused on this. Also, you know, we're 221 00:13:51,920 --> 00:13:54,800 Speaker 1: looking at a system that's using machine learning and AI 222 00:13:54,840 --> 00:13:57,680 Speaker 1: and machine learning by relying upon strategies that worked in 223 00:13:57,720 --> 00:14:03,120 Speaker 1: the past, could end up perpetuating discriminatory practices that disproportionately 224 00:14:03,200 --> 00:14:07,200 Speaker 1: hurt certain populations, namely people of color. The study also 225 00:14:07,240 --> 00:14:10,959 Speaker 1: indicates that Facebook's approach could be antithetical to the desires 226 00:14:11,000 --> 00:14:14,960 Speaker 1: of their clients, like the companies that are actually paying 227 00:14:15,000 --> 00:14:17,679 Speaker 1: for the ads, because a lot of these companies want 228 00:14:17,720 --> 00:14:21,360 Speaker 1: to project an image that values diversity. But if the 229 00:14:21,400 --> 00:14:24,520 Speaker 1: diversity reflected in the ads means that those ads aren't 230 00:14:24,560 --> 00:14:28,720 Speaker 1: being shown to all populations, that might mean that the 231 00:14:28,760 --> 00:14:32,400 Speaker 1: ad isn't getting the effect that the base company wanted 232 00:14:32,440 --> 00:14:35,680 Speaker 1: in the first place. Now, Meta reps say that Meta 233 00:14:35,800 --> 00:14:39,480 Speaker 1: is dedicated to preventing discrimination on its platforms and that 234 00:14:39,520 --> 00:14:42,560 Speaker 1: the company continues to develop its technologies with that goal 235 00:14:42,640 --> 00:14:46,760 Speaker 1: in mind. Further, we should be hearing more about Meta's 236 00:14:47,000 --> 00:14:49,960 Speaker 1: pushes to to fix these kinds of problems in the 237 00:14:50,000 --> 00:14:52,320 Speaker 1: months ahead. But this is a good example of how 238 00:14:52,360 --> 00:14:56,400 Speaker 1: machine learning and how AI can have a bias built 239 00:14:56,440 --> 00:14:59,920 Speaker 1: into it, and how that bias can have a negative impact. 240 00:15:00,520 --> 00:15:03,480 Speaker 1: Now that's not to say that all biases necessarily bad 241 00:15:03,600 --> 00:15:06,960 Speaker 1: or that all bias has to be avoided, but there 242 00:15:06,960 --> 00:15:09,720 Speaker 1: are definite areas where you could say, yeah, this is 243 00:15:09,760 --> 00:15:13,600 Speaker 1: a problem, and this kind of constitutes that and some 244 00:15:13,680 --> 00:15:16,600 Speaker 1: more Meta bad news. In Washington State here in the 245 00:15:16,640 --> 00:15:19,920 Speaker 1: United States, a judge has issued a twenty four point 246 00:15:20,000 --> 00:15:23,880 Speaker 1: seven million dollar fine for failing to comply with a 247 00:15:23,920 --> 00:15:28,200 Speaker 1: state campaign finance disclosure law. So the court found Facebook 248 00:15:28,280 --> 00:15:32,560 Speaker 1: guilty of violating the state's Fair Campaign Practices Act more 249 00:15:32,640 --> 00:15:37,240 Speaker 1: than eight hundred times eight hundred twenty two times in fact, 250 00:15:37,640 --> 00:15:39,800 Speaker 1: and this is not the first time this has happened. 251 00:15:39,920 --> 00:15:42,840 Speaker 1: The company came up for the same sort of problem 252 00:15:42,840 --> 00:15:47,400 Speaker 1: back in two thousand eighteen. So that law says that 253 00:15:47,520 --> 00:15:53,200 Speaker 1: any platform that airs or displays political advertising has to 254 00:15:53,280 --> 00:15:58,360 Speaker 1: maintain a publicly accessible database of who purchased the ads, 255 00:15:58,760 --> 00:16:02,520 Speaker 1: including their names and addresses. Plus the information has to 256 00:16:02,800 --> 00:16:06,960 Speaker 1: include whom the ads were targeting, how many views the 257 00:16:07,080 --> 00:16:09,960 Speaker 1: ads received, how the ads were paid for, and that 258 00:16:10,040 --> 00:16:13,720 Speaker 1: kind of stuff. So anyone who asks for this information 259 00:16:13,760 --> 00:16:18,120 Speaker 1: has the right to it, and the platforms are compelled 260 00:16:18,320 --> 00:16:22,440 Speaker 1: by law to comply and hand over that precious information. 261 00:16:23,040 --> 00:16:27,280 Speaker 1: But Facebook has declined to acquiesce to that request for 262 00:16:27,360 --> 00:16:30,960 Speaker 1: quite some time, has not followed the rules according to 263 00:16:31,040 --> 00:16:33,800 Speaker 1: the the case, and it has argued that the law 264 00:16:33,960 --> 00:16:37,840 Speaker 1: quote burdens political speech end quote. Though that's kind of 265 00:16:37,840 --> 00:16:41,080 Speaker 1: a tough thing to argue considering that, you know, platforms 266 00:16:41,080 --> 00:16:44,240 Speaker 1: like television, radio, and newspapers have all been complying with 267 00:16:44,280 --> 00:16:46,680 Speaker 1: this law since it was passed in nine two. So 268 00:16:46,720 --> 00:16:49,920 Speaker 1: I'm not sure that that's a really valid argument. Uh, 269 00:16:49,960 --> 00:16:53,680 Speaker 1: there's no doubt that Facebook has access to the information 270 00:16:53,840 --> 00:16:57,200 Speaker 1: that's required. The company has just repeatedly failed to hand 271 00:16:57,280 --> 00:17:00,840 Speaker 1: that information over. The law allow the judge to find 272 00:17:00,880 --> 00:17:04,239 Speaker 1: an entity up to ten thousand dollars per violation. And 273 00:17:04,280 --> 00:17:06,560 Speaker 1: as I said, there were eight hundred twenty two violations, 274 00:17:06,840 --> 00:17:10,720 Speaker 1: And you might say, huh, e d two times ten 275 00:17:10,760 --> 00:17:14,120 Speaker 1: thousand does not equal twenty four point seven million dollars. 276 00:17:14,119 --> 00:17:17,520 Speaker 1: That's that's way more than what you should expect. Well, 277 00:17:18,000 --> 00:17:20,840 Speaker 1: that same law also allows a judge to triple the 278 00:17:20,840 --> 00:17:25,760 Speaker 1: penalty per violation if the judge determines that the violations 279 00:17:25,800 --> 00:17:29,000 Speaker 1: were intentional in nature. And since Facebook went through this 280 00:17:29,160 --> 00:17:32,199 Speaker 1: same process back in two thousand eighteen, it's kind of 281 00:17:32,240 --> 00:17:36,000 Speaker 1: hard to argue that the company wasn't intentionally violating that law. 282 00:17:36,400 --> 00:17:39,159 Speaker 1: Thus we get the twenty four point seven million dollar 283 00:17:39,320 --> 00:17:43,840 Speaker 1: fine that might be the largest campaign finance penalty ever 284 00:17:44,080 --> 00:17:47,480 Speaker 1: issued here in the United States. Of course, compared to 285 00:17:47,600 --> 00:17:51,560 Speaker 1: Facebook's revenues, which even in the downturn it's experiencing right now, 286 00:17:51,680 --> 00:17:55,200 Speaker 1: or you know, measured in the billions of dollars. This 287 00:17:55,400 --> 00:17:58,080 Speaker 1: is small change, but then no company really wants to 288 00:17:58,160 --> 00:18:01,119 Speaker 1: just hand over twenty five million bucks, so it's not 289 00:18:01,160 --> 00:18:04,119 Speaker 1: exactly a slap on the wrist either. It's you know, 290 00:18:04,240 --> 00:18:07,280 Speaker 1: literally the largest penalty that the judge was allowed to 291 00:18:07,320 --> 00:18:10,280 Speaker 1: pass by law. Now let's hop on over to Twitter 292 00:18:10,320 --> 00:18:13,159 Speaker 1: to find out what's going on with Elon Musk's on again, 293 00:18:13,840 --> 00:18:19,120 Speaker 1: off again, on again again acquisition deal. So as it stands, 294 00:18:19,520 --> 00:18:24,440 Speaker 1: Musk has until the close of business tomorrow, Friday, October 295 00:18:24,840 --> 00:18:28,800 Speaker 1: to finalize his acquisition of Twitter. If you recall, Musk 296 00:18:28,880 --> 00:18:32,000 Speaker 1: initially agreed to buy Twitter at fifty four dollars twenty 297 00:18:32,040 --> 00:18:34,680 Speaker 1: cents per share back in the spring of this year. 298 00:18:35,520 --> 00:18:38,840 Speaker 1: Right now, that's actually just a hair over what Twitter 299 00:18:39,000 --> 00:18:42,280 Speaker 1: is currently trading at. Like when I went to record, 300 00:18:42,760 --> 00:18:46,520 Speaker 1: Twitter is trading at just under fifty four dollars per share, 301 00:18:46,600 --> 00:18:49,800 Speaker 1: so it's really close to what that deal was proposed at. 302 00:18:50,359 --> 00:18:54,160 Speaker 1: I suspect that the current share price reflects people anticipating 303 00:18:54,200 --> 00:18:56,560 Speaker 1: that this deal is going to go through by the 304 00:18:56,680 --> 00:19:00,920 Speaker 1: end of tomorrow, So even a small gain is a game. 305 00:19:01,200 --> 00:19:04,200 Speaker 1: So I think that has driven up interest in the stock, 306 00:19:04,280 --> 00:19:06,640 Speaker 1: and thus we see it really close to what Musk 307 00:19:06,800 --> 00:19:11,080 Speaker 1: was um agreeing to buy it for. Of course, Musk 308 00:19:11,680 --> 00:19:14,640 Speaker 1: famously attempted to back out of the deal, which then 309 00:19:14,720 --> 00:19:18,800 Speaker 1: prompted a court case to force Musk to go through 310 00:19:18,800 --> 00:19:21,800 Speaker 1: with the deal. Twitter brought that against Musk. Uh That 311 00:19:21,880 --> 00:19:25,320 Speaker 1: court case is currently on hold unless the deal does 312 00:19:25,400 --> 00:19:27,640 Speaker 1: not complete by the end of day tomorrow, in which 313 00:19:27,680 --> 00:19:31,600 Speaker 1: case the case is back on. So Musk himself arrived 314 00:19:31,600 --> 00:19:35,640 Speaker 1: at Twitter HQ yesterday. On Wednesday, he carried a bathroom 315 00:19:35,760 --> 00:19:39,919 Speaker 1: sink as a kind of publicity stunt. Uh. I'm not 316 00:19:39,960 --> 00:19:42,120 Speaker 1: sure what the message was, because typically we talked about 317 00:19:42,240 --> 00:19:46,679 Speaker 1: kitchen sink deals, not bathroom sinc ones. According to Gizmoto, 318 00:19:47,600 --> 00:19:51,840 Speaker 1: Musk said it was a visual pun on let that 319 00:19:51,960 --> 00:19:54,840 Speaker 1: sink in. But you know, I don't know. Maybe he's 320 00:19:54,880 --> 00:19:57,320 Speaker 1: just very particular about where he washes his hands. I 321 00:19:57,400 --> 00:20:00,960 Speaker 1: don't know. Anyway. Musk also published an open letter to 322 00:20:01,000 --> 00:20:04,760 Speaker 1: Twitter employees to address some fears and concerns people have 323 00:20:04,960 --> 00:20:08,960 Speaker 1: had about this acquisition. For example, he denied that he 324 00:20:09,040 --> 00:20:12,400 Speaker 1: plans to eliminate up to the workforce at the company. 325 00:20:12,520 --> 00:20:14,560 Speaker 1: That was something that had been reported in the past 326 00:20:14,560 --> 00:20:18,399 Speaker 1: by The Washington Post. Uh, there is an indication that 327 00:20:18,640 --> 00:20:22,160 Speaker 1: he expects there to be some downsizing, and in fact 328 00:20:22,200 --> 00:20:28,720 Speaker 1: had received previous advice from Jason callicannacas out of old 329 00:20:28,760 --> 00:20:32,320 Speaker 1: people to require people to come into the office, because 330 00:20:32,320 --> 00:20:34,399 Speaker 1: that's going to weed people out, like people will self 331 00:20:34,520 --> 00:20:38,840 Speaker 1: select for leaving the company. But then that tends to 332 00:20:38,880 --> 00:20:42,840 Speaker 1: be like your best people too, So that's not the 333 00:20:42,880 --> 00:20:46,200 Speaker 1: best advice I've ever heard, But you know, it's it's 334 00:20:46,280 --> 00:20:50,240 Speaker 1: very possible that Busk will attempt to downsize Twitter simply 335 00:20:50,280 --> 00:20:54,680 Speaker 1: by being unpleasant, something that I am told he has 336 00:20:54,720 --> 00:21:00,240 Speaker 1: a modicum of experience at doing. Anyway, Moscostle said has 337 00:21:00,280 --> 00:21:03,119 Speaker 1: no intention to allow Twitter to become a quote free 338 00:21:03,160 --> 00:21:05,800 Speaker 1: for all hell scape where anything can be said with 339 00:21:05,960 --> 00:21:10,160 Speaker 1: no consequences end quote. That also conflicts a little bit 340 00:21:10,160 --> 00:21:13,360 Speaker 1: earlier with reports that must believe Twitter should be kind 341 00:21:13,359 --> 00:21:16,600 Speaker 1: of an unfettered platform for free speech. But to be fair, 342 00:21:16,920 --> 00:21:20,159 Speaker 1: Musk has pretty much always maintained that this should actually 343 00:21:20,160 --> 00:21:23,400 Speaker 1: fall within the legal parameters of the various countries within 344 00:21:23,480 --> 00:21:26,720 Speaker 1: which Twitter operates. So, in other words, you can't say 345 00:21:26,760 --> 00:21:31,480 Speaker 1: absolutely anything if the country where you are operating has 346 00:21:32,200 --> 00:21:34,879 Speaker 1: limits on free speech, like you have to you have 347 00:21:34,960 --> 00:21:38,320 Speaker 1: to operate within the boundaries of the law. Uh, he 348 00:21:38,359 --> 00:21:41,639 Speaker 1: has at least made that concession. He has also indicated 349 00:21:41,680 --> 00:21:45,000 Speaker 1: that he intends for Twitter to ease off on content moderation, 350 00:21:45,080 --> 00:21:48,159 Speaker 1: which could allow for even more misinformation to proliferate across 351 00:21:48,160 --> 00:21:51,760 Speaker 1: the platform, and that he would reverse the permanent bands 352 00:21:51,800 --> 00:21:55,679 Speaker 1: of several prominent accounts, most notably that of Donald Trump, 353 00:21:56,000 --> 00:21:58,840 Speaker 1: who has seen his own truth social platform struggle to 354 00:21:59,000 --> 00:22:03,240 Speaker 1: find significant traction. Anyway, we'll have to wait until tomorrow 355 00:22:03,240 --> 00:22:05,840 Speaker 1: to see if the deal actually does go through for real, 356 00:22:06,280 --> 00:22:10,080 Speaker 1: which I mean, I'm there's like a seventy chance in 357 00:22:10,119 --> 00:22:13,280 Speaker 1: my mind that's going to happen, or if Musk will 358 00:22:13,320 --> 00:22:15,840 Speaker 1: pull some other maneuver in an attempt to get out 359 00:22:15,840 --> 00:22:18,200 Speaker 1: of the deal. I'm not sure that there is an 360 00:22:18,240 --> 00:22:21,160 Speaker 1: exit strategy that wouldn't also put the court case back 361 00:22:21,200 --> 00:22:25,280 Speaker 1: on track to continue. So I think there's a more 362 00:22:25,359 --> 00:22:27,919 Speaker 1: than decent chance that by the end of tomorrow, Twitter 363 00:22:27,960 --> 00:22:31,359 Speaker 1: will be a privately held company owned by Elon Musk. 364 00:22:32,080 --> 00:22:37,320 Speaker 1: CNBC another outlets report that Tesla, another Elon Musk company, 365 00:22:37,359 --> 00:22:40,920 Speaker 1: is currently under investigation by the U. S Department of Justice. 366 00:22:41,359 --> 00:22:44,679 Speaker 1: This is with regards to Tesla's driver assist systems, and 367 00:22:44,680 --> 00:22:48,880 Speaker 1: whether or not the company misled consumers with exaggerated claims 368 00:22:48,960 --> 00:22:54,360 Speaker 1: about those systems and their capabilities, namely that they essentially 369 00:22:54,480 --> 00:22:59,360 Speaker 1: constituted self driving. Now, at the very least, there appears 370 00:22:59,400 --> 00:23:03,400 Speaker 1: to be two distinct storylines coming out of Tesla. So 371 00:23:03,440 --> 00:23:07,760 Speaker 1: on the marketing side, the company seems to indicate that 372 00:23:07,800 --> 00:23:11,080 Speaker 1: Tesla vehicles, when they're in full self driving mode, are 373 00:23:12,080 --> 00:23:16,639 Speaker 1: you know, to any practical consideration, an autonomous vehicle. They 374 00:23:16,680 --> 00:23:19,720 Speaker 1: don't go quite that far to say it, but one 375 00:23:19,840 --> 00:23:24,000 Speaker 1: video on Tesla's site that shows a man inside a 376 00:23:24,040 --> 00:23:27,800 Speaker 1: Tesla vehicle goes on to say, quote, the person in 377 00:23:27,800 --> 00:23:30,760 Speaker 1: the driver's seat is only there for legal reasons. He 378 00:23:30,960 --> 00:23:35,159 Speaker 1: is not doing anything. The car is driving itself end quote. 379 00:23:35,560 --> 00:23:37,360 Speaker 1: That is not the same thing as saying this car 380 00:23:37,440 --> 00:23:41,080 Speaker 1: is autonomous, but it does seem to imply, Hey, the 381 00:23:41,160 --> 00:23:44,280 Speaker 1: system can take full control of your vehicle safely and 382 00:23:44,359 --> 00:23:49,000 Speaker 1: you can just sit back and relax. However, during actual operation, 383 00:23:49,440 --> 00:23:52,399 Speaker 1: Tesla has messages that tell drivers they are required to 384 00:23:52,480 --> 00:23:55,240 Speaker 1: keep their hands on the wheel even when using the 385 00:23:55,320 --> 00:24:00,639 Speaker 1: driver assist features, and further the website. On Tesla's page, 386 00:24:00,640 --> 00:24:04,280 Speaker 1: it actually says the systems quote do not make the 387 00:24:04,400 --> 00:24:08,280 Speaker 1: vehicle autonomous end quote. So it does say on the 388 00:24:08,320 --> 00:24:11,720 Speaker 1: web page, this doesn't make this an autonomous vehicle, even 389 00:24:11,760 --> 00:24:14,600 Speaker 1: while they also show videos where they say the only 390 00:24:14,640 --> 00:24:17,879 Speaker 1: reason we have a driver in the the driver's seat 391 00:24:18,160 --> 00:24:20,680 Speaker 1: is for legal reasons. So it does sound a lot 392 00:24:20,720 --> 00:24:24,320 Speaker 1: like double speak, right, like the cars aren't autonomous, but 393 00:24:24,400 --> 00:24:27,800 Speaker 1: you know they can drive themselves anyway. Tesla's have been 394 00:24:27,800 --> 00:24:32,040 Speaker 1: involved in numerous high profile accidents, some of them involving fatalities, 395 00:24:32,600 --> 00:24:36,320 Speaker 1: So the d o J is investigating the company, presumably 396 00:24:36,359 --> 00:24:39,640 Speaker 1: to see if there are any criminal implications here. It's 397 00:24:39,680 --> 00:24:44,840 Speaker 1: possible that Tesla's seemingly contradictory messages may keep the company 398 00:24:44,920 --> 00:24:48,680 Speaker 1: legally safe in that Tesla's lawyers can truthfully point out 399 00:24:48,720 --> 00:24:52,400 Speaker 1: that Tesla has denied that its vehicles are autonomous. This 400 00:24:52,480 --> 00:24:57,040 Speaker 1: is not the only legal investigation into Tesla by any means, 401 00:24:57,119 --> 00:25:00,480 Speaker 1: and it might be a while before we hear any 402 00:25:00,520 --> 00:25:04,640 Speaker 1: potential judicial action against the company, if in fact any 403 00:25:04,760 --> 00:25:09,159 Speaker 1: any are pending. Okay, we've got some more news stories 404 00:25:09,200 --> 00:25:11,040 Speaker 1: to get through before we get to that. Let's take 405 00:25:11,160 --> 00:25:22,600 Speaker 1: another quick break. We're back from break. We still have 406 00:25:22,720 --> 00:25:26,359 Speaker 1: one more you know, tangentially Elon Musk related story because 407 00:25:26,400 --> 00:25:30,720 Speaker 1: we're gonna talk about SpaceX and specifically Starlink. So earlier 408 00:25:30,760 --> 00:25:34,200 Speaker 1: this year, Starlink, which is the satellite Internet service provider 409 00:25:34,440 --> 00:25:38,280 Speaker 1: arm of SpaceX, offered up a service for RV owners, 410 00:25:38,800 --> 00:25:41,120 Speaker 1: and r V owners would pay a hundred thirty five 411 00:25:41,160 --> 00:25:45,320 Speaker 1: bucks per month for Internet access through Starlink. However, it 412 00:25:45,359 --> 00:25:48,600 Speaker 1: would only work for r vs that were stationary that 413 00:25:48,680 --> 00:25:52,680 Speaker 1: were parked. In other words, However, later this year, in December, 414 00:25:53,000 --> 00:25:55,240 Speaker 1: starlink is going to offer a plan that will allow 415 00:25:55,400 --> 00:25:58,919 Speaker 1: RV owners to access the Internet even while driving the 416 00:25:59,080 --> 00:26:01,959 Speaker 1: r V. Now, to do so will require the installation 417 00:26:02,280 --> 00:26:05,440 Speaker 1: of a new kind of satellite dish, one that comes 418 00:26:05,480 --> 00:26:08,879 Speaker 1: with a hefty two thousand five fee. If you were 419 00:26:08,920 --> 00:26:12,480 Speaker 1: just getting the standard stationary access system in your r V, 420 00:26:12,960 --> 00:26:18,520 Speaker 1: that one costs to install. So it's a pretty hefty upgrade. 421 00:26:18,560 --> 00:26:21,879 Speaker 1: You know, it's almost two thousand dollars more expensive. The 422 00:26:22,040 --> 00:26:25,399 Speaker 1: monthly cost for access will still be a hundred thirty 423 00:26:25,400 --> 00:26:29,560 Speaker 1: five dollar subscription fee. Starlink has recently been targeting use 424 00:26:29,640 --> 00:26:34,080 Speaker 1: cases for moving vehicles for private planes to ships at sea, 425 00:26:34,160 --> 00:26:36,680 Speaker 1: and we can now add r vs on the road 426 00:26:37,040 --> 00:26:41,439 Speaker 1: to that list. Earlier today, hackers got access to the 427 00:26:41,520 --> 00:26:45,320 Speaker 1: New York Post's website and Twitter feed and use that 428 00:26:45,400 --> 00:26:51,160 Speaker 1: access to publish some really awful headlines, mostly targeting specific politicians. 429 00:26:51,600 --> 00:26:56,760 Speaker 1: Those headlines included racist, misogynists, and other disgusting language. The 430 00:26:56,840 --> 00:26:59,920 Speaker 1: Post regained control of its accounts not too long after 431 00:27:00,119 --> 00:27:03,000 Speaker 1: they had been seized, and was able to remove the 432 00:27:03,040 --> 00:27:07,639 Speaker 1: offending material. This marks the second time during the current 433 00:27:07,680 --> 00:27:10,840 Speaker 1: election season here in the United States that a publication 434 00:27:10,920 --> 00:27:15,080 Speaker 1: found itself hacked. Fast Company was a target of such 435 00:27:15,080 --> 00:27:17,840 Speaker 1: a hack in late September actually took its websites down 436 00:27:17,920 --> 00:27:20,879 Speaker 1: for a full week to deal with that. Both Fast 437 00:27:20,880 --> 00:27:24,159 Speaker 1: Company and The New York Post rely upon WordPress as 438 00:27:24,200 --> 00:27:28,040 Speaker 1: a content management system, but as of this recording, there's 439 00:27:28,080 --> 00:27:31,120 Speaker 1: been no further information about how the hackers got access 440 00:27:31,160 --> 00:27:34,520 Speaker 1: to the New York Post website. Anyway, it's yet another 441 00:27:34,560 --> 00:27:38,040 Speaker 1: fun example of how political events can drive terrible things 442 00:27:38,080 --> 00:27:44,399 Speaker 1: in technology. Sigh. This past August, US House Speaker Nancy 443 00:27:44,400 --> 00:27:48,320 Speaker 1: Pelosi visited Taiwan to meet with Morris Cheng, the founder 444 00:27:48,400 --> 00:27:52,040 Speaker 1: of semiconductor company t s MC. And you might remember 445 00:27:52,080 --> 00:27:55,440 Speaker 1: that t SMC is responsible for the production of much 446 00:27:55,520 --> 00:27:58,399 Speaker 1: of the chips we rely upon in our electronics and 447 00:27:58,600 --> 00:28:02,680 Speaker 1: most of the higher end chips. The US has recently 448 00:28:02,720 --> 00:28:06,879 Speaker 1: passed legislation aimed at bootstrapping the semiconductor industry here in 449 00:28:06,960 --> 00:28:10,080 Speaker 1: the United States, and so it aims to shift some 450 00:28:10,160 --> 00:28:14,720 Speaker 1: of the dependence on Taiwan to US based facilities. And now, 451 00:28:14,840 --> 00:28:18,480 Speaker 1: the Financial Times in Taiwan reports that during the visit 452 00:28:18,520 --> 00:28:22,280 Speaker 1: back in August, Cheng told Pelosi that the United States 453 00:28:22,280 --> 00:28:26,359 Speaker 1: efforts are quote doomed to fail end quote. Now that 454 00:28:26,480 --> 00:28:29,640 Speaker 1: might be the case, but there are some other factors 455 00:28:29,680 --> 00:28:33,480 Speaker 1: that may have influenced Chang when he made such a proclamation, 456 00:28:33,520 --> 00:28:38,720 Speaker 1: assuming that the reporting is accurate. For example, Taiwan currently 457 00:28:38,920 --> 00:28:44,040 Speaker 1: enjoys a not entirely stable independence from mainland China, and 458 00:28:44,120 --> 00:28:47,720 Speaker 1: the Western world's reliance on semiconductors means that countries like 459 00:28:47,760 --> 00:28:51,160 Speaker 1: the United States have a vested interest in keeping Taiwan 460 00:28:51,360 --> 00:28:55,760 Speaker 1: free from Chinese interference. Therefore, if China were to make 461 00:28:55,800 --> 00:28:59,920 Speaker 1: any kind of aggressive moves towards Taiwan, that would likely 462 00:29:00,000 --> 00:29:03,880 Speaker 1: pull the US into what could become a dangerous conflict. 463 00:29:04,400 --> 00:29:07,280 Speaker 1: So it's the threat of the US is involvement that 464 00:29:07,360 --> 00:29:11,480 Speaker 1: keeps Taiwan temporarily safe. But if the West were to 465 00:29:11,600 --> 00:29:15,360 Speaker 1: reduce its reliance on Taiwan when it comes to semi conductors, 466 00:29:15,960 --> 00:29:21,680 Speaker 1: then this silicon shield around Taiwan will weaken. Therefore, Jang 467 00:29:21,720 --> 00:29:26,080 Speaker 1: has an existential motivation to dismiss the US's efforts to 468 00:29:26,120 --> 00:29:30,880 Speaker 1: become independent with semi conductors. Now that doesn't mean he's wrong. 469 00:29:31,160 --> 00:29:34,360 Speaker 1: He might be right. We're very early in the United 470 00:29:34,360 --> 00:29:38,400 Speaker 1: States effort to revitalize the semiconductor industry here in the States, 471 00:29:39,040 --> 00:29:42,680 Speaker 1: and it could turn into a total fiasco. It is 472 00:29:42,800 --> 00:29:45,520 Speaker 1: sure to have some bumpy spots along the road. That's 473 00:29:45,560 --> 00:29:48,280 Speaker 1: just the nature of reality. We just don't know where 474 00:29:48,320 --> 00:29:50,680 Speaker 1: that road ultimately is going to lead. We know the 475 00:29:50,720 --> 00:29:57,600 Speaker 1: intended destination is greater independence when it comes to producing semiconductors. Now, 476 00:29:57,640 --> 00:30:00,400 Speaker 1: considering Taiwan situation, I think it's safe to say we 477 00:30:00,600 --> 00:30:05,120 Speaker 1: cannot assume Chang's projections on the matter are free from bias. 478 00:30:05,120 --> 00:30:09,080 Speaker 1: They're certainly not free from personal interest. Well, it's almost Halloween, 479 00:30:09,160 --> 00:30:12,560 Speaker 1: so how about some terrifying news. A video from the 480 00:30:12,600 --> 00:30:17,480 Speaker 1: official Kestral Defense page on the Chinese micro blogging site 481 00:30:17,480 --> 00:30:21,520 Speaker 1: way Bow shows a large drone dropping off a four 482 00:30:21,640 --> 00:30:25,280 Speaker 1: legged robot similar to the kinds of robots you've seen 483 00:30:25,360 --> 00:30:29,320 Speaker 1: from Boston Dynamics, only this robot also happens to have 484 00:30:29,520 --> 00:30:33,520 Speaker 1: a machine gun. Ho ho ho, Sorry I'm mixing up 485 00:30:33,520 --> 00:30:38,720 Speaker 1: my holidays here. Anyway, the video demonstrates that this technology 486 00:30:39,360 --> 00:30:42,480 Speaker 1: is ready to go, at least according to the defense 487 00:30:42,480 --> 00:30:45,840 Speaker 1: company behind it. The robot and the gun would be 488 00:30:45,960 --> 00:30:49,120 Speaker 1: under human control, so this would be a remotely controlled robot, 489 00:30:49,160 --> 00:30:52,800 Speaker 1: not an autonomous one. You would have an operator capable 490 00:30:52,840 --> 00:30:56,239 Speaker 1: of maneuvering the robot and firing its weapon. And this 491 00:30:56,320 --> 00:30:59,320 Speaker 1: tech could potentially be used in battlefield situations where you 492 00:30:59,400 --> 00:31:03,440 Speaker 1: want to draw off robotic soldiers, say behind enemy lines 493 00:31:04,000 --> 00:31:07,160 Speaker 1: to attack in a different direction, or you know, in 494 00:31:07,280 --> 00:31:10,360 Speaker 1: other locations that are all intended to put pressure on 495 00:31:10,560 --> 00:31:14,160 Speaker 1: the enemy on multiple fronts. This is exactly the kind 496 00:31:14,160 --> 00:31:17,640 Speaker 1: of use the companies like Boston Dynamics recently pledged they 497 00:31:17,640 --> 00:31:22,480 Speaker 1: would not pursue the weaponization of robotic platforms. Of course, 498 00:31:22,520 --> 00:31:24,520 Speaker 1: the U. S Military is certainly hard at work of 499 00:31:24,600 --> 00:31:28,440 Speaker 1: building these kinds of things itself, So this is something 500 00:31:28,600 --> 00:31:30,640 Speaker 1: that looks like it's going to be on the horizon 501 00:31:30,640 --> 00:31:34,760 Speaker 1: no matter what. And yes, this is terrifying because there's 502 00:31:34,760 --> 00:31:37,800 Speaker 1: really a worry that robotic forces are going to reduce 503 00:31:37,960 --> 00:31:42,680 Speaker 1: barriers that countries face before they engage at armed conflict. Right, 504 00:31:43,120 --> 00:31:47,080 Speaker 1: it might remove certain concerns and make it more likely 505 00:31:47,480 --> 00:31:50,560 Speaker 1: that will see more war. It's a lot easier to 506 00:31:50,600 --> 00:31:54,720 Speaker 1: sell your invasion to your population if that population isn't, 507 00:31:54,760 --> 00:31:57,320 Speaker 1: you know, seeing its own soldiers being put in harm's way. 508 00:31:57,800 --> 00:32:01,680 Speaker 1: See also Russia. There's also an additional fear that we 509 00:32:01,680 --> 00:32:06,720 Speaker 1: could see future technologies progress toward automation for navigation and combat. 510 00:32:07,240 --> 00:32:09,640 Speaker 1: That's something that's particularly scary when you keep in mind 511 00:32:09,640 --> 00:32:12,720 Speaker 1: that computer vision is by no means incapable of making mistakes. 512 00:32:13,160 --> 00:32:15,160 Speaker 1: So not only is it already scary to think of 513 00:32:15,200 --> 00:32:18,360 Speaker 1: a robot with a gun, it's even scarier to think 514 00:32:18,400 --> 00:32:20,800 Speaker 1: it's a robot with a gun that might think that 515 00:32:21,160 --> 00:32:25,120 Speaker 1: you're not on its side. Not great. Recently, in Video 516 00:32:25,200 --> 00:32:29,320 Speaker 1: unveiled its forty series of graphics cards, the new flagship 517 00:32:29,360 --> 00:32:33,160 Speaker 1: cards that set the company's standard for performance, but problems 518 00:32:33,160 --> 00:32:36,320 Speaker 1: have already popped up with the r t X from 519 00:32:36,400 --> 00:32:39,040 Speaker 1: Nvidio itself, as there have been a few reports of 520 00:32:39,120 --> 00:32:42,800 Speaker 1: users discovering that a sixteen pen adapter used to connect 521 00:32:42,800 --> 00:32:47,360 Speaker 1: the card to the computer's power supply can overheat, which 522 00:32:47,400 --> 00:32:50,360 Speaker 1: can cause the adapter to melt or even catch fire. 523 00:32:50,960 --> 00:32:54,160 Speaker 1: Now IGOR Labs has released an article that reveals that 524 00:32:54,240 --> 00:32:57,440 Speaker 1: these adapters were poorly made in the first place, with 525 00:32:57,560 --> 00:33:02,200 Speaker 1: substandard soldering that can lead to these issues. Igor Labs 526 00:33:02,240 --> 00:33:05,280 Speaker 1: has alerted in Video to the problem, which was likely 527 00:33:05,360 --> 00:33:08,000 Speaker 1: caused when the company relied on an assembly partner that 528 00:33:08,000 --> 00:33:10,960 Speaker 1: took some shortcuts. Gamers who are eager to get in 529 00:33:11,080 --> 00:33:13,960 Speaker 1: Video's new chips may want to hold off. There is 530 00:33:14,360 --> 00:33:17,360 Speaker 1: the distinct possibility that in Video will hold a recall 531 00:33:17,680 --> 00:33:21,600 Speaker 1: and and correct this issue before sending out new cards, 532 00:33:22,040 --> 00:33:24,800 Speaker 1: so it might be better to just wait, or you 533 00:33:24,880 --> 00:33:27,560 Speaker 1: might want to wait for a third party manufacturers to 534 00:33:27,960 --> 00:33:31,800 Speaker 1: offer their own forty cards because in Video's business strategy 535 00:33:32,000 --> 00:33:34,720 Speaker 1: is not just to manufacture the cards itself, but it 536 00:33:34,800 --> 00:33:39,000 Speaker 1: also licenses the design and the tech out to other manufacturers, 537 00:33:39,320 --> 00:33:42,920 Speaker 1: and if those manufacturers actually replace the adapter that in 538 00:33:43,080 --> 00:33:47,400 Speaker 1: Vidia includes in its kits, then it might solve the 539 00:33:47,440 --> 00:33:51,080 Speaker 1: problem as well, and thus you could end up with 540 00:33:51,080 --> 00:33:53,680 Speaker 1: a graphics card that is safer than the official in 541 00:33:53,880 --> 00:33:57,080 Speaker 1: Video version. I do think we're probably going to see 542 00:33:57,120 --> 00:34:00,840 Speaker 1: a recall and replacement process before long. But as of 543 00:34:00,840 --> 00:34:03,320 Speaker 1: the time I'm recording this, that has not yet been announced. 544 00:34:03,800 --> 00:34:06,360 Speaker 1: Gamers who have been anticipating the release of Call of 545 00:34:06,440 --> 00:34:09,920 Speaker 1: Duty Modern Warfare two and who got their hands on 546 00:34:09,960 --> 00:34:12,920 Speaker 1: a physical copy of the game might be shocked to 547 00:34:13,000 --> 00:34:16,719 Speaker 1: learn that there's no game on that physical disc. In fact, 548 00:34:16,719 --> 00:34:19,920 Speaker 1: according to euro Gamer, there's just seventy two megabytes of 549 00:34:20,000 --> 00:34:23,359 Speaker 1: data on those discs. Now, the game, it turns out, 550 00:34:23,440 --> 00:34:26,880 Speaker 1: is closer to thirty five gigabytes in size on the 551 00:34:26,920 --> 00:34:30,280 Speaker 1: PS five that can actually balloon up to one fifty 552 00:34:30,400 --> 00:34:34,360 Speaker 1: gigs once you install a Day one patch and you 553 00:34:34,440 --> 00:34:38,600 Speaker 1: have all the packs for the game installed. So seventy 554 00:34:38,600 --> 00:34:41,840 Speaker 1: two megabytes a hundred and fifty gigabytes is a huge gap. 555 00:34:42,000 --> 00:34:44,800 Speaker 1: What has going on, Well, it looks like the physical 556 00:34:44,880 --> 00:34:49,280 Speaker 1: disc really just directs machines to download the digital copy anyway. 557 00:34:49,320 --> 00:34:52,520 Speaker 1: So yes, you'll get a physical disc. That disc will 558 00:34:52,520 --> 00:34:54,319 Speaker 1: have like the logo and the art and all that 559 00:34:54,400 --> 00:34:58,480 Speaker 1: kind of stuff, but there's no game on the disc 560 00:34:58,600 --> 00:35:01,719 Speaker 1: and all it will do is direct you toward a 561 00:35:01,760 --> 00:35:04,719 Speaker 1: massive digital download. So if you live somewhere that has 562 00:35:04,800 --> 00:35:09,000 Speaker 1: lousy internet connectivity, or maybe your data plan has a 563 00:35:09,120 --> 00:35:11,759 Speaker 1: data cap to it, you might be shocked to learn 564 00:35:11,800 --> 00:35:15,000 Speaker 1: that your physical copy doesn't actually, you know, let you 565 00:35:15,040 --> 00:35:17,080 Speaker 1: experience the game. You still have to go through the 566 00:35:17,120 --> 00:35:18,799 Speaker 1: same steps that you would have had to go through 567 00:35:19,120 --> 00:35:21,520 Speaker 1: if you just purchased it digitally in the first place. 568 00:35:21,840 --> 00:35:24,000 Speaker 1: And that leads to the question why even have a 569 00:35:24,040 --> 00:35:26,640 Speaker 1: physical option that this is going to be the way 570 00:35:26,680 --> 00:35:29,319 Speaker 1: it works. That reminds me of a time when you 571 00:35:29,320 --> 00:35:33,680 Speaker 1: could find box copies of computer games and inside was 572 00:35:33,760 --> 00:35:36,720 Speaker 1: just a code where you could download the digital copy. 573 00:35:36,880 --> 00:35:39,800 Speaker 1: And I guess you bought the box so that you 574 00:35:39,840 --> 00:35:41,719 Speaker 1: would have something to put up on a shelf. I 575 00:35:41,719 --> 00:35:45,640 Speaker 1: mean maybe for collectors, but I don't know. It's it 576 00:35:45,719 --> 00:35:48,920 Speaker 1: hits me the wrong way to have a disc for 577 00:35:48,960 --> 00:35:51,120 Speaker 1: a game and the game is not on the disk. 578 00:35:51,600 --> 00:35:56,360 Speaker 1: That just bugs me. Anyway. That's it for today's news 579 00:35:56,400 --> 00:35:59,520 Speaker 1: episode of text Stuff. Hope you are all well. If 580 00:35:59,520 --> 00:36:03,160 Speaker 1: you have suggestions or any questions or anything like that, 581 00:36:03,280 --> 00:36:04,920 Speaker 1: you want to get in touch with me, there are 582 00:36:04,920 --> 00:36:07,040 Speaker 1: a couple of ways of doing that. One is to 583 00:36:07,120 --> 00:36:10,080 Speaker 1: download the I Heart radio app. It's free to download 584 00:36:10,120 --> 00:36:12,319 Speaker 1: and use. You can navigate over to tech Stuff in 585 00:36:12,320 --> 00:36:15,120 Speaker 1: the search field. There's a little microphone icon there if 586 00:36:15,160 --> 00:36:17,120 Speaker 1: you click on that you can leave a voice message 587 00:36:17,200 --> 00:36:19,600 Speaker 1: up to thirty seconds in length let me know if 588 00:36:19,600 --> 00:36:21,560 Speaker 1: you would like me to use it in a future episode, 589 00:36:22,080 --> 00:36:24,680 Speaker 1: or you can reach out on Twitter while Twitter is 590 00:36:24,719 --> 00:36:26,720 Speaker 1: still around. I have no idea what's going to happen 591 00:36:27,400 --> 00:36:31,000 Speaker 1: if Elon Musk closes the steal, so we'll see what happens. 592 00:36:31,600 --> 00:36:34,960 Speaker 1: But the handle on Twitter is tech stuff hs W 593 00:36:35,640 --> 00:36:45,160 Speaker 1: and I'll talk to you again really soon. Tech Stuff 594 00:36:45,239 --> 00:36:48,440 Speaker 1: is an I Heart Radio production. For more podcasts from 595 00:36:48,440 --> 00:36:52,200 Speaker 1: my Heart Radio, visit the i Heart Radio app, Apple Podcasts, 596 00:36:52,320 --> 00:36:54,320 Speaker 1: or wherever you listen to your favorite shows.