1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:16,320 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,400 --> 00:00:20,360 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:20,400 --> 00:00:23,360 Speaker 1: are you. It's time for the tech News for Thursday 5 00:00:23,640 --> 00:00:28,479 Speaker 1: May twenty fifth, twenty twenty three. So, first thing, I 6 00:00:28,520 --> 00:00:31,880 Speaker 1: want to address our updates to a story I covered 7 00:00:31,960 --> 00:00:36,600 Speaker 1: on Tuesday, which was about an incident in Cardiff in 8 00:00:36,680 --> 00:00:41,600 Speaker 1: Wales where a pair of teenage boys died in a 9 00:00:41,840 --> 00:00:45,160 Speaker 1: traffic accident and word had gotten around that they were 10 00:00:45,800 --> 00:00:48,880 Speaker 1: in they were being pursued by police and that that 11 00:00:49,400 --> 00:00:52,000 Speaker 1: contributed to this accident. It caused the accident and there 12 00:00:52,000 --> 00:00:55,360 Speaker 1: were riots that followed. Now, initially reports were that the 13 00:00:55,400 --> 00:00:59,000 Speaker 1: police weren't involved at all and that this was misinformation 14 00:00:59,240 --> 00:01:03,720 Speaker 1: and it spread rapidly and that ended up creating the 15 00:01:03,760 --> 00:01:09,400 Speaker 1: situation that ultimately escalated into a riot. Since then, where 16 00:01:09,400 --> 00:01:12,880 Speaker 1: it has gotten out that CCTV footage has shown that 17 00:01:13,000 --> 00:01:15,640 Speaker 1: there was a police van that was following the boys. 18 00:01:16,200 --> 00:01:20,160 Speaker 1: The police in Cardiff have said that there still wasn't 19 00:01:20,280 --> 00:01:24,160 Speaker 1: a pursuit, but the footage shows that there was a 20 00:01:24,160 --> 00:01:27,800 Speaker 1: police van following behind the boys. The police have given 21 00:01:27,959 --> 00:01:31,160 Speaker 1: a timeline that suggests that the van actually turned off 22 00:01:31,640 --> 00:01:34,760 Speaker 1: from following them before the crash happened. I don't know 23 00:01:34,800 --> 00:01:36,880 Speaker 1: what the truth is at this point, but it is 24 00:01:36,920 --> 00:01:40,319 Speaker 1: important to follow up on it because obviously the initial 25 00:01:40,800 --> 00:01:45,080 Speaker 1: statement was about this being misinformation that then spread rapidly 26 00:01:45,160 --> 00:01:48,280 Speaker 1: throughout the community, and it may turn out that it's 27 00:01:48,320 --> 00:01:52,080 Speaker 1: not misinformation at all. So we'll have to continue to 28 00:01:52,120 --> 00:01:56,200 Speaker 1: watch the story and see what happens. Obviously, if it 29 00:01:56,320 --> 00:01:59,680 Speaker 1: turns out that the police were misrepresenting what was going on, 30 00:02:00,520 --> 00:02:05,600 Speaker 1: it is going to make a situation where police relationships 31 00:02:05,640 --> 00:02:09,040 Speaker 1: with the community are already on shaky ground much worse. 32 00:02:09,240 --> 00:02:11,240 Speaker 1: So we will keep an eye out to see how 33 00:02:11,320 --> 00:02:17,240 Speaker 1: this story continues. Yesterday, Twitter attempted to serve as the 34 00:02:17,360 --> 00:02:22,440 Speaker 1: platform for GOP presidential hopeful Ron DeSantis as he made 35 00:02:22,480 --> 00:02:26,000 Speaker 1: his announcement that he was officially launching a presidential campaign. 36 00:02:26,480 --> 00:02:31,880 Speaker 1: This had been long anticipated, yesterday was just the formal announcement. Now, 37 00:02:31,880 --> 00:02:35,640 Speaker 1: I say Twitter attempted to serve as a platform because 38 00:02:35,720 --> 00:02:40,360 Speaker 1: the Twitter space that Elon Musk created specifically for this 39 00:02:40,480 --> 00:02:45,079 Speaker 1: event became a cacophony when audio issues made it impossible 40 00:02:45,120 --> 00:02:48,560 Speaker 1: for anyone to say anything without it being a massive, 41 00:02:48,639 --> 00:02:52,760 Speaker 1: incomprehensible mess. Musk later said the problem was that Twitter's 42 00:02:52,840 --> 00:02:56,840 Speaker 1: servers were overloaded. Several tech news outlets have pointed out 43 00:02:56,919 --> 00:03:01,360 Speaker 1: that this event well well attended, virtually it wasn't a 44 00:03:01,360 --> 00:03:04,919 Speaker 1: small event. It still didn't come close to approaching other 45 00:03:05,720 --> 00:03:10,160 Speaker 1: large online only events that had, you know, several times 46 00:03:10,160 --> 00:03:15,000 Speaker 1: more people in attendance, but had fewer technical glitches. Whatever 47 00:03:15,080 --> 00:03:18,880 Speaker 1: the cause of the glitches, the launch did not go smoothly. 48 00:03:18,960 --> 00:03:21,040 Speaker 1: Now I'm not going to comment on the political side 49 00:03:21,040 --> 00:03:24,239 Speaker 1: of this, and you're welcome. Instead, I just want to 50 00:03:24,240 --> 00:03:29,120 Speaker 1: say the problems, the technical problems, really didn't come as 51 00:03:29,160 --> 00:03:32,720 Speaker 1: much of a surprise to at least the grouchy people 52 00:03:32,800 --> 00:03:36,360 Speaker 1: like me out in the tech space, because Musk effectively 53 00:03:36,440 --> 00:03:40,960 Speaker 1: gutted Twitter over the course of his ownership of the platform, 54 00:03:41,560 --> 00:03:45,480 Speaker 1: and since then, essentially a skeleton crew has had to 55 00:03:45,520 --> 00:03:51,480 Speaker 1: scramble to meet whatever sometimes apparently arbitrary goals Musk comes 56 00:03:51,560 --> 00:03:53,120 Speaker 1: up with, or at least that's how it looks on 57 00:03:53,160 --> 00:03:56,360 Speaker 1: the outside, And I fully admit I am looking at 58 00:03:56,360 --> 00:03:59,360 Speaker 1: this from the outside. I could be one hundred percent 59 00:03:59,720 --> 00:04:04,480 Speaker 1: off base with these observations and assumptions. So I don't 60 00:04:04,520 --> 00:04:08,839 Speaker 1: want to suggest that my view is the the sum 61 00:04:08,960 --> 00:04:12,960 Speaker 1: total of the truth. Just from the outside, it looks 62 00:04:13,040 --> 00:04:16,880 Speaker 1: like Musk keeps wanting Twitter to tackle these huge projects, 63 00:04:17,240 --> 00:04:21,119 Speaker 1: but with such a reduced workforce that it's it's kind 64 00:04:21,160 --> 00:04:24,160 Speaker 1: of setting the platform up to fail, which is unfortunate 65 00:04:24,200 --> 00:04:26,400 Speaker 1: because I'm sure the people who are actually working on it, 66 00:04:26,960 --> 00:04:28,880 Speaker 1: you know, they don't want things to fail, and they're 67 00:04:28,920 --> 00:04:33,200 Speaker 1: working hard, but they're doing so with a lack of 68 00:04:33,240 --> 00:04:37,400 Speaker 1: assets and resources. All right, now it's time to talk 69 00:04:37,440 --> 00:04:42,680 Speaker 1: about AI again, and to talk about Sam Altman, CEO 70 00:04:42,800 --> 00:04:47,960 Speaker 1: of open AI again. Now you might remember that very 71 00:04:48,080 --> 00:04:53,040 Speaker 1: recently Altman appeared before US Congress and said that AI 72 00:04:53,160 --> 00:04:57,120 Speaker 1: is a field where regulation is needed. And at the time, 73 00:04:57,160 --> 00:04:59,960 Speaker 1: I was kind of hopeful that this meant Altman was 74 00:05:00,279 --> 00:05:03,480 Speaker 1: really sincere in that belief and that he was going 75 00:05:03,560 --> 00:05:07,520 Speaker 1: to take an active role to really draft useful regulations. 76 00:05:08,400 --> 00:05:10,599 Speaker 1: But then you could also say, well, yeah, but Sam 77 00:05:10,640 --> 00:05:14,640 Speaker 1: Bankman freed said very similar things about cryptocurrency and regulations, 78 00:05:14,680 --> 00:05:17,760 Speaker 1: and then look where he's at right now, so that 79 00:05:18,360 --> 00:05:22,560 Speaker 1: maybe you shouldn't take these these statements at face value. 80 00:05:23,000 --> 00:05:27,599 Speaker 1: So over in the EU, Altman's tune is slightly different 81 00:05:27,800 --> 00:05:30,560 Speaker 1: than it was in the US. So I would say 82 00:05:30,560 --> 00:05:34,880 Speaker 1: that the EU has people who are far more skeptical 83 00:05:35,120 --> 00:05:40,159 Speaker 1: and concerned about AI than what you typically see here 84 00:05:40,160 --> 00:05:41,919 Speaker 1: in the United States. Not to say that people in 85 00:05:41,960 --> 00:05:45,320 Speaker 1: the US are totally cool with AI and they have 86 00:05:45,400 --> 00:05:48,240 Speaker 1: no worries, But in the EU, I think it's it's 87 00:05:48,320 --> 00:05:52,479 Speaker 1: more prominent. And in the EU, the EU has passed 88 00:05:52,760 --> 00:05:57,240 Speaker 1: a law called the AI Act, which categorizes artificial intelligence 89 00:05:57,279 --> 00:06:01,120 Speaker 1: into three different buckets according to perceived risks. So at 90 00:06:01,160 --> 00:06:05,040 Speaker 1: the very very top of this are unacceptable risks, So 91 00:06:05,200 --> 00:06:09,640 Speaker 1: these would be AI applications that would potentially violate citizens' rights. 92 00:06:09,720 --> 00:06:13,080 Speaker 1: So these would be applications that the EU would just 93 00:06:13,160 --> 00:06:17,000 Speaker 1: outlaw period like these, you cannot use AI to do 94 00:06:17,040 --> 00:06:19,400 Speaker 1: these sort of things, So this would be something like 95 00:06:20,560 --> 00:06:24,960 Speaker 1: China's social scoring system for example. That's where each person 96 00:06:24,960 --> 00:06:30,679 Speaker 1: would receive a score which really relates to how useful 97 00:06:30,760 --> 00:06:34,520 Speaker 1: and loyal they were according to the state. That would 98 00:06:34,520 --> 00:06:37,680 Speaker 1: be right out as a use of AI. Under this 99 00:06:38,120 --> 00:06:42,080 Speaker 1: are is a category called high risk AI systems. These 100 00:06:42,080 --> 00:06:46,279 Speaker 1: could potentially be useful, so they could have a social 101 00:06:46,440 --> 00:06:51,680 Speaker 1: benefit to them, but they're also potentially harmful. So because 102 00:06:51,720 --> 00:06:54,960 Speaker 1: of that anything that fell into this category would need 103 00:06:55,000 --> 00:06:58,279 Speaker 1: to follow a strict set of rules and regulations in 104 00:06:58,400 --> 00:07:01,360 Speaker 1: order to be legal in EU. So, in other words, 105 00:07:01,520 --> 00:07:05,400 Speaker 1: a high risk system would be allowable under EU law, 106 00:07:05,520 --> 00:07:09,080 Speaker 1: provided that the companies that were making and using those 107 00:07:09,120 --> 00:07:13,480 Speaker 1: systems abided by the rules and remain transparent and such. 108 00:07:14,560 --> 00:07:18,320 Speaker 1: Altman says that chat, GPT and the GPT Large Language 109 00:07:18,320 --> 00:07:22,720 Speaker 1: Model would fall into this high risk category as the 110 00:07:22,760 --> 00:07:26,240 Speaker 1: EU has defined it, and he objects to that. He 111 00:07:26,280 --> 00:07:29,960 Speaker 1: says that shouldn't be the case. And he also seems 112 00:07:30,000 --> 00:07:33,800 Speaker 1: to think that the rules and regulations are too restrictive 113 00:07:34,200 --> 00:07:38,200 Speaker 1: and that they are going to harm small companies that 114 00:07:38,280 --> 00:07:41,679 Speaker 1: wish to integrate AI. Now keep in mind those small 115 00:07:41,720 --> 00:07:45,040 Speaker 1: companies integrating AI. I think Altman's looking at those small 116 00:07:45,080 --> 00:07:48,360 Speaker 1: companies as customers, right, These are companies that would be 117 00:07:48,640 --> 00:07:55,360 Speaker 1: essentially licensing open AI's platforms for work. So that's you know, 118 00:07:55,440 --> 00:07:57,920 Speaker 1: he has a vested interest in this. So I guess 119 00:07:57,920 --> 00:08:01,520 Speaker 1: what Altman appears to be saying. This is my interpretation 120 00:08:02,280 --> 00:08:05,920 Speaker 1: is that he's all for regulations if he has a 121 00:08:05,960 --> 00:08:08,560 Speaker 1: heavy hand in making them so that you know, they 122 00:08:08,560 --> 00:08:12,680 Speaker 1: don't actually impede his business. But if a country or 123 00:08:12,680 --> 00:08:16,880 Speaker 1: a European Union creates rules independently, He's ready to take 124 00:08:16,880 --> 00:08:21,520 Speaker 1: his toys and go home. Or really, to quote him, 125 00:08:21,640 --> 00:08:25,840 Speaker 1: he said, quote, if we can comply, we will, and 126 00:08:25,920 --> 00:08:29,600 Speaker 1: if we can't, will cease operating. We will try, but 127 00:08:29,640 --> 00:08:34,479 Speaker 1: there are technical limits to what's possible. End quote. Meanwhile, 128 00:08:34,520 --> 00:08:39,640 Speaker 1: Brad Smith, Microsoft President, called on US lawmakers to form 129 00:08:39,720 --> 00:08:43,080 Speaker 1: rules that would limit or prevent integrating AI into critical 130 00:08:43,080 --> 00:08:48,520 Speaker 1: systems like say the US power grid or various water infrastructure, 131 00:08:48,559 --> 00:08:51,400 Speaker 1: that kind of thing. He also called for AI companies 132 00:08:51,400 --> 00:08:55,079 Speaker 1: to be held accountable if and when their tools cause problems. 133 00:08:55,480 --> 00:08:59,400 Speaker 1: And considering how Microsoft has really invested heavily in open 134 00:08:59,440 --> 00:09:04,520 Speaker 1: AI and integrated GPT into its Microsoft Being product, this 135 00:09:04,559 --> 00:09:08,160 Speaker 1: is a pretty interesting take. But Smith said, quote this 136 00:09:08,360 --> 00:09:12,120 Speaker 1: is the fundamental need to ensure that machines remain subject 137 00:09:12,240 --> 00:09:16,319 Speaker 1: to effective oversight by people, and the people who design 138 00:09:16,400 --> 00:09:22,000 Speaker 1: and operate machines remain accountable to everyone else's end quote. 139 00:09:22,720 --> 00:09:25,560 Speaker 1: So here's the thing I agree with that. I think 140 00:09:25,679 --> 00:09:28,400 Speaker 1: that is a reasonable thing to call for. I think 141 00:09:28,440 --> 00:09:31,560 Speaker 1: the world in general needs to come up with rules 142 00:09:31,679 --> 00:09:36,040 Speaker 1: for the design and integration of AI and limitations to 143 00:09:36,120 --> 00:09:39,720 Speaker 1: that right, like where AI should and shouldn't be used, 144 00:09:40,440 --> 00:09:42,679 Speaker 1: and how it should and shouldn't be used. And I 145 00:09:42,720 --> 00:09:45,040 Speaker 1: think those rules need to require companies to be as 146 00:09:45,080 --> 00:09:49,120 Speaker 1: transparent as possible. Just getting more and more complicated as 147 00:09:49,160 --> 00:09:52,760 Speaker 1: these AI models get more and more complex, and also 148 00:09:52,800 --> 00:09:56,800 Speaker 1: the rules need to make sense, they need to be effective. 149 00:09:57,120 --> 00:10:00,000 Speaker 1: They need to prevent companies from just having the protect 150 00:10:00,480 --> 00:10:04,560 Speaker 1: of rules being in existence as they continue to develop AI. 151 00:10:04,800 --> 00:10:07,360 Speaker 1: So by that I mean this, Okay, So if there's 152 00:10:07,480 --> 00:10:11,520 Speaker 1: no rule, like there's an absence of rules, companies can 153 00:10:11,559 --> 00:10:14,760 Speaker 1: actually be a little nervous as they operate. Right, you 154 00:10:14,840 --> 00:10:17,320 Speaker 1: have no rules, you have no oversight, But that means 155 00:10:17,360 --> 00:10:20,319 Speaker 1: that if you do something really bad, there's going to 156 00:10:20,360 --> 00:10:25,760 Speaker 1: be a big ruckus and perhaps an overreaction to the ruckus, 157 00:10:25,800 --> 00:10:28,959 Speaker 1: which means that you end up harming yourself more than 158 00:10:29,000 --> 00:10:31,040 Speaker 1: you would have if you just made a set of 159 00:10:31,120 --> 00:10:35,480 Speaker 1: rules and abided by them. Now, when you make rules, well, 160 00:10:35,520 --> 00:10:39,280 Speaker 1: that means that you do have these these rules, these restrictions, 161 00:10:39,280 --> 00:10:43,000 Speaker 1: but usually not everything's covered, right. There are often gaps 162 00:10:43,120 --> 00:10:47,240 Speaker 1: or loopholes. So if you do something unanticipated but something 163 00:10:47,240 --> 00:10:50,640 Speaker 1: that isn't covered by the rules, your defense is, well, 164 00:10:50,720 --> 00:10:52,720 Speaker 1: there's nothing in the rules that says, you know, I 165 00:10:52,800 --> 00:10:56,079 Speaker 1: can't make AI that automatically denies credit to people who 166 00:10:56,120 --> 00:10:59,000 Speaker 1: come from such and such a place, because historically we 167 00:10:59,120 --> 00:11:04,280 Speaker 1: know all those folks default on loans or whatever. Rules 168 00:11:04,280 --> 00:11:09,240 Speaker 1: that are intended to protect the public sometimes have an 169 00:11:09,280 --> 00:11:13,200 Speaker 1: odd way of protecting the perpetrators of bad deeds. That's 170 00:11:13,240 --> 00:11:15,600 Speaker 1: what I'm saying. Like, if there were no rules, then 171 00:11:15,600 --> 00:11:18,200 Speaker 1: you might have a much larger reaction. If there are 172 00:11:18,360 --> 00:11:21,920 Speaker 1: rules and whatever you did wasn't covered by them, then 173 00:11:21,920 --> 00:11:24,520 Speaker 1: you can say, hey, I was following the rules. Yeah, 174 00:11:24,559 --> 00:11:26,880 Speaker 1: this bad thing happened, but I wasn't breaking the law. 175 00:11:27,360 --> 00:11:29,800 Speaker 1: So creating rules does need to be done, but it 176 00:11:29,800 --> 00:11:32,600 Speaker 1: needs to be done with care and critical thinking. And 177 00:11:32,679 --> 00:11:35,200 Speaker 1: it also has to be an ongoing process. It's not 178 00:11:35,240 --> 00:11:38,640 Speaker 1: something you do once and then you walk away. According 179 00:11:38,720 --> 00:11:42,040 Speaker 1: to a research firm called Watchful Technologies, TikTok has been 180 00:11:42,080 --> 00:11:46,200 Speaker 1: testing an AI chatbot in Apple mobile devices. In the 181 00:11:46,280 --> 00:11:50,120 Speaker 1: TikTok app, the chatbot is called Taco and I don't 182 00:11:50,120 --> 00:11:52,840 Speaker 1: know if it only works on Tuesdays. Oh, hang on, 183 00:11:52,880 --> 00:11:57,880 Speaker 1: it's actually spelled Takoh. The chatbot is meant to help 184 00:11:57,920 --> 00:12:02,440 Speaker 1: with discovery, so users apparently activate this chatbot and converse 185 00:12:02,480 --> 00:12:05,280 Speaker 1: with it to help find stuff they like or to 186 00:12:05,360 --> 00:12:07,520 Speaker 1: answer questions they have. They might have a question of 187 00:12:07,960 --> 00:12:12,240 Speaker 1: what does it mean if my toilet won't flush or whatever, 188 00:12:12,280 --> 00:12:16,880 Speaker 1: and then the AI agent will find videos that somehow 189 00:12:16,920 --> 00:12:19,600 Speaker 1: relate to that kind of thing. Now, according to the researchers, 190 00:12:19,640 --> 00:12:22,920 Speaker 1: Taco's purpose seems mostly just to keep people on TikTok 191 00:12:23,080 --> 00:12:25,880 Speaker 1: longer and keep them watching videos. So it's not like 192 00:12:26,640 --> 00:12:29,439 Speaker 1: Taco's posing as a best friend or something like that, 193 00:12:29,520 --> 00:12:33,880 Speaker 1: but rather augmenting the recommendation algorithm to find stuff that's 194 00:12:33,920 --> 00:12:38,360 Speaker 1: going to maximize users time on the app. Okay, we're 195 00:12:38,360 --> 00:12:39,920 Speaker 1: going to take a quick break. When we come back, 196 00:12:39,960 --> 00:12:51,240 Speaker 1: we've got some more tech news to cover. We're back, 197 00:12:51,559 --> 00:12:55,960 Speaker 1: so yesterday. On Wednesday, Meta held another round of layoffs, 198 00:12:56,040 --> 00:13:00,240 Speaker 1: hitting somewhere in the neighborhood of six thousand people time. 199 00:13:00,280 --> 00:13:04,000 Speaker 1: The jobs affected were mostly on the business side of 200 00:13:04,040 --> 00:13:09,320 Speaker 1: Meta's operations, as opposed to, you know, like the tech side. Reportedly, 201 00:13:09,880 --> 00:13:14,319 Speaker 1: morale is in pretty bad shape in Meta. There were 202 00:13:14,320 --> 00:13:16,439 Speaker 1: a couple of articles I saw this morning that we're 203 00:13:16,440 --> 00:13:20,840 Speaker 1: saying things like Meta employees are trying to avoid being 204 00:13:21,040 --> 00:13:24,360 Speaker 1: included in a future round of layoffs by essentially creating 205 00:13:24,440 --> 00:13:28,400 Speaker 1: work like they're manufacturing work for them to do. It's 206 00:13:28,480 --> 00:13:32,040 Speaker 1: kind of like the bosses coming look busy kind of mentality. 207 00:13:32,600 --> 00:13:36,480 Speaker 1: Others at Meta appear to have no motivation to work 208 00:13:36,480 --> 00:13:38,440 Speaker 1: at all, because you know, when you don't know whether 209 00:13:38,520 --> 00:13:40,560 Speaker 1: or not you're going to have a job the next day, 210 00:13:41,320 --> 00:13:44,000 Speaker 1: it can really do a number on you. I have 211 00:13:44,160 --> 00:13:47,640 Speaker 1: been there, and it is tough now. According to tech Crunch, 212 00:13:48,080 --> 00:13:52,840 Speaker 1: this most recent round of layoffs should theoretically be the 213 00:13:52,920 --> 00:13:58,200 Speaker 1: last major layoffs for the foreseeable future. Meta has indicated 214 00:13:58,240 --> 00:14:01,320 Speaker 1: that it was aiming to eliminate ten thousand jobs the 215 00:14:01,400 --> 00:14:05,320 Speaker 1: spring total across layoffs, and this one was the second 216 00:14:05,440 --> 00:14:08,200 Speaker 1: round of layoffs, so they have definitely hit that ten 217 00:14:08,240 --> 00:14:11,920 Speaker 1: thousand mark. And since late last year, Meta has handed 218 00:14:12,040 --> 00:14:15,679 Speaker 1: around twenty one thousand staff they're walking papers and has 219 00:14:15,720 --> 00:14:21,000 Speaker 1: also put a a hiring freeze on thousands of open positions. 220 00:14:21,040 --> 00:14:24,640 Speaker 1: Mark Zuckerberg, Meta's CEO, has called twenty twenty three the 221 00:14:24,720 --> 00:14:29,040 Speaker 1: Year of Efficiency, indicating that Meta had a bloated workforce 222 00:14:29,480 --> 00:14:32,880 Speaker 1: that wasn't really representative of the actual amount of work 223 00:14:32,960 --> 00:14:35,280 Speaker 1: that needed to be done. In other words, we got 224 00:14:35,320 --> 00:14:38,000 Speaker 1: more people than we have work for them to do. Also, 225 00:14:38,040 --> 00:14:41,640 Speaker 1: the company continues to face some pretty hefty costs which 226 00:14:41,680 --> 00:14:44,680 Speaker 1: might be motivating some of these cutbacks. That includes a 227 00:14:44,800 --> 00:14:48,960 Speaker 1: more than one billion dollar fine that came down from 228 00:14:49,040 --> 00:14:52,000 Speaker 1: the EU earlier this week. To learn about that, just 229 00:14:52,040 --> 00:14:57,480 Speaker 1: listen to Tuesday's episode of Tech Stuff. Apple has announced 230 00:14:57,480 --> 00:15:02,560 Speaker 1: a truly ginormous deal with the company Broadcom that will 231 00:15:02,560 --> 00:15:05,760 Speaker 1: see these two companies making five G components in the 232 00:15:05,880 --> 00:15:09,720 Speaker 1: United States. So, according to an article in courts, Apple 233 00:15:09,760 --> 00:15:13,080 Speaker 1: will invest somewhere in the neighborhood of four hundred thirty 234 00:15:13,360 --> 00:15:18,840 Speaker 1: billion dollars to boost US manufacturing, largely in the five 235 00:15:18,920 --> 00:15:21,800 Speaker 1: G realm, but the connectivity space in general. So over 236 00:15:21,800 --> 00:15:24,400 Speaker 1: the last couple of years, Apple has been looking for 237 00:15:24,440 --> 00:15:29,360 Speaker 1: ways to decrease its reliance on Chinese manufacturing for various components. 238 00:15:29,800 --> 00:15:31,680 Speaker 1: There are a lot of different reasons to pull out 239 00:15:31,680 --> 00:15:35,360 Speaker 1: of China, ranging from optics because it doesn't always look 240 00:15:35,400 --> 00:15:37,600 Speaker 1: good to be doing business with a country that has 241 00:15:37,640 --> 00:15:41,840 Speaker 1: a pretty awful human rights record all the way to 242 00:15:42,480 --> 00:15:45,200 Speaker 1: practical things like supply chain issues if you want to 243 00:15:45,200 --> 00:15:49,080 Speaker 1: be super cynical. But it's not really easy to just 244 00:15:49,600 --> 00:15:54,440 Speaker 1: extract from China, largely because companies like Apple depend upon 245 00:15:54,520 --> 00:15:58,680 Speaker 1: the lower, the much much lower costs of labor in 246 00:15:58,760 --> 00:16:02,960 Speaker 1: China to keep production costs down. Recently, a manufacturing facility 247 00:16:02,960 --> 00:16:06,800 Speaker 1: in India actually announced it would no longer manufacture Apple 248 00:16:06,840 --> 00:16:13,440 Speaker 1: components because Apple's demands regarding costs of production meant that 249 00:16:13,680 --> 00:16:17,440 Speaker 1: this Indian company their profit margins were nearly non existent. 250 00:16:18,000 --> 00:16:21,480 Speaker 1: I mean, potentially the company would end up losing money 251 00:16:21,800 --> 00:16:24,120 Speaker 1: to make stuff for Apple because Apple was saying, We're 252 00:16:24,120 --> 00:16:26,560 Speaker 1: not going to pay you more than X amount, and 253 00:16:26,720 --> 00:16:31,480 Speaker 1: meanwhile it costs why amount to make the stuff. So 254 00:16:32,080 --> 00:16:35,520 Speaker 1: there are companies in other places where Apple has previously 255 00:16:35,560 --> 00:16:39,440 Speaker 1: tried to move to avoid working in China that have 256 00:16:39,520 --> 00:16:42,800 Speaker 1: already kind of bulked because of this issue. You know, 257 00:16:42,840 --> 00:16:45,880 Speaker 1: the fact that unless Apple is willing to pay the 258 00:16:45,960 --> 00:16:51,560 Speaker 1: amount that is acceptable within that country for labor, then 259 00:16:51,600 --> 00:16:53,440 Speaker 1: it's just not going to get done. So a move 260 00:16:53,480 --> 00:16:56,360 Speaker 1: out of China is likely going to mean increased prices 261 00:16:56,440 --> 00:17:01,120 Speaker 1: on items in the long run. Global economics are super complicated. 262 00:17:01,680 --> 00:17:04,680 Speaker 1: The attorneys general for several states here in the United 263 00:17:04,680 --> 00:17:08,399 Speaker 1: States have banded together to level a massive lawsuit against 264 00:17:08,400 --> 00:17:12,640 Speaker 1: a telecommunications company called Avid Telecom, and, according to the lawsuit, 265 00:17:12,720 --> 00:17:18,240 Speaker 1: this one telecommunications company facilitated billions of robocalls to people 266 00:17:18,240 --> 00:17:21,080 Speaker 1: who had previously signed up on the national Do Not 267 00:17:21,200 --> 00:17:24,480 Speaker 1: Call Registry. So that registry is supposed to protect the 268 00:17:24,520 --> 00:17:27,480 Speaker 1: people who sign up to it from receiving nuisance and 269 00:17:27,640 --> 00:17:31,720 Speaker 1: unwanted calls, primarily telemarketing calls, but also things like scams 270 00:17:31,720 --> 00:17:34,919 Speaker 1: and stuff. Citizens can actually designate the types of calls 271 00:17:34,920 --> 00:17:37,520 Speaker 1: that they are willing to receive. So if you want, 272 00:17:37,560 --> 00:17:39,159 Speaker 1: you can go to and you're in the US, you 273 00:17:39,160 --> 00:17:41,879 Speaker 1: can go to the Do Not Call Registry, sign up 274 00:17:41,880 --> 00:17:46,240 Speaker 1: for free, and even indicate which calls you don't mind getting. Anyway, 275 00:17:47,119 --> 00:17:49,280 Speaker 1: the problem is that, at least according to this lawsuit, 276 00:17:49,800 --> 00:17:53,639 Speaker 1: Avid Telecom allowed telemarketing calls and scams and such to 277 00:17:53,680 --> 00:17:56,560 Speaker 1: go through when they absolutely should not have been able to. 278 00:17:57,160 --> 00:18:00,320 Speaker 1: The lawyer for Avid Telecom denies the charges, saying the 279 00:18:00,359 --> 00:18:03,480 Speaker 1: company acted in accordance with the law and expressed disappointment 280 00:18:03,480 --> 00:18:06,040 Speaker 1: that all these Attorneys general didn't just sit down for 281 00:18:06,119 --> 00:18:10,119 Speaker 1: a civilized discussion before bringing a nasty lawsuit. We'll have 282 00:18:10,160 --> 00:18:14,400 Speaker 1: to see where this goes from here. Sony held a 283 00:18:14,440 --> 00:18:19,200 Speaker 1: PlayStation event this week, a PlayStation showcase event, and unveiled 284 00:18:19,240 --> 00:18:22,879 Speaker 1: a new handheld gaming device that will be capable of 285 00:18:22,880 --> 00:18:28,040 Speaker 1: playing any non VR game streamed from a nearby PlayStation five. 286 00:18:28,119 --> 00:18:31,680 Speaker 1: So essentially, you're running the game on your PlayStation five, 287 00:18:31,720 --> 00:18:34,800 Speaker 1: you're just streaming it to this handheld device that's within 288 00:18:35,400 --> 00:18:38,960 Speaker 1: a certain range of that PS five. It doesn't even 289 00:18:39,000 --> 00:18:41,399 Speaker 1: have an official name yet, but the internal name for 290 00:18:41,440 --> 00:18:44,440 Speaker 1: the handheld is Project Q. So Project Q will be 291 00:18:44,480 --> 00:18:47,439 Speaker 1: dependent upon a console. It will not be able to 292 00:18:47,560 --> 00:18:51,159 Speaker 1: play games natively. You can't just take it on the 293 00:18:51,200 --> 00:18:53,960 Speaker 1: go like you would with a Nintendo Switch. It kind 294 00:18:54,000 --> 00:18:57,480 Speaker 1: of looks like someone took a modern PlayStation controller, cut 295 00:18:57,520 --> 00:19:00,439 Speaker 1: it in half, and then shoved an eight inch screen 296 00:19:00,600 --> 00:19:03,679 Speaker 1: in between the two halves. I'm not crazy about this design, 297 00:19:03,680 --> 00:19:06,000 Speaker 1: but I'm also in the minority of folks who don't 298 00:19:06,040 --> 00:19:09,920 Speaker 1: like PlayStation controllers in general. I know I'm bonkers. Anyway. 299 00:19:10,119 --> 00:19:12,159 Speaker 1: That's about all we know so far about Project Q. 300 00:19:12,520 --> 00:19:15,520 Speaker 1: Sony didn't have information on how much folks should expect 301 00:19:15,520 --> 00:19:18,359 Speaker 1: this to cost or when it will come out. If 302 00:19:18,400 --> 00:19:20,439 Speaker 1: it's on the more expensive side, that's kind of a 303 00:19:20,440 --> 00:19:22,760 Speaker 1: deal breaker in my book. But then I also don't 304 00:19:22,800 --> 00:19:25,479 Speaker 1: own a PS five yet. I might actually change that 305 00:19:25,520 --> 00:19:28,240 Speaker 1: next month. Because I'm thinking about buying myself one as 306 00:19:28,280 --> 00:19:32,960 Speaker 1: a Berthday present, but I'm pretty sure i'll skip Project Q. Finally, 307 00:19:33,240 --> 00:19:37,840 Speaker 1: South Korea successfully launched a rocket carrying eight satellites earlier today. 308 00:19:37,920 --> 00:19:42,440 Speaker 1: The rocket, called Nuri, launched in the afternoon in South Korea, 309 00:19:42,520 --> 00:19:45,720 Speaker 1: and according to South Korea's Ministry of Science, it achieved orbit. 310 00:19:46,080 --> 00:19:49,199 Speaker 1: The office also reported that the primary satellite on board 311 00:19:49,480 --> 00:19:53,120 Speaker 1: the next SAT two has already established communications with Korea's 312 00:19:53,240 --> 00:19:57,399 Speaker 1: station in Antarctica. And as for the other satellites, at 313 00:19:57,400 --> 00:19:59,520 Speaker 1: the time of this recording, there was actually questions if 314 00:19:59,520 --> 00:20:02,960 Speaker 1: one of them microsatellites failed to deploy properly from the rocket. 315 00:20:03,000 --> 00:20:05,479 Speaker 1: I don't have more information on that just yet, but 316 00:20:05,560 --> 00:20:07,879 Speaker 1: South Korea is now the seventh country to achieve a 317 00:20:07,960 --> 00:20:11,000 Speaker 1: launch carry more than a ton of payload into orbit. 318 00:20:11,480 --> 00:20:14,080 Speaker 1: And that's it for the tech News. I hope you 319 00:20:14,119 --> 00:20:17,720 Speaker 1: are all well and I'll talk to you again really soon. 320 00:20:23,880 --> 00:20:28,520 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 321 00:20:28,880 --> 00:20:32,560 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 322 00:20:32,600 --> 00:20:37,399 Speaker 1: to your favorite shows.