1 00:00:04,360 --> 00:00:12,319 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,280 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,320 --> 00:00:17,959 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:18,000 --> 00:00:22,280 Speaker 1: are you? So before I talk about the news for Tuesday, 5 00:00:22,320 --> 00:00:24,720 Speaker 1: March seven, twenty twenty three, I want to give y'all 6 00:00:24,720 --> 00:00:28,920 Speaker 1: a heads up that tomorrow a special guest host takes 7 00:00:28,920 --> 00:00:33,000 Speaker 1: over the show, Bridget Todd, also known as the host 8 00:00:33,000 --> 00:00:35,760 Speaker 1: of There Are No Girls on the Internet, among many 9 00:00:35,800 --> 00:00:38,879 Speaker 1: other things, And I'm really excited to have y'all hear 10 00:00:38,960 --> 00:00:43,080 Speaker 1: this special episode. I don't want to spoil anything, but 11 00:00:43,479 --> 00:00:46,680 Speaker 1: I'm definitely looking forward to everyone getting a chance to 12 00:00:46,720 --> 00:00:49,600 Speaker 1: hear it. So make sure you tune in tomorrow to 13 00:00:49,640 --> 00:00:52,919 Speaker 1: get that special episode, and I'll be back on the 14 00:00:52,960 --> 00:00:58,600 Speaker 1: following Thursday. But now let's get to the news for today. 15 00:00:58,720 --> 00:01:01,400 Speaker 1: And first up, for a change, I really just have 16 00:01:01,480 --> 00:01:07,600 Speaker 1: one story explicitly about AI today, and it's about Microsoft's 17 00:01:07,600 --> 00:01:12,160 Speaker 1: incorporation of AI. This one was developed by open Ai 18 00:01:12,319 --> 00:01:16,080 Speaker 1: and it has to do with Microsoft's products for enterprises 19 00:01:16,440 --> 00:01:20,080 Speaker 1: for businesses. All right, So, first off, Microsoft had previously 20 00:01:20,400 --> 00:01:23,800 Speaker 1: invested a billion dollars in open ai and more recently, 21 00:01:24,160 --> 00:01:26,920 Speaker 1: the two organizations I've entered into a type of partnership, 22 00:01:27,120 --> 00:01:31,840 Speaker 1: Though details are scarce as to the extent of that partnership, 23 00:01:31,880 --> 00:01:35,720 Speaker 1: some folks estimate that the agreement has Microsoft investing ten 24 00:01:36,080 --> 00:01:42,000 Speaker 1: billion dollars with open AI over the next several years. Anyway, 25 00:01:42,080 --> 00:01:45,959 Speaker 1: this week, Microsoft has unveiled how it is integrating AI 26 00:01:46,080 --> 00:01:50,480 Speaker 1: into suits like power Platform, which is a platform with 27 00:01:50,520 --> 00:01:53,640 Speaker 1: tools for developers that kind of stuff, as well as 28 00:01:54,000 --> 00:01:58,720 Speaker 1: Dynamics three sixty five, which is used for enterprise resource planning. So, 29 00:01:58,760 --> 00:02:02,640 Speaker 1: in other words, these aren't not consumer facing products. You know, 30 00:02:02,720 --> 00:02:05,279 Speaker 1: if you don't happen to work in these fields, you'll 31 00:02:05,360 --> 00:02:08,600 Speaker 1: never really encounter them. They are meant to be tools 32 00:02:08,639 --> 00:02:12,359 Speaker 1: that businesses can use to optimize processes and stuff. So 33 00:02:12,400 --> 00:02:15,079 Speaker 1: the idea is that the AI will assist companies and 34 00:02:15,280 --> 00:02:19,840 Speaker 1: employees to do jobs, particularly those that are highly repetitive. 35 00:02:20,240 --> 00:02:24,520 Speaker 1: In other words, Microsoft's pitch is that this AI augments 36 00:02:24,520 --> 00:02:28,440 Speaker 1: what you're already doing, and thus the name they picked 37 00:02:28,480 --> 00:02:32,600 Speaker 1: copilot really makes sense. These tools are not intended to 38 00:02:32,639 --> 00:02:36,560 Speaker 1: replace decision makers. Instead, it's supposed to make their jobs 39 00:02:36,600 --> 00:02:40,800 Speaker 1: easier and less repetitive. And side note my own personal 40 00:02:40,800 --> 00:02:44,880 Speaker 1: opinion is. I think Tesla could probably take notes on 41 00:02:44,919 --> 00:02:50,079 Speaker 1: how Microsoft's naming approach went this way, because I think 42 00:02:50,080 --> 00:02:53,520 Speaker 1: Copilot is a much more responsible naming convention than say, 43 00:02:54,000 --> 00:02:59,440 Speaker 1: full self driving, when the actual technology is not full 44 00:02:59,440 --> 00:03:03,320 Speaker 1: self drive. Anyway, that being said, I haven't personally had 45 00:03:03,560 --> 00:03:08,160 Speaker 1: any experience using Copilot. It's really for stuff that's outside 46 00:03:08,200 --> 00:03:11,960 Speaker 1: my job duties, so I can't actually comment on the 47 00:03:12,080 --> 00:03:15,240 Speaker 1: quality of the tools performance, but supposed to do things 48 00:03:15,280 --> 00:03:19,679 Speaker 1: like help marketing departments identify new customers. For example, let's 49 00:03:19,680 --> 00:03:21,960 Speaker 1: say you've got a product or a service. Then you're 50 00:03:22,000 --> 00:03:24,919 Speaker 1: doing pretty well, but you're leaving money on the table 51 00:03:25,000 --> 00:03:30,000 Speaker 1: because you haven't identified people who would be really good customers. 52 00:03:30,480 --> 00:03:32,640 Speaker 1: You you never reached out to them because you weren't 53 00:03:32,639 --> 00:03:34,440 Speaker 1: aware of them or whatever. This tool is meant to 54 00:03:34,480 --> 00:03:40,560 Speaker 1: help people identify those potential customers and then craft communications 55 00:03:40,600 --> 00:03:44,800 Speaker 1: that are likely to resonate with them. That sort of thing. Further, 56 00:03:45,160 --> 00:03:49,600 Speaker 1: when building, say a message to potential customers, Copilot will 57 00:03:49,720 --> 00:03:54,000 Speaker 1: clearly label all the pieces of that communication that had 58 00:03:54,040 --> 00:03:57,320 Speaker 1: been crafted by AI, which will give real human beings 59 00:03:57,840 --> 00:04:02,160 Speaker 1: in whatever organization we're talking about a chance to review 60 00:04:02,240 --> 00:04:06,400 Speaker 1: these communications to make sure that everything's accurate and appropriate, 61 00:04:06,680 --> 00:04:09,440 Speaker 1: and you know, it doesn't end up just being a 62 00:04:09,480 --> 00:04:12,720 Speaker 1: message that says that, you know, robots should kill all 63 00:04:12,800 --> 00:04:16,160 Speaker 1: humans or something. Last week, I talked about how there's 64 00:04:16,200 --> 00:04:18,840 Speaker 1: a bill in the US House of Representatives that could 65 00:04:18,839 --> 00:04:22,640 Speaker 1: potentially call for the ban of TikTok. Well, the Senate, 66 00:04:22,880 --> 00:04:26,400 Speaker 1: which is the other branch of the US Congress, plans 67 00:04:26,440 --> 00:04:30,400 Speaker 1: to unveil a similar bill today. In both cases, the 68 00:04:30,480 --> 00:04:33,880 Speaker 1: aim really does go beyond TikTok. I mean, obviously the 69 00:04:33,880 --> 00:04:37,200 Speaker 1: app is a big reason for these proposals in both cases, 70 00:04:37,839 --> 00:04:42,520 Speaker 1: but in neither case is it exclusively the target. So 71 00:04:42,560 --> 00:04:45,080 Speaker 1: the Senate version is said to give the President of 72 00:04:45,120 --> 00:04:49,240 Speaker 1: the US the authority to restrict technology from foreign countries 73 00:04:49,560 --> 00:04:52,760 Speaker 1: if it's reasonable to suspect that those companies might be 74 00:04:53,200 --> 00:04:57,040 Speaker 1: using the tech to compromise the US's national security. And y'all, 75 00:04:57,120 --> 00:04:59,880 Speaker 1: this does get super tricky. I talked last week about 76 00:04:59,880 --> 00:05:03,320 Speaker 1: how the ACLU claims that banning TikTok would amount to 77 00:05:03,440 --> 00:05:06,920 Speaker 1: violating the First Amendment, which is the right to free speech, 78 00:05:07,480 --> 00:05:11,320 Speaker 1: and I honestly don't know about that one way or 79 00:05:11,360 --> 00:05:15,280 Speaker 1: the other. There's no doubt that as technology gets more advanced, 80 00:05:15,760 --> 00:05:18,840 Speaker 1: and starts to lean heavier on stuff like cloud computing 81 00:05:19,200 --> 00:05:22,880 Speaker 1: and has the potential to serve as a data gathering system. 82 00:05:23,560 --> 00:05:26,760 Speaker 1: We should be concerned, In fact, we should already be 83 00:05:26,880 --> 00:05:30,880 Speaker 1: concerned about this, because it's already happening all around us, 84 00:05:30,920 --> 00:05:34,200 Speaker 1: whether we're talking about companies that are foreign or domestic. 85 00:05:34,880 --> 00:05:37,960 Speaker 1: I mean, that is happening. That's what's going on with Meta, 86 00:05:38,040 --> 00:05:41,440 Speaker 1: it's what's going on with Google. Pretty much any big 87 00:05:41,480 --> 00:05:45,000 Speaker 1: tech company is doing this. But here, obviously, the concern 88 00:05:45,160 --> 00:05:49,000 Speaker 1: is tech companies that are ultimately located outside the United States. 89 00:05:49,440 --> 00:05:53,119 Speaker 1: From a national security perspective, I can understand that there's 90 00:05:53,200 --> 00:05:56,720 Speaker 1: a fear that foreign companies could potentially pose a threat. 91 00:05:57,600 --> 00:06:02,240 Speaker 1: We've already seen similar movements the US against foreign companies. So, 92 00:06:02,320 --> 00:06:07,919 Speaker 1: for example, the US banned telecom companies from using tech 93 00:06:08,160 --> 00:06:11,440 Speaker 1: from the Chinese company Huawei out of concern that that 94 00:06:11,480 --> 00:06:15,200 Speaker 1: could be used as a massive surveillance system. So Huawei 95 00:06:15,200 --> 00:06:18,400 Speaker 1: makes all sorts of networking equipment, among other things, and 96 00:06:18,440 --> 00:06:22,000 Speaker 1: the fear was that if telecom companies were integrating Huawei 97 00:06:22,240 --> 00:06:27,240 Speaker 1: systems within their infrastructure, that it could turn the telecommunications 98 00:06:27,279 --> 00:06:30,320 Speaker 1: industry in the United States into a giant espionage service 99 00:06:30,400 --> 00:06:34,960 Speaker 1: for the Chinese government. That was the fear. And you know, 100 00:06:35,720 --> 00:06:38,159 Speaker 1: we in the United States would much rather the NSA 101 00:06:38,520 --> 00:06:41,479 Speaker 1: do all the spying on communications within our country. Now 102 00:06:41,560 --> 00:06:47,080 Speaker 1: keep an American anyway. I haven't seen the proposed bill 103 00:06:47,360 --> 00:06:49,880 Speaker 1: yet because it wasn't released while I was writing this. 104 00:06:50,240 --> 00:06:52,120 Speaker 1: By the time you hear it, it may have already 105 00:06:52,160 --> 00:06:55,080 Speaker 1: been released. I will say that according to reports, it 106 00:06:55,120 --> 00:06:58,440 Speaker 1: does have bipartisan support, and we've seen other parts of 107 00:06:58,480 --> 00:07:01,640 Speaker 1: the world, like the EU, to a stronger stance against 108 00:07:01,720 --> 00:07:06,040 Speaker 1: letting information flow out of a region to foreign countries 109 00:07:06,040 --> 00:07:11,960 Speaker 1: without restriction. Right. The EU guards citizen data very carefully 110 00:07:12,080 --> 00:07:15,880 Speaker 1: and is quick to jump on companies that don't follow 111 00:07:15,880 --> 00:07:19,200 Speaker 1: the regulations within the EU, and I have long said 112 00:07:19,200 --> 00:07:23,120 Speaker 1: that the United States kind of fell behind on this. 113 00:07:23,640 --> 00:07:27,120 Speaker 1: So in some ways, I think this is correcting a 114 00:07:27,160 --> 00:07:32,840 Speaker 1: problem that has been around for ages. And of course, 115 00:07:32,920 --> 00:07:35,200 Speaker 1: if you look at China, it actually takes this whole 116 00:07:35,240 --> 00:07:39,240 Speaker 1: concept to an extreme right. It doesn't just limit what 117 00:07:39,400 --> 00:07:42,240 Speaker 1: goes out, it limits what comes in. You know, it 118 00:07:42,280 --> 00:07:46,080 Speaker 1: limits what citizens are able to access. And see you 119 00:07:46,480 --> 00:07:51,520 Speaker 1: can't even access lots of different Western based tools within China. 120 00:07:51,600 --> 00:07:57,760 Speaker 1: So there's clearly a spectrum here between letting everything go 121 00:07:58,160 --> 00:08:01,480 Speaker 1: flow in and flow out, everyone access everything and let 122 00:08:01,520 --> 00:08:04,680 Speaker 1: everything access us, and then the extreme of China, where 123 00:08:04,680 --> 00:08:09,080 Speaker 1: you take really authoritative control over what can be seen 124 00:08:09,160 --> 00:08:13,160 Speaker 1: and accessed. And I don't think either extreme is good. 125 00:08:13,320 --> 00:08:15,400 Speaker 1: I think you need to find someplace in the middle 126 00:08:15,760 --> 00:08:19,640 Speaker 1: where things work out properly. And I honestly don't know 127 00:08:19,680 --> 00:08:22,760 Speaker 1: where this Senate bill is going to fall along that spectrum. 128 00:08:22,800 --> 00:08:25,640 Speaker 1: I haven't had chance to read it yet, but I'm 129 00:08:25,640 --> 00:08:28,240 Speaker 1: sure i'll talk about this later on this week once 130 00:08:28,240 --> 00:08:32,160 Speaker 1: we have more information, and I definitely want to see 131 00:08:32,760 --> 00:08:37,120 Speaker 1: moves that will protect citizens without escalating into an issue 132 00:08:37,160 --> 00:08:40,560 Speaker 1: where the United States just becomes even more isolationist and 133 00:08:40,679 --> 00:08:46,480 Speaker 1: insular than we already are. Okay, do you remember Cambridge 134 00:08:46,520 --> 00:08:51,200 Speaker 1: Analytica and that scandal, So essentially a political marketing company 135 00:08:51,840 --> 00:08:54,760 Speaker 1: was using data that had been collected by an app developer, 136 00:08:54,800 --> 00:08:58,240 Speaker 1: a Facebook app developer who had crafted a survey app 137 00:08:58,920 --> 00:09:01,839 Speaker 1: for Facebook. Now, back in the days when this app 138 00:09:01,920 --> 00:09:06,120 Speaker 1: was created, Facebook's API was not particularly careful about the 139 00:09:06,160 --> 00:09:09,760 Speaker 1: kinds of information and app could gather, so it was 140 00:09:09,840 --> 00:09:13,040 Speaker 1: possible to design an app to scoop up all sorts 141 00:09:13,040 --> 00:09:15,880 Speaker 1: of data even if that information had nothing to do 142 00:09:15,960 --> 00:09:19,680 Speaker 1: with whatever the app was doing. In the Cambridge Analytica case, 143 00:09:20,120 --> 00:09:23,160 Speaker 1: this included getting a look at folks who hadn't even 144 00:09:23,440 --> 00:09:26,680 Speaker 1: used the app themselves. And here's how it would work. 145 00:09:27,200 --> 00:09:31,640 Speaker 1: Person A decides that they want to install this app 146 00:09:31,720 --> 00:09:35,079 Speaker 1: and take the survey, partly because it's a paid survey, 147 00:09:35,120 --> 00:09:38,560 Speaker 1: so they have an incentive, a monetary incentive to participate. 148 00:09:38,960 --> 00:09:42,000 Speaker 1: So they install the app on Facebook and they take 149 00:09:42,000 --> 00:09:45,880 Speaker 1: the survey. Now, persons B through Z, who happened to 150 00:09:45,880 --> 00:09:50,040 Speaker 1: be friends with person A, they don't download the app. 151 00:09:50,080 --> 00:09:53,240 Speaker 1: They don't install the app and take the survey. However, 152 00:09:53,640 --> 00:09:58,000 Speaker 1: the app gets access to person A's views as if 153 00:09:58,000 --> 00:10:01,280 Speaker 1: it were person a. So in other words, it can 154 00:10:01,320 --> 00:10:04,720 Speaker 1: actually look at all the profiles of all the friends 155 00:10:04,760 --> 00:10:08,520 Speaker 1: that person A has as if it were person A. 156 00:10:09,160 --> 00:10:12,360 Speaker 1: So it can collect all this information on persons B 157 00:10:12,640 --> 00:10:16,160 Speaker 1: through Z, even though none of them consented to have 158 00:10:16,280 --> 00:10:19,840 Speaker 1: their data shared. They didn't have a say in this. Well, 159 00:10:19,880 --> 00:10:24,840 Speaker 1: now Australia, nearly a decade after all this happened, is 160 00:10:25,480 --> 00:10:28,400 Speaker 1: hearing a case about this. The High Court is going 161 00:10:28,440 --> 00:10:31,120 Speaker 1: to hear a case that claims that as many as 162 00:10:31,200 --> 00:10:35,640 Speaker 1: three hundred thousand people in Australia had their data harvested 163 00:10:36,000 --> 00:10:40,360 Speaker 1: thanks to just fifty three people using the quids. Think 164 00:10:40,400 --> 00:10:43,920 Speaker 1: about that, fifty three people use this app and in 165 00:10:43,960 --> 00:10:48,160 Speaker 1: the process, around three hundred thousand people have their information 166 00:10:48,400 --> 00:10:53,320 Speaker 1: exposed and gathered by the app developer. And further, the 167 00:10:53,360 --> 00:10:56,120 Speaker 1: case claims that Facebook should be held accountable for allowing 168 00:10:56,120 --> 00:11:00,240 Speaker 1: this data harvest to happen. Now, since this incident that 169 00:11:00,360 --> 00:11:03,760 Speaker 1: happened nearly a decade ago, Facebook has subsequently beefed up 170 00:11:03,760 --> 00:11:08,439 Speaker 1: its restrictions. Further, the company is challenging the validity of 171 00:11:08,480 --> 00:11:11,800 Speaker 1: the case brought against it because at the time of 172 00:11:11,800 --> 00:11:15,800 Speaker 1: the incident, Facebook says it didn't have any commercial business 173 00:11:15,880 --> 00:11:20,440 Speaker 1: set up in Australia, no staff, no offices, no real 174 00:11:20,480 --> 00:11:24,160 Speaker 1: means of generating revenue within the country, nothing, and that 175 00:11:24,320 --> 00:11:28,160 Speaker 1: as such, the law that concerns this whole case, the 176 00:11:28,280 --> 00:11:31,839 Speaker 1: law that Facebook has been alleged to have broken, would 177 00:11:31,880 --> 00:11:35,240 Speaker 1: not apply because it has a prerequisite that the company 178 00:11:35,280 --> 00:11:39,120 Speaker 1: that carried out that broke the law rather carried out 179 00:11:39,320 --> 00:11:43,000 Speaker 1: business within Australia. So Facebook is saying, we didn't carry 180 00:11:43,000 --> 00:11:46,160 Speaker 1: out business in Australia, we didn't have any offices here, 181 00:11:46,679 --> 00:11:51,880 Speaker 1: we offered our platform here, but we weren't generating revenue. 182 00:11:51,920 --> 00:11:54,959 Speaker 1: There was no business being done. Now, there have been 183 00:11:55,440 --> 00:11:59,840 Speaker 1: lower courts that have disagreed with that conclusion, but Facebook's 184 00:12:00,200 --> 00:12:02,760 Speaker 1: pushing it as part of their defense. Now, if it 185 00:12:02,800 --> 00:12:05,720 Speaker 1: loses the case, the company faces a two point two 186 00:12:05,920 --> 00:12:10,080 Speaker 1: million dollars fine, which for Facebook that's not that much. 187 00:12:10,280 --> 00:12:13,600 Speaker 1: So even if the High Court ultimately rejects Facebook's arguments 188 00:12:13,600 --> 00:12:16,280 Speaker 1: and finds the company guilty, Facebook is not going to 189 00:12:16,320 --> 00:12:18,760 Speaker 1: be taking a massive hit when it's compared to its 190 00:12:18,800 --> 00:12:23,559 Speaker 1: overall revenue. And speaking of Facebook slash Meta, numerous outlets 191 00:12:23,600 --> 00:12:27,960 Speaker 1: like Bloomberg have reported that more layoffs are imminent at 192 00:12:27,960 --> 00:12:31,800 Speaker 1: the company. So if you remember Meta downsized by thirteen 193 00:12:31,880 --> 00:12:35,480 Speaker 1: percent late last year and actual numbers of employees, that's 194 00:12:35,520 --> 00:12:40,079 Speaker 1: around eleven thousand people who lost their jobs. According to Bloomberg, 195 00:12:40,400 --> 00:12:45,080 Speaker 1: that initial round of layoffs was really related to a 196 00:12:45,160 --> 00:12:50,960 Speaker 1: reorganization effort, which largely included cutting entire teams and also 197 00:12:51,720 --> 00:12:54,480 Speaker 1: managerial levels. The idea being that they wanted to flatten 198 00:12:54,520 --> 00:12:57,240 Speaker 1: out the hierarchy, so they didn't want to have as 199 00:12:57,280 --> 00:13:03,000 Speaker 1: many vertical levels between entry level employees and top brass. 200 00:13:03,640 --> 00:13:08,040 Speaker 1: So this was like a middle management clearinghouse kind of situation. However, 201 00:13:08,640 --> 00:13:13,960 Speaker 1: Bloomberg says that the upcoming layoffs will be more about 202 00:13:14,120 --> 00:13:18,880 Speaker 1: financial results than a reorganization, So it's possible we'll see 203 00:13:18,920 --> 00:13:23,760 Speaker 1: departments that are not heavily associated with revenue affected the most. 204 00:13:23,880 --> 00:13:26,800 Speaker 1: But time will tell a lot of people think that 205 00:13:26,880 --> 00:13:30,960 Speaker 1: this is going to unfold within the week or early 206 00:13:31,080 --> 00:13:32,880 Speaker 1: next week, so we'll just have to keep an eye out. 207 00:13:33,400 --> 00:13:35,880 Speaker 1: All right, we've got more news to cover, but first 208 00:13:35,920 --> 00:13:49,320 Speaker 1: let's take a quick break. We're back. UDN reports that 209 00:13:49,440 --> 00:13:52,719 Speaker 1: Intel has finalized its design of two nanometer and one 210 00:13:52,800 --> 00:13:57,199 Speaker 1: point eight nanimeter fabrication processes. Remember a nanometer is a 211 00:13:57,240 --> 00:14:01,320 Speaker 1: billion of a meter. However, when you hear these numbers, 212 00:14:01,720 --> 00:14:04,200 Speaker 1: it does not actually mean that Intel is about to 213 00:14:04,240 --> 00:14:09,120 Speaker 1: churn out chips that have components that measure two nanometers 214 00:14:09,520 --> 00:14:13,600 Speaker 1: or one point eight nanometers in size. It's actually just 215 00:14:13,640 --> 00:14:16,240 Speaker 1: a naming convention at this point. Once upon a time, 216 00:14:16,880 --> 00:14:20,480 Speaker 1: the numbers actually did refer to the size of the 217 00:14:20,560 --> 00:14:24,640 Speaker 1: individual components on a chip, but we abandoned that several 218 00:14:24,680 --> 00:14:28,200 Speaker 1: generations back. We kept the naming convention, which is odd 219 00:14:28,600 --> 00:14:31,400 Speaker 1: because it's a naming convention where the numbers keep going down, 220 00:14:32,080 --> 00:14:34,880 Speaker 1: but those numbers don't actually refer to the size of 221 00:14:35,000 --> 00:14:38,400 Speaker 1: anything on the chip anymore. Also, I should mention that 222 00:14:38,440 --> 00:14:42,240 Speaker 1: these processes have been designed but not actually built out. 223 00:14:42,320 --> 00:14:45,040 Speaker 1: So in other words, Intel has come out up with 224 00:14:45,120 --> 00:14:48,320 Speaker 1: the approach they're going to take, but they haven't actually 225 00:14:49,080 --> 00:14:51,760 Speaker 1: created the manufacturing process itself. So we're not going to 226 00:14:51,840 --> 00:14:56,360 Speaker 1: get chips right away that have these new features on them. 227 00:14:56,680 --> 00:15:01,920 Speaker 1: And if you're wondering why we don't go smaller anymore, really, 228 00:15:01,960 --> 00:15:03,440 Speaker 1: when it comes down to it, I mean, some stuff 229 00:15:03,480 --> 00:15:07,160 Speaker 1: does get shrunk from generation to generation, but not as 230 00:15:07,240 --> 00:15:09,120 Speaker 1: much as what the Naming Convention would have you think. 231 00:15:09,160 --> 00:15:12,000 Speaker 1: And the big reason for that comes down to quantum physics. 232 00:15:12,040 --> 00:15:17,080 Speaker 1: If you make components small enough, then quantum mechanics come 233 00:15:17,080 --> 00:15:19,920 Speaker 1: into play and electrons will decide to do whatever the 234 00:15:19,960 --> 00:15:22,560 Speaker 1: heck they want to do instead of following these careful 235 00:15:22,600 --> 00:15:25,400 Speaker 1: pathways you built for them with your transistors and whatnot. 236 00:15:26,120 --> 00:15:29,480 Speaker 1: That's being a little flippant, but it's also true anyway. 237 00:15:29,680 --> 00:15:32,320 Speaker 1: Intel is introducing a lot of stuff all at once 238 00:15:32,320 --> 00:15:35,440 Speaker 1: with the one point eight nanimeter fabrication process, also known 239 00:15:35,440 --> 00:15:40,360 Speaker 1: as the eighteen A process. The A stands for Angstrom 240 00:15:40,400 --> 00:15:43,200 Speaker 1: because you know, you have to go down from the 241 00:15:43,320 --> 00:15:47,680 Speaker 1: nanoscale to the Angstrom scale. Intel has a new transistor 242 00:15:47,720 --> 00:15:52,160 Speaker 1: design with this eighteen A approach. It is shrinking some 243 00:15:52,400 --> 00:15:56,120 Speaker 1: components so some things are getting smaller, and also has 244 00:15:56,240 --> 00:16:00,320 Speaker 1: added backside power delivery. And as Tom's Hardware point out, 245 00:16:01,040 --> 00:16:04,800 Speaker 1: there's a lot of stuff that has been incorporated into 246 00:16:04,880 --> 00:16:09,440 Speaker 1: this new design. It's a bold and potentially risky approach. 247 00:16:09,760 --> 00:16:13,560 Speaker 1: So typically what Intel does is follow a pretty standard 248 00:16:13,560 --> 00:16:17,200 Speaker 1: tick talk pattern, which means it comes up with a 249 00:16:17,280 --> 00:16:21,160 Speaker 1: new design for chip architecture, and then it builds a 250 00:16:21,240 --> 00:16:24,840 Speaker 1: chip based on that design. Then in the next generation 251 00:16:25,280 --> 00:16:30,640 Speaker 1: it optimizes that design, and then after optimizing the previous design, 252 00:16:30,800 --> 00:16:34,080 Speaker 1: Intel goes and makes another new design. So it goes 253 00:16:34,120 --> 00:16:39,920 Speaker 1: talk tick talk, So new architecture optimization, new architecture optimization. 254 00:16:40,000 --> 00:16:43,440 Speaker 1: Once upon a time, the architecture really did refer more 255 00:16:43,480 --> 00:16:47,320 Speaker 1: to shrinking things down, Like you would design an architecture 256 00:16:48,080 --> 00:16:50,920 Speaker 1: and then the next generation you would shrink it down. 257 00:16:51,120 --> 00:16:53,640 Speaker 1: The next generation you would build a new architecture based 258 00:16:53,680 --> 00:16:56,080 Speaker 1: on that new size you had reached, then you would 259 00:16:56,080 --> 00:16:58,440 Speaker 1: shrink it down. That kind of thing. We're kind of 260 00:16:58,680 --> 00:17:01,760 Speaker 1: beyond that at this point anyway, we're not likely to 261 00:17:01,760 --> 00:17:05,159 Speaker 1: see any chips based off this new process for at 262 00:17:05,240 --> 00:17:07,879 Speaker 1: least a year. Intel hopes to have it in place 263 00:17:07,920 --> 00:17:11,520 Speaker 1: by early twenty twenty four. Earlier on in this episode, 264 00:17:11,560 --> 00:17:15,520 Speaker 1: I talked about how Huawei has come under suspicion by 265 00:17:15,520 --> 00:17:19,040 Speaker 1: the US government, which then pushed telecom companies to ditch 266 00:17:19,080 --> 00:17:23,679 Speaker 1: Huawei equipment in their networks. Well, in another Huawei related story, 267 00:17:24,200 --> 00:17:28,600 Speaker 1: the company recently participated as an exhibitor at the Mobile 268 00:17:28,640 --> 00:17:31,880 Speaker 1: World Congress event in Spain, and visitors to Huawei's booth 269 00:17:32,320 --> 00:17:35,600 Speaker 1: were given personal security badges. They had to wear them 270 00:17:35,640 --> 00:17:37,640 Speaker 1: while they were in the booth, and they were meant 271 00:17:37,720 --> 00:17:40,800 Speaker 1: to hand the badges back when they were ready to 272 00:17:40,880 --> 00:17:43,800 Speaker 1: leave the booth. Only some folks forgot to hand the 273 00:17:43,800 --> 00:17:46,840 Speaker 1: badges back and they ultimately walked off with them, and 274 00:17:46,880 --> 00:17:49,200 Speaker 1: then some of them just left their badges lying around, 275 00:17:49,280 --> 00:17:51,600 Speaker 1: And then of course people got a little curious and 276 00:17:51,640 --> 00:17:54,239 Speaker 1: they took the badge which was on a lanyard, one 277 00:17:54,280 --> 00:17:57,600 Speaker 1: of those extending lanyards where you got the little plastic 278 00:17:58,119 --> 00:18:01,000 Speaker 1: clip that has the spool in site it so you 279 00:18:01,000 --> 00:18:04,720 Speaker 1: can extend and retract your badge. Well, someone found out 280 00:18:04,760 --> 00:18:09,320 Speaker 1: that inside that plastic clip, there was a tiny circuit board, 281 00:18:10,000 --> 00:18:13,080 Speaker 1: like they popped the clip open, and besides the spool 282 00:18:13,359 --> 00:18:17,080 Speaker 1: for the lanyard there was a chip, And so folks 283 00:18:17,080 --> 00:18:20,399 Speaker 1: started to speculate what this circuitry could be. Was Whahwei 284 00:18:20,440 --> 00:18:26,920 Speaker 1: trying to track people? So the company said, essentially yeah, 285 00:18:26,960 --> 00:18:29,280 Speaker 1: because those badges were meant to track where people were 286 00:18:29,320 --> 00:18:33,080 Speaker 1: going within the booth and how much time they were 287 00:18:33,119 --> 00:18:36,520 Speaker 1: spending at specific parts of the booth, that kind of thing. 288 00:18:36,960 --> 00:18:40,080 Speaker 1: This in turn would inform Huawei about what people were 289 00:18:40,080 --> 00:18:43,439 Speaker 1: most interested in. Folks were expected to hand over the 290 00:18:43,480 --> 00:18:46,320 Speaker 1: badges after the end of their visit, so it wasn't 291 00:18:46,359 --> 00:18:50,280 Speaker 1: like Huawei was trying to bug someone and track all 292 00:18:50,280 --> 00:18:53,600 Speaker 1: their movements throughout all of Mobile World Congress. They were 293 00:18:53,640 --> 00:18:56,119 Speaker 1: just really, according to what they're saying, interested in what 294 00:18:56,160 --> 00:18:59,160 Speaker 1: people were doing when they were actually visiting Huawei's booth. 295 00:18:59,800 --> 00:19:03,919 Speaker 1: Now there's concerned that individual badges could be connected to 296 00:19:03,960 --> 00:19:07,679 Speaker 1: specific people, like if they were personalized. I don't know 297 00:19:07,720 --> 00:19:09,840 Speaker 1: if they were personalized because the report I read it 298 00:19:09,840 --> 00:19:13,320 Speaker 1: and go into detail, but if they were, then there's 299 00:19:13,359 --> 00:19:17,280 Speaker 1: some concern about that data being linked to a specific individual. 300 00:19:17,480 --> 00:19:21,080 Speaker 1: But huawehas said that they're following their privacy policy, they're 301 00:19:21,080 --> 00:19:23,719 Speaker 1: going to protect that information. That that's not really what 302 00:19:23,760 --> 00:19:26,960 Speaker 1: they're really focused on. They're looking at what parts of 303 00:19:26,960 --> 00:19:30,879 Speaker 1: their booth design were effective and what wasn't. Honestly, I 304 00:19:30,960 --> 00:19:33,800 Speaker 1: get the feeling this is a story that's more innocent 305 00:19:33,800 --> 00:19:37,119 Speaker 1: than malicious. I didn't see anything to indicate that the 306 00:19:37,160 --> 00:19:41,640 Speaker 1: circuitry could really do anything sinister. It looks like it's 307 00:19:41,680 --> 00:19:45,600 Speaker 1: a passive RFID and Bluetooth system, so something that when 308 00:19:45,600 --> 00:19:49,520 Speaker 1: it passes within range of an emitter can reflect back 309 00:19:49,760 --> 00:19:54,080 Speaker 1: a response which just gives the system information of where 310 00:19:54,160 --> 00:19:57,640 Speaker 1: someone is and how long they are there. It didn't 311 00:19:57,680 --> 00:19:59,840 Speaker 1: look like it was capable of doing more than that 312 00:20:00,080 --> 00:20:02,960 Speaker 1: to me, but that was at a casual glance based 313 00:20:03,000 --> 00:20:06,320 Speaker 1: upon some photographs, and I just think that this one 314 00:20:06,480 --> 00:20:09,679 Speaker 1: is probably not that big of a deal. Honestly, I 315 00:20:09,720 --> 00:20:12,719 Speaker 1: think Huawei is pretty much telling the truth here. However, 316 00:20:12,760 --> 00:20:16,920 Speaker 1: it does really illustrate how Huawei has a reputation associated 317 00:20:16,960 --> 00:20:19,399 Speaker 1: with it. Whether that reputation is fair or not is 318 00:20:19,440 --> 00:20:23,000 Speaker 1: up for debate, but I think it shows that there's 319 00:20:23,080 --> 00:20:27,120 Speaker 1: this kind of cloud of suspicion that's associated with the company. 320 00:20:27,800 --> 00:20:30,200 Speaker 1: In a new segment that I'd like to call y'all 321 00:20:30,280 --> 00:20:32,720 Speaker 1: got to read this. I want to give a shout 322 00:20:32,760 --> 00:20:37,040 Speaker 1: out to Wired. Wired published a piece that's titled Inside 323 00:20:37,359 --> 00:20:42,120 Speaker 1: the Suspicion Machine, and it's about governments using systems that 324 00:20:42,200 --> 00:20:45,760 Speaker 1: reduce people to data points for the purposes of making 325 00:20:45,800 --> 00:20:49,800 Speaker 1: really important decisions regarding those people. It opens talking about 326 00:20:49,840 --> 00:20:55,480 Speaker 1: a welfare system in Europe and how such an algorithmic 327 00:20:55,520 --> 00:20:59,879 Speaker 1: approach can be used by administrators of these systems to 328 00:21:00,119 --> 00:21:05,000 Speaker 1: determine who is taking advantage of the system. As Wired 329 00:21:05,040 --> 00:21:07,360 Speaker 1: points out, they said, it's not like it's being used 330 00:21:07,400 --> 00:21:12,440 Speaker 1: to determine how much welfare a person requires, but rather 331 00:21:12,960 --> 00:21:15,679 Speaker 1: how likely is it that this person is abusing the system, 332 00:21:15,920 --> 00:21:21,160 Speaker 1: like it's taking an accusatory stance out of the gate. Well, 333 00:21:21,640 --> 00:21:24,399 Speaker 1: this reminds me a lot about a story we covered 334 00:21:24,800 --> 00:21:28,399 Speaker 1: recently about how a lot of HR leaders expect that 335 00:21:28,440 --> 00:21:31,800 Speaker 1: they will at least partly be relying on AI and 336 00:21:31,920 --> 00:21:35,320 Speaker 1: algorithms when it comes to making layoff decisions. So to 337 00:21:35,400 --> 00:21:38,639 Speaker 1: do that, you kind of have to reduce people into 338 00:21:38,760 --> 00:21:42,520 Speaker 1: a collection of data points so that an algorithmic system 339 00:21:42,640 --> 00:21:45,439 Speaker 1: can actually make decisions, and compare this set of numbers 340 00:21:45,480 --> 00:21:47,639 Speaker 1: against this set of numbers. But there are a lot 341 00:21:47,680 --> 00:21:51,359 Speaker 1: of dangers with this approach, including bias potentially leading to 342 00:21:51,480 --> 00:21:56,520 Speaker 1: specific populations of people being disproportionately targeted and affected by 343 00:21:56,520 --> 00:22:00,840 Speaker 1: these kinds of decisions. Now, the piece on Wired is extensive, 344 00:22:01,640 --> 00:22:05,399 Speaker 1: it is thorough, it is a great read. So I 345 00:22:05,440 --> 00:22:09,000 Speaker 1: recommend go check out the piece again. It's on Wired. 346 00:22:09,119 --> 00:22:13,159 Speaker 1: It's titled Inside the Suspicion Machine. Also, just for the 347 00:22:13,160 --> 00:22:15,720 Speaker 1: interest of full disclosure, I do not know any of 348 00:22:15,760 --> 00:22:18,280 Speaker 1: the people who contributed to that article. I do not 349 00:22:18,359 --> 00:22:22,040 Speaker 1: have any connection with Wired other than being a subscriber. 350 00:22:22,640 --> 00:22:25,240 Speaker 1: It's just my recommendation as someone who reads a lot 351 00:22:25,320 --> 00:22:29,520 Speaker 1: about tech. You know, in the early days of the pandemic, 352 00:22:29,800 --> 00:22:32,600 Speaker 1: when everyone was scrambling to find ways to work in 353 00:22:32,600 --> 00:22:37,840 Speaker 1: a suddenly decentralized remote approach, conference tools like Zoom really 354 00:22:37,880 --> 00:22:41,920 Speaker 1: took off, followed closely by people determined to disrupt work 355 00:22:41,960 --> 00:22:45,920 Speaker 1: being done through these tools. Well, we haven't emerged from 356 00:22:46,000 --> 00:22:49,439 Speaker 1: the disruption stage yet because recently a meeting that was 357 00:22:49,480 --> 00:22:52,520 Speaker 1: being held by the US Federal Reserve had to be 358 00:22:52,720 --> 00:22:58,400 Speaker 1: canceled in process due to a stream hijacker who blasted 359 00:22:58,440 --> 00:23:03,440 Speaker 1: pornographic content to the attendees of the Zoom meeting, which 360 00:23:03,520 --> 00:23:06,119 Speaker 1: was like a couple hundred people. I think now you 361 00:23:06,200 --> 00:23:08,280 Speaker 1: might remember there were similar stories from just a couple 362 00:23:08,320 --> 00:23:11,040 Speaker 1: of years ago about people who found that they could 363 00:23:11,080 --> 00:23:15,239 Speaker 1: access corporate meetings that failed to protect their links, and 364 00:23:15,320 --> 00:23:17,879 Speaker 1: also that didn't use any kind of password protection. I 365 00:23:17,920 --> 00:23:22,119 Speaker 1: saw a rapid adoption throughout the industry of things like 366 00:23:22,200 --> 00:23:26,120 Speaker 1: protected links and passwords after that kind of happened because 367 00:23:27,040 --> 00:23:29,320 Speaker 1: it was easy to do. It was pretty easy to 368 00:23:29,520 --> 00:23:32,720 Speaker 1: find links that were being shared publicly, but they weren't 369 00:23:32,760 --> 00:23:35,760 Speaker 1: intended to be public meetings, and then people would just 370 00:23:35,840 --> 00:23:38,760 Speaker 1: crash them. So this is kind of similar to that. 371 00:23:39,240 --> 00:23:43,520 Speaker 1: The Federal Reserves Brent Jarks issued an apology for the 372 00:23:43,560 --> 00:23:46,679 Speaker 1: incident and committed to finding out how the hijack happened 373 00:23:46,920 --> 00:23:51,040 Speaker 1: and how to prevent it from happening again. Okay, I've 374 00:23:51,040 --> 00:23:53,320 Speaker 1: got a couple more news stories, but before I get 375 00:23:53,359 --> 00:24:05,440 Speaker 1: to those, let's take another quick break. We're back. So, 376 00:24:05,520 --> 00:24:10,560 Speaker 1: Engadget has a piece about Honda's newest robot, which, unlike 377 00:24:10,560 --> 00:24:15,920 Speaker 1: the Tesla Optimus robot, and unlike Honda's own retired Asimo robot, 378 00:24:16,680 --> 00:24:20,640 Speaker 1: isn't humanoid. It's not built to look like a human. 379 00:24:20,680 --> 00:24:24,159 Speaker 1: It doesn't have legs and arms, no, Instead, Honda's robot 380 00:24:24,240 --> 00:24:28,760 Speaker 1: is actually the third generation of its autonomous work vehicle 381 00:24:29,280 --> 00:24:32,439 Speaker 1: or a w V design, So it's a type of 382 00:24:32,800 --> 00:24:37,200 Speaker 1: kind of self guided electric wagon. It's meant to carry 383 00:24:37,280 --> 00:24:42,520 Speaker 1: payloads and has a large enough surface flat surface that 384 00:24:42,720 --> 00:24:46,760 Speaker 1: can carry two pallets worth of payload, and it can 385 00:24:46,800 --> 00:24:51,360 Speaker 1: support up to two thousand pounds per go. It has 386 00:24:51,400 --> 00:24:54,000 Speaker 1: a top speed of around ten miles per hour. That's 387 00:24:54,040 --> 00:24:57,720 Speaker 1: actually fairly darned zippy for something like this, and it 388 00:24:57,760 --> 00:25:01,160 Speaker 1: reportedly can go for twenty eight miles before needing a recharge. 389 00:25:01,200 --> 00:25:04,359 Speaker 1: I don't know if that's under a full load. It 390 00:25:04,440 --> 00:25:07,160 Speaker 1: might be less time under a full load. But this 391 00:25:07,240 --> 00:25:09,440 Speaker 1: is not the first automous work vehicle, like I said, 392 00:25:09,480 --> 00:25:12,399 Speaker 1: it's actually the third generation, and Honda says it is 393 00:25:12,400 --> 00:25:16,320 Speaker 1: capable of working either by remote control, so you could 394 00:25:16,359 --> 00:25:21,760 Speaker 1: have someone actually steering this remotely, or it can operate 395 00:25:21,880 --> 00:25:27,080 Speaker 1: completely autonomously. Honda anticipates that the AWV will have utility 396 00:25:27,080 --> 00:25:31,520 Speaker 1: in environments like warehouses and construction sites and that kind 397 00:25:31,560 --> 00:25:34,320 Speaker 1: of thing. That it's designed to also be able to 398 00:25:34,359 --> 00:25:38,600 Speaker 1: travel over rough terrain, so a construction site would be fine. 399 00:25:38,960 --> 00:25:43,040 Speaker 1: So Honda now is looking to form strategic partnerships with 400 00:25:43,119 --> 00:25:47,199 Speaker 1: construction companies to kind of test the AWV in the 401 00:25:47,240 --> 00:25:51,479 Speaker 1: real world, Like, let's actually put this thing on real 402 00:25:51,560 --> 00:25:55,280 Speaker 1: construction sites and see if it ends up being a 403 00:25:55,400 --> 00:25:58,200 Speaker 1: useful tool. I think that this is actually a great 404 00:25:58,200 --> 00:26:01,040 Speaker 1: example of a robot the engineer build to perform a 405 00:26:01,080 --> 00:26:05,600 Speaker 1: specific function with a form that supports that function. I 406 00:26:05,680 --> 00:26:09,240 Speaker 1: talked about this earlier when we chatted about the Optimus 407 00:26:09,320 --> 00:26:13,760 Speaker 1: robot a few episodes ago. Where building a humanoid robot 408 00:26:13,840 --> 00:26:17,160 Speaker 1: is cool, like it's neat to see humanoid robots, no 409 00:26:17,200 --> 00:26:20,440 Speaker 1: doubt about it. They are fascinating, they are incredibly complicated. 410 00:26:21,080 --> 00:26:26,400 Speaker 1: It's impressive work, but it's not always or not necessarily 411 00:26:26,440 --> 00:26:29,000 Speaker 1: the best tool for the job. Sometimes it's better to 412 00:26:29,080 --> 00:26:34,440 Speaker 1: design a robot that is engineered to tackle a specific 413 00:26:34,520 --> 00:26:37,919 Speaker 1: task as efficiently as possible, which may not have anything 414 00:26:37,920 --> 00:26:41,520 Speaker 1: to do with the human form, and then you get 415 00:26:41,560 --> 00:26:44,359 Speaker 1: great results. If you try to create a humanoid robot 416 00:26:44,400 --> 00:26:48,200 Speaker 1: that can do anything, often it doesn't do anything very well, 417 00:26:48,720 --> 00:26:50,920 Speaker 1: Like it can do lots of different stuff, but it's 418 00:26:50,960 --> 00:26:55,080 Speaker 1: not necessarily performing at the top level. Whereas if you 419 00:26:55,160 --> 00:26:59,880 Speaker 1: build specific purpose built robots to do a small group 420 00:26:59,880 --> 00:27:05,440 Speaker 1: of tasks, you can get much better results. Typically, we're 421 00:27:05,480 --> 00:27:09,560 Speaker 1: just at that stage, like humanoid robots are really complicated 422 00:27:09,880 --> 00:27:13,520 Speaker 1: and difficult to pull off. So I think Honda's approach 423 00:27:13,600 --> 00:27:17,280 Speaker 1: makes way more sense than companies like Tesla that are 424 00:27:17,280 --> 00:27:21,080 Speaker 1: pushing this humanoid approach, at least in the near term. 425 00:27:21,119 --> 00:27:23,200 Speaker 1: Maybe in the future we'll get to a point where 426 00:27:23,840 --> 00:27:26,400 Speaker 1: building a humanoid robot that's really good at everything will 427 00:27:26,440 --> 00:27:30,160 Speaker 1: be a relatively trivial task. But right now, just making 428 00:27:30,200 --> 00:27:31,959 Speaker 1: them walk or be able to open a door and 429 00:27:32,000 --> 00:27:36,920 Speaker 1: walk through without falling over is harder than it sounds. Finally, 430 00:27:37,480 --> 00:27:41,720 Speaker 1: Japan's H three rocket didn't do so well this week 431 00:27:41,760 --> 00:27:45,680 Speaker 1: in its first test flight. So this is a new 432 00:27:45,800 --> 00:27:50,119 Speaker 1: launch vehicle that Japan's government has developed to put payloads 433 00:27:50,119 --> 00:27:54,080 Speaker 1: into orbit. Competitions getting really tight when it comes to 434 00:27:54,160 --> 00:27:58,679 Speaker 1: putting satellites up into space. So during its first launch, 435 00:27:59,040 --> 00:28:03,520 Speaker 1: the second stage of the vehicle failed to ignite, so 436 00:28:03,880 --> 00:28:08,240 Speaker 1: the administrators of Japan's space program gave the order to 437 00:28:08,760 --> 00:28:12,560 Speaker 1: the vehicle for it to self destruct. The vehicle did 438 00:28:12,920 --> 00:28:15,280 Speaker 1: self destruct. It came crashing down the ocean off the 439 00:28:15,320 --> 00:28:17,760 Speaker 1: coast of the Philippines, and Japan says there are no 440 00:28:17,840 --> 00:28:22,080 Speaker 1: reports of any injuries or damage caused by the wreckage 441 00:28:22,119 --> 00:28:24,600 Speaker 1: falling to Earth. But what it was meant to do 442 00:28:25,600 --> 00:28:29,240 Speaker 1: was to put a satellite called the Advanced Land Observing 443 00:28:29,280 --> 00:28:33,720 Speaker 1: Satellite three into orbit, and this satellite's purpose was to 444 00:28:33,800 --> 00:28:37,520 Speaker 1: monitor conditions on the ground to help the government coordinate 445 00:28:37,560 --> 00:28:40,280 Speaker 1: efforts in the wake of a disaster. And it also 446 00:28:40,400 --> 00:28:44,680 Speaker 1: carried some other tools for the defense ministry that would 447 00:28:44,720 --> 00:28:48,720 Speaker 1: be able to detect ballistic missile launches. And that makes 448 00:28:48,720 --> 00:28:51,560 Speaker 1: a lot of sense because if you know about Japan 449 00:28:51,600 --> 00:28:55,160 Speaker 1: and North Korea, you know that North Korea will occasionally 450 00:28:55,240 --> 00:28:59,640 Speaker 1: hold test launches and fire missiles into the Sea of Japan. 451 00:29:00,200 --> 00:29:03,040 Speaker 1: So having a system that detects those sorts of things 452 00:29:03,160 --> 00:29:05,960 Speaker 1: totally makes sense. But all of that obviously was lost 453 00:29:06,040 --> 00:29:09,600 Speaker 1: because the vehicle failed to attain orbit and had to 454 00:29:09,600 --> 00:29:14,840 Speaker 1: be put into self destruct mode. Numerous administrators in Japan's 455 00:29:14,840 --> 00:29:19,800 Speaker 1: space agency have expressed regret over the failure and apologies 456 00:29:20,000 --> 00:29:23,200 Speaker 1: for the failed launch. There are even questions about how 457 00:29:23,200 --> 00:29:26,880 Speaker 1: this might affect the government's space industry moving forward, which 458 00:29:27,120 --> 00:29:29,400 Speaker 1: it sounds incredible to me because, I mean, obviously any 459 00:29:29,480 --> 00:29:33,960 Speaker 1: failure is regrettable. Right, you don't want to ever see 460 00:29:33,960 --> 00:29:39,280 Speaker 1: a failure, but technology does fail, and we're talking about 461 00:29:39,360 --> 00:29:42,320 Speaker 1: rocket science. There's a reason you say, well, that's not 462 00:29:42,440 --> 00:29:46,360 Speaker 1: rocket science because rocket science is hard. It is really hard. 463 00:29:47,040 --> 00:29:50,120 Speaker 1: But I think this also is tough because it follows 464 00:29:50,280 --> 00:29:55,360 Speaker 1: on Japan's previous launch vehicle, the H two A, which 465 00:29:55,360 --> 00:29:59,480 Speaker 1: had a reputation for being extremely reliable, had a successful 466 00:29:59,560 --> 00:30:03,400 Speaker 1: launch eight of around ninety seven point eight percent. So 467 00:30:03,440 --> 00:30:06,440 Speaker 1: when you're following that track record and your first launch 468 00:30:06,440 --> 00:30:08,160 Speaker 1: out of the gate is a failure, I could see 469 00:30:08,200 --> 00:30:12,280 Speaker 1: how this could be a real black mark on the 470 00:30:12,360 --> 00:30:15,960 Speaker 1: program as a whole. So I'm not sure where Japan's 471 00:30:16,000 --> 00:30:19,640 Speaker 1: gonna go based on this failure, but yeah, I thought 472 00:30:19,640 --> 00:30:24,520 Speaker 1: I would cover that. And that's it for the News today, Tuesday, 473 00:30:24,600 --> 00:30:28,240 Speaker 1: March twenty three. Just a reminder tomorrow we have a 474 00:30:28,280 --> 00:30:31,000 Speaker 1: special episode of tech Stuff coming out, hosted by bridget 475 00:30:31,040 --> 00:30:33,640 Speaker 1: Todd of There Are No Girls in the Internet. I 476 00:30:33,720 --> 00:30:36,240 Speaker 1: look forward to hearing your thoughts on that show, and 477 00:30:36,320 --> 00:30:39,720 Speaker 1: I'll be back on Thursday with more tech news and 478 00:30:39,760 --> 00:30:43,080 Speaker 1: then it's off to Austin, Texas, where I'll be at 479 00:30:43,080 --> 00:30:45,520 Speaker 1: south By Southwest. So if you see me in Austin 480 00:30:45,680 --> 00:30:49,960 Speaker 1: over the weekend. You can wave. I'll probably be on 481 00:30:50,040 --> 00:30:54,480 Speaker 1: my way somewhere, probably Torchi's Tacos if I'm honest, because yes, 482 00:30:54,560 --> 00:30:57,720 Speaker 1: I am basic. And the taco of the month is 483 00:30:57,960 --> 00:31:00,480 Speaker 1: the Rosco, which is a chicken and waffles taco, and 484 00:31:00,600 --> 00:31:05,080 Speaker 1: y'all I got to get one. Okay, okay, Sorry kind 485 00:31:05,080 --> 00:31:07,480 Speaker 1: of lost the thread there. If you have suggestions for 486 00:31:07,520 --> 00:31:09,720 Speaker 1: topics I should cover in future episodes of tech Stuff, 487 00:31:09,760 --> 00:31:12,360 Speaker 1: please reach out. You can do so on Twitter. The 488 00:31:12,440 --> 00:31:16,440 Speaker 1: handle for the show is tech Stuff HSW or you 489 00:31:16,480 --> 00:31:20,120 Speaker 1: can reach out on the iHeartRadio app. It's free to download, 490 00:31:20,280 --> 00:31:22,600 Speaker 1: it's free to use. You put tech Stuff in the 491 00:31:22,600 --> 00:31:25,320 Speaker 1: little search field. It'll pop right up and you'll see 492 00:31:25,320 --> 00:31:27,440 Speaker 1: on the tech Stuff page there's a little microphone icon. 493 00:31:27,480 --> 00:31:29,320 Speaker 1: If you click on that, you can leave me a 494 00:31:29,400 --> 00:31:31,920 Speaker 1: voice message up to thirty seconds in way, and I'll 495 00:31:31,920 --> 00:31:41,280 Speaker 1: talk to you again and really soon. Tech Stuff is 496 00:31:41,320 --> 00:31:45,880 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 497 00:31:45,920 --> 00:31:49,560 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 498 00:31:49,600 --> 00:31:50,280 Speaker 1: favorite shows.