1 00:00:04,480 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:16,080 Speaker 1: and welcome to tech Stuff. I'm your host, Jonavan Strickland. 3 00:00:16,079 --> 00:00:19,919 Speaker 1: I'm an executive producer with iHeart Podcast Send. How the 4 00:00:20,040 --> 00:00:22,520 Speaker 1: tech are you? It's time for the tech news for 5 00:00:22,560 --> 00:00:26,920 Speaker 1: the week ending on Friday, June seventh, twenty twenty four. 6 00:00:27,600 --> 00:00:29,640 Speaker 1: You know, there have been a couple of products that 7 00:00:29,720 --> 00:00:33,240 Speaker 1: hit the market recently that really aim to either augment 8 00:00:33,600 --> 00:00:38,839 Speaker 1: or outright replace smartphones. And these products initially generated a 9 00:00:38,920 --> 00:00:42,640 Speaker 1: decent amount of buzz, but upon actual release, the reception 10 00:00:43,159 --> 00:00:47,239 Speaker 1: hasn't been universally positive. So one of these was the 11 00:00:47,360 --> 00:00:50,760 Speaker 1: Rabbit R one and maybe you've seen one of these, 12 00:00:50,800 --> 00:00:53,480 Speaker 1: maybe you haven't. It's a little orange do hickey. Let's 13 00:00:53,520 --> 00:00:56,960 Speaker 1: got some interesting user interface features on it, and it 14 00:00:57,080 --> 00:01:00,720 Speaker 1: got some folks excited back at CES early this year, 15 00:01:00,840 --> 00:01:04,680 Speaker 1: but media outlets like The Verge brought reality crashing down 16 00:01:04,720 --> 00:01:07,800 Speaker 1: on the R one with headlines like Rabbit R one 17 00:01:07,880 --> 00:01:12,440 Speaker 1: review nothing to see here by David Pierce. Another device 18 00:01:12,760 --> 00:01:17,679 Speaker 1: that has also seen critical dismissal is the Humane AI pen. 19 00:01:18,080 --> 00:01:19,880 Speaker 1: So I thought it would start this episode with a 20 00:01:19,920 --> 00:01:22,920 Speaker 1: quick compare and contrast of these two products, paired with 21 00:01:22,959 --> 00:01:25,360 Speaker 1: some commentary about where we are with AI when it 22 00:01:25,400 --> 00:01:29,600 Speaker 1: comes to incorporating it into consumer goods. So first up 23 00:01:29,760 --> 00:01:33,080 Speaker 1: the Rabbit R one. So this device wasn't intended to 24 00:01:33,120 --> 00:01:35,920 Speaker 1: be a replacement for a smartphone, even though a lot 25 00:01:35,959 --> 00:01:38,520 Speaker 1: of people were talking about it like that, because for 26 00:01:38,600 --> 00:01:41,479 Speaker 1: one thing, you can't make calls on an R one. 27 00:01:42,080 --> 00:01:44,440 Speaker 1: You were supposed to be able to set up your 28 00:01:44,520 --> 00:01:48,040 Speaker 1: various apps on this device, like to connect to your 29 00:01:48,600 --> 00:01:53,360 Speaker 1: various accounts, things like ride hailing apps and delivery services 30 00:01:53,400 --> 00:01:55,880 Speaker 1: and ticket apps, that kind of thing. And then the 31 00:01:56,000 --> 00:01:59,600 Speaker 1: idea was you could give a complex command to the 32 00:01:59,720 --> 00:02:02,600 Speaker 1: R one and it would take care of, you know, 33 00:02:02,720 --> 00:02:05,400 Speaker 1: pretty much all the rest. So you could say something like, 34 00:02:05,440 --> 00:02:08,320 Speaker 1: in an ideal situation, maybe you tell the R one 35 00:02:08,760 --> 00:02:11,800 Speaker 1: that you need a car to take you Tier hotel 36 00:02:11,960 --> 00:02:15,119 Speaker 1: from the airport. You've flown into a new city. You're 37 00:02:15,200 --> 00:02:18,560 Speaker 1: using the R one to call a an uber or 38 00:02:18,639 --> 00:02:20,760 Speaker 1: lift or something to pick you up from the airport, 39 00:02:20,800 --> 00:02:23,120 Speaker 1: take it Tier a hotel. Also to go ahead and 40 00:02:23,160 --> 00:02:25,840 Speaker 1: place an order for a pizza place you learned about 41 00:02:25,880 --> 00:02:27,560 Speaker 1: that's going to be near your hotel, so that can 42 00:02:27,600 --> 00:02:30,280 Speaker 1: be delivered, so you get that shortly after you've checked in, 43 00:02:30,560 --> 00:02:32,960 Speaker 1: and that you also want to buy a couple of 44 00:02:33,000 --> 00:02:36,079 Speaker 1: seats to a sporting event that's happening down the street 45 00:02:36,080 --> 00:02:38,520 Speaker 1: from the hotel. But according to reviews like the one 46 00:02:38,560 --> 00:02:41,360 Speaker 1: in The Verge, the reality was that R one was 47 00:02:41,440 --> 00:02:45,200 Speaker 1: not nearly so versatile or useful, and it had issues 48 00:02:45,240 --> 00:02:49,000 Speaker 1: with a lot of stuff, including delivering accurate weather forecasts. 49 00:02:49,080 --> 00:02:51,919 Speaker 1: You know, sometimes the R one would pin your location 50 00:02:52,040 --> 00:02:54,480 Speaker 1: to being a place that was like thousands of miles away, 51 00:02:54,840 --> 00:02:58,040 Speaker 1: and the image recognition that was built into the device 52 00:02:58,080 --> 00:03:00,880 Speaker 1: also left a lot to be desired. R one could 53 00:03:00,919 --> 00:03:04,000 Speaker 1: frequently misidentify stuff that you were pointing it at, and 54 00:03:04,040 --> 00:03:07,000 Speaker 1: that's not really useful either if you need to know, like, hey, 55 00:03:07,080 --> 00:03:08,800 Speaker 1: is this safe for me to eat? Or whatever it 56 00:03:08,840 --> 00:03:11,120 Speaker 1: may be. And while the price tag of one hundred 57 00:03:11,120 --> 00:03:13,680 Speaker 1: and ninety nine dollars wasn't as crazy as you know, 58 00:03:13,880 --> 00:03:17,079 Speaker 1: like the cost of an Apple Vision Pro, two hundred 59 00:03:17,080 --> 00:03:18,840 Speaker 1: bucks is still a lot to pay for a gadget 60 00:03:18,880 --> 00:03:22,160 Speaker 1: that doesn't work very well. Then you have the humane 61 00:03:22,200 --> 00:03:26,360 Speaker 1: aipen that's a wearable smart device. Although smart might be 62 00:03:26,520 --> 00:03:29,160 Speaker 1: generous a going to hey, hey, look it's David Pierce 63 00:03:29,200 --> 00:03:32,120 Speaker 1: of the Verge again who David Pierce had a rough 64 00:03:32,200 --> 00:03:36,600 Speaker 1: like April slash early May reviewing these kinds of devices. 65 00:03:36,800 --> 00:03:39,680 Speaker 1: But back in April, Pierce said that the Humane aipen 66 00:03:39,960 --> 00:03:44,160 Speaker 1: was quote not even close end quote to what was promised. 67 00:03:44,360 --> 00:03:47,680 Speaker 1: So this is a pin you wear it like it's 68 00:03:47,720 --> 00:03:50,400 Speaker 1: like a little Lavalier microphone or something. It doesn't have 69 00:03:50,440 --> 00:03:53,640 Speaker 1: a screen on the device, but it did have or 70 00:03:53,720 --> 00:03:56,680 Speaker 1: does have a little projection laser, so you can hold 71 00:03:56,720 --> 00:03:59,520 Speaker 1: your hand out and it will project a little interactive 72 00:04:00,160 --> 00:04:03,320 Speaker 1: menu on your hand and you can, you know, use 73 00:04:03,400 --> 00:04:07,240 Speaker 1: touch it's not really touch like, it's it's gesture commands 74 00:04:07,280 --> 00:04:09,240 Speaker 1: to be able to tell it what you want to do. 75 00:04:09,320 --> 00:04:11,560 Speaker 1: So if you were playing a music track, you could 76 00:04:11,600 --> 00:04:13,680 Speaker 1: do that to change the volume, that kind of stuff. 77 00:04:13,920 --> 00:04:17,040 Speaker 1: You could also interact with it through voice commands, and 78 00:04:17,160 --> 00:04:19,680 Speaker 1: it was supposed to lead fore you from the shackles 79 00:04:19,680 --> 00:04:22,400 Speaker 1: of your smartphone, so that you didn't have to have 80 00:04:22,480 --> 00:04:24,800 Speaker 1: your head down looking at a screen. You could be 81 00:04:25,040 --> 00:04:27,960 Speaker 1: keeping your phone in your pocket walking around using this 82 00:04:28,000 --> 00:04:30,520 Speaker 1: device to do the stuff that you would typically whip 83 00:04:30,600 --> 00:04:33,040 Speaker 1: your phone out to do, and you'd still have your 84 00:04:33,040 --> 00:04:34,960 Speaker 1: phone on you, so you could use that if you 85 00:04:35,000 --> 00:04:38,200 Speaker 1: needed to and the problem is it just didn't work 86 00:04:38,640 --> 00:04:42,320 Speaker 1: very well. It cost six hundred ninety nine dollars and 87 00:04:42,400 --> 00:04:45,520 Speaker 1: has an ongoing twenty four dollars subscription fee so that 88 00:04:45,560 --> 00:04:47,800 Speaker 1: you can use the cellular network, which is Tea Mobile, 89 00:04:47,800 --> 00:04:49,760 Speaker 1: by the way, in order for this thing to work. 90 00:04:49,960 --> 00:04:53,039 Speaker 1: And Pierce was pretty brutal in his review. He essentially 91 00:04:53,040 --> 00:04:56,000 Speaker 1: said it was a totally broken device. Now, the New 92 00:04:56,040 --> 00:04:58,960 Speaker 1: York Times reports that Humane has quietly been looking around 93 00:04:58,960 --> 00:05:02,159 Speaker 1: for a potential buy for the company, and allegedly one 94 00:05:02,160 --> 00:05:05,480 Speaker 1: of those entities they've looked at is HP. I recommend 95 00:05:05,480 --> 00:05:07,080 Speaker 1: that article in the New York Times, by the way, 96 00:05:07,120 --> 00:05:09,760 Speaker 1: it's titled this is going to be painful How a 97 00:05:09,880 --> 00:05:13,600 Speaker 1: Bold AI device flopped. Anyway, I think the R one 98 00:05:13,920 --> 00:05:18,719 Speaker 1: and the Humane aipen both help illustrate why we're not 99 00:05:18,800 --> 00:05:21,400 Speaker 1: really at a spot where AI is useful enough to 100 00:05:21,400 --> 00:05:24,520 Speaker 1: be the primary interface for a consumer product just yet. 101 00:05:24,680 --> 00:05:27,680 Speaker 1: Not saying that AI isn't impressive, not saying that there 102 00:05:27,720 --> 00:05:30,880 Speaker 1: hasn't been a ton of incredible work done in that field, 103 00:05:31,200 --> 00:05:35,320 Speaker 1: but it's not fully baked. And when you're talking about 104 00:05:35,400 --> 00:05:41,440 Speaker 1: really expensive consumer gadgets that end up not being reliable 105 00:05:41,800 --> 00:05:45,240 Speaker 1: or useful, that's a problem, right, Like you would kind 106 00:05:45,240 --> 00:05:47,479 Speaker 1: of want to see it continue to go through development 107 00:05:47,520 --> 00:05:50,920 Speaker 1: for a while, maybe have some limited beta testing and 108 00:05:51,320 --> 00:05:55,120 Speaker 1: go that route before releasing a full consumer product that 109 00:05:55,240 --> 00:05:58,880 Speaker 1: could potentially scuttle an entire company. Now, it's not just 110 00:05:59,000 --> 00:06:02,599 Speaker 1: consumer electronics that are struggling with AI, of course. Late 111 00:06:02,720 --> 00:06:05,520 Speaker 1: last week, Google reportedly eased off a little bit with 112 00:06:05,600 --> 00:06:09,080 Speaker 1: its AI Overviews tool. That's the feature that uses AI 113 00:06:09,200 --> 00:06:12,680 Speaker 1: to provide summaries and answers to queries in a web search, 114 00:06:12,839 --> 00:06:15,279 Speaker 1: with the idea being that if the AI can answer 115 00:06:15,279 --> 00:06:17,240 Speaker 1: whatever your question is, then you don't have to do 116 00:06:17,279 --> 00:06:20,680 Speaker 1: all that pesky clicking through to a reputable source to 117 00:06:20,760 --> 00:06:25,040 Speaker 1: dig through you know, edited and vetted information, which I mean, 118 00:06:25,080 --> 00:06:27,400 Speaker 1: I guess I can understand that a bit in some cases, 119 00:06:27,400 --> 00:06:31,760 Speaker 1: because goodness knows. I hate the trend that websites will 120 00:06:31,800 --> 00:06:35,200 Speaker 1: bury the lead of a story three cores of the 121 00:06:35,200 --> 00:06:37,719 Speaker 1: way down a page in order to serve you more ads, 122 00:06:37,880 --> 00:06:39,480 Speaker 1: so that you have to scroll past a lot more 123 00:06:39,480 --> 00:06:41,600 Speaker 1: ads before you get to what it is you're looking for. 124 00:06:42,040 --> 00:06:44,359 Speaker 1: I see that in a lot of like pop culture 125 00:06:44,440 --> 00:06:47,240 Speaker 1: news sites, where like the first five paragraphs are like 126 00:06:47,520 --> 00:06:50,680 Speaker 1: Star Wars originally came out in the nineteen seventies, like 127 00:06:50,800 --> 00:06:52,520 Speaker 1: I know this, I just want to know what the 128 00:06:52,560 --> 00:06:56,880 Speaker 1: next project is or goodness knows recipes. That's been a 129 00:06:56,960 --> 00:07:01,120 Speaker 1: problem for years, right, recipe sites do this all the time. Anyway. 130 00:07:01,360 --> 00:07:05,040 Speaker 1: The New York Times tested some searches last Friday and 131 00:07:05,120 --> 00:07:08,640 Speaker 1: found that overview AI was only popping up about once 132 00:07:08,680 --> 00:07:12,160 Speaker 1: in every six queries or so. So why is that? Well? 133 00:07:12,240 --> 00:07:15,200 Speaker 1: The conclusion that Nico Grant of The New York Times 134 00:07:15,280 --> 00:07:18,680 Speaker 1: arrived at is the answers provided by overview AI are 135 00:07:18,720 --> 00:07:23,120 Speaker 1: not always reliable or even safe. This is nothing new. 136 00:07:23,160 --> 00:07:25,880 Speaker 1: If you've been following news about generative AI for a while, 137 00:07:25,920 --> 00:07:29,040 Speaker 1: of course, I mean you likely know AI can hallucinate 138 00:07:29,240 --> 00:07:33,040 Speaker 1: or confabulate, or you just make stuff up. In other words, 139 00:07:33,400 --> 00:07:36,040 Speaker 1: that's not a good thing, particularly when you're using it 140 00:07:36,080 --> 00:07:38,920 Speaker 1: to augment search, and in some cases it can lead 141 00:07:38,960 --> 00:07:41,440 Speaker 1: to huge problems if the AI is recommending stuff that 142 00:07:41,600 --> 00:07:45,040 Speaker 1: just isn't safe. Google is continuing to develop the tool, 143 00:07:45,120 --> 00:07:47,320 Speaker 1: and the company has started to build in some protections 144 00:07:47,320 --> 00:07:50,680 Speaker 1: around certain categories of queries, such as stuff relating to 145 00:07:50,760 --> 00:07:54,480 Speaker 1: health matters. I'm honestly surprised at how aggressive companies like 146 00:07:54,560 --> 00:07:57,120 Speaker 1: open Ai and Google have been when it comes to 147 00:07:57,160 --> 00:08:00,720 Speaker 1: pushing these AI models out to the public, simply because 148 00:08:00,920 --> 00:08:05,200 Speaker 1: when things go wrong, they can go spectacularly wrong. But 149 00:08:05,280 --> 00:08:07,800 Speaker 1: I guess if you snooze, you lose. Over the years, 150 00:08:07,840 --> 00:08:10,440 Speaker 1: I've mentioned a lot of threats that AI poses, from 151 00:08:10,480 --> 00:08:13,800 Speaker 1: potentially taking work away from well just about everyone, to 152 00:08:14,120 --> 00:08:18,040 Speaker 1: the science fiction disaster scenario of a malevolent AI determined 153 00:08:18,080 --> 00:08:20,960 Speaker 1: to wipe out humanity. That last one is still well 154 00:08:21,000 --> 00:08:24,480 Speaker 1: within the realm of science fiction. Thankfully, we're not at that. 155 00:08:24,640 --> 00:08:28,640 Speaker 1: We haven't been putting AI in charge of potentially devastating weaponry, 156 00:08:28,680 --> 00:08:31,920 Speaker 1: for example, so hopefully we continue that trend. But one 157 00:08:31,960 --> 00:08:37,520 Speaker 1: threat AI definitely poses relates to electricity and resources. AI 158 00:08:37,640 --> 00:08:41,000 Speaker 1: requires a lot of power to operate, and with more 159 00:08:41,080 --> 00:08:45,760 Speaker 1: AI companies launching and then the big AI companies growing rapidly, 160 00:08:46,200 --> 00:08:49,439 Speaker 1: this means that any large entity that's serious about being 161 00:08:49,480 --> 00:08:52,760 Speaker 1: a definitive AI company has to start to think about 162 00:08:52,840 --> 00:08:55,400 Speaker 1: where is that juice going to come from. Over at 163 00:08:55,440 --> 00:08:58,960 Speaker 1: open Ai, that seems to include a company called Helium, 164 00:08:59,360 --> 00:09:03,800 Speaker 1: which aims to use nuclear fusion to generate electricity. That's 165 00:09:03,840 --> 00:09:08,720 Speaker 1: an amazing goal. Nuclear fusion would truly transform our world. 166 00:09:09,040 --> 00:09:11,720 Speaker 1: But it's a goal that's not currently viable. We've only 167 00:09:11,760 --> 00:09:14,520 Speaker 1: had a couple of cases where a fusion reaction even 168 00:09:14,559 --> 00:09:17,400 Speaker 1: produced more electricity than it took to start the reaction 169 00:09:17,480 --> 00:09:19,400 Speaker 1: in the first place, and when you take a real 170 00:09:19,440 --> 00:09:22,079 Speaker 1: step back and you consider the whole picture, we're still 171 00:09:22,080 --> 00:09:25,800 Speaker 1: not really there yet. So forging a deal with Helion 172 00:09:26,040 --> 00:09:29,480 Speaker 1: seems a tad premature since fusion is not yet a 173 00:09:29,559 --> 00:09:32,560 Speaker 1: viable method for power generation here on Earth. So why 174 00:09:32,559 --> 00:09:36,880 Speaker 1: would open Ai consider making such a deal. Well, one 175 00:09:36,920 --> 00:09:39,200 Speaker 1: factor that could have something to do with it is 176 00:09:39,240 --> 00:09:42,280 Speaker 1: that Sam Altman, who is the CEO of open Ai, 177 00:09:42,679 --> 00:09:46,560 Speaker 1: also happens to be the chairman of Helium's board of directors, 178 00:09:46,840 --> 00:09:49,960 Speaker 1: and he has nearly four hundred million dollars invested in 179 00:09:50,000 --> 00:09:53,240 Speaker 1: the company. Now, to be fair to mister Altman, he 180 00:09:53,280 --> 00:09:57,599 Speaker 1: has said that he has recused himself from these negotiations. 181 00:09:57,920 --> 00:10:00,760 Speaker 1: That's a good thing, because it sounds like there's more 182 00:10:00,800 --> 00:10:03,600 Speaker 1: than just the potential for a conflict of interest here 183 00:10:03,880 --> 00:10:07,800 Speaker 1: when the head of your very power thirsty company also 184 00:10:07,840 --> 00:10:10,760 Speaker 1: has hundreds of millions of dollars invested in a startup 185 00:10:10,880 --> 00:10:13,640 Speaker 1: power company, and then those two companies start to chat 186 00:10:13,679 --> 00:10:17,760 Speaker 1: with each other, Obviously, eyebrows will naturally rise as a result. 187 00:10:18,080 --> 00:10:20,440 Speaker 1: I'm actually kind of reminded of the story of We Work, 188 00:10:20,600 --> 00:10:23,240 Speaker 1: in which the CEO of that company would end up 189 00:10:23,480 --> 00:10:27,520 Speaker 1: leasing properties he personally owned to the company he ran 190 00:10:27,800 --> 00:10:31,840 Speaker 1: as CEO, which seemed shady to me then, and it 191 00:10:31,960 --> 00:10:34,360 Speaker 1: seems kind of shady to me now. But then, I 192 00:10:34,360 --> 00:10:37,040 Speaker 1: guess it's all academic at the moment, since Helium is 193 00:10:37,080 --> 00:10:41,000 Speaker 1: not currently generating electricity through fusion, and open ai reportedly 194 00:10:41,040 --> 00:10:43,640 Speaker 1: doesn't even have any data centers of its own. Instead 195 00:10:43,679 --> 00:10:47,320 Speaker 1: it relies on Microsoft's platforms. But still, it's a thing 196 00:10:47,400 --> 00:10:51,320 Speaker 1: that makes me go, hmm, Okay, before I make any 197 00:10:51,400 --> 00:10:54,680 Speaker 1: more references to outdated nineties music, we're gonna take a 198 00:10:54,760 --> 00:11:08,440 Speaker 1: quick break to thank our sponsors. We're back, so next up. 199 00:11:08,480 --> 00:11:11,040 Speaker 1: A watchdog group in the European Union is calling on 200 00:11:11,120 --> 00:11:13,680 Speaker 1: regulators to pass an order that would force Meta to 201 00:11:13,679 --> 00:11:16,720 Speaker 1: stop harvesting user data for the purposes of training AI 202 00:11:16,960 --> 00:11:20,320 Speaker 1: without first seeking user consent. Now, you might have heard 203 00:11:20,320 --> 00:11:23,559 Speaker 1: me talk about how Meta's policy on Instagram allows the 204 00:11:23,600 --> 00:11:27,040 Speaker 1: company to train AI models on user data without asking 205 00:11:27,040 --> 00:11:30,640 Speaker 1: permission first. In fact, you cannot opt out of that 206 00:11:31,040 --> 00:11:34,200 Speaker 1: if you are anywhere other than the European Union. But 207 00:11:34,280 --> 00:11:37,920 Speaker 1: the EU is different, right, you know, the EU has 208 00:11:38,000 --> 00:11:41,560 Speaker 1: these these laws in place that require companies to give 209 00:11:41,720 --> 00:11:45,040 Speaker 1: people more options. So in the EU, users would get 210 00:11:45,040 --> 00:11:47,960 Speaker 1: a little pop up about the matter to opt out 211 00:11:48,080 --> 00:11:51,199 Speaker 1: of it, and that was only if they actually read 212 00:11:51,240 --> 00:11:53,600 Speaker 1: the thing and saw the little hypertext link that would 213 00:11:53,679 --> 00:11:56,800 Speaker 1: let them go to a form where they could opt out. 214 00:11:57,080 --> 00:12:02,200 Speaker 1: The whole process was complicated and, in my opinion, misleading. Anyway, 215 00:12:02,200 --> 00:12:04,760 Speaker 1: it seems like Meta has purposefully designed a system to 216 00:12:04,920 --> 00:12:08,400 Speaker 1: discourage people from opting out. And from Meta's perspective, I 217 00:12:08,480 --> 00:12:11,840 Speaker 1: get it because training AI is hard. You need a 218 00:12:12,080 --> 00:12:15,880 Speaker 1: lot of material to train your AI, and Meta has 219 00:12:15,920 --> 00:12:19,520 Speaker 1: access to a treasure trove of user generated content, and 220 00:12:19,559 --> 00:12:22,360 Speaker 1: it sure is a hassle to have to secure permission 221 00:12:22,600 --> 00:12:25,200 Speaker 1: from each user in order to use all their stuff. 222 00:12:25,440 --> 00:12:28,480 Speaker 1: A Meta spokesperson responded to the Watchdog by saying that 223 00:12:28,559 --> 00:12:32,280 Speaker 1: Meta quote complies with privacy laws and our approach is 224 00:12:32,320 --> 00:12:35,280 Speaker 1: consistent with how other tech companies are developing and improving 225 00:12:35,320 --> 00:12:38,120 Speaker 1: their AI experiences in Europe. End quote. Now, I'm not 226 00:12:38,160 --> 00:12:39,840 Speaker 1: sure that last bit is going to be that much 227 00:12:40,200 --> 00:12:45,000 Speaker 1: helpful because I suspect watchdog organizations are equally concerned about 228 00:12:45,040 --> 00:12:47,520 Speaker 1: these other tech companies. I don't think they just have 229 00:12:47,600 --> 00:12:50,640 Speaker 1: it out for Meta. The watchdog group argues that Meta 230 00:12:50,679 --> 00:12:53,400 Speaker 1: has failed to comply with EU law that an opt 231 00:12:53,440 --> 00:12:56,240 Speaker 1: out form is not the same thing as an opt 232 00:12:56,400 --> 00:12:59,440 Speaker 1: in form, which is absolutely true. It's one thing to 233 00:12:59,440 --> 00:13:02,480 Speaker 1: skip over opting out. It's another one. You're asking point 234 00:13:02,520 --> 00:13:05,440 Speaker 1: blank do you consent to this? Because I need you 235 00:13:05,480 --> 00:13:08,640 Speaker 1: to indicate you do by clicking this box, and then 236 00:13:08,679 --> 00:13:11,400 Speaker 1: if you click the box, well then that's on you. Anyway, 237 00:13:11,480 --> 00:13:13,719 Speaker 1: I'm sure this story will continue to develop, so I'll 238 00:13:13,800 --> 00:13:16,640 Speaker 1: keep an eye on it. The US Department of Justice 239 00:13:16,679 --> 00:13:20,600 Speaker 1: and the Federal Trade Commission, or FTC, are opening investigations 240 00:13:20,600 --> 00:13:23,960 Speaker 1: into three large tech companies in the AI space. Those 241 00:13:24,000 --> 00:13:28,520 Speaker 1: are open Ai, Microsoft, and Nvidia, and the investigations are 242 00:13:28,640 --> 00:13:32,359 Speaker 1: of an anti trust nature, you know, looking into monopolies 243 00:13:32,400 --> 00:13:35,520 Speaker 1: and such. Broadly, the concern is that Microsoft, Open Ai, 244 00:13:35,559 --> 00:13:38,480 Speaker 1: and Nvidia have dominant positions in the AI space and 245 00:13:38,559 --> 00:13:43,319 Speaker 1: perhaps have used that position to disadvantage would be competitors. Reportedly, 246 00:13:43,400 --> 00:13:46,120 Speaker 1: the Department of Justice will investigate in Nvidia, while the 247 00:13:46,240 --> 00:13:49,480 Speaker 1: FTC will actually look into Microsoft and Open AI. This 248 00:13:49,559 --> 00:13:52,600 Speaker 1: follows on the heels of other anti trust investigations into 249 00:13:52,760 --> 00:13:57,080 Speaker 1: big tech like Meta, Amazon, and Google, and those investigations 250 00:13:57,080 --> 00:14:00,400 Speaker 1: spawned various lawsuits, and now it looks it's like the 251 00:14:00,800 --> 00:14:03,280 Speaker 1: era of regulators attempting to check the power of these 252 00:14:03,280 --> 00:14:06,360 Speaker 1: tech companies is at least continuing for the moment. This 253 00:14:06,440 --> 00:14:09,000 Speaker 1: is an election year, so who knows how long that 254 00:14:09,040 --> 00:14:12,080 Speaker 1: will last. We'll have to see. Thomas Clayburn of the 255 00:14:12,160 --> 00:14:15,439 Speaker 1: Register reported this week that Microsoft was laying off another 256 00:14:15,480 --> 00:14:18,280 Speaker 1: one thousand or so employees, which seemed like a slap 257 00:14:18,280 --> 00:14:21,400 Speaker 1: in the face after the company's CEO just told shareholders 258 00:14:21,440 --> 00:14:25,000 Speaker 1: in April that Microsoft had experienced a record third quarter. 259 00:14:25,200 --> 00:14:27,280 Speaker 1: You know, hey, y'all, we're doing better than we ever 260 00:14:27,440 --> 00:14:30,120 Speaker 1: have in the third quarter, so let's celebrate by laying 261 00:14:30,120 --> 00:14:34,239 Speaker 1: off people across the company. Clayburn reported that Microsoft reorganized 262 00:14:34,240 --> 00:14:37,400 Speaker 1: its mixed reality division this week, and that these cuts 263 00:14:37,400 --> 00:14:41,080 Speaker 1: wouldn't necessarily concentrate on just one division or department, but 264 00:14:41,160 --> 00:14:44,600 Speaker 1: instead would come across the entire company. One potential reason 265 00:14:44,640 --> 00:14:47,680 Speaker 1: for the cuts is that Microsoft is obviously making a 266 00:14:47,800 --> 00:14:50,360 Speaker 1: huge investment in AI, and when a company spends a 267 00:14:50,360 --> 00:14:52,840 Speaker 1: lot of money that company often looks at other places 268 00:14:52,880 --> 00:14:55,760 Speaker 1: where it can cut costs, and one surefire way to 269 00:14:55,760 --> 00:14:58,200 Speaker 1: cut costs, at least in the short term, is to 270 00:14:58,240 --> 00:15:02,200 Speaker 1: reduce the headcount. A bit can be a brutal business, y'all. 271 00:15:02,480 --> 00:15:05,160 Speaker 1: Matt Burgess of Wired has a great article titled the 272 00:15:05,200 --> 00:15:07,720 Speaker 1: Snowflake attack may be turning into one of the largest 273 00:15:07,800 --> 00:15:10,520 Speaker 1: data breaches ever. Since I did a pair of episodes 274 00:15:10,560 --> 00:15:13,400 Speaker 1: this week about largest data breaches in US history, I 275 00:15:13,400 --> 00:15:15,680 Speaker 1: thought it was good to include this news item today. 276 00:15:15,880 --> 00:15:20,000 Speaker 1: So Snowflake is a cloud storage firm mostly catering to 277 00:15:20,360 --> 00:15:24,400 Speaker 1: other businesses. Reportedly, the Ticketmaster data breach is, when you 278 00:15:24,440 --> 00:15:27,840 Speaker 1: get down to it, a Snowflake data breach, as Ticketmaster 279 00:15:28,000 --> 00:15:32,080 Speaker 1: uses Snowflake to host enormous databases like lots of other companies. 280 00:15:32,560 --> 00:15:35,920 Speaker 1: And that's Burgess's point. While only a couple of companies 281 00:15:35,960 --> 00:15:39,080 Speaker 1: have recently acknowledged they've experienced a data breach, and while 282 00:15:39,120 --> 00:15:42,440 Speaker 1: Snowflake reps have said only a quote unquote limited number 283 00:15:42,640 --> 00:15:46,040 Speaker 1: of customer accounts were accessed, the potential for truly massive 284 00:15:46,080 --> 00:15:49,880 Speaker 1: data breaches is daunting. Burgess rights that hackers are claiming 285 00:15:49,880 --> 00:15:53,320 Speaker 1: to possess millions of records belonging to other Snowflake customers, 286 00:15:53,320 --> 00:15:57,480 Speaker 1: which includes big companies like Advance Autoparts and Quote Wizard, 287 00:15:57,520 --> 00:16:00,840 Speaker 1: which is a subsidiary of Lending Tree. Not everything is 288 00:16:00,880 --> 00:16:04,320 Speaker 1: confirmed yet, but it should certainly raise more concerns about 289 00:16:04,320 --> 00:16:06,480 Speaker 1: the issue. And this is one of the huge risks 290 00:16:06,600 --> 00:16:10,760 Speaker 1: of cloud computing and storage because as cloud companies grow 291 00:16:10,920 --> 00:16:14,360 Speaker 1: and attract larger customers, they become ever more tempting targets 292 00:16:14,360 --> 00:16:17,760 Speaker 1: for hackers. It's a constant battle between security experts and 293 00:16:17,800 --> 00:16:21,040 Speaker 1: hackers to keep those systems safe. And vulnerabilities are really 294 00:16:21,240 --> 00:16:24,200 Speaker 1: just an employee who gives away too much information due 295 00:16:24,200 --> 00:16:27,680 Speaker 1: to social engineering can really put everything at risk. Anyway, 296 00:16:27,720 --> 00:16:29,560 Speaker 1: I highly recommend the article if you want to learn 297 00:16:29,640 --> 00:16:32,840 Speaker 1: more about how the hackers targeted Snowflake in the first place. 298 00:16:33,320 --> 00:16:36,880 Speaker 1: Reuter's reports that the startup Archer Aviation says the Federal 299 00:16:36,920 --> 00:16:40,240 Speaker 1: Aviation Administration or FAA here in the US has given 300 00:16:40,240 --> 00:16:42,920 Speaker 1: the company a certificate that will allow Archer Aviation to 301 00:16:42,960 --> 00:16:47,560 Speaker 1: begin commercial operations. So what does Archer Aviation do. Well, 302 00:16:47,640 --> 00:16:50,880 Speaker 1: it's an electric air taxi company, so it's got a 303 00:16:51,000 --> 00:16:54,560 Speaker 1: vertical takeoff and landing electric vehicle that looks a lot 304 00:16:54,600 --> 00:16:58,880 Speaker 1: like a huge drone. It's called Midnight, and it's a 305 00:16:58,920 --> 00:17:02,320 Speaker 1: piloted aircraft, and it's meant to provide transportation in densely 306 00:17:02,400 --> 00:17:06,159 Speaker 1: populated urban areas by conveniently popping up over all that 307 00:17:06,240 --> 00:17:10,679 Speaker 1: congestion and whisking passengers to whatever destination they're going to. 308 00:17:10,840 --> 00:17:12,879 Speaker 1: So a lot of these companies are focusing on a 309 00:17:12,880 --> 00:17:15,920 Speaker 1: business model that would take passengers from like a downtown 310 00:17:16,080 --> 00:17:19,840 Speaker 1: location to the closest airport. Archer Aviation is now the 311 00:17:20,000 --> 00:17:22,400 Speaker 1: second air taxi company in the US to get such 312 00:17:22,440 --> 00:17:26,080 Speaker 1: a certificate from the FAA. However, Reuters also points out 313 00:17:26,240 --> 00:17:29,400 Speaker 1: that the FAA has yet to actually certify the Midnight 314 00:17:29,440 --> 00:17:32,960 Speaker 1: Aircraft itself, which will have to happen before the FAA 315 00:17:33,000 --> 00:17:35,760 Speaker 1: can say, yeah, this aircraft meets our standards. Okay, I 316 00:17:35,800 --> 00:17:38,520 Speaker 1: got a couple of space news stories to wrap things up. 317 00:17:38,600 --> 00:17:42,040 Speaker 1: First up, after several delays and some malfunctions, a fully 318 00:17:42,200 --> 00:17:46,160 Speaker 1: crude Boeing star Liner spacecraft has docked with the International 319 00:17:46,240 --> 00:17:48,919 Speaker 1: Space Station this week for the first time with a 320 00:17:48,960 --> 00:17:52,040 Speaker 1: crew aboard. That is, Previous attempts to launch a Starliner 321 00:17:52,119 --> 00:17:54,920 Speaker 1: with a crew inside it were scuttled after some problems 322 00:17:55,080 --> 00:17:58,199 Speaker 1: cropped up. This trip had its own share of technical issues, 323 00:17:58,240 --> 00:18:02,200 Speaker 1: mostly stemming from the helium lys but helium is non toxic, 324 00:18:02,280 --> 00:18:05,520 Speaker 1: it's not flammable, it's not combustible. So despite some issues 325 00:18:05,560 --> 00:18:08,000 Speaker 1: with those leaks, the decision was made to go ahead 326 00:18:08,000 --> 00:18:10,800 Speaker 1: with the mission. It's been a really long journey for 327 00:18:10,920 --> 00:18:14,160 Speaker 1: the Boeing Starliner that Boeing has been developing this spacecraft 328 00:18:14,200 --> 00:18:16,960 Speaker 1: for like a decade, and it's good news for NASA. 329 00:18:17,160 --> 00:18:19,399 Speaker 1: NASA always wants to have options when it comes to 330 00:18:19,480 --> 00:18:23,119 Speaker 1: spacecraft to go to various locations. Obviously, Boeing will have 331 00:18:23,200 --> 00:18:26,080 Speaker 1: to address any issues that were revealed during this operation, 332 00:18:26,240 --> 00:18:28,560 Speaker 1: but so far you should be able to call this 333 00:18:28,600 --> 00:18:32,639 Speaker 1: one a successful test. Meanwhile, at SpaceX, that company launched 334 00:18:32,640 --> 00:18:35,520 Speaker 1: a Starship rocket yesterday and completed a successful test to 335 00:18:35,560 --> 00:18:39,440 Speaker 1: take the upper stage into orbit. Before it returned back 336 00:18:39,480 --> 00:18:42,879 Speaker 1: to Earth, it climbed to one hundred and thirty miles altitude. 337 00:18:43,040 --> 00:18:46,360 Speaker 1: The booster for the launch vehicle, after it separated from 338 00:18:46,480 --> 00:18:48,920 Speaker 1: the upper stage, came back down and landed in the 339 00:18:48,960 --> 00:18:52,840 Speaker 1: Gulf of Mexico. The upper stage would return and land 340 00:18:52,960 --> 00:18:55,960 Speaker 1: in the Indian Ocean. There were some technical issues during 341 00:18:56,000 --> 00:18:58,680 Speaker 1: the launch. It wasn't perfect. Upon re entry. The upper 342 00:18:58,720 --> 00:19:02,400 Speaker 1: stage lost some tie and also suffered some other damage, 343 00:19:02,480 --> 00:19:04,840 Speaker 1: but the general goal of testing the vehicle and having 344 00:19:04,920 --> 00:19:09,640 Speaker 1: both stages retrievable worked out. The SpaceX Starship is part 345 00:19:09,680 --> 00:19:13,439 Speaker 1: of the long term SpaceX plan that includes the lofty 346 00:19:13,480 --> 00:19:16,280 Speaker 1: goal of taking human astronauts to Mars one day, and 347 00:19:16,320 --> 00:19:19,119 Speaker 1: there's still a ton to do before that can become 348 00:19:19,119 --> 00:19:23,399 Speaker 1: a reality, but still pretty cool, space News. That's it 349 00:19:23,560 --> 00:19:26,680 Speaker 1: for the Tech News this week on Friday, June seventh, 350 00:19:26,680 --> 00:19:29,680 Speaker 1: twenty twenty four. I hope all of you out there 351 00:19:29,760 --> 00:19:32,920 Speaker 1: are well, and I'll talk to you again really soon. 352 00:19:38,880 --> 00:19:43,520 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 353 00:19:43,840 --> 00:19:47,560 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 354 00:19:47,600 --> 00:19:52,080 Speaker 1: to your favorite shows.