1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:16,000 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:19,920 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:20,000 --> 00:00:23,480 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:23,960 --> 00:00:29,720 Speaker 1: April twentieth, twenty twenty three. So it's four twenty and 6 00:00:29,840 --> 00:00:33,480 Speaker 1: fittingly enough, our first story is one where technology went 7 00:00:33,680 --> 00:00:39,400 Speaker 1: up in smoke. So first up. SpaceX's Starship launched this 8 00:00:39,560 --> 00:00:43,760 Speaker 1: morning after scrubbing the previously scheduled launch earlier this week. 9 00:00:44,320 --> 00:00:47,840 Speaker 1: A frozen valve was the reason behind the earlier cancelation. 10 00:00:48,400 --> 00:00:53,320 Speaker 1: But today the world's largest, most powerful launch vehicle in 11 00:00:53,520 --> 00:00:58,760 Speaker 1: our history of space exploration lifted up off its platform 12 00:00:58,840 --> 00:01:03,120 Speaker 1: in South Texas to hurtle toward orbit, though it did 13 00:01:03,120 --> 00:01:07,960 Speaker 1: not actually get there because around four minutes after launch, 14 00:01:08,520 --> 00:01:13,679 Speaker 1: things got spicy. The first stage engines cut off as planned, 15 00:01:14,360 --> 00:01:18,040 Speaker 1: but the second stage failed to separate from the first 16 00:01:18,120 --> 00:01:21,720 Speaker 1: stage and the launch vehicle began spinning, and then there 17 00:01:21,840 --> 00:01:25,240 Speaker 1: was a burst of flame and an explosion, with a 18 00:01:25,360 --> 00:01:30,080 Speaker 1: vehicle breaking apart in the process. Or, as SpaceX's Twitter 19 00:01:30,120 --> 00:01:33,880 Speaker 1: feed put it, Starship just experienced what we call a 20 00:01:34,040 --> 00:01:39,479 Speaker 1: rapid unscheduled disassembly. The SpaceX team stressed that the main 21 00:01:39,600 --> 00:01:43,119 Speaker 1: goal of this test was to clear the launch tower, 22 00:01:43,560 --> 00:01:46,520 Speaker 1: you know, to actually get a launch and a lift off, 23 00:01:46,600 --> 00:01:49,720 Speaker 1: and to clear the launch tower, which the starship did. 24 00:01:50,160 --> 00:01:53,320 Speaker 1: So with that in mind, the test was a success. 25 00:01:53,440 --> 00:01:56,960 Speaker 1: It's easy to say otherwise because subsequently it then exploded, 26 00:01:57,760 --> 00:02:01,000 Speaker 1: but it was clearly a disappointment that the spacecraft was 27 00:02:01,080 --> 00:02:04,880 Speaker 1: unable to deliver the payload into orbit as hoped. Regardless, 28 00:02:05,160 --> 00:02:08,640 Speaker 1: the launch vehicle has more tests in its future. Even 29 00:02:08,720 --> 00:02:12,399 Speaker 1: if the test had been completely successful, it still would 30 00:02:12,480 --> 00:02:14,960 Speaker 1: have lots more tests in the future. As to what 31 00:02:15,160 --> 00:02:17,840 Speaker 1: caused the failure, I'm sure there will be a full 32 00:02:17,880 --> 00:02:21,880 Speaker 1: accounting of it later. I saw early speculation that several 33 00:02:21,880 --> 00:02:24,359 Speaker 1: engines on one side of the first stage cut out early. 34 00:02:24,880 --> 00:02:30,880 Speaker 1: I've seen other speculation that SpaceX chose to detonate the spacecraft, 35 00:02:31,320 --> 00:02:34,040 Speaker 1: but as of the time I'm actually recording this, I 36 00:02:34,120 --> 00:02:37,000 Speaker 1: haven't seen official word, so I'm sure by the time 37 00:02:37,000 --> 00:02:39,000 Speaker 1: you listen to this that will have been cleared up. 38 00:02:39,320 --> 00:02:42,160 Speaker 1: But as a recorded that has not yet happened. Now, 39 00:02:42,200 --> 00:02:46,040 Speaker 1: he may as well get the other Musk related stories 40 00:02:46,080 --> 00:02:49,320 Speaker 1: out of the way, because Elon Musk, obviously head of SpaceX. 41 00:02:50,120 --> 00:02:55,240 Speaker 1: He appeared at an advertising conference yesterday Wednesday with a 42 00:02:55,280 --> 00:02:58,520 Speaker 1: message for the crowd that Twitter is a safe place 43 00:02:58,639 --> 00:03:01,840 Speaker 1: for advertisers to do business. Also, he doesn't plan to 44 00:03:01,880 --> 00:03:05,240 Speaker 1: restrict the types of speech that brands may want to 45 00:03:05,280 --> 00:03:09,799 Speaker 1: try and avoid, So he's saying Twitter's safe, but there's 46 00:03:09,800 --> 00:03:13,440 Speaker 1: also no guarantee that a brand's messaging won't appear alongside 47 00:03:13,800 --> 00:03:18,240 Speaker 1: objectionable speech, which probably isn't much of a selling point 48 00:03:18,400 --> 00:03:22,360 Speaker 1: for many advertisers out there. He also sent the message 49 00:03:22,639 --> 00:03:26,200 Speaker 1: don't tell us what to do. He essentially said, don't 50 00:03:26,240 --> 00:03:29,560 Speaker 1: try and dictate what Twitter should do. That's my job. 51 00:03:30,120 --> 00:03:32,919 Speaker 1: I'm not really sure if the strategy is going to work. 52 00:03:33,360 --> 00:03:36,800 Speaker 1: The company has seen some pretty drastic drops in revenue 53 00:03:36,920 --> 00:03:41,760 Speaker 1: from advertising anyway. A bunch of the regular customers with 54 00:03:42,000 --> 00:03:47,120 Speaker 1: advertising on Twitter dropped the platform entirely. A lot of 55 00:03:47,120 --> 00:03:50,200 Speaker 1: others stuck around, but they also reduced their spending on 56 00:03:50,240 --> 00:03:53,400 Speaker 1: the platform by as much as eighty percent. Of course, 57 00:03:53,880 --> 00:03:56,400 Speaker 1: some of that may have nothing to do with Twitter 58 00:03:56,440 --> 00:03:59,480 Speaker 1: as a platform. The decision might be more due to 59 00:03:59,520 --> 00:04:02,880 Speaker 1: the overall economic situation where a lot of companies have 60 00:04:03,360 --> 00:04:09,360 Speaker 1: rained in marketing expenditures. Meanwhile, Microsoft announced its Smart Campaigns 61 00:04:09,400 --> 00:04:13,040 Speaker 1: with a Multiplatform Ad Service, which is a way for 62 00:04:13,120 --> 00:04:17,400 Speaker 1: advertisers to use a tool to plan out their advertising 63 00:04:17,440 --> 00:04:21,440 Speaker 1: strategy on different platforms. They said that they're actually dropping 64 00:04:21,480 --> 00:04:25,520 Speaker 1: Twitter starting April twenty five. Other platforms will still be 65 00:04:25,640 --> 00:04:30,960 Speaker 1: available through this tool, but Twitter will not, So advertisers 66 00:04:31,000 --> 00:04:34,400 Speaker 1: who use Microsoft platform for things like ad strategy and 67 00:04:34,440 --> 00:04:37,320 Speaker 1: placement are going to have to rely on something else 68 00:04:37,400 --> 00:04:41,039 Speaker 1: with regard to Twitter. I'm sure that this will lead 69 00:04:41,040 --> 00:04:44,720 Speaker 1: to more speculation about if or when Twitter will fold. 70 00:04:45,480 --> 00:04:48,320 Speaker 1: I'm not ready to make any kind of prediction like that. 71 00:04:48,839 --> 00:04:50,760 Speaker 1: I'm not even going to predict that it will fold, 72 00:04:50,839 --> 00:04:54,560 Speaker 1: let alone when it will fold, because as I look 73 00:04:54,680 --> 00:04:58,200 Speaker 1: back over all the things that have happened with Twitter 74 00:04:58,480 --> 00:05:01,600 Speaker 1: since Musk first announced his intent to buy the company, 75 00:05:02,600 --> 00:05:05,080 Speaker 1: there is no way I could have anticipated all the 76 00:05:05,080 --> 00:05:07,720 Speaker 1: stuff that followed. So I sure as heck am not 77 00:05:07,839 --> 00:05:10,360 Speaker 1: going to try and predict the future on this one. 78 00:05:10,400 --> 00:05:14,480 Speaker 1: In Tesla News, the company issued a Q one update 79 00:05:14,600 --> 00:05:18,440 Speaker 1: letter in which Elon Musk announced that collectively, the beta 80 00:05:18,520 --> 00:05:22,120 Speaker 1: program for Tesla's full self driving service, which I feel 81 00:05:22,160 --> 00:05:25,520 Speaker 1: I need to mention, is not really full self driving 82 00:05:25,920 --> 00:05:29,720 Speaker 1: has now logged more than one hundred and fifty million miles, 83 00:05:30,160 --> 00:05:33,400 Speaker 1: and that really is an incredible achievement. You know, I 84 00:05:33,480 --> 00:05:36,839 Speaker 1: give Tesla a lot of flak, but credit where credits due. 85 00:05:36,920 --> 00:05:40,440 Speaker 1: One hundred and fifty million miles traveled is a lot. Also, 86 00:05:40,520 --> 00:05:43,640 Speaker 1: with so many miles traveled, Tesla has gathered a truly 87 00:05:43,839 --> 00:05:47,360 Speaker 1: enormous amount of data that can then be fed back 88 00:05:47,400 --> 00:05:53,320 Speaker 1: into systems to improve performance. So theoretically, with more miles traveled, 89 00:05:53,560 --> 00:05:56,680 Speaker 1: the system gets more reliable and safer. You can kind 90 00:05:56,680 --> 00:05:59,720 Speaker 1: of think of every car that has full self driving 91 00:05:59,720 --> 00:06:03,680 Speaker 1: act as being kind of like a student driver, only 92 00:06:03,960 --> 00:06:07,400 Speaker 1: these students all share the same brain. So as one 93 00:06:07,600 --> 00:06:11,240 Speaker 1: car learns a lesson, the benefits of that lesson go 94 00:06:11,360 --> 00:06:15,800 Speaker 1: to the entire fleet of cars. Now, I say theoretically 95 00:06:16,279 --> 00:06:18,560 Speaker 1: because there are actually other factors we have to take 96 00:06:18,560 --> 00:06:22,359 Speaker 1: into consideration. So, for example, Tesla has made changes to 97 00:06:22,440 --> 00:06:26,599 Speaker 1: the hardware in the construction of their vehicles, such as 98 00:06:26,680 --> 00:06:31,440 Speaker 1: removing ultrasonic sensors in favor of optical ones, and these 99 00:06:31,440 --> 00:06:35,400 Speaker 1: hardware changes also affect performance. So it's not just the 100 00:06:35,440 --> 00:06:40,119 Speaker 1: brains of these vehicles, it's the actual sensors as well. 101 00:06:40,600 --> 00:06:42,840 Speaker 1: It really doesn't matter how smart your vehicle is if 102 00:06:42,880 --> 00:06:47,360 Speaker 1: it can't sense an obstacle for example. Still, it is 103 00:06:47,800 --> 00:06:53,440 Speaker 1: a legitimately impressive achievement. That being said, Tesla's investors are 104 00:06:53,520 --> 00:06:56,640 Speaker 1: not so happy with the company right now. Also in 105 00:06:56,640 --> 00:06:59,520 Speaker 1: that letter was the revelation that the company's gross margins 106 00:06:59,560 --> 00:07:03,160 Speaker 1: are below twenty percent. The net income for the company 107 00:07:03,200 --> 00:07:06,000 Speaker 1: is two point nine billion dollars. That's a big old 108 00:07:06,080 --> 00:07:08,919 Speaker 1: chunk of change. But this time last year, the company's 109 00:07:08,960 --> 00:07:12,360 Speaker 1: income was seven hundred million dollars more than that, so 110 00:07:12,920 --> 00:07:15,960 Speaker 1: it is a drop in year over year net income. 111 00:07:16,600 --> 00:07:20,400 Speaker 1: Tesla has cut vehicle prices several times since the start 112 00:07:20,480 --> 00:07:23,440 Speaker 1: of the year because demand has been low, so in 113 00:07:23,520 --> 00:07:27,800 Speaker 1: order to try and you know, stimulate demand, they've cut 114 00:07:27,800 --> 00:07:30,320 Speaker 1: some prices. And my guess is the tough economy is 115 00:07:30,360 --> 00:07:33,880 Speaker 1: really playing a part in this, although another big part 116 00:07:34,320 --> 00:07:36,720 Speaker 1: is that other car companies are starting to finally make 117 00:07:36,800 --> 00:07:40,040 Speaker 1: up ground on Tesla, which for years had the advantage 118 00:07:40,040 --> 00:07:43,560 Speaker 1: of being all in on electric vehicles and most other 119 00:07:43,640 --> 00:07:46,800 Speaker 1: car companies were just kind of dipping their toe in 120 00:07:46,880 --> 00:07:49,360 Speaker 1: the water at the time. Well that's starting to change 121 00:07:49,400 --> 00:07:52,640 Speaker 1: now and now Tesla's facing greater competition in the electric 122 00:07:52,680 --> 00:07:56,480 Speaker 1: vehicle market. So Musk tried to get ahead of a 123 00:07:56,520 --> 00:08:01,120 Speaker 1: negative investor reaction. He said, quote, you're taking a view 124 00:08:01,160 --> 00:08:04,720 Speaker 1: that pushing for higher volumes and a larger fleet is 125 00:08:04,760 --> 00:08:09,080 Speaker 1: the right choice here versus a lower volume and higher 126 00:08:09,120 --> 00:08:13,120 Speaker 1: margin end quote. And honestly, I'm not sure he had 127 00:08:13,200 --> 00:08:15,840 Speaker 1: much other choice, because if not enough people are buying 128 00:08:15,880 --> 00:08:19,440 Speaker 1: Tesla's right now after several price cuts, they sure as 129 00:08:19,480 --> 00:08:23,400 Speaker 1: heck wouldn't be buying cars there were priced even higher. 130 00:08:23,920 --> 00:08:26,760 Speaker 1: Investors still weren't happy, though, and the stock price is 131 00:08:26,800 --> 00:08:31,440 Speaker 1: down by around six to seven percent six ' two seven. 132 00:08:32,320 --> 00:08:34,439 Speaker 1: As I was working on this episode, it was hovering 133 00:08:34,679 --> 00:08:38,520 Speaker 1: just under a seven percent drop from the day before. Okay, 134 00:08:38,640 --> 00:08:41,839 Speaker 1: now it's a high time, everyone's favorite time of the news, 135 00:08:42,559 --> 00:08:45,240 Speaker 1: and first up is a story I found more than 136 00:08:45,480 --> 00:08:49,640 Speaker 1: a little upsetting. I mean, like this one genuinely disturbed me. 137 00:08:49,960 --> 00:08:53,160 Speaker 1: All right, First, let me give some backstory. There's a 138 00:08:53,200 --> 00:08:56,960 Speaker 1: Formula one racer named Michael Schumbacher who was in a 139 00:08:57,160 --> 00:09:01,840 Speaker 1: terrible accident while skiing, not racing, and this happened a 140 00:09:01,920 --> 00:09:05,640 Speaker 1: decade ago, back in twenty thirteen. He was severely injured 141 00:09:05,679 --> 00:09:08,880 Speaker 1: in that accident. He received a brain injury in the process, 142 00:09:09,440 --> 00:09:12,520 Speaker 1: and since then he's been out of the limelight. His 143 00:09:12,600 --> 00:09:15,760 Speaker 1: family has stressed the need for privacy. They continue to 144 00:09:15,880 --> 00:09:20,000 Speaker 1: help him day by day. And that story alone is 145 00:09:20,000 --> 00:09:23,400 Speaker 1: a very tough one. Right to see someone have that 146 00:09:23,559 --> 00:09:27,719 Speaker 1: kind of injury, that's just terrible. But then we get 147 00:09:27,760 --> 00:09:30,440 Speaker 1: to something that is really hard to believe. A German 148 00:09:30,520 --> 00:09:37,240 Speaker 1: weekly magazine called De Actual was running a supposed interview 149 00:09:37,480 --> 00:09:42,640 Speaker 1: with Schumacher. Only it wasn't an interview with Schumacher, he 150 00:09:42,720 --> 00:09:47,760 Speaker 1: has not given any interviews since this accident in twenty thirteen. Rather, 151 00:09:48,400 --> 00:09:51,280 Speaker 1: it was an AI agent that had been instructed to 152 00:09:51,360 --> 00:09:56,800 Speaker 1: respond to questions as if it were Michael Schumacher. So 153 00:09:57,280 --> 00:10:01,360 Speaker 1: this weekly magazine published a quote unquote interview with an 154 00:10:01,400 --> 00:10:05,960 Speaker 1: AI chatbot impersonating an athlete who is still coping with 155 00:10:06,000 --> 00:10:08,719 Speaker 1: the effects of a brain injury. So to call it 156 00:10:09,360 --> 00:10:14,080 Speaker 1: tasteless and unethical seems to be a gross understatement, And 157 00:10:14,120 --> 00:10:15,800 Speaker 1: it kind of harkens back to what I was saying 158 00:10:15,840 --> 00:10:20,600 Speaker 1: earlier this week about concepts like right to identity and 159 00:10:20,760 --> 00:10:25,280 Speaker 1: right to personality. I mean, imagine if some news outlet 160 00:10:25,400 --> 00:10:28,960 Speaker 1: there out there ran a supposed interview with you, but 161 00:10:29,040 --> 00:10:31,080 Speaker 1: it wasn't with you. It was with a chatbot that 162 00:10:31,240 --> 00:10:35,240 Speaker 1: was told to impersonate you, and then it's presented to 163 00:10:35,280 --> 00:10:38,000 Speaker 1: people as if it's a legitimate interview. That would be 164 00:10:38,720 --> 00:10:43,640 Speaker 1: an incredible violation. Schumacher's family are pursuing legal action against 165 00:10:43,679 --> 00:10:47,199 Speaker 1: the magazine and y'all. A lot of the concerns around 166 00:10:47,280 --> 00:10:51,280 Speaker 1: AI may be a little premature, Like the worry that 167 00:10:51,320 --> 00:10:54,320 Speaker 1: AI is going to doom the human race might be 168 00:10:54,360 --> 00:10:57,960 Speaker 1: a bit of an overreaction as things are right now. 169 00:10:58,120 --> 00:11:01,160 Speaker 1: But when you see media outlets willing to go to 170 00:11:01,200 --> 00:11:04,040 Speaker 1: these kinds of links to generate a story, you realize 171 00:11:04,080 --> 00:11:07,800 Speaker 1: that AI could be dangerous. Sure, it is dangerous in 172 00:11:07,840 --> 00:11:12,720 Speaker 1: several ways, but human beings can be downright diabolical. Okay, 173 00:11:12,880 --> 00:11:15,000 Speaker 1: we've got a lot more stories to go to, but 174 00:11:15,120 --> 00:11:27,800 Speaker 1: first let's take a quick break. Okay, we're back. Let's 175 00:11:27,840 --> 00:11:31,880 Speaker 1: talk some more about AI. So, Reddit's CEO Steve Huffman 176 00:11:32,400 --> 00:11:35,240 Speaker 1: has a message to AI companies out there. If they 177 00:11:35,280 --> 00:11:38,560 Speaker 1: want to scrape content from Reddit as part of creating 178 00:11:38,800 --> 00:11:42,640 Speaker 1: large language models and training AI agents, they're going to 179 00:11:42,720 --> 00:11:45,080 Speaker 1: have to pay for that privilege. He spoke with The 180 00:11:45,120 --> 00:11:47,240 Speaker 1: New York Times and said Reddit plans to make an 181 00:11:47,280 --> 00:11:52,240 Speaker 1: exception in its API, its Application programming interface that if 182 00:11:52,280 --> 00:11:56,600 Speaker 1: an AI company wants to access this tool, they'll have 183 00:11:56,679 --> 00:11:59,440 Speaker 1: to pay first. Now, if you're a developer who needs 184 00:11:59,480 --> 00:12:03,959 Speaker 1: access to that API for something other than training artificial intelligence, 185 00:12:04,360 --> 00:12:07,400 Speaker 1: good news, no charge for you. You get to use 186 00:12:07,440 --> 00:12:10,880 Speaker 1: the tool. As intended. As for how much AI companies 187 00:12:10,920 --> 00:12:14,600 Speaker 1: will have to pay to get access, that really hasn't 188 00:12:14,640 --> 00:12:17,760 Speaker 1: been hashed out yet. Reddit has already served as a 189 00:12:17,840 --> 00:12:21,520 Speaker 1: treasure trove of human centric data for companies like Google 190 00:12:21,640 --> 00:12:26,160 Speaker 1: and OpenAI while they were developing their respective artificial intelligence 191 00:12:26,240 --> 00:12:30,200 Speaker 1: large language models. Personally, I like this move by Reddit. 192 00:12:30,440 --> 00:12:33,960 Speaker 1: I mean, we all know that information has value. Now 193 00:12:34,040 --> 00:12:36,400 Speaker 1: if only the Reddit users could get a cut of 194 00:12:36,440 --> 00:12:38,960 Speaker 1: the action. But that's probably taking things a little too far. 195 00:12:39,840 --> 00:12:42,480 Speaker 1: In a kind of similar story, Google is leading the 196 00:12:42,559 --> 00:12:46,720 Speaker 1: charge in petitioning the Australian government to allow AI projects 197 00:12:46,760 --> 00:12:53,000 Speaker 1: to crawl Australian websites and to be used in Australian implementations. 198 00:12:53,080 --> 00:12:58,120 Speaker 1: So currently Australia's copyright laws are potentially a barrier, creating 199 00:12:58,520 --> 00:13:02,320 Speaker 1: pesky legal restrictions on what Google and other companies like 200 00:13:02,360 --> 00:13:05,959 Speaker 1: open ai would consider to be information that's free for 201 00:13:06,040 --> 00:13:10,560 Speaker 1: the taking and applications that should be fully open to them. 202 00:13:11,120 --> 00:13:14,840 Speaker 1: Google framed its request by saying Australia could really be 203 00:13:14,920 --> 00:13:19,000 Speaker 1: missing out, with one spokesperson stating quote, the lack of 204 00:13:19,080 --> 00:13:23,840 Speaker 1: such copyright flexibilities means that investment in and development of 205 00:13:24,160 --> 00:13:28,600 Speaker 1: AI and machine learning technologies is happening and will continue 206 00:13:28,600 --> 00:13:32,760 Speaker 1: to happen overseas end quote. That's according to the Guardian. 207 00:13:33,240 --> 00:13:35,680 Speaker 1: So Google essentially is saying, hey, if you don't relax 208 00:13:35,720 --> 00:13:38,520 Speaker 1: those laws, my dude, you're going to have to import 209 00:13:38,600 --> 00:13:41,440 Speaker 1: all those useful AI tools from somewhere else, because ain't 210 00:13:41,440 --> 00:13:42,840 Speaker 1: no one going to be here to do it in 211 00:13:42,880 --> 00:13:46,240 Speaker 1: Australia because your laws are whack. Now. I have no 212 00:13:46,360 --> 00:13:49,719 Speaker 1: clue if Google's approach will be effective, particularly in an 213 00:13:49,800 --> 00:13:53,679 Speaker 1: environment where there's an increased wariness surrounding AI and its 214 00:13:53,679 --> 00:13:58,080 Speaker 1: development in the first place. I think the tide in 215 00:13:58,559 --> 00:14:02,240 Speaker 1: public opinion on a I is starting to turn sour. 216 00:14:02,800 --> 00:14:06,280 Speaker 1: Bloomberg has published an article titled Google's rush to win 217 00:14:06,480 --> 00:14:10,480 Speaker 1: in AI led to ethical lapses employees say, and the 218 00:14:10,640 --> 00:14:15,640 Speaker 1: article sites Google staff using language like cringeworthy or calling 219 00:14:15,800 --> 00:14:20,680 Speaker 1: its aichatbot a pathological liar. They were using these words 220 00:14:20,720 --> 00:14:24,080 Speaker 1: to describe Google Bard while it was still in development, 221 00:14:24,120 --> 00:14:27,600 Speaker 1: before it had been pushed out into a beta program. 222 00:14:28,000 --> 00:14:31,160 Speaker 1: They said the tool would frequently give unreliable or sometimes 223 00:14:31,160 --> 00:14:35,320 Speaker 1: outright dangerous information, while doing so in an authoritative voice, 224 00:14:35,320 --> 00:14:37,640 Speaker 1: so it sounds like it knows what's talking about. So 225 00:14:37,760 --> 00:14:41,440 Speaker 1: one example given in the article is about scuba diving 226 00:14:41,480 --> 00:14:45,240 Speaker 1: procedures and that supposedly Barred gave answers that would quote 227 00:14:45,800 --> 00:14:49,320 Speaker 1: likely result in serious injury or death end quote. So 228 00:14:49,360 --> 00:14:53,280 Speaker 1: that's really not good. This was during the testing phase, 229 00:14:53,320 --> 00:14:55,400 Speaker 1: as I said, for Bard, and my guess is that 230 00:14:55,480 --> 00:14:59,120 Speaker 1: Google had planned to keep Barred under wraps and continue 231 00:14:59,200 --> 00:15:02,200 Speaker 1: to work on it within development for much much longer. 232 00:15:02,760 --> 00:15:06,040 Speaker 1: But Open AI's release of chat GPT late last year 233 00:15:06,200 --> 00:15:10,280 Speaker 1: convinced Google's leadership to try the old move fast and 234 00:15:10,440 --> 00:15:14,160 Speaker 1: break things approach, or else the company would risk getting 235 00:15:14,240 --> 00:15:17,760 Speaker 1: left behind in the AI chatbot wars. And you know, 236 00:15:18,440 --> 00:15:21,880 Speaker 1: Google really hates it if it is not the dominant 237 00:15:21,880 --> 00:15:24,560 Speaker 1: player in whichever markets it's competing in. It likes to 238 00:15:24,600 --> 00:15:29,240 Speaker 1: have that comfortable sixty to ninety percent market share. So 239 00:15:29,320 --> 00:15:32,160 Speaker 1: the article reads kind of as a warning about AI 240 00:15:32,240 --> 00:15:37,240 Speaker 1: in general, that this push to create a snazzy tool 241 00:15:37,640 --> 00:15:41,760 Speaker 1: could result in harmful consequences, and that the resources required 242 00:15:41,760 --> 00:15:45,720 Speaker 1: to make certain that the development and deployment is following 243 00:15:45,800 --> 00:15:50,160 Speaker 1: really good, strong ethical guidelines those just aren't in place 244 00:15:50,360 --> 00:15:54,560 Speaker 1: or they're being ignored. Google reps told Bloomberg that quote 245 00:15:54,640 --> 00:15:58,360 Speaker 1: responsible AI remains a top priority at the company end quote. Now, 246 00:15:58,440 --> 00:16:01,360 Speaker 1: in my opinion, company's across the board have been far 247 00:16:01,440 --> 00:16:05,600 Speaker 1: too aggressive with these tools. Open AI's leadership has even 248 00:16:05,720 --> 00:16:09,960 Speaker 1: warned media outlets about its own chat pots, saying it's 249 00:16:09,960 --> 00:16:14,080 Speaker 1: not reliable. But by that time the cat was already 250 00:16:14,120 --> 00:16:16,800 Speaker 1: out of the artificial bag and it was too late 251 00:16:16,840 --> 00:16:21,280 Speaker 1: to issue cautionary messages Anyway. The Bloomberg article is well 252 00:16:21,280 --> 00:16:24,880 Speaker 1: worth a read and goes into much greater detail about 253 00:16:24,880 --> 00:16:27,640 Speaker 1: the conditions that led Google to Rushbard in to a 254 00:16:27,680 --> 00:16:33,440 Speaker 1: beta test, arguably prematurely. Yesterday, Meta held another round of layoffs, 255 00:16:33,600 --> 00:16:36,000 Speaker 1: one of three batches the company plans to spread out 256 00:16:36,040 --> 00:16:38,560 Speaker 1: across the next few months, which will ultimately see another 257 00:16:38,640 --> 00:16:42,280 Speaker 1: ten thousand Meta employees cut. This is after Meta had 258 00:16:42,320 --> 00:16:45,600 Speaker 1: already cut eleven thousand jobs late last year. This time, 259 00:16:45,640 --> 00:16:48,320 Speaker 1: the layoffs affected engineers and tech teams, and as you 260 00:16:48,400 --> 00:16:51,560 Speaker 1: might imagine, the cuts have delivered another blow to employee 261 00:16:51,560 --> 00:16:55,000 Speaker 1: morale at Meta. I mentioned in a previous news episode 262 00:16:55,000 --> 00:16:58,240 Speaker 1: that some Meta employees have become discouraged over recent months 263 00:16:58,280 --> 00:17:03,520 Speaker 1: due to several reasons. First, layoffs are demoralizing. To the 264 00:17:03,560 --> 00:17:06,520 Speaker 1: push to return to office has created a hardship on 265 00:17:06,600 --> 00:17:10,359 Speaker 1: a lot of employees. And see that hardship doesn't appear 266 00:17:10,359 --> 00:17:12,919 Speaker 1: to be shared by Meta's top executive leadership because a 267 00:17:12,920 --> 00:17:15,640 Speaker 1: lot of them seem to be absent from Meta's HQ. 268 00:17:15,840 --> 00:17:18,160 Speaker 1: Some of them have moved thousands of miles away while 269 00:17:18,160 --> 00:17:21,200 Speaker 1: still serving as executives, and yet the employees are forced 270 00:17:21,240 --> 00:17:24,920 Speaker 1: to go in. So yeah, news continues to be rough 271 00:17:25,240 --> 00:17:28,320 Speaker 1: for the staff of Meta. By the way, if you 272 00:17:28,359 --> 00:17:31,639 Speaker 1: were a Facebook user with an account any time between 273 00:17:31,720 --> 00:17:35,119 Speaker 1: May two thousand and seven and December twenty twenty two, 274 00:17:35,600 --> 00:17:39,000 Speaker 1: you can claim your share of a huge class action 275 00:17:39,200 --> 00:17:43,240 Speaker 1: lawsuit settlement. Now this has to do with Cambridge Analytica, 276 00:17:43,680 --> 00:17:46,280 Speaker 1: that scandal that rocked the tech world several years ago 277 00:17:46,800 --> 00:17:49,080 Speaker 1: and we still talk about it today. So in case 278 00:17:49,119 --> 00:17:52,760 Speaker 1: you don't remember that whole kerfuffle, a company that catered 279 00:17:52,800 --> 00:17:57,560 Speaker 1: to political campaigns, a data analytics company leveraged a survey 280 00:17:57,680 --> 00:18:00,480 Speaker 1: app to gather a huge amount of information about Facebook 281 00:18:00,560 --> 00:18:04,240 Speaker 1: users without their consent. The app was taking advantage of 282 00:18:04,400 --> 00:18:08,160 Speaker 1: large gaps in Facebook's API, gaps that the company would 283 00:18:08,160 --> 00:18:12,320 Speaker 1: subsequently close, but the damage had already been done anyway. 284 00:18:12,520 --> 00:18:14,879 Speaker 1: There were a lot of legal proceedings that followed in 285 00:18:14,920 --> 00:18:18,920 Speaker 1: the discovery of this transgression, including this class action lawsuit 286 00:18:18,920 --> 00:18:21,800 Speaker 1: that I mentioned, and the settlement was for a whopping 287 00:18:21,840 --> 00:18:25,960 Speaker 1: seven hundred and twenty five million dollars. Though Meta was 288 00:18:26,040 --> 00:18:28,000 Speaker 1: let off the hook in a sense because the company 289 00:18:28,000 --> 00:18:30,760 Speaker 1: didn't actually have to admit any wrongdoing as part of 290 00:18:30,760 --> 00:18:34,159 Speaker 1: this settlement. But then I think it's kind of hard 291 00:18:34,320 --> 00:18:37,119 Speaker 1: to say with a straight face, Hey, I didn't do 292 00:18:37,160 --> 00:18:39,320 Speaker 1: anything wrong and no one was hurt. I'm just paying 293 00:18:39,400 --> 00:18:42,760 Speaker 1: seven hundred and twenty five million dollars out of the 294 00:18:42,800 --> 00:18:45,600 Speaker 1: kindness of my heart. Anyway, the settlement itself is actually 295 00:18:45,600 --> 00:18:48,280 Speaker 1: old news. The company reached an agreement late last year. 296 00:18:48,359 --> 00:18:52,240 Speaker 1: The news now is that you Facebook users out there 297 00:18:52,240 --> 00:18:55,240 Speaker 1: who had an account that was active during those times, 298 00:18:56,000 --> 00:18:58,680 Speaker 1: you can actually submit a claim, and you have until 299 00:18:58,680 --> 00:19:01,119 Speaker 1: August twenty fifth to do so if you want to 300 00:19:01,160 --> 00:19:07,159 Speaker 1: do it online. The website is Facebook User Privacysettlement dot 301 00:19:07,200 --> 00:19:15,160 Speaker 1: com slash hashtag, submit, dash claim. Finally, mark September twenty ninth, 302 00:19:15,160 --> 00:19:18,520 Speaker 1: twenty twenty three, ondred calendars. That's when Netflix will officially 303 00:19:18,560 --> 00:19:21,639 Speaker 1: stop sending DVDs out by mail. For some of y'all, 304 00:19:21,760 --> 00:19:24,119 Speaker 1: what I said might actually sound strange, but not that 305 00:19:24,240 --> 00:19:27,840 Speaker 1: long ago. Netflix's business model was really just renting films 306 00:19:27,880 --> 00:19:31,119 Speaker 1: out through the mail. You would sign up for the service, 307 00:19:31,160 --> 00:19:33,240 Speaker 1: you would make a watch list of films and shows 308 00:19:33,280 --> 00:19:35,720 Speaker 1: you wanted to see, and then, based on what was available, 309 00:19:35,800 --> 00:19:38,800 Speaker 1: Netflix would mail you the DVD and then you would 310 00:19:38,800 --> 00:19:40,240 Speaker 1: watch it and you would put it back in its 311 00:19:40,280 --> 00:19:42,000 Speaker 1: envelope and drop it off in the mail and wait 312 00:19:42,040 --> 00:19:44,359 Speaker 1: for your next one. Or if you were like just 313 00:19:44,400 --> 00:19:46,720 Speaker 1: about everyone I know had the service, you would do 314 00:19:46,800 --> 00:19:49,320 Speaker 1: this a couple times, and then on your third DVD 315 00:19:49,640 --> 00:19:51,760 Speaker 1: you just have it sit on the coffee table in 316 00:19:51,800 --> 00:19:55,159 Speaker 1: the little red envelope, unopened and unwatched, and then you 317 00:19:55,160 --> 00:19:57,439 Speaker 1: would just continue paying for the privilege of giving this 318 00:19:57,520 --> 00:20:00,640 Speaker 1: DVD a temporary home. Or in fact, you might ultimately 319 00:20:00,680 --> 00:20:03,800 Speaker 1: get charged for the DVD because he kept it so 320 00:20:03,880 --> 00:20:07,480 Speaker 1: ding dang long. But anyway, Netflix's main business was all 321 00:20:07,520 --> 00:20:10,760 Speaker 1: in this male centric model. The company was founded back 322 00:20:10,760 --> 00:20:13,639 Speaker 1: in nineteen ninety seven, and it didn't start streaming until 323 00:20:13,720 --> 00:20:17,400 Speaker 1: two thousand and seven, so for a decade and some change, 324 00:20:17,720 --> 00:20:20,560 Speaker 1: that's just how folks use Netflix. Now we're at the 325 00:20:20,600 --> 00:20:23,160 Speaker 1: end of an era, and it's funny because I've actually 326 00:20:23,240 --> 00:20:26,159 Speaker 1: kind of gotten back into buying DVDs and Blu ray discs. 327 00:20:26,560 --> 00:20:30,520 Speaker 1: I got tired of titles disappearing off various services, sometimes 328 00:20:30,560 --> 00:20:33,560 Speaker 1: popping up on other services that I don't subscribe to, 329 00:20:33,880 --> 00:20:37,720 Speaker 1: So now I'm back to purchasing physical media. I guess 330 00:20:37,720 --> 00:20:41,560 Speaker 1: I'm just a human being, lost in time and lost 331 00:20:41,600 --> 00:20:46,600 Speaker 1: in space and in meaning. Hope you're all well, that's 332 00:20:46,640 --> 00:20:50,800 Speaker 1: it for this episode. I'll talk to you again really soon. 333 00:20:57,240 --> 00:21:01,879 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 334 00:21:02,200 --> 00:21:05,920 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 335 00:21:05,960 --> 00:21:07,000 Speaker 1: to your favorite shows.