1 00:00:04,480 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,440 --> 00:00:16,079 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:19,720 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:19,760 --> 00:00:22,239 Speaker 1: tech are you. It's time for the tech news for 5 00:00:22,320 --> 00:00:27,080 Speaker 1: the week ending Friday, April twenty sixth, twenty twenty four, 6 00:00:27,200 --> 00:00:31,240 Speaker 1: and first up. I covered one really big news item 7 00:00:31,320 --> 00:00:34,199 Speaker 1: earlier this week on Wednesday's episode, which is that the 8 00:00:34,320 --> 00:00:37,920 Speaker 1: US Senate approved a bill that gives the Chinese company 9 00:00:37,960 --> 00:00:42,000 Speaker 1: byte Dance one year to divest itself of TikTok or 10 00:00:42,040 --> 00:00:47,120 Speaker 1: else face a nationwide band. President Biden signed that bill 11 00:00:47,280 --> 00:00:50,800 Speaker 1: into law as was expected. And the only other update 12 00:00:50,840 --> 00:00:54,240 Speaker 1: I really have is that Reuter's reports Byteedance would rather 13 00:00:54,440 --> 00:00:58,920 Speaker 1: choose to shut TikTok down than to sell it off 14 00:00:58,960 --> 00:01:02,400 Speaker 1: to someone else. Of course, that's not likely to happen 15 00:01:02,640 --> 00:01:05,000 Speaker 1: right away. I mean, for one thing, they have a year, 16 00:01:05,080 --> 00:01:08,480 Speaker 1: but by Dance also plans to challenge the law on 17 00:01:08,600 --> 00:01:12,920 Speaker 1: constitutional grounds, as do others, So we have a ways 18 00:01:12,959 --> 00:01:16,320 Speaker 1: to go before this story ends one way or the other, 19 00:01:17,000 --> 00:01:21,840 Speaker 1: whether courts uphold it or say that in fact, it's unconstitutional. 20 00:01:22,080 --> 00:01:25,399 Speaker 1: And if you're wondering why bite Dance would rather just 21 00:01:26,080 --> 00:01:29,200 Speaker 1: scuttle all operations in the United States and just call 22 00:01:29,240 --> 00:01:31,800 Speaker 1: it a loss. It really comes down to that juicy 23 00:01:31,959 --> 00:01:35,440 Speaker 1: recommendation algorithm that TikTok uses, you know, the one that 24 00:01:35,720 --> 00:01:39,520 Speaker 1: former engineers for an earlier app called ninety nine fongs 25 00:01:39,520 --> 00:01:42,560 Speaker 1: say is their work and that they should be compensated 26 00:01:42,600 --> 00:01:46,319 Speaker 1: for it. Byte Dance does not want to part with 27 00:01:46,440 --> 00:01:49,960 Speaker 1: that algorithm. It is a key component of other apps 28 00:01:50,080 --> 00:01:53,520 Speaker 1: that the company owns and operates, including the Chinese cousin 29 00:01:53,560 --> 00:01:56,680 Speaker 1: to TikTok do Yin. And that's it for the update 30 00:01:56,680 --> 00:02:00,800 Speaker 1: for that story. This week, the US Federal Communiticationations Commission, 31 00:02:00,880 --> 00:02:05,919 Speaker 1: or FCC reinstated some, but not all, of net neutrality 32 00:02:06,040 --> 00:02:08,560 Speaker 1: regulations in the United States. So as a quick reminder, 33 00:02:08,800 --> 00:02:12,640 Speaker 1: net neutrality covers a fairly broad spectrum of concepts, but 34 00:02:12,680 --> 00:02:14,800 Speaker 1: you can kind of boil it down to say that 35 00:02:14,960 --> 00:02:19,400 Speaker 1: the rules mean all companies that provide internet infrastructure agree 36 00:02:19,480 --> 00:02:23,440 Speaker 1: to let all traffic cross their networks without interference, no 37 00:02:23,520 --> 00:02:26,520 Speaker 1: matter who it's from, to whom it's going, or what 38 00:02:26,600 --> 00:02:30,880 Speaker 1: kind of device the endpoint is using. So under net neutrality, 39 00:02:31,280 --> 00:02:35,040 Speaker 1: a provider would not give advantages to their own content 40 00:02:35,160 --> 00:02:39,240 Speaker 1: while throttling traffic from competitors. So if you got your 41 00:02:39,280 --> 00:02:43,359 Speaker 1: Internet service from Comcast, Comcasts wouldn't be allowed to put 42 00:02:43,360 --> 00:02:46,760 Speaker 1: its own services on a fast lane while inhibiting stuff 43 00:02:46,800 --> 00:02:50,240 Speaker 1: from like you know, Netflix or Max or Hulu or whatever, 44 00:02:50,680 --> 00:02:54,120 Speaker 1: or charging those companies an arm in a leg in 45 00:02:54,200 --> 00:02:56,959 Speaker 1: order to also be in the fast lane. The measure 46 00:02:57,080 --> 00:02:59,720 Speaker 1: passed three to two in the FCC. It went right 47 00:02:59,760 --> 00:03:02,560 Speaker 1: down party lines, which is pretty much par for the 48 00:03:02,600 --> 00:03:07,200 Speaker 1: course these days. So three Democratic appointees voted for it 49 00:03:07,240 --> 00:03:10,480 Speaker 1: and to Republican appointees voted against it. Now, to be clear, 50 00:03:10,800 --> 00:03:13,880 Speaker 1: this is not the first time the FCC has established 51 00:03:14,000 --> 00:03:17,400 Speaker 1: net neutrality rules. It did so under the Obama administration, 52 00:03:17,720 --> 00:03:21,920 Speaker 1: but then under the Trump administration the FCC revoked those 53 00:03:22,000 --> 00:03:25,000 Speaker 1: earlier rules, so this should really indicate that these rules 54 00:03:25,040 --> 00:03:28,680 Speaker 1: are far from set in stone. A tech analyst named 55 00:03:28,840 --> 00:03:33,480 Speaker 1: Ming chi Quo has posted in Medium that Apple has 56 00:03:33,520 --> 00:03:38,480 Speaker 1: scaled back its production of the Vision Pro mixed reality headsets. So, 57 00:03:38,520 --> 00:03:42,280 Speaker 1: according to this post, Apple initially expected a demand of 58 00:03:42,320 --> 00:03:45,640 Speaker 1: seven hundred thousand to eight hundred thousand headsets, but this 59 00:03:45,760 --> 00:03:49,360 Speaker 1: cut will bring it down to a little bit around 60 00:03:49,400 --> 00:03:51,720 Speaker 1: the half of that. Like four hundred thousand to four 61 00:03:51,760 --> 00:03:55,000 Speaker 1: hundred and fifty thousand units. Further, the post says that 62 00:03:55,040 --> 00:03:58,440 Speaker 1: Apple expects sales to dip even more next year, and 63 00:03:58,480 --> 00:04:01,480 Speaker 1: that the company has adjusted its product roadmap as a result. 64 00:04:01,840 --> 00:04:06,360 Speaker 1: So Apple often releases updated models of its various tech 65 00:04:06,520 --> 00:04:09,720 Speaker 1: platforms from year to year, like Max and iPhones and 66 00:04:09,720 --> 00:04:11,720 Speaker 1: that kind of thing. But the post says that the 67 00:04:11,720 --> 00:04:14,560 Speaker 1: company is no longer planning to release a new headset 68 00:04:14,600 --> 00:04:17,919 Speaker 1: in twenty twenty five. So could it be that mixed 69 00:04:17,920 --> 00:04:22,159 Speaker 1: reality is one arena in which Apple's reality distortion field 70 00:04:22,240 --> 00:04:25,320 Speaker 1: just doesn't extend. So in the past I've predicted that 71 00:04:25,400 --> 00:04:27,760 Speaker 1: various Apple products wouldn't go very far. I mean, the 72 00:04:27,839 --> 00:04:30,680 Speaker 1: iPad is a really big example of one where I 73 00:04:30,800 --> 00:04:33,680 Speaker 1: whiffed big time. So I've been so incredibly wrong in 74 00:04:33,720 --> 00:04:36,920 Speaker 1: the past. But maybe the challenges of getting a broad 75 00:04:36,960 --> 00:04:40,400 Speaker 1: audience for a mixed reality headset are too great even 76 00:04:40,440 --> 00:04:43,200 Speaker 1: for Apple, and to be fear to the company, mixed 77 00:04:43,200 --> 00:04:45,840 Speaker 1: reality is a very tough sell. I mean, the headsets 78 00:04:45,839 --> 00:04:48,320 Speaker 1: are expensive. You know, you have to wear them on 79 00:04:48,360 --> 00:04:51,400 Speaker 1: your head, so that's got a strike against it for 80 00:04:51,440 --> 00:04:54,040 Speaker 1: a lot of people. And there's a very limited number 81 00:04:54,080 --> 00:04:56,880 Speaker 1: of applications for these headsets right now. Despite the fact 82 00:04:56,880 --> 00:04:59,839 Speaker 1: that what they do appears to be really incredible, it's 83 00:04:59,839 --> 00:05:02,720 Speaker 1: a a very limited set of things that they do, 84 00:05:03,480 --> 00:05:06,800 Speaker 1: so with all of these things taken into account, it's 85 00:05:06,920 --> 00:05:10,960 Speaker 1: tough to get traction. So it's a pretty big uphill battle, 86 00:05:11,080 --> 00:05:14,520 Speaker 1: and I don't think Apple's gonna win it. Tesla has 87 00:05:14,560 --> 00:05:16,720 Speaker 1: had a wild week that I think proves the point 88 00:05:16,760 --> 00:05:19,640 Speaker 1: that if we live inside a computer simulation, there's some 89 00:05:19,720 --> 00:05:23,440 Speaker 1: serious bugs in the programming. So Tesla held its twenty 90 00:05:23,480 --> 00:05:26,039 Speaker 1: twenty four Q one earnings call this week, and the 91 00:05:26,040 --> 00:05:29,160 Speaker 1: company revealed that, at least according to The Verge, Tesla's 92 00:05:29,200 --> 00:05:32,720 Speaker 1: net income is down nine percent year over year and 93 00:05:32,760 --> 00:05:36,559 Speaker 1: has hit a six year low. Oh and quote, net 94 00:05:36,640 --> 00:05:41,159 Speaker 1: income attributable to common stockholders has slid fifty five percent 95 00:05:41,400 --> 00:05:44,760 Speaker 1: end quote. That's according to Andrew Hawkins of The Verge, 96 00:05:44,839 --> 00:05:48,480 Speaker 1: and he attributes this both to Tesla's price cutting efforts 97 00:05:48,600 --> 00:05:52,520 Speaker 1: and cooling demand for evs among the buying public. Elon 98 00:05:52,600 --> 00:05:56,080 Speaker 1: Musk says the auto industry in general has pivoted from 99 00:05:56,200 --> 00:06:00,480 Speaker 1: EV's back to hybrids, and that certainly is going on 100 00:06:00,560 --> 00:06:02,919 Speaker 1: here in America. I mean, we have seen a few 101 00:06:02,920 --> 00:06:06,560 Speaker 1: different auto manufacturers switch gears to use a pun and 102 00:06:06,760 --> 00:06:10,640 Speaker 1: to favor hybrids over evs for at least the near term. 103 00:06:11,200 --> 00:06:14,919 Speaker 1: The earnings call included a section that teased Tesla's upcoming 104 00:06:15,040 --> 00:06:17,800 Speaker 1: robotaxi product, which we're told we're going to hear more 105 00:06:17,839 --> 00:06:21,200 Speaker 1: about on August eighth. I'll remind you that, according to 106 00:06:21,440 --> 00:06:25,839 Speaker 1: various states that have regulatory requirements for autonomous taxi services, 107 00:06:25,960 --> 00:06:29,280 Speaker 1: Tesla has not yet begun the licensing procedures to get 108 00:06:29,279 --> 00:06:32,600 Speaker 1: permission to operate such a service in those states. Now, 109 00:06:32,600 --> 00:06:36,200 Speaker 1: maybe that's changed since the initial teas of the robotaxi service. 110 00:06:36,200 --> 00:06:39,080 Speaker 1: Maybe Tesla has started to file that paperwork. I haven't 111 00:06:39,120 --> 00:06:42,720 Speaker 1: heard that. But what happened after this earnings call is 112 00:06:42,760 --> 00:06:46,000 Speaker 1: where things really went bonkers from me. So you had 113 00:06:46,040 --> 00:06:49,400 Speaker 1: this earnings call, things sound dire. You know, you're down 114 00:06:49,640 --> 00:06:54,000 Speaker 1: significantly from the previous year, and the stock price would 115 00:06:54,000 --> 00:06:57,520 Speaker 1: surge twelve percent in trading on Wednesday. So I told 116 00:06:57,520 --> 00:07:01,720 Speaker 1: you is a mad world out there. The stockholder enthusiasm 117 00:07:01,760 --> 00:07:05,080 Speaker 1: appears to be linked to things like the upcoming robotaxi effort, 118 00:07:05,320 --> 00:07:09,000 Speaker 1: Musk claiming that Tesla isn't so much an auto manufacturer 119 00:07:09,040 --> 00:07:12,160 Speaker 1: as it is a digital platform, as well as Musk's 120 00:07:12,160 --> 00:07:15,640 Speaker 1: announcement that the company plans to introduce new models early 121 00:07:15,760 --> 00:07:18,840 Speaker 1: next year, if not sooner. I have no idea if 122 00:07:18,880 --> 00:07:21,480 Speaker 1: any of that is realistic. Keep in mind, we're also 123 00:07:21,520 --> 00:07:24,120 Speaker 1: talking about the same company that had delayed the launch 124 00:07:24,120 --> 00:07:27,320 Speaker 1: of the cyber truck numerous times before it finally rolled 125 00:07:27,320 --> 00:07:30,840 Speaker 1: off lots. But the shareholders seem happy for now. In 126 00:07:30,920 --> 00:07:35,760 Speaker 1: other Muskie news, Elon's Ai company x Ai, because Elon 127 00:07:35,880 --> 00:07:39,200 Speaker 1: Musk continues to be obsessed with the letter x is 128 00:07:39,280 --> 00:07:43,080 Speaker 1: getting close to raising six billion with a B dollars 129 00:07:43,080 --> 00:07:48,240 Speaker 1: in investments on evaluation of eighteen billion dollars. According to 130 00:07:48,280 --> 00:07:51,200 Speaker 1: tech Crunch. Not that long ago, the numbers looked a 131 00:07:51,200 --> 00:07:53,880 Speaker 1: lot different. It was more like an effort to raise 132 00:07:54,080 --> 00:07:58,960 Speaker 1: three billion dollars with evaluation of fifteen billion. But various 133 00:07:59,000 --> 00:08:02,040 Speaker 1: heavy hitting and groups charged forward to get a piece 134 00:08:02,040 --> 00:08:05,840 Speaker 1: of the action, and Xai quickly tweaked those numbers. So 135 00:08:05,840 --> 00:08:08,320 Speaker 1: it's kind of like running a lemonade stand. You know, 136 00:08:08,360 --> 00:08:11,920 Speaker 1: you've been selling tall, frosty glasses of tangy lemonade for 137 00:08:11,960 --> 00:08:15,360 Speaker 1: like twenty five cents, but then there's this great lemon 138 00:08:15,480 --> 00:08:17,920 Speaker 1: blight that wipes out nearly all lemons except for the 139 00:08:17,920 --> 00:08:20,440 Speaker 1: ones that you have. And it also coincides with like 140 00:08:20,440 --> 00:08:23,679 Speaker 1: a massive heat wave, and you've got the market cornered 141 00:08:23,680 --> 00:08:26,240 Speaker 1: on your sweet and sour zipper, so you hike the 142 00:08:26,240 --> 00:08:29,640 Speaker 1: price to five bucks for a shot. Anyway, the sales 143 00:08:29,680 --> 00:08:32,199 Speaker 1: pitch for XAI sounds like something from a science fiction 144 00:08:32,320 --> 00:08:35,000 Speaker 1: film where we find out how the killer robots got 145 00:08:35,040 --> 00:08:39,319 Speaker 1: their start. So, namely, Musk plans to leverage data from 146 00:08:39,320 --> 00:08:43,880 Speaker 1: his various other companies to use as training information for 147 00:08:44,240 --> 00:08:48,640 Speaker 1: the AI developed by XAI. So those businesses include stuff 148 00:08:48,720 --> 00:08:51,880 Speaker 1: like x formerly known as Twitter, which is a stakeholder 149 00:08:52,120 --> 00:08:57,559 Speaker 1: in XAI, SpaceX, Tesla, and even Neuralink that's the brain 150 00:08:57,600 --> 00:09:01,160 Speaker 1: computer interface company that Musk has. So Musk's vision is 151 00:09:01,200 --> 00:09:04,319 Speaker 1: a world in which all these other businesses are feeding 152 00:09:04,440 --> 00:09:08,720 Speaker 1: his XAI with real world data and experience and thus 153 00:09:08,840 --> 00:09:14,280 Speaker 1: accelerating that AI's learning patterns, and voila, we get miracle 154 00:09:14,400 --> 00:09:18,840 Speaker 1: robots and such. I remain skeptical. Now, a quick hop 155 00:09:18,880 --> 00:09:21,120 Speaker 1: away from Tesla will take us over to a story 156 00:09:21,160 --> 00:09:24,800 Speaker 1: about Fisker, which is an electric vehicle auto company that, 157 00:09:25,000 --> 00:09:28,720 Speaker 1: according to SFGate dot com, is now on the brink 158 00:09:28,760 --> 00:09:32,080 Speaker 1: of collapse. Huh, I'm business on the brink. That would 159 00:09:32,080 --> 00:09:34,880 Speaker 1: actually be a really good name for a podcast. Anyway, 160 00:09:35,160 --> 00:09:38,600 Speaker 1: this week, Fisker filed with the SEC and revealed that 161 00:09:38,679 --> 00:09:41,760 Speaker 1: the company is pretty much broke. Back in March, it 162 00:09:41,880 --> 00:09:44,480 Speaker 1: was unable to make an interest payment. Now, granted, it 163 00:09:44,520 --> 00:09:46,760 Speaker 1: was a whopper of an interest payment, it was eight 164 00:09:46,800 --> 00:09:50,440 Speaker 1: point four million dollars. But the filing revealed that, you know, 165 00:09:50,559 --> 00:09:54,319 Speaker 1: this company, Fisker is on borrowed time. So unless creditors 166 00:09:54,360 --> 00:09:57,840 Speaker 1: give the company some serious slack or it hits some 167 00:09:58,000 --> 00:10:01,160 Speaker 1: crazy financial windfall, the screw will need to file for 168 00:10:01,240 --> 00:10:05,720 Speaker 1: bankruptcy within thirty days. The filing doesn't sugarcoat things either. Essentially, 169 00:10:05,720 --> 00:10:08,480 Speaker 1: it says giving us more money is really risky because 170 00:10:08,480 --> 00:10:10,840 Speaker 1: we haven't proven we're able to do what we need 171 00:10:10,840 --> 00:10:13,600 Speaker 1: to do. But yeah, we've seen a few stories about 172 00:10:13,600 --> 00:10:16,680 Speaker 1: startups attempting to establish a foothold in the auto industry. 173 00:10:17,040 --> 00:10:20,400 Speaker 1: More often than not, it does not work out. Okay, 174 00:10:20,760 --> 00:10:23,080 Speaker 1: we've got a lot more news stories to get through 175 00:10:23,120 --> 00:10:25,199 Speaker 1: before we get to that. Let's take a quick break 176 00:10:25,240 --> 00:10:37,160 Speaker 1: to thank our sponsors. We're back and we're going to 177 00:10:37,240 --> 00:10:40,000 Speaker 1: take a little stroll over to Meta Meta also held 178 00:10:40,040 --> 00:10:43,040 Speaker 1: an earnings call this week, but investors were not nearly 179 00:10:43,080 --> 00:10:45,960 Speaker 1: as charmed by the revelations as those who were following 180 00:10:46,000 --> 00:10:49,520 Speaker 1: the Tesla announcements. The call began with talk about AI 181 00:10:49,920 --> 00:10:53,000 Speaker 1: to no big surprise, and then Zuckerberg also talked about 182 00:10:53,040 --> 00:10:55,880 Speaker 1: the metaverse, which is funny because you don't hear that 183 00:10:56,080 --> 00:10:59,520 Speaker 1: term bandied about nearly as much these days. I think 184 00:10:59,600 --> 00:11:03,320 Speaker 1: AI I largely displaced the metaverse concept that became like 185 00:11:03,600 --> 00:11:06,880 Speaker 1: the new thing that business leaders got excited about. But 186 00:11:07,000 --> 00:11:11,520 Speaker 1: as CNBC's Ashley Caput put said, quote, he spent almost 187 00:11:11,600 --> 00:11:14,760 Speaker 1: the entirety of his opening remarks focused on the many 188 00:11:14,800 --> 00:11:18,920 Speaker 1: ways Meta loses money end quote, Mizcaput, I believe you 189 00:11:19,000 --> 00:11:22,000 Speaker 1: have spilt some tea all over the place. But yeah, 190 00:11:22,080 --> 00:11:25,360 Speaker 1: Meta is very good at spending a lot of money 191 00:11:25,400 --> 00:11:28,360 Speaker 1: on stuff that isn't making very much by comparison. So, 192 00:11:28,440 --> 00:11:31,760 Speaker 1: for example, Reality Labs brought in four hundred and forty 193 00:11:31,760 --> 00:11:34,720 Speaker 1: million dollars in sales, and that's a lot of cheddar, 194 00:11:35,160 --> 00:11:38,000 Speaker 1: but that same division racked up a financial loss of 195 00:11:38,080 --> 00:11:44,240 Speaker 1: three point eighty five billion dollars. Anyway, Zuckerberg also seemed 196 00:11:44,360 --> 00:11:47,880 Speaker 1: prepared for the financial blow that would follow this earnings 197 00:11:47,920 --> 00:11:51,240 Speaker 1: call because he acknowledged that investors are generally not very 198 00:11:51,240 --> 00:11:54,640 Speaker 1: happy about Meta spending billions of dollars on projects that 199 00:11:54,679 --> 00:11:58,079 Speaker 1: are not yet monetizable. Sure enough, in the wake of 200 00:11:58,120 --> 00:12:01,640 Speaker 1: the earnings call, Meta's stock price dropped nineteen percent in 201 00:12:01,760 --> 00:12:05,040 Speaker 1: extended trading at some points. As for the future, Zuckerberg 202 00:12:05,080 --> 00:12:08,560 Speaker 1: revealed that the company estimates capital expenditures for twenty twenty 203 00:12:08,559 --> 00:12:10,960 Speaker 1: four to be in the thirty five billion to forty 204 00:12:11,080 --> 00:12:14,280 Speaker 1: billion dollar range. That's more than what the company had 205 00:12:14,320 --> 00:12:18,839 Speaker 1: previously estimated, So I guess making the future is real expensive. 206 00:12:19,320 --> 00:12:22,440 Speaker 1: Microsoft also had an earnings call this week, and the 207 00:12:22,440 --> 00:12:27,120 Speaker 1: company explained that the absolute rush to develop AI has 208 00:12:27,200 --> 00:12:30,840 Speaker 1: meant the demand for cloud based computing power is greater 209 00:12:31,000 --> 00:12:34,120 Speaker 1: than the supply of resources. So the report showed that 210 00:12:34,160 --> 00:12:38,520 Speaker 1: Microsoft has increased capital expenditures by nearly eighty percent year 211 00:12:38,559 --> 00:12:41,080 Speaker 1: over year in an effort to meet this demand to 212 00:12:41,080 --> 00:12:44,160 Speaker 1: build out things like data centers and such, and the 213 00:12:44,200 --> 00:12:47,600 Speaker 1: company is going to have to spend even more in 214 00:12:47,640 --> 00:12:51,160 Speaker 1: the next couple of years in order to really feed 215 00:12:51,400 --> 00:12:54,599 Speaker 1: the hungry beast that is AI. This is kind of 216 00:12:54,640 --> 00:12:58,280 Speaker 1: like seeing Parkinson's law of data on steroids, So Parkinson's 217 00:12:58,320 --> 00:12:59,960 Speaker 1: law of data in case you don't know what that is, 218 00:13:00,200 --> 00:13:03,560 Speaker 1: it says that information will fill any space made available 219 00:13:03,640 --> 00:13:07,400 Speaker 1: to it, like it doesn't matter how big your library 220 00:13:07,720 --> 00:13:12,240 Speaker 1: or filing cabinet or storage drive is, any way of 221 00:13:12,920 --> 00:13:16,160 Speaker 1: having a way of retaining information, it will fill up. 222 00:13:16,400 --> 00:13:19,760 Speaker 1: Like I remember when we would get a computer with 223 00:13:19,760 --> 00:13:22,439 Speaker 1: a hard drive like one hundred and twenty eight megabytes 224 00:13:22,520 --> 00:13:24,880 Speaker 1: or something, and I'd think, oh, we'll never fill all 225 00:13:24,880 --> 00:13:28,720 Speaker 1: this up, which was ridiculous obviously, Well, this seems to 226 00:13:28,720 --> 00:13:31,920 Speaker 1: be that same kind of issue. The compute power instead 227 00:13:31,960 --> 00:13:34,679 Speaker 1: of storage power just will get swallowed up as soon 228 00:13:34,720 --> 00:13:37,400 Speaker 1: as you make it available. Microsoft keeps adding to it, 229 00:13:37,440 --> 00:13:39,880 Speaker 1: and customers in the form of companies running these AI 230 00:13:39,920 --> 00:13:43,600 Speaker 1: implementations just buy it right up. But the expenditures that 231 00:13:43,640 --> 00:13:46,000 Speaker 1: Microsoft is making in order to meet this demand so 232 00:13:46,120 --> 00:13:49,000 Speaker 1: far are actually outpacing income. So that could be a 233 00:13:49,040 --> 00:13:51,720 Speaker 1: sticky wicket. At some point you could meet a tipping 234 00:13:51,720 --> 00:13:55,320 Speaker 1: point where it just you're spending more money than you 235 00:13:55,320 --> 00:13:58,600 Speaker 1: could possibly bring back in, especially if some of these 236 00:13:58,640 --> 00:14:01,800 Speaker 1: startup AI companies of fizzle out. So we'll have to 237 00:14:01,800 --> 00:14:04,839 Speaker 1: see where this kind of levels out. I'm sure there 238 00:14:04,880 --> 00:14:07,400 Speaker 1: will be a point, I don't know when that'll happen. 239 00:14:08,200 --> 00:14:11,680 Speaker 1: Google is facing sanctions in South Korea after that nation's 240 00:14:11,800 --> 00:14:16,200 Speaker 1: Fair Trade Commission, the Korea's version of FTC, determined that 241 00:14:16,240 --> 00:14:20,760 Speaker 1: Google's practice of giving YouTube Premium subscribers access to YouTube 242 00:14:20,840 --> 00:14:25,040 Speaker 1: Music as well essentially amounts to bundling these services together 243 00:14:25,280 --> 00:14:29,680 Speaker 1: and is an anti competitive practice that has hurt domestic 244 00:14:29,720 --> 00:14:34,520 Speaker 1: companies streaming platforms. So YouTube Music became the top music 245 00:14:34,560 --> 00:14:37,880 Speaker 1: streaming service in the Korean market, it surpassed even the 246 00:14:38,000 --> 00:14:42,160 Speaker 1: native Korean services, and the FTC launched an investigation into 247 00:14:42,200 --> 00:14:46,080 Speaker 1: the matter last year and now is preparing to determine sanctions. 248 00:14:46,120 --> 00:14:48,960 Speaker 1: But meanwhile, Korean companies are saying this is too little, 249 00:14:49,000 --> 00:14:51,800 Speaker 1: too late, and that the whole determination process is just 250 00:14:51,880 --> 00:14:54,200 Speaker 1: far too long to be of any good because by 251 00:14:54,240 --> 00:14:56,840 Speaker 1: the time a decision has arrived at the damage has 252 00:14:56,920 --> 00:15:02,000 Speaker 1: already been done. Meanwhile, the u US FTC, in our case, 253 00:15:02,000 --> 00:15:05,800 Speaker 1: it's the Federal Trade Commission, ruled that companies are no 254 00:15:05,880 --> 00:15:10,320 Speaker 1: longer allowed to force employees into non compete agreements. This 255 00:15:10,400 --> 00:15:14,400 Speaker 1: applies across all industries, obviously, but inside the tech world 256 00:15:14,520 --> 00:15:18,480 Speaker 1: it is a particularly huge development. Companies have long relied 257 00:15:18,520 --> 00:15:21,560 Speaker 1: on non competes to prevent employees from jumping ship and 258 00:15:21,600 --> 00:15:26,120 Speaker 1: working with a competitor, potentially bringing valuable skills and knowledge 259 00:15:26,200 --> 00:15:30,040 Speaker 1: to that competitor. In fact, some of the podcasters you 260 00:15:30,200 --> 00:15:34,800 Speaker 1: listen to may have non compete clauses in their work agreements, 261 00:15:35,400 --> 00:15:37,720 Speaker 1: you know, like ones you might be listening to right now. 262 00:15:37,880 --> 00:15:39,920 Speaker 1: But the FTC says that kind of thing is not 263 00:15:40,000 --> 00:15:42,640 Speaker 1: going to fly anymore. And you, as you might imagine, 264 00:15:42,920 --> 00:15:46,360 Speaker 1: that decision has not exactly been met with universal approval 265 00:15:46,360 --> 00:15:50,280 Speaker 1: among corporations out there, including tech companies. So on the contrary, 266 00:15:50,440 --> 00:15:54,200 Speaker 1: industry groups are already suing the FTC, arguing that the 267 00:15:54,240 --> 00:15:57,880 Speaker 1: agency lacks the authority to make and enforce such a ruling. 268 00:15:58,280 --> 00:16:01,400 Speaker 1: This rule will go into effect on August twenty second 269 00:16:01,520 --> 00:16:04,640 Speaker 1: of this year, at which point all current non competes, 270 00:16:04,760 --> 00:16:07,680 Speaker 1: at least for anyone who isn't in an executive policy 271 00:16:07,720 --> 00:16:10,640 Speaker 1: making position and earning you know, more than one hundred 272 00:16:10,640 --> 00:16:13,360 Speaker 1: and fifty thousand dollars a year, all non competes will 273 00:16:13,360 --> 00:16:15,480 Speaker 1: be null and void, and companies will not be allowed 274 00:16:15,520 --> 00:16:18,640 Speaker 1: to issue new ones. The lawsuits will continue until a 275 00:16:18,840 --> 00:16:22,120 Speaker 1: final decision is made. My guess is that ultimately this 276 00:16:22,240 --> 00:16:23,520 Speaker 1: is going to have to go all the way to 277 00:16:23,560 --> 00:16:26,640 Speaker 1: the Supreme Court. That's going to take years. So in 278 00:16:26,680 --> 00:16:31,440 Speaker 1: the meantime, I say, make hay while the sun shines. Okay, 279 00:16:31,440 --> 00:16:33,920 Speaker 1: I got a couple of weird AI stories to get throughs. 280 00:16:33,960 --> 00:16:36,200 Speaker 1: First up is a case in Maryland in which the 281 00:16:36,360 --> 00:16:39,360 Speaker 1: athletic director for a high school was found to have 282 00:16:39,480 --> 00:16:43,080 Speaker 1: used AI to impersonate the voice of that school's principle. 283 00:16:43,400 --> 00:16:47,040 Speaker 1: So the accusation goes that this athletic director fabricated and 284 00:16:47,160 --> 00:16:50,880 Speaker 1: audio recording using an AI copy of the principal's voice, 285 00:16:50,960 --> 00:16:55,320 Speaker 1: and this recording included some really awful stuff like racist 286 00:16:55,320 --> 00:16:59,520 Speaker 1: and antisemitic messaging. Apparently this all happened back in January 287 00:16:59,560 --> 00:17:03,320 Speaker 1: when the principle informed the athletic director that his contract 288 00:17:03,400 --> 00:17:06,200 Speaker 1: was not going to be renewed. There were various issues 289 00:17:06,200 --> 00:17:09,800 Speaker 1: and problems with this particular staff member. The athletic director 290 00:17:09,880 --> 00:17:12,720 Speaker 1: then fabricated the recording and posted it to social media, 291 00:17:12,760 --> 00:17:15,880 Speaker 1: which prompted a viral response. It led to the principle 292 00:17:16,000 --> 00:17:19,879 Speaker 1: being placed on leave. And now it's revealed that the 293 00:17:19,920 --> 00:17:24,000 Speaker 1: whole thing was a fraud. And I'm sure we'll never 294 00:17:24,040 --> 00:17:27,879 Speaker 1: hear of any other outlandish potentially criminal uses of AI 295 00:17:28,000 --> 00:17:32,840 Speaker 1: in voice replication technology, he said sarcastically. The other weird 296 00:17:32,880 --> 00:17:35,920 Speaker 1: AI story deals with Drake and it's about a disk track. 297 00:17:36,080 --> 00:17:38,320 Speaker 1: And yes, I know how lame it is to hear 298 00:17:38,680 --> 00:17:41,679 Speaker 1: me say the phrase disc track. I have not earned 299 00:17:41,800 --> 00:17:44,760 Speaker 1: any place at all within rap or hip hop culture, 300 00:17:45,080 --> 00:17:48,320 Speaker 1: but anyway, this particular track was aimed at Kendrick Lamar, 301 00:17:48,720 --> 00:17:52,320 Speaker 1: and the track included a recording of an AI impersonation 302 00:17:52,359 --> 00:17:54,880 Speaker 1: of Tupac Shakur, who, of course died at the age 303 00:17:54,880 --> 00:17:58,880 Speaker 1: of twenty five. Way back in nineteen ninety six. Shaquur's 304 00:17:59,040 --> 00:18:02,439 Speaker 1: estate issued a cease and desist order against Drake and 305 00:18:02,520 --> 00:18:05,480 Speaker 1: threaten legal actions saying that the estate did not give 306 00:18:05,520 --> 00:18:09,119 Speaker 1: permission for the use of Shakur's audio likeness on this track. 307 00:18:09,440 --> 00:18:12,119 Speaker 1: Shakur wasn't the only artist who is replicated for the 308 00:18:12,119 --> 00:18:16,119 Speaker 1: purposes of hyping up Drake and or dissing Lamar. Snoop 309 00:18:16,160 --> 00:18:19,639 Speaker 1: Dogg was also copied by AI, and you know Snoop 310 00:18:19,640 --> 00:18:22,840 Speaker 1: dog is very much alive. The issue really highlights the 311 00:18:22,960 --> 00:18:26,960 Speaker 1: legal gaps that currently exist with regard to technology's ability 312 00:18:27,000 --> 00:18:29,840 Speaker 1: to copy not just a specific work, but a person's 313 00:18:29,960 --> 00:18:34,240 Speaker 1: voice or style. Those sorts of things aren't eligible for copyright. 314 00:18:34,440 --> 00:18:37,240 Speaker 1: You cannot copyright a voice or a style. You have 315 00:18:37,280 --> 00:18:39,320 Speaker 1: to have something that's set down in some sort of 316 00:18:39,400 --> 00:18:42,000 Speaker 1: established format for it to be eligible for copyright. So 317 00:18:42,280 --> 00:18:45,199 Speaker 1: in the absence of copyright protection, the best avenue for 318 00:18:45,240 --> 00:18:49,720 Speaker 1: those who have been copied tends to relate to publicity rights, 319 00:18:49,920 --> 00:18:53,240 Speaker 1: which doesn't really apply to most of us. It's really 320 00:18:53,280 --> 00:18:56,560 Speaker 1: more of something that applies to notable people. But anyway, 321 00:18:56,760 --> 00:18:59,600 Speaker 1: I'm sure we'll never hear of any other versions of 322 00:18:59,640 --> 00:19:04,520 Speaker 1: this very problem due to AI sarcasm. Again in good 323 00:19:04,520 --> 00:19:06,639 Speaker 1: space news, NASA was able to fix some issues with 324 00:19:06,640 --> 00:19:09,840 Speaker 1: the Voyager one spacecraft, which is currently the human made 325 00:19:09,840 --> 00:19:13,199 Speaker 1: object that is furthest from its home planet. Voyager is 326 00:19:13,240 --> 00:19:16,080 Speaker 1: in interstellar space, and last year the spacecraft had a 327 00:19:16,119 --> 00:19:18,720 Speaker 1: malfunction that prevented it from being able to send data 328 00:19:18,800 --> 00:19:21,480 Speaker 1: back to Earth. So for months, engineers at NASA worked 329 00:19:21,480 --> 00:19:24,119 Speaker 1: to fix this problem, and now Voyager can send at 330 00:19:24,200 --> 00:19:27,800 Speaker 1: least some data relating to its health and operations back 331 00:19:27,840 --> 00:19:30,919 Speaker 1: to US, and NASA hopes to get other operations online 332 00:19:30,960 --> 00:19:34,639 Speaker 1: in the near future. The solution largely revolved around addressing 333 00:19:34,720 --> 00:19:38,280 Speaker 1: operations to different sectors of the computer system's memory. That 334 00:19:38,359 --> 00:19:40,240 Speaker 1: must have been hard, because we're not talking about a 335 00:19:40,400 --> 00:19:42,840 Speaker 1: massive amount of space here. I mean there's a massive 336 00:19:42,840 --> 00:19:45,320 Speaker 1: amount of outer space, sure, but not a massive amount 337 00:19:45,320 --> 00:19:49,040 Speaker 1: of computer memory space. Voyager has long since completed its 338 00:19:49,040 --> 00:19:52,760 Speaker 1: primary mission, which was, along with Voyager two, to observe 339 00:19:52,840 --> 00:19:55,360 Speaker 1: the outer planets of our Solar system, and now it's 340 00:19:55,359 --> 00:19:58,160 Speaker 1: giving us information on the nature of interstellar space, which 341 00:19:58,200 --> 00:20:02,359 Speaker 1: is pretty darn cool. Okay, I've gotten article recommendation for 342 00:20:02,440 --> 00:20:05,239 Speaker 1: y'all before I sign off. It's from torrent Freak. It 343 00:20:05,280 --> 00:20:09,200 Speaker 1: was written by Andy Maxwell. It's titled us Know Your 344 00:20:09,240 --> 00:20:12,719 Speaker 1: Customer proposal will put an end to anonymous cloud users, 345 00:20:12,920 --> 00:20:16,000 Speaker 1: which is alarming. The piece explains the whole issue, including 346 00:20:16,040 --> 00:20:18,840 Speaker 1: the reasons behind wanting to clamp down on anonymity in 347 00:20:18,880 --> 00:20:21,360 Speaker 1: the first place, and how this could impact services such 348 00:20:21,359 --> 00:20:24,440 Speaker 1: as VPNs and such. It's well worth your time. That's 349 00:20:24,480 --> 00:20:27,159 Speaker 1: it for this week. I hope you're all well, and 350 00:20:27,200 --> 00:20:36,280 Speaker 1: I'll talk to you again really soon. Tech Stuff is 351 00:20:36,280 --> 00:20:40,840 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 352 00:20:40,880 --> 00:20:44,520 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 353 00:20:44,560 --> 00:20:48,840 Speaker 1: favorite shows.