1 00:00:04,440 --> 00:00:12,440 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,560 --> 00:00:16,160 Speaker 1: Welcome to Tech Stuff, I'm your host, Jonathan Strickland. I'm 3 00:00:16,160 --> 00:00:19,200 Speaker 1: an executive producer with iHeart Podcasts and how the tech 4 00:00:19,239 --> 00:00:22,119 Speaker 1: are you. It's time for the tech news for the 5 00:00:22,160 --> 00:00:27,080 Speaker 1: week ending on Friday, February two, two thy twenty four 6 00:00:27,240 --> 00:00:31,160 Speaker 1: and this week, tech leaders, including the CEOs of meta 7 00:00:31,920 --> 00:00:36,840 Speaker 1: x formerly known as Twitter, Snap, Discord, and TikTok all 8 00:00:36,920 --> 00:00:40,239 Speaker 1: met with the Senate Judiciary Committee here in the United States, 9 00:00:40,520 --> 00:00:43,480 Speaker 1: and it was a pretty sobering experience. The committee brought 10 00:00:43,560 --> 00:00:48,479 Speaker 1: forward families who had experienced tragedies in which children died 11 00:00:48,960 --> 00:00:53,040 Speaker 1: after they encountered various harmful people or situations or material 12 00:00:53,320 --> 00:00:57,800 Speaker 1: on these various platforms. The CEOs expressed their sympathy to 13 00:00:57,880 --> 00:01:02,520 Speaker 1: the families. Mark Zuckerberg even gave a somewhat emotional apology, 14 00:01:02,840 --> 00:01:07,080 Speaker 1: but the politicians found the response lackluster, as did many 15 00:01:07,080 --> 00:01:10,240 Speaker 1: of the parents, who said that the apologies kind of 16 00:01:10,360 --> 00:01:12,920 Speaker 1: rang hollow because there didn't seem to be any indication 17 00:01:13,120 --> 00:01:15,840 Speaker 1: that these companies were going to do anything different. The companies, 18 00:01:15,840 --> 00:01:19,840 Speaker 1: of course, said that they had been implementing various new 19 00:01:19,920 --> 00:01:24,360 Speaker 1: policies to protect children. You know, Meta has recently done that, 20 00:01:24,720 --> 00:01:27,640 Speaker 1: and so have some of the other platforms, But the 21 00:01:27,680 --> 00:01:31,039 Speaker 1: senators demanded to know why Section two thirty, which is 22 00:01:31,080 --> 00:01:34,840 Speaker 1: the US law that provides something of illegal shield platforms, 23 00:01:35,120 --> 00:01:39,319 Speaker 1: should protect these companies from being sued by the families. Now, 24 00:01:40,000 --> 00:01:44,800 Speaker 1: just as a reminder, these CEOs didn't draft section two thirty. 25 00:01:44,840 --> 00:01:47,400 Speaker 1: In fact, section two thirty was drafted before any of 26 00:01:47,440 --> 00:01:52,240 Speaker 1: these companies existed. The drafting was done by actual politicians, 27 00:01:52,360 --> 00:01:55,480 Speaker 1: and Section two thirty is not a blanket, get out 28 00:01:55,480 --> 00:01:59,960 Speaker 1: of court free card for platforms. So essentially, Section two 29 00:02:00,000 --> 00:02:03,320 Speaker 1: thirty says that a platform cannot be held responsible for 30 00:02:03,400 --> 00:02:07,600 Speaker 1: the material that's posted to that platform by users or 31 00:02:07,640 --> 00:02:11,360 Speaker 1: other parties. So if a user posts something illegal onto 32 00:02:11,440 --> 00:02:16,359 Speaker 1: a platform, that's the user's fault, not the platform's fault. Specifically, 33 00:02:16,680 --> 00:02:21,200 Speaker 1: section two thirty says, quote no provider or user of 34 00:02:21,240 --> 00:02:24,799 Speaker 1: an interactive computer service shall be treated as the publisher 35 00:02:25,040 --> 00:02:29,560 Speaker 1: or speaker of any information provided by another information content 36 00:02:29,639 --> 00:02:34,680 Speaker 1: provider end quote. But this protection itself has limits. If 37 00:02:34,720 --> 00:02:38,440 Speaker 1: a platform is violating federal criminal law, then it can 38 00:02:38,520 --> 00:02:42,680 Speaker 1: be held accountable. So if someone posts illegal material and 39 00:02:42,760 --> 00:02:46,800 Speaker 1: the platform is alerted to this, it's the platform's responsibility 40 00:02:46,880 --> 00:02:52,560 Speaker 1: to remove that material. This includes stuff like intellectual property violations. Right, 41 00:02:52,639 --> 00:02:55,760 Speaker 1: so if you were to post something that you don't 42 00:02:55,760 --> 00:02:58,760 Speaker 1: have ownership of and the platform allowed it to stay up, 43 00:02:58,800 --> 00:03:01,359 Speaker 1: the platform could be held account. This is why YouTube 44 00:03:01,400 --> 00:03:04,360 Speaker 1: is so aggressive when removing videos that get a copyright 45 00:03:04,360 --> 00:03:07,480 Speaker 1: removal request, because it's safer to take something down that 46 00:03:07,800 --> 00:03:11,079 Speaker 1: might be a copyright violation or it might not, rather 47 00:03:11,120 --> 00:03:13,160 Speaker 1: than to just leave it up long enough to determine 48 00:03:13,160 --> 00:03:17,839 Speaker 1: if it actually counts as a copyright violation. Anyway, politicians 49 00:03:17,840 --> 00:03:20,360 Speaker 1: on both sides of the aisle have issues with Section 50 00:03:20,400 --> 00:03:24,040 Speaker 1: two thirty, despite you know, not seeming to really understand 51 00:03:24,080 --> 00:03:27,680 Speaker 1: what section two thirty is all about. Complicating matters is 52 00:03:27,720 --> 00:03:32,280 Speaker 1: that these social platforms use algorithms to serve content up 53 00:03:32,360 --> 00:03:34,960 Speaker 1: to users. So if it were just a case of 54 00:03:35,120 --> 00:03:38,680 Speaker 1: users seeing the stuff that they personally followed, it might 55 00:03:38,720 --> 00:03:42,440 Speaker 1: be a different story. But some legislators argue that because 56 00:03:42,520 --> 00:03:46,280 Speaker 1: sites like Facebook or TikTok use an algorithm to determine 57 00:03:46,320 --> 00:03:50,040 Speaker 1: what a user actually sees on their timeline, then those 58 00:03:50,080 --> 00:03:53,960 Speaker 1: platforms are acting more like publishers because they are curating 59 00:03:54,040 --> 00:03:57,640 Speaker 1: material effectively and honestly This is an argument that I 60 00:03:57,680 --> 00:04:01,240 Speaker 1: at least think there's some merit to it. It's it's 61 00:04:01,360 --> 00:04:04,360 Speaker 1: hard to argue against it. I guess you could say 62 00:04:04,640 --> 00:04:08,320 Speaker 1: that these platforms don't actually know what their algorithms are 63 00:04:08,360 --> 00:04:11,920 Speaker 1: going to recommend to users, because it's completely dependent upon 64 00:04:12,280 --> 00:04:15,040 Speaker 1: what those users have interacted with and expressed interest in 65 00:04:15,040 --> 00:04:17,240 Speaker 1: in the past. So it's not like the platforms are 66 00:04:17,279 --> 00:04:20,400 Speaker 1: aware of what the algorithms are doing and the whole 67 00:04:20,440 --> 00:04:23,360 Speaker 1: process is automated. But at the same time, the platform 68 00:04:23,400 --> 00:04:27,359 Speaker 1: itself is determining what someone sees versus all the stuff 69 00:04:27,360 --> 00:04:31,320 Speaker 1: that's actually available. So it's a tricky situation. Anyway, a 70 00:04:31,320 --> 00:04:34,400 Speaker 1: lot of Senators weren't impressed with the CEOs and promised 71 00:04:34,480 --> 00:04:38,400 Speaker 1: to pass laws that would weaken Section two thirty and 72 00:04:38,520 --> 00:04:42,160 Speaker 1: place heavy requirements on the platforms to protect children. There 73 00:04:42,160 --> 00:04:44,640 Speaker 1: are critics of some of these laws. There's actually multiple 74 00:04:44,720 --> 00:04:48,200 Speaker 1: laws that are in various stages in the Senate and 75 00:04:48,240 --> 00:04:50,800 Speaker 1: the House right now. There are critics who say that 76 00:04:50,920 --> 00:04:56,039 Speaker 1: these laws compromise privacy and security, and they can infringe 77 00:04:56,160 --> 00:04:58,839 Speaker 1: upon the First Amendment, and that many of them could 78 00:04:58,880 --> 00:05:03,480 Speaker 1: be difficult, if not in possible, to enforce, and worse yet, 79 00:05:03,560 --> 00:05:06,280 Speaker 1: they could potentially pose a danger to people in the 80 00:05:06,440 --> 00:05:12,000 Speaker 1: LGBTQ community, because not everyone agrees as to what counts 81 00:05:12,040 --> 00:05:16,400 Speaker 1: as being harmful forward children, and some see just the 82 00:05:16,480 --> 00:05:20,280 Speaker 1: existence of say, transgender people as a danger, like just 83 00:05:20,320 --> 00:05:24,839 Speaker 1: the fact that they exist, that's dangerous. So that in 84 00:05:24,960 --> 00:05:27,920 Speaker 1: turn is a slippery slope. So, as I said, it's 85 00:05:27,920 --> 00:05:32,520 Speaker 1: a complicated issue. Speaking of Zuckerberg, he gave some insight 86 00:05:32,640 --> 00:05:36,720 Speaker 1: into Meta's AI strategy during an earnings call yesterday. He 87 00:05:36,760 --> 00:05:40,000 Speaker 1: claimed that the amount of publicly shared information on the 88 00:05:40,000 --> 00:05:44,159 Speaker 1: company's platforms represents a larger data set than what most 89 00:05:44,200 --> 00:05:47,200 Speaker 1: AI companies use when training AI, which is kind of 90 00:05:47,200 --> 00:05:50,240 Speaker 1: funny because you know, like Facebook, you Meta with Facebook 91 00:05:50,279 --> 00:05:53,880 Speaker 1: and Instagram and stuff. They do not want anyone crawling 92 00:05:54,240 --> 00:05:57,280 Speaker 1: their sites. They don't want any of that. That's they say, 93 00:05:57,360 --> 00:06:01,279 Speaker 1: that's against their terms of service. You cannot create bots 94 00:06:01,320 --> 00:06:04,640 Speaker 1: to crawl these sites to gather up information. They can 95 00:06:04,680 --> 00:06:07,240 Speaker 1: do it, though, but no one else can. And so 96 00:06:07,240 --> 00:06:11,040 Speaker 1: they're saying, hey, we have this huge repository of publicly 97 00:06:11,080 --> 00:06:14,680 Speaker 1: shared pictures and videos and other stuff. It's way bigger 98 00:06:14,680 --> 00:06:18,240 Speaker 1: than what anyone else has, So we could train up 99 00:06:18,279 --> 00:06:21,520 Speaker 1: a very intelligent AI as a result of that, and 100 00:06:21,640 --> 00:06:25,640 Speaker 1: Zuckerberg actually mentioned that general intelligence remains a goal of 101 00:06:25,720 --> 00:06:28,040 Speaker 1: general intelligence in the form of AI. I don't mean 102 00:06:28,040 --> 00:06:32,040 Speaker 1: that Zuckerberg is seeking general intelligence. They're too any jokes there. 103 00:06:32,440 --> 00:06:35,839 Speaker 1: So general intelligence would be AI that would work in 104 00:06:35,839 --> 00:06:38,960 Speaker 1: a way that's at least similar to how human intelligence works, right. 105 00:06:39,000 --> 00:06:41,640 Speaker 1: That seems to be able to think for itself. It's 106 00:06:41,640 --> 00:06:45,359 Speaker 1: a kind of strong AI. So this kind of work 107 00:06:45,600 --> 00:06:49,159 Speaker 1: AI research and development, it's really expensive. And Zuckerberg's team 108 00:06:49,240 --> 00:06:51,640 Speaker 1: hinted that Meta could increase spending by as much as 109 00:06:51,720 --> 00:06:55,240 Speaker 1: nine billion dollars this year compared to last year, and 110 00:06:55,680 --> 00:06:59,240 Speaker 1: that this could just be the beginning of additional expenditures 111 00:06:59,279 --> 00:07:02,160 Speaker 1: and research into development, which sounds to me a lot 112 00:07:02,240 --> 00:07:05,600 Speaker 1: like hol Zuckerberg warned that Meta would be spending increasing 113 00:07:05,600 --> 00:07:08,279 Speaker 1: amounts to build out the metaverse. For the record, they 114 00:07:08,360 --> 00:07:13,040 Speaker 1: gave the metaverse very brief mention in their earnings called document. 115 00:07:13,560 --> 00:07:17,400 Speaker 1: They called it their metaverse efforts had put it under 116 00:07:17,600 --> 00:07:23,840 Speaker 1: new business initiatives. I don't know how much momentum their 117 00:07:23,880 --> 00:07:28,160 Speaker 1: metaverse efforts still have. Like to me, it feels like 118 00:07:28,440 --> 00:07:33,760 Speaker 1: the whole metaverse craze has already significantly died down right, 119 00:07:33,880 --> 00:07:36,280 Speaker 1: like that was the huge thing more than a year ago, 120 00:07:36,560 --> 00:07:41,840 Speaker 1: and now people are not nearly as enthusiastic about it. 121 00:07:41,920 --> 00:07:44,000 Speaker 1: But maybe I'm just not listening to the right corners. 122 00:07:44,040 --> 00:07:47,680 Speaker 1: Maybe the Web three enthusiasts and the metaverse enthusiasts are 123 00:07:47,720 --> 00:07:50,760 Speaker 1: still going whole hog and I'm just not seeing. One 124 00:07:50,800 --> 00:07:52,920 Speaker 1: big news story that happened earlier this week is that 125 00:07:53,000 --> 00:07:56,960 Speaker 1: Amazon called off its planned acquisition of the company I Robot, 126 00:07:57,160 --> 00:07:59,920 Speaker 1: which is best known as the maker of the Roomba 127 00:08:00,400 --> 00:08:04,160 Speaker 1: vacuum robot. This actually surprised me, but that's because I 128 00:08:04,240 --> 00:08:06,440 Speaker 1: thought that this deal had already gone through. I thought 129 00:08:06,480 --> 00:08:09,880 Speaker 1: Amazon had already acquired I Robot. I was just under 130 00:08:09,920 --> 00:08:13,040 Speaker 1: the impression that they had already completed this. The two 131 00:08:13,080 --> 00:08:15,720 Speaker 1: companies originally agreed to this deal back in the summer 132 00:08:15,760 --> 00:08:18,600 Speaker 1: of twenty twenty two, so yeah, I just figured like 133 00:08:18,640 --> 00:08:20,840 Speaker 1: within a year and a half that they had closed 134 00:08:20,840 --> 00:08:24,880 Speaker 1: the deal. But no, they faced a lot of regulator opposition, 135 00:08:24,960 --> 00:08:28,160 Speaker 1: including here in the United States, and ultimately Amazon said 136 00:08:28,200 --> 00:08:32,520 Speaker 1: that they just didn't see a way forward. So Amazon said, 137 00:08:32,559 --> 00:08:34,320 Speaker 1: you know what, We're going to pay the ninety five 138 00:08:34,360 --> 00:08:36,920 Speaker 1: million dollar fee that's going to cost us to walk 139 00:08:36,920 --> 00:08:40,760 Speaker 1: away from this deal, and they did, and that's really 140 00:08:40,880 --> 00:08:44,000 Speaker 1: bad news for iRobot. So once upon a time, the 141 00:08:44,120 --> 00:08:48,160 Speaker 1: Rumba was like the dominant product in robot vacuums. In fact, 142 00:08:48,200 --> 00:08:50,959 Speaker 1: it was, to all intents and purposes, like the only 143 00:08:51,160 --> 00:08:55,320 Speaker 1: name in the space. But it didn't last because obviously 144 00:08:55,360 --> 00:08:58,720 Speaker 1: other companies were going to build competing products, and some 145 00:08:58,880 --> 00:09:03,720 Speaker 1: of those products lost less, sometimes significantly less than the Rumba. 146 00:09:03,760 --> 00:09:07,400 Speaker 1: They also weren't necessarily as good as the Rumba, but 147 00:09:07,800 --> 00:09:11,360 Speaker 1: for a lot of consumers out there, cheaper is more 148 00:09:11,360 --> 00:09:16,559 Speaker 1: important than performance, so I robots It's market share decline 149 00:09:16,880 --> 00:09:19,160 Speaker 1: year over year as more and more of these products 150 00:09:19,200 --> 00:09:22,640 Speaker 1: came out. The Amazon acquisition would have given I robot 151 00:09:22,720 --> 00:09:25,920 Speaker 1: the financial security to really stay in business and to 152 00:09:25,960 --> 00:09:29,200 Speaker 1: really invest in innovation. But with the deal falling through, 153 00:09:29,240 --> 00:09:31,800 Speaker 1: there's a big question mark hanging over I Robot and 154 00:09:31,800 --> 00:09:34,680 Speaker 1: whether or not it can even continue to function, at 155 00:09:34,800 --> 00:09:37,880 Speaker 1: least in the long term. Already, the company has announced 156 00:09:37,880 --> 00:09:40,280 Speaker 1: it will hold layoffs that will affect around three hundred 157 00:09:40,280 --> 00:09:44,720 Speaker 1: and fifty employees, and I Robots CEO Colin Engel is 158 00:09:44,760 --> 00:09:47,760 Speaker 1: also stepping down, as he said that he and the 159 00:09:47,800 --> 00:09:51,400 Speaker 1: board both felt that the company really needed a leader 160 00:09:51,400 --> 00:09:56,120 Speaker 1: with quote unquote turnaround experience. I feel really badly for 161 00:09:56,400 --> 00:09:59,360 Speaker 1: everyone affected by these layoffs, and I hope that I 162 00:09:59,520 --> 00:10:02,920 Speaker 1: Robot make a recovery because the company has been responsible 163 00:10:02,920 --> 00:10:05,880 Speaker 1: for numerous innovations in the robotic space, you know, the 164 00:10:05,960 --> 00:10:08,800 Speaker 1: robotic vacuum space in particular, like there have been a 165 00:10:08,800 --> 00:10:12,280 Speaker 1: lot of innovations that I Robot has driven that then 166 00:10:12,360 --> 00:10:15,440 Speaker 1: got adopted by the larger industry, and without I Robot 167 00:10:15,559 --> 00:10:19,760 Speaker 1: there to actually forge that path, we may see a 168 00:10:19,840 --> 00:10:24,240 Speaker 1: real decrease in innovation in that space. After a Tesla 169 00:10:24,320 --> 00:10:28,480 Speaker 1: shareholder brought a lawsuit against Tesla five years ago, a 170 00:10:28,600 --> 00:10:32,160 Speaker 1: judge in the state of Delaware has ruled that Elon 171 00:10:32,280 --> 00:10:36,080 Speaker 1: Musk is not, in fact entitled to a compensation package 172 00:10:36,120 --> 00:10:40,640 Speaker 1: with a value of upwards of fifty five billion dollars. 173 00:10:41,240 --> 00:10:43,920 Speaker 1: He's not going to get that, and like I said, 174 00:10:43,960 --> 00:10:46,720 Speaker 1: it only took five years of languishing through the court system, 175 00:10:46,720 --> 00:10:49,440 Speaker 1: because justice is swift here in the United States. So 176 00:10:49,480 --> 00:10:52,960 Speaker 1: the lawsuit argued that the board of directors was misusing 177 00:10:53,000 --> 00:10:56,480 Speaker 1: company funds by enriching Elon Musk when that money probably 178 00:10:56,480 --> 00:10:59,000 Speaker 1: should have gone somewhere else. Either reinvesting in the company 179 00:10:59,120 --> 00:11:02,840 Speaker 1: or distributed among shareholders whatever. The plaintiff argued that the 180 00:11:02,920 --> 00:11:06,240 Speaker 1: directors were not really independent of Musk, Essentially they were 181 00:11:06,280 --> 00:11:09,560 Speaker 1: his friends and cronies, and that he was using his 182 00:11:09,640 --> 00:11:13,240 Speaker 1: connections as personal connections among some of the board members 183 00:11:13,400 --> 00:11:18,720 Speaker 1: to get this incredibly favorable compensation deal, and that the 184 00:11:19,360 --> 00:11:22,640 Speaker 1: lawsuit also argued that Musk himself was effectively creating his 185 00:11:22,679 --> 00:11:25,440 Speaker 1: own compensation package, like Musk was the one who drew 186 00:11:25,440 --> 00:11:28,880 Speaker 1: it up and then got the board to sign off 187 00:11:28,920 --> 00:11:31,559 Speaker 1: on it, and gosh, I bet we all wish we 188 00:11:31,600 --> 00:11:34,240 Speaker 1: could do that, right. Ap News also has a really 189 00:11:34,320 --> 00:11:37,920 Speaker 1: helpful graphic to assist in understanding exactly how much money 190 00:11:37,920 --> 00:11:41,800 Speaker 1: fifty five billion dollars is for one thing. It illustrates 191 00:11:41,800 --> 00:11:44,720 Speaker 1: that you could spend a dollar for every second of 192 00:11:44,960 --> 00:11:48,240 Speaker 1: every day and it would take you oney seven hundred 193 00:11:48,240 --> 00:11:52,080 Speaker 1: and forty four years to spend all that cash. Crazy, right, 194 00:11:52,520 --> 00:11:56,800 Speaker 1: No one needs that much money. Nobody needs that much money. 195 00:11:56,800 --> 00:11:58,319 Speaker 1: He also said that you could give one hundred and 196 00:11:58,320 --> 00:12:02,160 Speaker 1: sixty three bucks to every single person in the United States, Like, 197 00:12:02,760 --> 00:12:04,840 Speaker 1: that's just it's just way too much money for any 198 00:12:04,840 --> 00:12:07,120 Speaker 1: one person to have. I don't care who you are anyway, 199 00:12:07,400 --> 00:12:10,160 Speaker 1: Musk as is his custom, went to X to vent 200 00:12:10,240 --> 00:12:13,240 Speaker 1: his spleen and said, never incorporate your company in the 201 00:12:13,240 --> 00:12:16,480 Speaker 1: state of Delaware. I recommend incorporating Nevada or Texas if 202 00:12:16,520 --> 00:12:20,280 Speaker 1: you prefer shareholders to decide matters. Ironically, it was a 203 00:12:20,320 --> 00:12:22,640 Speaker 1: shareholder who helped decide this matter, because it was a 204 00:12:22,679 --> 00:12:25,600 Speaker 1: shareholder who brought the lawsuit forward in the first place. 205 00:12:25,800 --> 00:12:29,960 Speaker 1: But yeah, I'm not surprised he's upset. Fifty five billion 206 00:12:30,040 --> 00:12:32,640 Speaker 1: dollars is like when you think you're gonna get it 207 00:12:32,679 --> 00:12:34,880 Speaker 1: and it turns out you're not. And also when, like 208 00:12:34,920 --> 00:12:37,040 Speaker 1: a couple of years ago, you spent forty four billion 209 00:12:37,120 --> 00:12:40,160 Speaker 1: dollars to buy a company that you are steadily running 210 00:12:40,200 --> 00:12:42,040 Speaker 1: into the ground. I mean, this is a lot to 211 00:12:42,040 --> 00:12:44,160 Speaker 1: take in, you know, speaking of a lot to take in, 212 00:12:44,200 --> 00:12:46,280 Speaker 1: We've got more news to cover, but before we get 213 00:12:46,320 --> 00:12:59,000 Speaker 1: to that, let's take a quick break to thank our sponsors. Okay, 214 00:12:59,040 --> 00:13:02,760 Speaker 1: we're back. So this week, Apple's Vision Pro mixed reality 215 00:13:02,760 --> 00:13:06,760 Speaker 1: headset finally launched. As a reminder, this is the headset 216 00:13:06,800 --> 00:13:09,839 Speaker 1: with a starting price point of three thy four hundred 217 00:13:09,840 --> 00:13:12,920 Speaker 1: and ninety nine dollars and according to various sources, Apple 218 00:13:12,960 --> 00:13:15,920 Speaker 1: has sold more than two hundred thousand units so far. 219 00:13:16,440 --> 00:13:18,320 Speaker 1: I should point out that the company first made the 220 00:13:18,320 --> 00:13:21,880 Speaker 1: headsets available for pre order back in January, and the 221 00:13:22,600 --> 00:13:25,840 Speaker 1: majority of those sales were made in the first couple 222 00:13:25,880 --> 00:13:29,079 Speaker 1: of days that pre orders were open. After that they 223 00:13:29,200 --> 00:13:33,240 Speaker 1: trailed off significantly. But it's also two hundred thousand units. 224 00:13:33,240 --> 00:13:36,559 Speaker 1: That's also more than what analysts predict Apple actually has 225 00:13:36,840 --> 00:13:40,199 Speaker 1: available like manufactured, so some folks are going to be 226 00:13:40,240 --> 00:13:43,040 Speaker 1: waiting a little bit longer to get their new, very 227 00:13:43,080 --> 00:13:46,720 Speaker 1: expensive toy. Brian Heater of tech Crunch has some articles 228 00:13:46,760 --> 00:13:50,920 Speaker 1: about his experiences using the vision Pro. I recommend checking 229 00:13:50,960 --> 00:13:55,280 Speaker 1: those out. He mentions that he felt some motion sickness 230 00:13:55,640 --> 00:13:58,880 Speaker 1: on that first day, but he also admitted he's using 231 00:13:58,920 --> 00:14:02,400 Speaker 1: this device a lot because that's his job, right to 232 00:14:02,520 --> 00:14:05,360 Speaker 1: really put it through the paces and then report back 233 00:14:05,400 --> 00:14:07,040 Speaker 1: on it. You know, you might only be able to 234 00:14:07,120 --> 00:14:08,839 Speaker 1: use it for two and a half hours because that's 235 00:14:08,840 --> 00:14:10,960 Speaker 1: how long the battery lasts, and then you have to 236 00:14:11,000 --> 00:14:13,040 Speaker 1: recharge it. But he's been using it over and over 237 00:14:13,040 --> 00:14:15,720 Speaker 1: and over again. But he also makes a really good point. 238 00:14:15,760 --> 00:14:19,560 Speaker 1: He says the average vision Pro owner is also likely 239 00:14:19,600 --> 00:14:24,640 Speaker 1: to go ham at least initially upon purchasing their mixed 240 00:14:24,680 --> 00:14:28,680 Speaker 1: reality headset, because if you've spent three four hundred and 241 00:14:28,720 --> 00:14:32,080 Speaker 1: ninety nine bucks at minimum on something, you want to 242 00:14:32,120 --> 00:14:35,920 Speaker 1: get your money's worth. Anyway, I still think Apple's product 243 00:14:35,960 --> 00:14:39,120 Speaker 1: is really a niche one. Two hundred thousand units is 244 00:14:39,160 --> 00:14:42,920 Speaker 1: a lot, right that we should not dismiss selling two 245 00:14:43,000 --> 00:14:46,040 Speaker 1: hundred thousand of these things. But two hundred thousand units 246 00:14:46,080 --> 00:14:48,200 Speaker 1: is a fraction of what you would see with say 247 00:14:48,640 --> 00:14:52,000 Speaker 1: a new iPhone launch, for example. And while all the 248 00:14:52,040 --> 00:14:56,720 Speaker 1: reviews I've read praise the headsets, pass through video quality 249 00:14:56,840 --> 00:14:59,560 Speaker 1: and the image quality, I'm still not seeing a whole 250 00:14:59,560 --> 00:15:03,480 Speaker 1: lot of conversation about like any real killer apps. Now. 251 00:15:03,520 --> 00:15:06,200 Speaker 1: Maybe that will come in time, or maybe developers are 252 00:15:06,280 --> 00:15:09,040 Speaker 1: going to hold back and wait for a larger user 253 00:15:09,080 --> 00:15:13,120 Speaker 1: base before they invest the resources in developing apps for 254 00:15:13,240 --> 00:15:16,000 Speaker 1: this thing. And if that's the case, we could really 255 00:15:16,000 --> 00:15:18,560 Speaker 1: have a vicious cycle thing going on, sort of catch 256 00:15:18,600 --> 00:15:22,280 Speaker 1: twenty two, where developers won't make apps until they are 257 00:15:22,280 --> 00:15:25,520 Speaker 1: more users, but users aren't going to buy the product 258 00:15:25,600 --> 00:15:28,600 Speaker 1: until there are more apps running on it. That's a possibility. 259 00:15:29,680 --> 00:15:32,160 Speaker 1: I don't know. Maybe I'm ultimately wrong, Maybe this will 260 00:15:32,200 --> 00:15:35,920 Speaker 1: become the next like common household technology. But I don't 261 00:15:35,960 --> 00:15:39,680 Speaker 1: think that's possible at that price point, but who knows. 262 00:15:39,680 --> 00:15:43,200 Speaker 1: I've been wrong so many times before. The US Federal 263 00:15:43,200 --> 00:15:47,520 Speaker 1: Communications Commission or FCC, would very much like Congress to 264 00:15:47,520 --> 00:15:50,880 Speaker 1: make it illegal for anyone to use AI generated voices 265 00:15:51,000 --> 00:15:55,000 Speaker 1: for the purposes of robocalls, pretty please. Apparently someone used 266 00:15:55,000 --> 00:15:58,600 Speaker 1: an AI impersonation of Joe Biden's voice to make robocalls 267 00:15:58,640 --> 00:16:02,080 Speaker 1: in the state of New Hampshire to tar voters, presumably 268 00:16:02,200 --> 00:16:05,440 Speaker 1: to attempt to suppress the vote by telling people not 269 00:16:05,520 --> 00:16:10,320 Speaker 1: to bother voting. This spurred the FCC into action. And 270 00:16:10,360 --> 00:16:14,160 Speaker 1: we've all known for a while now that AI impersonations, 271 00:16:14,400 --> 00:16:16,760 Speaker 1: you know, deep fakes and that sort of thing could 272 00:16:16,800 --> 00:16:20,480 Speaker 1: play a big part in the creation and spread of misinformation. 273 00:16:21,000 --> 00:16:24,080 Speaker 1: And that's the FCC's main concern and why the agency 274 00:16:24,120 --> 00:16:28,360 Speaker 1: wants to make this practice illegal. They can't do that themselves, obviously, 275 00:16:28,360 --> 00:16:31,800 Speaker 1: they're not a lawmaking body, but they can recommend that. 276 00:16:31,840 --> 00:16:36,320 Speaker 1: To con Chris, the FCC proposes expanding the Telephone Consumer 277 00:16:36,360 --> 00:16:40,320 Speaker 1: Protection Act, which originally was drafted in the nineties, and 278 00:16:40,400 --> 00:16:43,720 Speaker 1: to expand that so it will protect against this AI 279 00:16:43,880 --> 00:16:48,240 Speaker 1: generated voice robocall practice, and I'm all for it, because 280 00:16:48,240 --> 00:16:51,400 Speaker 1: we're speeding down the road toward a pretty darn chaotic destination. 281 00:16:51,680 --> 00:16:54,480 Speaker 1: Anything we can do to mitigate that would be really nice, 282 00:16:54,880 --> 00:16:58,640 Speaker 1: in my opinion. Christopher Ray, the director of the US 283 00:16:58,760 --> 00:17:02,160 Speaker 1: Federal Bureau of Investmentation, is warning anybody who will listen 284 00:17:02,560 --> 00:17:05,720 Speaker 1: that the dangers of Chinese hackers and their ability to 285 00:17:05,840 --> 00:17:10,400 Speaker 1: disrupt critical US infrastructure such as electrical grids, is reaching 286 00:17:10,480 --> 00:17:15,159 Speaker 1: an alarming height. It has been no secret that for 287 00:17:15,320 --> 00:17:19,280 Speaker 1: years now, state backed hackers have worked to infiltrate these systems. 288 00:17:19,359 --> 00:17:22,760 Speaker 1: Like I remember reading articles from like a decade ago 289 00:17:23,000 --> 00:17:30,040 Speaker 1: about discovering Chinese code embedded into US infrastructure systems, proving 290 00:17:30,240 --> 00:17:33,880 Speaker 1: that there were these kind of markers where Chinese hackers 291 00:17:33,920 --> 00:17:38,000 Speaker 1: had managed to get access to these things, and there 292 00:17:38,040 --> 00:17:41,240 Speaker 1: have been fears that this work could lead to massive 293 00:17:41,320 --> 00:17:43,960 Speaker 1: cyber attacks in the future. Let's say that there's some 294 00:17:44,040 --> 00:17:48,959 Speaker 1: sort of event that ends up escalating and then China 295 00:17:49,280 --> 00:17:52,760 Speaker 1: orders to kind of take advantage of all this infiltration. 296 00:17:53,119 --> 00:17:56,480 Speaker 1: That's scary, Like you could have entire sections of the 297 00:17:56,520 --> 00:18:01,000 Speaker 1: electrical grid shut down or overwhelmed and have it break. 298 00:18:01,359 --> 00:18:03,800 Speaker 1: So what's new here? If all this stuff has been 299 00:18:03,800 --> 00:18:07,240 Speaker 1: going on for a while. Why is Ray speaking of now, Well, 300 00:18:07,280 --> 00:18:11,040 Speaker 1: there's not that much that's new. But Christopher Ray's statements 301 00:18:11,080 --> 00:18:13,960 Speaker 1: were made largely in frustration over the fact that despite 302 00:18:14,359 --> 00:18:17,440 Speaker 1: these things not actually being breaking news, despite the fact 303 00:18:17,440 --> 00:18:20,119 Speaker 1: that we have known about this stuff, we haven't done 304 00:18:20,280 --> 00:18:23,320 Speaker 1: enough about it. So he's calling for action. He's pointing 305 00:18:23,320 --> 00:18:28,160 Speaker 1: out how Chinese backed hacker groups have already caused massive problems, 306 00:18:28,720 --> 00:18:30,800 Speaker 1: like in the private sector. We've seen that over the 307 00:18:30,880 --> 00:18:34,359 Speaker 1: last few years. Plus. We're in an election year here 308 00:18:34,359 --> 00:18:36,080 Speaker 1: in the United States. It feels like we're always in 309 00:18:36,119 --> 00:18:38,960 Speaker 1: an election year. This time like it just it doesn't 310 00:18:39,000 --> 00:18:41,439 Speaker 1: matter what year it is, it feels like it's an 311 00:18:41,440 --> 00:18:46,080 Speaker 1: election year. But that becomes a huge incentive for a disruption. 312 00:18:46,480 --> 00:18:49,679 Speaker 1: We have seen Chinese and Russian backed groups attempt to 313 00:18:49,720 --> 00:18:54,919 Speaker 1: influence US politics through hacking and misinformation campaigns. So I 314 00:18:54,960 --> 00:18:58,639 Speaker 1: guess we'll see if anyone listens to him and actually 315 00:18:58,680 --> 00:19:02,680 Speaker 1: mounts a decent response to this warning. An Australian company 316 00:19:02,720 --> 00:19:06,479 Speaker 1: called Morse Micro has demonstrated a Wi Fi protocol that 317 00:19:06,520 --> 00:19:13,280 Speaker 1: they call Wi Fi Halo and it's HaLow. This uses 318 00:19:13,440 --> 00:19:18,440 Speaker 1: lower frequency radio signals to transmit Wi Fi over greater distances. 319 00:19:18,480 --> 00:19:22,680 Speaker 1: So these radio signals have longer wave links, lower frequencies 320 00:19:23,200 --> 00:19:28,200 Speaker 1: than what you would typically see with like your usual 321 00:19:28,280 --> 00:19:32,880 Speaker 1: Wi Fi transmitters, and the range they're getting is pretty impressive. 322 00:19:32,920 --> 00:19:36,440 Speaker 1: We're talking like three kilometers or around one point eight miles, 323 00:19:37,000 --> 00:19:40,399 Speaker 1: which is good, Like that's a good distance to be 324 00:19:40,400 --> 00:19:43,560 Speaker 1: able to transmit Wi Fi signals. And they say their 325 00:19:43,640 --> 00:19:46,760 Speaker 1: goal is to develop this technology so that it can 326 00:19:46,800 --> 00:19:51,760 Speaker 1: facilitate communications for stuff like Internet of Things applications, because 327 00:19:51,800 --> 00:19:55,280 Speaker 1: it wouldn't really be suitable for a high throughput application. 328 00:19:55,480 --> 00:19:59,920 Speaker 1: These transmission speeds are down to around one megabit per second. 329 00:20:00,320 --> 00:20:04,240 Speaker 1: You could have some grainy video for example, but that's 330 00:20:04,560 --> 00:20:08,200 Speaker 1: probably the extent of it. But as a solution for IoT, 331 00:20:08,480 --> 00:20:11,160 Speaker 1: you know, Internet of things, it could be a really 332 00:20:11,200 --> 00:20:14,919 Speaker 1: helpful technology. You wouldn't need as great a density of 333 00:20:15,040 --> 00:20:19,920 Speaker 1: transmitters because of this extended range. The longer radio waves 334 00:20:20,080 --> 00:20:25,800 Speaker 1: also have far better penetration for solid surfaces like walls, 335 00:20:26,359 --> 00:20:31,359 Speaker 1: so you know, with high frequency, low wavelength a small 336 00:20:31,359 --> 00:20:37,160 Speaker 1: wavelength Wi Fi signals, walls are a big obstacle, right 337 00:20:37,640 --> 00:20:42,680 Speaker 1: if you have those ultra high frequency Wi Fi solutions, 338 00:20:43,240 --> 00:20:45,960 Speaker 1: they don't penetrate walls very well, so you lose signal 339 00:20:46,000 --> 00:20:48,119 Speaker 1: as soon as there's any kind of barrier between you 340 00:20:48,160 --> 00:20:51,239 Speaker 1: and the transmitter. But these it penetrates really well, so 341 00:20:51,280 --> 00:20:54,280 Speaker 1: you can be inside a building and still have access 342 00:20:54,320 --> 00:20:57,960 Speaker 1: to these signals. As a bonus, the technology has lower 343 00:20:58,000 --> 00:21:02,760 Speaker 1: power requirements than higher frequency Wi Fi solutions, so it's 344 00:21:02,800 --> 00:21:05,760 Speaker 1: a pretty interesting approach. But it remains to be seen 345 00:21:05,840 --> 00:21:10,160 Speaker 1: if it will receive widespread adoption because there are already 346 00:21:10,280 --> 00:21:14,200 Speaker 1: a lot of different protocols for wireless data transmissions out there, 347 00:21:14,680 --> 00:21:18,880 Speaker 1: and some of them have even greater range than Halo does, 348 00:21:18,960 --> 00:21:22,919 Speaker 1: but they also have lower throughput. Right they can only 349 00:21:23,000 --> 00:21:26,320 Speaker 1: transmit data at a lower rate. But I figure the 350 00:21:26,359 --> 00:21:28,560 Speaker 1: more the merrier if it means we can find an 351 00:21:28,560 --> 00:21:35,040 Speaker 1: ideal balance between transmission rain and transmission speeds. Negotiations broke 352 00:21:35,160 --> 00:21:38,879 Speaker 1: down between TikTok and the music label Universal Music Group, 353 00:21:39,200 --> 00:21:41,600 Speaker 1: and the two parties were unable to agree on a 354 00:21:41,640 --> 00:21:45,560 Speaker 1: new licensing agreement. The previous one expired earlier this week 355 00:21:45,600 --> 00:21:49,720 Speaker 1: on Wednesday, so as a consequence, UMG demanded that TikTok 356 00:21:49,760 --> 00:21:53,760 Speaker 1: remove any UMG published tracks from the platform. And there's 357 00:21:53,800 --> 00:21:58,119 Speaker 1: some really big artists who are on the UMG label, 358 00:21:58,840 --> 00:22:02,240 Speaker 1: including one one of the biggest in the world, Taylor Swift. 359 00:22:02,960 --> 00:22:05,560 Speaker 1: I'm sure TikTok fans are going to shake it off, 360 00:22:05,680 --> 00:22:08,440 Speaker 1: shake it off, but they're not going to do it 361 00:22:08,480 --> 00:22:11,560 Speaker 1: with the help of her music because it's being removed 362 00:22:11,600 --> 00:22:16,320 Speaker 1: from TikTok. UMG officials said that TikTok's proposed compensation rate 363 00:22:16,800 --> 00:22:20,720 Speaker 1: was far below industry standards, and that the platform also 364 00:22:20,760 --> 00:22:25,160 Speaker 1: has tons of AI generated recordings that infringe upon artists 365 00:22:25,280 --> 00:22:29,480 Speaker 1: and music companies intellectual property. That's a big problem. It's 366 00:22:29,480 --> 00:22:31,080 Speaker 1: one of those problems we've been talking about for a 367 00:22:31,080 --> 00:22:35,280 Speaker 1: while now about how AI generated material can serve as 368 00:22:35,760 --> 00:22:39,119 Speaker 1: not just a threat from a competition standpoint, but a 369 00:22:39,160 --> 00:22:43,919 Speaker 1: threat in that it can be, you know, using an 370 00:22:44,080 --> 00:22:49,119 Speaker 1: artist's work to generate new work without compensating the original artists. 371 00:22:49,119 --> 00:22:51,960 Speaker 1: And that's a problem. So TikTok fired back, you know, 372 00:22:52,119 --> 00:22:54,880 Speaker 1: UMG said, hey, you're not paying enough. You pay less 373 00:22:54,920 --> 00:22:58,440 Speaker 1: than other platforms do. TikTok says UMG just wanted more 374 00:22:58,480 --> 00:23:02,399 Speaker 1: money for the label itself, and that any argument that 375 00:23:02,440 --> 00:23:06,919 Speaker 1: this is harming artists is disingenuous because UMG wasn't planning 376 00:23:06,920 --> 00:23:09,320 Speaker 1: on sharing that money with artists anyway, it was just 377 00:23:09,320 --> 00:23:12,200 Speaker 1: going to go to the company. Now, let me be clear, 378 00:23:13,119 --> 00:23:17,639 Speaker 1: all companies are greedy, full stop. TikTok is greedy. UMG 379 00:23:18,280 --> 00:23:21,439 Speaker 1: is also greedy. There are no good guys on the 380 00:23:21,440 --> 00:23:24,920 Speaker 1: corporate side here. Maybe the artists you could argue are 381 00:23:24,960 --> 00:23:29,080 Speaker 1: good guys. Wasn't it easier in your firefly catching days? 382 00:23:30,040 --> 00:23:33,600 Speaker 1: That's a that's a shout out to all you swifties 383 00:23:33,640 --> 00:23:38,160 Speaker 1: out there. So yeah, this is not an unusual thing 384 00:23:38,240 --> 00:23:40,359 Speaker 1: to see happen, and I'm sure that a lot of 385 00:23:40,359 --> 00:23:44,639 Speaker 1: TikTok users are frustrated by it. But it all comes 386 00:23:44,680 --> 00:23:48,520 Speaker 1: down to the argument of who's who gets that money, 387 00:23:48,920 --> 00:23:53,320 Speaker 1: and sad thing is it's almost never the artists, and 388 00:23:53,359 --> 00:23:55,600 Speaker 1: that's the worst of it, right, It just comes down 389 00:23:55,600 --> 00:23:58,560 Speaker 1: to which company ends up holding on to that money. 390 00:23:58,600 --> 00:24:02,800 Speaker 1: The most we learned that Mercedes Benz suffered a big 391 00:24:02,840 --> 00:24:06,560 Speaker 1: old it whoopsie this week. It's actually a whoopsie that 392 00:24:06,760 --> 00:24:10,880 Speaker 1: happened something like four months ago and lasted that whole time. 393 00:24:10,920 --> 00:24:14,080 Speaker 1: Like it was, it was essentially an ongoing issue. So 394 00:24:14,200 --> 00:24:17,399 Speaker 1: at the heart of the matter is a gethub token. 395 00:24:17,720 --> 00:24:22,480 Speaker 1: There was a developer at Mercedes who apparently accidentally published 396 00:24:22,600 --> 00:24:27,040 Speaker 1: this token on some publicly viewable source, and that meant 397 00:24:27,080 --> 00:24:29,760 Speaker 1: anyone who went to that source and saw it could 398 00:24:29,800 --> 00:24:33,680 Speaker 1: potentially use that token to access all sorts of very 399 00:24:33,760 --> 00:24:38,879 Speaker 1: secret proprietary data, including stuff like designs for vehicles and 400 00:24:39,080 --> 00:24:43,320 Speaker 1: source code for software. The fact that this system didn't 401 00:24:43,359 --> 00:24:47,200 Speaker 1: have multi factor authentication enabled really blows my mind. I mean, 402 00:24:47,240 --> 00:24:50,879 Speaker 1: typically you would want this kind of critical information, like 403 00:24:50,920 --> 00:24:55,560 Speaker 1: it's essentially your trade secrets, to be locked securely and 404 00:24:56,200 --> 00:25:00,280 Speaker 1: have multi factor authentication there to protect it. And you know, 405 00:25:00,320 --> 00:25:03,640 Speaker 1: token access is part of that strategy, but it shouldn't 406 00:25:03,640 --> 00:25:07,000 Speaker 1: be the start and end of it anyway. Mercedes has 407 00:25:07,040 --> 00:25:10,040 Speaker 1: not yet said if any third parties were able to 408 00:25:10,080 --> 00:25:13,119 Speaker 1: access the various databases that the token would have granted 409 00:25:13,160 --> 00:25:17,000 Speaker 1: access to. Apparently, like this database that you could access 410 00:25:17,040 --> 00:25:21,960 Speaker 1: would have included keys to other repositories, including on Amazon 411 00:25:22,000 --> 00:25:25,160 Speaker 1: Web Services and Microsoft Azure. So it's like, yeah, you've 412 00:25:25,160 --> 00:25:27,840 Speaker 1: got the keys to this and only that, but inside 413 00:25:27,880 --> 00:25:32,120 Speaker 1: there are all the other keys. Truly terrible stuff. There's 414 00:25:32,160 --> 00:25:34,560 Speaker 1: some who wonder if Mercedes would even be able to 415 00:25:34,600 --> 00:25:39,160 Speaker 1: detect if anyone outside the company had accessed the information 416 00:25:39,200 --> 00:25:42,800 Speaker 1: in the first place. This is wild stuff. And if 417 00:25:42,800 --> 00:25:46,240 Speaker 1: you're wondering how bad is this from a kind of 418 00:25:46,320 --> 00:25:49,280 Speaker 1: a how important could that data be? Stance, I would 419 00:25:49,320 --> 00:25:52,760 Speaker 1: put it at a solid nine out of ten, maybe 420 00:25:52,840 --> 00:25:57,520 Speaker 1: even a ten, Like this is real bad. The only 421 00:25:57,640 --> 00:26:00,640 Speaker 1: reason that it's not easily a ten le the gate 422 00:26:00,720 --> 00:26:03,639 Speaker 1: is just that we don't know if anyone else accessed it. 423 00:26:03,720 --> 00:26:05,720 Speaker 1: If they did, then that's a ten out of ten. 424 00:26:05,800 --> 00:26:08,560 Speaker 1: Like it's as bad as it gets. Hopefully, for the 425 00:26:08,560 --> 00:26:11,000 Speaker 1: company's sake, no one noticed the issue and exploited it. 426 00:26:11,080 --> 00:26:14,840 Speaker 1: But that's kind of like the weakest form of security 427 00:26:14,960 --> 00:26:17,560 Speaker 1: you could possibly have is just hoping that no one noticed. 428 00:26:17,560 --> 00:26:22,080 Speaker 1: It's like security through obscurity, not a great, great strategy, 429 00:26:23,440 --> 00:26:28,760 Speaker 1: really bad, And I bet that software developer is having 430 00:26:28,920 --> 00:26:33,160 Speaker 1: a really stressful week, is my guess. My heart goes 431 00:26:33,160 --> 00:26:35,520 Speaker 1: out because if it were truly an accident, that's just 432 00:26:36,280 --> 00:26:38,600 Speaker 1: I mean, it's a terrible accident, but it's not like 433 00:26:38,680 --> 00:26:43,280 Speaker 1: someone was being malicious or whatever. Okay, we've got a 434 00:26:43,280 --> 00:26:45,719 Speaker 1: couple more stories to get through before I can do that. 435 00:26:45,880 --> 00:26:59,040 Speaker 1: Let's take another quick break to thank our sponsor. We're back. 436 00:26:59,520 --> 00:27:04,520 Speaker 1: Spotify podcast strategy has long included a reliance on exclusive 437 00:27:04,560 --> 00:27:08,560 Speaker 1: shows that would only publish on the Spotify platform, and 438 00:27:08,640 --> 00:27:11,960 Speaker 1: I should mention right now, I guess that Spotify clearly 439 00:27:12,520 --> 00:27:16,240 Speaker 1: is a competitor to iHeart podcasts. We all work in 440 00:27:16,280 --> 00:27:21,480 Speaker 1: the same space. But the days of Spotify's exclusivity appear 441 00:27:22,000 --> 00:27:24,560 Speaker 1: to be coming to an end, at least for all 442 00:27:24,600 --> 00:27:29,160 Speaker 1: but one of Spotify's shows. Anyway, the platform had actually 443 00:27:29,160 --> 00:27:33,080 Speaker 1: gone down to just two exclusive shows. It had already 444 00:27:33,680 --> 00:27:37,120 Speaker 1: opened up other shows for wider distribution, and the two 445 00:27:37,160 --> 00:27:41,119 Speaker 1: shows that remained exclusive were The Joe Rogan Experience and 446 00:27:41,200 --> 00:27:45,719 Speaker 1: another one called Call Her Daddy. But now Call Her 447 00:27:45,800 --> 00:27:50,919 Speaker 1: Daddy can distribute on other audio platforms. Spotify is holding 448 00:27:50,960 --> 00:27:54,640 Speaker 1: on to exclusive rights for the video version of the show, 449 00:27:54,720 --> 00:27:56,760 Speaker 1: so it's not like going to be seeing that on YouTube. 450 00:27:56,760 --> 00:27:59,520 Speaker 1: It's just going to be exclusive to Spotify, so that 451 00:27:59,600 --> 00:28:03,600 Speaker 1: will remain on Spotify's platform, but the audio version will 452 00:28:03,600 --> 00:28:07,280 Speaker 1: be available wherever you get your podcasts. So now the 453 00:28:07,400 --> 00:28:11,600 Speaker 1: Joe Rogan Experience is the only exclusive audio podcast remaining 454 00:28:11,720 --> 00:28:15,840 Speaker 1: on Spotify, and people are wondering how long that will 455 00:28:15,840 --> 00:28:20,720 Speaker 1: stay true, you know, will the Joe Rogan Experience follow 456 00:28:20,760 --> 00:28:23,919 Speaker 1: in the footsteps of these other shows and then shift 457 00:28:24,080 --> 00:28:28,760 Speaker 1: to wider distribution, because once that contract comes up, then 458 00:28:28,760 --> 00:28:30,400 Speaker 1: they're going to have to decide how are they're going 459 00:28:30,440 --> 00:28:34,359 Speaker 1: to move forward with renewing it. Spotify has already spent 460 00:28:34,560 --> 00:28:37,960 Speaker 1: tens of millions, hundreds of millions of dollars on these 461 00:28:38,040 --> 00:28:41,160 Speaker 1: exclusive agreements, most of it to the Joe Rogan experience, 462 00:28:41,280 --> 00:28:45,640 Speaker 1: Like the rumor is that the deal for his exclusive 463 00:28:46,520 --> 00:28:49,200 Speaker 1: contract with Spotify was somewhere in the one hundred million 464 00:28:49,320 --> 00:28:55,560 Speaker 1: dollar range. Holy cats. My guess is that Spotify has 465 00:28:55,600 --> 00:28:59,760 Speaker 1: not seen a great return on investment for most of 466 00:28:59,760 --> 00:29:04,160 Speaker 1: its exclusive shows, which explains why they shifted to wider distribution. 467 00:29:04,320 --> 00:29:07,000 Speaker 1: And that makes sense, right, Like, if you're exclusive, then 468 00:29:07,320 --> 00:29:11,080 Speaker 1: the very nature of exclusivity, you have limited the reach 469 00:29:11,240 --> 00:29:14,360 Speaker 1: of your podcast because not everyone is going to use 470 00:29:14,360 --> 00:29:17,880 Speaker 1: your platform. A lot of people might, but not everyone will. 471 00:29:18,760 --> 00:29:22,680 Speaker 1: And we all know that for the majority of podcasts 472 00:29:22,680 --> 00:29:26,640 Speaker 1: out there, the source of revenue for these podcasts is ads. 473 00:29:26,720 --> 00:29:28,840 Speaker 1: I mean, we just had an ad break on this show. 474 00:29:29,400 --> 00:29:32,880 Speaker 1: That's how we are able to pay for the production 475 00:29:32,960 --> 00:29:36,080 Speaker 1: of these shows and to pay people who work on them. 476 00:29:36,280 --> 00:29:40,680 Speaker 1: It's through ad revenue. So if you are limiting who 477 00:29:40,680 --> 00:29:44,479 Speaker 1: can actually listen to your show, you're limiting how much 478 00:29:44,560 --> 00:29:48,520 Speaker 1: ad revenue you can generate. So it makes sense to 479 00:29:48,560 --> 00:29:51,920 Speaker 1: shift to wider distribution for these other shows, and it 480 00:29:51,960 --> 00:29:55,440 Speaker 1: makes sense that they probably weren't making enough money to 481 00:29:55,520 --> 00:30:00,680 Speaker 1: pay off those exclusivity agreements. The question is will the 482 00:30:00,760 --> 00:30:03,800 Speaker 1: Joe Rogan Experience follow suit or will that be seen 483 00:30:03,840 --> 00:30:07,200 Speaker 1: as such an important feather in the cap of Spotify 484 00:30:07,280 --> 00:30:10,960 Speaker 1: that they will hold on to it to retain exclusivity. 485 00:30:11,080 --> 00:30:15,320 Speaker 1: Here at iHeart Podcasts, we tend to favor wide distribution 486 00:30:15,440 --> 00:30:21,760 Speaker 1: from the get go. We maybe looked at possible exclusive podcasts, 487 00:30:21,880 --> 00:30:25,000 Speaker 1: but as far as I'm aware, we never did that 488 00:30:25,360 --> 00:30:28,920 Speaker 1: because we want folks to find our shows wherever they 489 00:30:28,920 --> 00:30:31,680 Speaker 1: happen to get their podcasts right, we don't want to 490 00:30:31,760 --> 00:30:35,200 Speaker 1: dictate how people have to listen to our shows. We 491 00:30:35,240 --> 00:30:38,280 Speaker 1: want them to be able to access it however they prefer. 492 00:30:38,640 --> 00:30:41,160 Speaker 1: It's kind of like how Netflix became known for making 493 00:30:41,240 --> 00:30:44,360 Speaker 1: sure that they developed an app for every single screen 494 00:30:44,400 --> 00:30:47,120 Speaker 1: that's out there, with the exception of the Apple Vision 495 00:30:47,160 --> 00:30:50,479 Speaker 1: pro because they specifically did not make an app for 496 00:30:50,520 --> 00:30:54,080 Speaker 1: that anyway. Could it be that we will see Spotify 497 00:30:54,160 --> 00:30:58,640 Speaker 1: totally abandon exclusivity at least for audio, or will the 498 00:30:58,680 --> 00:31:01,560 Speaker 1: company continue to spend ass of truckloads of money to 499 00:31:01,640 --> 00:31:04,280 Speaker 1: keep Joe Rogan. I do not know the answer to 500 00:31:04,320 --> 00:31:07,480 Speaker 1: that last story. Here is that a hacker made use 501 00:31:07,520 --> 00:31:11,240 Speaker 1: of ours Technica as part of a malware campaign in 502 00:31:11,280 --> 00:31:15,160 Speaker 1: a way that was very clever and unfortunately also difficult 503 00:31:15,240 --> 00:31:19,400 Speaker 1: to detect and stop. So here's how ours Technica's Dan 504 00:31:19,520 --> 00:31:24,640 Speaker 1: Gooden explained what happened. Quote. A benign image of a 505 00:31:24,680 --> 00:31:28,760 Speaker 1: pizza was uploaded to a third party website and was 506 00:31:28,840 --> 00:31:32,800 Speaker 1: then linked with a URL pasted into the about page 507 00:31:33,280 --> 00:31:37,920 Speaker 1: of a registered RS user. Buried in that URL was 508 00:31:37,920 --> 00:31:40,560 Speaker 1: a string of characters that appeared to be random, but 509 00:31:40,640 --> 00:31:45,280 Speaker 1: were actually a payload end quote. So the picture itself 510 00:31:45,320 --> 00:31:48,080 Speaker 1: was nothing like. The picture could have been of anything. 511 00:31:48,120 --> 00:31:49,920 Speaker 1: It didn't have to be of pizza, it could have 512 00:31:50,280 --> 00:31:53,880 Speaker 1: been any kind of web asset. It was really just 513 00:31:53,960 --> 00:31:57,120 Speaker 1: something that the hacker could use that they anchored this 514 00:31:57,560 --> 00:32:02,760 Speaker 1: URL on in this about page of this RS Technica user, 515 00:32:03,120 --> 00:32:06,360 Speaker 1: and the URL had this long string that would not 516 00:32:07,680 --> 00:32:10,320 Speaker 1: impede the web page from loading, like you would still 517 00:32:10,320 --> 00:32:13,719 Speaker 1: get the webpage to load without an error. But that URL, 518 00:32:13,840 --> 00:32:16,640 Speaker 1: that string that was appended to the URL was actually 519 00:32:16,640 --> 00:32:20,040 Speaker 1: a set of instructions for malware that was already on 520 00:32:21,040 --> 00:32:24,960 Speaker 1: victimized machines. So, in other words, the payload on ours 521 00:32:25,000 --> 00:32:29,000 Speaker 1: Technica did not contain malware itself. It contained a set 522 00:32:29,040 --> 00:32:32,080 Speaker 1: of orders. So the way this works is step one, 523 00:32:33,080 --> 00:32:35,840 Speaker 1: you infect a whole bunch of computers with malware through 524 00:32:35,880 --> 00:32:38,280 Speaker 1: whatever means you have at your disposal. In this case, 525 00:32:38,880 --> 00:32:43,440 Speaker 1: it was through infected USB drives, so you get people 526 00:32:43,520 --> 00:32:48,680 Speaker 1: to connect USB drives to different computers. This injects malware 527 00:32:48,720 --> 00:32:53,720 Speaker 1: into the computer system. That malware is then installed essentially 528 00:32:53,800 --> 00:32:56,640 Speaker 1: into those target computer systems, and it's just a waiting 529 00:32:56,720 --> 00:33:00,560 Speaker 1: further orders. Now Step two is you hide the orders 530 00:33:00,560 --> 00:33:03,960 Speaker 1: for that malware at the end of an otherwise benign 531 00:33:04,040 --> 00:33:07,560 Speaker 1: looking URL, and you find someplace on the web where 532 00:33:07,600 --> 00:33:12,280 Speaker 1: you can plant this URL. Really, you're just looking for 533 00:33:12,360 --> 00:33:15,200 Speaker 1: a host, and it doesn't matter which one you'd choose, 534 00:33:15,640 --> 00:33:18,120 Speaker 1: So in this case it was this about page for 535 00:33:18,160 --> 00:33:22,040 Speaker 1: a user on ours TECHNICAP Step three, the instructions dependent 536 00:33:22,160 --> 00:33:25,400 Speaker 1: that URL are telling the infected machines what to do. 537 00:33:25,520 --> 00:33:28,840 Speaker 1: They're actually seeking out those instructions. They receive the instructions 538 00:33:28,840 --> 00:33:32,200 Speaker 1: and then they execute them. So the malware on these 539 00:33:32,240 --> 00:33:36,080 Speaker 1: machines jumps into action and then step four profit. In 540 00:33:36,120 --> 00:33:40,200 Speaker 1: the case of this particular instance, the malware was part 541 00:33:40,320 --> 00:33:46,320 Speaker 1: of a cryptocurrency mining scheme, So chances are anyone visiting 542 00:33:46,400 --> 00:33:49,120 Speaker 1: ours tetnco is actually fine because the first stage of 543 00:33:49,120 --> 00:33:53,920 Speaker 1: the malware again had to be injected through these USB drives. 544 00:33:53,920 --> 00:33:58,760 Speaker 1: So unless you had grabbed one of those USB drives 545 00:33:58,760 --> 00:34:02,640 Speaker 1: and then thoughtlessly plug it into your computer and infected 546 00:34:02,680 --> 00:34:06,600 Speaker 1: your computer with malware, you're fine because visiting the urs 547 00:34:06,640 --> 00:34:10,359 Speaker 1: Technica site, even visiting that particular page, would not have 548 00:34:11,000 --> 00:34:14,040 Speaker 1: delivered any malware to your machine. It was only the 549 00:34:14,080 --> 00:34:18,359 Speaker 1: instructions for malware that already existed on machines. But this 550 00:34:18,440 --> 00:34:22,080 Speaker 1: does illustrate how hackers can hide malware instructions in plain 551 00:34:22,200 --> 00:34:26,040 Speaker 1: sight and it can be incredibly difficult to detect them. 552 00:34:26,760 --> 00:34:29,399 Speaker 1: All Right, that's it for the news. I do have 553 00:34:29,520 --> 00:34:33,520 Speaker 1: one recommended article to read if you have a subscription 554 00:34:33,600 --> 00:34:37,040 Speaker 1: to the Wall Street Journal. Rolf Winkler has a piece 555 00:34:37,080 --> 00:34:40,960 Speaker 1: titled twenty three and Me's Fall from six billion dollars 556 00:34:41,000 --> 00:34:46,160 Speaker 1: to nearly zero dollars. It includes quotes from CEO and 557 00:34:46,280 --> 00:34:50,959 Speaker 1: Wojiski regarding her strategy to recover from the massive dip 558 00:34:51,040 --> 00:34:53,319 Speaker 1: the company has faced in the wake of a devastating 559 00:34:53,400 --> 00:34:55,799 Speaker 1: data breach, and it's a really interesting read. I know 560 00:34:55,880 --> 00:34:58,040 Speaker 1: not everyone has access to the Wall Street Journal, but 561 00:34:58,040 --> 00:35:01,080 Speaker 1: if you do, that article is reading to just kind 562 00:35:01,080 --> 00:35:06,279 Speaker 1: of get a handle on Woojiski. Who I mean, I 563 00:35:06,320 --> 00:35:09,759 Speaker 1: don't know. She's got so much money that I I 564 00:35:09,760 --> 00:35:12,120 Speaker 1: don't think I can I can relate to her on 565 00:35:12,239 --> 00:35:16,319 Speaker 1: any on any human level. It's just it's beyond me. 566 00:35:16,800 --> 00:35:19,640 Speaker 1: But it is interesting to see kind of what do 567 00:35:19,680 --> 00:35:24,080 Speaker 1: you do when your company has gone from an incredibly 568 00:35:24,280 --> 00:35:28,520 Speaker 1: high valuation of like six billion dollars to almost nothing 569 00:35:28,680 --> 00:35:31,560 Speaker 1: like the twenty three and meters stock price now is 570 00:35:31,640 --> 00:35:35,160 Speaker 1: under a dollar. It is under threat of being delisted 571 00:35:35,160 --> 00:35:39,440 Speaker 1: from the Nasdaq. It's a dire situation, so interesting article 572 00:35:39,440 --> 00:35:42,000 Speaker 1: worth a read. In the meantime, I hope all of 573 00:35:42,040 --> 00:35:45,200 Speaker 1: you are well and I will talk to you again 574 00:35:45,920 --> 00:35:55,360 Speaker 1: really soon. Tech Stuff is an iHeart Radio production. For 575 00:35:55,480 --> 00:36:00,320 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcast Guests, 576 00:36:00,440 --> 00:36:02,440 Speaker 1: or wherever you listen to your favorite shows.