1 00:00:04,440 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,280 --> 00:00:15,720 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,760 --> 00:00:18,480 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,520 --> 00:00:22,720 Speaker 1: are you. I'm having a rough recovery day. I'm still 5 00:00:22,760 --> 00:00:26,720 Speaker 1: getting over COVID. I'm no longer testing positive for COVID, 6 00:00:27,280 --> 00:00:30,680 Speaker 1: but there's been some lingering symptoms and some days are 7 00:00:30,680 --> 00:00:32,840 Speaker 1: better than others. This is not one of the better days. 8 00:00:32,960 --> 00:00:35,600 Speaker 1: But it is a Thursday, which means it's in tech 9 00:00:35,760 --> 00:00:37,960 Speaker 1: News day. So we're going to cover the tech news 10 00:00:38,000 --> 00:00:43,839 Speaker 1: for October twelfth, twenty twenty three, and like Tuesday's news episode, 11 00:00:43,840 --> 00:00:46,559 Speaker 1: we're going to start off with a bunch of Twitter 12 00:00:46,800 --> 00:00:51,120 Speaker 1: slash x news. So first up, I mentioned that a 13 00:00:51,120 --> 00:00:55,600 Speaker 1: lot of people from media outlets to government agencies to 14 00:00:55,640 --> 00:00:58,920 Speaker 1: watchdog groups have been directing criticism at X for the 15 00:00:58,960 --> 00:01:04,200 Speaker 1: proliferation of misinformation and disinformation, specifically in the wake of 16 00:01:04,319 --> 00:01:08,160 Speaker 1: the ongoing war in Israel. It did not take long 17 00:01:08,240 --> 00:01:12,640 Speaker 1: at all for bad information to flood the platform after 18 00:01:12,680 --> 00:01:17,880 Speaker 1: Hamas launched attacks within Israel, and that bad information included 19 00:01:17,959 --> 00:01:22,800 Speaker 1: everything from photos and footage of past conflicts being passed 20 00:01:22,800 --> 00:01:26,560 Speaker 1: off as if they were from the most recent circumstances, 21 00:01:26,920 --> 00:01:29,800 Speaker 1: to some accounts claiming that footage that was taken from 22 00:01:29,840 --> 00:01:35,160 Speaker 1: modern video games actually represented circumstances in either Israel or Gaza. 23 00:01:35,680 --> 00:01:40,440 Speaker 1: One critic who raised concerns about the platform in general 24 00:01:41,160 --> 00:01:46,319 Speaker 1: was Thierry Breton or Breton, I apologize for my pronunciation, 25 00:01:46,440 --> 00:01:50,160 Speaker 1: but we're talking about a European Union commissioner who said 26 00:01:50,160 --> 00:01:53,480 Speaker 1: that X was a place for people to quote disseminate 27 00:01:53,640 --> 00:01:58,120 Speaker 1: illegal content and disinformation end quote. That could mean that 28 00:01:58,440 --> 00:02:00,880 Speaker 1: X could be found guilty of viol leading the EU's 29 00:02:00,960 --> 00:02:05,680 Speaker 1: Digital Services Act, or DSA. So the DSA went into 30 00:02:05,720 --> 00:02:09,519 Speaker 1: effect earlier this year, back in August, and it covers 31 00:02:09,560 --> 00:02:13,080 Speaker 1: a lot of ground when it comes to a platform's responsibilities. 32 00:02:13,440 --> 00:02:18,480 Speaker 1: The DSA expects that large platforms ones like x Slash, 33 00:02:18,520 --> 00:02:22,600 Speaker 1: Twitter or Facebook, or even platforms like Google and Amazon, 34 00:02:22,919 --> 00:02:27,160 Speaker 1: have to take an active role in content moderation. Unlike 35 00:02:27,200 --> 00:02:30,720 Speaker 1: the United States, which has Section two thirty, the EU's 36 00:02:30,840 --> 00:02:34,520 Speaker 1: DSA holds these platforms at least somewhat accountable for the 37 00:02:34,560 --> 00:02:38,560 Speaker 1: material that users post to those platforms, and the DSA 38 00:02:38,720 --> 00:02:42,640 Speaker 1: gives the EU government the authority to find companies that 39 00:02:42,720 --> 00:02:45,799 Speaker 1: failed to comply with the law, they can get hit 40 00:02:45,880 --> 00:02:49,600 Speaker 1: with a really hefty penalty, up to six percent of 41 00:02:49,639 --> 00:02:54,600 Speaker 1: their global turnover, which for the purposes of this episode, 42 00:02:54,600 --> 00:02:57,440 Speaker 1: will just kind of reference as revenue. It's there's a 43 00:02:57,520 --> 00:03:00,959 Speaker 1: slight distinction there, but you'll get the idea. So last year, 44 00:03:01,000 --> 00:03:05,000 Speaker 1: Twitter's global revenue was around four point four billion dollars, 45 00:03:05,600 --> 00:03:09,520 Speaker 1: so the DSA essentially says they could take six percent 46 00:03:09,639 --> 00:03:12,440 Speaker 1: of that. They could find Twitter six percent of that 47 00:03:12,520 --> 00:03:15,320 Speaker 1: for failing to comply with the DSA. That would be 48 00:03:15,360 --> 00:03:19,960 Speaker 1: around two hundred and fifty million dollars. That's not a 49 00:03:20,080 --> 00:03:24,080 Speaker 1: small fine. Plus. If the EU finds out that a 50 00:03:24,160 --> 00:03:28,680 Speaker 1: platform continues to fall short of the DSA's requirements, the 51 00:03:28,720 --> 00:03:33,440 Speaker 1: EU has the authority to suspend that platform's operations within 52 00:03:33,520 --> 00:03:37,320 Speaker 1: the European Union, which of course would also mean the 53 00:03:37,400 --> 00:03:39,640 Speaker 1: loss of even more revenue. If you can't operate in 54 00:03:39,680 --> 00:03:42,760 Speaker 1: a major part of the world, you're missing out on 55 00:03:42,840 --> 00:03:47,640 Speaker 1: a tremendous amount of revenue. Ex's CEO Linda Yakarino has 56 00:03:47,840 --> 00:03:52,600 Speaker 1: posted that Twitter has removed hundreds of accounts affiliated with 57 00:03:52,640 --> 00:03:57,280 Speaker 1: hamas and removed or labeled quote unquote tens of thousands 58 00:03:57,920 --> 00:04:02,920 Speaker 1: of pieces of content also said that Twitter slash X 59 00:04:03,320 --> 00:04:08,200 Speaker 1: has not received any information from europoll about illegal content 60 00:04:08,240 --> 00:04:11,240 Speaker 1: on the platform. She's saying, well, we're being accused of something, 61 00:04:11,280 --> 00:04:14,920 Speaker 1: but no one has actually come to us with these concerns. 62 00:04:14,920 --> 00:04:18,560 Speaker 1: It's the first we're hearing about. Essentially, the company is 63 00:04:18,600 --> 00:04:22,480 Speaker 1: relying on the quote unquote community, which essentially means volunteers 64 00:04:22,560 --> 00:04:26,880 Speaker 1: who are flagging posts that might violate the DSA or 65 00:04:27,080 --> 00:04:30,560 Speaker 1: x's own policies. But then X kind of has to 66 00:04:30,760 --> 00:04:35,000 Speaker 1: rely on volunteers because Elon Musk effectively gutted the content 67 00:04:35,080 --> 00:04:38,320 Speaker 1: moderation team by laying off most of them in the 68 00:04:38,320 --> 00:04:43,240 Speaker 1: wake of his acquisition The Federal Anti Discrimination Agency or 69 00:04:43,720 --> 00:04:49,720 Speaker 1: FADA THODA is leaving X. This German anti racism group 70 00:04:49,760 --> 00:04:53,760 Speaker 1: explained the decision to bounce off the platform. They claim 71 00:04:53,839 --> 00:04:58,680 Speaker 1: that incidents of hate speech have quote increased, particularly end quote, 72 00:04:59,120 --> 00:05:03,279 Speaker 1: ever since Elon Musk took control of Twitter. FADA said 73 00:05:03,400 --> 00:05:08,440 Speaker 1: that cases of anti trans and homophobic content, and racism, misogyny, 74 00:05:08,960 --> 00:05:12,479 Speaker 1: anti semitism, and other examples of hateful speech are all 75 00:05:12,560 --> 00:05:17,440 Speaker 1: on the rise, and that ultimately this forced the leaders 76 00:05:17,480 --> 00:05:20,520 Speaker 1: of FADA to ask the question, does it make sense 77 00:05:20,560 --> 00:05:24,520 Speaker 1: to stay on X? Is it providing more of a 78 00:05:24,560 --> 00:05:29,480 Speaker 1: benefit than a cost. And clearly the endecision was that 79 00:05:29,560 --> 00:05:32,599 Speaker 1: it just wasn't worth sticking around, or, as Fada's statement 80 00:05:32,640 --> 00:05:36,200 Speaker 1: puts it, quote in our opinion, X is no longer 81 00:05:36,240 --> 00:05:40,040 Speaker 1: an acceptable environment end quote. Now I've said this before, 82 00:05:40,839 --> 00:05:43,760 Speaker 1: but in the years past, Twitter, which by the way, 83 00:05:43,800 --> 00:05:46,840 Speaker 1: has always been a deeply flawed platform. It's not like 84 00:05:47,000 --> 00:05:50,279 Speaker 1: Elon Musk took it over and then it went from 85 00:05:50,320 --> 00:05:54,159 Speaker 1: being a shining beacon to the absolute worst. It had 86 00:05:54,240 --> 00:05:57,760 Speaker 1: problems well before Elon Musk entered into the picture. It's 87 00:05:57,839 --> 00:06:03,920 Speaker 1: just those problems have magna fight arguably since Musk took over. Anyway, 88 00:06:04,800 --> 00:06:08,200 Speaker 1: back in the day, if you will, Twitter was put 89 00:06:08,240 --> 00:06:11,960 Speaker 1: to effective use by various activists and humanitarian groups and 90 00:06:12,040 --> 00:06:14,760 Speaker 1: news outlets. It was a scene as a very useful 91 00:06:14,839 --> 00:06:19,600 Speaker 1: tool and worth the investment of time it takes to 92 00:06:19,680 --> 00:06:24,799 Speaker 1: create and cultivate and maintain a Twitter account. But Musk's 93 00:06:24,839 --> 00:06:28,159 Speaker 1: capricious leadership and the massive changes the he's made to 94 00:06:28,400 --> 00:06:31,040 Speaker 1: X have convinced a lot of those entities just to 95 00:06:31,080 --> 00:06:33,400 Speaker 1: pull up stakes and leave X behind. They're saying it 96 00:06:33,440 --> 00:06:37,640 Speaker 1: no longer makes sense to stay there. And now we're 97 00:06:37,680 --> 00:06:40,839 Speaker 1: actually starting to see what effect this has had on 98 00:06:40,960 --> 00:06:46,000 Speaker 1: the organizations that decided to leave. So let's take NPR 99 00:06:46,360 --> 00:06:50,600 Speaker 1: National Public Radio, for example. So in April of this year, 100 00:06:51,360 --> 00:06:55,000 Speaker 1: NPR made the decision to leave X. This is right 101 00:06:55,040 --> 00:06:58,080 Speaker 1: around the time that Musk actually renamed Twitter to X. 102 00:06:58,720 --> 00:07:01,840 Speaker 1: And to be clear, MP made this decision for the 103 00:07:01,880 --> 00:07:06,400 Speaker 1: official NPR accounts. They did not mandate that for everybody 104 00:07:06,760 --> 00:07:10,960 Speaker 1: who works for NPR, but they decided they were going 105 00:07:11,000 --> 00:07:16,800 Speaker 1: to stop their accounts on Twitter or X. The platform 106 00:07:16,960 --> 00:07:19,920 Speaker 1: at that time had appended a label to NPR, and 107 00:07:19,960 --> 00:07:24,720 Speaker 1: the label was US state affiliated Media. Now, Twitter's own 108 00:07:24,880 --> 00:07:30,840 Speaker 1: definition for state affiliated media was quote outlets where the 109 00:07:30,880 --> 00:07:36,800 Speaker 1: state exercises control over editorial content through financial resources, direct 110 00:07:36,920 --> 00:07:40,680 Speaker 1: or indirect political pressures, and or control over production and 111 00:07:40,720 --> 00:07:45,080 Speaker 1: distribution end quote. So clearly that definition is meant to 112 00:07:45,120 --> 00:07:49,240 Speaker 1: designate media outlets that are serving as a mouthpiece for 113 00:07:49,360 --> 00:07:53,240 Speaker 1: a government. So the content produced by those types of 114 00:07:53,280 --> 00:07:57,880 Speaker 1: outlets tends to be heavily biased in favor of whichever 115 00:07:57,960 --> 00:08:01,080 Speaker 1: government is in charge. I mean, that's clear, right. It's 116 00:08:01,120 --> 00:08:04,080 Speaker 1: the kind of designation that you might give to a 117 00:08:04,160 --> 00:08:07,240 Speaker 1: media outlet in places like Russia or China, where you 118 00:08:07,280 --> 00:08:11,520 Speaker 1: have authoritarian leaders who crack down on any voices that 119 00:08:11,600 --> 00:08:16,120 Speaker 1: oppose the state. Now, MBR does receive money directly from 120 00:08:16,160 --> 00:08:19,240 Speaker 1: the US federal government, but it makes up less than 121 00:08:19,240 --> 00:08:23,240 Speaker 1: one percent of its budget, and it is clear that 122 00:08:23,240 --> 00:08:26,880 Speaker 1: the government doesn't have any influence on the content that 123 00:08:26,920 --> 00:08:32,080 Speaker 1: INPR produces. So the crux of what makes a media 124 00:08:32,080 --> 00:08:38,520 Speaker 1: outlet a state affiliated media outlet wasn't there, but that's 125 00:08:38,559 --> 00:08:42,960 Speaker 1: what Twitter was labeling in PR. So NBR decided that's it, 126 00:08:43,040 --> 00:08:44,840 Speaker 1: we don't have to deal with this, We're a gone, 127 00:08:44,880 --> 00:08:47,959 Speaker 1: and that it leaves. It was clear that Musk had 128 00:08:47,960 --> 00:08:51,199 Speaker 1: an axe to grind against NPR, So six months later, 129 00:08:51,480 --> 00:08:55,400 Speaker 1: now that we're in present day, leadership at NBR has 130 00:08:55,440 --> 00:09:00,079 Speaker 1: circulated a memo to the company's staff. In that memom, 131 00:09:00,800 --> 00:09:04,280 Speaker 1: the leaders addressed the impact that the organization has seen 132 00:09:04,440 --> 00:09:10,680 Speaker 1: since leaving X and it's not that much so, according 133 00:09:10,679 --> 00:09:15,360 Speaker 1: to Niemann Reports, The NPR memo says that traffic to 134 00:09:16,000 --> 00:09:19,719 Speaker 1: NPR has dropped by only a single percentage point as 135 00:09:19,720 --> 00:09:24,080 Speaker 1: a result of leaving Twitter slash X. NPR also pointed 136 00:09:24,080 --> 00:09:27,800 Speaker 1: out that even before all this kerfuffle, traffic from Twitter 137 00:09:27,840 --> 00:09:31,320 Speaker 1: amounted to less than two percent of all of NPR's traffic, 138 00:09:31,760 --> 00:09:34,600 Speaker 1: and that explains how the organization found the decision to 139 00:09:34,720 --> 00:09:38,080 Speaker 1: leave Twitter to not be difficult at all. The message 140 00:09:38,160 --> 00:09:40,800 Speaker 1: is one that X definitely does not want to hear 141 00:09:41,040 --> 00:09:44,920 Speaker 1: that staying on Twitter just isn't worth the effort. The 142 00:09:44,960 --> 00:09:48,440 Speaker 1: negligible amount of traffic NPR got from Twitter was not 143 00:09:48,600 --> 00:09:52,640 Speaker 1: worth the frustration of operating on Twitter. I suspect other 144 00:09:52,720 --> 00:09:56,559 Speaker 1: media outlets may also reevaluate their position on the platform, 145 00:09:56,679 --> 00:09:59,880 Speaker 1: particularly as it deals with more allegations of hosting disinformation 146 00:10:00,040 --> 00:10:03,400 Speaker 1: and illegal content. But obviously it's not going to be 147 00:10:03,400 --> 00:10:07,000 Speaker 1: the same for everybody, right, some media outlets may say, well, yeah, 148 00:10:07,000 --> 00:10:09,559 Speaker 1: but we have a more significant percentage of our traffic 149 00:10:09,600 --> 00:10:12,200 Speaker 1: coming from Twitter. That's going to be a much harder 150 00:10:12,240 --> 00:10:15,560 Speaker 1: decision to leave. In fact, I would be surprised if 151 00:10:15,600 --> 00:10:17,959 Speaker 1: a media outlet chose to leave in the wake of 152 00:10:19,000 --> 00:10:23,400 Speaker 1: a revelation that Twitter was providing a significant percentage of 153 00:10:23,400 --> 00:10:28,000 Speaker 1: traffic to the media outlet. But for those that are 154 00:10:28,000 --> 00:10:30,560 Speaker 1: in a position similar to NPR, I would not be 155 00:10:30,600 --> 00:10:35,600 Speaker 1: surprised to see more of those outlets follow suit. Okay, 156 00:10:36,200 --> 00:10:38,520 Speaker 1: we're going to take a quick break. When we come back, 157 00:10:38,760 --> 00:10:42,280 Speaker 1: we're going to have some AI news, the other big 158 00:10:42,320 --> 00:10:46,240 Speaker 1: story that continues to unfold throughout twenty twenty three. But 159 00:10:46,280 --> 00:10:58,520 Speaker 1: first let's take a break to thank our sponsors. Okay, 160 00:10:58,559 --> 00:11:02,240 Speaker 1: we're back, so let's talk about artificial intelligence in general 161 00:11:02,800 --> 00:11:07,199 Speaker 1: and generative AI in particular. So first up, open ai 162 00:11:07,400 --> 00:11:10,480 Speaker 1: has plans to make it less expensive for developers to 163 00:11:10,480 --> 00:11:16,760 Speaker 1: build applications on top of open AI's models. So, as 164 00:11:16,800 --> 00:11:20,480 Speaker 1: it turns out, using artificial intelligence, like using it and 165 00:11:20,520 --> 00:11:25,440 Speaker 1: incorporating it into your work, tends to be really expensive 166 00:11:25,679 --> 00:11:29,560 Speaker 1: because AI requires a lot of computational resources in order 167 00:11:29,559 --> 00:11:32,840 Speaker 1: to work, and that goes all up and down. The 168 00:11:33,120 --> 00:11:37,920 Speaker 1: computational resources gamut from processing power to computer memory to 169 00:11:38,000 --> 00:11:41,920 Speaker 1: things like storage. Not every developer has easy access to 170 00:11:42,000 --> 00:11:45,880 Speaker 1: all of those things. It's entirely possible to come up 171 00:11:45,920 --> 00:11:49,640 Speaker 1: with a potentially brilliant idea on how to incorporate AI 172 00:11:49,720 --> 00:11:53,520 Speaker 1: into an app, but not be able to afford the 173 00:11:53,600 --> 00:11:57,440 Speaker 1: costs of bringing that idea to life. So open ai 174 00:11:57,640 --> 00:12:01,440 Speaker 1: plans to offer options that can mitigate some of those costs, 175 00:12:01,920 --> 00:12:05,280 Speaker 1: like offering memory storage, for example, which is a huge 176 00:12:05,280 --> 00:12:08,240 Speaker 1: cost apparently for developers who are working in the field 177 00:12:08,240 --> 00:12:11,880 Speaker 1: of AI. So that's just part of open AI's strategy. 178 00:12:11,960 --> 00:12:15,319 Speaker 1: The company also plans to unveil some computer vision tools, 179 00:12:15,520 --> 00:12:18,240 Speaker 1: which can make it possible for developers to create applications 180 00:12:18,240 --> 00:12:22,920 Speaker 1: that include some element of image analysis in them. You know, 181 00:12:22,960 --> 00:12:26,439 Speaker 1: it's pretty easy for humans to describe stuff that's included 182 00:12:26,480 --> 00:12:29,120 Speaker 1: inside an image. It tends to be a pretty heavy 183 00:12:29,160 --> 00:12:32,800 Speaker 1: lift for a computer, and the computer vision tools simplify 184 00:12:32,840 --> 00:12:36,079 Speaker 1: this and gives developers access to capabilities that otherwise they 185 00:12:36,360 --> 00:12:39,679 Speaker 1: likely wouldn't be able to create on their own. According 186 00:12:39,679 --> 00:12:42,160 Speaker 1: to various news outlets, the company is going to roll 187 00:12:42,160 --> 00:12:46,439 Speaker 1: out these features around November sixth. That's during the company's 188 00:12:46,440 --> 00:12:50,960 Speaker 1: first ever developer conference, which is taking place in San Francisco, California, 189 00:12:51,040 --> 00:12:54,360 Speaker 1: on November sixth. And this matters to you and me 190 00:12:55,080 --> 00:12:57,560 Speaker 1: because it likely means we're going to be seeing a 191 00:12:57,760 --> 00:13:02,240 Speaker 1: rush of new applications that incorporate AI, at least to 192 00:13:02,280 --> 00:13:06,080 Speaker 1: some extent in the not too distant future. So like 193 00:13:06,120 --> 00:13:10,000 Speaker 1: it or not, the AI revolution is going to intensify 194 00:13:10,679 --> 00:13:14,080 Speaker 1: before much longer. Now that's going to have some hefty 195 00:13:14,160 --> 00:13:18,400 Speaker 1: real world consequences. Even if we assume that every AI 196 00:13:18,520 --> 00:13:22,920 Speaker 1: application is benign or goodness, let's say that it's even helpful. 197 00:13:23,000 --> 00:13:25,520 Speaker 1: Let's be super optimistic. Let's say that the incorporation of 198 00:13:25,559 --> 00:13:30,960 Speaker 1: AI is genuinely helpful in all of its implementations, there 199 00:13:30,960 --> 00:13:33,680 Speaker 1: would still be consequences we would have to face even 200 00:13:33,760 --> 00:13:37,640 Speaker 1: with like perfect AI. Tech Spot reported on a research 201 00:13:37,679 --> 00:13:42,520 Speaker 1: paper written by Alex Devrie titled the Growing Energy Footprint 202 00:13:42,559 --> 00:13:48,160 Speaker 1: of Artificial Intelligence, and Devree argues that while much research 203 00:13:48,200 --> 00:13:50,880 Speaker 1: has been focused on how much energy AI language large 204 00:13:50,920 --> 00:13:54,200 Speaker 1: language models require during their training phase, it's a lot 205 00:13:54,720 --> 00:13:57,280 Speaker 1: like when you are training a large language model, you're 206 00:13:57,320 --> 00:14:01,480 Speaker 1: spending a great deal of money just on electricity to 207 00:14:01,720 --> 00:14:08,280 Speaker 1: power that process. However, less studied, but just as important, arguably, 208 00:14:09,120 --> 00:14:14,200 Speaker 1: is the amount of electricity needed during normal operations. Like 209 00:14:14,240 --> 00:14:18,760 Speaker 1: you've stopped training now you're just deploying applications built on 210 00:14:18,800 --> 00:14:21,920 Speaker 1: top of those large language models. Defrie is saying, well, 211 00:14:21,920 --> 00:14:26,080 Speaker 1: that's going to still put a pretty hefty demand on 212 00:14:27,240 --> 00:14:32,160 Speaker 1: electricity generation, especially as more and more of these apps 213 00:14:32,280 --> 00:14:37,360 Speaker 1: are deployed, are developed and deployed. So an analysis firm 214 00:14:37,560 --> 00:14:42,320 Speaker 1: called semi Analysis estimated that if Google were to incorporate 215 00:14:42,400 --> 00:14:47,240 Speaker 1: AI into every single search, which it cannot do. I 216 00:14:47,360 --> 00:14:50,840 Speaker 1: want to be clear about that, it lacks the physical 217 00:14:50,880 --> 00:14:54,840 Speaker 1: capability of powering that. But if Google were to try 218 00:14:54,960 --> 00:14:59,960 Speaker 1: and do that, it would require electricity equivalent to somewhere 219 00:15:00,000 --> 00:15:03,200 Speaker 1: in the neighborhood of twenty nine point two tarawatt hours 220 00:15:03,200 --> 00:15:06,240 Speaker 1: on an annual basis, which is pretty much the same 221 00:15:06,280 --> 00:15:11,280 Speaker 1: amount of electricity that Ireland uses per year. So to 222 00:15:11,520 --> 00:15:16,680 Speaker 1: power AI enhanced Google searches, you would need the same 223 00:15:16,680 --> 00:15:19,360 Speaker 1: amount of electricity that Ireland needs for a full year. 224 00:15:19,920 --> 00:15:24,280 Speaker 1: That's incredible. It also reminds me a lot of cryptocurrency mining. 225 00:15:24,760 --> 00:15:27,960 Speaker 1: Like we talked a lot about how the process of 226 00:15:28,040 --> 00:15:33,320 Speaker 1: mining crypto coins, particularly proof of work crypto coins requires 227 00:15:33,360 --> 00:15:36,720 Speaker 1: a ton of electricity. Proof of stake less so, but 228 00:15:36,840 --> 00:15:39,680 Speaker 1: still requires some proof of work requires a lot. That's 229 00:15:39,720 --> 00:15:43,240 Speaker 1: the kind that like bitcoin uses. And we have already 230 00:15:43,240 --> 00:15:47,000 Speaker 1: talked about how in the past, at least mining operations 231 00:15:47,040 --> 00:15:51,360 Speaker 1: worthy creating the power need equivalent of what we would 232 00:15:51,400 --> 00:15:55,000 Speaker 1: see for some nations in Europe on an annual basis 233 00:15:55,120 --> 00:15:58,120 Speaker 1: pretty crazy. Well, same thing is true of AI, but 234 00:15:58,160 --> 00:16:01,000 Speaker 1: as I said, Google doesn't have that capability. This is 235 00:16:01,040 --> 00:16:05,040 Speaker 1: more of a theoretical kind of discussion. Google lacks the 236 00:16:05,080 --> 00:16:08,200 Speaker 1: equipment that would be needed to be able to do this, 237 00:16:08,840 --> 00:16:12,080 Speaker 1: and in fact, the companies that make the equipment, like 238 00:16:12,160 --> 00:16:15,400 Speaker 1: make the processors that would power this kind of thing, 239 00:16:15,880 --> 00:16:19,600 Speaker 1: they aren't producing chips at that scale, Like they could 240 00:16:19,600 --> 00:16:23,960 Speaker 1: not produce enough chips in the short term anyway to 241 00:16:24,080 --> 00:16:27,240 Speaker 1: power that. So this is really again just a hypothetical situation. 242 00:16:27,400 --> 00:16:35,200 Speaker 1: But the point is that AI is a power hungry technology, 243 00:16:36,120 --> 00:16:38,880 Speaker 1: not in the sense of you know, sky net taking 244 00:16:38,880 --> 00:16:40,680 Speaker 1: over the world, but in the sense of it requires 245 00:16:40,720 --> 00:16:44,600 Speaker 1: a ton of power for it to work. And it's 246 00:16:44,600 --> 00:16:49,760 Speaker 1: something that people like defriethink developers need to really take 247 00:16:49,760 --> 00:16:54,600 Speaker 1: into consideration. Does it make sense to incorporate AI into 248 00:16:54,640 --> 00:17:01,680 Speaker 1: an application knowing the energy demands of that particular application, 249 00:17:02,280 --> 00:17:04,760 Speaker 1: And if it makes sense, then sure. But if it 250 00:17:04,840 --> 00:17:06,800 Speaker 1: doesn't make sense, then don't just throw it in there 251 00:17:07,240 --> 00:17:10,720 Speaker 1: just to have AI as part of your value proposition 252 00:17:10,760 --> 00:17:14,399 Speaker 1: of your app, because you're going to be costing yourself 253 00:17:14,400 --> 00:17:16,720 Speaker 1: a lot more money due to the fact that there's 254 00:17:16,760 --> 00:17:20,040 Speaker 1: so much electricity needed to power the darn thing. Okay, 255 00:17:20,320 --> 00:17:23,600 Speaker 1: moving on to some other news. Microsoft got some bad 256 00:17:23,680 --> 00:17:26,840 Speaker 1: news in the mail last month, which we are now 257 00:17:27,080 --> 00:17:29,679 Speaker 1: learning about due to a filing with the Securities and 258 00:17:29,720 --> 00:17:34,080 Speaker 1: Exchange Commission up the United States Internal Revenue Service. The 259 00:17:34,400 --> 00:17:39,359 Speaker 1: dreaded IRS sent a notice to Microsoft that the company 260 00:17:39,400 --> 00:17:43,200 Speaker 1: had failed to pay a few bills. It was called 261 00:17:43,200 --> 00:17:47,439 Speaker 1: a notice of Proposed adjustment, which is a, you know, 262 00:17:48,280 --> 00:17:53,119 Speaker 1: a fairly benign sounding term, but it had some massive consequences. 263 00:17:53,480 --> 00:17:57,920 Speaker 1: According to the IRS, Microsoft ows about twenty nine billion 264 00:17:57,960 --> 00:18:02,400 Speaker 1: dollars twenty eight point nine billion to be more specific, 265 00:18:02,800 --> 00:18:05,240 Speaker 1: and that's in back taxes from the two thousand and 266 00:18:05,240 --> 00:18:10,040 Speaker 1: four to twenty thirteen period, as well as penalties associated 267 00:18:10,040 --> 00:18:12,760 Speaker 1: with a failure to pay those back taxes over the years. 268 00:18:14,160 --> 00:18:16,840 Speaker 1: Microsoft says the whole thing's a misunderstanding. In fact, they 269 00:18:16,840 --> 00:18:21,520 Speaker 1: say they outright disagree with the Notices of Proposed Adjustment 270 00:18:21,600 --> 00:18:24,959 Speaker 1: aka NOPA, So I guess they're saying NOPA to NOPA. 271 00:18:25,920 --> 00:18:28,480 Speaker 1: They're going to pursue an appeals process with the IRS, 272 00:18:28,480 --> 00:18:31,960 Speaker 1: and if that doesn't go Microsoft's way, they plan on 273 00:18:32,040 --> 00:18:35,639 Speaker 1: bringing the whole matter to court. It would be shocked 274 00:18:35,640 --> 00:18:39,359 Speaker 1: if this doesn't ultimately go to court at some point. 275 00:18:39,920 --> 00:18:41,879 Speaker 1: And yeah, it's going to take a minute for all 276 00:18:41,960 --> 00:18:43,600 Speaker 1: that to hash out, so we'll have to check back. 277 00:18:43,640 --> 00:18:48,080 Speaker 1: And I don't know, like five years or so, the 278 00:18:48,119 --> 00:18:51,919 Speaker 1: State of Utah has filed a lawsuit against TikTok, and 279 00:18:52,000 --> 00:18:54,640 Speaker 1: the charges are probably pretty much what you would expect. 280 00:18:54,800 --> 00:18:58,919 Speaker 1: So first and foremost, Utah's Governor, Spencer Cox, argues that 281 00:18:59,000 --> 00:19:03,440 Speaker 1: TikTok children both by getting them hooked on the platform 282 00:19:03,640 --> 00:19:07,120 Speaker 1: so that they're spending countless hours on it, and also 283 00:19:07,160 --> 00:19:11,600 Speaker 1: by serving up harmful material. The lawsuit cites research that 284 00:19:11,640 --> 00:19:16,280 Speaker 1: links social media activity, like prolonged activity on social media, 285 00:19:16,760 --> 00:19:21,120 Speaker 1: with poor mental health. However, I think it is very 286 00:19:21,200 --> 00:19:26,520 Speaker 1: important to note the distinction between causation and correlation here. 287 00:19:27,040 --> 00:19:30,760 Speaker 1: So does staying on social media for a long time 288 00:19:31,440 --> 00:19:35,480 Speaker 1: lead to poor mental health? Possibly, But it's also possible 289 00:19:35,760 --> 00:19:39,680 Speaker 1: that people who have poor mental health tend to use 290 00:19:39,720 --> 00:19:43,840 Speaker 1: social media for longer than other people do, and that 291 00:19:43,880 --> 00:19:47,280 Speaker 1: would mean that using social media a lot doesn't necessarily 292 00:19:47,359 --> 00:19:50,639 Speaker 1: lead to poor mental health. But perhaps poor mental health 293 00:19:50,920 --> 00:19:54,240 Speaker 1: makes you more inclined to use social media. It's similar 294 00:19:54,240 --> 00:19:57,320 Speaker 1: to how people who commit violence may also play a 295 00:19:57,320 --> 00:20:00,440 Speaker 1: lot of violent video games, but that doesn't necessarily mean 296 00:20:00,480 --> 00:20:04,760 Speaker 1: that playing violent video games makes you violent. Again, there 297 00:20:04,760 --> 00:20:08,199 Speaker 1: could be a correlation, but no causation there. We don't know, 298 00:20:08,960 --> 00:20:12,359 Speaker 1: is my point. So I think making legislation on something 299 00:20:12,359 --> 00:20:17,800 Speaker 1: where there's not a clear causal link is problematic. You're 300 00:20:17,840 --> 00:20:21,800 Speaker 1: not likely to actually address whatever underlying issue is at stake, 301 00:20:21,880 --> 00:20:27,480 Speaker 1: But anyway, Alex Horrik of TikTok contested the charges brought 302 00:20:27,520 --> 00:20:32,800 Speaker 1: against the platform by the state of Utah. He said 303 00:20:33,160 --> 00:20:37,359 Speaker 1: that TikTok quote has industry leading safeguards for young people, 304 00:20:37,480 --> 00:20:41,480 Speaker 1: including an automatic sixty minute time limit for users under 305 00:20:41,520 --> 00:20:45,359 Speaker 1: eighteen and parental controls for teen accounts end quote, So 306 00:20:45,960 --> 00:20:50,080 Speaker 1: he's saying we already have protections in place. These are 307 00:20:50,119 --> 00:20:54,720 Speaker 1: here to prevent young people from being harmed by TikTok. 308 00:20:54,960 --> 00:20:58,040 Speaker 1: According to AP News, the lawsuit also brings up a 309 00:20:58,040 --> 00:21:01,000 Speaker 1: concern that we've seen many, many times from lawmakers here 310 00:21:01,040 --> 00:21:04,800 Speaker 1: in the United States that is related to the relationship 311 00:21:04,840 --> 00:21:10,320 Speaker 1: between TikTok, the American company, and Byte Dance, TikTok's Chinese 312 00:21:10,320 --> 00:21:14,600 Speaker 1: parent company, arguing that maybe there are deeper links there 313 00:21:14,680 --> 00:21:18,760 Speaker 1: than what TikTok claims. This is a song we've seen 314 00:21:18,800 --> 00:21:22,840 Speaker 1: again and again, so not a big surprise here, but 315 00:21:22,920 --> 00:21:25,800 Speaker 1: we'll keep an eye on where this lawsuit goes and 316 00:21:25,840 --> 00:21:28,480 Speaker 1: how it unfolds in the future. Got one more than 317 00:21:28,480 --> 00:21:30,920 Speaker 1: I'm going to tell before we get into another quick 318 00:21:31,000 --> 00:21:33,960 Speaker 1: break it has been a hot minute since I've talked 319 00:21:33,960 --> 00:21:37,840 Speaker 1: about FTX, the cryptocurrency exchange that famously went up in 320 00:21:37,880 --> 00:21:40,800 Speaker 1: flames almost a year ago. Next month will be a 321 00:21:40,920 --> 00:21:46,200 Speaker 1: year since FTX went under. Fdx's co founder Sam Bankman 322 00:21:46,320 --> 00:21:51,240 Speaker 1: freed AKASBF is in court to face charges that he 323 00:21:51,280 --> 00:21:55,440 Speaker 1: committed several fraud related crimes in his role as CEO 324 00:21:55,680 --> 00:22:01,720 Speaker 1: of FTX. Yesterday, Caroline Ellison, the former CEO of Alimator Research, 325 00:22:02,040 --> 00:22:07,680 Speaker 1: which was intrinsically connected to FTX, testified that FDx engaged 326 00:22:07,720 --> 00:22:11,600 Speaker 1: in some pretty darn shifty behavior, including an attempt to 327 00:22:11,760 --> 00:22:16,120 Speaker 1: bribe a Chinese official. So, as Ellison tells the story, 328 00:22:16,440 --> 00:22:21,879 Speaker 1: Chinese authorities brows ftx's funds on the OKX and quea 329 00:22:21,960 --> 00:22:27,520 Speaker 1: by exchanges. So apparently someone at Alibator Research, which by 330 00:22:27,520 --> 00:22:30,280 Speaker 1: the way, was an investment company that specialized in the 331 00:22:30,280 --> 00:22:35,679 Speaker 1: crypto market and had been co founded by SBF. Apparently 332 00:22:36,040 --> 00:22:41,560 Speaker 1: this person unnamed person ended up doing business with some 333 00:22:41,560 --> 00:22:45,520 Speaker 1: someone else who was suspected of money laundering. So the 334 00:22:45,600 --> 00:22:49,320 Speaker 1: Chinese authorities locked down around a billion dollars worth of 335 00:22:49,359 --> 00:22:53,280 Speaker 1: FTX assets, and this prompted folks at FTX to try 336 00:22:53,320 --> 00:22:55,720 Speaker 1: and come up with various strategies on how to free 337 00:22:55,840 --> 00:23:00,119 Speaker 1: up that money that otherwise was just locked away. It 338 00:23:00,200 --> 00:23:03,080 Speaker 1: sounds like first they started off with some fairly reasonable 339 00:23:03,400 --> 00:23:07,399 Speaker 1: and more importantly legal options, but apparently those did not 340 00:23:07,440 --> 00:23:11,960 Speaker 1: get any results. So the leaders, including SBF himself, began 341 00:23:12,000 --> 00:23:15,359 Speaker 1: to consider alternatives, and while they did not necessarily approve 342 00:23:15,440 --> 00:23:19,360 Speaker 1: them initially, further down the line they did, and that 343 00:23:20,160 --> 00:23:24,840 Speaker 1: ultimately resulted in offering a one hundred and fifty million 344 00:23:24,840 --> 00:23:29,040 Speaker 1: dollars bribe to a Chinese official to grease the wheels 345 00:23:29,040 --> 00:23:34,719 Speaker 1: and get that money unlocked. Not only did the company 346 00:23:34,720 --> 00:23:38,000 Speaker 1: apparently actually do this, it also listed the expenditure in 347 00:23:38,040 --> 00:23:41,679 Speaker 1: its State of Alameda documentation. Now, they didn't go so 348 00:23:41,760 --> 00:23:45,399 Speaker 1: far as to label it bribery costs that might have 349 00:23:45,920 --> 00:23:50,240 Speaker 1: been a red flag. Instead, it was filed under notable 350 00:23:50,359 --> 00:23:56,119 Speaker 1: slash idiosyncratic PNL stuff. And here's the wild thing, y'all. 351 00:23:56,480 --> 00:23:59,280 Speaker 1: The bribery thing isn't even one of the legal charges 352 00:23:59,359 --> 00:24:04,600 Speaker 1: brought again SBF. Right, that's not even a material component 353 00:24:04,760 --> 00:24:08,600 Speaker 1: of this trial. The whole story really just becomes part 354 00:24:08,640 --> 00:24:13,119 Speaker 1: of establishing SBF's character and leadership decisions and willingness to 355 00:24:13,240 --> 00:24:18,560 Speaker 1: engage in illicit activities. Ellison, for her own part, has 356 00:24:18,600 --> 00:24:22,719 Speaker 1: pled guilty to several fraud charges already, so things are 357 00:24:22,760 --> 00:24:26,440 Speaker 1: not looking so great for SBF at the moment. Okay, 358 00:24:26,880 --> 00:24:28,800 Speaker 1: we're going to take another quick break. When we come back, 359 00:24:28,800 --> 00:24:31,560 Speaker 1: I've got you know, I got quite a few more 360 00:24:31,920 --> 00:24:43,719 Speaker 1: news stories, so stick around. We'll be right back. Okay, 361 00:24:43,800 --> 00:24:45,399 Speaker 1: we have come back, and now for a couple of 362 00:24:45,440 --> 00:24:48,160 Speaker 1: stories about how the US government is trying to make 363 00:24:48,200 --> 00:24:51,439 Speaker 1: companies be more transparent when it comes to stuff like 364 00:24:52,240 --> 00:24:55,600 Speaker 1: you know, bills, service fees. So first up is the 365 00:24:55,640 --> 00:24:59,800 Speaker 1: Federal Trade Commission, or FTC, which has taken aim at 366 00:25:00,080 --> 00:25:03,439 Speaker 1: so called junk fees. So these are those hidden fees 367 00:25:04,240 --> 00:25:06,920 Speaker 1: that won't pop up in a transaction until it's right 368 00:25:06,960 --> 00:25:09,840 Speaker 1: at the moment of purchase. So the customer might get 369 00:25:09,880 --> 00:25:13,080 Speaker 1: a price quote on a product or a service and 370 00:25:13,119 --> 00:25:17,040 Speaker 1: then they'll start the whole process of purchasing it or 371 00:25:17,080 --> 00:25:19,720 Speaker 1: agreeing to it, and right when they're getting at that 372 00:25:19,840 --> 00:25:23,119 Speaker 1: last moment to complete a transaction, a bunch of weird 373 00:25:23,200 --> 00:25:28,199 Speaker 1: fees pop up and the cost increases, sometimes significantly. So 374 00:25:28,240 --> 00:25:31,600 Speaker 1: this happens in lots of different kinds of transactions. Personally, 375 00:25:31,800 --> 00:25:34,000 Speaker 1: I see it most whenever I use a ticket broker 376 00:25:34,400 --> 00:25:37,480 Speaker 1: to purchase a seat for an event, whether it's a 377 00:25:37,520 --> 00:25:41,160 Speaker 1: show or a concert or whatever it may be, sometimes 378 00:25:41,560 --> 00:25:43,920 Speaker 1: the fees end up being as much as the cost 379 00:25:43,960 --> 00:25:46,280 Speaker 1: of the ticket itself, so it doubles the amount I 380 00:25:46,320 --> 00:25:48,119 Speaker 1: would have to spend if I wanted to, I don't know, 381 00:25:48,800 --> 00:25:53,359 Speaker 1: go see thump Asaurus performs Strutton Live. Actually, that's probably 382 00:25:53,359 --> 00:25:55,960 Speaker 1: a bad example because thump Asaurus tends to play small 383 00:25:55,960 --> 00:25:59,199 Speaker 1: to medium venues. Those are less dependent on ticket brokers. 384 00:25:59,240 --> 00:26:01,560 Speaker 1: But you get what I'm saying. You can also find 385 00:26:01,600 --> 00:26:05,320 Speaker 1: it in other places, like vacation rentals, right, so Airbnb 386 00:26:05,480 --> 00:26:09,280 Speaker 1: has been brought in as a possible perpetrator of these 387 00:26:09,359 --> 00:26:12,600 Speaker 1: kinds of junk fees or car rentals. It can be 388 00:26:12,640 --> 00:26:15,920 Speaker 1: like that too, where you'll see one price quoted as 389 00:26:15,960 --> 00:26:18,679 Speaker 1: you log into a car rental company, but as you 390 00:26:18,760 --> 00:26:22,000 Speaker 1: go through it suddenly balloons as you get further into 391 00:26:22,000 --> 00:26:24,280 Speaker 1: the process. Not all car rental companies are like that, 392 00:26:24,320 --> 00:26:27,080 Speaker 1: by the way, I've noticed that a few have started 393 00:26:27,119 --> 00:26:31,000 Speaker 1: to already post what the full price would be on 394 00:26:31,040 --> 00:26:35,119 Speaker 1: that initial screen, and I think that's admirable because not 395 00:26:35,200 --> 00:26:37,200 Speaker 1: every one of them does that. A lot of them 396 00:26:37,680 --> 00:26:39,560 Speaker 1: starts off with a price where you're like, oh wow, 397 00:26:39,600 --> 00:26:42,080 Speaker 1: it's like eighty bucks for the full week. That's not bad. 398 00:26:42,560 --> 00:26:44,280 Speaker 1: Then you click through and you're like, oh wait, it's 399 00:26:44,359 --> 00:26:46,440 Speaker 1: actually like two hundred and seventy dollars for the full 400 00:26:46,440 --> 00:26:48,720 Speaker 1: week once you add all these fees in. So the 401 00:26:48,840 --> 00:26:53,720 Speaker 1: FTC would force companies to include any additional fees within 402 00:26:53,800 --> 00:26:56,760 Speaker 1: the advertised prices themselves. They could no longer hide it 403 00:26:57,080 --> 00:27:00,320 Speaker 1: until the end, and they have to explain in what 404 00:27:00,480 --> 00:27:04,800 Speaker 1: those fees are actually for, like why does that fee exist? 405 00:27:05,119 --> 00:27:09,000 Speaker 1: What is that fee going to? Like? Those convenience fees 406 00:27:09,080 --> 00:27:11,199 Speaker 1: drive me nuts because I'm wondering who the heck is 407 00:27:11,200 --> 00:27:14,600 Speaker 1: this convenient for, because it's not me anyway. It's also 408 00:27:14,640 --> 00:27:18,920 Speaker 1: supposed to indicate which, if any, fees could be refundable. 409 00:27:19,200 --> 00:27:22,240 Speaker 1: So all of this has to be done if FTC 410 00:27:22,760 --> 00:27:25,119 Speaker 1: adopts these rules. I'm sure the companies are not eager 411 00:27:25,160 --> 00:27:28,840 Speaker 1: to comply with any rule like this if it is 412 00:27:28,880 --> 00:27:32,160 Speaker 1: in fact adopted. But as a consumer, I am personally 413 00:27:32,200 --> 00:27:34,840 Speaker 1: fully in favor of making companies be more transparent when 414 00:27:34,840 --> 00:27:38,480 Speaker 1: it comes to costs to customers. Now, on a related note, 415 00:27:38,680 --> 00:27:44,200 Speaker 1: the US Federal Communications Commission or FCC has transparency rules 416 00:27:44,200 --> 00:27:47,840 Speaker 1: that are going to impact the major Internet service providers 417 00:27:47,880 --> 00:27:51,479 Speaker 1: in the United States starting on or in April of 418 00:27:51,520 --> 00:27:55,199 Speaker 1: twenty twenty four. So starting then, providers will have to 419 00:27:55,240 --> 00:27:58,960 Speaker 1: include a broadband consumer label, and this label will break 420 00:27:59,040 --> 00:28:05,280 Speaker 1: down the provider's services. It has to include the service's features, 421 00:28:05,600 --> 00:28:09,399 Speaker 1: its limitations, any restrictions that are in place, as well 422 00:28:09,480 --> 00:28:12,680 Speaker 1: as the price for the service. So that means the 423 00:28:12,720 --> 00:28:17,399 Speaker 1: label should show customer stuff like what broadband speed is 424 00:28:17,480 --> 00:28:20,840 Speaker 1: available in their area, how much is the introductory rate 425 00:28:21,040 --> 00:28:24,479 Speaker 1: for enrolling in that service, what are the actual rates 426 00:28:24,600 --> 00:28:28,040 Speaker 1: once you're done with the introductory period. Are there any 427 00:28:28,119 --> 00:28:32,320 Speaker 1: data allowances or data caps? If you prefer on that service, 428 00:28:32,720 --> 00:28:34,680 Speaker 1: it would need to have all of that listed out, 429 00:28:34,680 --> 00:28:38,720 Speaker 1: plus any other fees that would come into place, and that, 430 00:28:38,960 --> 00:28:41,280 Speaker 1: in theory, should make it easier for customers to do 431 00:28:41,360 --> 00:28:45,760 Speaker 1: some comparison shopping. However, I should also add that for 432 00:28:45,840 --> 00:28:48,200 Speaker 1: a lot of folks in the United States, comparison shopping 433 00:28:48,360 --> 00:28:52,400 Speaker 1: is more of a dream than a reality. It's not 434 00:28:52,480 --> 00:28:55,160 Speaker 1: a luxury for a lot of people because a lot 435 00:28:55,160 --> 00:29:00,560 Speaker 1: of locations, including where I live, there's not much choice. Technically, 436 00:29:00,640 --> 00:29:05,080 Speaker 1: I have two providers in my area that provide for 437 00:29:05,560 --> 00:29:10,640 Speaker 1: actual broadband speed service, and only one of them can 438 00:29:10,760 --> 00:29:14,320 Speaker 1: provide service at greater than fifty megabits per second. The 439 00:29:14,360 --> 00:29:16,520 Speaker 1: other one, I think max is out right at twenty five, 440 00:29:16,560 --> 00:29:18,680 Speaker 1: which I think is even I think that is the 441 00:29:18,680 --> 00:29:21,960 Speaker 1: threshold for broadband. So it's not like I even have 442 00:29:22,760 --> 00:29:25,719 Speaker 1: a different option to go with if I wanted to, 443 00:29:26,080 --> 00:29:28,680 Speaker 1: at least not one that would work for what I 444 00:29:28,760 --> 00:29:31,960 Speaker 1: do for a living, So that's kind of a bummer, 445 00:29:31,960 --> 00:29:33,880 Speaker 1: But at least I would have a better understanding of 446 00:29:33,920 --> 00:29:36,560 Speaker 1: what I'm spending my money on and what all those 447 00:29:36,600 --> 00:29:39,320 Speaker 1: fees are going to, and also things like how much 448 00:29:39,320 --> 00:29:42,960 Speaker 1: would it cost to terminate the service early maybe suddenly 449 00:29:43,000 --> 00:29:47,960 Speaker 1: I get a different opportunity to use a different ISP, 450 00:29:48,680 --> 00:29:51,080 Speaker 1: how much would it cost for me to cancel my 451 00:29:51,280 --> 00:29:54,480 Speaker 1: agreement with my current provider? And as you might suspect, 452 00:29:54,520 --> 00:29:58,600 Speaker 1: the broadband companies have pushed hard against this move. They've 453 00:29:58,600 --> 00:30:01,080 Speaker 1: claimed that it would be too difficult to implement, and 454 00:30:01,120 --> 00:30:03,840 Speaker 1: that really just gives the FCC more ammunition, right, because 455 00:30:03,920 --> 00:30:06,800 Speaker 1: if listing out all your fees is something that you 456 00:30:06,880 --> 00:30:10,800 Speaker 1: say is too complicated, there's definitely a lack of transparency 457 00:30:10,840 --> 00:30:13,800 Speaker 1: going on among other issues. So I don't think it 458 00:30:13,960 --> 00:30:17,680 Speaker 1: is an argument that does the broadband service much favor. 459 00:30:18,360 --> 00:30:21,000 Speaker 1: Next up, we've got a story about eBay and a 460 00:30:21,040 --> 00:30:26,840 Speaker 1: potential two billion with a b dollar fine for the company. 461 00:30:27,080 --> 00:30:30,360 Speaker 1: So what did eBay do that would warrant such a 462 00:30:30,480 --> 00:30:33,640 Speaker 1: hefty fine. Well, it all has to do with the 463 00:30:33,680 --> 00:30:38,200 Speaker 1: facilitation of sales of products that violate environmental laws in 464 00:30:38,240 --> 00:30:43,920 Speaker 1: the United States, namely defeat devices aka gadgets that, when 465 00:30:43,920 --> 00:30:47,440 Speaker 1: you install it in a diesel powered vehicle, gives the 466 00:30:47,560 --> 00:30:51,920 Speaker 1: driver the ability to quote unquote roll coal. So if 467 00:30:51,920 --> 00:30:56,200 Speaker 1: you've ever seen a car or a truck blast out 468 00:30:56,240 --> 00:31:00,320 Speaker 1: a huge cloud of black exhaust all in one burd first, 469 00:31:01,040 --> 00:31:04,760 Speaker 1: that is rolling coal. I had not even heard about 470 00:31:04,760 --> 00:31:08,120 Speaker 1: that practice until just one day someone did it to me. 471 00:31:08,840 --> 00:31:11,760 Speaker 1: I had been walking back from our office back when 472 00:31:11,760 --> 00:31:15,000 Speaker 1: it was at Pont City Market, and it was a 473 00:31:15,040 --> 00:31:17,479 Speaker 1: three mile walk, and it was a hot day, and 474 00:31:17,520 --> 00:31:19,560 Speaker 1: I was taking a little break, and as I was 475 00:31:19,560 --> 00:31:22,160 Speaker 1: sitting down on a bench, a big old pickup truck 476 00:31:22,240 --> 00:31:25,080 Speaker 1: drove past me and then belched out a ton of 477 00:31:25,200 --> 00:31:29,960 Speaker 1: black smoke rolling coal. Well, here's the thing. These devices 478 00:31:30,080 --> 00:31:33,400 Speaker 1: violate the Clean Air Act, and if authorities find a 479 00:31:33,440 --> 00:31:38,720 Speaker 1: retailer selling them, they can find that retailer five eighty 480 00:31:38,760 --> 00:31:45,080 Speaker 1: dollars per sale, so it's accumulative, and the EPA has 481 00:31:45,120 --> 00:31:48,080 Speaker 1: set its sites on eBay. Now, eBay has said the 482 00:31:48,120 --> 00:31:52,080 Speaker 1: lawsuit is quote unquote entirely unprecedented. Now that might be true. 483 00:31:52,120 --> 00:31:55,120 Speaker 1: It might be entirely unprecedented, but that doesn't mean that 484 00:31:55,160 --> 00:31:58,480 Speaker 1: it's unwarranted. It also doesn't mean that the EPA won't 485 00:31:58,520 --> 00:32:00,920 Speaker 1: continue to do that sort of thing with other companies. 486 00:32:01,280 --> 00:32:04,480 Speaker 1: But eBay also claims it has actively removed listings for 487 00:32:04,520 --> 00:32:07,400 Speaker 1: such illegal devices for years and that it has a 488 00:32:07,440 --> 00:32:09,719 Speaker 1: ban on those kind of devices. Those are not allowed 489 00:32:09,720 --> 00:32:13,920 Speaker 1: to be sold on eBay, according to eBay's representatives. So 490 00:32:14,040 --> 00:32:16,200 Speaker 1: we'll have to see where this goes from here and 491 00:32:16,280 --> 00:32:19,080 Speaker 1: whether or not it actually ends up going to trial, 492 00:32:20,120 --> 00:32:24,400 Speaker 1: and whether eBay actually has the receipts to show that 493 00:32:24,560 --> 00:32:27,800 Speaker 1: it has cracked down on this practice, or if, as 494 00:32:27,840 --> 00:32:32,160 Speaker 1: the EPA alleges, eBay has allowed for the sale of 495 00:32:32,200 --> 00:32:37,000 Speaker 1: these devices. Google announced it stopped a distributed denial of 496 00:32:37,080 --> 00:32:40,480 Speaker 1: service attack or d DOS that was the largest ever 497 00:32:40,640 --> 00:32:44,200 Speaker 1: on record. In fact, according to Google, this attack was 498 00:32:44,240 --> 00:32:48,120 Speaker 1: seven and a half times larger than the previous record holder, 499 00:32:48,480 --> 00:32:51,400 Speaker 1: and the attack was blasting targets with three hundred and 500 00:32:51,520 --> 00:32:56,160 Speaker 1: ninety eight million requests per second. So, just in case 501 00:32:56,200 --> 00:32:58,800 Speaker 1: you don't know what a denial of service attack is, 502 00:32:59,600 --> 00:33:04,200 Speaker 1: it's basically a tactic meant to overwhelm a target server 503 00:33:04,800 --> 00:33:08,040 Speaker 1: a web server. So in its most basic form, a 504 00:33:08,120 --> 00:33:12,000 Speaker 1: denial of service attack will see attackers send out requests 505 00:33:12,080 --> 00:33:16,160 Speaker 1: to a target. The target is compelled to respond to 506 00:33:16,200 --> 00:33:20,200 Speaker 1: the requests because that's how the Internet works. So by 507 00:33:20,240 --> 00:33:24,080 Speaker 1: sending millions or even hundreds of millions, or even billions 508 00:33:24,120 --> 00:33:27,880 Speaker 1: of requests to a target, the target can get locked 509 00:33:28,000 --> 00:33:30,880 Speaker 1: up trying to respond to all these requests. It can 510 00:33:30,920 --> 00:33:34,480 Speaker 1: even shut down as a result of this. A distributed 511 00:33:34,480 --> 00:33:37,880 Speaker 1: denial of service attack is the same thing, except the 512 00:33:37,920 --> 00:33:42,239 Speaker 1: attacks are distributed across a network of attackers. It's not 513 00:33:42,360 --> 00:33:45,080 Speaker 1: just one source of attack, and that makes it a 514 00:33:45,200 --> 00:33:48,480 Speaker 1: much more effective means of attacking a target because if 515 00:33:48,560 --> 00:33:51,320 Speaker 1: the attacks were coming from one source, the target might 516 00:33:51,400 --> 00:33:54,920 Speaker 1: identify that source and then block traffic coming from that 517 00:33:55,080 --> 00:33:59,640 Speaker 1: source and then protect itself. If it's distributed across lots 518 00:33:59,640 --> 00:34:04,479 Speaker 1: of machines, it can be more challenging to actually protect 519 00:34:04,520 --> 00:34:08,040 Speaker 1: yourself against that kind of attack. Google says it detected 520 00:34:08,080 --> 00:34:11,200 Speaker 1: the attack and then launched mitigation efforts, and that included 521 00:34:11,239 --> 00:34:15,319 Speaker 1: reaching out to peers and started to patch a vulnerability 522 00:34:15,320 --> 00:34:20,239 Speaker 1: in the HTTP slash two protocol stack. The attackers were 523 00:34:20,320 --> 00:34:24,719 Speaker 1: leveraging this vulnerability and it's present in some but not 524 00:34:24,800 --> 00:34:28,480 Speaker 1: all servers. So server administrators have been informed they should 525 00:34:28,760 --> 00:34:31,440 Speaker 1: check to see if their machines are vulnerable, and if 526 00:34:31,440 --> 00:34:34,000 Speaker 1: those machines are vulnerable, to be on the lookout for 527 00:34:34,000 --> 00:34:37,680 Speaker 1: a security patch from whatever vendor made that machine in 528 00:34:37,800 --> 00:34:41,960 Speaker 1: order to close off this route of attack. And finally, 529 00:34:42,360 --> 00:34:46,560 Speaker 1: for some sky is falling news, the US Federal Aviation 530 00:34:46,640 --> 00:34:50,239 Speaker 1: Authority or FAA has a warning about all those satellites 531 00:34:50,280 --> 00:34:53,040 Speaker 1: that were putting up in orbit. So I talked earlier 532 00:34:53,080 --> 00:34:56,359 Speaker 1: this week about how companies like SpaceX and Amazon are 533 00:34:56,440 --> 00:35:02,240 Speaker 1: launching thousands of satellites constellation of satellites up into orbit. 534 00:35:02,719 --> 00:35:06,480 Speaker 1: And the warning that the FAA is giving us is 535 00:35:06,560 --> 00:35:09,640 Speaker 1: the old saying what goes up must come down, even 536 00:35:09,680 --> 00:35:11,839 Speaker 1: if it happens to be in low Earth orbit. So 537 00:35:11,880 --> 00:35:16,080 Speaker 1: the FAA did an analysis and estimates that up to 538 00:35:16,200 --> 00:35:20,640 Speaker 1: twenty eight thousand pieces of satellite could fall to Earth 539 00:35:20,920 --> 00:35:24,279 Speaker 1: each year based upon how much stuff we're putting up there, 540 00:35:25,000 --> 00:35:28,719 Speaker 1: and that these could potentially represent an actual threat to 541 00:35:28,800 --> 00:35:33,240 Speaker 1: people in property. So the FAA estimates that the chance 542 00:35:33,280 --> 00:35:37,560 Speaker 1: of being hurt or worse from a piece of falling 543 00:35:37,640 --> 00:35:41,080 Speaker 1: satellite equipment could be point six per year, and that 544 00:35:41,160 --> 00:35:44,680 Speaker 1: means that it could possibly happen every two years or so, 545 00:35:44,840 --> 00:35:48,640 Speaker 1: just when you look at the big picture of statistics. 546 00:35:49,640 --> 00:35:52,640 Speaker 1: And if that's not scary enough, the FAA also indicated 547 00:35:52,680 --> 00:35:55,400 Speaker 1: that falling satellite pieces could potentially be a threat to 548 00:35:55,440 --> 00:36:00,960 Speaker 1: aircraft in mid flight, though this represents an even smaller likelihood. 549 00:36:01,440 --> 00:36:05,839 Speaker 1: The FAA indicated that regulation could limit some of this risk. 550 00:36:06,120 --> 00:36:08,879 Speaker 1: So if some regulations were put into place that would 551 00:36:08,960 --> 00:36:12,480 Speaker 1: restrict how many satellites companies can put into orbit or 552 00:36:12,760 --> 00:36:16,440 Speaker 1: how they operate those satellites, maybe that would end up 553 00:36:16,600 --> 00:36:20,440 Speaker 1: helping this. But the FAA also stressed it only has 554 00:36:20,520 --> 00:36:22,239 Speaker 1: authority to do that kind of thing here in the 555 00:36:22,360 --> 00:36:26,120 Speaker 1: United States. There are other nations that are launching satellites 556 00:36:26,120 --> 00:36:29,880 Speaker 1: into orbit, and so there is potentially a need for 557 00:36:29,960 --> 00:36:33,600 Speaker 1: some sort of international body that could set global regulations 558 00:36:33,640 --> 00:36:36,480 Speaker 1: on satellites and launches and that kind of thing. And 559 00:36:36,800 --> 00:36:40,520 Speaker 1: the proliferation of space debris. SpaceX, by the way, says 560 00:36:40,560 --> 00:36:45,480 Speaker 1: that the FAA's analysis is flawed and that the company's satellites, 561 00:36:45,520 --> 00:36:49,560 Speaker 1: they're Starlink satellites are far more likely to burn up 562 00:36:49,560 --> 00:36:52,120 Speaker 1: on entry because they're not terribly big. So they're like, 563 00:36:52,239 --> 00:36:55,440 Speaker 1: this is a moot point, because if our satellites are falling, 564 00:36:55,719 --> 00:36:57,920 Speaker 1: they're not going to make it to the ground. They 565 00:36:57,920 --> 00:37:01,759 Speaker 1: will burn up before they reach that far, and so 566 00:37:01,800 --> 00:37:04,959 Speaker 1: there would be nothing significant landing on Earth afterward. They've 567 00:37:04,960 --> 00:37:10,120 Speaker 1: also raised some other objections, so again, interesting, scary. My 568 00:37:10,200 --> 00:37:12,640 Speaker 1: guess is that there's the truth is somewhere in the 569 00:37:12,640 --> 00:37:17,120 Speaker 1: middle here. I do think that the odds of being 570 00:37:17,160 --> 00:37:20,080 Speaker 1: struck by space debris are pretty darn low if you're 571 00:37:20,120 --> 00:37:23,600 Speaker 1: not in space, probably somewhere along the same lines as 572 00:37:23,600 --> 00:37:28,480 Speaker 1: getting struck by lightning. Although obviously you can help avoid 573 00:37:28,520 --> 00:37:30,800 Speaker 1: getting hit by lightning by not being out and about 574 00:37:30,840 --> 00:37:33,680 Speaker 1: while there's a lightning storm. It's a lot harder to 575 00:37:33,719 --> 00:37:36,279 Speaker 1: predict when a piece of a satellite's going to come 576 00:37:36,280 --> 00:37:39,120 Speaker 1: plummeting down, So I get it. It is kind of scary. 577 00:37:39,600 --> 00:37:41,920 Speaker 1: Now before I go, I do have an article to recommend. 578 00:37:42,239 --> 00:37:46,200 Speaker 1: This article is titled how AI reduces the World to Stereotypes. 579 00:37:46,480 --> 00:37:49,400 Speaker 1: It was written by Victoria Turk and it was published 580 00:37:49,520 --> 00:37:54,280 Speaker 1: in Restofworld dot org. So Turk is mainly talking about 581 00:37:54,360 --> 00:37:59,520 Speaker 1: generative AI and how text to image systems tend to 582 00:37:59,520 --> 00:38:04,760 Speaker 1: perpetual wait and amplify biases and stereotypes. So the article 583 00:38:04,840 --> 00:38:07,520 Speaker 1: follows an experiment in which a team at rest of 584 00:38:07,600 --> 00:38:12,279 Speaker 1: World use the text to image generative AI tool mid 585 00:38:12,360 --> 00:38:16,560 Speaker 1: journey to produce pictures associated with specific places or types 586 00:38:16,600 --> 00:38:19,600 Speaker 1: of people, and then analyze those images and looked for 587 00:38:19,640 --> 00:38:23,759 Speaker 1: the inclusion of elements that reinforce biases and stereotypes. So 588 00:38:24,000 --> 00:38:27,560 Speaker 1: I recommend that article again. It's titled how AI reduces 589 00:38:27,600 --> 00:38:32,640 Speaker 1: the World to Stereotypes by Victoria Turk at Restofworld dot org. 590 00:38:33,120 --> 00:38:36,360 Speaker 1: That's it for me today. I hope you are all well. 591 00:38:36,480 --> 00:38:39,840 Speaker 1: Stay safe out there, take care of yourselves both physically 592 00:38:39,880 --> 00:38:42,880 Speaker 1: and mentally. It is a really tough time right now, 593 00:38:43,280 --> 00:38:52,480 Speaker 1: and I'll talk to you again really soon. Tech Stuff 594 00:38:52,560 --> 00:38:57,080 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 595 00:38:57,120 --> 00:39:00,680 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 596 00:39:00,719 --> 00:39:01,640 Speaker 1: your favorite shows.