1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,480 --> 00:00:16,079 Speaker 1: welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,079 --> 00:00:19,720 Speaker 1: an executive producer with iHeartRadio. And how the tech are you? 4 00:00:20,360 --> 00:00:23,720 Speaker 1: It is time for the tech news for October tenth, 5 00:00:23,960 --> 00:00:29,200 Speaker 1: twenty twenty three, or ten ten twenty three, or if 6 00:00:29,240 --> 00:00:33,920 Speaker 1: you're in the UK, ten ten twenty three. Let's start 7 00:00:33,920 --> 00:00:39,360 Speaker 1: off with a bunch of X news that is not 8 00:00:39,400 --> 00:00:42,760 Speaker 1: stuff that used to be news but isn't news anymore, 9 00:00:42,920 --> 00:00:45,760 Speaker 1: Not that kind of X news. I'm talking about news 10 00:00:46,240 --> 00:00:50,400 Speaker 1: about X formerly known as Twitter. We've got a bunch 11 00:00:50,400 --> 00:00:53,200 Speaker 1: of stories about it. So first up is one about 12 00:00:53,240 --> 00:00:58,200 Speaker 1: misinformation and disturbing content. So during times of turmoil, we 13 00:00:58,280 --> 00:01:01,960 Speaker 1: tend to see a lot more misinformation and outright disinformation 14 00:01:02,640 --> 00:01:05,959 Speaker 1: flood social media platforms. That's kind of par for the course. 15 00:01:06,480 --> 00:01:09,160 Speaker 1: That is the case. As Israel has gone to war 16 00:01:09,240 --> 00:01:12,720 Speaker 1: with Hamas, it should come as no surprise that this 17 00:01:12,880 --> 00:01:16,720 Speaker 1: war brings with it numerous instances of falsehoods and misleading 18 00:01:16,720 --> 00:01:21,120 Speaker 1: information spreading online across all different platforms. Even in the 19 00:01:21,200 --> 00:01:24,440 Speaker 1: pre elon Musk days, Twitter would have to step up 20 00:01:24,440 --> 00:01:26,600 Speaker 1: to deal with this sort of thing. But right Now 21 00:01:27,280 --> 00:01:32,560 Speaker 1: it's even harder because Musk's X has famously made severe 22 00:01:32,680 --> 00:01:36,400 Speaker 1: job cuts to the content moderation teams. In fact, most 23 00:01:36,400 --> 00:01:40,120 Speaker 1: reports say that those teams are effectively just gone that 24 00:01:40,240 --> 00:01:42,280 Speaker 1: you might as well just say there is no content 25 00:01:42,400 --> 00:01:46,600 Speaker 1: moderation team at X at this point, because it's it's 26 00:01:46,640 --> 00:01:50,160 Speaker 1: been whittled down so much so that means there's not 27 00:01:50,360 --> 00:01:53,440 Speaker 1: really a dedicated team in place to handle a crisis 28 00:01:53,480 --> 00:01:58,040 Speaker 1: like this, and that means that these instances of misinformation 29 00:01:58,200 --> 00:02:02,559 Speaker 1: and disinformation and disturbing matter can proliferate with very few 30 00:02:02,760 --> 00:02:07,160 Speaker 1: checks to prevent it from happening. Making matters more complicated 31 00:02:07,520 --> 00:02:11,080 Speaker 1: are the various changes Musk has made to X in 32 00:02:11,120 --> 00:02:15,480 Speaker 1: recent weeks. So, for example, he decided that article headlines 33 00:02:15,560 --> 00:02:19,400 Speaker 1: really mess with the esthetic he's going for. So now 34 00:02:19,440 --> 00:02:22,680 Speaker 1: when you post an article on X, it shows the 35 00:02:22,800 --> 00:02:25,799 Speaker 1: lead image and maybe a small blurb, but you don't 36 00:02:25,840 --> 00:02:29,080 Speaker 1: get a headline. You don't get information about what the 37 00:02:29,200 --> 00:02:32,320 Speaker 1: article is about, which makes it harder for reporters to 38 00:02:32,400 --> 00:02:36,440 Speaker 1: actually get their stories out on X and have people 39 00:02:36,480 --> 00:02:39,720 Speaker 1: read it, because those images don't necessarily convey the information 40 00:02:39,800 --> 00:02:43,680 Speaker 1: that tells folks why they should read a particular story 41 00:02:44,240 --> 00:02:48,520 Speaker 1: for outlets that haven't bowed to Musk's demand that they 42 00:02:48,520 --> 00:02:52,119 Speaker 1: pay for a verification check, their reach has been sort 43 00:02:52,120 --> 00:02:55,040 Speaker 1: of artificially limited, or maybe you could argue that if 44 00:02:55,120 --> 00:02:57,920 Speaker 1: they paid for the verification their reach would be boosted. 45 00:02:59,200 --> 00:03:04,800 Speaker 1: Musk also hasn't helped matters because he has this awful 46 00:03:04,960 --> 00:03:09,040 Speaker 1: tendency to promote accounts that have been linked to misinformation 47 00:03:09,360 --> 00:03:14,120 Speaker 1: in the past, including one that has posted outright antisemitic messaging. 48 00:03:14,480 --> 00:03:18,560 Speaker 1: So you can imagine in a war between Israel and Hamas, 49 00:03:18,639 --> 00:03:22,799 Speaker 1: that's going to have some repercussions. Now, you contrast this 50 00:03:22,919 --> 00:03:25,480 Speaker 1: with the role that Twitter played, say back during the 51 00:03:25,520 --> 00:03:29,359 Speaker 1: Arab Spring in the early twenty tens. Back then, people 52 00:03:29,360 --> 00:03:32,160 Speaker 1: were able to use Twitter to communicate, they used it 53 00:03:32,200 --> 00:03:34,840 Speaker 1: to organize, They used it to make sure aid was 54 00:03:34,880 --> 00:03:37,400 Speaker 1: getting to places where it was needed and that you know, 55 00:03:37,480 --> 00:03:41,680 Speaker 1: emergency services could could respond to the right place, and 56 00:03:42,280 --> 00:03:46,440 Speaker 1: you know, it played a pivotal role in that particular, 57 00:03:47,000 --> 00:03:51,040 Speaker 1: you know, very important political event. But with the chaotic 58 00:03:51,160 --> 00:03:54,480 Speaker 1: mess that is X today, it's a very different story. 59 00:03:54,720 --> 00:03:59,440 Speaker 1: X's algorithm promotes posts that drive engagement, and unfortunately those 60 00:03:59,480 --> 00:04:03,160 Speaker 1: posts often include ones with wildly inaccurate or false claims 61 00:04:03,720 --> 00:04:07,760 Speaker 1: as well as disturbing imagery via photos or videos. It's 62 00:04:07,880 --> 00:04:11,600 Speaker 1: enough to have some open source intelligence experts or osi 63 00:04:11,800 --> 00:04:15,800 Speaker 1: in T experts say that x is essentially a lost 64 00:04:15,960 --> 00:04:18,760 Speaker 1: cause for their line of work because of the sheer 65 00:04:18,839 --> 00:04:21,960 Speaker 1: volume of disinformation present on the platform. You'd be spending 66 00:04:22,400 --> 00:04:27,240 Speaker 1: all your time trying to verify a point of information. Meanwhile, 67 00:04:27,480 --> 00:04:30,920 Speaker 1: floods of more posts are coming in and you would 68 00:04:31,000 --> 00:04:33,240 Speaker 1: never be able to get to a point where you 69 00:04:33,240 --> 00:04:35,480 Speaker 1: actually know what's going on. So a lot of people 70 00:04:35,520 --> 00:04:37,960 Speaker 1: are saying like it's just not even worth it to 71 00:04:38,080 --> 00:04:44,480 Speaker 1: try and vet information on the former Twitter right now. Meanwhile, 72 00:04:44,800 --> 00:04:49,040 Speaker 1: the United States Securities and Exchange Commission or SEC, has 73 00:04:49,279 --> 00:04:53,760 Speaker 1: sued Elon Musk again. So you might wonder why. Well, 74 00:04:53,880 --> 00:04:56,680 Speaker 1: back in May of this year, the SEC served a 75 00:04:56,720 --> 00:05:00,480 Speaker 1: subpoena to Musk to compel him to appear for the 76 00:05:00,520 --> 00:05:05,280 Speaker 1: SEC and provide testimony with regard to his acquisition of Twitter. 77 00:05:05,680 --> 00:05:08,240 Speaker 1: So the SEC is concerned that some people may have 78 00:05:08,279 --> 00:05:12,880 Speaker 1: committed securities fraud during the whole Twitter acquisition process. So 79 00:05:12,960 --> 00:05:16,600 Speaker 1: this is an investigation into the acquisition and the deals 80 00:05:16,600 --> 00:05:20,400 Speaker 1: that surrounded it, looking to see if any illegal activity 81 00:05:20,760 --> 00:05:24,200 Speaker 1: was included in that Musk was supposed to appear before 82 00:05:24,240 --> 00:05:28,479 Speaker 1: the SEC back in September, but he protested it and 83 00:05:28,520 --> 00:05:31,680 Speaker 1: then he didn't show up. So now the SEC has 84 00:05:31,760 --> 00:05:35,960 Speaker 1: sued Musk in an effort to compel him to provide testimony. 85 00:05:36,040 --> 00:05:38,400 Speaker 1: This is not the first time the SEC and Musk 86 00:05:38,480 --> 00:05:42,159 Speaker 1: have clashed, not by a long shot. Musk has earned 87 00:05:42,160 --> 00:05:44,440 Speaker 1: a reputation as a sort of a bad boy in 88 00:05:44,480 --> 00:05:47,040 Speaker 1: the eyes of the SEC. So back in twenty eighteen, 89 00:05:47,720 --> 00:05:51,720 Speaker 1: Musk posted to you know, Twitter that he had secured 90 00:05:51,800 --> 00:05:55,640 Speaker 1: funding that would take the company Tesla private at the 91 00:05:55,680 --> 00:05:58,880 Speaker 1: staggering price of four hundred and twenty dollars per share. 92 00:05:59,240 --> 00:06:02,880 Speaker 1: Now this was probably in part a weed joke, a 93 00:06:02,960 --> 00:06:07,000 Speaker 1: bad one. Musk has a certain fixation associated with Mary 94 00:06:07,120 --> 00:06:10,520 Speaker 1: Jane as well as the letter X. But the SEC 95 00:06:10,600 --> 00:06:13,679 Speaker 1: takes this kind of stuff seriously because if the chairman 96 00:06:13,760 --> 00:06:16,360 Speaker 1: of the board of directors for a company claims that 97 00:06:16,400 --> 00:06:19,960 Speaker 1: the company has secured enough funding to go private and 98 00:06:20,000 --> 00:06:23,440 Speaker 1: then doesn't do that, what it looks like is market manipulation. 99 00:06:23,560 --> 00:06:26,640 Speaker 1: Because think about it. If Tesla's shares were trading a 100 00:06:26,760 --> 00:06:28,760 Speaker 1: I don't know, let's say Tesla shares were trading at 101 00:06:28,760 --> 00:06:32,400 Speaker 1: three hundred dollars per share, but the chairman of the 102 00:06:32,440 --> 00:06:36,839 Speaker 1: board announces that they've secured funding to take Tesla private 103 00:06:36,839 --> 00:06:40,760 Speaker 1: at four hundred twenty dollars per share. Well, let's say 104 00:06:40,760 --> 00:06:42,600 Speaker 1: you're not an investor. You might think I need to 105 00:06:42,640 --> 00:06:45,000 Speaker 1: sink as much money as I possibly can into Tesla 106 00:06:45,160 --> 00:06:47,720 Speaker 1: right now because there's going to be a buyout, and 107 00:06:47,760 --> 00:06:49,960 Speaker 1: that means like one hundred and twenty dollars on top 108 00:06:50,040 --> 00:06:51,520 Speaker 1: of what I'm paying right now, So I need to 109 00:06:51,520 --> 00:06:54,680 Speaker 1: buy as many shares as possible. This ends up being 110 00:06:54,720 --> 00:06:59,479 Speaker 1: a rush in investment, which drives stock prices up. But 111 00:06:59,520 --> 00:07:02,279 Speaker 1: if there's no deal there, well that just means that 112 00:07:02,320 --> 00:07:04,800 Speaker 1: the chairman manipulated the stock price of their own company. 113 00:07:04,920 --> 00:07:07,520 Speaker 1: That's a huge no no. So the SEC filed a 114 00:07:07,560 --> 00:07:12,240 Speaker 1: lawsuit against Musk. Eventually, Musk and the SEC settled out 115 00:07:12,240 --> 00:07:16,360 Speaker 1: of court. Musk had to pay twenty million dollars and 116 00:07:16,400 --> 00:07:19,720 Speaker 1: he was told to step down as the chair of 117 00:07:19,840 --> 00:07:23,280 Speaker 1: Tesla's board and also to agree to not do that 118 00:07:23,440 --> 00:07:27,600 Speaker 1: kind of thing again. He did later contest this and 119 00:07:27,760 --> 00:07:32,160 Speaker 1: argue that he was he was forced into this agreement 120 00:07:32,880 --> 00:07:35,640 Speaker 1: and that it was all an infringement upon his free speech. 121 00:07:35,680 --> 00:07:38,720 Speaker 1: You know, Elon Musk is very, very very concerned about 122 00:07:38,760 --> 00:07:41,720 Speaker 1: free speech when it applies to him, for a guy 123 00:07:41,720 --> 00:07:43,840 Speaker 1: who frequently holds the title of the wealthiest person in 124 00:07:43,880 --> 00:07:47,680 Speaker 1: the world. Honestly, the SEC fine was a slap on 125 00:07:47,680 --> 00:07:50,720 Speaker 1: the wrist. This new lawsuit really just comes down to 126 00:07:50,720 --> 00:07:53,160 Speaker 1: making Musk testify in front of the SEC. And since 127 00:07:53,200 --> 00:07:57,280 Speaker 1: the SEC has legitimate power to subpoena people, the general 128 00:07:57,320 --> 00:07:59,680 Speaker 1: wisdom is that Musk eventually will have to do it 129 00:07:59,720 --> 00:08:03,920 Speaker 1: because there's not really a legal reason why he could 130 00:08:03,920 --> 00:08:06,680 Speaker 1: have it thrown out. Now, we're not done with X 131 00:08:06,800 --> 00:08:09,680 Speaker 1: just yet. The platform also has a new feature for 132 00:08:09,760 --> 00:08:14,160 Speaker 1: those who subscribe and they get that little verification check. Now, 133 00:08:14,360 --> 00:08:18,040 Speaker 1: a verified account holder can restrict posts so that only 134 00:08:18,240 --> 00:08:22,040 Speaker 1: other verified accounts can reply to those posts. So if 135 00:08:22,080 --> 00:08:24,920 Speaker 1: you don't have a verified account, well then your input 136 00:08:25,000 --> 00:08:27,280 Speaker 1: is not welcome there. You do not get to be 137 00:08:27,400 --> 00:08:30,600 Speaker 1: part of the discourse. Ya free speech. Musk says that 138 00:08:30,640 --> 00:08:34,319 Speaker 1: the control should help with spam bots, which could actually 139 00:08:34,360 --> 00:08:37,440 Speaker 1: be a thing. And to be fair to X and 140 00:08:37,520 --> 00:08:40,520 Speaker 1: to Musk, I'm being very very snarky here, but let's 141 00:08:40,520 --> 00:08:44,000 Speaker 1: be honest. Twitter has actually done similar things in the past. 142 00:08:44,440 --> 00:08:46,760 Speaker 1: It's not like this is totally new. It wasn't a 143 00:08:46,840 --> 00:08:50,640 Speaker 1: verification thing. But back in twenty twenty, Twitter introduced a 144 00:08:50,679 --> 00:08:53,080 Speaker 1: feature that would let you limit replies to your post 145 00:08:53,480 --> 00:08:56,199 Speaker 1: to just accounts that you follow, So you could say, 146 00:08:56,200 --> 00:08:59,760 Speaker 1: all right, only people that I follow can reply to this, 147 00:09:00,600 --> 00:09:03,520 Speaker 1: so you cut out all the trolls and stuff, unless 148 00:09:03,520 --> 00:09:06,240 Speaker 1: you're following a lot of trolls, I guess. So if 149 00:09:06,280 --> 00:09:08,480 Speaker 1: you were really active on Twitter, but you were only 150 00:09:08,520 --> 00:09:10,800 Speaker 1: interested in having a discussion with the people that you 151 00:09:11,040 --> 00:09:14,080 Speaker 1: are following, then you could do that. Or you could 152 00:09:14,120 --> 00:09:16,920 Speaker 1: limit the replies so that only accounts that were mentioned 153 00:09:17,000 --> 00:09:19,800 Speaker 1: in the original tweet could reply, So it becomes more 154 00:09:19,840 --> 00:09:22,800 Speaker 1: of an actual one on one conversation that just happens 155 00:09:22,840 --> 00:09:26,560 Speaker 1: to be publicly viewable. So I do get a little 156 00:09:26,559 --> 00:09:30,200 Speaker 1: snarky about this where it turns into a paid for feature, 157 00:09:30,679 --> 00:09:35,360 Speaker 1: but it's not exactly unprecedented, so I'm not that head 158 00:09:35,440 --> 00:09:37,080 Speaker 1: up about it. It does seem to me like it's 159 00:09:37,120 --> 00:09:40,400 Speaker 1: another tactic to try and drive users to subscribe to 160 00:09:40,480 --> 00:09:45,520 Speaker 1: get that verified status. And rounding out our ex news 161 00:09:45,600 --> 00:09:50,320 Speaker 1: block is that Linda Yakarino, the CEO of X, canceled 162 00:09:50,360 --> 00:09:53,960 Speaker 1: her appearance at the tech Live conference, which is happening 163 00:09:54,040 --> 00:09:58,000 Speaker 1: next week. So tech Live is a Wall Street Journal 164 00:09:58,120 --> 00:10:01,200 Speaker 1: hosted event, and the Akarino was scheduled to speak there, 165 00:10:01,720 --> 00:10:05,320 Speaker 1: but X sent the Journal a statement saying, quote, Linda 166 00:10:05,360 --> 00:10:08,719 Speaker 1: Yakarina will be unable to attend the WSJ tech Live 167 00:10:08,800 --> 00:10:12,360 Speaker 1: conference next week. With the global crisis unfolding, Linda and 168 00:10:12,400 --> 00:10:15,520 Speaker 1: her team must remain fully focused on X end quote. 169 00:10:15,840 --> 00:10:19,800 Speaker 1: Presumably the global crisis alluded to is this war between 170 00:10:19,880 --> 00:10:23,240 Speaker 1: Israel and Hamas, and now X has been inundated with 171 00:10:23,480 --> 00:10:27,400 Speaker 1: horrific and misleading material. Certainly it would have been difficult 172 00:10:27,440 --> 00:10:31,199 Speaker 1: to field the pointed questions that attendees might have had 173 00:10:31,960 --> 00:10:35,080 Speaker 1: for X's CEO. The Verge points out that when she 174 00:10:35,240 --> 00:10:41,079 Speaker 1: attended their Code conference, she quote deflected most questions in quote, 175 00:10:41,120 --> 00:10:43,920 Speaker 1: so I imagine she wasn't keen to repeat that experience 176 00:10:44,320 --> 00:10:47,640 Speaker 1: in an even more critical setting. I do hope that 177 00:10:47,920 --> 00:10:52,520 Speaker 1: X's response to the crisis involves re establishing a content 178 00:10:52,640 --> 00:10:56,720 Speaker 1: moderation team and strategy. While I am very much an 179 00:10:56,800 --> 00:11:01,800 Speaker 1: outspoken critic of Elon Musk and X, it's not that 180 00:11:01,840 --> 00:11:04,040 Speaker 1: I want to see the platform fail. I would much 181 00:11:04,120 --> 00:11:09,240 Speaker 1: rather see it improve and to make some decisions that 182 00:11:09,400 --> 00:11:13,720 Speaker 1: end up demonstrably improving the service and protecting the people 183 00:11:13,760 --> 00:11:16,680 Speaker 1: who use it and making it useful for as many 184 00:11:16,720 --> 00:11:20,240 Speaker 1: people as possible. I just very much have the opinion 185 00:11:20,440 --> 00:11:22,800 Speaker 1: that that has not been the direction of the company 186 00:11:23,200 --> 00:11:27,679 Speaker 1: for the past year. Really, Okay, we're going to take 187 00:11:27,720 --> 00:11:29,360 Speaker 1: a quick break. When we come back, we've got some 188 00:11:29,400 --> 00:11:41,280 Speaker 1: more tech news to talk about. We're back. So I 189 00:11:41,400 --> 00:11:45,040 Speaker 1: mentioned in a previous tech News episode that the US 190 00:11:45,080 --> 00:11:49,880 Speaker 1: Federal Communications Commission, or FCC, has put net neutrality back 191 00:11:49,920 --> 00:11:52,920 Speaker 1: on the table. This was something that was a big 192 00:11:52,960 --> 00:11:56,440 Speaker 1: deal during Obama's administration and then got tossed out during 193 00:11:56,520 --> 00:12:00,480 Speaker 1: Trump's administration, and now it's coming background again. So ours 194 00:12:00,480 --> 00:12:04,319 Speaker 1: TETNACA has a great article titled net Neutrality's court fate 195 00:12:04,480 --> 00:12:10,600 Speaker 1: depends upon whether broadband is telecommunications? Okay, so y'all. This 196 00:12:10,679 --> 00:12:13,400 Speaker 1: gets a bit frustrating because it mostly ends up being 197 00:12:13,440 --> 00:12:18,959 Speaker 1: about how we define things, and arguably this ultimately becomes 198 00:12:19,000 --> 00:12:22,199 Speaker 1: an arbitrary distinction, right that we call something one thing 199 00:12:22,320 --> 00:12:27,240 Speaker 1: versus another thing. And yet that arbitrary distinction can determine 200 00:12:27,240 --> 00:12:31,160 Speaker 1: if the FCC actually has authority to reinstate net neutrality 201 00:12:31,280 --> 00:12:35,080 Speaker 1: or not. The gist of it is that any move 202 00:12:35,240 --> 00:12:39,400 Speaker 1: by the FCC to regulate Internet service providers is likely 203 00:12:39,480 --> 00:12:44,439 Speaker 1: to result in lawsuits against the FCC brought by those 204 00:12:44,679 --> 00:12:48,480 Speaker 1: same ISPs that all contend that the FCC doesn't actually 205 00:12:48,480 --> 00:12:52,280 Speaker 1: have the authority to make those regulations. And this all 206 00:12:52,320 --> 00:12:56,400 Speaker 1: has to do with the classification of broadband services. If 207 00:12:56,400 --> 00:13:01,120 Speaker 1: you classify them as a telecommunications service, then that means 208 00:13:01,120 --> 00:13:04,480 Speaker 1: they fall under Title I of the Communications Act, and 209 00:13:04,520 --> 00:13:09,000 Speaker 1: the FCC has authority to regulate that industry. They would 210 00:13:09,080 --> 00:13:12,559 Speaker 1: count as common carriers, so they'd have to follow certain rules, 211 00:13:12,559 --> 00:13:15,960 Speaker 1: and the FCC would have the authority to enforce those rules, 212 00:13:16,120 --> 00:13:18,839 Speaker 1: or at least to make those rules, and then you 213 00:13:19,400 --> 00:13:24,800 Speaker 1: have other elements of the government enforce them. But right now, 214 00:13:25,280 --> 00:13:31,000 Speaker 1: the ISP broadband industry tends to fall under the classification 215 00:13:31,080 --> 00:13:34,600 Speaker 1: of information services. This was something that was created back 216 00:13:34,600 --> 00:13:38,720 Speaker 1: when the Communications Act was being hashed out out of 217 00:13:38,760 --> 00:13:43,359 Speaker 1: concern that if you were to heavily regulate a new industry, 218 00:13:43,559 --> 00:13:48,120 Speaker 1: that of Internet service providers, it could end up hurting 219 00:13:48,160 --> 00:13:52,080 Speaker 1: innovation and adoption, and so, in an effort to avoid that, 220 00:13:53,160 --> 00:13:56,640 Speaker 1: these were called information services, not telecommunication services. So the 221 00:13:56,760 --> 00:14:01,600 Speaker 1: argument now is that these companies are far more than 222 00:14:01,679 --> 00:14:05,160 Speaker 1: just an information services company. It is a telecommunications company, 223 00:14:05,400 --> 00:14:07,960 Speaker 1: and thus it should be classified as such, and the 224 00:14:08,040 --> 00:14:12,560 Speaker 1: FCC would have authority. Obviously, the companies would oppose that. 225 00:14:12,880 --> 00:14:16,880 Speaker 1: They would much prefer to operate without that regulatory authority 226 00:14:16,920 --> 00:14:19,080 Speaker 1: over them. In fact, a lot of the companies are 227 00:14:19,120 --> 00:14:23,360 Speaker 1: also telecommunications companies, like they have a telecommunications business on 228 00:14:23,480 --> 00:14:27,360 Speaker 1: top of their ISP business, and yeah, I mean they 229 00:14:27,400 --> 00:14:30,480 Speaker 1: already have the experience of working under regulations, so they 230 00:14:30,480 --> 00:14:32,920 Speaker 1: would much prefer it if they didn't have to do that. 231 00:14:34,400 --> 00:14:38,600 Speaker 1: There was a recent report that stated that the FCC 232 00:14:38,960 --> 00:14:44,000 Speaker 1: may end up facing a reversal from the Supreme Court 233 00:14:44,280 --> 00:14:48,760 Speaker 1: if they were to attempt to assert this authority, saying 234 00:14:48,760 --> 00:14:53,760 Speaker 1: that the Supreme Court's conservative makeup at this point means 235 00:14:53,840 --> 00:14:59,080 Speaker 1: that more than likely it would be overturned. However, Ours 236 00:14:59,120 --> 00:15:02,320 Speaker 1: Technica piece men that the legal experts who wrote this 237 00:15:02,800 --> 00:15:05,840 Speaker 1: were paid by the broadband industry, or at least a 238 00:15:05,880 --> 00:15:09,080 Speaker 1: couple of companies that are broadband companies, So you could 239 00:15:09,160 --> 00:15:12,200 Speaker 1: argue that maybe there's a bit of poisoning the well 240 00:15:12,240 --> 00:15:14,920 Speaker 1: going on here. Anyway, it should come as no surprise 241 00:15:15,320 --> 00:15:20,160 Speaker 1: that we're seeing opposition from ISPs to this move. It 242 00:15:20,240 --> 00:15:23,040 Speaker 1: pretty much comes with the territory, so we'll have to 243 00:15:23,160 --> 00:15:25,880 Speaker 1: keep an eye on this and see how it develops. 244 00:15:27,040 --> 00:15:29,200 Speaker 1: I'm somewhat skeptical that it would go all the way 245 00:15:29,200 --> 00:15:31,560 Speaker 1: to the Supreme Court. The court would have to choose 246 00:15:31,640 --> 00:15:35,320 Speaker 1: to take out the argument, and I don't know that 247 00:15:35,440 --> 00:15:39,360 Speaker 1: it would, so I think that it is possible that 248 00:15:39,360 --> 00:15:42,360 Speaker 1: that report that was written was somewhat alarmist and meant 249 00:15:42,400 --> 00:15:47,480 Speaker 1: to try and dissuade the FCC from pursuing this line 250 00:15:47,520 --> 00:15:50,200 Speaker 1: of action in the first place. But we'll have to see. 251 00:15:50,280 --> 00:15:52,120 Speaker 1: A couple of weeks ago, I talked about how the 252 00:15:52,480 --> 00:15:55,960 Speaker 1: video game development world had turned on the company Unity. 253 00:15:56,680 --> 00:16:00,359 Speaker 1: Unity makes a popular, or at least a formerly popular 254 00:16:00,800 --> 00:16:03,840 Speaker 1: game engine. So you can think of a game engine 255 00:16:04,000 --> 00:16:06,480 Speaker 1: as a set of tools that game developers can use 256 00:16:06,480 --> 00:16:09,640 Speaker 1: when they're building out the game that they want to make. 257 00:16:10,080 --> 00:16:13,000 Speaker 1: So the developers don't have to create everything from scratch 258 00:16:13,080 --> 00:16:15,360 Speaker 1: because they have this set of tools they can use. 259 00:16:15,800 --> 00:16:19,320 Speaker 1: And as long as the set of tools supports whatever 260 00:16:19,360 --> 00:16:22,960 Speaker 1: the gameplay is that the developers have in mind, everything 261 00:16:23,280 --> 00:16:26,440 Speaker 1: can go fairly smoothly. I mean, there'll always be hiccups 262 00:16:26,480 --> 00:16:29,400 Speaker 1: and development, but you don't have to create everything from 263 00:16:29,440 --> 00:16:34,760 Speaker 1: scratch anyway. Not too long ago, Unity changed its policy 264 00:16:34,960 --> 00:16:38,280 Speaker 1: and said it would start charging developers what amounts to 265 00:16:38,840 --> 00:16:44,400 Speaker 1: royalties as defined by installations. So, in other words, at 266 00:16:44,440 --> 00:16:48,320 Speaker 1: a certain point, when someone installs a game that runs 267 00:16:48,320 --> 00:16:53,760 Speaker 1: on Unity, that counts toward the royalties, and developers will 268 00:16:53,760 --> 00:16:57,000 Speaker 1: have to pay Unity a certain amount of money for 269 00:16:57,080 --> 00:17:02,280 Speaker 1: that installation. Initial Unity was saying they would count every 270 00:17:02,360 --> 00:17:05,719 Speaker 1: single installation, which means that if you, as a player, 271 00:17:06,160 --> 00:17:09,800 Speaker 1: installed a game, then you uninstalled it, then you installed 272 00:17:09,800 --> 00:17:13,600 Speaker 1: it again, then that would count as two installations. So 273 00:17:14,119 --> 00:17:16,880 Speaker 1: if you really hated a game company, ironically you could 274 00:17:16,920 --> 00:17:20,159 Speaker 1: hurt it by buying a copy of their game, installing it, 275 00:17:20,600 --> 00:17:23,159 Speaker 1: uninstalling it, and doing it again and again, over and 276 00:17:23,200 --> 00:17:25,280 Speaker 1: over and over again, because that would mean that the 277 00:17:25,320 --> 00:17:28,200 Speaker 1: company would have to pay out the royalties for each 278 00:17:28,240 --> 00:17:32,240 Speaker 1: of those installations. Now, eventually Unity walked that back and 279 00:17:32,280 --> 00:17:34,960 Speaker 1: then said, okay, well, we'll only count it for the 280 00:17:35,000 --> 00:17:38,959 Speaker 1: first install per device. So if someone installs it on 281 00:17:39,080 --> 00:17:42,600 Speaker 1: like eighteen computers, that will still count as eighteen installations. 282 00:17:42,920 --> 00:17:45,800 Speaker 1: But if then they were to uninstall and reinstall, those 283 00:17:45,880 --> 00:17:48,080 Speaker 1: would not count as additional ones. On top of that, 284 00:17:49,080 --> 00:17:54,399 Speaker 1: this still created a big kerfuffle in the game development world. 285 00:17:54,520 --> 00:17:59,000 Speaker 1: It flew in the face of Unity's earlier philosophy, like 286 00:17:59,119 --> 00:18:02,560 Speaker 1: Unity had made a fairly big deal that they weren't 287 00:18:02,600 --> 00:18:07,320 Speaker 1: going to charge for installations or sales of games or whatever, 288 00:18:07,440 --> 00:18:09,960 Speaker 1: that there would just be kind of a flat fee 289 00:18:10,520 --> 00:18:13,760 Speaker 1: that you would pay for a certain level of access 290 00:18:13,840 --> 00:18:17,040 Speaker 1: to the Unity Engine and development tools, editing tools, that 291 00:18:17,119 --> 00:18:21,280 Speaker 1: kind of thing. So there was this big negative reaction 292 00:18:21,320 --> 00:18:24,800 Speaker 1: in the game development world. And now the CEO of Unity, 293 00:18:25,000 --> 00:18:28,959 Speaker 1: a guy named John riche Tello, has become the former 294 00:18:29,240 --> 00:18:32,959 Speaker 1: CEO of Unity. He resigned from the company in a 295 00:18:33,000 --> 00:18:36,760 Speaker 1: fairly sudden move, and now a guy named James M. 296 00:18:36,960 --> 00:18:40,600 Speaker 1: Whitehurst is serving as the interim CEO and president of 297 00:18:40,800 --> 00:18:45,720 Speaker 1: Unity while the company searches for a permanent replacement. How 298 00:18:45,800 --> 00:18:47,960 Speaker 1: this is going to affect Unity in the long haul, 299 00:18:48,080 --> 00:18:51,119 Speaker 1: obviously remains to be seen. I'm not sure how the 300 00:18:51,160 --> 00:18:54,679 Speaker 1: game development community will respond. I know that there was 301 00:18:54,720 --> 00:18:57,760 Speaker 1: a lot of trust lost as a result of this 302 00:18:57,920 --> 00:19:01,359 Speaker 1: recent change, so I think it's going to take a 303 00:19:01,359 --> 00:19:05,240 Speaker 1: lot of work to repair those relationships. And I don't 304 00:19:05,280 --> 00:19:09,439 Speaker 1: know how that meshes with the company's business strategy. Right, Like, 305 00:19:09,920 --> 00:19:14,320 Speaker 1: if the business strategy depends upon pursuing this kind of 306 00:19:15,320 --> 00:19:20,040 Speaker 1: revenue generation approach, then it's gonna be an uphill battle 307 00:19:20,280 --> 00:19:23,720 Speaker 1: to repair those relationships. And now for a tiny bit 308 00:19:24,240 --> 00:19:29,040 Speaker 1: of possible generative AI news, the jury's still out a 309 00:19:29,080 --> 00:19:33,240 Speaker 1: little bit. As befitting the god of mischief, the Verge 310 00:19:33,240 --> 00:19:36,119 Speaker 1: reports that a promotional poster for season two of the 311 00:19:36,440 --> 00:19:40,680 Speaker 1: Low Key series on Disney Plus may have been made 312 00:19:40,720 --> 00:19:45,560 Speaker 1: at least in part by Generative AI. Further, it appears 313 00:19:45,600 --> 00:19:49,760 Speaker 1: that the poster is using a stock image from Shutterstock 314 00:19:50,119 --> 00:19:55,080 Speaker 1: as the background, so it's this background image that is 315 00:19:55,200 --> 00:20:00,359 Speaker 1: presumably created through generative AI. The image in question appears 316 00:20:00,359 --> 00:20:06,080 Speaker 1: to be called Surreal Infinity Time Spiral Space Antique, a 317 00:20:06,119 --> 00:20:09,160 Speaker 1: great name, and the details in the image have some 318 00:20:09,200 --> 00:20:12,719 Speaker 1: of the tailtale signs of generative AI, like some like 319 00:20:12,760 --> 00:20:17,160 Speaker 1: superfluous little scribble marks and stuff that make it look 320 00:20:17,440 --> 00:20:20,720 Speaker 1: like it might have been created by AI, which can 321 00:20:21,200 --> 00:20:25,680 Speaker 1: get a little messy when it's making stuff like little 322 00:20:25,680 --> 00:20:28,240 Speaker 1: details can end up being kind of messy when you 323 00:20:28,240 --> 00:20:31,240 Speaker 1: look at it very closely. Now, this is a problem 324 00:20:31,840 --> 00:20:34,280 Speaker 1: not necessarily for Disney, which, as far as I know, 325 00:20:34,640 --> 00:20:37,520 Speaker 1: follow the rules for this stock image like you can 326 00:20:37,760 --> 00:20:40,200 Speaker 1: purchase stock images for commercial use if you have the 327 00:20:40,280 --> 00:20:42,080 Speaker 1: right license, and as far as I know, Disney did 328 00:20:42,119 --> 00:20:46,680 Speaker 1: all of that. But Shutterstock has a policy that does 329 00:20:46,720 --> 00:20:50,800 Speaker 1: not allow for AI generated images on the platform unless 330 00:20:51,280 --> 00:20:55,240 Speaker 1: they were created through the platform's own AI generative tool. 331 00:20:55,920 --> 00:20:58,560 Speaker 1: And you might think, well, that seems like gatekeeping, but 332 00:20:58,640 --> 00:21:02,840 Speaker 1: really it's to ensure a chain of IP ownership, because 333 00:21:02,880 --> 00:21:06,000 Speaker 1: things get really complicated. This is a brave new world 334 00:21:06,040 --> 00:21:09,560 Speaker 1: of AI generated content. So let's say you got an artist. 335 00:21:10,280 --> 00:21:15,000 Speaker 1: This artist uses some other generative AI tool to help 336 00:21:15,160 --> 00:21:19,840 Speaker 1: create their image. They do some work themselves, the AI 337 00:21:19,920 --> 00:21:23,239 Speaker 1: generative tool does some of the work. Then they end 338 00:21:23,359 --> 00:21:27,720 Speaker 1: up submitting that image to shutter Stock, and shutter Stock 339 00:21:27,760 --> 00:21:30,920 Speaker 1: puts it up in its marketplace. Well, there could potentially 340 00:21:31,000 --> 00:21:34,320 Speaker 1: be a very messy disagreement as to who or what 341 00:21:34,680 --> 00:21:38,840 Speaker 1: actually owns the IP of that image. So you could 342 00:21:38,920 --> 00:21:41,399 Speaker 1: make the argument that the artist who submits that image 343 00:21:41,400 --> 00:21:44,240 Speaker 1: to shutter Stock might not technically have the legal right 344 00:21:44,280 --> 00:21:49,000 Speaker 1: to do it because this other generative AI tool took 345 00:21:49,080 --> 00:21:53,360 Speaker 1: part in making that image. It's all very muddy, right. 346 00:21:53,440 --> 00:21:57,520 Speaker 1: It's indicative of how generative AI makes stuff like intellectual 347 00:21:57,560 --> 00:22:03,200 Speaker 1: property and copyright and trademarks and all of these related subjects. 348 00:22:03,840 --> 00:22:06,600 Speaker 1: It makes them much more complicated than they already were. 349 00:22:07,440 --> 00:22:10,920 Speaker 1: And then there's the issue of human artists, right, who 350 00:22:10,960 --> 00:22:14,959 Speaker 1: are number one, concerned that work that normally would go 351 00:22:15,040 --> 00:22:18,560 Speaker 1: to a person is at least in part being handled 352 00:22:18,600 --> 00:22:23,280 Speaker 1: by algorithms, and that number two, those algorithms require training 353 00:22:23,320 --> 00:22:27,760 Speaker 1: material in order to work, right, So the portfolios from 354 00:22:27,800 --> 00:22:31,560 Speaker 1: human artists are in a way being used to help 355 00:22:31,640 --> 00:22:36,800 Speaker 1: eliminate work for those human artists. Disney hasn't commented on 356 00:22:36,920 --> 00:22:40,320 Speaker 1: any of this so far, and you know, you could 357 00:22:40,440 --> 00:22:44,159 Speaker 1: argue that maybe Disney didn't know that it was an 358 00:22:44,200 --> 00:22:47,120 Speaker 1: AI generated image. See when it's on Shutterstock and it's 359 00:22:47,160 --> 00:22:52,280 Speaker 1: being used through the actual AI tool on Shutterstock, the 360 00:22:52,359 --> 00:22:55,800 Speaker 1: image gets a label that indicates that it was at 361 00:22:55,920 --> 00:22:59,639 Speaker 1: least in part AI generated, so you don't you're not 362 00:22:59,680 --> 00:23:04,000 Speaker 1: caught surprise when you're looking at Shutterstock's own library of 363 00:23:04,040 --> 00:23:07,959 Speaker 1: AI generated images. But this one didn't have that. It 364 00:23:08,000 --> 00:23:10,639 Speaker 1: appears to have been a case where someone was able 365 00:23:10,680 --> 00:23:15,159 Speaker 1: to slip in an AI generated image and Shutterstock didn't 366 00:23:15,480 --> 00:23:18,399 Speaker 1: detect that and ran it as a normal one. So 367 00:23:18,440 --> 00:23:20,679 Speaker 1: you could argue Disney didn't know although a lot of 368 00:23:20,720 --> 00:23:24,040 Speaker 1: the reports I saw said, come on, they knew because 369 00:23:24,080 --> 00:23:27,879 Speaker 1: the again it had those telltale signs of generative AI. 370 00:23:28,720 --> 00:23:31,200 Speaker 1: All right, I've got a little bit more news for you, 371 00:23:31,320 --> 00:23:33,720 Speaker 1: but before we can get to that, let's take another 372 00:23:33,960 --> 00:23:47,000 Speaker 1: quick break to think our sponsors. Okay, we're back. So 373 00:23:47,920 --> 00:23:52,480 Speaker 1: SpaceX's Starlink was never the only game in town when 374 00:23:52,520 --> 00:23:57,680 Speaker 1: it comes to satellite provided Internet connectivity. In fact, satellite 375 00:23:57,880 --> 00:24:01,640 Speaker 1: connectivity has been around for quite some time, but SpaceX 376 00:24:01,960 --> 00:24:06,200 Speaker 1: and starlink they use fleets constellations, if you will, of 377 00:24:06,240 --> 00:24:09,320 Speaker 1: tiny satellites, as a new way to achieve that outcome 378 00:24:09,400 --> 00:24:15,480 Speaker 1: as opposed to having a satellite in essentially geostationary orbit. 379 00:24:16,160 --> 00:24:19,800 Speaker 1: You have this fleet of satellites and as they pass overhead, 380 00:24:19,960 --> 00:24:23,680 Speaker 1: your antenna locks onto one, and then when it starts 381 00:24:23,680 --> 00:24:25,960 Speaker 1: to pass out of view, it locks onto the next one. 382 00:24:26,480 --> 00:24:32,320 Speaker 1: So that's kind of SpaceX's approach with starlink. And now 383 00:24:32,520 --> 00:24:38,600 Speaker 1: Amazon has launched literally a competitor to SpaceX's starlink service. 384 00:24:39,240 --> 00:24:43,000 Speaker 1: Amazon's version is called Kuiper, named after the Dutch astronomer 385 00:24:43,119 --> 00:24:47,200 Speaker 1: Gerard Kuiper, whose name also graces the Kuiper Belt. It's 386 00:24:47,240 --> 00:24:49,320 Speaker 1: a disc in our solar system made up of these 387 00:24:49,640 --> 00:24:54,719 Speaker 1: small bodies of mostly frozen voladoles like ammonia and methane. Anyway, 388 00:24:54,800 --> 00:24:58,680 Speaker 1: last Friday, and Atlas five launch vehicle carried a pair 389 00:24:58,880 --> 00:25:03,080 Speaker 1: of Amazon proto type Kuiper satellites and its payload and 390 00:25:03,160 --> 00:25:05,520 Speaker 1: delivered them to orbit, and Amazon said it was able 391 00:25:05,560 --> 00:25:07,720 Speaker 1: to make contact with both of them within an hour 392 00:25:07,920 --> 00:25:13,040 Speaker 1: of them having reached orbit. Amazon's plan is to use 393 00:25:13,080 --> 00:25:17,199 Speaker 1: a constellation of more than three thousand of these satellites 394 00:25:17,440 --> 00:25:22,520 Speaker 1: to provide Internet communication links, similar to how SpaceX's Starlink 395 00:25:22,720 --> 00:25:26,240 Speaker 1: does it. While satellite connectivity is a huge help to 396 00:25:26,320 --> 00:25:29,320 Speaker 1: people who could be in remote areas who might find 397 00:25:29,320 --> 00:25:34,320 Speaker 1: it difficult to get terrestrial Internet service because ISPs are, 398 00:25:34,920 --> 00:25:41,160 Speaker 1: let us say, reluctant to provide infrastructure to remote areas, 399 00:25:42,280 --> 00:25:46,520 Speaker 1: it does have a benefit. However, astronomers are really concerned 400 00:25:47,119 --> 00:25:50,520 Speaker 1: that as we fill out our orbits with small spacecraft, 401 00:25:50,520 --> 00:25:53,400 Speaker 1: it's going to make it harder to actually do astronomical 402 00:25:53,440 --> 00:25:57,480 Speaker 1: research here on Earth. It certainly also creates a lot 403 00:25:57,480 --> 00:26:01,360 Speaker 1: of potential space debris if it's not handled properly. One 404 00:26:01,400 --> 00:26:05,320 Speaker 1: thing I find really interesting is that this is explicitly 405 00:26:05,400 --> 00:26:09,240 Speaker 1: an Amazon project it is not a Blue Origin project. 406 00:26:09,480 --> 00:26:14,360 Speaker 1: So if you recall, Jeff Bezos founded Amazon dot Com, 407 00:26:14,920 --> 00:26:18,679 Speaker 1: and while he no longer runs Amazon, he oversees the 408 00:26:18,760 --> 00:26:23,640 Speaker 1: private space company Blue Origin. Yet Kuiper remains an Amazon project, 409 00:26:23,760 --> 00:26:26,320 Speaker 1: not a Blue Origin project. So I just thought that 410 00:26:26,440 --> 00:26:30,000 Speaker 1: was kind of interesting. The FTC recently released a report 411 00:26:30,320 --> 00:26:33,560 Speaker 1: that reveals victims have lost a total of around two 412 00:26:33,600 --> 00:26:38,680 Speaker 1: point seven billion dollars to scams on social media platforms 413 00:26:39,000 --> 00:26:42,800 Speaker 1: since twenty twenty one, so in two years, two point 414 00:26:42,920 --> 00:26:47,159 Speaker 1: seven billion dollars. A lot of the scams fall into 415 00:26:47,280 --> 00:26:51,600 Speaker 1: things like misleading ads for products. On the rare occasion 416 00:26:51,640 --> 00:26:54,040 Speaker 1: that I pop on over to Facebook, I see these 417 00:26:54,080 --> 00:26:56,960 Speaker 1: things all the time in my feed. I'll scroll down 418 00:26:57,240 --> 00:27:00,560 Speaker 1: and I'll see some ad that's clai to be a 419 00:27:00,600 --> 00:27:03,720 Speaker 1: fire sale or a clearance sale of some sort. And 420 00:27:04,160 --> 00:27:06,440 Speaker 1: it's funny because if I go away and I come back, 421 00:27:06,480 --> 00:27:09,120 Speaker 1: I'll see an ad that uses the exact same image, 422 00:27:09,480 --> 00:27:13,520 Speaker 1: very similar verbiage, but it'll have a different business name 423 00:27:13,600 --> 00:27:17,560 Speaker 1: associated with it. And I remember seeing one specifically about 424 00:27:17,600 --> 00:27:21,679 Speaker 1: a going out of business sale for some vinyl records store, 425 00:27:22,160 --> 00:27:23,919 Speaker 1: and I actually did a search to see if the 426 00:27:24,000 --> 00:27:27,560 Speaker 1: story even existed. It doesn't, and then I saw the 427 00:27:27,600 --> 00:27:30,560 Speaker 1: exact same images and a very similar ad but with 428 00:27:30,640 --> 00:27:33,520 Speaker 1: a different company name, same thing. They all claim to 429 00:27:33,560 --> 00:27:37,119 Speaker 1: be selling stuff at absurdly low prices. If you do 430 00:27:37,160 --> 00:27:41,080 Speaker 1: a little searching, you'll see that the claimed sales price 431 00:27:41,160 --> 00:27:44,639 Speaker 1: are usually like a fraction of what the actual market 432 00:27:44,720 --> 00:27:48,440 Speaker 1: value is for whatever the product is in question, And 433 00:27:49,240 --> 00:27:53,040 Speaker 1: a lot of these images are actually using pictures of 434 00:27:53,080 --> 00:27:56,520 Speaker 1: like artists' works, like one of a kind pieces that 435 00:27:56,600 --> 00:28:00,000 Speaker 1: an artist has produced, but the ad is showing up 436 00:28:00,080 --> 00:28:02,920 Speaker 1: off as if it were some sort of mass produced product. 437 00:28:03,520 --> 00:28:06,240 Speaker 1: If you actually were to order something from one of 438 00:28:06,280 --> 00:28:09,480 Speaker 1: those sites, it typically unfolds in one of two ways. 439 00:28:09,960 --> 00:28:13,080 Speaker 1: You either eventually get something that is a very cheap 440 00:28:13,240 --> 00:28:17,640 Speaker 1: knockoff of whatever the original item was, or you get 441 00:28:17,680 --> 00:28:19,880 Speaker 1: nothing at all, or maybe you get a box with 442 00:28:19,960 --> 00:28:23,920 Speaker 1: just some random junk in it. Well, these are some 443 00:28:24,000 --> 00:28:26,639 Speaker 1: of the most widespread scams. They tend to be on 444 00:28:26,680 --> 00:28:29,200 Speaker 1: the smaller scale when it comes to how much money 445 00:28:29,240 --> 00:28:33,000 Speaker 1: a person might actually lose personally, because it's normally not 446 00:28:33,280 --> 00:28:36,639 Speaker 1: that much, notine the grand scheme of things. For the 447 00:28:36,760 --> 00:28:39,640 Speaker 1: really big hits to the wallet, the scams that take 448 00:28:39,960 --> 00:28:43,240 Speaker 1: the most from victims, you have to turn to classic 449 00:28:43,320 --> 00:28:48,720 Speaker 1: investment scams. I think the proliferation of cryptocurrency, of blockchain, 450 00:28:48,880 --> 00:28:51,160 Speaker 1: of NFTs, and all that kind of stuff along those 451 00:28:51,200 --> 00:28:55,760 Speaker 1: lines has really created the perfect environment for scams. Not 452 00:28:55,880 --> 00:28:58,720 Speaker 1: to be clear, I'm not saying everything related to crypto 453 00:28:59,440 --> 00:29:03,360 Speaker 1: is outright a scam, at least I don't feel comfortable 454 00:29:03,400 --> 00:29:05,800 Speaker 1: saying that definitively. I think a lot of stuff with 455 00:29:05,840 --> 00:29:09,040 Speaker 1: crypto is a scam, and I definitely do not have 456 00:29:09,080 --> 00:29:12,200 Speaker 1: a high opinion of crypto in general. But what I 457 00:29:12,240 --> 00:29:16,480 Speaker 1: am saying is that crypto is kind of cryptic. It's 458 00:29:16,520 --> 00:29:19,719 Speaker 1: hard to understand. It's got a lot of elements to 459 00:29:19,760 --> 00:29:23,080 Speaker 1: it that are difficult to describe to someone who is 460 00:29:23,240 --> 00:29:27,080 Speaker 1: new to the concept, and as such, it is a 461 00:29:27,160 --> 00:29:30,320 Speaker 1: perfect opportunity for a conton artist. Right the scammer can 462 00:29:30,400 --> 00:29:33,080 Speaker 1: count on the fact that the target doesn't have a 463 00:29:33,120 --> 00:29:36,520 Speaker 1: full understanding of the subject matter, and by depending upon 464 00:29:37,080 --> 00:29:40,360 Speaker 1: base human features like greed, the con man can set 465 00:29:40,440 --> 00:29:42,239 Speaker 1: up a trap that a lot of people are going 466 00:29:42,280 --> 00:29:46,160 Speaker 1: to fall into. Overall, the FTC's report reinforces something that 467 00:29:46,200 --> 00:29:48,760 Speaker 1: I think most of us already know, which is that 468 00:29:49,120 --> 00:29:53,120 Speaker 1: scams are abundant on social media. So it really does 469 00:29:53,160 --> 00:29:56,000 Speaker 1: pay to be a critical thinker and to ask important questions, 470 00:29:56,040 --> 00:29:59,719 Speaker 1: even just to yourself, before you start sinking money into anything, 471 00:29:59,720 --> 00:30:03,320 Speaker 1: whether it's a product that seems to be priced way 472 00:30:03,360 --> 00:30:07,560 Speaker 1: below its value, or some investment opportunity or something along 473 00:30:07,600 --> 00:30:11,440 Speaker 1: those lines, ask these questions, do some research. I've also 474 00:30:11,480 --> 00:30:13,600 Speaker 1: seen this, by the way, targeting people who are doing 475 00:30:13,640 --> 00:30:18,360 Speaker 1: things like job searching, which really it really grinds my gears, y'all, 476 00:30:18,400 --> 00:30:21,959 Speaker 1: because obviously anyone who's doing a job search is trying 477 00:30:22,000 --> 00:30:25,280 Speaker 1: to make an improvement in their life. They're actually taking 478 00:30:25,840 --> 00:30:31,239 Speaker 1: real effort, and that requires a lot of investment of 479 00:30:31,280 --> 00:30:33,920 Speaker 1: your time and energy, right, it's not easy to do. 480 00:30:34,120 --> 00:30:37,440 Speaker 1: Anyone who has actively gone into a job search knows 481 00:30:37,920 --> 00:30:41,400 Speaker 1: that can be a grueling experience. So to prey upon 482 00:30:41,560 --> 00:30:44,720 Speaker 1: people who are in this vulnerable position and who likely 483 00:30:44,920 --> 00:30:48,880 Speaker 1: have the most to lose and can least afford to 484 00:30:48,920 --> 00:30:53,080 Speaker 1: lose any of it that it just really hits me 485 00:30:53,120 --> 00:30:57,640 Speaker 1: as being pure evil. It's hard to I don't really 486 00:30:57,720 --> 00:30:59,840 Speaker 1: believe in evil per se, but I mean that's about 487 00:31:00,440 --> 00:31:03,800 Speaker 1: I can get to it. Business Insider reports that Meta, 488 00:31:04,320 --> 00:31:06,560 Speaker 1: the company that owns Facebook and Instagram and all that 489 00:31:07,000 --> 00:31:11,040 Speaker 1: has been hiring famous folks, particularly people who are famous 490 00:31:11,080 --> 00:31:15,000 Speaker 1: on the Internet or celebrities as I like to call them, 491 00:31:15,360 --> 00:31:20,400 Speaker 1: to model for the company's AI assistance. So the idea 492 00:31:20,440 --> 00:31:23,360 Speaker 1: is that in the future, you'll be able to interact 493 00:31:23,440 --> 00:31:27,320 Speaker 1: with an AI assistant on Meta's various platforms, and the 494 00:31:27,360 --> 00:31:30,720 Speaker 1: assistant will be modeled after I don't know, like mister Beast, 495 00:31:31,200 --> 00:31:34,200 Speaker 1: And then I assume mister Beast will sue you if 496 00:31:34,240 --> 00:31:37,479 Speaker 1: you ask him to recommend a good burger joint or something. Anyway, 497 00:31:37,480 --> 00:31:40,560 Speaker 1: Meta is also working with quote unquote real celebrities too, 498 00:31:40,840 --> 00:31:45,280 Speaker 1: like Tom Brady, and apparently agreeing to be part of 499 00:31:45,320 --> 00:31:49,560 Speaker 1: this ends up being some big money. The Information reports 500 00:31:49,600 --> 00:31:54,640 Speaker 1: that one creator earned five million dollars paid out over 501 00:31:54,720 --> 00:31:58,120 Speaker 1: two years for putting in the equivalent of six hours 502 00:31:58,160 --> 00:32:03,720 Speaker 1: of work in a studio. Five million dollars for six 503 00:32:03,760 --> 00:32:07,640 Speaker 1: hours of work. Meta, why haven't you reached out to me, 504 00:32:08,240 --> 00:32:11,920 Speaker 1: you know, beloved online personality Jonathan Strickland. I'm sure we 505 00:32:11,960 --> 00:32:15,200 Speaker 1: could cut a deal. I'm sorry for all those times 506 00:32:15,240 --> 00:32:18,200 Speaker 1: I called you a predatory, dangerous company that trades upon 507 00:32:18,280 --> 00:32:21,960 Speaker 1: personal information to make heaps of cash, well simultaneously making 508 00:32:22,160 --> 00:32:25,760 Speaker 1: us miserable. I'm sure we can come up with a 509 00:32:25,800 --> 00:32:28,440 Speaker 1: figure that will convince me to compromise my stance on 510 00:32:28,480 --> 00:32:31,680 Speaker 1: the matter. No, I'm just kidding. There's no such figure. 511 00:32:31,720 --> 00:32:34,400 Speaker 1: They couldn't offer me enough money anyway. As it stands, 512 00:32:34,640 --> 00:32:38,080 Speaker 1: the AI assistants are just text based for the moment, 513 00:32:38,160 --> 00:32:41,000 Speaker 1: so I guess they just text like Tom Brady and 514 00:32:41,080 --> 00:32:44,560 Speaker 1: mister Beast. But obviously the plan in the future is 515 00:32:44,560 --> 00:32:49,280 Speaker 1: to incorporate AI generated video assistants, probably animated ones if 516 00:32:49,280 --> 00:32:51,960 Speaker 1: I had to guess, complete with a voice or a 517 00:32:51,960 --> 00:32:55,320 Speaker 1: synthesized voice that belongs to these you know, famous people, 518 00:32:55,920 --> 00:32:57,720 Speaker 1: and y'all, I've got a lot of thoughts about this. 519 00:32:57,800 --> 00:33:00,880 Speaker 1: I'm actually concerned that it will create an even more 520 00:33:01,000 --> 00:33:08,160 Speaker 1: dangerous parasocial tendency. In general, we already see parasocial relationships 521 00:33:09,200 --> 00:33:13,440 Speaker 1: among fans and famous people that never turns out well 522 00:33:13,520 --> 00:33:17,200 Speaker 1: like it's bad, it's not healthy, and I fear that 523 00:33:17,320 --> 00:33:20,360 Speaker 1: this is going to take it a step further. But then, 524 00:33:20,400 --> 00:33:22,600 Speaker 1: what do I know. I'm not important enough to be bribed, 525 00:33:22,760 --> 00:33:28,840 Speaker 1: so who really cares what I think? Finally, I've got 526 00:33:28,880 --> 00:33:31,640 Speaker 1: a trio of articles to recommend to you all today, 527 00:33:32,200 --> 00:33:35,800 Speaker 1: So first up is a piece by Amanda Hoover. It's 528 00:33:35,800 --> 00:33:40,320 Speaker 1: over on Wired and it's titled New York's Airbnb ban 529 00:33:40,640 --> 00:33:45,120 Speaker 1: is descending into pure chaos. So, just for context, New 530 00:33:45,200 --> 00:33:47,880 Speaker 1: York City passed some rules to crack down on folks 531 00:33:48,080 --> 00:33:51,280 Speaker 1: who are renting out living space for short term stays 532 00:33:51,320 --> 00:33:54,400 Speaker 1: in the area because this has contributed to a real 533 00:33:54,520 --> 00:33:58,800 Speaker 1: estate crisis in the city. Setting aside properties for visitors 534 00:33:59,040 --> 00:34:02,040 Speaker 1: means that New York City residents don't have a shot 535 00:34:02,080 --> 00:34:06,040 Speaker 1: at those homes, and there's a shortage of real estate 536 00:34:06,080 --> 00:34:09,200 Speaker 1: for people who actually need a place to live. So 537 00:34:09,239 --> 00:34:11,719 Speaker 1: the piece goes into detail about how the cities move 538 00:34:11,880 --> 00:34:15,520 Speaker 1: to really cut back on Airbnb's operation within New York 539 00:34:15,560 --> 00:34:19,920 Speaker 1: City has now precipitated an effort among former Airbnb hosts 540 00:34:20,200 --> 00:34:22,879 Speaker 1: to dive into a sort of black market for short 541 00:34:22,920 --> 00:34:26,600 Speaker 1: term visitors. It's really interesting. Next up, we have a 542 00:34:26,600 --> 00:34:30,680 Speaker 1: piece by Alex Coma of the Washington City Paper that's 543 00:34:30,880 --> 00:34:35,600 Speaker 1: Koma for Alex's last name. The article is titled the 544 00:34:35,680 --> 00:34:40,399 Speaker 1: rent is Too Damn Algorithmic, So it kind of dovetails 545 00:34:40,440 --> 00:34:42,600 Speaker 1: with what I was just talking about in a way 546 00:34:42,680 --> 00:34:46,640 Speaker 1: like it's a related issue. This is about an attorney 547 00:34:46,680 --> 00:34:50,360 Speaker 1: general's fight with a company called real Page. That company's 548 00:34:50,360 --> 00:34:54,680 Speaker 1: business is to help landlords set rent prices in various regions. 549 00:34:55,000 --> 00:34:59,840 Speaker 1: So the idea is that this algorithm helps landlords s 550 00:35:00,239 --> 00:35:04,719 Speaker 1: rent so that it is in a competitive area with 551 00:35:04,880 --> 00:35:08,839 Speaker 1: other landlords in the space. They don't price themselves out 552 00:35:08,840 --> 00:35:11,960 Speaker 1: of the market, and they don't underprice themselves to a 553 00:35:12,000 --> 00:35:16,319 Speaker 1: point where they're cutting into their revenue too much. But 554 00:35:17,520 --> 00:35:20,760 Speaker 1: the argument is that this has actually created an anti 555 00:35:20,800 --> 00:35:25,360 Speaker 1: competitive landscape because all the landlords are using this particular tool, 556 00:35:26,120 --> 00:35:29,960 Speaker 1: and the algorithm essentially ends up just being a price 557 00:35:30,160 --> 00:35:35,080 Speaker 1: fixing tool where everybody has kind of agreed on a 558 00:35:35,200 --> 00:35:40,200 Speaker 1: very narrow range of rent for specific areas and specific properties, 559 00:35:40,760 --> 00:35:43,520 Speaker 1: and that means that there's no real competition going on, 560 00:35:44,360 --> 00:35:50,120 Speaker 1: and it essentially becomes collusion through algorithm. Now, the last 561 00:35:50,280 --> 00:35:53,239 Speaker 1: article that I have to recommend to you today is 562 00:35:53,280 --> 00:35:57,719 Speaker 1: from The Guardian. It was written by pramad Arcaria and 563 00:35:57,920 --> 00:36:01,440 Speaker 1: Michael Hudson, and my apologies for my attempt to pronounce 564 00:36:01,480 --> 00:36:05,640 Speaker 1: that name, but it is titled revealed Amazon linked to 565 00:36:05,760 --> 00:36:09,600 Speaker 1: trafficking of workers in Saudi Arabia. This is a very 566 00:36:09,640 --> 00:36:13,239 Speaker 1: sobering report. There are a lot of horrific experiences and 567 00:36:13,320 --> 00:36:18,120 Speaker 1: tragic circumstances that are listed within this article. But I 568 00:36:18,120 --> 00:36:20,759 Speaker 1: think it's really important to read because we need to 569 00:36:20,800 --> 00:36:24,640 Speaker 1: know the full impact that these huge tech companies can 570 00:36:24,719 --> 00:36:27,480 Speaker 1: have all over the world, not just to our own 571 00:36:27,520 --> 00:36:31,600 Speaker 1: personal lives, but how are they operating in other parts 572 00:36:31,640 --> 00:36:35,160 Speaker 1: of the world. How much of the company's success is 573 00:36:35,239 --> 00:36:39,680 Speaker 1: predicated upon human suffering. I think it's important to know that. 574 00:36:39,960 --> 00:36:42,040 Speaker 1: I think you need to know that so that you 575 00:36:42,080 --> 00:36:45,879 Speaker 1: are when you're making choices, you're making informed choices. And 576 00:36:46,280 --> 00:36:52,080 Speaker 1: if you're fine with making choices knowing these things, then 577 00:36:52,120 --> 00:36:54,640 Speaker 1: you know that's your own personal decision. But at least 578 00:36:54,680 --> 00:36:59,120 Speaker 1: you're doing it with knowledge, as opposed to remaining blissfully 579 00:36:59,200 --> 00:37:05,960 Speaker 1: ignorant and potentially perpetuating harm through your choices. You know, 580 00:37:06,040 --> 00:37:10,359 Speaker 1: not on purpose, but effectively. You know, purpose doesn't really 581 00:37:10,440 --> 00:37:13,839 Speaker 1: matter when you're talking about human suffering, right, you need 582 00:37:13,880 --> 00:37:17,680 Speaker 1: to really think about how to how to mitigate and 583 00:37:17,920 --> 00:37:22,879 Speaker 1: end that. So sorry for my soap boxing, but it's 584 00:37:22,960 --> 00:37:25,319 Speaker 1: just this stuff is important to me, and I think 585 00:37:25,360 --> 00:37:30,160 Speaker 1: that this article is a great revelation on how Amazon's 586 00:37:30,280 --> 00:37:34,680 Speaker 1: operations in this one particular part of the world really 587 00:37:34,960 --> 00:37:40,640 Speaker 1: lead to some awful conditions that need to be understood 588 00:37:40,719 --> 00:37:43,719 Speaker 1: and revealed, and companies need to be held accountable for 589 00:37:43,800 --> 00:37:48,840 Speaker 1: these kinds of operations. They need to actually take responsibility 590 00:37:48,880 --> 00:37:51,840 Speaker 1: for that, because otherwise, you know, we're just going to 591 00:37:51,880 --> 00:37:56,399 Speaker 1: see it perpetuated in Saudi Arabia and elsewhere, and that's 592 00:37:56,520 --> 00:38:00,480 Speaker 1: just that's really not acceptable, at least not in my opinion. Okay, 593 00:38:00,520 --> 00:38:05,080 Speaker 1: I'm done proselytizing. I hope y'all. I'll have a great 594 00:38:05,200 --> 00:38:14,719 Speaker 1: day and I'll talk to you again really soon. Tech 595 00:38:14,800 --> 00:38:19,200 Speaker 1: Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 596 00:38:19,520 --> 00:38:23,240 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 597 00:38:23,239 --> 00:38:24,320 Speaker 1: to your favorite shows.