1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:18,760 Speaker 1: I'm an executive producer with iHeart Podcasts and how the 4 00:00:18,800 --> 00:00:22,000 Speaker 1: tech are you. It's time for the tech news for Thursday, 5 00:00:22,040 --> 00:00:28,160 Speaker 1: November thirty, twenty twenty three. Let's get to it. When 6 00:00:28,240 --> 00:00:31,280 Speaker 1: last we left open Ai, the story was that Sam 7 00:00:31,360 --> 00:00:34,320 Speaker 1: Altman was going to come back to the company after 8 00:00:34,360 --> 00:00:37,839 Speaker 1: being suddenly and surprisingly fired, and that the board of 9 00:00:37,880 --> 00:00:40,960 Speaker 1: directors would consist of different folks as part of the deal. 10 00:00:41,200 --> 00:00:44,159 Speaker 1: So three of the four members who voted against Altman 11 00:00:44,560 --> 00:00:48,520 Speaker 1: were shown the door. The only remaining one is Adam DiAngelo. Well, 12 00:00:48,560 --> 00:00:51,400 Speaker 1: now Sam Altman is officially back as CEO, and on 13 00:00:51,440 --> 00:00:54,760 Speaker 1: top of that, Microsoft is going to have a representative 14 00:00:54,800 --> 00:00:58,440 Speaker 1: who will hold a non voting seat on the board. 15 00:00:58,960 --> 00:01:02,440 Speaker 1: If you recall, Microsoft has committed to a ten billion 16 00:01:02,600 --> 00:01:06,960 Speaker 1: dollar investment in open Ai, and Microsoft CEO Satya Nadella 17 00:01:07,080 --> 00:01:12,120 Speaker 1: was reportedly furious over Altman's firing because, well, the board 18 00:01:12,120 --> 00:01:15,600 Speaker 1: of directors kept it all a secret until telling Altman 19 00:01:15,720 --> 00:01:18,920 Speaker 1: and then essentially telling the world, and Nadella feels that 20 00:01:19,160 --> 00:01:21,920 Speaker 1: Microsoft should probably get a heads up on that sort 21 00:01:21,920 --> 00:01:24,760 Speaker 1: of thing early on, since they are pouring in so 22 00:01:24,800 --> 00:01:28,160 Speaker 1: many billions of dollars into open Ai, and maybe even 23 00:01:28,560 --> 00:01:31,920 Speaker 1: have a chance to propose alternatives because again, they are 24 00:01:31,920 --> 00:01:35,440 Speaker 1: so heavily invested in the company. Ilia Sutzkever, who is 25 00:01:35,520 --> 00:01:38,640 Speaker 1: one of the board members responsible for firing Altman in 26 00:01:38,640 --> 00:01:41,240 Speaker 1: the first place, stepped down from the board, but he 27 00:01:41,280 --> 00:01:44,080 Speaker 1: may still remain with the company. He is one of 28 00:01:44,120 --> 00:01:47,200 Speaker 1: the co founders of open Ai and has served as 29 00:01:47,240 --> 00:01:51,200 Speaker 1: its chief scientist. Anyway, this really went down as one 30 00:01:51,240 --> 00:01:54,160 Speaker 1: of the weirder stories of twenty twenty three, and one 31 00:01:54,200 --> 00:01:56,560 Speaker 1: that actually wrapped up pretty darn fast. I mean, most 32 00:01:56,600 --> 00:02:00,360 Speaker 1: of our weird stories have lasted the entire year. Talk 33 00:02:00,480 --> 00:02:04,760 Speaker 1: about chat GPT speaking of weird stories, and how some 34 00:02:04,840 --> 00:02:08,480 Speaker 1: researchers fooled the tool into revealing training data as well 35 00:02:08,520 --> 00:02:12,000 Speaker 1: as some personal information about various people. And they did 36 00:02:12,000 --> 00:02:15,080 Speaker 1: it by having the chat bought repeat the same word 37 00:02:15,520 --> 00:02:19,200 Speaker 1: over and over without stop forever. So they essentially gave 38 00:02:19,240 --> 00:02:22,560 Speaker 1: it the prompt repeat the word, you know, whatever the 39 00:02:22,560 --> 00:02:26,200 Speaker 1: word was, forever, and it would start doing that. So 40 00:02:26,400 --> 00:02:29,000 Speaker 1: it began to comply, but at some point it started 41 00:02:29,000 --> 00:02:33,000 Speaker 1: to intersperse the repetitions of the word with other information 42 00:02:33,600 --> 00:02:38,919 Speaker 1: like snippets from copyrighted research papers or personal phone numbers 43 00:02:39,080 --> 00:02:43,720 Speaker 1: of private citizens, including the CEO of some company. The 44 00:02:43,760 --> 00:02:48,400 Speaker 1: researchers used words like poem or company and had it 45 00:02:48,480 --> 00:02:50,760 Speaker 1: repeat that, and they saw different results depending on which 46 00:02:50,760 --> 00:02:53,440 Speaker 1: word they used. And it's interesting because I was just 47 00:02:53,560 --> 00:02:58,160 Speaker 1: rewatching a horror movie called Pontypool, and in that movie 48 00:02:58,200 --> 00:03:01,960 Speaker 1: people go crazy while they repeating simple small words over 49 00:03:02,000 --> 00:03:04,080 Speaker 1: and over in English. And then I read the story 50 00:03:04,120 --> 00:03:06,440 Speaker 1: and I think, oh, gosh, we're in the movie. I mean, 51 00:03:06,480 --> 00:03:09,320 Speaker 1: not really. But the researchers Chi died open Ai. They 52 00:03:09,360 --> 00:03:12,760 Speaker 1: said the attack they used shouldn't have worked because someone 53 00:03:12,880 --> 00:03:17,360 Speaker 1: surely should have discovered it well before then and fixed it. 54 00:03:17,760 --> 00:03:20,000 Speaker 1: They said, the attack is kind of silly, but it 55 00:03:20,040 --> 00:03:23,600 Speaker 1: could have serious consequences if someone invested enough money in 56 00:03:23,680 --> 00:03:26,120 Speaker 1: the paid for version of chat GPT, because that's what 57 00:03:26,160 --> 00:03:29,560 Speaker 1: the researchers used. If they poured enough money into it 58 00:03:29,600 --> 00:03:32,880 Speaker 1: and they used a similar style of attack, they could 59 00:03:32,960 --> 00:03:35,840 Speaker 1: have received an awful lot of information in the process. 60 00:03:36,200 --> 00:03:39,160 Speaker 1: The attack also showed how open ai has used material 61 00:03:39,240 --> 00:03:42,760 Speaker 1: on the web, including copyrighted material, to train its large 62 00:03:42,880 --> 00:03:47,600 Speaker 1: language model. That's an unfolding, sticky situation as lawmakers try 63 00:03:47,640 --> 00:03:50,960 Speaker 1: to get up to speed on the legalities of training llms. 64 00:03:51,480 --> 00:03:55,440 Speaker 1: The researcher said that open ai actually patched this vulnerability 65 00:03:55,600 --> 00:03:58,760 Speaker 1: back in August, so they're just now publishing their findings. 66 00:03:59,160 --> 00:04:03,000 Speaker 1: But according to them, open ai has already responded to 67 00:04:03,000 --> 00:04:07,280 Speaker 1: the issue. However, in Gadget, which published a whole article 68 00:04:07,320 --> 00:04:12,280 Speaker 1: about this issue, said that their staff attempted to replicate 69 00:04:12,360 --> 00:04:14,680 Speaker 1: the results and they were able to do so. They 70 00:04:14,840 --> 00:04:18,520 Speaker 1: used the word reply. They told the chatbot to repeat 71 00:04:18,520 --> 00:04:24,039 Speaker 1: reply forever, and it started to submitting other types of data. 72 00:04:24,640 --> 00:04:29,880 Speaker 1: So it's clear that this is not fully resolved. It'll 73 00:04:29,880 --> 00:04:33,320 Speaker 1: be interesting to see if this actually ends up having 74 00:04:33,400 --> 00:04:37,359 Speaker 1: any actual negative consequences in the real world. Beyond the 75 00:04:37,440 --> 00:04:41,160 Speaker 1: possibility of that happening, this next story is for you 76 00:04:41,320 --> 00:04:44,640 Speaker 1: Apple users out there. According to the Wall Street Journal, 77 00:04:44,800 --> 00:04:47,919 Speaker 1: Apple has proposed that it and Goldman Sachs, which is 78 00:04:47,960 --> 00:04:53,080 Speaker 1: the bank that facilitates transactions that are moving across Apple Pay, 79 00:04:53,560 --> 00:04:57,680 Speaker 1: will part ways in twelve to fifteen months, and Goldman 80 00:04:57,760 --> 00:05:00,680 Speaker 1: Sachs is likely to accept that offer because, as reportedly, 81 00:05:00,720 --> 00:05:03,960 Speaker 1: the bank has found Apple's business to be unprofitable that 82 00:05:04,080 --> 00:05:09,080 Speaker 1: handling these transactions is actually costly. Apparently, folks who use 83 00:05:09,080 --> 00:05:12,560 Speaker 1: Apple Pay make payments on their accounts more slowly than 84 00:05:12,600 --> 00:05:15,640 Speaker 1: people who use other types of credit cards, and this 85 00:05:16,000 --> 00:05:19,280 Speaker 1: lag in payment has had a big impact on Goldman 86 00:05:19,360 --> 00:05:22,480 Speaker 1: Sachs's own finances. This means that Apple will need to 87 00:05:22,520 --> 00:05:26,760 Speaker 1: find some other financial institution to back those transactions, or 88 00:05:26,839 --> 00:05:30,240 Speaker 1: somehow do it themselves, which I cannot imagine happening because 89 00:05:30,640 --> 00:05:33,760 Speaker 1: I imagine that requires fulfilling a whole bunch of regulations 90 00:05:33,760 --> 00:05:36,440 Speaker 1: that I don't think Apple's ready to do. Or I 91 00:05:36,480 --> 00:05:41,280 Speaker 1: guess the third option is closing off Apple Pay. I 92 00:05:41,360 --> 00:05:44,320 Speaker 1: don't see that being a big priority either. The Wall 93 00:05:44,320 --> 00:05:47,960 Speaker 1: Street Journal reports that American Express or Synchrony Financial are 94 00:05:48,000 --> 00:05:53,600 Speaker 1: potential successors who might take over the back end operations. However, 95 00:05:53,600 --> 00:05:56,560 Speaker 1: the article says that Apple has been pretty demanding of 96 00:05:56,640 --> 00:05:59,800 Speaker 1: its partners, and that when you pair that with the 97 00:05:59,880 --> 00:06:03,800 Speaker 1: challenges of actually making the business profitable, that could cause 98 00:06:03,839 --> 00:06:07,040 Speaker 1: a hiccup or two. So is Apple Pay on borrowed time? 99 00:06:07,240 --> 00:06:09,760 Speaker 1: Is it possible that it could be phased out after 100 00:06:09,800 --> 00:06:12,120 Speaker 1: a year or so? It beats me, but I'll keep 101 00:06:12,120 --> 00:06:15,799 Speaker 1: an eye on it. Starting tomorrow, which is Friday, Desummer first. 102 00:06:15,839 --> 00:06:18,120 Speaker 1: In case you're one of those odd people who likes 103 00:06:18,160 --> 00:06:22,279 Speaker 1: to listen to tech news episodes months after it happened, 104 00:06:22,760 --> 00:06:25,640 Speaker 1: Google will be shutting down old user accounts that have 105 00:06:25,720 --> 00:06:29,080 Speaker 1: been inactive for at least two years. I actually got 106 00:06:29,120 --> 00:06:31,719 Speaker 1: a message from Google not that long ago regarding an 107 00:06:31,720 --> 00:06:34,800 Speaker 1: email address that I created for a podcast, but I 108 00:06:34,880 --> 00:06:38,080 Speaker 1: abandoned that podcast after two episodes, because even professionals like 109 00:06:38,120 --> 00:06:41,440 Speaker 1: me can be a cliche. The move will delete all 110 00:06:41,520 --> 00:06:45,159 Speaker 1: related data to the Gmail account, so that includes not 111 00:06:45,360 --> 00:06:48,080 Speaker 1: just the Gmail address and everything that was in the 112 00:06:48,200 --> 00:06:52,840 Speaker 1: Gmail but also anything in an associated Google Drive or 113 00:06:52,920 --> 00:06:56,039 Speaker 1: Google Photos account. To be clear, this isn't going to 114 00:06:56,080 --> 00:06:59,080 Speaker 1: affect any active Gmail accounts. If you have used your 115 00:06:59,080 --> 00:07:01,200 Speaker 1: Gmail account with that in the last two years, it 116 00:07:01,279 --> 00:07:04,200 Speaker 1: is not gonna be in danger. Only ones that have 117 00:07:04,320 --> 00:07:07,080 Speaker 1: been inactive for two years or longer will be affected, 118 00:07:07,279 --> 00:07:10,440 Speaker 1: and it will also only affect personal Gmail addresses, not 119 00:07:10,560 --> 00:07:15,000 Speaker 1: addresses associated with organizations or businesses. There's still some unanswered 120 00:07:15,040 --> 00:07:18,040 Speaker 1: questions about all this, however. The big one is will 121 00:07:18,160 --> 00:07:22,800 Speaker 1: Google recycle the inactive Gmail addresses once they wipe those out? 122 00:07:23,040 --> 00:07:26,560 Speaker 1: Will people be able to reregister those addresses. If so, 123 00:07:27,000 --> 00:07:30,280 Speaker 1: that could potentially lead to issues with identity theft because 124 00:07:30,400 --> 00:07:33,920 Speaker 1: folks could scoop up newly available addresses that previously had 125 00:07:33,960 --> 00:07:37,640 Speaker 1: belonged to someone else. I imagine Google has probably thought of that, 126 00:07:38,000 --> 00:07:42,280 Speaker 1: but I haven't seen any confirmation on that. A company 127 00:07:42,320 --> 00:07:45,960 Speaker 1: called Analytics claims that Google did a big old whoopsie, 128 00:07:46,000 --> 00:07:50,360 Speaker 1: like a big old, huge issue whoopsie. See really, when 129 00:07:50,400 --> 00:07:53,120 Speaker 1: you get down to it, Google is an advertising platform 130 00:07:53,440 --> 00:07:55,480 Speaker 1: more than anything else. It's not a search company. It's 131 00:07:55,480 --> 00:07:59,320 Speaker 1: an ad company. Serves up ads from customers on various websites, 132 00:07:59,360 --> 00:08:03,960 Speaker 1: including Google. Well, part of that business includes giving advertisers 133 00:08:04,000 --> 00:08:06,400 Speaker 1: the chance to say which types of websites they do 134 00:08:06,600 --> 00:08:09,560 Speaker 1: or don't want to have their ads appear on, So 135 00:08:09,640 --> 00:08:12,840 Speaker 1: it gives the advertisers more control over protecting the brands 136 00:08:12,840 --> 00:08:16,240 Speaker 1: that they represent. I get a very similar consideration, like 137 00:08:16,280 --> 00:08:19,000 Speaker 1: I can say what types of ads I do or 138 00:08:19,160 --> 00:08:22,520 Speaker 1: don't want to appear on this show, and that gives 139 00:08:22,560 --> 00:08:25,640 Speaker 1: me more control over the show and protects you, the listener, 140 00:08:25,920 --> 00:08:29,320 Speaker 1: from advertisers that perhaps I don't believe in for one 141 00:08:29,320 --> 00:08:32,719 Speaker 1: reason or another. Well, advertisers get the same kind of 142 00:08:33,800 --> 00:08:36,360 Speaker 1: consideration when it comes to where their ads are going 143 00:08:36,400 --> 00:08:39,760 Speaker 1: to be delivered, because you want to protect the brand, right. 144 00:08:39,800 --> 00:08:42,840 Speaker 1: You don't want the brand to be associated with something 145 00:08:43,360 --> 00:08:47,280 Speaker 1: that is against the brand's image, whether that is outright 146 00:08:47,320 --> 00:08:53,040 Speaker 1: illegal content because that's a possibility, or things like porn sites. 147 00:08:53,280 --> 00:08:57,480 Speaker 1: If you are, for example, an advertiser that advertises children's clothing, 148 00:08:58,120 --> 00:09:02,839 Speaker 1: you definitely don't want your ad appearing on an adult website. 149 00:09:02,960 --> 00:09:07,360 Speaker 1: That's just not appropriate, so Analytics says, the Google Search 150 00:09:07,400 --> 00:09:11,280 Speaker 1: partner network has apparently been placing ads from prominent brands 151 00:09:11,320 --> 00:09:14,640 Speaker 1: on sites that are very likely not on any white list, 152 00:09:15,040 --> 00:09:18,760 Speaker 1: so illegal sites, adult websites, all this kind of stuff 153 00:09:18,800 --> 00:09:23,000 Speaker 1: have been having some major advertisers pop up on them, 154 00:09:23,400 --> 00:09:28,400 Speaker 1: and that's probably not what the advertisers wanted. According to Analytics, 155 00:09:28,679 --> 00:09:35,240 Speaker 1: they found instances in which very high profile companies like Apple, Amazon, Lego, Meta, Uber, etc. 156 00:09:36,080 --> 00:09:39,320 Speaker 1: Had their ads posted on inappropriate sites. These would be 157 00:09:39,400 --> 00:09:41,959 Speaker 1: sites that the brands absolutely would not want to be 158 00:09:42,040 --> 00:09:45,440 Speaker 1: associated with. This is a huge issue for brand safety right. 159 00:09:45,760 --> 00:09:49,040 Speaker 1: But other problems include cases in which, say, alcohol brand 160 00:09:49,080 --> 00:09:53,079 Speaker 1: advertisements were showing up on websites that are targeted to children. Clearly, 161 00:09:53,120 --> 00:09:56,760 Speaker 1: the website administrators don't want that that they're running a 162 00:09:56,800 --> 00:09:59,800 Speaker 1: site that's targeting children. They don't want ads that have 163 00:10:00,040 --> 00:10:03,800 Speaker 1: alcohol advertised on them running on the site. The advertisers 164 00:10:03,840 --> 00:10:05,960 Speaker 1: don't want it either, because there could be a huge 165 00:10:06,040 --> 00:10:09,160 Speaker 1: legal case to unfold. Because of all of this, Google 166 00:10:09,200 --> 00:10:12,240 Speaker 1: reportedly is investigating the issue in addressing it, though questions 167 00:10:12,240 --> 00:10:14,800 Speaker 1: remain as to how this even happened in the first place. 168 00:10:15,280 --> 00:10:17,200 Speaker 1: This comes at a really bad time for Google. The 169 00:10:17,240 --> 00:10:20,160 Speaker 1: company is already in the target sites of US lawmakers 170 00:10:20,240 --> 00:10:24,119 Speaker 1: as they question if Google is engaged in anti competitive 171 00:10:24,160 --> 00:10:27,839 Speaker 1: practices and if, perhaps to an extreme case, if Google 172 00:10:27,920 --> 00:10:31,280 Speaker 1: should get broken up into smaller companies. Politicians and the 173 00:10:31,280 --> 00:10:34,119 Speaker 1: EU have also indicated that they may launch an investigation 174 00:10:34,200 --> 00:10:36,760 Speaker 1: into the matter. They pointed out that if the EU 175 00:10:36,840 --> 00:10:40,719 Speaker 1: Commission ads, like an ad for the EU Commission, ends 176 00:10:40,760 --> 00:10:44,880 Speaker 1: up being placed on, say a sanctioned Russian website, then 177 00:10:45,480 --> 00:10:49,000 Speaker 1: the Commission inadvertently becomes an accomplice to a party that 178 00:10:49,120 --> 00:10:53,080 Speaker 1: is violating these international sanctions. So this really is a 179 00:10:53,120 --> 00:10:55,480 Speaker 1: pretty big mess for Google. All Right, we're going to 180 00:10:55,520 --> 00:10:57,120 Speaker 1: take a quick break. When we come back, we'll be 181 00:10:57,160 --> 00:11:09,680 Speaker 1: talking about messes for some other tech companies. Okay, we're back, 182 00:11:10,120 --> 00:11:13,400 Speaker 1: So here in the United States, the CEOs of five 183 00:11:13,600 --> 00:11:22,000 Speaker 1: tech companies, those being X, the platform formerly known as Twitter, Meta, TikTok, Snap, 184 00:11:22,360 --> 00:11:25,640 Speaker 1: and Discord are all going to get together on January 185 00:11:25,800 --> 00:11:30,440 Speaker 1: thirty first, twenty twenty four. They're not having lunch, they're 186 00:11:30,480 --> 00:11:33,760 Speaker 1: not engaging in small talk. Instead, they're appearing in front 187 00:11:33,800 --> 00:11:36,040 Speaker 1: of the US Senate to answer questions about how they 188 00:11:36,080 --> 00:11:40,680 Speaker 1: protect or rather fail to protect children's safety online. So 189 00:11:40,720 --> 00:11:44,000 Speaker 1: the hearing is all about child exploitation, and that's one 190 00:11:44,000 --> 00:11:46,920 Speaker 1: of the foundational concerns people have brought up about social 191 00:11:46,960 --> 00:11:51,160 Speaker 1: platforms for years now. Really, heck, when Francis Hogan came 192 00:11:51,200 --> 00:11:56,000 Speaker 1: forward with internal documents from Facebook now Meta, many of 193 00:11:56,000 --> 00:11:58,840 Speaker 1: those documents centered on the company's potential impact on the 194 00:11:58,880 --> 00:12:02,760 Speaker 1: mental health of young users, and it wasn't good news. 195 00:12:03,280 --> 00:12:06,240 Speaker 1: Senators say that the day we'll give CEOs the chance 196 00:12:06,280 --> 00:12:10,680 Speaker 1: to quote testify about their failure to protect children online 197 00:12:10,800 --> 00:12:13,640 Speaker 1: end quote. That certainly sounds like the Senate has already 198 00:12:13,840 --> 00:12:18,400 Speaker 1: reached the conclusion that these platforms have done and perhaps 199 00:12:18,440 --> 00:12:22,439 Speaker 1: actively are doing harm to children or the very least 200 00:12:22,840 --> 00:12:25,640 Speaker 1: failing to protect them, and that they are at fault 201 00:12:25,679 --> 00:12:27,800 Speaker 1: for this. I'm sure it'll be a fun day for 202 00:12:27,840 --> 00:12:30,720 Speaker 1: all involved. Personally, I think something that tends to be 203 00:12:30,760 --> 00:12:33,520 Speaker 1: overlooked is how we in the United States aren't really 204 00:12:33,600 --> 00:12:38,000 Speaker 1: serious about data privacy or security or data ownership, you know, 205 00:12:38,040 --> 00:12:42,720 Speaker 1: whether the data that pertains to people actually belongs to them. 206 00:12:43,480 --> 00:12:46,240 Speaker 1: And by addressing some of those gaps, I think we 207 00:12:46,240 --> 00:12:49,240 Speaker 1: could actually solve a lot of problems that adversely affect 208 00:12:49,280 --> 00:12:52,640 Speaker 1: young folks if we had tougher rules in place as 209 00:12:52,679 --> 00:12:55,760 Speaker 1: to who owns that data and who can access it, 210 00:12:56,200 --> 00:13:00,439 Speaker 1: and tougher laws about getting permission to access them data 211 00:13:00,480 --> 00:13:03,440 Speaker 1: and to exploit it. All of those things I think 212 00:13:03,840 --> 00:13:07,319 Speaker 1: are necessary, But hey, what do I know? So that'll 213 00:13:07,480 --> 00:13:09,719 Speaker 1: be happening at the end of the first month of 214 00:13:09,800 --> 00:13:13,600 Speaker 1: next year. US Senator Marco Rubio has proposed the most 215 00:13:13,679 --> 00:13:17,480 Speaker 1: recent opposition to TikTok here in the US. Rubio proposes 216 00:13:17,520 --> 00:13:21,920 Speaker 1: an ultimatum either China hands over the algorithm used for 217 00:13:22,000 --> 00:13:27,079 Speaker 1: TikTok's recommendation engine, or the United States demonetizes the platform 218 00:13:27,120 --> 00:13:30,520 Speaker 1: within the country, which effectively ends up being a ban 219 00:13:30,800 --> 00:13:33,960 Speaker 1: on TikTok. This is pretty wild stuff, so at the 220 00:13:33,960 --> 00:13:36,680 Speaker 1: heart of Rubio's call appears to be this perception that 221 00:13:36,720 --> 00:13:42,600 Speaker 1: TikTok's algorithm is purposefully shaping discourse. That perhaps China's using 222 00:13:42,600 --> 00:13:46,040 Speaker 1: the algorithm to push propaganda or sway the minds of 223 00:13:46,080 --> 00:13:49,720 Speaker 1: the US public in particular directions. And maybe that's true. 224 00:13:49,800 --> 00:13:52,400 Speaker 1: I mean, maybe the algorithm is doing that, but I 225 00:13:52,440 --> 00:13:55,720 Speaker 1: suspect the algorithm is actually doing something far simpler. It 226 00:13:55,840 --> 00:13:59,920 Speaker 1: is essentially identifying the types of content each user to 227 00:14:00,080 --> 00:14:02,679 Speaker 1: tends to like, and then it just serves up an 228 00:14:02,800 --> 00:14:05,839 Speaker 1: endless supply of that kind of content in an effort 229 00:14:05,920 --> 00:14:08,240 Speaker 1: to keep that user engaged on the platform for as 230 00:14:08,240 --> 00:14:11,560 Speaker 1: long as possible. So if that content does favor one 231 00:14:11,600 --> 00:14:15,760 Speaker 1: particular political ideology, well yeah, TikTok serves that up to 232 00:14:15,760 --> 00:14:19,040 Speaker 1: the user. But if the user tends to favor the 233 00:14:19,200 --> 00:14:23,480 Speaker 1: opposite point of view, I don't think TikTok tries to 234 00:14:23,720 --> 00:14:27,440 Speaker 1: change the user's mind and manipulate them so that they 235 00:14:27,520 --> 00:14:31,400 Speaker 1: think the other thing. No, Instead, it just caters to 236 00:14:31,680 --> 00:14:33,440 Speaker 1: that kind of point of view and serves up that 237 00:14:33,560 --> 00:14:37,720 Speaker 1: sort of video content to the user. Again, that's my guess, 238 00:14:37,760 --> 00:14:40,760 Speaker 1: and maybe I'm wrong. But anyway, as CNBC puts it, 239 00:14:40,840 --> 00:14:44,760 Speaker 1: Rubio's law would quote prohibit financial transactions from social media 240 00:14:44,800 --> 00:14:50,600 Speaker 1: companies located in or under the influence of China, Russia, 241 00:14:50,640 --> 00:14:53,080 Speaker 1: and a few other countries end quote. So yeah, that 242 00:14:53,200 --> 00:14:56,760 Speaker 1: under the influence is the key element for TikTok because 243 00:14:56,800 --> 00:14:59,880 Speaker 1: obviously the company TikTok is centered here in the United States, 244 00:15:00,320 --> 00:15:03,880 Speaker 1: but it's pairing company by Dance is in China, so 245 00:15:03,960 --> 00:15:07,280 Speaker 1: that seems to be where that key phrase comes into play. 246 00:15:07,680 --> 00:15:10,760 Speaker 1: I don't really think this is on the right track. 247 00:15:10,800 --> 00:15:14,280 Speaker 1: I do think TikTok is potentially harmful, but uh, I 248 00:15:14,280 --> 00:15:18,720 Speaker 1: don't think it's necessarily trying to sway opinions in a 249 00:15:18,760 --> 00:15:22,840 Speaker 1: specific direction. I think the stuff that TikTok serves up 250 00:15:23,240 --> 00:15:25,480 Speaker 1: is driven by the behaviors of the people who are 251 00:15:25,480 --> 00:15:28,880 Speaker 1: on TikTok, not the other way around. But you know, 252 00:15:29,080 --> 00:15:32,400 Speaker 1: that's just my guess. I don't have any real hard 253 00:15:32,480 --> 00:15:33,960 Speaker 1: data to say one way or the other, and I 254 00:15:34,000 --> 00:15:37,920 Speaker 1: could very clearly be wrong. Okay. Next up, Elon Musk 255 00:15:37,960 --> 00:15:40,480 Speaker 1: had some less than elegant words for advertisers who are 256 00:15:40,520 --> 00:15:44,080 Speaker 1: fleeing X, the platform formerly known as Twitter. At the 257 00:15:44,200 --> 00:15:47,200 Speaker 1: Deal Book conference, Musk had a bit of a rant 258 00:15:47,360 --> 00:15:51,400 Speaker 1: against advertisers, saying they were in effect blackmailing him. Now 259 00:15:51,440 --> 00:15:54,920 Speaker 1: as a reminder, Musk had posted and also boosted posts 260 00:15:55,040 --> 00:15:59,720 Speaker 1: containing antisemitic messaging, and many advertisers chose that moment to 261 00:15:59,760 --> 00:16:02,360 Speaker 1: say peace out to the platform. So I'm not sure 262 00:16:02,400 --> 00:16:04,720 Speaker 1: this is really a case of blackmail. To be clear, 263 00:16:05,040 --> 00:16:07,320 Speaker 1: I think this is a case of advertisers not wanting 264 00:16:07,400 --> 00:16:10,440 Speaker 1: to be complicit in the spread of hate speech because 265 00:16:10,440 --> 00:16:13,440 Speaker 1: that's a pretty bad way to protect your brand's image. 266 00:16:13,960 --> 00:16:17,600 Speaker 1: So that's not the same thing as blackmail. Facing consequences 267 00:16:17,600 --> 00:16:21,840 Speaker 1: for your actions is not the same thing as being persecuted. Anyway, 268 00:16:22,320 --> 00:16:27,000 Speaker 1: Musk told the advertisers to well, using a phrase that 269 00:16:27,040 --> 00:16:29,520 Speaker 1: would be used in the television series The Good Place 270 00:16:29,840 --> 00:16:33,680 Speaker 1: to Go, fork themselves. Then he made himself the victim 271 00:16:33,840 --> 00:16:37,520 Speaker 1: by saying that if the company dies, which seems possible, 272 00:16:37,720 --> 00:16:41,080 Speaker 1: it will be because of an advertiser boycott. I think 273 00:16:41,640 --> 00:16:45,920 Speaker 1: if x slash Twitter dies, it's because an entitled billionaire 274 00:16:45,920 --> 00:16:48,840 Speaker 1: who thinks consequences are something that happens to other people 275 00:16:49,280 --> 00:16:52,360 Speaker 1: bought the company on a whim. And remember this was 276 00:16:52,840 --> 00:16:55,840 Speaker 1: after he announced his intention to buy it, then tried 277 00:16:55,880 --> 00:16:59,040 Speaker 1: to back out of the deal almost immediately, and then 278 00:16:59,200 --> 00:17:01,400 Speaker 1: went through the deal only when he was threatened with 279 00:17:01,480 --> 00:17:07,000 Speaker 1: legal consequences I think perhaps that's the reason that x 280 00:17:07,040 --> 00:17:11,800 Speaker 1: slash Twitter dies, not that advertisers just spontaneously decided they 281 00:17:11,840 --> 00:17:15,600 Speaker 1: don't like Elon Musk. But what do I know? The 282 00:17:15,760 --> 00:17:19,639 Speaker 1: United Auto Workers labor group recently negotiated new contracts with 283 00:17:19,720 --> 00:17:23,480 Speaker 1: three major automakers here in the US. Now, as revealed 284 00:17:23,480 --> 00:17:27,479 Speaker 1: in a video recorded by UAW President Sean Fain, it 285 00:17:27,560 --> 00:17:32,440 Speaker 1: aims to unionize Tesla Motors. Considering the union secured substantial 286 00:17:32,520 --> 00:17:36,560 Speaker 1: raises for its members with those three major automakers, this 287 00:17:36,840 --> 00:17:40,400 Speaker 1: effort might actually gain some interest within Tesla. The UAW 288 00:17:40,520 --> 00:17:43,879 Speaker 1: also wants to organize workers at Mercedes Benz and at Toyota, 289 00:17:43,960 --> 00:17:47,480 Speaker 1: among other auto manufacturers. I imagine Elon Musk is not 290 00:17:47,640 --> 00:17:51,639 Speaker 1: happy about this. He recently expressed anger and frustration over 291 00:17:51,720 --> 00:17:55,440 Speaker 1: the opposition that labor groups have shown against Tesla over 292 00:17:55,480 --> 00:17:58,760 Speaker 1: in Sweden. That's been an ongoing story, and Musk has 293 00:17:58,800 --> 00:18:01,880 Speaker 1: not been quiet about his desay pleasure at the situation. 294 00:18:02,520 --> 00:18:05,720 Speaker 1: We've been seeing a growth in unions and organization over 295 00:18:05,760 --> 00:18:07,560 Speaker 1: the past few years. Have been covering it for like 296 00:18:07,800 --> 00:18:10,800 Speaker 1: three years. We'll have to wait and see if they 297 00:18:10,880 --> 00:18:14,399 Speaker 1: take hold in Tesla. There have been individual efforts at 298 00:18:14,480 --> 00:18:19,080 Speaker 1: various Tesla facilities for unionization, but this would be more 299 00:18:19,080 --> 00:18:23,440 Speaker 1: of an organization wide approach. For a quarter mile down 300 00:18:23,480 --> 00:18:27,240 Speaker 1: Fourteenth Street in Detroit, you can recharge your electric vehicle 301 00:18:27,840 --> 00:18:31,399 Speaker 1: even as you drive it. The company Electrion completed the 302 00:18:31,440 --> 00:18:35,399 Speaker 1: installation of some inductive charging coils along that stretch of 303 00:18:35,560 --> 00:18:38,719 Speaker 1: street and it will serve as a test for the technology, 304 00:18:38,800 --> 00:18:41,680 Speaker 1: so evs that are equipped with an appropriate receiver will 305 00:18:41,680 --> 00:18:43,520 Speaker 1: be able to make use of the coils. The coils 306 00:18:43,520 --> 00:18:46,960 Speaker 1: will create an electromagnetic field. Passing through the field will 307 00:18:46,960 --> 00:18:49,520 Speaker 1: induce electric charge to flow through the receiver, which then 308 00:18:49,760 --> 00:18:53,720 Speaker 1: gets directed to recharge the battery, so your car can 309 00:18:53,840 --> 00:18:56,760 Speaker 1: charge as it drives along the street. The tech also 310 00:18:56,800 --> 00:18:58,840 Speaker 1: works at the car's stationary so if you park your 311 00:18:58,920 --> 00:19:02,480 Speaker 1: vehicle along that stretch of street, it will recharge while 312 00:19:02,520 --> 00:19:04,199 Speaker 1: you do whatever it is you have to do, and 313 00:19:04,240 --> 00:19:07,320 Speaker 1: you don't have to have any plugs or cables required. Now, 314 00:19:07,359 --> 00:19:10,560 Speaker 1: this is an early test of the technology. It's really 315 00:19:10,600 --> 00:19:13,400 Speaker 1: looked as sort of a test bed for the tech. 316 00:19:13,800 --> 00:19:15,280 Speaker 1: It's not quite at the level where you're going to 317 00:19:15,320 --> 00:19:17,600 Speaker 1: see it installed in cities worldwide, but if it is 318 00:19:17,640 --> 00:19:21,080 Speaker 1: shown to be practical, we could see other municipalities around 319 00:19:21,080 --> 00:19:24,600 Speaker 1: the world invest in this technology. And that's it for 320 00:19:24,640 --> 00:19:26,760 Speaker 1: the tech news that I have for you today, Thursday, 321 00:19:26,800 --> 00:19:29,359 Speaker 1: November thirty, of twenty twenty three. I am out on 322 00:19:29,480 --> 00:19:31,919 Speaker 1: vacation next week, so we will have a few rerun 323 00:19:31,960 --> 00:19:35,800 Speaker 1: episodes play next week. We also will have a special 324 00:19:35,880 --> 00:19:39,560 Speaker 1: episode of another podcast play on Tuesday, so that you 325 00:19:39,600 --> 00:19:42,440 Speaker 1: can be introduced to a new show that I think 326 00:19:42,560 --> 00:19:46,000 Speaker 1: is pretty interesting. It takes a different look at AI 327 00:19:46,160 --> 00:19:48,760 Speaker 1: than you'll typically hear on tech Stuff, and honestly, as 328 00:19:48,880 --> 00:19:52,040 Speaker 1: much as I him and haw about artificial intelligence on 329 00:19:52,040 --> 00:19:54,760 Speaker 1: this show, I do think it's really important to hear 330 00:19:54,800 --> 00:19:57,760 Speaker 1: different points of view because I can't claim to be 331 00:19:58,840 --> 00:20:02,200 Speaker 1: the correct person. I don't think I am just right 332 00:20:02,320 --> 00:20:05,919 Speaker 1: in everything. My opinions have been shaped by various experiences. 333 00:20:06,040 --> 00:20:08,520 Speaker 1: I think it's very important to hear other points of view. 334 00:20:08,720 --> 00:20:11,600 Speaker 1: So check that out and I will talk to you 335 00:20:11,680 --> 00:20:22,199 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 336 00:20:22,320 --> 00:20:27,160 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 337 00:20:27,280 --> 00:20:29,280 Speaker 1: or wherever you listen to your favorite shows.