1 00:00:04,440 --> 00:00:12,399 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. He there, 2 00:00:12,400 --> 00:00:16,079 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,120 --> 00:00:19,840 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,960 --> 00:00:24,040 Speaker 1: are you. It's time for the tech news for Thursday, 5 00:00:24,880 --> 00:00:29,400 Speaker 1: May the fourth be with you. Yeah, big Star Wars 6 00:00:29,480 --> 00:00:32,800 Speaker 1: nerds over here at iHeart as you might imagine, So 7 00:00:33,240 --> 00:00:37,199 Speaker 1: we've got a whole email chain with podcasters you're probably 8 00:00:37,200 --> 00:00:39,800 Speaker 1: familiar with, and a lot of producers that you may 9 00:00:39,800 --> 00:00:42,200 Speaker 1: have heard of, but you know, aren't always on Mike 10 00:00:42,800 --> 00:00:48,480 Speaker 1: sharing hilarious Star Wars related photographs. I wish I could 11 00:00:48,520 --> 00:00:51,600 Speaker 1: share them with you, but instead we're gonna go through 12 00:00:52,000 --> 00:00:56,320 Speaker 1: a whole bunch of tech news now. On Tuesday, I 13 00:00:56,400 --> 00:01:00,120 Speaker 1: talked about how the Writers Guild of America or wg A, 14 00:01:00,840 --> 00:01:04,160 Speaker 1: which represents writers for TV and film here in the 15 00:01:04,280 --> 00:01:08,240 Speaker 1: United States, is now on strike. The negotiations between the 16 00:01:08,400 --> 00:01:12,960 Speaker 1: WGA and various Hollywood studios could not reach a satisfactory agreement. 17 00:01:13,319 --> 00:01:16,199 Speaker 1: So here we are. And I also talked about how 18 00:01:16,680 --> 00:01:21,200 Speaker 1: one of the concerns that the WGA has and one 19 00:01:21,200 --> 00:01:23,720 Speaker 1: of the reasons that I'm covering the story on tech 20 00:01:23,800 --> 00:01:28,760 Speaker 1: stuff at all, has to do with generative AI. Now 21 00:01:28,840 --> 00:01:31,240 Speaker 1: I've got a little bit more information thanks to Vice, 22 00:01:31,600 --> 00:01:36,920 Speaker 1: which has an article titled GPT four can't replace striking 23 00:01:37,000 --> 00:01:42,000 Speaker 1: TV writers, but studios are going to try a sidebar, 24 00:01:42,480 --> 00:01:46,200 Speaker 1: Vice is trying to avoid bankruptcy right now, and that 25 00:01:46,240 --> 00:01:51,600 Speaker 1: company actually has some really great investigative journalism across all 26 00:01:51,880 --> 00:01:55,240 Speaker 1: brackets of news. So those of y'all who don't use 27 00:01:55,280 --> 00:01:57,800 Speaker 1: ad blockers on your browsers might want to head over 28 00:01:57,840 --> 00:02:01,760 Speaker 1: device and scan the articles. I have no connection device whatsoever. 29 00:02:02,120 --> 00:02:05,600 Speaker 1: I just hate to see a company that has really 30 00:02:05,640 --> 00:02:09,000 Speaker 1: pushed for some great journalism. I mean, there's other stuff 31 00:02:09,000 --> 00:02:11,280 Speaker 1: on there too, but there's some really great journalism on there. 32 00:02:11,440 --> 00:02:13,360 Speaker 1: I hate to see him in trouble. Anyway, back to 33 00:02:13,400 --> 00:02:18,120 Speaker 1: the story, so Vice's Chloe Shang explains that one of 34 00:02:18,160 --> 00:02:22,000 Speaker 1: the WGA's demands was for studios to promise that AI 35 00:02:22,160 --> 00:02:25,520 Speaker 1: would not become part of the creative experience for film 36 00:02:25,680 --> 00:02:29,000 Speaker 1: and television, that AI would not be used to do 37 00:02:29,040 --> 00:02:32,360 Speaker 1: stuff like punch up a script or to generate a 38 00:02:32,480 --> 00:02:36,079 Speaker 1: draft of a story, or otherwise perform work that should 39 00:02:36,080 --> 00:02:40,760 Speaker 1: be done by a union represented writer. The WGA's concern 40 00:02:41,240 --> 00:02:44,240 Speaker 1: is that studios would lean on AI to do some 41 00:02:44,360 --> 00:02:46,960 Speaker 1: of the early work. It's not that AI would completely 42 00:02:47,000 --> 00:02:50,720 Speaker 1: replace human writers, but rather that studios might use AI 43 00:02:50,960 --> 00:02:54,160 Speaker 1: to do like a very rough draft or even just 44 00:02:54,200 --> 00:02:57,960 Speaker 1: a treatment, and then would hand that over to a 45 00:02:58,160 --> 00:03:02,440 Speaker 1: human writer to polish it up into something that's actually usable. 46 00:03:02,840 --> 00:03:04,760 Speaker 1: But the human writer would be doing so at a 47 00:03:04,760 --> 00:03:07,960 Speaker 1: lower rate because the writer technically wouldn't be coming up 48 00:03:08,000 --> 00:03:11,920 Speaker 1: with their own ideas. Instead, they're refining an already existing idea. 49 00:03:12,400 --> 00:03:14,400 Speaker 1: So this is kind of like if you've got a 50 00:03:14,480 --> 00:03:17,000 Speaker 1: script and you go out and you get a writer 51 00:03:17,080 --> 00:03:19,240 Speaker 1: to come in and read the script and punch it 52 00:03:19,320 --> 00:03:21,760 Speaker 1: up a little bit, that second writer isn't going to 53 00:03:21,760 --> 00:03:24,960 Speaker 1: be making the same amount as the primary writer did. 54 00:03:25,600 --> 00:03:27,760 Speaker 1: And so what they're saying is you're just trying to 55 00:03:27,800 --> 00:03:30,959 Speaker 1: push more work over to AI, and then we're left 56 00:03:31,440 --> 00:03:34,720 Speaker 1: to pick up these terrible ideas that AI is creating, 57 00:03:34,960 --> 00:03:37,080 Speaker 1: and we're expected to make something of them, and at 58 00:03:37,080 --> 00:03:40,360 Speaker 1: a lower rate of pay. The studios have balked at 59 00:03:40,400 --> 00:03:43,560 Speaker 1: the WGA's demands, saying that this decision should be made 60 00:03:43,640 --> 00:03:46,760 Speaker 1: year by year because they are framing AI as a 61 00:03:46,800 --> 00:03:50,320 Speaker 1: sort of advancement in tech and that the industry shouldn't 62 00:03:50,320 --> 00:03:55,160 Speaker 1: be ignoring advancements like that. The WGA counters with the 63 00:03:55,200 --> 00:03:59,320 Speaker 1: observation that studios have been pushing writers to move toward 64 00:03:59,760 --> 00:04:03,720 Speaker 1: a more gig economy model, so kind of a work 65 00:04:03,800 --> 00:04:06,920 Speaker 1: for a higher situation where writers have to hustle gig 66 00:04:06,960 --> 00:04:10,320 Speaker 1: to gig, with writers' rooms getting smaller in the process, 67 00:04:10,360 --> 00:04:13,320 Speaker 1: which in turn means they are even fewer opportunities for 68 00:04:13,440 --> 00:04:17,320 Speaker 1: writers to find work. And y'all, I'm a gen x feller. 69 00:04:17,440 --> 00:04:20,640 Speaker 1: I grew up before the gig economy was really a 70 00:04:20,680 --> 00:04:23,719 Speaker 1: widespread thing, and I don't think I have seen a 71 00:04:23,760 --> 00:04:27,520 Speaker 1: more destructive environment that you know, doesn't directly tie into 72 00:04:27,640 --> 00:04:31,680 Speaker 1: really terrible stuff like racism and misogyny. The gig economy 73 00:04:32,360 --> 00:04:35,680 Speaker 1: is terrible. It clearly forces tons of people to work 74 00:04:35,720 --> 00:04:40,680 Speaker 1: themselves to exhaustion and constantly they stress about whether the 75 00:04:40,720 --> 00:04:44,040 Speaker 1: money they make this week will pay the rent next week. 76 00:04:44,480 --> 00:04:48,440 Speaker 1: And meanwhile, the companies that are benefiting from those workers 77 00:04:48,720 --> 00:04:51,120 Speaker 1: are always figuring out ways to keep more and more 78 00:04:51,120 --> 00:04:53,680 Speaker 1: of the cash for themselves. Now, you could just say 79 00:04:54,040 --> 00:04:57,640 Speaker 1: same as it ever was, but it's really more blatant 80 00:04:57,720 --> 00:05:02,159 Speaker 1: with the gig economy anyway. This is the reason unions 81 00:05:02,360 --> 00:05:07,000 Speaker 1: exist to provide workers protection in aggregate because a one 82 00:05:07,040 --> 00:05:11,000 Speaker 1: on one basis there is an undeniable power imbalance between 83 00:05:11,040 --> 00:05:15,880 Speaker 1: the employers and the employed. So I'm glad the WGA 84 00:05:16,080 --> 00:05:20,680 Speaker 1: exists to fight these fights and to lay these this groundwork. 85 00:05:20,760 --> 00:05:22,960 Speaker 1: I don't know how that's going to turn out. Obviously, 86 00:05:23,000 --> 00:05:26,520 Speaker 1: the strike is still in progress, no agreement has been reached. 87 00:05:26,880 --> 00:05:28,800 Speaker 1: I don't know if the studios will budge on this, 88 00:05:29,520 --> 00:05:32,960 Speaker 1: but I think it is important to set those rules now, 89 00:05:33,160 --> 00:05:37,000 Speaker 1: because even if you argue that generative AI at the 90 00:05:37,000 --> 00:05:40,120 Speaker 1: moment isn't good enough to even do this, that wouldn't 91 00:05:40,120 --> 00:05:42,920 Speaker 1: stop studios from trying, and it doesn't stop them from 92 00:05:42,920 --> 00:05:46,200 Speaker 1: trying again as the AI gets better. So setting out 93 00:05:46,279 --> 00:05:52,520 Speaker 1: rules to protect unionized workers makes sense. Over in Washington, DC, 94 00:05:52,680 --> 00:05:55,080 Speaker 1: the US government is going to play host to executives 95 00:05:55,120 --> 00:05:59,760 Speaker 1: from companies like Microsoft and Alphabet that's Google's parent company, 96 00:06:00,120 --> 00:06:04,440 Speaker 1: in order to hold a discussion about AI. Theoretically, this 97 00:06:04,560 --> 00:06:08,239 Speaker 1: meeting will explore the possible risks that AI could pose 98 00:06:08,920 --> 00:06:12,720 Speaker 1: and the methods that companies should use to guard against 99 00:06:12,880 --> 00:06:16,560 Speaker 1: those risks. But y'all, I've watched politicians talk to tech 100 00:06:16,600 --> 00:06:19,480 Speaker 1: experts before, and it often feels like the two sides 101 00:06:19,480 --> 00:06:24,200 Speaker 1: are speaking totally different languages. Hopefully the questions that these 102 00:06:24,400 --> 00:06:26,840 Speaker 1: executives are going to get asked are going to be relevant, 103 00:06:27,279 --> 00:06:32,280 Speaker 1: and hopefully the answers they give will be direct, but honestly, 104 00:06:32,320 --> 00:06:35,400 Speaker 1: that's a long shot. The motivation behind the talks, however, 105 00:06:35,760 --> 00:06:37,960 Speaker 1: is a very good one. How can we make best 106 00:06:38,080 --> 00:06:43,320 Speaker 1: use of AI while avoiding as many serious problems as possible. 107 00:06:43,839 --> 00:06:47,320 Speaker 1: Already with generative AI, we're seeing numerous issues ranging from 108 00:06:47,520 --> 00:06:53,000 Speaker 1: copyright concerns, plagiarism, answer accuracy, the tendency for AI to 109 00:06:53,080 --> 00:06:57,359 Speaker 1: quote unquote hallucinate and generate wrong answers, and misinformation. And 110 00:06:57,400 --> 00:07:01,400 Speaker 1: there's a lot more to really consider. And that's generative AI. 111 00:07:02,000 --> 00:07:04,200 Speaker 1: There's lots of other versions of AI we need to 112 00:07:04,240 --> 00:07:07,960 Speaker 1: really consider. There's also the concerns about how it could 113 00:07:07,960 --> 00:07:11,200 Speaker 1: impact the job market, as we see with the WGA strike, 114 00:07:11,480 --> 00:07:15,040 Speaker 1: as well as the CEO of IBM's recent revelation that 115 00:07:15,120 --> 00:07:17,920 Speaker 1: the company may consider AI to fill in as many 116 00:07:17,960 --> 00:07:20,760 Speaker 1: as seven eight hundred empty position over the next few 117 00:07:20,840 --> 00:07:24,120 Speaker 1: years rather than using a human being in that job. 118 00:07:24,640 --> 00:07:26,280 Speaker 1: There is a lot of ground to cover here, and 119 00:07:26,320 --> 00:07:29,160 Speaker 1: I'm sure next week I'll have some sort of update 120 00:07:29,320 --> 00:07:34,760 Speaker 1: to this story. The White House isn't the only US 121 00:07:34,880 --> 00:07:39,200 Speaker 1: government entity looking into potential risks with AI. The Federal 122 00:07:39,240 --> 00:07:43,760 Speaker 1: Trade Commission, or FTC's chief has said that the agency 123 00:07:43,800 --> 00:07:48,240 Speaker 1: is investigating whether AI could create more disparity between companies, 124 00:07:48,680 --> 00:07:52,400 Speaker 1: saying that it could potentially enhance the power of dominant 125 00:07:52,480 --> 00:07:55,800 Speaker 1: firms according to Reuter's so, in other words, it could 126 00:07:55,800 --> 00:08:03,040 Speaker 1: become an anti competitive risk. The FTC chair said that 127 00:08:03,520 --> 00:08:06,520 Speaker 1: the agency is looking into how AI could make some 128 00:08:06,760 --> 00:08:10,440 Speaker 1: problems much worse, such as allowing companies to more effectively 129 00:08:10,520 --> 00:08:15,240 Speaker 1: commit fraud. For example, in the wake of spectacular collapses 130 00:08:15,320 --> 00:08:20,120 Speaker 1: like the cryptocurrency exchange FTX, there's increased pressure to uncover 131 00:08:20,240 --> 00:08:23,080 Speaker 1: and address fraud before it can have a massive impact 132 00:08:23,160 --> 00:08:26,640 Speaker 1: on investors and customers. Con also pointed out that AI 133 00:08:26,760 --> 00:08:31,440 Speaker 1: tools could potentially decrease competition by allowing for price fixing 134 00:08:31,480 --> 00:08:36,920 Speaker 1: strategies across industries, that it's not a far fetched idea. 135 00:08:36,960 --> 00:08:40,319 Speaker 1: We've actually seen something similar to that play out in 136 00:08:40,400 --> 00:08:45,920 Speaker 1: a microcosm, because in Texas there's this ongoing issue where 137 00:08:46,080 --> 00:08:49,280 Speaker 1: landlords have been using a specific company in order to 138 00:08:49,480 --> 00:08:54,800 Speaker 1: set rental prices, and there are class action lawsuits that 139 00:08:54,840 --> 00:08:58,480 Speaker 1: are saying that this amounts to price fixing, that it's 140 00:08:58,520 --> 00:09:03,840 Speaker 1: a collusion offect across landowners who are then setting rental 141 00:09:03,880 --> 00:09:08,720 Speaker 1: prices at exorbitant levels because there's no competition. There's nowhere 142 00:09:08,720 --> 00:09:12,520 Speaker 1: to go where you could pay rent less at somewhere else, 143 00:09:13,040 --> 00:09:19,160 Speaker 1: because this centralized service has removed all competition, and that 144 00:09:19,320 --> 00:09:23,080 Speaker 1: you effectively have price fixing, even if the landlords themselves 145 00:09:23,080 --> 00:09:27,320 Speaker 1: didn't intentionally set out to do that. That's the effect 146 00:09:27,480 --> 00:09:32,040 Speaker 1: that this approach has. Oh Brave New World to have 147 00:09:32,080 --> 00:09:36,160 Speaker 1: such artificial intelligence in it. The Verge reports that Microsoft 148 00:09:36,200 --> 00:09:40,839 Speaker 1: is unveiling some new bells and whistles with the bing chatbot. Now. 149 00:09:40,840 --> 00:09:44,600 Speaker 1: As you may recall, Microsoft is leveraging open ais Chat 150 00:09:44,640 --> 00:09:50,040 Speaker 1: GPT to power the bing chatbot, which, unlike open ais 151 00:09:50,160 --> 00:09:54,840 Speaker 1: Chat GPT, actually has access to real time information. Chat 152 00:09:54,880 --> 00:09:57,839 Speaker 1: GPT's info cuts off at twenty twenty one. It doesn't 153 00:09:57,880 --> 00:10:02,120 Speaker 1: have information more recent than that. Bing is different now. 154 00:10:02,160 --> 00:10:05,080 Speaker 1: Apparently users on Edge will be able to use Bing 155 00:10:05,200 --> 00:10:08,640 Speaker 1: to do stuff like find a restaurant and book a reservation, 156 00:10:08,800 --> 00:10:11,640 Speaker 1: all within bing chat, without having to go back and 157 00:10:11,679 --> 00:10:16,280 Speaker 1: forth between multiple websites like review sites and the restaurant's 158 00:10:16,320 --> 00:10:19,520 Speaker 1: own site, and then maybe something like open table or whatever. 159 00:10:19,960 --> 00:10:22,600 Speaker 1: So you might have a conversation like, where's a good 160 00:10:22,640 --> 00:10:25,360 Speaker 1: place to take my partner for their birthday? They are 161 00:10:25,440 --> 00:10:27,920 Speaker 1: allergic to shellfish? And then the chat bought would make 162 00:10:27,960 --> 00:10:30,199 Speaker 1: a few recommendations, and then you might say, okay, well 163 00:10:30,200 --> 00:10:33,640 Speaker 1: what's the menu for place X? And it gives you 164 00:10:33,720 --> 00:10:36,439 Speaker 1: some examples and then you say, great, make a reservation 165 00:10:36,520 --> 00:10:39,080 Speaker 1: for Friday night at nine pm for two, and it 166 00:10:39,120 --> 00:10:41,600 Speaker 1: takes care of it. It's a lot like what Google 167 00:10:41,679 --> 00:10:44,760 Speaker 1: showed off at a Google ioevan a few years ago 168 00:10:45,200 --> 00:10:49,439 Speaker 1: with its assistant feature. In that demonstration, the company showed 169 00:10:49,440 --> 00:10:52,120 Speaker 1: how the assistant could pose as an actual human being 170 00:10:52,440 --> 00:10:56,280 Speaker 1: and make reservations on your behalf over the phone, which was, 171 00:10:56,320 --> 00:11:00,160 Speaker 1: in my opinion, kind of creepy. Bing's approach is obviously 172 00:11:00,200 --> 00:11:02,959 Speaker 1: a bit different. It's web based, it's not necessarily making 173 00:11:03,000 --> 00:11:05,840 Speaker 1: phone calls on your behalf. The Verge also says that 174 00:11:05,920 --> 00:11:08,640 Speaker 1: another feature lets you use Bing to find content to 175 00:11:08,679 --> 00:11:11,800 Speaker 1: watch without having to search around to see what platform 176 00:11:11,880 --> 00:11:15,199 Speaker 1: has what. So instead of saying, oh, I want to 177 00:11:15,240 --> 00:11:17,880 Speaker 1: watch the latest episode of Barry, but I didn't remember 178 00:11:18,080 --> 00:11:20,520 Speaker 1: that it's on HBO Max. How do I find it? 179 00:11:20,679 --> 00:11:23,240 Speaker 1: Do I just start searching? Do I go to different platforms? 180 00:11:23,320 --> 00:11:25,000 Speaker 1: Do I look to see if it's on their pages. 181 00:11:25,480 --> 00:11:27,800 Speaker 1: With this, you would just tell bing and Chat and 182 00:11:27,840 --> 00:11:29,720 Speaker 1: it would send you to the right location. You would 183 00:11:29,720 --> 00:11:31,160 Speaker 1: just say I want to watch the latest episode of 184 00:11:31,160 --> 00:11:35,400 Speaker 1: Barry Boom. It's into HBO Max. So useful in that 185 00:11:35,480 --> 00:11:38,440 Speaker 1: in that regard as well. And if you want to 186 00:11:38,480 --> 00:11:41,640 Speaker 1: learn more about the various features that are being unveiled, 187 00:11:41,679 --> 00:11:43,640 Speaker 1: you can go to the Verge. They have an article 188 00:11:43,679 --> 00:11:47,520 Speaker 1: titled Microsoft's bing Chat Butt gets smarter with restaurant bookings, 189 00:11:47,840 --> 00:11:51,839 Speaker 1: image results and more. Speaking of end more, we'll have 190 00:11:51,960 --> 00:12:04,760 Speaker 1: more after we take this quick break. We're back and 191 00:12:04,840 --> 00:12:08,880 Speaker 1: now we switch from AI to some Meta stories not 192 00:12:08,960 --> 00:12:13,480 Speaker 1: Twitter at this time. Meta, So first up, France's antitrust 193 00:12:13,559 --> 00:12:17,760 Speaker 1: agency has leveled a mandate against Meta. The company has 194 00:12:17,800 --> 00:12:21,719 Speaker 1: to change how it gives access to ad verification partners. 195 00:12:22,440 --> 00:12:25,560 Speaker 1: So essentially, what the agency is arguing is that Meta 196 00:12:25,640 --> 00:12:28,840 Speaker 1: has this dominant position when it comes to online advertising 197 00:12:29,160 --> 00:12:32,360 Speaker 1: and it has used that to dictate terms to partners, 198 00:12:32,600 --> 00:12:36,000 Speaker 1: and moreover, that Meta needs to be more transparent with 199 00:12:36,480 --> 00:12:41,640 Speaker 1: its analytics and to allow more ad verification companies access 200 00:12:41,679 --> 00:12:44,960 Speaker 1: to those analytics. Ad verification companies what they do is 201 00:12:45,120 --> 00:12:47,520 Speaker 1: I mean it's in the name. They're essentially trying to 202 00:12:47,640 --> 00:12:51,280 Speaker 1: verify that an AD campaign is doing what it's supposed 203 00:12:51,320 --> 00:12:55,480 Speaker 1: to do. So an ad verification company monitors analytics to 204 00:12:55,520 --> 00:12:58,880 Speaker 1: make sure that the reach is what you wanted, like 205 00:12:58,920 --> 00:13:01,640 Speaker 1: you're hitting as many people as what your ad campaign 206 00:13:01,679 --> 00:13:05,080 Speaker 1: was supposed to do, and that the results are positive ones, 207 00:13:05,120 --> 00:13:08,480 Speaker 1: that you're getting more traffic and more commerce whatever it 208 00:13:08,480 --> 00:13:11,720 Speaker 1: may be because of those ads. However, to be able 209 00:13:11,760 --> 00:13:13,880 Speaker 1: to do that, you have to have access to the data, 210 00:13:14,320 --> 00:13:20,480 Speaker 1: and according to the French regulators, the problem is that 211 00:13:20,600 --> 00:13:23,800 Speaker 1: Meta would offer that kind of access to major partners, 212 00:13:23,920 --> 00:13:28,760 Speaker 1: so big advertising verification companies could get access to that 213 00:13:28,840 --> 00:13:32,640 Speaker 1: data and actually do this, but smaller ones were denied 214 00:13:32,679 --> 00:13:36,520 Speaker 1: that opportunity, and this ultimately puts those smaller companies at 215 00:13:36,520 --> 00:13:40,280 Speaker 1: a greater disadvantage compared to big players in the space. 216 00:13:40,360 --> 00:13:44,040 Speaker 1: It is anti competitive. This is not just based on 217 00:13:44,120 --> 00:13:48,600 Speaker 1: the agency's opinion. An ad verification company called ad Lukes 218 00:13:49,160 --> 00:13:52,920 Speaker 1: actually sought access to Meta's analytics and received a denial. 219 00:13:53,200 --> 00:13:57,000 Speaker 1: This is a smaller ad verification company located in France. 220 00:13:57,160 --> 00:14:01,000 Speaker 1: So ad Lukes subsequently brought this case to the Antitrust 221 00:14:01,080 --> 00:14:04,520 Speaker 1: Agency and things progress from there. The agency says that 222 00:14:04,559 --> 00:14:07,760 Speaker 1: Meta has two months to change its approach and adjust 223 00:14:07,760 --> 00:14:11,120 Speaker 1: the rules for ad verification access or else face. I 224 00:14:11,120 --> 00:14:13,880 Speaker 1: don't know some sort of repercussions. I imagine I couldn't 225 00:14:13,920 --> 00:14:17,480 Speaker 1: actually find out what the consequences would be for Meta. 226 00:14:17,559 --> 00:14:20,040 Speaker 1: If it failed to do that, it would probably be 227 00:14:20,240 --> 00:14:25,840 Speaker 1: some form of actual legal action against the company. Back 228 00:14:25,840 --> 00:14:30,760 Speaker 1: here in the States, Meta faces more government scrutiny. The FTC, 229 00:14:31,440 --> 00:14:34,120 Speaker 1: you might remember them from earlier in the episode, says 230 00:14:34,120 --> 00:14:37,720 Speaker 1: that Meta broke the rules and violated a policy order 231 00:14:37,760 --> 00:14:40,880 Speaker 1: that was issued in twenty twenty with regard to how 232 00:14:40,920 --> 00:14:44,920 Speaker 1: the company collects and uses personal data, specifically that belonging 233 00:14:45,000 --> 00:14:50,400 Speaker 1: to kids. And so yesterday the FTC proposed barring Meta 234 00:14:50,440 --> 00:14:54,960 Speaker 1: from monetizing data belonging to kids completely. The Meta would 235 00:14:55,040 --> 00:14:58,240 Speaker 1: not be allowed. It would be illegal for Meta to 236 00:14:58,800 --> 00:15:03,000 Speaker 1: exploit the person information that came from kids. You might 237 00:15:03,040 --> 00:15:05,000 Speaker 1: remember that a couple of years ago, Meta was in 238 00:15:05,040 --> 00:15:08,040 Speaker 1: the hot seat when a whistleblower came forward with accusations 239 00:15:08,040 --> 00:15:11,720 Speaker 1: that the company was regularly pursuing business strategies that could 240 00:15:11,720 --> 00:15:16,720 Speaker 1: directly or indirectly harm people, including children. In fact, at 241 00:15:16,760 --> 00:15:20,320 Speaker 1: that time, Meta was actively developing a version of Instagram 242 00:15:20,680 --> 00:15:24,280 Speaker 1: targeting kids. Because currently Meta's platforms are intended for those 243 00:15:24,320 --> 00:15:27,320 Speaker 1: who are at least thirteen years old. The FTC is 244 00:15:27,400 --> 00:15:30,640 Speaker 1: essentially saying that Meta has proven repeatedly to be a 245 00:15:30,720 --> 00:15:34,400 Speaker 1: poor steward of personal information and that there exists a 246 00:15:34,600 --> 00:15:38,240 Speaker 1: need to update the earlier privacy agreements to place further 247 00:15:38,280 --> 00:15:41,760 Speaker 1: restrictions on how Meta collects data and how it can 248 00:15:41,800 --> 00:15:45,640 Speaker 1: then use that information, particularly with regard to children. Plus, 249 00:15:45,680 --> 00:15:49,080 Speaker 1: the FTC says that Meta should not launch any new 250 00:15:49,120 --> 00:15:54,520 Speaker 1: products or make substantive changes to existing ones until there 251 00:15:54,560 --> 00:15:58,360 Speaker 1: is confirmation that those products comply with the terms of 252 00:15:58,400 --> 00:16:02,440 Speaker 1: this agreement. Meta has thirty days to respond, and Meta 253 00:16:02,480 --> 00:16:07,440 Speaker 1: spokesperson Andy Stone made no bones about it. He said 254 00:16:07,440 --> 00:16:09,640 Speaker 1: that this is a political stunt and the company is 255 00:16:09,680 --> 00:16:13,520 Speaker 1: going to push back against updating this agreement, which is 256 00:16:14,000 --> 00:16:17,280 Speaker 1: not a big surprise. And I don't know if I 257 00:16:17,320 --> 00:16:20,080 Speaker 1: agree that it's fully a political stunt. I think it 258 00:16:20,160 --> 00:16:22,880 Speaker 1: partly is, but I also think it's something that is 259 00:16:23,960 --> 00:16:28,600 Speaker 1: genuinely needed, So yes, I think in part this is 260 00:16:28,640 --> 00:16:32,440 Speaker 1: about scoring political points, but I also think that it 261 00:16:32,560 --> 00:16:36,720 Speaker 1: is beyond time for us to consider how to best 262 00:16:36,800 --> 00:16:41,200 Speaker 1: protect private information from being exploited in ways that have, 263 00:16:42,040 --> 00:16:45,520 Speaker 1: you know, truly negative consequences. Further down the line, while 264 00:16:45,520 --> 00:16:48,240 Speaker 1: the FTC is pushing back against meta to back off 265 00:16:48,280 --> 00:16:51,920 Speaker 1: of exploiting the personal data belonging to kids, some senators 266 00:16:52,200 --> 00:16:56,280 Speaker 1: are taking an even tougher stance. For senators, two of 267 00:16:56,280 --> 00:16:59,240 Speaker 1: them Democrats and two of them Republicans, so it's bipartisan 268 00:16:59,520 --> 00:17:03,359 Speaker 1: have in introduce legislation that would ban social media platforms 269 00:17:03,360 --> 00:17:06,000 Speaker 1: from allowing anyone under the age of thirteen to have 270 00:17:06,040 --> 00:17:09,520 Speaker 1: an account, and further, anyone under the age of eighteen 271 00:17:09,720 --> 00:17:13,400 Speaker 1: would need permission from a legal guardian before being allowed 272 00:17:13,400 --> 00:17:16,560 Speaker 1: to make an account. On top of that, for those 273 00:17:16,640 --> 00:17:19,320 Speaker 1: kids who are between the ages of thirteen and eighteen, 274 00:17:19,800 --> 00:17:23,360 Speaker 1: things would have to work differently on those platforms. Namely, 275 00:17:23,800 --> 00:17:27,639 Speaker 1: services like Facebook or TikTok or whatever would not be 276 00:17:27,720 --> 00:17:32,240 Speaker 1: allowed to use algorithms to recommend content to those users. 277 00:17:32,680 --> 00:17:36,199 Speaker 1: That is a huge change. As I'm sure you all know, 278 00:17:36,320 --> 00:17:40,600 Speaker 1: these platforms rely on algorithms to select and serve content 279 00:17:41,080 --> 00:17:44,439 Speaker 1: that the platform determines is most likely to keep that 280 00:17:44,640 --> 00:17:48,719 Speaker 1: specific person engaged and stuck to the site or service 281 00:17:49,040 --> 00:17:52,200 Speaker 1: whatever keeps the eyeballs glued to that screen as long 282 00:17:52,240 --> 00:17:54,760 Speaker 1: as possible so that more ads can be served to them. 283 00:17:55,200 --> 00:17:57,119 Speaker 1: That is the name of the game. And if this 284 00:17:57,280 --> 00:18:00,600 Speaker 1: legislation were to be adopted into law, the social networks 285 00:18:00,680 --> 00:18:03,760 Speaker 1: would not be able to treat kids the same way 286 00:18:04,000 --> 00:18:08,440 Speaker 1: they treat adults. However, there are a lot of steps 287 00:18:08,480 --> 00:18:12,119 Speaker 1: that have to happen before this proposal can turn into 288 00:18:12,200 --> 00:18:17,280 Speaker 1: an actual law, and there are some debates within government 289 00:18:17,480 --> 00:18:22,760 Speaker 1: as to how these restrictions might infringe upon civil civil liberties, like, 290 00:18:22,840 --> 00:18:26,440 Speaker 1: you know, how could they potentially violate the First Amendment, 291 00:18:26,520 --> 00:18:29,479 Speaker 1: for example, And if you can make an argument and 292 00:18:29,760 --> 00:18:34,439 Speaker 1: a valid one, that such a measure would be in 293 00:18:34,520 --> 00:18:38,399 Speaker 1: violation of the Constitution, then that's a that's you know, 294 00:18:38,480 --> 00:18:41,840 Speaker 1: a non starter, right, the legislation is not going to 295 00:18:41,880 --> 00:18:44,960 Speaker 1: be able to go anywhere. Also, we have to keep 296 00:18:44,960 --> 00:18:47,919 Speaker 1: in mind there are obviously tech lobbyists who spend a 297 00:18:48,000 --> 00:18:50,000 Speaker 1: lot of time and a lot of money bending the 298 00:18:50,040 --> 00:18:54,480 Speaker 1: ear of politicians in an effort to minimize regulations and 299 00:18:54,560 --> 00:18:58,480 Speaker 1: to get as favorable a political environment as possible to 300 00:18:58,520 --> 00:19:01,400 Speaker 1: do their business. So I know, means, is this new 301 00:19:01,480 --> 00:19:06,320 Speaker 1: proposal absolutely bound to become law. It might, but there's 302 00:19:06,359 --> 00:19:10,280 Speaker 1: nothing that is guaranteeing it. Then we have the state 303 00:19:10,560 --> 00:19:14,920 Speaker 1: of Montana here in the United States, which figuratively told 304 00:19:14,960 --> 00:19:17,520 Speaker 1: the rest of the world to hold my beer. So 305 00:19:17,960 --> 00:19:23,120 Speaker 1: Montana's governor, Greg Gianforte had previously supported a bill that 306 00:19:23,200 --> 00:19:27,919 Speaker 1: would effectively ban TikTok in the state of Montana. But 307 00:19:28,000 --> 00:19:31,879 Speaker 1: then folks pointed out that this bill, by singling out TikTok, 308 00:19:32,200 --> 00:19:35,359 Speaker 1: was likely to be deemed unconstitutional because here was a 309 00:19:35,400 --> 00:19:39,439 Speaker 1: state government acting out against a specific company. It looks 310 00:19:39,480 --> 00:19:42,399 Speaker 1: like it's a targeted attack, because it kind of is. 311 00:19:42,800 --> 00:19:47,439 Speaker 1: So then gia Forte issued what's called an amendatory veto. 312 00:19:47,560 --> 00:19:53,159 Speaker 1: So essentially, this proposes different language for the bill, and 313 00:19:53,240 --> 00:19:57,040 Speaker 1: if the state legislature adopts that language and changes the 314 00:19:57,040 --> 00:19:59,360 Speaker 1: bill so that it no longer is singling out TikTok, 315 00:20:00,000 --> 00:20:03,639 Speaker 1: the governor agrees to sign the bill into law. Otherwise, 316 00:20:03,680 --> 00:20:07,400 Speaker 1: the governor will veto the legislation since you know, constitutional 317 00:20:07,440 --> 00:20:10,640 Speaker 1: issues would ultimately mean that the law would eventually get 318 00:20:11,119 --> 00:20:14,920 Speaker 1: overturned by a court anyway. But here's the problem. This 319 00:20:15,040 --> 00:20:18,199 Speaker 1: new language, by removing the specific reference to TikTok, but 320 00:20:18,400 --> 00:20:22,400 Speaker 1: keeping some of the other vague passages would technically ban 321 00:20:22,720 --> 00:20:28,880 Speaker 1: all social networks in Montana for everyone, every single social network, 322 00:20:29,040 --> 00:20:34,879 Speaker 1: because the legislation says that it would be illegal for 323 00:20:35,480 --> 00:20:40,880 Speaker 1: any social media application to facilitate quote the personal information 324 00:20:41,080 --> 00:20:44,439 Speaker 1: or data to be provided to a foreign adversary or 325 00:20:44,440 --> 00:20:48,520 Speaker 1: a person or entity located within a country designated as 326 00:20:48,560 --> 00:20:52,200 Speaker 1: a foreign adversary end quote. Okay, Well, that's a big 327 00:20:52,240 --> 00:20:55,280 Speaker 1: problem because even if you're not scraping data, even if 328 00:20:55,320 --> 00:20:58,320 Speaker 1: you're not you know, the recipient of a fire hose 329 00:20:58,320 --> 00:21:02,119 Speaker 1: of information directly from the platform. Because obviously that's the 330 00:21:02,160 --> 00:21:05,480 Speaker 1: fear with TikTok, right. The concern about TikTok, at least 331 00:21:05,480 --> 00:21:10,399 Speaker 1: in the United States, is that TikTok is collecting enormous 332 00:21:10,400 --> 00:21:14,320 Speaker 1: amounts of information and funneling it to a Chinese company, 333 00:21:14,359 --> 00:21:18,160 Speaker 1: which in turn might be sharing that with the Chinese government, 334 00:21:18,600 --> 00:21:22,000 Speaker 1: and that that is potentially a risk to national security. 335 00:21:22,560 --> 00:21:25,720 Speaker 1: That's the concern about TikTok. But the way this language 336 00:21:25,920 --> 00:21:30,600 Speaker 1: is formed, it means that anyone in a country that's 337 00:21:30,640 --> 00:21:34,159 Speaker 1: ay quote unquote foreign adversary, so example, anyone in China, 338 00:21:34,520 --> 00:21:38,000 Speaker 1: if anyone in China could log into, say Facebook, and 339 00:21:38,040 --> 00:21:41,760 Speaker 1: then check out, say my account on Facebook, Well, they 340 00:21:41,800 --> 00:21:45,640 Speaker 1: would see my name, that's my personal information. If I 341 00:21:45,680 --> 00:21:47,879 Speaker 1: hadn't you know, hidden it, they would see things like 342 00:21:47,920 --> 00:21:51,600 Speaker 1: my birth date, that's my personal information. Like that's personal 343 00:21:51,600 --> 00:21:56,080 Speaker 1: info I have on Facebook that anyone in China could 344 00:21:56,119 --> 00:21:59,359 Speaker 1: see if they went to my page. Well, according to 345 00:21:59,400 --> 00:22:02,359 Speaker 1: the the rules set down or the language set down 346 00:22:02,400 --> 00:22:04,800 Speaker 1: in this proposed legislation, that would be against the law 347 00:22:04,800 --> 00:22:09,320 Speaker 1: in Montana, which means that social platforms would have to 348 00:22:09,359 --> 00:22:13,159 Speaker 1: just stop operating in Montana or find some way to 349 00:22:13,280 --> 00:22:17,280 Speaker 1: prevent anyone in a quote unquote you know, a foreign 350 00:22:17,320 --> 00:22:21,879 Speaker 1: adversarial country from being able to access data belonging to 351 00:22:21,920 --> 00:22:29,160 Speaker 1: Montana citizens. It creates a spectrum wide ban. So, yeah, 352 00:22:29,200 --> 00:22:32,760 Speaker 1: this is a real issue. It actually really shows how 353 00:22:32,800 --> 00:22:37,360 Speaker 1: hard it is to create legislation that does not include 354 00:22:37,480 --> 00:22:42,160 Speaker 1: vague language that could have unintended consequences, because I mean, 355 00:22:42,200 --> 00:22:44,520 Speaker 1: how do you defend against that If you've passed a 356 00:22:44,520 --> 00:22:47,960 Speaker 1: bill that has this measure in it and then someone says, well, 357 00:22:48,320 --> 00:22:51,560 Speaker 1: people in China can see my information, so you have 358 00:22:51,600 --> 00:22:55,439 Speaker 1: to block Facebook. You know, what are they going to say, like, no, 359 00:22:55,520 --> 00:22:57,320 Speaker 1: we didn't mean it like that. I mean, it's just 360 00:22:57,800 --> 00:23:02,040 Speaker 1: it's a mess, oh boy. Speaking of TikTok. By the way, 361 00:23:02,200 --> 00:23:05,399 Speaker 1: Eric Hann, who was the head of TikTok's US trust 362 00:23:05,400 --> 00:23:08,439 Speaker 1: and safety operations, is leaving the company. This had been 363 00:23:08,520 --> 00:23:12,520 Speaker 1: rumored for a while, and apparently he has said that 364 00:23:12,600 --> 00:23:17,400 Speaker 1: his role was essentially a poisoned chalice, that he had 365 00:23:17,400 --> 00:23:21,520 Speaker 1: the unenviable job of being the person responsible for leading 366 00:23:21,560 --> 00:23:24,439 Speaker 1: efforts to work with the US government in order to 367 00:23:24,560 --> 00:23:29,199 Speaker 1: avoid a nationwide ban on TikTok, and he felt he 368 00:23:29,280 --> 00:23:31,639 Speaker 1: was essentially set up to fail and to be the 369 00:23:31,760 --> 00:23:35,080 Speaker 1: scapegoat for that failure. It's kind of hard to disagree 370 00:23:35,119 --> 00:23:38,280 Speaker 1: with that, because the US government recently has appeared to 371 00:23:38,320 --> 00:23:41,520 Speaker 1: have kind of a laser focus on banning TikTok. For 372 00:23:41,560 --> 00:23:45,080 Speaker 1: the record, I do think TikTok poses some risks. I 373 00:23:45,119 --> 00:23:49,439 Speaker 1: think that TikTok is potentially a dangerous thing when it 374 00:23:49,480 --> 00:23:52,760 Speaker 1: comes to data collection. But as we just talked about 375 00:23:52,920 --> 00:23:57,399 Speaker 1: with the previous news item, all social networks are risky. 376 00:23:57,560 --> 00:24:01,240 Speaker 1: Like that, It's not just TikTok. I mean, talk presents 377 00:24:01,440 --> 00:24:05,600 Speaker 1: a more clear and present danger, at least as a perception. 378 00:24:05,840 --> 00:24:08,440 Speaker 1: I don't know if that's the reality. I honestly don't 379 00:24:08,480 --> 00:24:12,520 Speaker 1: know if people in China have that direct access to 380 00:24:12,600 --> 00:24:17,000 Speaker 1: TikTok's user data, but even if they don't, without strict 381 00:24:17,040 --> 00:24:20,919 Speaker 1: privacy laws in place within the United States, all of 382 00:24:21,000 --> 00:24:25,040 Speaker 1: our user data is still being gathered and bought and sold. 383 00:24:25,640 --> 00:24:29,399 Speaker 1: So it may mean that there's an extra step involved, 384 00:24:29,840 --> 00:24:33,440 Speaker 1: but it doesn't really matter because the Chinese government could 385 00:24:33,440 --> 00:24:36,080 Speaker 1: get hold of all that kind of information across all 386 00:24:36,119 --> 00:24:39,680 Speaker 1: different platforms just by spending the money. It doesn't have 387 00:24:39,760 --> 00:24:43,560 Speaker 1: to have that implanted app in the US for this 388 00:24:43,640 --> 00:24:47,359 Speaker 1: to happen. So I get Han's position here, and it 389 00:24:47,400 --> 00:24:49,399 Speaker 1: really kind of is unfair when you think of it 390 00:24:49,440 --> 00:24:53,040 Speaker 1: that way. And again, it's not that I think that 391 00:24:53,320 --> 00:24:57,040 Speaker 1: TikTok isn't necessarily dangerous. I think TikTok is. But then 392 00:24:57,280 --> 00:25:01,280 Speaker 1: I also think all social media potentially is. It's that 393 00:25:01,880 --> 00:25:06,240 Speaker 1: we need to address privacy laws in the United States, 394 00:25:06,640 --> 00:25:10,160 Speaker 1: and we've needed to do it for decades, and whether 395 00:25:10,240 --> 00:25:12,320 Speaker 1: it ever happens or not, I don't know. My guess 396 00:25:12,359 --> 00:25:15,440 Speaker 1: is what will happen is it will see some sort 397 00:25:15,440 --> 00:25:19,000 Speaker 1: of massive ban on TikTok, which is not gonna solve 398 00:25:19,000 --> 00:25:20,880 Speaker 1: the problem because there are gonna be all these other 399 00:25:20,920 --> 00:25:23,960 Speaker 1: sources of data that are you know, data siphons that 400 00:25:24,040 --> 00:25:26,960 Speaker 1: are going to continue the issue. The problem will still exist. 401 00:25:27,480 --> 00:25:30,520 Speaker 1: Younger people will be very upset because TikTok was destroyed, 402 00:25:31,040 --> 00:25:34,800 Speaker 1: and uh, the United States will continue to do nothing 403 00:25:34,840 --> 00:25:40,600 Speaker 1: about protecting online privacy. So yeah, big old mess there. 404 00:25:41,119 --> 00:25:44,360 Speaker 1: All right, I'm done ranting about that. I'm gonna I'm 405 00:25:44,359 --> 00:25:47,639 Speaker 1: gonna drink some chemimal tea and we'll be back after 406 00:25:47,680 --> 00:26:01,320 Speaker 1: this quick break. Okay, we're back and now to talk 407 00:26:01,320 --> 00:26:05,359 Speaker 1: about Microsoft again. So Microsoft continues to make some decisions 408 00:26:05,359 --> 00:26:09,919 Speaker 1: that I think are unwise. I think ultimately some of 409 00:26:09,920 --> 00:26:13,879 Speaker 1: the company's strategies are going to result in regulatory agencies 410 00:26:14,000 --> 00:26:17,480 Speaker 1: around the world giving the company a serious rap on 411 00:26:17,480 --> 00:26:19,879 Speaker 1: the knuckles with a ruler. I mean, we're already seeing 412 00:26:19,920 --> 00:26:25,520 Speaker 1: that unfold in Europe with certain Microsoft policies. Well they 413 00:26:25,800 --> 00:26:29,080 Speaker 1: that company seems to be doubling down. Like I just 414 00:26:29,160 --> 00:26:33,679 Speaker 1: I don't understand the logic behind the decisions. So The 415 00:26:33,760 --> 00:26:38,679 Speaker 1: Verge reported on this issue and cited Reddit users who 416 00:26:38,720 --> 00:26:41,560 Speaker 1: are saying that Microsoft is pushing out a change to 417 00:26:42,040 --> 00:26:48,280 Speaker 1: IT admins in various organizations that are Microsoft customers and 418 00:26:49,000 --> 00:26:52,120 Speaker 1: Microsoft is saying that it's going to make at least 419 00:26:52,160 --> 00:26:55,640 Speaker 1: certain links that are posted in stuff like Outlook, which 420 00:26:55,680 --> 00:26:59,400 Speaker 1: is email if you're not familiar, and Microsoft Teams, which 421 00:26:59,440 --> 00:27:04,360 Speaker 1: is Microsoft video conferencing tool similar to something like Zoom. Anyway, 422 00:27:04,440 --> 00:27:07,240 Speaker 1: the company says it's going to make it so that 423 00:27:07,600 --> 00:27:12,640 Speaker 1: links that are shared in these services will then push 424 00:27:12,880 --> 00:27:15,800 Speaker 1: users to go to Microsoft Edge if they click on 425 00:27:15,840 --> 00:27:18,560 Speaker 1: the link. So, in other words, it won't matter what 426 00:27:18,760 --> 00:27:21,880 Speaker 1: browser you have set as your default browser for your 427 00:27:22,080 --> 00:27:25,879 Speaker 1: operating system. Instead, if you click on the link in 428 00:27:26,000 --> 00:27:28,520 Speaker 1: Teams or an Outlook, it's going to open up an 429 00:27:28,640 --> 00:27:32,000 Speaker 1: Edge browser window and go to that web page. Now, y'all, 430 00:27:32,520 --> 00:27:34,240 Speaker 1: I don't feel I need to point out that this 431 00:27:34,359 --> 00:27:36,560 Speaker 1: is falling right in line with the sort of anti 432 00:27:36,600 --> 00:27:41,440 Speaker 1: competitive accusations regulators have been leveling at Microsoft recently, saying 433 00:27:41,440 --> 00:27:45,720 Speaker 1: that by tying your various products together so tightly and 434 00:27:45,880 --> 00:27:51,199 Speaker 1: ignoring things like someone's setting of a default browser, it 435 00:27:51,280 --> 00:27:56,320 Speaker 1: is inherently anti competitive. Also, while I'm not sure what 436 00:27:56,480 --> 00:27:59,199 Speaker 1: the scope is on these situations, like I don't know 437 00:27:59,280 --> 00:28:03,000 Speaker 1: if it's going to affect every single link that shared 438 00:28:03,160 --> 00:28:06,480 Speaker 1: or if it's just certain ones, I do know there 439 00:28:06,520 --> 00:28:09,720 Speaker 1: are tools that I rely upon in my job that 440 00:28:09,800 --> 00:28:14,840 Speaker 1: are optimized for a different browser than Edge. So if 441 00:28:14,880 --> 00:28:16,719 Speaker 1: you want to use that tool and you want to 442 00:28:16,760 --> 00:28:19,159 Speaker 1: go smoothly, you have to use a specific browser to 443 00:28:19,200 --> 00:28:21,480 Speaker 1: do it. I'm sure you've all encountered this. Back in 444 00:28:21,520 --> 00:28:24,080 Speaker 1: the old days, it used to be that web pages 445 00:28:24,119 --> 00:28:27,239 Speaker 1: would be optimized for one browser versus another, and if 446 00:28:27,280 --> 00:28:28,920 Speaker 1: you were to go to that web page using a 447 00:28:28,960 --> 00:28:33,080 Speaker 1: different browser, it looked terrible. But we've largely moved away 448 00:28:33,119 --> 00:28:35,159 Speaker 1: from that part. But we still have certain web based 449 00:28:35,160 --> 00:28:39,360 Speaker 1: tools that are designed to work seamlessly with specific browsers 450 00:28:39,920 --> 00:28:43,280 Speaker 1: and maybe not so well with others. So opening a 451 00:28:43,320 --> 00:28:46,760 Speaker 1: link that would send me to an EDGE based browser 452 00:28:46,840 --> 00:28:50,400 Speaker 1: that's gonna send me over the edge. Of course, worst 453 00:28:50,480 --> 00:28:54,200 Speaker 1: case scenario, I could just copy the link in my 454 00:28:54,280 --> 00:28:57,560 Speaker 1: email and then go into my preferred browser and then 455 00:28:57,600 --> 00:29:01,560 Speaker 1: paste the link into the URL bar. But that's madness. 456 00:29:02,080 --> 00:29:04,240 Speaker 1: I should just click the link and then be able 457 00:29:04,280 --> 00:29:06,920 Speaker 1: to go straight to the default browser and see what 458 00:29:07,040 --> 00:29:10,320 Speaker 1: I want. So, as you might imagine it, admins are 459 00:29:10,400 --> 00:29:16,040 Speaker 1: not super thrilled about this change. So not only is 460 00:29:16,080 --> 00:29:19,120 Speaker 1: it something that's frustrating to the admins themselves, but also 461 00:29:19,120 --> 00:29:20,800 Speaker 1: you got to remember, these are the folks who have 462 00:29:20,840 --> 00:29:26,400 Speaker 1: to communicate those changes throughout their various organizations, and then 463 00:29:26,480 --> 00:29:28,520 Speaker 1: they are the ones who get blamed for it, even 464 00:29:28,520 --> 00:29:30,640 Speaker 1: though they're not the ones who made the policy change, 465 00:29:30,640 --> 00:29:33,120 Speaker 1: like they're just communicating it. But my gosh, I have 466 00:29:33,200 --> 00:29:36,880 Speaker 1: seen cases where the person in charge of it becomes 467 00:29:36,920 --> 00:29:39,840 Speaker 1: the recipient of so much abuse because something stops working 468 00:29:39,880 --> 00:29:41,720 Speaker 1: the way it's supposed to. But it's all of the 469 00:29:41,720 --> 00:29:46,640 Speaker 1: IT admin's hands because it's ultimately coming from the provider, 470 00:29:47,240 --> 00:29:49,840 Speaker 1: in this case Microsoft, so it's not really their fault. 471 00:29:50,520 --> 00:29:54,080 Speaker 1: If the IT admin does happen to be an organization 472 00:29:54,160 --> 00:29:58,960 Speaker 1: that is a Microsoft three sixty five enterprise customer, because 473 00:29:58,960 --> 00:30:02,760 Speaker 1: they're different levels of business customer for Microsoft three sixty five. 474 00:30:03,040 --> 00:30:05,440 Speaker 1: If they're at the enterprise level, well, good news, they 475 00:30:05,440 --> 00:30:08,320 Speaker 1: can actually change that policy. They don't have to use 476 00:30:08,840 --> 00:30:13,240 Speaker 1: the edge version. They can turn that off and allow 477 00:30:13,280 --> 00:30:16,120 Speaker 1: people to continue to open up links in their default browsers. 478 00:30:16,880 --> 00:30:19,840 Speaker 1: But if they're working at a company that has a 479 00:30:19,920 --> 00:30:23,760 Speaker 1: Microsoft three sixty five for Business account instead of the 480 00:30:23,840 --> 00:30:28,800 Speaker 1: Microsoft three sixty five Enterprise account, well the business customers 481 00:30:28,920 --> 00:30:32,120 Speaker 1: are going to be subjected to this change. So I 482 00:30:32,160 --> 00:30:37,520 Speaker 1: guess I'm saying good luck in those courtrooms Microsoft over 483 00:30:37,600 --> 00:30:41,640 Speaker 1: At Google, some employees have expressed, let's say a little 484 00:30:41,680 --> 00:30:47,640 Speaker 1: bit of consternation regarding the company's extensive cutbacks. While Sundhar Pachai, 485 00:30:47,680 --> 00:30:53,280 Speaker 1: the company's CEO, took home a cool two hundred twenty 486 00:30:53,440 --> 00:30:59,000 Speaker 1: six million dollars in compensation last year, As you might imagine, 487 00:30:59,360 --> 00:31:03,800 Speaker 1: internal communication tools played host to numerous memes that aligned 488 00:31:03,840 --> 00:31:07,320 Speaker 1: on a few basic messages. You know that Google morale 489 00:31:07,560 --> 00:31:11,320 Speaker 1: is dropping. The company has laid off thousands of employees, 490 00:31:11,680 --> 00:31:15,719 Speaker 1: the company has reduced perks and benefits across the organization, 491 00:31:15,920 --> 00:31:18,440 Speaker 1: and then the CEO is one of the highest paid 492 00:31:18,480 --> 00:31:22,080 Speaker 1: in the United States. The disparity is hard to just 493 00:31:22,280 --> 00:31:25,200 Speaker 1: ignore here, and it definitely makes it more challenging for 494 00:31:25,240 --> 00:31:29,880 Speaker 1: Google leadership to create messaging around things like sacrifice and 495 00:31:29,960 --> 00:31:32,600 Speaker 1: cutbacks in the face of a tough economy, when the 496 00:31:32,640 --> 00:31:36,040 Speaker 1: head honcho could pull a Scrooge McDuck if he wanted, 497 00:31:36,080 --> 00:31:40,160 Speaker 1: and go swimming around in a big old vault of money. Finally, 498 00:31:40,640 --> 00:31:45,560 Speaker 1: the Independent reports that the Climate Action Against Disinformation or 499 00:31:45,720 --> 00:31:51,840 Speaker 1: CAAD Commission has found multiple instances of Google running advertisements 500 00:31:51,920 --> 00:31:56,560 Speaker 1: against YouTube videos that contain misinformation about the climate crisis. 501 00:31:56,960 --> 00:32:00,400 Speaker 1: So back in twenty twenty one, Google up the dated 502 00:32:00,480 --> 00:32:04,560 Speaker 1: its policy and said Google would no longer serve ads 503 00:32:04,640 --> 00:32:09,120 Speaker 1: against content that contradicts the scientific consensus on climate change. 504 00:32:09,600 --> 00:32:13,600 Speaker 1: And so what the commission found was that they have 505 00:32:13,800 --> 00:32:17,600 Speaker 1: failed to enforce their own policy and in fact, the 506 00:32:17,640 --> 00:32:23,480 Speaker 1: company has profited off of videos that actively spread climate misinformation, 507 00:32:24,000 --> 00:32:27,880 Speaker 1: and of course the channel's pushing the misinformation, they benefit 508 00:32:28,000 --> 00:32:32,480 Speaker 1: because their channels are monetized, so they're getting money as well, 509 00:32:32,640 --> 00:32:36,600 Speaker 1: and that means there's a financial incentive to continue creating 510 00:32:36,720 --> 00:32:40,120 Speaker 1: misleading content on YouTube, because if you can monetize it 511 00:32:40,320 --> 00:32:43,360 Speaker 1: and you can get popular, then you're gonna make money. 512 00:32:43,840 --> 00:32:46,479 Speaker 1: Google reps say that while their policy does state that 513 00:32:46,520 --> 00:32:50,360 Speaker 1: such videos should not have ads served against them, which 514 00:32:50,400 --> 00:32:53,640 Speaker 1: is probably something that advertisers want as well, brands are 515 00:32:53,720 --> 00:32:57,120 Speaker 1: typically not super keen on being associated with messages of 516 00:32:57,160 --> 00:33:01,200 Speaker 1: climate change denial. The reps say, well, are systems not perfect, 517 00:33:01,560 --> 00:33:06,040 Speaker 1: some stuff will slip through. And Google took the list 518 00:33:06,080 --> 00:33:08,720 Speaker 1: that the Commission submitted. It had like a hundred different 519 00:33:08,800 --> 00:33:11,719 Speaker 1: videos that were in violation of the policy that were 520 00:33:11,760 --> 00:33:16,680 Speaker 1: still monetized. Google then subsequently demonetized those videos, but I 521 00:33:16,680 --> 00:33:19,480 Speaker 1: think we can draw a couple of conclusions based on 522 00:33:19,600 --> 00:33:25,440 Speaker 1: this incident. One, it is genuinely hard to enforce policies 523 00:33:25,440 --> 00:33:28,040 Speaker 1: on a platform that receives such a huge amount of 524 00:33:28,040 --> 00:33:31,480 Speaker 1: content every single minute. It's actually at a point where 525 00:33:31,480 --> 00:33:35,240 Speaker 1: it's literally impossible for humans to review everything, so it 526 00:33:35,280 --> 00:33:39,200 Speaker 1: is difficult to catch it all and some stuff might 527 00:33:39,280 --> 00:33:42,480 Speaker 1: slip through, So that part you can kind of see 528 00:33:42,520 --> 00:33:46,160 Speaker 1: Google's point of view. However, Two, if no one calls 529 00:33:46,200 --> 00:33:49,320 Speaker 1: the company to be held accountable, and if there's money 530 00:33:49,360 --> 00:33:52,360 Speaker 1: to be made, we should not be surprised when we 531 00:33:52,440 --> 00:33:57,800 Speaker 1: find videos that violate its policies. So eternal vigilance is 532 00:33:57,840 --> 00:34:01,600 Speaker 1: what is called for. Because, as the commission found by 533 00:34:01,600 --> 00:34:04,880 Speaker 1: bringing this to Google's attention, Google then did go and 534 00:34:04,920 --> 00:34:08,839 Speaker 1: demonetize those videos. But if they hadn't done that, it 535 00:34:09,000 --> 00:34:14,000 Speaker 1: probably would have stayed up monetized for who knows how long. 536 00:34:14,120 --> 00:34:18,120 Speaker 1: So yeah, we have to hold these companies accountable to 537 00:34:18,239 --> 00:34:21,680 Speaker 1: their own frickin' policies. This isn't even about holding a 538 00:34:21,680 --> 00:34:25,040 Speaker 1: company accountable to the law. It's holding a company accountable 539 00:34:25,080 --> 00:34:28,040 Speaker 1: to the things it says it will do. All right, 540 00:34:29,040 --> 00:34:31,040 Speaker 1: and that's it for the tech news for today. I'm 541 00:34:31,080 --> 00:34:34,279 Speaker 1: sad I didn't have any fun Star Wars news. It 542 00:34:34,360 --> 00:34:36,600 Speaker 1: was kind of hard to talk about Star Wars tech 543 00:34:37,120 --> 00:34:41,000 Speaker 1: because either I'm talking about special effects technology or I'm 544 00:34:41,000 --> 00:34:45,319 Speaker 1: talking about fictional technology that we only wish existed but 545 00:34:45,760 --> 00:34:50,440 Speaker 1: doesn't actually exist. Maybe next year we can hope. I 546 00:34:50,480 --> 00:34:53,160 Speaker 1: hope you are all well, and I'll talk to you again, 547 00:34:53,760 --> 00:35:03,879 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 548 00:35:03,960 --> 00:35:08,680 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 549 00:35:08,719 --> 00:35:10,640 Speaker 1: wherever you listen to your favorite shows.