1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,080 --> 00:00:14,960 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,040 --> 00:00:17,920 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,000 --> 00:00:20,920 Speaker 1: And how the tech are you. It's time for the 5 00:00:20,920 --> 00:00:25,880 Speaker 1: tech news for Thursday, April twenty eight, two thousand, twenty two, 6 00:00:26,560 --> 00:00:29,760 Speaker 1: and we're gonna start off with some news about Meta. 7 00:00:30,440 --> 00:00:33,800 Speaker 1: The company had its first quarter report and the results 8 00:00:33,840 --> 00:00:38,080 Speaker 1: gave new confidence to investors. Now you might remember that 9 00:00:38,120 --> 00:00:40,440 Speaker 1: we found out that by the end of last year, 10 00:00:40,520 --> 00:00:43,880 Speaker 1: Meta had seen a decline in users for the first time, 11 00:00:44,479 --> 00:00:48,080 Speaker 1: and that, coupled with a couple of other tough pills 12 00:00:48,120 --> 00:00:53,479 Speaker 1: to swallow, ended up being a big shadow across the company. 13 00:00:54,040 --> 00:00:57,920 Speaker 1: And those tough bills included stuff like Zuckerberg acknowledging that 14 00:00:58,000 --> 00:01:01,280 Speaker 1: TikTok was far more effective at a tracting young users 15 00:01:01,360 --> 00:01:07,160 Speaker 1: than than Meta's properties were, including stuff like Instagram and Facebook, 16 00:01:08,040 --> 00:01:11,640 Speaker 1: and that Meta had been spending billions of dollars on 17 00:01:11,800 --> 00:01:15,440 Speaker 1: its Metaverse plans, but those plans are not gonna come 18 00:01:15,480 --> 00:01:18,360 Speaker 1: to fruition for several years. I mean, Zuckerberg himself said 19 00:01:18,959 --> 00:01:22,720 Speaker 1: it will be many years before this becomes a revenue 20 00:01:22,760 --> 00:01:26,280 Speaker 1: generating part of the business. This is literally us building 21 00:01:26,280 --> 00:01:30,160 Speaker 1: it out, and that meant that for a while folks 22 00:01:30,160 --> 00:01:33,760 Speaker 1: were kind of seeing doom and gloom for the company 23 00:01:33,840 --> 00:01:36,800 Speaker 1: because again, so much of our focus is on the 24 00:01:36,840 --> 00:01:40,959 Speaker 1: short term. But now Meta reports that they saw an 25 00:01:41,040 --> 00:01:44,000 Speaker 1: uptick in users this past quarter. There was actually an 26 00:01:44,040 --> 00:01:48,280 Speaker 1: increase Now The Verge reports that the growth in users, 27 00:01:48,520 --> 00:01:53,440 Speaker 1: while definitely being there, is still the lowest growth rate 28 00:01:53,560 --> 00:01:57,000 Speaker 1: in the company's history, with the Facebook app growing by 29 00:01:57,160 --> 00:02:01,160 Speaker 1: four percent in the first quarter. That company also revealed 30 00:02:01,200 --> 00:02:04,240 Speaker 1: that ad revenue is likely still going to be fairly 31 00:02:04,520 --> 00:02:09,560 Speaker 1: soft for the next few months, largely because of how 32 00:02:09,639 --> 00:02:12,920 Speaker 1: Apple updated iOS so that users can opt out of 33 00:02:13,000 --> 00:02:16,360 Speaker 1: some tracking data that Meta had been relying upon in 34 00:02:16,440 --> 00:02:21,239 Speaker 1: order to deliver targeted ads. Targeted ads have super high 35 00:02:21,480 --> 00:02:27,000 Speaker 1: value to advertisers, so without that guarantee, then Facebook, you know, 36 00:02:27,040 --> 00:02:30,360 Speaker 1: they can still sell ads, but the targeted ads were 37 00:02:30,400 --> 00:02:33,040 Speaker 1: the ones that we're bringing in the most money, so 38 00:02:33,480 --> 00:02:36,239 Speaker 1: it means having to sell kind of a lower tier 39 00:02:36,440 --> 00:02:41,720 Speaker 1: advertising because of this issue with Apple. Interesting to me 40 00:02:41,840 --> 00:02:45,520 Speaker 1: that investors appear to be responding more to the uptick 41 00:02:45,680 --> 00:02:50,000 Speaker 1: in user numbers than in the announcements that revenue might 42 00:02:50,040 --> 00:02:53,880 Speaker 1: still be a bit slow. Also, Zuckerberg indicated that the 43 00:02:53,880 --> 00:02:56,399 Speaker 1: company is going to ease back on some of its 44 00:02:56,400 --> 00:03:00,200 Speaker 1: expenditures this year, probably because a lot of folks don't 45 00:03:00,440 --> 00:03:03,360 Speaker 1: really like thinking about long term investments and they prefer 46 00:03:03,440 --> 00:03:07,400 Speaker 1: to see numbers go up quarter over quarter. And Bloomberg 47 00:03:07,480 --> 00:03:10,720 Speaker 1: reports that the easing off is a is a relative thing. 48 00:03:10,800 --> 00:03:15,120 Speaker 1: Really that apparently the target spending for the year was 49 00:03:15,240 --> 00:03:20,480 Speaker 1: somewhere around us staggering nine five billion dollars and now 50 00:03:20,520 --> 00:03:22,880 Speaker 1: it's going to be somewhere closer between eighty seven and 51 00:03:23,000 --> 00:03:28,200 Speaker 1: ninety two billion dollars um, which I mean, again, that's 52 00:03:28,200 --> 00:03:31,920 Speaker 1: from the Bloomberg report, and that's just bonkers to me. Anyway, 53 00:03:31,960 --> 00:03:34,639 Speaker 1: it continues to astound me how the company's share value 54 00:03:34,639 --> 00:03:39,240 Speaker 1: can fluctuate dramatically over metrics that, at least to my eyes, 55 00:03:39,280 --> 00:03:44,040 Speaker 1: don't seem to indicate actual real success or meaningful growth, 56 00:03:44,320 --> 00:03:46,880 Speaker 1: like if it's not bringing in revenue, or if it's 57 00:03:46,920 --> 00:03:52,040 Speaker 1: not bringing in you know, a uh an equal amount 58 00:03:52,080 --> 00:03:54,320 Speaker 1: of revenue compared to the growth and users, I don't 59 00:03:54,360 --> 00:03:56,120 Speaker 1: see where the value is. But then I'm not a 60 00:03:56,160 --> 00:04:01,000 Speaker 1: finance person, and I fully admit that I likely just 61 00:04:01,080 --> 00:04:03,880 Speaker 1: don't have a good enough understanding of investment. This is 62 00:04:04,120 --> 00:04:07,200 Speaker 1: a me problem more than say, a meta problem. I 63 00:04:07,200 --> 00:04:10,440 Speaker 1: guess now I do have another meta story, and this 64 00:04:10,520 --> 00:04:13,200 Speaker 1: is a positive one. And I know I'm surprised too, 65 00:04:13,400 --> 00:04:17,479 Speaker 1: But some researchers with Meta's AI division collaborated with researchers 66 00:04:17,520 --> 00:04:21,599 Speaker 1: at the University of Illinois Urbana Champagne to tackle a 67 00:04:21,880 --> 00:04:26,080 Speaker 1: hard problem that's a pun you just don't know it yet, 68 00:04:26,440 --> 00:04:29,880 Speaker 1: and that problem is finding ways to make concrete that 69 00:04:30,000 --> 00:04:33,520 Speaker 1: end up cutting way back on carbon dioxide emissions. There's 70 00:04:33,520 --> 00:04:38,080 Speaker 1: a huge carbon footprint in concrete production. Actually did an 71 00:04:38,080 --> 00:04:42,080 Speaker 1: episode about concrete not too long ago and talked about 72 00:04:42,120 --> 00:04:46,080 Speaker 1: how the production of cement is a really carbon intensive process. 73 00:04:46,839 --> 00:04:50,799 Speaker 1: Concrete production contributes about eight percent of all carbon dioxide 74 00:04:50,839 --> 00:04:55,159 Speaker 1: emissions globally each year. So finding ways to make concrete, 75 00:04:55,760 --> 00:04:59,679 Speaker 1: which is undeniably useful stuff, I mean, it's important while 76 00:05:00,120 --> 00:05:03,440 Speaker 1: also cutting back on CEO two emissions would be a 77 00:05:03,560 --> 00:05:06,920 Speaker 1: huge important component in our plans to achieve a carbon 78 00:05:06,960 --> 00:05:11,240 Speaker 1: neutral status in the future. But a huge challenge, a 79 00:05:11,320 --> 00:05:14,560 Speaker 1: huge problem is that there are multiple variables that you 80 00:05:14,600 --> 00:05:17,880 Speaker 1: can tweak when you're producing concrete. You could break it 81 00:05:17,880 --> 00:05:20,839 Speaker 1: down and say that concrete is essentially made up of 82 00:05:21,440 --> 00:05:25,160 Speaker 1: four things. You know, it's like water, aggregate, cement, and 83 00:05:25,240 --> 00:05:31,640 Speaker 1: typically some other substances that allow for the creation of concrete, 84 00:05:32,000 --> 00:05:35,360 Speaker 1: which means that you can tweak those different factors, right 85 00:05:35,440 --> 00:05:38,680 Speaker 1: and uh in different measurements and determine all, right, well, 86 00:05:38,720 --> 00:05:41,360 Speaker 1: let's see if we put less cement in and more 87 00:05:41,400 --> 00:05:44,320 Speaker 1: of this other stuff in what happens. But because you've 88 00:05:44,360 --> 00:05:46,960 Speaker 1: got four variables, there are a lot of different ways 89 00:05:47,000 --> 00:05:50,279 Speaker 1: to play with that, and that is one of the 90 00:05:50,320 --> 00:05:52,440 Speaker 1: reasons why I can get really challenging, because you guys 91 00:05:52,480 --> 00:05:54,640 Speaker 1: spend a lot of time playing around with all these 92 00:05:54,680 --> 00:05:59,440 Speaker 1: different variables. So the researchers used AI to train a 93 00:05:59,520 --> 00:06:03,760 Speaker 1: model on one thousand different concrete formulas and then derive 94 00:06:03,880 --> 00:06:06,279 Speaker 1: what would be you know, likely to be the most 95 00:06:06,279 --> 00:06:10,080 Speaker 1: efficient approach that would still yield a strong and reliable concrete. 96 00:06:10,160 --> 00:06:13,200 Speaker 1: Because sometimes you can change this stuff up and you 97 00:06:13,240 --> 00:06:15,919 Speaker 1: will get concrete, but it takes too long to dry, 98 00:06:16,320 --> 00:06:19,520 Speaker 1: it's not as strong, and yeah, maybe it didn't create 99 00:06:19,560 --> 00:06:22,800 Speaker 1: as much CEO two, but it might not be useful. 100 00:06:23,040 --> 00:06:26,440 Speaker 1: So they fed all these formulas into the system, and 101 00:06:26,560 --> 00:06:29,839 Speaker 1: the system itself produced new formulas and the team picked 102 00:06:29,839 --> 00:06:33,719 Speaker 1: the five most promising ones to continue testing and tweaking, 103 00:06:34,360 --> 00:06:39,120 Speaker 1: and they modified the AI generated formulas slightly to improve them, 104 00:06:39,160 --> 00:06:42,480 Speaker 1: and ultimately the team created a new formula that could 105 00:06:42,560 --> 00:06:46,200 Speaker 1: replace up to half of the cement needed to produce 106 00:06:46,240 --> 00:06:50,279 Speaker 1: any given amount of concrete and instead use other materials 107 00:06:50,320 --> 00:06:53,960 Speaker 1: like fly ash and slag in its place. In addition, 108 00:06:54,360 --> 00:06:57,880 Speaker 1: the formula was supposed to exceed all strength metric requirements, 109 00:06:58,400 --> 00:07:01,000 Speaker 1: which it did, and that meant that the concrete produced 110 00:07:01,040 --> 00:07:04,159 Speaker 1: should be more than strong and resilient enough to serve 111 00:07:04,720 --> 00:07:09,120 Speaker 1: as concrete while that cement requirement is still reduced dramatically. 112 00:07:09,600 --> 00:07:12,600 Speaker 1: Meta then teamed up with a concrete company called Ozinga 113 00:07:12,680 --> 00:07:15,760 Speaker 1: to refine this formula even further and to move into 114 00:07:15,800 --> 00:07:18,720 Speaker 1: real world testing. Because it's one thing to say that 115 00:07:18,800 --> 00:07:21,720 Speaker 1: mathematically this is what it means. It's another thing to 116 00:07:21,760 --> 00:07:25,120 Speaker 1: actually find out in the real world. There's still more 117 00:07:25,160 --> 00:07:27,360 Speaker 1: work to be done, including finding out there's a way 118 00:07:27,400 --> 00:07:30,520 Speaker 1: to create this kind of concrete that drives a little faster, 119 00:07:31,160 --> 00:07:34,480 Speaker 1: because that would speed up construction efforts when you're making 120 00:07:34,560 --> 00:07:36,440 Speaker 1: use of this concrete. Otherwise, you have to wait for 121 00:07:36,480 --> 00:07:39,720 Speaker 1: it to to cure while uh, you know, before you 122 00:07:39,760 --> 00:07:43,800 Speaker 1: can start laying down more. Here's some elon musk Twitter 123 00:07:43,920 --> 00:07:47,520 Speaker 1: news that has nothing to do with him buying the platform. 124 00:07:47,560 --> 00:07:50,520 Speaker 1: So back in two thousand eighteen, the u S Security 125 00:07:50,600 --> 00:07:54,480 Speaker 1: is an exchange commission or SEC, charged Musk with fraud 126 00:07:54,840 --> 00:07:57,800 Speaker 1: after he tweeted that he had secured enough funding to 127 00:07:57,880 --> 00:08:02,520 Speaker 1: take Tesla private. Uh. The SECS allegations that Musk had 128 00:08:02,600 --> 00:08:06,720 Speaker 1: not actually secured such funding and that the announcement ended 129 00:08:06,800 --> 00:08:10,040 Speaker 1: up having a massive effect on Tesla's stock price, and 130 00:08:10,080 --> 00:08:14,480 Speaker 1: that Musk was essentially manipulating the stock market, which I 131 00:08:14,520 --> 00:08:17,640 Speaker 1: should add is a big no no. So there was 132 00:08:17,680 --> 00:08:21,160 Speaker 1: a lawsuit and ultimately Musk settled out of court, and 133 00:08:21,200 --> 00:08:23,880 Speaker 1: in the process he signed a document that required him 134 00:08:24,000 --> 00:08:28,400 Speaker 1: and Tesla to each pay twenty million dollars in civil finds. 135 00:08:29,000 --> 00:08:32,520 Speaker 1: Plus Musk had to step down as chairman of Tesla's 136 00:08:32,559 --> 00:08:35,319 Speaker 1: board of directors, and from that point forward he was 137 00:08:35,360 --> 00:08:39,319 Speaker 1: supposed to seek pre approval for any tweets he was 138 00:08:39,360 --> 00:08:41,760 Speaker 1: going to send out that related to his businesses before 139 00:08:41,800 --> 00:08:45,480 Speaker 1: he actually posted them. Well, recently, Musk has been trying 140 00:08:45,520 --> 00:08:48,440 Speaker 1: to get that settlement thrown out, but a federal judges 141 00:08:48,480 --> 00:08:52,360 Speaker 1: said no, Dice, it stands the judge argues that the 142 00:08:52,480 --> 00:08:56,480 Speaker 1: SECS allegations were fair and warranted, and that Musk has 143 00:08:56,480 --> 00:08:59,439 Speaker 1: no evidence that he actually secured the funding he said 144 00:08:59,480 --> 00:09:03,880 Speaker 1: he had, and that he signed this settlement voluntarily, and 145 00:09:03,960 --> 00:09:07,720 Speaker 1: he must therefore abide by its rules. The judge also wrote, 146 00:09:07,880 --> 00:09:11,679 Speaker 1: quote Musk may wish it were otherwise, but he remains 147 00:09:11,760 --> 00:09:15,120 Speaker 1: subject to the same enforcement authority and has the same 148 00:09:15,160 --> 00:09:19,079 Speaker 1: means to challenge the exercise of that authority as any 149 00:09:19,120 --> 00:09:23,400 Speaker 1: other citizen. End quote. Right to repair advocates in the 150 00:09:23,480 --> 00:09:26,839 Speaker 1: United States can celebrate a victory. Apple has now made 151 00:09:27,040 --> 00:09:30,920 Speaker 1: self service repair for iPhone twelve and thirteen models available 152 00:09:31,000 --> 00:09:34,360 Speaker 1: in the United States. That means that Apple will allow 153 00:09:34,440 --> 00:09:38,640 Speaker 1: people to order repair manuals, proprietary tools needed to access 154 00:09:38,679 --> 00:09:43,480 Speaker 1: Apple products, and even official Apple parts through the self 155 00:09:43,559 --> 00:09:47,200 Speaker 1: service repair store. Now that's not to say that your 156 00:09:47,280 --> 00:09:50,280 Speaker 1: average iPhone user is really going to be able to 157 00:09:50,320 --> 00:09:53,080 Speaker 1: pop open the hood and change the air filter on 158 00:09:53,120 --> 00:09:56,320 Speaker 1: their phone or whatever. You get what I mean. Making 159 00:09:56,360 --> 00:10:00,000 Speaker 1: repairs will require a certain level of knowledge, skill, and expertise, 160 00:10:00,200 --> 00:10:03,520 Speaker 1: so it's far more likely that independent electronics repair stores 161 00:10:03,559 --> 00:10:06,920 Speaker 1: will make use of this offering, giving iPhone users options 162 00:10:07,320 --> 00:10:10,920 Speaker 1: when it comes to fixing problems with their phones. Apple 163 00:10:11,000 --> 00:10:14,520 Speaker 1: later plans to roll out similar offerings for Mac computers, 164 00:10:14,559 --> 00:10:16,840 Speaker 1: and this is a pretty big step for Apple, which 165 00:10:16,880 --> 00:10:20,760 Speaker 1: for years has attempted to maintain a walled garden and 166 00:10:20,840 --> 00:10:24,120 Speaker 1: require users to go through Apple and Apple alone when 167 00:10:24,120 --> 00:10:27,959 Speaker 1: it comes to repairs. So good job Apple. We've got 168 00:10:27,960 --> 00:10:30,240 Speaker 1: a few more news items to get through. But before 169 00:10:30,240 --> 00:10:40,640 Speaker 1: we do that, let's take a quick break down under 170 00:10:41,080 --> 00:10:45,120 Speaker 1: Amazon has declined to acquiesce to the government's request to 171 00:10:45,160 --> 00:10:48,760 Speaker 1: take a look see at the company's product search system 172 00:10:48,880 --> 00:10:52,920 Speaker 1: and algorithms. So the government wants to see if Amazon 173 00:10:53,080 --> 00:10:56,840 Speaker 1: favors its own in house products over a third party 174 00:10:57,040 --> 00:11:01,400 Speaker 1: merchant products and other words, giving itself an unfair advantage 175 00:11:01,480 --> 00:11:05,760 Speaker 1: in the Amazon marketplace. So this is the same accusation 176 00:11:05,880 --> 00:11:09,679 Speaker 1: Amazon has faced in places like India and the United States. 177 00:11:10,360 --> 00:11:13,680 Speaker 1: Amazon has refused to share that information, or at least 178 00:11:13,720 --> 00:11:17,920 Speaker 1: declined to send it to the Australian government, and the 179 00:11:17,960 --> 00:11:21,400 Speaker 1: company denies that it has ever given preferential treatment to 180 00:11:21,480 --> 00:11:25,040 Speaker 1: its own products. That's not likely to fly for the 181 00:11:25,120 --> 00:11:29,800 Speaker 1: Australian Competition and Consumer Commission for the a c c 182 00:11:30,200 --> 00:11:34,360 Speaker 1: c H. Now, one thing that is very much different 183 00:11:34,360 --> 00:11:37,520 Speaker 1: in Australia compared to some other markets, like specifically the 184 00:11:37,640 --> 00:11:41,080 Speaker 1: United States, is that Amazon is still a relatively young 185 00:11:41,240 --> 00:11:44,440 Speaker 1: player in that country, and it's in fact, it's not 186 00:11:44,640 --> 00:11:48,800 Speaker 1: the largest online market in Australia. In fact, according to 187 00:11:48,840 --> 00:11:52,320 Speaker 1: the a c c C, Amazon brought in just one 188 00:11:52,440 --> 00:11:57,240 Speaker 1: quarter of the sales that eBay saw in Australia last year. However, 189 00:11:57,440 --> 00:12:01,040 Speaker 1: the a c c C is concerned about any platform 190 00:12:01,160 --> 00:12:04,920 Speaker 1: that engages in what it views as anti competitive or 191 00:12:05,000 --> 00:12:09,360 Speaker 1: unfair practices, even if that platform isn't as dominant down 192 00:12:09,400 --> 00:12:12,000 Speaker 1: there as it is here. Now, the only other thing 193 00:12:12,040 --> 00:12:15,240 Speaker 1: I know about Amazon in Australia is that the company 194 00:12:15,240 --> 00:12:18,920 Speaker 1: has seen a lot of returns of boomerangs. But then 195 00:12:19,040 --> 00:12:21,559 Speaker 1: I'm led to understand that's what they're supposed to do. 196 00:12:22,120 --> 00:12:27,360 Speaker 1: Related to Amazon, let's talk about Twitch, because Amazon owns Twitch. 197 00:12:28,080 --> 00:12:31,320 Speaker 1: According to Bloomberg, Twitch is considering a couple of moves 198 00:12:31,320 --> 00:12:34,760 Speaker 1: that could dramatically impact streamers and the experience of watching 199 00:12:34,800 --> 00:12:38,280 Speaker 1: live streams on the platform. First is that it sounds 200 00:12:38,320 --> 00:12:41,920 Speaker 1: like Twitch wants to encourage streamers to run more ads 201 00:12:42,120 --> 00:12:45,960 Speaker 1: during live streams. Obviously, that would generate more revenue for 202 00:12:46,000 --> 00:12:50,600 Speaker 1: the platform, And another possible strategy is that Twitch will 203 00:12:50,679 --> 00:12:54,440 Speaker 1: change the amount of subscription revenue that streamers get to keep. 204 00:12:54,880 --> 00:12:58,800 Speaker 1: So right now, if you subscribe to a Twitch streamer, 205 00:12:58,880 --> 00:13:02,199 Speaker 1: let's say that you know, you've you chalk over six 206 00:13:02,280 --> 00:13:04,960 Speaker 1: bucks a month in order to follow a particular streamer. 207 00:13:05,600 --> 00:13:09,600 Speaker 1: The streamers keep se of the revenue generated from audience 208 00:13:09,640 --> 00:13:13,280 Speaker 1: subscriptions to their channel. But according to Bloomberg, Twitch is 209 00:13:13,320 --> 00:13:16,280 Speaker 1: considering dropping that down, so it's a fifty fifty split. 210 00:13:16,400 --> 00:13:19,640 Speaker 1: Fifty go to the streamer and the other fifty goes 211 00:13:19,679 --> 00:13:22,360 Speaker 1: to Twitch itself. That is probably not going to go 212 00:13:22,440 --> 00:13:25,840 Speaker 1: over so well with the streaming community at large, certainly 213 00:13:25,880 --> 00:13:27,920 Speaker 1: not for some of the more active streamers who really 214 00:13:27,960 --> 00:13:31,880 Speaker 1: depend upon Twitch for their livelihood. Now, whether this might 215 00:13:31,960 --> 00:13:36,240 Speaker 1: prompt an exodus from Twitch to other platforms like YouTube 216 00:13:36,520 --> 00:13:38,440 Speaker 1: remains to be seen. For some, like some of the 217 00:13:38,520 --> 00:13:42,360 Speaker 1: really big ones, they might be signed to exclusive contracts 218 00:13:42,360 --> 00:13:45,240 Speaker 1: that lock them into Twitch for a while. Um for 219 00:13:45,400 --> 00:13:49,160 Speaker 1: other ones, more moderate content creators, it may be just 220 00:13:49,280 --> 00:13:52,040 Speaker 1: a question they have to ask themselves. It also may 221 00:13:52,120 --> 00:13:54,800 Speaker 1: turn out that the company will reconsider these moves. They 222 00:13:54,840 --> 00:13:59,200 Speaker 1: haven't yet been implemented, so there's no you know, setting 223 00:13:59,320 --> 00:14:04,360 Speaker 1: stone Lan to go this way. Also in gaming, AXIOS 224 00:14:04,360 --> 00:14:08,319 Speaker 1: reports that far right extremism is growing in the gaming 225 00:14:08,400 --> 00:14:11,920 Speaker 1: space and that the systems present in games are not 226 00:14:12,080 --> 00:14:15,880 Speaker 1: up to the task of moderating players and addressing the problem. 227 00:14:16,000 --> 00:14:19,320 Speaker 1: Axis cites a couple of reports. One came from the 228 00:14:19,360 --> 00:14:24,160 Speaker 1: Extremism and Gaming Research Network and it published in December 229 00:14:24,200 --> 00:14:27,520 Speaker 1: two thousand one. UH. There was another report from two 230 00:14:27,560 --> 00:14:31,520 Speaker 1: thousand nineteen from the Anti Defamation League, and these reports 231 00:14:31,600 --> 00:14:34,720 Speaker 1: paint a pretty troubling picture. They show that more than 232 00:14:34,760 --> 00:14:38,040 Speaker 1: half of the people who have experienced harassment in online 233 00:14:38,080 --> 00:14:42,400 Speaker 1: multiplayer games believe that they were targeted because of things 234 00:14:42,440 --> 00:14:46,120 Speaker 1: like their race, their gender, their sexual orientation, and so on. 235 00:14:46,600 --> 00:14:49,960 Speaker 1: The two thousand twenty one report indicates that games are 236 00:14:50,000 --> 00:14:52,040 Speaker 1: sorely lacking when it comes to ways that they can 237 00:14:52,120 --> 00:14:57,080 Speaker 1: manage the problem. This sets them apart from platforms like Facebook, Twitter, 238 00:14:57,560 --> 00:15:00,040 Speaker 1: you know, other social network platforms that have spent a 239 00:15:00,120 --> 00:15:03,640 Speaker 1: lot of money trying to deal with this problem to 240 00:15:03,840 --> 00:15:07,920 Speaker 1: varying degrees of success, but the gaming space in general 241 00:15:08,040 --> 00:15:11,720 Speaker 1: hasn't done that yet. UH. There are also indications that 242 00:15:11,720 --> 00:15:15,640 Speaker 1: white supremacist ideology is on the rise within the online 243 00:15:15,680 --> 00:15:19,400 Speaker 1: gaming world in general, so it does paint a pretty 244 00:15:19,400 --> 00:15:22,480 Speaker 1: troubling picture. It doesn't mean that you know games are 245 00:15:22,520 --> 00:15:25,760 Speaker 1: bad themselves, but rather there may need to be a 246 00:15:25,880 --> 00:15:31,080 Speaker 1: shift and how companies end up moderating their players in 247 00:15:31,120 --> 00:15:36,080 Speaker 1: their online platforms. In Michigan, several parties led by Ford 248 00:15:36,120 --> 00:15:40,320 Speaker 1: and Sidewalk Infrastructure Partners have raised one million dollars in 249 00:15:40,360 --> 00:15:44,640 Speaker 1: capital funding to develop a connected roadway for the purposes 250 00:15:44,720 --> 00:15:49,280 Speaker 1: of autonomous cars. So the roadway will connect Detroit to 251 00:15:49,400 --> 00:15:53,960 Speaker 1: ann Arbor, Michigan, and it was previously announced in twenty twenty, 252 00:15:54,080 --> 00:15:57,240 Speaker 1: but the recent news marks the first time serious money 253 00:15:57,280 --> 00:16:00,640 Speaker 1: is being put towards this project. So the goal is 254 00:16:00,680 --> 00:16:04,960 Speaker 1: to create a dedicated roadway for self driving vehicles, and 255 00:16:05,040 --> 00:16:08,440 Speaker 1: the roadway will actually include hardware that will allow for 256 00:16:08,520 --> 00:16:13,320 Speaker 1: communication between the vehicle and the underlying infrastructure. And this 257 00:16:13,400 --> 00:16:15,680 Speaker 1: is something I'm really interested in. It's something that we've 258 00:16:15,720 --> 00:16:19,280 Speaker 1: been talking about for years now, and that's the most 259 00:16:19,280 --> 00:16:22,200 Speaker 1: self driving cars that we think about today. For the 260 00:16:22,280 --> 00:16:26,680 Speaker 1: most part, they are largely self contained systems. It's like 261 00:16:26,760 --> 00:16:30,000 Speaker 1: having your own PC, right and and not having it 262 00:16:30,040 --> 00:16:32,560 Speaker 1: connect to anything else. The PC is able to do 263 00:16:32,600 --> 00:16:37,640 Speaker 1: some incredible stuff, but it's restricted to its own abilities. 264 00:16:38,000 --> 00:16:40,840 Speaker 1: So the car alone is responsible for operating the vehicle, 265 00:16:41,240 --> 00:16:44,680 Speaker 1: monitoring the surrounding environment, avoiding collisions and all that kind 266 00:16:44,720 --> 00:16:49,600 Speaker 1: of stuff. However, if you're able to pair an autonomous 267 00:16:49,680 --> 00:16:53,200 Speaker 1: vehicle with a smart infrastructure, while the vehicles on the 268 00:16:53,320 --> 00:16:57,200 Speaker 1: road can communicate with the road itself, and then also 269 00:16:57,280 --> 00:17:01,760 Speaker 1: by extension other autonomous vehicles, which can allow for faster traffic, 270 00:17:02,120 --> 00:17:06,200 Speaker 1: less congestion, safer operations because like everyone knows what everyone 271 00:17:06,280 --> 00:17:09,280 Speaker 1: else is doing already, right, all the cars are aware 272 00:17:09,400 --> 00:17:13,040 Speaker 1: of all the other cars on that infrastructure and can 273 00:17:13,080 --> 00:17:16,920 Speaker 1: operate at a higher rate of speed with a much, 274 00:17:17,040 --> 00:17:21,560 Speaker 1: much much lower probability of something going wrong apart from 275 00:17:21,600 --> 00:17:24,359 Speaker 1: things like you know, like a flat tire. Stuff like 276 00:17:24,400 --> 00:17:27,760 Speaker 1: that still obviously an issue, but that cars would be 277 00:17:27,760 --> 00:17:32,880 Speaker 1: able to react very quickly and collectively, so you wouldn't 278 00:17:32,920 --> 00:17:36,520 Speaker 1: have things like pile ups. Now, Obviously, building out infrastructure 279 00:17:36,600 --> 00:17:40,600 Speaker 1: is time consuming, it's expensive, and it's challenging, and we 280 00:17:40,640 --> 00:17:42,800 Speaker 1: already have tons of roads here in the United States, 281 00:17:42,800 --> 00:17:47,879 Speaker 1: and building out, replacing or upgrading existing infrastructure, that's a 282 00:17:48,000 --> 00:17:51,800 Speaker 1: that's a huge endeavor. Anyone who has lived somewhere that 283 00:17:51,920 --> 00:17:55,080 Speaker 1: had massive road work being done on a local Highway 284 00:17:55,280 --> 00:17:58,240 Speaker 1: knows that it can take an excruciatingly long time for 285 00:17:58,280 --> 00:18:02,760 Speaker 1: that work to complete. Ill if this project in Michigan 286 00:18:02,880 --> 00:18:06,920 Speaker 1: ends up being a big success, it could serve as 287 00:18:07,000 --> 00:18:09,920 Speaker 1: a model for future projects leading to a more safe 288 00:18:09,920 --> 00:18:14,160 Speaker 1: and effective autonomous vehicle ecosystem. Finally, over in the UK, 289 00:18:14,680 --> 00:18:18,840 Speaker 1: the government is revising laws regarding public service broadcasting and 290 00:18:18,920 --> 00:18:23,960 Speaker 1: these laws will now also cover streaming services like Netflix, 291 00:18:24,000 --> 00:18:27,880 Speaker 1: Apple TV and more that are operating within the UK. 292 00:18:28,560 --> 00:18:31,880 Speaker 1: So the rules will include a restriction on broadcasting quote 293 00:18:31,960 --> 00:18:36,640 Speaker 1: unquote harmful content. Now I'm not entirely certain what constitutes 294 00:18:36,720 --> 00:18:40,639 Speaker 1: harmful content, like what is defined as harmful content, but 295 00:18:40,840 --> 00:18:45,040 Speaker 1: if a streaming service is found guilty of broadcasting harmful 296 00:18:45,080 --> 00:18:47,919 Speaker 1: content within the UK, it can be fined up to 297 00:18:48,000 --> 00:18:51,320 Speaker 1: five percent of its revenue. And it sounds to me 298 00:18:51,480 --> 00:18:55,800 Speaker 1: as though the process is a viewer watches a streaming service, 299 00:18:56,200 --> 00:18:59,160 Speaker 1: see something on that streaming service that the viewer believes 300 00:18:59,240 --> 00:19:03,560 Speaker 1: is harmful. The viewer then contact the UK's communications regulatory 301 00:19:03,600 --> 00:19:06,880 Speaker 1: agency known as the Office of Communications but better known 302 00:19:06,920 --> 00:19:11,200 Speaker 1: as OFFCOM, and presumably OFFCOM would then review that material 303 00:19:11,240 --> 00:19:14,680 Speaker 1: to determine if it does in fact constitute harmful content. 304 00:19:15,040 --> 00:19:17,880 Speaker 1: And then take action, but I'm not entirely certain. I'll 305 00:19:17,920 --> 00:19:19,960 Speaker 1: have to keep an eye on it. And that's the 306 00:19:20,000 --> 00:19:23,680 Speaker 1: news for Thursday, April two thousand twenty two. Hope you're 307 00:19:23,680 --> 00:19:26,879 Speaker 1: all well. If you have any questions or suggestions for me, 308 00:19:26,960 --> 00:19:28,879 Speaker 1: send me a message on Twitter. The handle for the 309 00:19:28,880 --> 00:19:31,880 Speaker 1: show is text Stuff H s W. And I'll talk 310 00:19:31,920 --> 00:19:40,679 Speaker 1: to you again really soon. Text Stuff is an I 311 00:19:40,800 --> 00:19:44,280 Speaker 1: Heart Radio production. For more podcasts from my heart Radio, 312 00:19:44,640 --> 00:19:47,800 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 313 00:19:47,880 --> 00:19:49,399 Speaker 1: you listen to your favorite shows.