1 00:00:04,440 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,360 --> 00:00:15,960 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,000 --> 00:00:19,480 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,520 --> 00:00:22,320 Speaker 1: are you? It's time for the tech news for Tuesday, 5 00:00:22,640 --> 00:00:27,840 Speaker 1: June twentieth, twenty twenty three, Time dot Com has an 6 00:00:27,960 --> 00:00:33,640 Speaker 1: article by Billy Perigo titled exclusive open ai lobbied the 7 00:00:33,680 --> 00:00:37,519 Speaker 1: EU to water down AI regulation and I think that 8 00:00:37,680 --> 00:00:42,080 Speaker 1: article is worth a read. Now, it's not exactly surprising, 9 00:00:42,400 --> 00:00:45,760 Speaker 1: because if you've been following the news, open AI's CEO 10 00:00:46,000 --> 00:00:50,640 Speaker 1: Sam Altman has been proactive in engaging with politicians around 11 00:00:50,720 --> 00:00:55,800 Speaker 1: the world, ostensibly to help create sensible regulations for the 12 00:00:55,800 --> 00:00:59,560 Speaker 1: burgeoning AI industry. But, as it turns out, and again 13 00:01:00,400 --> 00:01:03,760 Speaker 1: this is not really a surprise, Altman's work has mostly 14 00:01:03,800 --> 00:01:07,040 Speaker 1: focused on how to create regulations that have a minimal 15 00:01:07,120 --> 00:01:10,920 Speaker 1: impact on open AI's business. In other words, I think 16 00:01:11,040 --> 00:01:16,320 Speaker 1: Altman recognized that regulations are unavoidable, so the next best 17 00:01:16,400 --> 00:01:19,880 Speaker 1: thing is to have a hand in creating those regulations 18 00:01:20,280 --> 00:01:23,399 Speaker 1: so that open ai can still pursue business with as 19 00:01:23,480 --> 00:01:26,759 Speaker 1: few barriers in its way as possible. Now, you might 20 00:01:26,800 --> 00:01:31,240 Speaker 1: remember that the EU created different designations for AI, ranging 21 00:01:31,280 --> 00:01:34,760 Speaker 1: from minimal risk or no risk at all all the 22 00:01:34,800 --> 00:01:39,240 Speaker 1: way up to unacceptable risk. Anything that's deemed an unacceptable 23 00:01:39,319 --> 00:01:42,880 Speaker 1: risk would be unlawful into EU. Again, that makes sense. 24 00:01:42,959 --> 00:01:47,240 Speaker 1: Unacceptable suggests that right. Anyway, the original draft of EU's 25 00:01:47,360 --> 00:01:54,000 Speaker 1: regulations would possibly put generative AI products like open AI's 26 00:01:54,120 --> 00:01:58,400 Speaker 1: GPT three language model and chat GPT the chat pot 27 00:01:58,920 --> 00:02:03,640 Speaker 1: into the high ra category at least for specific applications, 28 00:02:03,920 --> 00:02:07,200 Speaker 1: and high risk is the rank just below unacceptable, and 29 00:02:07,280 --> 00:02:09,280 Speaker 1: it would have meant that open ai would have to 30 00:02:09,320 --> 00:02:13,839 Speaker 1: adhere to more strict regulations and rules. But last year 31 00:02:14,120 --> 00:02:17,760 Speaker 1: open Ai sent a seven page white paper to the 32 00:02:17,800 --> 00:02:21,600 Speaker 1: EU and also lobbied extensively to have the rules tweaked 33 00:02:21,960 --> 00:02:27,639 Speaker 1: so that open AI's products don't count toward high risk applications. Instead, 34 00:02:28,080 --> 00:02:31,399 Speaker 1: open ai and a few other large established companies would 35 00:02:31,440 --> 00:02:34,680 Speaker 1: have products that would be kind of classified as foundation, 36 00:02:35,480 --> 00:02:39,520 Speaker 1: you know, like these are the pillars of AI than 37 00:02:39,560 --> 00:02:44,720 Speaker 1: they are above reproach. Which that's me being a little snarky, 38 00:02:45,200 --> 00:02:48,480 Speaker 1: but because we've seen multiple times now how chatbots like 39 00:02:48,560 --> 00:02:54,560 Speaker 1: chat GPT can cause real problems. Anyway, the lobbying actually worked. 40 00:02:54,600 --> 00:02:58,320 Speaker 1: The amended regulation language carves out space for open ai 41 00:02:58,720 --> 00:03:01,880 Speaker 1: and some other companies like at. Critics say that open 42 00:03:01,919 --> 00:03:06,120 Speaker 1: ai has essentially neutered the legislation and that the company 43 00:03:06,160 --> 00:03:09,240 Speaker 1: is being a bit two faced, that it's calling for 44 00:03:09,320 --> 00:03:13,200 Speaker 1: regulation on the one hand and secretly negotiating for a 45 00:03:13,240 --> 00:03:18,320 Speaker 1: favorable deal in the background. On a related note, tech 46 00:03:18,360 --> 00:03:22,000 Speaker 1: Crunch's Ingrid London reports that a consumer group called the 47 00:03:22,040 --> 00:03:29,280 Speaker 1: European Consumer Organization aka BUC. That initialism reflects the original 48 00:03:29,360 --> 00:03:32,040 Speaker 1: French name of the organization, which no, I am not 49 00:03:32,080 --> 00:03:37,080 Speaker 1: going to attempt to say, because I you know, I 50 00:03:37,120 --> 00:03:40,640 Speaker 1: feel like I get enough abuse already. Anyway, the BUC 51 00:03:40,920 --> 00:03:46,920 Speaker 1: is calling upon European Union regulators to actively investigate what 52 00:03:47,200 --> 00:03:51,640 Speaker 1: risks Generative Ai actually poses to EU citizens, and y'all, 53 00:03:51,720 --> 00:03:54,760 Speaker 1: I think that sounds perfectly reasonable. I mean, as it stands, 54 00:03:55,320 --> 00:03:59,080 Speaker 1: the appearance is that lawmakers are relying heavily on companies 55 00:03:59,200 --> 00:04:02,320 Speaker 1: like open ai, saying that they pinky promise not to 56 00:04:02,360 --> 00:04:06,840 Speaker 1: do anything harmful and that their tools have appropriate protections 57 00:04:06,880 --> 00:04:10,120 Speaker 1: in place already, so there's no need for further regulation. 58 00:04:10,640 --> 00:04:14,040 Speaker 1: But a thorough investigation could put those claims to the 59 00:04:14,120 --> 00:04:18,719 Speaker 1: test and then either confirm that those claims are true 60 00:04:19,320 --> 00:04:22,880 Speaker 1: or prove them to be meritless. I think either way, 61 00:04:22,920 --> 00:04:25,760 Speaker 1: you need to have that understanding before you actually start 62 00:04:26,200 --> 00:04:31,040 Speaker 1: to create regulations. The BUC also submitted a report titled 63 00:04:31,360 --> 00:04:35,400 Speaker 1: Ghost in the Machine addressing the consumer harms of generative AI, 64 00:04:35,839 --> 00:04:38,160 Speaker 1: and as the name of the report spells out, the 65 00:04:38,200 --> 00:04:44,040 Speaker 1: buc's position is pretty clear, generative AI does represent potential 66 00:04:44,080 --> 00:04:48,560 Speaker 1: harm to citizens. Now, interestingly, the report is mostly about 67 00:04:48,600 --> 00:04:53,239 Speaker 1: stuff like ensuring fair competition in the marketplace and limiting 68 00:04:53,279 --> 00:04:58,000 Speaker 1: the spread of disinformation and ensuring accessibility. So, in other words, 69 00:04:58,640 --> 00:05:01,680 Speaker 1: it's not like it's an alarm report. Right, This isn't 70 00:05:01,680 --> 00:05:04,240 Speaker 1: a paper that's saying AI is going to destroy us 71 00:05:04,279 --> 00:05:07,880 Speaker 1: all it's going to end too in a post apocalyptic scenario. 72 00:05:08,000 --> 00:05:10,880 Speaker 1: That's not what this report says. So it's not that 73 00:05:11,240 --> 00:05:14,840 Speaker 1: sort of inflammatory language that we frequently find in the 74 00:05:14,920 --> 00:05:17,800 Speaker 1: media and in some reports here in the United States. 75 00:05:18,320 --> 00:05:21,760 Speaker 1: So it's really more about how AI could potentially create 76 00:05:21,800 --> 00:05:25,480 Speaker 1: problems for citizens and thus needs to be investigated more 77 00:05:25,520 --> 00:05:29,400 Speaker 1: thoroughly before the EU passes actual regulations, Like how can 78 00:05:29,440 --> 00:05:32,280 Speaker 1: you regulate something if you don't fully understand what it 79 00:05:32,400 --> 00:05:35,000 Speaker 1: may or may not be able to do? And I 80 00:05:35,000 --> 00:05:38,800 Speaker 1: think that's a pretty reasonable conclusion, particularly since it's not 81 00:05:38,920 --> 00:05:42,600 Speaker 1: unusual to see leaders struggle with the implications of technologies 82 00:05:42,600 --> 00:05:47,080 Speaker 1: that they don't fully understand or have experience using. Apple 83 00:05:47,120 --> 00:05:50,200 Speaker 1: has joined Google in sending out a message to employees 84 00:05:50,279 --> 00:05:53,479 Speaker 1: saying it's a no no to use AI chatbots like chat, 85 00:05:53,480 --> 00:05:58,680 Speaker 1: GPT and Google Bard for internal company purposes. So Apple 86 00:05:58,720 --> 00:06:01,839 Speaker 1: employees are not to rely on those bots to help 87 00:06:01,880 --> 00:06:05,360 Speaker 1: them do work, whether it might be organized notes into 88 00:06:05,360 --> 00:06:09,440 Speaker 1: a presentation or maybe help a developer build out code. 89 00:06:09,839 --> 00:06:13,600 Speaker 1: So you may recall that Google also told its employees 90 00:06:13,680 --> 00:06:16,960 Speaker 1: the same thing earlier, and that involved the odd situation 91 00:06:17,040 --> 00:06:19,880 Speaker 1: of Google saying, hey, you know that tool that we're 92 00:06:19,960 --> 00:06:22,839 Speaker 1: creating and we're trying to sell to businesses so that 93 00:06:22,880 --> 00:06:26,040 Speaker 1: they can use it for business reasons. Yeah, don't use 94 00:06:26,080 --> 00:06:29,320 Speaker 1: that for our business. It kind of makes me think 95 00:06:29,320 --> 00:06:31,760 Speaker 1: of my aunt who covered all her furniture and plastic 96 00:06:31,800 --> 00:06:35,880 Speaker 1: because it was quote unquote for company that never seemed 97 00:06:35,880 --> 00:06:38,880 Speaker 1: to visit. It wasn't furniture that was meant for sitting 98 00:06:38,960 --> 00:06:42,920 Speaker 1: on anyway. I would point to big companies making these 99 00:06:42,920 --> 00:06:47,000 Speaker 1: sweeping rules as proof that perhaps regulators need to approach 100 00:06:47,080 --> 00:06:49,919 Speaker 1: their work carefully. Because if the big companies are shying 101 00:06:49,920 --> 00:06:53,080 Speaker 1: away from the tech, including the big companies that are 102 00:06:53,120 --> 00:06:56,680 Speaker 1: making the tech in the first place, that should really 103 00:06:56,720 --> 00:06:59,919 Speaker 1: be a red flag. And it's doubly awkward because open 104 00:07:00,120 --> 00:07:04,800 Speaker 1: ai launched an iOS app based version of chat GPT recently, 105 00:07:05,480 --> 00:07:08,239 Speaker 1: and since I've dogged on chat bots for three stories 106 00:07:08,240 --> 00:07:10,840 Speaker 1: in a row, I want to say I don't think 107 00:07:10,880 --> 00:07:15,600 Speaker 1: this technology is inherently bad or that it doesn't serve 108 00:07:15,840 --> 00:07:18,840 Speaker 1: a purpose. I just don't think it has proven to 109 00:07:18,880 --> 00:07:22,960 Speaker 1: be trustworthy, and considering the data harvesting practices of companies, 110 00:07:23,280 --> 00:07:26,760 Speaker 1: I also question whether using it for sensitive business purposes 111 00:07:26,880 --> 00:07:30,240 Speaker 1: or any sensitive purpose for that matter, is a good idea. 112 00:07:30,840 --> 00:07:33,760 Speaker 1: While I can agree with Apple leadership on the whole 113 00:07:33,880 --> 00:07:36,840 Speaker 1: generative AI thing, I do have to call them out 114 00:07:36,840 --> 00:07:39,440 Speaker 1: for continuing to try and trademark the image of an 115 00:07:39,440 --> 00:07:42,600 Speaker 1: actual Apple. And you might be thinking, huh, you know 116 00:07:43,280 --> 00:07:47,760 Speaker 1: apples existed a long time before Apple the company did. Otherwise, 117 00:07:47,920 --> 00:07:50,680 Speaker 1: why would you name the company Apple if there weren't 118 00:07:50,760 --> 00:07:53,640 Speaker 1: apples already? And apples are grown all over the world 119 00:07:54,080 --> 00:07:56,280 Speaker 1: and taking a photo of an Apple shouldn't be a 120 00:07:56,320 --> 00:08:00,280 Speaker 1: big deal, So how could Apple the Company trade mark 121 00:08:00,320 --> 00:08:03,480 Speaker 1: an image of Apple the fruit, and I guess the 122 00:08:03,480 --> 00:08:06,560 Speaker 1: answer to that is they could repeatedly try to do this. 123 00:08:07,160 --> 00:08:11,040 Speaker 1: Mashables Cecily Maron has a piece on this that goes 124 00:08:11,080 --> 00:08:14,080 Speaker 1: over Apple's quest to trademark a black and white image 125 00:08:14,320 --> 00:08:18,040 Speaker 1: of a Granny Smith Apple in various countries, including in Switzerland, 126 00:08:18,360 --> 00:08:22,080 Speaker 1: and Malron points out that if Apple succeeds in doing this, 127 00:08:22,520 --> 00:08:26,280 Speaker 1: the company could conceivably force any other organization that happens 128 00:08:26,280 --> 00:08:29,320 Speaker 1: to have an Apple and its logo to change, which, y'all, 129 00:08:29,640 --> 00:08:33,320 Speaker 1: this just seems outright bizarre to me. It's also only 130 00:08:33,480 --> 00:08:37,160 Speaker 1: slightly tech related, so we're gonna move on now. Last 131 00:08:37,160 --> 00:08:40,400 Speaker 1: week I talked about how thousands of communities on Reddit 132 00:08:41,080 --> 00:08:45,000 Speaker 1: also known as subreddits, had gone dark in protest of 133 00:08:45,040 --> 00:08:48,720 Speaker 1: how Reddit changed its API policy and how those changes 134 00:08:48,760 --> 00:08:52,760 Speaker 1: are effectively forcing developers of several popular Reddit apps to 135 00:08:52,840 --> 00:08:56,080 Speaker 1: shut those apps down. So basically, Reddit has set a 136 00:08:56,200 --> 00:08:59,880 Speaker 1: very high fee for apps that reference the platform a lot, 137 00:09:00,320 --> 00:09:03,720 Speaker 1: and the really popular apps reference Reddit on such a 138 00:09:03,760 --> 00:09:06,960 Speaker 1: scale that the fee would equal several million dollars a year, 139 00:09:07,360 --> 00:09:10,040 Speaker 1: which these developers can't pay, and so the apps have 140 00:09:10,120 --> 00:09:13,840 Speaker 1: gone dark. Well, that protest, which was to last two 141 00:09:14,040 --> 00:09:18,400 Speaker 1: days of these various subredts going dark has now continued 142 00:09:18,520 --> 00:09:22,360 Speaker 1: for at least thousands of subreddit communities. This, in part 143 00:09:22,640 --> 00:09:25,400 Speaker 1: is a response to how Reddit's leadership has chosen to 144 00:09:25,480 --> 00:09:29,240 Speaker 1: respond to the protests, and by that I mean they have. 145 00:09:29,520 --> 00:09:33,280 Speaker 1: The leadership has largely dismissed the protests as being nothing 146 00:09:33,320 --> 00:09:37,360 Speaker 1: more than an inconvenience. Now, a hacker criminal group known 147 00:09:37,440 --> 00:09:41,840 Speaker 1: both as Alpha, but instead of a A at the end, 148 00:09:41,880 --> 00:09:45,560 Speaker 1: it's a V, but it's ALPHV, also known as black Cat, 149 00:09:45,960 --> 00:09:48,480 Speaker 1: is saying that Reddit leadership needs to pay a four 150 00:09:48,520 --> 00:09:52,200 Speaker 1: and a half million dollar ransom fee as well as 151 00:09:52,360 --> 00:09:56,200 Speaker 1: ditch the changes to its API policy, or the group 152 00:09:56,240 --> 00:09:58,800 Speaker 1: is going to release internal data to the tune of 153 00:09:58,840 --> 00:10:02,480 Speaker 1: eighty gigabytes of him. Now, we already know this group 154 00:10:02,840 --> 00:10:06,600 Speaker 1: was successful in breaching company systems because Reddit actually confirmed 155 00:10:06,600 --> 00:10:10,480 Speaker 1: the breach back in February. What black Cat did not do, 156 00:10:10,559 --> 00:10:14,120 Speaker 1: which is what most ransomware groups will do, they did 157 00:10:14,160 --> 00:10:18,240 Speaker 1: not encrypt Reddit's databases. Typically, that's what ransomware groups do, right, 158 00:10:18,280 --> 00:10:22,440 Speaker 1: They lock down databases by encrypting them, and they say, 159 00:10:22,480 --> 00:10:25,199 Speaker 1: we'll give you the decryption key, but first you have 160 00:10:25,240 --> 00:10:27,080 Speaker 1: to pay a ransom. Well, they didn't do that. In 161 00:10:27,120 --> 00:10:30,240 Speaker 1: this case, all they did was they accessed a bunch 162 00:10:30,280 --> 00:10:34,480 Speaker 1: of systems and apparently stole around eighty gigabytes worth of information. 163 00:10:35,200 --> 00:10:38,080 Speaker 1: Now this group is saying, hey, Reddit, pay up and 164 00:10:38,200 --> 00:10:41,800 Speaker 1: change your ways, or else your data goes public. Reddit's 165 00:10:41,880 --> 00:10:44,320 Speaker 1: probably not going to comply with this demand, and you 166 00:10:44,360 --> 00:10:46,920 Speaker 1: could chalk this up to a hacker group taking advantage 167 00:10:46,920 --> 00:10:50,040 Speaker 1: of our recent community flare up to grab some publicity 168 00:10:50,040 --> 00:10:53,840 Speaker 1: for itself, rather than an earnest attempt to blackmail the company. 169 00:10:54,360 --> 00:10:58,160 Speaker 1: The protests on Reddit continue in the meantime, and they've 170 00:10:58,200 --> 00:11:02,200 Speaker 1: taken a rather absurd and mean centric approach, which, considering 171 00:11:02,200 --> 00:11:05,200 Speaker 1: we're talking about Internet culture, is par for the course. 172 00:11:05,600 --> 00:11:09,559 Speaker 1: While more than half of the previously dark subredits have returned, 173 00:11:10,040 --> 00:11:12,680 Speaker 1: the outrage on Reddit has not gone away, and in 174 00:11:12,760 --> 00:11:16,040 Speaker 1: some of those returned subreddits, a particular British comedian and 175 00:11:16,120 --> 00:11:19,640 Speaker 1: TV host has taken a prominent role in the protests, 176 00:11:20,040 --> 00:11:24,080 Speaker 1: and initially did so without his own involvement or consent. 177 00:11:24,400 --> 00:11:28,480 Speaker 1: So I'm talking about John Oliver. Redditors on various subredits 178 00:11:28,520 --> 00:11:32,400 Speaker 1: are flooding pages with photoshopped images of John Oliver, and 179 00:11:32,520 --> 00:11:36,160 Speaker 1: other redditors are upvoting those images, which effectively drowns out 180 00:11:36,200 --> 00:11:39,760 Speaker 1: anything that you know isn't John Oliver and would otherwise 181 00:11:39,760 --> 00:11:43,800 Speaker 1: be relevant to the subreddit community. John Oliver himself approves 182 00:11:43,800 --> 00:11:47,160 Speaker 1: of the move. He even posted along Twitter thread filled 183 00:11:47,200 --> 00:11:50,000 Speaker 1: with selfies of himself to serve as fodder for the 184 00:11:50,000 --> 00:11:54,040 Speaker 1: photoshopped memes. Reddit leaders have allegedly contacted some of the 185 00:11:54,080 --> 00:11:58,079 Speaker 1: more popular subredits that remained dark and threatened to remove 186 00:11:58,120 --> 00:12:02,079 Speaker 1: the moderators of those communities and force those communities to reopen. 187 00:12:02,559 --> 00:12:05,240 Speaker 1: So this protest move seems like it's the next step 188 00:12:05,360 --> 00:12:08,439 Speaker 1: to keep the subreddits from serving a useful purpose. Yes, 189 00:12:08,640 --> 00:12:11,520 Speaker 1: they can reopen, but the users can flood it with 190 00:12:11,720 --> 00:12:16,680 Speaker 1: just garbage in an effort to continue to protest, and 191 00:12:16,840 --> 00:12:20,240 Speaker 1: presumably that would also end up discouraging advertisers from doing 192 00:12:20,320 --> 00:12:24,079 Speaker 1: business with the platform, which ultimately is a pressure point 193 00:12:24,080 --> 00:12:28,080 Speaker 1: that could work on Reddit management. Okay, we've got some 194 00:12:28,120 --> 00:12:30,320 Speaker 1: more tech news to cover, but before we get to that, 195 00:12:30,400 --> 00:12:42,360 Speaker 1: let's take a quick break. We're back so here in 196 00:12:42,400 --> 00:12:46,280 Speaker 1: the United States, the Senate has reintroduced a bill called 197 00:12:46,320 --> 00:12:52,840 Speaker 1: the Platform Accountability and Transparency Act or PATA PATA. The 198 00:12:52,880 --> 00:12:56,560 Speaker 1: bill was originally drafted in twenty twenty one. Initially, it 199 00:12:56,600 --> 00:13:00,320 Speaker 1: was introduced in twenty twenty two, but didn't for seed 200 00:13:00,400 --> 00:13:03,440 Speaker 1: to getting a vote, and now it's back again. So 201 00:13:03,679 --> 00:13:06,440 Speaker 1: the purpose of this bill is not one that I 202 00:13:06,480 --> 00:13:10,200 Speaker 1: would jump on the way or jump against, I guess 203 00:13:10,280 --> 00:13:12,360 Speaker 1: the way I would with a lot of bills that 204 00:13:12,480 --> 00:13:17,000 Speaker 1: involve the web, because often I see proposed legislation that 205 00:13:17,040 --> 00:13:20,360 Speaker 1: I think is terribly misguided. This one I don't feel 206 00:13:20,360 --> 00:13:22,760 Speaker 1: that way anyway. The purpose of it is to require 207 00:13:22,760 --> 00:13:27,680 Speaker 1: social media companies to be more transparent with their advertising libraries, 208 00:13:28,000 --> 00:13:30,560 Speaker 1: as well as give information on things like their recommendation 209 00:13:30,600 --> 00:13:34,679 Speaker 1: algorithms and how those work. So if you've ever wondered, well, 210 00:13:34,720 --> 00:13:37,920 Speaker 1: how the heck did this particular thing go viral? Like 211 00:13:38,000 --> 00:13:41,680 Speaker 1: it's obviously really really popular, But why did this one 212 00:13:41,720 --> 00:13:45,000 Speaker 1: go viral but this other similar piece of content that 213 00:13:45,040 --> 00:13:49,000 Speaker 1: one just went unnoticed? What's the difference? That's the kind 214 00:13:49,000 --> 00:13:52,920 Speaker 1: of question this bill aims to help answer by requiring 215 00:13:52,920 --> 00:13:56,520 Speaker 1: platforms to divulge how certain types of content get boosts 216 00:13:56,559 --> 00:14:00,160 Speaker 1: from algorithms, and thus that content gets served up to 217 00:14:00,280 --> 00:14:03,520 Speaker 1: more people and has a better chance of going viral. Now, obviously, 218 00:14:03,600 --> 00:14:07,360 Speaker 1: just serving that content up to people isn't enough. You 219 00:14:07,400 --> 00:14:09,920 Speaker 1: have to actually get people to engage with the material, 220 00:14:10,280 --> 00:14:14,600 Speaker 1: and that's when true virality happens. But obviously, if more 221 00:14:14,640 --> 00:14:18,640 Speaker 1: people are able to see a particular piece of content, 222 00:14:18,760 --> 00:14:22,040 Speaker 1: especially early early on, then the more likely it is 223 00:14:22,080 --> 00:14:25,320 Speaker 1: to go viral. The bill would also require platforms to 224 00:14:25,320 --> 00:14:29,600 Speaker 1: be transparent about content moderation policies and practices, And on 225 00:14:29,640 --> 00:14:34,920 Speaker 1: top of that, journalists and researchers who use publicly available 226 00:14:35,000 --> 00:14:39,000 Speaker 1: data from these platforms would also receive legal protections which 227 00:14:39,000 --> 00:14:43,560 Speaker 1: would limit their liability as long as they followed specific 228 00:14:43,600 --> 00:14:47,000 Speaker 1: privacy and security requirements. So, in other words, as long 229 00:14:47,040 --> 00:14:50,160 Speaker 1: as you're handling the data properly and you're not allowing 230 00:14:50,160 --> 00:14:55,480 Speaker 1: it to just disseminate without control, then you should be 231 00:14:55,560 --> 00:14:59,880 Speaker 1: protected from legal action based upon whatever your research or 232 00:15:00,200 --> 00:15:06,760 Speaker 1: or journalist investigation uncovers. One element that previously had been 233 00:15:06,760 --> 00:15:10,560 Speaker 1: in the bill has now gone missing. So once upon 234 00:15:10,600 --> 00:15:13,840 Speaker 1: a time, there was a section in this proposed bill 235 00:15:14,360 --> 00:15:18,840 Speaker 1: that would have removed section two thirty protections for platforms 236 00:15:18,880 --> 00:15:22,720 Speaker 1: if they refuse to comply with legal data requests. So, 237 00:15:22,760 --> 00:15:27,760 Speaker 1: if you'll recall, Section two thirty grants platforms online platforms 238 00:15:27,920 --> 00:15:31,280 Speaker 1: legal protections from being held liable for the stuff that 239 00:15:31,440 --> 00:15:35,720 Speaker 1: users post to those platforms. So here's an example. If 240 00:15:35,760 --> 00:15:41,520 Speaker 1: dishonest John posts something illegal to Facebook, it's not Facebook 241 00:15:41,560 --> 00:15:46,080 Speaker 1: that's legally accountable for that material, assuming that Facebook is 242 00:15:46,120 --> 00:15:48,800 Speaker 1: also following the rules and making a reasonable effort to 243 00:15:48,840 --> 00:15:51,960 Speaker 1: remove illegal stuff from its platform when it gets notified 244 00:15:52,000 --> 00:15:55,200 Speaker 1: about it. The earlier version of this bill said that 245 00:15:55,200 --> 00:15:59,160 Speaker 1: platforms could lose Section two thirty protections if they did 246 00:15:59,160 --> 00:16:03,360 Speaker 1: not comply with awful data requests. However, now that's out 247 00:16:03,400 --> 00:16:05,640 Speaker 1: of the bill, so the teeth have been taken out 248 00:16:05,680 --> 00:16:08,800 Speaker 1: of that, which probably is a good thing because there 249 00:16:08,800 --> 00:16:12,800 Speaker 1: have been a lot of attacks on section two thirty, 250 00:16:12,920 --> 00:16:17,720 Speaker 1: some of which I think aren't really you know, relevant, 251 00:16:18,440 --> 00:16:21,160 Speaker 1: and this could have been used as sort of a 252 00:16:22,120 --> 00:16:26,960 Speaker 1: backdoor approach to getting rid of section two thirty. I'm 253 00:16:26,960 --> 00:16:30,400 Speaker 1: not going to argue that section two thirty should never 254 00:16:30,440 --> 00:16:32,720 Speaker 1: go away. Maybe it should, but it needs to be 255 00:16:32,840 --> 00:16:36,400 Speaker 1: done in a way that doesn't involve finding a loophole, right. 256 00:16:36,440 --> 00:16:39,040 Speaker 1: You need to find a way to address that and 257 00:16:39,120 --> 00:16:43,200 Speaker 1: discuss what are the merits, where's it failing if it 258 00:16:43,360 --> 00:16:46,120 Speaker 1: is failing, and how should that be addressed Rather than 259 00:16:46,200 --> 00:16:48,240 Speaker 1: just say, ah, I created a shortcut, so we can 260 00:16:48,280 --> 00:16:52,400 Speaker 1: work away around it anyway. The bill has bipartisan support 261 00:16:52,440 --> 00:16:54,040 Speaker 1: in the Senate, but it still has to go to 262 00:16:54,080 --> 00:16:56,960 Speaker 1: a vote. If it does pass a vote, the House 263 00:16:56,960 --> 00:17:00,120 Speaker 1: of Representatives would then have to discuss it and make 264 00:17:00,160 --> 00:17:02,160 Speaker 1: any changes to the language of the bill they wanted 265 00:17:02,240 --> 00:17:04,680 Speaker 1: and then vote on that. And if that vote passed, 266 00:17:04,720 --> 00:17:06,280 Speaker 1: then it would go back to the Senate and they 267 00:17:06,280 --> 00:17:08,080 Speaker 1: would look at any of the changes and they would 268 00:17:08,160 --> 00:17:11,240 Speaker 1: vote again, and then eventually it would go to the 269 00:17:11,280 --> 00:17:13,960 Speaker 1: President to be signed into law. So there's still a 270 00:17:14,000 --> 00:17:17,160 Speaker 1: long way to go toward making social networks more transparent 271 00:17:17,200 --> 00:17:20,680 Speaker 1: and accountable, and there's no guarantee that it will actually 272 00:17:20,720 --> 00:17:25,040 Speaker 1: become a law, but it's on its way. Also here 273 00:17:25,040 --> 00:17:28,800 Speaker 1: in the United States, the Federal Communications Commission or FCC, 274 00:17:29,359 --> 00:17:33,920 Speaker 1: would really like to know more about data caps, specifically 275 00:17:34,760 --> 00:17:38,680 Speaker 1: why we have data caps. Data caps, of course, being 276 00:17:39,440 --> 00:17:44,520 Speaker 1: a limit to the amount of data you can download 277 00:17:44,840 --> 00:17:48,520 Speaker 1: in a given amount of time. So a lot of 278 00:17:48,720 --> 00:17:52,720 Speaker 1: providers will have a plan in place where you can 279 00:17:52,960 --> 00:17:56,000 Speaker 1: access a certain amount of data, and then once you 280 00:17:56,119 --> 00:17:59,880 Speaker 1: hit that limit, you might be throttled or charged more 281 00:18:00,160 --> 00:18:04,080 Speaker 1: to access more information. On top of that, so the 282 00:18:04,160 --> 00:18:08,480 Speaker 1: FCC is not saying that data caps are unnecessary. In fact, 283 00:18:08,520 --> 00:18:11,399 Speaker 1: that's the whole point of the FCC asking questions in 284 00:18:11,400 --> 00:18:14,639 Speaker 1: the first place. They're not coming at this with a 285 00:18:14,800 --> 00:18:18,680 Speaker 1: decision already in mind. The agency wants to understand data 286 00:18:18,720 --> 00:18:23,120 Speaker 1: caps both from the perspective of providers as well as consumers. 287 00:18:23,400 --> 00:18:27,000 Speaker 1: So are data caps necessary and if they are, what 288 00:18:27,119 --> 00:18:31,200 Speaker 1: purpose do they serve? Do they have an unfair impact 289 00:18:31,320 --> 00:18:35,000 Speaker 1: on consumers? There's no denying that data consumption rates have 290 00:18:35,160 --> 00:18:37,679 Speaker 1: blown up over the last decade. In fact, the pandemic 291 00:18:37,920 --> 00:18:41,080 Speaker 1: really did a number on data use because millions of 292 00:18:41,119 --> 00:18:44,280 Speaker 1: people were forced to work remotely. So the FCC is 293 00:18:44,320 --> 00:18:48,240 Speaker 1: aiming to understand data cap practices better. Do they serve 294 00:18:48,280 --> 00:18:53,440 Speaker 1: a purpose beyond just driving revenue for providers, Then the 295 00:18:53,720 --> 00:18:56,400 Speaker 1: FCC can make rules to ensure that data caps don't 296 00:18:56,440 --> 00:18:59,960 Speaker 1: prevent accessibility to consumers and that they don't create unf 297 00:19:00,280 --> 00:19:04,600 Speaker 1: market conditions that discourage competition. So it's too early to 298 00:19:04,640 --> 00:19:06,960 Speaker 1: say yay, data caps are on the way out. They 299 00:19:07,000 --> 00:19:10,639 Speaker 1: may not be. There may very well be legitimate purposes 300 00:19:10,680 --> 00:19:14,560 Speaker 1: for them beyond we make money this way, right, So 301 00:19:15,080 --> 00:19:18,560 Speaker 1: we'll see how this unfolds. All right now, let's go 302 00:19:18,640 --> 00:19:22,320 Speaker 1: back over to the European Union. You might remember that 303 00:19:22,400 --> 00:19:25,919 Speaker 1: the EU created rules that will require companies to adopt 304 00:19:25,920 --> 00:19:30,360 Speaker 1: the USBC standard in devices like smartphones and tablets and 305 00:19:30,480 --> 00:19:33,040 Speaker 1: you know, other things like laptops as well, and that 306 00:19:33,160 --> 00:19:36,560 Speaker 1: this decision is forcing companies like Apple to make significant 307 00:19:36,600 --> 00:19:40,119 Speaker 1: changes in order to stay in the European market. Namely, 308 00:19:40,680 --> 00:19:44,320 Speaker 1: Apple would have to either ditch the lightning port entirely 309 00:19:44,440 --> 00:19:48,959 Speaker 1: and switch it out to USBC, or include USBC along 310 00:19:49,119 --> 00:19:52,520 Speaker 1: with lightning ports on its devices, which would obviously add 311 00:19:52,600 --> 00:19:56,760 Speaker 1: bulk to those gadgets. And now the EU Parliament has 312 00:19:56,840 --> 00:19:59,800 Speaker 1: voted to require manufacturers to make it possible to easily 313 00:20:00,160 --> 00:20:05,000 Speaker 1: place batteries on those types of devices for consumers. So, 314 00:20:05,040 --> 00:20:07,320 Speaker 1: in other words, if you went out and bought yourself 315 00:20:07,320 --> 00:20:10,639 Speaker 1: a smartphone, you as a consumer should be able to 316 00:20:11,160 --> 00:20:16,280 Speaker 1: open up a back little section on that phone, pull 317 00:20:16,280 --> 00:20:18,359 Speaker 1: out a battery and put in a new one. This 318 00:20:18,400 --> 00:20:22,480 Speaker 1: would be huge and a huge change. Being able to 319 00:20:22,480 --> 00:20:24,399 Speaker 1: pop up in your iPhone swap out the battery not 320 00:20:24,480 --> 00:20:26,840 Speaker 1: only can give you extended use, but obviously you can 321 00:20:26,920 --> 00:20:29,719 Speaker 1: replace the battery if your original one no longer holds 322 00:20:29,760 --> 00:20:33,720 Speaker 1: a full charge. With these rules, in place. By necessity, 323 00:20:33,800 --> 00:20:37,080 Speaker 1: companies would have to change the design of gadgets so 324 00:20:37,119 --> 00:20:40,040 Speaker 1: that you could actually access the battery in the first place. 325 00:20:40,240 --> 00:20:43,160 Speaker 1: And again, companies like Apple use a lot of proprietary 326 00:20:43,200 --> 00:20:46,560 Speaker 1: fasteners that require special tools if you want to get 327 00:20:46,600 --> 00:20:49,719 Speaker 1: into an iPhone's guts. Apple has made a lot of 328 00:20:50,240 --> 00:20:55,560 Speaker 1: allowances and has made a lot of concessions to regulators 329 00:20:55,600 --> 00:21:00,520 Speaker 1: by making these tools more readily available to independent repair people, 330 00:21:00,920 --> 00:21:03,840 Speaker 1: but still not something that the average person can easily do. 331 00:21:04,359 --> 00:21:07,520 Speaker 1: And Apple typically isn't keen on just letting anyone be 332 00:21:07,600 --> 00:21:11,080 Speaker 1: able to pop open an Apple device, partly to discourage 333 00:21:11,080 --> 00:21:14,040 Speaker 1: hackers from making Apple products do stuff that they weren't 334 00:21:14,080 --> 00:21:18,600 Speaker 1: intended to do, because Apple has always been about control 335 00:21:18,880 --> 00:21:23,200 Speaker 1: of the experience of using their products. And also it's 336 00:21:23,280 --> 00:21:26,600 Speaker 1: partly or perhaps even mostly in an effort to create 337 00:21:26,640 --> 00:21:30,640 Speaker 1: a closed ecosystem where Apple determines who can access a device, 338 00:21:31,040 --> 00:21:33,359 Speaker 1: and this is a way of generating revenue, right because 339 00:21:33,400 --> 00:21:36,680 Speaker 1: if you prevent just anyone from being able to open 340 00:21:36,720 --> 00:21:40,800 Speaker 1: the device, then they have to go to specific repair 341 00:21:41,680 --> 00:21:45,240 Speaker 1: shops and repair people, and those often end up being 342 00:21:45,320 --> 00:21:50,240 Speaker 1: licensed by Apple, which means that Apple gets money from this, 343 00:21:50,400 --> 00:21:53,880 Speaker 1: and it becomes a way to keep generating revenue by 344 00:21:53,920 --> 00:21:56,320 Speaker 1: preventing just anyone from being able to open it up 345 00:21:56,320 --> 00:21:59,359 Speaker 1: and mess around. Now, this new law goes into effect 346 00:21:59,359 --> 00:22:03,000 Speaker 1: in twenty twenty seven, which makes sense like it's it's 347 00:22:03,240 --> 00:22:05,239 Speaker 1: there's a reason it's that far out. It has they 348 00:22:05,280 --> 00:22:08,840 Speaker 1: have to give manufacturers time to design products that will 349 00:22:08,880 --> 00:22:12,680 Speaker 1: comply with these rules. And this rule may not be, 350 00:22:13,400 --> 00:22:16,199 Speaker 1: you know, the most wonderful thing in the world for 351 00:22:16,320 --> 00:22:20,199 Speaker 1: all gadget lovers out there, because it may mean that 352 00:22:20,240 --> 00:22:23,679 Speaker 1: we start to see gadgets get a little chonky again, 353 00:22:24,240 --> 00:22:29,600 Speaker 1: because a big reason for preventing people from accessing the 354 00:22:29,640 --> 00:22:33,439 Speaker 1: innerds of these devices is that, forim, factors have become 355 00:22:33,640 --> 00:22:37,600 Speaker 1: so small and so slim that it requires manufacturers to 356 00:22:37,680 --> 00:22:41,359 Speaker 1: design components that cram together in such a way that 357 00:22:41,440 --> 00:22:44,199 Speaker 1: you can't really separate them. Like the battery ends up 358 00:22:44,240 --> 00:22:48,600 Speaker 1: being literally built into the phone, which means it's impossible 359 00:22:48,640 --> 00:22:52,880 Speaker 1: to remove the battery without breaking the phone. So this 360 00:22:53,040 --> 00:22:55,800 Speaker 1: could mean that the smartphones of the future, at least 361 00:22:55,840 --> 00:22:59,240 Speaker 1: in Europe, get a bit more hefty, but the trade 362 00:22:59,240 --> 00:23:01,760 Speaker 1: off would be you'd be able to pop out the 363 00:23:01,800 --> 00:23:04,280 Speaker 1: battery and swap it for a new one anytime you like. 364 00:23:04,720 --> 00:23:08,440 Speaker 1: So we'll have to see how this impacts the design 365 00:23:09,440 --> 00:23:13,879 Speaker 1: of gadgets that are sold in the European Union. Okay, 366 00:23:13,920 --> 00:23:16,000 Speaker 1: we're going to take another quick break. When we come back. 367 00:23:16,000 --> 00:23:28,280 Speaker 1: I got about four more stories to cover. All right, 368 00:23:28,320 --> 00:23:32,360 Speaker 1: we're back again. When HDTVs first began to take off 369 00:23:32,359 --> 00:23:36,280 Speaker 1: in the consumer market, you really had two major technologies 370 00:23:36,280 --> 00:23:40,520 Speaker 1: that were in competition. You had LCD screens and you 371 00:23:40,560 --> 00:23:43,240 Speaker 1: had plasma televisions, and each of them had their own 372 00:23:43,280 --> 00:23:47,840 Speaker 1: pros and cons. Plasma televisions were really great with resolution 373 00:23:48,040 --> 00:23:53,639 Speaker 1: and color, especially on mid sized television screens. However, they 374 00:23:53,680 --> 00:23:56,800 Speaker 1: were also not as bright, and they also had the 375 00:23:56,800 --> 00:23:59,800 Speaker 1: possibility of screen burn in, which was another kind of 376 00:24:00,280 --> 00:24:04,960 Speaker 1: negative for plasma screens. They also were more expensive to 377 00:24:05,040 --> 00:24:08,520 Speaker 1: manufacture than LCD screens, and by extension, they were much 378 00:24:08,560 --> 00:24:13,480 Speaker 1: more expensive to purchase for consumers. Those costs got bigger 379 00:24:13,640 --> 00:24:17,040 Speaker 1: as the screens themselves got bigger and they consumed more 380 00:24:17,160 --> 00:24:21,600 Speaker 1: energy than LCDs. So while plasma screens had their fans 381 00:24:21,680 --> 00:24:25,560 Speaker 1: like people who just passionately loved the quality of plasma 382 00:24:25,720 --> 00:24:29,280 Speaker 1: over LCD, by twenty thirteen the plasma had kind of 383 00:24:29,560 --> 00:24:34,760 Speaker 1: faded away. The Phillips stopped making them in twenty thirteen 384 00:24:34,800 --> 00:24:37,680 Speaker 1: and they pretty much died as a result. So while 385 00:24:37,800 --> 00:24:42,000 Speaker 1: LCDs enjoyed a decade more in the spotlight, they are 386 00:24:42,160 --> 00:24:45,399 Speaker 1: now also kind of going off to live on the 387 00:24:45,400 --> 00:24:47,560 Speaker 1: farm and pet the rabbits, if you know what I'm saying. 388 00:24:48,400 --> 00:24:53,119 Speaker 1: So technologies like LED and LED or O lead screens 389 00:24:53,560 --> 00:24:57,800 Speaker 1: are dominant, and LCD research and development is yesterday's news. 390 00:24:57,800 --> 00:25:01,520 Speaker 1: According to industry insiders. They're saying, like, yeah, the LCD 391 00:25:01,680 --> 00:25:04,320 Speaker 1: has pretty much gone as far as it can, like 392 00:25:04,359 --> 00:25:07,960 Speaker 1: that technology is sort of maxed out, and thus there's 393 00:25:08,040 --> 00:25:10,200 Speaker 1: no real reason to pour money in R and D, 394 00:25:10,960 --> 00:25:16,399 Speaker 1: and instead you would be researching other technologies. However, I 395 00:25:16,520 --> 00:25:21,680 Speaker 1: don't want to actually say that LCD stuff is literally 396 00:25:21,760 --> 00:25:24,840 Speaker 1: going away, like it's dying off like plasma did. I'm 397 00:25:24,840 --> 00:25:27,080 Speaker 1: having a bit of fun with this, but it's not 398 00:25:27,160 --> 00:25:30,120 Speaker 1: quite to that level. LCD will likely still be part 399 00:25:30,119 --> 00:25:34,000 Speaker 1: of LCD LED TVs. It's just that those televisions are 400 00:25:34,040 --> 00:25:36,680 Speaker 1: going to be the lower end televisions because they'll they'll 401 00:25:36,720 --> 00:25:41,560 Speaker 1: max out in resolution and quality, but they'll still represent 402 00:25:41,720 --> 00:25:46,679 Speaker 1: sort of the bargain end of high definition televisions, and 403 00:25:46,720 --> 00:25:50,840 Speaker 1: that the future of ultra high resolution television really rests 404 00:25:50,960 --> 00:25:56,480 Speaker 1: on the metaphorical shoulders of o LED tech. So thanks 405 00:25:56,480 --> 00:26:01,320 Speaker 1: for all your hard work, LCD. Enjoy your retirement now. 406 00:26:01,359 --> 00:26:05,480 Speaker 1: Today I learned about muons, which I previously believe to 407 00:26:05,560 --> 00:26:09,720 Speaker 1: be the sound that was made by cows. Sorry, it 408 00:26:09,760 --> 00:26:11,879 Speaker 1: was Father's Day this past weekend and in the United 409 00:26:11,880 --> 00:26:14,880 Speaker 1: States here, and I think I've got some residual Dad 410 00:26:15,000 --> 00:26:17,919 Speaker 1: jokes stuck to me. But no, muons are not the 411 00:26:17,960 --> 00:26:21,880 Speaker 1: sound made by cow ons. Mooons are subatomic particles. They 412 00:26:21,960 --> 00:26:26,080 Speaker 1: are similar to electrons, but they weigh a lot more 413 00:26:26,119 --> 00:26:29,080 Speaker 1: than electrons do. And the mooons that visit our little 414 00:26:29,160 --> 00:26:33,439 Speaker 1: planet come from interactions between cosmic rays hitting little particles 415 00:26:33,560 --> 00:26:37,920 Speaker 1: way up in our atmosphere, and thousands of them hit 416 00:26:38,000 --> 00:26:41,520 Speaker 1: every square meter of the Earth every minute. The particles 417 00:26:41,520 --> 00:26:44,760 Speaker 1: themselves zip along at pretty dang incredible speed. They're not 418 00:26:44,840 --> 00:26:47,199 Speaker 1: quite moving at the speed of light. You know, you 419 00:26:47,280 --> 00:26:49,680 Speaker 1: can't really hit speed of light with if you've got mass. 420 00:26:49,960 --> 00:26:52,520 Speaker 1: That's one of the tricky things about the speed of light. 421 00:26:52,560 --> 00:26:55,840 Speaker 1: But they are super duper fast, and they're teeny teeny tiny, 422 00:26:56,160 --> 00:26:59,920 Speaker 1: and they can penetrate solid stuff like the ground by 423 00:27:00,080 --> 00:27:04,200 Speaker 1: a lot like miles. Now, scientists in Japan have created 424 00:27:04,320 --> 00:27:08,480 Speaker 1: Muon detectors that collectively can serve as a kind of 425 00:27:08,640 --> 00:27:12,679 Speaker 1: underground GPS. Researchers at the University of Tokyo showed that 426 00:27:12,880 --> 00:27:16,680 Speaker 1: using Muon detectors, they could calculate a receiver's position in 427 00:27:16,720 --> 00:27:20,000 Speaker 1: the basement of a six story building. So the thought 428 00:27:20,119 --> 00:27:23,480 Speaker 1: is this technology could potentially be used in areas where 429 00:27:23,520 --> 00:27:27,320 Speaker 1: other means of navigation like GPS would be useless because 430 00:27:27,320 --> 00:27:30,439 Speaker 1: signals would not be able to penetrate the correct depth. 431 00:27:30,480 --> 00:27:33,280 Speaker 1: In order to be useful, the tech could be used 432 00:27:33,280 --> 00:27:36,880 Speaker 1: to monitor underground volcanic activity, or to plot a path 433 00:27:36,960 --> 00:27:40,040 Speaker 1: for an unmanned underground vehicle, that kind of stuff, and 434 00:27:40,200 --> 00:27:43,359 Speaker 1: it would rely on naturally occurring muons. Like I said, 435 00:27:43,400 --> 00:27:45,480 Speaker 1: you know, thousands of these are hitting every square meter, 436 00:27:45,640 --> 00:27:50,080 Speaker 1: like ten thousand every minute hitting a square meter. So 437 00:27:50,640 --> 00:27:52,840 Speaker 1: the tech really is all about coordinating the efforts of 438 00:27:52,880 --> 00:27:55,879 Speaker 1: various receiving stations for the purposes of mapping and navigation. 439 00:27:56,440 --> 00:28:00,360 Speaker 1: I can't pretend to understand the subtleties of how all 440 00:28:00,440 --> 00:28:03,040 Speaker 1: this works, but I do plan to look into it 441 00:28:03,080 --> 00:28:05,399 Speaker 1: a lot more and do a full episode about it 442 00:28:05,440 --> 00:28:08,280 Speaker 1: in the future. I just don't want to say anything 443 00:28:08,280 --> 00:28:11,600 Speaker 1: else here where, I just end up making an assumption 444 00:28:11,680 --> 00:28:14,920 Speaker 1: that's completely off basin incorrect. So in the meantime, you 445 00:28:14,920 --> 00:28:18,199 Speaker 1: should know that while cowons do not make muons, a 446 00:28:18,320 --> 00:28:21,760 Speaker 1: quark is the sound made by a dirk. Space dot 447 00:28:21,800 --> 00:28:25,760 Speaker 1: Com reports that scientists at the Large Hadron Collider are 448 00:28:25,880 --> 00:28:27,720 Speaker 1: zeroing in on the answer to one of the most 449 00:28:27,800 --> 00:28:33,080 Speaker 1: perplexing questions in cosmic history. Why do we live in 450 00:28:33,119 --> 00:28:37,920 Speaker 1: a universe made of matter instead of antimatter? Now keep 451 00:28:37,960 --> 00:28:41,640 Speaker 1: in mind we humans gave matter and antimatter their names, 452 00:28:41,800 --> 00:28:44,160 Speaker 1: so really you could say, why has our universe made 453 00:28:44,240 --> 00:28:48,000 Speaker 1: up of this stuff rather than this other stuff which 454 00:28:48,320 --> 00:28:52,360 Speaker 1: also existed? So when matter and antimatter encounter one another, 455 00:28:52,600 --> 00:28:56,120 Speaker 1: they annihilate each other. So if there had been equal 456 00:28:56,160 --> 00:28:59,120 Speaker 1: amounts of matter and antimatter created at the beginning of 457 00:28:59,160 --> 00:29:02,120 Speaker 1: the universe, the whole universe should have poofed out of 458 00:29:02,160 --> 00:29:05,280 Speaker 1: existence in a big old explosion of energy and radiation. 459 00:29:05,720 --> 00:29:11,600 Speaker 1: But for some reason, matter one out. Why though, why 460 00:29:11,800 --> 00:29:16,000 Speaker 1: was the creation asymmetrical? Because based on our understanding of 461 00:29:16,080 --> 00:29:19,720 Speaker 1: the standard model of physics, it should have been symmetrical. 462 00:29:20,240 --> 00:29:23,520 Speaker 1: The scientists have been conducting research in this area creating 463 00:29:23,560 --> 00:29:26,760 Speaker 1: antimatter and labs at the Large Hadron Collider and trying 464 00:29:26,760 --> 00:29:29,840 Speaker 1: to get a better understanding of why this imbalance existed 465 00:29:30,160 --> 00:29:32,800 Speaker 1: and what that means with regard to our understanding of 466 00:29:32,880 --> 00:29:36,520 Speaker 1: cosmology and physics in general. It's pretty cool stuff and 467 00:29:36,560 --> 00:29:38,520 Speaker 1: if you want to learn more, I recommend reading the 468 00:29:38,640 --> 00:29:41,960 Speaker 1: article Large Hadron Collider may be closing in on the 469 00:29:42,080 --> 00:29:47,720 Speaker 1: Universe's missing antimatter by Keith Cooper at space dot Com. Finally, 470 00:29:48,000 --> 00:29:52,200 Speaker 1: Suzuki Motor Corporation and SkyDrive, a startup company in Japan 471 00:29:52,600 --> 00:29:55,960 Speaker 1: that's in the flying car biz, jointly announced intentions to 472 00:29:56,040 --> 00:30:01,080 Speaker 1: produce flying cars starting next year like the spring of 473 00:30:01,200 --> 00:30:06,480 Speaker 1: twenty twenty four. SkyDrive is going to use Suzuki's manufacturing 474 00:30:06,520 --> 00:30:09,840 Speaker 1: facilities to assemble the flying cars. These flying cars look 475 00:30:09,840 --> 00:30:13,760 Speaker 1: a little bit like oversized quad copter drones, except instead 476 00:30:13,800 --> 00:30:16,640 Speaker 1: of having like a quad copter, it's called that because 477 00:30:16,640 --> 00:30:19,000 Speaker 1: you have four sets of rotors, one on the end 478 00:30:19,080 --> 00:30:22,760 Speaker 1: of each arm, So you've got a central unit and 479 00:30:22,840 --> 00:30:26,600 Speaker 1: four arms that go off at in different angles, and 480 00:30:26,680 --> 00:30:29,040 Speaker 1: there's a set of rotors at the end of each one. 481 00:30:29,400 --> 00:30:32,600 Speaker 1: This one would actually have three sets of rotors and 482 00:30:32,680 --> 00:30:35,080 Speaker 1: like a triangular formation at the end of each arm, 483 00:30:35,120 --> 00:30:39,360 Speaker 1: so you'd have twelve sets total instead of just four. Anyway, 484 00:30:39,520 --> 00:30:41,640 Speaker 1: it sounds like the goal is to have some flying 485 00:30:41,680 --> 00:30:45,040 Speaker 1: cars available in time for the twenty twenty five World Exposition, 486 00:30:45,080 --> 00:30:49,440 Speaker 1: which taking place in Osaka, Japan. As for how they 487 00:30:49,440 --> 00:30:51,800 Speaker 1: would be used, like, what would be the practical purpose 488 00:30:51,960 --> 00:30:55,320 Speaker 1: of these flying cars? Your guess is as good as mine. 489 00:30:55,880 --> 00:30:59,200 Speaker 1: The most frequent suggestion I have seen for flying cars 490 00:30:59,720 --> 00:31:02,320 Speaker 1: is that they might be used to transport people from 491 00:31:02,360 --> 00:31:06,200 Speaker 1: one area to an airport or back from an airport 492 00:31:06,200 --> 00:31:08,360 Speaker 1: to that area, which sounds kind of like a park 493 00:31:08,400 --> 00:31:12,720 Speaker 1: and fly situation. Like based on that description, I think 494 00:31:12,720 --> 00:31:15,360 Speaker 1: you would have to drive to a facility that has 495 00:31:15,400 --> 00:31:18,440 Speaker 1: a landing pad, and then you would board a flying car, 496 00:31:18,640 --> 00:31:21,000 Speaker 1: get whisked away to the airport, and then you would 497 00:31:21,000 --> 00:31:23,800 Speaker 1: go through the joy that is navigating through an airport. 498 00:31:24,280 --> 00:31:26,920 Speaker 1: I remain somewhat skeptical that that's actually a good use 499 00:31:26,960 --> 00:31:30,000 Speaker 1: of technology and resources, but that might be because I 500 00:31:30,000 --> 00:31:32,480 Speaker 1: also live in a city where it's not that hard 501 00:31:32,720 --> 00:31:36,000 Speaker 1: to get to and from the major airport. So maybe 502 00:31:36,040 --> 00:31:39,480 Speaker 1: if I lived in Los Angeles or Manhattan or Tokyo, 503 00:31:39,520 --> 00:31:43,480 Speaker 1: maybe I would feel differently, but as it stands, I'm 504 00:31:43,520 --> 00:31:49,440 Speaker 1: not sure that it's a viable business model. But I 505 00:31:49,440 --> 00:31:52,440 Speaker 1: don't know. Maybe I'll be proven wrong. And that wraps 506 00:31:52,560 --> 00:31:56,360 Speaker 1: up the tech news for Tuesday, June twentieth, twenty twenty three. 507 00:31:57,120 --> 00:32:00,880 Speaker 1: I hope you are all well. Just ahead up. I 508 00:32:00,960 --> 00:32:04,920 Speaker 1: will be out next week taking a vacation. My plan 509 00:32:05,040 --> 00:32:08,680 Speaker 1: right now is to have some reruns run next week, 510 00:32:09,000 --> 00:32:11,760 Speaker 1: but I'll be back the following week with brand new episodes, 511 00:32:12,000 --> 00:32:14,600 Speaker 1: So just one to make you aware of that heading 512 00:32:14,720 --> 00:32:16,960 Speaker 1: into it. I'll probably mention it again at the end 513 00:32:16,960 --> 00:32:20,040 Speaker 1: of the other episodes this week, and I'll talk to 514 00:32:20,040 --> 00:32:30,880 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 515 00:32:31,160 --> 00:32:36,200 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 516 00:32:36,320 --> 00:32:42,080 Speaker 1: or wherever you listen to your favorite shows.