1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,600 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,720 --> 00:00:18,000 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:18,040 --> 00:00:24,040 Speaker 1: how the tech are you Today? Is Thursday, February twenty two. 5 00:00:24,239 --> 00:00:27,840 Speaker 1: Let us get to the tech news. Should be a 6 00:00:28,160 --> 00:00:30,760 Speaker 1: nice quake episode today. And by now, I'm sure you've 7 00:00:30,800 --> 00:00:36,080 Speaker 1: all heard how Mark Zuckerberg has gifted his underlings I'm sorry, peons, 8 00:00:36,920 --> 00:00:41,919 Speaker 1: I'm sorry employees with a new designation. Yes, they are 9 00:00:42,040 --> 00:00:47,360 Speaker 1: called meta mats. Now thanks, I hate it. He also 10 00:00:47,440 --> 00:00:50,040 Speaker 1: said that a new corporate value for the company goes 11 00:00:50,080 --> 00:00:55,680 Speaker 1: something like this meta meta mates me, which means that 12 00:00:55,960 --> 00:00:59,520 Speaker 1: employees should think about what's important in that order, namely, 13 00:01:00,040 --> 00:01:03,440 Speaker 1: first comes the company, then come the co workers, then 14 00:01:03,600 --> 00:01:06,520 Speaker 1: comes yourself. And I just want to say, that's kind 15 00:01:06,520 --> 00:01:09,600 Speaker 1: of messed up. All right, It's it's Jonathan soapbox time. 16 00:01:09,640 --> 00:01:12,720 Speaker 1: You all get out the popcorn. We're gonna go early today. 17 00:01:12,760 --> 00:01:16,320 Speaker 1: So you have likely heard the phrase the show must 18 00:01:16,400 --> 00:01:19,559 Speaker 1: go on, which is an old theater phrase and it's 19 00:01:19,640 --> 00:01:23,600 Speaker 1: toxic as all get out. So let me explain. The 20 00:01:23,600 --> 00:01:27,720 Speaker 1: phrase suggests that the most important element here is the 21 00:01:27,760 --> 00:01:31,400 Speaker 1: show that the show is more important than the people 22 00:01:31,440 --> 00:01:35,160 Speaker 1: who are involved in making the show happen. So if 23 00:01:35,240 --> 00:01:38,240 Speaker 1: you are a performer and you're sick or you're hurt, 24 00:01:38,560 --> 00:01:42,720 Speaker 1: you gotta suck it up because the show must go on. 25 00:01:43,600 --> 00:01:46,200 Speaker 1: The show is way more important than your personal health 26 00:01:46,200 --> 00:01:48,960 Speaker 1: and well being. And if you happen to have the 27 00:01:49,000 --> 00:01:52,120 Speaker 1: temerity to think that your own welfare is at least 28 00:01:52,160 --> 00:01:57,440 Speaker 1: as important as the show, then you are an egomaniacal diva. 29 00:01:57,600 --> 00:02:02,160 Speaker 1: Well I've got news for you. All actors are egomaniacal divas. 30 00:02:02,720 --> 00:02:06,200 Speaker 1: We just are. It's just the ones who value their 31 00:02:06,200 --> 00:02:11,600 Speaker 1: own welfare can often be portrayed as being the real jerks. Now, 32 00:02:11,639 --> 00:02:15,200 Speaker 1: to be fair, some of them are real jerks. Some 33 00:02:15,240 --> 00:02:18,120 Speaker 1: of them go beyond valuing their welfare and they think 34 00:02:18,160 --> 00:02:22,000 Speaker 1: that the world revolves around them. But that's beside the 35 00:02:22,000 --> 00:02:25,440 Speaker 1: point anyway. I feel this show must go on. Mentality 36 00:02:25,520 --> 00:02:29,800 Speaker 1: has kind of benefited organizations at the cost of their 37 00:02:29,880 --> 00:02:34,640 Speaker 1: own employees health and wellness for like ever, and I 38 00:02:34,680 --> 00:02:37,760 Speaker 1: don't think that's a good mindset. And that doesn't even 39 00:02:37,800 --> 00:02:40,639 Speaker 1: take into account the fact that meta mates is an 40 00:02:40,760 --> 00:02:45,400 Speaker 1: awful thing to call here employees that seems dehumanizing to me, 41 00:02:45,760 --> 00:02:50,720 Speaker 1: maybe because I'm still reeling from Meta's commercial during the 42 00:02:50,760 --> 00:02:54,440 Speaker 1: Super Bowl, and I guess I shouldn't be surprised. So 43 00:02:54,919 --> 00:02:58,280 Speaker 1: I think we could say that the spirit of the 44 00:02:58,360 --> 00:03:03,320 Speaker 1: message is that an organization really only succeeds if the 45 00:03:03,400 --> 00:03:06,960 Speaker 1: people who are working in that organization are working together 46 00:03:07,080 --> 00:03:10,560 Speaker 1: toward a common goal, and that if everyone had a 47 00:03:10,600 --> 00:03:15,040 Speaker 1: more kind of self centered approach, then nothing would get accomplished. 48 00:03:15,320 --> 00:03:17,400 Speaker 1: And I can get behind that. I can understand that 49 00:03:17,440 --> 00:03:21,000 Speaker 1: the idea that, hey, this enterprise only works if we 50 00:03:21,080 --> 00:03:24,240 Speaker 1: all work together. I get that. It's just that this 51 00:03:24,320 --> 00:03:28,480 Speaker 1: particular phrasing doesn't read that way to me. It almost 52 00:03:28,520 --> 00:03:31,799 Speaker 1: sounds more like this is what's important, and you are 53 00:03:31,919 --> 00:03:36,360 Speaker 1: like third on the list. Yuck. Earlier this week, I 54 00:03:36,400 --> 00:03:39,720 Speaker 1: talked about how research shows a significant percentage of hacker 55 00:03:39,760 --> 00:03:44,680 Speaker 1: activity originates out of Russia. No big surprise there. Well, yesterday, 56 00:03:44,720 --> 00:03:48,320 Speaker 1: the US intelligence community published an alert saying that Russian 57 00:03:48,360 --> 00:03:53,400 Speaker 1: hackers have been compromising defense contractors and subcontractors in an 58 00:03:53,400 --> 00:03:57,800 Speaker 1: effort to steal information regarding US defenses. The alert said 59 00:03:57,840 --> 00:04:01,000 Speaker 1: that a pattern of regular targeting of these companies dates 60 00:04:01,000 --> 00:04:04,480 Speaker 1: back to at least January twenty twenty, and according to 61 00:04:04,520 --> 00:04:08,440 Speaker 1: the alert, the data in question relates to quote US 62 00:04:08,520 --> 00:04:14,600 Speaker 1: weapons platforms, development and deployment timelines, vehicle specifications, and plans 63 00:04:14,600 --> 00:04:19,719 Speaker 1: for communications, infrastructure, and information technology end quote. That is 64 00:04:19,800 --> 00:04:23,679 Speaker 1: really bad news for the United States. And I'm definitely 65 00:04:23,680 --> 00:04:27,440 Speaker 1: not surprised at this thing, at the fact that Russia 66 00:04:27,440 --> 00:04:30,240 Speaker 1: has been targeting those systems. Cyber warfare is just a 67 00:04:30,320 --> 00:04:33,520 Speaker 1: fact of life right now. Also, I can guarantee you 68 00:04:33,720 --> 00:04:37,280 Speaker 1: that the United States is trying and potentially succeeding at 69 00:04:37,320 --> 00:04:40,159 Speaker 1: doing the exact same thing to Russia. We just don't 70 00:04:40,240 --> 00:04:45,360 Speaker 1: hear about that typically, But I I would be shocked 71 00:04:45,680 --> 00:04:48,599 Speaker 1: to learn that we don't have anything related to those 72 00:04:48,680 --> 00:04:52,000 Speaker 1: kinds of activities going on here in the States. The 73 00:04:52,080 --> 00:04:56,000 Speaker 1: upsetting thing is hearing how these attacks have actually been effective, 74 00:04:56,520 --> 00:05:00,400 Speaker 1: because a lot of them apparently are using bay sick 75 00:05:00,680 --> 00:05:05,320 Speaker 1: tactics that could easily be defeated with just the bare 76 00:05:05,680 --> 00:05:09,320 Speaker 1: minimum amount of attention and care. Stuff that folks should 77 00:05:09,360 --> 00:05:13,200 Speaker 1: be aware of by now, Like spear phishing attempts. You 78 00:05:13,240 --> 00:05:15,400 Speaker 1: know what phishing attempts are. That's when someone's trying to 79 00:05:15,440 --> 00:05:19,520 Speaker 1: trick you into giving up information. Spear Fishing is a 80 00:05:19,520 --> 00:05:22,760 Speaker 1: little bit more targeted, right, You are using what you 81 00:05:22,800 --> 00:05:27,520 Speaker 1: know about your target to improve the chances of fooling 82 00:05:27,520 --> 00:05:30,359 Speaker 1: them into giving you information. And you would think that 83 00:05:30,480 --> 00:05:34,440 Speaker 1: defense contractors would have a pretty good process to identify 84 00:05:34,560 --> 00:05:37,159 Speaker 1: and prevent that kind of thing from happening. But I 85 00:05:37,200 --> 00:05:39,919 Speaker 1: guess it just proves that people continue to be the 86 00:05:40,000 --> 00:05:44,480 Speaker 1: weakest link in any security system. Uh So we won't 87 00:05:44,520 --> 00:05:47,520 Speaker 1: have a perfect information security system until we get rid 88 00:05:47,560 --> 00:05:49,560 Speaker 1: of people, I guess, is what I'm saying, Not that 89 00:05:49,640 --> 00:05:51,760 Speaker 1: I want to get rid of people. That kind of 90 00:05:52,560 --> 00:05:56,680 Speaker 1: ends up making the whole thing moot. If you live 91 00:05:56,720 --> 00:05:59,719 Speaker 1: in an apartment in the United States, well, the fc 92 00:06:00,040 --> 00:06:02,240 Speaker 1: CE just gave you more options when it comes to 93 00:06:02,279 --> 00:06:07,719 Speaker 1: Internet service providers. And here's the sitch. Sometimes landlords someone 94 00:06:07,760 --> 00:06:13,320 Speaker 1: who owns a multi tenant building of some sort, they 95 00:06:13,360 --> 00:06:17,080 Speaker 1: will make a sweetheart deal with a specific Internet service 96 00:06:17,120 --> 00:06:20,400 Speaker 1: provider and that makes that I s P the only 97 00:06:20,520 --> 00:06:24,279 Speaker 1: provider for that building. So anyone living inside the building 98 00:06:24,680 --> 00:06:27,960 Speaker 1: has no choice when it comes to an I s P. 99 00:06:28,520 --> 00:06:31,720 Speaker 1: They either pay for service from that one provider that 100 00:06:31,800 --> 00:06:34,359 Speaker 1: the landlord made an agreement with, or they have to 101 00:06:34,400 --> 00:06:38,840 Speaker 1: go without internet service. Now, technically this was already against 102 00:06:38,880 --> 00:06:41,400 Speaker 1: the rules in the United States, but there were various 103 00:06:41,440 --> 00:06:44,320 Speaker 1: loopholes that landlords and I s p s could use 104 00:06:44,360 --> 00:06:48,040 Speaker 1: to assentially ignore the rules. Now the f c C, 105 00:06:48,440 --> 00:06:52,400 Speaker 1: which is the Federal Communications Commission, has passed a four 106 00:06:52,760 --> 00:06:58,440 Speaker 1: to zero vote to close those loopholes. That is awesome, 107 00:06:58,720 --> 00:07:02,360 Speaker 1: particularly since the CC currently has an equal split of 108 00:07:02,400 --> 00:07:06,920 Speaker 1: Democrat and Republican reps sitting at it. So it's very 109 00:07:07,040 --> 00:07:10,720 Speaker 1: rare to find a topic where both of those parties 110 00:07:10,720 --> 00:07:15,520 Speaker 1: will actually align and support it in an a unanimous vote. 111 00:07:15,600 --> 00:07:21,200 Speaker 1: That like never happens, So that's nice. And uh, you know, 112 00:07:21,600 --> 00:07:23,720 Speaker 1: there's still one empty seat on the f c C. 113 00:07:24,240 --> 00:07:25,880 Speaker 1: That's something that a lot of people are kind of 114 00:07:25,920 --> 00:07:29,000 Speaker 1: irritated about because they would love to see the FCC 115 00:07:29,160 --> 00:07:32,920 Speaker 1: at full operation, particularly because that would mean you would 116 00:07:32,960 --> 00:07:36,120 Speaker 1: have a tie breaking vote, a three to two vote, 117 00:07:36,360 --> 00:07:40,400 Speaker 1: and uh, because we're talking about a democratic administration, it 118 00:07:40,400 --> 00:07:44,080 Speaker 1: would mean a Democrat sitting at that extra seat, but 119 00:07:44,200 --> 00:07:47,120 Speaker 1: that has not been filled as of yet. Anyway, the 120 00:07:47,160 --> 00:07:50,720 Speaker 1: hope is that now these loopholes have been closed and 121 00:07:50,760 --> 00:07:52,720 Speaker 1: competing I s p s will be able to get 122 00:07:52,760 --> 00:07:55,880 Speaker 1: access to those buildings that previously had been off limits. 123 00:07:55,880 --> 00:07:59,400 Speaker 1: To them and tenants or choices when it comes to 124 00:08:00,040 --> 00:08:03,360 Speaker 1: choosing an internet service provider, and here in America, a 125 00:08:03,360 --> 00:08:05,960 Speaker 1: lot of places have very limited options when it comes 126 00:08:06,000 --> 00:08:09,160 Speaker 1: to internet service. For example, at my home in the 127 00:08:09,200 --> 00:08:13,200 Speaker 1: city of Atlanta, I essentially have two broadband options for 128 00:08:13,280 --> 00:08:16,400 Speaker 1: cable based internet, and then one other one if I 129 00:08:16,480 --> 00:08:20,600 Speaker 1: wanted satellite and to have the slowest internet ever, Folks 130 00:08:20,600 --> 00:08:23,880 Speaker 1: who live in apartments typically don't even get that limited 131 00:08:23,960 --> 00:08:26,480 Speaker 1: amount of choice. So here's hoping this means we're going 132 00:08:26,520 --> 00:08:28,440 Speaker 1: to see a big change for people who are living 133 00:08:28,440 --> 00:08:31,760 Speaker 1: in apartment buildings and such, though there are several consumer 134 00:08:31,800 --> 00:08:34,600 Speaker 1: advocacy groups that are saying we're likely to see more 135 00:08:34,679 --> 00:08:37,760 Speaker 1: loopholes pop up, and what we really need is a 136 00:08:37,880 --> 00:08:43,440 Speaker 1: full FCC that is capable of taking larger steps towards 137 00:08:43,480 --> 00:08:48,400 Speaker 1: preventing that practice from continuing. Well, we all know that 138 00:08:48,520 --> 00:08:51,880 Speaker 1: meta slash Facebook has been hurt pretty badly by Apple's 139 00:08:51,960 --> 00:08:56,880 Speaker 1: changed to iOS. That is that Apple gave iOS users 140 00:08:56,920 --> 00:09:01,720 Speaker 1: the opportunity to opt out of app tracking or app 141 00:09:01,800 --> 00:09:04,640 Speaker 1: based tracking, and that meant that iPhone owners and alike 142 00:09:04,720 --> 00:09:07,920 Speaker 1: could decide to not feed into Facebook's tried and true 143 00:09:07,960 --> 00:09:11,600 Speaker 1: method of gobbling up user data to aid in its 144 00:09:11,800 --> 00:09:16,280 Speaker 1: business of targeted advertising. So the company reportedly lost out 145 00:09:16,320 --> 00:09:19,880 Speaker 1: on like ten billion dollars of revenue last year because 146 00:09:19,920 --> 00:09:24,199 Speaker 1: of this change to iOS. Well, now Alphabet slash Google 147 00:09:24,480 --> 00:09:27,520 Speaker 1: is following suit. The company announced yesterday that it is 148 00:09:27,640 --> 00:09:30,679 Speaker 1: updating its own policies that would restrict how apps can 149 00:09:30,720 --> 00:09:34,640 Speaker 1: share user data with third parties. But Alphabet's approach is 150 00:09:34,679 --> 00:09:37,520 Speaker 1: going to be a little bit different. In fact, in 151 00:09:37,600 --> 00:09:41,040 Speaker 1: the announcement, there is a passage that reads, quote, other 152 00:09:41,200 --> 00:09:45,200 Speaker 1: platforms have taken a different approach to ads privacy, bluntly 153 00:09:45,280 --> 00:09:49,520 Speaker 1: restricting existing technologies used by developers and advertisers. End quote. 154 00:09:50,000 --> 00:09:52,760 Speaker 1: I put the sas in that, But no, there's sass there. 155 00:09:52,800 --> 00:09:55,880 Speaker 1: It looks like it's targeting Apple, saying Apples using a 156 00:09:56,280 --> 00:10:01,000 Speaker 1: hammer instead of say, a scalpel. So Google's approach will 157 00:10:01,000 --> 00:10:03,360 Speaker 1: allow for some sort of wiggle room, it seems. And 158 00:10:03,360 --> 00:10:05,360 Speaker 1: the company has also said it's not going to implement 159 00:10:05,400 --> 00:10:07,800 Speaker 1: these changes for another two years, and that will give 160 00:10:07,840 --> 00:10:11,680 Speaker 1: developers and advertisers time to prepare for that and to 161 00:10:11,760 --> 00:10:15,800 Speaker 1: create new plans and strategies. So it doesn't sound like 162 00:10:15,840 --> 00:10:18,160 Speaker 1: it's going to be quite as severe a change as 163 00:10:18,240 --> 00:10:21,120 Speaker 1: Apple's approach, but it can still pose a threat to 164 00:10:21,160 --> 00:10:24,959 Speaker 1: revenue for companies like Meta down the line. Okay, we've 165 00:10:25,000 --> 00:10:27,360 Speaker 1: got some more news stories to cover before we get 166 00:10:27,360 --> 00:10:37,800 Speaker 1: to those, Let's take a quick break. Hi. Are you 167 00:10:37,880 --> 00:10:40,839 Speaker 1: Canadian or do you use a Canadian bank? No, this 168 00:10:40,880 --> 00:10:43,160 Speaker 1: isn't an ad this is this is a news story. 169 00:10:43,440 --> 00:10:45,600 Speaker 1: If you are a Canadian or use a Canadian bank, 170 00:10:45,640 --> 00:10:48,360 Speaker 1: you might have been experiencing some pretty rough issues with 171 00:10:48,440 --> 00:10:52,640 Speaker 1: your bank recently. Uh So, yesterday on Wednesday, online services 172 00:10:52,640 --> 00:10:56,320 Speaker 1: went down for five major banks in Canada, including the 173 00:10:56,400 --> 00:10:59,760 Speaker 1: Royal Bank of Canada. Customers found it impossible to perform 174 00:10:59,760 --> 00:11:03,520 Speaker 1: by sick tasks like e transfers and mobile banking, and 175 00:11:03,559 --> 00:11:06,440 Speaker 1: there were reports of people being unable to pay for 176 00:11:06,480 --> 00:11:09,679 Speaker 1: purchases at the point of sale due to this outage. 177 00:11:10,160 --> 00:11:13,320 Speaker 1: It happened yesterday evening, and as of this recording, there 178 00:11:13,320 --> 00:11:16,000 Speaker 1: has not yet been an explanation as to what happened. 179 00:11:16,640 --> 00:11:19,360 Speaker 1: Uh and in some cases it sounds like services are 180 00:11:19,400 --> 00:11:23,200 Speaker 1: continuing to be disrupted. Uh. There is some speculation about 181 00:11:23,240 --> 00:11:26,319 Speaker 1: what happened, but since that speculation lacks any evidence as 182 00:11:26,360 --> 00:11:28,079 Speaker 1: of the time I'm recording this, I'm not going to 183 00:11:28,160 --> 00:11:31,240 Speaker 1: talk about it until we learn more. We have seen 184 00:11:31,320 --> 00:11:34,960 Speaker 1: big outages and services father online stuff before, but it's 185 00:11:35,280 --> 00:11:38,839 Speaker 1: is unusual to see something that's specifically affecting a single 186 00:11:39,040 --> 00:11:43,400 Speaker 1: sector banking in this case, and stretching across multiple institutions, 187 00:11:43,400 --> 00:11:45,960 Speaker 1: which is pretty freaky. I'll be sure to follow up 188 00:11:45,960 --> 00:11:48,240 Speaker 1: on this once we know more about what is going on. 189 00:11:48,880 --> 00:11:51,840 Speaker 1: In the state of California, lawmakers are working on legislation 190 00:11:51,840 --> 00:11:54,760 Speaker 1: that would put tighter restrictions on how big tech companies 191 00:11:54,800 --> 00:11:58,400 Speaker 1: in the state can collect and use children's personal information. 192 00:11:58,920 --> 00:12:01,840 Speaker 1: Uh It's modeled after a similar code in the UK, 193 00:12:02,480 --> 00:12:05,040 Speaker 1: and the California proposal would mean that big tech companies 194 00:12:05,080 --> 00:12:07,760 Speaker 1: like Meta and Alphabet would have to limit how much 195 00:12:07,840 --> 00:12:12,000 Speaker 1: data they collect from young users and restrict their use 196 00:12:12,000 --> 00:12:15,440 Speaker 1: of targeted advertising to kids and prevent them from using 197 00:12:15,480 --> 00:12:19,760 Speaker 1: various tricks to tempt kids, to sidestep privacy protections, and 198 00:12:19,800 --> 00:12:23,079 Speaker 1: also to knockoff location tracking for children within the state. 199 00:12:23,760 --> 00:12:26,640 Speaker 1: And we've seen kind of a general move towards raining 200 00:12:26,679 --> 00:12:29,080 Speaker 1: in big text influence over the last couple of years, 201 00:12:29,080 --> 00:12:32,120 Speaker 1: particularly when it comes to how tech can impact children. 202 00:12:32,480 --> 00:12:35,120 Speaker 1: In fact, that's the surface level reason the earn It 203 00:12:35,320 --> 00:12:37,560 Speaker 1: Act is going through the U. S. Senate. I talked 204 00:12:37,559 --> 00:12:40,480 Speaker 1: about earn it earlier this week and how that legislation, 205 00:12:40,520 --> 00:12:42,880 Speaker 1: if passed into law, could be a death sentence for 206 00:12:43,040 --> 00:12:45,680 Speaker 1: end to end encrypted communication here in the United States. 207 00:12:46,280 --> 00:12:48,640 Speaker 1: I would say earn it is a bad approach to 208 00:12:48,679 --> 00:12:52,319 Speaker 1: trying to protect kids. But these restrictions, you know, limiting 209 00:12:52,360 --> 00:12:55,120 Speaker 1: how companies can collect information and what they can do 210 00:12:55,160 --> 00:12:57,640 Speaker 1: with it, that seems like a pretty good step to me. 211 00:12:59,120 --> 00:13:02,600 Speaker 1: On a national In the United States, senators have introduced 212 00:13:02,600 --> 00:13:06,680 Speaker 1: the Kids Online Safety Act. This legislation is meant to 213 00:13:06,800 --> 00:13:10,280 Speaker 1: counteract the harmful effects certain online platforms seem to have 214 00:13:10,360 --> 00:13:13,080 Speaker 1: on children. You might remember the big story of the 215 00:13:13,080 --> 00:13:17,440 Speaker 1: Facebook leak last year, and that internal documents from Facebook 216 00:13:17,440 --> 00:13:20,760 Speaker 1: indicated that folks at Meta you know, the parent company, 217 00:13:20,800 --> 00:13:23,840 Speaker 1: are aware of the harmful effects that platforms like Instagram 218 00:13:23,920 --> 00:13:27,320 Speaker 1: can have on some users, particularly young girls and young women. 219 00:13:28,040 --> 00:13:30,400 Speaker 1: That's the sort of thing this Act is meant to 220 00:13:30,440 --> 00:13:34,079 Speaker 1: target the risk that platforms could contribute to serious mental 221 00:13:34,120 --> 00:13:38,800 Speaker 1: health issues like eating disorders, substance abuse, and even suicide. 222 00:13:39,280 --> 00:13:42,520 Speaker 1: The Act calls for platforms that could be quote reasonably 223 00:13:42,600 --> 00:13:45,840 Speaker 1: likely to be used in quote by people under the 224 00:13:45,840 --> 00:13:49,400 Speaker 1: age of sixteen to institute more protections so that those 225 00:13:49,520 --> 00:13:52,880 Speaker 1: users aren't presented with images and posts that promote things 226 00:13:52,880 --> 00:13:55,679 Speaker 1: that can be harmful to mental health. The Act calls 227 00:13:55,720 --> 00:13:58,520 Speaker 1: for platforms to institute limits that would prevent people from 228 00:13:58,559 --> 00:14:02,000 Speaker 1: specifically seeking out younger users, and it would also restrict 229 00:14:02,000 --> 00:14:04,760 Speaker 1: how much data the platforms could collect from younger users, 230 00:14:04,840 --> 00:14:08,280 Speaker 1: similar to the California proposal we just talked about. Also, 231 00:14:08,440 --> 00:14:12,000 Speaker 1: the Act would require platforms to make restricted settings the default, 232 00:14:12,360 --> 00:14:14,480 Speaker 1: and that is a big deal. I think a lot 233 00:14:14,520 --> 00:14:17,720 Speaker 1: of platforms try to skirt issues with having their data 234 00:14:17,800 --> 00:14:22,960 Speaker 1: tracking abilities restricted by making those opt out choices rather 235 00:14:23,000 --> 00:14:25,720 Speaker 1: than opt in choices, like you opt out of having 236 00:14:25,720 --> 00:14:28,840 Speaker 1: your data tracked, and some go a step further and 237 00:14:28,840 --> 00:14:32,960 Speaker 1: they bury in the opt out settings deep in the menu, 238 00:14:33,080 --> 00:14:36,000 Speaker 1: so they make it really unlikely that the majority of 239 00:14:36,000 --> 00:14:37,960 Speaker 1: people are actually going to go hunting for the setting 240 00:14:38,000 --> 00:14:42,320 Speaker 1: to turn off data tracking, so the access you can't 241 00:14:42,320 --> 00:14:46,400 Speaker 1: do that there. The opt out has to be the default. 242 00:14:46,960 --> 00:14:50,040 Speaker 1: You have to opt into having things tracked. And uh. 243 00:14:50,160 --> 00:14:53,440 Speaker 1: The Act also doesn't have a size requirement as far 244 00:14:53,480 --> 00:14:55,920 Speaker 1: as the companies go. That doesn't matter how big or 245 00:14:55,960 --> 00:14:59,320 Speaker 1: small the platform is. If it is reasonably likely to 246 00:14:59,360 --> 00:15:02,400 Speaker 1: be used by children, the law will apply to that platform. 247 00:15:02,760 --> 00:15:04,600 Speaker 1: So a lot of other laws tend to have like 248 00:15:04,640 --> 00:15:07,280 Speaker 1: a restriction that say you have to have a certain 249 00:15:07,360 --> 00:15:10,720 Speaker 1: number of users for you to be held accountable to 250 00:15:10,760 --> 00:15:13,480 Speaker 1: that law. That's not the case here. And of course 251 00:15:13,520 --> 00:15:15,320 Speaker 1: this is not a law yet. It will have to 252 00:15:15,360 --> 00:15:17,320 Speaker 1: pass a vote in both the Senate and the House 253 00:15:17,320 --> 00:15:20,280 Speaker 1: and get approved by the President before that happens, and 254 00:15:20,320 --> 00:15:23,080 Speaker 1: it is quite possible that it could be changed significantly 255 00:15:23,160 --> 00:15:25,240 Speaker 1: before it gets to that point, or it could just 256 00:15:25,480 --> 00:15:29,360 Speaker 1: die on the floor. A lot of legislation does. The 257 00:15:29,400 --> 00:15:32,880 Speaker 1: website Motherboard recently received a recording said to be from 258 00:15:32,920 --> 00:15:37,120 Speaker 1: a mandatory anti union meeting held in Amazon's JFK eight 259 00:15:37,200 --> 00:15:40,960 Speaker 1: warehouse up in New York. That's the warehouse that initially 260 00:15:41,000 --> 00:15:43,760 Speaker 1: failed to get enough signatures to force a union vote 261 00:15:43,880 --> 00:15:47,000 Speaker 1: last year, but now is back on track and got 262 00:15:47,040 --> 00:15:49,040 Speaker 1: the signatures and now it's going to hold a vote. 263 00:15:49,480 --> 00:15:53,360 Speaker 1: So at this mandatory meeting, a representative laid out the 264 00:15:53,440 --> 00:15:58,000 Speaker 1: supposed risks involved if workers were to sign to join 265 00:15:58,040 --> 00:16:01,280 Speaker 1: a union. You might remember that Amazon employees at a 266 00:16:01,280 --> 00:16:04,920 Speaker 1: warehouse in Alabama held a union vote that failed, but 267 00:16:05,000 --> 00:16:08,320 Speaker 1: then the National Labor Relations Board or in l r B, 268 00:16:09,040 --> 00:16:11,720 Speaker 1: gave them a do over after the n l RB 269 00:16:11,880 --> 00:16:15,880 Speaker 1: concluded that Amazon the company had interfered with the union 270 00:16:15,920 --> 00:16:19,680 Speaker 1: election process in an effort to prevent a union from forming, 271 00:16:20,120 --> 00:16:25,400 Speaker 1: well actively discouraging employees by threatening them, in this case, 272 00:16:25,440 --> 00:16:28,080 Speaker 1: by suggesting that they could see a reduction in wages 273 00:16:28,120 --> 00:16:31,480 Speaker 1: if they joined a union and implying that there could 274 00:16:31,480 --> 00:16:34,240 Speaker 1: be no limit to how high union dues would be. 275 00:16:35,040 --> 00:16:37,920 Speaker 1: That kind of falls into that category too. And if 276 00:16:37,960 --> 00:16:41,000 Speaker 1: this recording is in fact legitimate, and I have no 277 00:16:41,080 --> 00:16:44,480 Speaker 1: reason to believe otherwise, it could really be a massive 278 00:16:44,520 --> 00:16:47,520 Speaker 1: blow against Amazon. The vote on whether or not to 279 00:16:47,680 --> 00:16:51,040 Speaker 1: unionize should happen late next month, so we'll keep an 280 00:16:51,040 --> 00:16:53,880 Speaker 1: eye on this story to see where it goes. And finally, 281 00:16:54,240 --> 00:16:57,600 Speaker 1: there's a really cool article and Wired titled deep mind 282 00:16:57,760 --> 00:17:01,120 Speaker 1: has trained an AI to control nuclear your fusion that 283 00:17:01,240 --> 00:17:03,880 Speaker 1: I recommend you check out if you're interested in stuff 284 00:17:03,880 --> 00:17:08,679 Speaker 1: like fusion, artificial intelligence, and machine learning. I'll give the highlights, 285 00:17:08,720 --> 00:17:11,760 Speaker 1: but seriously, you should read the article if you can. So. First, 286 00:17:12,280 --> 00:17:15,439 Speaker 1: nuclear fusion is the process by which you release energy 287 00:17:15,640 --> 00:17:18,879 Speaker 1: by fusing two atoms together to make a heavier atom. 288 00:17:19,520 --> 00:17:22,640 Speaker 1: This is the nuclear process that takes place in stars 289 00:17:22,680 --> 00:17:25,880 Speaker 1: like the Sun. If we could create a nuclear reactor 290 00:17:26,040 --> 00:17:31,160 Speaker 1: capable of holding sustained fusion reactions, we could revolutionize how 291 00:17:31,640 --> 00:17:36,439 Speaker 1: we generate an access energy. Uh, build a few of 292 00:17:36,480 --> 00:17:40,840 Speaker 1: those working fusion reactors, and you would solve energy crises 293 00:17:40,920 --> 00:17:43,639 Speaker 1: around the world. Plus you wouldn't have to worry about 294 00:17:43,760 --> 00:17:46,399 Speaker 1: dangerous radioactive waste the way you have to do with 295 00:17:46,520 --> 00:17:50,320 Speaker 1: nuclear fission reactors. Fission is the type of nuclear power 296 00:17:50,440 --> 00:17:52,359 Speaker 1: we rely on now. That's when you take a heavy 297 00:17:52,400 --> 00:17:55,560 Speaker 1: atom and you split it up and you release energy 298 00:17:55,560 --> 00:17:58,000 Speaker 1: in the process. But I don't know if you've noticed, 299 00:17:58,680 --> 00:18:01,240 Speaker 1: things are a little different here on Earth than they 300 00:18:01,280 --> 00:18:04,920 Speaker 1: are on the Sun. We don't have the intense heat 301 00:18:05,359 --> 00:18:08,560 Speaker 1: and we don't have the incredible gravitational forces that you 302 00:18:08,600 --> 00:18:11,560 Speaker 1: would find on the Sun. So it is very challenging 303 00:18:11,600 --> 00:18:16,280 Speaker 1: to push tiny atoms together with enough energy to get 304 00:18:16,320 --> 00:18:19,720 Speaker 1: them to fuse. So a lot of our fusion work 305 00:18:19,760 --> 00:18:22,720 Speaker 1: has been a net loss, where we're pouring more energy 306 00:18:22,760 --> 00:18:26,359 Speaker 1: into making the reaction happen, then we're getting out of 307 00:18:26,400 --> 00:18:29,919 Speaker 1: the reaction itself. So the way we push atoms together, 308 00:18:30,119 --> 00:18:33,199 Speaker 1: since we don't have that incredible gravitational force of the 309 00:18:33,240 --> 00:18:36,800 Speaker 1: sun is we use powerful magnetic fields, and if we 310 00:18:36,800 --> 00:18:40,800 Speaker 1: could control these magnetic fields more precisely and efficiently, we 311 00:18:40,880 --> 00:18:45,320 Speaker 1: could potentially make massive improvements towards sustained fusion reactions. This 312 00:18:45,359 --> 00:18:47,960 Speaker 1: is where deep mind is coming in. Deep Mind emerged 313 00:18:48,000 --> 00:18:50,439 Speaker 1: from Google's secret R and D division years ago, and 314 00:18:50,480 --> 00:18:54,040 Speaker 1: it focuses primarily on machine learning. So the task at 315 00:18:54,040 --> 00:18:57,480 Speaker 1: hand was teaching an AI computer system how to control 316 00:18:57,560 --> 00:19:01,480 Speaker 1: a complicated series of magnets to manipulate a magnetic field 317 00:19:01,560 --> 00:19:04,119 Speaker 1: in such a way to get the optimal result in 318 00:19:04,200 --> 00:19:08,000 Speaker 1: a plasma that contains fusing atoms. Like I said, this 319 00:19:08,080 --> 00:19:10,399 Speaker 1: is just a highlight of what is covered in the article. 320 00:19:11,200 --> 00:19:13,640 Speaker 1: It is fascinating stuff and it could end up being 321 00:19:13,640 --> 00:19:16,600 Speaker 1: a significant step toward making fusion go from science fiction 322 00:19:17,040 --> 00:19:22,600 Speaker 1: to reality eventually. So check out that article. And that's 323 00:19:22,640 --> 00:19:26,879 Speaker 1: it for the news for Thursday, February two. If you 324 00:19:26,960 --> 00:19:29,160 Speaker 1: have any suggestions for topics I should cover in future 325 00:19:29,160 --> 00:19:31,679 Speaker 1: episodes of tech Stuff, reach out to me on Twitter. 326 00:19:31,920 --> 00:19:35,800 Speaker 1: Handle is text stuff hs W I'll talk to you again, 327 00:19:36,640 --> 00:19:44,920 Speaker 1: really sick. Text Stuff is an I Heart Radio production. 328 00:19:45,160 --> 00:19:47,960 Speaker 1: For more podcasts from my Heart Radio, visit the i 329 00:19:48,080 --> 00:19:51,320 Speaker 1: heart radio, app, Apple podcasts, or wherever you listen to 330 00:19:51,359 --> 00:19:52,280 Speaker 1: your favorite shows.