1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,880 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,000 --> 00:00:18,160 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,200 --> 00:00:20,400 Speaker 1: and I love all things tech and it is time 5 00:00:20,480 --> 00:00:25,840 Speaker 1: for the tech news for Tuesday, October five, twenty one. 6 00:00:26,440 --> 00:00:30,639 Speaker 1: Let's get at it. Chances are you are aware that 7 00:00:30,760 --> 00:00:35,279 Speaker 1: yesterday Facebook and all of its services went offline for 8 00:00:35,320 --> 00:00:41,320 Speaker 1: several hours. That includes Facebook itself, Messenger, Instagram, What's That, 9 00:00:41,840 --> 00:00:46,680 Speaker 1: and Oculus VR. The outage began at approximately eleven forty 10 00:00:46,800 --> 00:00:50,920 Speaker 1: a m. Eastern time. That's when facebooks records on the 11 00:00:51,000 --> 00:00:56,880 Speaker 1: domain name system suddenly went So. The domain name system 12 00:00:56,960 --> 00:01:00,200 Speaker 1: or d n S is like an address or or 13 00:01:00,360 --> 00:01:04,160 Speaker 1: phone book for the Internet. So you probably know about 14 00:01:04,319 --> 00:01:08,959 Speaker 1: IP addresses, those numerical addresses, especially I p v four 15 00:01:09,080 --> 00:01:11,600 Speaker 1: is numerical, I p v six is a mix. But 16 00:01:11,720 --> 00:01:14,520 Speaker 1: this is how say a web browser connects to a 17 00:01:14,640 --> 00:01:19,160 Speaker 1: specific web page. It actually connects through an I p address. 18 00:01:19,200 --> 00:01:22,840 Speaker 1: But IP addresses are not easy for most humans to 19 00:01:22,880 --> 00:01:25,760 Speaker 1: remember because again they just look like a jumble of 20 00:01:25,840 --> 00:01:30,520 Speaker 1: numbers or with I p v six numbers and possibly letters. 21 00:01:30,560 --> 00:01:34,040 Speaker 1: So we use words for our u r ls like 22 00:01:34,120 --> 00:01:37,319 Speaker 1: Facebook dot Com, for example, and the d n S 23 00:01:37,600 --> 00:01:41,480 Speaker 1: translates those words, which doesn't you know, words don't really 24 00:01:41,480 --> 00:01:44,360 Speaker 1: mean anything to web browsers. That's just how we navigate 25 00:01:44,400 --> 00:01:47,160 Speaker 1: the web. And so d n S says, all right, well, 26 00:01:47,200 --> 00:01:49,880 Speaker 1: let me look and see what I P address that 27 00:01:50,240 --> 00:01:54,360 Speaker 1: particular series of words relates to, and then serves that up. Now, 28 00:01:54,360 --> 00:01:58,080 Speaker 1: on Facebook's case, the problems seemed to be that Facebook somehow, 29 00:01:58,600 --> 00:02:03,880 Speaker 1: probably through a miscon figuration, eliminated its Border Gateway Protocol 30 00:02:04,040 --> 00:02:07,000 Speaker 1: or b GP route. This is kind of like a 31 00:02:07,080 --> 00:02:11,400 Speaker 1: map of how data can get to Facebook's servers. Facebook 32 00:02:11,440 --> 00:02:16,840 Speaker 1: servers published this route to neighboring servers, and they propagated outwards. 33 00:02:16,840 --> 00:02:19,359 Speaker 1: So it's kind of like if you're driving through an 34 00:02:19,400 --> 00:02:22,440 Speaker 1: unfamiliar town and you stop to ask a local for directions. 35 00:02:22,440 --> 00:02:24,680 Speaker 1: You say something like, hey, do you happen to know 36 00:02:24,760 --> 00:02:27,760 Speaker 1: the way to get to Facebook, and they say, well, yeah, 37 00:02:28,080 --> 00:02:30,800 Speaker 1: go down this road a peace and when you see 38 00:02:30,800 --> 00:02:33,560 Speaker 1: the sign for Old Donigan's farm, you turn left. Now, 39 00:02:33,600 --> 00:02:37,160 Speaker 1: if you pass an ambulatory scarecrow, you've done gone too far, 40 00:02:37,800 --> 00:02:42,160 Speaker 1: except because Facebook withdrew the b g P. If you 41 00:02:42,200 --> 00:02:44,080 Speaker 1: were to stop and ask for directions, and say do 42 00:02:44,120 --> 00:02:45,800 Speaker 1: you know the way to Facebook? You would get the 43 00:02:45,840 --> 00:02:49,440 Speaker 1: answer you can't get that from hreh. But wait, it 44 00:02:49,480 --> 00:02:54,120 Speaker 1: gets worse. See Facebook's systems are all internal ones built 45 00:02:54,240 --> 00:02:58,800 Speaker 1: on these computers that connect to these particular servers. That 46 00:02:58,880 --> 00:03:02,320 Speaker 1: got effect of Lee severed from the Internet and the 47 00:03:02,360 --> 00:03:05,760 Speaker 1: loss of the b g P meant that not only 48 00:03:06,280 --> 00:03:09,240 Speaker 1: could you as a user no longer check your Facebook 49 00:03:09,320 --> 00:03:13,840 Speaker 1: or Instagram or use WhatsApp, and you couldn't play Oculus VR. 50 00:03:14,360 --> 00:03:17,960 Speaker 1: If you were a Facebook employee, you couldn't access your 51 00:03:17,960 --> 00:03:21,120 Speaker 1: internal systems. You might not even be able to access 52 00:03:21,200 --> 00:03:23,560 Speaker 1: the building I saw. One person suggests that the security 53 00:03:23,560 --> 00:03:27,000 Speaker 1: system itself is tied into those servers and that your 54 00:03:27,200 --> 00:03:30,959 Speaker 1: security badge wouldn't work to let you in in that case. Also, 55 00:03:31,120 --> 00:03:34,560 Speaker 1: Facebook has Facebook log In. This is where you know 56 00:03:34,639 --> 00:03:38,080 Speaker 1: various apps use your Facebook login for you to access 57 00:03:38,640 --> 00:03:41,000 Speaker 1: the app. That way, you don't create a user name 58 00:03:41,040 --> 00:03:43,240 Speaker 1: and a password for that specific app. Instead, you use 59 00:03:43,280 --> 00:03:46,280 Speaker 1: your Facebook login. Well, with Facebook servers down, there was 60 00:03:46,280 --> 00:03:49,040 Speaker 1: no way to authenticate that. So if you were not 61 00:03:49,160 --> 00:03:53,120 Speaker 1: actually actively signed into those services, you might have found 62 00:03:53,160 --> 00:03:56,280 Speaker 1: it impossible to connect to all those apps. It took 63 00:03:56,560 --> 00:03:59,800 Speaker 1: several hours to reconfigure the b GP routes, and that 64 00:04:00,000 --> 00:04:02,320 Speaker 1: has made more complicated because obviously the computers that you 65 00:04:02,320 --> 00:04:06,480 Speaker 1: would normally use to reconfigure that were the same ones 66 00:04:06,560 --> 00:04:09,960 Speaker 1: that were you know, quote unquote disappeared from the Internet. 67 00:04:10,280 --> 00:04:14,680 Speaker 1: So it was a pretty rough day for Facebook. Now, 68 00:04:15,360 --> 00:04:18,400 Speaker 1: this was just part of the drama that Facebook is 69 00:04:18,440 --> 00:04:22,400 Speaker 1: currently going through. On Sunday this past Sunday, former Facebook 70 00:04:22,440 --> 00:04:26,559 Speaker 1: employee Francis Howgan appeared on Sixty Minutes and it turns 71 00:04:26,560 --> 00:04:30,680 Speaker 1: out that Francis is the person who leaked internal research 72 00:04:30,760 --> 00:04:34,120 Speaker 1: documents to the Wall Street Journal. You know, the studies 73 00:04:34,160 --> 00:04:38,320 Speaker 1: that showed that Facebook researchers were concerned that Instagram was 74 00:04:38,360 --> 00:04:43,200 Speaker 1: harmful to mental health, particularly for teenage girl who girls 75 00:04:43,200 --> 00:04:46,520 Speaker 1: who were using the service um and that Facebook was 76 00:04:46,560 --> 00:04:49,840 Speaker 1: well aware of how misinformation was spreading across the platform 77 00:04:49,880 --> 00:04:55,480 Speaker 1: and how the algorithm was elevating misinformation campaigns. On the interview, 78 00:04:56,000 --> 00:04:59,320 Speaker 1: she alleged that Facebook, when faced with a decision to 79 00:04:59,440 --> 00:05:02,640 Speaker 1: either go for making a profit or go with an 80 00:05:02,680 --> 00:05:06,039 Speaker 1: option that would be better for users, would always choose 81 00:05:06,040 --> 00:05:09,560 Speaker 1: the profit option every single time. Today, she has scheduled 82 00:05:09,560 --> 00:05:12,719 Speaker 1: to appear before the United States Senate to answer questions 83 00:05:12,720 --> 00:05:15,960 Speaker 1: about her former employer, And of course we have been 84 00:05:16,080 --> 00:05:19,800 Speaker 1: having ongoing conversations within the government, the US government and 85 00:05:20,040 --> 00:05:23,279 Speaker 1: the European government as well about whether or not Facebook 86 00:05:23,320 --> 00:05:26,960 Speaker 1: represents a monopoly and if perhaps it might be a 87 00:05:26,960 --> 00:05:30,280 Speaker 1: good idea to, you know, force the company to divest 88 00:05:30,480 --> 00:05:36,239 Speaker 1: itself of properties like Instagram and WhatsApp. Now, I've also 89 00:05:36,320 --> 00:05:41,039 Speaker 1: seen some people suggest that somehow the outage yesterday was 90 00:05:41,120 --> 00:05:44,320 Speaker 1: linked to the hearing that's happening today and and the 91 00:05:44,480 --> 00:05:48,920 Speaker 1: sixty minutes expose that had happened on Sunday, that perhaps 92 00:05:48,960 --> 00:05:52,240 Speaker 1: someone over at Facebook was trying to hide something and 93 00:05:52,320 --> 00:05:57,000 Speaker 1: just took extreme measures. But that doesn't seem like that 94 00:05:57,040 --> 00:06:00,720 Speaker 1: doesn't seem to hold up even under casual screw utiny. 95 00:06:00,760 --> 00:06:03,960 Speaker 1: I would argue that the outage hurt the company big time, 96 00:06:04,320 --> 00:06:08,440 Speaker 1: and at the worst possible moment for Facebook, for one thing, 97 00:06:09,040 --> 00:06:14,320 Speaker 1: because Facebook went down and all these various integrated services 98 00:06:14,400 --> 00:06:18,040 Speaker 1: then became unavailable. The people who say Facebook has too 99 00:06:18,120 --> 00:06:23,039 Speaker 1: much influence have way more ammunition, right, because millions of 100 00:06:23,080 --> 00:06:26,760 Speaker 1: people rely on, say, WhatsApp to stay in touch with 101 00:06:26,839 --> 00:06:31,279 Speaker 1: family and friends around the world, including vulnerable populations who 102 00:06:31,360 --> 00:06:35,840 Speaker 1: use WhatsApp in order to maintain contact with people, like say, 103 00:06:35,839 --> 00:06:40,800 Speaker 1: back in Africa or in South America. But because WhatsApp 104 00:06:40,920 --> 00:06:45,280 Speaker 1: runs on Facebook's systems, that whole service became unavailable globally 105 00:06:45,320 --> 00:06:49,839 Speaker 1: for like six hours. This, the critics argue, is what 106 00:06:50,000 --> 00:06:52,479 Speaker 1: happens when we put all of our digital eggs in 107 00:06:52,600 --> 00:06:56,560 Speaker 1: one social media platforms basket. Right, If we have our 108 00:06:56,600 --> 00:07:00,520 Speaker 1: logins tied to Facebook, If the service as we rely 109 00:07:00,720 --> 00:07:03,800 Speaker 1: upon to stay in touch with everybody are part of 110 00:07:03,839 --> 00:07:07,760 Speaker 1: Facebook and Facebook goes down, all of that becomes unavailable 111 00:07:07,760 --> 00:07:11,520 Speaker 1: to us. That, as the argument goes, is an argument 112 00:07:11,560 --> 00:07:14,800 Speaker 1: against this kind of consolidation that we've been seeing with 113 00:07:14,840 --> 00:07:18,800 Speaker 1: platforms like Facebook, where you know, they're not just building 114 00:07:18,800 --> 00:07:23,360 Speaker 1: out their own tools, they're buying up tools that overlap 115 00:07:23,440 --> 00:07:26,360 Speaker 1: what Facebook already does. A lot of people have pointed 116 00:07:26,400 --> 00:07:30,400 Speaker 1: out over the years that Facebook has acquired companies when 117 00:07:30,560 --> 00:07:32,680 Speaker 1: it became clear that Facebook itself was not going to 118 00:07:32,720 --> 00:07:35,320 Speaker 1: be able to compete against those companies, and those companies 119 00:07:36,000 --> 00:07:40,480 Speaker 1: potentially stood as a threat when it comes to getting 120 00:07:40,480 --> 00:07:44,600 Speaker 1: people to spend more and more time on Facebook. Now, 121 00:07:45,360 --> 00:07:49,280 Speaker 1: beyond those pieces of evidence that suggests that this is 122 00:07:49,320 --> 00:07:53,080 Speaker 1: not something that the company would have wanted Bloomberg estimated 123 00:07:53,120 --> 00:07:57,040 Speaker 1: that the outage and the dip and Facebook's stock price, 124 00:07:57,240 --> 00:08:02,120 Speaker 1: which has been going on since, you know, earlier in September. UH, 125 00:08:02,160 --> 00:08:05,120 Speaker 1: that also meant that CEO Mark Zuckerberg saw his wealth 126 00:08:05,160 --> 00:08:10,200 Speaker 1: decreased by around seven billion dollars. Now that is beyond 127 00:08:10,240 --> 00:08:13,240 Speaker 1: a princely some obviously, and that's not to say that 128 00:08:13,280 --> 00:08:16,280 Speaker 1: he won't regain that wealth now that Facebook is back 129 00:08:16,360 --> 00:08:19,960 Speaker 1: up and running. Maybe he will. Maybe the company's UH 130 00:08:20,280 --> 00:08:23,800 Speaker 1: stock prices will improve, although with the hearing in front 131 00:08:23,840 --> 00:08:26,600 Speaker 1: of the U. S. Senate that's not certain. I would 132 00:08:26,680 --> 00:08:31,440 Speaker 1: argue no one would authorize some sort of extreme pull 133 00:08:31,560 --> 00:08:36,520 Speaker 1: the plug UH strategy at the cost of seven billion bucks. 134 00:08:36,840 --> 00:08:40,679 Speaker 1: I'm pretty confident this was all coincidental. It was a 135 00:08:40,760 --> 00:08:44,000 Speaker 1: miss configuration, It was not done on purpose, and it 136 00:08:44,080 --> 00:08:47,240 Speaker 1: was just supremely bad timing for Facebook for all this 137 00:08:47,320 --> 00:08:51,040 Speaker 1: to happen at once. Will the U. S. Government force 138 00:08:51,160 --> 00:08:54,880 Speaker 1: Facebook to say bye bye to Instagram and WhatsApp? I 139 00:08:54,920 --> 00:08:58,080 Speaker 1: honestly don't know. There is precedent. We have seen the 140 00:08:58,120 --> 00:09:01,800 Speaker 1: government break up big company years before, but it hasn't 141 00:09:01,800 --> 00:09:05,400 Speaker 1: happened in a long time, and I'm not sure that 142 00:09:05,480 --> 00:09:08,800 Speaker 1: it will happen in this case. However, I figure there's 143 00:09:08,840 --> 00:09:12,240 Speaker 1: never been a time with more obvious support for that 144 00:09:12,400 --> 00:09:17,440 Speaker 1: argument than right now. Yesterday really brought that into, you know, 145 00:09:17,720 --> 00:09:21,480 Speaker 1: into the light. Last week, I reported that Amazon would 146 00:09:21,520 --> 00:09:25,199 Speaker 1: hold a hardware reveal event, but that my recording session 147 00:09:25,600 --> 00:09:29,160 Speaker 1: happened before the event occurred, so I couldn't actually say 148 00:09:29,200 --> 00:09:31,520 Speaker 1: what they were going to show off. I did mention 149 00:09:31,640 --> 00:09:35,160 Speaker 1: that the Verge had predicted a wall mounted echo device, 150 00:09:35,480 --> 00:09:38,800 Speaker 1: and we got one. It's called the Echo Show. But 151 00:09:38,920 --> 00:09:41,800 Speaker 1: that site also predicted we wouldn't see anything about the 152 00:09:41,920 --> 00:09:44,440 Speaker 1: robot that Amazon had had in development for a few 153 00:09:44,480 --> 00:09:47,800 Speaker 1: years at that point. However, Amazon surprised us all and 154 00:09:47,840 --> 00:09:51,160 Speaker 1: in fact brought out a robot that it now calls Astro. 155 00:09:52,200 --> 00:09:55,800 Speaker 1: The robot will initially sell to a limited customer base 156 00:09:56,440 --> 00:09:59,680 Speaker 1: by invitation only, so you actually have to request and invite, 157 00:10:00,160 --> 00:10:03,520 Speaker 1: and you must hand over the princely sum of nine 158 00:10:04,559 --> 00:10:07,319 Speaker 1: dollars to buy it. Later on, that price is actually 159 00:10:07,320 --> 00:10:12,439 Speaker 1: going to go up to around one tho. The robot 160 00:10:12,920 --> 00:10:15,840 Speaker 1: has a tablet like screen for a face, like it's 161 00:10:15,840 --> 00:10:19,360 Speaker 1: got these digital eyes. Essentially kind of looks a little 162 00:10:19,360 --> 00:10:23,440 Speaker 1: bit like Wally from the Pixar film. It moves around 163 00:10:23,480 --> 00:10:26,640 Speaker 1: on two twelve inch wheels, and it's got a little 164 00:10:26,720 --> 00:10:30,520 Speaker 1: castor like wheel in the back to provide more stability. 165 00:10:30,559 --> 00:10:34,120 Speaker 1: It's got cameras, including a five megapixel camera built into 166 00:10:34,120 --> 00:10:36,720 Speaker 1: the screen, but it's also got a periscope camera it 167 00:10:36,760 --> 00:10:40,120 Speaker 1: can extend and use to look around at surroundings. It's 168 00:10:40,160 --> 00:10:42,880 Speaker 1: got a speaker, it's got a microphone. It has facial 169 00:10:42,920 --> 00:10:46,640 Speaker 1: recognition technology built into it. It can learn a map 170 00:10:46,679 --> 00:10:50,319 Speaker 1: of your home, so you can have it learned the 171 00:10:50,400 --> 00:10:54,680 Speaker 1: layout of your home and assigned specific names to specific rooms. Thus, 172 00:10:54,720 --> 00:10:57,680 Speaker 1: you could tell your robot, hey, you know, go to 173 00:10:57,720 --> 00:11:00,160 Speaker 1: the kitchen and it would navigate over to the chin 174 00:11:00,280 --> 00:11:03,880 Speaker 1: from wherever it happened to be. Um. It's got all 175 00:11:03,960 --> 00:11:07,160 Speaker 1: the Alexa capabilities built into it so that it can 176 00:11:07,200 --> 00:11:10,760 Speaker 1: respond to the request that you would typically use to 177 00:11:11,600 --> 00:11:14,240 Speaker 1: ask Alexa for stuff like what the weather is, or 178 00:11:14,480 --> 00:11:16,640 Speaker 1: to play a certain song and all that kind of stuff. 179 00:11:17,040 --> 00:11:20,160 Speaker 1: Not everyone is sold on this robot. The m I 180 00:11:20,200 --> 00:11:24,040 Speaker 1: T Technology Review has called the robot stupid. The Verge 181 00:11:24,160 --> 00:11:27,240 Speaker 1: ran a follow up piece saying that developers who had 182 00:11:27,240 --> 00:11:30,400 Speaker 1: worked on the robot had reportedly called it terrible and 183 00:11:30,440 --> 00:11:33,280 Speaker 1: claimed it would make dumb mistakes such as throwing itself 184 00:11:33,320 --> 00:11:35,560 Speaker 1: off the stairs, So if you have stairs in your home, 185 00:11:36,000 --> 00:11:37,760 Speaker 1: you might not want to have this thing on an 186 00:11:37,800 --> 00:11:40,600 Speaker 1: upper level. I think I would advise folks to maybe 187 00:11:40,640 --> 00:11:43,160 Speaker 1: pump the brakes a bit before shelling out a grand 188 00:11:43,360 --> 00:11:45,800 Speaker 1: or more for this thing. While we have stuff like 189 00:11:45,880 --> 00:11:49,679 Speaker 1: room Buzz and those are reasonably popular, we're really still 190 00:11:49,720 --> 00:11:54,320 Speaker 1: waiting on the household robot that is a must buy. 191 00:11:54,480 --> 00:11:57,319 Speaker 1: I've seen a lot of people suggest that this robot 192 00:11:57,800 --> 00:12:01,440 Speaker 1: doesn't do a whole really when you get down to it, 193 00:12:01,520 --> 00:12:03,920 Speaker 1: and the stuff that it does do, it doesn't necessarily 194 00:12:03,960 --> 00:12:11,439 Speaker 1: do super well, so it's not like a wise purchasing decision. Um. 195 00:12:11,480 --> 00:12:15,600 Speaker 1: I mean, it would potentially give Amazon a lot more 196 00:12:15,640 --> 00:12:18,040 Speaker 1: opportunities to figure out how to sell stuff to you. 197 00:12:18,120 --> 00:12:21,080 Speaker 1: I mean, if they've got essentially a computer that can 198 00:12:21,240 --> 00:12:25,480 Speaker 1: move around your house and observe stuff, then you could 199 00:12:25,520 --> 00:12:27,920 Speaker 1: argue this could just be another way for Amazon to 200 00:12:27,960 --> 00:12:31,080 Speaker 1: position itself in order to sell more products to customers. 201 00:12:31,160 --> 00:12:34,720 Speaker 1: And frankly, I suspect that that is a large part 202 00:12:34,720 --> 00:12:37,040 Speaker 1: of it. And you you have to pay for that privilege, right, 203 00:12:37,080 --> 00:12:39,080 Speaker 1: You have to pay like a thousand dollars or more 204 00:12:39,800 --> 00:12:42,480 Speaker 1: in order to have the privilege of having a surveillance 205 00:12:42,520 --> 00:12:46,240 Speaker 1: device that is potentially giving more information to a company 206 00:12:46,280 --> 00:12:50,000 Speaker 1: to sell you more stuff. Anyway. Besides the robot and 207 00:12:50,080 --> 00:12:53,760 Speaker 1: the wall mounted Echo Show, which again is like a 208 00:12:53,800 --> 00:12:57,320 Speaker 1: smart display that you can attach to a wall, we 209 00:12:57,400 --> 00:13:00,199 Speaker 1: also saw the Echo Glow. This is a table top 210 00:13:00,200 --> 00:13:04,480 Speaker 1: tablet for video conferencing for kids, and it's meant to 211 00:13:04,520 --> 00:13:07,280 Speaker 1: appeal to children so that they video conference with distant 212 00:13:07,320 --> 00:13:09,559 Speaker 1: friends and family and they don't get bored and just 213 00:13:10,080 --> 00:13:14,200 Speaker 1: wander off. That was what Amazon said. The purpose was 214 00:13:14,280 --> 00:13:18,079 Speaker 1: behind the design. So in this case, the tablet actually 215 00:13:18,160 --> 00:13:21,960 Speaker 1: has incorporated inside of it a projector. The projector can 216 00:13:22,000 --> 00:13:26,760 Speaker 1: display stuff on the tabletop in front of the tablet. 217 00:13:27,440 --> 00:13:29,600 Speaker 1: It even comes with like a white mat. It's kind 218 00:13:29,600 --> 00:13:31,760 Speaker 1: of like a movie screen, so you lay the mat down, 219 00:13:31,800 --> 00:13:34,160 Speaker 1: you put the tablet behind the mat, and it can 220 00:13:34,200 --> 00:13:37,240 Speaker 1: project a screen down on the mat that's on the table. 221 00:13:37,760 --> 00:13:40,240 Speaker 1: Uh that can become an interactive story book, you can 222 00:13:40,320 --> 00:13:44,920 Speaker 1: become a game. And there are infrared sensors inside the 223 00:13:45,160 --> 00:13:47,960 Speaker 1: tablet so it can track things like hand motion, so 224 00:13:48,120 --> 00:13:50,679 Speaker 1: when you're you know, moving stuff around on this screen, 225 00:13:50,720 --> 00:13:53,120 Speaker 1: it's acting almost like a touch screen. It's kind of 226 00:13:53,120 --> 00:13:55,880 Speaker 1: a neat idea. I'm not entirely sold on it, but 227 00:13:55,920 --> 00:13:59,679 Speaker 1: then I also don't have any kids, so um. Amazon 228 00:13:59,720 --> 00:14:03,640 Speaker 1: also revealed a smart thermostat fitness track your devices, and 229 00:14:03,720 --> 00:14:09,160 Speaker 1: also a version of its Echo device specifically designed for 230 00:14:09,360 --> 00:14:13,400 Speaker 1: Disney properties. So in the future, should you stay at 231 00:14:13,400 --> 00:14:16,200 Speaker 1: like a Walt Disney World resort, you might find that 232 00:14:16,280 --> 00:14:19,320 Speaker 1: there is a Disney themed Echo and you can use 233 00:14:19,360 --> 00:14:23,480 Speaker 1: it to ask questions like checking on dining reservations or 234 00:14:23,640 --> 00:14:27,520 Speaker 1: asking about park hours, that kind of stuff. Um, I 235 00:14:27,600 --> 00:14:32,760 Speaker 1: was not super impressed with the Amazon presentation, but then 236 00:14:32,880 --> 00:14:34,760 Speaker 1: I also don't want this to be an Amazon ad. 237 00:14:34,840 --> 00:14:37,560 Speaker 1: So we're just gonna go to break and listen to 238 00:14:38,160 --> 00:14:40,760 Speaker 1: other ads and then come back with some more news 239 00:14:48,480 --> 00:14:52,200 Speaker 1: and we're back. Researchers at Trendy College in Dublin, Ireland, 240 00:14:52,280 --> 00:14:54,440 Speaker 1: which by the way, is one of my favorite spots 241 00:14:54,480 --> 00:14:57,760 Speaker 1: in Ireland, have released a research paper that looked into 242 00:14:57,840 --> 00:15:01,600 Speaker 1: how much data iOS and Android devices gather from their 243 00:15:01,600 --> 00:15:05,840 Speaker 1: respective users. The paper says that both operating systems send 244 00:15:05,920 --> 00:15:09,360 Speaker 1: packets of data back to you know, their their respective 245 00:15:09,480 --> 00:15:12,840 Speaker 1: h q s approximately every four and a half minutes. 246 00:15:12,880 --> 00:15:16,960 Speaker 1: Whether you're using it or not. This happens even if 247 00:15:16,960 --> 00:15:19,920 Speaker 1: you've gone to the trouble of setting all data sharing 248 00:15:19,960 --> 00:15:24,680 Speaker 1: options to off. Keep in mind, some data sharing isn't optional, 249 00:15:24,920 --> 00:15:29,080 Speaker 1: it's mandatory. And the study did find some differences between 250 00:15:29,120 --> 00:15:33,480 Speaker 1: the two. Google, they said, collects much larger volumes of data. 251 00:15:33,680 --> 00:15:37,640 Speaker 1: The size of packets that Google sending back is much 252 00:15:37,680 --> 00:15:40,920 Speaker 1: greater than what Apple is doing. However, Apple, while sending 253 00:15:41,000 --> 00:15:46,120 Speaker 1: less data, overall, sent a wider variety of data, including 254 00:15:46,200 --> 00:15:49,920 Speaker 1: data about other devices that are connected to whichever network 255 00:15:50,160 --> 00:15:54,200 Speaker 1: you connect to. Android doesn't do that, so Apple is 256 00:15:54,640 --> 00:15:57,000 Speaker 1: interested in what other devices are on the same network 257 00:15:57,040 --> 00:15:59,840 Speaker 1: you're on. I guess. The Google reps have said this 258 00:16:00,200 --> 00:16:02,880 Speaker 1: is really just kind of how smartphones work. They compared 259 00:16:02,920 --> 00:16:07,200 Speaker 1: it to how modern day vehicles have components that send 260 00:16:07,280 --> 00:16:11,400 Speaker 1: data back to car companies for safety purposes. The researchers 261 00:16:11,400 --> 00:16:13,920 Speaker 1: said that if users really want to limit how much 262 00:16:13,920 --> 00:16:17,840 Speaker 1: of your data your phone collects and shares, you should 263 00:16:17,880 --> 00:16:20,360 Speaker 1: really go with an Android phone because despite the fact 264 00:16:20,400 --> 00:16:23,360 Speaker 1: that it's sending more data, you have more options. You 265 00:16:23,400 --> 00:16:27,080 Speaker 1: can have network connections disabled when you activate your phone, 266 00:16:27,600 --> 00:16:30,880 Speaker 1: and you should also disabled Google Play and YouTube, and 267 00:16:30,920 --> 00:16:33,520 Speaker 1: the Google Play Store. That cuts down on most of 268 00:16:33,560 --> 00:16:37,080 Speaker 1: the data collection. You would still need to sideload apps 269 00:16:37,160 --> 00:16:39,960 Speaker 1: and get them from someplace other than the Google Play 270 00:16:40,000 --> 00:16:43,440 Speaker 1: Store in order to avoid that data collection. Even then, 271 00:16:43,440 --> 00:16:46,240 Speaker 1: you're just minimizing the amount of data that's being collected 272 00:16:46,240 --> 00:16:49,280 Speaker 1: and shared. They did say that Apple users are just 273 00:16:49,320 --> 00:16:53,000 Speaker 1: out of lock because Apple requires a network connection to 274 00:16:53,040 --> 00:16:56,000 Speaker 1: activate an iPhone. So Apple users are subjected to data 275 00:16:56,000 --> 00:16:59,120 Speaker 1: collection no matter what they do. Speaking of Apple, the 276 00:16:59,120 --> 00:17:02,400 Speaker 1: Wall Street Journal parts that Apple made more operating profit 277 00:17:02,480 --> 00:17:07,879 Speaker 1: from games than Microsoft, Sony, Nintendo, and Activision Blizzard combined. 278 00:17:08,600 --> 00:17:13,240 Speaker 1: And Apple doesn't make games, but profited more from games 279 00:17:13,240 --> 00:17:15,960 Speaker 1: than all those other companies. Al Right, so in some 280 00:17:16,000 --> 00:17:18,359 Speaker 1: ways this actually is not a big surprise, right. I 281 00:17:18,359 --> 00:17:22,440 Speaker 1: Mean we're talking operating profit, we're not talking revenue. So 282 00:17:22,760 --> 00:17:27,920 Speaker 1: companies like Nintendo, Microsoft, Sony, and Activision Blizzard they actually 283 00:17:28,040 --> 00:17:31,840 Speaker 1: make games. Now that means that there are costs associated 284 00:17:31,880 --> 00:17:34,959 Speaker 1: with that business. You know, Profit is what you've got 285 00:17:35,160 --> 00:17:39,080 Speaker 1: after you've accounted for the costs of producing whatever it 286 00:17:39,119 --> 00:17:42,119 Speaker 1: is you're producing. So if I decide I'm going to 287 00:17:42,240 --> 00:17:45,280 Speaker 1: go and make a chair, and I sell that chair 288 00:17:45,320 --> 00:17:48,840 Speaker 1: for fifty dollars, but it turns out the materials cost 289 00:17:48,880 --> 00:17:51,400 Speaker 1: me sixty dollars. Well, I lost money on that, right, 290 00:17:51,440 --> 00:17:54,280 Speaker 1: I didn't make a profit. That's a loss. While Apple 291 00:17:54,320 --> 00:17:57,680 Speaker 1: generates revenue by taking a cut of all digital sales 292 00:17:57,800 --> 00:18:01,240 Speaker 1: up to in some cases, so Apple gets a slice 293 00:18:01,240 --> 00:18:05,160 Speaker 1: of the pie, which is I guess, Apple pie. Every 294 00:18:05,160 --> 00:18:08,040 Speaker 1: time someone purchases a game from the Apple App Store, 295 00:18:08,600 --> 00:18:11,480 Speaker 1: or and this is really important, they get a slice 296 00:18:11,480 --> 00:18:13,920 Speaker 1: of the pie if a person makes an in game 297 00:18:14,000 --> 00:18:17,760 Speaker 1: purchase on an app that was taken from the Apple 298 00:18:17,760 --> 00:18:20,480 Speaker 1: App Store. Apple has been going through a series of 299 00:18:20,520 --> 00:18:24,240 Speaker 1: court cases about those in game purchases, having recently been 300 00:18:24,320 --> 00:18:27,320 Speaker 1: ordered by a judge to offer alternatives or allow for 301 00:18:27,359 --> 00:18:31,440 Speaker 1: alternatives to Apple's own in app purchasing system. This would 302 00:18:31,440 --> 00:18:34,359 Speaker 1: give users the option to go with something else and 303 00:18:34,400 --> 00:18:37,000 Speaker 1: not go through Apple. This is at the heart of 304 00:18:37,000 --> 00:18:41,040 Speaker 1: the matter between Apple and Epic Games, the creator of Fortnite. 305 00:18:41,359 --> 00:18:45,040 Speaker 1: So again, yes, Apple has made more profiting games than 306 00:18:45,560 --> 00:18:48,800 Speaker 1: official game companies, but then Apple also doesn't have to 307 00:18:48,840 --> 00:18:52,920 Speaker 1: have the expense of development for those. A man named 308 00:18:52,960 --> 00:18:56,120 Speaker 1: O N. D As, a former elevator operator at Tesla, 309 00:18:56,359 --> 00:18:59,920 Speaker 1: sued the company, charging that Tesla is a hostile work environment, 310 00:19:00,040 --> 00:19:02,320 Speaker 1: and that he and other employees were the targets for 311 00:19:02,560 --> 00:19:06,439 Speaker 1: things like racial slurs and intimidation. A court found in 312 00:19:06,480 --> 00:19:09,199 Speaker 1: his favor, and the court has awarded him some one 313 00:19:09,920 --> 00:19:13,360 Speaker 1: seven million dollars or so in damages. Now, one thing 314 00:19:13,400 --> 00:19:15,600 Speaker 1: that makes this story really important is the fact that 315 00:19:15,720 --> 00:19:19,480 Speaker 1: Diaz was even able to sue at all. Tesla, like 316 00:19:19,560 --> 00:19:22,879 Speaker 1: a lot of tech companies, has a policy that requires 317 00:19:22,880 --> 00:19:27,600 Speaker 1: employees to sign an arbitration agreement. Now, these agreements say 318 00:19:27,640 --> 00:19:32,199 Speaker 1: that employees must work within company systems to resolve problems. 319 00:19:32,480 --> 00:19:35,639 Speaker 1: They are not allowed to go outside the company and say, 320 00:19:35,920 --> 00:19:39,080 Speaker 1: address the media or sue the company in a court 321 00:19:39,160 --> 00:19:42,840 Speaker 1: and take it to a public trial. However, Diaz never 322 00:19:42,960 --> 00:19:46,040 Speaker 1: signed that kind of an agreement, so he wasn't held 323 00:19:46,080 --> 00:19:50,080 Speaker 1: to those restrictions. He could sue without breaking an agreement 324 00:19:50,080 --> 00:19:52,639 Speaker 1: he had made with the company. And you might even say, hah, 325 00:19:53,720 --> 00:19:55,959 Speaker 1: kind of sounds to me like these sorts of agreements 326 00:19:55,960 --> 00:19:59,280 Speaker 1: would allow for a toxic work environment to flourish, and 327 00:19:59,359 --> 00:20:01,640 Speaker 1: that there would be no real way you could address 328 00:20:01,680 --> 00:20:04,800 Speaker 1: it because all the disputes have to be handled internally, 329 00:20:05,440 --> 00:20:09,680 Speaker 1: and if the system is rotten, then you'll never come 330 00:20:09,680 --> 00:20:12,840 Speaker 1: out on top because the game is stacked against you 331 00:20:12,920 --> 00:20:15,760 Speaker 1: from the beginning. You would not be the only person 332 00:20:15,840 --> 00:20:19,800 Speaker 1: to think that way. In fact, one of one of 333 00:20:19,840 --> 00:20:23,960 Speaker 1: the the investors of Tesla, an organization called NIA Impact 334 00:20:24,080 --> 00:20:28,280 Speaker 1: Capital Um, has sent uh the board of directors a 335 00:20:28,359 --> 00:20:32,520 Speaker 1: request to review this arbitration policy, raising concerns that the 336 00:20:32,560 --> 00:20:36,560 Speaker 1: policy enables harassment. Now, I would go even further than that. 337 00:20:36,640 --> 00:20:40,800 Speaker 1: I would say it encourages harassment. Also, I mean, is 338 00:20:40,800 --> 00:20:42,760 Speaker 1: it just me or do you sense that there's like 339 00:20:42,800 --> 00:20:46,320 Speaker 1: a growing labor movement in the United States? You know, 340 00:20:46,440 --> 00:20:49,359 Speaker 1: I think we're still in the very early stages and 341 00:20:49,440 --> 00:20:52,199 Speaker 1: it could just fizzle out. But I also feel like 342 00:20:52,280 --> 00:20:55,280 Speaker 1: younger generations in particular are getting a bit more head 343 00:20:55,359 --> 00:20:58,679 Speaker 1: up about the status quo, and they're demanding things change 344 00:20:58,840 --> 00:21:03,040 Speaker 1: or they won't play ball. And hey, young folks, I 345 00:21:03,080 --> 00:21:05,960 Speaker 1: am with you on this one. You know, I might 346 00:21:05,960 --> 00:21:09,840 Speaker 1: not understand your squid games and your TikTok's, but I'm 347 00:21:09,880 --> 00:21:14,080 Speaker 1: all for overhauling the labor system. Also, here's hoping Tesla 348 00:21:14,240 --> 00:21:20,040 Speaker 1: cracks down on those internal problems and really does try 349 00:21:20,080 --> 00:21:22,600 Speaker 1: to change that toxic work environment. It would be really 350 00:21:22,680 --> 00:21:28,520 Speaker 1: nice to hear about companies actually making substantial transformational changes 351 00:21:28,560 --> 00:21:32,520 Speaker 1: in workplaces and find a place where employees are all 352 00:21:32,560 --> 00:21:36,359 Speaker 1: treated with respect and dignity, which I know that's crazy talk, 353 00:21:36,800 --> 00:21:39,480 Speaker 1: but it would be a nice change of pace from 354 00:21:39,560 --> 00:21:43,720 Speaker 1: the stories we constantly get about companies that have terrible 355 00:21:43,800 --> 00:21:49,240 Speaker 1: work environments, really demoralized employees, and just it just sounds 356 00:21:49,240 --> 00:21:53,320 Speaker 1: like the worst. So I am hoping that Tesla makes 357 00:21:53,320 --> 00:21:57,040 Speaker 1: those changes. Also, I hope that we start to see 358 00:21:57,080 --> 00:22:01,400 Speaker 1: more resistance to this trend of arbitration agreements being put 359 00:22:01,440 --> 00:22:04,119 Speaker 1: in place. It's just a way for companies to protect 360 00:22:04,200 --> 00:22:08,919 Speaker 1: themselves at the expense of their employees. And I don't 361 00:22:09,040 --> 00:22:11,800 Speaker 1: think that's I don't when we're talking about protection, I 362 00:22:11,840 --> 00:22:13,960 Speaker 1: don't think it's the companies that need to be protected. 363 00:22:14,400 --> 00:22:17,680 Speaker 1: I think it's the individual employees who need the protection. Again, 364 00:22:18,560 --> 00:22:21,840 Speaker 1: that's my own opinion. Sticking with cars, since we just 365 00:22:21,880 --> 00:22:27,480 Speaker 1: talked about Tesla, let's chat a bit about California because 366 00:22:27,640 --> 00:22:30,760 Speaker 1: it's that state's Department of Motor Vehicles that just granted 367 00:22:30,800 --> 00:22:34,520 Speaker 1: permits to a pair of self driving car companies, that 368 00:22:34,600 --> 00:22:38,200 Speaker 1: being Way Moo and Cruise. They both have been given 369 00:22:38,240 --> 00:22:42,280 Speaker 1: permits to operate light duty self driving taxi cabs in 370 00:22:42,359 --> 00:22:47,439 Speaker 1: San Francisco and in way most case beyond. But these 371 00:22:47,480 --> 00:22:50,879 Speaker 1: permits do come with some restrictions. However, they're not open ended. 372 00:22:51,160 --> 00:22:55,320 Speaker 1: So in cruises case, that company's cabs will only be 373 00:22:55,400 --> 00:22:58,720 Speaker 1: able to operate between ten pm and six am, so 374 00:22:58,880 --> 00:23:01,840 Speaker 1: they can only operate it overnight. They are allowed a 375 00:23:01,880 --> 00:23:05,520 Speaker 1: maximum speed of thirty miles per hour, and they can't 376 00:23:05,520 --> 00:23:10,160 Speaker 1: operate during particularly foggy or rainy conditions. I actually find 377 00:23:10,200 --> 00:23:12,800 Speaker 1: that ten pm to six am really odd. I mean, 378 00:23:12,800 --> 00:23:16,800 Speaker 1: maybe it's to avoid traffic. Um. I was trying to 379 00:23:16,880 --> 00:23:20,800 Speaker 1: find more confirmation on this and make sure that it 380 00:23:20,880 --> 00:23:24,439 Speaker 1: wasn't just a transcription error and that they didn't they 381 00:23:24,440 --> 00:23:27,000 Speaker 1: didn't mean between six am and ten pm, but rather 382 00:23:27,080 --> 00:23:30,480 Speaker 1: ten pm and six am. I I don't know, so 383 00:23:30,520 --> 00:23:34,120 Speaker 1: I'm just going with what the press release said. So, 384 00:23:34,320 --> 00:23:38,560 Speaker 1: since we're talking San Francisco, and this does limit when 385 00:23:38,640 --> 00:23:42,040 Speaker 1: the vehicles can operate and they can't operate in bad weather, 386 00:23:42,680 --> 00:23:45,040 Speaker 1: it really does restrict when they would be able to 387 00:23:45,800 --> 00:23:51,520 Speaker 1: two autonomously drive the streets of San Francisco. WAYMO, which 388 00:23:51,720 --> 00:23:54,400 Speaker 1: is part of Alphabet that's the company that's also apparent 389 00:23:54,440 --> 00:23:57,159 Speaker 1: to Google, gets a bit more leeway the way MOO 390 00:23:57,400 --> 00:24:00,600 Speaker 1: cars will be authorized to travel up to sixty five 391 00:24:00,640 --> 00:24:02,760 Speaker 1: miles per hour, so they could go on too highways. 392 00:24:03,119 --> 00:24:06,240 Speaker 1: They can go beyond just San Francisco, but not the 393 00:24:06,400 --> 00:24:09,160 Speaker 1: entire state. Uh. They don't appear to have a limitation 394 00:24:09,200 --> 00:24:11,680 Speaker 1: on when they are allowed to operate, so we don't 395 00:24:11,720 --> 00:24:14,240 Speaker 1: have like restricted hours for waymo as far as I 396 00:24:14,280 --> 00:24:17,440 Speaker 1: can suss out. They also are not allowed to drive 397 00:24:17,520 --> 00:24:19,800 Speaker 1: if the weather is too nasty, so that restriction does 398 00:24:19,840 --> 00:24:23,800 Speaker 1: still apply. But does this actually mean that robo taxis 399 00:24:23,880 --> 00:24:27,679 Speaker 1: are right now prowling the streets of San Francisco. Not 400 00:24:27,840 --> 00:24:30,160 Speaker 1: quite yet, because the companies will still need to get 401 00:24:30,160 --> 00:24:34,800 Speaker 1: approval from California's Public Utilities Commission, and that commission might 402 00:24:34,840 --> 00:24:38,520 Speaker 1: have more restrictions or more requirements. Uh, so we'll have 403 00:24:38,560 --> 00:24:42,080 Speaker 1: to just keep an eye out for it. Last week, 404 00:24:42,400 --> 00:24:47,120 Speaker 1: the website Lyons posted an essay written primarily by Alexandra Abrams, 405 00:24:47,119 --> 00:24:50,120 Speaker 1: who used to be the head of Blue Origins employee Communications. 406 00:24:50,800 --> 00:24:53,680 Speaker 1: Blue Origin, in case you don't recall, is Amazon founder 407 00:24:53,800 --> 00:24:58,520 Speaker 1: Jeff Bezos's space industry company anyway, Abrams, in around twenty 408 00:24:58,560 --> 00:25:02,040 Speaker 1: other folks who either work or have worked for Blue Origin, 409 00:25:02,480 --> 00:25:05,640 Speaker 1: detailed that the company has a truly toxic work culture 410 00:25:05,680 --> 00:25:08,760 Speaker 1: seems to be a common theme. The essay alleges that 411 00:25:08,880 --> 00:25:13,119 Speaker 1: numerous male senior leaders have acted inappropriately with women, including 412 00:25:13,160 --> 00:25:17,000 Speaker 1: one who CEO Bob Smith, would then appoint as part 413 00:25:17,040 --> 00:25:19,760 Speaker 1: of a hiring committee for a senior HR position. Now, 414 00:25:19,760 --> 00:25:22,239 Speaker 1: in case that didn't quite register with you, I'll put 415 00:25:22,280 --> 00:25:25,479 Speaker 1: it in in another way, a guy who had numerous sexual 416 00:25:25,480 --> 00:25:28,560 Speaker 1: harassment claims against him was then put on a committee 417 00:25:28,800 --> 00:25:32,639 Speaker 1: in charge of finding a human resources executive. That's like 418 00:25:32,760 --> 00:25:36,320 Speaker 1: giving an inmate in solitary confinement the authority to choose 419 00:25:36,320 --> 00:25:39,600 Speaker 1: who the warden is for the prison. The essay lists 420 00:25:39,680 --> 00:25:42,480 Speaker 1: several incidents and examples and makes the case that Blue 421 00:25:42,520 --> 00:25:47,399 Speaker 1: Origins work environment is harmful, and it's harmful to productivity, 422 00:25:47,400 --> 00:25:50,399 Speaker 1: it's harmful to mental health and Abrahams also said that 423 00:25:50,440 --> 00:25:53,320 Speaker 1: many of the authors of the essay would never go 424 00:25:53,520 --> 00:25:56,600 Speaker 1: up on a Blue Origin rocket out of concerns for safety, 425 00:25:56,640 --> 00:25:59,840 Speaker 1: because they said that the work conditions were so intense 426 00:26:00,320 --> 00:26:04,240 Speaker 1: and so rushed that they feel that safety has been 427 00:26:05,040 --> 00:26:08,840 Speaker 1: uh neglected too much. Meanwhile, Blue Origin has announced that 428 00:26:08,840 --> 00:26:11,080 Speaker 1: William Shatner is scheduled to go up on the next trip, 429 00:26:11,560 --> 00:26:15,760 Speaker 1: so they're going with all systems go. I guess Today 430 00:26:15,960 --> 00:26:19,439 Speaker 1: representatives from Microsoft, Google, and Amazon will be meeting with 431 00:26:19,480 --> 00:26:22,560 Speaker 1: the US government's Office of Science and Technology Policy at 432 00:26:22,600 --> 00:26:26,119 Speaker 1: the White House to talk about quantum computing. Now, for 433 00:26:26,160 --> 00:26:29,040 Speaker 1: those who are not familiar with that term, quantum computing 434 00:26:29,160 --> 00:26:33,120 Speaker 1: involves the strange world of quantum mechanics. So your classic 435 00:26:33,119 --> 00:26:36,680 Speaker 1: computers operate by processing information in the form of binary 436 00:26:36,760 --> 00:26:39,919 Speaker 1: digits or bits, and a bit can be a zero 437 00:26:40,080 --> 00:26:42,560 Speaker 1: or a one. It's kind of like a light switch. 438 00:26:42,840 --> 00:26:47,640 Speaker 1: It's either off or it's on. But quantum computers use 439 00:26:47,720 --> 00:26:51,280 Speaker 1: what we call cubits or quantum bits. Now, these can 440 00:26:51,320 --> 00:26:55,040 Speaker 1: be a zero, a one, anything in between, and all 441 00:26:55,080 --> 00:26:57,800 Speaker 1: of them at the same time. Quantum computing get involve 442 00:26:57,880 --> 00:27:00,679 Speaker 1: strange stuff like superposition. That it's what we were just 443 00:27:00,720 --> 00:27:04,320 Speaker 1: talking about. Superposition is where you have a quantum element 444 00:27:04,400 --> 00:27:08,919 Speaker 1: that can occupy all possible states simultaneously. It would be 445 00:27:09,000 --> 00:27:11,040 Speaker 1: like if you could be awake in a sleep at 446 00:27:11,040 --> 00:27:13,240 Speaker 1: the same time. You know, those are two different states. 447 00:27:13,920 --> 00:27:16,840 Speaker 1: That's kind of what superposition is, but it only applies 448 00:27:17,280 --> 00:27:20,359 Speaker 1: to the quantum world, alright, So anyone who's trying to 449 00:27:20,400 --> 00:27:24,439 Speaker 1: sell you anything about superposition that could be observed in 450 00:27:24,480 --> 00:27:27,399 Speaker 1: our classical universe, they're full of it. That is not 451 00:27:27,520 --> 00:27:32,360 Speaker 1: the way the world works. Quantum computers can also involve entanglement. Now, 452 00:27:32,400 --> 00:27:35,520 Speaker 1: this is where you have the state of one quantum 453 00:27:35,560 --> 00:27:39,040 Speaker 1: element tied to the state of another, no matter how 454 00:27:39,359 --> 00:27:43,040 Speaker 1: far apart those two quantum elements are, and by observing 455 00:27:43,160 --> 00:27:45,640 Speaker 1: one of those two you can tell what the state 456 00:27:45,720 --> 00:27:48,600 Speaker 1: of the other one was at that exact moment. So 457 00:27:49,119 --> 00:27:51,760 Speaker 1: let's talk about electron spin. This is a pretty common one. 458 00:27:52,080 --> 00:27:54,639 Speaker 1: Let's say that you've got two entangled electrons. One of 459 00:27:54,680 --> 00:27:56,879 Speaker 1: them is spinning up and the other one spinning down. 460 00:27:57,520 --> 00:28:01,720 Speaker 1: And let's say that they're a universe apart. It doesn't 461 00:28:01,720 --> 00:28:04,920 Speaker 1: matter as long as they're tied together. So while one 462 00:28:04,960 --> 00:28:07,439 Speaker 1: spinning up, the other one spinning down, you observe the 463 00:28:07,640 --> 00:28:09,320 Speaker 1: one of them. You see that spinning up, You know 464 00:28:09,400 --> 00:28:14,239 Speaker 1: the other one was spinning down. However, the crazy thing 465 00:28:14,240 --> 00:28:16,840 Speaker 1: about the quantum world is that when you start to 466 00:28:16,920 --> 00:28:21,760 Speaker 1: observe things, you actually change their behavior, and things like 467 00:28:21,880 --> 00:28:24,920 Speaker 1: entanglement breakdown. So while you might be able to say 468 00:28:25,080 --> 00:28:28,080 Speaker 1: at that specific moment that other electron was spinning down, 469 00:28:28,320 --> 00:28:31,040 Speaker 1: you don't know what it's doing now because the entanglement 470 00:28:31,080 --> 00:28:35,919 Speaker 1: has been severed. Now quantum computers can solve a subset 471 00:28:36,119 --> 00:28:41,680 Speaker 1: of computational problems with insane speed, assuming that the quantum 472 00:28:41,680 --> 00:28:44,080 Speaker 1: computer has enough cubits to do it, and that you 473 00:28:44,120 --> 00:28:48,840 Speaker 1: have developed an algorithm that can tackle that sort of problem. 474 00:28:48,880 --> 00:28:52,240 Speaker 1: And these are computational problems that would take classic computers 475 00:28:52,360 --> 00:28:56,239 Speaker 1: hundreds or thousands or millions of years to complete, and 476 00:28:56,280 --> 00:28:59,960 Speaker 1: you can do it relatively quickly with a quantum computer. Now, 477 00:29:00,160 --> 00:29:03,640 Speaker 1: it doesn't apply for all types of computational problems. So 478 00:29:03,680 --> 00:29:05,360 Speaker 1: in other words, you wouldn't want to use a quantum 479 00:29:05,400 --> 00:29:08,480 Speaker 1: computer to play call of duty because it wouldn't work 480 00:29:08,640 --> 00:29:12,120 Speaker 1: nearly as well as a classic computer would. Anyway, the 481 00:29:12,280 --> 00:29:17,000 Speaker 1: experts from Microsoft and Amazon and Google are meeting today 482 00:29:17,080 --> 00:29:21,120 Speaker 1: to talk about possible future applications of quantum computing, some 483 00:29:21,200 --> 00:29:24,680 Speaker 1: of which relate to security. Encryption is one of those 484 00:29:24,720 --> 00:29:28,200 Speaker 1: things that quantum computers could really disrupt, because with a 485 00:29:28,240 --> 00:29:31,640 Speaker 1: sufficiently powerful quantum computer and the right algorithm, you could 486 00:29:31,640 --> 00:29:35,600 Speaker 1: potentially break even the toughest encryption schemes in a very 487 00:29:35,680 --> 00:29:39,760 Speaker 1: little amount of time. It's amazing and also kind of scary. 488 00:29:40,280 --> 00:29:42,280 Speaker 1: We have a couple more stories to get to But 489 00:29:42,320 --> 00:29:52,400 Speaker 1: before we do that, let's take another quick break. So, 490 00:29:52,480 --> 00:29:56,800 Speaker 1: in the continuing battle against robo calls, the United States 491 00:29:56,880 --> 00:30:02,320 Speaker 1: Federal Communications Commission or FCC is considering new restrictions on 492 00:30:02,600 --> 00:30:05,800 Speaker 1: domestic gateway providers and they would have to follow these 493 00:30:05,840 --> 00:30:09,920 Speaker 1: new rules to cut back on robocalls that originate outside 494 00:30:09,920 --> 00:30:12,360 Speaker 1: the United States. All right, this gets to how do 495 00:30:12,400 --> 00:30:16,280 Speaker 1: you deal with a problem that doesn't necessarily originate inside 496 00:30:16,280 --> 00:30:19,320 Speaker 1: your own country? Right? We see this with the Internet, 497 00:30:19,400 --> 00:30:22,280 Speaker 1: and we see it with the telephone system because the 498 00:30:22,320 --> 00:30:26,480 Speaker 1: FCC doesn't have jurisdiction outside the United States, right Like, 499 00:30:26,560 --> 00:30:28,959 Speaker 1: they can't, you know, the f c C can't go 500 00:30:29,080 --> 00:30:31,960 Speaker 1: to a German telephone company and say, hey, knock it off, 501 00:30:32,560 --> 00:30:37,720 Speaker 1: because the FCC has no authority over telecommunications companies that 502 00:30:37,760 --> 00:30:41,320 Speaker 1: are outside the United States, and yet illegal robocalls from 503 00:30:41,320 --> 00:30:45,240 Speaker 1: other nations are a real problem. So the FCC's proposal 504 00:30:46,040 --> 00:30:49,719 Speaker 1: is to require gateway phone companies. These are US based 505 00:30:49,760 --> 00:30:54,240 Speaker 1: companies that accept and facilitate calls that are coming from 506 00:30:54,240 --> 00:30:58,520 Speaker 1: outside the country and routing them to the correct you know, 507 00:30:58,600 --> 00:31:02,360 Speaker 1: phone line in inside the country. They will be required 508 00:31:02,400 --> 00:31:04,960 Speaker 1: to verify the accuracy of color I D and to 509 00:31:05,040 --> 00:31:09,120 Speaker 1: authenticate that a call is legitimate, or presumably they would 510 00:31:09,120 --> 00:31:13,400 Speaker 1: have to block it. The FCC has already created protocols 511 00:31:13,440 --> 00:31:16,360 Speaker 1: called Stir and Shaken, which are meant to crack down 512 00:31:16,360 --> 00:31:19,600 Speaker 1: on US based robo calls. And here's hoping that these 513 00:31:19,640 --> 00:31:22,720 Speaker 1: proposals become actual rules and that we see a dramatic 514 00:31:22,800 --> 00:31:26,120 Speaker 1: decrease in robo calls as a result. I don't know 515 00:31:26,120 --> 00:31:29,480 Speaker 1: about any of you, but for me, I answer maybe 516 00:31:29,600 --> 00:31:32,080 Speaker 1: one call out of twenty that comes to my phone 517 00:31:32,320 --> 00:31:35,000 Speaker 1: because the vast majority of calls I get our numbers, 518 00:31:35,000 --> 00:31:38,520 Speaker 1: I don't recognize. Most of them never bother to leave 519 00:31:38,560 --> 00:31:41,200 Speaker 1: a message once it goes to voicemail. The ones that do, 520 00:31:41,360 --> 00:31:46,560 Speaker 1: it's prerecorded nonsense. That doesn't you know, it's nothing important. 521 00:31:47,400 --> 00:31:50,640 Speaker 1: It's like you need to renew this warranty that I 522 00:31:50,680 --> 00:31:55,120 Speaker 1: don't have on often on something that I don't own, 523 00:31:55,920 --> 00:31:58,280 Speaker 1: so it's not not something that I really want. And 524 00:31:58,320 --> 00:32:00,400 Speaker 1: it would be really nice if you know, when my 525 00:32:00,440 --> 00:32:02,880 Speaker 1: phone rings, I could be reasonably sure that it's from 526 00:32:02,920 --> 00:32:05,960 Speaker 1: someone I know and possibly someone I want to talk to, 527 00:32:06,160 --> 00:32:10,080 Speaker 1: or maybe like I know, oh, that call is coming 528 00:32:10,200 --> 00:32:12,880 Speaker 1: from you know, the power company, and they're letting me 529 00:32:12,960 --> 00:32:16,920 Speaker 1: know about maybe a planned outage in order to do 530 00:32:17,040 --> 00:32:19,760 Speaker 1: some work in the neighborhood. Those are the kind of 531 00:32:19,800 --> 00:32:21,600 Speaker 1: calls where, yeah, I'd kind of like to be able 532 00:32:21,640 --> 00:32:24,400 Speaker 1: to get them, but I don't get anything because I 533 00:32:24,440 --> 00:32:26,320 Speaker 1: just let them all go to voicemail because there are 534 00:32:26,320 --> 00:32:30,000 Speaker 1: too many robocalls. All right, I know, you know this 535 00:32:30,080 --> 00:32:32,160 Speaker 1: is a problem. If you have a phone, you're aware 536 00:32:32,200 --> 00:32:37,800 Speaker 1: of it. I'll be quiet now. Finally, the activist group Anonymous, 537 00:32:38,120 --> 00:32:40,600 Speaker 1: which honestly I kind of lost tabs on for a while. 538 00:32:40,680 --> 00:32:44,160 Speaker 1: They I'm not sure if they went quiet or if 539 00:32:44,200 --> 00:32:48,400 Speaker 1: I just stopped, you know, hearing about their exploits. Anyway. UH, 540 00:32:48,560 --> 00:32:53,280 Speaker 1: some group that's associated with Anonymous has been actively defacing 541 00:32:53,320 --> 00:32:57,880 Speaker 1: a Chinese government website that promotes tourism in China. The 542 00:32:58,000 --> 00:33:02,680 Speaker 1: group has repeatedly altered that website, adding in images, a 543 00:33:02,720 --> 00:33:05,880 Speaker 1: lot of them images of various leaders that the Chinese 544 00:33:05,960 --> 00:33:10,560 Speaker 1: government has vilified, and UH also sending in messages that 545 00:33:10,680 --> 00:33:14,160 Speaker 1: call for people who visit the website to reject communism 546 00:33:14,160 --> 00:33:18,640 Speaker 1: and authoritarianism. I'm not sure how effective those strategies are 547 00:33:18,720 --> 00:33:21,520 Speaker 1: when you're targeting a tourism page, but it does show 548 00:33:21,560 --> 00:33:24,880 Speaker 1: how Anonymous will use a combination of subversion and you know, 549 00:33:24,960 --> 00:33:28,160 Speaker 1: kind of juvenile memes to poke at a target. The 550 00:33:28,200 --> 00:33:31,480 Speaker 1: group says it discovered that the server hosting the site 551 00:33:31,520 --> 00:33:35,400 Speaker 1: was using default password credentials. That speaks really poorly of 552 00:33:35,440 --> 00:33:38,440 Speaker 1: that website's approach to security. As a rule, you should 553 00:33:38,440 --> 00:33:42,840 Speaker 1: always change the default password settings on all your network devices, 554 00:33:42,920 --> 00:33:46,200 Speaker 1: otherwise you run the risk of someone using the default 555 00:33:46,320 --> 00:33:49,360 Speaker 1: password credentials to access your stuff and mess with it. 556 00:33:49,560 --> 00:33:52,240 Speaker 1: I don't have to guess your password if you've never 557 00:33:52,320 --> 00:33:55,320 Speaker 1: changed it. If I try to log into your router 558 00:33:55,560 --> 00:33:58,760 Speaker 1: and I use admin and password as the credentials and 559 00:33:58,800 --> 00:34:02,320 Speaker 1: it gets in, that's really on you because you didn't 560 00:34:02,360 --> 00:34:05,240 Speaker 1: take the steps to I mean it's on me too, 561 00:34:05,320 --> 00:34:08,000 Speaker 1: I'm the jerk who's trying to access your system, but 562 00:34:08,040 --> 00:34:10,480 Speaker 1: it's on you for also for not taking the steps 563 00:34:10,520 --> 00:34:14,319 Speaker 1: to change those defaults. So changing defaults is really good 564 00:34:14,920 --> 00:34:19,160 Speaker 1: Anonymous in case you're not familiar with them. It's a 565 00:34:19,360 --> 00:34:22,960 Speaker 1: very loose organization, actually using the word organization as even 566 00:34:23,000 --> 00:34:29,319 Speaker 1: a little grandiose. It's a collective of hackers and activists 567 00:34:29,400 --> 00:34:35,279 Speaker 1: who will target various companies and individuals for different reasons. 568 00:34:35,320 --> 00:34:41,759 Speaker 1: Sometimes it's to fight against what is seen as authoritarian regimes. 569 00:34:41,800 --> 00:34:45,560 Speaker 1: Sometimes it's to fight against companies that are seen to 570 00:34:45,640 --> 00:34:49,640 Speaker 1: be hypocritical. Uh, it really just depends, and there's no 571 00:34:49,920 --> 00:34:53,799 Speaker 1: centralized leadership. Really. You can have people who sort of 572 00:34:53,800 --> 00:34:57,160 Speaker 1: take the lead on specific initiatives, but it's a very 573 00:34:57,320 --> 00:35:02,160 Speaker 1: loose collective at which works to the group's both benefit 574 00:35:02,200 --> 00:35:05,120 Speaker 1: and detriment. It all depends on what anyone's trying to do. 575 00:35:05,880 --> 00:35:09,239 Speaker 1: All right, that's it for the tech News for Tuesday, 576 00:35:09,360 --> 00:35:12,400 Speaker 1: October one. Will be back later in the week with 577 00:35:12,480 --> 00:35:16,040 Speaker 1: more tech News and more episodes of tech Stuff. If 578 00:35:16,120 --> 00:35:18,880 Speaker 1: you have any suggestions for topics I should cover on 579 00:35:18,960 --> 00:35:21,240 Speaker 1: the show, reach out to me on Twitter. The handle 580 00:35:21,360 --> 00:35:24,120 Speaker 1: is text stuff h s W and I'll talk to 581 00:35:24,120 --> 00:35:32,400 Speaker 1: you again really soon. Y. Tech Stuff is an I 582 00:35:32,520 --> 00:35:36,000 Speaker 1: Heart Radio production. For more podcasts from my Heart Radio, 583 00:35:36,360 --> 00:35:39,520 Speaker 1: visit the i Heart Radio app, Apple Podcasts, or wherever 584 00:35:39,600 --> 00:35:41,120 Speaker 1: you listen to your favorite shows.