1 00:00:04,519 --> 00:00:12,560 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,640 --> 00:00:15,920 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,920 --> 00:00:19,200 Speaker 1: an executive producer with iHeart Podcasts and How the Tech 4 00:00:19,239 --> 00:00:22,600 Speaker 1: are You. It's time for the tech news for December nineteenth, 5 00:00:22,800 --> 00:00:26,960 Speaker 1: twenty twenty three, and first up. Elon Musk has had 6 00:00:27,000 --> 00:00:29,680 Speaker 1: a pretty rocky twenty twenty three in general, I think 7 00:00:29,680 --> 00:00:32,040 Speaker 1: it's safe to say, and these last few months have 8 00:00:32,080 --> 00:00:36,639 Speaker 1: been particularly challenging. I will add that in my own opinion, 9 00:00:36,920 --> 00:00:40,000 Speaker 1: I feel like this is a rocky road that Elon 10 00:00:40,159 --> 00:00:43,720 Speaker 1: Musk himself has paved. Anyway. The latest of those challenges 11 00:00:43,880 --> 00:00:47,640 Speaker 1: is that the European Commission announced an investigation into x, 12 00:00:48,280 --> 00:00:52,000 Speaker 1: the social platform formally known as Twitter. This has to 13 00:00:52,000 --> 00:00:55,000 Speaker 1: do with the new Digital Services Act in the EU, 14 00:00:55,360 --> 00:00:58,640 Speaker 1: which came into effect this past August. This law sets 15 00:00:58,640 --> 00:01:05,119 Speaker 1: out rules regarding stuff like disinformation, transparent advertising, and illegal 16 00:01:05,120 --> 00:01:09,319 Speaker 1: content on the Internet. So the goal was really to 17 00:01:09,440 --> 00:01:12,279 Speaker 1: bring the member states of the EU, all of which 18 00:01:12,600 --> 00:01:16,000 Speaker 1: have their own local sets of laws on these matters, 19 00:01:16,880 --> 00:01:21,119 Speaker 1: closer into alignment to have a sort of universal set 20 00:01:21,160 --> 00:01:23,399 Speaker 1: of rules for the whole of the EU. For my 21 00:01:23,959 --> 00:01:27,280 Speaker 1: fellow American listeners think of it as the difference between, say, 22 00:01:27,520 --> 00:01:31,960 Speaker 1: a federal law regarding a specific matter, and a state 23 00:01:32,080 --> 00:01:36,040 Speaker 1: law similar to that. Anyway, the commissioners say that it 24 00:01:36,080 --> 00:01:39,319 Speaker 1: appears X has not been compliant with this new law, 25 00:01:39,600 --> 00:01:41,880 Speaker 1: and that the platform has failed to do any significant 26 00:01:41,920 --> 00:01:46,920 Speaker 1: work to prevent or respond to illegal content and disinformation 27 00:01:47,040 --> 00:01:50,800 Speaker 1: posted to it. In particular, the focus is on information 28 00:01:51,400 --> 00:01:54,520 Speaker 1: in the wake of the conflict between Israel and Hamas. 29 00:01:54,960 --> 00:01:57,680 Speaker 1: On top of that, the commission argues that X has 30 00:01:57,720 --> 00:02:02,560 Speaker 1: made use of a quote suspected deceptive design of user 31 00:02:02,640 --> 00:02:05,200 Speaker 1: interface in the quote, and you might think, well, what 32 00:02:05,240 --> 00:02:08,320 Speaker 1: the heck do they mean by that? Well, Lisa O'Carroll 33 00:02:08,400 --> 00:02:11,000 Speaker 1: of The Guardian has an explanation. It's due to how 34 00:02:11,360 --> 00:02:15,600 Speaker 1: X markets and operates the blue check mark system. Now 35 00:02:15,639 --> 00:02:18,120 Speaker 1: I'll remind you that back in the Twitter days, a 36 00:02:18,160 --> 00:02:21,400 Speaker 1: blue check mark was a verification mark, right. It was 37 00:02:21,520 --> 00:02:25,320 Speaker 1: meant to verify that an account belonged to the person 38 00:02:25,400 --> 00:02:28,320 Speaker 1: it claimed to represent. So that way, if you saw 39 00:02:28,800 --> 00:02:31,119 Speaker 1: a celebrities account and there was a blue check mark 40 00:02:31,160 --> 00:02:35,200 Speaker 1: next to it, you could be assured that indeed the celebrity, 41 00:02:35,560 --> 00:02:39,440 Speaker 1: or more likely the celebrities PR team owns that account, 42 00:02:40,080 --> 00:02:43,560 Speaker 1: but now a blue check mark merely indicates someone who 43 00:02:43,600 --> 00:02:47,120 Speaker 1: has paid to have the blue check mark, which comes 44 00:02:47,160 --> 00:02:51,880 Speaker 1: with a few little actual benefits in addition to having 45 00:02:51,960 --> 00:02:54,519 Speaker 1: the blue check mark appear next to your name. The 46 00:02:54,600 --> 00:02:57,240 Speaker 1: investigation is in the earliest stages and there is a 47 00:02:57,280 --> 00:02:59,959 Speaker 1: lot to cover, Like they're wondering if does the blue 48 00:03:00,200 --> 00:03:05,800 Speaker 1: check mark actually mislead people into thinking that somehow an 49 00:03:05,800 --> 00:03:10,480 Speaker 1: account that has a blue check mark is more reliable 50 00:03:10,639 --> 00:03:13,480 Speaker 1: than one that doesn't. So there's a lot more that 51 00:03:13,520 --> 00:03:15,840 Speaker 1: has to go on. It's just a preliminary investigation at 52 00:03:15,880 --> 00:03:18,280 Speaker 1: this point, and I am sure this comes as unwelcome 53 00:03:18,360 --> 00:03:21,600 Speaker 1: news to mister Musk. If the EU determines that X 54 00:03:21,680 --> 00:03:24,720 Speaker 1: has violated the Digital Services Act, the consequences can be 55 00:03:24,800 --> 00:03:27,960 Speaker 1: pretty severe. Punishments can include a fine of up to 56 00:03:28,080 --> 00:03:32,600 Speaker 1: six percent of the company's annual turnover. That's a UK 57 00:03:32,840 --> 00:03:36,320 Speaker 1: term that essentially means gross revenue, So before you take 58 00:03:36,360 --> 00:03:38,320 Speaker 1: all your expenses out, it's how much money you made 59 00:03:38,680 --> 00:03:42,760 Speaker 1: before you had to deduct expenses For X. That would 60 00:03:42,800 --> 00:03:47,280 Speaker 1: be pretty awful, as the company has already had massive 61 00:03:47,360 --> 00:03:50,840 Speaker 1: drops in revenue as it is. They could even potentially 62 00:03:50,880 --> 00:03:54,920 Speaker 1: block X from operating within the EU. So this is 63 00:03:55,080 --> 00:03:57,120 Speaker 1: a big deal and we'll have to see how this 64 00:03:57,160 --> 00:04:01,360 Speaker 1: story unfolds. If the recent past is anything to go by, 65 00:04:01,600 --> 00:04:05,600 Speaker 1: it will probably not go well for x unlast elon 66 00:04:05,720 --> 00:04:10,880 Speaker 1: Musk's asides to remove himself from that situation entirely, which 67 00:04:10,880 --> 00:04:12,840 Speaker 1: I don't see happening, but I think that's really the 68 00:04:12,840 --> 00:04:16,080 Speaker 1: only way out for the platform. Next up. Comcast has 69 00:04:16,120 --> 00:04:18,320 Speaker 1: now confirmed has been the target of a data breach, 70 00:04:18,400 --> 00:04:23,360 Speaker 1: potentially affecting nearly thirty six million customers. Tech crunches Carly 71 00:04:23,480 --> 00:04:26,440 Speaker 1: Page has a piece on this. It's titled Comcast says 72 00:04:26,480 --> 00:04:30,080 Speaker 1: hackers stole data of close to thirty six million Infinity customers. 73 00:04:30,400 --> 00:04:33,520 Speaker 1: So the issue appears to be related to a weakness 74 00:04:33,839 --> 00:04:38,359 Speaker 1: in enterprise hardware, namely some networking devices made by a 75 00:04:38,360 --> 00:04:42,080 Speaker 1: company called Citrix. So Comcast uses Citrix devices in its 76 00:04:42,080 --> 00:04:45,919 Speaker 1: own network, and hackers discovered a vulnerability in certain Citrix 77 00:04:46,040 --> 00:04:50,119 Speaker 1: gear last summer. Citrix responded by creating and then making 78 00:04:50,160 --> 00:04:54,760 Speaker 1: available some security patches, but not everyone was right on 79 00:04:54,839 --> 00:04:58,160 Speaker 1: top of the ball there and didn't implement the security 80 00:04:58,160 --> 00:05:02,479 Speaker 1: patches fast enough, including apparently Comcasts, so that gave hackers 81 00:05:02,520 --> 00:05:06,680 Speaker 1: the chance to exploit the vulnerability in Comcasts systems this 82 00:05:06,760 --> 00:05:11,360 Speaker 1: past October and access customer data, and we're just hearing 83 00:05:11,400 --> 00:05:14,960 Speaker 1: about it now, right, Like Comcasts found out about about 84 00:05:14,960 --> 00:05:18,440 Speaker 1: a month later and now we're hearing about it. So 85 00:05:18,560 --> 00:05:23,720 Speaker 1: what kind of data was accessed? Comcast hasn't fully explained 86 00:05:23,760 --> 00:05:26,240 Speaker 1: that yet. Apparently they're still investigating it, but they said 87 00:05:26,240 --> 00:05:29,320 Speaker 1: that at least for some customers, it includes a lot 88 00:05:29,560 --> 00:05:34,120 Speaker 1: like their name, their username, a hashed version of their password, 89 00:05:34,640 --> 00:05:37,200 Speaker 1: so in other words, sort of an encrypted version, no 90 00:05:37,320 --> 00:05:41,559 Speaker 1: telling how strong the encryption is. The customer's address, their 91 00:05:42,080 --> 00:05:45,200 Speaker 1: other contact informations like email, phone number that sort of stuff, 92 00:05:45,360 --> 00:05:48,000 Speaker 1: their data birth, the last four digits of their Social 93 00:05:48,000 --> 00:05:51,560 Speaker 1: Security number, and any security questions and answers they may 94 00:05:51,600 --> 00:05:54,520 Speaker 1: have provided, you know, the stuff like what was the 95 00:05:54,560 --> 00:05:58,160 Speaker 1: street you grew up on? Kind of questions. Comcast has 96 00:05:58,200 --> 00:06:00,719 Speaker 1: been reaching out the customers and require them to do 97 00:06:00,760 --> 00:06:04,600 Speaker 1: a password reset, while also recommending that they enable two 98 00:06:04,680 --> 00:06:07,640 Speaker 1: factor authentication. This is where I recommend to all of 99 00:06:07,720 --> 00:06:11,840 Speaker 1: y'all that you use multi factor authentication whenever it is available. 100 00:06:12,640 --> 00:06:16,560 Speaker 1: Multi factor authentication does not guarantee that you're absolutely safe 101 00:06:16,560 --> 00:06:19,640 Speaker 1: from attacks, but it does raise the bar enough so 102 00:06:19,720 --> 00:06:23,800 Speaker 1: that I would say most attackers won't bother putting forth 103 00:06:23,839 --> 00:06:26,880 Speaker 1: the effort to get past it. Now I did say most, 104 00:06:27,520 --> 00:06:30,560 Speaker 1: not all. So there is no such thing as a 105 00:06:30,600 --> 00:06:35,040 Speaker 1: perfectly safe system. But you can at least improve your 106 00:06:35,080 --> 00:06:39,839 Speaker 1: security and thus eliminate like ninety eight percent of the 107 00:06:39,839 --> 00:06:43,320 Speaker 1: potential attacks you could face, and I think that is 108 00:06:43,320 --> 00:06:46,080 Speaker 1: a worthwhile effort. If you're in the United States and 109 00:06:46,120 --> 00:06:49,279 Speaker 1: you are a last minute shopper type and the holidays 110 00:06:49,320 --> 00:06:52,080 Speaker 1: are right here upon us, maybe you were planning on 111 00:06:52,120 --> 00:06:55,800 Speaker 1: getting someone an Apple Watch, specifically a Series nine or 112 00:06:55,839 --> 00:06:59,480 Speaker 1: an Ultra to Apple Watch, you better hop to it, 113 00:06:59,680 --> 00:07:02,760 Speaker 1: because once three PM rolls around on the twenty first, 114 00:07:02,800 --> 00:07:06,640 Speaker 1: which is this Thursday, those watches will be off store 115 00:07:06,839 --> 00:07:09,880 Speaker 1: shelves in the US. And that's because Apple is facing 116 00:07:09,880 --> 00:07:13,640 Speaker 1: a ban by the International Trade Center or ITC all 117 00:07:13,680 --> 00:07:17,040 Speaker 1: because of a little sensor that's present in those two 118 00:07:17,120 --> 00:07:20,440 Speaker 1: models of the Apple Watch. So a company called Massimo, 119 00:07:20,960 --> 00:07:24,360 Speaker 1: which makes medical devices, is in a patent dispute with 120 00:07:24,440 --> 00:07:28,040 Speaker 1: Apple over a sensor called an SPO two sensor. It's 121 00:07:28,560 --> 00:07:34,800 Speaker 1: designed to measure the blood oxygen models. So Massimo or Masimo, 122 00:07:35,120 --> 00:07:37,640 Speaker 1: I don't know how you say the name of the company. Essentially, 123 00:07:37,680 --> 00:07:41,160 Speaker 1: they're saying Apple infringed upon a patented technology. It's an 124 00:07:41,200 --> 00:07:45,400 Speaker 1: allegation that Apple veheminently denies. But in the face of 125 00:07:45,440 --> 00:07:48,160 Speaker 1: this ITC ban, Apple is preemptively making the move to 126 00:07:48,240 --> 00:07:50,800 Speaker 1: pull all Series nine and Ultra two models off the 127 00:07:50,800 --> 00:07:54,280 Speaker 1: shelves for the time being. Within the US, the matter 128 00:07:54,400 --> 00:07:56,840 Speaker 1: is up for review. President Biden will actually have the 129 00:07:56,880 --> 00:08:00,960 Speaker 1: authority to either veto or allow the ban to continue. 130 00:08:01,560 --> 00:08:04,200 Speaker 1: If he does not veto the ban, Apple can still 131 00:08:04,240 --> 00:08:08,560 Speaker 1: actually appeal that decision in federal courts, but it would 132 00:08:08,600 --> 00:08:11,080 Speaker 1: mean that the tech would be off limits for US customers, 133 00:08:11,320 --> 00:08:13,360 Speaker 1: at least within the United States. I mean you could 134 00:08:13,440 --> 00:08:17,040 Speaker 1: technically travel somewhere else and buy one of these, I guess. 135 00:08:17,280 --> 00:08:20,040 Speaker 1: According to Victoria Song of The Verge, Apple is also 136 00:08:20,040 --> 00:08:23,080 Speaker 1: working on a software solution that, if approved by the ITC, 137 00:08:23,400 --> 00:08:26,040 Speaker 1: could serve as a workaround for the issue, so there 138 00:08:26,080 --> 00:08:28,120 Speaker 1: wouldn't have to be any changes to the hardware and 139 00:08:28,200 --> 00:08:31,280 Speaker 1: Apple could continue to sell them in stores in the 140 00:08:31,360 --> 00:08:33,600 Speaker 1: United States. So I'll have to see what happens with this. 141 00:08:34,600 --> 00:08:37,559 Speaker 1: Google has agreed to a settlement with the US States, 142 00:08:38,160 --> 00:08:42,760 Speaker 1: as in all fifty United States states in a massive 143 00:08:42,960 --> 00:08:47,640 Speaker 1: anti trust lawsuit. So this lawsuit focused on Google's Play Store, 144 00:08:48,040 --> 00:08:53,480 Speaker 1: the Android App Store essentially, and the United States was 145 00:08:53,520 --> 00:08:57,560 Speaker 1: accusing Google of operating an anti competitive platform by controlling 146 00:08:57,600 --> 00:09:00,959 Speaker 1: the stuff like app distribution and Android device is and 147 00:09:01,120 --> 00:09:04,880 Speaker 1: forcing app developers to pay fees for in app transactions. 148 00:09:05,000 --> 00:09:07,320 Speaker 1: Right like Google would take a cut of all in 149 00:09:07,400 --> 00:09:10,679 Speaker 1: app transactions. That's something that Apple has also faced in 150 00:09:10,880 --> 00:09:14,400 Speaker 1: courts in the United States recently. So if a judge 151 00:09:14,440 --> 00:09:17,439 Speaker 1: approves this settlement, because it has not been approved as 152 00:09:17,480 --> 00:09:20,320 Speaker 1: of the time I'm recording this, Google will have to 153 00:09:20,320 --> 00:09:24,079 Speaker 1: pay seven hundred million dollars and says it will introduce 154 00:09:24,120 --> 00:09:27,480 Speaker 1: greater competition in the play app Store to boot. Of 155 00:09:27,480 --> 00:09:30,000 Speaker 1: that seven hundred million, seventy million would go to a 156 00:09:30,040 --> 00:09:33,600 Speaker 1: fund that the US States would divvy up. The other 157 00:09:33,679 --> 00:09:36,199 Speaker 1: six hundred and thirty million would go to a fund 158 00:09:36,240 --> 00:09:39,320 Speaker 1: that would pay out to Google consumers. In the process 159 00:09:39,360 --> 00:09:43,240 Speaker 1: of the settlement, Google admits no wrongdoing, which is often 160 00:09:43,400 --> 00:09:46,000 Speaker 1: a case with court settlements. Now, this is just the 161 00:09:46,080 --> 00:09:51,319 Speaker 1: latest news in an ongoing year of scrutiny on Google 162 00:09:51,360 --> 00:09:53,680 Speaker 1: from various regulators. So I think we're going to have 163 00:09:53,679 --> 00:09:58,240 Speaker 1: a lot more stories about various governmental organizations applying pressure 164 00:09:58,240 --> 00:10:01,480 Speaker 1: to Google next year. But you'll have to listen to 165 00:10:01,520 --> 00:10:05,680 Speaker 1: my Predictions episode that'll be coming up in a little 166 00:10:05,720 --> 00:10:07,920 Speaker 1: more than a week and a half to learn more 167 00:10:07,920 --> 00:10:11,000 Speaker 1: about that. Yep, it's coming back. Speaking of coming back, 168 00:10:11,559 --> 00:10:14,280 Speaker 1: we'll be back to after we take this quick break 169 00:10:14,320 --> 00:10:29,600 Speaker 1: to thank our sponsors. We're back. Sean Lenas of CNN 170 00:10:29,640 --> 00:10:33,240 Speaker 1: has a story about how Microsoft believes operatives from China 171 00:10:33,320 --> 00:10:36,400 Speaker 1: are making use of generative AI in an effort to 172 00:10:36,480 --> 00:10:40,559 Speaker 1: mislead US voters. So the images are meant to sow 173 00:10:40,720 --> 00:10:45,000 Speaker 1: discord among voters and to push the divide that already 174 00:10:45,040 --> 00:10:48,920 Speaker 1: exists between the left and the right to even greater extremes. Really, 175 00:10:48,920 --> 00:10:53,200 Speaker 1: it's about making extremists, and y'all, we do not need 176 00:10:53,280 --> 00:10:56,400 Speaker 1: outside help to do that. We're really good at it ourselves. 177 00:10:56,440 --> 00:10:59,520 Speaker 1: But admittedly, if you put your thumb on the scale, 178 00:10:59,559 --> 00:11:03,600 Speaker 1: it does speed things up considerably. So this campaign appears 179 00:11:03,640 --> 00:11:08,200 Speaker 1: to largely center on creating kind of iconic images through AI, 180 00:11:09,000 --> 00:11:12,000 Speaker 1: and these images are meant to highlight specific issues in 181 00:11:12,040 --> 00:11:14,480 Speaker 1: the United States to a degree that I would actually 182 00:11:14,480 --> 00:11:16,400 Speaker 1: call absurd. I looked at some of these images and 183 00:11:16,400 --> 00:11:21,480 Speaker 1: I'm thinking this feels like it's parody or satire, right, 184 00:11:21,720 --> 00:11:25,600 Speaker 1: it's so over the top. However, according to at least 185 00:11:25,640 --> 00:11:28,080 Speaker 1: some folks over in Microsoft, the efforts are proving to 186 00:11:28,080 --> 00:11:32,040 Speaker 1: be effective because they are driving engagement online at least 187 00:11:32,040 --> 00:11:35,400 Speaker 1: on certain platforms. According to lin Gas, other companies have 188 00:11:35,440 --> 00:11:38,800 Speaker 1: reported a rise in Chinese backed misinformation campaigns as well, 189 00:11:39,360 --> 00:11:43,240 Speaker 1: where they've actually taken action to take down these campaigns, 190 00:11:43,840 --> 00:11:45,880 Speaker 1: and some of them at least appear to not be 191 00:11:45,960 --> 00:11:50,319 Speaker 1: particularly effective, but they are happening. So it's a reminder 192 00:11:50,600 --> 00:11:54,480 Speaker 1: to not accept everything you encounter online, specifically on social 193 00:11:54,480 --> 00:12:00,160 Speaker 1: networking sites, at face value. Rob Stump of Inside Evs 194 00:12:00,480 --> 00:12:04,000 Speaker 1: has an article titled Volkswagen will bring back physical buttons 195 00:12:04,040 --> 00:12:07,400 Speaker 1: in new cars and Yeah, that pretty much gives away right. 196 00:12:07,440 --> 00:12:11,040 Speaker 1: The title says that the company has decided to walk 197 00:12:11,080 --> 00:12:15,120 Speaker 1: back recent design changes in their electric vehicle line, and 198 00:12:15,360 --> 00:12:18,760 Speaker 1: those design changes saw a lot of touchscreen interfaces and 199 00:12:19,200 --> 00:12:24,079 Speaker 1: haptic capacitive buttons replace physical buttons that were on things 200 00:12:24,120 --> 00:12:27,839 Speaker 1: like control panels and steering wheels. So Volkswagen had really 201 00:12:28,320 --> 00:12:30,400 Speaker 1: kind of made this declaration that they were going to 202 00:12:30,480 --> 00:12:34,840 Speaker 1: go buttonless, but it turns out that the customers, the drivers, 203 00:12:35,320 --> 00:12:38,280 Speaker 1: send a very clear message to the company saying don't 204 00:12:38,280 --> 00:12:41,800 Speaker 1: do that. Customers reported that the touchscreen interfaces and those 205 00:12:41,840 --> 00:12:46,480 Speaker 1: haptic capacitive surfaces on the steering wheel were distracting and 206 00:12:46,520 --> 00:12:50,280 Speaker 1: frustrating and potentially even dangerous to use. As such, the 207 00:12:50,320 --> 00:12:53,120 Speaker 1: company has started to revise the design on the ID 208 00:12:53,400 --> 00:12:57,439 Speaker 1: two electric vehicle, which reintroduces some but not all, of 209 00:12:57,480 --> 00:13:02,960 Speaker 1: the physical control buttons. The whole Brewjaja prompted Volkswagen CEO 210 00:13:03,080 --> 00:13:06,840 Speaker 1: to say the design change to touch interfaces quote did 211 00:13:06,960 --> 00:13:10,480 Speaker 1: a lot of damage end quote to the brand. Now 212 00:13:10,559 --> 00:13:13,599 Speaker 1: keep in mind, Volkswagen is the same brand that was 213 00:13:13,640 --> 00:13:17,400 Speaker 1: behind the diesel Gate scandal years ago. I mean, arguably 214 00:13:17,920 --> 00:13:21,079 Speaker 1: that company is still trying to recover from that brew ha, 215 00:13:22,000 --> 00:13:25,640 Speaker 1: So it knows a lot about damaging itself. Yeah, and 216 00:13:25,720 --> 00:13:30,360 Speaker 1: this story mostly is focused in Europe, like that's where 217 00:13:30,480 --> 00:13:32,400 Speaker 1: a lot of these vehicles have been released and the 218 00:13:32,600 --> 00:13:36,839 Speaker 1: objection to them has followed. I will personally say, I'm 219 00:13:36,880 --> 00:13:38,800 Speaker 1: not a driver, so it's not like I have any 220 00:13:39,080 --> 00:13:41,880 Speaker 1: real expertise in this area, but I do find an 221 00:13:41,960 --> 00:13:44,480 Speaker 1: off putting to be in a vehicle where everything is 222 00:13:44,520 --> 00:13:48,880 Speaker 1: like a touch based interface because it requires to me 223 00:13:49,040 --> 00:13:51,959 Speaker 1: at least to spend more time looking at the interface 224 00:13:51,960 --> 00:13:53,560 Speaker 1: to figure out what I need to do in order 225 00:13:53,600 --> 00:13:56,520 Speaker 1: to achieve the outcome I want. And if I were 226 00:13:56,559 --> 00:13:59,120 Speaker 1: a driver, that would mean my attention would be off 227 00:13:59,120 --> 00:14:04,480 Speaker 1: the road, and that I think is pretty darn dangerous. 228 00:14:04,520 --> 00:14:07,040 Speaker 1: I was gonna say reckless, but ironically that seems like 229 00:14:07,080 --> 00:14:10,400 Speaker 1: that would be the wrong word to use. Anyway, Volkswagen 230 00:14:10,440 --> 00:14:12,839 Speaker 1: has said, you know what are bad, We'll go back 231 00:14:12,840 --> 00:14:16,679 Speaker 1: to buttons, so we'll see how those look in future models. 232 00:14:17,280 --> 00:14:20,520 Speaker 1: Speaking of cars, this week, Canada is likely to announce 233 00:14:20,560 --> 00:14:23,520 Speaker 1: that all new cars will need to be zero emissions 234 00:14:23,640 --> 00:14:26,800 Speaker 1: vehicles by the year twenty thirty five. And by that 235 00:14:26,920 --> 00:14:32,000 Speaker 1: I mean actually, some Canadian governmental agency will issue a statement, 236 00:14:32,520 --> 00:14:36,240 Speaker 1: not that Canada itself will suddenly gain sentience and begin 237 00:14:36,320 --> 00:14:39,760 Speaker 1: to speak Anyway, The point is it seems like Canada 238 00:14:39,920 --> 00:14:42,800 Speaker 1: is following in the footsteps of several other countries around 239 00:14:42,800 --> 00:14:46,160 Speaker 1: the world that are also setting requirements on new vehicles 240 00:14:46,200 --> 00:14:49,240 Speaker 1: to be as carbon free as far as emissions go 241 00:14:49,920 --> 00:14:52,880 Speaker 1: within a little more than a decade. This doesn't cover 242 00:14:53,040 --> 00:14:56,680 Speaker 1: used vehicles. Obviously, you could still have an internal combustion 243 00:14:56,840 --> 00:14:59,640 Speaker 1: engine vehicle in Canada. It's not like they're being outlawed, 244 00:15:00,280 --> 00:15:03,520 Speaker 1: just that any new vehicle from twenty thirty five on 245 00:15:03,840 --> 00:15:08,160 Speaker 1: has to be emissions free. I find it somewhat amazing 246 00:15:08,840 --> 00:15:12,720 Speaker 1: that I could well live to see gas stations become 247 00:15:12,840 --> 00:15:17,239 Speaker 1: largely obsolete, like get such an It's such an integrated 248 00:15:17,360 --> 00:15:20,280 Speaker 1: thing in my life I think of, Like I have 249 00:15:20,360 --> 00:15:23,360 Speaker 1: childhood memories of going to gas stations and stuff. It 250 00:15:23,440 --> 00:15:26,760 Speaker 1: is wild to me to think that within ten fifteen 251 00:15:26,880 --> 00:15:31,800 Speaker 1: years those could pretty much become, you know, completely converted 252 00:15:31,840 --> 00:15:35,560 Speaker 1: over into like charging stations and stuff. It just is odd. 253 00:15:36,000 --> 00:15:38,560 Speaker 1: I don't know. Maybe it's just because I'm old. And 254 00:15:38,680 --> 00:15:43,080 Speaker 1: in Space News, NASA demonstrated the enormous power of the 255 00:15:43,160 --> 00:15:48,960 Speaker 1: Deep Space Optical Communications project or DSOC by beaming an 256 00:15:49,080 --> 00:15:55,320 Speaker 1: ultra high definition video from space to Earth using laser beams. 257 00:15:55,880 --> 00:15:57,960 Speaker 1: And I know what you're thinking, a video of what? 258 00:15:57,960 --> 00:16:02,840 Speaker 1: What did NASA's video contain? Well, maybe you didn't ask 259 00:16:02,880 --> 00:16:06,200 Speaker 1: that question, you probably can guess. I mean it's a 260 00:16:06,240 --> 00:16:10,840 Speaker 1: digital video. Yes, it was of a cat, because some 261 00:16:10,920 --> 00:16:16,080 Speaker 1: memes just won't go away. Now, this particular cat is Tater's, 262 00:16:16,360 --> 00:16:21,000 Speaker 1: an orange tabby, and the video shows Taters joyfully chasing 263 00:16:21,040 --> 00:16:25,239 Speaker 1: a little laserbeam dot around an environment, and the transmitter 264 00:16:25,560 --> 00:16:29,040 Speaker 1: is a board of spacecraft called Psyche. That spacecraft is 265 00:16:29,120 --> 00:16:32,640 Speaker 1: nearly nineteen million miles away from us, but you'll be 266 00:16:32,680 --> 00:16:35,280 Speaker 1: relieved to hear that little Taters has his pause firmly 267 00:16:35,320 --> 00:16:37,400 Speaker 1: on the ground. Taters did not go on a ride 268 00:16:38,520 --> 00:16:41,480 Speaker 1: helped to space. What they did was they shot the video, 269 00:16:41,600 --> 00:16:44,640 Speaker 1: or actually they took a video and downloaded it and 270 00:16:44,680 --> 00:16:50,360 Speaker 1: then they saved it to Psyche. Spacecraft's hard drive essentially 271 00:16:50,840 --> 00:16:53,760 Speaker 1: sent the spacecraft out into space, and then it beamed 272 00:16:53,760 --> 00:16:56,240 Speaker 1: the video back to us at a speed that was incredible, 273 00:16:56,320 --> 00:16:59,360 Speaker 1: faster than broadband speeds here on the ground. It took 274 00:16:59,560 --> 00:17:03,320 Speaker 1: a less than two minutes for the message to travel 275 00:17:03,360 --> 00:17:07,680 Speaker 1: those nineteen million miles. It actually they said it took 276 00:17:07,760 --> 00:17:10,320 Speaker 1: less time for the spacecraft to beam the video down 277 00:17:10,800 --> 00:17:14,680 Speaker 1: to Earth than it took for them to transmit the 278 00:17:15,480 --> 00:17:21,960 Speaker 1: video from the receiver station to the headquarters, which is 279 00:17:22,000 --> 00:17:24,960 Speaker 1: pretty amazing. The experiment really helps pave the way for 280 00:17:25,040 --> 00:17:27,679 Speaker 1: long distance transmissions in space. That's something that's going to 281 00:17:27,680 --> 00:17:31,160 Speaker 1: be a key component in missions to places like Mars 282 00:17:31,320 --> 00:17:34,560 Speaker 1: or beyond. If we want to set up really good 283 00:17:35,000 --> 00:17:39,159 Speaker 1: colonies on places like Mars or the Moon. Using this 284 00:17:39,280 --> 00:17:43,320 Speaker 1: kind of laser communication technology will be key to bringing 285 00:17:43,400 --> 00:17:46,440 Speaker 1: down the delay times as much as possible and being 286 00:17:46,480 --> 00:17:50,040 Speaker 1: able to communicate seamlessly or as close to it as 287 00:17:50,080 --> 00:17:54,199 Speaker 1: we can. Okay, I've got just one article recommendation for 288 00:17:54,240 --> 00:17:58,399 Speaker 1: you all today. TJ. Thompson and Daniel Angus wrote a 289 00:17:58,400 --> 00:18:01,520 Speaker 1: piece for The Conversation, although so I actually found it 290 00:18:01,600 --> 00:18:06,119 Speaker 1: on tech Explore and the piece is titled data Poisoning. 291 00:18:06,240 --> 00:18:10,880 Speaker 1: How artists are sabotaging AI to take revenge on image generators. 292 00:18:11,440 --> 00:18:14,560 Speaker 1: And I've talked about data poisoning before on this show, 293 00:18:14,600 --> 00:18:18,320 Speaker 1: but it's a good article to get a grounding in 294 00:18:18,400 --> 00:18:21,280 Speaker 1: what's going on. As you probably know, a lot of 295 00:18:21,359 --> 00:18:25,240 Speaker 1: artists are upset that AI companies have been using artists' 296 00:18:25,240 --> 00:18:28,720 Speaker 1: works to train up generative AI, but they haven't gotten 297 00:18:28,760 --> 00:18:31,359 Speaker 1: the artist's consent to do that. So it's sort of 298 00:18:31,400 --> 00:18:34,200 Speaker 1: like if you found out someone was using your work 299 00:18:34,320 --> 00:18:38,000 Speaker 1: to train your replacement without ever telling you, Like they 300 00:18:38,080 --> 00:18:40,159 Speaker 1: might be saying, hey, you're putting in great work, and 301 00:18:40,200 --> 00:18:42,400 Speaker 1: they really mean it, because they mean your great work 302 00:18:42,480 --> 00:18:44,399 Speaker 1: is training up the person who's going to take over 303 00:18:44,440 --> 00:18:48,880 Speaker 1: your job. Anyway, as this piece explains some artists are 304 00:18:48,880 --> 00:18:51,879 Speaker 1: experimenting with ways to trip up AI so that it 305 00:18:51,920 --> 00:18:55,800 Speaker 1: produces really poor output, so check it out. And that's 306 00:18:55,840 --> 00:18:59,800 Speaker 1: it for this episode of tech Stuff. I'll have one 307 00:19:00,200 --> 00:19:03,480 Speaker 1: two other tech news episodes before we close out the year. 308 00:19:03,480 --> 00:19:05,879 Speaker 1: I've also got a bunch of special episodes kind of 309 00:19:06,640 --> 00:19:09,280 Speaker 1: looking back over some of the big tech stories of 310 00:19:09,320 --> 00:19:12,720 Speaker 1: twenty twenty three, so check those out. And they'll even 311 00:19:12,720 --> 00:19:15,400 Speaker 1: be special ones publishing on weekends so you can check 312 00:19:15,440 --> 00:19:18,520 Speaker 1: those out too. And yeah, thanks to all of you, 313 00:19:18,680 --> 00:19:22,720 Speaker 1: Thanks to the amazing production staff who have stepped up 314 00:19:22,760 --> 00:19:25,160 Speaker 1: to help me out during the holiday season. I really 315 00:19:25,200 --> 00:19:28,919 Speaker 1: appreciate it, including super producer Tari. I hope you are 316 00:19:29,040 --> 00:19:32,720 Speaker 1: enjoying your holiday and to all of you out there, 317 00:19:32,880 --> 00:19:35,639 Speaker 1: I wish you happiness and health and I'll talk to 318 00:19:35,680 --> 00:19:45,800 Speaker 1: you again really soon. Tech Stuff is an iHeartRadio production. 319 00:19:46,240 --> 00:19:51,280 Speaker 1: For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 320 00:19:51,400 --> 00:19:56,720 Speaker 1: or wherever you listen to your favorite shows.