1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,240 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,360 --> 00:00:17,639 Speaker 1: Jonathan Strickland. I'm an executive producer with iHeart Radio. And 4 00:00:17,680 --> 00:00:20,840 Speaker 1: how the tech are you. It's time for the tech 5 00:00:20,880 --> 00:00:25,880 Speaker 1: news for Thursday, August twenty two. Here in the United States, 6 00:00:26,239 --> 00:00:29,520 Speaker 1: we are headed toward another election. This one is a 7 00:00:29,600 --> 00:00:32,360 Speaker 1: mid term election. That means that the president is not 8 00:00:32,680 --> 00:00:34,559 Speaker 1: up for re election, but a lot of folks in 9 00:00:34,600 --> 00:00:38,960 Speaker 1: Congress and in local positions around the US are It 10 00:00:39,040 --> 00:00:41,360 Speaker 1: also means that there is a heightened concern about the 11 00:00:41,400 --> 00:00:44,720 Speaker 1: spread of misinformation online. And you know, that's been an 12 00:00:44,720 --> 00:00:47,440 Speaker 1: issue for years, but it really came under the spotlight 13 00:00:47,520 --> 00:00:51,840 Speaker 1: in and it has stayed in the spotlight ever since, 14 00:00:51,880 --> 00:00:56,560 Speaker 1: hogging center stage. Now to that end, in twenty twenty, Facebook, 15 00:00:56,720 --> 00:01:00,720 Speaker 1: the platform and actually the company itself, because the company 16 00:01:00,720 --> 00:01:03,680 Speaker 1: had not yet changed its name to Meta in anyway, 17 00:01:03,720 --> 00:01:07,199 Speaker 1: it took the significant step of banning new political ads 18 00:01:07,200 --> 00:01:10,959 Speaker 1: from being published in the week leading up to the election. 19 00:01:11,360 --> 00:01:13,360 Speaker 1: Now Meta is going to do the same thing for 20 00:01:13,480 --> 00:01:16,320 Speaker 1: the US mid terms this year. And if you listened 21 00:01:16,400 --> 00:01:19,560 Speaker 1: to Tuesday's episode this week, you know that there was 22 00:01:19,640 --> 00:01:22,520 Speaker 1: a watchdog group that called out Meta for failing to 23 00:01:22,560 --> 00:01:27,399 Speaker 1: follow its own content moderation policies on Facebook. In Brazil, 24 00:01:27,880 --> 00:01:33,119 Speaker 1: the group had submitted several ads purposefully containing misinformation within them, 25 00:01:33,160 --> 00:01:37,000 Speaker 1: and Facebook accepted all of those ads. Now, it's true 26 00:01:37,040 --> 00:01:41,800 Speaker 1: that in English speaking countries, Facebook's content moderation is, you know, 27 00:01:41,880 --> 00:01:45,280 Speaker 1: slightly less incompetent. But rather than deal with the flood 28 00:01:45,280 --> 00:01:49,480 Speaker 1: of possible attack ads and misinformation campaigns, Facebook is gonna 29 00:01:49,480 --> 00:01:52,120 Speaker 1: wash its hands of the whole business, at least for 30 00:01:52,240 --> 00:01:55,200 Speaker 1: the week leading up to the election itself. It remains 31 00:01:55,240 --> 00:01:57,760 Speaker 1: to be seen if Facebook will copy exactly what it 32 00:01:57,800 --> 00:02:00,640 Speaker 1: did in because while the plan was just to have 33 00:02:00,680 --> 00:02:04,200 Speaker 1: a temporary band that lasted a week, Facebook actually kept 34 00:02:04,200 --> 00:02:07,120 Speaker 1: that ban on political ads in place all the way 35 00:02:07,160 --> 00:02:10,959 Speaker 1: for six months. Pretty sure that's not gonna happen this year, 36 00:02:11,040 --> 00:02:14,920 Speaker 1: but maybe it will. TikTok is taking a similar approach 37 00:02:14,960 --> 00:02:18,120 Speaker 1: when it comes to political content. Now, the company has 38 00:02:18,200 --> 00:02:22,000 Speaker 1: a band that restricts influencers from posting paid political content. 39 00:02:22,120 --> 00:02:25,120 Speaker 1: That's pretty much been the case from day one. The 40 00:02:25,200 --> 00:02:28,920 Speaker 1: company has never accepted political ads. But it does start 41 00:02:28,919 --> 00:02:32,600 Speaker 1: to get tricky when you understand that TikTok influencers aren't 42 00:02:32,600 --> 00:02:38,639 Speaker 1: necessarily looking to TikTok as their payday. They take sponsorship 43 00:02:38,680 --> 00:02:41,520 Speaker 1: deals instead. And one of these days I really need 44 00:02:41,560 --> 00:02:45,000 Speaker 1: to do a full episode on how various influencers make 45 00:02:45,080 --> 00:02:48,520 Speaker 1: their money, because it varies from platform to platform, and 46 00:02:48,560 --> 00:02:53,120 Speaker 1: it's never as straightforward as you might imagine. Anyway, TikTok 47 00:02:53,440 --> 00:02:57,000 Speaker 1: isn't banning political content outright. You can still post a 48 00:02:57,080 --> 00:03:00,440 Speaker 1: video where you express your political beliefs or where you 49 00:03:00,520 --> 00:03:04,720 Speaker 1: critique a politician or policy or whatever, so long as 50 00:03:04,760 --> 00:03:07,799 Speaker 1: you don't break the other rules. For instance, if your 51 00:03:07,919 --> 00:03:12,440 Speaker 1: video spreads misinformation, or it's meant to incite violence, or 52 00:03:12,440 --> 00:03:15,040 Speaker 1: it turns out that you were paid to put that 53 00:03:15,120 --> 00:03:18,760 Speaker 1: specific viewpoint up online, all of that is right out. 54 00:03:19,160 --> 00:03:21,400 Speaker 1: The video is likely going to get deleted or at 55 00:03:21,480 --> 00:03:26,239 Speaker 1: least heavily restricted, and there's a chance that the account holder, 56 00:03:26,320 --> 00:03:28,640 Speaker 1: the person who made the video, might find themselves without 57 00:03:28,639 --> 00:03:32,079 Speaker 1: an account before too long. TikTok says it will make 58 00:03:32,120 --> 00:03:35,800 Speaker 1: available information to creators so that they understand the rules. 59 00:03:36,000 --> 00:03:39,480 Speaker 1: You know, it's making sure that people know what they're 60 00:03:39,480 --> 00:03:41,880 Speaker 1: getting into, especially people who might be new to TikTok, 61 00:03:41,960 --> 00:03:44,600 Speaker 1: and you know they're they're taking sponsorship deals maybe they 62 00:03:44,600 --> 00:03:47,640 Speaker 1: aren't aware of the ban on political content when it 63 00:03:47,640 --> 00:03:52,400 Speaker 1: comes to sponsorship and paid content. I don't know if 64 00:03:52,440 --> 00:03:54,920 Speaker 1: the steps that Facebook and TikTok are taking are going 65 00:03:54,960 --> 00:03:57,960 Speaker 1: to have a significant impact on the spread of misinformation online. 66 00:03:58,440 --> 00:04:00,640 Speaker 1: I do think they are steps in the right direction, 67 00:04:00,920 --> 00:04:04,240 Speaker 1: but I imagine that all this really will do is 68 00:04:04,440 --> 00:04:08,640 Speaker 1: make those who are are determined to spread misinformation to 69 00:04:08,760 --> 00:04:11,920 Speaker 1: just find new ways to do it. Amazon is reportedly 70 00:04:11,920 --> 00:04:15,120 Speaker 1: testing a short form video feature in its app that's 71 00:04:15,400 --> 00:04:18,840 Speaker 1: sort of similar to TikTok. That's how I'm seeing it reported, 72 00:04:18,839 --> 00:04:22,120 Speaker 1: though I I don't. I don't draw it as close 73 00:04:22,200 --> 00:04:25,239 Speaker 1: a connection as others do. So instead of it being 74 00:04:25,760 --> 00:04:29,960 Speaker 1: a platform for dances and pranks and sketches and dubious 75 00:04:30,000 --> 00:04:32,640 Speaker 1: life hacks and that kind of thing the way TikTok is, 76 00:04:33,600 --> 00:04:37,480 Speaker 1: Amazon's version is meant to let users post short videos detailing, 77 00:04:37,800 --> 00:04:40,080 Speaker 1: you know, like a review or a demonstration of a 78 00:04:40,120 --> 00:04:44,320 Speaker 1: product that's found on Amazon's store. So let's say you 79 00:04:44,360 --> 00:04:47,040 Speaker 1: pop onto Amazon in order to buy yourself a brand 80 00:04:47,080 --> 00:04:50,920 Speaker 1: new sewing machine, and when it arrives, maybe you shoot 81 00:04:50,920 --> 00:04:54,800 Speaker 1: a quick unboxing video, or maybe you show how to 82 00:04:55,000 --> 00:04:58,240 Speaker 1: set it up for its first use, particularly if the 83 00:04:58,279 --> 00:05:01,240 Speaker 1: one you butt has poor and auctions. I've had that 84 00:05:01,320 --> 00:05:04,160 Speaker 1: happen with stuff I've bought, where you know, I spent 85 00:05:05,040 --> 00:05:06,920 Speaker 1: way more time trying to figure out how to just 86 00:05:06,960 --> 00:05:09,880 Speaker 1: set it up than necessary because the instructions were terrible. 87 00:05:10,640 --> 00:05:13,520 Speaker 1: Or maybe you shoot a video to demonstrate how the 88 00:05:13,600 --> 00:05:16,600 Speaker 1: device works as you actually finish a project on it. 89 00:05:17,120 --> 00:05:19,400 Speaker 1: That's the kind of stuff that Amazon is looking into. 90 00:05:19,960 --> 00:05:23,520 Speaker 1: The feature right now is called Inspire, and Amazon is 91 00:05:23,560 --> 00:05:25,880 Speaker 1: testing it out, but there's no guarantee that the company 92 00:05:25,920 --> 00:05:28,800 Speaker 1: will actually integrate it into the app for everyone and 93 00:05:28,960 --> 00:05:31,960 Speaker 1: roll it out. It might be that tests show that 94 00:05:32,080 --> 00:05:35,080 Speaker 1: users aren't crazy about it or just never use it, 95 00:05:35,440 --> 00:05:37,760 Speaker 1: but it is interesting to see how other big companies 96 00:05:37,800 --> 00:05:40,400 Speaker 1: are testing the waters and trying to tap into the 97 00:05:40,480 --> 00:05:45,520 Speaker 1: appeal of TikTok like presentations. Tech Crunch has made available 98 00:05:45,520 --> 00:05:47,839 Speaker 1: a tool designed to see if your Android device is 99 00:05:47,880 --> 00:05:52,479 Speaker 1: listed on a database indicating that spyware is installed on 100 00:05:52,560 --> 00:05:54,440 Speaker 1: that device. All right, I'm going to back up a 101 00:05:54,480 --> 00:05:57,960 Speaker 1: little bit. So earlier this year, tech Crunch received a 102 00:05:58,040 --> 00:06:00,880 Speaker 1: large amount of data that came from US spyware groups 103 00:06:01,000 --> 00:06:04,360 Speaker 1: internal servers, and the information contains a list of all 104 00:06:04,400 --> 00:06:08,000 Speaker 1: the Android devices that have been infected by the Truth 105 00:06:08,080 --> 00:06:12,280 Speaker 1: Spy spyware and its network of spyware that includes other 106 00:06:12,480 --> 00:06:15,680 Speaker 1: lots of other spyware apps, tons of them. And these 107 00:06:15,680 --> 00:06:19,080 Speaker 1: apps sit quietly on Android phones and they do stuff 108 00:06:19,120 --> 00:06:22,400 Speaker 1: like log activity and track the device. And obviously there 109 00:06:22,440 --> 00:06:25,720 Speaker 1: are a lot of nefarious reasons someone might be using spyware. 110 00:06:25,920 --> 00:06:29,159 Speaker 1: I can't think of any like legit reasons for using it, 111 00:06:29,200 --> 00:06:32,040 Speaker 1: but there are a lot of bad ones. And one 112 00:06:32,440 --> 00:06:36,599 Speaker 1: uh that tech Crunch pointed out that is particularly scary 113 00:06:36,720 --> 00:06:40,919 Speaker 1: and of real concern is that a stalker might attempt 114 00:06:41,040 --> 00:06:44,200 Speaker 1: to infect a target's phone with spywear in order to 115 00:06:44,320 --> 00:06:48,599 Speaker 1: keep tabs on that target. That is terrifying, y'all, and 116 00:06:48,640 --> 00:06:51,280 Speaker 1: it's all too real. We have seen instances of this. 117 00:06:51,600 --> 00:06:54,280 Speaker 1: We've seen it not just with things like apps and 118 00:06:54,480 --> 00:06:57,560 Speaker 1: and using spywear, We've seen it with using things like 119 00:06:57,600 --> 00:07:00,480 Speaker 1: air tags. Right. So tech Crunch has now create a 120 00:07:00,480 --> 00:07:03,359 Speaker 1: tool that makes it possible for Android device owners to 121 00:07:03,480 --> 00:07:07,000 Speaker 1: check and see if their own device shows up in 122 00:07:07,120 --> 00:07:11,400 Speaker 1: that list that tech Crunch received, which led up to 123 00:07:11,600 --> 00:07:14,080 Speaker 1: all the devices that had been infected by these various 124 00:07:14,160 --> 00:07:18,440 Speaker 1: kinds of spyware up to June of two. So the 125 00:07:18,480 --> 00:07:24,200 Speaker 1: tool is located at tech crunch dot com slash pages 126 00:07:24,480 --> 00:07:30,640 Speaker 1: Slash the Truth Spy Dash Investigation. Now, if you go 127 00:07:30,760 --> 00:07:32,960 Speaker 1: there and you want to check on your device to 128 00:07:33,000 --> 00:07:35,400 Speaker 1: make sure it's not in this list, there will be 129 00:07:35,480 --> 00:07:37,680 Speaker 1: a few steps you have to take, but the page 130 00:07:37,680 --> 00:07:41,080 Speaker 1: actually walks you through all of that. So I suggest 131 00:07:41,120 --> 00:07:43,120 Speaker 1: if you have an Android device that you go through 132 00:07:43,120 --> 00:07:45,640 Speaker 1: the trouble of doing this, particularly if you suspect that 133 00:07:45,720 --> 00:07:49,680 Speaker 1: there could be spyware on your device. Now, if you 134 00:07:49,880 --> 00:07:54,040 Speaker 1: find that your device is a match in that system, 135 00:07:54,040 --> 00:07:56,040 Speaker 1: which I would say is kind of a worst case scenario, 136 00:07:56,560 --> 00:07:58,680 Speaker 1: the page also has links to guide you through the 137 00:07:58,720 --> 00:08:03,520 Speaker 1: steps of removing spywear from your device. Now that crunch 138 00:08:03,560 --> 00:08:06,120 Speaker 1: does issue a warning that I want to repeat. If 139 00:08:06,200 --> 00:08:09,240 Speaker 1: you think that someone is using your device to spy 140 00:08:09,320 --> 00:08:13,160 Speaker 1: on you and you're in a stalker situation, you need 141 00:08:13,200 --> 00:08:16,160 Speaker 1: to be very careful moving forward because removing the spyware 142 00:08:16,320 --> 00:08:18,760 Speaker 1: is likely to alert whomever put it on your device 143 00:08:18,840 --> 00:08:21,559 Speaker 1: that you're onto them and that you're doing this. So 144 00:08:21,960 --> 00:08:24,840 Speaker 1: just things to keep in mind to try and remain safe. 145 00:08:25,280 --> 00:08:28,680 Speaker 1: And this is also a reminder that you should always 146 00:08:28,680 --> 00:08:31,520 Speaker 1: be careful with your devices. I'm not victim blaming here, 147 00:08:31,560 --> 00:08:35,559 Speaker 1: because it is not always possible to avoid getting hit 148 00:08:35,559 --> 00:08:39,840 Speaker 1: by stuff like this. We have seen really, really inventive 149 00:08:39,840 --> 00:08:43,720 Speaker 1: attacks where the target of the attack had had little 150 00:08:43,760 --> 00:08:46,680 Speaker 1: to no chance of avoiding it. So this is not blaming, 151 00:08:46,679 --> 00:08:50,800 Speaker 1: but just saying that taking precautions and following best practices 152 00:08:50,840 --> 00:08:54,960 Speaker 1: when it comes to browsing on your device or downloading 153 00:08:55,000 --> 00:08:58,240 Speaker 1: and installing apps, you know, doing your homework to make 154 00:08:58,280 --> 00:09:00,960 Speaker 1: sure you're being as or full as you can be, 155 00:09:01,520 --> 00:09:04,880 Speaker 1: that can really improve your chances of staying safe. But again, 156 00:09:05,480 --> 00:09:08,720 Speaker 1: no shade thrown on people who are affected by this stuff. 157 00:09:09,400 --> 00:09:12,440 Speaker 1: It can be easy to fall victim to it, and sometimes, 158 00:09:12,520 --> 00:09:17,760 Speaker 1: like I said, unavoidable. Reuter's reports that major financial institutions 159 00:09:17,800 --> 00:09:20,319 Speaker 1: like banks are cracking down on the types of messaging 160 00:09:20,360 --> 00:09:23,680 Speaker 1: and communication apps that employees are allowed to use while 161 00:09:23,720 --> 00:09:27,960 Speaker 1: they conduct business. And here's the thing. I often bristle 162 00:09:28,120 --> 00:09:32,160 Speaker 1: at the thought of a company, organization, or boss restricting 163 00:09:32,200 --> 00:09:35,280 Speaker 1: what employees can do and how they do it, and 164 00:09:35,360 --> 00:09:39,840 Speaker 1: in this instance actually have to side with the organizations. See, 165 00:09:39,880 --> 00:09:44,320 Speaker 1: these financial institutions are responsible for moving huge amounts of 166 00:09:44,360 --> 00:09:48,920 Speaker 1: money around. Moreover, they are often responsible for transactions that 167 00:09:48,960 --> 00:09:53,440 Speaker 1: can indicate massive important things going on in business that 168 00:09:53,520 --> 00:09:56,240 Speaker 1: might not yet be public knowledge, and that opens up 169 00:09:56,240 --> 00:10:00,199 Speaker 1: opportunities for stuff like insider trading or just getting head 170 00:10:00,280 --> 00:10:03,720 Speaker 1: start on everybody else on something. And because of that, 171 00:10:04,120 --> 00:10:08,280 Speaker 1: the financial industry traditionally keeps a very close watch on 172 00:10:08,440 --> 00:10:12,840 Speaker 1: employee activities in order to remain compliant, so failing to 173 00:10:12,880 --> 00:10:16,520 Speaker 1: do so could bring regulators down on the financial institution 174 00:10:16,600 --> 00:10:19,640 Speaker 1: with hefty fines to follow. Should it turn out that 175 00:10:19,920 --> 00:10:22,880 Speaker 1: a lack of supervision led to employees of using their 176 00:10:22,920 --> 00:10:29,000 Speaker 1: position and knowledge and breaking rules, then enter the pandemic. Suddenly, 177 00:10:29,040 --> 00:10:32,440 Speaker 1: these institutions needed to pivot, just like everybody else did, 178 00:10:32,800 --> 00:10:36,120 Speaker 1: so that their employees could work remotely and business would 179 00:10:36,120 --> 00:10:39,720 Speaker 1: not be disrupted, and some employees began using all sorts 180 00:10:39,720 --> 00:10:43,000 Speaker 1: of different services to communicate with their clients. Some of 181 00:10:43,000 --> 00:10:47,320 Speaker 1: those services include features meant to ensure secure in private communications. 182 00:10:47,360 --> 00:10:49,680 Speaker 1: That's not a bad thing. Secure and private communications are 183 00:10:49,720 --> 00:10:53,360 Speaker 1: a good thing in most cases. I'm talking about stuff 184 00:10:53,400 --> 00:10:57,920 Speaker 1: like end to end encryption or the communications services that 185 00:10:58,000 --> 00:11:00,800 Speaker 1: don't keep a log of past communication sations so there's 186 00:11:00,800 --> 00:11:04,400 Speaker 1: not a record. But that is a legal nightmare for 187 00:11:04,440 --> 00:11:07,040 Speaker 1: financial companies. They need to be able to present these 188 00:11:07,080 --> 00:11:09,720 Speaker 1: communications in the wake of an investigation. They need to 189 00:11:09,760 --> 00:11:12,520 Speaker 1: be able to know what happened, who knew about it, 190 00:11:12,760 --> 00:11:16,440 Speaker 1: who were who was making decisions. And if you have 191 00:11:16,600 --> 00:11:19,920 Speaker 1: these kinds of communications channels that aren't keeping track of 192 00:11:19,960 --> 00:11:23,200 Speaker 1: that stuff, you have no proof and you're not compliant. 193 00:11:23,480 --> 00:11:26,240 Speaker 1: So right now, financial institutions around the world are one 194 00:11:26,280 --> 00:11:29,400 Speaker 1: they're setting aside large amounts of money to pay incoming 195 00:11:29,440 --> 00:11:32,840 Speaker 1: fines because of these types of communications that are going on. 196 00:11:33,160 --> 00:11:35,760 Speaker 1: So the companies they're there's they're not even saying it's 197 00:11:35,800 --> 00:11:38,520 Speaker 1: not our fault. They're saying, yeah, we're saying aside money 198 00:11:38,520 --> 00:11:42,280 Speaker 1: because we know that these are violations. And meanwhile, they're 199 00:11:42,720 --> 00:11:45,640 Speaker 1: trying to require employees to only use tools that can 200 00:11:45,679 --> 00:11:49,280 Speaker 1: record an archive communications in order to meet compliance with 201 00:11:49,360 --> 00:11:52,040 Speaker 1: various laws. So this is an interesting example of a 202 00:11:52,080 --> 00:11:55,360 Speaker 1: scenario where the very features I usually think of as 203 00:11:55,360 --> 00:11:59,320 Speaker 1: being important for personal communication are ones that are absolutely 204 00:11:59,360 --> 00:12:03,000 Speaker 1: inappropriate it for this particular use case. All right, we've 205 00:12:03,040 --> 00:12:05,439 Speaker 1: got some more news items to go over, but before 206 00:12:05,440 --> 00:12:17,439 Speaker 1: we do that, let's take a quick break. We're back 207 00:12:17,960 --> 00:12:21,080 Speaker 1: in Saudi Arabia. The courts have sentenced a woman named 208 00:12:21,200 --> 00:12:25,520 Speaker 1: Salma al Shahab to thirty four years in prison. So 209 00:12:25,640 --> 00:12:30,280 Speaker 1: what was Salma's crime, Well, she had been retweeting activists 210 00:12:30,320 --> 00:12:33,640 Speaker 1: who were calling for women's rights in Saudi Arabia and 211 00:12:33,720 --> 00:12:36,680 Speaker 1: for the release of certain political prisoners within the country. 212 00:12:36,960 --> 00:12:39,840 Speaker 1: Salma had been attending university in the UK in pursuit 213 00:12:39,880 --> 00:12:43,240 Speaker 1: of a PhD, and upon returning to Saudi Arabia for 214 00:12:43,360 --> 00:12:47,679 Speaker 1: a vacation, she was detained and arrested, then tried and 215 00:12:47,760 --> 00:12:51,080 Speaker 1: convicted on charges that she was aiming to quote disturb 216 00:12:51,280 --> 00:12:54,960 Speaker 1: public order and destabilize the security and stability of the 217 00:12:55,000 --> 00:12:59,880 Speaker 1: state end quote. This is absolutely horrifying, and to test 218 00:12:59,880 --> 00:13:03,760 Speaker 1: a full stop on top of that, the Saudi regime 219 00:13:03,840 --> 00:13:07,559 Speaker 1: owns a significant chunk of Twitter itself through what is 220 00:13:07,600 --> 00:13:11,240 Speaker 1: called the Public Investment Fund. That's an interesting name for 221 00:13:11,320 --> 00:13:14,880 Speaker 1: a fund that represents sovereign wealth and not you know, 222 00:13:15,040 --> 00:13:18,600 Speaker 1: public wealth. So this puts Twitter in an awkward position 223 00:13:18,640 --> 00:13:22,840 Speaker 1: to write because this is the platform where the supposed 224 00:13:23,480 --> 00:13:31,000 Speaker 1: infraction happen. And meanwhile, the entity that has has incarcerated 225 00:13:31,000 --> 00:13:36,439 Speaker 1: this woman has ownership partial ownership of that platform. Not great. 226 00:13:36,880 --> 00:13:39,200 Speaker 1: I am not sure what if anything can be done 227 00:13:39,240 --> 00:13:42,719 Speaker 1: on behalf of Salma, who clearly doesn't deserve being imprisoned 228 00:13:42,760 --> 00:13:45,200 Speaker 1: at all, let alone for thirty or four years for 229 00:13:45,320 --> 00:13:50,320 Speaker 1: just retweeting social messages on Twitter. In India, government officials 230 00:13:50,320 --> 00:13:53,160 Speaker 1: are considering rules that would require Apple to abandon its 231 00:13:53,160 --> 00:13:57,319 Speaker 1: proprietary charging ports on devices like air pods and iPhones 232 00:13:57,720 --> 00:14:02,440 Speaker 1: and switched to the Universal Standard of SBC. Now should 233 00:14:02,480 --> 00:14:05,240 Speaker 1: these officials decide that, Apple might have to make the 234 00:14:05,240 --> 00:14:08,920 Speaker 1: switch as early as four in India. This follows a 235 00:14:08,960 --> 00:14:11,240 Speaker 1: similar decision that we talked about in a previous text, 236 00:14:11,240 --> 00:14:14,320 Speaker 1: stuff that happened in the European Union, which also seeks 237 00:14:14,320 --> 00:14:16,800 Speaker 1: to simplify the market and to make it easier for 238 00:14:16,840 --> 00:14:20,000 Speaker 1: consumers to swap out charging chords without worrying if this 239 00:14:20,120 --> 00:14:23,680 Speaker 1: cord matches that device or not. I suspect Apple will 240 00:14:23,760 --> 00:14:27,560 Speaker 1: migrate to USBC for those devices, at least for certain models. Uh, 241 00:14:27,640 --> 00:14:30,880 Speaker 1: It's aready done so for things like Mac computers, and 242 00:14:30,960 --> 00:14:34,080 Speaker 1: largely I think this is going to happen because while 243 00:14:34,120 --> 00:14:37,920 Speaker 1: Apple was once primarily known as a hardware company during 244 00:14:37,920 --> 00:14:41,120 Speaker 1: the Steve Jobs era, now we're in the Tim Cook era, 245 00:14:41,360 --> 00:14:45,000 Speaker 1: and Apple has been repositioning itself to be more of 246 00:14:45,040 --> 00:14:48,760 Speaker 1: a services company. I think Cook is less resistant to 247 00:14:48,840 --> 00:14:53,720 Speaker 1: conforming to industry standards than Jobs was. Jobs was kind 248 00:14:53,720 --> 00:14:56,080 Speaker 1: of like my way or no way at all, and 249 00:14:56,160 --> 00:14:59,640 Speaker 1: Tim Cook is more like, let's do whatever it takes 250 00:14:59,680 --> 00:15:02,320 Speaker 1: in or to continue to be able to provide services. 251 00:15:02,840 --> 00:15:06,440 Speaker 1: Bloomberg reports that the upcoming AD supported tier of Netflix 252 00:15:06,520 --> 00:15:09,640 Speaker 1: might have a few other restrictions for subscribers. The company 253 00:15:09,640 --> 00:15:12,640 Speaker 1: had announced plans to introduce a lower cost AD supported 254 00:15:12,680 --> 00:15:14,760 Speaker 1: tier in an effort to gain back some of the 255 00:15:14,800 --> 00:15:18,840 Speaker 1: subscribers it has previously lost and presumably put the company 256 00:15:18,840 --> 00:15:21,960 Speaker 1: back on track for growth, but we don't yet know 257 00:15:22,080 --> 00:15:26,080 Speaker 1: the subscription price for that AD supported tier anyway, Bloomberg 258 00:15:26,120 --> 00:15:28,680 Speaker 1: reports that Netflix might turn off a feature that current 259 00:15:28,680 --> 00:15:32,720 Speaker 1: subscribers have, that being the ability to download content to 260 00:15:32,880 --> 00:15:35,200 Speaker 1: mobile devices in order to be able to watch that 261 00:15:35,280 --> 00:15:39,160 Speaker 1: content offline, for example, when you're on an airplane. And 262 00:15:39,600 --> 00:15:42,840 Speaker 1: this actually makes sense to me. For Netflix, you know, 263 00:15:43,040 --> 00:15:44,840 Speaker 1: it's in a position where it needs to prove to 264 00:15:44,960 --> 00:15:48,680 Speaker 1: advertisers that the ads on its platform will have value, 265 00:15:48,840 --> 00:15:50,560 Speaker 1: and the company needs to be able to show that 266 00:15:50,600 --> 00:15:53,360 Speaker 1: the ads are actually being played and watched, and you 267 00:15:53,400 --> 00:15:56,200 Speaker 1: can't really do that if the content is downloaded. For 268 00:15:56,280 --> 00:16:00,680 Speaker 1: offline viewing. Also, Bloomberg reports that user will not be 269 00:16:00,880 --> 00:16:04,120 Speaker 1: able to skip ads, and again that makes sense from 270 00:16:04,160 --> 00:16:07,880 Speaker 1: Netflix's perspective. If users can just skip the ads, then 271 00:16:07,920 --> 00:16:12,000 Speaker 1: why wouldn't everyone just downgrade to the cheapest ads supported option, 272 00:16:12,360 --> 00:16:14,320 Speaker 1: you know, the and then just fast forward through all 273 00:16:14,320 --> 00:16:17,800 Speaker 1: the ads. So Netflix has yet to comment on Bloomberg's report, 274 00:16:17,840 --> 00:16:20,760 Speaker 1: and it's possible that these sorts of decisions haven't been 275 00:16:20,760 --> 00:16:25,400 Speaker 1: finalized yet. And hey, I know what you're thinking. Yeah, 276 00:16:25,600 --> 00:16:28,680 Speaker 1: I'd love to scale Mount Kilimanjaro, but then I wouldn't 277 00:16:28,680 --> 00:16:31,080 Speaker 1: be able to play today's wordle and I need an 278 00:16:31,200 --> 00:16:34,000 Speaker 1: internet connection. Well, have I got good news for you, 279 00:16:34,280 --> 00:16:38,320 Speaker 1: because the mountain has now got high speed WiFi on it. 280 00:16:38,880 --> 00:16:42,520 Speaker 1: The Guardian has a pretty cheeky article about this news. 281 00:16:42,960 --> 00:16:46,920 Speaker 1: The article is titled Kilimanjaro gets high speed Internet so 282 00:16:47,000 --> 00:16:51,240 Speaker 1: climbers can tweet or instagram ascent And sure you could 283 00:16:51,240 --> 00:16:53,960 Speaker 1: do that, but honestly, I feel like Tanzania has installed 284 00:16:54,000 --> 00:16:58,280 Speaker 1: high speed internet service on kilimanjar Oh not so that 285 00:16:58,360 --> 00:17:01,800 Speaker 1: influencers can post videos while they look for the most 286 00:17:01,800 --> 00:17:05,240 Speaker 1: picturesque background. So it seems like their lives are perfect 287 00:17:05,320 --> 00:17:09,240 Speaker 1: and without problems or limitations, but rather to make certain 288 00:17:09,240 --> 00:17:11,960 Speaker 1: that there's a communications network in place to aid with 289 00:17:12,080 --> 00:17:15,200 Speaker 1: navigation and to help in the event of an emergency. 290 00:17:15,640 --> 00:17:18,679 Speaker 1: Having that safety net there could spell the difference between 291 00:17:18,720 --> 00:17:23,240 Speaker 1: a successful rescue attempt and tragedy. I do think there 292 00:17:23,280 --> 00:17:25,680 Speaker 1: will be plenty of people who will say, awesome, now 293 00:17:25,680 --> 00:17:29,200 Speaker 1: I can live tweet my climbing experience. But I'm okay 294 00:17:29,240 --> 00:17:31,479 Speaker 1: with that if it also means someone who got in 295 00:17:31,600 --> 00:17:34,680 Speaker 1: over their head has an opportunity to make it back 296 00:17:34,720 --> 00:17:39,680 Speaker 1: to safety. And yesterday's Tech Stuff Tidbits episode, which ended 297 00:17:39,760 --> 00:17:42,199 Speaker 1: up being nearly fifty minutes long, I talked about how 298 00:17:42,280 --> 00:17:44,880 Speaker 1: VR has some barriers to entry that could serve as 299 00:17:44,920 --> 00:17:47,600 Speaker 1: a hurdle for companies like Meta that they have to 300 00:17:47,640 --> 00:17:50,000 Speaker 1: get over in order to bring the Metaverse to life. 301 00:17:50,400 --> 00:17:52,919 Speaker 1: One thing I didn't mention is the unveiling of the 302 00:17:53,000 --> 00:17:59,359 Speaker 1: latest version of Meta CEO Mark Zuckerberg's Metaverse avatar, which 303 00:17:59,400 --> 00:18:03,800 Speaker 1: subsequently got a ton of news coverage in jokes about 304 00:18:04,200 --> 00:18:06,640 Speaker 1: the avatar looking like a soulless creature and other people 305 00:18:06,680 --> 00:18:09,800 Speaker 1: saying that's a fitting representation for Mark Zuckerberg, and I 306 00:18:09,800 --> 00:18:12,200 Speaker 1: am not here to make jokes like that. Instead, I'll 307 00:18:12,240 --> 00:18:14,480 Speaker 1: say I think this is an indication that Zuckerberg is 308 00:18:14,520 --> 00:18:17,600 Speaker 1: trying to address some of those hurdles that I mentioned 309 00:18:17,600 --> 00:18:20,959 Speaker 1: in yesterday's episode, because a big one is the cost 310 00:18:21,040 --> 00:18:23,960 Speaker 1: of VR gear, and if VR is to be an 311 00:18:23,960 --> 00:18:27,040 Speaker 1: important component in the metaverse, which I should add is 312 00:18:27,080 --> 00:18:29,320 Speaker 1: not a foregone conclusion, it's just one of the more 313 00:18:30,280 --> 00:18:33,960 Speaker 1: commonly referenced manifestations of the metaverse is that it's a 314 00:18:34,080 --> 00:18:37,560 Speaker 1: VR experience. Well, that would mean that companies like Meta 315 00:18:37,680 --> 00:18:40,040 Speaker 1: will want to make sure that the gear required isn't 316 00:18:40,080 --> 00:18:43,000 Speaker 1: so expensive that it prices most people out of it. 317 00:18:43,359 --> 00:18:46,119 Speaker 1: The metaverse is only going to be insanely profitable to 318 00:18:46,200 --> 00:18:48,800 Speaker 1: Meta if the company can get a metric buttload of 319 00:18:48,920 --> 00:18:51,639 Speaker 1: users to join it. So one way to ensure that 320 00:18:51,680 --> 00:18:54,600 Speaker 1: the gear won't be too expensive is to make certain 321 00:18:54,600 --> 00:18:57,480 Speaker 1: that the experience itself doesn't require top of the line 322 00:18:57,520 --> 00:19:00,919 Speaker 1: gear to render and process the experience. It's make it 323 00:19:01,000 --> 00:19:04,760 Speaker 1: so that lower cost or in other words, cheap equipment 324 00:19:05,000 --> 00:19:08,360 Speaker 1: can work just fine on the platform. To do that, 325 00:19:08,560 --> 00:19:12,399 Speaker 1: you have to make some pretty big compromises in graphical quality. 326 00:19:12,720 --> 00:19:16,360 Speaker 1: Enter the avatar, which to me looks like an Xbox 327 00:19:16,359 --> 00:19:21,000 Speaker 1: three sixty era Avatar or maybe Nintendo Me avatar style. 328 00:19:21,840 --> 00:19:25,760 Speaker 1: It's hardly a compelling virtual representation of a real person. 329 00:19:26,520 --> 00:19:29,920 Speaker 1: This presents a really interesting dilemma, I think because I 330 00:19:30,000 --> 00:19:33,920 Speaker 1: don't know about you, but when I imagine the concept 331 00:19:33,960 --> 00:19:38,000 Speaker 1: of the metaverse, my imagination conjures up this virtual environment 332 00:19:38,000 --> 00:19:40,760 Speaker 1: where people can have any sort of avatar they want, 333 00:19:41,080 --> 00:19:45,280 Speaker 1: and these avatars tend to look really cool and really impressive. Uh, 334 00:19:45,359 --> 00:19:49,159 Speaker 1: they're high resolution and well animated, and in fact, a 335 00:19:49,320 --> 00:19:53,000 Speaker 1: well designed avatar becomes something of a status symbol within 336 00:19:53,080 --> 00:19:59,119 Speaker 1: the metaverse. Then that only works if the environment supports 337 00:19:59,200 --> 00:20:02,040 Speaker 1: that kind of a presentation, right, you need to be 338 00:20:02,119 --> 00:20:05,520 Speaker 1: able to actually have that on there. But if instead 339 00:20:05,800 --> 00:20:08,960 Speaker 1: the metaverse needs to aim for the lowest common denominator 340 00:20:09,080 --> 00:20:11,680 Speaker 1: in order to attract the largest number of people so 341 00:20:11,720 --> 00:20:16,320 Speaker 1: that the cheapest equipment can run the darn thing, then paradoxically, 342 00:20:16,359 --> 00:20:18,200 Speaker 1: I think a lot of folks won't want to join 343 00:20:18,440 --> 00:20:21,639 Speaker 1: because it won't look cool enough. It'll look kind of cheap, 344 00:20:22,240 --> 00:20:25,560 Speaker 1: so they won't have that temptation of joining there, Like, 345 00:20:25,640 --> 00:20:27,760 Speaker 1: that's not cool. I don't want to do that. So 346 00:20:27,800 --> 00:20:30,919 Speaker 1: then you're stuck. You either design something that looks amazing, 347 00:20:31,520 --> 00:20:34,800 Speaker 1: but in turn requires equipment that's so expensive that most 348 00:20:34,800 --> 00:20:38,560 Speaker 1: people can't afford to participate, or you go super simple, 349 00:20:39,119 --> 00:20:41,240 Speaker 1: but then no one feels tempted to join it in 350 00:20:41,280 --> 00:20:45,040 Speaker 1: the first place. Now, I should add, we're still a 351 00:20:45,160 --> 00:20:48,320 Speaker 1: very long way out from the true realization of the metaverse, 352 00:20:48,520 --> 00:20:51,800 Speaker 1: and in the meantime, I expect we're going to continue 353 00:20:51,840 --> 00:20:55,119 Speaker 1: to see advancements in technology that might be able to 354 00:20:55,160 --> 00:20:58,679 Speaker 1: address these problems sufficiently so that they're a non issue 355 00:20:58,720 --> 00:21:01,080 Speaker 1: when the ding dang darn thing is actually ready for 356 00:21:01,119 --> 00:21:05,880 Speaker 1: prime time. All right, I have one more news item 357 00:21:05,920 --> 00:21:08,639 Speaker 1: after this that I want to cover, but before we 358 00:21:08,680 --> 00:21:20,480 Speaker 1: get to that, let's take another break. Let's talk about 359 00:21:20,560 --> 00:21:22,960 Speaker 1: the final news item for today, because it's a fun one, 360 00:21:23,520 --> 00:21:26,520 Speaker 1: I think, anyway, and it comes from a story from 361 00:21:26,560 --> 00:21:29,280 Speaker 1: the Verge uh and it's it's a story that really 362 00:21:29,280 --> 00:21:31,600 Speaker 1: tickled me. But it's also a story that's long since 363 00:21:31,680 --> 00:21:35,440 Speaker 1: been moot because this is a story about the deep 364 00:21:35,600 --> 00:21:39,760 Speaker 1: past of computer history, back in the Windows XP days. 365 00:21:40,400 --> 00:21:43,240 Speaker 1: Arguably some places are still in the Windows XP days. 366 00:21:43,280 --> 00:21:46,239 Speaker 1: I find it insane that there are people who are 367 00:21:46,240 --> 00:21:49,280 Speaker 1: still running on Windows XP machines. I get that there 368 00:21:49,320 --> 00:21:52,560 Speaker 1: are certain conditions that that mean that people just that's 369 00:21:52,560 --> 00:21:55,520 Speaker 1: all they have access to. But you know that that's 370 00:21:55,520 --> 00:21:58,960 Speaker 1: a truly defunct operating system. Although I mean, I gotta 371 00:21:58,960 --> 00:22:00,880 Speaker 1: admit I was a big Window is XP fan back 372 00:22:00,880 --> 00:22:03,960 Speaker 1: in the day. Okay, getting back to the story. So 373 00:22:05,480 --> 00:22:09,399 Speaker 1: the story goes that this unnamed laptop manufacturer began to 374 00:22:09,440 --> 00:22:16,240 Speaker 1: get complaints about computers crashing under very specific circumstances. Those 375 00:22:16,280 --> 00:22:20,960 Speaker 1: circumstances involved someone playing the music video for Janet Jackson's 376 00:22:21,040 --> 00:22:26,040 Speaker 1: hit Rhythm Nation. In fact, sometimes it wasn't even the 377 00:22:26,119 --> 00:22:29,520 Speaker 1: laptop itself that was playing it when the crash happened. 378 00:22:29,520 --> 00:22:32,439 Speaker 1: It just happened to be close to another device that 379 00:22:32,600 --> 00:22:35,760 Speaker 1: was playing the music video, and then the other the 380 00:22:36,200 --> 00:22:41,080 Speaker 1: laptop from this unnamed manufacturer would crash. Uh. And I 381 00:22:41,160 --> 00:22:44,280 Speaker 1: don't know, maybe you're just saying that the laptop hated 382 00:22:44,359 --> 00:22:47,159 Speaker 1: Rhythm Nation because everybody is a critic, am alright, But 383 00:22:47,240 --> 00:22:50,280 Speaker 1: what the heck was actually going on? Well, according to 384 00:22:50,400 --> 00:22:54,320 Speaker 1: Raymond Chen, who is a software engineer with Microsoft, the 385 00:22:54,400 --> 00:22:58,840 Speaker 1: problem was that the music video contained within it a 386 00:22:58,960 --> 00:23:03,320 Speaker 1: sound effect that had a specific frequency in it, and 387 00:23:03,359 --> 00:23:07,040 Speaker 1: that frequency just happened to be the resonant frequency for 388 00:23:07,119 --> 00:23:11,840 Speaker 1: the laptops hard drive. So the sound effect would introduce 389 00:23:11,960 --> 00:23:15,320 Speaker 1: vibrations into the hard drive and that would lead to 390 00:23:15,359 --> 00:23:19,159 Speaker 1: a system crash. And you might wonder, well, how the 391 00:23:19,160 --> 00:23:23,000 Speaker 1: heck did the laptop manufacturer deal with that problem? And yeah, 392 00:23:23,320 --> 00:23:26,159 Speaker 1: this specific music video would set off that laptop, but 393 00:23:26,240 --> 00:23:30,480 Speaker 1: then so would any other source of sound that contained 394 00:23:30,480 --> 00:23:34,159 Speaker 1: that frequency, right, it would introduce these vibrations into the 395 00:23:34,200 --> 00:23:37,720 Speaker 1: hard disk drive, and then the laptop would crash. So, 396 00:23:37,760 --> 00:23:42,840 Speaker 1: according to Chen, this manufacturer issued essentially a hatch update 397 00:23:43,400 --> 00:23:46,840 Speaker 1: that would prevent the computers speakers from playing that specific 398 00:23:46,920 --> 00:23:49,919 Speaker 1: frequency at all. So, in other words, if you were 399 00:23:49,960 --> 00:23:53,480 Speaker 1: to play Rhythm Nation, that sound effect would lack that 400 00:23:53,680 --> 00:23:57,159 Speaker 1: specific frequency. I don't know if the entire sound effect 401 00:23:57,200 --> 00:24:00,360 Speaker 1: would go silent, or if just that one frequency from 402 00:24:00,359 --> 00:24:03,040 Speaker 1: the sound effect would no longer be audible. You might 403 00:24:03,040 --> 00:24:05,760 Speaker 1: not even be able to tell the difference, you know, 404 00:24:05,880 --> 00:24:09,840 Speaker 1: on casual listen. I don't know, but I assume if 405 00:24:09,880 --> 00:24:13,680 Speaker 1: some other machine we're playing that music video, the laptop 406 00:24:13,720 --> 00:24:16,919 Speaker 1: could still potentially crash. Though I also think things like 407 00:24:16,960 --> 00:24:21,600 Speaker 1: speaker quality would determine that, because you're talking about producing 408 00:24:21,760 --> 00:24:25,639 Speaker 1: a frequency with the proper amplitude for it to really 409 00:24:25,680 --> 00:24:29,680 Speaker 1: be an issue for the the affected laptops. Anyway, I 410 00:24:29,720 --> 00:24:32,439 Speaker 1: thought the story was really amusing, but it's also a 411 00:24:32,480 --> 00:24:37,320 Speaker 1: great reminder that it's cool to learn about physics. Resonant 412 00:24:37,359 --> 00:24:40,800 Speaker 1: frequencies are important for a lot of different stuff. Nicola 413 00:24:40,840 --> 00:24:46,200 Speaker 1: Tesla was obsessed with the idea of resonant frequencies. Musical 414 00:24:46,240 --> 00:24:51,359 Speaker 1: instruments are obviously dependent upon things like resonant frequencies, but 415 00:24:52,119 --> 00:24:54,960 Speaker 1: these things can also lead to dangerous situations. I mean, 416 00:24:55,000 --> 00:24:58,440 Speaker 1: if you are able to resonate adjust the right frequency, 417 00:24:58,880 --> 00:25:02,120 Speaker 1: you could potentially dis roy stuff as large as suspension bridges. 418 00:25:02,640 --> 00:25:04,960 Speaker 1: The idea being here that this is a frequency that 419 00:25:05,040 --> 00:25:09,280 Speaker 1: resonates with whatever the material is and causes it to vibrate, 420 00:25:09,840 --> 00:25:12,520 Speaker 1: and it can be kind of like being on a 421 00:25:12,600 --> 00:25:14,800 Speaker 1: swing set and having someone push you at just the 422 00:25:14,880 --> 00:25:19,480 Speaker 1: right moment and getting you to hire and higher arcs. 423 00:25:19,520 --> 00:25:21,640 Speaker 1: Sort of like that, and just imagine that you get 424 00:25:21,680 --> 00:25:24,719 Speaker 1: to a point where things begin to break. This is 425 00:25:24,760 --> 00:25:26,879 Speaker 1: the classic example of this is where you hit the 426 00:25:26,920 --> 00:25:31,639 Speaker 1: resonant frequency for a crystal container and shatter it as 427 00:25:31,640 --> 00:25:37,560 Speaker 1: a result. Or maybe you might use resonantive frequencies purposefully 428 00:25:37,600 --> 00:25:41,240 Speaker 1: in an attempt to conduct industrial sabotage and a nuclear 429 00:25:41,280 --> 00:25:46,160 Speaker 1: facility by manipulating centrifuges rotational speeds and try and make 430 00:25:46,200 --> 00:25:51,240 Speaker 1: them destroy themselves. Cough stucks net cough. Science is both 431 00:25:51,280 --> 00:25:54,840 Speaker 1: cool and kind of scary. All right. That's it for 432 00:25:54,960 --> 00:25:58,040 Speaker 1: today's tech Stuff news episode. Hope you enjoyed it. If 433 00:25:58,040 --> 00:26:00,280 Speaker 1: you have suggestions for things I should cover and future 434 00:26:00,280 --> 00:26:02,720 Speaker 1: episodes of tech Stuff, please reach out to me. There 435 00:26:02,720 --> 00:26:04,239 Speaker 1: are a couple of different ways of doing that. You 436 00:26:04,280 --> 00:26:07,960 Speaker 1: can download the i Heart Radio app and navigate to 437 00:26:08,000 --> 00:26:11,280 Speaker 1: the tech Stuff page. There is a little microphone icon there. 438 00:26:11,520 --> 00:26:13,680 Speaker 1: If you click on that, you can leave a voice 439 00:26:13,680 --> 00:26:16,040 Speaker 1: message up to thirty seconds in length. Let me know 440 00:26:16,080 --> 00:26:18,000 Speaker 1: if you would like me to use the voice message 441 00:26:18,080 --> 00:26:21,440 Speaker 1: in an episode, and maybe we'll have you lead into one, 442 00:26:22,400 --> 00:26:24,520 Speaker 1: or if you prefer, you can always reach out to 443 00:26:24,560 --> 00:26:27,200 Speaker 1: me on Twitter. The handle for the show is tech 444 00:26:27,359 --> 00:26:30,679 Speaker 1: Stuff h s W and I'll talk to you again 445 00:26:31,560 --> 00:26:39,960 Speaker 1: really soon. Tech Stuff is an I Heart Radio production. 446 00:26:40,200 --> 00:26:43,000 Speaker 1: For more podcasts from my Heart Radio, visit the i 447 00:26:43,119 --> 00:26:46,359 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 448 00:26:46,400 --> 00:26:47,320 Speaker 1: your favorite shows.