1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from My Heart Radio. 2 00:00:11,800 --> 00:00:14,680 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,840 --> 00:00:18,120 Speaker 1: Jonathan Strickland, Diamond executive producer with My Heart Radio and 4 00:00:18,160 --> 00:00:20,759 Speaker 1: how the tech are you. We're going to cover the 5 00:00:20,800 --> 00:00:25,080 Speaker 1: tech news today on Tuesday, November eight, two thousand twenty two. 6 00:00:25,160 --> 00:00:27,480 Speaker 1: But before we get going, I do want to let 7 00:00:27,560 --> 00:00:29,720 Speaker 1: y'all know that tomorrow we're going to share with you 8 00:00:29,800 --> 00:00:33,320 Speaker 1: an episode of The Restless Ones here in the Tech 9 00:00:33,360 --> 00:00:36,760 Speaker 1: Stuff feed. The Restless Ones is a show I host 10 00:00:36,800 --> 00:00:41,120 Speaker 1: where I interview leaders in tech positions across different industries 11 00:00:41,640 --> 00:00:46,919 Speaker 1: to talk about things like technical challenges and innovations and 12 00:00:47,200 --> 00:00:50,400 Speaker 1: leadership strategies and things along those lines. I think you'll 13 00:00:50,400 --> 00:00:53,040 Speaker 1: dig it all right, But onto the news, and we're 14 00:00:53,040 --> 00:00:56,040 Speaker 1: gonna start with Twitter. Of course, there's a lot of 15 00:00:56,320 --> 00:00:58,480 Speaker 1: Twitter news because a lot has been going on over there. 16 00:00:58,720 --> 00:01:02,280 Speaker 1: So last week was kind of a dumpster fire over 17 00:01:02,320 --> 00:01:04,840 Speaker 1: at Twitter, and this week has already proven to be 18 00:01:04,959 --> 00:01:08,400 Speaker 1: chaotic and we're barely into it. Last week I talked 19 00:01:08,440 --> 00:01:11,039 Speaker 1: about how the word was that Musk was going to 20 00:01:11,120 --> 00:01:15,400 Speaker 1: lay off approximately half of Twitter's staff, and that did happen, 21 00:01:16,080 --> 00:01:21,560 Speaker 1: and Musk tweeted out quote regarding Twitter's reduction in force. Unfortunately, 22 00:01:21,600 --> 00:01:24,039 Speaker 1: there is no choice when the company is losing over 23 00:01:24,080 --> 00:01:27,560 Speaker 1: four million dollars a day end quote. But the move 24 00:01:27,680 --> 00:01:32,000 Speaker 1: to layoff staff was so rushed that well, there were 25 00:01:32,040 --> 00:01:35,240 Speaker 1: there were regrets. Apparently at least some of the folks 26 00:01:35,280 --> 00:01:37,680 Speaker 1: fired at Twitter, we're not supposed to be fired at all. 27 00:01:38,160 --> 00:01:43,199 Speaker 1: They got lumped in by accident. Their accounts were discontinued, 28 00:01:43,280 --> 00:01:47,160 Speaker 1: they were suspended, and they were effectively laid off without 29 00:01:47,720 --> 00:01:50,880 Speaker 1: meaning to be laid off. That's pretty gush darn awful. 30 00:01:51,120 --> 00:01:53,440 Speaker 1: That speaks to a certain level of incompetence on the 31 00:01:53,480 --> 00:01:56,800 Speaker 1: part of management. But on the other hand, there really 32 00:01:56,880 --> 00:02:01,000 Speaker 1: was a very short timeline to announce that kind of layoff. 33 00:02:01,000 --> 00:02:05,480 Speaker 1: I mean, that was such an extensive move that and 34 00:02:05,520 --> 00:02:08,440 Speaker 1: you had so little time. Mistakes were bound to happen. 35 00:02:08,639 --> 00:02:12,080 Speaker 1: It's just when those mistakes are wrapped up in people's 36 00:02:12,120 --> 00:02:15,200 Speaker 1: lives and their careers that's it's just really hard to 37 00:02:15,240 --> 00:02:19,320 Speaker 1: get past. Right. Then, there were leaders who found out 38 00:02:19,400 --> 00:02:21,520 Speaker 1: that some of the folks that they had laid off 39 00:02:21,560 --> 00:02:25,680 Speaker 1: were whoopsie daisy, folks who are knowledgeable about specific projects 40 00:02:25,720 --> 00:02:28,560 Speaker 1: and who are responsible for a lot of work and 41 00:02:28,639 --> 00:02:32,519 Speaker 1: Without those folks, Twitter would be set way back on 42 00:02:32,600 --> 00:02:36,280 Speaker 1: those projects. And since Musk has been setting aggressive, in 43 00:02:36,320 --> 00:02:40,960 Speaker 1: fact unrealistically aggressive deadlines to get certain features built into 44 00:02:41,040 --> 00:02:44,840 Speaker 1: Twitter or overhauled at Twitter, they still need some of 45 00:02:44,840 --> 00:02:48,280 Speaker 1: those now former employees. And so, in a move that 46 00:02:48,360 --> 00:02:52,360 Speaker 1: I think you can only describe as embarrassing, various Twitter 47 00:02:52,560 --> 00:02:57,040 Speaker 1: folks began to reach out on some various platforms to 48 00:02:57,080 --> 00:03:00,680 Speaker 1: try and entice recently laid off employee is to come back, 49 00:03:01,200 --> 00:03:04,880 Speaker 1: asking could you please return to work? Now, keep in 50 00:03:04,919 --> 00:03:09,080 Speaker 1: mind these employees had already been locked out of company systems, 51 00:03:09,400 --> 00:03:12,360 Speaker 1: So if they had a company computer in their position, 52 00:03:12,440 --> 00:03:14,799 Speaker 1: they got locked out of that. They were locked out 53 00:03:14,800 --> 00:03:17,560 Speaker 1: of Twitter's slack channel, they were locked out of their 54 00:03:17,560 --> 00:03:21,840 Speaker 1: employee email. So yeah, not a good look. Some affected 55 00:03:22,000 --> 00:03:24,919 Speaker 1: former employees are worried that if they refused to come 56 00:03:24,919 --> 00:03:28,320 Speaker 1: back after being laid off, then they will get fired. 57 00:03:29,280 --> 00:03:31,440 Speaker 1: Now you might think, if you've been laid off, how 58 00:03:31,440 --> 00:03:34,480 Speaker 1: could you be fired? Well, if you've been laid off 59 00:03:34,560 --> 00:03:36,760 Speaker 1: and then the company says, wait, that we didn't mean 60 00:03:36,800 --> 00:03:39,240 Speaker 1: to do that, come back to work, and you refuse 61 00:03:39,320 --> 00:03:41,360 Speaker 1: to come back to work, then the company can say 62 00:03:41,400 --> 00:03:43,839 Speaker 1: we never really laid you off. We're firing you now. 63 00:03:44,240 --> 00:03:47,760 Speaker 1: And by being fired rather than laid off, the employees 64 00:03:47,760 --> 00:03:51,360 Speaker 1: could potentially lose the sixty days of severance pay that 65 00:03:51,480 --> 00:03:54,800 Speaker 1: otherwise they would have received if you know, the layoffs 66 00:03:54,960 --> 00:03:58,560 Speaker 1: had stuck in the first place, which you might imagine 67 00:03:58,840 --> 00:04:02,280 Speaker 1: creates really nasty ral issues, and you'd be right in fact, 68 00:04:02,280 --> 00:04:05,560 Speaker 1: I honestly, I can't even begin to think of how 69 00:04:05,600 --> 00:04:08,480 Speaker 1: I would feel if this were happening to me. Meanwhile, 70 00:04:09,040 --> 00:04:11,600 Speaker 1: for the folks who are still at Twitter who didn't 71 00:04:11,600 --> 00:04:14,400 Speaker 1: get laid off, you have employees who are unsure of 72 00:04:14,480 --> 00:04:17,320 Speaker 1: what team they fit in, or or who they report 73 00:04:17,360 --> 00:04:22,200 Speaker 1: to because the company has been gutted, and reportedly some 74 00:04:22,240 --> 00:04:24,720 Speaker 1: employees and a lot of employees are left wondering about 75 00:04:24,760 --> 00:04:28,839 Speaker 1: health benefits because Twitter's open enrollment period was to begin 76 00:04:28,920 --> 00:04:33,960 Speaker 1: this week, but management has been pretty quiet on that front, 77 00:04:34,480 --> 00:04:38,040 Speaker 1: with no real word on what health benefits are going 78 00:04:38,080 --> 00:04:40,000 Speaker 1: to be. Now that Twitter is a private company, it's 79 00:04:40,040 --> 00:04:43,240 Speaker 1: no longer the same company that it was, you know, 80 00:04:43,360 --> 00:04:46,960 Speaker 1: a couple of weeks ago. So that's really concerning too. 81 00:04:47,240 --> 00:04:50,839 Speaker 1: I mean, in this country, healthcare is at a point 82 00:04:50,920 --> 00:04:54,800 Speaker 1: where if you don't have insurance, you're really really in 83 00:04:54,920 --> 00:04:59,360 Speaker 1: trouble and insurance changes all the time, So having open 84 00:04:59,480 --> 00:05:02,680 Speaker 1: enrollment is important. And I don't know if that's been 85 00:05:02,680 --> 00:05:05,280 Speaker 1: resolved yet. It might have been, but the news as 86 00:05:05,320 --> 00:05:07,800 Speaker 1: I'm reporting it now did not have any indication of that. 87 00:05:08,080 --> 00:05:11,560 Speaker 1: My hope is that Twitter sorted that out, because that's 88 00:05:12,120 --> 00:05:16,640 Speaker 1: incredibly important to employees. Twitter also rolled out the eight 89 00:05:16,640 --> 00:05:20,520 Speaker 1: dollar a month Twitter Blue subscription over the weekend. This 90 00:05:20,640 --> 00:05:23,760 Speaker 1: is the version of the subscription that's going to come 91 00:05:23,800 --> 00:05:27,200 Speaker 1: with it a new verified check mark, so you don't 92 00:05:27,200 --> 00:05:29,480 Speaker 1: have to go through the old verification process. You can 93 00:05:29,520 --> 00:05:31,680 Speaker 1: subscribe to Twitter Blue for eight bucks a month and 94 00:05:31,680 --> 00:05:34,800 Speaker 1: get that checkmark on your your account. However, the company 95 00:05:34,800 --> 00:05:38,039 Speaker 1: has subsequently decided to hold off on actually instituting the 96 00:05:38,120 --> 00:05:42,000 Speaker 1: check mark changes until the midterm elections are over here 97 00:05:42,000 --> 00:05:45,920 Speaker 1: in the United States. Those elections happened today, so we 98 00:05:45,960 --> 00:05:49,160 Speaker 1: should see the rollout happened later this week. But there 99 00:05:49,240 --> 00:05:52,240 Speaker 1: was concerned that people could rush to get verified accounts 100 00:05:52,360 --> 00:05:55,040 Speaker 1: and then use that check mark to lend credibility to 101 00:05:55,120 --> 00:05:59,839 Speaker 1: misinformation campaigns. Right if you're sending out lies about the election, 102 00:06:00,000 --> 00:06:01,680 Speaker 1: but you have that little blue check mark next to 103 00:06:01,720 --> 00:06:05,560 Speaker 1: your name, people might think oh, this person's legit. So 104 00:06:06,360 --> 00:06:09,240 Speaker 1: for that reason, Twitter held back on actually implementing this 105 00:06:09,320 --> 00:06:12,680 Speaker 1: change so far. On a similar note, we learned that 106 00:06:12,760 --> 00:06:16,440 Speaker 1: free speech absolutist Elon Musk actually does have limits to 107 00:06:16,560 --> 00:06:19,040 Speaker 1: free speech. For example, if you make an account and 108 00:06:19,080 --> 00:06:22,080 Speaker 1: you claim to be Elon Musk and you're not clearly 109 00:06:22,120 --> 00:06:25,760 Speaker 1: stating in your bio that it's a parody account, Ellen 110 00:06:25,880 --> 00:06:29,880 Speaker 1: will ban you permanently one strike, and you're not just 111 00:06:29,920 --> 00:06:31,560 Speaker 1: out of the game, you are out of the league, 112 00:06:31,800 --> 00:06:35,000 Speaker 1: which is interesting. Musk said that now that the company 113 00:06:35,040 --> 00:06:38,039 Speaker 1: is rolling out widespread verification, there will be no warning 114 00:06:38,160 --> 00:06:42,320 Speaker 1: if someone chooses to impersonate someone else without clearly labeling 115 00:06:42,320 --> 00:06:46,320 Speaker 1: the account as a parody account. Sarah Silverman and Kathy Griffin, 116 00:06:46,400 --> 00:06:50,880 Speaker 1: both comedians, changed their verified names to Elon Musk, which 117 00:06:51,040 --> 00:06:54,120 Speaker 1: illustrated how having a verification system that doesn't, you know, 118 00:06:54,800 --> 00:06:57,640 Speaker 1: actually verify that someone is who they claim to be 119 00:06:58,440 --> 00:07:01,880 Speaker 1: is just downright broken. Sort of like how full self 120 00:07:01,960 --> 00:07:05,560 Speaker 1: driving on Tesla vehicles isn't really full self driving. I'm 121 00:07:05,760 --> 00:07:09,400 Speaker 1: sensing a little theme along Elon Musk companies here. Another 122 00:07:09,480 --> 00:07:11,960 Speaker 1: issue is that Musk has said Twitter Blue accounts will 123 00:07:11,960 --> 00:07:14,880 Speaker 1: receive half the ad load of a regular Twitter user, 124 00:07:15,400 --> 00:07:18,760 Speaker 1: and analysts say if that happens, Twitter will miss out 125 00:07:18,760 --> 00:07:22,320 Speaker 1: on about six dollars of ad revenue per Twitter Blue 126 00:07:22,320 --> 00:07:25,680 Speaker 1: subscriber in the United States, and once Google and Apple 127 00:07:25,720 --> 00:07:29,280 Speaker 1: have taken a cut of the subscription transaction of eight 128 00:07:29,320 --> 00:07:32,880 Speaker 1: dollars per month to be in Twitter Blue, it's actually 129 00:07:32,920 --> 00:07:37,960 Speaker 1: possible that subscribing to Twitter Blue will cost Twitter money 130 00:07:38,080 --> 00:07:42,440 Speaker 1: so as people pay Twitter to use it. If in fact, 131 00:07:42,520 --> 00:07:46,440 Speaker 1: this all stays the way it is, then Twitter will 132 00:07:46,520 --> 00:07:48,880 Speaker 1: lose money because it will not be showing as many 133 00:07:48,920 --> 00:07:51,520 Speaker 1: ads to those people and won't be able to monetize 134 00:07:51,840 --> 00:07:55,880 Speaker 1: their activity. So yeah, there's some questions about that long 135 00:07:56,000 --> 00:07:58,920 Speaker 1: term strategy, as it does not appear to be helping 136 00:07:58,960 --> 00:08:02,120 Speaker 1: the problem of losing more than four million dollars per day. 137 00:08:02,400 --> 00:08:05,080 Speaker 1: The Verge reports that at least some verified users are 138 00:08:05,120 --> 00:08:07,880 Speaker 1: now finding it impossible to change their names in Twitter, 139 00:08:08,480 --> 00:08:10,960 Speaker 1: leading some to wonder if perhaps the company is trying 140 00:08:10,960 --> 00:08:14,000 Speaker 1: to head off a mass musk ing, or just to 141 00:08:14,120 --> 00:08:16,720 Speaker 1: avoid Twitter turning into that free for all hell escape 142 00:08:16,760 --> 00:08:21,560 Speaker 1: the Musk was saying definitely would not happen. Elizabeth Lopato 143 00:08:21,600 --> 00:08:24,720 Speaker 1: of The Verge reported that she and her verified coworkers 144 00:08:25,040 --> 00:08:27,440 Speaker 1: all encountered an error message while they were trying to 145 00:08:27,520 --> 00:08:32,080 Speaker 1: change their names, but that her unverified coworker was able 146 00:08:32,120 --> 00:08:36,400 Speaker 1: to change his name on the platform without a problem, 147 00:08:36,440 --> 00:08:40,520 Speaker 1: which does sound curious. Also, Musk warned that verified users 148 00:08:40,520 --> 00:08:43,920 Speaker 1: who changed their names would quote cause temporary loss of 149 00:08:44,040 --> 00:08:47,520 Speaker 1: verified check mark end quote. Elizabeth pointed out that if 150 00:08:47,520 --> 00:08:50,319 Speaker 1: she changed her Twitter name to Liz, which is a 151 00:08:50,440 --> 00:08:53,319 Speaker 1: name that some of her friends and colleagues use for her, 152 00:08:53,960 --> 00:08:57,680 Speaker 1: that would technically mean she would be unverified, at least temporarily, 153 00:08:58,040 --> 00:09:01,280 Speaker 1: even though she was just aaging one form of her 154 00:09:01,360 --> 00:09:03,959 Speaker 1: name to a different form of her name, not changing 155 00:09:03,960 --> 00:09:09,000 Speaker 1: her name to something else entirely, and still twittering over here. 156 00:09:09,440 --> 00:09:12,480 Speaker 1: News outlet The State not to be confused with the 157 00:09:12,480 --> 00:09:16,000 Speaker 1: old MTV comedy Sketch group with folks like Thomas Lennon 158 00:09:16,000 --> 00:09:20,240 Speaker 1: and Joe Latrulia in it reports that quote Twitter's rules 159 00:09:20,360 --> 00:09:24,520 Speaker 1: under new CEO Elon Musk, published Monday do not include 160 00:09:24,559 --> 00:09:29,959 Speaker 1: policies about misinformation end quote. That's a big old yikes 161 00:09:30,000 --> 00:09:32,960 Speaker 1: that is concerning now there are rules in place. They 162 00:09:32,960 --> 00:09:36,800 Speaker 1: did publish rules for content moderation, So for example, it's 163 00:09:36,800 --> 00:09:40,840 Speaker 1: against the rules to engage in targeted harassment. You cannot 164 00:09:41,040 --> 00:09:45,520 Speaker 1: call for violence or glorify violence. You can't promote terrorism 165 00:09:45,559 --> 00:09:49,000 Speaker 1: that kind of thing. Also, they had rules against dox sing. 166 00:09:49,160 --> 00:09:53,640 Speaker 1: You cannot reveal personal information about someone else on Twitter. 167 00:09:53,920 --> 00:09:58,280 Speaker 1: That's against the rules. So there are content rules in place, 168 00:09:58,920 --> 00:10:01,560 Speaker 1: but when it comes of things like misinformation, none of 169 00:10:01,600 --> 00:10:04,680 Speaker 1: that is included in the rules. The State reached out 170 00:10:04,720 --> 00:10:10,920 Speaker 1: to Twitter asking if the company would allow clear misinformation 171 00:10:10,960 --> 00:10:16,480 Speaker 1: campaigns to to proliferate across the platform without moderation, and 172 00:10:16,559 --> 00:10:20,520 Speaker 1: seem to get the indication that Twitter will still look 173 00:10:20,520 --> 00:10:22,800 Speaker 1: for that kind of stuff, but it's not codified in 174 00:10:22,800 --> 00:10:27,120 Speaker 1: the rules. So who knows. Platformer, a tech newsletter written 175 00:10:27,120 --> 00:10:30,320 Speaker 1: by journalist Casey Newton, whose work is really good. By 176 00:10:30,320 --> 00:10:32,360 Speaker 1: the way, you're not familiar with Casey, you should check 177 00:10:32,400 --> 00:10:36,120 Speaker 1: out his work, revealed that Musk is allegedly considering putting 178 00:10:36,200 --> 00:10:39,920 Speaker 1: all of Twitter behind a paywall. Now, it sounds as 179 00:10:39,920 --> 00:10:42,679 Speaker 1: though this plan is to allow for a certain amount 180 00:10:42,679 --> 00:10:46,080 Speaker 1: of browsing on Twitter each month for free, but once 181 00:10:46,120 --> 00:10:48,400 Speaker 1: you hit a limit, let's say, like it's like I 182 00:10:48,440 --> 00:10:51,400 Speaker 1: don't know, five hours or something, then you would have 183 00:10:51,480 --> 00:10:54,320 Speaker 1: to pay to continue using Twitter for the rest of 184 00:10:54,360 --> 00:10:56,760 Speaker 1: that calendar month, or you'd have to wait for the 185 00:10:56,800 --> 00:10:59,640 Speaker 1: next month before you could use Twitter again, which sounds 186 00:10:59,640 --> 00:11:03,600 Speaker 1: a lot like some mobile games I've played, where the 187 00:11:03,600 --> 00:11:06,040 Speaker 1: game is designed to hook you into activity and then 188 00:11:06,080 --> 00:11:09,040 Speaker 1: has a cool down period, but you can skip the 189 00:11:09,080 --> 00:11:12,360 Speaker 1: cool down if you pay cash. As of this recording, 190 00:11:12,400 --> 00:11:14,680 Speaker 1: there is no word on how seriously Musk and his 191 00:11:14,760 --> 00:11:19,000 Speaker 1: team are considering this option. I would be really interested 192 00:11:19,040 --> 00:11:22,360 Speaker 1: to see if this got implemented. I mean, I'm already 193 00:11:22,920 --> 00:11:26,160 Speaker 1: kind of off Twitter myself, but yeah, I could definitely 194 00:11:26,160 --> 00:11:28,440 Speaker 1: see that brands would want to pay in order to 195 00:11:28,480 --> 00:11:33,520 Speaker 1: continue using Twitter. But if your average user isn't, obviously, 196 00:11:33,600 --> 00:11:35,880 Speaker 1: that would not be a very sustainable model. I guess 197 00:11:35,960 --> 00:11:41,319 Speaker 1: they're they're really banking on people being addicted to using Twitter. 198 00:11:42,040 --> 00:11:45,720 Speaker 1: Will be back with some more news, including stuff that 199 00:11:45,760 --> 00:11:58,560 Speaker 1: doesn't involve Twitter at all. After these messages. One thing 200 00:11:58,559 --> 00:12:01,959 Speaker 1: that some Twitter users have been doing is dumping Twitter 201 00:12:02,280 --> 00:12:07,920 Speaker 1: entirely and migrating over to Mastodon. Uh. Then a bunch 202 00:12:07,960 --> 00:12:11,200 Speaker 1: of people go to Mastodon and their left saying, what 203 00:12:11,280 --> 00:12:14,880 Speaker 1: the heck is this? How does this work? So? Mastodon 204 00:12:15,640 --> 00:12:18,559 Speaker 1: allows you to post messages, it allows you to engage 205 00:12:18,679 --> 00:12:21,960 Speaker 1: with posted messages, so you can comment on someone else's message, 206 00:12:22,240 --> 00:12:25,480 Speaker 1: you could repost other people's messages, that kind of stuff. 207 00:12:25,720 --> 00:12:28,520 Speaker 1: So it has a lot of features that are similar 208 00:12:28,720 --> 00:12:31,880 Speaker 1: to what you can do on Twitter, but Mastodon is 209 00:12:32,040 --> 00:12:36,760 Speaker 1: organized in a totally different way from Twitter. Twitter is 210 00:12:36,800 --> 00:12:44,120 Speaker 1: a centralized service hosted on Twitter servers. Mastodon is decentralized. 211 00:12:44,160 --> 00:12:47,440 Speaker 1: It is hosted on multiple servers that are owned and 212 00:12:47,480 --> 00:12:51,600 Speaker 1: operated by different people and organizations all around the world. 213 00:12:51,679 --> 00:12:55,040 Speaker 1: So when you start using Mastodon, the first thing you 214 00:12:55,080 --> 00:12:59,240 Speaker 1: have to do is choose a server to join. Servers 215 00:12:59,280 --> 00:13:02,240 Speaker 1: tend to have focus to them. That focus could be 216 00:13:02,320 --> 00:13:05,679 Speaker 1: topic based, It could be regionally based, could be like 217 00:13:05,720 --> 00:13:10,040 Speaker 1: this is a server for you know, Southeast UK, that 218 00:13:10,120 --> 00:13:12,800 Speaker 1: kind of thing. It can be demographically based. There are 219 00:13:12,800 --> 00:13:16,760 Speaker 1: a lot of lgbt Q friendly servers, and the server 220 00:13:16,920 --> 00:13:21,920 Speaker 1: you choose ends up becoming part of your handle slash address. 221 00:13:22,559 --> 00:13:24,840 Speaker 1: Folks who are on the same server that you're on 222 00:13:25,120 --> 00:13:27,520 Speaker 1: can just look you up by your handle, right, It's 223 00:13:27,559 --> 00:13:29,640 Speaker 1: like you're both in the same room in a house. 224 00:13:29,720 --> 00:13:31,920 Speaker 1: They can just look across the room and see you there. 225 00:13:32,480 --> 00:13:34,679 Speaker 1: But they can do that just by using your handle. 226 00:13:34,760 --> 00:13:37,280 Speaker 1: If they're on a different server. They have to know 227 00:13:37,360 --> 00:13:40,280 Speaker 1: what server you belong to in order to look you up, 228 00:13:40,320 --> 00:13:44,400 Speaker 1: so they use your handle plus whatever server you belong to, 229 00:13:45,000 --> 00:13:47,160 Speaker 1: and then they can send you messages and things of 230 00:13:47,200 --> 00:13:50,800 Speaker 1: that nature. All that being said, anyone on any server 231 00:13:51,000 --> 00:13:54,480 Speaker 1: can technically communicate with anyone else on any other server, 232 00:13:54,640 --> 00:13:59,839 Speaker 1: but discovery is pretty challenging. Uh, it's a little more 233 00:14:00,040 --> 00:14:03,640 Speaker 1: umbersome that what you would find on Twitter. Also, if 234 00:14:03,720 --> 00:14:08,719 Speaker 1: whatever entity or organization runs your server stops supporting it, 235 00:14:09,440 --> 00:14:12,400 Speaker 1: you effectively lose your account. You would have to start 236 00:14:12,440 --> 00:14:16,079 Speaker 1: over by establishing a new account on another server. So 237 00:14:16,160 --> 00:14:20,480 Speaker 1: that is an issue with Mastodon. Uh. There are people 238 00:14:20,480 --> 00:14:24,080 Speaker 1: who are working to create kind of a buffer so 239 00:14:24,120 --> 00:14:28,000 Speaker 1: that if an organization plans to shut down a Mastodon server, 240 00:14:28,200 --> 00:14:30,760 Speaker 1: people will have time to migrate their stuff over to 241 00:14:30,840 --> 00:14:33,640 Speaker 1: a different server. But as I understand it right now, 242 00:14:33,680 --> 00:14:36,760 Speaker 1: that's not the case. It's just kind of taking on 243 00:14:36,880 --> 00:14:39,600 Speaker 1: faith that the server you've joined, we're will still be 244 00:14:39,640 --> 00:14:44,760 Speaker 1: around tomorrow. Anyway, Mastodon has reported an influx of users 245 00:14:45,160 --> 00:14:49,160 Speaker 1: that's really pushed Mastodon to the limit on some servers. 246 00:14:49,480 --> 00:14:52,880 Speaker 1: You know, a certain servers got way more popular then 247 00:14:52,960 --> 00:14:57,320 Speaker 1: that increase in traffic has really strained the server load, 248 00:14:57,960 --> 00:15:01,240 Speaker 1: and it's you know, been a pretty quiet utility for 249 00:15:01,280 --> 00:15:04,960 Speaker 1: the last several years, except within certain communities. If you 250 00:15:05,000 --> 00:15:08,120 Speaker 1: are thinking about joining Mastodon and checking it out, I 251 00:15:08,160 --> 00:15:14,000 Speaker 1: feel that I should add Mastodon tends to lean left politically, 252 00:15:14,240 --> 00:15:18,720 Speaker 1: not exclusively, but it is a tendency. Now. I'm personally 253 00:15:18,800 --> 00:15:21,440 Speaker 1: a left leaning kind of guy, which I'm sure everyone 254 00:15:21,440 --> 00:15:27,040 Speaker 1: out there realizes based upon my my uh commentary in episodes, 255 00:15:27,560 --> 00:15:31,000 Speaker 1: but I realized not everyone is, and likely a lot 256 00:15:31,040 --> 00:15:33,400 Speaker 1: of listeners will disagree with me politically, so I thought 257 00:15:33,400 --> 00:15:36,400 Speaker 1: it's only fair I mentioned that there is a somewhat 258 00:15:36,440 --> 00:15:40,000 Speaker 1: liberal leaning environment at Mastodon. As for whether I will 259 00:15:40,000 --> 00:15:42,360 Speaker 1: set up a tech stuff space over on Macedon, I'm 260 00:15:42,360 --> 00:15:45,000 Speaker 1: still thinking about it, and I will let y'all know. 261 00:15:45,800 --> 00:15:49,520 Speaker 1: Several news outlets are reporting that Meta maybe gearing up 262 00:15:49,520 --> 00:15:52,720 Speaker 1: for its own round of layoffs. This week, The Wall 263 00:15:52,720 --> 00:15:56,200 Speaker 1: Street Journal reported we may hear about thousands of layoffs 264 00:15:56,200 --> 00:15:59,480 Speaker 1: beginning tomorrow, which would be Wednesday, the ninth of November 265 00:15:59,760 --> 00:16:02,160 Speaker 1: to thousand, twenty two for those of y'all listening to 266 00:16:02,640 --> 00:16:06,400 Speaker 1: old news episodes in the future. If these layoffs happen, 267 00:16:06,680 --> 00:16:10,680 Speaker 1: they will mark the first broad scale downsizing across Meta 268 00:16:11,120 --> 00:16:16,120 Speaker 1: in the company's history. It's not exactly shocking because CEO 269 00:16:16,240 --> 00:16:19,680 Speaker 1: Mark Zuckerberg has indicated several times this year that he 270 00:16:19,800 --> 00:16:23,360 Speaker 1: suspects Meta has more employees than there is work to do, 271 00:16:24,040 --> 00:16:27,240 Speaker 1: and that some folks aren't necessarily putting in significant effort 272 00:16:27,280 --> 00:16:30,040 Speaker 1: in their jobs, and that there are employees who may 273 00:16:30,080 --> 00:16:32,920 Speaker 1: not need to be there. And this is all talk 274 00:16:33,000 --> 00:16:35,840 Speaker 1: that indicates that there's a leader in place who feels 275 00:16:35,840 --> 00:16:40,000 Speaker 1: it's time to clear out some office space. Plus, Zuckerberg 276 00:16:40,120 --> 00:16:44,960 Speaker 1: continues to focus on his Metaverse project, which is costing 277 00:16:45,200 --> 00:16:48,880 Speaker 1: Meta billions of dollars each year. If you're going to 278 00:16:48,960 --> 00:16:51,600 Speaker 1: keep up with that kind of cash drain, you may 279 00:16:51,680 --> 00:16:55,440 Speaker 1: eventually have to look at making cuts elsewhere, like with 280 00:16:55,520 --> 00:16:58,760 Speaker 1: your head count on projects that are not directly connected 281 00:16:58,760 --> 00:17:02,000 Speaker 1: to your Metaverse vision. We've already heard reports that there 282 00:17:02,040 --> 00:17:05,800 Speaker 1: are folks in Meta who say they are on MMH projects, 283 00:17:05,880 --> 00:17:10,119 Speaker 1: which stands for make Mark Happy. That one way to 284 00:17:10,160 --> 00:17:13,159 Speaker 1: stay in good graces with management is to tackle whatever 285 00:17:13,240 --> 00:17:17,040 Speaker 1: pet project Zuckerberg has in mind at the time. But 286 00:17:17,200 --> 00:17:21,080 Speaker 1: that this gets tricky because Zuckerberg also is known to 287 00:17:21,280 --> 00:17:24,800 Speaker 1: change his mind about stuff, often due to the public 288 00:17:24,840 --> 00:17:30,840 Speaker 1: reception of his ideas, which then necessitates pivoting those projects 289 00:17:30,920 --> 00:17:36,520 Speaker 1: and being able to respond to new criteria. Now, I'm 290 00:17:36,520 --> 00:17:38,919 Speaker 1: no expert in any of this kind of stuff, but 291 00:17:39,000 --> 00:17:40,840 Speaker 1: to me, it sounds like Meta isn't a bit of 292 00:17:40,840 --> 00:17:45,199 Speaker 1: a vicious cycle right now. Investors are generally unhappy that 293 00:17:45,240 --> 00:17:47,800 Speaker 1: the company isn't performing nearly as well as it used 294 00:17:47,800 --> 00:17:50,479 Speaker 1: to who which is putting it lightly, And they're really 295 00:17:50,520 --> 00:17:53,159 Speaker 1: not happy that Meta is spending billions of dollars on 296 00:17:53,200 --> 00:17:56,919 Speaker 1: a project that will not see fruition for years if 297 00:17:57,000 --> 00:18:00,560 Speaker 1: it succeeds at all. And there's still plenty of skeptics, 298 00:18:00,600 --> 00:18:04,840 Speaker 1: myself included, who think that the vision of the metaverse, 299 00:18:04,880 --> 00:18:10,320 Speaker 1: particularly as something that necessitates VR and a R headgear, 300 00:18:11,200 --> 00:18:15,959 Speaker 1: is not likely to have broad acceptance. You're gonna have 301 00:18:16,760 --> 00:18:21,560 Speaker 1: probably a narrow band of extremely enthusiastic fans of it, 302 00:18:22,000 --> 00:18:25,880 Speaker 1: but I just don't see it extending beyond that without 303 00:18:25,920 --> 00:18:29,080 Speaker 1: like a lot of growing pains. Now that being said, 304 00:18:29,480 --> 00:18:32,080 Speaker 1: shares of Meta rose a bit after news broke that 305 00:18:32,160 --> 00:18:37,000 Speaker 1: layoffs are coming, because that's cheerful, And as of this recording, 306 00:18:37,280 --> 00:18:40,400 Speaker 1: shares of Meta are trading and around ninety six dollars 307 00:18:40,400 --> 00:18:44,480 Speaker 1: per share. Now, this is up from its low of 308 00:18:44,600 --> 00:18:47,480 Speaker 1: eighty eight dollars per share, which was what it was 309 00:18:47,520 --> 00:18:50,680 Speaker 1: posting it just a few days ago. But ninety six 310 00:18:50,760 --> 00:18:54,760 Speaker 1: dollars a share is way way down from the company's 311 00:18:54,840 --> 00:18:58,120 Speaker 1: high this year of around three forty dollars a year 312 00:18:58,200 --> 00:19:01,280 Speaker 1: back in January. So yeah, it was trading at three 313 00:19:01,960 --> 00:19:04,200 Speaker 1: dollars at the beginning of the year. Now it's less 314 00:19:04,200 --> 00:19:07,399 Speaker 1: than a hundred dollars per share. You know, towards the 315 00:19:07,480 --> 00:19:09,879 Speaker 1: end of the year. Uh, it did have a little 316 00:19:09,880 --> 00:19:13,480 Speaker 1: bit of a recovery compared to its low, but yeah, 317 00:19:13,640 --> 00:19:16,560 Speaker 1: that is not in a great place. Whendows eleven users 318 00:19:16,640 --> 00:19:19,359 Speaker 1: might notice some new ads popping up as they go 319 00:19:19,480 --> 00:19:23,639 Speaker 1: to shut down their computers. Bleeping Computer reports that users 320 00:19:23,640 --> 00:19:26,840 Speaker 1: have noticed flyout ads popping up as they go to 321 00:19:27,160 --> 00:19:29,520 Speaker 1: shut down their machines. So they go into the start 322 00:19:29,560 --> 00:19:32,480 Speaker 1: menu and they pull up the little menu that comes 323 00:19:32,560 --> 00:19:36,840 Speaker 1: up when you hover over the power selection, and the 324 00:19:36,880 --> 00:19:39,600 Speaker 1: ads are really simple. It's just an option that if 325 00:19:39,640 --> 00:19:41,760 Speaker 1: you were to click on the option would take you 326 00:19:41,800 --> 00:19:45,119 Speaker 1: to a product page. So for example, like when you 327 00:19:45,160 --> 00:19:48,240 Speaker 1: would hover over that power button, you would get a 328 00:19:48,280 --> 00:19:51,520 Speaker 1: little menu that give you options like change account settings, 329 00:19:52,040 --> 00:19:56,600 Speaker 1: lock to lock your computer, or sign out. But now 330 00:19:56,640 --> 00:19:59,440 Speaker 1: if you do it, you would also get a little 331 00:19:59,480 --> 00:20:01,960 Speaker 1: advertise service at the top. It would just be listed 332 00:20:01,960 --> 00:20:05,280 Speaker 1: as another option among these that are already there. So 333 00:20:05,359 --> 00:20:07,920 Speaker 1: you might see an option that says back up your files, 334 00:20:08,359 --> 00:20:10,000 Speaker 1: and if you click on it, it actually takes you 335 00:20:10,040 --> 00:20:13,680 Speaker 1: to Microsoft's one drive product page, or you might see 336 00:20:13,680 --> 00:20:17,040 Speaker 1: one that says sign up for Microsoft account. So, in 337 00:20:17,040 --> 00:20:19,879 Speaker 1: other words, the first option in the menu is trying 338 00:20:19,920 --> 00:20:25,040 Speaker 1: to sell you something specifically some sort of Microsoft service. Interesting. 339 00:20:25,960 --> 00:20:29,080 Speaker 1: People are already upset about it, which I get like 340 00:20:29,560 --> 00:20:34,520 Speaker 1: there they don't want to be confronted with sales messages 341 00:20:34,560 --> 00:20:37,520 Speaker 1: when they're just trying to access basic functions of a 342 00:20:37,520 --> 00:20:42,760 Speaker 1: computer like turning it off. So yeah, interesting little report there. 343 00:20:43,160 --> 00:20:46,119 Speaker 1: And you know, we already talked earlier about how Microsoft 344 00:20:46,240 --> 00:20:51,560 Speaker 1: is apparently considering going down a strategy where they offer 345 00:20:51,760 --> 00:20:56,640 Speaker 1: PCs that are subsidized by advertising so that you could 346 00:20:56,680 --> 00:21:01,080 Speaker 1: purchase it for very little money relatively speaking. And the 347 00:21:01,119 --> 00:21:03,560 Speaker 1: tradeoff would be every time you're using your computer, you're 348 00:21:03,560 --> 00:21:07,640 Speaker 1: getting ads served to you, and that offsets the purchase 349 00:21:07,720 --> 00:21:12,320 Speaker 1: price of the computer. Uh, we've heard about that last week. 350 00:21:12,400 --> 00:21:14,840 Speaker 1: Still no word on whether or not they're moving forward 351 00:21:14,840 --> 00:21:17,440 Speaker 1: with that, but we are starting to see Microsoft incorporate 352 00:21:18,320 --> 00:21:23,040 Speaker 1: Microsoft specific advertising in basic functions of Windows eleven, so 353 00:21:23,680 --> 00:21:26,600 Speaker 1: maybe that is where they're going over on the Activision 354 00:21:26,640 --> 00:21:29,800 Speaker 1: Blizzard side of things, which just as a reminder, Microsoft 355 00:21:29,840 --> 00:21:33,480 Speaker 1: is still attempting to acquire Activision Blizzard, but the deal 356 00:21:33,640 --> 00:21:36,520 Speaker 1: is facing some scrutiny in different parts of the world, 357 00:21:36,560 --> 00:21:39,680 Speaker 1: So no guarantee that that's going to happen or what 358 00:21:39,760 --> 00:21:43,200 Speaker 1: timeline it might happen on. But let's talk about Call 359 00:21:43,359 --> 00:21:47,320 Speaker 1: of Duty Modern Warfare two and the upcoming free to 360 00:21:47,440 --> 00:21:52,280 Speaker 1: play companion title Call of Duty war Zone two point oh. 361 00:21:52,640 --> 00:21:57,440 Speaker 1: Engadget reports that Activision is launching some content moderation tools 362 00:21:57,920 --> 00:22:02,520 Speaker 1: meant to deal with toxicity in communications within the games. 363 00:22:02,880 --> 00:22:05,200 Speaker 1: And I'm sure that if any of y'all have ever 364 00:22:05,240 --> 00:22:08,720 Speaker 1: played any sort of competitive online game that features voice 365 00:22:08,800 --> 00:22:11,960 Speaker 1: chat or text chat, you have run into situations where 366 00:22:12,000 --> 00:22:16,959 Speaker 1: someone is being incredibly offensive and vulgar and are harassing 367 00:22:16,960 --> 00:22:19,560 Speaker 1: other players that you can hear like some of the 368 00:22:19,560 --> 00:22:23,880 Speaker 1: worst stuff imaginable in these games. And for folks who 369 00:22:23,920 --> 00:22:28,120 Speaker 1: make say they're living streaming, people who live stream games, 370 00:22:28,560 --> 00:22:30,320 Speaker 1: this can be a real problem because that kind of 371 00:22:30,359 --> 00:22:34,040 Speaker 1: language can get the streamer and hot water with whatever 372 00:22:34,359 --> 00:22:37,800 Speaker 1: platform they're using, like if it's YouTube or Twitch, even 373 00:22:38,080 --> 00:22:41,919 Speaker 1: if the streamer is not necessarily engaging with the toxic player, 374 00:22:42,280 --> 00:22:45,959 Speaker 1: like if there's proximity chat on and another player is 375 00:22:46,080 --> 00:22:49,720 Speaker 1: in the same general space as the streamer and then 376 00:22:49,760 --> 00:22:53,800 Speaker 1: just starts letting loose with ethnic slurs, that affects the 377 00:22:53,840 --> 00:22:58,000 Speaker 1: streamer too, right, So not to mention just normal players 378 00:22:58,000 --> 00:23:03,440 Speaker 1: who don't necessarily want to encounter that kind of abuse online. Well, 379 00:23:03,480 --> 00:23:06,880 Speaker 1: Call of Duty is going to have active moderation teams who, 380 00:23:06,960 --> 00:23:09,840 Speaker 1: once they verify that a player has engaged in toxic 381 00:23:09,920 --> 00:23:13,600 Speaker 1: voice chat or text messages within the game, they will 382 00:23:13,720 --> 00:23:18,400 Speaker 1: mute that player in all communication channels. That person will 383 00:23:18,440 --> 00:23:21,439 Speaker 1: no longer be heard or be able to text. They'll 384 00:23:21,480 --> 00:23:23,959 Speaker 1: still be able to play unless they have violated some 385 00:23:24,000 --> 00:23:25,960 Speaker 1: other rules, in which case they could get you know, 386 00:23:26,000 --> 00:23:29,760 Speaker 1: booted and banned, but they won't be heard by anybody, 387 00:23:29,760 --> 00:23:32,199 Speaker 1: which I think is a blessing. Uh. In order for 388 00:23:32,240 --> 00:23:35,760 Speaker 1: this to work, however, is being a reactive system. Players 389 00:23:35,760 --> 00:23:39,359 Speaker 1: will still need to flag offensive gamers to alert the 390 00:23:39,400 --> 00:23:42,679 Speaker 1: moderation team to look into them. So if you're playing, 391 00:23:42,720 --> 00:23:45,400 Speaker 1: and you encounter someone who is just being the worst, 392 00:23:45,920 --> 00:23:49,840 Speaker 1: you can flag them. Then, at least hypothetically, the content 393 00:23:49,960 --> 00:23:52,479 Speaker 1: moderation team will take a closer look and listen in 394 00:23:52,520 --> 00:23:55,359 Speaker 1: on chat and start looking at the text messages, and 395 00:23:55,400 --> 00:23:59,639 Speaker 1: if they see that, in fact, there are instances of 396 00:23:59,720 --> 00:24:03,280 Speaker 1: a use, they will then mute the player. Still, considering 397 00:24:03,359 --> 00:24:05,320 Speaker 1: all the types of stuff that can get shouted about 398 00:24:05,359 --> 00:24:07,680 Speaker 1: in these games, I think that's a good choice. It's 399 00:24:07,680 --> 00:24:11,040 Speaker 1: not the best. There are more proactive approaches to this 400 00:24:11,160 --> 00:24:14,520 Speaker 1: kind of thing, but at least it's a way of 401 00:24:14,880 --> 00:24:18,240 Speaker 1: addressing it. And that's going to be important because war 402 00:24:18,359 --> 00:24:21,399 Speaker 1: Zone being a free to play game is bound to 403 00:24:21,440 --> 00:24:25,600 Speaker 1: attract a ton of players, and not everyone's going to 404 00:24:25,640 --> 00:24:29,040 Speaker 1: be a hateful jerk, but some of them definitely will be. 405 00:24:29,880 --> 00:24:33,000 Speaker 1: As for me, I'm not good at these kind of 406 00:24:33,080 --> 00:24:36,280 Speaker 1: first person shooter multiplayer games. I can't play at a 407 00:24:36,280 --> 00:24:40,040 Speaker 1: competitive level. I would just be a bullet sponge. So 408 00:24:40,119 --> 00:24:43,440 Speaker 1: you won't find me playing these online. I just I can't. 409 00:24:43,440 --> 00:24:46,160 Speaker 1: My reactions are not good, my aim is not good. 410 00:24:46,200 --> 00:24:48,480 Speaker 1: Even with aim assist, I'm not gonna I'm not gonna 411 00:24:48,800 --> 00:24:52,760 Speaker 1: compete at a decent level, So I just go through 412 00:24:52,840 --> 00:24:55,320 Speaker 1: single player first person shooters and then I yell at 413 00:24:55,359 --> 00:24:59,600 Speaker 1: computer controlled enemies because I know I can't hurt their feelings. Um. 414 00:24:59,640 --> 00:25:02,359 Speaker 1: But yeah, I'm glad to hear this news because I 415 00:25:02,680 --> 00:25:06,720 Speaker 1: do watch some streamers and I never want them to 416 00:25:06,840 --> 00:25:10,920 Speaker 1: have their their career impacted by things that are outside 417 00:25:10,920 --> 00:25:14,680 Speaker 1: their control. The Verge reports that AMC theaters here in 418 00:25:14,720 --> 00:25:18,359 Speaker 1: the US and Zoom are partnering to bring interactive meeting 419 00:25:18,400 --> 00:25:22,440 Speaker 1: calls to some AMC theaters, allowing the theater to become 420 00:25:22,520 --> 00:25:25,240 Speaker 1: kind of a giant meeting space, capable of holding more 421 00:25:25,280 --> 00:25:29,520 Speaker 1: than a hundred people as they participate in like a 422 00:25:29,600 --> 00:25:31,960 Speaker 1: remote meeting. And I can see that as being an 423 00:25:32,000 --> 00:25:36,000 Speaker 1: attractive option for say a large company that has offices 424 00:25:36,400 --> 00:25:39,199 Speaker 1: or stores that are all over the place, like if 425 00:25:39,240 --> 00:25:43,560 Speaker 1: it's a nationwide company, for example, employees can get together 426 00:25:43,680 --> 00:25:46,359 Speaker 1: in a physical space and watch an all hands meeting 427 00:25:46,400 --> 00:25:49,119 Speaker 1: on a big old screen. The other thing that this 428 00:25:49,200 --> 00:25:52,040 Speaker 1: does is that can reinforce that feeling that your boss 429 00:25:52,640 --> 00:25:57,159 Speaker 1: literally towers over you and that the power dynamic is 430 00:25:57,200 --> 00:26:00,800 Speaker 1: incredibly unbalanced, and that helps you visual is that enormous 431 00:26:00,800 --> 00:26:04,600 Speaker 1: gap between your position and that of your boss. Maybe 432 00:26:04,600 --> 00:26:07,000 Speaker 1: I'm getting a bit too cynical about this, But honestly, 433 00:26:07,040 --> 00:26:09,520 Speaker 1: that's the first thing that popped into my head is like, gosh, 434 00:26:09,560 --> 00:26:11,400 Speaker 1: what would make me feel like I had really been 435 00:26:11,400 --> 00:26:13,919 Speaker 1: put in my place? Oh? I know what if my 436 00:26:14,000 --> 00:26:17,000 Speaker 1: boss were five stories tall and looking down at me 437 00:26:17,040 --> 00:26:21,199 Speaker 1: while telling me about the company. AMC says it's planning 438 00:26:21,200 --> 00:26:24,720 Speaker 1: on installing the service in seventeen major US markets next year, 439 00:26:25,080 --> 00:26:28,520 Speaker 1: so it's not coming to every AMC everywhere, but it 440 00:26:28,560 --> 00:26:32,720 Speaker 1: should be rolling out throughout If I'm being serious, I 441 00:26:32,760 --> 00:26:36,240 Speaker 1: can see the value of this, Like I getting people 442 00:26:36,240 --> 00:26:39,680 Speaker 1: together does have a certain value to it, and uh, 443 00:26:39,840 --> 00:26:44,160 Speaker 1: for a distributed company, one that has locations across lots 444 00:26:44,160 --> 00:26:47,479 Speaker 1: of different regions, it could be a practical solution as 445 00:26:47,480 --> 00:26:50,720 Speaker 1: opposed to having every single person individually logging into a 446 00:26:50,760 --> 00:26:54,240 Speaker 1: meeting from their own computer. But yeah, it's hard to 447 00:26:54,280 --> 00:26:58,639 Speaker 1: get around that fact of you're looking at an enormous 448 00:26:58,680 --> 00:27:02,760 Speaker 1: movie screen and by necessity everything has to be enlarged 449 00:27:02,840 --> 00:27:04,919 Speaker 1: so that the folks sitting in the back can have 450 00:27:04,960 --> 00:27:08,000 Speaker 1: a view of what's going on. To also the thought 451 00:27:08,040 --> 00:27:11,919 Speaker 1: of my boss's voice coming over a surround sound system 452 00:27:11,960 --> 00:27:16,520 Speaker 1: with like Dolby Base to it also terrifying. Maybe I'm 453 00:27:16,520 --> 00:27:18,800 Speaker 1: thinking about this too much. You know what, We're gonna 454 00:27:18,840 --> 00:27:21,600 Speaker 1: take another quick break and I'm gonna calm down, and 455 00:27:21,640 --> 00:27:33,560 Speaker 1: then we'll cover the rest of the tech news. Some 456 00:27:33,680 --> 00:27:37,439 Speaker 1: engineers and researchers at m I T, the University of Minnesota, 457 00:27:37,600 --> 00:27:40,840 Speaker 1: and Samsung have developed a new kind of Terra Hurts 458 00:27:40,920 --> 00:27:44,400 Speaker 1: camera that can operate at room temperature and room pressure, 459 00:27:44,440 --> 00:27:47,200 Speaker 1: which opens up the possibility for this tech to be 460 00:27:47,320 --> 00:27:51,240 Speaker 1: used in many different areas, including stuff like airport security. 461 00:27:51,760 --> 00:27:55,560 Speaker 1: So the cameras capable of generating and detecting waves and 462 00:27:55,600 --> 00:27:59,520 Speaker 1: the terror Hurts range of the electromagnetic spectrum. This puts 463 00:27:59,520 --> 00:28:04,080 Speaker 1: this kind of radiation between microwaves and visible light. It 464 00:28:04,480 --> 00:28:08,720 Speaker 1: inhabits that space, and these waves can penetrate lots of 465 00:28:08,720 --> 00:28:12,480 Speaker 1: different kinds of materials, non metallic materials, not all of them, 466 00:28:12,480 --> 00:28:15,160 Speaker 1: but a lot of them, and that's one reason why 467 00:28:15,160 --> 00:28:18,080 Speaker 1: it could be useful in security operations. They can also 468 00:28:18,119 --> 00:28:22,439 Speaker 1: detect signatures of certain molecules, so in an industrial setting, 469 00:28:22,960 --> 00:28:26,600 Speaker 1: you could use this kind of technology to analyze materials 470 00:28:26,600 --> 00:28:28,919 Speaker 1: for certain types of molecules, but you can do it 471 00:28:28,960 --> 00:28:31,760 Speaker 1: in a non invasive, nondestructive way that could be really 472 00:28:31,840 --> 00:28:35,800 Speaker 1: valuable to the camera works by sending out light. It 473 00:28:35,920 --> 00:28:39,440 Speaker 1: stimulates these little particles called quantum dots, which I've talked 474 00:28:39,440 --> 00:28:42,440 Speaker 1: about in tech stuff episodes from a long time ago. 475 00:28:42,480 --> 00:28:44,120 Speaker 1: I probably need to do a new one. But it 476 00:28:44,200 --> 00:28:47,440 Speaker 1: stimulates quantum dots by using a terror hurts wave. So 477 00:28:47,520 --> 00:28:50,960 Speaker 1: it generates a terror hurts wave, sends that to affect 478 00:28:51,000 --> 00:28:54,560 Speaker 1: these quantum dots, which then emit light, and that light 479 00:28:54,640 --> 00:28:56,960 Speaker 1: can then be recorded by a sensor similar to the 480 00:28:57,040 --> 00:29:00,479 Speaker 1: kind we use in digital cameras. The team created a 481 00:29:00,480 --> 00:29:05,760 Speaker 1: device that produces images that not just show what was imaged, 482 00:29:05,760 --> 00:29:08,920 Speaker 1: but can show the polarization state of terror Hurts waves, 483 00:29:08,920 --> 00:29:12,080 Speaker 1: which can give you ideas as to the nature of 484 00:29:12,160 --> 00:29:15,000 Speaker 1: whatever was image. And I think of this tech as 485 00:29:15,040 --> 00:29:18,680 Speaker 1: being similar to kind of like thermal imaging sensors. Thermal 486 00:29:18,760 --> 00:29:23,360 Speaker 1: imaging goggles, for example, detect infrared radiation and they send 487 00:29:23,400 --> 00:29:26,480 Speaker 1: that data to a system that converts the information into 488 00:29:26,520 --> 00:29:29,320 Speaker 1: something we can see. So when you put on thermal 489 00:29:29,360 --> 00:29:32,400 Speaker 1: goggles and you see like that heat signature, you know, 490 00:29:32,480 --> 00:29:36,080 Speaker 1: like predator vision, that that's thanks to a system that 491 00:29:36,280 --> 00:29:40,440 Speaker 1: is taking the thermal information and then matching that two 492 00:29:40,440 --> 00:29:43,280 Speaker 1: colors that we can see in order to indicate levels 493 00:29:43,280 --> 00:29:45,920 Speaker 1: of heat. Otherwise it would just be invisible to us, 494 00:29:45,960 --> 00:29:48,840 Speaker 1: So it has to be converted into the visible range 495 00:29:48,880 --> 00:29:50,280 Speaker 1: for us to be able to see it, right, That 496 00:29:50,320 --> 00:29:53,400 Speaker 1: makes sense, Same sort of thing here. The researchers have 497 00:29:53,480 --> 00:29:56,760 Speaker 1: made big strides towards solving some massive engineering challenges when 498 00:29:56,840 --> 00:29:59,400 Speaker 1: using terror hurts cameras. You know, being able to operate 499 00:29:59,440 --> 00:30:02,040 Speaker 1: a room to sure and at room pressure is huge, 500 00:30:02,400 --> 00:30:04,640 Speaker 1: But there are other obstacles that we have to overcome 501 00:30:04,680 --> 00:30:08,360 Speaker 1: before this tech could go into widespread use, namely, generating 502 00:30:08,440 --> 00:30:12,160 Speaker 1: terror hurts waves in the first place is a complicated process, 503 00:30:12,200 --> 00:30:14,360 Speaker 1: and the current tech that we have to rely upon 504 00:30:14,520 --> 00:30:18,640 Speaker 1: uses really expensive large laser systems, so it's not something 505 00:30:18,640 --> 00:30:22,000 Speaker 1: that would be practical right now. We would need to 506 00:30:22,000 --> 00:30:25,320 Speaker 1: make greater strides in being able to create terror hurts 507 00:30:25,360 --> 00:30:28,720 Speaker 1: waves without the reliance of things like laser systems, which 508 00:30:28,720 --> 00:30:32,040 Speaker 1: people are working on, but we have ways to go. Recently, 509 00:30:32,080 --> 00:30:36,960 Speaker 1: representatives at the Atacama Large Millimeter Array Observatory in Chile, 510 00:30:37,520 --> 00:30:41,120 Speaker 1: which houses large important radio antenna that are used in 511 00:30:41,160 --> 00:30:45,760 Speaker 1: astronomical observations, revealed that the facility had to shut down 512 00:30:46,200 --> 00:30:49,400 Speaker 1: starting in late October due to a cyber attack. The 513 00:30:49,480 --> 00:30:53,000 Speaker 1: rep said the attackers did not affect scientific data, so 514 00:30:53,040 --> 00:30:57,440 Speaker 1: they weren't able to compromise information from recent observations. Nor 515 00:30:57,520 --> 00:31:00,920 Speaker 1: were they able to get actual access to the antennas themselves, 516 00:31:00,920 --> 00:31:05,040 Speaker 1: so they weren't able to damage the array of radio antennas. 517 00:31:05,080 --> 00:31:07,920 Speaker 1: But other network systems were affected, you know, things like 518 00:31:07,960 --> 00:31:11,480 Speaker 1: administrative systems, email systems, that kind of stuff, so they 519 00:31:11,480 --> 00:31:14,040 Speaker 1: had to shut it down to contain it. The statement 520 00:31:14,120 --> 00:31:17,640 Speaker 1: indicated that the attack has been contained, but an investigation 521 00:31:17,760 --> 00:31:21,240 Speaker 1: is ongoing and as of now they are unsure when 522 00:31:21,320 --> 00:31:24,680 Speaker 1: the observatory will be able to go back to normal operations. 523 00:31:25,280 --> 00:31:29,400 Speaker 1: I've done episodes about AI powered machines competing against humans 524 00:31:29,440 --> 00:31:32,560 Speaker 1: in various games, as well as the related topic of 525 00:31:32,680 --> 00:31:37,520 Speaker 1: solvable versus unsolvable games. So a solvable game is one that, 526 00:31:37,680 --> 00:31:41,480 Speaker 1: if you assume both players are playing perfectly, you can 527 00:31:41,520 --> 00:31:45,800 Speaker 1: predict the outcome from any position even before a single 528 00:31:45,880 --> 00:31:48,720 Speaker 1: move is made. You know, you have to have complete 529 00:31:48,720 --> 00:31:50,800 Speaker 1: information about the game in order for this to work, 530 00:31:51,240 --> 00:31:54,240 Speaker 1: but it is possible. So for example, in Connect four, 531 00:31:54,880 --> 00:31:59,520 Speaker 1: assuming that both players are playing perfectly, player one will 532 00:31:59,640 --> 00:32:03,680 Speaker 1: all days win just from the beginning Anyway, there are 533 00:32:03,720 --> 00:32:06,280 Speaker 1: other games that have lots of different potential outcomes, and 534 00:32:06,320 --> 00:32:08,440 Speaker 1: some of those are games that humans were for a 535 00:32:08,520 --> 00:32:12,160 Speaker 1: very long time able to dominate. So you've likely heard 536 00:32:12,160 --> 00:32:15,840 Speaker 1: about the various chess programs like Deep Blue that played 537 00:32:15,880 --> 00:32:19,280 Speaker 1: at superhuman levels and were able to beat chess champions. 538 00:32:19,280 --> 00:32:22,520 Speaker 1: But another board game that computers for a long time 539 00:32:22,800 --> 00:32:25,960 Speaker 1: we're good at, but not not better than humans. But 540 00:32:26,040 --> 00:32:29,840 Speaker 1: then eventually overtook us is Go. So back in two 541 00:32:29,840 --> 00:32:34,640 Speaker 1: thousand and sixteen, Alpha Go and AI emerged as a 542 00:32:34,680 --> 00:32:40,120 Speaker 1: formidable computer opponent, defeating the best human Go players, and 543 00:32:40,200 --> 00:32:43,840 Speaker 1: so for the first time in history, a computer was 544 00:32:43,920 --> 00:32:48,000 Speaker 1: able to play Go at a level better than expert humans. 545 00:32:48,320 --> 00:32:51,680 Speaker 1: But recently, some researchers found that they could beat a 546 00:32:51,960 --> 00:32:58,000 Speaker 1: go playing AI called Cateago, which is also capable of 547 00:32:58,000 --> 00:33:01,880 Speaker 1: beating human champions. But they could beat Catego using a 548 00:33:01,960 --> 00:33:06,920 Speaker 1: very quirky strategy. So the researchers took a much weaker 549 00:33:07,360 --> 00:33:11,840 Speaker 1: go playing AI program. So they got like a computer 550 00:33:12,080 --> 00:33:14,560 Speaker 1: program that can play Go, but it plays it at 551 00:33:14,560 --> 00:33:18,240 Speaker 1: a level where even amateur players human players can defeat 552 00:33:18,760 --> 00:33:23,120 Speaker 1: this AI without much trouble. And then they took this 553 00:33:23,280 --> 00:33:28,600 Speaker 1: week Ai Go game playing system and put it up 554 00:33:28,600 --> 00:33:33,000 Speaker 1: against Catego, which has beaten human champions, and they found 555 00:33:33,000 --> 00:33:35,440 Speaker 1: that if they trained the weak AI to go after 556 00:33:35,560 --> 00:33:39,440 Speaker 1: some blind spots and Catego strategy, they could actually have 557 00:33:39,720 --> 00:33:45,840 Speaker 1: this inferior AI defeat a much stronger one, which sounds 558 00:33:45,840 --> 00:33:49,120 Speaker 1: really weird, right. I mean, we humans can get pantsed 559 00:33:49,360 --> 00:33:53,479 Speaker 1: by Cateago, but we can beat this weaker AI system. However, 560 00:33:53,520 --> 00:33:56,520 Speaker 1: the weaker AI system can beat Catego. It's kind of 561 00:33:56,520 --> 00:33:58,920 Speaker 1: like paper rock scissors in that way, and you might think, well, 562 00:33:58,960 --> 00:34:02,000 Speaker 1: how is this even possible? Bowl Well, during the training 563 00:34:02,880 --> 00:34:07,280 Speaker 1: process for Catego, Catego would refine its approach by playing 564 00:34:07,360 --> 00:34:11,640 Speaker 1: millions of games against itself. But this refining process meant 565 00:34:11,680 --> 00:34:16,320 Speaker 1: that over time, Catego would naturally concentrate on particular moves 566 00:34:16,320 --> 00:34:21,000 Speaker 1: and sequences within the game and migrate away from more rare, 567 00:34:21,160 --> 00:34:25,879 Speaker 1: outlying possibilities. And that meant that some legal moves were 568 00:34:25,960 --> 00:34:29,000 Speaker 1: largely ignored because they were just not likely to occur. 569 00:34:29,640 --> 00:34:33,440 Speaker 1: But this weaker AI, by focusing on those unlikely moves, 570 00:34:34,040 --> 00:34:38,040 Speaker 1: was able to make moves that Catego wasn't able to anticipate. 571 00:34:38,480 --> 00:34:42,880 Speaker 1: And that's interesting because it meant that this system that 572 00:34:42,960 --> 00:34:45,839 Speaker 1: could potentially just beat the best go players could be 573 00:34:45,840 --> 00:34:50,520 Speaker 1: defeated by something far inferior, and that should also raise 574 00:34:50,600 --> 00:34:53,960 Speaker 1: red flags to us not not for go necessarily. This 575 00:34:54,040 --> 00:34:58,560 Speaker 1: is an entertaining version of a larger issue that we 576 00:34:58,560 --> 00:35:01,440 Speaker 1: should really pay attention to. When I hear a story 577 00:35:01,520 --> 00:35:05,520 Speaker 1: like this, I think about other uses of AI, especially 578 00:35:05,520 --> 00:35:09,759 Speaker 1: AI that has been trained through machine learning in processes 579 00:35:09,880 --> 00:35:15,560 Speaker 1: similar to kindago and AI can sometimes, on the surface 580 00:35:15,560 --> 00:35:18,359 Speaker 1: appear to be really capable, but knowing that it can 581 00:35:18,400 --> 00:35:20,759 Speaker 1: be thrown for a loop if it encounters something that's 582 00:35:20,840 --> 00:35:26,320 Speaker 1: outside its training parameters should make us wary. Take, for example, 583 00:35:26,400 --> 00:35:30,520 Speaker 1: self driving cars. While we can train autonomous driving systems 584 00:35:30,560 --> 00:35:34,359 Speaker 1: in countless scenarios, and we can take information from real 585 00:35:34,400 --> 00:35:38,600 Speaker 1: world experiences and use that to continue to to augment 586 00:35:39,560 --> 00:35:43,480 Speaker 1: as self driving systems capabilities, the fact is the real 587 00:35:43,520 --> 00:35:46,919 Speaker 1: world could throw some really strange stuff at us occasionally. 588 00:35:47,400 --> 00:35:49,520 Speaker 1: And while humans might be able to make a split 589 00:35:49,560 --> 00:35:53,520 Speaker 1: second decision and avoid an accident even if they encounter 590 00:35:53,640 --> 00:35:57,520 Speaker 1: something they have never seen before, and AI system can 591 00:35:57,600 --> 00:36:02,279 Speaker 1: lack that capability and make a bad choice, which is 592 00:36:02,320 --> 00:36:05,000 Speaker 1: one of the biggest challenges when we come to truly 593 00:36:05,080 --> 00:36:08,840 Speaker 1: refining AI for specific use that we need to remember 594 00:36:09,880 --> 00:36:14,920 Speaker 1: that with these these machine learning processes, we can create 595 00:36:15,000 --> 00:36:19,760 Speaker 1: blind spots, and those blind spots. If it's a possibility 596 00:36:19,800 --> 00:36:22,160 Speaker 1: that those blind spots could occur in the real world, 597 00:36:22,280 --> 00:36:24,600 Speaker 1: they will occur, and when they do, it's going to 598 00:36:24,960 --> 00:36:27,480 Speaker 1: throw these AI systems for a loop. Whether that AI 599 00:36:27,560 --> 00:36:33,160 Speaker 1: system is used to detect things like you know, financial uh, 600 00:36:33,360 --> 00:36:38,480 Speaker 1: misappropriation of funds, or it's designed to control a car 601 00:36:38,560 --> 00:36:43,120 Speaker 1: as it goes down the highway or whatever. If there 602 00:36:43,160 --> 00:36:45,880 Speaker 1: are blind spots that are possible, then they will happen 603 00:36:45,920 --> 00:36:48,520 Speaker 1: at some point or another, and we need to be 604 00:36:48,600 --> 00:36:50,479 Speaker 1: on the lookout for that kind of thing. It's fun 605 00:36:50,520 --> 00:36:53,799 Speaker 1: with go not so much fun if you're in a 606 00:36:53,800 --> 00:36:57,319 Speaker 1: car that's being controlled by a robot and there's no 607 00:36:57,680 --> 00:37:02,480 Speaker 1: human controls available to you. That would be scary. Finally, 608 00:37:02,600 --> 00:37:06,120 Speaker 1: talking about scary, Palmer Lucky, who's the guy who created 609 00:37:06,160 --> 00:37:09,239 Speaker 1: the oculus and then subsequently went on to reveal that 610 00:37:09,280 --> 00:37:12,640 Speaker 1: he can be a pretty awful person, has created a 611 00:37:12,760 --> 00:37:17,759 Speaker 1: VR headset that can kill you. He posted about this 612 00:37:17,880 --> 00:37:21,719 Speaker 1: on his blog over the weekend. Because November six, two 613 00:37:22,160 --> 00:37:25,760 Speaker 1: twenty two is an important date in the fictional series 614 00:37:26,040 --> 00:37:30,319 Speaker 1: Sword Art Online. So in that series, a deranged tech 615 00:37:30,440 --> 00:37:35,240 Speaker 1: genius creates a really immersive game and a VR headset 616 00:37:35,440 --> 00:37:39,000 Speaker 1: used to access that game, and it's supposed to be 617 00:37:39,560 --> 00:37:43,399 Speaker 1: the most incredible experience ever. But what people don't know 618 00:37:44,040 --> 00:37:46,600 Speaker 1: is that he also built into this headset, which is 619 00:37:46,640 --> 00:37:50,799 Speaker 1: called the Nerve Gear, a system of powerful microwaves that 620 00:37:50,840 --> 00:37:53,560 Speaker 1: are designed to kill the user if they die in 621 00:37:53,600 --> 00:37:55,800 Speaker 1: the game. So if you die in the game, you 622 00:37:56,000 --> 00:37:59,240 Speaker 1: die in real life. It's an old sci fi trope. 623 00:37:59,680 --> 00:38:03,600 Speaker 1: Similar Literally, you cannot remove the nerve Gear without activating 624 00:38:03,680 --> 00:38:06,719 Speaker 1: the kill switch, which makes me think of movies like 625 00:38:06,760 --> 00:38:11,560 Speaker 1: Battle Royale where you've got characters wearing explosive collars. That 626 00:38:11,840 --> 00:38:14,960 Speaker 1: the Hunger games is the same way the explosive colors, 627 00:38:15,000 --> 00:38:17,440 Speaker 1: where they can activate the collar and kill someone if 628 00:38:17,480 --> 00:38:20,560 Speaker 1: they are breaking the rules of the game or they're 629 00:38:20,560 --> 00:38:23,360 Speaker 1: in the wrong region at the wrong time. Well, Palmer 630 00:38:23,600 --> 00:38:26,800 Speaker 1: thought this was such a cool idea he built something similar. 631 00:38:26,920 --> 00:38:30,279 Speaker 1: He didn't use microwaves because he can't really mantorize that 632 00:38:30,480 --> 00:38:32,520 Speaker 1: technology to a point where you could fit it into 633 00:38:32,600 --> 00:38:37,279 Speaker 1: a headset. So instead of using microwaves, he has used explosives. 634 00:38:37,719 --> 00:38:42,000 Speaker 1: He has positioned three small explosive devices on the headset, 635 00:38:42,560 --> 00:38:46,120 Speaker 1: and they are wired, so that should the headset indicate 636 00:38:46,160 --> 00:38:49,640 Speaker 1: that the player has died in the game by flashing 637 00:38:49,680 --> 00:38:53,400 Speaker 1: a red screen up at a certain frequency, the explosives 638 00:38:53,400 --> 00:38:56,319 Speaker 1: will trigger and the player's head would get blowed up 639 00:38:56,360 --> 00:38:59,160 Speaker 1: in real life. So you die in the game, this 640 00:38:59,280 --> 00:39:02,200 Speaker 1: headset kill you. Palmer then goes on to say he 641 00:39:02,239 --> 00:39:03,880 Speaker 1: has yet to build in a system that would make 642 00:39:03,920 --> 00:39:05,839 Speaker 1: the headset explode if you tried to take it off, 643 00:39:06,120 --> 00:39:09,600 Speaker 1: so it's not like it's tamper proof, so shucks, and 644 00:39:09,680 --> 00:39:13,080 Speaker 1: that he also is not sure that the explosives couldn't 645 00:39:13,120 --> 00:39:16,160 Speaker 1: be triggered by accident, So in other words, you might 646 00:39:16,200 --> 00:39:19,200 Speaker 1: be playing the game not die, but the explosives get 647 00:39:19,239 --> 00:39:22,040 Speaker 1: triggered and you get blown up. Anyway, he calls it 648 00:39:22,080 --> 00:39:25,960 Speaker 1: more of an office art piece. So yeah, this is 649 00:39:25,960 --> 00:39:28,560 Speaker 1: one of those cases where I feel that fandom has 650 00:39:28,600 --> 00:39:31,479 Speaker 1: gone a little too far, because sometimes people take away 651 00:39:31,520 --> 00:39:35,760 Speaker 1: the wrong lessons from fiction, Like I can just imagine 652 00:39:35,800 --> 00:39:38,480 Speaker 1: that Palm are lucky reading or watching Hunger Games or 653 00:39:38,520 --> 00:39:41,480 Speaker 1: Battle Royal for that matter, and thinking oh that's a 654 00:39:41,480 --> 00:39:45,719 Speaker 1: good idea. That means you you learned the wrong lesson Palmer. 655 00:39:46,719 --> 00:39:52,680 Speaker 1: That's not what that's not what they're trying to teach whatever. Anyway, 656 00:39:53,080 --> 00:39:56,719 Speaker 1: you know, I I don't assume that anyone's ever going 657 00:39:56,760 --> 00:39:58,960 Speaker 1: to actually be putting these things on unless we truly 658 00:39:59,040 --> 00:40:02,960 Speaker 1: enter a dystopian sure, in which case I'll be roving 659 00:40:03,000 --> 00:40:06,920 Speaker 1: the deserts looking for gas. Because I've already decided the 660 00:40:06,960 --> 00:40:09,640 Speaker 1: mad Max approach is my way. I kind of dig 661 00:40:09,680 --> 00:40:12,600 Speaker 1: the punk aesthetic, so that's that's kind of my I'm 662 00:40:12,600 --> 00:40:14,839 Speaker 1: not gonna be worried about the virtual reality stuff so much. 663 00:40:14,880 --> 00:40:18,160 Speaker 1: It will be going after petrol. I guess that's it 664 00:40:18,520 --> 00:40:21,600 Speaker 1: for this news episode of tech Stuff. We will be 665 00:40:21,640 --> 00:40:25,680 Speaker 1: back on Thursday with another news episode. As a reminder, tomorrow, 666 00:40:25,719 --> 00:40:28,560 Speaker 1: we are publishing an episode of The Restless Ones in 667 00:40:28,640 --> 00:40:31,919 Speaker 1: the tech Stuff feed, and I encourage you to listen 668 00:40:31,920 --> 00:40:35,200 Speaker 1: to it. It's a good show and you might discover 669 00:40:35,320 --> 00:40:38,000 Speaker 1: that you want to dive into the entire series because 670 00:40:38,160 --> 00:40:40,439 Speaker 1: I talked to a lot of interesting people this past year. 671 00:40:41,080 --> 00:40:44,759 Speaker 1: Um and yeah, check it out. If you have suggestions 672 00:40:44,800 --> 00:40:47,320 Speaker 1: for future topics I should cover in tech Stuff, please 673 00:40:47,360 --> 00:40:49,120 Speaker 1: reach out to me. A couple different ways to do that. 674 00:40:49,200 --> 00:40:52,640 Speaker 1: You can download the I Heart Radio app, which is 675 00:40:52,680 --> 00:40:55,520 Speaker 1: free to downloads free to use. You can navigate over 676 00:40:55,520 --> 00:40:57,520 Speaker 1: to tech Stuff by putting tech Stuff in the search 677 00:40:57,560 --> 00:40:59,920 Speaker 1: field you'll see on the tech Stuff page there's the 678 00:41:00,080 --> 00:41:02,279 Speaker 1: little microphone icon. If you click on that, you can 679 00:41:02,360 --> 00:41:05,160 Speaker 1: leave a voice message up to thirty seconds long let 680 00:41:05,160 --> 00:41:07,400 Speaker 1: me know what you would like to hear in future episodes, 681 00:41:07,960 --> 00:41:10,080 Speaker 1: or if you prefer, you can get in touch with 682 00:41:10,120 --> 00:41:12,360 Speaker 1: me over on Twitter. The handle for the show is 683 00:41:12,680 --> 00:41:15,960 Speaker 1: tech Stuff H s W and I'll talk to you 684 00:41:16,000 --> 00:41:25,680 Speaker 1: again really soon Y. Tech Stuff is an i Heart 685 00:41:25,760 --> 00:41:29,520 Speaker 1: Radio production. For more podcasts from my Heart Radio, visit 686 00:41:29,560 --> 00:41:32,640 Speaker 1: the i Heart Radio app, Apple Podcasts, or wherever you 687 00:41:32,680 --> 00:41:34,040 Speaker 1: listen to your favorite shows.