1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,520 Speaker 1: Be there and welcome to tech Stuff. I'm your host 3 00:00:14,640 --> 00:00:17,520 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,600 --> 00:00:20,200 Speaker 1: And how the tech are you? Sime for the Tech 5 00:00:20,280 --> 00:00:24,640 Speaker 1: News for Thursday, November two thousand twenty two, And on 6 00:00:24,680 --> 00:00:27,920 Speaker 1: Tuesday I mentioned that folks were expecting Meta to hold 7 00:00:27,960 --> 00:00:32,960 Speaker 1: a round of layoffs, and then yesterday that happened. All told, 8 00:00:33,000 --> 00:00:36,880 Speaker 1: the company handed walking papers over to of its staff. 9 00:00:37,240 --> 00:00:42,440 Speaker 1: That's more than eleven thousand people. CEO Mark Zuckerberg sent 10 00:00:42,479 --> 00:00:45,720 Speaker 1: out a memo calling the layoffs a last resort, though 11 00:00:45,760 --> 00:00:48,680 Speaker 1: I'm not entirely sure he knows what last resort means, 12 00:00:48,720 --> 00:00:52,360 Speaker 1: because I suspect we haven't actually seen the last resort 13 00:00:52,600 --> 00:00:57,200 Speaker 1: from Meta. Zuckerberg also said he wanted to take accountability 14 00:00:57,240 --> 00:01:00,160 Speaker 1: for the decision and that he was a special lee. 15 00:01:00,280 --> 00:01:03,959 Speaker 1: Sorry to those impacted. So this is the same CEO 16 00:01:04,080 --> 00:01:06,600 Speaker 1: who back in June said there were people at Meta 17 00:01:06,640 --> 00:01:09,240 Speaker 1: who probably didn't need to be there, implying the company 18 00:01:09,280 --> 00:01:12,399 Speaker 1: had more staff than what was needed and that individual 19 00:01:12,440 --> 00:01:16,840 Speaker 1: productivity was down because of that. Zuckerberg further explained that 20 00:01:16,959 --> 00:01:19,039 Speaker 1: closer to the beginning of the pandemic, there was a 21 00:01:19,120 --> 00:01:22,000 Speaker 1: rush to spend more time online, and all of that 22 00:01:22,080 --> 00:01:24,120 Speaker 1: makes sense, but then he goes on to say that 23 00:01:24,160 --> 00:01:26,120 Speaker 1: he and his team all thought that that move to 24 00:01:26,160 --> 00:01:30,520 Speaker 1: be online more frequently would be a permanent thing that 25 00:01:30,560 --> 00:01:33,839 Speaker 1: people wouldn't ease off once local restrictions lifted and people 26 00:01:33,840 --> 00:01:37,640 Speaker 1: started feeling safe outside their homes. And who boy, were 27 00:01:37,680 --> 00:01:41,959 Speaker 1: they wrong about that one anyway, because the prediction was 28 00:01:42,040 --> 00:01:44,160 Speaker 1: that things were going to be all online all the time, 29 00:01:44,240 --> 00:01:47,600 Speaker 1: which honestly makes me think of the metaverse concept. Meadow 30 00:01:47,640 --> 00:01:50,760 Speaker 1: went on a hiring spree. According to the company's earnings 31 00:01:50,760 --> 00:01:53,800 Speaker 1: report in the third quarter, employee head count was almost 32 00:01:54,640 --> 00:01:58,880 Speaker 1: larger than it was this time last year. So when 33 00:01:58,880 --> 00:02:01,600 Speaker 1: you look at it that way, a thirteent reduction in 34 00:02:01,640 --> 00:02:05,600 Speaker 1: workforce still has met a significantly larger from a headcount 35 00:02:05,600 --> 00:02:09,799 Speaker 1: perspective than it was at the end of Now I'm 36 00:02:09,840 --> 00:02:14,760 Speaker 1: being pretty judge of Zuckerberg and his executive team, but 37 00:02:14,880 --> 00:02:17,840 Speaker 1: if I really want to be fair, Meta did need 38 00:02:17,919 --> 00:02:21,000 Speaker 1: to make some moves to correct course. I'm not entirely 39 00:02:21,040 --> 00:02:24,680 Speaker 1: convinced it has actually done that, because Zuckerberg still seems 40 00:02:24,720 --> 00:02:27,440 Speaker 1: determined to bring his vision of the Metaverse to realization. 41 00:02:28,200 --> 00:02:30,440 Speaker 1: That's gonna take a lot more time, a lot more money. 42 00:02:30,600 --> 00:02:32,880 Speaker 1: It's gonna be a really tough road to travel, and 43 00:02:32,919 --> 00:02:36,400 Speaker 1: even if the company gets there, I'm not yet convinced 44 00:02:36,400 --> 00:02:39,480 Speaker 1: that enough people actually want to use the metaverse to 45 00:02:39,520 --> 00:02:42,240 Speaker 1: make it all worthwhile. But then, Meta is a company 46 00:02:42,240 --> 00:02:46,080 Speaker 1: that is facing an existential crisis, has struggled to attract 47 00:02:46,320 --> 00:02:49,920 Speaker 1: younger users, and with its current user base steadily aging, 48 00:02:50,360 --> 00:02:53,400 Speaker 1: it's kind of like a ticking time bomb situation. So 49 00:02:53,440 --> 00:02:55,760 Speaker 1: I guess what I'm saying is I don't really believe 50 00:02:55,840 --> 00:02:59,520 Speaker 1: in Zuckerberg's mission. So with that in mind, I do 51 00:02:59,600 --> 00:03:02,520 Speaker 1: hold him accountable because it was the pursuit of that 52 00:03:02,600 --> 00:03:06,080 Speaker 1: mission that has really contributed heavily to Meta needing to 53 00:03:06,120 --> 00:03:09,280 Speaker 1: cut costs and scale down in the first place. All 54 00:03:09,280 --> 00:03:11,839 Speaker 1: that being said, you should keep in mind I am 55 00:03:11,960 --> 00:03:14,720 Speaker 1: by no means an expert in this stuff. This is 56 00:03:14,760 --> 00:03:17,800 Speaker 1: really just my opinion, and it's possible, I mean, it's 57 00:03:17,840 --> 00:03:21,799 Speaker 1: probably even likely that I'm totally off base. Maybe we'll 58 00:03:21,840 --> 00:03:24,680 Speaker 1: all be wearing headsets and spending countless hours working and 59 00:03:24,680 --> 00:03:30,160 Speaker 1: playing in the metaverse in a few years. The well, look, 60 00:03:30,360 --> 00:03:33,960 Speaker 1: I managed to put the Twitter stuff in the second 61 00:03:34,160 --> 00:03:37,200 Speaker 1: position for today's news episode instead of leading with it. 62 00:03:37,240 --> 00:03:42,840 Speaker 1: That's progress, I guess. Twitter launched its new blue check 63 00:03:42,920 --> 00:03:47,080 Speaker 1: mark verification strategy yesterday and after holding off because hey, 64 00:03:47,240 --> 00:03:50,120 Speaker 1: folks pointed out that having a verification system that doesn't 65 00:03:50,160 --> 00:03:54,720 Speaker 1: actually do any verification right before you hold an election 66 00:03:55,280 --> 00:03:58,200 Speaker 1: could be a bad idea. Of course, I live in 67 00:03:58,200 --> 00:04:00,120 Speaker 1: the state where we're going to have to have a 68 00:04:00,280 --> 00:04:03,320 Speaker 1: runoff election later on in December, so I look forward 69 00:04:03,360 --> 00:04:06,320 Speaker 1: to this broken approach having a massive impact on my 70 00:04:06,400 --> 00:04:09,560 Speaker 1: local politics in a month. And yes, it is my 71 00:04:09,720 --> 00:04:13,960 Speaker 1: opinion that the verification system as it now exists is 72 00:04:14,080 --> 00:04:18,720 Speaker 1: largely worthless, or, as tech crunches Lance Ulanov put it, 73 00:04:18,720 --> 00:04:21,599 Speaker 1: it's quote about to have as much value as a 74 00:04:21,640 --> 00:04:24,960 Speaker 1: sticker someone hands a child for behaving at the dentist 75 00:04:25,120 --> 00:04:29,239 Speaker 1: end quote. Okay, just in case you are not sure 76 00:04:29,279 --> 00:04:33,359 Speaker 1: what I'm even ranting about here, Previously, Twitter head a 77 00:04:33,400 --> 00:04:36,719 Speaker 1: system to check and make sure that the Twitter account 78 00:04:36,839 --> 00:04:40,720 Speaker 1: claiming to represent a known entity. You know, it might 79 00:04:40,760 --> 00:04:43,200 Speaker 1: be a celebrity, it might be a politician, it could 80 00:04:43,200 --> 00:04:46,839 Speaker 1: be the the Twitter feed for an official brand. That 81 00:04:46,960 --> 00:04:50,120 Speaker 1: kind of thing. They had to actually make sure that 82 00:04:50,120 --> 00:04:54,200 Speaker 1: those accounts belonged to those entities, that this wasn't a 83 00:04:54,200 --> 00:04:59,520 Speaker 1: case of someone impersonating a notable figure. If Twitter researchers 84 00:04:59,520 --> 00:05:03,599 Speaker 1: determine the YEP that account is legitimate, then they would 85 00:05:03,640 --> 00:05:06,720 Speaker 1: award a blue check mark to that account to show 86 00:05:06,720 --> 00:05:11,000 Speaker 1: everyone that this was the real deal. It gave users 87 00:05:11,040 --> 00:05:14,000 Speaker 1: the confidence to know that they were following an official 88 00:05:14,040 --> 00:05:17,239 Speaker 1: account that really represented who they thought they were following. 89 00:05:17,720 --> 00:05:20,560 Speaker 1: For reasons I still don't understand, I got one of 90 00:05:20,600 --> 00:05:23,279 Speaker 1: those blue check marks on my personal Twitter handle back 91 00:05:23,320 --> 00:05:26,919 Speaker 1: when I was still using Twitter. Technically I still have it, 92 00:05:27,000 --> 00:05:29,800 Speaker 1: but my accounts all protected now and I don't use 93 00:05:29,839 --> 00:05:34,400 Speaker 1: it anymore anyway. The news system is subscription based. If 94 00:05:34,400 --> 00:05:36,800 Speaker 1: you pay eight bucks a month, you have access to 95 00:05:36,920 --> 00:05:41,800 Speaker 1: Twitter Blue, and that includes the little blue checkmark. Twitter 96 00:05:41,880 --> 00:05:46,120 Speaker 1: briefly rolled out a gray badge that says official to 97 00:05:46,400 --> 00:05:49,599 Speaker 1: Twitter accounts. Uh. These were sort of like taking the 98 00:05:49,600 --> 00:05:53,480 Speaker 1: place of the old blue check marks for previously verified accounts, 99 00:05:53,520 --> 00:05:57,160 Speaker 1: at least for some of them originally. Ell It's reported 100 00:05:57,200 --> 00:06:00,000 Speaker 1: that the new official verification was kind of rolling out gradually. 101 00:06:00,040 --> 00:06:02,240 Speaker 1: I mean. Tech Crunch even pointed out that President Joe 102 00:06:02,320 --> 00:06:06,800 Speaker 1: Biden's personal account had the older blue check mark, but 103 00:06:06,960 --> 00:06:11,159 Speaker 1: not the official badge. But an hour after official badges 104 00:06:11,200 --> 00:06:15,159 Speaker 1: began to appear, they began to disappear. Meanwhile, we've already 105 00:06:15,200 --> 00:06:18,320 Speaker 1: seen folks exploit the new verification system to apparent sidate 106 00:06:18,560 --> 00:06:22,120 Speaker 1: high profile individuals and companies. For example, so In created 107 00:06:22,120 --> 00:06:26,400 Speaker 1: an account that was called Nintendo of Us you know 108 00:06:26,560 --> 00:06:31,000 Speaker 1: of us as a Nintendo America. That account has subsequently 109 00:06:31,000 --> 00:06:33,480 Speaker 1: been banned, but before it was banned and had the 110 00:06:33,600 --> 00:06:38,039 Speaker 1: verification check mark on there and uh it. The tweet 111 00:06:38,160 --> 00:06:40,960 Speaker 1: that they posted showed an image of Mario making a 112 00:06:42,400 --> 00:06:46,920 Speaker 1: famous rude hand gesture, not something Nintendo would be particularly 113 00:06:46,960 --> 00:06:51,520 Speaker 1: pleased to see. As Jason Schreier observed, quote, can't imagine 114 00:06:51,520 --> 00:06:54,600 Speaker 1: why all the advertisers are pulling out of Twitter L 115 00:06:54,839 --> 00:06:58,560 Speaker 1: m Ao end quote. Considering that Twitter took a huge 116 00:06:58,640 --> 00:07:01,880 Speaker 1: chunk out of its content ration team during the recent layoffs, 117 00:07:01,920 --> 00:07:04,920 Speaker 1: I imagine things have got to be really busy over 118 00:07:04,960 --> 00:07:08,400 Speaker 1: there as folks attempt to exploit the verification system. Behind 119 00:07:08,440 --> 00:07:11,280 Speaker 1: the scenes at Twitter, Elon Musk has made more changes 120 00:07:11,320 --> 00:07:15,480 Speaker 1: to the work environment, having now slashed headcount by around 121 00:07:15,800 --> 00:07:19,640 Speaker 1: Musk has eliminated Twitter's work from home policy and it's 122 00:07:20,040 --> 00:07:23,600 Speaker 1: days of rest policy, so the work from home policies 123 00:07:23,720 --> 00:07:27,280 Speaker 1: self explanatory. It's no surprise that Musk has shut that 124 00:07:27,320 --> 00:07:30,800 Speaker 1: down because he's had a similar stance in his other companies, 125 00:07:31,320 --> 00:07:34,120 Speaker 1: so stands the reason he would do the same at Twitter. 126 00:07:34,640 --> 00:07:36,920 Speaker 1: So in playce will have to come into the office. 127 00:07:37,040 --> 00:07:40,520 Speaker 1: They are no longer allowed to work remotely. The days 128 00:07:40,520 --> 00:07:43,560 Speaker 1: of rest policy was a benefit that would give employees 129 00:07:43,600 --> 00:07:46,720 Speaker 1: a little time off between major projects so that they 130 00:07:46,720 --> 00:07:51,120 Speaker 1: could recharge their batteries and avoid burnout. But now there 131 00:07:51,120 --> 00:07:53,720 Speaker 1: are half the people to do all the work. Not 132 00:07:53,760 --> 00:07:56,160 Speaker 1: to mention that Musk has added some projects with really 133 00:07:56,200 --> 00:07:59,840 Speaker 1: aggressive deadlines, so time off really isn't something Musk is 134 00:07:59,880 --> 00:08:02,120 Speaker 1: interested in. My guess is there are a lot of 135 00:08:02,160 --> 00:08:05,520 Speaker 1: folks over at Twitter we're dealing with a ton more 136 00:08:05,600 --> 00:08:09,600 Speaker 1: stress than they had to endure in recent years. So 137 00:08:10,000 --> 00:08:12,920 Speaker 1: my heart goes out to y'all. Things are looking grim 138 00:08:13,000 --> 00:08:16,320 Speaker 1: right now in the cryptocurrency world as numerous digital coins 139 00:08:16,360 --> 00:08:21,000 Speaker 1: take a tumble in value yet again. Bitcoin is currently 140 00:08:21,080 --> 00:08:24,800 Speaker 1: valued at around sixteen thousand, four hundred sixty bucks per coin, 141 00:08:24,880 --> 00:08:28,120 Speaker 1: and y'all, yeah, that's a lot of money. Sixteen grand, 142 00:08:29,000 --> 00:08:33,600 Speaker 1: almost sixteen and a half per bitcoin. That's expensive, but 143 00:08:34,040 --> 00:08:35,880 Speaker 1: when you think back a couple of years ago when 144 00:08:35,920 --> 00:08:40,480 Speaker 1: the currency hit around sixty thousand per coin, it is 145 00:08:40,600 --> 00:08:45,080 Speaker 1: a huge drop. Bitcoin had hit a low of fifteen thousand, 146 00:08:45,120 --> 00:08:48,080 Speaker 1: seven hundred fifty bucks yesterday, so it actually dip below 147 00:08:48,160 --> 00:08:51,160 Speaker 1: sixteen thousand for the first time in ages. That was 148 00:08:51,240 --> 00:08:53,360 Speaker 1: likely because of the next bit of news that we 149 00:08:53,400 --> 00:08:56,760 Speaker 1: have to cover, cryptocurrency exchange f t X is in 150 00:08:56,800 --> 00:09:00,280 Speaker 1: the middle of a real crisis. This require is a 151 00:09:00,280 --> 00:09:04,280 Speaker 1: little bit of background. So an entrepreneur named Sam Bankman 152 00:09:04,360 --> 00:09:07,280 Speaker 1: Freed a k A s b F that's what he 153 00:09:07,360 --> 00:09:11,760 Speaker 1: goes by. UH and Gary Wong co founded the f 154 00:09:11,800 --> 00:09:16,120 Speaker 1: t X exchange back in twenty nineteen, and an exchange 155 00:09:16,160 --> 00:09:19,240 Speaker 1: is a business that allows customers to trade cryptocurrencies for 156 00:09:19,280 --> 00:09:22,640 Speaker 1: other assets, which could include a different cryptocurrency or a 157 00:09:22,679 --> 00:09:28,160 Speaker 1: fiat currency. Similarly, you could buy into a cryptocurrency exchanging 158 00:09:28,400 --> 00:09:31,000 Speaker 1: fiat currency for a digital coin. That kind of thing. 159 00:09:31,080 --> 00:09:36,680 Speaker 1: That's what an exchange does. It handles that transaction well. Binance, 160 00:09:36,720 --> 00:09:40,040 Speaker 1: which is another cryptocurrency exchange and a huge competitor to 161 00:09:40,120 --> 00:09:43,080 Speaker 1: f t X, had a stake in f t X, 162 00:09:43,200 --> 00:09:47,080 Speaker 1: but last year Binance essentially converted its steak into f 163 00:09:47,240 --> 00:09:51,120 Speaker 1: t T, and f t T is the native cryptocurrency 164 00:09:51,160 --> 00:09:54,000 Speaker 1: on the f t X exchange, so you could buy 165 00:09:54,040 --> 00:09:56,439 Speaker 1: into f t T and then convert the f t 166 00:09:56,440 --> 00:10:00,000 Speaker 1: T into other stuff like bitcoin or whatever. As cryptome 167 00:10:00,000 --> 00:10:05,560 Speaker 1: markets have encountered rough waters, SPF, through his trading firm, 168 00:10:05,600 --> 00:10:09,160 Speaker 1: Alameda Research, began to save or at least attempt to 169 00:10:09,240 --> 00:10:12,680 Speaker 1: save struggling crypto firms that were on the verge of 170 00:10:12,720 --> 00:10:17,240 Speaker 1: collapse and that may have overextended SPF. Then you flash 171 00:10:17,280 --> 00:10:21,280 Speaker 1: forward to this week when Binance CEO and sometimes are 172 00:10:21,520 --> 00:10:26,040 Speaker 1: nemesis to SPF, Chong Peng Shao said that the company 173 00:10:26,160 --> 00:10:29,200 Speaker 1: was going to cash out of its stock of f 174 00:10:29,320 --> 00:10:34,360 Speaker 1: t T. Now, this would mean that the typically low 175 00:10:34,480 --> 00:10:39,360 Speaker 1: trading market for ft T would suddenly get flooded by 176 00:10:39,360 --> 00:10:44,560 Speaker 1: Binances vault of FTT currency once they unloaded their share, 177 00:10:44,840 --> 00:10:47,560 Speaker 1: that in turn would drive down the value of f 178 00:10:47,720 --> 00:10:50,480 Speaker 1: t T. Right, if you flood a market with currency, 179 00:10:50,520 --> 00:10:53,800 Speaker 1: then the currency becomes devalued. So then you had a 180 00:10:53,800 --> 00:10:55,920 Speaker 1: bunch of folks wanting to cash out of their FTT 181 00:10:56,160 --> 00:10:58,840 Speaker 1: because they don't want to see their currency lose tons 182 00:10:58,880 --> 00:11:02,319 Speaker 1: of value. That that's investment for them, and it essentially 183 00:11:02,320 --> 00:11:05,000 Speaker 1: became a run on the bank, and suddenly fd X 184 00:11:05,040 --> 00:11:08,320 Speaker 1: found itself in a liquidity crunch. Finance at one point 185 00:11:08,320 --> 00:11:11,480 Speaker 1: indicate it would actually acquire fd X, which would at 186 00:11:11,520 --> 00:11:13,520 Speaker 1: least help settle things down, but it would be a 187 00:11:13,520 --> 00:11:16,880 Speaker 1: real power move because SPF and Chong Peng had been 188 00:11:16,920 --> 00:11:21,320 Speaker 1: feuding extensively in the past. But then Binance backdown of 189 00:11:21,360 --> 00:11:24,520 Speaker 1: the agreement, which was, you know, never formalized, so no 190 00:11:24,600 --> 00:11:28,160 Speaker 1: problem there, and said that fd x is financial situation 191 00:11:28,280 --> 00:11:32,319 Speaker 1: was way worse than first suspected. Now SBF and f 192 00:11:32,480 --> 00:11:35,240 Speaker 1: d X are searching for another lifeline, and meanwhile the 193 00:11:35,240 --> 00:11:38,679 Speaker 1: crypto market in general is taking another hit. Seeing a 194 00:11:38,720 --> 00:11:41,560 Speaker 1: major exchange in trouble tends to have a ripple effect 195 00:11:41,559 --> 00:11:45,360 Speaker 1: on people's confidence in the overall market, so crypto continues 196 00:11:45,400 --> 00:11:47,560 Speaker 1: to have a rough time of it as it has 197 00:11:47,640 --> 00:11:52,560 Speaker 1: for a while now. Market Watch reports that TikTok has 198 00:11:52,720 --> 00:11:57,080 Speaker 1: secret scores assigned to influencers. These scores reflect how well 199 00:11:57,600 --> 00:12:02,760 Speaker 1: content creators are at from products or showing enthusiasm for sponsors, 200 00:12:02,760 --> 00:12:04,200 Speaker 1: that kind of thing, So it's kind of like a 201 00:12:04,240 --> 00:12:08,120 Speaker 1: secret system to evaluate how likely a given content creator 202 00:12:08,200 --> 00:12:10,600 Speaker 1: is to play ball and to do it well. I 203 00:12:10,640 --> 00:12:14,600 Speaker 1: suppose presumably creators who have higher scores will get more opportunities, 204 00:12:14,640 --> 00:12:18,040 Speaker 1: perhaps with more prestigious brands. I'd like to think that 205 00:12:18,080 --> 00:12:21,920 Speaker 1: the score indicates just how influential are you, oh, influencer, 206 00:12:22,640 --> 00:12:25,840 Speaker 1: but it does look at things like diligence and cooperation, 207 00:12:25,880 --> 00:12:28,360 Speaker 1: which is just their way of saying how frequently will 208 00:12:28,400 --> 00:12:30,640 Speaker 1: they post and do they put up a fuss about it. 209 00:12:31,240 --> 00:12:34,040 Speaker 1: The scoring system apparently doesn't really apply here in the 210 00:12:34,120 --> 00:12:38,080 Speaker 1: United States because TikTok is really looking to leverage the 211 00:12:38,120 --> 00:12:41,160 Speaker 1: social network into a shopping experience, and those efforts have 212 00:12:41,400 --> 00:12:44,040 Speaker 1: not really taken off here in the US. It has 213 00:12:44,320 --> 00:12:46,080 Speaker 1: in other parts of the world, so in places like 214 00:12:46,120 --> 00:12:49,880 Speaker 1: Southeast Asia, the system appears to be fully deployed. According 215 00:12:49,880 --> 00:12:52,680 Speaker 1: to market Watch, TikTok has been holding discussions in the 216 00:12:52,760 --> 00:12:55,560 Speaker 1: UK about using the scores to partner with creators, and 217 00:12:55,559 --> 00:12:57,960 Speaker 1: I want to be clear, I don't necessarily think this 218 00:12:58,040 --> 00:13:01,360 Speaker 1: is a bad system. Uh. Some brands might have a 219 00:13:01,360 --> 00:13:04,160 Speaker 1: particular influencer in mind when they want to engage in 220 00:13:04,160 --> 00:13:08,120 Speaker 1: a in a TikTok campaign, but others might not have 221 00:13:08,160 --> 00:13:11,040 Speaker 1: any idea of who they would want to represent. Their brand, 222 00:13:11,480 --> 00:13:13,560 Speaker 1: and having a scoring system could be a quick way 223 00:13:13,559 --> 00:13:17,600 Speaker 1: to at least get eyeballs in front of particular influencers 224 00:13:17,640 --> 00:13:20,880 Speaker 1: to get an idea who's a good match. It is 225 00:13:20,920 --> 00:13:24,320 Speaker 1: a little creepy because TikTok's parent company, byte Dance, is 226 00:13:24,360 --> 00:13:27,560 Speaker 1: in China, and China has been developing a social credit 227 00:13:27,679 --> 00:13:32,240 Speaker 1: score policy that potentially could see citizens receive social scores 228 00:13:32,240 --> 00:13:36,480 Speaker 1: that could impact everything from job prospects to travel bands. Now, 229 00:13:37,000 --> 00:13:40,439 Speaker 1: so far, the social credit program has really applied to businesses, 230 00:13:40,480 --> 00:13:43,360 Speaker 1: not individuals, and various regions in China have rolled out 231 00:13:43,360 --> 00:13:45,520 Speaker 1: their own version of this, so it's not like there's 232 00:13:45,559 --> 00:13:51,000 Speaker 1: a unified national program in place. So I don't want 233 00:13:51,040 --> 00:13:55,360 Speaker 1: to pay a dystopian picture the way often we will 234 00:13:55,440 --> 00:13:57,880 Speaker 1: often hear when we talk about the social credit score. 235 00:13:58,400 --> 00:14:02,240 Speaker 1: But um, yeah, it's possible that that could change in 236 00:14:02,280 --> 00:14:06,000 Speaker 1: the future. Okay, we have more to talk about, but 237 00:14:06,080 --> 00:14:18,640 Speaker 1: before we get to that, let's take a quick break. Okay, 238 00:14:18,640 --> 00:14:22,720 Speaker 1: we're back. So Variety, which mentally is an outlet that 239 00:14:22,760 --> 00:14:26,240 Speaker 1: I do not reference often on this show, reported that, 240 00:14:26,480 --> 00:14:30,400 Speaker 1: according to a YouTube representative, eighty million people are now 241 00:14:30,440 --> 00:14:35,080 Speaker 1: subscribed to YouTube's music and premium services and that's up 242 00:14:35,120 --> 00:14:39,600 Speaker 1: from fifty million last year. Subscribers are able to watch 243 00:14:39,760 --> 00:14:43,800 Speaker 1: videos ad free, though obviously any baked in ads like 244 00:14:43,840 --> 00:14:47,280 Speaker 1: a host red sponsorship within a video will still be there. 245 00:14:47,800 --> 00:14:49,800 Speaker 1: They can also do stuff like play videos in the 246 00:14:49,840 --> 00:14:54,280 Speaker 1: background and download for offline viewing, so you know there's 247 00:14:54,320 --> 00:14:57,520 Speaker 1: some neat things there. The background video means that you 248 00:14:57,560 --> 00:14:59,960 Speaker 1: could say, launch a video on your phone and then 249 00:15:00,040 --> 00:15:02,200 Speaker 1: set your phone screen to sleep mode and still be 250 00:15:02,280 --> 00:15:04,160 Speaker 1: able to listen to the video, which comes in handy 251 00:15:04,160 --> 00:15:06,320 Speaker 1: for me because I like listening to a S m 252 00:15:06,480 --> 00:15:08,760 Speaker 1: R as it go to sleep, and I would prefer 253 00:15:08,840 --> 00:15:11,760 Speaker 1: the screen not stay lit up so that way it's 254 00:15:11,760 --> 00:15:15,840 Speaker 1: not distracting me before I can not off. Anyway, While 255 00:15:15,880 --> 00:15:19,000 Speaker 1: a thirty million subscriber increases great the amount of revenue 256 00:15:19,000 --> 00:15:22,040 Speaker 1: those subscriptions generate, which is somewhere around eleven billion per 257 00:15:22,120 --> 00:15:25,800 Speaker 1: year according to Android Headlines, it's still less than half 258 00:15:25,840 --> 00:15:29,720 Speaker 1: of what the company's reported revenue is. It's understandable why 259 00:15:29,720 --> 00:15:32,760 Speaker 1: YouTube would push for folks to subscribe. A subscription is 260 00:15:32,800 --> 00:15:38,120 Speaker 1: reliable revenue month over month. The advertising market fluctuates, particularly 261 00:15:38,160 --> 00:15:41,520 Speaker 1: in times of economic uncertainty, as companies start to curb 262 00:15:41,640 --> 00:15:45,200 Speaker 1: spending Plus, YouTube doesn't have to worry so much about 263 00:15:45,280 --> 00:15:47,880 Speaker 1: pairing ads with a video that the advertiser would object 264 00:15:47,880 --> 00:15:51,760 Speaker 1: to if you're subscribed, right, so they don't have to 265 00:15:51,760 --> 00:15:54,440 Speaker 1: worry about the kinds of situations that have happened in 266 00:15:54,440 --> 00:15:56,720 Speaker 1: the past. You know, there's nothing like a brand getting 267 00:15:56,720 --> 00:15:58,600 Speaker 1: word that one of their ads has gone up on 268 00:15:58,640 --> 00:16:01,720 Speaker 1: a video that, for example, includes hate speech or some 269 00:16:01,880 --> 00:16:06,280 Speaker 1: other objectionable content that causes a big problem. So if 270 00:16:06,320 --> 00:16:09,920 Speaker 1: you can push people over to subscribe and then rely 271 00:16:10,200 --> 00:16:15,880 Speaker 1: less and less on ads, then you remove that impediment. Anyway, 272 00:16:15,920 --> 00:16:18,200 Speaker 1: It'll be interesting to see if YouTube can continue to 273 00:16:18,200 --> 00:16:21,520 Speaker 1: convince people to subscribe for a service when they can 274 00:16:21,560 --> 00:16:25,240 Speaker 1: watch the content for free otherwise. I mean, eighty million subscribers, 275 00:16:25,240 --> 00:16:29,520 Speaker 1: that's a lot, but YouTube has more than two billion 276 00:16:29,680 --> 00:16:35,320 Speaker 1: active users, so it's it's not even a blink of 277 00:16:35,360 --> 00:16:38,600 Speaker 1: the eye for the overall number. Also, I feel like, 278 00:16:38,640 --> 00:16:40,680 Speaker 1: in the interest of full disclosure, I should say, yeah, 279 00:16:40,760 --> 00:16:44,120 Speaker 1: I am a subscriber to YouTube stuff. I subscribed actually 280 00:16:44,200 --> 00:16:48,840 Speaker 1: to an earlier Google product that ultimately, surprise, surprise, got 281 00:16:48,880 --> 00:16:52,840 Speaker 1: shut down and Google kind of rolled my subscription over 282 00:16:52,920 --> 00:16:55,400 Speaker 1: to YouTube, and I liked it. So I just stuck 283 00:16:55,440 --> 00:16:59,600 Speaker 1: with it. Last month, Australian health insurance company Meta Bank 284 00:16:59,720 --> 00:17:03,160 Speaker 1: was hit with a cyber attack that included a massive 285 00:17:03,240 --> 00:17:06,680 Speaker 1: data breach. Now a hacker or a group of hackers 286 00:17:06,680 --> 00:17:10,000 Speaker 1: are demanding extortion payments or else they're going to release 287 00:17:10,119 --> 00:17:14,200 Speaker 1: private medical information belonging to Meta banks clients. The hacker 288 00:17:14,280 --> 00:17:17,800 Speaker 1: demands that Meta Bank handover a dollar for every customer affected, 289 00:17:17,840 --> 00:17:22,000 Speaker 1: and that would be nine point seven million dollars. And 290 00:17:22,119 --> 00:17:26,000 Speaker 1: the hacker has already released what it calls a naughty list, 291 00:17:26,400 --> 00:17:30,520 Speaker 1: which is beyond messed up. When you hear about the 292 00:17:30,600 --> 00:17:33,680 Speaker 1: kinds of folks that are on this list, it's a 293 00:17:34,800 --> 00:17:37,679 Speaker 1: it's able list, to say the least. The list includes 294 00:17:37,760 --> 00:17:40,320 Speaker 1: names of customers who, according to the hacker, have sought 295 00:17:40,400 --> 00:17:45,520 Speaker 1: treatments that range from mental health sessions, to drug addiction services, 296 00:17:45,560 --> 00:17:50,680 Speaker 1: to HIV treatment to abortions. Meta Bank so far has 297 00:17:50,720 --> 00:17:54,200 Speaker 1: refused to pay the hacker, which I'm sure was a 298 00:17:54,320 --> 00:17:59,919 Speaker 1: really difficult decision based upon the nature of this information, 299 00:18:00,359 --> 00:18:03,360 Speaker 1: but in my opinion, it is also the right decision 300 00:18:03,440 --> 00:18:08,520 Speaker 1: to make. Paying cybercriminals simply reinforces that their activities are profitable, 301 00:18:09,280 --> 00:18:11,680 Speaker 1: and it also doesn't guarantee that you'll actually see an 302 00:18:11,760 --> 00:18:15,440 Speaker 1: end to the problems. You might pay and still be tormented. 303 00:18:16,000 --> 00:18:18,160 Speaker 1: But Meta Bank has had to reach out to customers 304 00:18:18,160 --> 00:18:22,080 Speaker 1: to apologize for the issue. And it is a difficult 305 00:18:22,119 --> 00:18:25,880 Speaker 1: fact that innocent people are facing the prospect of public 306 00:18:25,960 --> 00:18:29,560 Speaker 1: shaming for stuff that frankly is no one's business. I 307 00:18:29,640 --> 00:18:32,920 Speaker 1: agree with Australia's Home Affairs Minister Claire O'Neill, who is 308 00:18:32,960 --> 00:18:38,159 Speaker 1: called the hackers quote scummy criminals end quote sounds legit 309 00:18:38,200 --> 00:18:41,960 Speaker 1: to me. If you're on Cloud nine, you better get off. 310 00:18:42,600 --> 00:18:45,800 Speaker 1: In this context, Cloud nine refers to a malicious Google 311 00:18:45,880 --> 00:18:49,000 Speaker 1: Chrome extension that does a lot of nasty stuff like 312 00:18:49,359 --> 00:18:53,359 Speaker 1: key logging. That's when a program that is on your 313 00:18:53,359 --> 00:18:57,800 Speaker 1: computer and it records or logs every key stroke you make. 314 00:18:58,520 --> 00:19:02,200 Speaker 1: Hackers use key loggers to get information like log in credentials, 315 00:19:02,200 --> 00:19:06,400 Speaker 1: credit card numbers, bank account information, etcetera. It's really really 316 00:19:06,440 --> 00:19:09,600 Speaker 1: bad news. Cloud nine can also access your copy and 317 00:19:09,640 --> 00:19:12,480 Speaker 1: paste data. So maybe you don't type in your password, 318 00:19:12,480 --> 00:19:16,040 Speaker 1: maybe you copy it over from like a password vault. Well, 319 00:19:16,080 --> 00:19:18,600 Speaker 1: this tool nullifies that approach as well, or at least 320 00:19:18,920 --> 00:19:21,399 Speaker 1: makes it just as risky. It can also direct your 321 00:19:21,440 --> 00:19:25,520 Speaker 1: computer to mine for cryptocurrencies, so an infected system could 322 00:19:25,560 --> 00:19:28,280 Speaker 1: join a larger bought net dedicated to mining for bitcoin 323 00:19:28,359 --> 00:19:31,800 Speaker 1: or whatever, or you know, more likely some lower value 324 00:19:31,800 --> 00:19:36,200 Speaker 1: crypto because bitcoin still requires some pretty hefty equipment and 325 00:19:36,280 --> 00:19:40,520 Speaker 1: it can allow hackers to remotely execute code on your machine. 326 00:19:40,720 --> 00:19:43,680 Speaker 1: So they could potentially use Cloud nine to run some 327 00:19:43,800 --> 00:19:46,919 Speaker 1: other form of malicious software on an infected machine. So 328 00:19:46,960 --> 00:19:51,720 Speaker 1: that's bad news. Now here is the good news. According 329 00:19:51,720 --> 00:19:57,439 Speaker 1: to zimperium z Labs, which published a report about this extension, 330 00:19:58,200 --> 00:20:03,600 Speaker 1: you cannot find Cloud nine in the official Chrome extension store. Instead, 331 00:20:04,000 --> 00:20:07,240 Speaker 1: folks who are trying to infect computers are using other means, 332 00:20:07,280 --> 00:20:10,280 Speaker 1: such as disguising Cloud nine as a supposed update to 333 00:20:10,440 --> 00:20:13,320 Speaker 1: Adobe Flash and then including the link to this quote 334 00:20:13,359 --> 00:20:16,920 Speaker 1: unquote update on sites that claim to have some form 335 00:20:16,960 --> 00:20:19,480 Speaker 1: of content on them that you might want to see. 336 00:20:19,760 --> 00:20:23,040 Speaker 1: So a person looking for this particular type of content, 337 00:20:23,119 --> 00:20:29,359 Speaker 1: which now is not exclusively the domain of adult video, 338 00:20:29,520 --> 00:20:33,520 Speaker 1: but frequently falls in that category, they end up seeing 339 00:20:33,520 --> 00:20:35,520 Speaker 1: a little window and says, hey, you can't view this 340 00:20:35,560 --> 00:20:37,760 Speaker 1: because you're Flash players out of date, So just to 341 00:20:37,960 --> 00:20:40,320 Speaker 1: click here and download this update and we'll get you 342 00:20:40,359 --> 00:20:43,240 Speaker 1: all set. Except it's not really an update to a 343 00:20:43,320 --> 00:20:47,320 Speaker 1: video player. It is this malicious browser extension, which, once installed, 344 00:20:48,080 --> 00:20:50,280 Speaker 1: starts logging all those key strokes. Now, this is a 345 00:20:50,280 --> 00:20:52,560 Speaker 1: big reason why folks need to be careful before they 346 00:20:52,600 --> 00:20:56,879 Speaker 1: install stuff outside of official stores. Like even in official 347 00:20:56,920 --> 00:21:01,040 Speaker 1: stores can be tricky. Occasionally stuff will slip in without 348 00:21:01,560 --> 00:21:04,600 Speaker 1: the powers that be noticing, but definitely if you're going 349 00:21:04,640 --> 00:21:08,320 Speaker 1: outside official stores, which is something Chrome allows, it's called 350 00:21:08,440 --> 00:21:12,880 Speaker 1: sideloading Chrome. Essentially, what Google is saying is we we 351 00:21:13,280 --> 00:21:16,359 Speaker 1: expect you're an adult and you can make your own decisions, 352 00:21:16,760 --> 00:21:19,159 Speaker 1: so just make good ones or else you're gonna regret it, 353 00:21:19,320 --> 00:21:22,040 Speaker 1: whereas Apple says, we don't think you're adults, so we 354 00:21:22,080 --> 00:21:26,560 Speaker 1: don't let you make decisions. It's putting words into Apple's mouth, 355 00:21:26,640 --> 00:21:28,720 Speaker 1: but it's kind of how it comes across to me. 356 00:21:28,840 --> 00:21:31,840 Speaker 1: Sometimes either way, you have to be really careful, and 357 00:21:31,880 --> 00:21:34,119 Speaker 1: if you're not, then you can end up with something 358 00:21:34,200 --> 00:21:37,560 Speaker 1: like Cloud nine installed on your browser and then next 359 00:21:37,600 --> 00:21:41,360 Speaker 1: thing you know, you've got this enormous headache on your 360 00:21:41,359 --> 00:21:44,000 Speaker 1: hands or I guess not on your hands. It's in 361 00:21:44,040 --> 00:21:47,480 Speaker 1: your head about the fact that you've got this um 362 00:21:47,680 --> 00:21:51,720 Speaker 1: browser that's logging all your key strokes. So yeah, just 363 00:21:51,880 --> 00:21:55,560 Speaker 1: make sure you're being super careful when you're using your computer, 364 00:21:56,240 --> 00:21:58,400 Speaker 1: make sure you know what you're getting into when you're 365 00:21:58,440 --> 00:22:01,520 Speaker 1: side loading. And it's just good to be aware that 366 00:22:01,560 --> 00:22:03,680 Speaker 1: these kind of tools are out there because it might 367 00:22:03,720 --> 00:22:07,080 Speaker 1: make you think twice before you download something and install 368 00:22:07,160 --> 00:22:11,320 Speaker 1: it without really looking into it further. IBM s Asprey 369 00:22:11,480 --> 00:22:15,560 Speaker 1: is ready to take Flight, which is a a pun 370 00:22:15,600 --> 00:22:18,320 Speaker 1: that doesn't really work because it's not an aircraft. Asprey 371 00:22:18,400 --> 00:22:22,480 Speaker 1: in this case is at one of IBMS quantum processors. 372 00:22:22,920 --> 00:22:25,679 Speaker 1: The company has been using bird names for their quantum 373 00:22:25,720 --> 00:22:27,560 Speaker 1: processors for a while. I think the one that came 374 00:22:27,560 --> 00:22:31,399 Speaker 1: out before this one was called Eagle and Osprey is 375 00:22:31,440 --> 00:22:35,879 Speaker 1: a quantum processor that has four hundred thirty three cubits 376 00:22:36,560 --> 00:22:40,159 Speaker 1: q U b I T, so the cubit for quantum 377 00:22:40,200 --> 00:22:45,120 Speaker 1: computers is the basic unit of information. With a traditional computer, 378 00:22:45,520 --> 00:22:49,679 Speaker 1: your basic unit of information is the bit, the binary digit, 379 00:22:50,040 --> 00:22:52,600 Speaker 1: which can by be either a zero or a one. 380 00:22:52,760 --> 00:22:54,960 Speaker 1: It cannot be both. It's one or the other, so 381 00:22:55,000 --> 00:22:58,120 Speaker 1: it's off or on that kind of thing. But a cubit, 382 00:22:59,119 --> 00:23:03,960 Speaker 1: because of superposition, can essentially be both zero and one 383 00:23:04,000 --> 00:23:07,200 Speaker 1: at the same time and technically all values in between. 384 00:23:07,960 --> 00:23:10,800 Speaker 1: This is because of a quantum effect, and it's not 385 00:23:10,880 --> 00:23:13,960 Speaker 1: something you you observe in classical systems, but it's possible 386 00:23:13,960 --> 00:23:17,119 Speaker 1: in a quantum system. So if you have enough cubits 387 00:23:17,480 --> 00:23:20,199 Speaker 1: and you have the right kind of algorithm designed to 388 00:23:20,320 --> 00:23:24,440 Speaker 1: run on a quantum machine, you can potentially solve extremely 389 00:23:24,760 --> 00:23:29,600 Speaker 1: difficult computational problems from a classical computing perspective. That is, 390 00:23:29,680 --> 00:23:33,880 Speaker 1: and it's only a subset of very difficult computational problems. 391 00:23:33,920 --> 00:23:38,600 Speaker 1: It's not every computational problem, but for a subset. For example, 392 00:23:38,920 --> 00:23:41,800 Speaker 1: if you wanted to try and decrypt something and you 393 00:23:41,840 --> 00:23:46,520 Speaker 1: didn't know the two extremely large prime numbers that were 394 00:23:46,600 --> 00:23:50,679 Speaker 1: used to multiply together and get a certain product, well, 395 00:23:51,000 --> 00:23:55,680 Speaker 1: a quantum computer could, in theory solve that problem much much, 396 00:23:55,800 --> 00:23:59,000 Speaker 1: much more quickly than a classical computer for a really 397 00:23:59,080 --> 00:24:02,199 Speaker 1: large number. Classical computer might take more time than the 398 00:24:02,320 --> 00:24:05,240 Speaker 1: universe has been around, whereas with a quantum computer you 399 00:24:05,320 --> 00:24:07,600 Speaker 1: might be talking about, I don't know, a few minutes, 400 00:24:07,640 --> 00:24:11,840 Speaker 1: depending upon again how sophisticated that quantum computer is. We 401 00:24:11,920 --> 00:24:15,719 Speaker 1: are not at that level yet. That's where we're headed towards, 402 00:24:16,000 --> 00:24:19,679 Speaker 1: but we're not there now. Four three cubits is a 403 00:24:19,720 --> 00:24:22,960 Speaker 1: brand new record. That's an enormous number of cubits. When 404 00:24:23,000 --> 00:24:26,080 Speaker 1: I first started talking about quantum systems, when I when 405 00:24:26,119 --> 00:24:29,200 Speaker 1: I first learned that they were even a thing, we 406 00:24:29,200 --> 00:24:32,720 Speaker 1: were working our way towards building a system that had 407 00:24:32,760 --> 00:24:37,720 Speaker 1: just forty cubits. So we've already gone ten times that 408 00:24:37,840 --> 00:24:40,880 Speaker 1: number since I started talking about quantum computers a few 409 00:24:40,920 --> 00:24:44,879 Speaker 1: years ago. And i VM expects to release a quantum 410 00:24:44,920 --> 00:24:49,840 Speaker 1: processor with one thousand cubits on it next year. That's 411 00:24:49,880 --> 00:24:54,600 Speaker 1: a level of of doubling that that is astonishing. That 412 00:24:54,640 --> 00:24:59,760 Speaker 1: actually outpaces Moore's law, which is pretty incredible. Now, we 413 00:25:00,000 --> 00:25:01,720 Speaker 1: still have a lot of other things that have to 414 00:25:01,720 --> 00:25:05,000 Speaker 1: fall into place before quantum computer has become the new 415 00:25:05,840 --> 00:25:10,840 Speaker 1: UH defining factor for how we encrypt things moving forward, 416 00:25:11,119 --> 00:25:13,520 Speaker 1: because obviously one of the things that we worry about 417 00:25:13,520 --> 00:25:16,840 Speaker 1: with quantum computers is that encryption, the current methods of 418 00:25:16,920 --> 00:25:20,560 Speaker 1: encryption that we rely upon, will be nullified, and that 419 00:25:21,080 --> 00:25:24,240 Speaker 1: everything from that point forward could be decrypted assuming it 420 00:25:24,280 --> 00:25:28,440 Speaker 1: was using this traditional form of encryption. But to do that, 421 00:25:28,520 --> 00:25:31,280 Speaker 1: you have to have not just a most powerful computer 422 00:25:31,359 --> 00:25:33,480 Speaker 1: system that can do it, but you also have to 423 00:25:33,560 --> 00:25:37,040 Speaker 1: have the algorithm, and you have to have a way 424 00:25:37,080 --> 00:25:41,760 Speaker 1: of reducing error rate so that you're not uncertain of 425 00:25:41,800 --> 00:25:44,760 Speaker 1: the answer. And we haven't gotten there yet, but this 426 00:25:44,840 --> 00:25:47,880 Speaker 1: is an incredible achievement and I'm really excited to see 427 00:25:47,920 --> 00:25:51,560 Speaker 1: what happens next. Speaking of what happens next, let's take 428 00:25:51,560 --> 00:26:05,520 Speaker 1: another quick break. Okay, we're back and interesting full disclosure 429 00:26:05,560 --> 00:26:08,920 Speaker 1: of y'all, I was at a training session for most 430 00:26:08,920 --> 00:26:11,880 Speaker 1: of the day today, and so this bit of news 431 00:26:11,920 --> 00:26:15,520 Speaker 1: has actually coming out since I started writing the episode, 432 00:26:15,560 --> 00:26:18,199 Speaker 1: which was early, early, early in the morning. And so 433 00:26:18,240 --> 00:26:20,160 Speaker 1: we've got some updates that I want to talk about, 434 00:26:20,200 --> 00:26:23,280 Speaker 1: which means, unfortunately that I have to talk about Twitter 435 00:26:23,440 --> 00:26:26,560 Speaker 1: some more, which I didn't plan on doing. But there 436 00:26:26,600 --> 00:26:31,000 Speaker 1: are some updates. One thing is that as of right 437 00:26:31,040 --> 00:26:36,640 Speaker 1: now over on Twitter, any any account that was made 438 00:26:36,760 --> 00:26:40,920 Speaker 1: after Wednesday, November nine, will not be able to subscribe 439 00:26:40,960 --> 00:26:46,320 Speaker 1: to Twitter Blue for now, because there was this rash 440 00:26:46,560 --> 00:26:51,960 Speaker 1: of new accounts that we're seeking verification and then impersonating 441 00:26:52,480 --> 00:26:56,239 Speaker 1: notable people and brands, and so Twitter had to do 442 00:26:56,320 --> 00:27:00,000 Speaker 1: something about that, and the reaction was all right, well, 443 00:27:00,040 --> 00:27:04,040 Speaker 1: it's locked down any any new accounts from being able 444 00:27:04,080 --> 00:27:07,879 Speaker 1: to do this right all the Gate. So the question 445 00:27:07,960 --> 00:27:11,159 Speaker 1: is when will those accounts be allowed to seek Twitter 446 00:27:11,320 --> 00:27:15,119 Speaker 1: Blue verification. We don't know yet. I don't think anyone 447 00:27:15,200 --> 00:27:17,600 Speaker 1: at Twitter knows yet. I think that this is all 448 00:27:17,720 --> 00:27:25,800 Speaker 1: very reactionary, understandably so, because that flood of impersonations obviously 449 00:27:26,000 --> 00:27:30,880 Speaker 1: posed a real threat to Twitter's revenue, as advertisers do 450 00:27:31,000 --> 00:27:36,760 Speaker 1: not want to see fake accounts claiming to be them 451 00:27:36,800 --> 00:27:39,440 Speaker 1: with a little check Martin next to them doing things 452 00:27:39,520 --> 00:27:43,000 Speaker 1: like posting a picture of Mario flipping off everybody on Twitter. 453 00:27:43,520 --> 00:27:47,560 Speaker 1: So yeah, not a surprise that this has happened. It 454 00:27:47,680 --> 00:27:51,720 Speaker 1: does really paint a picture of how quickly things are 455 00:27:51,760 --> 00:27:56,840 Speaker 1: developing at Twitter, how chaotic things must be over there. Also, 456 00:27:57,760 --> 00:27:59,919 Speaker 1: I know, I'm pretty sure every single one of you 457 00:28:00,040 --> 00:28:03,560 Speaker 1: out there, I feel certain, I'll say this, every single 458 00:28:03,600 --> 00:28:06,560 Speaker 1: one of you out there would have predicted that this 459 00:28:06,600 --> 00:28:10,240 Speaker 1: would have happened, right that if you have a verification 460 00:28:10,280 --> 00:28:13,760 Speaker 1: system that the only barrier of entry is having eight 461 00:28:13,760 --> 00:28:16,880 Speaker 1: dollars a month to spend to have it, you were 462 00:28:17,080 --> 00:28:19,400 Speaker 1: ultimately going to see a lot of people, a lot 463 00:28:19,400 --> 00:28:22,480 Speaker 1: of troublemakers take advantage of that, whether it was going 464 00:28:22,520 --> 00:28:24,800 Speaker 1: to be a ton or you know, just a few 465 00:28:24,840 --> 00:28:27,320 Speaker 1: thousand or whatever it may be. You knew that was 466 00:28:27,320 --> 00:28:29,520 Speaker 1: going to happen, right like I knew that was going 467 00:28:29,560 --> 00:28:33,240 Speaker 1: to happen. It's shocking to me that Elon Musk didn't 468 00:28:33,280 --> 00:28:36,520 Speaker 1: think that was going to happen, or perhaps didn't care 469 00:28:36,640 --> 00:28:39,880 Speaker 1: or didn't understand what the consequences of that would be 470 00:28:40,320 --> 00:28:46,280 Speaker 1: on Twitter's reputation and ability to attract advertisers. So yeah, 471 00:28:46,320 --> 00:28:50,480 Speaker 1: this is at best a temporary solution to heading off 472 00:28:50,560 --> 00:28:56,560 Speaker 1: future rushes to to create impersonation accounts, And uh, I 473 00:28:56,600 --> 00:29:00,520 Speaker 1: think that's wise, But I honestly you think that the 474 00:29:00,560 --> 00:29:03,080 Speaker 1: whole eight dollar verification thing was a bad idea to 475 00:29:03,120 --> 00:29:04,920 Speaker 1: start with. It may very will be that we'll see 476 00:29:04,960 --> 00:29:09,920 Speaker 1: them reverse that decision that the Twitter Blue subscription will 477 00:29:10,000 --> 00:29:12,680 Speaker 1: no longer include the check mark, or maybe it will 478 00:29:12,720 --> 00:29:18,440 Speaker 1: include some other kind of badge to differentiate a Twitter 479 00:29:18,520 --> 00:29:22,600 Speaker 1: Blue subscriber from a normal Twitter user. But yeah, that 480 00:29:22,680 --> 00:29:26,360 Speaker 1: check mark not a great idea because it just set 481 00:29:26,440 --> 00:29:30,440 Speaker 1: the stage for that kind of abuse. So that's update 482 00:29:30,600 --> 00:29:35,840 Speaker 1: number one. In update number two, the US Treasury Department 483 00:29:36,200 --> 00:29:41,720 Speaker 1: Financial Crimes Enforcement Network has UH posted a filing that 484 00:29:41,760 --> 00:29:47,440 Speaker 1: shows that Twitter has applied to become a money service business. Now, 485 00:29:47,480 --> 00:29:49,960 Speaker 1: those of you all who know Elon Musk's history know 486 00:29:50,200 --> 00:29:54,040 Speaker 1: that he was part of x which would eventually become PayPal, 487 00:29:54,840 --> 00:29:57,640 Speaker 1: and also that he has talked in the past about 488 00:29:57,920 --> 00:30:02,640 Speaker 1: aspirations to turn Twitter into a kind of catch all 489 00:30:03,160 --> 00:30:06,600 Speaker 1: service where there would be multiple things you could do 490 00:30:06,640 --> 00:30:09,120 Speaker 1: on it, not just send out a tweet, you know, 491 00:30:09,200 --> 00:30:11,640 Speaker 1: talking about what you had for breakfast, like we used 492 00:30:11,640 --> 00:30:14,600 Speaker 1: to do in the old days, but to perhaps transfer 493 00:30:14,720 --> 00:30:20,080 Speaker 1: funds using Twitter. There's also been some speculation that this 494 00:30:20,200 --> 00:30:23,720 Speaker 1: might mean that Twitter will have some form of of 495 00:30:23,920 --> 00:30:29,400 Speaker 1: relationship with cryptocurrency, but honestly, I don't know if that 496 00:30:29,640 --> 00:30:32,480 Speaker 1: is actually the case. We do know, obviously, Elon Musk 497 00:30:32,520 --> 00:30:35,200 Speaker 1: has been interested in cryptocurrency. There was a while where 498 00:30:35,480 --> 00:30:40,880 Speaker 1: Tesla would accept bitcoin for payment for for a new 499 00:30:40,960 --> 00:30:49,560 Speaker 1: Tesla vehicle. Elon Musk famously promoted doge coin um, which 500 00:30:49,760 --> 00:30:53,000 Speaker 1: did not did not last very long. Like it it, 501 00:30:53,000 --> 00:30:57,160 Speaker 1: It pumped the doge coin value briefly, but doge coin 502 00:30:57,240 --> 00:31:00,320 Speaker 1: has since been back down to just a few sense 503 00:31:00,400 --> 00:31:04,880 Speaker 1: per coin. And uh, yeah, I don't know if this 504 00:31:04,920 --> 00:31:07,959 Speaker 1: means we're going to see Twitter get more involved in 505 00:31:08,000 --> 00:31:11,800 Speaker 1: the crypto world or not. I wouldn't be shocked if 506 00:31:11,840 --> 00:31:15,760 Speaker 1: that were the case. But certainly this is an example 507 00:31:16,000 --> 00:31:22,440 Speaker 1: of Twitter getting interested in becoming more of a um 508 00:31:22,760 --> 00:31:26,520 Speaker 1: a payment services platform, and this is the first step 509 00:31:26,520 --> 00:31:28,720 Speaker 1: to doing it, but obviously a lot more steps have 510 00:31:28,800 --> 00:31:33,400 Speaker 1: to happen before it actually comes to fruition. Next, Twitter 511 00:31:33,480 --> 00:31:39,560 Speaker 1: chief information security officer Leah Kissner leaves Twitter, so Casey 512 00:31:39,600 --> 00:31:44,680 Speaker 1: Newton was the first to report this. Uh. Kissner said 513 00:31:44,760 --> 00:31:47,880 Speaker 1: that this was a hard decision, so it appears to 514 00:31:47,880 --> 00:31:52,520 Speaker 1: be their decision to leave Twitter, not the company's, but 515 00:31:52,760 --> 00:31:57,040 Speaker 1: did not elaborate, did not explain why that decision was 516 00:31:57,240 --> 00:32:02,480 Speaker 1: arrived at I mean, obviously things are again really chaotic 517 00:32:02,480 --> 00:32:06,840 Speaker 1: at Twitter, and that many of the departments have been 518 00:32:07,680 --> 00:32:10,920 Speaker 1: decimated by the layoffs, and I would not be surprised 519 00:32:10,920 --> 00:32:13,440 Speaker 1: to learn that cybersecurity was one of those. It does 520 00:32:13,480 --> 00:32:16,840 Speaker 1: sound like a lot of a lot of divisions within 521 00:32:16,880 --> 00:32:21,120 Speaker 1: Twitter are are working under skeleton crew type situations at 522 00:32:21,120 --> 00:32:26,320 Speaker 1: this point. But yes, seeing Kissner depart is not, uh, 523 00:32:26,440 --> 00:32:32,640 Speaker 1: not something that I find very encouraging. Um it is 524 00:32:33,200 --> 00:32:37,240 Speaker 1: scary because cybersecurity is a very serious matter, and obviously 525 00:32:38,280 --> 00:32:44,040 Speaker 1: social media in general has been a target for different 526 00:32:44,600 --> 00:32:49,760 Speaker 1: agencies out there attempting to spread misinformation and disinformation, to 527 00:32:50,480 --> 00:32:56,840 Speaker 1: so discontent, to spread messages of hate, and to see 528 00:32:56,880 --> 00:33:02,600 Speaker 1: someone who had seignori in a cybersecurity department leave like 529 00:33:02,680 --> 00:33:10,600 Speaker 1: that is troubling. Hopefully Twitter is able to account for 530 00:33:10,640 --> 00:33:14,840 Speaker 1: this and be able to adapt, And I also really 531 00:33:14,880 --> 00:33:19,160 Speaker 1: hope that Kissner is able to find a new opportunity, 532 00:33:19,200 --> 00:33:22,280 Speaker 1: a new place to fit in. They indicated that they 533 00:33:22,320 --> 00:33:25,360 Speaker 1: don't have anything lined up right now. In fact, they said, 534 00:33:25,440 --> 00:33:28,640 Speaker 1: I'm looking forward to figuring out what's next, starting with 535 00:33:28,680 --> 00:33:35,000 Speaker 1: my reviews for US next Security. So that, um, that's 536 00:33:35,000 --> 00:33:38,400 Speaker 1: that's incredible, Like that, it's a it's such a big 537 00:33:38,440 --> 00:33:43,680 Speaker 1: deal for anyone to leave without having another backup plan. 538 00:33:43,800 --> 00:33:46,120 Speaker 1: That was always something that when I was in my 539 00:33:46,200 --> 00:33:49,680 Speaker 1: earlier career, I just felt I couldn't leave something unless 540 00:33:49,680 --> 00:33:52,240 Speaker 1: I had another thing lined up. So it's got to 541 00:33:52,320 --> 00:33:57,360 Speaker 1: be pretty a pretty big reason to leave a high ranking, 542 00:33:58,120 --> 00:34:02,000 Speaker 1: you know, position within a company and not have something 543 00:34:02,040 --> 00:34:06,120 Speaker 1: else right behind it. Uh, chances are Kisseners going to 544 00:34:06,160 --> 00:34:11,560 Speaker 1: find other opportunities pretty quickly. Cybersecurity is obviously important for 545 00:34:11,600 --> 00:34:16,480 Speaker 1: all companies that are in the digital age, so probably 546 00:34:16,480 --> 00:34:19,120 Speaker 1: going to be okay in the long run. But yeah, 547 00:34:19,400 --> 00:34:23,920 Speaker 1: another concerning bit of news for Twitter. What else is 548 00:34:24,000 --> 00:34:27,440 Speaker 1: do now? Twitter, of course, is not the only company 549 00:34:27,560 --> 00:34:30,759 Speaker 1: that's going through rough times. We talked about Meta being 550 00:34:31,160 --> 00:34:34,279 Speaker 1: in that same situation. Really, all the big tech companies 551 00:34:34,640 --> 00:34:40,520 Speaker 1: are looking at making cuts and finding costs savings pretty 552 00:34:40,600 --> 00:34:45,799 Speaker 1: much across the board, and Amazon is no stranger to this. Reportedly, 553 00:34:46,719 --> 00:34:52,200 Speaker 1: Amazon is considering making some cuts in their division that 554 00:34:52,600 --> 00:34:56,319 Speaker 1: involves their voice assistant tool, whose name I won't say 555 00:34:56,360 --> 00:34:58,640 Speaker 1: in case you happen to be listening on one of 556 00:34:58,640 --> 00:35:02,640 Speaker 1: those devices, but uh, it's it's a woman's name that 557 00:35:02,800 --> 00:35:06,400 Speaker 1: starts with A and it ends with A and it 558 00:35:06,520 --> 00:35:12,120 Speaker 1: has lex in the middle. Anyway, Apparently that division has 559 00:35:12,360 --> 00:35:16,400 Speaker 1: had an operating loss of more than five billion dollars 560 00:35:16,520 --> 00:35:20,920 Speaker 1: in recent years, according to the Wall Street Journal, which 561 00:35:21,400 --> 00:35:24,319 Speaker 1: that's a lot of that's a lot of money. Obviously. 562 00:35:24,560 --> 00:35:29,239 Speaker 1: It's interesting to me because the voice assistant feature was 563 00:35:29,320 --> 00:35:33,279 Speaker 1: one that just a few years ago was really being 564 00:35:33,320 --> 00:35:39,279 Speaker 1: promoted across multiple companies Amazon, Google, Apple as being a 565 00:35:39,400 --> 00:35:43,440 Speaker 1: game changing technology, a new way to interact with our 566 00:35:43,520 --> 00:35:48,560 Speaker 1: devices that removes the barriers that would otherwise exist. Being 567 00:35:48,600 --> 00:35:51,480 Speaker 1: able to speak to a computer system and have it 568 00:35:51,800 --> 00:35:54,120 Speaker 1: understand what you want and to do whatever it is 569 00:35:54,160 --> 00:35:56,359 Speaker 1: you were asking of it. That was a big deal, 570 00:35:56,480 --> 00:36:00,560 Speaker 1: Like that's like Star Trek kind of stuff. But if 571 00:36:00,560 --> 00:36:05,280 Speaker 1: it's not driving revenue, it becomes harder and harder to justify, 572 00:36:05,520 --> 00:36:12,520 Speaker 1: especially in times of economic uncertainty. So, uh, here's hoping 573 00:36:13,280 --> 00:36:16,960 Speaker 1: that there won't be too many people affected by layoffs 574 00:36:17,640 --> 00:36:21,719 Speaker 1: if there are going to be massive cuts across Amazon, 575 00:36:22,320 --> 00:36:26,680 Speaker 1: And uh yeah, I'll be really interested to see how 576 00:36:26,840 --> 00:36:30,839 Speaker 1: we go from here, Like what else are we going 577 00:36:30,920 --> 00:36:34,799 Speaker 1: to see from voice assistants or will we will those 578 00:36:34,840 --> 00:36:38,120 Speaker 1: fade away and they just won't have really made a 579 00:36:38,239 --> 00:36:41,480 Speaker 1: huge impact. I know I use a voice assistant pretty regularly, 580 00:36:41,560 --> 00:36:43,680 Speaker 1: but I do it to do like three different things, 581 00:36:44,239 --> 00:36:46,880 Speaker 1: and it's the same three things pretty much every day, 582 00:36:46,960 --> 00:36:52,279 Speaker 1: and that's it. So arguably I'm not really making good 583 00:36:52,400 --> 00:36:55,280 Speaker 1: use of the voice assistant feature. And maybe I wouldn't 584 00:36:55,320 --> 00:36:57,520 Speaker 1: even notice if it were gone, except then I'd have 585 00:36:57,600 --> 00:36:59,560 Speaker 1: to figure out how to turn my lights on and off. 586 00:37:02,280 --> 00:37:03,719 Speaker 1: Used to be a time where I knew how to 587 00:37:03,760 --> 00:37:08,280 Speaker 1: do that. I guess there's that switch on the wall. Anyway. 588 00:37:08,680 --> 00:37:11,560 Speaker 1: Last little bit of news that I have this is 589 00:37:11,600 --> 00:37:14,600 Speaker 1: another update that came out while I was in training. 590 00:37:15,320 --> 00:37:18,160 Speaker 1: The bit of news is that Kaspersky is going to 591 00:37:18,239 --> 00:37:23,799 Speaker 1: have to stop the operation of its VPN product within Russia. Now, 592 00:37:23,880 --> 00:37:27,760 Speaker 1: Kaspersky itself is a Russian company. It's known for making 593 00:37:27,880 --> 00:37:32,680 Speaker 1: security software UH, anti virus, that kind of stuff, as 594 00:37:32,719 --> 00:37:38,240 Speaker 1: well as VPNs or virtual private networks, and UH, Russia 595 00:37:38,280 --> 00:37:43,600 Speaker 1: had already cracked down on most VPNs last year. Uh 596 00:37:44,120 --> 00:37:47,399 Speaker 1: those would be companies that were or VPNs that were 597 00:37:47,400 --> 00:37:50,720 Speaker 1: from companies that were not themselves located out of Russia, 598 00:37:51,120 --> 00:37:56,080 Speaker 1: So things like Nord VPN, Express VPN, Proton VPN, you've 599 00:37:56,120 --> 00:37:59,640 Speaker 1: probably heard of some of these. Those were kind of 600 00:38:00,560 --> 00:38:05,839 Speaker 1: band last year, and a big reason for that is 601 00:38:05,880 --> 00:38:10,080 Speaker 1: that Russia was trying to force these companies to make 602 00:38:10,120 --> 00:38:15,040 Speaker 1: sure that the Russian government would be able to censor 603 00:38:15,520 --> 00:38:19,840 Speaker 1: information across these VPNs, which isn't really what the VPNs 604 00:38:19,880 --> 00:38:24,040 Speaker 1: are for. Uh So Russia is trying to control the 605 00:38:24,120 --> 00:38:28,839 Speaker 1: narrative right there, trying to control the info that their 606 00:38:28,920 --> 00:38:32,880 Speaker 1: citizens can easily access. Not quite to the same level 607 00:38:32,880 --> 00:38:36,440 Speaker 1: that we see in China, but there is some pretty 608 00:38:36,480 --> 00:38:40,080 Speaker 1: heavy censorship going on on the government level in Russia 609 00:38:40,200 --> 00:38:45,239 Speaker 1: regarding Internet information and UM. Yeah, and also they wanted 610 00:38:45,280 --> 00:38:47,240 Speaker 1: to be able to see who was seeing what again, 611 00:38:47,920 --> 00:38:51,560 Speaker 1: that's the VPNs are supposed to be the cure for that. 612 00:38:51,880 --> 00:38:54,640 Speaker 1: You're supposed to use a VPN so that people who 613 00:38:54,640 --> 00:38:57,440 Speaker 1: are snooping on you don't know what sights you're going to. 614 00:38:57,560 --> 00:39:02,799 Speaker 1: It can be absolutely of critical importance in in work right, 615 00:39:02,880 --> 00:39:05,160 Speaker 1: you might be working for a company where you need 616 00:39:05,200 --> 00:39:09,280 Speaker 1: to log into a VPN first before you access company 617 00:39:10,200 --> 00:39:14,120 Speaker 1: assets because the company wants to protect those assets from 618 00:39:14,120 --> 00:39:16,759 Speaker 1: many snoops. They don't want anyone to be able to 619 00:39:17,600 --> 00:39:22,080 Speaker 1: figure out uh proprietary secrets. For example, the Russian government 620 00:39:22,120 --> 00:39:25,799 Speaker 1: does not really like secrets that aren't there's so there 621 00:39:25,880 --> 00:39:28,520 Speaker 1: was a real push for this. Well, now Kaspersky itself, 622 00:39:28,560 --> 00:39:32,440 Speaker 1: a Russian company, has decided to end the VPN service 623 00:39:32,480 --> 00:39:36,600 Speaker 1: within Russia because I guess it has found that it's 624 00:39:36,640 --> 00:39:41,400 Speaker 1: really difficult to provide VPN services and also meet the 625 00:39:41,480 --> 00:39:45,200 Speaker 1: expectations of the Russian government. That these two things are 626 00:39:45,239 --> 00:39:49,359 Speaker 1: antithetical to one another, they counteract each other, and yeah, 627 00:39:49,520 --> 00:39:54,120 Speaker 1: that's where we're at there. It's pretty severe. Uh, I 628 00:39:54,200 --> 00:40:00,279 Speaker 1: don't know how many VPN services are legally available bowl 629 00:40:00,320 --> 00:40:02,920 Speaker 1: in Russia at this point. I'm sure that there are 630 00:40:02,960 --> 00:40:06,000 Speaker 1: no shortage of people who are accessing them illegally, but 631 00:40:06,120 --> 00:40:09,920 Speaker 1: that obviously brings with it some risk. Yeah, that's it. 632 00:40:10,480 --> 00:40:13,480 Speaker 1: That's the last little update that I came across before 633 00:40:13,600 --> 00:40:17,040 Speaker 1: I was ready to send us off to super producer Tari. 634 00:40:17,680 --> 00:40:20,759 Speaker 1: So I hope you are all well. Thank you so 635 00:40:20,840 --> 00:40:22,920 Speaker 1: much for listening. If you would like to reach out 636 00:40:23,000 --> 00:40:24,279 Speaker 1: to the show, there are a couple of ways to 637 00:40:24,320 --> 00:40:27,400 Speaker 1: do that. One is that you can download the I 638 00:40:27,520 --> 00:40:30,600 Speaker 1: Heart Radio app. You can navigate over to the tech 639 00:40:30,640 --> 00:40:32,640 Speaker 1: stuff page. Just put tech Stuff in the little search 640 00:40:32,680 --> 00:40:38,160 Speaker 1: file and you can use the microphone icon click on 641 00:40:38,239 --> 00:40:42,360 Speaker 1: that and you can leave a message up to thirty 642 00:40:42,520 --> 00:40:46,960 Speaker 1: seconds in length and uh that way, I will know 643 00:40:47,000 --> 00:40:48,400 Speaker 1: what it is you would like to hear. And if 644 00:40:48,400 --> 00:40:50,239 Speaker 1: you would like me to use that message in a 645 00:40:50,239 --> 00:40:53,880 Speaker 1: future episode, well just go on ahead and let me 646 00:40:53,960 --> 00:40:57,600 Speaker 1: know in that message, because it's all about opt in. Secondly, 647 00:40:58,040 --> 00:41:00,400 Speaker 1: that if you would prefer to reach out in a 648 00:41:00,440 --> 00:41:02,960 Speaker 1: way that doesn't require you to speak, you can always 649 00:41:03,000 --> 00:41:05,960 Speaker 1: send me a message via Twitter. The handle for the 650 00:41:05,960 --> 00:41:09,879 Speaker 1: show is tech Stuff h s W. Now I'll talk 651 00:41:09,880 --> 00:41:19,200 Speaker 1: to you again really soon, y. Tech Stuff is an 652 00:41:19,200 --> 00:41:22,880 Speaker 1: I Heart Radio production. For more podcasts from I Heart radio, 653 00:41:23,239 --> 00:41:26,400 Speaker 1: visit the I heart Radio app, Apple podcasts, or wherever 654 00:41:26,480 --> 00:41:28,000 Speaker 1: you listen to your favorite shows.