1 00:00:04,360 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. He there, 2 00:00:12,360 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,960 --> 00:00:19,079 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:19,160 --> 00:00:22,079 Speaker 1: are you. It's time for the tech news for Tuesday, 5 00:00:22,120 --> 00:00:26,439 Speaker 1: March twenty eighth, twenty twenty three, and we start off 6 00:00:26,760 --> 00:00:29,680 Speaker 1: with a little bit of sad news. So last Friday, 7 00:00:30,080 --> 00:00:33,800 Speaker 1: Gordon Moore passed away at the age of ninety four. 8 00:00:34,040 --> 00:00:36,960 Speaker 1: That's a ripe old age. Now I plan to do 9 00:00:37,120 --> 00:00:42,159 Speaker 1: an episode about more to really go into detail about 10 00:00:42,200 --> 00:00:45,320 Speaker 1: his life and contributions. But he is someone that I 11 00:00:45,360 --> 00:00:48,639 Speaker 1: have talked about a lot in past episodes of tech 12 00:00:48,680 --> 00:00:52,560 Speaker 1: Stuff because he's a pretty important person in the world 13 00:00:52,600 --> 00:00:56,440 Speaker 1: of tech. He's the man who first observed that semiconductor 14 00:00:56,520 --> 00:01:01,640 Speaker 1: fabrication companies had followed a particular trajectory, namely that the 15 00:01:01,680 --> 00:01:06,240 Speaker 1: companies were cramming twice as many components onto a square 16 00:01:06,240 --> 00:01:10,959 Speaker 1: inch of silicon every two years or so. So if 17 00:01:11,319 --> 00:01:14,039 Speaker 1: you were able to put ten thousand components on a 18 00:01:14,120 --> 00:01:17,720 Speaker 1: square inch of silicon in year one, two years later, 19 00:01:17,760 --> 00:01:22,280 Speaker 1: by year three, you're cramming twenty thousand on there by 20 00:01:22,360 --> 00:01:25,320 Speaker 1: year five. That would go up to forty thousand and 21 00:01:25,400 --> 00:01:28,679 Speaker 1: so on, and it gets real big, real fast. Now, 22 00:01:28,720 --> 00:01:32,559 Speaker 1: this observation came to be known as Moore's law, which 23 00:01:32,560 --> 00:01:35,280 Speaker 1: is funny because you know, he was making an observation 24 00:01:35,319 --> 00:01:38,520 Speaker 1: and saying that you could predict what would happen based 25 00:01:38,520 --> 00:01:42,080 Speaker 1: on this observation, But he wasn't saying that there was 26 00:01:42,160 --> 00:01:47,640 Speaker 1: some fundamental law that drove this. His initial observations were 27 00:01:47,640 --> 00:01:52,080 Speaker 1: really more about economic factors that would fund that kind 28 00:01:52,160 --> 00:01:55,200 Speaker 1: of innovation and make it necessary in order to be competitive. 29 00:01:55,640 --> 00:01:58,800 Speaker 1: But later interpretations of his observation would kind of boil 30 00:01:58,920 --> 00:02:03,720 Speaker 1: down to a predict that computing processing power doubles every 31 00:02:03,760 --> 00:02:06,520 Speaker 1: two years or so. But Moore's law was just one 32 00:02:06,560 --> 00:02:09,040 Speaker 1: of his big contributions. Another huge one is that he 33 00:02:09,160 --> 00:02:12,840 Speaker 1: was a co founder of Intel. So we'll do a 34 00:02:12,880 --> 00:02:16,840 Speaker 1: full episode about him and his contributions soon, and in 35 00:02:16,880 --> 00:02:21,840 Speaker 1: the meantime, Fair Winds and following sees Gordon Moore. Markets 36 00:02:21,880 --> 00:02:26,639 Speaker 1: Insider reports that, according to analyst Dan Ives, TikTok has 37 00:02:26,680 --> 00:02:31,120 Speaker 1: a ninety percent chance of being banned in the US 38 00:02:31,320 --> 00:02:35,080 Speaker 1: if things continue as they are now. This follows shall 39 00:02:35,320 --> 00:02:40,799 Speaker 1: she Chew, which is TikTok's CEO appearing before Congress. Now, 40 00:02:40,840 --> 00:02:44,200 Speaker 1: as I mentioned last week, I was writing and recording 41 00:02:44,480 --> 00:02:47,800 Speaker 1: during his testimony, so I had to follow up on 42 00:02:47,840 --> 00:02:51,200 Speaker 1: what was going on, and I couldn't watch very much 43 00:02:51,200 --> 00:02:54,000 Speaker 1: of it, y'all because it was not pretty. There was 44 00:02:54,040 --> 00:02:59,640 Speaker 1: some real ethnocentric xenophobic behavior on display in Congress, some 45 00:02:59,720 --> 00:03:02,960 Speaker 1: of it really really ugly, and there were a lot 46 00:03:03,000 --> 00:03:06,120 Speaker 1: of assumptions about choose heritage. It seemed like everybody was 47 00:03:06,160 --> 00:03:11,760 Speaker 1: assuming he's Chinese. He's not. He comes from Singapore, not China. 48 00:03:12,080 --> 00:03:16,080 Speaker 1: There was also a lot of railroading where like no 49 00:03:16,120 --> 00:03:19,160 Speaker 1: one was asking a question at one point, like they 50 00:03:19,200 --> 00:03:23,960 Speaker 1: were essentially making accusations and statements but not asking anything. 51 00:03:24,680 --> 00:03:29,360 Speaker 1: And it's kind of like counterproductive. I mean, if someone's 52 00:03:29,400 --> 00:03:33,080 Speaker 1: there to provide testimony, you're supposed to ask them questions 53 00:03:33,440 --> 00:03:36,600 Speaker 1: so that they can answer, then you follow up. And 54 00:03:36,640 --> 00:03:39,280 Speaker 1: it looked like a lot of them, the people in 55 00:03:39,320 --> 00:03:42,920 Speaker 1: Congress were kind of using it to grandstand, and you know, 56 00:03:43,240 --> 00:03:46,920 Speaker 1: a lot of TikTok folks, like a lot of users 57 00:03:47,280 --> 00:03:50,280 Speaker 1: took the ball and ran with it by creating a 58 00:03:50,320 --> 00:03:54,800 Speaker 1: bunch of satirical videos pointing out how Congress was behaving. 59 00:03:55,320 --> 00:04:00,880 Speaker 1: Of course, TikTok users have their own bias obviously, so 60 00:04:01,400 --> 00:04:05,160 Speaker 1: according to Ives the analyst, TikTok has a couple of 61 00:04:05,200 --> 00:04:09,080 Speaker 1: options if it wants to avoid being obliterated, and it's 62 00:04:09,120 --> 00:04:11,160 Speaker 1: pretty much what we heard a few years back when 63 00:04:11,200 --> 00:04:15,120 Speaker 1: former President Donald Trump started to go after TikTok. The 64 00:04:15,200 --> 00:04:19,120 Speaker 1: company can either hold an initial public offering or IPO 65 00:04:19,440 --> 00:04:24,039 Speaker 1: and go public and in the process presumably break its 66 00:04:24,040 --> 00:04:27,159 Speaker 1: connection with byte Dance, though you would have to take 67 00:04:27,160 --> 00:04:30,520 Speaker 1: some real steps to make sure that happened, or it 68 00:04:30,600 --> 00:04:33,520 Speaker 1: could be sold off to some other company, you know, 69 00:04:33,600 --> 00:04:37,920 Speaker 1: not a Chinese company. Now. Honestly, I don't know how 70 00:04:37,920 --> 00:04:41,200 Speaker 1: to feel about all this. On the one hand, China 71 00:04:41,440 --> 00:04:45,520 Speaker 1: very much is focused on digital espionage. Like that's not 72 00:04:45,680 --> 00:04:51,320 Speaker 1: a controversial thing to say. China's actively pursuing digital espionage 73 00:04:51,360 --> 00:04:54,719 Speaker 1: in multiple venues. I don't know that TikTok's one of 74 00:04:54,720 --> 00:04:58,360 Speaker 1: them necessarily, it wouldn't surprise me if it were. But 75 00:04:58,920 --> 00:05:01,520 Speaker 1: you know, it's there's no denying China's in that business. 76 00:05:01,920 --> 00:05:05,760 Speaker 1: But then so is everybody else, including the United States. 77 00:05:05,880 --> 00:05:09,039 Speaker 1: It's not like China's the outlier here. You can also 78 00:05:09,120 --> 00:05:13,560 Speaker 1: argue that even the private and public companies here in 79 00:05:13,600 --> 00:05:15,960 Speaker 1: the US and the tech sector are, when you get 80 00:05:16,000 --> 00:05:19,360 Speaker 1: down to it, focused on digital espionage with regard to users. 81 00:05:19,400 --> 00:05:22,880 Speaker 1: They are in the business of buying and selling personal 82 00:05:22,920 --> 00:05:29,120 Speaker 1: information about people like us. No one is clean on this. 83 00:05:29,200 --> 00:05:31,520 Speaker 1: In other words, like you can't just single out TikTok 84 00:05:31,520 --> 00:05:35,080 Speaker 1: and say you are bad because you're collecting information, because 85 00:05:35,160 --> 00:05:38,719 Speaker 1: that's the truth across the board. You could say everyone's 86 00:05:38,720 --> 00:05:41,520 Speaker 1: bad and that would be true. But yeah, singling out 87 00:05:41,560 --> 00:05:45,400 Speaker 1: TikTok for that specific purpose doesn't hold a lot of water. Now, 88 00:05:45,440 --> 00:05:50,840 Speaker 1: the national security component does complicate things further, but it 89 00:05:50,920 --> 00:05:55,039 Speaker 1: also creates an opening for xenophobic ideologies to take hold, 90 00:05:55,200 --> 00:05:58,719 Speaker 1: which gets pretty ugly. Like there are elements of the 91 00:05:58,839 --> 00:06:04,119 Speaker 1: rhetoric that we're starting to sound a bit like McCarthyism 92 00:06:04,480 --> 00:06:07,080 Speaker 1: from back in the nineteen fifties. If you don't know 93 00:06:07,120 --> 00:06:08,880 Speaker 1: what that is, you need to look up the Red 94 00:06:09,040 --> 00:06:12,520 Speaker 1: Scare and educate yourself. And you know, some of the 95 00:06:12,520 --> 00:06:16,880 Speaker 1: concerns might be based in reality, but they can become 96 00:06:16,920 --> 00:06:20,520 Speaker 1: exaggerated and also conflated with other things, and it just 97 00:06:20,720 --> 00:06:23,919 Speaker 1: can get very very ugly, very very quickly. So the 98 00:06:23,960 --> 00:06:26,200 Speaker 1: whole thing is a huge mess. And I sure as heck, 99 00:06:26,720 --> 00:06:29,440 Speaker 1: do not know what the best way forward is. I 100 00:06:29,480 --> 00:06:32,880 Speaker 1: just feel pretty comfortable saying whatever the best way happens 101 00:06:32,920 --> 00:06:35,520 Speaker 1: to be, that ain't the way we're going right now. 102 00:06:36,200 --> 00:06:39,760 Speaker 1: Joseph Cox of Motherboard has a piece about the FBI's 103 00:06:39,760 --> 00:06:44,680 Speaker 1: contract with a company called Team Simru or perhaps Simru 104 00:06:44,800 --> 00:06:48,160 Speaker 1: it's see why Mru, I'm not sure how to pronounce it. 105 00:06:48,880 --> 00:06:53,360 Speaker 1: And the contract is to acquire what is called NetFlow data. 106 00:06:53,440 --> 00:06:55,440 Speaker 1: This is kind of like a big picture of view 107 00:06:55,720 --> 00:06:59,840 Speaker 1: of information that's moving across networks like the Internet. You 108 00:07:00,040 --> 00:07:02,880 Speaker 1: can see which servers are connecting to one another, transmitting 109 00:07:02,920 --> 00:07:08,119 Speaker 1: information from endpoint to endpoint. So Team Simru is able 110 00:07:08,160 --> 00:07:11,400 Speaker 1: to get this data by doing deals with Internet service providers. 111 00:07:11,800 --> 00:07:16,400 Speaker 1: Team Simru offers ISPs threat detection services and in return, 112 00:07:16,440 --> 00:07:21,080 Speaker 1: the ISPs trade this data, which Team Simru then sells 113 00:07:21,200 --> 00:07:24,840 Speaker 1: to clients like the FBI. Apparently. Now. The reason this 114 00:07:24,880 --> 00:07:28,600 Speaker 1: is important is it marks a kind of loophole that 115 00:07:28,640 --> 00:07:30,840 Speaker 1: the FBI has leaned on in order to get an 116 00:07:30,840 --> 00:07:35,240 Speaker 1: eye on data transmissions without first going through the legal 117 00:07:35,280 --> 00:07:39,760 Speaker 1: process to do so lawfully. Right, instead of getting a warrant, 118 00:07:39,880 --> 00:07:44,080 Speaker 1: a digital warrant, they just buy the data outright. Now, 119 00:07:44,160 --> 00:07:47,320 Speaker 1: NetFlow is not quite the same thing as spying on 120 00:07:47,440 --> 00:07:51,720 Speaker 1: the actual information being exchanged itself. It can, however, include 121 00:07:51,720 --> 00:07:54,480 Speaker 1: a lot of stuff like what you are l you're visiting, 122 00:07:54,520 --> 00:07:57,080 Speaker 1: so you know which websites people are going to during 123 00:07:57,080 --> 00:08:01,840 Speaker 1: a given Internet session, and NetFlow can definitely provide clues 124 00:08:02,480 --> 00:08:05,840 Speaker 1: as to potential illegal activity, so you might be able 125 00:08:05,880 --> 00:08:09,200 Speaker 1: to find which servers are associated with a hacker group, 126 00:08:09,240 --> 00:08:13,080 Speaker 1: for example. But the issue is that metadata can tell 127 00:08:13,160 --> 00:08:16,760 Speaker 1: us a lot about a person by itself. Like you 128 00:08:16,840 --> 00:08:21,200 Speaker 1: might think the information within a conversation is important, and 129 00:08:21,320 --> 00:08:25,840 Speaker 1: it is, but the metadata about that conversation can give 130 00:08:25,880 --> 00:08:28,880 Speaker 1: you a lot of clues all on its own. Now, 131 00:08:28,880 --> 00:08:30,960 Speaker 1: I've mentioned this before, but there have been a lot 132 00:08:30,960 --> 00:08:34,000 Speaker 1: of demonstrations showing that with just a few data points 133 00:08:34,440 --> 00:08:37,880 Speaker 1: you can really narrow down the specific Internet users identity 134 00:08:37,920 --> 00:08:42,040 Speaker 1: with incredible accuracy and precision. So this practice of just 135 00:08:42,280 --> 00:08:47,200 Speaker 1: buying information from companies like team cymru it gets into 136 00:08:47,240 --> 00:08:51,720 Speaker 1: some really dangerous surveillance state territory. Here, if you'd like 137 00:08:51,760 --> 00:08:54,880 Speaker 1: to learn more, you can visit vice dot com and 138 00:08:55,000 --> 00:08:58,640 Speaker 1: just read the article. Here is the FBI's contract to 139 00:08:58,679 --> 00:09:02,960 Speaker 1: buy mass Internet data. The facial recognition company clear View, 140 00:09:03,200 --> 00:09:08,280 Speaker 1: which built out an enormous database of images using questionable means, 141 00:09:08,800 --> 00:09:13,839 Speaker 1: mainly scraping social network platforms to gather digital photos without 142 00:09:13,920 --> 00:09:19,440 Speaker 1: first getting the consent of users of those platforms, says 143 00:09:19,520 --> 00:09:23,640 Speaker 1: that it has facilitated almost a million searchers or US police. 144 00:09:23,679 --> 00:09:26,640 Speaker 1: This is according to the company itself that has made 145 00:09:26,679 --> 00:09:31,120 Speaker 1: that claim. They revealed this startling information and a piece 146 00:09:31,160 --> 00:09:35,640 Speaker 1: for the BBC. Clearview has received some high profile criticism 147 00:09:35,679 --> 00:09:38,320 Speaker 1: in the United States, and several cities here in the 148 00:09:38,400 --> 00:09:42,160 Speaker 1: US outright banned their police forces from using the service. 149 00:09:42,200 --> 00:09:46,800 Speaker 1: They say do not make use of clear Views services. Now, 150 00:09:46,800 --> 00:09:49,920 Speaker 1: as I've mentioned several times in the past, facial recognition 151 00:09:49,960 --> 00:09:54,400 Speaker 1: technology is far from perfect and frequently encounters issues when 152 00:09:54,440 --> 00:09:58,280 Speaker 1: identifying people of color in particular, and ultimately, these are 153 00:09:58,280 --> 00:10:01,160 Speaker 1: the same people who received dispropor or sets, scrutiny and 154 00:10:01,240 --> 00:10:05,720 Speaker 1: abuse from law enforcement. Making matters worse is that it's 155 00:10:05,920 --> 00:10:09,480 Speaker 1: very difficult to say which police departments have been frequent 156 00:10:09,559 --> 00:10:13,000 Speaker 1: customers for clear View, because there are very few laws 157 00:10:13,000 --> 00:10:16,719 Speaker 1: and regulations that require police to disclose that kind of information. 158 00:10:17,280 --> 00:10:20,200 Speaker 1: So this lack of transparency creates a further gap of 159 00:10:20,240 --> 00:10:24,760 Speaker 1: distrust between police and citizens, and scraping social media for 160 00:10:24,840 --> 00:10:28,400 Speaker 1: data like this typically goes against the terms of services 161 00:10:28,480 --> 00:10:32,040 Speaker 1: for those social networks. Facebook has come down hard on 162 00:10:32,120 --> 00:10:36,520 Speaker 1: companies that have tried to scrape Facebook in order to 163 00:10:36,559 --> 00:10:41,680 Speaker 1: gather data. Also academic researchers. Facebook's really gone hard against 164 00:10:41,679 --> 00:10:46,400 Speaker 1: academic researchers doing that. So essentially, companies like Meta slash 165 00:10:46,440 --> 00:10:48,760 Speaker 1: Facebook don't like anyone else being able to do what 166 00:10:49,000 --> 00:10:54,040 Speaker 1: they are very very definitely doing themselves. I mean it's 167 00:10:54,480 --> 00:10:58,080 Speaker 1: I don't have information on it. I cannot say with 168 00:10:58,200 --> 00:11:01,319 Speaker 1: evidence that they do, but it would shock me if 169 00:11:01,360 --> 00:11:04,600 Speaker 1: they are not making use of that technology that they 170 00:11:04,600 --> 00:11:06,440 Speaker 1: don't want anyone else to be able to use on 171 00:11:06,480 --> 00:11:11,719 Speaker 1: their platforms. Okay, with that cheerful note, let's take a 172 00:11:11,800 --> 00:11:24,199 Speaker 1: quick break. We'll come back with some more tech news. Okay, 173 00:11:24,280 --> 00:11:28,400 Speaker 1: we're back. So this past weekend, an organization called the 174 00:11:28,600 --> 00:11:35,440 Speaker 1: Military Cyber Professional Association, or MCPA, petitioned lawmakers to create 175 00:11:35,480 --> 00:11:39,520 Speaker 1: a new branch of the US military that would focus 176 00:11:39,640 --> 00:11:43,720 Speaker 1: solely on digital and cyber threats. The group, which consists 177 00:11:43,720 --> 00:11:46,640 Speaker 1: of folks who have worked in digital security within the military, 178 00:11:46,760 --> 00:11:50,480 Speaker 1: some of them current military officials, some of them former ones. 179 00:11:50,960 --> 00:11:53,160 Speaker 1: They say that one of the largest challenges that the 180 00:11:53,200 --> 00:11:56,600 Speaker 1: United States faces with digital warfare in mind is that 181 00:11:56,760 --> 00:11:59,839 Speaker 1: up to now, each branch of the military has its 182 00:11:59,840 --> 00:12:05,040 Speaker 1: own independent methodology to fight cyber threats, and there is 183 00:12:05,280 --> 00:12:11,120 Speaker 1: a unified office that can take some actions, but it's 184 00:12:11,160 --> 00:12:14,800 Speaker 1: not a full branch of the military. So the MCPA 185 00:12:14,880 --> 00:12:18,320 Speaker 1: is saying this creates gaps in US defenses and that 186 00:12:19,040 --> 00:12:24,600 Speaker 1: if we are to actually have a real strong national 187 00:12:24,640 --> 00:12:28,160 Speaker 1: security approach to cybersecurity, there needs to be a military 188 00:12:28,200 --> 00:12:33,080 Speaker 1: branch specifically dedicated to cyber warfare and defense. This branch 189 00:12:33,360 --> 00:12:38,920 Speaker 1: could establish best practices, it could spearhead projects to protect 190 00:12:38,960 --> 00:12:43,800 Speaker 1: the United States, and it would be a fully funded 191 00:12:43,880 --> 00:12:47,800 Speaker 1: branch of the military, as opposed to something that could 192 00:12:47,840 --> 00:12:52,160 Speaker 1: see its funding ebb and flow based upon the individual 193 00:12:52,960 --> 00:12:57,880 Speaker 1: branches of the military that already exist. We've already established 194 00:12:57,920 --> 00:12:59,920 Speaker 1: a new branch of the military here in the United 195 00:13:00,040 --> 00:13:04,320 Speaker 1: States in the recent past, the Space Force being the 196 00:13:04,360 --> 00:13:09,199 Speaker 1: most recent branch. So if this did happen, I mean 197 00:13:09,240 --> 00:13:13,600 Speaker 1: there is precedents, recent precedents. Before Space Force, it was 198 00:13:13,640 --> 00:13:17,280 Speaker 1: like seventy two years since the last time the US 199 00:13:17,360 --> 00:13:20,280 Speaker 1: had created a branch of the military. But Hey, we're 200 00:13:20,280 --> 00:13:22,920 Speaker 1: on a roll now, right, I mean two, that's a streak. 201 00:13:22,960 --> 00:13:26,120 Speaker 1: We can start creating branches for everything. Whether or not 202 00:13:26,720 --> 00:13:30,440 Speaker 1: anyone actually makes a move on this to turn this 203 00:13:30,679 --> 00:13:35,000 Speaker 1: proposal into something more actionable remains to be seen. There 204 00:13:35,040 --> 00:13:39,720 Speaker 1: does not appear to currently be a lot of behind 205 00:13:39,760 --> 00:13:43,959 Speaker 1: this in Congress or in the executive branch, so we 206 00:13:44,000 --> 00:13:47,000 Speaker 1: may not see any movement on this in the near future. 207 00:13:47,320 --> 00:13:50,280 Speaker 1: It may take a while. I do think that it's 208 00:13:50,400 --> 00:13:53,560 Speaker 1: not a bad idea to establish such a branch, to 209 00:13:53,679 --> 00:13:58,360 Speaker 1: have something that is well funded and can attract talent 210 00:13:58,840 --> 00:14:03,160 Speaker 1: and maintain and keep that talent in order to fight 211 00:14:03,280 --> 00:14:08,200 Speaker 1: off things like the various hacker groups that are constantly 212 00:14:08,640 --> 00:14:12,880 Speaker 1: poking at vulnerabilities within the United States to find a 213 00:14:12,920 --> 00:14:15,439 Speaker 1: way to get a foothold. I think it's important to 214 00:14:15,559 --> 00:14:18,920 Speaker 1: establish a defense against that, and the longer we wait, 215 00:14:18,960 --> 00:14:23,000 Speaker 1: the harder it will be to do it effectively. So personally, 216 00:14:23,040 --> 00:14:25,760 Speaker 1: I think it's not a bad idea. I'm not a 217 00:14:25,840 --> 00:14:30,280 Speaker 1: huge fan of creating more military necessarily, but this is 218 00:14:30,320 --> 00:14:33,800 Speaker 1: the way the future is going, and with the increase 219 00:14:34,160 --> 00:14:38,040 Speaker 1: in things like ai USE, it's going to be a necessity, 220 00:14:38,280 --> 00:14:42,360 Speaker 1: and I would rather see us be proactive than reactive. 221 00:14:42,600 --> 00:14:44,680 Speaker 1: I mean, we're already going to be at least a 222 00:14:44,680 --> 00:14:47,840 Speaker 1: little reactive, but you know, we could argue we're proactive. 223 00:14:49,680 --> 00:14:53,400 Speaker 1: Zoe Schiffer over at Platformer has a great article titled 224 00:14:53,800 --> 00:14:58,000 Speaker 1: the Secret list of Twitter VIPs getting boosted over everyone else, 225 00:14:58,600 --> 00:15:02,080 Speaker 1: which obviously reveals how some of Twitter's internal documents show 226 00:15:02,160 --> 00:15:06,040 Speaker 1: that the platform is being, let's say, inconsistent with how 227 00:15:06,040 --> 00:15:09,120 Speaker 1: it applies the rules. I'm sure this comes as a 228 00:15:09,160 --> 00:15:13,080 Speaker 1: massive shock to everyone, all right. So the heart of 229 00:15:13,120 --> 00:15:15,960 Speaker 1: the matter comes from Elon Musk saying that to hold 230 00:15:16,040 --> 00:15:20,680 Speaker 1: onto that blue verified check mark that some users have, 231 00:15:21,200 --> 00:15:23,440 Speaker 1: they're going to have to pony up the eight bucks 232 00:15:23,440 --> 00:15:27,680 Speaker 1: a month subscription fee to Twitter Blue. Or actually it's 233 00:15:27,720 --> 00:15:29,880 Speaker 1: more than that, if you're talking about a brand and 234 00:15:30,000 --> 00:15:33,600 Speaker 1: like a company as opposed to an individual. Elon Musk 235 00:15:33,640 --> 00:15:37,880 Speaker 1: says that this is really about treating everyone equally, that 236 00:15:38,040 --> 00:15:41,840 Speaker 1: people shouldn't be elevated over others with that blue check mark. 237 00:15:42,360 --> 00:15:45,880 Speaker 1: But already that's just a fricking lie, right, because if 238 00:15:45,920 --> 00:15:49,080 Speaker 1: it were about treating everyone equally, either you get rid 239 00:15:49,080 --> 00:15:51,680 Speaker 1: of the blue check mark entirely so no one has it, 240 00:15:52,200 --> 00:15:55,080 Speaker 1: or everybody gets the blue check mark, which is the 241 00:15:55,120 --> 00:15:57,960 Speaker 1: same thing as nobody having it, but you wouldn't have 242 00:15:58,240 --> 00:16:02,000 Speaker 1: to pay for it. That's not about treating people equally. 243 00:16:02,440 --> 00:16:05,000 Speaker 1: You're talking about a paid for version and a non 244 00:16:05,040 --> 00:16:08,920 Speaker 1: paid for version. That's not equal. Besides, the blue check 245 00:16:08,960 --> 00:16:13,440 Speaker 1: mark was never intended to be a status symbol. That's 246 00:16:13,440 --> 00:16:16,160 Speaker 1: why it turned into what it was meant to be. 247 00:16:16,880 --> 00:16:19,920 Speaker 1: Was a way for Twitter to show that it had 248 00:16:20,080 --> 00:16:25,320 Speaker 1: verified the identity of that Twitter account, and that Twitter 249 00:16:25,480 --> 00:16:29,600 Speaker 1: could vouch that that account actually did represent the person 250 00:16:29,800 --> 00:16:33,640 Speaker 1: or brand that it was connected to. So if you 251 00:16:33,760 --> 00:16:36,960 Speaker 1: saw I don't know Nicholas Holt on Twitter and there's 252 00:16:36,960 --> 00:16:39,280 Speaker 1: a blue check mark, you know, oh, that actually does 253 00:16:39,760 --> 00:16:43,680 Speaker 1: belong to the actor Nicholas Holt, or probably to his 254 00:16:44,160 --> 00:16:48,440 Speaker 1: PR personnel who handle his social media accounts, but you 255 00:16:48,440 --> 00:16:50,400 Speaker 1: would know that that's the official one and it's not 256 00:16:50,440 --> 00:16:53,600 Speaker 1: just someone claiming to be that. That was the reason 257 00:16:53,640 --> 00:16:56,560 Speaker 1: for the verified check mark. It was to verify the 258 00:16:56,640 --> 00:16:59,360 Speaker 1: person was who they said they were. But of course 259 00:16:59,400 --> 00:17:02,360 Speaker 1: the checkmark did become a sort of social capital kind 260 00:17:02,400 --> 00:17:05,919 Speaker 1: of thing, like people were like, I'm important enough to 261 00:17:05,920 --> 00:17:09,960 Speaker 1: have a check mark, and that this somehow raised them 262 00:17:10,080 --> 00:17:13,359 Speaker 1: in social status above everyone else on Twitter, where social 263 00:17:13,400 --> 00:17:16,560 Speaker 1: status means really nothing, I would argue. I mean, I 264 00:17:16,600 --> 00:17:21,359 Speaker 1: say that as someone who got a blue check mark anyway. 265 00:17:22,160 --> 00:17:26,280 Speaker 1: Now the new version is weird because what the blue 266 00:17:26,359 --> 00:17:28,680 Speaker 1: check mark means is that the person's willing to pay 267 00:17:28,680 --> 00:17:31,560 Speaker 1: a fee to keep that blue check mark. I don't 268 00:17:31,560 --> 00:17:33,720 Speaker 1: know when that actually comes into play. I haven't even 269 00:17:33,800 --> 00:17:36,520 Speaker 1: seen a message saying hey, you're gonna need to pay 270 00:17:36,720 --> 00:17:38,800 Speaker 1: a subscription to keep your blue check mark. It may 271 00:17:38,840 --> 00:17:40,800 Speaker 1: have come through and I just ignored it because I 272 00:17:40,840 --> 00:17:44,679 Speaker 1: don't pop on Twitter that much anymore. So it's entirely 273 00:17:44,680 --> 00:17:47,359 Speaker 1: possible that's just on me, not on Twitter. Anyway. It 274 00:17:47,400 --> 00:17:50,920 Speaker 1: turns out that this version, the new version, that Elon 275 00:17:51,000 --> 00:17:55,000 Speaker 1: Musk is talking about, is even less fair than what 276 00:17:55,040 --> 00:17:58,520 Speaker 1: we had heard before, because while he's saying, hey, you're 277 00:17:58,520 --> 00:18:01,120 Speaker 1: gonna have to pay if you want that blue check mark, 278 00:18:01,680 --> 00:18:05,120 Speaker 1: there are around thirty five VIP users of Twitter who 279 00:18:05,400 --> 00:18:09,040 Speaker 1: not only are secure with their status, they actually get 280 00:18:09,080 --> 00:18:13,120 Speaker 1: a boost from Twitter. They get higher visibility than everybody 281 00:18:13,240 --> 00:18:18,159 Speaker 1: else does so that they appear in more Twitter timelines. 282 00:18:19,000 --> 00:18:22,000 Speaker 1: Now it will come as no surprise that Elon Musk 283 00:18:22,040 --> 00:18:23,600 Speaker 1: is in there. We've talked about in the past, how 284 00:18:23,640 --> 00:18:28,159 Speaker 1: Elon Musk has his Twitter feed boosted above other stuff 285 00:18:28,160 --> 00:18:31,919 Speaker 1: so that he shows up in more feeds. But others 286 00:18:31,960 --> 00:18:35,399 Speaker 1: are also on that list, and it ranges from ultra 287 00:18:35,480 --> 00:18:42,200 Speaker 1: conservative commentator Ben Shapiro to the progressive politician Alexandria Okazia Cortez. 288 00:18:42,960 --> 00:18:46,520 Speaker 1: And then there's a few celebrities thrown in there. There's 289 00:18:46,520 --> 00:18:49,640 Speaker 1: some athletes in there, there's other politicians in there. It's 290 00:18:49,640 --> 00:18:53,199 Speaker 1: a real animal farm situation. You know, all Twitter users 291 00:18:53,200 --> 00:18:57,080 Speaker 1: are equal, but some are more equal than others. Probably 292 00:18:57,119 --> 00:18:59,040 Speaker 1: not a big surprise to most of you out there. 293 00:18:59,080 --> 00:19:02,360 Speaker 1: It wasn't a big surprise to me. But again, just 294 00:19:02,440 --> 00:19:08,639 Speaker 1: more evidence that things at Twitter are our topsy turvy. 295 00:19:09,040 --> 00:19:12,520 Speaker 1: And hey, if your account doesn't have that paid for 296 00:19:12,800 --> 00:19:15,840 Speaker 1: blue check mark, it sounds like your tweets ain't ever 297 00:19:15,880 --> 00:19:18,760 Speaker 1: gonna show up as a recommended tweet in the four 298 00:19:18,920 --> 00:19:22,399 Speaker 1: you page of Twitter. So if you go to Twitter, 299 00:19:22,440 --> 00:19:25,040 Speaker 1: you'll see there's like a four you thing, And the 300 00:19:25,119 --> 00:19:29,119 Speaker 1: stuff in the four you section is curated for you. 301 00:19:29,160 --> 00:19:32,159 Speaker 1: It's supposed to be stuff that the algorithm serves to 302 00:19:32,160 --> 00:19:36,520 Speaker 1: you that you are likely to find interesting or engaging. 303 00:19:37,359 --> 00:19:41,280 Speaker 1: That's the purpose. Actual practice varies because goodness knows. The 304 00:19:41,359 --> 00:19:43,520 Speaker 1: last time I checked it out, I was like, none 305 00:19:43,560 --> 00:19:47,200 Speaker 1: of this feels like it's for me, but whatever. So anyway, 306 00:19:47,440 --> 00:19:49,359 Speaker 1: if you don't have a blue check mark, if you 307 00:19:49,400 --> 00:19:54,280 Speaker 1: haven't paid that subscription fee, then your tweets aren't going 308 00:19:54,320 --> 00:19:57,239 Speaker 1: to be showing up in anyone's for you page. So 309 00:19:57,280 --> 00:19:59,280 Speaker 1: I guess this is an incentive to try and get 310 00:19:59,320 --> 00:20:01,920 Speaker 1: more people to pay for that check mark so that 311 00:20:01,960 --> 00:20:05,880 Speaker 1: they get increased visibility on the platform. This change will 312 00:20:05,880 --> 00:20:09,359 Speaker 1: take effect on April fifteenth, according to Elon Musk. Not 313 00:20:09,400 --> 00:20:12,600 Speaker 1: only that, but only the paid for accounts will be 314 00:20:12,640 --> 00:20:16,200 Speaker 1: able to vote in Twitter polls. Now. Musk says this 315 00:20:16,280 --> 00:20:19,280 Speaker 1: is to fight off the issue with AI bots, which 316 00:20:19,320 --> 00:20:22,399 Speaker 1: is funny because I thought his whole plan was to 317 00:20:22,440 --> 00:20:26,840 Speaker 1: eliminate bots from the platform. I remember distinctly him talking 318 00:20:26,880 --> 00:20:29,800 Speaker 1: about this in the lead up to him purchasing Twitter, 319 00:20:29,920 --> 00:20:32,080 Speaker 1: that this was a big part of what his plan was. 320 00:20:32,560 --> 00:20:34,840 Speaker 1: But I guess now he's saying, Nope, you can't do that. 321 00:20:34,920 --> 00:20:38,280 Speaker 1: It's impossible. So the bots are going to ruin everything 322 00:20:38,359 --> 00:20:41,679 Speaker 1: like the ding dang poles unless we make it a 323 00:20:41,840 --> 00:20:46,879 Speaker 1: paid for service. So this could also maybe have something 324 00:20:46,880 --> 00:20:49,520 Speaker 1: to do with the fact that, you know, a few 325 00:20:49,520 --> 00:20:51,600 Speaker 1: months ago, he ran a poll asking if he should 326 00:20:51,600 --> 00:20:55,680 Speaker 1: step down from Twitter, and the majority voted yes, and then, 327 00:20:56,240 --> 00:20:58,840 Speaker 1: you know, one of his followers later suggested that maybe 328 00:20:58,920 --> 00:21:01,399 Speaker 1: only people who are subscribe to Twitter Blue should be 329 00:21:01,400 --> 00:21:03,399 Speaker 1: able to vote in those kinds of polls, and he 330 00:21:03,480 --> 00:21:07,560 Speaker 1: was like, huh, interesting, world may never care. I mean, 331 00:21:07,640 --> 00:21:12,200 Speaker 1: the world may never know whatever. CNBC reports that Twitter 332 00:21:12,280 --> 00:21:17,120 Speaker 1: also applied for a subpoena against gethub, the collaborative developer platform, 333 00:21:17,400 --> 00:21:20,760 Speaker 1: because someone using the handle free speech Enthusiast shared what 334 00:21:20,840 --> 00:21:24,320 Speaker 1: appeared to be segments of Twitter's own source code. Get 335 00:21:24,400 --> 00:21:27,119 Speaker 1: Hub complied with the court order and remove the code, 336 00:21:27,280 --> 00:21:30,560 Speaker 1: so it's no longer up there. Interestingly, Musk himself had 337 00:21:30,560 --> 00:21:33,920 Speaker 1: previously said he would make the source code for Twitter's 338 00:21:33,960 --> 00:21:38,440 Speaker 1: recommendation algorithm and open source code in an attempt to 339 00:21:38,480 --> 00:21:41,960 Speaker 1: suss out if there is any actual bias or preference. 340 00:21:42,359 --> 00:21:44,359 Speaker 1: You know, there are a lot of allegations that Twitter 341 00:21:45,320 --> 00:21:51,240 Speaker 1: downplay conservative messaging and prevents it from getting as large 342 00:21:51,760 --> 00:21:56,639 Speaker 1: an audience as other messages, so making it open source 343 00:21:56,680 --> 00:21:59,600 Speaker 1: would allow people to comb through the code and see 344 00:21:59,760 --> 00:22:03,840 Speaker 1: is in fact bias there. Also to find any mistakes 345 00:22:03,840 --> 00:22:06,720 Speaker 1: in the algorithm, or a way to create the algorithm 346 00:22:06,760 --> 00:22:09,160 Speaker 1: so that it is more effective in what it's supposed 347 00:22:09,200 --> 00:22:11,200 Speaker 1: to do. And the whole idea is that this is 348 00:22:11,240 --> 00:22:15,920 Speaker 1: going to create better transparency and fairness moving forward. But 349 00:22:16,000 --> 00:22:18,359 Speaker 1: you know, it hasn't happened yet. And as we just said, 350 00:22:18,960 --> 00:22:23,440 Speaker 1: Twitter moved to strike source code off of GitHub, though 351 00:22:23,480 --> 00:22:26,320 Speaker 1: I don't know necessarily that the source code in question 352 00:22:26,640 --> 00:22:30,480 Speaker 1: related to the recommendation algorithm, the stuff that was on GitHub, 353 00:22:31,600 --> 00:22:34,440 Speaker 1: and obviously it should be done on Twitter zone terms. 354 00:22:34,440 --> 00:22:36,720 Speaker 1: I do believe that. I'm just I don't know when 355 00:22:36,760 --> 00:22:41,680 Speaker 1: we're going to see that code actually become open source. Okay, 356 00:22:41,840 --> 00:22:43,880 Speaker 1: I've got a few more news stories in the tech 357 00:22:43,920 --> 00:22:46,600 Speaker 1: world to cover before we wrap this up, so let's 358 00:22:46,640 --> 00:22:58,440 Speaker 1: take another quick break and we'll be right back. The 359 00:22:58,520 --> 00:23:02,840 Speaker 1: chief technology officer of Nvidia, guy named Michael Kagan, recently 360 00:23:02,960 --> 00:23:07,879 Speaker 1: dismissed cryptocurrencies, saying they don't quote bring anything useful for 361 00:23:08,080 --> 00:23:11,239 Speaker 1: society end quote. This is according to an article in 362 00:23:11,440 --> 00:23:15,840 Speaker 1: the Guardian. This is interesting. It was not that long 363 00:23:15,880 --> 00:23:19,919 Speaker 1: ago that Nvidio was selling truckloads of GPUs to the 364 00:23:20,000 --> 00:23:24,360 Speaker 1: crypto mining community. It was really big business, so much 365 00:23:24,400 --> 00:23:27,840 Speaker 1: so that it actually became very difficult for anyone else 366 00:23:27,960 --> 00:23:31,959 Speaker 1: to find a new GPU, and if you did find one, 367 00:23:32,080 --> 00:23:35,360 Speaker 1: chances are the price was marked up so high that 368 00:23:35,440 --> 00:23:38,800 Speaker 1: it would be difficult to justify buying one. The aftermarket 369 00:23:38,840 --> 00:23:43,479 Speaker 1: on these things was insane. You had people either buying 370 00:23:43,480 --> 00:23:48,320 Speaker 1: them up to use as crypto mining rigs, or buying 371 00:23:48,359 --> 00:23:51,200 Speaker 1: them up and then selling them at an insane markup 372 00:23:51,440 --> 00:23:56,960 Speaker 1: to gamers. But then Ethereum, the cryptocurrency that had fueled 373 00:23:57,080 --> 00:24:01,240 Speaker 1: this GPU frenzy, finally made the trend transition from proof 374 00:24:01,240 --> 00:24:05,040 Speaker 1: of work to proof of steak operations. So with proof 375 00:24:05,040 --> 00:24:08,359 Speaker 1: of work, you need a lot of computing processing power 376 00:24:08,440 --> 00:24:11,560 Speaker 1: to be a successful miner in the system and to 377 00:24:11,800 --> 00:24:16,840 Speaker 1: be awarded new crypto tokens by participating in that system. 378 00:24:17,160 --> 00:24:19,840 Speaker 1: But with proof of steak, you just have to have 379 00:24:19,880 --> 00:24:23,639 Speaker 1: a big old pile of cryptocins already, and then you 380 00:24:23,800 --> 00:24:28,160 Speaker 1: stake some portion of those crypto coins in the system, 381 00:24:28,280 --> 00:24:30,520 Speaker 1: and the more you stake, the more likely you are 382 00:24:30,640 --> 00:24:34,840 Speaker 1: going to make even more money. Well, once it made 383 00:24:34,880 --> 00:24:39,359 Speaker 1: that transition, you suddenly didn't need all those GPUs to 384 00:24:39,480 --> 00:24:42,480 Speaker 1: just churn away at all hours of day and night, 385 00:24:42,880 --> 00:24:47,400 Speaker 1: so really overnight the demand for GPUs plummeted because now 386 00:24:47,480 --> 00:24:52,000 Speaker 1: cryptomners had no use for them, and not every cryptocin 387 00:24:52,080 --> 00:24:56,160 Speaker 1: has moved off of proof of work. There are cryptocurrencies 388 00:24:56,160 --> 00:24:58,119 Speaker 1: out there that still depend on proof of work, like 389 00:24:58,200 --> 00:25:02,119 Speaker 1: Bitcoin being the big one. Bitcoin is such a big 390 00:25:02,160 --> 00:25:06,879 Speaker 1: fish that GPUs don't have enough output. They They're processing 391 00:25:06,920 --> 00:25:12,920 Speaker 1: power is nowhere close to being competitive for bitcoin mining, 392 00:25:13,119 --> 00:25:16,080 Speaker 1: so you wouldn't buy a GPU if you wanted to 393 00:25:16,080 --> 00:25:19,879 Speaker 1: try and be a bitcoin miner. And then for smaller 394 00:25:20,240 --> 00:25:24,119 Speaker 1: crypto tokens, you know, less valuable ones, GPUs are overkill 395 00:25:24,480 --> 00:25:27,840 Speaker 1: because you'd spend more on electricity bills than you would 396 00:25:27,920 --> 00:25:30,720 Speaker 1: make through mining, so it doesn't make sense to use 397 00:25:30,880 --> 00:25:34,000 Speaker 1: GPUs for that. So in other words, what I'm saying 398 00:25:34,080 --> 00:25:36,520 Speaker 1: is it sounds like a little bit of suspicious timing 399 00:25:36,600 --> 00:25:39,960 Speaker 1: for Kagan to say that that cryptocurrency offers nothing useful 400 00:25:39,960 --> 00:25:43,720 Speaker 1: to society. It certainly offers nothing really useful for Nvidia 401 00:25:43,880 --> 00:25:48,520 Speaker 1: right now, but like, why are you saying nothing? Like 402 00:25:48,560 --> 00:25:51,399 Speaker 1: it's nothing useful to society at this point. Now. I 403 00:25:51,440 --> 00:25:54,119 Speaker 1: don't mean to say that I disagree with them. I 404 00:25:54,160 --> 00:25:58,480 Speaker 1: actually am you know, very hard on cryptocurrencies. If you've 405 00:25:58,520 --> 00:26:01,239 Speaker 1: been listening to this show. You know that, But I 406 00:26:01,280 --> 00:26:04,879 Speaker 1: do feel like if ethereum, we're still proof of work. 407 00:26:05,200 --> 00:26:09,120 Speaker 1: If it hadn't made that transition, I dealt Kagan would 408 00:26:09,160 --> 00:26:12,040 Speaker 1: have said this, that's what I'm saying. Maybe he would have. 409 00:26:12,200 --> 00:26:15,200 Speaker 1: Maybe I'm being too cynical, but it seems to be 410 00:26:15,720 --> 00:26:17,760 Speaker 1: one of those things where, like, you know, there's a 411 00:26:17,880 --> 00:26:20,760 Speaker 1: nasty break up, one person breaks up with the other 412 00:26:20,960 --> 00:26:23,320 Speaker 1: and the other person's like, well, I never loved you anyway. 413 00:26:23,640 --> 00:26:26,359 Speaker 1: That's kind of what it feels like to me. Maybe 414 00:26:26,400 --> 00:26:29,760 Speaker 1: I'm being too harsh here, But what does Kagan say 415 00:26:30,080 --> 00:26:33,360 Speaker 1: is useful to society? Well, that would be AI. And 416 00:26:33,400 --> 00:26:36,640 Speaker 1: there's here's a big shock AI. A lot of those 417 00:26:36,680 --> 00:26:40,119 Speaker 1: applications lean hard on parallel processing, which happens to be 418 00:26:40,200 --> 00:26:43,440 Speaker 1: something that GPUs are really good at. So the thing 419 00:26:43,480 --> 00:26:48,159 Speaker 1: that no longer needs GPUs is useless to society, but 420 00:26:48,200 --> 00:26:51,200 Speaker 1: the thing that really does need GPUs is going to 421 00:26:51,280 --> 00:26:55,000 Speaker 1: be super useful. GOT it doesn't seem like there's any 422 00:26:55,040 --> 00:26:58,439 Speaker 1: bias going on there at all. Huh. Reuters reports that 423 00:26:58,480 --> 00:27:04,840 Speaker 1: the United States Commodity Trading Commission or CFTC, has sued Binance, 424 00:27:05,119 --> 00:27:10,480 Speaker 1: the world's largest cryptocurrency exchange. They also sued Binance's founder, 425 00:27:10,960 --> 00:27:16,320 Speaker 1: Chong pen Xao Akacz. According to Reuters, the charge is 426 00:27:16,359 --> 00:27:22,239 Speaker 1: that Binance illegally quote offered and executed commodity derivatives transactions 427 00:27:22,280 --> 00:27:26,680 Speaker 1: on behalf of US persons end quote. Essentially, the CFTC 428 00:27:26,880 --> 00:27:29,760 Speaker 1: says that Binance was operating in such a way as 429 00:27:29,800 --> 00:27:34,440 Speaker 1: to avoid compliance with US laws securities laws in particular, 430 00:27:34,920 --> 00:27:37,360 Speaker 1: and Binance says that it has worked hard to make 431 00:27:37,400 --> 00:27:39,720 Speaker 1: sure that it doesn't have US users on its platform. 432 00:27:39,800 --> 00:27:42,359 Speaker 1: So even if it were doing the things that the 433 00:27:42,440 --> 00:27:46,520 Speaker 1: CFTC says it's doing, it doesn't apply because it didn't 434 00:27:46,520 --> 00:27:50,680 Speaker 1: involve any US transactions in the first place. There is 435 00:27:50,720 --> 00:27:53,800 Speaker 1: a US based version of Binance, and the relationship between 436 00:27:53,800 --> 00:27:57,159 Speaker 1: that entity and the main Binances it's kind of like 437 00:27:57,600 --> 00:28:03,119 Speaker 1: one of those Facebook relationship statuses. Know It's complicated. CNBC 438 00:28:03,359 --> 00:28:07,320 Speaker 1: reports that the giant Chinese e commerce company Ali Baba 439 00:28:07,440 --> 00:28:12,199 Speaker 1: is splitting into six business groups, and that these groups, 440 00:28:12,200 --> 00:28:14,680 Speaker 1: with the exception of one of them, could potentially spin 441 00:28:14,720 --> 00:28:18,160 Speaker 1: off to become their own independent company. Each group will 442 00:28:18,200 --> 00:28:21,000 Speaker 1: have its own CEO and its own board of directors, 443 00:28:21,000 --> 00:28:24,600 Speaker 1: and the groups include a cloud intelligence group, which current 444 00:28:24,640 --> 00:28:30,120 Speaker 1: Ali Baba CEO Daniel Joan will oversee a Talbau Tamaal 445 00:28:30,560 --> 00:28:34,520 Speaker 1: Commerce group which will focus on online shopping platforms. This 446 00:28:34,560 --> 00:28:38,280 Speaker 1: one will continue to operate as a subsidiary of Ali 447 00:28:38,320 --> 00:28:42,480 Speaker 1: Baba even if the other businesses spin off successfully. There's 448 00:28:42,520 --> 00:28:46,760 Speaker 1: there's a smart logistics company or business group, a global 449 00:28:46,840 --> 00:28:50,880 Speaker 1: digital commerce business group, a digital media and entertainment group, 450 00:28:50,960 --> 00:28:55,200 Speaker 1: and a local services group. Now the reason for this 451 00:28:55,440 --> 00:28:58,480 Speaker 1: is probably due to the fact that Ali Baba lost 452 00:28:58,560 --> 00:29:02,880 Speaker 1: around six hundred billion dollars in value over recent months, 453 00:29:02,880 --> 00:29:07,800 Speaker 1: and so this is an attempt to reorganize and allow 454 00:29:07,960 --> 00:29:12,160 Speaker 1: each business group to focus on what it does without 455 00:29:12,240 --> 00:29:15,200 Speaker 1: having to rely on the company as a whole to 456 00:29:15,360 --> 00:29:19,800 Speaker 1: support each business decision. So we'll see whether or not 457 00:29:20,040 --> 00:29:23,880 Speaker 1: this ends up having a net positive effect on the company. 458 00:29:24,480 --> 00:29:28,880 Speaker 1: Lift is getting a new CEO. According to CNBC, Logan Green, 459 00:29:29,120 --> 00:29:31,800 Speaker 1: the current CEO and a co founder of the company, 460 00:29:31,800 --> 00:29:36,120 Speaker 1: will step down on April seventeenth. David Risher, who formerly 461 00:29:36,160 --> 00:29:38,920 Speaker 1: worked at Amazon as an executive, will take over as 462 00:29:39,120 --> 00:29:42,960 Speaker 1: CEO of Lift. In addition, another co founder, John Zimmer, 463 00:29:43,000 --> 00:29:45,920 Speaker 1: who was serving as president of Lift, will be stepping 464 00:29:45,960 --> 00:29:49,480 Speaker 1: down and the Chairman of the board. Sean Agaroal will 465 00:29:49,520 --> 00:29:51,680 Speaker 1: step down as chairman but will remain on the board 466 00:29:51,680 --> 00:29:55,400 Speaker 1: of directors. Logan Green, the current CEO who's stepping down, 467 00:29:55,760 --> 00:29:59,760 Speaker 1: will take his place as chairman. So why the change, Well, 468 00:30:00,360 --> 00:30:02,160 Speaker 1: kind of like Ali Baba, Lift has been in a 469 00:30:02,200 --> 00:30:05,280 Speaker 1: really rough spot over the last year. Stock prices dropped 470 00:30:05,280 --> 00:30:09,200 Speaker 1: by around seventy percent year over year, So my guess 471 00:30:09,240 --> 00:30:11,480 Speaker 1: is that this is a big reorg push to try 472 00:30:11,520 --> 00:30:13,600 Speaker 1: and turn the company around and get those share prices 473 00:30:13,680 --> 00:30:18,360 Speaker 1: up and make stockholders happy. Amazon meanwhile, failed to get 474 00:30:18,400 --> 00:30:21,960 Speaker 1: a massive lawsuit thrown out of court last week. A 475 00:30:22,240 --> 00:30:26,360 Speaker 1: class action lawsuit alleges that Amazon engaged in anti competitive 476 00:30:26,400 --> 00:30:31,680 Speaker 1: practices that artificially drove prices higher through other retailers who 477 00:30:31,720 --> 00:30:35,680 Speaker 1: depend upon the Amazon platform. Namely, that Amazon had a 478 00:30:35,680 --> 00:30:39,760 Speaker 1: policy that prevents retailers from offering lower prices for goods 479 00:30:39,840 --> 00:30:42,920 Speaker 1: sold outside of Amazon if they also wanted to sell 480 00:30:42,960 --> 00:30:46,680 Speaker 1: those products on the Amazon platform. So it's that kind 481 00:30:46,720 --> 00:30:50,360 Speaker 1: of price fixing, is what the argument is here. The 482 00:30:50,480 --> 00:30:54,080 Speaker 1: judge hearing the arguments rejected Amazon's objections to the case, 483 00:30:54,560 --> 00:30:57,840 Speaker 1: so it can move forward through the court system. If 484 00:30:57,880 --> 00:31:01,840 Speaker 1: Amazon ultimately loses this case, it could face damages in 485 00:31:01,920 --> 00:31:05,720 Speaker 1: the billions of dollars, up to a potential one hundred 486 00:31:05,840 --> 00:31:09,080 Speaker 1: seventy two billion. Now that I expect it would ever 487 00:31:09,200 --> 00:31:13,120 Speaker 1: get to that. I think if Amazon lawyers suspected that 488 00:31:13,200 --> 00:31:15,480 Speaker 1: they would be unable to win the case, it would 489 00:31:15,520 --> 00:31:18,200 Speaker 1: go to a settlement. It would shock me if it 490 00:31:18,280 --> 00:31:23,520 Speaker 1: was otherwise. But still pretty massive deal there. And finally, 491 00:31:23,560 --> 00:31:26,880 Speaker 1: we've seen dozens of big companies announced massive layoffs for 492 00:31:26,960 --> 00:31:31,120 Speaker 1: the past few months, and Disney is no exception. Bob 493 00:31:31,160 --> 00:31:35,440 Speaker 1: Iger who quickly returned to Disney to replace Bob Chapeck 494 00:31:35,760 --> 00:31:40,320 Speaker 1: as CEO, and keep in mind Chapec was Bob Iger's replacement. 495 00:31:40,360 --> 00:31:43,560 Speaker 1: A few years ago, Iger said that Disney would be 496 00:31:43,640 --> 00:31:47,880 Speaker 1: downsizing to the tune of around seven thousand employees. Now 497 00:31:47,920 --> 00:31:51,800 Speaker 1: it sounds like the company's Metaverse team, which had around 498 00:31:51,800 --> 00:31:55,640 Speaker 1: fifty people in it, are among those who saw their 499 00:31:55,760 --> 00:31:59,800 Speaker 1: jobs eliminated. Now, I'm not sure that this is a 500 00:31:59,840 --> 00:32:03,280 Speaker 1: commentary on the Metaverse as a whole. You could see 501 00:32:03,280 --> 00:32:06,120 Speaker 1: this as Disney looking at something that at best is 502 00:32:06,160 --> 00:32:09,680 Speaker 1: going to take several years to become a thing and saying, 503 00:32:09,720 --> 00:32:12,560 Speaker 1: you know what, we need to focus our resources onto. 504 00:32:12,600 --> 00:32:15,880 Speaker 1: Here and now, and we can worry about the future 505 00:32:16,000 --> 00:32:19,719 Speaker 1: once we have stabilized, or you could look at it 506 00:32:19,800 --> 00:32:22,800 Speaker 1: as Disney losing confidence that the metaverse is ever going 507 00:32:22,800 --> 00:32:26,239 Speaker 1: to be anything more than a curiosity. Either way, it 508 00:32:26,280 --> 00:32:28,320 Speaker 1: really stinks for so many people to get hit with 509 00:32:28,440 --> 00:32:32,760 Speaker 1: job cuts. When it's happening across so many companies, particularly 510 00:32:32,760 --> 00:32:35,160 Speaker 1: in the tech sector, you really start to worry for 511 00:32:35,200 --> 00:32:37,160 Speaker 1: folks and hope that they can find a job that 512 00:32:37,200 --> 00:32:40,880 Speaker 1: makes good use of their skills and their knowledge and experience. Plus, 513 00:32:41,120 --> 00:32:42,640 Speaker 1: you know, it would be nice if it was a 514 00:32:42,720 --> 00:32:45,640 Speaker 1: job that they also liked. I guess, you know, you 515 00:32:45,680 --> 00:32:48,240 Speaker 1: can't ask for too much, I suppose, But yeah, I 516 00:32:48,320 --> 00:32:53,560 Speaker 1: don't know if this means that the metaverse is on hold. 517 00:32:54,440 --> 00:32:57,400 Speaker 1: I suspect that what we're seeing right now means that 518 00:32:57,480 --> 00:33:00,920 Speaker 1: it's going to take even longer for whatever the metaverse 519 00:33:01,080 --> 00:33:03,000 Speaker 1: is going to be for it to emerge, and it 520 00:33:03,040 --> 00:33:06,760 Speaker 1: may end up having a big effect on how the 521 00:33:06,760 --> 00:33:09,960 Speaker 1: metaverse emerges. It may mean that we get a very 522 00:33:10,120 --> 00:33:13,760 Speaker 1: different kind of metaverse at some point in the future 523 00:33:13,800 --> 00:33:18,640 Speaker 1: then we would if the economy hadn't gone into this 524 00:33:19,480 --> 00:33:24,360 Speaker 1: economic uncertainty that we're in right now. So hard to say, 525 00:33:24,680 --> 00:33:29,040 Speaker 1: I know again, just like cryptocurrency, I've been very skeptical 526 00:33:29,720 --> 00:33:32,960 Speaker 1: about the metaverse, but that's largely because I see it 527 00:33:33,120 --> 00:33:40,640 Speaker 1: as this poorly defined concept that a lot of companies 528 00:33:40,640 --> 00:33:44,040 Speaker 1: are trying to rush into to either claim a space 529 00:33:44,120 --> 00:33:47,480 Speaker 1: in when there's no real space there, or they're doing 530 00:33:47,480 --> 00:33:50,960 Speaker 1: their darnedest to be the entity that defines what the 531 00:33:51,000 --> 00:33:56,240 Speaker 1: metaverse is, which rarely ends up benefiting everybody. It largely 532 00:33:56,280 --> 00:34:01,120 Speaker 1: benefits whoever planted their flag in the first place. So 533 00:34:01,680 --> 00:34:06,200 Speaker 1: I'm still very skeptical about the whole thing. Also, I'm 534 00:34:06,240 --> 00:34:09,319 Speaker 1: getting older, Like we got to establish that every time, right, 535 00:34:09,360 --> 00:34:12,600 Speaker 1: I'm getting older, and as a person who's getting older, 536 00:34:12,640 --> 00:34:16,080 Speaker 1: I'm running into the risk of being unable to see 537 00:34:17,200 --> 00:34:22,560 Speaker 1: the potential benefits of emerging technologies. So I fully admit 538 00:34:22,600 --> 00:34:25,799 Speaker 1: that that's entirely possible. It could very well be that, 539 00:34:26,239 --> 00:34:29,040 Speaker 1: you know, in twenty years time, everyone will be looking 540 00:34:29,040 --> 00:34:33,919 Speaker 1: back on the metaversus being the truly revolutionary wave that 541 00:34:34,040 --> 00:34:37,520 Speaker 1: transformed the world for the better, and everybody's super happy 542 00:34:37,560 --> 00:34:39,960 Speaker 1: about it, and I'll just be the grouchy old man 543 00:34:40,600 --> 00:34:45,080 Speaker 1: in the shit in the woods who yells at passing clouds. 544 00:34:45,239 --> 00:34:48,520 Speaker 1: Entirely possible. I admit it. All right. That's it for 545 00:34:48,680 --> 00:34:51,360 Speaker 1: this episode of tech Stuff. Hope you are all well. 546 00:34:52,280 --> 00:34:53,799 Speaker 1: If you would like to reach out to me, do 547 00:34:53,840 --> 00:34:55,879 Speaker 1: so on Twitter. The handle for the show is tech 548 00:34:55,960 --> 00:34:59,520 Speaker 1: stuff HSW believe it. Even though there's no blue check 549 00:34:59,560 --> 00:35:02,320 Speaker 1: mark next to it, that's the brand for the show. 550 00:35:02,960 --> 00:35:05,920 Speaker 1: Or you can download the iHeartRadio apps free to downloads, 551 00:35:05,920 --> 00:35:08,880 Speaker 1: free to use. Navigate over to tech Stuff by putting 552 00:35:08,920 --> 00:35:11,400 Speaker 1: that into the little search field. It'll take you to 553 00:35:11,440 --> 00:35:13,680 Speaker 1: the tech stuff page. You'll see there's a little microphone 554 00:35:13,840 --> 00:35:15,799 Speaker 1: icon there. If you click on that, you can leave 555 00:35:15,840 --> 00:35:17,719 Speaker 1: me a voice message up to thirty seconds in late 556 00:35:17,840 --> 00:35:19,400 Speaker 1: Let me know what you would like to hear, and 557 00:35:19,480 --> 00:35:28,719 Speaker 1: I'll talk to you again really soon. Tech Stuff is 558 00:35:28,760 --> 00:35:33,319 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 559 00:35:33,320 --> 00:35:36,960 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 560 00:35:37,000 --> 00:35:37,720 Speaker 1: favorite shows.