1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:15,840 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,880 --> 00:00:19,079 Speaker 1: I'm an executive producer with iHeartRadio and how the tech 4 00:00:19,120 --> 00:00:24,200 Speaker 1: are you? It is time for the tech News or Tuesday, 5 00:00:24,640 --> 00:00:31,120 Speaker 1: May sixteenth, twenty twenty three. And as the band Stained 6 00:00:31,400 --> 00:00:35,159 Speaker 1: would say, it's been a while since we've had an 7 00:00:35,159 --> 00:00:39,240 Speaker 1: Elon Musk heavy episode of tech news, but today we're 8 00:00:39,320 --> 00:00:42,199 Speaker 1: going to make up for that. So let strap in. 9 00:00:42,680 --> 00:00:47,480 Speaker 1: Last week on Thursday afternoon, after I had already published 10 00:00:47,920 --> 00:00:50,640 Speaker 1: my news episode for the day, which can I just 11 00:00:50,720 --> 00:00:54,639 Speaker 1: say rude, Elon Musk tweeted out that he had selected 12 00:00:55,040 --> 00:00:58,720 Speaker 1: a new CEO for Twitter. This had been something Musk 13 00:00:58,760 --> 00:01:01,760 Speaker 1: had been promising us for a while while. He said 14 00:01:01,760 --> 00:01:03,960 Speaker 1: that he would get a CEO to come on board 15 00:01:03,960 --> 00:01:06,280 Speaker 1: and replace him, adding that he would do so once 16 00:01:06,319 --> 00:01:09,200 Speaker 1: Twitter was kind of in a stable place. Now, I'm 17 00:01:09,200 --> 00:01:12,880 Speaker 1: not sure that Twitter is actually anywhere close to being stable, 18 00:01:13,200 --> 00:01:16,480 Speaker 1: considering the chaos that seems to unfold every week. But 19 00:01:17,000 --> 00:01:21,600 Speaker 1: forget that. We now know that Linda Yakarino has been 20 00:01:21,680 --> 00:01:27,440 Speaker 1: named the next CEO of Twitter. She formerly headed advertising 21 00:01:27,760 --> 00:01:30,640 Speaker 1: over at INBC Universal, which I mean, that's a heck 22 00:01:30,640 --> 00:01:33,800 Speaker 1: of a job. I mean, NBC Universal is a truly 23 00:01:34,160 --> 00:01:37,119 Speaker 1: enormous company, and of course it's part of an even 24 00:01:37,160 --> 00:01:41,560 Speaker 1: bigger company called Comcast, and so she definitely knows how 25 00:01:41,600 --> 00:01:44,800 Speaker 1: advertising works like this, is an expert in that field. 26 00:01:45,120 --> 00:01:48,920 Speaker 1: So could her leadership help repair the relationships between Twitter 27 00:01:49,000 --> 00:01:53,320 Speaker 1: and advertisers. I think she's got a decent chance now. 28 00:01:53,400 --> 00:01:55,960 Speaker 1: Whether she can convince the people who have already jumped 29 00:01:56,040 --> 00:02:00,000 Speaker 1: ship off of Twitter to come back remains to be seen. 30 00:02:00,200 --> 00:02:02,920 Speaker 1: So in other words, there may not be as many 31 00:02:02,960 --> 00:02:06,560 Speaker 1: folks to advertise too, even if she's able to convince 32 00:02:06,600 --> 00:02:10,359 Speaker 1: them to return to Twitter. Musk is also not leaving 33 00:02:10,400 --> 00:02:13,320 Speaker 1: Twitter entirely. Of course, he will stay on both as 34 00:02:13,360 --> 00:02:18,480 Speaker 1: the chief technology officer and as the executive chairman. Now, 35 00:02:18,520 --> 00:02:20,480 Speaker 1: I'll say this, I hope she can lead the company 36 00:02:20,520 --> 00:02:24,560 Speaker 1: toward true stability and success and create a place where 37 00:02:24,600 --> 00:02:29,960 Speaker 1: people aren't constantly bombarded by hate speech and misinformation and scams. 38 00:02:30,480 --> 00:02:33,040 Speaker 1: That would be nice. It's a long shot, because it's 39 00:02:33,200 --> 00:02:35,520 Speaker 1: I mean, I don't, honestly don't. I don't know what 40 00:02:35,560 --> 00:02:37,519 Speaker 1: you do to fix Twitter. At this point, it was 41 00:02:37,560 --> 00:02:41,440 Speaker 1: already not in the best of shape. Before Elon Musk 42 00:02:41,639 --> 00:02:44,760 Speaker 1: took over, and I don't see how you could argue 43 00:02:44,800 --> 00:02:48,840 Speaker 1: it's gotten any better since then. So yeah, it's a 44 00:02:48,960 --> 00:02:51,600 Speaker 1: huge not to untangle. But maybe she can do it. 45 00:02:51,720 --> 00:02:55,200 Speaker 1: We'll have to find out. Meanwhile, even though Elon Musk 46 00:02:55,400 --> 00:02:59,200 Speaker 1: is the current CEO and executive chairman of Twitter, he 47 00:02:59,360 --> 00:03:03,600 Speaker 1: still has to submit tweets about his other company, Tesla, 48 00:03:04,000 --> 00:03:07,800 Speaker 1: to a lawyer before he posts them. Now does he 49 00:03:08,200 --> 00:03:12,720 Speaker 1: do this? I don't know. Twitter has never identified this 50 00:03:12,960 --> 00:03:16,320 Speaker 1: lawyer that he's supposed to do this with, but you 51 00:03:16,360 --> 00:03:19,919 Speaker 1: know he's supposed to. The Verge reported on this because 52 00:03:20,080 --> 00:03:24,720 Speaker 1: Musk challenged this consent decree he had agreed to back 53 00:03:24,760 --> 00:03:27,359 Speaker 1: in twenty eighteen. He had challenged it in a recent 54 00:03:27,400 --> 00:03:31,280 Speaker 1: appeals case, and the federal court rejected his appeal. So 55 00:03:31,360 --> 00:03:34,400 Speaker 1: it all stems from a settlement Musk agreed to way 56 00:03:34,440 --> 00:03:37,880 Speaker 1: back in twenty eighteen. That's when the US Securities and 57 00:03:38,000 --> 00:03:43,000 Speaker 1: Exchange Commission, or SEC, accused Musk of making quote a 58 00:03:43,080 --> 00:03:47,040 Speaker 1: series of false and misleading statements regarding taking Tesla, a 59 00:03:47,080 --> 00:03:52,480 Speaker 1: publicly traded company private end quote. Essentially, the SEC was 60 00:03:52,520 --> 00:03:56,240 Speaker 1: saying that Musk wasn't following proper procedure if in fact 61 00:03:56,400 --> 00:03:59,240 Speaker 1: it was his intention to take Tesla private, and further, 62 00:03:59,320 --> 00:04:02,400 Speaker 1: whether he won to do it or not, it amounted 63 00:04:02,440 --> 00:04:07,920 Speaker 1: to market manipulation. Anyway, Musk settled with the SEC, and 64 00:04:08,280 --> 00:04:11,119 Speaker 1: as a concession, he agreed to this sort of babysitter 65 00:04:11,200 --> 00:04:13,120 Speaker 1: clause where he has to show his tweets to a 66 00:04:13,200 --> 00:04:18,040 Speaker 1: lawyer before he actually posts them. Musk wanted that thrown out. 67 00:04:18,600 --> 00:04:21,839 Speaker 1: I'm sure it's humiliating to be the person who bought 68 00:04:21,920 --> 00:04:25,440 Speaker 1: Twitter and still have a legal requirement to submit tweets 69 00:04:25,440 --> 00:04:29,239 Speaker 1: for review. Musk's argument was that the SEC was essentially 70 00:04:29,279 --> 00:04:32,880 Speaker 1: using this consent decree to infringe upon Musk's free speech, 71 00:04:33,279 --> 00:04:36,919 Speaker 1: and we all know what a believer Musk is in 72 00:04:37,040 --> 00:04:42,520 Speaker 1: free speech, at least for himself. Anyway, the Court said no, 73 00:04:42,720 --> 00:04:46,720 Speaker 1: dice Elon, and pointed out that the SEC has investigated 74 00:04:46,760 --> 00:04:51,159 Speaker 1: a grand total of three of Elon Musk's tweets over 75 00:04:51,200 --> 00:04:54,479 Speaker 1: the years, and that includes the tweet that actually started 76 00:04:54,520 --> 00:04:56,960 Speaker 1: off the whole mess in the first place, so only 77 00:04:57,080 --> 00:05:01,760 Speaker 1: two since then, and of course Elon hid posted many 78 00:05:01,800 --> 00:05:06,240 Speaker 1: many times more than just three over the last five years. Personally, 79 00:05:06,279 --> 00:05:08,360 Speaker 1: I think Musk should just consider it a fair trade 80 00:05:08,400 --> 00:05:12,599 Speaker 1: because obviously, I mean, like an earlier investor lawsuit sought 81 00:05:12,760 --> 00:05:16,000 Speaker 1: billions of dollars in damages against Elon Musk. They were 82 00:05:16,080 --> 00:05:20,040 Speaker 1: arguing that Musk's tweets ended up costing investors enormous amounts 83 00:05:20,040 --> 00:05:22,400 Speaker 1: of money, but a jury found that Musk was not 84 00:05:22,560 --> 00:05:25,960 Speaker 1: liable for those losses, so he didn't have to pay 85 00:05:25,960 --> 00:05:29,800 Speaker 1: out the billions. He's just gonna have to mine this babysitter. 86 00:05:30,279 --> 00:05:34,080 Speaker 1: And to be honest, I'm very skeptical that he actually 87 00:05:34,120 --> 00:05:36,800 Speaker 1: goes through it. He just I don't know. I guess 88 00:05:36,800 --> 00:05:40,640 Speaker 1: he just wants the whole thing lifted. And you know, 89 00:05:41,200 --> 00:05:43,600 Speaker 1: I think it's pretty fair to say that Elon Musk's 90 00:05:44,200 --> 00:05:47,360 Speaker 1: mo for a long time has been wishing to avoid 91 00:05:47,440 --> 00:05:50,640 Speaker 1: consequences of his own actions. I don't think that's unfair 92 00:05:50,640 --> 00:05:53,880 Speaker 1: to say. Over at Tesla, Elon Musk has shown that 93 00:05:53,880 --> 00:05:56,599 Speaker 1: he wants to be more involved with the company, which 94 00:05:56,640 --> 00:05:59,960 Speaker 1: is something that shareholders have been asking for for age 95 00:06:00,240 --> 00:06:03,920 Speaker 1: ever since he first announced that he wanted to buy Twitter. Now, 96 00:06:03,920 --> 00:06:07,240 Speaker 1: more specifically, Elon Musk wants to be more involved if 97 00:06:07,279 --> 00:06:10,760 Speaker 1: anyone in the company wants to hire anyone else, Musk 98 00:06:10,880 --> 00:06:13,960 Speaker 1: sent out a memo throughout Tesla, saying that all new 99 00:06:14,080 --> 00:06:18,719 Speaker 1: higher requests have to come to him or his personal approval. 100 00:06:18,920 --> 00:06:21,760 Speaker 1: Doesn't matter who the person is or what position is 101 00:06:21,760 --> 00:06:24,920 Speaker 1: being filled, it has to go to Musk first. Even 102 00:06:24,960 --> 00:06:27,440 Speaker 1: in the case of contractors, it has to come to 103 00:06:27,520 --> 00:06:31,119 Speaker 1: him for approval. He instructed managers to send him hiring 104 00:06:31,160 --> 00:06:33,880 Speaker 1: requests on a weekly basis. However, he also followed this 105 00:06:34,000 --> 00:06:37,560 Speaker 1: up by saying that people should quote unquote think carefully 106 00:06:38,000 --> 00:06:41,560 Speaker 1: before submitting a request. Now, to me, this is me 107 00:06:41,680 --> 00:06:44,760 Speaker 1: inferring from that statement that seems to be kind of 108 00:06:44,800 --> 00:06:48,760 Speaker 1: an intimidation tactic, Like it's meant to discourage people from 109 00:06:48,800 --> 00:06:52,320 Speaker 1: making requests in the first place, because they need to 110 00:06:52,360 --> 00:06:57,480 Speaker 1: think carefully before they ask, And it just seems like 111 00:06:57,520 --> 00:06:59,960 Speaker 1: it's Musk trying to head that off and avoid having 112 00:07:00,120 --> 00:07:03,960 Speaker 1: to hire more people. Later today, Musk is actually going 113 00:07:04,000 --> 00:07:06,920 Speaker 1: to hold an earnings call with shareholders, so I imagine 114 00:07:07,120 --> 00:07:08,920 Speaker 1: we'll have a lot more to say about his increased 115 00:07:08,960 --> 00:07:11,720 Speaker 1: involvement with the company later this week when we do 116 00:07:11,760 --> 00:07:15,080 Speaker 1: another news episode on Thursday. One thing that could be 117 00:07:15,160 --> 00:07:19,120 Speaker 1: discussed on that call, if shareholders get their wish, is 118 00:07:19,160 --> 00:07:23,280 Speaker 1: a discussion about succession planning. So if you're not familiar 119 00:07:23,360 --> 00:07:26,200 Speaker 1: with succession, and I'm not talking about the television series. 120 00:07:26,520 --> 00:07:29,200 Speaker 1: It's when a current leader outlines their plan regarding who 121 00:07:29,240 --> 00:07:32,680 Speaker 1: should take the top spot after they leave the company. 122 00:07:33,000 --> 00:07:35,720 Speaker 1: Investors will soon vote on whether or not to compel 123 00:07:36,000 --> 00:07:40,720 Speaker 1: Tesla to publish a key person risk report. The concern 124 00:07:40,800 --> 00:07:43,760 Speaker 1: is that Tesla may be too tightly bound to the 125 00:07:43,800 --> 00:07:47,000 Speaker 1: personality of Elon Musk, which means if something were to 126 00:07:47,080 --> 00:07:52,800 Speaker 1: happen to Musk, whether it's something catastrophic or maybe Musk 127 00:07:53,280 --> 00:07:57,120 Speaker 1: ends up getting distracted wanting to build his AI company 128 00:07:57,160 --> 00:08:00,680 Speaker 1: and he runs off from Tesla, that the company could 129 00:08:00,760 --> 00:08:03,560 Speaker 1: end up being directionless and investors would lose out on 130 00:08:03,640 --> 00:08:07,280 Speaker 1: a lot of money, and that the company could be 131 00:08:07,360 --> 00:08:11,320 Speaker 1: in danger in the wake of Musk's absence. I think 132 00:08:11,360 --> 00:08:13,640 Speaker 1: the closest similar example I can think of right now 133 00:08:14,320 --> 00:08:17,400 Speaker 1: is how investors were thinking about Steve Jobs and Apple 134 00:08:17,920 --> 00:08:21,880 Speaker 1: like the two were synonymous. Now, Jobs actually did the 135 00:08:21,920 --> 00:08:24,840 Speaker 1: proper steps to plan for his successor, but it didn't 136 00:08:24,880 --> 00:08:28,920 Speaker 1: stop folks from worrying that the company Apple could fall 137 00:08:29,000 --> 00:08:34,000 Speaker 1: apart after Steve Jobs's death. Obviously that didn't happen, So 138 00:08:35,040 --> 00:08:38,120 Speaker 1: Tesla's shareholders may force the company to do a thorough 139 00:08:38,160 --> 00:08:41,240 Speaker 1: report then not only examines how important Musk is to 140 00:08:41,320 --> 00:08:44,480 Speaker 1: the company, but also identify key personnel who could potentially 141 00:08:44,520 --> 00:08:46,920 Speaker 1: take on the role of CEO in the future. The 142 00:08:46,960 --> 00:08:49,520 Speaker 1: shareholders will also vote on whether or not to approve 143 00:08:49,679 --> 00:08:54,800 Speaker 1: certain board member nominees, some of which are contentious. And 144 00:08:54,840 --> 00:08:57,200 Speaker 1: like I said, we'll circle back to this on Thursday 145 00:08:57,240 --> 00:09:00,360 Speaker 1: to talk about what unfolded during the actual earnings call 146 00:09:00,480 --> 00:09:04,880 Speaker 1: if there are any significant updates. Now it's time to 147 00:09:04,960 --> 00:09:08,720 Speaker 1: shift to AI. And I know I cover AI a lot, 148 00:09:08,760 --> 00:09:11,760 Speaker 1: but it keeps creeping into the tech news and so 149 00:09:11,960 --> 00:09:16,680 Speaker 1: we're gonna talk a bit about AI today. Open AI's 150 00:09:16,840 --> 00:09:20,239 Speaker 1: CEO Sam Altman is set to appear before a congressional 151 00:09:20,280 --> 00:09:23,920 Speaker 1: panel here in the United States. Altman has previously appeared 152 00:09:23,960 --> 00:09:27,560 Speaker 1: to be fairly straightforward in his assessment of AI. He's 153 00:09:27,600 --> 00:09:30,560 Speaker 1: even suggested that folks were overplaying the capabilities of his 154 00:09:30,640 --> 00:09:35,480 Speaker 1: own company's chat pot Chat GPT. Prior to his meeting today, 155 00:09:35,800 --> 00:09:39,400 Speaker 1: Altman submitted written testimony to the panel and suggested a 156 00:09:39,440 --> 00:09:43,600 Speaker 1: framework for a licensing procedure for AI companies. So essentially, 157 00:09:43,920 --> 00:09:46,760 Speaker 1: Altman's proposal is to create a system where companies that 158 00:09:46,920 --> 00:09:50,120 Speaker 1: want to develop certain types of AI tools, will have 159 00:09:50,160 --> 00:09:54,280 Speaker 1: to procure a license and follow established safety standards. Of course, 160 00:09:54,559 --> 00:09:58,560 Speaker 1: Congress would have to establish those standards first and InterMine 161 00:09:58,559 --> 00:10:01,079 Speaker 1: what sort of parameters AI I should fall into and 162 00:10:01,160 --> 00:10:05,800 Speaker 1: what would constitute an unsafe version of AI. Altman also 163 00:10:05,880 --> 00:10:10,559 Speaker 1: reportedly use the phrase regulation of AI is essential. This 164 00:10:10,600 --> 00:10:13,640 Speaker 1: is according to Reuters, which you know, again that might 165 00:10:13,679 --> 00:10:16,640 Speaker 1: come as a surprise considering he's the CEO of arguably 166 00:10:16,679 --> 00:10:21,600 Speaker 1: the most famous AI company right now. Typically, business owners 167 00:10:21,679 --> 00:10:24,480 Speaker 1: aren't really gung hole and calling out for regulation of 168 00:10:24,520 --> 00:10:28,600 Speaker 1: their own industry, and when they are, sometimes it turns 169 00:10:28,679 --> 00:10:32,480 Speaker 1: up that they were being, you know, perhaps less than 170 00:10:33,200 --> 00:10:37,400 Speaker 1: forthright about it. See also Sam Bankman freed of FTX fame. 171 00:10:37,920 --> 00:10:43,000 Speaker 1: Some critics worry that regulation could discourage startups and potentially 172 00:10:43,240 --> 00:10:47,960 Speaker 1: cause smaller AI companies to sort of fade away and 173 00:10:48,080 --> 00:10:52,160 Speaker 1: just leave AI to the larger established companies. That is possible, 174 00:10:52,320 --> 00:10:55,199 Speaker 1: but then you also have to admit these larger companies 175 00:10:55,200 --> 00:10:59,000 Speaker 1: are progressing at such a rapid pace, they're investing billions 176 00:10:59,040 --> 00:11:01,959 Speaker 1: of dollars in re search and development, that there does 177 00:11:02,000 --> 00:11:03,800 Speaker 1: seem to be a need for some sort of checks 178 00:11:03,800 --> 00:11:05,600 Speaker 1: and balances to be put in place in order to 179 00:11:05,600 --> 00:11:09,400 Speaker 1: head off problems before they get too severe. Okay, We've 180 00:11:09,400 --> 00:11:12,600 Speaker 1: got more stories, including more AI stories to go over, 181 00:11:12,679 --> 00:11:25,520 Speaker 1: but first let's take a quick break. Okay, we got 182 00:11:25,520 --> 00:11:29,880 Speaker 1: some more AI stories to cover. The World Health Organization 183 00:11:30,160 --> 00:11:35,000 Speaker 1: or who Who? By the way, A side note if 184 00:11:35,040 --> 00:11:38,840 Speaker 1: you've ever seen the film Clue, it's a comedy mystery 185 00:11:38,920 --> 00:11:43,320 Speaker 1: film that I absolutely adore. It's very, very silly and 186 00:11:43,400 --> 00:11:46,920 Speaker 1: I love it. One of the characters there mentions that 187 00:11:46,960 --> 00:11:51,520 Speaker 1: he works for the United Nations Organization, which would be no, 188 00:11:52,320 --> 00:11:55,800 Speaker 1: and then asked about him being a politician, he says, no, 189 00:11:55,880 --> 00:12:00,640 Speaker 1: he works for the World Health Organization. Who's joke there 190 00:12:01,120 --> 00:12:04,040 Speaker 1: That never gets actually used in the in the film, 191 00:12:04,400 --> 00:12:06,920 Speaker 1: but it means he works for you know who. Okay, 192 00:12:07,120 --> 00:12:12,160 Speaker 1: I'm sorry, I got distracted anyway. The World Health Organization, 193 00:12:12,200 --> 00:12:15,520 Speaker 1: which is a real thing, issued a statement cautioning the 194 00:12:15,559 --> 00:12:19,520 Speaker 1: medical field about the use of AI and cited concerns 195 00:12:19,559 --> 00:12:25,600 Speaker 1: that AI can potentially misinform patients and or healthcare providers. Also, 196 00:12:26,040 --> 00:12:30,560 Speaker 1: as we have seen in other areas, AI can contain bias, 197 00:12:30,720 --> 00:12:35,160 Speaker 1: and bias can ultimately cause harm to people, particularly people 198 00:12:35,800 --> 00:12:41,400 Speaker 1: in specific populations. Right like with facial recognition technologies. We've 199 00:12:41,440 --> 00:12:46,280 Speaker 1: seen AI cause disproportionate harm to people of color Overall. 200 00:12:46,480 --> 00:12:51,920 Speaker 1: The Who said that AI stands to provide great benefit 201 00:12:52,360 --> 00:12:56,040 Speaker 1: in the field of healthcare, like there are obvious applications 202 00:12:56,040 --> 00:13:00,760 Speaker 1: where it could be of huge help. However, we have 203 00:13:00,880 --> 00:13:04,800 Speaker 1: to address issues like bias and misinformation, as well as 204 00:13:05,160 --> 00:13:08,240 Speaker 1: the potential for bad actors to use AI to create 205 00:13:08,320 --> 00:13:12,680 Speaker 1: outright disinformation in an attempt to harm I think the 206 00:13:12,720 --> 00:13:16,200 Speaker 1: takeaway here is that WHO is suggesting it might be 207 00:13:16,360 --> 00:13:18,600 Speaker 1: a little too early for the healthcare sector to just 208 00:13:18,760 --> 00:13:23,000 Speaker 1: fully embrace AI. I think that is fair to say. Now, 209 00:13:23,000 --> 00:13:26,160 Speaker 1: we also have to admit AI already plays a huge 210 00:13:26,240 --> 00:13:30,120 Speaker 1: part in the world of healthcare, because AI is more 211 00:13:30,760 --> 00:13:35,760 Speaker 1: than just chatbots and large language models. AI includes lots 212 00:13:35,800 --> 00:13:37,960 Speaker 1: of stuff, and a lot of that is already being 213 00:13:38,080 --> 00:13:41,640 Speaker 1: used regularly in healthcare. That doesn't appear to be what 214 00:13:41,720 --> 00:13:46,680 Speaker 1: the WHO is specifically referencing here. My interpretation is that 215 00:13:46,720 --> 00:13:51,559 Speaker 1: they're talking more about the generative chatbots style AIS that 216 00:13:51,679 --> 00:13:54,559 Speaker 1: have been taking over the news. The New York Times 217 00:13:54,600 --> 00:13:58,600 Speaker 1: reported that some AI researchers at Microsoft think that the 218 00:13:58,679 --> 00:14:02,320 Speaker 1: AI system there work looking on could be showing the 219 00:14:02,360 --> 00:14:08,240 Speaker 1: faintest signs of approaching artificial general intelligence or AGI. They 220 00:14:08,280 --> 00:14:13,199 Speaker 1: called it sparks of AGI. That would mean that we're 221 00:14:13,240 --> 00:14:16,160 Speaker 1: talking about machines that appear at least to be able 222 00:14:16,200 --> 00:14:18,600 Speaker 1: to reason in a way that is similar to how 223 00:14:18,640 --> 00:14:22,720 Speaker 1: we humans reason. The science fiction definition of AI has 224 00:14:22,840 --> 00:14:27,000 Speaker 1: long been one of general intelligence, often passing into the 225 00:14:27,040 --> 00:14:32,960 Speaker 1: category of superhuman intelligence. The article, which is titled Microsoft 226 00:14:33,000 --> 00:14:36,680 Speaker 1: says new AI shows signs of human reasoning, and it 227 00:14:36,800 --> 00:14:41,040 Speaker 1: describes an experiment in which researchers asked this sort of 228 00:14:41,360 --> 00:14:45,640 Speaker 1: AI chatbot to solve a bit of a puzzle. They said, 229 00:14:45,760 --> 00:14:49,200 Speaker 1: how can you create a stack out of this weird 230 00:14:49,280 --> 00:14:53,560 Speaker 1: collection of objects that would result in a stable structure, 231 00:14:54,000 --> 00:14:58,280 Speaker 1: And the objects included nine eggs, a laptop, a book, 232 00:14:58,680 --> 00:15:02,640 Speaker 1: a bottle, and an So the AI chatbot suggested using 233 00:15:02,680 --> 00:15:06,200 Speaker 1: the book as the base and then set the eggs 234 00:15:06,240 --> 00:15:08,960 Speaker 1: on top of the book in a three x three grid, 235 00:15:09,640 --> 00:15:12,960 Speaker 1: and then gently laying the laptop on this layer of eggs, 236 00:15:13,320 --> 00:15:15,960 Speaker 1: and then putting the remaining items on top of the 237 00:15:16,040 --> 00:15:21,480 Speaker 1: laptop's surface. The researchers concluded that the system was at 238 00:15:21,520 --> 00:15:24,440 Speaker 1: least appearing to use some real world knowledge, such as 239 00:15:24,440 --> 00:15:26,960 Speaker 1: that eggs are delicate, and therefore they would need to 240 00:15:27,000 --> 00:15:30,760 Speaker 1: be in a arrangement like a grid in order to 241 00:15:31,200 --> 00:15:34,760 Speaker 1: have enough support to avoid cracking, and that the laptop's 242 00:15:34,800 --> 00:15:37,440 Speaker 1: upper surface would be flat so it would be able 243 00:15:37,480 --> 00:15:40,840 Speaker 1: to support the bottle and then the nail. Earlier versions 244 00:15:40,880 --> 00:15:46,000 Speaker 1: of the AI system gave more nonsensical answers, so it 245 00:15:46,080 --> 00:15:49,520 Speaker 1: was an argument that this newest version of the AI 246 00:15:49,720 --> 00:15:54,360 Speaker 1: model was sophisticated enough that it could actually reason out 247 00:15:54,400 --> 00:15:58,400 Speaker 1: an answer that would potentially work. So could that mean 248 00:15:58,400 --> 00:16:03,560 Speaker 1: we're now approaching general intelligence? Maybe? Maybe not. While the 249 00:16:03,600 --> 00:16:08,040 Speaker 1: researchers appear convinced that these are some early signs of 250 00:16:08,120 --> 00:16:12,640 Speaker 1: limited but still general intelligence, other experts argue that the 251 00:16:12,680 --> 00:16:16,320 Speaker 1: results just give the appearance of intelligence, and that this 252 00:16:16,400 --> 00:16:21,080 Speaker 1: is another case where because of the perspective someone takes, 253 00:16:21,600 --> 00:16:24,680 Speaker 1: you see a particular outcome. I talked about this recently 254 00:16:24,720 --> 00:16:27,920 Speaker 1: in another episode. If you pull an Obi one Kenobi, 255 00:16:28,240 --> 00:16:30,920 Speaker 1: then you look at things from quote unquote a certain 256 00:16:31,000 --> 00:16:35,640 Speaker 1: point of view, then maybe you'll see signs of general intelligence. 257 00:16:36,080 --> 00:16:37,840 Speaker 1: But if you look at it from a different point 258 00:16:37,880 --> 00:16:41,440 Speaker 1: of view, maybe those signs of general intelligence just vanish, 259 00:16:41,600 --> 00:16:44,320 Speaker 1: and it turns out that the thing you thought was 260 00:16:44,360 --> 00:16:48,960 Speaker 1: real was just an illusion? So is that what's going on? 261 00:16:49,880 --> 00:16:53,600 Speaker 1: I honestly don't know. I will say that there are 262 00:16:53,720 --> 00:16:56,360 Speaker 1: some critics who have a pretty strong point to make, 263 00:16:56,400 --> 00:16:59,480 Speaker 1: which is that Microsoft is a company that has made 264 00:17:00,320 --> 00:17:03,480 Speaker 1: massive investments in AI to the tune of more than 265 00:17:03,560 --> 00:17:07,399 Speaker 1: ten billion dollars, So it has a vested interest in 266 00:17:07,480 --> 00:17:11,760 Speaker 1: AI becoming a huge success, right Like, They've poured a 267 00:17:11,800 --> 00:17:16,320 Speaker 1: lot of money into this, so they have a desire 268 00:17:16,359 --> 00:17:18,320 Speaker 1: for this to come out the other side as a 269 00:17:18,359 --> 00:17:24,000 Speaker 1: huge revenue generator. So it's possible that such a company could, 270 00:17:24,119 --> 00:17:28,080 Speaker 1: as Professor Martin Sapp of Carnegie Mellon University has said 271 00:17:28,520 --> 00:17:32,800 Speaker 1: to be quote co opting the research paper format into 272 00:17:32,880 --> 00:17:37,480 Speaker 1: pr pitches in the quote, in other words, trying to 273 00:17:38,400 --> 00:17:43,600 Speaker 1: manufacture support to make a tool seem more sophisticated than 274 00:17:43,640 --> 00:17:47,119 Speaker 1: it potentially is. I don't know the truth of the matter. 275 00:17:47,359 --> 00:17:53,320 Speaker 1: If I'm being honest, I'm still skeptical about machines capable 276 00:17:53,560 --> 00:17:58,720 Speaker 1: of making the leap to general intelligence. However, that's based 277 00:17:58,880 --> 00:18:02,000 Speaker 1: largely upon the fact that we don't fully understand general 278 00:18:02,040 --> 00:18:06,960 Speaker 1: intelligence within humans, let alone in machines. But maybe that 279 00:18:07,160 --> 00:18:11,520 Speaker 1: isn't necessary. Maybe we will achieve general intelligence with machines 280 00:18:11,560 --> 00:18:14,520 Speaker 1: without having a full appreciation of how it works in humans. 281 00:18:15,080 --> 00:18:19,560 Speaker 1: That's possible, I guess, So I just I mean, I 282 00:18:19,640 --> 00:18:22,440 Speaker 1: keep going on to I don't know, but I do 283 00:18:22,480 --> 00:18:26,880 Speaker 1: remain somewhat skeptical. Apple has announced a host of new 284 00:18:26,960 --> 00:18:32,000 Speaker 1: accessibility features coming to various Apple devices, both on iOS 285 00:18:32,240 --> 00:18:36,760 Speaker 1: and mac os, and that's great. So accessibility is getting 286 00:18:36,800 --> 00:18:41,439 Speaker 1: more attention and support these days, and it's been long overdue. 287 00:18:42,280 --> 00:18:45,280 Speaker 1: I follow a lot of people who work in improving 288 00:18:45,359 --> 00:18:50,439 Speaker 1: accessibility in technology, particularly in things like video games, where 289 00:18:50,920 --> 00:18:53,399 Speaker 1: there are now settings and a lot of video games 290 00:18:53,400 --> 00:18:56,639 Speaker 1: that are meant to allow people who might have limitations 291 00:18:57,000 --> 00:18:59,760 Speaker 1: in some way or another still be able to enjoy that. 292 00:19:00,920 --> 00:19:04,080 Speaker 1: I think that's great. I think being able to increase 293 00:19:04,560 --> 00:19:08,159 Speaker 1: the spectrum of folks who get to experience stuff is fantastic. 294 00:19:08,480 --> 00:19:12,400 Speaker 1: I think everyone's a winner when that happens, and seeing 295 00:19:12,720 --> 00:19:17,720 Speaker 1: companies put attention toward accessibility is one of the big 296 00:19:17,760 --> 00:19:22,919 Speaker 1: steps toward addressing gaps that can otherwise exist between different 297 00:19:22,960 --> 00:19:26,639 Speaker 1: populations when it comes to their ability to use tech. 298 00:19:27,080 --> 00:19:31,120 Speaker 1: So I love accessibility technology in general, but the specific 299 00:19:31,119 --> 00:19:33,840 Speaker 1: feature I wanted to talk about that Apple is introducing 300 00:19:34,320 --> 00:19:38,880 Speaker 1: is called personal voice. So with personal Voice, users can 301 00:19:38,960 --> 00:19:44,680 Speaker 1: train their device to sound just like they sound. Then 302 00:19:44,800 --> 00:19:47,919 Speaker 1: the Apple device can speak in a synthesized version of 303 00:19:48,000 --> 00:19:52,520 Speaker 1: the user's own voice. For people who have conditions that 304 00:19:52,800 --> 00:19:56,480 Speaker 1: affect their ability to communicate, potentially it might mean that 305 00:19:56,560 --> 00:19:59,359 Speaker 1: they're facing a future where they will no longer be 306 00:19:59,480 --> 00:20:02,600 Speaker 1: able to speak at some point. Well, this kind of 307 00:20:02,640 --> 00:20:05,720 Speaker 1: feature is huge for them. Rather than them having to 308 00:20:05,760 --> 00:20:12,040 Speaker 1: rely upon an impersonal, generic synthetic voice to speak when 309 00:20:12,080 --> 00:20:14,800 Speaker 1: they cannot, the voice that will come out of their 310 00:20:14,800 --> 00:20:19,399 Speaker 1: devices will be their own. I think that's incredible. I 311 00:20:19,400 --> 00:20:24,360 Speaker 1: think it's a really great use of the synthetic voice technology. 312 00:20:24,440 --> 00:20:28,119 Speaker 1: We have talked about how synthesized voices can cause disruption 313 00:20:28,400 --> 00:20:31,840 Speaker 1: and bad ways, right, how it can impersonate people in 314 00:20:31,880 --> 00:20:37,280 Speaker 1: the arts where those people suddenly feel like their identity 315 00:20:37,320 --> 00:20:39,240 Speaker 1: has been stripped from them and put to use in 316 00:20:39,320 --> 00:20:43,280 Speaker 1: something that they had no involvement with. That's bad, right, 317 00:20:43,720 --> 00:20:45,960 Speaker 1: Or people like me, I would be out of a 318 00:20:46,040 --> 00:20:49,760 Speaker 1: job if iHeart decided, you know what, We're just going 319 00:20:49,800 --> 00:20:56,520 Speaker 1: to train of synthesizer on Jonathan's voice because he's got 320 00:20:57,160 --> 00:21:01,439 Speaker 1: thousands of hours of content out there. Well, we're going 321 00:21:01,480 --> 00:21:04,520 Speaker 1: to train it on his voice. We're gonna get a 322 00:21:04,640 --> 00:21:09,399 Speaker 1: chat GPT style bought to write episodes of tech stuff 323 00:21:09,440 --> 00:21:12,680 Speaker 1: as if it were Jonathan. Have the voice that sounds 324 00:21:12,720 --> 00:21:15,359 Speaker 1: like Jonathan, deliver it, and then we don't have to hire, 325 00:21:15,600 --> 00:21:18,240 Speaker 1: you know, we don't have to pay Jonathan anymore. He 326 00:21:18,280 --> 00:21:21,119 Speaker 1: could just he could just be cut free. It's a 327 00:21:21,160 --> 00:21:24,679 Speaker 1: scary thought, like I get it, like, and it's potentially 328 00:21:24,880 --> 00:21:28,359 Speaker 1: a possible thing, But you know, I U still argue 329 00:21:28,400 --> 00:21:36,280 Speaker 1: that humans have their own, actual, legitimate contributions to various activities, 330 00:21:36,320 --> 00:21:41,240 Speaker 1: including things like creating episodes. So yeah, it's nice to 331 00:21:41,280 --> 00:21:45,760 Speaker 1: see a version of voice synthesis that isn't potentially scary 332 00:21:45,920 --> 00:21:49,439 Speaker 1: or bad, but rather an application that can protect a 333 00:21:49,440 --> 00:21:56,240 Speaker 1: person's agency and personality and independence. I think that is fantastic. 334 00:21:56,280 --> 00:22:00,000 Speaker 1: So good on you Apple for developing those particular excess 335 00:22:00,000 --> 00:22:03,320 Speaker 1: stability features. There are other ones that Apple also announced 336 00:22:03,840 --> 00:22:07,479 Speaker 1: which are equally great. I was calling this one out 337 00:22:07,520 --> 00:22:11,520 Speaker 1: specifically because it kind of has that AI connection with 338 00:22:11,680 --> 00:22:17,920 Speaker 1: the voice synthesis model that is worked into this particular tool. Okay, 339 00:22:18,000 --> 00:22:20,040 Speaker 1: we're going to take another quick break. When we come back, 340 00:22:20,080 --> 00:22:22,919 Speaker 1: we will wrap up with a few more big stories. 341 00:22:33,040 --> 00:22:36,399 Speaker 1: So we're back, and I saw on CNN that a 342 00:22:36,560 --> 00:22:40,960 Speaker 1: former byte Dance employee has filed a wrongful termination lawsuit 343 00:22:41,600 --> 00:22:45,960 Speaker 1: against byte Dance. That employee, Yin Tall You says that 344 00:22:46,080 --> 00:22:50,440 Speaker 1: he formerly served as head of engineering for US operations. 345 00:22:50,800 --> 00:22:53,800 Speaker 1: Now you might recall that byte Dance is the parent 346 00:22:53,840 --> 00:22:58,679 Speaker 1: company of the popular video social platform TikTok, and you 347 00:22:58,760 --> 00:23:01,960 Speaker 1: probably also know that in the United States as well 348 00:23:02,000 --> 00:23:04,679 Speaker 1: as many other parts of the world, there's a growing 349 00:23:04,720 --> 00:23:08,640 Speaker 1: concern that TikTok might serve as a kind of data 350 00:23:08,760 --> 00:23:13,840 Speaker 1: siphon and shoot that data to the Chinese Communist Party. 351 00:23:14,359 --> 00:23:17,880 Speaker 1: Because if you aren't aware Bye Dance being a company 352 00:23:17,920 --> 00:23:22,199 Speaker 1: that's centered in China, that means that that by law, 353 00:23:23,000 --> 00:23:28,159 Speaker 1: the company is supposed to aid the Chinese Communist Party 354 00:23:28,720 --> 00:23:33,040 Speaker 1: when it comes to things like gathering information about, you know, 355 00:23:33,400 --> 00:23:36,239 Speaker 1: enemies of China and that sort of thing. They are 356 00:23:36,280 --> 00:23:39,639 Speaker 1: supposed to be obligated to share that kind of information, 357 00:23:39,760 --> 00:23:42,800 Speaker 1: and a lot of companies in China have to reserve 358 00:23:42,960 --> 00:23:47,480 Speaker 1: a spot for an official from the Chinese Communist Party 359 00:23:47,880 --> 00:23:52,160 Speaker 1: to essentially kind of sit on the board of the company. 360 00:23:52,800 --> 00:23:56,800 Speaker 1: So You says, the fears of all these different nations 361 00:23:56,840 --> 00:24:02,399 Speaker 1: that perhaps bye Dance is gathering information using its platforms 362 00:24:02,400 --> 00:24:04,960 Speaker 1: and then sending it on to the Chinese Communist Party. 363 00:24:05,280 --> 00:24:08,000 Speaker 1: He says that those fears are totally justified. He says 364 00:24:08,040 --> 00:24:12,320 Speaker 1: that Chinese officials have full access to get data that 365 00:24:12,400 --> 00:24:16,600 Speaker 1: was gathered through byte Dance applications, including presumably TikTok, and 366 00:24:16,600 --> 00:24:19,560 Speaker 1: that they had full backdoor access to data even that 367 00:24:19,800 --> 00:24:24,440 Speaker 1: data saved on US servers. Now ByteDance disputes this. They 368 00:24:24,480 --> 00:24:27,400 Speaker 1: say that this is just not true. They point out 369 00:24:27,480 --> 00:24:30,359 Speaker 1: that you worked for the company for less than a 370 00:24:30,480 --> 00:24:34,879 Speaker 1: year before his employment was terminated, that it ended in 371 00:24:34,880 --> 00:24:38,720 Speaker 1: twenty eighteen. You, by the way, says that byte Edance's 372 00:24:39,359 --> 00:24:43,800 Speaker 1: description of his time of employment is not true. I'm 373 00:24:43,920 --> 00:24:48,280 Speaker 1: sure that use accusations are confirming a lot of fears 374 00:24:48,880 --> 00:24:52,800 Speaker 1: that are held in various political circles. But security experts 375 00:24:53,080 --> 00:24:56,880 Speaker 1: say that there really hasn't been any evidence that Chinese 376 00:24:56,880 --> 00:25:01,800 Speaker 1: officials were actually accessing TikTok data in the United States. 377 00:25:02,359 --> 00:25:05,080 Speaker 1: So this very well could be a situation where a 378 00:25:05,200 --> 00:25:09,679 Speaker 1: former employee is leveraging growing suspicion in order to support 379 00:25:09,760 --> 00:25:14,320 Speaker 1: their own claims, or it's possible that as accusations are 380 00:25:14,359 --> 00:25:18,560 Speaker 1: all true, I honestly don't know the answer. I will, however, 381 00:25:18,640 --> 00:25:23,960 Speaker 1: point out again that even if TikTok was actively being 382 00:25:24,040 --> 00:25:28,439 Speaker 1: used by China to spy on Americans, the fact is 383 00:25:28,960 --> 00:25:31,879 Speaker 1: you can buy and sell data from pretty much every 384 00:25:31,960 --> 00:25:35,400 Speaker 1: online platform, which means you don't have to rely on 385 00:25:35,440 --> 00:25:39,199 Speaker 1: a single app to be like your way to gaze 386 00:25:39,240 --> 00:25:43,280 Speaker 1: into an enemy's territory. That's not necessary. You don't need 387 00:25:43,320 --> 00:25:45,399 Speaker 1: to do that because you can just buy the data 388 00:25:45,480 --> 00:25:50,000 Speaker 1: online from all sorts of different data brokers. Unless there 389 00:25:50,040 --> 00:25:53,159 Speaker 1: are really strict controls about that sort of thing, the 390 00:25:53,200 --> 00:25:57,840 Speaker 1: information is out there, So that's something we should really 391 00:25:57,880 --> 00:25:59,879 Speaker 1: think about, is that. You know, if you were to 392 00:25:59,880 --> 00:26:05,199 Speaker 1: eat even shut down TikTok, that's one potential stream of 393 00:26:05,240 --> 00:26:12,320 Speaker 1: information that could potentially go to an adversary, but there's 394 00:26:12,359 --> 00:26:15,280 Speaker 1: still all the other ones. It's like putting your finger 395 00:26:15,960 --> 00:26:18,960 Speaker 1: in a hole in a dam and then like fifteen 396 00:26:19,000 --> 00:26:22,520 Speaker 1: feet down from you, there's a massive breach that's allowing 397 00:26:23,400 --> 00:26:26,200 Speaker 1: millions of gallons of water to pass through what You're 398 00:26:26,200 --> 00:26:29,680 Speaker 1: not really doing anything at that point. But we've gone 399 00:26:29,720 --> 00:26:32,280 Speaker 1: over this before, so we'll move on. Now. Here's a 400 00:26:32,359 --> 00:26:37,520 Speaker 1: quick update on the Microsoft Slash Activision Blizzard deal. If 401 00:26:37,560 --> 00:26:41,080 Speaker 1: you recall, Microsoft has been trying to purchase Activision Blizzard 402 00:26:41,560 --> 00:26:45,320 Speaker 1: and has met with some resistance around the world because 403 00:26:45,359 --> 00:26:49,800 Speaker 1: we're talking about a global acquisition here. So as expected, 404 00:26:50,119 --> 00:26:52,919 Speaker 1: as we talked about I think last week, the EU 405 00:26:53,160 --> 00:26:57,600 Speaker 1: has now approved this merger. Now, you might remember this 406 00:26:57,760 --> 00:27:01,960 Speaker 1: was never a guarantee. An earlier reports had messages saying 407 00:27:02,160 --> 00:27:05,320 Speaker 1: that perhaps even Sony was campaigning hard to have this 408 00:27:05,440 --> 00:27:08,560 Speaker 1: merger blocked out of concern that it would constitute an 409 00:27:08,680 --> 00:27:12,560 Speaker 1: unfair advantage for Microsoft in the video game market, particularly 410 00:27:13,080 --> 00:27:17,119 Speaker 1: for really popular titles like Call of Duty. But Microsoft 411 00:27:17,320 --> 00:27:21,119 Speaker 1: then made several promises to regulators to keep things fair 412 00:27:21,160 --> 00:27:24,040 Speaker 1: and you know, not just absolutely lay waste to the 413 00:27:24,040 --> 00:27:28,360 Speaker 1: home video game industry, and also to make sure they 414 00:27:28,359 --> 00:27:30,800 Speaker 1: took steps so that they're not becoming the de facto 415 00:27:30,960 --> 00:27:35,240 Speaker 1: cloud gaming service. Microsoft agreed to a ten year licensed 416 00:27:35,240 --> 00:27:38,080 Speaker 1: deal that said the company would keep Activision Blizzard titles 417 00:27:38,080 --> 00:27:43,680 Speaker 1: available through all cloud streaming services, all cloud game streaming services, 418 00:27:43,680 --> 00:27:47,399 Speaker 1: I should say, as long as those services sign a 419 00:27:47,480 --> 00:27:51,560 Speaker 1: license agreement with Microsoft. This addresses one of the main 420 00:27:51,600 --> 00:27:54,280 Speaker 1: concerns that has held this deal up in the UK, 421 00:27:54,400 --> 00:27:58,280 Speaker 1: where regulators have voted to actually block the deal. Microsoft 422 00:27:58,320 --> 00:28:01,960 Speaker 1: is now appealing that decision. And then meanwhile here in 423 00:28:01,960 --> 00:28:04,439 Speaker 1: the United States, we still have to wait for regulators 424 00:28:04,480 --> 00:28:07,399 Speaker 1: to actually weigh in. They haven't done that yet, so 425 00:28:07,800 --> 00:28:12,119 Speaker 1: this acquisition still is not necessarily going to happen. But 426 00:28:12,320 --> 00:28:16,040 Speaker 1: arguably the EU's clearance for this deal gives the move 427 00:28:16,160 --> 00:28:20,560 Speaker 1: some support and some momentum, so its chances have improved slightly. 428 00:28:21,080 --> 00:28:25,800 Speaker 1: So maybe we'll see opinions reverse further downstream. We'll have 429 00:28:25,800 --> 00:28:28,280 Speaker 1: to wait and see. You know. The EU was actually 430 00:28:28,280 --> 00:28:31,680 Speaker 1: pretty busy this week because on top of approving Microsoft's 431 00:28:31,720 --> 00:28:36,920 Speaker 1: acquisition bid for Activision Blizzard, the EU also approved cryptocurrency 432 00:28:36,960 --> 00:28:40,840 Speaker 1: regulation rules this week, being the first region in the 433 00:28:40,840 --> 00:28:46,480 Speaker 1: world to form formal cryptocurrency regulations. These will not take 434 00:28:46,560 --> 00:28:49,880 Speaker 1: effect until next year, but the regulations are meant to 435 00:28:49,960 --> 00:28:53,960 Speaker 1: protect EU citizens from losing their shirts in the crypto market, 436 00:28:54,160 --> 00:28:56,880 Speaker 1: and also to create a framework to hold scam artists 437 00:28:56,920 --> 00:28:59,640 Speaker 1: and bad actors accountable when it turns out that they're 438 00:28:59,640 --> 00:29:03,000 Speaker 1: amazed in crypto investment opportunity is a little more than 439 00:29:03,000 --> 00:29:06,600 Speaker 1: a Ponzi scheme. Ultimately, the goal here is to weed 440 00:29:06,680 --> 00:29:10,040 Speaker 1: out the bad crypto entities from the good ones. So 441 00:29:10,080 --> 00:29:13,560 Speaker 1: it's not saying all crypto is bad, but rather there 442 00:29:13,600 --> 00:29:17,040 Speaker 1: needs to be this framework of regulations in order to 443 00:29:17,080 --> 00:29:20,960 Speaker 1: make sure that it's not just running rampant and causing harm. 444 00:29:21,960 --> 00:29:25,200 Speaker 1: And also this protects the EU and the process. So 445 00:29:25,280 --> 00:29:28,400 Speaker 1: it also means that these regulations create the framework for 446 00:29:28,440 --> 00:29:33,680 Speaker 1: pursuing crypto currency companies that are engaged in stuff like 447 00:29:33,880 --> 00:29:39,040 Speaker 1: money laundering or financing terrorists. These regulations may end up 448 00:29:39,040 --> 00:29:41,160 Speaker 1: serving as a foundation for other parts of the world 449 00:29:41,200 --> 00:29:44,719 Speaker 1: to adopt similar approaches and thus rain in the wild 450 00:29:44,920 --> 00:29:48,800 Speaker 1: West nature of the crypto community. While that might chafe 451 00:29:48,840 --> 00:29:51,920 Speaker 1: a bit to the folks who saw cryptocurrency as a 452 00:29:51,920 --> 00:29:55,360 Speaker 1: way of working outside the system, it can also potentially 453 00:29:55,440 --> 00:30:00,160 Speaker 1: mitigate disasters such as the aforementioned collapse of FTX. How 454 00:30:00,200 --> 00:30:04,360 Speaker 1: that particular event had a domino effect across the crypto market. 455 00:30:04,760 --> 00:30:08,800 Speaker 1: I think it's hard to argue against creating rules that 456 00:30:10,280 --> 00:30:14,720 Speaker 1: minimize those those chances of like those big disasters, or 457 00:30:14,720 --> 00:30:17,480 Speaker 1: at least reduce maybe minimizes the wrong word, but to 458 00:30:17,560 --> 00:30:21,200 Speaker 1: reduce the chance that that happens. Because investors don't want 459 00:30:21,200 --> 00:30:24,360 Speaker 1: to see their money go away, right, You don't want 460 00:30:24,400 --> 00:30:29,280 Speaker 1: to have to depend upon some government agency to retrieve 461 00:30:30,040 --> 00:30:33,320 Speaker 1: some or or maybe, if you're lucky, all of your investment, 462 00:30:33,960 --> 00:30:37,360 Speaker 1: because that's never going to be something that you can 463 00:30:37,400 --> 00:30:41,040 Speaker 1: depend upon. So yeah, I think regulations are the right idea. 464 00:30:41,480 --> 00:30:45,640 Speaker 1: They are antithetical to kind of the spirit of cryptocurrency, 465 00:30:45,680 --> 00:30:50,320 Speaker 1: but as we've seen, without those regulations, there's a lot 466 00:30:50,360 --> 00:30:55,080 Speaker 1: of opportunity for people to take advantage of folks at 467 00:30:55,080 --> 00:31:00,000 Speaker 1: a grand scale that is incredibly harmful. And finally, while 468 00:31:00,000 --> 00:31:02,560 Speaker 1: well I usually like to end a tech news episode 469 00:31:02,560 --> 00:31:05,760 Speaker 1: with a silly or a light hearted story, I do 470 00:31:05,840 --> 00:31:09,080 Speaker 1: not have such a story for this particular episode. Instead, 471 00:31:09,600 --> 00:31:12,320 Speaker 1: we get to say that India is the first country 472 00:31:12,320 --> 00:31:17,240 Speaker 1: with a democratically elected government to ban messaging services that 473 00:31:17,320 --> 00:31:21,280 Speaker 1: allow end to end encryption. Now we've seen these sorts 474 00:31:21,280 --> 00:31:25,360 Speaker 1: of bans in authoritatively governed countries now where you've got 475 00:31:25,440 --> 00:31:29,920 Speaker 1: essentially a dictator or a military organization in charge of 476 00:31:29,920 --> 00:31:32,480 Speaker 1: the country, but we've never seen it in a democracy. 477 00:31:33,040 --> 00:31:37,880 Speaker 1: And the justification for this move for banning encrypted messaging 478 00:31:37,920 --> 00:31:41,400 Speaker 1: services is pretty much what you would suspect. The government 479 00:31:41,520 --> 00:31:44,320 Speaker 1: says that you've got to get rid of them because 480 00:31:44,440 --> 00:31:47,080 Speaker 1: terrorists are using these apps to communicate with each other, 481 00:31:47,800 --> 00:31:51,000 Speaker 1: and they use encryption to hide their plans from authorities, 482 00:31:51,120 --> 00:31:54,719 Speaker 1: and their plans constitute a threat to the state of 483 00:31:54,720 --> 00:31:57,240 Speaker 1: India and its citizens. So you've got to get rid 484 00:31:57,280 --> 00:32:00,840 Speaker 1: of encryption. That means that nobody would get to have 485 00:32:00,880 --> 00:32:05,280 Speaker 1: access to encrypted messaging systems within India. And you know, 486 00:32:05,520 --> 00:32:08,160 Speaker 1: you'll hear arguments of like, why are you worried unless 487 00:32:08,200 --> 00:32:12,280 Speaker 1: you have something to hide? But consider how officials in 488 00:32:12,280 --> 00:32:15,560 Speaker 1: India have gone after people who have criticized the government. 489 00:32:16,040 --> 00:32:18,280 Speaker 1: They have gone so far as to petition platforms like 490 00:32:18,320 --> 00:32:22,640 Speaker 1: Twitter to remove posts that put the government in an 491 00:32:22,760 --> 00:32:27,400 Speaker 1: unfavorable light. So you start to see a government use 492 00:32:27,840 --> 00:32:32,719 Speaker 1: its power to suppress speech, and you start to make 493 00:32:32,760 --> 00:32:36,840 Speaker 1: a solid argument that removing access to encrypted messaging services 494 00:32:37,360 --> 00:32:42,200 Speaker 1: is another step toward authoritarianism. It's just it's authoritarianism that 495 00:32:42,640 --> 00:32:48,040 Speaker 1: is dressed up like democracy. So bad story there. I 496 00:32:48,080 --> 00:32:50,000 Speaker 1: hate to end it like that, but that was the 497 00:32:50,120 --> 00:32:53,080 Speaker 1: last last one that I came across before I started 498 00:32:53,120 --> 00:32:57,480 Speaker 1: working on this episode. In the meantime, I hope all 499 00:32:57,520 --> 00:33:00,320 Speaker 1: of you out there are well and I I will 500 00:33:00,320 --> 00:33:09,959 Speaker 1: talk to you again really soon. Tech Stuff is an 501 00:33:10,040 --> 00:33:15,560 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 502 00:33:15,680 --> 00:33:18,840 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.