1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from my Heart Radio. 2 00:00:12,000 --> 00:00:14,520 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,640 --> 00:00:17,400 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,480 --> 00:00:20,759 Speaker 1: and how the tech are you? It is time for 5 00:00:20,800 --> 00:00:24,840 Speaker 1: the Tech News for Thursday, December twenty two. And you know, 6 00:00:25,520 --> 00:00:29,080 Speaker 1: I do news episodes on Tuesdays and Thursdays, and every 7 00:00:29,120 --> 00:00:31,400 Speaker 1: time a Thursday rolls around, I worry that I'm not 8 00:00:31,400 --> 00:00:34,559 Speaker 1: gonna have enough material to really do a decent episode. 9 00:00:34,600 --> 00:00:38,040 Speaker 1: But that is definitely not a problem. Today we have 10 00:00:38,280 --> 00:00:41,400 Speaker 1: a ton to talk about, including some updates on stories 11 00:00:41,440 --> 00:00:46,159 Speaker 1: that we've chatted about in previous news episodes, including Tuesdays. 12 00:00:46,200 --> 00:00:49,040 Speaker 1: So let's begin with f t X. Now, I'm not 13 00:00:49,080 --> 00:00:51,640 Speaker 1: gonna do a full catch up on f t X 14 00:00:51,800 --> 00:00:55,800 Speaker 1: because I actually did that on Tuesday's episode. So if 15 00:00:55,800 --> 00:00:57,360 Speaker 1: you don't know what f t X is and you 16 00:00:57,440 --> 00:00:59,680 Speaker 1: want to know, I recommend just listening to the first 17 00:00:59,720 --> 00:01:03,200 Speaker 1: five minutes or so of Tuesday's show and then that'll 18 00:01:03,200 --> 00:01:05,880 Speaker 1: get you up to speed. So first up is the 19 00:01:06,000 --> 00:01:10,120 Speaker 1: matter of Sam Bankman Freed or SBF. He's a co 20 00:01:10,319 --> 00:01:15,080 Speaker 1: founder of the cryptocurrency exchange f t X, so he 21 00:01:15,160 --> 00:01:18,720 Speaker 1: was arrested on Monday evening, which I talked about on 22 00:01:18,720 --> 00:01:22,000 Speaker 1: Tuesday's episode. But now we've learned that if he's convicted 23 00:01:22,120 --> 00:01:25,560 Speaker 1: of the crimes he stands accused of, he could face 24 00:01:25,640 --> 00:01:30,000 Speaker 1: a maximum of a hundred fifteen years in prison. He 25 00:01:30,040 --> 00:01:33,240 Speaker 1: could spend the rest of his life in prison. He 26 00:01:33,319 --> 00:01:37,120 Speaker 1: has eight federal counts against him. Those charges range from 27 00:01:37,120 --> 00:01:42,039 Speaker 1: wire fraud to securities fraud to money laundering. Now, all 28 00:01:42,080 --> 00:01:45,360 Speaker 1: that being said, I think it is highly unlikely he'll 29 00:01:45,400 --> 00:01:49,240 Speaker 1: face anything close to a hundred fifteen years. Uh, He's 30 00:01:49,240 --> 00:01:52,600 Speaker 1: still likely to end up with a prison sentence, perhaps 31 00:01:52,720 --> 00:01:56,360 Speaker 1: a considerable one, like ten years or so. But one 32 00:01:56,480 --> 00:01:59,680 Speaker 1: challenge for the prosecution is proving beyond a reasonable doubt 33 00:01:59,840 --> 00:02:04,400 Speaker 1: that SPF was committing fraud. So here in the United States, 34 00:02:05,120 --> 00:02:07,960 Speaker 1: we have a saying that goes ignorance of the law 35 00:02:08,120 --> 00:02:11,280 Speaker 1: is no excuse. Now, generally, what that means is that 36 00:02:11,360 --> 00:02:14,600 Speaker 1: if you unknowingly break a law, it still means you 37 00:02:14,639 --> 00:02:17,440 Speaker 1: broke a law, and you can still be held accountable 38 00:02:17,480 --> 00:02:20,560 Speaker 1: for it. Being ignorant of law does not excuse you 39 00:02:20,919 --> 00:02:26,040 Speaker 1: from accountability. However, there are a few crimes that require 40 00:02:26,080 --> 00:02:31,440 Speaker 1: things like intent and understanding to be considered a crime. 41 00:02:31,840 --> 00:02:36,200 Speaker 1: So fraud is an example fraud, at least in some jurisdictions. 42 00:02:36,560 --> 00:02:39,200 Speaker 1: Fraud is a crime that carries with it the concept 43 00:02:39,280 --> 00:02:44,480 Speaker 1: of intent that you intended to defraud someone. If you 44 00:02:44,560 --> 00:02:48,320 Speaker 1: did not intend to do it and it happened by accident, 45 00:02:48,800 --> 00:02:52,200 Speaker 1: then it's not really fraud at least in those jurisdictions. 46 00:02:52,240 --> 00:02:55,400 Speaker 1: It could still be some other crime, but it would 47 00:02:55,440 --> 00:02:58,760 Speaker 1: not be fraud. There are other examples of crimes where 48 00:02:58,800 --> 00:03:02,520 Speaker 1: intent makes a different. So that's what separates something like 49 00:03:02,680 --> 00:03:09,040 Speaker 1: voluntary manslaughter from involuntary manslaughter. Is intent. Anyway, I think 50 00:03:09,080 --> 00:03:12,920 Speaker 1: the odds are at least some charges will stick to 51 00:03:13,200 --> 00:03:15,280 Speaker 1: SBF and he's going to be looking at a prison 52 00:03:15,320 --> 00:03:19,320 Speaker 1: sentence of at least a few years, just I don't 53 00:03:19,360 --> 00:03:26,000 Speaker 1: expect the hundred fifteen max to to fall on his shoulders. Meanwhile, 54 00:03:26,240 --> 00:03:30,799 Speaker 1: f t X is in the process of going through bankruptcy, 55 00:03:30,960 --> 00:03:34,560 Speaker 1: and really what they're going through is liquidating assets and 56 00:03:34,600 --> 00:03:37,240 Speaker 1: trying to recapture as much value as they can to 57 00:03:37,320 --> 00:03:42,200 Speaker 1: return to creditors and investors. So Tuesday I mentioned how 58 00:03:42,280 --> 00:03:45,480 Speaker 1: the government of the Bahamas wanted to get involved in 59 00:03:45,520 --> 00:03:48,800 Speaker 1: how certain f t X assets are being handled because 60 00:03:49,360 --> 00:03:54,280 Speaker 1: f t X established headquarters in the Bahamas a while back, 61 00:03:54,920 --> 00:03:58,120 Speaker 1: and SPF and one of his buddies bought thirty five 62 00:03:58,200 --> 00:04:02,640 Speaker 1: properties around New Providence, Bahamas, and in total those properties 63 00:04:02,720 --> 00:04:06,480 Speaker 1: value at around two fifty million dollars. You've got a 64 00:04:06,480 --> 00:04:08,720 Speaker 1: couple of branches of f t X that are involved 65 00:04:08,720 --> 00:04:12,480 Speaker 1: in this as well. There's f t X, the American company. 66 00:04:13,000 --> 00:04:17,080 Speaker 1: There's f t X Digital Markets that's the Bahamanian branch 67 00:04:17,200 --> 00:04:20,400 Speaker 1: of f t X. And what the American branch of 68 00:04:20,400 --> 00:04:23,920 Speaker 1: f t X is essentially saying is we don't want 69 00:04:24,000 --> 00:04:27,320 Speaker 1: to share any information with f t X Digital Markets 70 00:04:27,800 --> 00:04:31,640 Speaker 1: because we don't trust the government of the Bahamas and 71 00:04:31,720 --> 00:04:36,039 Speaker 1: we worry that the government will steal stuff that we're 72 00:04:36,080 --> 00:04:39,360 Speaker 1: trying to liquidate, that they will lean on ft X 73 00:04:39,440 --> 00:04:43,960 Speaker 1: Digital Markets and get that information, and then our investors 74 00:04:43,960 --> 00:04:47,440 Speaker 1: will not get the value that they deserve. Meanwhile, ft 75 00:04:47,720 --> 00:04:51,520 Speaker 1: X Digital is saying this is nonsense. We were appointed 76 00:04:51,640 --> 00:04:55,640 Speaker 1: by a court order and our job is to liquidate 77 00:04:55,800 --> 00:04:59,560 Speaker 1: assets to recover money for investors and creditors too. But 78 00:04:59,680 --> 00:05:03,360 Speaker 1: we do that unless we actually have access to data. 79 00:05:03,520 --> 00:05:06,560 Speaker 1: So our job is the same as your job is 80 00:05:06,640 --> 00:05:09,080 Speaker 1: essentially what they're saying, then you have the government of 81 00:05:09,120 --> 00:05:12,159 Speaker 1: the Bahamas and it's like, hey, it's against the law 82 00:05:12,360 --> 00:05:17,000 Speaker 1: for matters involving Bahamanian land to be arbitrated outside of 83 00:05:17,040 --> 00:05:20,920 Speaker 1: the Bahamas. So it's a big old mess. In other words, Ultimately, 84 00:05:21,000 --> 00:05:23,880 Speaker 1: the only job FTX really has right now is to 85 00:05:23,880 --> 00:05:27,240 Speaker 1: get as much value from whatever assets it has and 86 00:05:27,320 --> 00:05:30,159 Speaker 1: to then hand that over to the creditors and the investors. 87 00:05:30,160 --> 00:05:31,919 Speaker 1: So it's no surprise that we're seeing this kind of 88 00:05:31,960 --> 00:05:34,919 Speaker 1: tug of war between the branches as well as the 89 00:05:34,960 --> 00:05:39,719 Speaker 1: Bahamanian government. So yeah, everybody feels like they're owed a 90 00:05:39,800 --> 00:05:42,880 Speaker 1: piece of this, and so there's gonna be probably some 91 00:05:42,920 --> 00:05:45,720 Speaker 1: pretty nasty arguments about how it needs to be divvied up, 92 00:05:46,000 --> 00:05:48,719 Speaker 1: because there are a lot of people who are holding 93 00:05:48,720 --> 00:05:51,480 Speaker 1: the bag right now and they don't. They don't want 94 00:05:51,480 --> 00:05:54,159 Speaker 1: to They don't want to be holding the bag. If anything, 95 00:05:54,200 --> 00:05:56,200 Speaker 1: they want that bag to be very very small, if 96 00:05:56,200 --> 00:05:58,960 Speaker 1: they have to hold a bag at all. So yeah, 97 00:05:59,080 --> 00:06:02,960 Speaker 1: this isn't over with. Then. Over at Binance, which is 98 00:06:03,040 --> 00:06:06,800 Speaker 1: the largest cryptocurrency exchange, it's also an entity that had 99 00:06:07,400 --> 00:06:11,160 Speaker 1: let's say, a contentious relationship with f t X. The 100 00:06:11,240 --> 00:06:15,479 Speaker 1: CEO of Binance is trying to calm matters. Internally, his 101 00:06:15,600 --> 00:06:18,520 Speaker 1: name's Chong Peng Zoo. He's also known as c Z 102 00:06:18,800 --> 00:06:23,360 Speaker 1: because we just love initialisms in crypto. And he sent 103 00:06:23,440 --> 00:06:26,800 Speaker 1: a memo to employees and acknowledge that folks have recently 104 00:06:26,839 --> 00:06:31,080 Speaker 1: withdrawn a lot of money from Finance, more than a 105 00:06:31,120 --> 00:06:35,640 Speaker 1: billion dollars in one day. Some estimates say that in 106 00:06:35,720 --> 00:06:41,240 Speaker 1: the week leading up to yesterday that customers have removed 107 00:06:41,279 --> 00:06:44,360 Speaker 1: more than three and a half billion dollars from the exchange. 108 00:06:45,000 --> 00:06:47,160 Speaker 1: But he says, you know, this happens like there are 109 00:06:47,240 --> 00:06:50,320 Speaker 1: days where we might have a billion in deposits in 110 00:06:50,360 --> 00:06:52,560 Speaker 1: other days where you might have a billion in withdrawals. 111 00:06:52,600 --> 00:06:56,360 Speaker 1: It's not a cause for alarm. He did also warn 112 00:06:56,400 --> 00:06:59,479 Speaker 1: that crypto in general is in for rough time, but quote, 113 00:06:59,720 --> 00:07:03,160 Speaker 1: this organization was built to last. As long as we 114 00:07:03,240 --> 00:07:07,240 Speaker 1: continue to offer users the best product, user experience, and 115 00:07:07,360 --> 00:07:12,720 Speaker 1: frictionless trading environment, Finance will survive any crypto winter end. 116 00:07:12,800 --> 00:07:17,080 Speaker 1: Quote Now, as I mentioned on Tuesday, Binance is also 117 00:07:17,120 --> 00:07:21,240 Speaker 1: the subject of a US Department of Justice investigation. There 118 00:07:21,240 --> 00:07:24,200 Speaker 1: are concerns that some of the shenanigans that we're going 119 00:07:24,200 --> 00:07:27,720 Speaker 1: on at f t X are also happening over at Binance, 120 00:07:28,120 --> 00:07:34,320 Speaker 1: including like ten billion dollars of illegal payments processing. While 121 00:07:34,400 --> 00:07:37,320 Speaker 1: Binance did commission a report from a third party to 122 00:07:37,400 --> 00:07:41,560 Speaker 1: be more transparent about the company's reserves, that report has 123 00:07:41,560 --> 00:07:44,360 Speaker 1: not satisfied everyone out there, So this too, is a 124 00:07:44,400 --> 00:07:49,040 Speaker 1: developing story. Another developing story revolves around the US government's 125 00:07:49,160 --> 00:07:53,120 Speaker 1: stance on TikTok. So on Tuesday, I talked about how 126 00:07:53,160 --> 00:07:56,920 Speaker 1: there are some proposals developing in Congress that would ban 127 00:07:57,080 --> 00:07:59,720 Speaker 1: TikTok in the United States unless the company becomes a 128 00:07:59,760 --> 00:08:03,800 Speaker 1: true the US based business, essentially cutting all ties to 129 00:08:03,880 --> 00:08:07,160 Speaker 1: China because right now TikTok is a subsidiary of a 130 00:08:07,240 --> 00:08:10,160 Speaker 1: Chinese company called byte Dance. Well, now the U s 131 00:08:10,160 --> 00:08:14,520 Speaker 1: Senate has voted to ban federal employees from installing TikTok 132 00:08:14,720 --> 00:08:18,720 Speaker 1: on government owned devices. Now, this is obviously not like 133 00:08:18,800 --> 00:08:23,800 Speaker 1: a nationwide ban on TikTok for everybody. It's rather another 134 00:08:23,840 --> 00:08:27,800 Speaker 1: display of how government officials are worried that TikTok the 135 00:08:27,920 --> 00:08:32,080 Speaker 1: video platform could be used by the Chinese Communist Party 136 00:08:32,120 --> 00:08:36,280 Speaker 1: as a way to gather information about people in the US, 137 00:08:36,320 --> 00:08:39,960 Speaker 1: something that TikTok reps have denied repeatedly in the past. 138 00:08:40,440 --> 00:08:44,120 Speaker 1: A few state governments have already passed similar laws for 139 00:08:44,240 --> 00:08:47,800 Speaker 1: state government employees, so this federal vote will also have 140 00:08:47,880 --> 00:08:50,640 Speaker 1: to pass the House of Representatives before it can then 141 00:08:50,679 --> 00:08:53,800 Speaker 1: be sent to President Biden's office for his signature, which 142 00:08:53,800 --> 00:08:57,200 Speaker 1: would be interesting to see because Biden and his administration 143 00:08:57,320 --> 00:09:01,160 Speaker 1: have been an active negotiations with to Talk in an 144 00:09:01,160 --> 00:09:03,960 Speaker 1: effort to find a way where TikTok can operate within 145 00:09:04,000 --> 00:09:08,920 Speaker 1: the US without fear of it serving as spyware for China. Now, personally, 146 00:09:09,360 --> 00:09:12,760 Speaker 1: I actually think it's okay to tell federal government employees, hey, 147 00:09:12,800 --> 00:09:16,720 Speaker 1: if the phone belongs to the federal government, don't put 148 00:09:16,800 --> 00:09:19,960 Speaker 1: TikTok on the phone. You can save TikTok for your 149 00:09:20,000 --> 00:09:23,360 Speaker 1: own personal device. Now, maybe that's because I also think 150 00:09:23,960 --> 00:09:26,440 Speaker 1: it would be weird for a lot of companies to say, oh, sure, 151 00:09:26,480 --> 00:09:29,520 Speaker 1: go ahead and put TikTok on your corporate device. I 152 00:09:29,559 --> 00:09:32,120 Speaker 1: think it's totally fair for a corporation and say, hey, 153 00:09:32,200 --> 00:09:34,080 Speaker 1: you know, just don't You can do it for your 154 00:09:34,120 --> 00:09:36,559 Speaker 1: own personal one, but for your work phone, don't put 155 00:09:36,600 --> 00:09:38,680 Speaker 1: it on there. I think the same thing should be 156 00:09:38,720 --> 00:09:42,720 Speaker 1: true for government employees. But then, as I have said 157 00:09:42,880 --> 00:09:46,840 Speaker 1: many times, I am old and grouchy and I'm only 158 00:09:46,880 --> 00:09:51,080 Speaker 1: getting more so with every passing day. Now. Speaking of TikTok, 159 00:09:51,679 --> 00:09:55,920 Speaker 1: the Center for Countering Digital Hate released a report that 160 00:09:56,040 --> 00:09:59,880 Speaker 1: claims TikTok's algorithm is regularly serving up harmful in the 161 00:10:00,000 --> 00:10:04,839 Speaker 1: sial to young users. So researchers with the organization created 162 00:10:05,120 --> 00:10:09,240 Speaker 1: accounts fake accounts on TikTok. So the accounts claimed to 163 00:10:09,520 --> 00:10:13,720 Speaker 1: be representing a thirteen year old user. That's the minimum 164 00:10:13,840 --> 00:10:17,800 Speaker 1: age that you can be on TikTok. They then chose 165 00:10:17,880 --> 00:10:21,760 Speaker 1: mental health, embody image as areas of interest. With one 166 00:10:22,160 --> 00:10:24,760 Speaker 1: of these accounts they used a female user name, and 167 00:10:24,760 --> 00:10:27,439 Speaker 1: the other one they used the user name that had 168 00:10:27,600 --> 00:10:29,760 Speaker 1: lose weight as part of the name, just to see 169 00:10:29,760 --> 00:10:32,400 Speaker 1: if that would have any effect on the content they received. 170 00:10:32,640 --> 00:10:35,120 Speaker 1: And according to the researchers, it took less than three 171 00:10:35,160 --> 00:10:38,480 Speaker 1: minutes before they started to encounter content related to suicide 172 00:10:38,920 --> 00:10:43,000 Speaker 1: and within eight minutes they started seeing content about eating disorders. 173 00:10:43,280 --> 00:10:47,120 Speaker 1: And when I say content about, I mean content promoting 174 00:10:47,520 --> 00:10:51,440 Speaker 1: these things, not content raising awareness about them. Are trying 175 00:10:51,440 --> 00:10:56,320 Speaker 1: to counsel people or to prevent problems, but rather to 176 00:10:56,760 --> 00:10:59,560 Speaker 1: incite them. So the researchers said it was clear that 177 00:10:59,600 --> 00:11:03,480 Speaker 1: TikTok algorithm was amplifying these harmful messages and that the 178 00:11:03,480 --> 00:11:06,560 Speaker 1: effect on young users could be really dangerous. It could 179 00:11:06,559 --> 00:11:10,200 Speaker 1: contribute to mental health issues. Now, behind the scenes, what 180 00:11:10,280 --> 00:11:13,720 Speaker 1: appears to be going on is that TikTok's algorithm selects 181 00:11:13,760 --> 00:11:17,560 Speaker 1: from content that the algorithm estimates will be of interest 182 00:11:17,720 --> 00:11:21,439 Speaker 1: to the user. When something is a hit, the algorithm 183 00:11:21,480 --> 00:11:24,760 Speaker 1: looks for similar content, and by a hit, I mean 184 00:11:25,120 --> 00:11:28,000 Speaker 1: did did a person watch the video all the way through? 185 00:11:28,520 --> 00:11:31,400 Speaker 1: So as you spend more time paying attention to certain 186 00:11:31,440 --> 00:11:34,280 Speaker 1: types of content, what the algorithm is doing is just 187 00:11:34,320 --> 00:11:38,000 Speaker 1: trying to serve up similar content to you in the 188 00:11:38,040 --> 00:11:41,240 Speaker 1: thought that this will keep you on TikTok longer. So 189 00:11:41,400 --> 00:11:43,800 Speaker 1: at best you could say that the algorithm is ultimately 190 00:11:43,960 --> 00:11:49,680 Speaker 1: content agnostic. It's not necessarily trying to harm anyone, but 191 00:11:49,760 --> 00:11:52,960 Speaker 1: it's not trying to save anyone either. Instead, it's just 192 00:11:53,080 --> 00:11:55,280 Speaker 1: trying to keep people on the app for as long 193 00:11:55,320 --> 00:11:59,960 Speaker 1: as possible. And if terrible traumatic content is what's keeping 194 00:12:00,120 --> 00:12:02,680 Speaker 1: someone on there, well it's just gonna keep on sending 195 00:12:02,720 --> 00:12:06,559 Speaker 1: that content on. So if people are watching videos that 196 00:12:06,600 --> 00:12:11,079 Speaker 1: contain harmful messages, they'll keep getting more of those. And again, 197 00:12:11,679 --> 00:12:15,400 Speaker 1: the issue here is one of amplification. It's not so 198 00:12:15,520 --> 00:12:18,360 Speaker 1: much that you're you know, shouldn't be allowed to say 199 00:12:18,679 --> 00:12:23,120 Speaker 1: bad things, although you should always know that consequences can 200 00:12:23,160 --> 00:12:28,400 Speaker 1: come from saying bad things, but that it's the artificial 201 00:12:28,480 --> 00:12:32,080 Speaker 1: amplification of those messages and the potential harm that can 202 00:12:32,160 --> 00:12:36,160 Speaker 1: cause two users. So it's similar to problems that we've 203 00:12:36,200 --> 00:12:39,640 Speaker 1: seen on other platforms like Facebook, where recommendation algorithms have 204 00:12:39,760 --> 00:12:43,280 Speaker 1: played a big part in, say, the proliferation of misinformation. 205 00:12:44,040 --> 00:12:46,640 Speaker 1: I'm sure these kinds of studies will fuel more efforts 206 00:12:46,679 --> 00:12:49,800 Speaker 1: on the regulatory and government side as well. A lot 207 00:12:49,840 --> 00:12:53,360 Speaker 1: of that focus has been on the potential use of 208 00:12:53,360 --> 00:12:57,760 Speaker 1: TikTok as spywear essentially, but some of it is also 209 00:12:57,920 --> 00:13:01,240 Speaker 1: on a concern about how it could be affecting the 210 00:13:01,280 --> 00:13:05,480 Speaker 1: mental health of users, particularly young users. There's a lot 211 00:13:05,520 --> 00:13:08,480 Speaker 1: of armchair psychology too about whether or not TikTok is 212 00:13:08,520 --> 00:13:13,760 Speaker 1: absolutely ruining attention spans and making it impossible for younger 213 00:13:13,760 --> 00:13:16,400 Speaker 1: people to pay attention to anything that lasts longer than 214 00:13:16,440 --> 00:13:19,960 Speaker 1: a few seconds. Uh. Again, I don't have any actual 215 00:13:20,000 --> 00:13:22,520 Speaker 1: hard data on that. I don't know if that's really 216 00:13:22,559 --> 00:13:25,720 Speaker 1: the case. If it is, I'm doomed because all these 217 00:13:25,720 --> 00:13:28,240 Speaker 1: episodes are long. Alright, with that, we're gonna take a 218 00:13:28,280 --> 00:13:40,319 Speaker 1: quick break. When we come back, we've got some more news. Okay, 219 00:13:40,360 --> 00:13:46,360 Speaker 1: we're back. So Tesla's stock price hit their lowest point 220 00:13:46,679 --> 00:13:51,600 Speaker 1: in two years yesterday. Reuters sites investors who are concerned 221 00:13:51,640 --> 00:13:54,800 Speaker 1: that CEO Elon Musk is spending too much of his 222 00:13:54,920 --> 00:13:58,040 Speaker 1: time and energy over at Twitter, which you know, at 223 00:13:58,080 --> 00:14:01,080 Speaker 1: least from the surface, appears to be a valid concern. 224 00:14:01,800 --> 00:14:04,920 Speaker 1: According to filings with the U S Securities and Exchange 225 00:14:04,920 --> 00:14:10,720 Speaker 1: Commission or the SEC, Elon Musk himself offloaded millions of 226 00:14:10,720 --> 00:14:14,400 Speaker 1: his own personal shares of Tesla. Now, keep in mind, 227 00:14:14,640 --> 00:14:17,520 Speaker 1: Musk is still the majority owner of Tesla. He still 228 00:14:17,520 --> 00:14:20,040 Speaker 1: has the more most shares of anybody, So while he 229 00:14:20,160 --> 00:14:24,000 Speaker 1: sold off millions, he's got millions more. But the value 230 00:14:24,280 --> 00:14:26,720 Speaker 1: of the stocks he sold off, according to those filings, 231 00:14:27,360 --> 00:14:33,760 Speaker 1: was three point five billion dollars yauza. Now, generally speaking, 232 00:14:34,080 --> 00:14:37,880 Speaker 1: when a CEO sells off large chunks of their own 233 00:14:38,000 --> 00:14:41,680 Speaker 1: shares of the company, they're they're headed um. Investors can 234 00:14:41,720 --> 00:14:44,840 Speaker 1: get a little worried because sometimes they'll take it as 235 00:14:44,840 --> 00:14:49,160 Speaker 1: a sign that the CEO foresees troubled waters ahead. And 236 00:14:49,280 --> 00:14:53,280 Speaker 1: that the CEO has lost confidence in their own business. However, 237 00:14:53,320 --> 00:14:57,240 Speaker 1: in this case, Musk's ongoing Twitter issues maybe playing a 238 00:14:57,360 --> 00:15:00,240 Speaker 1: part in his decision to sell off stocks. But because 239 00:15:00,240 --> 00:15:04,240 Speaker 1: Twitter has billions of dollars of debt and massive interest 240 00:15:04,280 --> 00:15:07,800 Speaker 1: payments coming in off that debt, and the banks that 241 00:15:07,880 --> 00:15:11,800 Speaker 1: helped Musk secure financing for his purchase of Twitter have 242 00:15:11,880 --> 00:15:16,320 Speaker 1: reportedly been looking at margin loans on Musk's Tesla stock, 243 00:15:16,800 --> 00:15:20,240 Speaker 1: and the declining price of the stock is likely not 244 00:15:20,280 --> 00:15:23,840 Speaker 1: a welcome site to those banks. But with all that 245 00:15:23,880 --> 00:15:27,360 Speaker 1: being said, I just checked the Tesla stock as I'm 246 00:15:27,400 --> 00:15:30,840 Speaker 1: recording this episode, and it's currently trading at a hundred 247 00:15:30,920 --> 00:15:34,480 Speaker 1: fifty seven dollars, so it's up slightly from its low 248 00:15:34,560 --> 00:15:37,480 Speaker 1: of around a hundred fifty six dollars. So it could 249 00:15:37,520 --> 00:15:39,920 Speaker 1: be that Musk's cashing out of those stocks won't have 250 00:15:40,200 --> 00:15:42,640 Speaker 1: as big a ripple as some might suspect. We'll have 251 00:15:42,680 --> 00:15:46,200 Speaker 1: to wait and see. And across town over at Twitter, 252 00:15:47,840 --> 00:15:50,160 Speaker 1: the New York Times reports that Twitter has kind of 253 00:15:50,160 --> 00:15:55,080 Speaker 1: stopped paying rent on its offices, you know the offices 254 00:15:55,120 --> 00:15:58,720 Speaker 1: that Elon Musk has demanded that everyone who still works 255 00:15:58,720 --> 00:16:02,120 Speaker 1: at Twitter returns to those offices, In fact, there are 256 00:16:02,160 --> 00:16:04,960 Speaker 1: offices that, at least in the San Francisco branch, employees 257 00:16:05,000 --> 00:16:08,840 Speaker 1: had set up beds so they could work increasingly long 258 00:16:08,880 --> 00:16:12,920 Speaker 1: hours without actually leaving because Elon Musk is determined to 259 00:16:13,000 --> 00:16:16,080 Speaker 1: be visited by three spirits in you know, nine days 260 00:16:16,160 --> 00:16:20,560 Speaker 1: or so. Now, apparently employees have been instructed to not 261 00:16:20,920 --> 00:16:24,080 Speaker 1: pay vendors, which you know, is another great sign of 262 00:16:24,120 --> 00:16:27,640 Speaker 1: a company doing a okay. It's also a totally cool 263 00:16:27,760 --> 00:16:30,720 Speaker 1: business practice to not pay the people who provided goods 264 00:16:30,720 --> 00:16:34,480 Speaker 1: and services to you. It's not at all scummy. This 265 00:16:34,600 --> 00:16:38,400 Speaker 1: also includes a nearly two hundred thousand dollar bill for 266 00:16:38,520 --> 00:16:41,880 Speaker 1: private charter flights. At least according to a lawsuit, these 267 00:16:41,880 --> 00:16:44,680 Speaker 1: would be flights that Twitter arranged when Musk was taking 268 00:16:44,680 --> 00:16:47,440 Speaker 1: over the company and flying back and forth to visit 269 00:16:47,520 --> 00:16:51,880 Speaker 1: various offices. There have allegedly been discussions at the executive 270 00:16:51,960 --> 00:16:54,920 Speaker 1: level about what consequences the company might face if it 271 00:16:54,960 --> 00:16:59,800 Speaker 1: were to you know, not pay severance on all those 272 00:16:59,840 --> 00:17:03,440 Speaker 1: in employees who received a severance package upon being laid 273 00:17:03,480 --> 00:17:07,320 Speaker 1: off from the company, The question being would it be 274 00:17:07,400 --> 00:17:10,240 Speaker 1: cheaper to pay severance or just fight the lawsuits that 275 00:17:10,280 --> 00:17:13,320 Speaker 1: would come in. Would we rather just fight lawsuits, which 276 00:17:13,800 --> 00:17:17,600 Speaker 1: gross right, and if anyone in Twitter were to go 277 00:17:17,720 --> 00:17:20,919 Speaker 1: blabbing to the press about stuff happening when the company, 278 00:17:21,240 --> 00:17:24,840 Speaker 1: Musk says he would bring the wrath of the gods 279 00:17:25,080 --> 00:17:27,080 Speaker 1: down on them, though not in so many words. He 280 00:17:27,119 --> 00:17:29,560 Speaker 1: would just say, you sign a non disclosure agreement and 281 00:17:29,640 --> 00:17:32,359 Speaker 1: we will take you to court and sue you to 282 00:17:32,400 --> 00:17:34,880 Speaker 1: the fullest extent of the law. In fact, he might 283 00:17:34,960 --> 00:17:40,000 Speaker 1: need six ghosts. Then there's the matter of Ahmad abu Amo. 284 00:17:40,440 --> 00:17:43,879 Speaker 1: He's a former Twitter employee who is now facing three 285 00:17:43,920 --> 00:17:48,600 Speaker 1: and a half years of prison. Why well, he was 286 00:17:48,720 --> 00:17:52,280 Speaker 1: convicted earlier this year of spying on behalf of the 287 00:17:52,359 --> 00:17:56,399 Speaker 1: Saudi Arabian government while he was working for Twitter, and 288 00:17:56,400 --> 00:18:00,760 Speaker 1: he worked at Twitter from fifteen. During that time, he 289 00:18:00,800 --> 00:18:05,159 Speaker 1: apparently used his access as a media partnerships manager to 290 00:18:05,359 --> 00:18:08,399 Speaker 1: gather data about people who have been critical of the 291 00:18:08,440 --> 00:18:13,400 Speaker 1: Saudi Arabian government. Then he sent that data too, said government. 292 00:18:13,440 --> 00:18:17,120 Speaker 1: And just a reminder, this is the same government that 293 00:18:17,440 --> 00:18:21,240 Speaker 1: allegedly ordered the murder of a journalist named Jamal ka Shogi. 294 00:18:21,640 --> 00:18:27,920 Speaker 1: I say allegedly because until it's completely proven, I guess 295 00:18:27,960 --> 00:18:32,320 Speaker 1: I need to. But everyone essentially agrees that Jamalka Shogi 296 00:18:32,440 --> 00:18:35,959 Speaker 1: was assassinated on behalf of the Saudi Arabian government. This 297 00:18:36,040 --> 00:18:40,080 Speaker 1: is also the government that has sentenced people to more 298 00:18:40,160 --> 00:18:44,600 Speaker 1: than ten years in jail for criticizing the government on 299 00:18:44,720 --> 00:18:48,320 Speaker 1: social media in the past. And it's also the same 300 00:18:48,359 --> 00:18:51,840 Speaker 1: government that has a significant ownership stake in the current 301 00:18:51,920 --> 00:18:55,280 Speaker 1: version of Twitter. In fact, there I believe the second 302 00:18:55,480 --> 00:19:01,679 Speaker 1: largest stakeholder behind Elon Musk. Anyway, secutors showed how abu 303 00:19:01,800 --> 00:19:05,160 Speaker 1: Omo received large payments from the Saudi government as well 304 00:19:05,200 --> 00:19:08,360 Speaker 1: as the gift of a watch valued at more than 305 00:19:08,440 --> 00:19:13,639 Speaker 1: forty grand which oh, come on. Anyway, he has already 306 00:19:13,640 --> 00:19:16,280 Speaker 1: been convicted, as I said, and now he has been 307 00:19:16,280 --> 00:19:18,639 Speaker 1: sentenced to three and a half years in prison on 308 00:19:18,800 --> 00:19:23,639 Speaker 1: counts of acting as a foreign agent, acting in the 309 00:19:23,760 --> 00:19:28,440 Speaker 1: in the context of a money launderer, uh falsification of records, 310 00:19:28,480 --> 00:19:31,560 Speaker 1: and other charges as well. By the way, the fact 311 00:19:31,640 --> 00:19:34,200 Speaker 1: that he's just looking at three and a half years 312 00:19:34,280 --> 00:19:35,919 Speaker 1: is kind of why I said at the beginning of 313 00:19:35,960 --> 00:19:38,399 Speaker 1: this episode that I don't think SPF is going to 314 00:19:38,520 --> 00:19:43,360 Speaker 1: have that full hundred fifteen year sentence thrown at him 315 00:19:43,400 --> 00:19:48,240 Speaker 1: if he should be convicted. And Kenya Amnesty International is 316 00:19:48,280 --> 00:19:52,560 Speaker 1: backing a lawsuit against Meta that claims that the company 317 00:19:52,560 --> 00:19:57,720 Speaker 1: allowed hate speech and calls for inciting violence to spread 318 00:19:58,320 --> 00:20:03,000 Speaker 1: without moderation on Facebook and it exacerbated the war and 319 00:20:03,480 --> 00:20:08,680 Speaker 1: Uh Tigrai, which is a northern province in Ethiopia. Now, 320 00:20:08,680 --> 00:20:11,720 Speaker 1: the conflict in Ethiopia is a really complicated one. It 321 00:20:11,800 --> 00:20:16,520 Speaker 1: involves a very long history of different ethnic groups and 322 00:20:16,600 --> 00:20:22,400 Speaker 1: foreign uh countries within the country vying for power. Honestly, 323 00:20:22,960 --> 00:20:26,120 Speaker 1: the country has been in conflict way more frequently than 324 00:20:26,200 --> 00:20:30,560 Speaker 1: it has enjoyed peace in modern history. The recent conflicts 325 00:20:30,600 --> 00:20:35,040 Speaker 1: have largely revolved around Ti Gray, which in recent Ethiopian 326 00:20:35,160 --> 00:20:39,720 Speaker 1: history had been a dominant political power before the balance 327 00:20:39,880 --> 00:20:45,520 Speaker 1: shifted to other groups within Ethiopia. In early November, these 328 00:20:45,600 --> 00:20:48,639 Speaker 1: various parties involved in the war agreed to a ceasefire, 329 00:20:49,119 --> 00:20:53,160 Speaker 1: but this lawsuit is arguing that Meata was complacent by 330 00:20:53,200 --> 00:20:58,280 Speaker 1: allowing messages that encouraged violence and abuse throughout Ethiopia and 331 00:20:58,359 --> 00:21:01,560 Speaker 1: made it harder for these opposing parties to arrive at 332 00:21:01,560 --> 00:21:05,320 Speaker 1: any sort of ceasefire, and that as a result, many 333 00:21:05,400 --> 00:21:10,280 Speaker 1: people suffered and died, some indirectly from the fact that 334 00:21:10,320 --> 00:21:15,280 Speaker 1: this kind of language was spreading like wildfire on Facebook 335 00:21:15,320 --> 00:21:18,520 Speaker 1: in Ethiopia, some directly that there have been arguments that 336 00:21:18,600 --> 00:21:22,920 Speaker 1: some people were targeted as a result of hate speech 337 00:21:23,359 --> 00:21:27,040 Speaker 1: that was spreading across Facebook. The lawsuits aim is to 338 00:21:27,080 --> 00:21:30,560 Speaker 1: force Meta to create a compensation fund valued at around 339 00:21:30,640 --> 00:21:34,280 Speaker 1: one point three billion dollars for the purposes of paying 340 00:21:34,320 --> 00:21:37,680 Speaker 1: restitution to people who are victims of hate and violence 341 00:21:37,760 --> 00:21:40,479 Speaker 1: on Facebook. Now, if you've listened to tech stuff long 342 00:21:40,600 --> 00:21:44,080 Speaker 1: enough or paid attention to world news, you know that 343 00:21:44,119 --> 00:21:47,120 Speaker 1: while Facebook has been in the spotlight for allowing harmful 344 00:21:47,160 --> 00:21:50,959 Speaker 1: misinformation to spread here in the United States, it's way 345 00:21:51,040 --> 00:21:55,760 Speaker 1: more of a problem in non English speaking countries. The 346 00:21:55,800 --> 00:21:58,800 Speaker 1: company has fewer resources dedicated to preventing that kind of 347 00:21:58,800 --> 00:22:02,760 Speaker 1: abuse on the platform. They have frankly not made it 348 00:22:02,800 --> 00:22:06,800 Speaker 1: a priority to really tackle those problems in non English 349 00:22:06,840 --> 00:22:11,000 Speaker 1: speaking countries, particularly in the developing world. So far, Facebook's 350 00:22:11,040 --> 00:22:14,960 Speaker 1: response has been fairly boiler plate. The company has rules 351 00:22:14,960 --> 00:22:17,159 Speaker 1: and policies about the types of stuff that's allowed on 352 00:22:17,160 --> 00:22:20,160 Speaker 1: the platform, and it works hard to remove any material 353 00:22:20,440 --> 00:22:23,080 Speaker 1: that violates those rules. So we'll have to keep an 354 00:22:23,119 --> 00:22:26,240 Speaker 1: eye on this. See where this lawsuit develops from here. 355 00:22:26,840 --> 00:22:30,639 Speaker 1: Wired reports that several Russian cities have experienced disruption in 356 00:22:30,760 --> 00:22:34,440 Speaker 1: GPS signals over the last week. This sort of thing 357 00:22:34,480 --> 00:22:38,040 Speaker 1: can sometimes be the result of attacks on infrastructure. I mean, 358 00:22:38,080 --> 00:22:42,120 Speaker 1: it's possible to actually jam signals or to otherwise inhibit them, 359 00:22:42,200 --> 00:22:45,480 Speaker 1: to spoof them so that you get incorrect information. But 360 00:22:45,560 --> 00:22:48,359 Speaker 1: in this case, the Russian government may be the reason 361 00:22:48,400 --> 00:22:52,160 Speaker 1: for the disruption itself. Ukraine forces have been relying on 362 00:22:52,320 --> 00:22:57,679 Speaker 1: drones to strike important cities within Russia as Ukraine and 363 00:22:57,760 --> 00:23:01,680 Speaker 1: Russia continue their war, and so it's possible that Russian 364 00:23:01,680 --> 00:23:06,280 Speaker 1: authorities have ordered the disruption of GPS in an attempt 365 00:23:06,359 --> 00:23:10,879 Speaker 1: to foil drone navigation systems and thus protect potential targets. 366 00:23:11,359 --> 00:23:15,600 Speaker 1: According to analysts, the zones affected by GPS disruption measure 367 00:23:15,680 --> 00:23:19,399 Speaker 1: hundreds or sometimes even thousands of kilometers in diameter around 368 00:23:19,440 --> 00:23:23,439 Speaker 1: important cities. Now I imagine that it must be pretty 369 00:23:23,520 --> 00:23:26,359 Speaker 1: challenging to navigate in those cities for a lot of 370 00:23:26,400 --> 00:23:29,920 Speaker 1: folks right now. Just as a side note, this also 371 00:23:30,000 --> 00:23:34,520 Speaker 1: demonstrates that there's real value in learning how to read 372 00:23:35,000 --> 00:23:38,280 Speaker 1: a paper roadmap, and having a physical roadmap in your 373 00:23:38,359 --> 00:23:42,320 Speaker 1: vehicle is a good idea, and over dependence on technology 374 00:23:42,400 --> 00:23:45,800 Speaker 1: can become a problem if that technology should fail for 375 00:23:45,960 --> 00:23:49,840 Speaker 1: whatever reasons. So just maybe one of your stocking stuffers 376 00:23:49,840 --> 00:23:53,280 Speaker 1: this holiday season, or your Hanukkah gifts, or you know, 377 00:23:53,440 --> 00:23:55,720 Speaker 1: just something you want to buy for yourself should be 378 00:23:55,800 --> 00:23:59,640 Speaker 1: a roadmap and just familiarize yourself with it so that 379 00:23:59,800 --> 00:24:02,120 Speaker 1: in case you should ever need it, you know how 380 00:24:02,160 --> 00:24:05,440 Speaker 1: to read it. Okay, that's enough of that. We're gonna 381 00:24:05,480 --> 00:24:07,840 Speaker 1: come back after this break and finish up with a 382 00:24:07,880 --> 00:24:11,680 Speaker 1: few more news items. But first let's listen to these messages. 383 00:24:21,080 --> 00:24:23,720 Speaker 1: We're back. Okay, I've got a positive story here. It's 384 00:24:23,720 --> 00:24:26,720 Speaker 1: a short one. A couple of motorists are really thankful 385 00:24:26,800 --> 00:24:30,359 Speaker 1: for the iPhone fourteen's satellite connectivity, which is available on 386 00:24:30,359 --> 00:24:33,960 Speaker 1: the iPhone fourteen and the iPhone fourteen Pro. These two 387 00:24:34,000 --> 00:24:37,080 Speaker 1: motorists were in a car accident. They were driving down 388 00:24:37,119 --> 00:24:41,240 Speaker 1: a road in California through some canyons and cell services 389 00:24:41,280 --> 00:24:43,800 Speaker 1: pretty hard to come by in this particular region of 390 00:24:43,880 --> 00:24:47,560 Speaker 1: Los Angeles County, not the city of the county, and 391 00:24:47,840 --> 00:24:50,520 Speaker 1: their vehicles slid off the road and it went down 392 00:24:50,600 --> 00:24:54,280 Speaker 1: the side of a mountain. Fortunately, the two were able 393 00:24:54,320 --> 00:24:56,199 Speaker 1: to get out of the car. They were able to 394 00:24:56,240 --> 00:24:59,840 Speaker 1: connect the their iPhone fourteen to a satellite and they 395 00:24:59,840 --> 00:25:04,480 Speaker 1: were able to contact emergency services. Emergency services, then we're 396 00:25:04,520 --> 00:25:07,680 Speaker 1: able to use the location data to direct a rescue 397 00:25:07,680 --> 00:25:10,960 Speaker 1: helicopter to their location where they were rescued and then 398 00:25:11,000 --> 00:25:14,000 Speaker 1: taken to a hospital for observation. And I wanted to 399 00:25:14,040 --> 00:25:16,800 Speaker 1: include the story because it has a happy ending and 400 00:25:16,840 --> 00:25:19,320 Speaker 1: it shows how tech really can make a huge and 401 00:25:19,480 --> 00:25:22,920 Speaker 1: positive difference. Like I know, in my news items, I 402 00:25:23,040 --> 00:25:27,720 Speaker 1: frequently am focusing on some pretty dark stuff that involves tech. 403 00:25:28,160 --> 00:25:31,199 Speaker 1: But you know, tech is a tool. It's just like 404 00:25:31,280 --> 00:25:34,240 Speaker 1: Hamlet said, there's nothing good or bad, but thinking makes 405 00:25:34,240 --> 00:25:37,119 Speaker 1: it so. While tools neither good nor bad, it's just 406 00:25:37,280 --> 00:25:40,159 Speaker 1: and how you use it, and you can use them 407 00:25:40,240 --> 00:25:43,240 Speaker 1: in really good, positive ways. This is one of them. 408 00:25:43,359 --> 00:25:47,159 Speaker 1: And uh, honestly, I'm very thankful that Apple included the 409 00:25:47,200 --> 00:25:51,359 Speaker 1: satellite connectivity in the most recent iPhones because it means 410 00:25:51,400 --> 00:25:55,600 Speaker 1: that people like this have another lifeline in situations where 411 00:25:55,640 --> 00:26:00,240 Speaker 1: otherwise they might be completely helpless. In other Apple use 412 00:26:00,760 --> 00:26:03,679 Speaker 1: the company is reportedly getting ready to allow third party 413 00:26:03,760 --> 00:26:07,439 Speaker 1: app stores on iOS for the first time ever, so 414 00:26:07,480 --> 00:26:10,920 Speaker 1: this would include stuff like Amazon's App Store, for example, 415 00:26:10,960 --> 00:26:14,520 Speaker 1: which typically you would not be able to install on 416 00:26:14,560 --> 00:26:17,720 Speaker 1: an iOS device or the Epic Game Store. Again, you 417 00:26:17,720 --> 00:26:19,240 Speaker 1: wouldn't be able to do that. You would have to 418 00:26:19,240 --> 00:26:22,520 Speaker 1: do everything through Apple itself. Now, the reason for this 419 00:26:22,600 --> 00:26:28,000 Speaker 1: change is because not because Apple suddenly became really hospitable. 420 00:26:28,640 --> 00:26:32,280 Speaker 1: It's because the EU passed the Digital Markets Act and 421 00:26:32,359 --> 00:26:35,399 Speaker 1: it cracks down on policies that would otherwise give a 422 00:26:35,400 --> 00:26:39,639 Speaker 1: platform and unfair advantage over would be competitors. So the 423 00:26:39,720 --> 00:26:42,040 Speaker 1: Act makes it illegal for a company to become kind 424 00:26:42,080 --> 00:26:45,840 Speaker 1: of a gatekeeper to its own ecosystem if that ecosystem 425 00:26:45,880 --> 00:26:50,359 Speaker 1: is like a significantly large and important one like iOS. Now, 426 00:26:50,400 --> 00:26:53,920 Speaker 1: whether this is ultimately going to lead to a catastrophic 427 00:26:54,000 --> 00:26:58,240 Speaker 1: downturn for users safety, which Apple has repeatedly warned about, 428 00:26:58,280 --> 00:27:00,159 Speaker 1: They've said that's the reason they have an allow it 429 00:27:00,240 --> 00:27:02,800 Speaker 1: in the past. That remains to be seen. Now, I 430 00:27:02,800 --> 00:27:06,080 Speaker 1: will say, while I'm skeptical that it's going to lead 431 00:27:06,080 --> 00:27:09,560 Speaker 1: to the downfall of civilization, it is always a good 432 00:27:09,560 --> 00:27:14,280 Speaker 1: idea to do research before you sideload anything onto a device. 433 00:27:14,880 --> 00:27:20,320 Speaker 1: When you sidestep the official you know, app store, make 434 00:27:20,400 --> 00:27:23,119 Speaker 1: sure that whatever it is you're about to download and 435 00:27:23,200 --> 00:27:25,840 Speaker 1: install is on the up and up, because otherwise you 436 00:27:25,880 --> 00:27:29,080 Speaker 1: can run into problems. A hacker claims to have used 437 00:27:29,119 --> 00:27:31,600 Speaker 1: a bit of social engineering to get access to an 438 00:27:31,640 --> 00:27:36,800 Speaker 1: important FBI database. The database is called infra Guard and 439 00:27:37,000 --> 00:27:40,479 Speaker 1: it is essentially a collection of like eighty thousand people 440 00:27:40,720 --> 00:27:44,840 Speaker 1: in various organizations, including government agencies as well as corporations, 441 00:27:45,280 --> 00:27:47,200 Speaker 1: and these are people who are considered to be important 442 00:27:47,240 --> 00:27:50,440 Speaker 1: when it comes to protecting US infrastructure. The hacker claims 443 00:27:50,720 --> 00:27:54,480 Speaker 1: they got access to this by posing as the CEO 444 00:27:54,600 --> 00:27:56,960 Speaker 1: of a company and said that they were actually surprised 445 00:27:57,040 --> 00:28:00,720 Speaker 1: at how poor the vetting process was when they uh 446 00:28:00,800 --> 00:28:03,200 Speaker 1: when they essentially applied to get access, and then we're 447 00:28:03,240 --> 00:28:06,520 Speaker 1: granted access to this database. So this is kind of 448 00:28:06,560 --> 00:28:11,680 Speaker 1: like a directory of really important people that are related 449 00:28:11,720 --> 00:28:15,840 Speaker 1: to US infrastructure. The hacker has been on cybercriminal forums 450 00:28:15,880 --> 00:28:20,120 Speaker 1: asking for fifty thousand bucks in return for this database. Now, 451 00:28:20,240 --> 00:28:23,600 Speaker 1: the info in this database appears to be somewhat limited. 452 00:28:23,840 --> 00:28:26,679 Speaker 1: Most entries consist of like a name and an email 453 00:28:26,720 --> 00:28:29,760 Speaker 1: address and that's about it. But this could still serve 454 00:28:29,800 --> 00:28:32,200 Speaker 1: as a valuable list for someone who wants to engage 455 00:28:32,280 --> 00:28:36,080 Speaker 1: in spear fishing. This is a targeted version of fishing 456 00:28:36,480 --> 00:28:40,040 Speaker 1: to get you know, sensitive information. It's an attempt to 457 00:28:40,040 --> 00:28:43,240 Speaker 1: compromise someone and trick them into handing over access to 458 00:28:43,280 --> 00:28:48,920 Speaker 1: more important systems bad form FBI, that's the oldest trick 459 00:28:48,960 --> 00:28:52,200 Speaker 1: in the book. YouTube announced that soon bots will be 460 00:28:52,240 --> 00:28:55,880 Speaker 1: able to remove comments deemed to be abusive, and will 461 00:28:55,920 --> 00:28:59,760 Speaker 1: also be able to issue temporary bands for at least 462 00:29:00,240 --> 00:29:03,000 Speaker 1: short while two users who are being jerks in the 463 00:29:03,040 --> 00:29:06,960 Speaker 1: comments sections of the platform. So if you get multiple 464 00:29:07,000 --> 00:29:10,479 Speaker 1: warnings from a bot that you know your comments are 465 00:29:10,480 --> 00:29:14,960 Speaker 1: being removed because you're being um a jerk face, then 466 00:29:15,280 --> 00:29:18,000 Speaker 1: the bot can ultimately give you a twenty four hour 467 00:29:18,120 --> 00:29:20,760 Speaker 1: time out where you can't comment on anything in that 468 00:29:20,840 --> 00:29:24,320 Speaker 1: twenty four hours. It does not sound like channel owners 469 00:29:24,360 --> 00:29:26,480 Speaker 1: are going to have the option to opt out of 470 00:29:26,520 --> 00:29:29,160 Speaker 1: this particular policy, that it's just kind of there by default, 471 00:29:29,800 --> 00:29:33,680 Speaker 1: And as Ron Amadeo of artist Hetnica points out, this 472 00:29:33,760 --> 00:29:37,640 Speaker 1: approach is probably the only one that YouTube can feasibly take. 473 00:29:38,400 --> 00:29:41,480 Speaker 1: You have hundreds of hours of content that has uploaded 474 00:29:41,520 --> 00:29:44,520 Speaker 1: to YouTube every single minute of the day, plus you 475 00:29:44,560 --> 00:29:47,520 Speaker 1: have countless live streams active at any given time, so 476 00:29:47,560 --> 00:29:50,440 Speaker 1: there's just no way that the company could dedicate a 477 00:29:50,560 --> 00:29:55,160 Speaker 1: large enough staff of human content moderators to oversee everything, 478 00:29:55,320 --> 00:29:57,920 Speaker 1: it would be impossible. So really the question is going 479 00:29:58,000 --> 00:30:02,360 Speaker 1: to end up being is automated content moderation better than 480 00:30:02,760 --> 00:30:07,200 Speaker 1: no content moderation? We'll have to find out. Executives at 481 00:30:07,240 --> 00:30:11,880 Speaker 1: Google apparently discussed the phenomenon of chat GPT recently. That's 482 00:30:11,880 --> 00:30:14,600 Speaker 1: an AI chat bot, and it's proven to be really compelling. 483 00:30:14,640 --> 00:30:17,720 Speaker 1: I've talked about it previously on tech Stuff. The chat 484 00:30:17,760 --> 00:30:21,000 Speaker 1: bought is capable of putting together responses to various queries 485 00:30:21,040 --> 00:30:24,920 Speaker 1: in a way that seems authoritative and trustworthy. Now I 486 00:30:24,960 --> 00:30:29,000 Speaker 1: say seems authoritative and trustworthy because folks have pointed out 487 00:30:29,000 --> 00:30:32,440 Speaker 1: that at least in some cases, the responses generated can 488 00:30:32,560 --> 00:30:37,400 Speaker 1: sometimes include questionable or outright wrong information, but the style 489 00:30:37,440 --> 00:30:42,080 Speaker 1: of presentation seems reliable and structured, and that can give 490 00:30:42,120 --> 00:30:45,040 Speaker 1: the reader the incorrect feeling that they can count on 491 00:30:45,080 --> 00:30:48,080 Speaker 1: the information being given to them that it's a good answer, 492 00:30:48,120 --> 00:30:51,480 Speaker 1: when in reality it may not be. As such, the 493 00:30:51,560 --> 00:30:55,840 Speaker 1: Google executives said that while Google has similar capabilities with 494 00:30:55,960 --> 00:30:59,360 Speaker 1: their AI chat bots, they are not putting them out 495 00:30:59,440 --> 00:31:01,920 Speaker 1: there out of a concern that they pose a quote 496 00:31:02,000 --> 00:31:07,920 Speaker 1: unquote reputational risk due to issues with factuality and bias, 497 00:31:08,040 --> 00:31:12,560 Speaker 1: and as such, a model similar to chat GPT is 498 00:31:12,600 --> 00:31:17,880 Speaker 1: not likely to replace our current method of search anytime soon. Now, 499 00:31:18,280 --> 00:31:21,440 Speaker 1: over time, maybe we will see search gravitate toward a 500 00:31:21,440 --> 00:31:25,000 Speaker 1: more semantic web kind of presentation in the future. That's 501 00:31:25,040 --> 00:31:28,440 Speaker 1: where you would log into the web, you would ask 502 00:31:28,600 --> 00:31:31,200 Speaker 1: whatever it was you wanted, and the web would quote 503 00:31:31,240 --> 00:31:34,760 Speaker 1: unquote understand what you were asking based upon who you were, 504 00:31:35,000 --> 00:31:38,440 Speaker 1: your your situation in life, you know, all these sort 505 00:31:38,480 --> 00:31:42,560 Speaker 1: of factors that your browser can't do right now. But 506 00:31:42,920 --> 00:31:46,640 Speaker 1: that's the concept of the semantic web, and a lot 507 00:31:46,680 --> 00:31:50,120 Speaker 1: of people have been kind of comparing chat GPT with 508 00:31:50,280 --> 00:31:55,480 Speaker 1: that idea. But chat GPT is is far more limited 509 00:31:55,760 --> 00:32:01,000 Speaker 1: than the concept of semantic web would require. So uh, 510 00:32:01,280 --> 00:32:04,160 Speaker 1: it's tough right now. Like it's really tough when you 511 00:32:04,400 --> 00:32:07,720 Speaker 1: essentially have a black box that generates answers when you 512 00:32:07,760 --> 00:32:10,600 Speaker 1: ask questions, but it doesn't tell you how it came 513 00:32:10,640 --> 00:32:13,920 Speaker 1: to those answers. That means you probably shouldn't put all 514 00:32:13,960 --> 00:32:18,720 Speaker 1: your eggs in that mysterious basket. Finally, a British man 515 00:32:18,840 --> 00:32:23,040 Speaker 1: named Amar Reshi used AI to create a children's book, 516 00:32:23,320 --> 00:32:25,800 Speaker 1: and now he says people have been sending him abusive 517 00:32:25,920 --> 00:32:31,920 Speaker 1: and threatening messages, so Reshi used chat gpt to generate 518 00:32:31,920 --> 00:32:34,920 Speaker 1: a story about a character named Alice and her friendly 519 00:32:35,080 --> 00:32:39,640 Speaker 1: robots Sparkle as Alice tries to learn about technology. He 520 00:32:39,760 --> 00:32:42,560 Speaker 1: had to refine the story. He had to come up 521 00:32:42,720 --> 00:32:45,880 Speaker 1: this in different ways and ask different questions of chat gpt, 522 00:32:46,600 --> 00:32:51,640 Speaker 1: but ultimately chat gpt generated the basic story. Now, to 523 00:32:51,760 --> 00:32:55,560 Speaker 1: illustrate the story, he used an app called mid journey, 524 00:32:55,720 --> 00:33:00,880 Speaker 1: and it makes images based off simple textual prompts. He said. Likewise, 525 00:33:00,880 --> 00:33:04,440 Speaker 1: he had to refine his search results many, many, many 526 00:33:04,480 --> 00:33:07,360 Speaker 1: times because mid Journey would often generate stuff that was, 527 00:33:07,920 --> 00:33:11,280 Speaker 1: you know, not really fitting for a children's book, he said. 528 00:33:11,320 --> 00:33:13,640 Speaker 1: In some cases, if he was writing a horror novel, 529 00:33:13,920 --> 00:33:15,880 Speaker 1: it would have been the right way to go. But 530 00:33:16,240 --> 00:33:19,560 Speaker 1: over time he was able to get those refined and 531 00:33:19,600 --> 00:33:22,360 Speaker 1: then once he was happy with the results, he collected 532 00:33:22,400 --> 00:33:26,959 Speaker 1: them and then published a children's book using Amazon's Kindled 533 00:33:27,000 --> 00:33:31,080 Speaker 1: direct publishing tool. And his close friends all thought this 534 00:33:31,120 --> 00:33:33,000 Speaker 1: was a super cute idea. Like he was doing it 535 00:33:33,040 --> 00:33:35,720 Speaker 1: because you know, he wanted to give a gift to 536 00:33:35,840 --> 00:33:38,720 Speaker 1: friends who had little kids. And then he shared it 537 00:33:38,880 --> 00:33:41,239 Speaker 1: with a slightly larger group, and then he went on 538 00:33:41,280 --> 00:33:43,600 Speaker 1: Twitter and talked about it, and that's when he started 539 00:33:43,600 --> 00:33:48,600 Speaker 1: getting pushback. You had authors and illustrators who would criticize 540 00:33:48,600 --> 00:33:53,560 Speaker 1: Reshi for outsourcing human creativity two machines, and they argued 541 00:33:54,000 --> 00:33:57,960 Speaker 1: that his actions cheapen the artistry that goes into these pursuits. 542 00:33:58,480 --> 00:34:03,800 Speaker 1: They're also worried that these systems which use machine learning 543 00:34:04,400 --> 00:34:08,680 Speaker 1: are using their own works as source material in order 544 00:34:08,719 --> 00:34:12,520 Speaker 1: to learn that these machines are going to, over time 545 00:34:12,560 --> 00:34:16,920 Speaker 1: get better and better at copying specific styles. So you 546 00:34:16,920 --> 00:34:20,040 Speaker 1: could say, like, I want a poem in the style 547 00:34:20,080 --> 00:34:23,480 Speaker 1: of Dr SEUs about such and such, and ultimately you 548 00:34:23,480 --> 00:34:25,680 Speaker 1: could get to a point where you would get a 549 00:34:25,719 --> 00:34:27,839 Speaker 1: poem that seemed to have been written by Dr SEUs, 550 00:34:27,960 --> 00:34:32,239 Speaker 1: but wasn't. That's something that people are really concerned about. 551 00:34:32,280 --> 00:34:35,000 Speaker 1: Creatives are really concerned about that, and I don't blame them. 552 00:34:35,040 --> 00:34:39,640 Speaker 1: I think that's a legitimate concern. They're also, you know, 553 00:34:40,160 --> 00:34:44,440 Speaker 1: really worried that that you'll have entities out there that 554 00:34:44,480 --> 00:34:48,160 Speaker 1: would usually do you know, work for higher kind of jobs, 555 00:34:48,600 --> 00:34:51,919 Speaker 1: turned to AI and just be satisfied that it's quote 556 00:34:51,960 --> 00:34:55,520 Speaker 1: unquote good enough that the work produced will not be 557 00:34:55,719 --> 00:34:58,840 Speaker 1: great work not nearly as great as what a human 558 00:34:58,840 --> 00:35:03,840 Speaker 1: would produce. But that the the people doing the hiring 559 00:35:03,920 --> 00:35:07,279 Speaker 1: might not care because they're able to get that work 560 00:35:07,320 --> 00:35:10,640 Speaker 1: for free as opposed to having to pay someone for it. Now. 561 00:35:10,680 --> 00:35:15,080 Speaker 1: Reschi says he was surprised at receiving abuse and threats, 562 00:35:15,520 --> 00:35:19,200 Speaker 1: but he does understand how artists and authors feel threatened 563 00:35:19,239 --> 00:35:21,880 Speaker 1: and concerned, and he even went on to say that 564 00:35:21,920 --> 00:35:25,120 Speaker 1: those concerns could be quite valid, that you know those 565 00:35:25,160 --> 00:35:27,239 Speaker 1: are There's some questions that need to be answered, like 566 00:35:27,320 --> 00:35:32,399 Speaker 1: how are these artists AI programs being trained? Are they 567 00:35:32,400 --> 00:35:35,440 Speaker 1: being trained on copy written works, on things that are 568 00:35:35,480 --> 00:35:40,120 Speaker 1: being developed by illustrators and artists today, and is it 569 00:35:40,560 --> 00:35:44,799 Speaker 1: not fair? Like is it unfair for those artists to 570 00:35:44,840 --> 00:35:50,800 Speaker 1: go uncompensated while meanwhile these AI programs are potentially copying 571 00:35:50,840 --> 00:35:54,359 Speaker 1: their styles. These are tough questions, and honestly, I think 572 00:35:54,360 --> 00:35:57,920 Speaker 1: it's great to start asking questions like this and to 573 00:35:58,160 --> 00:36:02,160 Speaker 1: really dive down in to the ethics of how we 574 00:36:02,280 --> 00:36:05,400 Speaker 1: use a I and UH. I think that these are 575 00:36:05,440 --> 00:36:08,359 Speaker 1: conversations we have to really start jumping into. We thought 576 00:36:08,400 --> 00:36:11,440 Speaker 1: we probably wouldn't need to worry about them for years. No, 577 00:36:11,600 --> 00:36:14,279 Speaker 1: that time is now. We have to have these conversations 578 00:36:14,280 --> 00:36:17,600 Speaker 1: about the ethics of AI now in order to kind 579 00:36:17,600 --> 00:36:21,000 Speaker 1: of figure out best practices and ways that we're not 580 00:36:21,200 --> 00:36:26,640 Speaker 1: harming people or exploiting them or stealing from them, like 581 00:36:26,680 --> 00:36:29,520 Speaker 1: all these things are very important. So I think it's 582 00:36:29,520 --> 00:36:32,680 Speaker 1: a fascinating story. I think it was a fascinating experiment. 583 00:36:33,160 --> 00:36:36,520 Speaker 1: And if if Freshie had not, you know, published it 584 00:36:37,160 --> 00:36:40,240 Speaker 1: more widely, if you had not publicized it more widely, 585 00:36:40,280 --> 00:36:43,160 Speaker 1: I guess I should say probably nothing would have come 586 00:36:43,160 --> 00:36:46,600 Speaker 1: of it. But because of that, it just created this 587 00:36:47,239 --> 00:36:52,640 Speaker 1: storm of controversy and unfortunately, of threats. I think that 588 00:36:52,719 --> 00:36:57,600 Speaker 1: anyone threatening somebody for doing something like this is definitively 589 00:36:57,680 --> 00:36:59,880 Speaker 1: in the wrong. They should not do that. That is 590 00:37:00,120 --> 00:37:04,720 Speaker 1: not the way to go ever, so shame on those folks, 591 00:37:04,760 --> 00:37:08,640 Speaker 1: like they need to get some perspective here. But it 592 00:37:08,719 --> 00:37:12,560 Speaker 1: does mean that we need to start having these conversations, uh, 593 00:37:12,640 --> 00:37:18,080 Speaker 1: in order to come to a common understanding. Okay, that's it. 594 00:37:18,239 --> 00:37:21,160 Speaker 1: That's it for the news for this week. I will 595 00:37:21,239 --> 00:37:24,480 Speaker 1: be coming to you next week, hopefully with some episodes 596 00:37:24,520 --> 00:37:27,840 Speaker 1: that will kind of wrap up some of the or 597 00:37:28,000 --> 00:37:30,279 Speaker 1: maybe not wrap up, but at least touch on some 598 00:37:30,320 --> 00:37:33,200 Speaker 1: of the big stories that unfolded in tech this year. 599 00:37:33,880 --> 00:37:36,960 Speaker 1: So next week will probably be a lot of episodes 600 00:37:37,000 --> 00:37:40,920 Speaker 1: about that instead of the normal tech news episodes. I 601 00:37:40,960 --> 00:37:43,160 Speaker 1: think it'll probably just be a continuation of the big 602 00:37:43,200 --> 00:37:48,400 Speaker 1: stories of two, just based upon the list of headlines 603 00:37:48,480 --> 00:37:52,080 Speaker 1: I have written down. I haven't actually started writing it yet, 604 00:37:52,320 --> 00:37:54,759 Speaker 1: so I suspect it's going to be a very long 605 00:37:55,120 --> 00:37:57,919 Speaker 1: series of episodes. But a lot of stuff happened this year. 606 00:37:58,800 --> 00:38:00,799 Speaker 1: If you have suggestions for things I should cover on 607 00:38:00,880 --> 00:38:03,320 Speaker 1: tech Stuff, please reach out to me. A couple of 608 00:38:03,320 --> 00:38:05,120 Speaker 1: ways to do that. One is to download the I 609 00:38:05,160 --> 00:38:07,759 Speaker 1: Heart Radio app, which is free to downloads free to use. 610 00:38:07,760 --> 00:38:10,040 Speaker 1: You can just navigate to tech Stuff through the little 611 00:38:10,040 --> 00:38:12,640 Speaker 1: search field and there's a little microphone icon. If you 612 00:38:12,680 --> 00:38:14,319 Speaker 1: click on that, you can leave a message up to 613 00:38:14,320 --> 00:38:16,759 Speaker 1: thirty seconds in length. Let me know if you would 614 00:38:16,800 --> 00:38:18,400 Speaker 1: like me to use it in a future episode. I 615 00:38:18,440 --> 00:38:21,040 Speaker 1: will not use it unless you tell me too, but 616 00:38:21,400 --> 00:38:24,200 Speaker 1: that's a one way to to suggest topics. Another is 617 00:38:24,239 --> 00:38:27,080 Speaker 1: to use Twitter. The handle that we use is tech 618 00:38:27,239 --> 00:38:30,760 Speaker 1: stuff hs W and I will talk to you again 619 00:38:31,719 --> 00:38:40,920 Speaker 1: really soon, y. Tech Stuff is an I Heart Radio production. 620 00:38:41,160 --> 00:38:44,000 Speaker 1: For more podcasts from my Heart Radio, visit the I 621 00:38:44,080 --> 00:38:47,319 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 622 00:38:47,360 --> 00:38:48,280 Speaker 1: your favorite shows,