1 00:00:04,440 --> 00:00:12,360 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,400 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host Jonathan Strickland, 3 00:00:15,840 --> 00:00:19,119 Speaker 1: Diamond executive producer with iHeartRadio and how the tech are you? 4 00:00:19,640 --> 00:00:22,680 Speaker 1: Sign for the Tech News for Tuesday, May twenty third, 5 00:00:23,040 --> 00:00:27,200 Speaker 1: twenty twenty three. And first up, we got a couple 6 00:00:27,240 --> 00:00:32,320 Speaker 1: of stories about how social network platforms facilitated the rapid 7 00:00:32,479 --> 00:00:38,360 Speaker 1: spread of misinformation and how that in turn created more chaos. Also, 8 00:00:38,840 --> 00:00:41,839 Speaker 1: this first one has a dash of AI in it, 9 00:00:42,400 --> 00:00:46,040 Speaker 1: so that's a bonus. So first up, Yesterday, an AI 10 00:00:46,280 --> 00:00:50,960 Speaker 1: generated image showing a plume of dark smoke that appeared 11 00:00:51,000 --> 00:00:54,960 Speaker 1: to be the result of an explosion that was apparently 12 00:00:55,200 --> 00:00:58,520 Speaker 1: near the Pentagon here in the United States went viral. 13 00:00:58,880 --> 00:01:01,960 Speaker 1: But as I said, that is image was AI generated. 14 00:01:02,440 --> 00:01:06,039 Speaker 1: There was no such explosion that happened anywhere close to 15 00:01:06,080 --> 00:01:09,080 Speaker 1: the Pentagon. In fact, if you zoomed in a bit 16 00:01:09,200 --> 00:01:11,400 Speaker 1: on that image, you would see that there were some 17 00:01:11,560 --> 00:01:15,319 Speaker 1: hinky ness to the image. There were details in the 18 00:01:15,360 --> 00:01:17,600 Speaker 1: photo that looked a bit off, kind of like how 19 00:01:17,840 --> 00:01:20,800 Speaker 1: you know, if you do an AI generated image of 20 00:01:20,840 --> 00:01:24,559 Speaker 1: a person, the AI just doesn't seem to get fingers right. 21 00:01:25,520 --> 00:01:28,880 Speaker 1: Often the fingers in images that are generated by AI 22 00:01:29,040 --> 00:01:32,280 Speaker 1: are the things of nightmares. I guess AI just thinks 23 00:01:32,280 --> 00:01:34,480 Speaker 1: that we have spaghetti at the end of our hands. Anyway, 24 00:01:35,080 --> 00:01:37,800 Speaker 1: at the time of this recording, I haven't seen anything 25 00:01:37,800 --> 00:01:41,920 Speaker 1: about who might have generated the image. Initially reportedly, it 26 00:01:41,959 --> 00:01:45,680 Speaker 1: was actually on Facebook before it really took off on Twitter. 27 00:01:46,360 --> 00:01:49,840 Speaker 1: We do know that several Russian based news sites or 28 00:01:50,480 --> 00:01:53,960 Speaker 1: propaganda sites depending upon your point of view, ran with 29 00:01:54,040 --> 00:01:57,800 Speaker 1: the story and they published it as a breaking news item, 30 00:01:58,280 --> 00:02:01,840 Speaker 1: and it even caused a small dip in the stock market. 31 00:02:02,240 --> 00:02:04,640 Speaker 1: But this stumble corrected itself once where it got out 32 00:02:04,760 --> 00:02:07,640 Speaker 1: the whole thing was just a hoax. So this is 33 00:02:07,680 --> 00:02:11,280 Speaker 1: a case where misinformation really was more of an inconvenience 34 00:02:11,360 --> 00:02:14,480 Speaker 1: than a real threat because of the rapid response and 35 00:02:14,880 --> 00:02:20,200 Speaker 1: the debunking of this image. But it does show that again, 36 00:02:20,360 --> 00:02:28,320 Speaker 1: social networks really facilitate incredible rapid spread of misinformation. Now 37 00:02:28,400 --> 00:02:31,880 Speaker 1: let's go to story number two, and this one takes 38 00:02:31,880 --> 00:02:35,440 Speaker 1: place in the UK. Now, this one doesn't involve AI, 39 00:02:35,960 --> 00:02:39,919 Speaker 1: but it does involve social networks and a very real tragedy. 40 00:02:40,320 --> 00:02:43,320 Speaker 1: So yesterday, a couple of teenagers were riding on an 41 00:02:43,360 --> 00:02:48,280 Speaker 1: off road bike or scooter in Cardiff, Wales, and they 42 00:02:48,320 --> 00:02:53,160 Speaker 1: got involved in a traffic accident and both teenagers died 43 00:02:53,200 --> 00:02:58,200 Speaker 1: from their injuries. Now that is undeniably terrible, a horrible loss. 44 00:02:58,639 --> 00:03:02,240 Speaker 1: Police then arrived on the scene of the accident, but 45 00:03:02,320 --> 00:03:05,800 Speaker 1: on social media there was this narrative that began to 46 00:03:05,919 --> 00:03:10,119 Speaker 1: form that accused the police of actually causing the accident. 47 00:03:10,639 --> 00:03:13,480 Speaker 1: The narrative said that the police were in a pursuit 48 00:03:13,520 --> 00:03:16,480 Speaker 1: and that in turn created the accident in which the 49 00:03:16,480 --> 00:03:20,520 Speaker 1: two teenagers lost their lives. But that was just not true. 50 00:03:20,600 --> 00:03:23,400 Speaker 1: The police weren't involved in a pursuit. There was no 51 00:03:23,560 --> 00:03:27,359 Speaker 1: police presence until after the accident happened and police were 52 00:03:27,360 --> 00:03:30,360 Speaker 1: called to the scene. However, this didn't stop the story 53 00:03:30,400 --> 00:03:34,440 Speaker 1: from spreading online rapidly and people in the community began 54 00:03:34,600 --> 00:03:37,400 Speaker 1: to assembol and what started off as kind of a 55 00:03:37,440 --> 00:03:41,520 Speaker 1: demonstration of anger toward police escalated into a full blown riot, 56 00:03:41,960 --> 00:03:45,080 Speaker 1: with the crowd throwing stuff at police officers. Some of 57 00:03:45,080 --> 00:03:47,920 Speaker 1: those police officers suffered injuries, although from what I understand, 58 00:03:47,960 --> 00:03:51,000 Speaker 1: none of them were really serious injuries. And again it 59 00:03:51,040 --> 00:03:54,840 Speaker 1: turned out that the story that the police had contributed 60 00:03:54,880 --> 00:03:57,600 Speaker 1: to this accident was just a lie. But the time 61 00:03:57,680 --> 00:03:59,760 Speaker 1: that message was getting out there, things were already out 62 00:03:59,800 --> 00:04:04,200 Speaker 1: of the crowd continued to roam the streets until the 63 00:04:04,240 --> 00:04:08,000 Speaker 1: early hours of this morning, and then they dispersed. Now, 64 00:04:08,040 --> 00:04:11,000 Speaker 1: to be clear, I do not think it's fair to 65 00:04:11,080 --> 00:04:16,800 Speaker 1: blame social networks for the actual misinformation. Rather, social networks 66 00:04:16,920 --> 00:04:21,880 Speaker 1: facilitated the spread of misinformation. They didn't make it. They 67 00:04:21,960 --> 00:04:25,320 Speaker 1: just made it way easier for it to spread around. Now, 68 00:04:25,360 --> 00:04:28,839 Speaker 1: I do not know if any recommendation algorithms played a 69 00:04:28,880 --> 00:04:33,520 Speaker 1: part in that. It's possible because the algorithm could promote 70 00:04:33,640 --> 00:04:36,000 Speaker 1: stories that seem to be driving a lot of engagement 71 00:04:36,600 --> 00:04:39,600 Speaker 1: among people of a specific region. Right, it might be, Oh, 72 00:04:39,839 --> 00:04:44,280 Speaker 1: people around you are really interested in this particular story, 73 00:04:44,800 --> 00:04:47,160 Speaker 1: and then you get served up that story and it 74 00:04:47,240 --> 00:04:50,279 Speaker 1: perpetuates itself. That's a possibility, But I don't know for 75 00:04:50,320 --> 00:04:54,480 Speaker 1: a fact that that happened, But you know, it's a possibility, 76 00:04:54,480 --> 00:04:58,040 Speaker 1: and at the very least, it definitely did spread across 77 00:04:58,040 --> 00:05:03,719 Speaker 1: social media. Well, misinformation has been a thing long before 78 00:05:03,800 --> 00:05:09,640 Speaker 1: social networks ever existed, and rumors have passed around well 79 00:05:10,279 --> 00:05:13,960 Speaker 1: and truly without social networks. Right, So it's not like 80 00:05:14,040 --> 00:05:16,240 Speaker 1: if we got rid of social networks, this would no 81 00:05:16,320 --> 00:05:21,200 Speaker 1: longer be a problem. It's just that it's extremely efficient 82 00:05:22,120 --> 00:05:25,680 Speaker 1: to spread misinformation at this point, much more so than 83 00:05:25,680 --> 00:05:28,960 Speaker 1: it was in the past. And now let's talk about 84 00:05:29,080 --> 00:05:32,960 Speaker 1: end to end encryption. It continues to face challenges from 85 00:05:33,000 --> 00:05:37,440 Speaker 1: political leaders around the world. I've talked about how many nations, 86 00:05:37,480 --> 00:05:40,520 Speaker 1: including the United States, have looked for ways to work 87 00:05:40,560 --> 00:05:45,160 Speaker 1: around end to end encryption or perhaps even ban it outright. 88 00:05:45,720 --> 00:05:48,680 Speaker 1: And it's usually in the desire to scan messages for 89 00:05:48,800 --> 00:05:54,080 Speaker 1: signs of illegal content, so it could be an attempt 90 00:05:54,160 --> 00:05:58,640 Speaker 1: to look for communication between would be terrorists or to 91 00:05:59,040 --> 00:06:02,240 Speaker 1: search for evidence of people trafficking in illegal materials like 92 00:06:02,360 --> 00:06:07,320 Speaker 1: child pornography. Spain's government has joined the list of governments 93 00:06:07,320 --> 00:06:11,280 Speaker 1: that are very much taking aim into end encryption. This 94 00:06:11,400 --> 00:06:14,400 Speaker 1: is not a unique view even in the European Union. 95 00:06:14,440 --> 00:06:17,160 Speaker 1: There are lots of countries in the EU that have 96 00:06:17,640 --> 00:06:21,680 Speaker 1: proposed creating rules that would allow a government to scan 97 00:06:22,240 --> 00:06:26,400 Speaker 1: and monitor communications, which means that you would have to 98 00:06:26,440 --> 00:06:29,200 Speaker 1: get rid of into end encryption, because the very nature 99 00:06:29,200 --> 00:06:31,400 Speaker 1: of into end encryption is that only the people at 100 00:06:31,520 --> 00:06:37,440 Speaker 1: either end can access the encrypted information. So anyone who 101 00:06:38,720 --> 00:06:42,920 Speaker 1: tries to intercept the information somewhere in the middle they're 102 00:06:42,920 --> 00:06:45,880 Speaker 1: just going to be left with encrypted nonsense. They can't 103 00:06:45,960 --> 00:06:49,279 Speaker 1: read it. Now, this is a really complicated problem. So 104 00:06:49,360 --> 00:06:52,000 Speaker 1: on the one hand, you do have the legitimate concern 105 00:06:52,560 --> 00:06:55,719 Speaker 1: that more needs to be done, for example, to protect 106 00:06:55,880 --> 00:06:59,760 Speaker 1: children from becoming victims. I think it's hard to deny 107 00:06:59,800 --> 00:07:03,400 Speaker 1: that right, that we need to be better at protecting 108 00:07:03,520 --> 00:07:08,599 Speaker 1: children from child predators. On the other hand, this measure 109 00:07:08,720 --> 00:07:12,120 Speaker 1: means an end to private communication, and there are some 110 00:07:12,280 --> 00:07:16,360 Speaker 1: situations where such communication is absolutely critical. You know, authoritarian 111 00:07:16,440 --> 00:07:22,680 Speaker 1: governments could abuse and have abused this kind of process 112 00:07:23,040 --> 00:07:26,520 Speaker 1: to crack down on perceived threats, and those threats might 113 00:07:26,600 --> 00:07:30,640 Speaker 1: just be someone like a journalist or an activist, or 114 00:07:31,160 --> 00:07:34,920 Speaker 1: you know, a political rival, and yeah, it's all done 115 00:07:35,120 --> 00:07:38,920 Speaker 1: in the name of protecting the state, but it really 116 00:07:38,920 --> 00:07:42,560 Speaker 1: comes down to an authoritarian display of power and denying 117 00:07:42,640 --> 00:07:47,280 Speaker 1: other people the right to privacy. And I can't pretend 118 00:07:47,320 --> 00:07:50,080 Speaker 1: to have the answers here. I don't think getting rid 119 00:07:50,080 --> 00:07:54,240 Speaker 1: of ENDO end encryption is a good idea. However, Now, 120 00:07:54,320 --> 00:07:57,480 Speaker 1: one agency that would probably love to see into end 121 00:07:57,560 --> 00:08:02,320 Speaker 1: encryption go away is THEBI. I say that with some 122 00:08:02,480 --> 00:08:06,720 Speaker 1: level of confidence because the US Foreign Intelligence Surveillance Court 123 00:08:07,360 --> 00:08:12,240 Speaker 1: released an April twenty twenty two opinion detailing more than 124 00:08:12,280 --> 00:08:16,240 Speaker 1: two hundred and seventy five thousand instances of the FBI 125 00:08:16,360 --> 00:08:22,040 Speaker 1: conducting warrantless searches of citizen communications between twenty twenty and 126 00:08:22,280 --> 00:08:26,080 Speaker 1: early twenty twenty one. Now I should add that opinion. 127 00:08:26,400 --> 00:08:30,840 Speaker 1: That court opinion is also highly redacted, so there are 128 00:08:30,840 --> 00:08:34,000 Speaker 1: a lot of blacked out spots in that report. But 129 00:08:34,840 --> 00:08:38,439 Speaker 1: essentially it's saying the FBI searched through communications more than 130 00:08:38,440 --> 00:08:43,640 Speaker 1: a quarter of a million times of presumably American citizen 131 00:08:43,679 --> 00:08:48,680 Speaker 1: communications without securing a warrant to do it in one year. Now, 132 00:08:48,679 --> 00:08:52,360 Speaker 1: the FBI was relying on the Foreign Intelligence Surveillance Act 133 00:08:52,480 --> 00:08:57,840 Speaker 1: or FISA. FISA, which allows for warrantless digital searches and 134 00:08:57,920 --> 00:09:02,920 Speaker 1: monitoring of communication, but only for communications between foreign individuals 135 00:09:02,960 --> 00:09:06,440 Speaker 1: outside of America. Like, it's not supposed to be used 136 00:09:06,720 --> 00:09:12,560 Speaker 1: to spy on communications of American citizens. However, the law 137 00:09:12,880 --> 00:09:16,400 Speaker 1: allows officials to do a sort of three degrees of 138 00:09:16,480 --> 00:09:21,360 Speaker 1: separation style gain. They can look at the target's communications, 139 00:09:21,440 --> 00:09:25,240 Speaker 1: so they've identified somebody that they want to surveil for 140 00:09:25,320 --> 00:09:29,080 Speaker 1: whatever reason. This person's a foreign individual, they're not in 141 00:09:29,080 --> 00:09:32,719 Speaker 1: the United States, and thus there's no warrant needed, So 142 00:09:32,760 --> 00:09:35,240 Speaker 1: they look at the target's communications, but then they can 143 00:09:35,280 --> 00:09:38,040 Speaker 1: also look at whom the target has been in contact with. 144 00:09:38,760 --> 00:09:40,760 Speaker 1: Then they can even go one step further. They can 145 00:09:40,800 --> 00:09:46,240 Speaker 1: look at the contacts that the contact had. So let's 146 00:09:46,240 --> 00:09:48,920 Speaker 1: say you got an old fishing buddy, and your old 147 00:09:48,920 --> 00:09:51,280 Speaker 1: fishing buddy happens to be friends with a shady person 148 00:09:51,280 --> 00:09:54,200 Speaker 1: who turns out to have been on the FBI's radar. Well, 149 00:09:54,240 --> 00:09:57,080 Speaker 1: by extension, you could be on the FBI's radar two 150 00:09:57,480 --> 00:10:00,679 Speaker 1: because you're connected to your friend and your friend is 151 00:10:00,720 --> 00:10:04,400 Speaker 1: connected to this other person. Now, FISA isn't supposed to 152 00:10:04,480 --> 00:10:07,760 Speaker 1: let agents investigate American citizens, but because of this degree 153 00:10:07,800 --> 00:10:12,080 Speaker 1: of separation thing, it can happen like a lot, like 154 00:10:12,080 --> 00:10:14,440 Speaker 1: two hundred and seventy eight thousand times in the matter 155 00:10:14,480 --> 00:10:16,920 Speaker 1: of a year or so. And the court opinion shows 156 00:10:16,920 --> 00:10:19,360 Speaker 1: that the FBI was using these techniques to run searches 157 00:10:19,360 --> 00:10:24,479 Speaker 1: on people who most assuredly were not foreign agents communicating overseas, 158 00:10:24,760 --> 00:10:28,959 Speaker 1: such as protesters during Black Lives Matter protests in the 159 00:10:29,000 --> 00:10:32,120 Speaker 1: wake of George Floyd's death. So here we have what 160 00:10:32,200 --> 00:10:36,000 Speaker 1: appears to be a pretty clear series of offenses against 161 00:10:36,000 --> 00:10:40,600 Speaker 1: American citizens perpetuated by the FBI. And this is just 162 00:10:40,800 --> 00:10:44,760 Speaker 1: one reason why end to end encryption is important because 163 00:10:44,840 --> 00:10:48,000 Speaker 1: if the quote unquote good guys are breaking the rules, 164 00:10:48,760 --> 00:10:52,160 Speaker 1: that's not good. All Right, we're gonna take a quick break. 165 00:10:52,200 --> 00:10:54,520 Speaker 1: When we come back, we've got some more tech news 166 00:10:54,520 --> 00:11:06,959 Speaker 1: to cover. Okay, now, we got a double whammie section 167 00:11:07,080 --> 00:11:09,480 Speaker 1: for Meta. So first up, the EU has leveled a 168 00:11:09,600 --> 00:11:14,679 Speaker 1: one point three billion dollar fine against Meta. That's a 169 00:11:14,840 --> 00:11:18,400 Speaker 1: princely sum, saying that the company failed to keep EU 170 00:11:18,520 --> 00:11:22,120 Speaker 1: citizen data safe and private. So essentially, the violation here 171 00:11:22,640 --> 00:11:28,120 Speaker 1: involves transmitting EU citizen data to US based servers, which 172 00:11:28,160 --> 00:11:30,959 Speaker 1: is something that the EU is very much against, without 173 00:11:31,120 --> 00:11:34,880 Speaker 1: there being further protections in place for that information. So 174 00:11:34,960 --> 00:11:37,280 Speaker 1: you might remember in past episodes, I've talked about how 175 00:11:37,360 --> 00:11:42,640 Speaker 1: the fear about TikTok largely centers around this belief that 176 00:11:42,920 --> 00:11:46,040 Speaker 1: it's a company that could be sending personal data belonging 177 00:11:46,040 --> 00:11:51,360 Speaker 1: to American citizens to China. Well, that fear exists despite 178 00:11:51,360 --> 00:11:53,240 Speaker 1: the fact that we don't actually have evidence of this 179 00:11:53,559 --> 00:11:57,440 Speaker 1: having happened. It could have happened. I'm not saying it didn't, 180 00:11:57,760 --> 00:12:00,400 Speaker 1: I'm just saying that we we haven't seen evidence of 181 00:12:00,440 --> 00:12:03,360 Speaker 1: it yet. However, here we have a case of an 182 00:12:03,360 --> 00:12:07,640 Speaker 1: American company essentially doing the same thing, channeling EU citizen 183 00:12:07,720 --> 00:12:12,640 Speaker 1: data from the European Union to US based computer servers, 184 00:12:13,200 --> 00:12:16,560 Speaker 1: and you could say, wow, the turns they have tabled. 185 00:12:17,040 --> 00:12:18,920 Speaker 1: And now Meta has been ordered to pay more than 186 00:12:19,000 --> 00:12:22,439 Speaker 1: a billion dollars in fines relating to this offense. Now, Meta, 187 00:12:22,480 --> 00:12:25,040 Speaker 1: of course plans to appeal the ruling and the fine. 188 00:12:25,400 --> 00:12:28,640 Speaker 1: I think that's obvious, But regulators say that Meta's offenses 189 00:12:28,679 --> 00:12:33,040 Speaker 1: are systemic and continuous in violation of the rules of GDPR. 190 00:12:33,679 --> 00:12:37,079 Speaker 1: I think Meta is hoping to wait this out so 191 00:12:37,080 --> 00:12:39,680 Speaker 1: that the US and the EU come to an agreement 192 00:12:40,160 --> 00:12:44,319 Speaker 1: on how and under what circumstances a US based company 193 00:12:44,320 --> 00:12:47,760 Speaker 1: can transmit EU based data back to the United States. 194 00:12:48,240 --> 00:12:51,240 Speaker 1: And part of the hold up is this concern that 195 00:12:51,320 --> 00:12:56,640 Speaker 1: the US government could potentially spy on European Union citizen data. 196 00:12:57,040 --> 00:13:00,200 Speaker 1: They could use it as a surveillance tool, which you 197 00:13:00,280 --> 00:13:02,840 Speaker 1: might say sounds far fetched, But we just got done 198 00:13:02,960 --> 00:13:07,200 Speaker 1: talking about the FBI doing that to American citizens. So 199 00:13:07,280 --> 00:13:10,319 Speaker 1: there you go. The other big punch to Meta's stomach 200 00:13:10,400 --> 00:13:13,360 Speaker 1: this week comes in the form of Giffee, or if 201 00:13:13,400 --> 00:13:18,839 Speaker 1: you prefer Jiffy, not the peanut butter, but the animated 202 00:13:18,880 --> 00:13:23,920 Speaker 1: giff database and search engine. Now. Meta purchased Giffee a 203 00:13:23,960 --> 00:13:28,040 Speaker 1: few years ago for four hundred million dollars, But then 204 00:13:28,160 --> 00:13:31,760 Speaker 1: regulators in the UK determined that Meta's possession of Giffee 205 00:13:32,160 --> 00:13:36,800 Speaker 1: constituted anti competitive business practices and ordered Meta to divest 206 00:13:36,880 --> 00:13:41,040 Speaker 1: itself of the company, which now Meta has done. Meta 207 00:13:41,120 --> 00:13:44,719 Speaker 1: sold Giffee off to another company called shutter Stock. You 208 00:13:44,800 --> 00:13:47,840 Speaker 1: might be familiar with them, but Meta did not recapture 209 00:13:47,880 --> 00:13:50,559 Speaker 1: the four hundred million dollars it had spent on Giffee 210 00:13:50,600 --> 00:13:54,079 Speaker 1: just a few years ago. Instead, shutter Stock purchased Giffee 211 00:13:54,080 --> 00:13:58,120 Speaker 1: for the equivalent of fifty three million dollars. Now that's 212 00:13:58,160 --> 00:14:00,360 Speaker 1: still a healthy chunk of change, Don't get me wrong, 213 00:14:00,800 --> 00:14:04,920 Speaker 1: it's a far cry from four hundred million AUCI. The 214 00:14:05,040 --> 00:14:08,520 Speaker 1: US Surgeon General has issued an advisory stating that there's 215 00:14:08,559 --> 00:14:11,240 Speaker 1: not enough evidence to say social media is safe for 216 00:14:11,360 --> 00:14:14,720 Speaker 1: kids to use, which I suppose you could flip and say, 217 00:14:15,080 --> 00:14:17,680 Speaker 1: is there evidence showing that the use of social media 218 00:14:17,800 --> 00:14:20,840 Speaker 1: is harmful to kids? I mean, I know that that's 219 00:14:20,920 --> 00:14:24,320 Speaker 1: the belief, but what does the actual evidence, say, and 220 00:14:24,360 --> 00:14:27,200 Speaker 1: I think the problem is that there's not enough research 221 00:14:27,240 --> 00:14:31,480 Speaker 1: to draw conclusions. However, there is a concern that social 222 00:14:31,480 --> 00:14:34,920 Speaker 1: media could contribute to mental health problems among the youth, 223 00:14:35,400 --> 00:14:39,760 Speaker 1: which at least seems to make sense. But we don't 224 00:14:39,960 --> 00:14:43,880 Speaker 1: have all the data yet, right, so I don't know 225 00:14:44,000 --> 00:14:46,800 Speaker 1: that we've yet been able to determine whether social media 226 00:14:46,920 --> 00:14:50,160 Speaker 1: use among kids is good, bad, or indifferent. I think 227 00:14:50,200 --> 00:14:52,320 Speaker 1: one problem with a lot of studies is it comes 228 00:14:52,360 --> 00:14:55,000 Speaker 1: down to a chicken or egg kind of problem. And 229 00:14:55,040 --> 00:14:59,640 Speaker 1: what I mean by that is are people developing mental 230 00:14:59,640 --> 00:15:03,520 Speaker 1: health problems because they spend too much time on social media? 231 00:15:03,680 --> 00:15:07,520 Speaker 1: Or is it that people who have mental health challenges 232 00:15:08,120 --> 00:15:11,520 Speaker 1: are more likely to spend more time on social media, 233 00:15:11,560 --> 00:15:13,680 Speaker 1: So it could be like a correlation but not a 234 00:15:13,840 --> 00:15:17,720 Speaker 1: causation situation. Here it falls into a similar challenge of 235 00:15:17,760 --> 00:15:21,000 Speaker 1: determining if violent video games have a negative impact on 236 00:15:21,080 --> 00:15:25,160 Speaker 1: mental health. Do violent video games make people violent or 237 00:15:25,800 --> 00:15:29,800 Speaker 1: do violent people tend to like violent video games which 238 00:15:29,800 --> 00:15:32,320 Speaker 1: could also be enjoyed by people who aren't violent at all. 239 00:15:32,800 --> 00:15:36,200 Speaker 1: So this advisory is really meant to encourage families to 240 00:15:36,640 --> 00:15:41,200 Speaker 1: think really seriously about social media use and to encourage 241 00:15:41,240 --> 00:15:44,520 Speaker 1: healthy family behaviors, and I think that's a good message 242 00:15:44,520 --> 00:15:48,400 Speaker 1: no matter how the research ultimately shakes out. Rapper ice 243 00:15:48,480 --> 00:15:51,080 Speaker 1: Cube has a few things to say about AI, and 244 00:15:51,160 --> 00:15:55,920 Speaker 1: they are not complementary. He actually called AI demonic and 245 00:15:56,080 --> 00:16:00,240 Speaker 1: referenced the recent songs featuring AI generated voices mimicking people 246 00:16:00,520 --> 00:16:04,760 Speaker 1: like Drake. Ice Cube said, quote, somebody can't take your 247 00:16:04,800 --> 00:16:08,520 Speaker 1: original voice and manipulate it without having to pay end quote. 248 00:16:08,960 --> 00:16:11,480 Speaker 1: That's not necessarily the case. As we've said on the 249 00:16:11,520 --> 00:16:15,520 Speaker 1: show before, existing law does not really cover synthesized voices. 250 00:16:15,960 --> 00:16:19,840 Speaker 1: You can't copyright a voice. You can't trademark it either. 251 00:16:20,400 --> 00:16:24,360 Speaker 1: But ice Cube's concerns are understandable. If someone is replicating 252 00:16:24,440 --> 00:16:27,560 Speaker 1: a specific person's voice in order to make something new, 253 00:16:28,440 --> 00:16:31,800 Speaker 1: that sort of proves that the original voice has value 254 00:16:31,840 --> 00:16:34,800 Speaker 1: to it. Otherwise, why are you copying it? Why wouldn't 255 00:16:34,840 --> 00:16:38,080 Speaker 1: you just make a new synthesized voice that doesn't sound 256 00:16:38,120 --> 00:16:42,560 Speaker 1: like anyone? In particular, if you're using AI to copy 257 00:16:42,600 --> 00:16:46,760 Speaker 1: the style and the sound of someone specific, that kind 258 00:16:46,760 --> 00:16:49,920 Speaker 1: of confirms that the original has value, and that to 259 00:16:49,960 --> 00:16:53,200 Speaker 1: me suggests that we do need to develop laws to 260 00:16:53,240 --> 00:16:56,600 Speaker 1: protect those things, and some states do have some laws 261 00:16:56,640 --> 00:16:59,760 Speaker 1: that protect that, but it's not across the United States. 262 00:17:00,000 --> 00:17:01,840 Speaker 1: Other parts of the world need to think about this too. 263 00:17:02,440 --> 00:17:05,320 Speaker 1: It's a brave new world to have such Ai people 264 00:17:05,359 --> 00:17:08,840 Speaker 1: in it. In space news, NASA has awarded the private 265 00:17:08,880 --> 00:17:12,119 Speaker 1: space company Blue Origin a contract to land astronauts on 266 00:17:12,240 --> 00:17:15,960 Speaker 1: the Moon. The lunar lander will be named Blue Moon, 267 00:17:16,400 --> 00:17:18,879 Speaker 1: which makes me want to launch right into do wop music. 268 00:17:19,200 --> 00:17:21,880 Speaker 1: But I'll spare you, as well as my super producer 269 00:17:21,960 --> 00:17:25,080 Speaker 1: Tari from having to endure that dig de don ding 270 00:17:25,119 --> 00:17:29,679 Speaker 1: Blue Moon. Sorry slipped out. This won't be the lunar 271 00:17:29,760 --> 00:17:32,920 Speaker 1: lander used in the upcoming planned missions to the Moon 272 00:17:33,040 --> 00:17:35,880 Speaker 1: that are part of the early part of the Project Artemis. 273 00:17:36,080 --> 00:17:38,000 Speaker 1: Those are actually going to use a lunar lander that's 274 00:17:38,040 --> 00:17:41,720 Speaker 1: created by SpaceX, and that one is the Human Landing 275 00:17:41,800 --> 00:17:45,920 Speaker 1: System or HLS. It's interesting that NASA is using both 276 00:17:46,000 --> 00:17:49,560 Speaker 1: companies for this purpose, but eventually the plan is to 277 00:17:49,680 --> 00:17:53,280 Speaker 1: establish a permanent facility on the Moon. As for when 278 00:17:53,359 --> 00:17:56,399 Speaker 1: Blue Moon will see us standing alone without a dream 279 00:17:56,480 --> 00:17:58,800 Speaker 1: in our heart, well, it's going to be like twenty 280 00:17:58,840 --> 00:18:02,879 Speaker 1: twenty nine or so. Finally, IBM is investing a huge 281 00:18:02,920 --> 00:18:05,920 Speaker 1: amount of money, like one hundred million Bucks and we'll 282 00:18:05,920 --> 00:18:10,080 Speaker 1: be partnering with the Universities of Chicago and Tokyo to 283 00:18:10,160 --> 00:18:14,639 Speaker 1: build a quantum supercomputer that aims to have one hundred 284 00:18:14,720 --> 00:18:19,560 Speaker 1: thousand cubits, and I guess that would be a hundred 285 00:18:19,880 --> 00:18:24,679 Speaker 1: kill cubits. Currently, I think the largest quantum computer is 286 00:18:24,720 --> 00:18:27,119 Speaker 1: IBM's Ospray. I could be wrong about that, but I 287 00:18:27,119 --> 00:18:29,800 Speaker 1: think it's the Ospray, and as I recall, the Osprey 288 00:18:29,840 --> 00:18:33,760 Speaker 1: has four hundred and thirty three cubits. IBM is also 289 00:18:33,920 --> 00:18:38,320 Speaker 1: planning on launching the Condor computer sometime this year. I 290 00:18:38,359 --> 00:18:40,480 Speaker 1: believe that one is going to have slightly more than 291 00:18:40,560 --> 00:18:46,919 Speaker 1: one thousand cubits, but one hundred thousand is tremendous. It 292 00:18:47,000 --> 00:18:51,320 Speaker 1: is so enormous. Now, just a reminder, a cubit is 293 00:18:51,359 --> 00:18:55,680 Speaker 1: a quantum bit, and unlike a classical bit, which can 294 00:18:55,800 --> 00:18:59,760 Speaker 1: either be a zero or a one, a cubit can 295 00:18:59,800 --> 00:19:03,600 Speaker 1: be placed into superposition, meaning it can be both zero 296 00:19:03,840 --> 00:19:08,359 Speaker 1: and one simultaneously. Technically every value in between as well, 297 00:19:09,160 --> 00:19:12,320 Speaker 1: and it all gets very very quantum. So I recommend 298 00:19:12,320 --> 00:19:15,240 Speaker 1: looking through the tech stuff archives for episodes about quantum 299 00:19:15,280 --> 00:19:18,680 Speaker 1: computing if you want to learn more. As for a deadline, 300 00:19:18,840 --> 00:19:21,600 Speaker 1: IBM's looking a decade out with a goal of this 301 00:19:21,720 --> 00:19:25,960 Speaker 1: quantum supercomputer doing science and stuff by twenty thirty three. 302 00:19:26,960 --> 00:19:30,080 Speaker 1: And that's it for the Tech News today, May twenty third, 303 00:19:30,200 --> 00:19:33,800 Speaker 1: twenty twenty three. I hope you are all well, and 304 00:19:33,840 --> 00:19:43,760 Speaker 1: I'll talk to you again really soon. Tech Stuff is 305 00:19:43,800 --> 00:19:48,359 Speaker 1: an iHeartRadio production. For more podcasts from iHeartRadio, visit the 306 00:19:48,400 --> 00:19:52,000 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you listen to your 307 00:19:52,040 --> 00:19:56,560 Speaker 1: favorite shows.