1 00:00:04,440 --> 00:00:12,400 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon 2 00:00:12,520 --> 00:00:15,720 Speaker 1: Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:15,720 --> 00:00:19,400 Speaker 1: an executive producer with iHeartRadio. And how the tech are you. 4 00:00:20,079 --> 00:00:23,520 Speaker 1: I am still in Las Vegas, Nevada. You can probably 5 00:00:23,520 --> 00:00:25,720 Speaker 1: tell that my voice is going. I did a lot 6 00:00:25,720 --> 00:00:29,319 Speaker 1: of talking yesterday. I had to do some recordings of 7 00:00:29,360 --> 00:00:32,479 Speaker 1: the Restless Ones podcast, and I did a lot of 8 00:00:32,680 --> 00:00:36,120 Speaker 1: chatting with other folks, and yeah, I don't know if 9 00:00:36,120 --> 00:00:37,760 Speaker 1: I'm coming down with anything. I sure hope I'm not. 10 00:00:37,840 --> 00:00:40,760 Speaker 1: I've been masking every time I can, whenever I'm going 11 00:00:40,760 --> 00:00:44,360 Speaker 1: out in public, and I recommend that y'all who have 12 00:00:44,400 --> 00:00:46,720 Speaker 1: to go out in public do the same if you can, 13 00:00:47,159 --> 00:00:50,720 Speaker 1: because obviously COVID is having another surge, and I want 14 00:00:50,760 --> 00:00:54,160 Speaker 1: folks to be safe. And hopefully all I have is 15 00:00:54,280 --> 00:00:56,600 Speaker 1: just like wear and tear on the old vocal cords 16 00:00:56,600 --> 00:01:00,480 Speaker 1: and that's it. We'll obviously be very careful and keep checking. 17 00:01:00,920 --> 00:01:04,280 Speaker 1: But in the meantime, we have some tech news to 18 00:01:04,319 --> 00:01:08,440 Speaker 1: get to, so let's talk about that. Officials here in 19 00:01:08,480 --> 00:01:10,760 Speaker 1: the United States, as well as some in Japan, have 20 00:01:10,760 --> 00:01:14,280 Speaker 1: announced the discovery of a Chinese hacker attack that is 21 00:01:14,360 --> 00:01:19,320 Speaker 1: infiltrating organizations by compromising older routers. You can read up 22 00:01:19,319 --> 00:01:22,760 Speaker 1: on this in Ours Technica. There's an article by Dan Gooden. 23 00:01:22,959 --> 00:01:27,200 Speaker 1: The article is titled Backdoored firmware lets China State hackers 24 00:01:27,200 --> 00:01:31,840 Speaker 1: control routers with magic packets. The article goes into a 25 00:01:31,880 --> 00:01:34,479 Speaker 1: lot of technical detail. I'm not going to bother doing 26 00:01:34,560 --> 00:01:37,520 Speaker 1: that here because this is really just a news item. 27 00:01:37,800 --> 00:01:41,360 Speaker 1: But essentially this hacker group, which has lots of different names, 28 00:01:41,360 --> 00:01:44,440 Speaker 1: I'll just choose one of them, which is black Tech. First, 29 00:01:44,440 --> 00:01:47,440 Speaker 1: they have to get access to a system. Now how 30 00:01:47,480 --> 00:01:51,000 Speaker 1: they do that doesn't really matter. It's just they need 31 00:01:51,040 --> 00:01:54,640 Speaker 1: to get access. First, administrative access, like administrator level access 32 00:01:54,680 --> 00:01:58,520 Speaker 1: to be specific. They might get that through some compromise 33 00:01:58,600 --> 00:02:02,000 Speaker 1: login credentials, perhaps from another hacker group, maybe they use 34 00:02:02,000 --> 00:02:05,640 Speaker 1: social engineering, but whatever. First they get that access. With 35 00:02:05,760 --> 00:02:09,560 Speaker 1: that administrator level access to the system, they then push 36 00:02:09,639 --> 00:02:13,880 Speaker 1: an older version of firmware to certain routers, older routers 37 00:02:13,880 --> 00:02:16,240 Speaker 1: that don't have a protection against this kind of attack. 38 00:02:16,840 --> 00:02:20,440 Speaker 1: And typically they're targeting routers that are on the edge 39 00:02:20,560 --> 00:02:24,800 Speaker 1: of networks. So for example, they might target a branch 40 00:02:25,120 --> 00:02:29,839 Speaker 1: office of a larger company, partly because they're less likely 41 00:02:29,919 --> 00:02:32,520 Speaker 1: to be noticed, and that security won't be quite as 42 00:02:32,639 --> 00:02:35,679 Speaker 1: robust as it would be if it were at HQ. Now, 43 00:02:35,680 --> 00:02:39,840 Speaker 1: because these routers have a trusted relationship with all the 44 00:02:39,880 --> 00:02:43,560 Speaker 1: other computer systems that are connected to that company, the 45 00:02:43,600 --> 00:02:46,200 Speaker 1: hackers are then able to get widespread access to the 46 00:02:46,240 --> 00:02:51,000 Speaker 1: full organization. They target the edge routers, they infect those, 47 00:02:51,520 --> 00:02:55,200 Speaker 1: and then they snoop around Good and explains in his 48 00:02:55,280 --> 00:02:58,359 Speaker 1: piece that the more recent router hardware that has come 49 00:02:58,400 --> 00:03:01,400 Speaker 1: out over the last couple of years includes protections against 50 00:03:01,400 --> 00:03:05,480 Speaker 1: this type of attack, But obviously most organizations aren't updating 51 00:03:05,480 --> 00:03:08,280 Speaker 1: their routers regularly. They do so when they need to, 52 00:03:09,080 --> 00:03:12,560 Speaker 1: and determining if your system has been affected it seems 53 00:03:12,560 --> 00:03:15,600 Speaker 1: like it's not very easy or straightforward, particularly if the 54 00:03:15,639 --> 00:03:19,000 Speaker 1: hackers are actually being really careful. So it's another example 55 00:03:19,320 --> 00:03:23,639 Speaker 1: of how cyberwar is a real ongoing thing and that 56 00:03:23,720 --> 00:03:28,280 Speaker 1: state backed hackers continue to infiltrate both companies and public organizations. 57 00:03:29,160 --> 00:03:33,120 Speaker 1: As for the magic packets, those are little small packets 58 00:03:33,120 --> 00:03:35,640 Speaker 1: of data that the hackers are using essentially to open 59 00:03:35,760 --> 00:03:40,000 Speaker 1: or close backdoor access into these systems, and typically the 60 00:03:40,040 --> 00:03:43,640 Speaker 1: magic packets kind of blend in with overall network traffic 61 00:03:44,080 --> 00:03:46,880 Speaker 1: so that's kind of why they were called that. Back 62 00:03:46,880 --> 00:03:50,000 Speaker 1: in July, word got out that hackers had leveraged a 63 00:03:50,080 --> 00:03:53,520 Speaker 1: stolen Microsoft certificate to gain access to the email systems 64 00:03:53,560 --> 00:03:58,880 Speaker 1: belonging to really large organizations, including the United States State Department. Now, 65 00:03:58,920 --> 00:04:01,400 Speaker 1: the New York Times report that the hackers stole around 66 00:04:01,440 --> 00:04:07,600 Speaker 1: sixty thousand emails, primarily from ten email accounts. Further, nine 67 00:04:07,800 --> 00:04:10,560 Speaker 1: of those ten accounts belong to members of the State 68 00:04:10,560 --> 00:04:14,440 Speaker 1: Department who are working on projects that involve East Asian affairs. Now, 69 00:04:14,440 --> 00:04:18,159 Speaker 1: the State Department has not formally accused China of being 70 00:04:18,200 --> 00:04:21,920 Speaker 1: behind the attacks, but that seems to be the implication here. 71 00:04:22,360 --> 00:04:25,600 Speaker 1: The hack has sparked conversations within the US government on 72 00:04:25,640 --> 00:04:29,360 Speaker 1: how to harden systems against hacker attacks, including the possibility 73 00:04:29,360 --> 00:04:34,159 Speaker 1: of moving away from single vendor solutions like using Microsoft. 74 00:04:34,640 --> 00:04:37,440 Speaker 1: And while there's an element of closing the barn door 75 00:04:37,560 --> 00:04:40,360 Speaker 1: after the horse has already escaped going on here, I 76 00:04:40,440 --> 00:04:43,760 Speaker 1: do think putting all your trust in a single vendor 77 00:04:43,880 --> 00:04:46,760 Speaker 1: can lead to massive problems down the line. But honestly, 78 00:04:46,760 --> 00:04:51,279 Speaker 1: there aren't really any easy, fully reliable solutions here. Security 79 00:04:51,320 --> 00:04:56,279 Speaker 1: researchers revealed that all modern GPUs, that is graphics processing 80 00:04:56,360 --> 00:05:00,240 Speaker 1: units have their own way to compress data. But even 81 00:05:00,240 --> 00:05:03,000 Speaker 1: though they're each using a different method, that method can 82 00:05:03,040 --> 00:05:07,240 Speaker 1: be exploited to steal pixels from a website. Which sounds weird, right, 83 00:05:07,600 --> 00:05:10,800 Speaker 1: but essentially here's how it works. You get hackers and 84 00:05:10,839 --> 00:05:13,880 Speaker 1: they create a malicious website that has the same name 85 00:05:14,080 --> 00:05:17,840 Speaker 1: but a different domain as their target. So let's say 86 00:05:17,839 --> 00:05:20,479 Speaker 1: they've identified a target website that they don't have access 87 00:05:20,480 --> 00:05:24,080 Speaker 1: to directly, maybe it's behind some sort of protective measure, 88 00:05:24,640 --> 00:05:27,800 Speaker 1: So they create a different website has the same name 89 00:05:28,040 --> 00:05:30,280 Speaker 1: but a different domain, so instead of dot com, maybe 90 00:05:30,279 --> 00:05:34,520 Speaker 1: it's dot biz. Now in this malicious version of the website, 91 00:05:34,640 --> 00:05:37,840 Speaker 1: they include an iframe. The iframe serves as a spot 92 00:05:37,880 --> 00:05:41,000 Speaker 1: where you can embed other content into your website. So 93 00:05:41,800 --> 00:05:45,120 Speaker 1: one common use for iframes is to embed ads in them, 94 00:05:45,120 --> 00:05:48,200 Speaker 1: and that allows the site to swap ads out dynamically 95 00:05:48,320 --> 00:05:51,200 Speaker 1: and the iframe kind of holds its place. They then 96 00:05:51,279 --> 00:05:55,440 Speaker 1: rely on the GPU's data compression scheme to pull pixels 97 00:05:55,440 --> 00:05:58,359 Speaker 1: into a side channel, and they pull it from the 98 00:05:58,360 --> 00:06:04,560 Speaker 1: target website. You reconstruct what appears on your target site. 99 00:06:05,080 --> 00:06:08,279 Speaker 1: So you might have a corporate website that's usually behind 100 00:06:08,279 --> 00:06:11,159 Speaker 1: protection and the hackers create a malicious site named the 101 00:06:11,200 --> 00:06:14,680 Speaker 1: same thing, but on a different domain, and by stealing pixels, 102 00:06:15,040 --> 00:06:18,200 Speaker 1: they can recreate what was on that target website, potentially 103 00:06:18,240 --> 00:06:21,240 Speaker 1: gaining access to sensitive data, including things like you know, 104 00:06:21,279 --> 00:06:25,359 Speaker 1: log in credentials. The researchers were mostly showing that this 105 00:06:25,640 --> 00:06:29,800 Speaker 1: is possible. They didn't indicate that it's an active concern, Like, 106 00:06:29,839 --> 00:06:32,800 Speaker 1: they didn't say that this is something that they're seeing 107 00:06:32,880 --> 00:06:36,440 Speaker 1: actively deployed out there, but that in fact the hardware 108 00:06:36,560 --> 00:06:40,600 Speaker 1: across all GPU manufacturers allows for this, and since it 109 00:06:40,680 --> 00:06:44,040 Speaker 1: is possible, sooner or later someone will do it. Right, 110 00:06:44,320 --> 00:06:48,120 Speaker 1: So there are ways to block pixels stealing. Most of 111 00:06:48,120 --> 00:06:52,440 Speaker 1: those involve website administrators being proactive on the matter. They 112 00:06:52,480 --> 00:06:54,839 Speaker 1: can build stuff into the header of a web page 113 00:06:55,320 --> 00:06:58,080 Speaker 1: that will protect against that kind of thing, but that's 114 00:06:58,120 --> 00:07:01,000 Speaker 1: a lot of work. It is also probably, you know, 115 00:07:01,200 --> 00:07:03,479 Speaker 1: it's likely, I think that we're going to see companies 116 00:07:03,520 --> 00:07:06,760 Speaker 1: that make web browsers that facilitate this kind of attack 117 00:07:06,960 --> 00:07:10,960 Speaker 1: to address that in the future. The European Union issued 118 00:07:11,000 --> 00:07:15,680 Speaker 1: a report stating that after analyzing various social network platforms 119 00:07:15,680 --> 00:07:18,720 Speaker 1: out there, one in particular stands out when it comes 120 00:07:18,760 --> 00:07:23,200 Speaker 1: to the proliferation of disinformation. Do you have any guesses? 121 00:07:23,680 --> 00:07:27,200 Speaker 1: If you said X marks the spot, meaning the platform 122 00:07:27,280 --> 00:07:33,160 Speaker 1: formerly known as Twitter, you win a prize a virtual 123 00:07:33,200 --> 00:07:38,800 Speaker 1: donut enjoy Anyway, the EU previously established a voluntary code 124 00:07:39,000 --> 00:07:45,000 Speaker 1: of practice on disinformation, and tons of platforms signed to 125 00:07:45,200 --> 00:07:48,400 Speaker 1: pledge and that they would follow this voluntary code, so 126 00:07:48,440 --> 00:07:53,239 Speaker 1: that includes Google and YouTube, meta primarily Facebook and LinkedIn. 127 00:07:53,400 --> 00:07:57,240 Speaker 1: They all signed it, along with like forty other platforms. 128 00:07:57,960 --> 00:08:00,920 Speaker 1: It also used to include X, but at last May 129 00:08:01,200 --> 00:08:03,560 Speaker 1: Elon Musk decided that X was going to peace out 130 00:08:03,600 --> 00:08:07,240 Speaker 1: of that voluntary code. Now, researchers in the EU say 131 00:08:07,280 --> 00:08:10,960 Speaker 1: that disinformation is spreading like wildfire on X and that 132 00:08:11,120 --> 00:08:14,200 Speaker 1: the problem is only getting worse. Also, X has recently 133 00:08:14,240 --> 00:08:16,840 Speaker 1: disabled a feature that used to allow users to report 134 00:08:16,880 --> 00:08:21,760 Speaker 1: cases of misinformation about elections. So this means, assuming those 135 00:08:21,840 --> 00:08:24,880 Speaker 1: EU researchers are right that X is not just the 136 00:08:25,000 --> 00:08:28,920 Speaker 1: worst when it comes to preventing misinformation and disinformation from 137 00:08:29,080 --> 00:08:33,160 Speaker 1: propagating across their system, they are actively turning off features 138 00:08:33,200 --> 00:08:35,760 Speaker 1: that would do anything about it, which is a big 139 00:08:35,760 --> 00:08:40,760 Speaker 1: old yikes. An article in the LGBTQ Nation reveals that 140 00:08:40,920 --> 00:08:45,120 Speaker 1: TikTok is working with the government of Kenya to restrict 141 00:08:45,200 --> 00:08:51,040 Speaker 1: and remove LGBTQ plus content from the platform within Kenya's borders. Now. 142 00:08:51,080 --> 00:08:55,920 Speaker 1: This is because Kenya's government has banned same sex sexual relationships, 143 00:08:56,200 --> 00:08:59,360 Speaker 1: and currently the government is also considering an additional law 144 00:08:59,520 --> 00:09:04,640 Speaker 1: that would hunish homosexuality with a lifetime prison sentence. TikTok 145 00:09:04,679 --> 00:09:08,480 Speaker 1: has already deplatformed and demonetized TikTok users in Kenya who, 146 00:09:08,520 --> 00:09:13,360 Speaker 1: according to the country's laws, have posted restricted content. Apparently, 147 00:09:13,760 --> 00:09:17,520 Speaker 1: TikTok CEO has committed to not only being more active 148 00:09:17,760 --> 00:09:22,120 Speaker 1: to remove LGBTQ plus content from the platform within Kenya, 149 00:09:22,200 --> 00:09:24,680 Speaker 1: but to also launch a campaign to urge users to 150 00:09:24,800 --> 00:09:30,600 Speaker 1: instead post quote unquote positive content, which sounds disturbingly close 151 00:09:30,640 --> 00:09:34,920 Speaker 1: to conversion therapy. In my opinion, it's really ugly stuff 152 00:09:34,920 --> 00:09:38,120 Speaker 1: and my heart goes out to the people of Kenya. 153 00:09:38,160 --> 00:09:40,960 Speaker 1: This week, Meta announced an update to its line of 154 00:09:40,960 --> 00:09:45,280 Speaker 1: smart glasses, previously known as Stories, but to be really 155 00:09:45,320 --> 00:09:48,559 Speaker 1: formal about they are now known as ray Ban Meta 156 00:09:48,600 --> 00:09:51,599 Speaker 1: smart Glasses. The glasses have a pair of cameras in 157 00:09:51,640 --> 00:09:54,600 Speaker 1: them when at the top of each corner of the frames, 158 00:09:55,320 --> 00:09:58,040 Speaker 1: and they contain five microphones that are meant to provide 159 00:09:58,040 --> 00:10:00,360 Speaker 1: better sound pickup when you use your glasses, you know, 160 00:10:01,120 --> 00:10:04,199 Speaker 1: take a phone call or something. They also have speakers 161 00:10:04,320 --> 00:10:06,040 Speaker 1: that the company says will make it easy for the 162 00:10:06,040 --> 00:10:09,440 Speaker 1: wearer to hear what's playing on their glasses, but it won't, 163 00:10:09,480 --> 00:10:11,839 Speaker 1: you know, irritate the living heck out of everybody else 164 00:10:11,840 --> 00:10:14,840 Speaker 1: who happens to be nearby. The glasses can take about 165 00:10:14,840 --> 00:10:19,000 Speaker 1: five hundred photos at twelve megapixel resolution or around one 166 00:10:19,080 --> 00:10:22,200 Speaker 1: hundred and thirty second long video clips at ten ADP 167 00:10:22,559 --> 00:10:24,720 Speaker 1: before you end up using up all the storage on 168 00:10:24,760 --> 00:10:27,680 Speaker 1: the device and it's time to offload stuff. The battery 169 00:10:27,760 --> 00:10:30,960 Speaker 1: reportedly is good for like four to six hours of use, 170 00:10:31,200 --> 00:10:33,400 Speaker 1: and they come in a case that can also serve 171 00:10:33,480 --> 00:10:35,880 Speaker 1: as a charger, and the case itself can hold enough 172 00:10:35,920 --> 00:10:38,600 Speaker 1: battery juice to recharge the glasses several times before the 173 00:10:38,640 --> 00:10:42,000 Speaker 1: case itself needs to be recharged. You can also live 174 00:10:42,040 --> 00:10:46,080 Speaker 1: stream while wearing these glasses too. The controls include both 175 00:10:46,120 --> 00:10:48,320 Speaker 1: a touch system and the stems of the eyeglasses and 176 00:10:48,480 --> 00:10:52,280 Speaker 1: voice commands, and essentially these things do some of, but 177 00:10:52,480 --> 00:10:56,560 Speaker 1: not all, of the stuff that Google Glass did years ago, 178 00:10:56,640 --> 00:10:59,280 Speaker 1: but they do look a lot better. They do not 179 00:10:59,440 --> 00:11:01,360 Speaker 1: as far as I can tell, create any sort of 180 00:11:01,760 --> 00:11:04,079 Speaker 1: ar view of the world around you, which Google Glass 181 00:11:04,080 --> 00:11:06,960 Speaker 1: could sort of do. And already folks are worried about 182 00:11:06,960 --> 00:11:09,480 Speaker 1: privacy issues with these glasses. They're concerned that you could 183 00:11:09,520 --> 00:11:11,840 Speaker 1: be on video or have someone snapping pictures of you 184 00:11:12,280 --> 00:11:15,199 Speaker 1: without your knowledge or consent. Now I get that concern. 185 00:11:15,400 --> 00:11:17,959 Speaker 1: It's a legit concern, don't get me wrong, But y'all, 186 00:11:17,960 --> 00:11:21,080 Speaker 1: that's already happening. There are so many cameras out there, 187 00:11:21,320 --> 00:11:24,200 Speaker 1: whether they're in phones or in security systems like in 188 00:11:24,320 --> 00:11:27,720 Speaker 1: door bells or in cars, So I pretty much guarantee 189 00:11:27,760 --> 00:11:32,320 Speaker 1: that you're on camera frequently throughout an average day. But 190 00:11:32,400 --> 00:11:34,720 Speaker 1: I do understand how it gets creepy when you're talking 191 00:11:34,720 --> 00:11:38,280 Speaker 1: about someone wearing the camera on their face. Anyway, Meta 192 00:11:38,320 --> 00:11:40,440 Speaker 1: is taking pre orders on these things, which start at 193 00:11:40,480 --> 00:11:43,400 Speaker 1: around two hundred and ninety nine US dollars, and they'll 194 00:11:43,400 --> 00:11:47,640 Speaker 1: start shipping in mid October. So maybe your Halloween costume 195 00:11:48,040 --> 00:11:52,400 Speaker 1: can be invasive surveillance. That's a fun one. All right, 196 00:11:52,640 --> 00:11:54,920 Speaker 1: We're gonna take a quick break and we're gonna come 197 00:11:54,960 --> 00:12:06,520 Speaker 1: back with more news in just a moment. We're back. 198 00:12:07,000 --> 00:12:10,320 Speaker 1: Reddit indicated this week that it will remove the opt 199 00:12:10,400 --> 00:12:14,600 Speaker 1: out feature for personalized ads, or at least some Reddit 200 00:12:14,679 --> 00:12:18,160 Speaker 1: users around the world. So the new policy says that 201 00:12:18,240 --> 00:12:22,000 Speaker 1: Reddit will take your activity on the platform and then 202 00:12:22,120 --> 00:12:24,720 Speaker 1: use that to decide which ads to display to you. 203 00:12:25,040 --> 00:12:26,920 Speaker 1: So let's say you spend a lot of time on 204 00:12:27,000 --> 00:12:30,360 Speaker 1: subreddits that are dedicated to gaming, Well, then you're more 205 00:12:30,480 --> 00:12:33,520 Speaker 1: likely to see ads related to gaming while on Reddit. Right. 206 00:12:33,559 --> 00:12:36,800 Speaker 1: It says this is for your benefit, But there are 207 00:12:36,800 --> 00:12:39,600 Speaker 1: a lot of users who are upset because the message 208 00:12:39,600 --> 00:12:41,800 Speaker 1: that they're picking up on is Reddit is determined to 209 00:12:41,840 --> 00:12:44,920 Speaker 1: track their behavior across the site and there's no way 210 00:12:44,960 --> 00:12:47,520 Speaker 1: to tell Reddit to knock it off. And while Reddit 211 00:12:47,559 --> 00:12:50,360 Speaker 1: says users all around the world will be able to 212 00:12:50,400 --> 00:12:54,240 Speaker 1: opt out of personalized advertising based off of quote unquote 213 00:12:54,640 --> 00:12:59,000 Speaker 1: information and activity from our partners, only users in select 214 00:12:59,040 --> 00:13:02,319 Speaker 1: locations will actually be able to opt out of personalized 215 00:13:02,360 --> 00:13:06,320 Speaker 1: ads based off their activity on Reddit itself. I'm guessing 216 00:13:06,520 --> 00:13:09,760 Speaker 1: those select locations will be places like the European Union, 217 00:13:10,080 --> 00:13:12,000 Speaker 1: which has some pretty strict rules in place when it 218 00:13:12,040 --> 00:13:15,080 Speaker 1: comes to user privacy, and any place that doesn't have 219 00:13:15,160 --> 00:13:17,439 Speaker 1: those kind of rules, Redd, it's going to be tracking 220 00:13:17,480 --> 00:13:21,240 Speaker 1: you like crazy. And now, the latest in the long 221 00:13:22,080 --> 00:13:27,720 Speaker 1: arduous story of Microsoft's plan to acquire Activision Blizzard. When 222 00:13:28,000 --> 00:13:31,240 Speaker 1: last we left our tail, Microsoft was working hard to 223 00:13:31,240 --> 00:13:34,360 Speaker 1: convince regulators in the UK that the deal would not 224 00:13:34,440 --> 00:13:37,480 Speaker 1: result in an anti competitive situation within the world of 225 00:13:37,520 --> 00:13:42,240 Speaker 1: console gaming in general and cloud based streaming gaming in particular. 226 00:13:42,880 --> 00:13:47,160 Speaker 1: Well last week that regulatory agency gave provisional approval to 227 00:13:47,200 --> 00:13:51,280 Speaker 1: the acquisition, so that roadblock was removed. But now the 228 00:13:51,360 --> 00:13:54,720 Speaker 1: United States Federal Trade Commission has once again objected to 229 00:13:54,800 --> 00:13:58,640 Speaker 1: this deal. Now, they previously attempted to secure an injunction 230 00:13:58,880 --> 00:14:03,480 Speaker 1: against the deal closing, but a US District Court judge 231 00:14:03,480 --> 00:14:06,800 Speaker 1: denied that request and said that the FTC had failed 232 00:14:06,840 --> 00:14:10,040 Speaker 1: to produce evidence that this deal would actually be harmful 233 00:14:10,160 --> 00:14:15,160 Speaker 1: toward competition. The FTC then appealed that ruling, but then 234 00:14:15,280 --> 00:14:19,000 Speaker 1: withdrew the complaint not that long afterward, and it turns 235 00:14:19,040 --> 00:14:22,360 Speaker 1: out that was temporary because now they have submitted the 236 00:14:22,400 --> 00:14:26,080 Speaker 1: appeal again to the Ninth Circuit Court of Appeals. Once 237 00:14:26,280 --> 00:14:31,400 Speaker 1: the regulators in the UK gave their provisional approval, and 238 00:14:31,880 --> 00:14:34,760 Speaker 1: then once the court actually gives a decision one way 239 00:14:34,920 --> 00:14:38,760 Speaker 1: or the other, the FTC plans on an evidentiary hearing 240 00:14:38,840 --> 00:14:42,400 Speaker 1: on the matter. Now This on the surface sounds bad 241 00:14:42,480 --> 00:14:46,040 Speaker 1: for Microsoft and Activision Blizzard, but they could still go 242 00:14:46,120 --> 00:14:48,560 Speaker 1: through with their deal because there's no injunction against it, 243 00:14:48,920 --> 00:14:51,640 Speaker 1: and that deal is scheduled to close on October eighteenth. 244 00:14:52,320 --> 00:14:54,320 Speaker 1: Then once the deal is closed, they can worry about 245 00:14:54,320 --> 00:14:57,320 Speaker 1: any ongoing legal issues they might face after the fact, 246 00:14:57,360 --> 00:15:00,400 Speaker 1: but by then the deal will already be done. So 247 00:15:00,480 --> 00:15:04,160 Speaker 1: what I'm seeing is that most analysts think this deal 248 00:15:04,200 --> 00:15:08,640 Speaker 1: will finally close next month, and now for a few 249 00:15:09,280 --> 00:15:13,040 Speaker 1: AI stories, it's always going to be that. In a 250 00:15:13,120 --> 00:15:18,040 Speaker 1: tech Stuff News episode, an SEO consultant named Gagan Gotra 251 00:15:18,440 --> 00:15:24,000 Speaker 1: brought some concerning information about Google Bard conversations. So it 252 00:15:24,080 --> 00:15:27,720 Speaker 1: turns out if someone has a conversation with google Bard, 253 00:15:27,840 --> 00:15:30,840 Speaker 1: which I'm just going to remind you that's Google's AI 254 00:15:31,040 --> 00:15:36,359 Speaker 1: powered chatbot. It's similar to Open AI's chat GPT, specifically, 255 00:15:36,720 --> 00:15:40,960 Speaker 1: as it is integrated with bing, well, if you then 256 00:15:41,080 --> 00:15:45,760 Speaker 1: share a link of that conversation with someone else, then 257 00:15:45,920 --> 00:15:51,720 Speaker 1: Google will actually index that conversation, which means that conversation 258 00:15:51,800 --> 00:15:55,240 Speaker 1: with google Bard can pop up in future search results. 259 00:15:55,640 --> 00:15:59,040 Speaker 1: So let's say that you and a coworker are using 260 00:15:59,080 --> 00:16:02,240 Speaker 1: google Bard to help develop a business plan and this 261 00:16:02,400 --> 00:16:04,720 Speaker 1: is something that you do not want to share outside 262 00:16:04,760 --> 00:16:08,960 Speaker 1: of your organization. When you share that conversational link to 263 00:16:09,000 --> 00:16:13,760 Speaker 1: your coworker, the link itself becomes indexable, and then Google's 264 00:16:13,760 --> 00:16:16,560 Speaker 1: web craller will index the conversation, and if someone else 265 00:16:16,640 --> 00:16:20,360 Speaker 1: uses the right search query, that conversation can potentially pop 266 00:16:20,480 --> 00:16:22,560 Speaker 1: up in those search results, and the information that you 267 00:16:22,680 --> 00:16:26,360 Speaker 1: intended to be private has now been made public. Now, 268 00:16:26,360 --> 00:16:30,040 Speaker 1: obviously there are all sorts of situations where indexing the 269 00:16:30,120 --> 00:16:33,120 Speaker 1: barred conversation could turn out to be a really bad thing. 270 00:16:33,960 --> 00:16:36,320 Speaker 1: It's not that different from when open Ai had an 271 00:16:36,360 --> 00:16:40,520 Speaker 1: issue with chat GPT, where the chatbot would occasionally give 272 00:16:40,720 --> 00:16:46,320 Speaker 1: users access to other users chat histories with the chatbot. Now, 273 00:16:46,800 --> 00:16:49,560 Speaker 1: in the Google case, we're not talking about a bug, 274 00:16:50,000 --> 00:16:53,160 Speaker 1: we're talking about a feature. Because Gotra had an exchange 275 00:16:53,560 --> 00:16:58,200 Speaker 1: with Google research scientist Peter Lieu, and Peter pointed out 276 00:16:58,400 --> 00:17:01,520 Speaker 1: that the search engine will only end conversations if someone 277 00:17:01,520 --> 00:17:05,000 Speaker 1: has clicked on the share button, and Goscher's reply was 278 00:17:05,040 --> 00:17:08,040 Speaker 1: the same one I would have made. Namely, I think 279 00:17:08,080 --> 00:17:11,440 Speaker 1: most users would just assume that share just means you've 280 00:17:11,440 --> 00:17:15,040 Speaker 1: elected to share that conversation with someone in particular not 281 00:17:15,320 --> 00:17:19,480 Speaker 1: the whole world in general, but that appears to be 282 00:17:19,600 --> 00:17:23,600 Speaker 1: the case. Once upon a time, when chat GPT first 283 00:17:23,600 --> 00:17:27,080 Speaker 1: splashed on the scene, the chatbot could only draw information 284 00:17:27,240 --> 00:17:31,840 Speaker 1: from before September twenty twenty one. It didn't have access 285 00:17:32,000 --> 00:17:36,240 Speaker 1: to the real time web. It could not crawl the 286 00:17:36,240 --> 00:17:39,119 Speaker 1: web for current information. So you could not ask it 287 00:17:39,160 --> 00:17:42,600 Speaker 1: about breaking news or anything like that because it literally 288 00:17:42,640 --> 00:17:46,159 Speaker 1: just could not access those information sources. Now, back in 289 00:17:46,240 --> 00:17:50,200 Speaker 1: July of this year, OpenAI created a subscription based tier 290 00:17:50,240 --> 00:17:53,440 Speaker 1: of service that would let users rely on chat GPT 291 00:17:53,640 --> 00:17:57,639 Speaker 1: through bing searches to access current information. But that feature 292 00:17:57,920 --> 00:18:01,600 Speaker 1: soon went away. Why because users figured out that they 293 00:18:01,600 --> 00:18:05,680 Speaker 1: could use this particular feature to bypass paywalls and get 294 00:18:05,720 --> 00:18:07,639 Speaker 1: to content that normally you would have to pay a 295 00:18:07,680 --> 00:18:11,879 Speaker 1: subscription to access. That's not exactly something that either open 296 00:18:11,920 --> 00:18:15,760 Speaker 1: ai or Microsoft wants to deal with, so the feature 297 00:18:15,760 --> 00:18:20,000 Speaker 1: went offline. But now it's come back. OpenAI announced yesterday 298 00:18:20,040 --> 00:18:23,480 Speaker 1: that through bing, it will let users access real time 299 00:18:23,520 --> 00:18:28,439 Speaker 1: information using the chatbot, and it calls this authoritative. Now. 300 00:18:28,480 --> 00:18:31,919 Speaker 1: I personally find that designation questionable given how chatbots like 301 00:18:32,000 --> 00:18:37,240 Speaker 1: chat GPT are prone to producing hallucinations or confabulations, which 302 00:18:37,280 --> 00:18:40,800 Speaker 1: just means sometimes they make stuff up. So I'm not 303 00:18:40,840 --> 00:18:44,600 Speaker 1: sure how authoritative you can actually claim to be. If 304 00:18:44,640 --> 00:18:47,560 Speaker 1: you've been following the union strikes in Hollywood, you likely 305 00:18:47,640 --> 00:18:50,520 Speaker 1: know that the Writer's Guild of America or WGA, has 306 00:18:50,560 --> 00:18:53,760 Speaker 1: reached a provisional agreement with the Alliance of Motion Pictures 307 00:18:53,760 --> 00:18:58,320 Speaker 1: and Television Producers aka the AMPTP. A lot of the 308 00:18:58,560 --> 00:19:02,600 Speaker 1: WGA's concerns relate to tech, ranging from how streaming companies 309 00:19:02,600 --> 00:19:06,159 Speaker 1: determine payouts like residuals, to the role of artificial intelligence 310 00:19:06,600 --> 00:19:10,800 Speaker 1: should play in film and television. The agreement, which WGA 311 00:19:10,880 --> 00:19:13,480 Speaker 1: members will have to ratify in a vote, essentially says 312 00:19:13,520 --> 00:19:15,919 Speaker 1: that the studios will not be able to compel writers 313 00:19:15,960 --> 00:19:19,280 Speaker 1: to use AI, that any content that does involve generative 314 00:19:19,320 --> 00:19:22,160 Speaker 1: AI has to be labeled as such, and that AI 315 00:19:22,320 --> 00:19:24,560 Speaker 1: is not eligible to get a writing credit. Now that's 316 00:19:24,600 --> 00:19:27,119 Speaker 1: important when it comes to things like payments and residuals 317 00:19:27,160 --> 00:19:30,480 Speaker 1: and so on, Considering the US courts have recently found 318 00:19:30,480 --> 00:19:33,679 Speaker 1: that AI generated material is not eligible for copyright, I 319 00:19:33,720 --> 00:19:36,960 Speaker 1: think these changes are ultimately of benefit to both the 320 00:19:36,960 --> 00:19:42,240 Speaker 1: writers and the producers. Meanwhile, the actor strike still goes on. Okay, 321 00:19:42,760 --> 00:19:46,960 Speaker 1: that's it for the news for today Thursday, September twenty eighth, 322 00:19:47,320 --> 00:19:51,000 Speaker 1: twenty twenty three. Next week, I'll be back in Atlanta 323 00:19:51,480 --> 00:19:54,560 Speaker 1: and episodes will return as normal. They'll probably sound more 324 00:19:54,640 --> 00:19:58,000 Speaker 1: like the older episodes and not weird because I'm recording 325 00:19:58,040 --> 00:20:01,040 Speaker 1: in some hotel room in Las Vegas. So I hope 326 00:20:01,160 --> 00:20:03,480 Speaker 1: you are all well, and I'll talk to you again 327 00:20:04,240 --> 00:20:13,960 Speaker 1: really soon. Tech Stuff is an iHeartRadio production. For more 328 00:20:14,000 --> 00:20:18,760 Speaker 1: podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or 329 00:20:18,800 --> 00:20:20,719 Speaker 1: wherever you listen to your favorite shows.