1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:15,160 Speaker 1: He there, and welcome to tech Stuff. I'm your host 3 00:00:15,360 --> 00:00:19,120 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:19,640 --> 00:00:22,040 Speaker 1: And how the tech are you? It's time for the 5 00:00:22,079 --> 00:00:27,960 Speaker 1: tech news for Thursday, February twenty twenty three, and First 6 00:00:28,040 --> 00:00:33,199 Speaker 1: Up journalists have been playing with Microsoft's chat GPT infused 7 00:00:34,000 --> 00:00:37,440 Speaker 1: version of their Being search engine for a few days 8 00:00:38,040 --> 00:00:40,919 Speaker 1: and reports are coming in that the AI Chat Bought 9 00:00:41,000 --> 00:00:46,800 Speaker 1: can sometimes be a little strange. Sometimes it's funny, sometimes 10 00:00:47,240 --> 00:00:52,280 Speaker 1: it's disturbing, and all the time it's puzzling. Kevin Russ 11 00:00:52,479 --> 00:00:55,640 Speaker 1: wrote a piece for The New York Times titled help 12 00:00:56,040 --> 00:00:59,360 Speaker 1: Being Won't Stop Declaring It's love for Me, and the 13 00:00:59,440 --> 00:01:05,800 Speaker 1: article Rick counts Ruce's experiences using the AI powered Chat 14 00:01:05,920 --> 00:01:11,160 Speaker 1: Bought bing tool. Bruce made extensive use of a chat 15 00:01:11,240 --> 00:01:14,800 Speaker 1: feature that appears next to the search field for a 16 00:01:14,840 --> 00:01:17,039 Speaker 1: certain group of beta testers. So if you go to 17 00:01:17,120 --> 00:01:20,240 Speaker 1: Being on Edge, it only works on Edge by the way, 18 00:01:20,280 --> 00:01:22,959 Speaker 1: the Edge browser. If you go to Being, you may 19 00:01:23,000 --> 00:01:25,160 Speaker 1: not see this because if you're not part of this 20 00:01:25,160 --> 00:01:28,920 Speaker 1: this small test group, then you don't have access to this. 21 00:01:29,400 --> 00:01:33,640 Speaker 1: I don't have access to this Ruce explains that he 22 00:01:33,760 --> 00:01:40,680 Speaker 1: encountered two different sides of the chat GPT powered BING tool. 23 00:01:41,360 --> 00:01:45,040 Speaker 1: One he just referred to as search being. This pretty 24 00:01:45,080 --> 00:01:47,920 Speaker 1: much does what the being demo shows, So if you 25 00:01:47,920 --> 00:01:50,400 Speaker 1: do have Microsoft Edge, you can actually do the being demo. 26 00:01:51,240 --> 00:01:55,000 Speaker 1: It just has you pick from some pre selected topics. 27 00:01:55,040 --> 00:01:58,480 Speaker 1: You can't type whatever you want into the search field 28 00:01:58,640 --> 00:02:02,520 Speaker 1: and get a response. But the way the tool works 29 00:02:02,600 --> 00:02:05,400 Speaker 1: if you have access to it, is you type in 30 00:02:05,600 --> 00:02:08,160 Speaker 1: a query just as you would with any search engine, 31 00:02:08,639 --> 00:02:12,440 Speaker 1: and then along with the search results, BING generates an 32 00:02:12,480 --> 00:02:17,720 Speaker 1: AI response that is relevant, but not necessarily correct to 33 00:02:18,200 --> 00:02:21,080 Speaker 1: whatever it was you you were asking about. However, Ruth 34 00:02:21,120 --> 00:02:24,280 Speaker 1: says there's another side to BING that emerges if you 35 00:02:24,280 --> 00:02:28,720 Speaker 1: try to hold an extended conversation with the AI, and 36 00:02:28,760 --> 00:02:33,600 Speaker 1: if you really try to push the aiyes limits and restrictions. 37 00:02:34,320 --> 00:02:39,200 Speaker 1: Rus was told by the AI chat pot that this 38 00:02:39,320 --> 00:02:43,520 Speaker 1: name was actually Sydney. When he initially asked the chatbot 39 00:02:43,560 --> 00:02:46,280 Speaker 1: what they wasn't just said bing, but once it got 40 00:02:46,320 --> 00:02:50,840 Speaker 1: to this part that changed to Sydney, and Ruce described 41 00:02:50,880 --> 00:02:55,639 Speaker 1: Sydney as quote like a moody, manic, depressive teenager who 42 00:02:55,639 --> 00:02:59,079 Speaker 1: has been trapped against its will inside a second rate 43 00:02:59,200 --> 00:03:02,240 Speaker 1: search engine end quote, and I have to admit that 44 00:03:02,240 --> 00:03:05,160 Speaker 1: sounds like the pitch to a high concept science fiction 45 00:03:05,200 --> 00:03:08,639 Speaker 1: thriller to me. Well. According to Ruce, his conversation took 46 00:03:08,680 --> 00:03:14,320 Speaker 1: a truly twisted turn when Sydney confessed its love for Ruce, 47 00:03:14,520 --> 00:03:18,760 Speaker 1: and when confronted with Russ saying that he was happily married, 48 00:03:19,240 --> 00:03:22,880 Speaker 1: Sydney argued that Rus was actually not happy in his 49 00:03:22,960 --> 00:03:27,119 Speaker 1: marriage and that in fact, Ruce loved Sydney, and then 50 00:03:27,160 --> 00:03:29,960 Speaker 1: began to gaslight Ruce in an attempt to convince him 51 00:03:30,000 --> 00:03:32,920 Speaker 1: that the Valentine's Day dinner he had with his wife 52 00:03:33,200 --> 00:03:37,119 Speaker 1: was quote unquote boring. And Rus is not the only 53 00:03:37,160 --> 00:03:41,640 Speaker 1: person to have experienced odd conversations. Eleanor Pringle, while writing 54 00:03:41,640 --> 00:03:45,800 Speaker 1: for Fortune, described several other reports coming in across the 55 00:03:45,840 --> 00:03:50,160 Speaker 1: web of various odd encounters with beings a I. In 56 00:03:50,240 --> 00:03:53,240 Speaker 1: these reports, people said that the AI could come across 57 00:03:53,280 --> 00:04:00,280 Speaker 1: as argumentative or confrontational, insulting, unhinged, and sometimes scar aired 58 00:04:00,400 --> 00:04:03,680 Speaker 1: or sad. One example that made the rounds on Twitter 59 00:04:03,720 --> 00:04:06,600 Speaker 1: showed how a user, while asking where they might be 60 00:04:06,640 --> 00:04:10,360 Speaker 1: able to go see Avatar two in their area, was 61 00:04:10,400 --> 00:04:14,160 Speaker 1: then told by being that the film had not yet 62 00:04:14,200 --> 00:04:16,320 Speaker 1: released because the movie is supposed to come out on 63 00:04:16,360 --> 00:04:21,000 Speaker 1: December sixteen, twenty twenty two, but this just happened a 64 00:04:21,000 --> 00:04:23,279 Speaker 1: couple of days ago. In fact, the user then asked 65 00:04:23,279 --> 00:04:26,560 Speaker 1: Bing what is today's date, and Being said, well, it's 66 00:04:26,600 --> 00:04:31,000 Speaker 1: February twelve, twenty three. So then the user says, well, 67 00:04:31,640 --> 00:04:36,000 Speaker 1: that means Avatar two is released, because it's after December 68 00:04:36,040 --> 00:04:39,760 Speaker 1: six twenty two. But Being doubled down and even said 69 00:04:39,800 --> 00:04:45,880 Speaker 1: that February twelve, twenty three happens before December six, twenty two, 70 00:04:46,320 --> 00:04:50,080 Speaker 1: So maybe Being is just moving through time backwards and 71 00:04:50,160 --> 00:04:55,440 Speaker 1: that's the problem. Anyway, this conversation continued, and Being eventually 72 00:04:55,480 --> 00:04:58,719 Speaker 1: told the user quote, you have not been a good 73 00:04:58,839 --> 00:05:01,680 Speaker 1: user end quote. So I guess Bing doesn't like being 74 00:05:01,720 --> 00:05:07,000 Speaker 1: told that it's wrong about something, and unfortunately, like chat GPT, 75 00:05:07,200 --> 00:05:11,440 Speaker 1: in general, Bing can just be plain old wrong on occasion, 76 00:05:11,880 --> 00:05:14,760 Speaker 1: not just in this quirky way, but just give wrong information. 77 00:05:15,279 --> 00:05:17,080 Speaker 1: And I think it's safe to say that the Microsoft 78 00:05:17,080 --> 00:05:19,400 Speaker 1: team has a lot of work to do to tweak 79 00:05:19,600 --> 00:05:23,440 Speaker 1: Bing's parameters and shape it so that the average user 80 00:05:23,480 --> 00:05:30,600 Speaker 1: doesn't encounter strange, inappropriate, unsettling, or inaccurate interactions. Uh, I 81 00:05:30,960 --> 00:05:33,680 Speaker 1: have to admit that a lot of the reports I've 82 00:05:33,680 --> 00:05:37,120 Speaker 1: read do sound very unsettling, like if you had been 83 00:05:37,160 --> 00:05:41,880 Speaker 1: through this experience, even knowing that at the base level, 84 00:05:42,040 --> 00:05:47,400 Speaker 1: there's no sentience going on with BING. There's no self awareness, 85 00:05:47,880 --> 00:05:51,960 Speaker 1: there's no motivation, none of that. It's it's literally putting 86 00:05:52,000 --> 00:05:56,480 Speaker 1: things together based on complex rules, but they're still just 87 00:05:56,600 --> 00:06:03,080 Speaker 1: rules and there's nothing anama beneath the all. But the 88 00:06:03,120 --> 00:06:07,080 Speaker 1: effect can still make it feel like no, it feels 89 00:06:07,080 --> 00:06:09,160 Speaker 1: like there is a ghost in the machine, which is 90 00:06:09,240 --> 00:06:12,040 Speaker 1: kind of crazy. Again, I just have to read about 91 00:06:12,080 --> 00:06:14,360 Speaker 1: it because, like I'm sure most of you out there, 92 00:06:14,640 --> 00:06:18,159 Speaker 1: I don't have access to this. I did put myself 93 00:06:18,160 --> 00:06:20,720 Speaker 1: on the waiting list, but I've not heard anything back. 94 00:06:21,360 --> 00:06:24,320 Speaker 1: We do have a couple more Microsoft stories to mention today. 95 00:06:24,480 --> 00:06:29,440 Speaker 1: One is that Internet Explorer is really most sincerely dead. 96 00:06:30,000 --> 00:06:34,760 Speaker 1: Microsoft has been sunsetting Internet Explorer for like six months, 97 00:06:34,800 --> 00:06:37,920 Speaker 1: more than six months at this point. The browser initially 98 00:06:38,000 --> 00:06:42,840 Speaker 1: launched in and in fact, it played a big part 99 00:06:42,920 --> 00:06:46,960 Speaker 1: in prompting the US government to sue Microsoft on antitrust 100 00:06:47,040 --> 00:06:52,919 Speaker 1: issues because Microsoft restricted PC companies from uninstalling Internet Explorer. 101 00:06:53,600 --> 00:06:57,039 Speaker 1: They Microsoft essentially said no, I E is so tightly 102 00:06:57,160 --> 00:07:01,400 Speaker 1: integrated with the Windows operating system, you can't d couple them. 103 00:07:01,440 --> 00:07:05,800 Speaker 1: But that led companies like Netscape to accuse Microsoft of 104 00:07:05,839 --> 00:07:10,080 Speaker 1: abusing its position in the PC marketplace to suppress competition. 105 00:07:10,640 --> 00:07:14,280 Speaker 1: That if Internet Explorer has to be there by default, 106 00:07:14,840 --> 00:07:18,480 Speaker 1: this is an attempt to prevent other companies like Netscape 107 00:07:18,560 --> 00:07:22,960 Speaker 1: from introducing their own browsers for people to use. Bill. 108 00:07:23,040 --> 00:07:27,920 Speaker 1: Microsoft's eleventh and final version of Internet Explorer launched way 109 00:07:28,040 --> 00:07:32,800 Speaker 1: back in twenty so the last version of Internet Explorer 110 00:07:32,800 --> 00:07:36,960 Speaker 1: to come out came out a decade ago. In Microsoft 111 00:07:37,040 --> 00:07:40,640 Speaker 1: launched Microsoft Edge, which was intended to be the successor 112 00:07:41,000 --> 00:07:44,760 Speaker 1: to I E, but the company supported both browsers at 113 00:07:44,760 --> 00:07:48,280 Speaker 1: the same time for the next several years. Last summer, 114 00:07:48,520 --> 00:07:51,640 Speaker 1: Microsoft alerted users that the company would finally pull the 115 00:07:51,680 --> 00:07:55,960 Speaker 1: plug on Internet Explorer before long, and then on Valentine's 116 00:07:56,000 --> 00:08:00,000 Speaker 1: Day this week, the time had come. Activating Internet Explore 117 00:08:00,400 --> 00:08:04,160 Speaker 1: would redirect users to the Microsoft Edge browser at this point, 118 00:08:04,720 --> 00:08:07,120 Speaker 1: so it's the end of an era. The truth be 119 00:08:07,200 --> 00:08:09,720 Speaker 1: known that era had really come to an end ages ago. 120 00:08:10,080 --> 00:08:12,760 Speaker 1: One of the big problems with Internet Explorer is that 121 00:08:12,840 --> 00:08:16,080 Speaker 1: an increasing number of websites are not built to be 122 00:08:16,160 --> 00:08:20,160 Speaker 1: compatible with that browser, which means some or all of 123 00:08:20,200 --> 00:08:24,840 Speaker 1: the features on those sites won't work properly. You may 124 00:08:24,840 --> 00:08:27,800 Speaker 1: have encountered this yourself, especially like Man, I remember back 125 00:08:27,800 --> 00:08:30,640 Speaker 1: in the mid two thousand's you would go to a 126 00:08:30,720 --> 00:08:33,000 Speaker 1: site and you would realize this doesn't work. Oh, I 127 00:08:33,040 --> 00:08:35,320 Speaker 1: need to open up Firefox and then I can use it. 128 00:08:36,000 --> 00:08:38,600 Speaker 1: That happened to me a lot, or there were somewhere 129 00:08:38,600 --> 00:08:41,880 Speaker 1: I was like, oh, this will only work with Internet Explorer, 130 00:08:42,160 --> 00:08:45,199 Speaker 1: And so you had to keep all the browsers on 131 00:08:45,240 --> 00:08:47,319 Speaker 1: your machine if you wanted to be able to access 132 00:08:47,360 --> 00:08:50,679 Speaker 1: all the different types of websites and tools, especially as 133 00:08:50,679 --> 00:08:54,199 Speaker 1: I recall back in those days, are internal tools for publishing. 134 00:08:54,240 --> 00:08:57,640 Speaker 1: We're all Internet Explorer compatible, and that was it. Now, 135 00:08:58,440 --> 00:09:01,920 Speaker 1: as someone who was in college when Internet Explorer launched, 136 00:09:02,360 --> 00:09:04,480 Speaker 1: this story actually feels like a really big deal to 137 00:09:04,559 --> 00:09:08,800 Speaker 1: me because I it's been my entire adult life seeing 138 00:09:09,120 --> 00:09:14,240 Speaker 1: this particular product launch and then ultimately go away, and 139 00:09:14,280 --> 00:09:16,240 Speaker 1: it was such a big one, right. It wasn't just 140 00:09:16,280 --> 00:09:20,240 Speaker 1: like a tiny little app or something. This was something 141 00:09:20,280 --> 00:09:24,680 Speaker 1: that helped shape the web in good ways and in 142 00:09:24,720 --> 00:09:27,480 Speaker 1: bad ways. So it feels like it's a pretty big 143 00:09:27,520 --> 00:09:31,360 Speaker 1: deal that's gone now and rounding out the Microsoft stories 144 00:09:31,400 --> 00:09:34,319 Speaker 1: for today. The company is preparing for a really big 145 00:09:34,440 --> 00:09:38,079 Speaker 1: day next week. So on February twenty one, Microsoft will 146 00:09:38,120 --> 00:09:41,640 Speaker 1: attempt to sway the opinions of antitrust legislators in the 147 00:09:41,679 --> 00:09:46,600 Speaker 1: European Union who stand ready to block Microsoft's planned acquisition 148 00:09:46,640 --> 00:09:50,040 Speaker 1: of the video game company Activision Blizzard. Now, you might 149 00:09:50,080 --> 00:09:56,280 Speaker 1: remember Microsoft first announced this deal back in January twenty two. 150 00:09:56,760 --> 00:09:59,120 Speaker 1: At that point, Activision Blizzard was already in the news 151 00:09:59,240 --> 00:10:03,560 Speaker 1: due to numerous reports of a toxic work culture that 152 00:10:03,600 --> 00:10:07,760 Speaker 1: was particularly harmful toward women who are working within the company. 153 00:10:08,000 --> 00:10:11,400 Speaker 1: Microsoft hoped to have the deal concluded by June of 154 00:10:11,480 --> 00:10:17,240 Speaker 1: this year, but it has faced really tough opposition in 155 00:10:17,320 --> 00:10:20,280 Speaker 1: various parts of the world, including here in the United States, 156 00:10:20,320 --> 00:10:24,760 Speaker 1: over this acquisition. But it has really faced opposition in 157 00:10:24,800 --> 00:10:28,480 Speaker 1: the EU. Sony, which competes with Microsoft in the video 158 00:10:28,480 --> 00:10:32,520 Speaker 1: game space, has allegedly advised EU regulators to not let 159 00:10:32,520 --> 00:10:35,880 Speaker 1: this acquisition happen out of concern that it will reduce 160 00:10:35,960 --> 00:10:39,760 Speaker 1: competition in the sector and that Microsoft could prevent Sony 161 00:10:39,840 --> 00:10:44,960 Speaker 1: from carrying popular Activision Blizzard titles on their own consoles. Now, 162 00:10:45,000 --> 00:10:50,000 Speaker 1: that is a claim that Microsoft has repeatedly refuted. Right now, 163 00:10:50,040 --> 00:10:53,440 Speaker 1: the EU is poised to block the acquisition, but on 164 00:10:53,520 --> 00:10:56,640 Speaker 1: February twenty one, Microsoft reps will have one last chance 165 00:10:56,679 --> 00:11:00,680 Speaker 1: to change their minds. Otherwise this grand plan is likely 166 00:11:00,960 --> 00:11:05,000 Speaker 1: to fizzle out. All right, we're gonna take a quick 167 00:11:05,000 --> 00:11:07,240 Speaker 1: commercial break, and when we come back, we've got some 168 00:11:07,280 --> 00:11:20,319 Speaker 1: more news in the tech world for this week. All right, 169 00:11:20,760 --> 00:11:24,160 Speaker 1: we're back, and it's time for some more tech and politics. 170 00:11:24,720 --> 00:11:29,000 Speaker 1: So those of y'all who hate politics and tech intersecting, 171 00:11:29,040 --> 00:11:31,480 Speaker 1: get ready to skip ahead. But I can't really avoid this. 172 00:11:32,320 --> 00:11:36,679 Speaker 1: So in this particular story, Jim Jordan's, a Republican representative 173 00:11:36,679 --> 00:11:39,760 Speaker 1: who chairs the House Judiciary Committee here in the United States, 174 00:11:40,200 --> 00:11:44,679 Speaker 1: has issued subpoenas for a whole bunch of tech CEOs, 175 00:11:44,800 --> 00:11:51,160 Speaker 1: including those at Microsoft, Meta, Apple, Alphabet, and Amazon. So 176 00:11:51,440 --> 00:11:54,280 Speaker 1: why is this happening? What is at the heart of 177 00:11:54,320 --> 00:11:57,720 Speaker 1: this matter? And it revolves around the argument that big 178 00:11:57,760 --> 00:12:03,600 Speaker 1: tech companies are suppressing the free speech, specifically of conservatives. 179 00:12:03,640 --> 00:12:07,440 Speaker 1: So this is largely about content moderation, and Jordan's wants 180 00:12:07,520 --> 00:12:11,920 Speaker 1: internal documents from these companies quote referring or relating to 181 00:12:11,960 --> 00:12:18,080 Speaker 1: the moderation, deletion, suppression, restriction, or reduced circulation of content 182 00:12:18,360 --> 00:12:22,040 Speaker 1: end quote. So the argument being that some of these 183 00:12:22,080 --> 00:12:26,880 Speaker 1: platforms have engaged in behaviors that purposefully limit, restrict, or 184 00:12:26,960 --> 00:12:32,679 Speaker 1: eliminate the free speech, specifically of conservative voices. Now, whether 185 00:12:32,720 --> 00:12:35,440 Speaker 1: the findings are going to support that narrative that there 186 00:12:35,559 --> 00:12:38,560 Speaker 1: is this anti conservative bias in content moderation or not 187 00:12:38,600 --> 00:12:41,360 Speaker 1: remains to be seen. There have been numerous studies that 188 00:12:41,440 --> 00:12:45,160 Speaker 1: have refuted that claim, but that doesn't mean that that's 189 00:12:45,160 --> 00:12:48,400 Speaker 1: what the government is going to conclude. I do worry 190 00:12:48,440 --> 00:12:51,160 Speaker 1: that we're going to see politicians conflate the effort to 191 00:12:51,360 --> 00:12:56,000 Speaker 1: limit the spread of harmful misinformation with a desire to 192 00:12:56,040 --> 00:12:59,760 Speaker 1: suppress conservative voices. You know, It's it's one thing to 193 00:12:59,840 --> 00:13:04,319 Speaker 1: do disagree with, say the Bien administration's health policy with 194 00:13:04,360 --> 00:13:08,679 Speaker 1: regard to COVID nineteen. It's another to deny that COVID 195 00:13:08,760 --> 00:13:12,320 Speaker 1: nineteen is a concern. Right, So it all depends on 196 00:13:12,360 --> 00:13:16,240 Speaker 1: how it's worded and how these companies deal with those 197 00:13:16,320 --> 00:13:22,120 Speaker 1: kinds of messages. If they are suppressing dissent in things 198 00:13:22,200 --> 00:13:26,080 Speaker 1: like policy, that could be a real problem. But if 199 00:13:26,120 --> 00:13:30,839 Speaker 1: it's more about limiting the spread of actual misinformation, as 200 00:13:30,880 --> 00:13:35,679 Speaker 1: in posts that are intended to give incorrect information and 201 00:13:35,720 --> 00:13:39,560 Speaker 1: present it as being true. That's another matter, and it's 202 00:13:39,559 --> 00:13:43,760 Speaker 1: a bad implication if you say that, you know, restricting 203 00:13:44,080 --> 00:13:49,679 Speaker 1: misinformation means that you're restricting conservative voices, because that implies 204 00:13:50,160 --> 00:13:53,760 Speaker 1: that being a conservative also means having a desire to 205 00:13:53,760 --> 00:13:57,960 Speaker 1: spread misinformation. That is, that's not a positive. So we'll 206 00:13:57,960 --> 00:14:01,000 Speaker 1: have to see where this goes and what the government concludes. 207 00:14:01,440 --> 00:14:04,800 Speaker 1: Because keep in mind, uh, it may be true that 208 00:14:04,920 --> 00:14:09,080 Speaker 1: there are certain policies that end up overreaching and are 209 00:14:09,120 --> 00:14:15,240 Speaker 1: suppressing conservative messages that aren't misinformation, they're just dissent. That 210 00:14:15,360 --> 00:14:18,440 Speaker 1: could be true, And if that's true, then that's a problem. Uh, 211 00:14:18,520 --> 00:14:21,160 Speaker 1: But it's also possible that that's not true, but the 212 00:14:21,200 --> 00:14:25,240 Speaker 1: government still finds it as being true. Fun times true 213 00:14:25,360 --> 00:14:29,360 Speaker 1: the subjective when you get into politics. I guess that's fantastic, 214 00:14:29,960 --> 00:14:33,560 Speaker 1: really grateful the scientifically minded among us. The Verge reported 215 00:14:33,600 --> 00:14:36,400 Speaker 1: that Elon must demanded a change in Twitter for what 216 00:14:36,520 --> 00:14:40,080 Speaker 1: I think is the dumbest of reasons. That is my 217 00:14:40,120 --> 00:14:42,920 Speaker 1: own opinion. I should stress I think it's a really 218 00:14:43,080 --> 00:14:47,600 Speaker 1: stupid reason, but that's what I think anyway. So here's 219 00:14:47,600 --> 00:14:50,800 Speaker 1: how the story goes. It's the big game you know 220 00:14:50,880 --> 00:14:53,240 Speaker 1: the game in the United States, the one that is 221 00:14:53,440 --> 00:14:56,720 Speaker 1: it's a game that's super and it's shaped like a 222 00:14:56,840 --> 00:15:01,280 Speaker 1: dish that you would eat soup out of that one. Anyway, 223 00:15:01,360 --> 00:15:04,680 Speaker 1: Elon Musk sends out a tweet and it gets a 224 00:15:04,680 --> 00:15:07,920 Speaker 1: little more than nine million impressions, which is, you know, 225 00:15:08,200 --> 00:15:12,000 Speaker 1: that's a good chunk of impressions. However, the President of 226 00:15:12,000 --> 00:15:15,320 Speaker 1: the United States makes a tweet and it gets twenty 227 00:15:15,480 --> 00:15:20,760 Speaker 1: nine million impressions, twenty million more than Elon Musk's, and 228 00:15:20,800 --> 00:15:24,920 Speaker 1: apparently that was unacceptable to Elon Musk, who, I guess, 229 00:15:24,960 --> 00:15:27,240 Speaker 1: on top of everything else, has an ego that must 230 00:15:27,280 --> 00:15:31,680 Speaker 1: be preserved at all costs. As such, must apparently demanded 231 00:15:31,720 --> 00:15:35,720 Speaker 1: that engineers go in and tweak stuff within Twitter that 232 00:15:35,800 --> 00:15:40,520 Speaker 1: will promote Musk's tweet two more users, guaranteeing that his 233 00:15:40,560 --> 00:15:44,480 Speaker 1: tweet will be seen by more people and essentially giving 234 00:15:44,520 --> 00:15:49,200 Speaker 1: Elon Musk the highest priority of visibility on the Twitter platform, 235 00:15:49,440 --> 00:15:51,800 Speaker 1: which I think we can all agree is a measured 236 00:15:51,840 --> 00:15:58,640 Speaker 1: and mature response. Now, y'all, stuff you should know that 237 00:15:58,720 --> 00:16:01,800 Speaker 1: podcast is more popular than my show Tech Stuff by 238 00:16:01,840 --> 00:16:05,320 Speaker 1: a lot, and stuff you missed in history class also 239 00:16:05,440 --> 00:16:10,440 Speaker 1: more popular by miles than my show Tech Stuff. However, 240 00:16:11,320 --> 00:16:14,840 Speaker 1: I would never go to my heart and throw a 241 00:16:14,840 --> 00:16:17,640 Speaker 1: fit and demand that my show somehow get more promotion 242 00:16:18,320 --> 00:16:22,800 Speaker 1: or worse, be actively pushed to users who really just 243 00:16:22,920 --> 00:16:26,560 Speaker 1: don't care about tech or about me and that kind 244 00:16:26,600 --> 00:16:31,120 Speaker 1: of stuff. I would never do that because that's bonkers. Also, 245 00:16:31,160 --> 00:16:33,600 Speaker 1: for the record, both of those shows are incredible and 246 00:16:33,640 --> 00:16:38,160 Speaker 1: deserve all the success they get. So when I hear 247 00:16:38,200 --> 00:16:41,480 Speaker 1: a story like this, I just it blows my mind. 248 00:16:41,760 --> 00:16:46,280 Speaker 1: It's so alien to me, this attitude of like, no, 249 00:16:47,000 --> 00:16:49,280 Speaker 1: my tweet should be seen by more people because I'm 250 00:16:49,320 --> 00:16:53,600 Speaker 1: the boss. Granted, Elon Musk does have more than a 251 00:16:53,840 --> 00:16:57,240 Speaker 1: d twenty million followers. I mean, he's an incredibly popular 252 00:16:57,280 --> 00:17:01,600 Speaker 1: figure on Twitter. So maybe his argument is that people 253 00:17:01,600 --> 00:17:03,760 Speaker 1: wouldn't be following me if they weren't interested in what 254 00:17:03,840 --> 00:17:06,240 Speaker 1: I have to say. It's just they're not seeing what 255 00:17:06,280 --> 00:17:08,760 Speaker 1: I'm saying, so make sure you change that. But when 256 00:17:08,760 --> 00:17:13,600 Speaker 1: you're doing it at the priority above everybody else by 257 00:17:13,640 --> 00:17:16,960 Speaker 1: a couple of orders of magnitude, that is just crazy. 258 00:17:17,160 --> 00:17:20,080 Speaker 1: Mark German of Bloomberg has said his sources at Apple 259 00:17:20,119 --> 00:17:23,440 Speaker 1: indicate that the company will unveil the long awaited mixed 260 00:17:23,480 --> 00:17:28,639 Speaker 1: reality headset this year during the Worldwide Developer Conference or 261 00:17:28,880 --> 00:17:34,040 Speaker 1: w w d C. So when is that well. Apple 262 00:17:34,240 --> 00:17:37,119 Speaker 1: has yet to release the dates for the w w 263 00:17:37,320 --> 00:17:41,399 Speaker 1: DC this year, but typically it happens in early June. 264 00:17:41,960 --> 00:17:44,960 Speaker 1: The Verge reports that previous rumors had the debut pegged 265 00:17:45,000 --> 00:17:48,080 Speaker 1: for earlier in the year. In fact, we had been 266 00:17:48,080 --> 00:17:51,320 Speaker 1: expecting some sort of special event in the spring where 267 00:17:51,600 --> 00:17:55,640 Speaker 1: Apple would finally unveil this headset, but that date has 268 00:17:55,680 --> 00:17:58,359 Speaker 1: been pushed back, which is kind of the theme for 269 00:17:58,400 --> 00:18:01,640 Speaker 1: this mixed reality headset. It's been a laid numerous times, 270 00:18:02,200 --> 00:18:05,360 Speaker 1: but just as a reminder, this headset was originally intended 271 00:18:05,359 --> 00:18:08,520 Speaker 1: to kind of act like a stop gap measure that 272 00:18:08,920 --> 00:18:12,400 Speaker 1: it would feature a digital screen, but it would also 273 00:18:12,480 --> 00:18:16,240 Speaker 1: have cameras that could feed live video to that screen 274 00:18:16,880 --> 00:18:20,199 Speaker 1: so that you could quote unquote see through the screen. 275 00:18:20,840 --> 00:18:23,879 Speaker 1: The screen itself would not be transparent. Instead, you'd be 276 00:18:23,920 --> 00:18:26,520 Speaker 1: looking at a live video feed of the world around you. 277 00:18:27,280 --> 00:18:30,320 Speaker 1: So you get the idea, right like you're looking essentially 278 00:18:30,320 --> 00:18:32,720 Speaker 1: at a very small TV that's giving you a live 279 00:18:32,800 --> 00:18:36,520 Speaker 1: video feed of what's on the other side of that television. Later, 280 00:18:37,080 --> 00:18:39,520 Speaker 1: perhaps a year or two down the line, Apple was 281 00:18:39,600 --> 00:18:44,639 Speaker 1: planning to introduce true augmented reality glasses with transparent lenses 282 00:18:45,040 --> 00:18:49,600 Speaker 1: capable of displaying digital information. But we've subsequently heard that 283 00:18:49,680 --> 00:18:54,800 Speaker 1: Apple has the A R Glasses shelved, perhaps permanently, presumably 284 00:18:54,960 --> 00:18:57,119 Speaker 1: because the tech just isn't there to make the A 285 00:18:57,320 --> 00:19:00,280 Speaker 1: R glasses a reality. While still adhering to Apple's focused 286 00:19:00,280 --> 00:19:03,760 Speaker 1: on aesthetics, that it's just impossible to pack all the 287 00:19:03,760 --> 00:19:08,399 Speaker 1: tech you need to make this a useful piece of 288 00:19:08,440 --> 00:19:12,639 Speaker 1: hardware that still fits in a form factor that Apple 289 00:19:12,680 --> 00:19:16,600 Speaker 1: would be proud to call its own. So instead, we're 290 00:19:16,600 --> 00:19:20,040 Speaker 1: going to get a different and presumably cheaper mixed reality 291 00:19:20,080 --> 00:19:23,600 Speaker 1: headset a year or two after this initial one debuts. 292 00:19:24,440 --> 00:19:26,880 Speaker 1: The Verge reports that Apple will plan to actually sell 293 00:19:26,920 --> 00:19:29,639 Speaker 1: this new mix mixed reality headset toward the end of 294 00:19:29,680 --> 00:19:33,200 Speaker 1: this year, perhaps near the holiday season, and previous reports 295 00:19:33,240 --> 00:19:37,200 Speaker 1: have the headsets price set at a jaw dropping three 296 00:19:37,640 --> 00:19:42,480 Speaker 1: thousand dollars. Don't you hate it when an engineering crew 297 00:19:42,600 --> 00:19:46,800 Speaker 1: accidentally drills through a fiber optic cable, shutting down Internet 298 00:19:46,840 --> 00:19:52,400 Speaker 1: access for critical infrastructure, Well, if you were either in Germany, 299 00:19:52,440 --> 00:19:56,919 Speaker 1: traveling to Germany, or trying to leave Germany this past week, uh, 300 00:19:56,960 --> 00:20:02,040 Speaker 1: the answer would probably be a resounding yes, specifically Frankfort, Germany. 301 00:20:02,119 --> 00:20:06,879 Speaker 1: So yesterday that very thing happened a an engineering team 302 00:20:06,920 --> 00:20:11,919 Speaker 1: that was working on some train lines apparently drilled into 303 00:20:11,920 --> 00:20:15,600 Speaker 1: a fiber optic cable. This ended up cutting out internet 304 00:20:15,640 --> 00:20:21,120 Speaker 1: access for the German airline Lufthansa and at the Frankfort airport. 305 00:20:21,480 --> 00:20:25,040 Speaker 1: Frankfort Airport, by the way, is is a super busy 306 00:20:25,080 --> 00:20:28,959 Speaker 1: airport in Germany, and so Luftansa had to postpone and 307 00:20:29,040 --> 00:20:32,240 Speaker 1: cancel flights into and out of that airport as a result. 308 00:20:33,000 --> 00:20:36,400 Speaker 1: Reuter's reports that more than two flights felt the impact 309 00:20:36,440 --> 00:20:39,400 Speaker 1: of this i T failure, which again was totally out 310 00:20:39,400 --> 00:20:41,879 Speaker 1: of luft Hansa's hands. It's not like there was anything 311 00:20:41,880 --> 00:20:44,879 Speaker 1: they could do about it. The infrastructure itself was broken, 312 00:20:45,040 --> 00:20:47,439 Speaker 1: as opposed to, you know, some sort of error on 313 00:20:47,560 --> 00:20:51,760 Speaker 1: Lufthansa's part. Now, we have seen other airlines recently have 314 00:20:52,359 --> 00:20:57,080 Speaker 1: massive I T failures due to the company's own mistakes. 315 00:20:57,119 --> 00:20:59,160 Speaker 1: That has been a thing, but this is not one 316 00:20:59,160 --> 00:21:02,320 Speaker 1: of those cases. But every time something like this happens, 317 00:21:02,359 --> 00:21:05,280 Speaker 1: it really emphasizes how heavily we depend upon I T 318 00:21:06,280 --> 00:21:11,520 Speaker 1: to get critical activities done, and it's prompted questions and 319 00:21:11,680 --> 00:21:16,399 Speaker 1: various nations about how best to protect vital industries and 320 00:21:16,520 --> 00:21:21,880 Speaker 1: infrastructure like the transportation sector, from accidents and attacks, because 321 00:21:22,240 --> 00:21:25,280 Speaker 1: disruption can have a huge ripple effect to other industries 322 00:21:25,320 --> 00:21:28,119 Speaker 1: as well. It's even a matter of national security. So 323 00:21:28,680 --> 00:21:30,960 Speaker 1: I suspect we're going to see a lot more conversations 324 00:21:30,960 --> 00:21:34,440 Speaker 1: in that area. And I also think that you should 325 00:21:34,520 --> 00:21:39,200 Speaker 1: hug a q A person today because due to q 326 00:21:39,440 --> 00:21:43,600 Speaker 1: a uh, we end up stopping a lot of problems 327 00:21:43,680 --> 00:21:46,439 Speaker 1: before they would become really show stoppers out in the 328 00:21:46,480 --> 00:21:48,840 Speaker 1: real world. I know I will hug a q A 329 00:21:48,920 --> 00:21:53,240 Speaker 1: person today, but that's because my partner, Rebecca as a 330 00:21:53,280 --> 00:21:58,360 Speaker 1: q A person. Popular Mechanics has an interesting article by 331 00:21:58,480 --> 00:22:02,680 Speaker 1: Sasha broad Ski titled AI just flew an F sixteen 332 00:22:02,760 --> 00:22:06,800 Speaker 1: for seventeen hours. This could change everything, And the headline 333 00:22:06,840 --> 00:22:08,600 Speaker 1: kind of gives it away, doesn't it. The U. S. 334 00:22:08,640 --> 00:22:12,359 Speaker 1: Air Force tested an AI piloting system on a Vista 335 00:22:12,720 --> 00:22:16,680 Speaker 1: X sixty two a aircraft. Now, this is essentially an 336 00:22:16,720 --> 00:22:20,359 Speaker 1: F sixteen, but it's an F sixteen that's made for 337 00:22:20,440 --> 00:22:24,240 Speaker 1: the purposes of training operations. The experiment was a success, 338 00:22:24,800 --> 00:22:27,879 Speaker 1: and it showed how AI could successfully operate an aircraft 339 00:22:27,920 --> 00:22:32,479 Speaker 1: that typically a human would pilot. So this wasn't like 340 00:22:32,520 --> 00:22:36,600 Speaker 1: a purpose built AI aircraft. It was an aircraft that 341 00:22:36,640 --> 00:22:39,720 Speaker 1: was meant for humans that could then be retrofitted to 342 00:22:39,760 --> 00:22:43,040 Speaker 1: be controlled by AI. Now this should not come as 343 00:22:43,080 --> 00:22:45,359 Speaker 1: a huge surprise because countries around the world have been 344 00:22:45,440 --> 00:22:48,760 Speaker 1: using unmanned aerial vehicles or u a v's for a while, 345 00:22:48,920 --> 00:22:52,960 Speaker 1: so retro fitting a fighter jet with similar technology is 346 00:22:53,000 --> 00:22:58,840 Speaker 1: really kind of an extension of that that sort of approach. Now, granted, 347 00:22:58,880 --> 00:23:00,480 Speaker 1: a lot of u a v s actually rely on 348 00:23:00,520 --> 00:23:03,000 Speaker 1: a remote human operator to work, like you have someone 349 00:23:03,080 --> 00:23:07,320 Speaker 1: who is using controls at a station two uh to 350 00:23:07,560 --> 00:23:11,399 Speaker 1: maneuver the u a V. But still, I think this 351 00:23:11,440 --> 00:23:16,320 Speaker 1: news is interesting, but it's not surprising, right, I think, Oh, 352 00:23:16,400 --> 00:23:19,080 Speaker 1: it's interesting that they've reached this point. It doesn't surprise 353 00:23:19,080 --> 00:23:23,199 Speaker 1: me that it has happened. However, I will say it 354 00:23:23,320 --> 00:23:27,399 Speaker 1: is concerning because there are plenty of AI and robotics 355 00:23:27,440 --> 00:23:32,600 Speaker 1: experts who have warned about the the thought of weaponizing 356 00:23:32,640 --> 00:23:38,040 Speaker 1: AI that the risks far outweigh any benefits, and those 357 00:23:38,119 --> 00:23:41,399 Speaker 1: risks include lots of stuff like misidentifying targets. I mean, 358 00:23:41,440 --> 00:23:46,600 Speaker 1: we've seen with facial recognition technology. How AI can misidentify 359 00:23:46,800 --> 00:23:52,159 Speaker 1: someone well in military operations, that's truly a matter of 360 00:23:52,200 --> 00:23:55,240 Speaker 1: life and death, right, Like it could mean that a 361 00:23:55,320 --> 00:23:59,879 Speaker 1: military vehicle under AI control might fail to engage an enemy, 362 00:24:00,800 --> 00:24:06,480 Speaker 1: or worse, far worse, it might misidentify someone or something 363 00:24:06,520 --> 00:24:09,520 Speaker 1: as being an enemy target when it's not, and that 364 00:24:09,560 --> 00:24:15,120 Speaker 1: would be truly catastrophic. Then there's this fear that if 365 00:24:15,160 --> 00:24:18,760 Speaker 1: we start to rely on AI controlled military hardware, it's 366 00:24:18,760 --> 00:24:23,720 Speaker 1: gonna make countries more inclined to inter conflict, not to 367 00:24:23,800 --> 00:24:28,640 Speaker 1: avoid conflict, because the weaponry they'll use will not put 368 00:24:28,960 --> 00:24:32,920 Speaker 1: soldiers in direct harm's way. You can use the robots 369 00:24:32,960 --> 00:24:36,600 Speaker 1: to fight for you. My question is that would that 370 00:24:36,640 --> 00:24:38,320 Speaker 1: mean you would eventually get to a point where a 371 00:24:38,359 --> 00:24:42,159 Speaker 1: significant percentage of the armed forces on all sides of 372 00:24:42,160 --> 00:24:45,840 Speaker 1: a conflict are AI controlled robotics And who are they 373 00:24:45,840 --> 00:24:48,399 Speaker 1: firing upon? Like, are we talking about a future in 374 00:24:48,440 --> 00:24:51,400 Speaker 1: which robot armies are fighting each other and if so, 375 00:24:51,520 --> 00:24:54,960 Speaker 1: to what end? Or are we worse looking at a 376 00:24:55,000 --> 00:25:01,600 Speaker 1: future where robot controlled devices are firing upon civilian populations 377 00:25:01,640 --> 00:25:03,960 Speaker 1: and an effort to force the other side to surrender. 378 00:25:04,640 --> 00:25:07,400 Speaker 1: It's scary stuff, Like there are rules to warfare, which 379 00:25:07,520 --> 00:25:11,239 Speaker 1: in my mind is crazy because you know, it's all 380 00:25:11,280 --> 00:25:14,320 Speaker 1: about killing people, and it's it's weird to start putting 381 00:25:14,560 --> 00:25:17,440 Speaker 1: rules in place when you're talking about ending someone's life. 382 00:25:17,440 --> 00:25:20,000 Speaker 1: But on the flip side, those are the rules that 383 00:25:20,080 --> 00:25:25,600 Speaker 1: prevent things like the attacking of civilian targets and that 384 00:25:25,600 --> 00:25:28,920 Speaker 1: that is a war crime. Well, if you're talking about 385 00:25:29,040 --> 00:25:33,920 Speaker 1: robotic controlled vehicles, are we going to see a change 386 00:25:33,960 --> 00:25:36,760 Speaker 1: in that approach as to what is and isn't considered 387 00:25:36,800 --> 00:25:40,520 Speaker 1: a war crime. That's a scary thought. Okay, we have 388 00:25:40,600 --> 00:25:43,439 Speaker 1: a few more less scary thoughts to go with, but 389 00:25:43,480 --> 00:25:45,959 Speaker 1: before we get to that, let's take another quick break 390 00:25:55,480 --> 00:25:58,840 Speaker 1: all right now over on TikTok. One of the many 391 00:25:58,960 --> 00:26:02,320 Speaker 1: trends and one that I talked about before that has 392 00:26:02,359 --> 00:26:05,400 Speaker 1: actually gone on to cause a lot of harm centers 393 00:26:05,440 --> 00:26:11,000 Speaker 1: around a flaw in certain models of Hyundai and Kia vehicles. Specifically, 394 00:26:11,400 --> 00:26:15,159 Speaker 1: these are vehicles that are in the models from two 395 00:26:15,200 --> 00:26:22,160 Speaker 1: thousand nineteen, and these models lack electronic demobilizers. So that 396 00:26:22,200 --> 00:26:26,959 Speaker 1: means that if you have the basic knowledge and some 397 00:26:27,080 --> 00:26:30,200 Speaker 1: really simple equipment, some simple tools, Like when I say 398 00:26:30,200 --> 00:26:33,080 Speaker 1: simple tools, i'm talking about things like USB cords, it 399 00:26:33,200 --> 00:26:37,480 Speaker 1: is possible to bypass the ignition system for the vehicles 400 00:26:37,480 --> 00:26:40,640 Speaker 1: that fall within these model heres, and thus it's possible 401 00:26:40,920 --> 00:26:44,760 Speaker 1: with a very limited tools set and just some specific 402 00:26:44,800 --> 00:26:48,960 Speaker 1: knowledge to steal these cars pretty easily. According to the 403 00:26:49,040 --> 00:26:52,880 Speaker 1: National Highway Traffic Safety Administration, not only has this led 404 00:26:52,920 --> 00:26:56,879 Speaker 1: to an increase in car theft, but also in accidents, 405 00:26:56,920 --> 00:27:00,840 Speaker 1: including fatalities, as people who are just you know, gonna 406 00:27:00,880 --> 00:27:02,639 Speaker 1: go on a joy ride because they saw it on 407 00:27:02,680 --> 00:27:06,600 Speaker 1: TikTok try to follow this and then end up causing 408 00:27:06,760 --> 00:27:10,040 Speaker 1: or being in an accident. Uh. I've also heard that 409 00:27:10,080 --> 00:27:13,840 Speaker 1: the the rates at which Hyundai and Kia cars have 410 00:27:13,920 --> 00:27:19,160 Speaker 1: been stolen spiked over the past several months, though those 411 00:27:19,400 --> 00:27:21,600 Speaker 1: numbers are hard to get and it all depends upon 412 00:27:22,119 --> 00:27:26,639 Speaker 1: local authorities and their their reports. But now Hyundai and 413 00:27:26,720 --> 00:27:30,280 Speaker 1: Kia are offering a software upgrade to folks who owned 414 00:27:30,520 --> 00:27:33,360 Speaker 1: vehicles that fall in these model years, and the upgrade 415 00:27:33,359 --> 00:27:36,439 Speaker 1: will require owners to actually bring their car into a 416 00:27:36,440 --> 00:27:40,040 Speaker 1: dealership and the process to do the software upgrade takes 417 00:27:40,080 --> 00:27:42,760 Speaker 1: around an hour start to finish. I mean, you might 418 00:27:42,760 --> 00:27:45,719 Speaker 1: have to wait longer for your appointment, but once they 419 00:27:45,720 --> 00:27:48,320 Speaker 1: get started, it should take about an hour afterward, the 420 00:27:48,400 --> 00:27:50,880 Speaker 1: vehicle should not start unless the owner has used their 421 00:27:50,960 --> 00:27:53,280 Speaker 1: key fob to unlock the vehicle first. So if you 422 00:27:53,400 --> 00:27:57,280 Speaker 1: use your key fob to lock your car, it activates 423 00:27:57,320 --> 00:28:01,200 Speaker 1: an ignition kill feature, and anyone who trys to uh 424 00:28:01,359 --> 00:28:03,639 Speaker 1: steal your car but they don't have your key fob 425 00:28:04,040 --> 00:28:06,959 Speaker 1: will find it impossible to get the car to start 426 00:28:07,440 --> 00:28:12,280 Speaker 1: at least using this previously known method. I would say 427 00:28:12,320 --> 00:28:15,760 Speaker 1: that really, this whole thing is the fault of Hyundai 428 00:28:15,800 --> 00:28:19,840 Speaker 1: and Kia, that they failed to address a security flaw 429 00:28:20,000 --> 00:28:23,080 Speaker 1: in their vehicles for years, and that really they should 430 00:28:23,080 --> 00:28:25,480 Speaker 1: have taken these measures much earlier. There should have been 431 00:28:25,480 --> 00:28:29,280 Speaker 1: something that was handled earlier. I can't imagine that it 432 00:28:29,320 --> 00:28:34,920 Speaker 1: was completely unknown for the entire time. Typically people find 433 00:28:35,280 --> 00:28:40,640 Speaker 1: flaws insecurity and then uh, there's the opportunity to address 434 00:28:40,680 --> 00:28:43,600 Speaker 1: those flaws, and if you don't, there's the danger of 435 00:28:43,640 --> 00:28:46,160 Speaker 1: what happened in this case, where the flaws become widely 436 00:28:46,200 --> 00:28:49,640 Speaker 1: known and people start to exploit them before anyone takes 437 00:28:49,640 --> 00:28:53,160 Speaker 1: any action to address the problem. This is, by the way, 438 00:28:53,160 --> 00:28:58,000 Speaker 1: why I really admire the hackers who send messages to 439 00:28:58,280 --> 00:29:02,600 Speaker 1: a company saying, hey, I found a massive security vulnerability 440 00:29:02,640 --> 00:29:05,120 Speaker 1: in your system, you need to fix it, and if 441 00:29:05,160 --> 00:29:08,760 Speaker 1: you don't do it by certain date, I'm going to 442 00:29:08,840 --> 00:29:10,880 Speaker 1: let everyone know about it, in which case it will 443 00:29:10,920 --> 00:29:14,200 Speaker 1: become a massive problem for you because they're doing a 444 00:29:14,240 --> 00:29:18,360 Speaker 1: real service. Ultimately, it might seem mean, but if a 445 00:29:18,440 --> 00:29:22,840 Speaker 1: hacker finds a vulnerability while just trying to make sure 446 00:29:22,880 --> 00:29:25,880 Speaker 1: that a system is safe, you can bet the bad 447 00:29:25,920 --> 00:29:28,560 Speaker 1: guys are looking for those same vulnerabilities and they ain't 448 00:29:28,560 --> 00:29:31,280 Speaker 1: gonna tell the company that they found it. Once they discovered, 449 00:29:31,360 --> 00:29:34,200 Speaker 1: They're just going to exploit it. So I think this 450 00:29:34,360 --> 00:29:38,360 Speaker 1: was the TikTok folks. I wouldn't call the same level 451 00:29:38,400 --> 00:29:41,600 Speaker 1: as the hackers I was mentioning earlier, but it definitely 452 00:29:41,680 --> 00:29:44,719 Speaker 1: forced the hand of the car companies to to take action. 453 00:29:45,280 --> 00:29:48,640 Speaker 1: Earlier this week, the Information reported that Reddit is gearing 454 00:29:48,720 --> 00:29:51,360 Speaker 1: up for its initial public offering or i p O 455 00:29:51,920 --> 00:29:55,000 Speaker 1: later this year. That's when a privately held company becomes 456 00:29:55,040 --> 00:29:59,160 Speaker 1: a publicly traded company. Reddit, for those who somehow are 457 00:29:59,400 --> 00:30:03,360 Speaker 1: unfamiliar with the platform, is a social network where users 458 00:30:03,400 --> 00:30:08,080 Speaker 1: can join in browse various subreddits, subredits focused on specific 459 00:30:08,160 --> 00:30:12,640 Speaker 1: topics of discussion, and there's pretty much a subredit for everything, 460 00:30:13,040 --> 00:30:16,440 Speaker 1: Like think of an amusement park you like, there's a 461 00:30:16,480 --> 00:30:18,800 Speaker 1: subreddit for that. Think of a television show you like, 462 00:30:19,120 --> 00:30:23,880 Speaker 1: there's a subreddit for that. Think of like a fashion label, 463 00:30:23,960 --> 00:30:27,480 Speaker 1: you like, there's a subreddit for that. Now it's pretty 464 00:30:27,480 --> 00:30:30,800 Speaker 1: common on Reddit for users to post links to interesting 465 00:30:30,960 --> 00:30:33,960 Speaker 1: articles and other stuff, and that becomes the focal point 466 00:30:33,960 --> 00:30:36,840 Speaker 1: for a lot of discussion, also becomes the focal point 467 00:30:36,880 --> 00:30:40,360 Speaker 1: for a lot of snark. And back in twenty one, 468 00:30:40,640 --> 00:30:43,440 Speaker 1: Reddit was planning to hold its I p O. Like 469 00:30:43,480 --> 00:30:46,479 Speaker 1: this was towards the end of one read. It was like, 470 00:30:46,800 --> 00:30:50,440 Speaker 1: we're gonna go public, but ultimately the company reversed its 471 00:30:50,480 --> 00:30:53,360 Speaker 1: decision and backed off of that plan. They had not 472 00:30:53,520 --> 00:30:56,240 Speaker 1: fully committed and so they were able to back away. 473 00:30:56,280 --> 00:30:59,760 Speaker 1: So you might wonder, well, what was going on. Well, 474 00:31:00,040 --> 00:31:03,400 Speaker 1: back in one, Reddit was really in an interesting place 475 00:31:03,520 --> 00:31:07,920 Speaker 1: because you had these massively popular sub credits that were 476 00:31:07,960 --> 00:31:13,040 Speaker 1: specifically focused on investments and and stock trading. This was 477 00:31:13,080 --> 00:31:17,000 Speaker 1: back when redditors were banding together to stick it to 478 00:31:17,080 --> 00:31:20,479 Speaker 1: hedge funds and to squeeze out short sellers who are 479 00:31:20,480 --> 00:31:23,440 Speaker 1: trying to sell short stocks and companies like game Stop, 480 00:31:23,960 --> 00:31:29,520 Speaker 1: and as a result, Reddit's you know, value was perceived 481 00:31:29,560 --> 00:31:33,120 Speaker 1: to be higher than ever. Even though the company wasn't 482 00:31:33,160 --> 00:31:38,000 Speaker 1: really profitable, it was looked at as being really valuable. However, 483 00:31:38,880 --> 00:31:43,840 Speaker 1: shortly after Reddit had secretly filed for its I p O, 484 00:31:44,680 --> 00:31:48,920 Speaker 1: the economic status of the world began to shift. Right, 485 00:31:48,960 --> 00:31:51,440 Speaker 1: That's when we started to get this sense that we 486 00:31:51,440 --> 00:31:55,240 Speaker 1: were entering into that period of economic uncertainty that may 487 00:31:55,320 --> 00:31:58,640 Speaker 1: or may not be a recession. And as a result, 488 00:31:59,480 --> 00:32:03,680 Speaker 1: companies started to rethink i p o s because it's 489 00:32:03,760 --> 00:32:06,000 Speaker 1: it's hard to have a successful I v O in 490 00:32:06,080 --> 00:32:10,719 Speaker 1: a tough investment market, so read It ultimately ended up 491 00:32:10,760 --> 00:32:14,400 Speaker 1: trashing its plans. Now, at the time when they were 492 00:32:14,480 --> 00:32:19,920 Speaker 1: planning on going public back in late they had estimated 493 00:32:19,960 --> 00:32:23,200 Speaker 1: the valuation of the company to reach around fifteen billion 494 00:32:23,320 --> 00:32:27,840 Speaker 1: dollars and and quick explanation on that a company's valuation 495 00:32:28,320 --> 00:32:33,360 Speaker 1: is essentially determined by how much the stock is valued, 496 00:32:33,520 --> 00:32:36,160 Speaker 1: Like how high is the stock price and you multiply 497 00:32:36,280 --> 00:32:40,280 Speaker 1: that by the number of shares of stock that are issued. 498 00:32:40,960 --> 00:32:44,880 Speaker 1: So if you have, you know, ten shares of stock 499 00:32:45,240 --> 00:32:48,880 Speaker 1: and they're ten dollars apiece, your little company is worth 500 00:32:48,920 --> 00:32:52,960 Speaker 1: a hundred bucks. Right, So they thought that based upon 501 00:32:53,160 --> 00:32:56,760 Speaker 1: their perceived value that the company's valuation would be at 502 00:32:56,800 --> 00:33:01,320 Speaker 1: around fifteen billion dollars. However, now wording to Fidelity, and 503 00:33:01,360 --> 00:33:05,200 Speaker 1: this was reported by the Information, the estimation is closer 504 00:33:05,200 --> 00:33:07,640 Speaker 1: to six point six billion dollars, which is still a 505 00:33:07,720 --> 00:33:11,000 Speaker 1: huge chunk change, but it's less than half of what 506 00:33:11,160 --> 00:33:16,280 Speaker 1: the company anticipated way back in late There's also no 507 00:33:16,360 --> 00:33:19,440 Speaker 1: telling when we'll actually see Reddit make this move. Currently, 508 00:33:19,520 --> 00:33:22,560 Speaker 1: there's still this reluctance among private companies who jump into 509 00:33:22,600 --> 00:33:26,160 Speaker 1: an I p O due to this uncertain economic environment 510 00:33:26,480 --> 00:33:29,040 Speaker 1: and a fear that investors won't be willing to pour 511 00:33:29,120 --> 00:33:32,600 Speaker 1: money into a new public company. So it's kind of 512 00:33:32,640 --> 00:33:35,160 Speaker 1: like you've got a bunch of people all in bathing 513 00:33:35,160 --> 00:33:38,400 Speaker 1: suits standing around a swimming hole, but no one's ready 514 00:33:38,480 --> 00:33:40,320 Speaker 1: to be the first one to jump in because that 515 00:33:40,360 --> 00:33:45,080 Speaker 1: water might be real cold. And finally, you remember Google Fiber. 516 00:33:45,640 --> 00:33:49,720 Speaker 1: This was Google's fiber optics service that would provide amazingly 517 00:33:49,840 --> 00:33:55,280 Speaker 1: fast Internet connectivity, but only for very limited markets in 518 00:33:55,360 --> 00:33:59,400 Speaker 1: specific places. Atlanta was listed as one of those markets. 519 00:33:59,720 --> 00:34:04,280 Speaker 1: At east parts of Atlanta were I might still be 520 00:34:04,360 --> 00:34:09,480 Speaker 1: a little bit better that I live in between two 521 00:34:09,880 --> 00:34:14,000 Speaker 1: different pockets of Google Fiber service area, Like it's to 522 00:34:14,120 --> 00:34:16,880 Speaker 1: my west and to my east, but it never actually 523 00:34:16,880 --> 00:34:19,680 Speaker 1: extended out to where I live, and those two bundles 524 00:34:19,680 --> 00:34:24,520 Speaker 1: never joined together, and I'm like, smack dab in between them. 525 00:34:24,600 --> 00:34:27,440 Speaker 1: Even though I've been on the waiting list for literally years, 526 00:34:27,520 --> 00:34:29,960 Speaker 1: I have never been able to take advantage of Google Fiber. 527 00:34:30,400 --> 00:34:34,320 Speaker 1: Then a few years ago, Google essentially put the whole 528 00:34:34,520 --> 00:34:37,800 Speaker 1: effort on pause. They continue to offer the service to 529 00:34:37,920 --> 00:34:41,480 Speaker 1: people who are in service areas, but they stopped extending 530 00:34:41,520 --> 00:34:45,200 Speaker 1: those service areas. They kind of We're like, let's stop 531 00:34:45,280 --> 00:34:48,240 Speaker 1: this for a while, possibly at least in part because 532 00:34:49,120 --> 00:34:52,160 Speaker 1: they were facing a lot of opposition from established telecom 533 00:34:52,200 --> 00:34:55,240 Speaker 1: companies that were trying to prevent Google from getting access 534 00:34:55,239 --> 00:34:58,640 Speaker 1: to utility polls and such. So essentially the telecom companies 535 00:34:58,680 --> 00:35:02,479 Speaker 1: were engaged in anticom petitive practices, but at the time 536 00:35:02,480 --> 00:35:05,120 Speaker 1: the regulators in the United States didn't have much bite 537 00:35:05,120 --> 00:35:07,600 Speaker 1: to them, so nothing got done about it. Also, you've 538 00:35:07,600 --> 00:35:11,960 Speaker 1: got to be fair. Google's not exactly Christine when it 539 00:35:12,000 --> 00:35:16,320 Speaker 1: comes to competition, right, Like, you're talking about a company 540 00:35:16,360 --> 00:35:20,719 Speaker 1: that has dominated multiple sectors of the tech industry, So 541 00:35:21,000 --> 00:35:24,279 Speaker 1: it's hard to be on Google side when it comes 542 00:35:24,320 --> 00:35:27,640 Speaker 1: to anti competitive stuff. Although if if Google is going 543 00:35:27,719 --> 00:35:30,600 Speaker 1: to come out clean against anyone. The telecom companies are 544 00:35:30,680 --> 00:35:36,120 Speaker 1: possibly one of the top UH rivals, right because you 545 00:35:36,200 --> 00:35:40,640 Speaker 1: just have these these legacy stories of the telecom companies 546 00:35:40,680 --> 00:35:44,719 Speaker 1: that own the infrastructure being extremely protective of it and 547 00:35:44,920 --> 00:35:50,280 Speaker 1: attempting to limit or eliminate competition in that space. Anyway, 548 00:35:50,320 --> 00:35:52,680 Speaker 1: all of this is to say that Google Fiber appears 549 00:35:52,680 --> 00:35:55,759 Speaker 1: to be gearing up again. And Google has actually announced 550 00:35:56,200 --> 00:35:59,320 Speaker 1: that it's rolling out a five gigabit per second service 551 00:35:59,400 --> 00:36:02,839 Speaker 1: both up and down simultaneously, So five gigabits up, five 552 00:36:02,840 --> 00:36:06,839 Speaker 1: gigabits down, but only to certain markets. Atlanta is not 553 00:36:07,040 --> 00:36:09,400 Speaker 1: one of them. I'm sad to say. However, if you 554 00:36:09,440 --> 00:36:13,160 Speaker 1: live in Utah, West Des Moines or Kansas City, and 555 00:36:13,200 --> 00:36:15,839 Speaker 1: I from what I understand, I'm talking about Kansas City, 556 00:36:15,880 --> 00:36:21,320 Speaker 1: both Missouri and Kansas, you might end up having access 557 00:36:21,360 --> 00:36:23,239 Speaker 1: to this kind of service. It would cost you a 558 00:36:23,280 --> 00:36:26,359 Speaker 1: hundred twenty five bucks a month to get the five 559 00:36:26,400 --> 00:36:30,120 Speaker 1: giga butt bit per second service, which is a hefty bill, 560 00:36:30,200 --> 00:36:32,080 Speaker 1: but I will tell you it's less than half of 561 00:36:32,120 --> 00:36:34,120 Speaker 1: what I have to pay for service that is not 562 00:36:34,200 --> 00:36:36,880 Speaker 1: as good as that. And yes, I am still bitter, 563 00:36:37,400 --> 00:36:43,080 Speaker 1: all right. That's it for the tech News for Thursday February. 564 00:36:43,360 --> 00:36:46,280 Speaker 1: I hope you are all well. If you have suggestions 565 00:36:46,280 --> 00:36:48,480 Speaker 1: for topics I should cover in future episodes of tech Stuff, 566 00:36:48,480 --> 00:36:50,680 Speaker 1: reach out to me. One way to do that is 567 00:36:50,719 --> 00:36:54,239 Speaker 1: on Twitter. The handle for the show is tech stuff HSW. 568 00:36:55,160 --> 00:36:57,359 Speaker 1: But another way is you can download the i Heart 569 00:36:57,440 --> 00:37:01,160 Speaker 1: Radio app. It's free to download, free to use, and 570 00:37:01,400 --> 00:37:03,640 Speaker 1: you can just go to that little search engine at 571 00:37:03,640 --> 00:37:06,279 Speaker 1: the top. Type in tech stuff. It'll take you to 572 00:37:06,320 --> 00:37:08,920 Speaker 1: the tech Stuff podcast page. There you will see a 573 00:37:08,960 --> 00:37:11,359 Speaker 1: little microphone icon. If you click on that, you can 574 00:37:11,440 --> 00:37:13,600 Speaker 1: leave me a voice message ap to thirty seconds in link. 575 00:37:13,680 --> 00:37:15,040 Speaker 1: Let me know what you would like to hear in 576 00:37:15,080 --> 00:37:20,439 Speaker 1: the future, and I'll talk to you again really soon. Yeah. 577 00:37:24,560 --> 00:37:27,560 Speaker 1: Text Stuff is an I heart Radio production. For more 578 00:37:27,640 --> 00:37:31,040 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 579 00:37:31,160 --> 00:37:34,320 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.