1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from I Heart Radio. 2 00:00:11,880 --> 00:00:14,600 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,800 --> 00:00:18,480 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:18,640 --> 00:00:22,239 Speaker 1: And how the tech are you? Y'all? It's time for 5 00:00:22,320 --> 00:00:26,040 Speaker 1: the tech news for January thirty one, twenty twenty three. 6 00:00:26,200 --> 00:00:31,840 Speaker 1: Saying goodbye to January already, And first off, this week 7 00:00:31,920 --> 00:00:34,760 Speaker 1: is actually a really big one for several tech companies 8 00:00:34,800 --> 00:00:39,720 Speaker 1: because it's when they report their earnings for the previous quarter. 9 00:00:40,280 --> 00:00:42,920 Speaker 1: Uh Snap has their call today. By the time you 10 00:00:43,080 --> 00:00:45,440 Speaker 1: hear this episode, that news may already be out there, 11 00:00:45,920 --> 00:00:48,519 Speaker 1: but at the time I'm recording this, it hasn't yet happened. 12 00:00:49,120 --> 00:00:53,400 Speaker 1: Meta has their earnings called tomorrow. Everyone's gonna really pay 13 00:00:53,440 --> 00:00:55,680 Speaker 1: a lot of attention to that. And then Thursday is 14 00:00:55,720 --> 00:00:58,360 Speaker 1: what I like to call a triple a day, and 15 00:00:58,400 --> 00:01:02,600 Speaker 1: by that I mean Amazon, Apple, and Alphabet all have 16 00:01:02,720 --> 00:01:08,120 Speaker 1: their earnings call this Thursday. There's always increased scrutiny into 17 00:01:08,200 --> 00:01:11,360 Speaker 1: these companies around the earnings calls, and I admit that 18 00:01:11,400 --> 00:01:14,360 Speaker 1: can get a little frustrating if you're really just interested 19 00:01:14,360 --> 00:01:17,600 Speaker 1: in the tech, But honestly, it's a really good indicator 20 00:01:17,640 --> 00:01:20,319 Speaker 1: of how these companies are doing, like how the overall 21 00:01:20,440 --> 00:01:24,039 Speaker 1: tech economy is doing, and what we can expect going 22 00:01:24,120 --> 00:01:28,000 Speaker 1: forward because of that, so if investors are unhappy with 23 00:01:28,040 --> 00:01:30,600 Speaker 1: the results, he can see stock prices take a hit 24 00:01:31,120 --> 00:01:34,040 Speaker 1: that in turn can affect a company's plans moving forward. 25 00:01:34,440 --> 00:01:37,960 Speaker 1: For example, Meta has been under some serious fire from 26 00:01:38,000 --> 00:01:40,959 Speaker 1: investors for a while now because of some obstacles that 27 00:01:41,000 --> 00:01:44,200 Speaker 1: the company has faced as far as getting you know, 28 00:01:44,280 --> 00:01:49,120 Speaker 1: more people on the platforms and increasing engagement. Also, Zuckerberg's 29 00:01:49,160 --> 00:01:52,320 Speaker 1: continued commitment to building out the metaverse hasn't won him 30 00:01:52,320 --> 00:01:55,760 Speaker 1: a lot of fans in the investor community. Will probably 31 00:01:55,800 --> 00:01:59,080 Speaker 1: also get a feel for how digital ad spending is 32 00:01:59,120 --> 00:02:02,280 Speaker 1: going across the tech industry because a lot of these 33 00:02:02,320 --> 00:02:06,280 Speaker 1: big tech companies that's where they really depend upon, you know, 34 00:02:06,360 --> 00:02:09,040 Speaker 1: for the revenue, Like Alphabet, you know, the parent company 35 00:02:09,080 --> 00:02:13,640 Speaker 1: for Google, and Meta both really depend on advertising for 36 00:02:13,760 --> 00:02:17,600 Speaker 1: the vast majority of the revenue they generate. Of course, 37 00:02:18,320 --> 00:02:21,960 Speaker 1: these are also companies that have recently engaged in massive layoffs, 38 00:02:22,000 --> 00:02:24,520 Speaker 1: which we'll talk about a bit more in this episode. 39 00:02:24,760 --> 00:02:28,000 Speaker 1: And layoffs, obviously they have a terrible impact on the 40 00:02:28,000 --> 00:02:31,760 Speaker 1: people who are directly affected. Uh, but the investment world 41 00:02:31,760 --> 00:02:35,160 Speaker 1: often reacts positively because I mean it means the company 42 00:02:35,200 --> 00:02:38,680 Speaker 1: has reduced its costs by eliminating workforce, which is a 43 00:02:38,720 --> 00:02:41,960 Speaker 1: pretty clinical way to say. A lot of people found 44 00:02:42,000 --> 00:02:46,160 Speaker 1: themselves without jobs. So it doesn't mean that a slim 45 00:02:46,200 --> 00:02:49,000 Speaker 1: down company is going to perform better, but at least 46 00:02:49,040 --> 00:02:51,239 Speaker 1: in the short term, you end up seeing a bigger 47 00:02:51,240 --> 00:02:56,200 Speaker 1: return on investment because costs drop. And once again I 48 00:02:56,240 --> 00:02:59,720 Speaker 1: get all sour on the concept of capitalism. Now, don't 49 00:02:59,720 --> 00:03:01,800 Speaker 1: get me wrong, I don't have a better alternative up 50 00:03:01,800 --> 00:03:04,600 Speaker 1: my sleeve, so I'm just complaining about something I can't change. 51 00:03:05,160 --> 00:03:07,560 Speaker 1: It just it really feels gross to me to have 52 00:03:07,600 --> 00:03:11,440 Speaker 1: a system where investors are encouraged when people are losing 53 00:03:11,480 --> 00:03:15,640 Speaker 1: their jobs. But that's enough commentary from me. I'm sure 54 00:03:15,800 --> 00:03:18,400 Speaker 1: on Thursday, I'll talk a little bit more about what happened, 55 00:03:18,400 --> 00:03:21,679 Speaker 1: at least during Meta's call, but it probably won't be 56 00:03:21,760 --> 00:03:24,760 Speaker 1: until next week before I can talk about Amazon, Apple, 57 00:03:24,880 --> 00:03:28,720 Speaker 1: and Alphabet. Now, one thing I imagine Apple execs will 58 00:03:28,760 --> 00:03:32,040 Speaker 1: not be addressing on Thursday is how the United States 59 00:03:32,280 --> 00:03:36,200 Speaker 1: National Labor Relations Board or in l r B, has 60 00:03:36,240 --> 00:03:39,880 Speaker 1: accused the company of engaging in practices meant to discourage 61 00:03:39,880 --> 00:03:43,840 Speaker 1: employees from organizing and unionizing. In fact, the n l 62 00:03:43,960 --> 00:03:48,840 Speaker 1: r B says that various official Apple rules quote tend 63 00:03:49,080 --> 00:03:55,640 Speaker 1: to interfere with restraint or coerce employees end quote from organizing. 64 00:03:56,160 --> 00:03:59,520 Speaker 1: This was reported in Bloomberg and y'all. This is not great, 65 00:04:00,120 --> 00:04:02,920 Speaker 1: but it's also not a surprise because tech companies in general, 66 00:04:03,400 --> 00:04:07,840 Speaker 1: and big tech companies in particular, have a pretty well 67 00:04:07,960 --> 00:04:13,000 Speaker 1: documented history of forming policies intended to protect the company 68 00:04:13,080 --> 00:04:16,640 Speaker 1: above the rights of the employees. You hear it all 69 00:04:16,680 --> 00:04:19,720 Speaker 1: the time. That comes in the form of arbitration agreements. 70 00:04:20,279 --> 00:04:25,080 Speaker 1: This is where employees have to agree to settle issues 71 00:04:25,240 --> 00:04:28,120 Speaker 1: within the company by going through like HR as opposed 72 00:04:28,160 --> 00:04:32,760 Speaker 1: to going outside the company to resolve sometimes really serious 73 00:04:32,800 --> 00:04:36,400 Speaker 1: problems like things like sexual harassment or cases of discrimination. 74 00:04:36,839 --> 00:04:40,560 Speaker 1: So the company is compelling employees to to sign an 75 00:04:40,600 --> 00:04:43,400 Speaker 1: agreement that says you won't go to, say, a law firm, 76 00:04:43,760 --> 00:04:46,359 Speaker 1: in order to pursue your case. You have to go 77 00:04:46,400 --> 00:04:51,000 Speaker 1: through the company. There's also the case of being compelled 78 00:04:51,040 --> 00:04:55,279 Speaker 1: to sign non disclosure agreements, which frequently do way more 79 00:04:55,400 --> 00:04:59,360 Speaker 1: to restrict an employee's freedom than anything else. Anyway, Apple 80 00:04:59,480 --> 00:05:02,000 Speaker 1: is not unique to the situation. It's not like they're 81 00:05:02,040 --> 00:05:04,719 Speaker 1: the bad guy and the only one doing this, but 82 00:05:05,000 --> 00:05:08,120 Speaker 1: it is currently facing these accusations from the n l 83 00:05:08,240 --> 00:05:10,479 Speaker 1: r B, and there are a couple of options the 84 00:05:10,520 --> 00:05:14,359 Speaker 1: company could take. One is that Apple could settle with 85 00:05:14,440 --> 00:05:17,960 Speaker 1: the n l r B, presumably that settlement would require 86 00:05:17,960 --> 00:05:21,960 Speaker 1: a commitment to change these various rules and policies, or 87 00:05:22,400 --> 00:05:25,680 Speaker 1: Apple could end up facing a formal complaint from the 88 00:05:25,800 --> 00:05:28,880 Speaker 1: n l RB. I'd say this is another recent example 89 00:05:28,920 --> 00:05:31,120 Speaker 1: of how we're seeing a shift in philosophy when it 90 00:05:31,160 --> 00:05:34,280 Speaker 1: comes to big companies and employee rights here in the 91 00:05:34,360 --> 00:05:38,479 Speaker 1: United States, although whether this ultimately results in massive changes 92 00:05:38,520 --> 00:05:43,680 Speaker 1: throughout business remains to be seen. I'm somewhat pessimistic simply 93 00:05:43,760 --> 00:05:49,159 Speaker 1: because for a very long time, companies essentially had the 94 00:05:49,520 --> 00:05:52,160 Speaker 1: advantage in this area, and it's very hard to kind 95 00:05:52,160 --> 00:05:55,560 Speaker 1: of fight that sort of inertia. While we wait for 96 00:05:55,600 --> 00:05:59,200 Speaker 1: Alphabet's earnings call, we actually have a couple of Google 97 00:05:59,240 --> 00:06:02,080 Speaker 1: stories to talk about out so again a reminder, Alphabet 98 00:06:02,240 --> 00:06:05,080 Speaker 1: is the big umbrella company that is parent to Google 99 00:06:05,160 --> 00:06:08,520 Speaker 1: and YouTube and all that. Anyway, Google recently sent out 100 00:06:08,560 --> 00:06:11,640 Speaker 1: an email to Google THI customers alerting them of a 101 00:06:11,720 --> 00:06:14,480 Speaker 1: data breach now, Google Fi, in case you're not aware, 102 00:06:15,080 --> 00:06:19,440 Speaker 1: is Google's cellular service company, so it's like a telecommunication service, 103 00:06:19,680 --> 00:06:23,240 Speaker 1: and Google piggybacks on top of other companies infrastructure to 104 00:06:23,320 --> 00:06:27,080 Speaker 1: provide these services, so it's not like Google has its 105 00:06:27,120 --> 00:06:31,000 Speaker 1: own cell towers out there. Instead, they essentially are leasing 106 00:06:31,040 --> 00:06:35,120 Speaker 1: that from another company, companies like T Mobile, which had 107 00:06:35,160 --> 00:06:38,520 Speaker 1: its own data breach recently, and it is possible that 108 00:06:38,560 --> 00:06:42,200 Speaker 1: this data breach for Google Fi is a direct result 109 00:06:42,440 --> 00:06:45,360 Speaker 1: of the T Mobile breach, but we don't have confirmation 110 00:06:45,400 --> 00:06:48,839 Speaker 1: on that. According to Google, hackers access to database that 111 00:06:48,880 --> 00:06:52,400 Speaker 1: included information on customers, but the info was limited to 112 00:06:52,440 --> 00:06:55,920 Speaker 1: stuff like phone numbers, account status so whether they're active 113 00:06:56,000 --> 00:07:00,600 Speaker 1: or inactive, simcards, serial numbers, and details about the service 114 00:07:00,600 --> 00:07:03,719 Speaker 1: plan that each customer subscribed to. Now, what was not 115 00:07:03,839 --> 00:07:09,200 Speaker 1: included were names and other personal information. Also, Google says 116 00:07:09,279 --> 00:07:13,080 Speaker 1: no passwords or payment data or text message content was 117 00:07:13,120 --> 00:07:16,360 Speaker 1: included in the data, so that's good. At least. Google 118 00:07:16,400 --> 00:07:19,160 Speaker 1: did warn customers to be on the lookout for phishing scams, 119 00:07:19,200 --> 00:07:21,880 Speaker 1: which the company stressed is a good idea anyway, but 120 00:07:21,920 --> 00:07:25,280 Speaker 1: it's particularly good when you have a data breach like this, 121 00:07:25,760 --> 00:07:28,640 Speaker 1: And according to tech Crunch, at least one Google THI 122 00:07:28,760 --> 00:07:32,360 Speaker 1: customer has reported being the victim of SIM swapping. That's 123 00:07:32,400 --> 00:07:34,840 Speaker 1: when a hacker hijacks a SIM card to take control 124 00:07:34,920 --> 00:07:38,080 Speaker 1: of a phone number, and that allows them to send 125 00:07:38,120 --> 00:07:41,280 Speaker 1: and receive phone calls as if they were that phone, 126 00:07:41,920 --> 00:07:45,400 Speaker 1: and as well as as be able to access voicemail, 127 00:07:45,400 --> 00:07:49,120 Speaker 1: which is a big old yikes. Speaking of Google, as 128 00:07:49,120 --> 00:07:53,800 Speaker 1: I mentioned earlier, the company recently laid off some employees 129 00:07:54,000 --> 00:07:57,040 Speaker 1: like it has plans to lay off twelve thousand total, 130 00:07:57,760 --> 00:08:00,840 Speaker 1: and one of those employees was christ In max o 131 00:08:01,600 --> 00:08:05,440 Speaker 1: Um or max co. I apologize christ and I do 132 00:08:05,480 --> 00:08:07,280 Speaker 1: not know how to pronounce your last name, and that's 133 00:08:07,320 --> 00:08:10,760 Speaker 1: my fault. But Kristen joined Google back in two thousand 134 00:08:10,800 --> 00:08:13,240 Speaker 1: and eight and had served as the company's head of 135 00:08:13,320 --> 00:08:16,960 Speaker 1: mental health and well being, and apparently her team was 136 00:08:17,080 --> 00:08:21,000 Speaker 1: hit significantly by these layoffs. And I think that sends 137 00:08:21,040 --> 00:08:25,720 Speaker 1: a pretty bleak message to Google employees who remain at Google, 138 00:08:26,120 --> 00:08:28,640 Speaker 1: because I imagine hearing that the mental health and well 139 00:08:28,680 --> 00:08:33,040 Speaker 1: being department has been largely cleared out. It's not great 140 00:08:33,080 --> 00:08:36,560 Speaker 1: news when you start thinking about what that implies, because 141 00:08:36,559 --> 00:08:39,320 Speaker 1: it suggests that Google does not view mental health and 142 00:08:39,360 --> 00:08:43,920 Speaker 1: well being of its employees as a critical component for 143 00:08:44,160 --> 00:08:48,360 Speaker 1: its business. And that's just not a good sign. I've 144 00:08:48,440 --> 00:08:52,800 Speaker 1: worked for companies where I ultimately felt like the employees 145 00:08:52,840 --> 00:08:58,240 Speaker 1: were unappreciated or taken advantage of, and it is the worst, right, 146 00:08:58,320 --> 00:09:01,200 Speaker 1: even if you love what you do, if you feel 147 00:09:01,240 --> 00:09:04,240 Speaker 1: like you're working for an organization that doesn't have an 148 00:09:04,280 --> 00:09:08,680 Speaker 1: appreciation for its own employees, that's demoralizing and that could 149 00:09:08,679 --> 00:09:11,719 Speaker 1: be enough to convince someone to leave, even if they 150 00:09:11,720 --> 00:09:15,440 Speaker 1: love their work. Trust me, I know, because I did 151 00:09:15,520 --> 00:09:21,680 Speaker 1: that anyway. According to Insider, uh, some employees hit by 152 00:09:21,679 --> 00:09:25,360 Speaker 1: the layoffs received kind of a terse impersonal email that 153 00:09:25,480 --> 00:09:27,760 Speaker 1: essentially just told them that they no longer have a 154 00:09:27,880 --> 00:09:30,800 Speaker 1: role at the company and that and at least some 155 00:09:30,920 --> 00:09:34,280 Speaker 1: of the cases, a few employees said their managers didn't 156 00:09:34,280 --> 00:09:36,719 Speaker 1: even know that this had happened, which suggests that the 157 00:09:36,760 --> 00:09:41,800 Speaker 1: managers weren't consulted before these layoffs started to roll across 158 00:09:41,840 --> 00:09:45,040 Speaker 1: the company. So the impression I get from the article 159 00:09:45,160 --> 00:09:48,719 Speaker 1: in Insider says that you know that these layoffs may 160 00:09:48,720 --> 00:09:53,120 Speaker 1: have lacked forethought and consideration. Now, maybe I'm reading too 161 00:09:53,200 --> 00:09:54,960 Speaker 1: much into it, but that's what it says to me. 162 00:09:55,800 --> 00:09:58,160 Speaker 1: And again that could really hurt the company in the 163 00:09:58,240 --> 00:10:00,400 Speaker 1: long run if it turns out that some of these 164 00:10:00,480 --> 00:10:03,800 Speaker 1: layoffs were for people that really they needed in order 165 00:10:03,840 --> 00:10:07,520 Speaker 1: to keep certain projects going and to ensure success. But again, 166 00:10:08,320 --> 00:10:10,839 Speaker 1: I don't know how I would go about telling twelve 167 00:10:10,880 --> 00:10:13,560 Speaker 1: thousand people they don't work for me anymore, because I 168 00:10:13,600 --> 00:10:17,679 Speaker 1: think singing telegrams would get too expensive. All right, We've 169 00:10:17,720 --> 00:10:20,000 Speaker 1: got a lot more news to cover before we get 170 00:10:20,000 --> 00:10:32,280 Speaker 1: to that. Let's take a quick break. Okay, we're back, 171 00:10:32,320 --> 00:10:34,640 Speaker 1: and I got a little bit more Google news. One 172 00:10:34,679 --> 00:10:38,200 Speaker 1: person who had some words of warning for Google is 173 00:10:38,280 --> 00:10:41,839 Speaker 1: Paul Butchit or book Hite. I don't know how to 174 00:10:41,880 --> 00:10:44,200 Speaker 1: say his last name either. I'm I'm batting a thousand 175 00:10:44,200 --> 00:10:48,160 Speaker 1: when it comes to names today, y'all. But anyway, Paul 176 00:10:48,280 --> 00:10:51,400 Speaker 1: as I will call him lead development on Gmail, so's 177 00:10:51,440 --> 00:10:56,080 Speaker 1: he's credited as the person who created Gmail. And he 178 00:10:56,200 --> 00:11:00,680 Speaker 1: predicts that chat gpt will completely we're a place Google 179 00:11:00,760 --> 00:11:02,839 Speaker 1: search within a year or two, as people start to 180 00:11:02,920 --> 00:11:06,520 Speaker 1: lean on chat gpt to get their questions answered. Instead 181 00:11:06,559 --> 00:11:08,640 Speaker 1: of going to search for the right length and then 182 00:11:08,679 --> 00:11:11,400 Speaker 1: searching for their answer there, they'll just ask the question 183 00:11:11,440 --> 00:11:13,760 Speaker 1: to chat gpt. They'll get the answer and then they'll 184 00:11:13,760 --> 00:11:17,080 Speaker 1: just skip Google entirely, and this in turn will destroy 185 00:11:17,160 --> 00:11:21,160 Speaker 1: Google because Google depends so heavily on ads that are 186 00:11:21,160 --> 00:11:25,080 Speaker 1: served within search results. Paul points out that even if 187 00:11:25,080 --> 00:11:29,280 Speaker 1: Google deploys its own AI tool, which we know that 188 00:11:29,320 --> 00:11:32,320 Speaker 1: Google has been developing, that's not a secret. The company 189 00:11:32,360 --> 00:11:36,080 Speaker 1: has been very uh forthright on that, saying yes, they're 190 00:11:36,080 --> 00:11:38,480 Speaker 1: working on AI, but it's not something that they're ready 191 00:11:38,520 --> 00:11:42,080 Speaker 1: to release yet because it could be potentially harmful. Well, 192 00:11:42,160 --> 00:11:44,120 Speaker 1: Paul says, the effect would still be the same if 193 00:11:44,160 --> 00:11:48,000 Speaker 1: Google launches its own version of this, because again, it 194 00:11:48,120 --> 00:11:52,120 Speaker 1: bypasses search, which means it bypasses where you would normally 195 00:11:52,160 --> 00:11:55,640 Speaker 1: serve up ads. But y'all, I want to just point 196 00:11:55,640 --> 00:11:59,120 Speaker 1: out that it would do way more than just that. 197 00:11:59,320 --> 00:12:02,960 Speaker 1: It would hurt way more than just Google, because think 198 00:12:02,960 --> 00:12:05,320 Speaker 1: about what shows up in search results in the first place. 199 00:12:05,480 --> 00:12:09,880 Speaker 1: It's it's a list of links to external websites. That's 200 00:12:09,920 --> 00:12:12,400 Speaker 1: where all this data lives. Right when you're asking a 201 00:12:12,480 --> 00:12:16,000 Speaker 1: question from Google, the data lives on these websites, and 202 00:12:16,600 --> 00:12:18,800 Speaker 1: you know, the goal is to have you click on 203 00:12:18,840 --> 00:12:20,960 Speaker 1: the links so that you visit the website, and then 204 00:12:21,200 --> 00:12:24,480 Speaker 1: chances are that website is generating revenue from ad results too. 205 00:12:25,280 --> 00:12:29,720 Speaker 1: But if chat GPT disrupts web search to that extent, 206 00:12:30,600 --> 00:12:32,880 Speaker 1: it's not just gonna hit Google. It's going to hit 207 00:12:33,000 --> 00:12:36,000 Speaker 1: all the sites that Google links out to, and that 208 00:12:36,559 --> 00:12:39,880 Speaker 1: you know chat GPT is drawing information from these sites 209 00:12:40,480 --> 00:12:43,520 Speaker 1: because it's not like chat gpt just magically knows the 210 00:12:43,559 --> 00:12:48,640 Speaker 1: answers to questions. It's pulling data from sources in a 211 00:12:48,679 --> 00:12:52,720 Speaker 1: way that's not entirely or even particularly transparent. So we 212 00:12:52,720 --> 00:12:57,719 Speaker 1: don't actually know where chat gpt is pulling data from 213 00:12:57,720 --> 00:13:00,400 Speaker 1: in any given moment. But think about it's for just 214 00:13:00,480 --> 00:13:06,000 Speaker 1: a moment. If search essentially dies, then how do websites 215 00:13:06,080 --> 00:13:09,480 Speaker 1: attract users to their sites? Now? Subsites are kind of 216 00:13:09,520 --> 00:13:13,280 Speaker 1: like landing pages. Folks might bookmark them, they might go 217 00:13:13,360 --> 00:13:16,920 Speaker 1: to them regularly and not rely on search. Right, you 218 00:13:17,000 --> 00:13:19,920 Speaker 1: might have certain news sites that you go to on 219 00:13:19,960 --> 00:13:23,319 Speaker 1: a regular basis. Those cases, you know, they get a 220 00:13:23,400 --> 00:13:26,600 Speaker 1: lot of traffic that's coming in just from folks jumping 221 00:13:26,640 --> 00:13:30,079 Speaker 1: straight to the website. But a lot of sites depend 222 00:13:30,160 --> 00:13:34,360 Speaker 1: heavily on traffic from search engines. People are searching for 223 00:13:34,400 --> 00:13:37,600 Speaker 1: specific terms, they come up with responses, they go to 224 00:13:37,679 --> 00:13:41,400 Speaker 1: those websites, and that's where most of the traffic coming 225 00:13:41,400 --> 00:13:44,400 Speaker 1: to those websites comes from, the search engines. So if 226 00:13:44,400 --> 00:13:49,800 Speaker 1: the search engine goes down, that traffic disappears. Then eventually 227 00:13:50,120 --> 00:13:52,360 Speaker 1: these sites start to go out of business because no 228 00:13:52,400 --> 00:13:54,840 Speaker 1: one's visiting them, so they're not getting any ad revenue, 229 00:13:55,440 --> 00:13:58,120 Speaker 1: so they you know, it costs money to keep those 230 00:13:58,160 --> 00:14:01,600 Speaker 1: sites up. It's eventually just gonna die. Now, ultimately that 231 00:14:01,640 --> 00:14:05,440 Speaker 1: means that the information sources that chat GPT depends upon 232 00:14:06,040 --> 00:14:08,680 Speaker 1: will start dying off, and that would mean that the 233 00:14:08,720 --> 00:14:12,280 Speaker 1: tool would become less reliable or maybe more biased, depending 234 00:14:12,360 --> 00:14:14,920 Speaker 1: upon what sites are sticking around. So, in other words, 235 00:14:15,880 --> 00:14:20,160 Speaker 1: I don't think a future where chat GPT takes over 236 00:14:20,240 --> 00:14:22,640 Speaker 1: for search is bad just for Google. I think it's 237 00:14:22,640 --> 00:14:27,720 Speaker 1: bad for everyone, including chat GPT, in the long run. 238 00:14:28,640 --> 00:14:30,480 Speaker 1: So that's something to think about. I don't know that 239 00:14:30,480 --> 00:14:34,000 Speaker 1: there's any way to avoid it, but I definitely see 240 00:14:34,000 --> 00:14:37,640 Speaker 1: it as being a bad thing like that would require 241 00:14:38,160 --> 00:14:42,680 Speaker 1: some careful work in order to either avoid or fix 242 00:14:42,760 --> 00:14:47,320 Speaker 1: that problem. Okay, now let me paint a scenario for you. 243 00:14:47,600 --> 00:14:50,040 Speaker 1: This is a new story. Let's say that you're you 244 00:14:50,080 --> 00:14:55,000 Speaker 1: are an AI company that specializes in creating synthetic human voices, 245 00:14:55,320 --> 00:14:58,760 Speaker 1: including voices that could sound really close or maybe even 246 00:14:58,800 --> 00:15:02,760 Speaker 1: identical to the voices of specific people, Like you could 247 00:15:02,840 --> 00:15:07,040 Speaker 1: replicate a voice like famous people, and you've got this 248 00:15:07,120 --> 00:15:10,800 Speaker 1: tool that can learn how to reproduce the sound and 249 00:15:10,920 --> 00:15:15,800 Speaker 1: vocal quality of a real, specific human beings voice. Now 250 00:15:16,360 --> 00:15:19,680 Speaker 1: would it shock you to learn that some people on 251 00:15:19,720 --> 00:15:22,400 Speaker 1: the Internet would use this kind of tool to do 252 00:15:22,520 --> 00:15:27,760 Speaker 1: awful things? Oh? You wouldn't. Oh I So you've been 253 00:15:27,760 --> 00:15:30,680 Speaker 1: on the internet before, you already you realize this would 254 00:15:30,680 --> 00:15:32,760 Speaker 1: be a problem, right the gate right, Like, yeah, of 255 00:15:32,800 --> 00:15:37,520 Speaker 1: course it's obvious. Well, that means you're ahead of eleven Labs. 256 00:15:37,520 --> 00:15:40,920 Speaker 1: That's a company that makes a voice synthesis tool and 257 00:15:41,280 --> 00:15:45,680 Speaker 1: launched a beta for its AI generated voice tool, and 258 00:15:45,720 --> 00:15:48,680 Speaker 1: then even more recently than that has announced that it 259 00:15:48,760 --> 00:15:54,640 Speaker 1: had detected a quote increased number of voice clothing misuse 260 00:15:54,760 --> 00:15:58,240 Speaker 1: cases end quote. So this is all reported in Vice, 261 00:15:58,640 --> 00:16:02,080 Speaker 1: which has a very long art call detailing how members 262 00:16:02,160 --> 00:16:06,640 Speaker 1: of four chan had enrolled in this beta and then 263 00:16:06,720 --> 00:16:09,880 Speaker 1: use the tools to create synthesized AI replications of voices 264 00:16:09,920 --> 00:16:14,680 Speaker 1: belonging to folks like the actress Emma Watson or podcaster 265 00:16:14,920 --> 00:16:18,120 Speaker 1: Joe Rogan and others. And then, and I'm sure this 266 00:16:18,200 --> 00:16:21,560 Speaker 1: is gonna shock you, they use these synthetic voices to 267 00:16:21,680 --> 00:16:24,560 Speaker 1: make them say all sorts of truly awful stuff. I'm 268 00:16:24,600 --> 00:16:30,560 Speaker 1: talking like homophobic and transphobic comments, racial slurs, violent threats, 269 00:16:30,600 --> 00:16:34,040 Speaker 1: all sorts of stuff like that. Again, this is only 270 00:16:34,120 --> 00:16:36,920 Speaker 1: shocking for someone who maybe was born yesterday, but for 271 00:16:36,960 --> 00:16:39,800 Speaker 1: anyone who actually is aware that four chan is a thing, 272 00:16:40,360 --> 00:16:44,000 Speaker 1: this is not surprising even a little bit. And considering 273 00:16:44,000 --> 00:16:47,600 Speaker 1: we've already seen plenty of people use deep faked technology 274 00:16:47,640 --> 00:16:51,200 Speaker 1: to create content without the consent of the folks who 275 00:16:51,240 --> 00:16:55,000 Speaker 1: were replicated in those videos, you can only imagine the 276 00:16:55,040 --> 00:16:57,360 Speaker 1: sort of stuff we could see in the future. And 277 00:16:57,400 --> 00:16:59,320 Speaker 1: I think I'm gonna need to do a full episode 278 00:16:59,400 --> 00:17:03,000 Speaker 1: about this problem. I probably want to talk with someone 279 00:17:03,040 --> 00:17:06,680 Speaker 1: who has either experienced this kind a violation of their 280 00:17:06,720 --> 00:17:10,440 Speaker 1: identity themselves, or someone who has worked closely with people 281 00:17:10,480 --> 00:17:13,920 Speaker 1: who have experienced it, because I honestly think it's really 282 00:17:13,960 --> 00:17:18,800 Speaker 1: important to realize the the extent of the harm this 283 00:17:19,000 --> 00:17:23,199 Speaker 1: abuse causes. Uh, And I wanted to come from a 284 00:17:23,240 --> 00:17:25,800 Speaker 1: genuine place, so I'll probably be on the lookout see 285 00:17:25,800 --> 00:17:27,399 Speaker 1: if I can find someone who would be willing to 286 00:17:27,440 --> 00:17:32,400 Speaker 1: talk about that, because it's a really despicable practice too, 287 00:17:33,080 --> 00:17:38,280 Speaker 1: to replicate someone without their consent. It is a true violation. Anyway, 288 00:17:38,400 --> 00:17:41,159 Speaker 1: eleven Labs has chosen to lock down the beta and 289 00:17:41,200 --> 00:17:44,520 Speaker 1: revisit their approach to safeguards, which he has a good idea. 290 00:17:44,560 --> 00:17:48,359 Speaker 1: It sounds like they're going to require identification verification before 291 00:17:48,359 --> 00:17:52,439 Speaker 1: you're allowed to replicate a specific voice, and to to 292 00:17:52,760 --> 00:17:55,199 Speaker 1: show that you have the consent to do that. I 293 00:17:55,200 --> 00:17:59,399 Speaker 1: think that's the bare minimum you you can do. And 294 00:17:59,400 --> 00:18:01,720 Speaker 1: obviously you have to keep the tool underlock and key. 295 00:18:01,760 --> 00:18:04,560 Speaker 1: You can't just release it because otherwise people would just 296 00:18:04,640 --> 00:18:10,000 Speaker 1: exploit it. So interesting story, Uh, disturbing look at the future. 297 00:18:10,280 --> 00:18:12,240 Speaker 1: But we knew this was coming, I mean because deep 298 00:18:12,280 --> 00:18:14,720 Speaker 1: fakes have been a thing for a while now. The 299 00:18:14,720 --> 00:18:19,120 Speaker 1: Wall Street Journal reports that despite US export restrictions that 300 00:18:19,200 --> 00:18:23,160 Speaker 1: should prevent certain Chinese companies and organizations from being able 301 00:18:23,200 --> 00:18:26,719 Speaker 1: to purchase stuff like cutting edge computer chips, the China 302 00:18:26,800 --> 00:18:29,959 Speaker 1: Academy of Engineering Physics seems to be able to do 303 00:18:30,000 --> 00:18:32,679 Speaker 1: that without any real problem. Uh. This is one of 304 00:18:32,680 --> 00:18:35,680 Speaker 1: the organizations that the United States has already put on 305 00:18:35,760 --> 00:18:40,760 Speaker 1: a blacklist for semi conductor companies and chip manufacturers. In fact, 306 00:18:40,840 --> 00:18:44,879 Speaker 1: the US has told companies like Nvidia and Intel expressly 307 00:18:45,000 --> 00:18:48,880 Speaker 1: do not sell hardware to the China Academy of Engineering 308 00:18:48,880 --> 00:18:54,200 Speaker 1: Physics because the organization's work includes work in nuclear weapons research. 309 00:18:55,040 --> 00:18:58,639 Speaker 1: So if that ban is in place, how is this 310 00:18:58,760 --> 00:19:04,000 Speaker 1: organization acquiring the chips? But largely through resellers in China. 311 00:19:04,480 --> 00:19:07,960 Speaker 1: You have other entities in China buying up these these 312 00:19:08,000 --> 00:19:12,440 Speaker 1: components and then reselling those two organizations like the China 313 00:19:12,480 --> 00:19:16,200 Speaker 1: Academy of Engineering Physics. It is very hard. It's almost 314 00:19:16,280 --> 00:19:20,439 Speaker 1: impossible to prevent all sophisticated chips from being shipped to 315 00:19:20,520 --> 00:19:23,880 Speaker 1: China because some of these components were clearly originally part 316 00:19:23,920 --> 00:19:26,800 Speaker 1: of like a larger system, like like a PC or 317 00:19:26,840 --> 00:19:29,800 Speaker 1: a gaming rig. The Wall Street General reports that the 318 00:19:29,840 --> 00:19:33,080 Speaker 1: chips in question are typically a couple of years old, 319 00:19:33,240 --> 00:19:36,200 Speaker 1: so they aren't the most powerful, most recent ones on 320 00:19:36,240 --> 00:19:39,680 Speaker 1: the market. It's hard to imagine how companies can make 321 00:19:39,720 --> 00:19:42,720 Speaker 1: certain that all retailers that they work with comply with 322 00:19:42,760 --> 00:19:46,200 Speaker 1: these rules, none of them are shipping to China. Meanwhile, 323 00:19:46,680 --> 00:19:50,199 Speaker 1: China has to purchase these chips because it lacks the 324 00:19:50,240 --> 00:19:55,080 Speaker 1: manufacturing facilities to make the most sophisticated processors, particularly at scale. 325 00:19:55,480 --> 00:19:59,399 Speaker 1: China lags behind everybody else by a few generations, and 326 00:19:59,440 --> 00:20:02,720 Speaker 1: so these black lists do slow down China's military use 327 00:20:02,720 --> 00:20:05,920 Speaker 1: of such technology. But the article really does point out 328 00:20:06,080 --> 00:20:10,199 Speaker 1: how hard it is to enforce a total ban. Okay, 329 00:20:10,560 --> 00:20:12,720 Speaker 1: I've got a few more tech stories I want to 330 00:20:12,760 --> 00:20:15,280 Speaker 1: talk about. Before I get to that, Let's take another 331 00:20:15,359 --> 00:20:29,560 Speaker 1: quick break. We're backed. Tech Dirt has an informative piece 332 00:20:29,600 --> 00:20:33,880 Speaker 1: by Mike Masnick about how a paralegal named Katherine Twoson 333 00:20:34,640 --> 00:20:39,240 Speaker 1: is pulling at the threads holding together Joshua Browner's story 334 00:20:39,400 --> 00:20:42,560 Speaker 1: and image. So Browner, in case you don't know, is 335 00:20:42,600 --> 00:20:46,080 Speaker 1: the CEO of a company called Do Not Pay. I 336 00:20:46,119 --> 00:20:49,280 Speaker 1: have mentioned Do Not Pay a few times already this year. 337 00:20:49,800 --> 00:20:53,160 Speaker 1: Uh It's best known for being an AI focused company 338 00:20:53,320 --> 00:20:56,000 Speaker 1: that aims to use artificial intelligence to help people do 339 00:20:56,160 --> 00:21:00,400 Speaker 1: stuff ranging from canceling subscription services that they're not using 340 00:21:00,440 --> 00:21:06,720 Speaker 1: anymore to fighting parking tickets and other legal related work. Well, 341 00:21:06,880 --> 00:21:10,600 Speaker 1: Twoson decided to put some of Do Not Pays services 342 00:21:10,640 --> 00:21:13,919 Speaker 1: to the test. She reported that her attempts to have 343 00:21:14,040 --> 00:21:17,880 Speaker 1: the AI generate certain legal documents were largely a failure. 344 00:21:18,280 --> 00:21:21,199 Speaker 1: That in one case, it was pretty much like the 345 00:21:21,240 --> 00:21:23,480 Speaker 1: service was just kind of filling in the blanks on 346 00:21:23,520 --> 00:21:26,920 Speaker 1: a form letter kind of approach, and that in another 347 00:21:27,440 --> 00:21:30,760 Speaker 1: she put uh in for a more complicated legal document 348 00:21:30,800 --> 00:21:32,800 Speaker 1: and received a response that it would take a few 349 00:21:32,800 --> 00:21:38,080 Speaker 1: hours to process. She's suggests that maybe that indicates that 350 00:21:38,119 --> 00:21:41,680 Speaker 1: there is little to know AI involvement in that process 351 00:21:41,720 --> 00:21:44,560 Speaker 1: at all, That if it were a computer generated you 352 00:21:44,560 --> 00:21:48,240 Speaker 1: wouldn't expect it to be something that takes a long time, 353 00:21:49,080 --> 00:21:52,120 Speaker 1: and that perhaps it means that there are actual humans 354 00:21:52,160 --> 00:21:54,920 Speaker 1: putting together those kind of requests, or at least having 355 00:21:55,640 --> 00:22:00,399 Speaker 1: a heavy involvement in editing an AI generated piece. In 356 00:22:00,400 --> 00:22:03,520 Speaker 1: other words, that it seems to indicate the AI is 357 00:22:03,560 --> 00:22:09,160 Speaker 1: not quite as sophisticated and effective as the company might 358 00:22:09,480 --> 00:22:13,720 Speaker 1: otherwise indicate. She has also investigated some of Browder's claims, 359 00:22:13,720 --> 00:22:17,359 Speaker 1: including one where he tweeted he would buy up ten 360 00:22:17,440 --> 00:22:21,320 Speaker 1: dollars of medical debt for every retweet and follow that 361 00:22:21,480 --> 00:22:26,160 Speaker 1: This particular tweet message got Tuson suspected the Browner had 362 00:22:26,200 --> 00:22:30,920 Speaker 1: not followed through on that promise at that point. Browder 363 00:22:30,960 --> 00:22:34,280 Speaker 1: then produced a receipt showing a five hundred dollar donation, 364 00:22:34,440 --> 00:22:37,800 Speaker 1: which bought up around fifty grand of medical debt because 365 00:22:37,880 --> 00:22:41,200 Speaker 1: debt is sold for pennies on the dollar. But Twoson 366 00:22:41,560 --> 00:22:46,200 Speaker 1: found this receipt somewhat suspicious and hypothesized that Browder had 367 00:22:46,320 --> 00:22:49,040 Speaker 1: not made the donation when he claimed he did. He 368 00:22:49,080 --> 00:22:52,240 Speaker 1: claimed he had made it back in early December last year, 369 00:22:52,920 --> 00:22:57,159 Speaker 1: but Twoson looked closer at the receipt and saw that 370 00:22:57,240 --> 00:23:01,520 Speaker 1: the date on the receipt and that was next to 371 00:23:01,520 --> 00:23:04,600 Speaker 1: the line item failed to be on the exact same 372 00:23:06,560 --> 00:23:09,960 Speaker 1: line horizontally across the receipt. She said, oh, it's it's misaligned. 373 00:23:10,280 --> 00:23:13,840 Speaker 1: She even did her own donation to buy up medical 374 00:23:13,920 --> 00:23:18,639 Speaker 1: debt and showed that her receipt everything was in perfect alignment, 375 00:23:18,960 --> 00:23:22,679 Speaker 1: but his receipt showed a little misalignment, which suggested the 376 00:23:22,720 --> 00:23:26,239 Speaker 1: possibility of a photoshop job to change the date. And 377 00:23:26,280 --> 00:23:28,960 Speaker 1: then Tuson decided to look into it further by actually 378 00:23:29,000 --> 00:23:32,520 Speaker 1: contacting the organization that sold the medical debt in the 379 00:23:32,560 --> 00:23:37,160 Speaker 1: first place, and found that Browner's purchase happened four minutes 380 00:23:37,640 --> 00:23:43,160 Speaker 1: after she brought into question his follow through. So she says, hey, 381 00:23:43,560 --> 00:23:49,440 Speaker 1: whatever happened to this guarantee? Four minutes later? He apparently 382 00:23:49,480 --> 00:23:53,400 Speaker 1: makes this purchase and then subsequently says, yeah, I did 383 00:23:53,400 --> 00:23:56,840 Speaker 1: it way back in December. So this piece really paints 384 00:23:56,880 --> 00:23:59,520 Speaker 1: Browder in a critical light, suggesting that he is leaning 385 00:23:59,520 --> 00:24:02,399 Speaker 1: perhaps a little too hard on publicity stunts and hype. 386 00:24:03,080 --> 00:24:07,080 Speaker 1: We've been hearing these kinds of of thoughts about Browner 387 00:24:07,160 --> 00:24:09,879 Speaker 1: for a while now, particularly in the wake of his 388 00:24:09,960 --> 00:24:13,080 Speaker 1: offer to pay a million dollars to a lawyer who 389 00:24:13,080 --> 00:24:16,359 Speaker 1: would use do not pays AI to argue a case 390 00:24:16,400 --> 00:24:18,639 Speaker 1: before the U. S. Supreme Court. So I guess the 391 00:24:18,760 --> 00:24:22,080 Speaker 1: lesson here is to rely on critical thinking and investigation 392 00:24:22,160 --> 00:24:25,040 Speaker 1: like Tucson did, and do your best to try and 393 00:24:25,560 --> 00:24:30,360 Speaker 1: hear through the hype. Speaking of hype, Mercedes Benz announced 394 00:24:30,440 --> 00:24:33,320 Speaker 1: last week that the company will introduce a level three 395 00:24:33,440 --> 00:24:37,440 Speaker 1: autonomous driving feature in vehicles here in the United States 396 00:24:37,840 --> 00:24:41,359 Speaker 1: later this year. So the various levels of autonomous driving 397 00:24:41,560 --> 00:24:44,520 Speaker 1: come out of the Society of Automotive Engineers that kind 398 00:24:44,520 --> 00:24:49,000 Speaker 1: of defined the levels. They identified six levels from zero 399 00:24:49,119 --> 00:24:52,000 Speaker 1: to five. So a level zero vehicle essentially has no 400 00:24:52,080 --> 00:24:55,879 Speaker 1: autonomous features and is strictly manual. A Level five vehicle 401 00:24:55,920 --> 00:24:59,800 Speaker 1: would be totally autonomous and would probably even lack manual 402 00:25:00,040 --> 00:25:05,320 Speaker 1: trolls entirely. It would just be computer controlled and automated 403 00:25:05,600 --> 00:25:09,920 Speaker 1: in every situation. Levels one and two include driver assist functions, 404 00:25:09,920 --> 00:25:12,719 Speaker 1: but ultimately a human is driving the vehicle. Even if 405 00:25:12,720 --> 00:25:15,600 Speaker 1: their feet can be off the pedals or they're not 406 00:25:15,680 --> 00:25:20,280 Speaker 1: actively steering, they are technically still driving. They're considered the driver. 407 00:25:20,840 --> 00:25:24,560 Speaker 1: A level two vehicle can provide steering and break and 408 00:25:24,640 --> 00:25:28,959 Speaker 1: acceleration support, but again ultimately a humans doing the driving. 409 00:25:29,440 --> 00:25:32,080 Speaker 1: And it's this is the level that stuff like Tesla's 410 00:25:32,080 --> 00:25:36,119 Speaker 1: autopilot and even its full self driving features fall into 411 00:25:36,200 --> 00:25:39,959 Speaker 1: their level two. Once you hit level three, then you 412 00:25:40,000 --> 00:25:43,280 Speaker 1: say the vehicle is doing the driving when the autonomous 413 00:25:43,320 --> 00:25:45,119 Speaker 1: mode is active, even if someone is sitting in the 414 00:25:45,200 --> 00:25:47,879 Speaker 1: driver's seat. The vehicles doing the driving as long as 415 00:25:47,880 --> 00:25:52,960 Speaker 1: the mode is in operation. Now, a level three does 416 00:25:53,119 --> 00:25:56,359 Speaker 1: have a feature where the autonomous mode may request to 417 00:25:56,480 --> 00:25:59,119 Speaker 1: hand off control to a human driver, so you do 418 00:25:59,240 --> 00:26:01,879 Speaker 1: still need some one in that driver's seat who is 419 00:26:01,920 --> 00:26:04,840 Speaker 1: ready at any time to take over control of the vehicle. 420 00:26:05,560 --> 00:26:07,400 Speaker 1: That is one of the features of a level three. 421 00:26:08,119 --> 00:26:10,760 Speaker 1: Level three. At even a level four autonomous vehicle can 422 00:26:10,800 --> 00:26:15,200 Speaker 1: only operate in autonomous mode under specific conditions, so it's 423 00:26:15,200 --> 00:26:21,200 Speaker 1: not in every scenario. So conditions typically include things like weather, right, 424 00:26:21,240 --> 00:26:24,679 Speaker 1: so if the vehicle is in really foggy weather or 425 00:26:24,680 --> 00:26:28,560 Speaker 1: it's really storming, maybe you can't engage autonomous mode under 426 00:26:28,600 --> 00:26:32,800 Speaker 1: those conditions. Or it could include things like geo fencing. 427 00:26:33,040 --> 00:26:35,160 Speaker 1: In other words, the vehicle is only going to operate 428 00:26:35,240 --> 00:26:38,720 Speaker 1: in autonomous mode within a certain driving range, and outside 429 00:26:38,720 --> 00:26:42,639 Speaker 1: of that it has to be manually driven. So in 430 00:26:42,640 --> 00:26:44,840 Speaker 1: this specific case, the system is only going to be 431 00:26:44,880 --> 00:26:48,280 Speaker 1: available to customers in Nevada here in the US because 432 00:26:48,280 --> 00:26:51,880 Speaker 1: that's where it's legal, and other states have not created 433 00:26:52,000 --> 00:26:55,400 Speaker 1: laws that uh that that match with this yet, so 434 00:26:55,560 --> 00:26:58,600 Speaker 1: it will not operate outside of Nevada. And it's gonna 435 00:26:58,640 --> 00:27:01,480 Speaker 1: include sensors as well that will ensure that the driver 436 00:27:01,640 --> 00:27:04,479 Speaker 1: doesn't obstruct their face from view of the road, so 437 00:27:04,520 --> 00:27:07,240 Speaker 1: you can't like hold a magazine up or a book 438 00:27:07,760 --> 00:27:10,880 Speaker 1: or something like that and block your face. If you do, 439 00:27:11,240 --> 00:27:14,679 Speaker 1: then the car detects that and then switches back to 440 00:27:14,880 --> 00:27:19,120 Speaker 1: human controlled mode. So it it does still have some 441 00:27:19,800 --> 00:27:23,360 Speaker 1: uh some big limitations on it, which makes sense, right, 442 00:27:23,440 --> 00:27:26,479 Speaker 1: you want to have it there for emergencies and for 443 00:27:26,520 --> 00:27:30,680 Speaker 1: safety's sake. But very interesting to see a level three 444 00:27:30,800 --> 00:27:34,679 Speaker 1: vehicle coming onto roads in the United States, even if 445 00:27:34,720 --> 00:27:38,760 Speaker 1: it's just in Nevada. Blows, the hardware store company has 446 00:27:38,800 --> 00:27:42,600 Speaker 1: innovated a theft prevention system that they're calling Project Unlock 447 00:27:43,359 --> 00:27:46,040 Speaker 1: that I think is actually kind of nifty. So theft 448 00:27:46,160 --> 00:27:48,720 Speaker 1: is a big problem for stores in general like Lows 449 00:27:49,359 --> 00:27:54,359 Speaker 1: and organized theft is cited as a real cost for 450 00:27:55,000 --> 00:27:59,200 Speaker 1: stores like Lows for retail companies, and you have to 451 00:27:59,240 --> 00:28:01,680 Speaker 1: figure out a way to prevent theft, but you also 452 00:28:01,760 --> 00:28:05,639 Speaker 1: want to avoid frustrating customers. So in other words, like 453 00:28:05,720 --> 00:28:09,280 Speaker 1: locking everything away in special lockers where you have to 454 00:28:09,320 --> 00:28:12,760 Speaker 1: go and hunt down a customer service rep to come 455 00:28:12,800 --> 00:28:15,440 Speaker 1: in and unlock a case so that you can buy 456 00:28:15,480 --> 00:28:18,560 Speaker 1: your your power drill or whatever it is. That's not 457 00:28:18,600 --> 00:28:22,200 Speaker 1: a great customer experience. So what Lows has done is 458 00:28:22,240 --> 00:28:25,080 Speaker 1: they've made special use of r F I D chips 459 00:28:25,080 --> 00:28:28,920 Speaker 1: that keep the tool inert until they are activated at 460 00:28:28,920 --> 00:28:31,840 Speaker 1: a point of sale. So the way this works is, 461 00:28:32,480 --> 00:28:34,160 Speaker 1: let's say you go and you pick up a box 462 00:28:34,200 --> 00:28:37,760 Speaker 1: it's got a drill and an electric drill. You bring 463 00:28:37,800 --> 00:28:41,320 Speaker 1: the box to a cash register, they scan the barcode 464 00:28:41,320 --> 00:28:43,640 Speaker 1: on the box, You pay for your purchase, and at 465 00:28:43,680 --> 00:28:46,280 Speaker 1: that point the point of sale system sends a signal 466 00:28:46,680 --> 00:28:50,000 Speaker 1: to the specific r F I D chip that's inside 467 00:28:50,080 --> 00:28:53,520 Speaker 1: your drill, and it's tied to the bar code that's 468 00:28:53,560 --> 00:28:56,600 Speaker 1: on the box, and then the tool activates. So without 469 00:28:56,640 --> 00:29:00,280 Speaker 1: this step, the tool will not work. You can charge 470 00:29:00,320 --> 00:29:02,440 Speaker 1: up the battery, you can plug it in, it won't 471 00:29:02,440 --> 00:29:05,880 Speaker 1: work because it is inactive until you have gone through 472 00:29:05,920 --> 00:29:07,840 Speaker 1: the point of sale. So the idea here is only 473 00:29:07,920 --> 00:29:11,640 Speaker 1: legitimate purchases will be active in anything else that was 474 00:29:11,720 --> 00:29:14,360 Speaker 1: just say, lifted off the store shelf will be a 475 00:29:14,360 --> 00:29:17,320 Speaker 1: big old paper weight without that activation. Now, I'm not 476 00:29:17,360 --> 00:29:19,880 Speaker 1: gonna say that this approach is full proof, because nothing 477 00:29:19,960 --> 00:29:22,840 Speaker 1: ever is. Hackers will likely figure out a way to 478 00:29:23,040 --> 00:29:27,520 Speaker 1: replicate the activation eventually, but you know, it might be 479 00:29:27,640 --> 00:29:30,120 Speaker 1: enough of a hassle. It might be a high enough 480 00:29:30,320 --> 00:29:33,920 Speaker 1: barrier to take a serious chunk out of organized theft 481 00:29:34,320 --> 00:29:38,360 Speaker 1: at stores that use Project Unlock. Also, you know how 482 00:29:38,400 --> 00:29:41,200 Speaker 1: they say never read the comments. While the article where 483 00:29:41,200 --> 00:29:43,560 Speaker 1: I was reading about this was on Fox Business, and 484 00:29:43,600 --> 00:29:47,640 Speaker 1: I gotta say take that advice, because I did read 485 00:29:47,680 --> 00:29:51,240 Speaker 1: some of the comments and I wish I had not 486 00:29:52,040 --> 00:29:54,360 Speaker 1: h There was one person who had suggested that this 487 00:29:54,360 --> 00:29:57,000 Speaker 1: could be a way for tool companies to create a 488 00:29:57,120 --> 00:30:01,360 Speaker 1: tools as a service subscription fee. In other words, if 489 00:30:01,400 --> 00:30:04,400 Speaker 1: you don't pay a yearly fee, then they could turn 490 00:30:04,520 --> 00:30:07,720 Speaker 1: off your power tools, which would be really tricky to 491 00:30:07,760 --> 00:30:11,760 Speaker 1: do because you have to be within range of an 492 00:30:11,880 --> 00:30:14,720 Speaker 1: r F I D transmitter that is specifically going to 493 00:30:16,040 --> 00:30:19,760 Speaker 1: uh connect with the the chip that's in your tool, 494 00:30:20,120 --> 00:30:22,480 Speaker 1: and you'd have to have it be specific because you 495 00:30:22,520 --> 00:30:26,720 Speaker 1: can't just do a blanket de activation. If someone had 496 00:30:26,880 --> 00:30:30,360 Speaker 1: enrolled in that subscription service, then you would have to 497 00:30:30,400 --> 00:30:33,280 Speaker 1: have it where their tools would not be affected, and 498 00:30:33,320 --> 00:30:35,040 Speaker 1: it would mean you would have to send people out 499 00:30:35,040 --> 00:30:40,760 Speaker 1: into the real world with transmitters zapping neighborhoods and rural 500 00:30:41,440 --> 00:30:44,560 Speaker 1: homes and all this kind of stuff to de activate tools. 501 00:30:44,600 --> 00:30:47,600 Speaker 1: It's just not a practical thing. So it's not something 502 00:30:47,640 --> 00:30:52,000 Speaker 1: that I think would ever actually happen. I understand the concern, Like, 503 00:30:52,040 --> 00:30:53,760 Speaker 1: if you think, oh, there's a chip in this that 504 00:30:53,800 --> 00:30:58,560 Speaker 1: could theoretically turn my tool into an inert piece of 505 00:30:58,600 --> 00:31:02,400 Speaker 1: plastic and metal, obviously that would be concerning, But that's 506 00:31:02,440 --> 00:31:05,720 Speaker 1: just not how this technology would work in the real world. 507 00:31:06,440 --> 00:31:09,400 Speaker 1: Bloomberg reports that Sony is cutting back on producing VR 508 00:31:09,440 --> 00:31:14,760 Speaker 1: two headsets for the latest PlayStation due to low preorder figures, 509 00:31:14,800 --> 00:31:18,600 Speaker 1: so pre sales have been slow for the VR two headsets, 510 00:31:18,640 --> 00:31:21,760 Speaker 1: so apparently there's so much lower than the company anticipated 511 00:31:22,120 --> 00:31:26,560 Speaker 1: they're scaling back on production in anticipation of very slow sales. 512 00:31:27,440 --> 00:31:28,960 Speaker 1: Part of this could be due to the fact that 513 00:31:29,000 --> 00:31:31,600 Speaker 1: people in general are just being a bit more cautious 514 00:31:31,600 --> 00:31:34,200 Speaker 1: with their money than they have in recent years due 515 00:31:34,280 --> 00:31:36,960 Speaker 1: to economic factors, So in other words, it may have 516 00:31:37,040 --> 00:31:40,000 Speaker 1: nothing to do with the perception of VR. It may 517 00:31:40,040 --> 00:31:43,480 Speaker 1: have more to do with people just being careful about 518 00:31:43,480 --> 00:31:46,720 Speaker 1: their spending. However, another part could be an indication that 519 00:31:46,840 --> 00:31:50,680 Speaker 1: VR is still occupying a fairly niche market, and you 520 00:31:50,720 --> 00:31:53,040 Speaker 1: could think of that as being really bad news for 521 00:31:53,040 --> 00:31:56,000 Speaker 1: stuff like the Metaverse, let alone an other VR A 522 00:31:56,280 --> 00:32:00,360 Speaker 1: R or x R gear. Moving forward, according to Berg, 523 00:32:00,960 --> 00:32:03,680 Speaker 1: it sounds like Sony is cutting back by half for 524 00:32:03,840 --> 00:32:07,360 Speaker 1: the launch quarter, which again is later this year. So 525 00:32:07,400 --> 00:32:10,480 Speaker 1: originally they had planned to produce around two million units 526 00:32:10,520 --> 00:32:13,600 Speaker 1: for that first quarter at launch, but now it sounds 527 00:32:13,600 --> 00:32:16,280 Speaker 1: like it's going to be a million units. And further, 528 00:32:16,440 --> 00:32:19,480 Speaker 1: Sony expects to only sell around one point five million 529 00:32:19,520 --> 00:32:24,720 Speaker 1: headsets between the launch and March of next year. Now, 530 00:32:24,720 --> 00:32:26,680 Speaker 1: I think it's still too early to say that VR 531 00:32:26,840 --> 00:32:29,600 Speaker 1: is sinking, or even just saying that VR is just 532 00:32:29,680 --> 00:32:33,200 Speaker 1: treading water. I think what we need to do is 533 00:32:33,200 --> 00:32:38,480 Speaker 1: see if when the economy starts to recover, if people 534 00:32:38,520 --> 00:32:41,080 Speaker 1: start to move more on VR hardware, and then we 535 00:32:41,120 --> 00:32:43,800 Speaker 1: can get a better idea. Part of what we need 536 00:32:44,160 --> 00:32:49,240 Speaker 1: are more developers to create really compelling VR experiences. But 537 00:32:49,320 --> 00:32:52,440 Speaker 1: that becomes like a chicken and egg problem because if 538 00:32:52,440 --> 00:32:56,120 Speaker 1: you're a developer, you only want to pour your money 539 00:32:56,160 --> 00:32:59,600 Speaker 1: and effort and energy and time into producing something that's 540 00:32:59,600 --> 00:33:02,040 Speaker 1: going to have a return on investment, and if there 541 00:33:02,040 --> 00:33:05,720 Speaker 1: aren't enough people out there who own VR headsets, it 542 00:33:05,760 --> 00:33:09,080 Speaker 1: doesn't make sense to make that investment because you'll never 543 00:33:09,120 --> 00:33:12,320 Speaker 1: make your money back, right, It's just a money losing experience. 544 00:33:13,280 --> 00:33:15,960 Speaker 1: So that's part of the problem is that like a 545 00:33:16,000 --> 00:33:18,040 Speaker 1: lot of people say I don't want VR because there's 546 00:33:18,080 --> 00:33:20,640 Speaker 1: nothing on VR that I want to do, and meanwhile 547 00:33:20,640 --> 00:33:22,440 Speaker 1: developers are saying, well, we don't want to make anything 548 00:33:22,480 --> 00:33:25,280 Speaker 1: for VR because no one owns a VR headset. And 549 00:33:25,400 --> 00:33:28,920 Speaker 1: I'm being I'm using exaggerations here. Obviously there are people 550 00:33:28,960 --> 00:33:31,880 Speaker 1: who own them, but very few in the grand scheme 551 00:33:31,880 --> 00:33:34,320 Speaker 1: of things. So that's really where we are right now. 552 00:33:34,360 --> 00:33:36,239 Speaker 1: It's a it's a problem that's been this way for 553 00:33:36,280 --> 00:33:40,280 Speaker 1: a few years. Uh. I personally don't really care for VR, 554 00:33:40,360 --> 00:33:42,960 Speaker 1: but that's because I get real rpy when I put 555 00:33:43,000 --> 00:33:44,880 Speaker 1: on a headset, Like within a few minutes, I start 556 00:33:44,960 --> 00:33:48,280 Speaker 1: to not feel so good. But I would still like 557 00:33:48,360 --> 00:33:51,560 Speaker 1: to see the technology evolve. I think that it has 558 00:33:51,680 --> 00:33:54,720 Speaker 1: a really interesting place, but that's not going to happen 559 00:33:54,800 --> 00:33:58,200 Speaker 1: unless the money is there. Finally, it seems like every 560 00:33:58,280 --> 00:34:00,560 Speaker 1: year we wait to see if E three is going 561 00:34:00,600 --> 00:34:03,960 Speaker 1: to be done for reals. Uh. And it is scheduled 562 00:34:04,000 --> 00:34:07,360 Speaker 1: to take place in June this year after not happening 563 00:34:07,440 --> 00:34:09,920 Speaker 1: last year. But it sounds like it's gonna be a 564 00:34:10,000 --> 00:34:13,200 Speaker 1: pretty rough go for E three. And that's because, according 565 00:34:13,239 --> 00:34:18,120 Speaker 1: to I g N, Microsoft, Nintendo, and Sony are all 566 00:34:18,160 --> 00:34:22,080 Speaker 1: reportedly skipping the event and instead opting to hold their 567 00:34:22,120 --> 00:34:26,839 Speaker 1: own events individually whenever they feel like it instead. Now, 568 00:34:26,840 --> 00:34:29,040 Speaker 1: this should not come as a big surprise because we've 569 00:34:29,080 --> 00:34:32,280 Speaker 1: seen these companies do similar things before. I mean, Sony 570 00:34:32,320 --> 00:34:35,400 Speaker 1: and Nintendo in particular have skipped E three in the past. 571 00:34:35,680 --> 00:34:38,760 Speaker 1: Sony hasn't been involved in E three since two thousand nineteen, 572 00:34:39,160 --> 00:34:42,799 Speaker 1: and Nintendo instead chooses to hold Nintendo Direct events in 573 00:34:42,840 --> 00:34:48,839 Speaker 1: its own uh online events throughout the year. So it's 574 00:34:48,880 --> 00:34:51,560 Speaker 1: a pretty tough blow when the companies that are behind 575 00:34:51,680 --> 00:34:56,880 Speaker 1: the major gaming platforms all choose to skip a video 576 00:34:56,960 --> 00:35:02,320 Speaker 1: game conference, particularly when the three companies also own video 577 00:35:02,320 --> 00:35:05,680 Speaker 1: game developers and publishers. So if the the developers and 578 00:35:05,680 --> 00:35:10,839 Speaker 1: publishers they own also skip E three, that's huge. Now, 579 00:35:10,840 --> 00:35:14,479 Speaker 1: it doesn't necessarily mean E three is really most sincerely dead, 580 00:35:14,560 --> 00:35:17,440 Speaker 1: because there are other video game developers and publishers that 581 00:35:17,480 --> 00:35:21,160 Speaker 1: are not owned by Sony or Nintendo or Microsoft, and 582 00:35:21,200 --> 00:35:24,320 Speaker 1: they might still attend, but it does make the event 583 00:35:24,400 --> 00:35:28,440 Speaker 1: a tougher sell, both to industry professionals, to the media, 584 00:35:28,560 --> 00:35:32,120 Speaker 1: and to the general public. So we'll have to keep 585 00:35:32,120 --> 00:35:35,080 Speaker 1: our eyes open see how things go. Uh you know. 586 00:35:35,160 --> 00:35:37,000 Speaker 1: E three has had to pull the plug a couple 587 00:35:37,000 --> 00:35:39,279 Speaker 1: of times in the past. It may very well be 588 00:35:39,400 --> 00:35:42,040 Speaker 1: that we're we're at the end of the line for 589 00:35:42,239 --> 00:35:45,400 Speaker 1: E three, at least as as a trade show. It 590 00:35:45,440 --> 00:35:48,560 Speaker 1: may become more of a fan experience, although it has 591 00:35:48,600 --> 00:35:51,839 Speaker 1: not been doing particularly well in that regard, at least 592 00:35:51,880 --> 00:35:56,160 Speaker 1: from the actual perspective. Of a fan, like, it's frustrating 593 00:35:56,239 --> 00:35:59,000 Speaker 1: to go to E three as a fan. It is 594 00:35:59,040 --> 00:36:01,200 Speaker 1: not as much fun as you think it might be 595 00:36:01,200 --> 00:36:04,320 Speaker 1: because you're spending the vast majority of your time waiting 596 00:36:04,320 --> 00:36:07,360 Speaker 1: in a line to play a game for five minutes. 597 00:36:07,800 --> 00:36:10,919 Speaker 1: So not the best experience for a fan. So we'll 598 00:36:10,920 --> 00:36:13,440 Speaker 1: have to see if E three makes some massive changes 599 00:36:13,480 --> 00:36:18,520 Speaker 1: to kind of forge a future for the event. Otherwise, 600 00:36:19,000 --> 00:36:22,160 Speaker 1: I think we we may finally be getting to the end. 601 00:36:22,600 --> 00:36:24,680 Speaker 1: Of course, people have been saying that for like more 602 00:36:24,719 --> 00:36:27,719 Speaker 1: than a decade, so maybe it'll be like Moore's long, 603 00:36:27,760 --> 00:36:31,440 Speaker 1: it'll just stick around beyond comprehension. Okay, that's it for 604 00:36:31,480 --> 00:36:34,520 Speaker 1: this episode of tech Stuff. Hope you are well. If 605 00:36:34,520 --> 00:36:36,799 Speaker 1: you would like to reach out to be and let 606 00:36:36,800 --> 00:36:38,880 Speaker 1: me know of something I should cover in a future episode, 607 00:36:39,120 --> 00:36:41,440 Speaker 1: please do so. You can do that on Twitter. The 608 00:36:41,480 --> 00:36:43,800 Speaker 1: handle for the show is tech Stuff h s W. 609 00:36:44,520 --> 00:36:46,920 Speaker 1: Or you can download the I Heart Radio app. It's 610 00:36:46,920 --> 00:36:49,279 Speaker 1: free to download, it's free to use. It's got a 611 00:36:49,280 --> 00:36:53,239 Speaker 1: little search toolbar built right into it. You type text 612 00:36:53,280 --> 00:36:55,640 Speaker 1: stuff in there, it'll take you to the tech Stuff 613 00:36:55,800 --> 00:36:58,480 Speaker 1: page in the app, and there you'll find a little 614 00:36:58,560 --> 00:37:00,680 Speaker 1: microphone icon if you click on that you can leave 615 00:37:00,680 --> 00:37:02,960 Speaker 1: a voice message up to thirty seconds in length. Let 616 00:37:03,000 --> 00:37:04,400 Speaker 1: me know what you would like to hear in the future, 617 00:37:04,840 --> 00:37:13,799 Speaker 1: and I'll talk to you again really soon. Yeah. Text 618 00:37:13,800 --> 00:37:17,279 Speaker 1: Stuff is an I heart Radio production. For more podcasts 619 00:37:17,280 --> 00:37:20,040 Speaker 1: from my heart Radio, visit the i heart Radio app, 620 00:37:20,160 --> 00:37:23,320 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.