1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,360 Speaker 1: Hey there, and welcome to tex Stuff. I'm your host, 3 00:00:14,440 --> 00:00:17,600 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,640 --> 00:00:19,639 Speaker 1: And how the tech are you? It is time for 5 00:00:19,680 --> 00:00:24,000 Speaker 1: the tech news for Tuesday, February seven three and I'm 6 00:00:24,079 --> 00:00:26,239 Speaker 1: late to the game on talking about this, but here 7 00:00:26,239 --> 00:00:29,000 Speaker 1: in the United States, one of the big news stories 8 00:00:29,040 --> 00:00:31,760 Speaker 1: at the end of last week was the discovery of 9 00:00:31,880 --> 00:00:36,879 Speaker 1: and subsequent destruction of a Chinese made balloon that drifted 10 00:00:37,040 --> 00:00:41,280 Speaker 1: across the United States West coast to East coast. US 11 00:00:41,360 --> 00:00:45,000 Speaker 1: officials say the balloon was a spy vehicle. Chinese officials 12 00:00:45,000 --> 00:00:47,839 Speaker 1: claim it was for science rather than surveillance. It's all 13 00:00:47,880 --> 00:00:50,600 Speaker 1: a moot point now, because late last week the US 14 00:00:50,680 --> 00:00:54,280 Speaker 1: military shot down the balloon off the coast of South Carolina. 15 00:00:54,920 --> 00:00:57,800 Speaker 1: It's turned into a giant political matter here in the 16 00:00:57,840 --> 00:01:01,480 Speaker 1: United States because conservatives are arguing that the President was 17 00:01:01,560 --> 00:01:05,520 Speaker 1: indecisive and delayed too long before taking action. The president 18 00:01:05,560 --> 00:01:07,560 Speaker 1: says he gave the order to shoot down the balloon 19 00:01:07,600 --> 00:01:10,000 Speaker 1: as soon as the military deemed it would be safe 20 00:01:10,200 --> 00:01:12,520 Speaker 1: to do so and would have the best chance of 21 00:01:12,560 --> 00:01:15,760 Speaker 1: retrieving the balloons payload to make sure you know that, 22 00:01:15,840 --> 00:01:20,039 Speaker 1: in fact, this is like an espionage device, and I'm 23 00:01:20,080 --> 00:01:23,280 Speaker 1: sure we're gonna see both major parties here in the 24 00:01:23,360 --> 00:01:26,520 Speaker 1: United States try to use this event to earn political 25 00:01:26,560 --> 00:01:30,880 Speaker 1: points with their respective basis, and it's all very exhausting, 26 00:01:31,080 --> 00:01:33,800 Speaker 1: so I ain't gonna give it any more air time. 27 00:01:33,920 --> 00:01:38,920 Speaker 1: Like I don't object to incorporating politics into tech stories, 28 00:01:38,959 --> 00:01:42,400 Speaker 1: because the two do have overlap. I mean, one affects 29 00:01:42,440 --> 00:01:45,800 Speaker 1: the other and vice versa, but this is getting to 30 00:01:45,840 --> 00:01:49,920 Speaker 1: a point where he gets so petty. It's just exhausting. However, 31 00:01:50,240 --> 00:01:53,040 Speaker 1: what I will do is plan a future episode about 32 00:01:53,080 --> 00:01:57,480 Speaker 1: spy balloons and their history, because that's actually really neat stuff. 33 00:01:57,800 --> 00:02:01,320 Speaker 1: So we will do that and and avoid the kind 34 00:02:01,320 --> 00:02:06,160 Speaker 1: of weird argumentative politics that we're hearing right now in 35 00:02:06,200 --> 00:02:10,080 Speaker 1: the States. Last week, I also talked about how a 36 00:02:10,160 --> 00:02:12,760 Speaker 1: lot of publicly traded tech companies here in the US 37 00:02:13,200 --> 00:02:17,440 Speaker 1: we're holding earnings calls with their investors. On Thursday, we 38 00:02:17,520 --> 00:02:20,400 Speaker 1: got the Triple A calls, that would be the Amazon, 39 00:02:20,680 --> 00:02:25,040 Speaker 1: Alphabet and Apple earnings calls, and I suspected the call 40 00:02:25,120 --> 00:02:28,640 Speaker 1: showed that these companies are facing some challenges to have 41 00:02:28,760 --> 00:02:32,480 Speaker 1: investors a little concerned in some cases. Apple reported that 42 00:02:32,560 --> 00:02:35,280 Speaker 1: it had a revenue drop, so not a loss. It 43 00:02:35,280 --> 00:02:38,399 Speaker 1: didn't operate at a loss, just a drop in revenue. 44 00:02:38,760 --> 00:02:41,320 Speaker 1: But this would be the first revenue drop for Apple 45 00:02:41,400 --> 00:02:44,520 Speaker 1: in more than three years, and investors really don't like 46 00:02:44,560 --> 00:02:47,880 Speaker 1: it when winning streaks are interrupted. Sales were down five 47 00:02:48,000 --> 00:02:51,560 Speaker 1: percent quarter over well year over year for quarter four 48 00:02:51,639 --> 00:02:55,600 Speaker 1: of two, and that was a larger drop than what 49 00:02:55,720 --> 00:02:59,680 Speaker 1: analysts had predicted, and the fault was largely laid at 50 00:02:59,680 --> 00:03:04,080 Speaker 1: the fee of China, specifically having manufacturing facilities in China 51 00:03:04,200 --> 00:03:08,680 Speaker 1: shut down during COVID nineteen outbreaks, and considering the reported 52 00:03:08,720 --> 00:03:10,520 Speaker 1: conditions of what it's like to work at one of 53 00:03:10,560 --> 00:03:13,560 Speaker 1: these places, that all feels really gross to me. I 54 00:03:13,600 --> 00:03:17,200 Speaker 1: don't think China has taken the right approach to handling 55 00:03:17,360 --> 00:03:20,560 Speaker 1: COVID nineteen. But at the same time, it's really hard 56 00:03:20,600 --> 00:03:22,800 Speaker 1: for me to say, ah, man, we weren't able to 57 00:03:22,800 --> 00:03:25,080 Speaker 1: make as many do dads because the government was trying 58 00:03:25,120 --> 00:03:27,920 Speaker 1: to stop the spread of a highly infectious disease. So 59 00:03:27,960 --> 00:03:30,520 Speaker 1: I guess what I'm saying is that, ultimately I'm glad 60 00:03:30,560 --> 00:03:32,160 Speaker 1: I'm not the one who has to make those kinds 61 00:03:32,160 --> 00:03:36,240 Speaker 1: of decisions. Anyway, on top of the falling sales, Apple's 62 00:03:36,440 --> 00:03:42,279 Speaker 1: profit fell by thirteen percent. Now again, Apple was still profitable, 63 00:03:42,640 --> 00:03:45,680 Speaker 1: it just wasn't as profitable as it was this time 64 00:03:45,880 --> 00:03:48,840 Speaker 1: last year. Apple did not indicate that it would be 65 00:03:48,880 --> 00:03:52,880 Speaker 1: following the big tech trend of laying off thousands of employees, however, 66 00:03:53,320 --> 00:03:56,960 Speaker 1: so that's good news for Apple employees. Amazon's news was 67 00:03:57,120 --> 00:04:01,400 Speaker 1: even worse in that the company reported that you one 68 00:04:01,400 --> 00:04:03,640 Speaker 1: three operating profit might turn out to be a big 69 00:04:03,680 --> 00:04:06,600 Speaker 1: old goose egg a zero in other words, so no 70 00:04:06,760 --> 00:04:12,760 Speaker 1: operating profit in the beginning of according to current estimations. 71 00:04:13,320 --> 00:04:17,080 Speaker 1: Amazon executives have said that customers have been cutting way 72 00:04:17,200 --> 00:04:20,080 Speaker 1: back on spending and that was causing a massive hit 73 00:04:20,240 --> 00:04:24,839 Speaker 1: to Amazon's revenue, and that even the layoffs and downsizing 74 00:04:24,920 --> 00:04:30,560 Speaker 1: in Amazon is not counteracting that that drop in customer purchases. 75 00:04:31,120 --> 00:04:34,680 Speaker 1: And you know, we're talking about eighteen thousand jobs cut 76 00:04:35,120 --> 00:04:38,200 Speaker 1: at this point in Amazon. One area where Amazon is 77 00:04:38,240 --> 00:04:41,240 Speaker 1: doing okay is in its cloud computing services, where the 78 00:04:41,279 --> 00:04:45,400 Speaker 1: company actually saw higher revenues than analysts had predicted. But yeah, 79 00:04:45,480 --> 00:04:49,279 Speaker 1: the pandemic days of rampant online shopping have all really 80 00:04:49,360 --> 00:04:52,599 Speaker 1: slowed down as people are being more frugal with their 81 00:04:52,640 --> 00:04:58,560 Speaker 1: spending during this undefined economic downturn. But I call it 82 00:04:58,600 --> 00:05:02,679 Speaker 1: a recession by an We have Alphabet, the parent company 83 00:05:02,839 --> 00:05:06,920 Speaker 1: of Google and YouTube, and some other companies as well. Overall, 84 00:05:06,920 --> 00:05:10,159 Speaker 1: the companies saw declines in both revenue and profit, which 85 00:05:10,160 --> 00:05:13,919 Speaker 1: again is not a huge surprise because Alphabet draws the 86 00:05:14,040 --> 00:05:17,680 Speaker 1: vast majority of its revenue from digital ads, and digital 87 00:05:17,720 --> 00:05:21,200 Speaker 1: ad spending is down. In fact, according to Alphabet, the 88 00:05:21,240 --> 00:05:24,440 Speaker 1: ad revenue fell by four percent in general and eight 89 00:05:24,440 --> 00:05:29,040 Speaker 1: percent on platforms like YouTube and like Amazon. Alphabet has 90 00:05:29,080 --> 00:05:31,719 Speaker 1: also slashed a lot of jobs twelve thousand in the 91 00:05:31,760 --> 00:05:36,159 Speaker 1: case of Alphabet, So the call did not present a 92 00:05:36,160 --> 00:05:38,839 Speaker 1: lot of surprises. None of these did, I would say, 93 00:05:38,880 --> 00:05:42,799 Speaker 1: But yeah, kind of confirms stuff that we already either 94 00:05:42,960 --> 00:05:46,800 Speaker 1: new or suspected. Other tech companies are also cutting back 95 00:05:46,839 --> 00:05:50,920 Speaker 1: by eliminating more jobs. Dell Technologies recently revealed it will 96 00:05:50,960 --> 00:05:53,800 Speaker 1: be laying off around five percent of its workforce. That 97 00:05:53,839 --> 00:05:58,640 Speaker 1: would be more than six thousand, five hundred employees. So why, well, 98 00:05:58,680 --> 00:06:02,040 Speaker 1: we all know why. It's the economic situation that again 99 00:06:02,080 --> 00:06:04,160 Speaker 1: we don't have a word for. We just call it challenging. 100 00:06:04,600 --> 00:06:07,520 Speaker 1: And while I hate seeing more tech employees face the 101 00:06:07,560 --> 00:06:10,479 Speaker 1: possibility of being out of a job. I can't say 102 00:06:10,520 --> 00:06:13,080 Speaker 1: that I'm really surprised by Dell's move. I mean, we 103 00:06:13,160 --> 00:06:16,080 Speaker 1: all know that consumers have cut way back on spending 104 00:06:16,120 --> 00:06:19,440 Speaker 1: in general, and that includes buying new computers. People are 105 00:06:19,440 --> 00:06:23,479 Speaker 1: sticking with their devices longer rather than updating or replacing them, 106 00:06:23,880 --> 00:06:27,640 Speaker 1: so the hardware cycle has changed a bit. This impacts 107 00:06:27,640 --> 00:06:30,840 Speaker 1: companies like Dell big time. On top of that, Dell 108 00:06:30,960 --> 00:06:34,480 Speaker 1: has seen some other competitors in the consumer PC space 109 00:06:35,120 --> 00:06:38,440 Speaker 1: gain ground on them and get more competitive. So it's 110 00:06:38,480 --> 00:06:42,000 Speaker 1: not just that Dell is selling fewer PCs. It's also 111 00:06:42,040 --> 00:06:44,880 Speaker 1: that other companies are starting to catch up to Dell 112 00:06:45,120 --> 00:06:48,320 Speaker 1: or to really be a competitive threat. So it's a 113 00:06:48,360 --> 00:06:52,520 Speaker 1: double whammy. And Boeing is yet another company in tech 114 00:06:52,760 --> 00:06:55,359 Speaker 1: that has announced it will be holding layoffs. The company 115 00:06:55,400 --> 00:06:59,520 Speaker 1: said it plans to eliminate around two thousand jobs, primarily 116 00:06:59,560 --> 00:07:03,280 Speaker 1: into part meants like human resources and finance, and then 117 00:07:03,360 --> 00:07:07,480 Speaker 1: other departments like engineering and manufacturing are going to be protected. 118 00:07:07,800 --> 00:07:10,680 Speaker 1: Boeing blends to outsource HR and finance roles to a 119 00:07:10,720 --> 00:07:15,760 Speaker 1: company called Tata Consulting Services that is part of a 120 00:07:15,920 --> 00:07:20,800 Speaker 1: larger conglomerate that is located in India. But unlike the 121 00:07:20,840 --> 00:07:24,280 Speaker 1: other companies I mentioned so far, Boeing actually plans to 122 00:07:24,360 --> 00:07:27,680 Speaker 1: grow its workforce this year. So last year Boeing hired 123 00:07:27,720 --> 00:07:31,640 Speaker 1: on fifteen thousand new employees, so still a net gain 124 00:07:31,720 --> 00:07:34,280 Speaker 1: even when you consider the two thousand layoffs coming up, 125 00:07:34,800 --> 00:07:38,200 Speaker 1: and this year they plan on adding another ten thousand 126 00:07:38,240 --> 00:07:40,520 Speaker 1: by the end of the year. So really this sounds 127 00:07:40,560 --> 00:07:43,480 Speaker 1: more like Boeing is putting its eggs in the in 128 00:07:43,520 --> 00:07:47,320 Speaker 1: the products and services basket with engineering and manufacturing, and 129 00:07:47,400 --> 00:07:50,440 Speaker 1: that this move is more about shifting finance and human 130 00:07:50,520 --> 00:07:55,040 Speaker 1: resources focused parts of the business to outsourced jobs. So 131 00:07:55,200 --> 00:07:59,200 Speaker 1: it's really more like a reorg than anything else, at 132 00:07:59,240 --> 00:08:03,080 Speaker 1: least as of right now. Now we're about to enter 133 00:08:03,200 --> 00:08:06,600 Speaker 1: into the AI slash chat bought part of our episode, 134 00:08:06,960 --> 00:08:09,600 Speaker 1: because that's been the big news so far in Tree. 135 00:08:10,000 --> 00:08:12,840 Speaker 1: So we all know about chat gpt in the various 136 00:08:12,880 --> 00:08:16,360 Speaker 1: concerns and criticisms and wild experiments that are related to it, 137 00:08:16,760 --> 00:08:20,080 Speaker 1: and you likely know that Microsoft had previously invested a 138 00:08:20,120 --> 00:08:24,720 Speaker 1: billion dollars in the company behind chat GPT called open Ai. 139 00:08:24,760 --> 00:08:26,360 Speaker 1: At least you would know that if you listen to 140 00:08:26,400 --> 00:08:30,480 Speaker 1: my episode about open Ai. Also Microsoft is going to 141 00:08:30,520 --> 00:08:33,920 Speaker 1: invest a lot more in open Ai, up to around 142 00:08:33,960 --> 00:08:37,600 Speaker 1: ten billion dollars according to at least some reports. Well 143 00:08:37,720 --> 00:08:40,439 Speaker 1: later today, Microsoft is holding a press event. In fact, 144 00:08:40,800 --> 00:08:42,839 Speaker 1: might have already happened by the time you are listening 145 00:08:42,880 --> 00:08:46,160 Speaker 1: to this podcast. But since it hasn't happened yet as 146 00:08:46,200 --> 00:08:48,319 Speaker 1: of when I'm writing the darn thing, the best I 147 00:08:48,360 --> 00:08:50,520 Speaker 1: can do is talk about what people think is going 148 00:08:50,559 --> 00:08:53,840 Speaker 1: to happen. That is that Microsoft is going to make 149 00:08:53,840 --> 00:08:58,880 Speaker 1: an announcement that it's incorporating chat GPT into being search 150 00:08:59,240 --> 00:09:02,440 Speaker 1: in some way. Now, Google yesterday, in a kind of 151 00:09:02,480 --> 00:09:07,160 Speaker 1: panicked and rushed way, unveiled its own chat bought called Bard. 152 00:09:07,320 --> 00:09:10,120 Speaker 1: We had already heard about Bard, but at the time 153 00:09:10,160 --> 00:09:14,199 Speaker 1: I was talking about it was referenced as Apprentice Bard 154 00:09:14,240 --> 00:09:17,559 Speaker 1: and it was only an internal tool. So Google has 155 00:09:17,600 --> 00:09:20,360 Speaker 1: now already launched a closed and private beta for a 156 00:09:20,440 --> 00:09:23,720 Speaker 1: select group of users to experiment with Bard, and plans 157 00:09:23,760 --> 00:09:26,319 Speaker 1: to open this up further to a lot more people 158 00:09:26,400 --> 00:09:28,840 Speaker 1: in the near future in the coming weeks, for example. 159 00:09:29,760 --> 00:09:32,560 Speaker 1: So Google also plans a press event that will now 160 00:09:32,600 --> 00:09:35,680 Speaker 1: happen tomorrow on Wednesday. So it looks like Microsoft and 161 00:09:35,720 --> 00:09:38,520 Speaker 1: Google are facing off in the AI chat bought space, 162 00:09:38,559 --> 00:09:42,920 Speaker 1: particularly with regard to how it could be interwoven with search. Now, 163 00:09:42,960 --> 00:09:46,320 Speaker 1: I've already talked about how this approach could ultimately cause 164 00:09:46,360 --> 00:09:50,480 Speaker 1: a whole lot of problems online, mainly through cutting off 165 00:09:50,640 --> 00:09:54,240 Speaker 1: vital Internet traffic to various websites that depend upon search 166 00:09:54,280 --> 00:09:58,000 Speaker 1: results to get visitors. But yeah, it looks like this 167 00:09:58,040 --> 00:10:01,040 Speaker 1: fight is coming to us pretty quickly. The two chat 168 00:10:01,080 --> 00:10:05,040 Speaker 1: bots will be based off different language models open aies 169 00:10:05,280 --> 00:10:09,480 Speaker 1: g PT or generative pre train transformer is based off 170 00:10:09,480 --> 00:10:13,320 Speaker 1: the transformer model that Google pioneered. Now, in this case, 171 00:10:13,360 --> 00:10:17,240 Speaker 1: transformer doesn't refer to either a robot in disguise or 172 00:10:17,720 --> 00:10:21,280 Speaker 1: a device that can step voltage up or down. Instead, 173 00:10:21,320 --> 00:10:23,960 Speaker 1: a transformer in this sense is a machine learning model. 174 00:10:24,400 --> 00:10:27,040 Speaker 1: Google introduced it in two thousand seventeen, and it's very 175 00:10:27,120 --> 00:10:32,280 Speaker 1: popular model in machine learning applications. Google's language model is 176 00:10:32,320 --> 00:10:36,640 Speaker 1: called Language Models for Dialogue Applications or land up. So 177 00:10:36,720 --> 00:10:40,080 Speaker 1: you have GPT on one side and lambda on the other, 178 00:10:40,400 --> 00:10:42,959 Speaker 1: things do get a little more muddeled. However, Google has 179 00:10:43,000 --> 00:10:47,080 Speaker 1: invested several hundred million dollars in an AI company called Anthropic, 180 00:10:47,679 --> 00:10:50,920 Speaker 1: which was founded by some folks who had formerly been 181 00:10:50,960 --> 00:10:54,800 Speaker 1: researchers at open Ai. So both Microsoft and Google are 182 00:10:54,880 --> 00:10:59,680 Speaker 1: launching AI powered tools that share some common DNA. I 183 00:10:59,720 --> 00:11:01,920 Speaker 1: have a been invited to the beta for Google, so 184 00:11:02,160 --> 00:11:04,880 Speaker 1: I haven't seen what that looks like yet, and the 185 00:11:04,920 --> 00:11:08,320 Speaker 1: Microsoft event hasn't happened as the time I'm recording this, 186 00:11:08,760 --> 00:11:11,560 Speaker 1: so I can't really comment on what these look like 187 00:11:11,720 --> 00:11:14,880 Speaker 1: or how they perform. And I am sure that people 188 00:11:14,880 --> 00:11:17,600 Speaker 1: at Google and Microsoft are already aware of the potential 189 00:11:17,679 --> 00:11:21,160 Speaker 1: dangers this AI could pose to business practices. So I'm 190 00:11:21,200 --> 00:11:24,880 Speaker 1: not talking about, you know, doomsday prophecies about AI destroying 191 00:11:24,880 --> 00:11:28,120 Speaker 1: the world or anything. Instead, I'm talking about AI eliminating 192 00:11:28,480 --> 00:11:31,640 Speaker 1: an advertising platform that Google has depended upon for most 193 00:11:31,679 --> 00:11:35,120 Speaker 1: of its existence. I'm sure they have contingencies in mind. 194 00:11:35,240 --> 00:11:37,840 Speaker 1: I just don't know what those happened to be, but 195 00:11:37,880 --> 00:11:40,920 Speaker 1: it should be fascinating to see how this unfolds. Also, 196 00:11:41,040 --> 00:11:43,360 Speaker 1: if it gets good enough, it'll put me out of 197 00:11:43,400 --> 00:11:45,280 Speaker 1: a job. I mean, why would you listen to me 198 00:11:45,320 --> 00:11:47,280 Speaker 1: when you can just have your tech questions answered by 199 00:11:47,320 --> 00:11:50,599 Speaker 1: a super fast robot unless it's you know, from a 200 00:11:50,679 --> 00:11:54,720 Speaker 1: Southern charm Okay, I I gotta take a break from 201 00:11:54,760 --> 00:12:07,760 Speaker 1: being charming. We will be back after these messages. Okay, 202 00:12:07,880 --> 00:12:10,920 Speaker 1: we're back, and we're actually back with another AI chat 203 00:12:10,960 --> 00:12:13,600 Speaker 1: bought story. So the United States is not the only 204 00:12:13,679 --> 00:12:18,840 Speaker 1: country that's obsessed with this idea. By Do, the enormous 205 00:12:18,880 --> 00:12:22,280 Speaker 1: company in China, announced today that it has been testing 206 00:12:22,280 --> 00:12:27,440 Speaker 1: a chat butt that it calls Ernie bot. There's no 207 00:12:27,480 --> 00:12:29,720 Speaker 1: word of this chat bot has a roommate that's called 208 00:12:29,880 --> 00:12:35,319 Speaker 1: bert Bot, but Ernie has been tested internally for some time, 209 00:12:35,400 --> 00:12:39,480 Speaker 1: it seems, and these tests are scheduled to conclude sometime 210 00:12:39,520 --> 00:12:44,280 Speaker 1: in March. Presumably at that point by Do will unveiled 211 00:12:44,559 --> 00:12:48,400 Speaker 1: the chat bot into some sort of public facing phase 212 00:12:48,480 --> 00:12:51,439 Speaker 1: where people will get a chance to use it outside 213 00:12:51,600 --> 00:12:56,000 Speaker 1: of by Do itself. Ernie actually is an acronym. It 214 00:12:56,120 --> 00:13:00,960 Speaker 1: stands for Enhanced Representation through Knowledge integrat SHILT, and it 215 00:13:01,040 --> 00:13:03,840 Speaker 1: sounds like it has a lot of the same sort 216 00:13:03,840 --> 00:13:06,640 Speaker 1: of features that we've seen in stuff like chat GPT 217 00:13:07,160 --> 00:13:11,360 Speaker 1: as well as doll E, in that it is said 218 00:13:11,400 --> 00:13:14,920 Speaker 1: to be able to generate text based on user prompts, 219 00:13:15,040 --> 00:13:17,840 Speaker 1: so you can ask it questions and it will generate 220 00:13:17,920 --> 00:13:22,200 Speaker 1: a response by drawing information from around you know, its databases, 221 00:13:22,800 --> 00:13:25,360 Speaker 1: and it can also do things like create images based 222 00:13:25,360 --> 00:13:27,920 Speaker 1: on prompts, so you could tell it to create a 223 00:13:27,960 --> 00:13:33,400 Speaker 1: serene landscape with a windmill next to a river or something, 224 00:13:33,440 --> 00:13:35,840 Speaker 1: and it would do it. I think it will only 225 00:13:35,880 --> 00:13:38,560 Speaker 1: be a matter of time before someone creates a roundtable 226 00:13:38,640 --> 00:13:42,080 Speaker 1: discussion session with these various chat bots all talking to 227 00:13:42,200 --> 00:13:45,840 Speaker 1: each other and have them discuss some sort of topic 228 00:13:46,120 --> 00:13:49,679 Speaker 1: like you know, which Nintendo game is the best, or 229 00:13:50,040 --> 00:13:53,400 Speaker 1: how to achieve world peace or something like that. It's 230 00:13:53,440 --> 00:13:55,640 Speaker 1: just a matter of time. Even if it's done as 231 00:13:55,679 --> 00:13:58,160 Speaker 1: a parody, I expect we'll see it. In fact, I'm 232 00:13:58,160 --> 00:14:02,000 Speaker 1: sure we'll see lots of parodies where people will purposefully 233 00:14:02,160 --> 00:14:08,319 Speaker 1: create fake conversations between these different chat bots for comedic purposes, 234 00:14:08,400 --> 00:14:11,760 Speaker 1: and yet they will not reveal that it's all fake, 235 00:14:12,120 --> 00:14:14,360 Speaker 1: because that's the world we live in, where you can't 236 00:14:14,360 --> 00:14:18,600 Speaker 1: tell that something satire because there's no indicator to let 237 00:14:18,640 --> 00:14:21,280 Speaker 1: you know that, in fact, it is satire. It's just 238 00:14:21,360 --> 00:14:24,160 Speaker 1: passed off as the truth, so it is, in other words, 239 00:14:24,560 --> 00:14:29,280 Speaker 1: a lie. Yeah. I have thoughts about this. Now. I've 240 00:14:29,320 --> 00:14:31,160 Speaker 1: got a couple of gaming stories to cover, both of 241 00:14:31,160 --> 00:14:34,080 Speaker 1: which are a little rough. First up, Meta announced last 242 00:14:34,080 --> 00:14:36,920 Speaker 1: week that it's going to sunset the VR game Echo 243 00:14:37,040 --> 00:14:41,000 Speaker 1: VR coming this August. This game will having a relatively 244 00:14:41,080 --> 00:14:45,160 Speaker 1: small player base, has received critical and popular acclaim, and 245 00:14:45,200 --> 00:14:47,280 Speaker 1: the folks who do play it are said to be 246 00:14:47,320 --> 00:14:50,840 Speaker 1: really loyal fans, so this announcement came as a disappointment. 247 00:14:51,240 --> 00:14:53,720 Speaker 1: Meta explained that the reason behind the decision is that 248 00:14:53,760 --> 00:14:57,160 Speaker 1: the company wants to create other games and VR experiences, 249 00:14:57,200 --> 00:15:00,600 Speaker 1: and maintaining a game requires resources that the company would 250 00:15:00,680 --> 00:15:04,480 Speaker 1: rather focus on new opportunities. Now that does make sense. 251 00:15:04,560 --> 00:15:09,280 Speaker 1: Any online focused game is going to require ongoing support 252 00:15:09,400 --> 00:15:12,480 Speaker 1: or it will eventually fail, and considering that the player 253 00:15:12,520 --> 00:15:15,720 Speaker 1: base is down to the low ten thousands according to 254 00:15:15,760 --> 00:15:18,640 Speaker 1: The Verge, you can understand why Meta would want to 255 00:15:18,640 --> 00:15:21,800 Speaker 1: shift to something else that could potentially have a bigger impact. 256 00:15:22,440 --> 00:15:25,520 Speaker 1: Even if revenue isn't a concern right now, it's bound 257 00:15:25,560 --> 00:15:28,640 Speaker 1: to be at some point. Unfortunately, this kind of decision 258 00:15:28,720 --> 00:15:31,119 Speaker 1: is also the sort of thing that can discourage adoption 259 00:15:31,720 --> 00:15:35,440 Speaker 1: because buying into VR is expensive, The equipment is costly, 260 00:15:35,840 --> 00:15:38,560 Speaker 1: and that's before you even start buying games for it. 261 00:15:38,840 --> 00:15:41,040 Speaker 1: So if you're thinking about getting into VR, but then 262 00:15:41,120 --> 00:15:43,920 Speaker 1: you hear that one of the more praised titles is 263 00:15:43,920 --> 00:15:47,600 Speaker 1: getting shut down later this year, you might find yourself, 264 00:15:47,680 --> 00:15:53,000 Speaker 1: you know, reviewing the situation, as Vegan would say. John Carmack, 265 00:15:53,080 --> 00:15:55,720 Speaker 1: who until recently was a consultant for Meta on All 266 00:15:55,760 --> 00:15:59,400 Speaker 1: Things VR, voiced his disappointment with this decision. He said 267 00:15:59,440 --> 00:16:01,920 Speaker 1: that Meta is taking something away from the community and 268 00:16:01,960 --> 00:16:05,200 Speaker 1: that never really goes over very well. And I agree 269 00:16:05,240 --> 00:16:07,360 Speaker 1: with that. If you want to learn more about this, 270 00:16:07,440 --> 00:16:10,520 Speaker 1: I recommend the article in The Verge titled Here's why 271 00:16:10,640 --> 00:16:14,640 Speaker 1: Meta is shutting down Echo VR. The ViRGE also reports 272 00:16:14,640 --> 00:16:18,520 Speaker 1: that Amazon is phasing out fifty three games and it's 273 00:16:18,760 --> 00:16:23,760 Speaker 1: Luna cloud streaming service this month. So it's got this 274 00:16:23,920 --> 00:16:27,560 Speaker 1: video game streaming service called Luna, and fifty three titles 275 00:16:27,560 --> 00:16:31,520 Speaker 1: are going to leave in February over different dates. Back 276 00:16:31,560 --> 00:16:35,600 Speaker 1: in December, Luna rotated out another forty titles. Amazon says 277 00:16:35,640 --> 00:16:38,920 Speaker 1: the game strategy is to consistently switch out titles to 278 00:16:39,000 --> 00:16:42,440 Speaker 1: keep the service fresh. That makes sense to a degree 279 00:16:42,600 --> 00:16:44,720 Speaker 1: to me, but it does mean that gamers might find 280 00:16:44,760 --> 00:16:47,760 Speaker 1: themselves focusing more on titles that are good for short 281 00:16:47,800 --> 00:16:51,760 Speaker 1: sessions and don't involve long protracted storylines because you never 282 00:16:51,800 --> 00:16:54,240 Speaker 1: know when that title is gonna go away. There's nothing 283 00:16:54,280 --> 00:16:57,000 Speaker 1: like being halfway through saving the world and then finding 284 00:16:57,000 --> 00:16:59,920 Speaker 1: out you can't access your game anymore, and what happens 285 00:16:59,880 --> 00:17:03,120 Speaker 1: to a world y'all think of the world. Well, according 286 00:17:03,160 --> 00:17:06,520 Speaker 1: to game Rant, Luna Plus has around a hundred ninety 287 00:17:06,520 --> 00:17:09,800 Speaker 1: titles on its service, so fifty three games. That means 288 00:17:09,800 --> 00:17:11,960 Speaker 1: we're looking at around a little more than a quarter 289 00:17:12,359 --> 00:17:16,520 Speaker 1: of all the the game titles leaving the service this month. 290 00:17:16,720 --> 00:17:19,560 Speaker 1: I'm curious how popular the services. I'll try to look 291 00:17:19,560 --> 00:17:22,159 Speaker 1: into it. I'm thinking about doing an episode about the 292 00:17:22,200 --> 00:17:26,160 Speaker 1: different cloud based gaming services out there and any information 293 00:17:26,200 --> 00:17:28,879 Speaker 1: we have about how well they're performing. In some cases, 294 00:17:28,920 --> 00:17:31,320 Speaker 1: we don't really have any information at all, but we've 295 00:17:31,359 --> 00:17:34,520 Speaker 1: definitely come a long way since the Uyah. Finally, last 296 00:17:34,520 --> 00:17:37,320 Speaker 1: week I talked about Nothing Forever. That's the AI generated 297 00:17:37,359 --> 00:17:41,280 Speaker 1: Seinfeld episode simulator that was broadcasting on Twitch. It uses 298 00:17:41,359 --> 00:17:46,440 Speaker 1: GPT to create the dialogue. Well, they had a real 299 00:17:46,560 --> 00:17:50,520 Speaker 1: problem recently in that uh, the character Larry, the standing 300 00:17:50,560 --> 00:17:53,600 Speaker 1: for Jerry Seinfeld, was doing his stand up and started 301 00:17:53,680 --> 00:17:57,679 Speaker 1: spouting off transphobic and homophobic thoughts as part of his 302 00:17:57,720 --> 00:18:00,440 Speaker 1: stand up routine. This should not have ever happened and 303 00:18:00,520 --> 00:18:04,000 Speaker 1: twhich banned the channel for fourteen days as a result 304 00:18:04,040 --> 00:18:06,600 Speaker 1: of It turns out the issue was that the GPT 305 00:18:06,720 --> 00:18:09,280 Speaker 1: model they had been using, which was codenamed Da Vinci, 306 00:18:10,080 --> 00:18:13,639 Speaker 1: had problems at some point, and the the model just 307 00:18:13,640 --> 00:18:16,959 Speaker 1: stopped creating content, so you just had these empty rooms. 308 00:18:17,440 --> 00:18:21,000 Speaker 1: So the team decided to roll back to an earlier 309 00:18:21,119 --> 00:18:25,199 Speaker 1: version of GPT called CURI, and this one lacked the 310 00:18:25,240 --> 00:18:30,639 Speaker 1: content moderation blinders that the Da Vinci model had, and 311 00:18:30,720 --> 00:18:35,960 Speaker 1: so CURI was more lax and allowing problematic content to 312 00:18:35,960 --> 00:18:37,920 Speaker 1: come through, and that's when we got the transphobic and 313 00:18:37,960 --> 00:18:42,000 Speaker 1: homophobic content. So the creators say they do plan on 314 00:18:42,080 --> 00:18:45,920 Speaker 1: coming back. They're going to put more content moderation features 315 00:18:46,000 --> 00:18:48,000 Speaker 1: in place to prevent this sort of thing from happening, 316 00:18:48,040 --> 00:18:51,160 Speaker 1: because obviously they don't want harmful content to come out 317 00:18:51,760 --> 00:18:55,199 Speaker 1: from their little fun diversion. Uh, And that this was 318 00:18:55,280 --> 00:18:58,560 Speaker 1: just a mistake based upon going to an earlier version 319 00:18:58,680 --> 00:19:03,040 Speaker 1: of this language model. So it was really unfortunate. I'm 320 00:19:03,119 --> 00:19:06,600 Speaker 1: glad that they're addressing it, and hopefully we don't see 321 00:19:06,600 --> 00:19:08,480 Speaker 1: anything like this in the future. But it is one 322 00:19:08,480 --> 00:19:12,359 Speaker 1: of the the downsides to AI that's pulling from across 323 00:19:12,359 --> 00:19:15,399 Speaker 1: the internet to generate its content. All Right, that's it. 324 00:19:15,840 --> 00:19:19,400 Speaker 1: That's the news for Tuesday, February seven three. Hope you're 325 00:19:19,480 --> 00:19:23,560 Speaker 1: all well. I'm doing pretty well myself. If you would 326 00:19:23,560 --> 00:19:27,520 Speaker 1: like to contact me with some future ideas for shows 327 00:19:27,520 --> 00:19:29,719 Speaker 1: that I should do, you can do so on Twitter. 328 00:19:29,880 --> 00:19:32,040 Speaker 1: The handle for the show is tech Stuff hs W, 329 00:19:32,400 --> 00:19:34,640 Speaker 1: or if you prefer, you can download the I Heart 330 00:19:34,720 --> 00:19:37,480 Speaker 1: Radio apps free to download, spree to use. You can 331 00:19:37,600 --> 00:19:40,280 Speaker 1: type in tech stuff in a little search bar. It 332 00:19:40,280 --> 00:19:42,240 Speaker 1: will navigate over to the tech stuff page and there 333 00:19:42,240 --> 00:19:44,760 Speaker 1: you'll see a little microphone icon. If you click on that, 334 00:19:45,080 --> 00:19:47,240 Speaker 1: you can leave a voice message up to thirty seconds 335 00:19:47,240 --> 00:19:51,400 Speaker 1: in length and request your topic of choice and I'll 336 00:19:51,400 --> 00:20:00,400 Speaker 1: talk to you again really soon. Text Stuff is an 337 00:20:00,400 --> 00:20:04,080 Speaker 1: I heart Radio production. For more podcasts from I Heart Radio, 338 00:20:04,440 --> 00:20:07,600 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 339 00:20:07,680 --> 00:20:09,200 Speaker 1: you listen to your favorite shows.