1 00:00:04,480 --> 00:00:12,319 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,760 Speaker 1: and welcome to tech Stuff. I'm your host, John van Strickland. 3 00:00:15,760 --> 00:00:18,840 Speaker 1: I'm an executive producer with iHeartRadio. And how the tech 4 00:00:18,920 --> 00:00:21,159 Speaker 1: are you. It is time for the tech news for 5 00:00:21,200 --> 00:00:25,680 Speaker 1: the week ending on November one, twenty twenty four. And 6 00:00:26,079 --> 00:00:30,600 Speaker 1: here in the United States, we are rapidly approaching election day, 7 00:00:30,800 --> 00:00:33,640 Speaker 1: which is Tuesday, November fifth. Now, for those of y'all 8 00:00:33,800 --> 00:00:35,960 Speaker 1: eligible to vote in the United States but who have 9 00:00:36,040 --> 00:00:38,800 Speaker 1: not yet done so, I urge you to get out 10 00:00:38,840 --> 00:00:42,800 Speaker 1: there and vote. Do your research, look into candidates and 11 00:00:42,920 --> 00:00:45,159 Speaker 1: the issues in your area so that you can make 12 00:00:45,200 --> 00:00:48,720 Speaker 1: an informed decision, and go deeper than just sites that 13 00:00:48,760 --> 00:00:53,040 Speaker 1: are run by politically aligned outlets. Look for information that's 14 00:00:53,479 --> 00:00:56,400 Speaker 1: as objective as possible. It's very easy to fall into 15 00:00:56,480 --> 00:01:00,160 Speaker 1: a trap of misinformation or bias, and that goes for 16 00:01:00,240 --> 00:01:04,000 Speaker 1: any side. I'm not picking specific sides here, it's true 17 00:01:04,040 --> 00:01:07,360 Speaker 1: across the board, but for goodness sakes, get out and vote. 18 00:01:07,400 --> 00:01:11,320 Speaker 1: It's one of the most important components in maintaining a democracy, 19 00:01:11,680 --> 00:01:14,720 Speaker 1: and the very act of voting is under assault these days. 20 00:01:14,840 --> 00:01:16,920 Speaker 1: And I don't know about you, but I would rather 21 00:01:17,080 --> 00:01:19,640 Speaker 1: continue to have a say, even if it's just a 22 00:01:19,680 --> 00:01:22,959 Speaker 1: tiny say in how the country I live in operates. 23 00:01:23,319 --> 00:01:24,800 Speaker 1: Now that we have that out of the way, let's 24 00:01:24,800 --> 00:01:27,399 Speaker 1: get to a whole bunch of tech stories that relate 25 00:01:27,480 --> 00:01:30,560 Speaker 1: to the election, because there are tons of those this week, 26 00:01:30,760 --> 00:01:34,319 Speaker 1: which is no big surprise. First up, some voters in 27 00:01:34,360 --> 00:01:39,679 Speaker 1: Pennsylvania received a strange text message this past weekend stating 28 00:01:39,720 --> 00:01:43,160 Speaker 1: that they had voted in the November fifth election, even 29 00:01:43,200 --> 00:01:47,119 Speaker 1: though they hadn't done that yet. Now, obviously that could 30 00:01:47,120 --> 00:01:51,640 Speaker 1: stir up some pretty intense confusion and negative emotions. If 31 00:01:51,680 --> 00:01:54,760 Speaker 1: you received a message saying that you had already voted 32 00:01:55,120 --> 00:01:58,640 Speaker 1: and you had not already voted, clearly you would suspect 33 00:01:59,040 --> 00:02:02,960 Speaker 1: hanky and panky going on. Now, in this case, it 34 00:02:03,040 --> 00:02:06,840 Speaker 1: wasn't any sort of hanky panky at the actual polling locations. 35 00:02:07,000 --> 00:02:10,800 Speaker 1: It wasn't like the state department messed up. Instead, it 36 00:02:10,880 --> 00:02:14,480 Speaker 1: was with an organization called All Vote, and All Vote 37 00:02:14,480 --> 00:02:18,040 Speaker 1: had previously gotten into hot water for texting registered voters 38 00:02:18,040 --> 00:02:21,239 Speaker 1: to alert them that they were not actually registered to vote, 39 00:02:21,480 --> 00:02:25,360 Speaker 1: when in fact they were. County officials in different regions 40 00:02:25,400 --> 00:02:28,760 Speaker 1: of Pennsylvania have warned that that particular approach was a 41 00:02:28,840 --> 00:02:32,320 Speaker 1: hoax designed to gather personal information from people that would 42 00:02:32,360 --> 00:02:36,440 Speaker 1: later be exploited against those people. Max Marvin and Kate 43 00:02:36,520 --> 00:02:41,000 Speaker 1: Bernard reported on this issue in the Philadelphia Inquirer in 44 00:02:41,040 --> 00:02:45,040 Speaker 1: a piece titled Pennsylvania residents got phony texts claiming they 45 00:02:45,040 --> 00:02:49,600 Speaker 1: had voted already Ignore them, officials say so. They also 46 00:02:49,720 --> 00:02:54,800 Speaker 1: quoted a contracted spokesperson for All Vote named Charlotte Climber. 47 00:02:55,440 --> 00:02:58,400 Speaker 1: Climber has claimed that this is all just an innocent 48 00:02:58,720 --> 00:03:03,840 Speaker 1: but incredibly dumb and careless mistake made when All Vote 49 00:03:03,880 --> 00:03:07,280 Speaker 1: employees entered some sort of typo into their system. So 50 00:03:07,400 --> 00:03:12,480 Speaker 1: apparently the original idea was that the organization would text 51 00:03:12,560 --> 00:03:16,120 Speaker 1: folks who had voted in the twenty twenty two election 52 00:03:16,600 --> 00:03:20,240 Speaker 1: and direct them to voting resources for the twenty twenty 53 00:03:20,400 --> 00:03:23,760 Speaker 1: four election. Now, why All Vote felt the need to 54 00:03:23,840 --> 00:03:27,200 Speaker 1: inform people that the vote they had cast two years 55 00:03:27,200 --> 00:03:30,239 Speaker 1: ago had in fact been counted is a bit beyond me. 56 00:03:30,400 --> 00:03:33,480 Speaker 1: That ship has sailed Like I guess the message would be, 57 00:03:33,720 --> 00:03:36,920 Speaker 1: you voted in the election, meaning the twenty twenty two election. 58 00:03:37,240 --> 00:03:40,720 Speaker 1: Here are some resources for you. But because it left 59 00:03:40,800 --> 00:03:43,520 Speaker 1: off the twenty twenty two part, it seemed to indicate 60 00:03:43,560 --> 00:03:46,920 Speaker 1: that people had already voted in this election. Now, that 61 00:03:46,960 --> 00:03:50,040 Speaker 1: seems to me like that messaging was unnecessary in the 62 00:03:50,080 --> 00:03:52,640 Speaker 1: first place. And if that messaging had been left out, 63 00:03:52,680 --> 00:03:55,000 Speaker 1: you know, the whole you had voted in the election bit, 64 00:03:55,240 --> 00:03:57,080 Speaker 1: there'd be nothing to talk about with this story. It 65 00:03:57,080 --> 00:03:59,520 Speaker 1: would just be a text message saying, hey, here's some 66 00:03:59,600 --> 00:04:03,120 Speaker 1: resource for you for voting in this election, and then 67 00:04:03,280 --> 00:04:05,720 Speaker 1: there'd be nothing to talk about. Climber has said the 68 00:04:05,760 --> 00:04:08,960 Speaker 1: founders of All Vote are progressives, but declined to give 69 00:04:09,040 --> 00:04:12,880 Speaker 1: more information about them, saying that there's a concern for 70 00:04:12,960 --> 00:04:16,240 Speaker 1: their safety and that people who might be hard right 71 00:04:16,760 --> 00:04:21,320 Speaker 1: leaning voters might do them harm. This is where I 72 00:04:21,360 --> 00:04:24,479 Speaker 1: say that while I consider myself a progressive, so I 73 00:04:24,480 --> 00:04:28,520 Speaker 1: would consider myself aligned with the people who are at 74 00:04:28,640 --> 00:04:32,719 Speaker 1: least alleged to be behind All Vote. I also think 75 00:04:32,760 --> 00:04:38,240 Speaker 1: All Votes operations are not great like they're not done properly. 76 00:04:39,000 --> 00:04:43,120 Speaker 1: They were poorly thought out and executed, and if the 77 00:04:43,160 --> 00:04:46,120 Speaker 1: owners of all Vote are actually sincere, they're likely to 78 00:04:46,160 --> 00:04:48,839 Speaker 1: cause more harm to their cause than to help it 79 00:04:49,160 --> 00:04:53,240 Speaker 1: through these kinds of operations. Unless this was an actual 80 00:04:53,279 --> 00:04:58,560 Speaker 1: attempt to confuse and discourage people in say hard right counties, 81 00:04:58,600 --> 00:05:01,760 Speaker 1: from voting, in which case that's just despicable, Like, I 82 00:05:01,800 --> 00:05:04,919 Speaker 1: don't care what side you're for. At least according to 83 00:05:05,320 --> 00:05:08,560 Speaker 1: this spokesperson, the people who own all vote believe in 84 00:05:08,600 --> 00:05:11,679 Speaker 1: the same things I generally believe in politically, I don't 85 00:05:11,720 --> 00:05:16,240 Speaker 1: believe in discouraging people through deception so that they don't vote. 86 00:05:16,240 --> 00:05:20,359 Speaker 1: I think that's reprehensible. Whether or not that's what was 87 00:05:20,400 --> 00:05:22,039 Speaker 1: going on, I don't know. It could be that this 88 00:05:22,160 --> 00:05:25,239 Speaker 1: really was an innocent mistake and just a poorly thought 89 00:05:25,240 --> 00:05:28,520 Speaker 1: out campaign in the first place. Either way, a great 90 00:05:28,560 --> 00:05:32,200 Speaker 1: way to use technology to discourage the participation of the 91 00:05:32,240 --> 00:05:36,640 Speaker 1: democratic process. Speaking of unintentional tech issues affecting the election, 92 00:05:36,760 --> 00:05:40,200 Speaker 1: less talk about Colorado. So it seems that the Colorado 93 00:05:40,279 --> 00:05:44,560 Speaker 1: Department of State committed a major booboo, or, as Todd 94 00:05:44,600 --> 00:05:48,040 Speaker 1: Feathers of Gizmoto called it, a bit of an uh oh. 95 00:05:48,279 --> 00:05:51,280 Speaker 1: This comes from an article written by Todd Feathers. It's 96 00:05:51,320 --> 00:05:57,760 Speaker 1: titled Colorado agency improperly posted passwords for its election system online. Yeah, 97 00:05:57,800 --> 00:06:00,960 Speaker 1: that's not good. So what actually happened? Well, the Department 98 00:06:01,000 --> 00:06:05,560 Speaker 1: of State had a link to a spreadsheet on its website, 99 00:06:05,839 --> 00:06:10,440 Speaker 1: and the spreadsheet included some hidden tabs, so upon casual glance, 100 00:06:10,800 --> 00:06:14,839 Speaker 1: there's nothing hinky going on. However, you can just choose 101 00:06:14,839 --> 00:06:18,480 Speaker 1: an option to unhide tabs, in which case one of 102 00:06:18,520 --> 00:06:24,239 Speaker 1: those tabs contained passwords for the state's voting machines. Yikes, 103 00:06:24,360 --> 00:06:27,200 Speaker 1: I hear you say. Now, it's bad. There's no sugar 104 00:06:27,240 --> 00:06:32,680 Speaker 1: coating it. That's bad. That's a terrible breach of security etiquette. 105 00:06:32,720 --> 00:06:37,159 Speaker 1: It is, not, however, catastrophic. Why Well, for one thing, 106 00:06:37,600 --> 00:06:41,520 Speaker 1: these voting machines actually require two passwords. It's kind of 107 00:06:41,520 --> 00:06:43,520 Speaker 1: like one of those things you see in movies where 108 00:06:43,560 --> 00:06:46,400 Speaker 1: you have to have two keys to activate the nuclear 109 00:06:46,560 --> 00:06:48,599 Speaker 1: arsenal or whatever, and you have to have two people 110 00:06:48,640 --> 00:06:51,280 Speaker 1: across the room from each other insert and turned the 111 00:06:51,360 --> 00:06:54,279 Speaker 1: keys simultaneously to activate it. Well, in this case, the 112 00:06:54,440 --> 00:06:58,160 Speaker 1: voting machines require two separate passwords, and those passwords are 113 00:06:58,200 --> 00:07:02,760 Speaker 1: not collected together. So while a set of passwords were 114 00:07:02,760 --> 00:07:06,520 Speaker 1: available according to the spreadsheet, the other set was not available. 115 00:07:06,680 --> 00:07:09,920 Speaker 1: So even if you had access the spreadsheet and revealed 116 00:07:09,920 --> 00:07:12,600 Speaker 1: the tabs and you wrote down all the passwords, you 117 00:07:12,600 --> 00:07:15,560 Speaker 1: would still only have one half of the password component. 118 00:07:15,840 --> 00:07:18,480 Speaker 1: But that's not at all like in order to actually 119 00:07:18,600 --> 00:07:21,440 Speaker 1: use these passwords. You would need to have physical access 120 00:07:21,520 --> 00:07:24,960 Speaker 1: to the voting machines themselves. Now, in order to do that, 121 00:07:25,000 --> 00:07:27,680 Speaker 1: you would need credentials to be able to get access 122 00:07:27,760 --> 00:07:31,000 Speaker 1: to the areas where the voting machines are stored. They're 123 00:07:31,000 --> 00:07:34,800 Speaker 1: also under a constant video surveillance, so you'd have to 124 00:07:34,840 --> 00:07:38,760 Speaker 1: deal with that as well. Plus, Colorado uses paper ballots, 125 00:07:39,120 --> 00:07:43,080 Speaker 1: so there's a physical paper trail of every vote cast. 126 00:07:43,400 --> 00:07:46,760 Speaker 1: Voters verify that their ballots are correct before they cast 127 00:07:46,800 --> 00:07:50,160 Speaker 1: their vote, so even if someone did access the machine 128 00:07:50,160 --> 00:07:54,000 Speaker 1: somehow and messed with them, the paper trail would remain 129 00:07:54,160 --> 00:07:57,840 Speaker 1: a resource for verifying the votes that had been cast. 130 00:07:58,040 --> 00:08:02,040 Speaker 1: So while the mistake is a giant whoopsie and absolutely 131 00:08:02,120 --> 00:08:06,480 Speaker 1: should not have happened, the voting process itself is still reliable. 132 00:08:06,800 --> 00:08:09,520 Speaker 1: Of course, that's not stopping some people from using this 133 00:08:09,720 --> 00:08:13,480 Speaker 1: to bring into question the results of any votes that 134 00:08:13,520 --> 00:08:15,840 Speaker 1: are made in the state. In other words, this is 135 00:08:15,880 --> 00:08:20,360 Speaker 1: an opportunity to undermine confidence in the results of the 136 00:08:20,440 --> 00:08:23,520 Speaker 1: election ahead of time, so that no matter who wins, 137 00:08:23,800 --> 00:08:25,960 Speaker 1: you can have some wiggle room in case your side 138 00:08:26,000 --> 00:08:27,880 Speaker 1: didn't win. If your side did win, you say, hey, 139 00:08:27,880 --> 00:08:29,960 Speaker 1: look the system worked, and if your side didn't win. 140 00:08:30,000 --> 00:08:33,640 Speaker 1: You're like, this is because our system was compromised. Now, 141 00:08:33,679 --> 00:08:36,480 Speaker 1: I'm here to remind y'all that for democracy to fail, 142 00:08:36,679 --> 00:08:39,720 Speaker 1: you don't actually have to prove that the system is broken. 143 00:08:39,920 --> 00:08:42,760 Speaker 1: All you have to do is convince enough people that 144 00:08:42,800 --> 00:08:46,480 Speaker 1: the system is broken to reduce their confidence in the system, 145 00:08:46,679 --> 00:08:51,439 Speaker 1: and then democracy dies. That's how that works. Reporter Barbara 146 00:08:51,640 --> 00:08:55,880 Speaker 1: Ortute has a piece in AP News that's titled report 147 00:08:56,000 --> 00:09:00,319 Speaker 1: says crowdsourced fact checks on X failed to a dress 148 00:09:00,400 --> 00:09:04,760 Speaker 1: flood of US election misinformation. Now, I think that comes 149 00:09:04,800 --> 00:09:08,760 Speaker 1: as a surprise to absolutely no one. Before Elon Musk 150 00:09:08,920 --> 00:09:12,559 Speaker 1: carried us SYNC into the HQ of what was then Twitter, 151 00:09:13,120 --> 00:09:16,280 Speaker 1: I maintained that was foreshadowing that he would eventually sink 152 00:09:16,360 --> 00:09:19,480 Speaker 1: the company, which hasn't happened yet. But goodness gracious, I mean, 153 00:09:19,880 --> 00:09:23,240 Speaker 1: I think the band is playing while the Titanic is sinking. Anyway, 154 00:09:23,400 --> 00:09:27,280 Speaker 1: the site had content moderators, and they did stuff like 155 00:09:27,600 --> 00:09:30,640 Speaker 1: act on tweets that were spreading misinformation, like they could 156 00:09:30,760 --> 00:09:35,239 Speaker 1: label tweets as saying this is inaccurate or this is misinformation, 157 00:09:35,320 --> 00:09:38,280 Speaker 1: whatever it may be. Those days, however, are pretty much gone. 158 00:09:39,040 --> 00:09:42,200 Speaker 1: X relies very heavily on the user community to flag 159 00:09:42,240 --> 00:09:45,120 Speaker 1: posts that contain misinformation. You know, these are folks who 160 00:09:45,160 --> 00:09:47,839 Speaker 1: are not necessarily trained to do that, or it might 161 00:09:47,880 --> 00:09:51,400 Speaker 1: include people who are actually interested in spreading misinformation to 162 00:09:51,440 --> 00:09:54,400 Speaker 1: start with. Now, to be clear, this community based tool 163 00:09:54,559 --> 00:09:58,439 Speaker 1: existed before Elon Musk took over Twitter. This is not 164 00:09:58,520 --> 00:10:02,000 Speaker 1: like Twitter did this in instead of content moderation. Now, 165 00:10:02,040 --> 00:10:05,920 Speaker 1: originally this tool was called birdwatch. But while it used 166 00:10:05,960 --> 00:10:10,240 Speaker 1: to be a supplemental way for Twitter to monitor and 167 00:10:10,960 --> 00:10:16,000 Speaker 1: moderate content, it's now taking a more prominent role because 168 00:10:16,160 --> 00:10:19,680 Speaker 1: content moderation is not really a big thing over at 169 00:10:19,880 --> 00:10:22,240 Speaker 1: X these days, there aren't that many people left to 170 00:10:22,360 --> 00:10:26,240 Speaker 1: actually do it. The Center for Countering Digital Hate took 171 00:10:26,280 --> 00:10:29,199 Speaker 1: a look at the community notes tool on X and 172 00:10:29,320 --> 00:10:33,959 Speaker 1: found that seventy four percent of the tweets the posts 173 00:10:34,040 --> 00:10:40,319 Speaker 1: that they sampled had community notes that were inaccurate or inadequate. 174 00:10:40,679 --> 00:10:43,359 Speaker 1: The sample size was two hundred and eighty three posts 175 00:10:43,480 --> 00:10:46,560 Speaker 1: that had community notes attached to them. That means that 176 00:10:46,600 --> 00:10:49,360 Speaker 1: two hundred and nine of those posts were not accurate. 177 00:10:49,600 --> 00:10:52,559 Speaker 1: Now X maintains that the community notes feature is a 178 00:10:52,640 --> 00:10:55,720 Speaker 1: useful tool and has pointed to other academic research, saying 179 00:10:55,760 --> 00:10:59,400 Speaker 1: that it's a trustworthy source of information. So this is 180 00:10:59,440 --> 00:11:04,200 Speaker 1: not to say that definitively the tool is broken or 181 00:11:04,280 --> 00:11:08,720 Speaker 1: not performing as intended, but that this one nonprofit group 182 00:11:09,160 --> 00:11:12,960 Speaker 1: says that that's the case. For more information and a 183 00:11:13,000 --> 00:11:15,440 Speaker 1: more thorough look at all this, I recommend reading an 184 00:11:15,520 --> 00:11:19,760 Speaker 1: article by Ashley Bellinger on Ours Technica. It's titled Toxic 185 00:11:20,240 --> 00:11:26,840 Speaker 1: X users sabotage community notes that could derail disinfo report says. Okay, 186 00:11:27,160 --> 00:11:30,520 Speaker 1: while we're on the subject of Elon Musk, which is 187 00:11:30,920 --> 00:11:34,600 Speaker 1: tangential to X, I suppose in PRS Bobby Allen has 188 00:11:34,600 --> 00:11:38,160 Speaker 1: a piece titled and Elon Musk backed political group is 189 00:11:38,200 --> 00:11:42,720 Speaker 1: posting fake Kamala Harris ads on Facebook. So yeah, we've 190 00:11:42,760 --> 00:11:46,200 Speaker 1: got more misinformation and election news headed your way. My 191 00:11:46,320 --> 00:11:49,719 Speaker 1: apologies that we knew this was going to happen. I mean, 192 00:11:49,880 --> 00:11:52,199 Speaker 1: I've been talking about for months. But in this case, 193 00:11:52,240 --> 00:11:55,720 Speaker 1: Elon Musk has provided significant funds to a group called 194 00:11:55,920 --> 00:12:00,240 Speaker 1: Building America's Future. This group, in turn, has operated a 195 00:12:00,360 --> 00:12:04,719 Speaker 1: Facebook account that is called Progress twenty twenty eight, and 196 00:12:04,880 --> 00:12:08,880 Speaker 1: this account has published campaign ads that look like they're 197 00:12:08,920 --> 00:12:14,479 Speaker 1: from Kamala Harris's campaign, but in fact they are purposefully misleading, 198 00:12:14,880 --> 00:12:17,800 Speaker 1: both to trick the viewer into thinking that the ads 199 00:12:17,840 --> 00:12:22,560 Speaker 1: are actually from Kamala Harris's campaign, and also by misrepresenting 200 00:12:22,600 --> 00:12:26,760 Speaker 1: Harris's position on various topics. Now, according to the First Amendment, 201 00:12:27,200 --> 00:12:30,120 Speaker 1: there's nothing illegal going on here. There's nothing that says 202 00:12:30,320 --> 00:12:35,199 Speaker 1: you have to be truthful when you're advertising with political speech. Now, sure, 203 00:12:35,440 --> 00:12:37,880 Speaker 1: you got to be truthful when it comes to advertising 204 00:12:37,920 --> 00:12:41,280 Speaker 1: products and services. For example, if I were to endorse 205 00:12:41,360 --> 00:12:44,240 Speaker 1: something but it turned out I never actually used whatever 206 00:12:44,280 --> 00:12:47,120 Speaker 1: it was I was endorsing, I could potentially face a 207 00:12:47,320 --> 00:12:51,079 Speaker 1: pretty massive fine for that. Like I could get hit 208 00:12:51,240 --> 00:12:54,040 Speaker 1: hard for doing that, My company could get hit hard 209 00:12:54,080 --> 00:12:55,960 Speaker 1: for doing that. But you know, when it comes to 210 00:12:56,000 --> 00:13:00,520 Speaker 1: informing people about, you know, where politicians stand on specific topics, 211 00:13:00,679 --> 00:13:04,480 Speaker 1: truth just isn't a requirement here in the US. Further, 212 00:13:05,000 --> 00:13:07,680 Speaker 1: using the name Progress twenty twenty eight for the group 213 00:13:07,840 --> 00:13:11,480 Speaker 1: further obvious skates who's actually behind the campaign. So it 214 00:13:11,480 --> 00:13:14,000 Speaker 1: would be easy for someone to believe this is a 215 00:13:14,040 --> 00:13:17,240 Speaker 1: real ad from the Harris side. And it's interesting because 216 00:13:17,280 --> 00:13:19,559 Speaker 1: there are limitations for this kind of thing, I mean, 217 00:13:19,559 --> 00:13:21,959 Speaker 1: the use of deep fakes, you know, where you're making 218 00:13:22,000 --> 00:13:24,840 Speaker 1: it seem like someone is saying or doing something that 219 00:13:24,880 --> 00:13:27,840 Speaker 1: they actually did not do. That could be a violation 220 00:13:27,960 --> 00:13:31,600 Speaker 1: of policies. But an ad that ultimately does a similar 221 00:13:31,640 --> 00:13:34,440 Speaker 1: thing but without the use of deep fakes, that's fine, 222 00:13:34,720 --> 00:13:37,920 Speaker 1: which just makes me curious where the line is. Okay, 223 00:13:38,200 --> 00:13:43,960 Speaker 1: we've got tons more information about technology, including within the elections, 224 00:13:43,960 --> 00:13:56,280 Speaker 1: but also other stuff. But first, let's take a quick break. Okay, 225 00:13:56,320 --> 00:14:01,560 Speaker 1: we're back. Meta held its third quarter earnings call this week, 226 00:14:01,840 --> 00:14:06,640 Speaker 1: and CNBC's Jonathan Vanian has a piece all about the 227 00:14:06,679 --> 00:14:10,240 Speaker 1: Reality Labs part of that call, which was just one part. 228 00:14:10,480 --> 00:14:12,360 Speaker 1: There were there were more things to talk about than 229 00:14:12,440 --> 00:14:16,000 Speaker 1: just Reality Labs. But you might remember that Reality Labs 230 00:14:16,040 --> 00:14:19,680 Speaker 1: focuses on all things mixed reality, so that includes virtual 231 00:14:19,720 --> 00:14:23,680 Speaker 1: reality and augmented reality as well as the metaverse. You know, 232 00:14:23,920 --> 00:14:30,000 Speaker 1: this supposed future of our computational approach, and you know, 233 00:14:30,200 --> 00:14:32,240 Speaker 1: this has really served as a great way for Meta 234 00:14:32,240 --> 00:14:36,119 Speaker 1: to spend tons of money while having to assure investors 235 00:14:36,120 --> 00:14:38,680 Speaker 1: that the metaverse is very much going to happen and 236 00:14:38,720 --> 00:14:41,320 Speaker 1: it will be the future of computing, and that's something 237 00:14:41,320 --> 00:14:44,760 Speaker 1: that stakeholders remain skeptical about. I think for good reason. 238 00:14:45,040 --> 00:14:48,520 Speaker 1: That's my personal opinion. I remain unconvinced that the metaverse 239 00:14:48,880 --> 00:14:51,800 Speaker 1: is going to be a significant thing. I really doubt 240 00:14:51,840 --> 00:14:55,280 Speaker 1: it's going to be the new means of interacting with computing. 241 00:14:55,440 --> 00:14:57,880 Speaker 1: I just don't think that's true, but it could be wrong. 242 00:14:58,240 --> 00:15:01,400 Speaker 1: So anyway, the call revealed that Reality Labs had a 243 00:15:01,560 --> 00:15:05,440 Speaker 1: twenty nine percent increase in revenue, reaching two hundred and 244 00:15:05,440 --> 00:15:08,680 Speaker 1: seventy million dollars this past quarter. That's no small shakes. 245 00:15:08,680 --> 00:15:10,800 Speaker 1: Two hundred seventy million dollars is a lot of money. 246 00:15:10,880 --> 00:15:13,120 Speaker 1: Keep in mind, Reality Labs most of the money they 247 00:15:13,120 --> 00:15:18,440 Speaker 1: make is through product like the Meta VR headsets or 248 00:15:18,560 --> 00:15:23,320 Speaker 1: the augmented Reality ray bands that Meta has co branded 249 00:15:23,360 --> 00:15:28,240 Speaker 1: with ray Ban. Those are the source for revenue in 250 00:15:28,280 --> 00:15:31,480 Speaker 1: the most for the most part, for the Reality Labs division, 251 00:15:31,680 --> 00:15:33,600 Speaker 1: and two hundred and seventy million dollars that's a lot 252 00:15:33,600 --> 00:15:36,560 Speaker 1: of money, but it's not as much as analysts had 253 00:15:36,600 --> 00:15:39,520 Speaker 1: been hoping for. They had estimated that Meta would hit 254 00:15:39,800 --> 00:15:43,720 Speaker 1: around three hundred and ten million dollars in revenue from 255 00:15:43,720 --> 00:15:47,440 Speaker 1: the Reality Labs unit. Further, the division as a whole 256 00:15:47,720 --> 00:15:51,160 Speaker 1: chalked up an overall operating loss. So yeah, it made 257 00:15:51,160 --> 00:15:54,800 Speaker 1: two hundred and seventy million dollars, but it's spent way 258 00:15:55,880 --> 00:15:59,360 Speaker 1: way more than that. How much more? Well, the operating 259 00:15:59,440 --> 00:16:04,520 Speaker 1: loss is at four point four billion with a B dollars, 260 00:16:04,720 --> 00:16:06,880 Speaker 1: So yeah, you make two hundred and seventy million, but 261 00:16:06,960 --> 00:16:11,120 Speaker 1: you spent more than four point four billion. Yikes. Zuckerberg 262 00:16:11,160 --> 00:16:15,000 Speaker 1: continues to insist that vr AR and the metaverse are 263 00:16:15,080 --> 00:16:18,880 Speaker 1: the future, and he might be correct. Right, just because 264 00:16:18,920 --> 00:16:23,160 Speaker 1: it doesn't appeal to me and doesn't really seem to 265 00:16:23,840 --> 00:16:26,760 Speaker 1: fit with my concept of what computing really is all about, 266 00:16:26,920 --> 00:16:29,560 Speaker 1: that doesn't mean anything at all. I'm just one person, 267 00:16:30,040 --> 00:16:32,680 Speaker 1: but I've yet to see a demonstration of this technology 268 00:16:32,680 --> 00:16:36,480 Speaker 1: that convinces me that the pros actually outweigh the cons. 269 00:16:36,840 --> 00:16:39,680 Speaker 1: I mean, you know, feeling immersed is amazing, don't get 270 00:16:39,720 --> 00:16:43,040 Speaker 1: me wrong. Like, if you can experience a really immersive 271 00:16:43,360 --> 00:16:47,440 Speaker 1: presentation of some sort, that can be really fun and entertaining. 272 00:16:47,480 --> 00:16:50,040 Speaker 1: But wearing a headset for more than, you know, like 273 00:16:50,360 --> 00:16:53,000 Speaker 1: half an hour is a pretty hard sell for me, 274 00:16:53,240 --> 00:16:55,000 Speaker 1: as well as for a lot of other people. And 275 00:16:55,280 --> 00:16:59,240 Speaker 1: you know, people who like me are prone to motion sickness. 276 00:16:59,400 --> 00:17:02,480 Speaker 1: If the is not done really, really well, it's a 277 00:17:02,520 --> 00:17:05,800 Speaker 1: super hard sell. It doesn't matter how good the experience 278 00:17:05,920 --> 00:17:10,199 Speaker 1: is right, like, sometimes our tolerance is pretty low, and 279 00:17:10,280 --> 00:17:13,880 Speaker 1: I cannot imagine using this kind of technology to do 280 00:17:14,240 --> 00:17:20,040 Speaker 1: work right. Maybe leisure activities, but work is hard to 281 00:17:20,160 --> 00:17:23,800 Speaker 1: imagine in this particular context. Now, it could very well 282 00:17:23,840 --> 00:17:26,560 Speaker 1: be I'm just suffering under the problems of a very 283 00:17:26,600 --> 00:17:30,240 Speaker 1: limited imagination and that this really is the future. I 284 00:17:30,400 --> 00:17:34,560 Speaker 1: just remain very skeptical. Jason Kebler of four oh four 285 00:17:34,720 --> 00:17:37,919 Speaker 1: Media has a piece that's got a great headline. The 286 00:17:37,960 --> 00:17:44,440 Speaker 1: headline is Zuckerberg, the AI slop will continue until morale improves. 287 00:17:44,880 --> 00:17:47,440 Speaker 1: That's a great play on a popular joke that's typically 288 00:17:47,480 --> 00:17:52,120 Speaker 1: about flogging anyway. As you might suspect, Kebler is not 289 00:17:52,160 --> 00:17:57,480 Speaker 1: a huge fan of AI generated content flooding various social platforms, 290 00:17:57,680 --> 00:18:01,720 Speaker 1: but he quotes Zuckerberg who said during that earnings call quote, 291 00:18:01,920 --> 00:18:05,200 Speaker 1: I think we're going to add a whole new category 292 00:18:05,240 --> 00:18:09,800 Speaker 1: of content, which is AI generated or AI summarized content, 293 00:18:10,119 --> 00:18:13,680 Speaker 1: or kind of existing content pulled together by AI in 294 00:18:13,800 --> 00:18:17,119 Speaker 1: some way. And I think that's going to be just 295 00:18:17,400 --> 00:18:20,679 Speaker 1: very exciting for Facebook and Instagram and maybe threads or 296 00:18:20,880 --> 00:18:25,080 Speaker 1: other kind of feed experiences over time. End quote. So 297 00:18:25,119 --> 00:18:28,320 Speaker 1: that's all direct quote, including like the verbal stumbling now 298 00:18:28,320 --> 00:18:31,639 Speaker 1: in full disclosure, I actually don't mind using AI to 299 00:18:31,720 --> 00:18:35,119 Speaker 1: summarize something that I have already read in order to 300 00:18:35,160 --> 00:18:37,879 Speaker 1: like refresh my memory of the most important points, you know, 301 00:18:38,000 --> 00:18:40,240 Speaker 1: kind of like using it like a note taking app. 302 00:18:40,320 --> 00:18:42,760 Speaker 1: I don't mind doing that, but I do think it's 303 00:18:42,800 --> 00:18:46,239 Speaker 1: still vital to actually read the content first and not 304 00:18:46,400 --> 00:18:49,960 Speaker 1: just rely on an AI generated summary, because sometimes those 305 00:18:50,000 --> 00:18:53,960 Speaker 1: are not accurate or they misidentify what the most important 306 00:18:53,960 --> 00:18:57,520 Speaker 1: points are. I have less than zero interest in having 307 00:18:57,600 --> 00:19:02,880 Speaker 1: AI generated content populate my social media feeds as it stands. 308 00:19:03,000 --> 00:19:07,080 Speaker 1: I hate that a recommendation algorithm decides which posts I 309 00:19:07,200 --> 00:19:10,080 Speaker 1: see and when I see them, because I often miss 310 00:19:10,119 --> 00:19:13,600 Speaker 1: out on important, timely stuff because Meta thinks that I 311 00:19:13,600 --> 00:19:17,000 Speaker 1: should see that Facebook post later on, maybe sometimes weeks 312 00:19:17,119 --> 00:19:20,280 Speaker 1: later after it's been posted. Like I posted a picture 313 00:19:20,600 --> 00:19:24,680 Speaker 1: of me having voted. I believe that was on October 314 00:19:24,760 --> 00:19:28,320 Speaker 1: seventeenth or so. So anyway, it was not long after Georgia, 315 00:19:28,440 --> 00:19:31,040 Speaker 1: my area of Georgia had started doing early voting, and 316 00:19:31,320 --> 00:19:34,199 Speaker 1: I was still getting people just seeing that picture for 317 00:19:34,240 --> 00:19:38,520 Speaker 1: the first time and liking it this week, more than 318 00:19:38,560 --> 00:19:42,919 Speaker 1: a week after I had done it. So that tells 319 00:19:42,960 --> 00:19:46,200 Speaker 1: me that the recommendation algorithm is not serving this up 320 00:19:46,200 --> 00:19:49,240 Speaker 1: in a timely manner. So I don't want the stuff 321 00:19:49,240 --> 00:19:52,720 Speaker 1: that my friends post to be buried even further under 322 00:19:52,840 --> 00:19:57,280 Speaker 1: AI generated crap. You know, I'm tired of not being 323 00:19:57,320 --> 00:19:59,439 Speaker 1: able to see the things that my friends post. I mean, 324 00:19:59,440 --> 00:20:02,080 Speaker 1: there are times where someone appears to disappear from my 325 00:20:02,119 --> 00:20:05,160 Speaker 1: life entirely. But it's not that they're gone and they're 326 00:20:05,200 --> 00:20:08,280 Speaker 1: not posting or anything. It's that Meta has just decided 327 00:20:08,760 --> 00:20:11,560 Speaker 1: that I don't need to see their updates unless I 328 00:20:11,600 --> 00:20:13,600 Speaker 1: actually go to their page, and even then I'm not 329 00:20:13,640 --> 00:20:16,600 Speaker 1: necessarily going to see it in chronological order. And to 330 00:20:16,640 --> 00:20:20,880 Speaker 1: think that even this limited engagement could be limited even 331 00:20:21,000 --> 00:20:26,359 Speaker 1: more with AI generated junk is really upsetting. So is 332 00:20:26,400 --> 00:20:30,680 Speaker 1: Meta just trying to sever relationships between real life friends. No, 333 00:20:31,240 --> 00:20:33,159 Speaker 1: I don't think that's true. I don't think that's what 334 00:20:33,240 --> 00:20:36,040 Speaker 1: Meta's goal is. What the company is actually trying to do, 335 00:20:36,160 --> 00:20:38,960 Speaker 1: is what it's been doing for years, is trying to 336 00:20:39,040 --> 00:20:42,120 Speaker 1: keep you on its various platforms for as long as 337 00:20:42,119 --> 00:20:45,359 Speaker 1: possible in order to serve you more ads and to 338 00:20:45,440 --> 00:20:48,240 Speaker 1: make more money. And if your friends just aren't enough 339 00:20:48,480 --> 00:20:52,720 Speaker 1: to keep you glued to Facebook or you know whatever, Well, 340 00:20:52,880 --> 00:20:55,720 Speaker 1: by golly, we'll just have the bot's slave away and 341 00:20:55,760 --> 00:20:58,800 Speaker 1: the content minds to create stuff that does keep you there, 342 00:20:59,080 --> 00:21:02,680 Speaker 1: which is pretty gross. Oh and while Meta is looking 343 00:21:02,720 --> 00:21:06,240 Speaker 1: at a more than four billion dollar operating loss on 344 00:21:06,280 --> 00:21:09,320 Speaker 1: the reality Labs division, it's also looking to spend even 345 00:21:09,359 --> 00:21:13,159 Speaker 1: more money, largely on the AI side of things. So 346 00:21:13,440 --> 00:21:17,160 Speaker 1: according to CNBC, Meta plans to raise the low end 347 00:21:17,160 --> 00:21:21,520 Speaker 1: of its capital expenditures guidance for twenty twenty four up 348 00:21:21,600 --> 00:21:24,640 Speaker 1: to thirty eight billion dollars, So that's up a full 349 00:21:24,760 --> 00:21:27,720 Speaker 1: billion dollars from what it was before. Earlier, the low 350 00:21:27,840 --> 00:21:30,280 Speaker 1: end for expenditures for this year was going to be 351 00:21:30,320 --> 00:21:34,080 Speaker 1: thirty seven billion. Now the top end is forty billion, 352 00:21:34,119 --> 00:21:37,479 Speaker 1: which remains unchanged from earlier in the year. So really 353 00:21:37,640 --> 00:21:40,480 Speaker 1: the margin has just decreased, right, It's going to be 354 00:21:40,520 --> 00:21:43,120 Speaker 1: somewhere between thirty eight and forty billion dollars, but it's 355 00:21:43,119 --> 00:21:46,800 Speaker 1: definitely not thirty seven that's too little. Zuckerberg says next 356 00:21:46,880 --> 00:21:49,080 Speaker 1: year will be even more dramatic, which kind of is 357 00:21:49,119 --> 00:21:50,760 Speaker 1: in line with what he's been saying for the last 358 00:21:50,800 --> 00:21:54,000 Speaker 1: couple of years. And as I have said repeatedly on 359 00:21:54,160 --> 00:21:58,280 Speaker 1: tech stuff, AI is expensive and running AI operations in 360 00:21:58,320 --> 00:22:00,920 Speaker 1: the cloud costs a whole whole lot of money. You 361 00:22:01,320 --> 00:22:04,760 Speaker 1: need a lot of data centers, you need specialized processors. 362 00:22:05,240 --> 00:22:09,360 Speaker 1: This racks up the cost of operation very quickly. It's 363 00:22:09,359 --> 00:22:12,080 Speaker 1: one of the reasons that I actually favor having AI 364 00:22:12,240 --> 00:22:16,160 Speaker 1: processing capabilities built into chips that are in consumer devices 365 00:22:16,520 --> 00:22:19,919 Speaker 1: and moving at least some or hopefully all of that 366 00:22:20,240 --> 00:22:24,160 Speaker 1: AI processing to the end device, because one that speeds 367 00:22:24,200 --> 00:22:28,800 Speaker 1: things up considerably because it's just handling your calculations, not 368 00:22:28,960 --> 00:22:33,880 Speaker 1: everybody's two. It vastly improves security and privacy because you're 369 00:22:33,920 --> 00:22:38,760 Speaker 1: not constantly beaming your information to some server farms somewhere. 370 00:22:38,960 --> 00:22:42,080 Speaker 1: As you engage with an AI application, it's not being 371 00:22:42,200 --> 00:22:44,679 Speaker 1: held in some server farm. It's not being used to 372 00:22:44,960 --> 00:22:48,840 Speaker 1: train the next generation of large language models. It's just 373 00:22:48,960 --> 00:22:52,199 Speaker 1: you interacting with your device. So I much prefer that 374 00:22:52,359 --> 00:22:57,080 Speaker 1: approach to AI interactions than the cloud based ones. But 375 00:22:57,280 --> 00:23:00,600 Speaker 1: that's really a topic for another time now. Like reality labs, 376 00:23:00,640 --> 00:23:04,680 Speaker 1: the AI costs are something Zuckerberg doesn't really shy away from. 377 00:23:04,760 --> 00:23:07,960 Speaker 1: He just wades right into it. He says it's a 378 00:23:08,080 --> 00:23:12,560 Speaker 1: necessary cost in order to establish the infrastructure for future platforms, 379 00:23:12,840 --> 00:23:15,879 Speaker 1: and again he might be right, like, maybe this is 380 00:23:15,920 --> 00:23:19,000 Speaker 1: the future and he's just getting the work done early, 381 00:23:19,040 --> 00:23:21,760 Speaker 1: and it's very painful because it's very expensive and it's 382 00:23:21,760 --> 00:23:24,080 Speaker 1: not paying off right away, but that in the long 383 00:23:24,160 --> 00:23:28,320 Speaker 1: run it sets the groundwork that allows Meta to prosper. 384 00:23:28,640 --> 00:23:31,120 Speaker 1: But in the meantime, the market has been a little critical. 385 00:23:31,640 --> 00:23:35,760 Speaker 1: Shortly after the earnings call, in after market trading, the 386 00:23:35,880 --> 00:23:38,320 Speaker 1: stock price for Meta took a bit of a dip. 387 00:23:38,840 --> 00:23:42,280 Speaker 1: So it seems to me that investors are getting a 388 00:23:42,400 --> 00:23:46,960 Speaker 1: little anxious and tired of these reports about the company 389 00:23:47,040 --> 00:23:51,040 Speaker 1: spending so much money on reality labs and AI when 390 00:23:51,760 --> 00:23:57,080 Speaker 1: the payoff for those investments is pretty far down the road. Personally, 391 00:23:57,560 --> 00:24:01,600 Speaker 1: I think in general that preparing for the long term 392 00:24:01,760 --> 00:24:05,040 Speaker 1: is a better approach than just focusing on the short term. So, 393 00:24:05,680 --> 00:24:07,760 Speaker 1: you know, I feel conflicted about this because I do 394 00:24:07,800 --> 00:24:10,600 Speaker 1: think that if you're going to really invest in a 395 00:24:10,640 --> 00:24:14,480 Speaker 1: company's future, being willing to do that and know that 396 00:24:14,520 --> 00:24:17,720 Speaker 1: you're not going to see returns right away takes a 397 00:24:17,720 --> 00:24:21,000 Speaker 1: lot of bold decision making and ultimately can be the 398 00:24:21,040 --> 00:24:24,280 Speaker 1: best thing for the company. On the flip side, I 399 00:24:24,359 --> 00:24:27,880 Speaker 1: remain unconvinced that the metaverse and AI is really going 400 00:24:27,960 --> 00:24:31,520 Speaker 1: to give Meta the ability to stand apart from other 401 00:24:31,600 --> 00:24:34,879 Speaker 1: competitors in the space, or that the company has established 402 00:24:34,960 --> 00:24:37,560 Speaker 1: that there's a future in that particular type of computing 403 00:24:37,560 --> 00:24:40,560 Speaker 1: in the first place. So double edged sword kind of thing. 404 00:24:40,840 --> 00:24:43,520 Speaker 1: But I would actually like to see more companies take 405 00:24:43,640 --> 00:24:47,320 Speaker 1: a firm stance on yes, this is a painful decision 406 00:24:47,520 --> 00:24:49,919 Speaker 1: in the sense that you know, we're investing money in 407 00:24:49,960 --> 00:24:52,240 Speaker 1: something that's not going to be an immediate payoff, but 408 00:24:52,320 --> 00:24:55,000 Speaker 1: down the road it's going to be the best thing 409 00:24:55,040 --> 00:24:57,800 Speaker 1: for our company. I am really tired of seeing companies 410 00:24:57,800 --> 00:25:02,600 Speaker 1: that almost sacrifice everything for the purpose of getting short 411 00:25:02,680 --> 00:25:05,840 Speaker 1: term gains, because ultimately, I think that ends up corrupting 412 00:25:05,840 --> 00:25:09,159 Speaker 1: companies from the inside, and they ultimately crumble under their 413 00:25:09,200 --> 00:25:11,639 Speaker 1: own weight once they reach a point where you just 414 00:25:11,680 --> 00:25:15,000 Speaker 1: can't do that anymore. Okay, enough about that, But one 415 00:25:15,000 --> 00:25:18,680 Speaker 1: other thing that Zuckerbird brought up during that earnings call 416 00:25:19,080 --> 00:25:23,760 Speaker 1: was that Threads, the Meta operated alternative to Twitter, is 417 00:25:23,920 --> 00:25:27,720 Speaker 1: up to nearly two hundred and seventy five million monthly visitors. 418 00:25:28,080 --> 00:25:31,040 Speaker 1: That is a one hundred and seventy five percent increase 419 00:25:31,320 --> 00:25:34,240 Speaker 1: from last year, and he also said the app is 420 00:25:34,240 --> 00:25:38,080 Speaker 1: seeing more than one million new users sign up every 421 00:25:38,160 --> 00:25:42,439 Speaker 1: single day. Now, that, along with a general migration away 422 00:25:42,560 --> 00:25:46,280 Speaker 1: from X formerly known as Twitter, means the gap between 423 00:25:46,280 --> 00:25:50,240 Speaker 1: the two services is beginning to narrow. Analysts from Sensor 424 00:25:50,320 --> 00:25:53,119 Speaker 1: Tower estimate that X has somewhere in the neighborhood of 425 00:25:53,119 --> 00:25:56,400 Speaker 1: three hundred and eighteen million users. Now, I say estimate 426 00:25:56,480 --> 00:25:59,520 Speaker 1: because X is a private company, and as a private company, 427 00:25:59,600 --> 00:26:02,800 Speaker 1: they don't have to share that kind of information. Threads 428 00:26:03,240 --> 00:26:06,880 Speaker 1: is somewhere around two hundred and seventy five million monthly users. 429 00:26:07,119 --> 00:26:09,600 Speaker 1: So the growth of Threads, along with folks who are 430 00:26:09,680 --> 00:26:13,440 Speaker 1: ditching X, has changed the landscape quite a bit. Meta's 431 00:26:13,560 --> 00:26:17,560 Speaker 1: CFO Chief financial Officer Susan Lee has said that the 432 00:26:17,600 --> 00:26:20,680 Speaker 1: company doesn't expect Threads to be a significant revenue driver 433 00:26:20,760 --> 00:26:23,960 Speaker 1: in twenty twenty five. Rather, the focus will be on 434 00:26:24,520 --> 00:26:28,680 Speaker 1: growing the platform further before really exploiting the heck out 435 00:26:28,720 --> 00:26:31,000 Speaker 1: of it. Now, if I were a betting man, I 436 00:26:31,000 --> 00:26:33,800 Speaker 1: would say that Meta will eventually sprinkle some of its 437 00:26:33,840 --> 00:26:38,000 Speaker 1: targeted advertising ferry dust on Threads, and it will probably 438 00:26:38,000 --> 00:26:41,159 Speaker 1: happen in the not too distant future, and it will 439 00:26:41,200 --> 00:26:47,399 Speaker 1: increase Meta's already significant hold in the online advertising space. Meanwhile, 440 00:26:47,520 --> 00:26:51,440 Speaker 1: I expect X's decline to continue simply because Elon Musk 441 00:26:51,600 --> 00:26:56,000 Speaker 1: cannot help himself and consistently makes decisions that drive away 442 00:26:56,040 --> 00:27:00,959 Speaker 1: many users of X to other platforms include Threads and 443 00:27:01,000 --> 00:27:03,680 Speaker 1: blue Sky and mast it on. I think the one 444 00:27:03,720 --> 00:27:09,280 Speaker 1: advantage X has is that no other singular microblogging platform 445 00:27:09,720 --> 00:27:14,040 Speaker 1: has the dominance that Twitter once had, So either users 446 00:27:14,080 --> 00:27:17,359 Speaker 1: are adopting multiple platforms and kind of spreading their time 447 00:27:17,400 --> 00:27:20,719 Speaker 1: across them, or they're having to choose one of several 448 00:27:20,720 --> 00:27:23,879 Speaker 1: different offerings and just be satisfied with the fact that 449 00:27:23,920 --> 00:27:25,720 Speaker 1: they're not going to see everything they used to see 450 00:27:25,760 --> 00:27:29,760 Speaker 1: because other people have gone to competing services. So that's 451 00:27:29,840 --> 00:27:32,480 Speaker 1: kind of my view of where things are right now. Okay, 452 00:27:33,040 --> 00:27:34,960 Speaker 1: we're going to take another quick break, but I still 453 00:27:35,000 --> 00:27:38,800 Speaker 1: have plenty of tech news. This week was shock full, y'all, 454 00:27:39,040 --> 00:27:40,919 Speaker 1: so let's take a quick break. We'll be back with 455 00:27:40,960 --> 00:27:53,399 Speaker 1: some more headlines. Kylie Robson of The Verge has a 456 00:27:53,480 --> 00:27:57,639 Speaker 1: piece titled open AI's search engine is now live in 457 00:27:57,800 --> 00:28:01,159 Speaker 1: Chat GPT, which I'm sure is for concern for some 458 00:28:01,200 --> 00:28:04,800 Speaker 1: folks over at Google. Now. To be clear, this feature 459 00:28:04,920 --> 00:28:09,480 Speaker 1: is currently only available to paid subscribers of Chat GPT. 460 00:28:09,960 --> 00:28:13,199 Speaker 1: It will later roll out to other users, like, you know, 461 00:28:13,440 --> 00:28:17,560 Speaker 1: enterprise users or people who are working in educational institutions. 462 00:28:17,840 --> 00:28:21,919 Speaker 1: As Robison explains, the feature is integrated directly into the 463 00:28:22,000 --> 00:28:25,439 Speaker 1: chat gpt experience, so it's not like a separate tool 464 00:28:25,640 --> 00:28:29,800 Speaker 1: or tab or something. Users when they're putting in a 465 00:28:29,840 --> 00:28:35,639 Speaker 1: query and conversing with the chat gpt bot will sometimes 466 00:28:35,680 --> 00:28:41,160 Speaker 1: receive suggestions for various websites that relate to whatever the 467 00:28:41,320 --> 00:28:45,680 Speaker 1: topic of discussion is, so they'll get suggestions for resources 468 00:28:45,800 --> 00:28:49,880 Speaker 1: that they can use to further answer their questions. Robison 469 00:28:49,920 --> 00:28:53,400 Speaker 1: points out that chat GPT's search results do not contain 470 00:28:53,520 --> 00:28:57,200 Speaker 1: promoted results, so that might be a big advantage over 471 00:28:57,320 --> 00:29:00,360 Speaker 1: folks who previously had used Google. You can, and also, 472 00:29:00,440 --> 00:29:03,640 Speaker 1: by the way, directly ask chat gpt to give you 473 00:29:04,040 --> 00:29:07,520 Speaker 1: web results, so you can ask it to give you 474 00:29:07,560 --> 00:29:11,160 Speaker 1: those those websites from the beginning and just treat it 475 00:29:11,240 --> 00:29:16,440 Speaker 1: like it's a search engine. And by not including promoted results, 476 00:29:16,160 --> 00:29:19,840 Speaker 1: that's a shot across the bow at Google, because that's 477 00:29:19,880 --> 00:29:22,680 Speaker 1: Google's bread and butter. I'm guessing anyone who has used 478 00:29:22,720 --> 00:29:25,760 Speaker 1: Google for like the last decade has noticed that sponsored 479 00:29:25,800 --> 00:29:29,920 Speaker 1: results appear at the top of search results. Lists now. 480 00:29:29,960 --> 00:29:33,560 Speaker 1: In the very old days, the general expectation was that 481 00:29:33,640 --> 00:29:37,560 Speaker 1: the most relevant, highest quality search results would appear at 482 00:29:37,560 --> 00:29:41,200 Speaker 1: the very top of Google's results page. But for the 483 00:29:41,240 --> 00:29:44,160 Speaker 1: last several years, the very top of that list is 484 00:29:44,240 --> 00:29:48,440 Speaker 1: frequently dominated with paid ads that lock in those spots, 485 00:29:48,720 --> 00:29:51,760 Speaker 1: And even though Google does tag those search results as 486 00:29:51,800 --> 00:29:55,760 Speaker 1: being promoted search results, some folks still bristle at how 487 00:29:55,840 --> 00:29:58,680 Speaker 1: this feels like preferential treatment for what is often thought 488 00:29:58,680 --> 00:30:01,960 Speaker 1: of as a scholarly like ibbe. As if you went 489 00:30:02,040 --> 00:30:05,000 Speaker 1: to the library and you wanted to look up books 490 00:30:05,040 --> 00:30:07,600 Speaker 1: that were about a specific topic, and you found out 491 00:30:07,760 --> 00:30:10,600 Speaker 1: that the results of all the books in the library 492 00:30:10,600 --> 00:30:13,160 Speaker 1: that do relate to that topic have been ordered not 493 00:30:13,480 --> 00:30:18,640 Speaker 1: by author name or whatever, but by way of who 494 00:30:18,680 --> 00:30:22,040 Speaker 1: has paid the library to have their book promoted the most. 495 00:30:22,600 --> 00:30:26,400 Speaker 1: That's not cool. Open ai claims it does not intend 496 00:30:26,480 --> 00:30:29,560 Speaker 1: on selling advertising the way Google does, but that does 497 00:30:29,680 --> 00:30:33,520 Speaker 1: raise questions as to how AI will generate revenue from 498 00:30:33,600 --> 00:30:37,280 Speaker 1: free users who use this tool, because presumably free users 499 00:30:37,280 --> 00:30:39,680 Speaker 1: will be able to interact with this in the not 500 00:30:39,760 --> 00:30:43,040 Speaker 1: too distant future. Another question that remains is how open 501 00:30:43,080 --> 00:30:46,000 Speaker 1: ai will navigate the choppy waters when it comes to 502 00:30:46,120 --> 00:30:51,440 Speaker 1: referencing or pulling from various resources online. Most media companies 503 00:30:51,480 --> 00:30:53,600 Speaker 1: would very much prefer if you were to go to 504 00:30:53,720 --> 00:30:56,479 Speaker 1: their website and read stuff on their own web pages. 505 00:30:56,640 --> 00:30:59,200 Speaker 1: That's one of the many reasons I actually recommend folks 506 00:30:59,320 --> 00:31:03,240 Speaker 1: check out the articles that I reference in these episodes. BENJ. 507 00:31:03,520 --> 00:31:08,080 Speaker 1: Edwards of Ours Technica, as an article titled Google CEO says, 508 00:31:08,120 --> 00:31:11,720 Speaker 1: over twenty five percent of new Google code is generated 509 00:31:11,760 --> 00:31:15,600 Speaker 1: by AI. That's really incredible that more than one quarter 510 00:31:15,960 --> 00:31:19,880 Speaker 1: of the code generated within Google is coming from AI tools. Now, 511 00:31:19,880 --> 00:31:23,760 Speaker 1: according to Google executives, all these operations are still ultimately 512 00:31:23,800 --> 00:31:27,840 Speaker 1: overseen by actual human beings. AI might generate some of 513 00:31:27,880 --> 00:31:31,840 Speaker 1: that code, but humans review and then approve that code 514 00:31:31,920 --> 00:31:34,240 Speaker 1: before it can be deployed. Now, the purpose of this 515 00:31:34,320 --> 00:31:38,200 Speaker 1: structure is to streamline coding and speed things up, presumably 516 00:31:38,240 --> 00:31:40,240 Speaker 1: while also cutting down on the need for quite so 517 00:31:40,320 --> 00:31:43,280 Speaker 1: many people coding for Google in the first place. And 518 00:31:43,320 --> 00:31:46,720 Speaker 1: as Edwards explains, not everyone is super happy about how 519 00:31:46,800 --> 00:31:50,080 Speaker 1: AI is taking such a prominent role in coding. There 520 00:31:50,120 --> 00:31:53,240 Speaker 1: are critics who worry that relying so heavily on AI 521 00:31:53,440 --> 00:31:56,800 Speaker 1: means we could end up with apps and code that 522 00:31:56,880 --> 00:32:00,880 Speaker 1: has more errors in it more vulnerabilities interduced into the 523 00:32:00,920 --> 00:32:05,200 Speaker 1: code that are difficult to detect, meaning that anyone deploying 524 00:32:05,200 --> 00:32:08,600 Speaker 1: that code could be creating opportunities for hackers, and maybe 525 00:32:08,640 --> 00:32:11,880 Speaker 1: the hackers are using their own AI tools to identify 526 00:32:12,000 --> 00:32:15,120 Speaker 1: and then exploit those vulnerabilities. In fact, I think that's 527 00:32:15,160 --> 00:32:18,760 Speaker 1: just a guarantee. I think AI coding and AI hacking 528 00:32:19,000 --> 00:32:21,520 Speaker 1: that's the reality we face now. It's going to be 529 00:32:21,560 --> 00:32:26,240 Speaker 1: a seesaw like relationship moving forward. I recommend reading BENJ. 530 00:32:26,400 --> 00:32:29,280 Speaker 1: Edwards article in full for a deeper look into the story, 531 00:32:29,320 --> 00:32:32,000 Speaker 1: including how some studies have shown that folks who have 532 00:32:32,280 --> 00:32:36,400 Speaker 1: used AI to help code will simultaneously end up believing 533 00:32:36,440 --> 00:32:39,720 Speaker 1: that their code is superior to other people's code because 534 00:32:39,760 --> 00:32:43,080 Speaker 1: they used AI tools to make it, while also introducing 535 00:32:43,160 --> 00:32:46,480 Speaker 1: more bugs into their own work. So it's kind of 536 00:32:46,520 --> 00:32:51,960 Speaker 1: like creating this false sense of superiority or our expectation. 537 00:32:52,760 --> 00:32:57,240 Speaker 1: Jonathan M. Gitlin, also of ours Tetnica, has another Google piece. 538 00:32:57,320 --> 00:33:01,000 Speaker 1: This one is titled Generative AI is Coming to Google Maps, 539 00:33:01,200 --> 00:33:04,560 Speaker 1: Google Earth Ways and Yeah. It's all about how Google 540 00:33:04,640 --> 00:33:08,120 Speaker 1: is incorporating more AI tools into its various map apps 541 00:33:08,320 --> 00:33:11,440 Speaker 1: to give users more features. Some of this was news 542 00:33:11,440 --> 00:33:13,160 Speaker 1: to me, In fact, quite a bit of news to me, 543 00:33:13,240 --> 00:33:16,560 Speaker 1: even though apparently some of these features have been around 544 00:33:16,600 --> 00:33:20,400 Speaker 1: for a while. So for example, Getland quotes Google's VP 545 00:33:20,520 --> 00:33:25,959 Speaker 1: and general manager of GEO. That's Chris Phillips, who said, quote, 546 00:33:26,360 --> 00:33:30,080 Speaker 1: think of features like lens and maps. When you're on 547 00:33:30,120 --> 00:33:32,560 Speaker 1: a street corner, you can lift up your phone and look, 548 00:33:32,680 --> 00:33:35,760 Speaker 1: and through your camera view you can actually see. We 549 00:33:35,880 --> 00:33:38,400 Speaker 1: laid places on top of your view so you can 550 00:33:38,400 --> 00:33:41,640 Speaker 1: see a business, Is it open, what are the ratings 551 00:33:41,680 --> 00:33:44,400 Speaker 1: for it? Is it busy? You can even see businesses 552 00:33:44,440 --> 00:33:47,280 Speaker 1: that are out of your line of sight. End quote. 553 00:33:47,520 --> 00:33:49,560 Speaker 1: I didn't know about that, although I have talked about 554 00:33:49,640 --> 00:33:53,120 Speaker 1: that very use case and augmented reality in the past. 555 00:33:53,280 --> 00:33:55,440 Speaker 1: But I think it's pretty cool, particularly if you're out 556 00:33:55,440 --> 00:33:57,880 Speaker 1: and about somewhere that's new to you and you want 557 00:33:57,920 --> 00:34:00,840 Speaker 1: to find a place that's like open and nottally slammed. 558 00:34:01,160 --> 00:34:04,040 Speaker 1: I actually could have used that. Last weekend, me and 559 00:34:04,080 --> 00:34:07,880 Speaker 1: my party were in Pennsylvania for the Pennsylvania Renaissance Fair, 560 00:34:07,920 --> 00:34:10,480 Speaker 1: which now is over, but if you ever get a chance, 561 00:34:10,600 --> 00:34:13,160 Speaker 1: it's a phenomenal renaissance fair. I say that as someone 562 00:34:13,480 --> 00:34:16,560 Speaker 1: who used to work at a Renaissance Fair. It's amazing. 563 00:34:16,719 --> 00:34:19,560 Speaker 1: But after the fair was done on Sunday, we wanted 564 00:34:19,600 --> 00:34:21,680 Speaker 1: to go and eat at a restaurant, and we had 565 00:34:21,719 --> 00:34:23,839 Speaker 1: identified which one we wanted to go to, but by 566 00:34:23,880 --> 00:34:25,600 Speaker 1: the time we got there, we found out the kitchen 567 00:34:25,600 --> 00:34:28,880 Speaker 1: had already closed, and so we were forced to change 568 00:34:28,880 --> 00:34:31,560 Speaker 1: our plans and find somewhere else to grab dinner. Having 569 00:34:31,640 --> 00:34:34,440 Speaker 1: this tool would have told us like, oh, by the 570 00:34:34,440 --> 00:34:36,239 Speaker 1: time you get there, it's going to be too late, 571 00:34:36,320 --> 00:34:39,440 Speaker 1: so don't even bother, and that would have saved us 572 00:34:39,440 --> 00:34:42,960 Speaker 1: some trouble. Anyway, Google is bringing AI enabled search results 573 00:34:43,000 --> 00:34:45,480 Speaker 1: to Maps, so if you just have a general hanker 574 00:34:45,520 --> 00:34:47,680 Speaker 1: in for something but you don't know any more details, 575 00:34:47,920 --> 00:34:50,000 Speaker 1: you can ask Maps and have a little conversation with 576 00:34:50,040 --> 00:34:53,000 Speaker 1: an AI bought to whittle down your options. Check out 577 00:34:53,000 --> 00:34:56,320 Speaker 1: Getland's article to learn more about the upcoming features headed 578 00:34:56,400 --> 00:34:59,960 Speaker 1: to Google Maps and these other tools. Joshua Tyler of 579 00:35:00,080 --> 00:35:04,400 Speaker 1: Giant Freaking Robot has an article titled I attended Google's 580 00:35:04,440 --> 00:35:09,480 Speaker 1: Creator Conversation event and it turned into a funeral. Yikes. 581 00:35:09,600 --> 00:35:11,800 Speaker 1: All right, so this is about an event that Google 582 00:35:11,880 --> 00:35:14,960 Speaker 1: held this week, and Tyler's description of it is pretty 583 00:35:15,080 --> 00:35:18,400 Speaker 1: darn grim. He talks about receiving a tour of the 584 00:35:18,440 --> 00:35:21,240 Speaker 1: Google campus, and it was a tour of a campus 585 00:35:21,239 --> 00:35:24,760 Speaker 1: that was almost completely empty, although the event was happening 586 00:35:24,800 --> 00:35:27,480 Speaker 1: in the middle of a work day. He describes getting 587 00:35:27,520 --> 00:35:31,120 Speaker 1: not one but two security badges before being allowed into 588 00:35:31,160 --> 00:35:33,919 Speaker 1: a secure building, only to have no one actually look 589 00:35:33,960 --> 00:35:37,520 Speaker 1: at or check his credentials once that time came. According 590 00:35:37,520 --> 00:35:40,880 Speaker 1: to Tyler, the attendees of this event consisted of folks 591 00:35:40,920 --> 00:35:44,640 Speaker 1: who worked on independent websites that have been shadow band 592 00:35:45,280 --> 00:35:49,160 Speaker 1: due to Google's changes in its approach to generating search results. Now, 593 00:35:49,200 --> 00:35:51,800 Speaker 1: in case you're not familiar with the term shadow band, 594 00:35:52,080 --> 00:35:56,000 Speaker 1: it means that you're blocking someone or something from your 595 00:35:56,040 --> 00:35:59,920 Speaker 1: online service like a forum or in this case, Google 596 00:36:00,200 --> 00:36:02,919 Speaker 1: search results, and they don't have any knowledge that they've 597 00:36:02,960 --> 00:36:05,759 Speaker 1: been blocked. So the entity is still allowed to make 598 00:36:05,920 --> 00:36:09,040 Speaker 1: posts on the platform. Like let's say that it's a 599 00:36:09,080 --> 00:36:13,640 Speaker 1: social media platform, you could still post to it. The trick, though, 600 00:36:13,680 --> 00:36:16,160 Speaker 1: is that no one is seeing your posts. Yeah, your 601 00:36:16,160 --> 00:36:19,640 Speaker 1: posts are being published, but no one actually gets to 602 00:36:19,680 --> 00:36:23,200 Speaker 1: see them. That way, the entity that's making these posts 603 00:36:23,680 --> 00:36:27,120 Speaker 1: remains unaware that their work is purposefully being withheld from 604 00:36:27,160 --> 00:36:29,920 Speaker 1: the intended audience, and then they're just left to wonder 605 00:36:30,000 --> 00:36:33,759 Speaker 1: why the heck their traffic stats are in serious decline. 606 00:36:33,920 --> 00:36:38,000 Speaker 1: Tyler and other site owners and operators began to answer 607 00:36:38,160 --> 00:36:42,120 Speaker 1: questions from Googlers, So this conversation was really Googler's asking 608 00:36:42,760 --> 00:36:46,160 Speaker 1: the site owners questions, and the owners tried to bring 609 00:36:46,280 --> 00:36:49,600 Speaker 1: up their concerns about how changes to Google's approach had 610 00:36:49,600 --> 00:36:54,440 Speaker 1: disproportionately affected their traffic stats. I recommend reading Tyler's full article. 611 00:36:54,560 --> 00:36:57,719 Speaker 1: It is upsetting, it is sad, and as someone who 612 00:36:57,760 --> 00:37:00,400 Speaker 1: worked for a company that was really heavily depend on 613 00:37:00,880 --> 00:37:05,240 Speaker 1: ranking well and search results, I feel his frustration quite keenly. 614 00:37:05,800 --> 00:37:08,520 Speaker 1: Over in Russia, the government there is trying to force 615 00:37:08,600 --> 00:37:13,800 Speaker 1: Google to reinstate certain YouTube channels, particularly state backed channels 616 00:37:13,920 --> 00:37:18,239 Speaker 1: that are frequently associated with distributing misinformation, particularly about the 617 00:37:18,280 --> 00:37:22,440 Speaker 1: war in Ukraine. YouTube has banned those channels for violations 618 00:37:22,480 --> 00:37:25,520 Speaker 1: of the company's various policies, and Russia says that's just 619 00:37:25,600 --> 00:37:30,399 Speaker 1: not cool. So Russia levy to fine against Google, which 620 00:37:30,440 --> 00:37:36,000 Speaker 1: is currently standing at two undecillian rubles. Now I'm pulling 621 00:37:36,000 --> 00:37:39,600 Speaker 1: this from John Broadkin's article in Ours Technica. It's titled 622 00:37:39,800 --> 00:37:44,319 Speaker 1: Russia finds Google an impossible amount in attempt to end 623 00:37:44,480 --> 00:37:47,960 Speaker 1: YouTube bans. So if you're wondering how much is two 624 00:37:48,400 --> 00:37:52,400 Speaker 1: Undecillian rubles if you convert it to dollars, well, Broadkin 625 00:37:52,480 --> 00:37:56,160 Speaker 1: has kindly done that conversion for us, so it's roughly 626 00:37:56,480 --> 00:38:02,759 Speaker 1: the number twenty followed by thirty three zeros in dollars. Like, 627 00:38:02,800 --> 00:38:08,080 Speaker 1: it's more money than the world's gross domestic product. And 628 00:38:08,239 --> 00:38:10,319 Speaker 1: to say that Google will not pay this is of 629 00:38:10,320 --> 00:38:14,400 Speaker 1: course obvious because nobody could pay it. No one, not 630 00:38:14,520 --> 00:38:17,840 Speaker 1: the entire world, has that much money. So it's not 631 00:38:18,000 --> 00:38:21,239 Speaker 1: really a real fine. It's a message to Google saying, hey, 632 00:38:21,320 --> 00:38:24,120 Speaker 1: give us our YouTube channels back. We mean it. Or rather, 633 00:38:24,160 --> 00:38:26,920 Speaker 1: it's a fine that really had a stipulation attached to it. 634 00:38:27,000 --> 00:38:30,160 Speaker 1: So originally this fine was much less than that. But 635 00:38:30,239 --> 00:38:33,040 Speaker 1: if Google refused to reverse its decision to ban the 636 00:38:33,120 --> 00:38:36,080 Speaker 1: channels and to pay the fine, that fine would double 637 00:38:36,239 --> 00:38:38,440 Speaker 1: after a given amount of time. So have you ever 638 00:38:38,480 --> 00:38:41,160 Speaker 1: played the game where you take like two pennies and 639 00:38:41,200 --> 00:38:44,200 Speaker 1: then you double it to see how many times doubling 640 00:38:44,239 --> 00:38:47,319 Speaker 1: it would take before you're dealing with a good deal 641 00:38:47,320 --> 00:38:49,799 Speaker 1: of money. Like if you took two pennies and you 642 00:38:49,880 --> 00:38:52,400 Speaker 1: doubled it for eight rounds, you would end up with 643 00:38:52,440 --> 00:38:55,000 Speaker 1: two dollars fifty six cents in pennies. If you did 644 00:38:55,000 --> 00:38:57,760 Speaker 1: it for sixteen rounds, you've got six hundred and fifty 645 00:38:57,760 --> 00:39:00,799 Speaker 1: five dollars and thirty six cents also in pennies. If 646 00:39:00,800 --> 00:39:03,560 Speaker 1: you did it for thirty two rounds, you would have 647 00:39:03,760 --> 00:39:09,880 Speaker 1: nearly forty three million dollars in pennies. Doubling gets really big, 648 00:39:10,520 --> 00:39:15,560 Speaker 1: really fast, so no wonder that fine is now ginormously huge. 649 00:39:16,080 --> 00:39:18,600 Speaker 1: Google's response, in a large part over the course of 650 00:39:18,640 --> 00:39:21,640 Speaker 1: this dispute has essentially been to pull up Stakes out 651 00:39:21,640 --> 00:39:24,840 Speaker 1: of Russia and to shut down most operations there, including 652 00:39:24,920 --> 00:39:28,840 Speaker 1: like relocating people who work for Google to leave Russia 653 00:39:28,920 --> 00:39:32,120 Speaker 1: and work in other offices. So I suspect we're not 654 00:39:32,160 --> 00:39:35,720 Speaker 1: going to see much movement on Google's behalf here. Okay, 655 00:39:35,719 --> 00:39:38,720 Speaker 1: I've got a few recommendations for reading for all of y'all, 656 00:39:38,840 --> 00:39:40,960 Speaker 1: in addition to the articles that I've already mentioned in 657 00:39:41,000 --> 00:39:45,200 Speaker 1: this episode. First up is Eric Berger's piece in Ours Technica. 658 00:39:45,280 --> 00:39:49,399 Speaker 1: It is titled what is Happening with Boeing's Starliner Spacecraft? 659 00:39:49,680 --> 00:39:52,440 Speaker 1: As we have covered on tech Stuff, the star Liner 660 00:39:52,520 --> 00:39:55,320 Speaker 1: has had a really bumpy go of it. A Starliner 661 00:39:55,400 --> 00:39:59,759 Speaker 1: spacecraft successfully delivered two astronauts to the International Space Station, 662 00:40:00,200 --> 00:40:03,480 Speaker 1: but on the way there, the spacecraft experienced problems with 663 00:40:03,520 --> 00:40:07,360 Speaker 1: its thruster systems, and ultimately NASA decided to return the 664 00:40:07,400 --> 00:40:11,160 Speaker 1: star Liner to Earth without the astronauts aboard it. They 665 00:40:11,200 --> 00:40:14,799 Speaker 1: will have to come back home aboard a SpaceX spacecraft. 666 00:40:15,040 --> 00:40:18,840 Speaker 1: At a debriefing designed to address the star Liner as 667 00:40:19,360 --> 00:40:22,719 Speaker 1: it had returned to Earth, the two reps expected from 668 00:40:22,800 --> 00:40:27,440 Speaker 1: Boeing didn't show up. The company has remained silent about 669 00:40:27,480 --> 00:40:30,319 Speaker 1: the star Liner in the weeks following the landing. Two 670 00:40:30,360 --> 00:40:32,879 Speaker 1: months have gone by and we haven't really received any 671 00:40:32,960 --> 00:40:36,759 Speaker 1: updates on it. Berger explains that, you know, here's what 672 00:40:36,800 --> 00:40:40,440 Speaker 1: we know, here's what folks suspect about the spacecraft moving 673 00:40:40,480 --> 00:40:45,080 Speaker 1: forward or maybe not moving forward, and so that one's 674 00:40:45,120 --> 00:40:47,880 Speaker 1: well worth a read. The other piece that I recommend 675 00:40:47,960 --> 00:40:52,120 Speaker 1: reading is Arianna Bindmann's piece in sf gait. It is 676 00:40:52,200 --> 00:40:56,800 Speaker 1: titled a Lot of Demoralized People. Ghost jobs are wreaking 677 00:40:56,880 --> 00:41:00,880 Speaker 1: havoc on tech workers. So this is about how companies 678 00:41:00,880 --> 00:41:04,440 Speaker 1: in general, but particularly in the tech space, will often 679 00:41:04,480 --> 00:41:08,560 Speaker 1: post jobs to various career sites, but those jobs aren't 680 00:41:08,640 --> 00:41:11,000 Speaker 1: really available now. It may be a case that the 681 00:41:11,000 --> 00:41:14,040 Speaker 1: company has already filled the positions in question. It may 682 00:41:14,080 --> 00:41:16,440 Speaker 1: be a case that they were never looking to hire 683 00:41:16,480 --> 00:41:21,280 Speaker 1: in the first place. And the article explains why companies 684 00:41:21,360 --> 00:41:24,360 Speaker 1: do this, and here's a spoiler alert, it's for awful, 685 00:41:24,640 --> 00:41:29,480 Speaker 1: cruel and selfish reasons. I highly recommend reading the article 686 00:41:29,600 --> 00:41:33,000 Speaker 1: if you have need of being really angry at corporations today. 687 00:41:33,239 --> 00:41:35,799 Speaker 1: This is a great way to get very mad at corporations. 688 00:41:35,840 --> 00:41:38,560 Speaker 1: It's also just a very good read in general. It's 689 00:41:38,560 --> 00:41:41,080 Speaker 1: also very important for anybody who might be in the 690 00:41:41,200 --> 00:41:44,680 Speaker 1: job market. If you're someone who has experienced the very 691 00:41:44,760 --> 00:41:49,400 Speaker 1: rough setback of having sent out countless resumes only to 692 00:41:49,480 --> 00:41:53,200 Speaker 1: hear nothing, this practice might be playing a part in 693 00:41:53,239 --> 00:41:58,400 Speaker 1: that this process of posting fake jobs to sites. Understanding 694 00:41:58,480 --> 00:42:02,919 Speaker 1: why that happens can also help explain maybe your lack 695 00:42:02,960 --> 00:42:05,600 Speaker 1: of traction if you've been job hunting and nothing's come back, 696 00:42:05,640 --> 00:42:09,879 Speaker 1: because that can be really demoralizing. It can really add 697 00:42:09,920 --> 00:42:13,799 Speaker 1: to stress and depression if you're constantly putting yourself out 698 00:42:13,840 --> 00:42:16,759 Speaker 1: there but you're not hearing anything back. Well, part of 699 00:42:16,800 --> 00:42:19,120 Speaker 1: the reason of that might be that the companies you're 700 00:42:19,160 --> 00:42:22,520 Speaker 1: applying to didn't actually have jobs available in the first place, 701 00:42:22,800 --> 00:42:25,640 Speaker 1: and this article explains why, so check that out as well. 702 00:42:26,080 --> 00:42:28,719 Speaker 1: That's it for me. I hope all of you had 703 00:42:28,800 --> 00:42:31,279 Speaker 1: a fantastic week. For those of you who were out 704 00:42:31,360 --> 00:42:33,800 Speaker 1: there trick or treating with the kiddo's or whatever, I 705 00:42:33,840 --> 00:42:37,200 Speaker 1: hope you had a great time. Enjoy some mounds and 706 00:42:37,280 --> 00:42:40,120 Speaker 1: almond joy on my behalf, and I'll talk to you 707 00:42:40,160 --> 00:42:50,640 Speaker 1: again really soon. Tech Stuff is an iHeartRadio production. For 708 00:42:50,800 --> 00:42:55,640 Speaker 1: more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, 709 00:42:55,760 --> 00:43:01,799 Speaker 1: or wherever you listen to your favorite shows. You