1 00:00:04,440 --> 00:00:12,559 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,600 --> 00:00:16,440 Speaker 1: Welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm 3 00:00:16,480 --> 00:00:19,720 Speaker 1: an executive producer with iHeartRadio and how the tech are you. 4 00:00:20,400 --> 00:00:24,000 Speaker 1: It's time for the tech news for November two, twenty 5 00:00:24,160 --> 00:00:27,800 Speaker 1: twenty three, rapidly running out of days. In twenty twenty 6 00:00:27,800 --> 00:00:31,800 Speaker 1: three and yesterday, I did an episode about the history 7 00:00:31,880 --> 00:00:35,000 Speaker 1: of ad blockers, But one thing I didn't really cover 8 00:00:35,080 --> 00:00:41,680 Speaker 1: in great detail was YouTube and YouTube's battle against ad blocking. Well, 9 00:00:41,920 --> 00:00:46,600 Speaker 1: the platform has been rolling out an escalating response to 10 00:00:46,960 --> 00:00:51,240 Speaker 1: users who have been employing ad blockers on YouTube while 11 00:00:51,280 --> 00:00:55,560 Speaker 1: trying to watch YouTube videos. Gizmoto's Kevin Hurler has a 12 00:00:55,600 --> 00:00:59,880 Speaker 1: piece on this on Gizmoto titled YouTube's ad blocker crackdown 13 00:01:00,120 --> 00:01:04,280 Speaker 1: is getting harder to dodge, and he explains that earlier 14 00:01:04,319 --> 00:01:08,240 Speaker 1: in October, YouTube began to show a pop up notification 15 00:01:08,400 --> 00:01:12,800 Speaker 1: to ad blocker users, and the notification said that video 16 00:01:12,920 --> 00:01:16,160 Speaker 1: playback would not work until the user disabled ad block 17 00:01:16,800 --> 00:01:22,720 Speaker 1: or otherwise whitelisted YouTube. Now, initially, folks could just close 18 00:01:22,760 --> 00:01:25,400 Speaker 1: out the pop up and continue on their merry way. 19 00:01:25,959 --> 00:01:29,080 Speaker 1: A little bit later, YouTube beef this up. They then 20 00:01:29,200 --> 00:01:31,399 Speaker 1: had a little checkbox you had to click on to 21 00:01:32,319 --> 00:01:35,759 Speaker 1: notify YouTube that you were acknowledging you had been alerted 22 00:01:35,800 --> 00:01:39,399 Speaker 1: to this policy, that you should not be using ad blockers, 23 00:01:39,400 --> 00:01:42,440 Speaker 1: that this is against YouTube's terms of service, and then 24 00:01:42,440 --> 00:01:45,720 Speaker 1: you could go on your merry way. Now, YouTube will 25 00:01:45,760 --> 00:01:48,520 Speaker 1: just keep that pop up in place, and it'll block 26 00:01:49,040 --> 00:01:51,120 Speaker 1: all activity. You won't be able to do anything else 27 00:01:51,160 --> 00:01:54,720 Speaker 1: on YouTube until you actually take the steps to either 28 00:01:55,240 --> 00:02:00,520 Speaker 1: disable or pause ad blocking entirely, or to whitelist YouTube. 29 00:02:01,000 --> 00:02:03,600 Speaker 1: Hurler also points out that the Wall Street Journal has 30 00:02:03,680 --> 00:02:07,040 Speaker 1: tracked YouTube's ad revenue over the last three quarters and 31 00:02:07,080 --> 00:02:10,440 Speaker 1: that revenue has been on the decline. So we're not 32 00:02:10,480 --> 00:02:13,200 Speaker 1: saying that YouTube is losing money. It's not that. It's 33 00:02:13,240 --> 00:02:16,600 Speaker 1: just they're not making as much money as they had previously. 34 00:02:16,680 --> 00:02:20,760 Speaker 1: It's going down. That's a trend that most companies are 35 00:02:20,880 --> 00:02:24,960 Speaker 1: not super happy about. YouTube has been pushing for a 36 00:02:25,000 --> 00:02:28,160 Speaker 1: lot more folks to subscribe to YouTube premium. That's been 37 00:02:28,360 --> 00:02:30,720 Speaker 1: a campaign that's been going on for several months now, 38 00:02:31,320 --> 00:02:34,080 Speaker 1: and they have also introduced a lot of new types 39 00:02:34,120 --> 00:02:38,360 Speaker 1: of ads, including unskippable ads, longer ones too like it 40 00:02:38,440 --> 00:02:40,760 Speaker 1: used to be that an unskippable ad would last maybe 41 00:02:40,840 --> 00:02:43,280 Speaker 1: five or ten seconds, but now we have full thirty 42 00:02:43,320 --> 00:02:47,200 Speaker 1: second unskippable ads on say YouTube TV, for example. And 43 00:02:47,480 --> 00:02:49,880 Speaker 1: it has also started to remove some of the controls 44 00:02:49,960 --> 00:02:53,680 Speaker 1: that video creators had when it came to where they 45 00:02:53,680 --> 00:02:58,640 Speaker 1: would place ads against their content. I remember talking about 46 00:02:58,680 --> 00:03:03,360 Speaker 1: how some ASMR creators were very upset because at least initially, 47 00:03:03,800 --> 00:03:07,119 Speaker 1: it looked like they wouldn't have the opportunity to specify 48 00:03:07,200 --> 00:03:10,440 Speaker 1: that ads should only be pre roll, for example, a 49 00:03:10,480 --> 00:03:14,679 Speaker 1: not post roll, and that by removing that control, it 50 00:03:14,800 --> 00:03:18,240 Speaker 1: meant that an ASMR AD or an ASMR video rather 51 00:03:18,760 --> 00:03:22,200 Speaker 1: might have allowed an intrusive ad play after the video, 52 00:03:22,320 --> 00:03:26,560 Speaker 1: thus potentially undoing all the work that the ASMR video 53 00:03:26,680 --> 00:03:30,440 Speaker 1: was intended to do. Well. I suspect we're going to 54 00:03:30,480 --> 00:03:33,800 Speaker 1: continue to see shifts in strategy from YouTube because they're 55 00:03:33,880 --> 00:03:35,880 Speaker 1: going to continue to try and find the most lucrative 56 00:03:35,920 --> 00:03:38,800 Speaker 1: way to make as much money as possible. That's kind 57 00:03:38,840 --> 00:03:43,000 Speaker 1: of what businesses do. And I'm sure we'll also see 58 00:03:43,040 --> 00:03:46,480 Speaker 1: new approaches to ad blocking go on as well. And 59 00:03:47,040 --> 00:03:52,280 Speaker 1: so it goes. David Ingram and kat Tinbarge have a 60 00:03:52,320 --> 00:03:56,320 Speaker 1: really good piece over at Nbcnews dot Com. It's titled 61 00:03:56,600 --> 00:04:00,720 Speaker 1: critics renewed Calls for a TikTok band, claiming play has 62 00:04:00,720 --> 00:04:06,400 Speaker 1: an anti Israel bias. So the journalists walk a pretty 63 00:04:06,440 --> 00:04:09,840 Speaker 1: tight line in an attempt to be really objective in 64 00:04:09,920 --> 00:04:14,320 Speaker 1: this piece, and they explain that several politicians are saying 65 00:04:14,360 --> 00:04:18,640 Speaker 1: that TikTok is spreading propaganda to young people and favoring 66 00:04:18,760 --> 00:04:24,240 Speaker 1: a pro Palestinian stance over a pro Israeli stance that 67 00:04:24,279 --> 00:04:28,080 Speaker 1: with regard to the ongoing war between Israel and Hamas. 68 00:04:28,839 --> 00:04:32,120 Speaker 1: But the reporters actually found that if you look at 69 00:04:32,120 --> 00:04:36,200 Speaker 1: the stats on TikTok in the United States over the 70 00:04:36,279 --> 00:04:40,480 Speaker 1: last month or so, the period of time covered by 71 00:04:40,760 --> 00:04:43,919 Speaker 1: the point when Hamas attacked Israel and then Israel's response 72 00:04:43,960 --> 00:04:47,600 Speaker 1: to that attack, not only are posts with a hashtag 73 00:04:47,760 --> 00:04:52,680 Speaker 1: such as stand with Palestine not outpacing ones that have 74 00:04:52,760 --> 00:04:56,960 Speaker 1: the hashtag stand with Israel, but also that those Israel 75 00:04:57,080 --> 00:05:02,320 Speaker 1: posts are actually outperforming the Palestine ones. So, in other words, 76 00:05:02,800 --> 00:05:07,320 Speaker 1: the stats seem to directly contradict the claims these politicians 77 00:05:07,360 --> 00:05:12,800 Speaker 1: are making. They're saying there's this anti Israel bias across TikTok. 78 00:05:13,600 --> 00:05:16,000 Speaker 1: The reporters are saying, well, based upon what we're seeing 79 00:05:16,080 --> 00:05:18,440 Speaker 1: at least over the last thirty days or so here 80 00:05:18,440 --> 00:05:21,240 Speaker 1: in the United States, there doesn't appear to be any 81 00:05:21,279 --> 00:05:24,960 Speaker 1: indication of bias. In fact, we're seeing a trend toward 82 00:05:25,040 --> 00:05:28,240 Speaker 1: the opposite if you're looking at those stats. The reporters 83 00:05:28,279 --> 00:05:32,000 Speaker 1: say that the claims appear to be based mostly on 84 00:05:32,080 --> 00:05:36,239 Speaker 1: anecdotal evidence, and also by taking sort of a longer 85 00:05:36,320 --> 00:05:39,320 Speaker 1: view across the entire history of TikTok. Like if you 86 00:05:39,360 --> 00:05:43,159 Speaker 1: look at all of TikTok's history and you're looking at 87 00:05:43,200 --> 00:05:47,080 Speaker 1: the trends of you know, stand with say, stand with 88 00:05:47,480 --> 00:05:51,240 Speaker 1: a Palestine or stand with Israel, then things get a 89 00:05:51,279 --> 00:05:54,400 Speaker 1: little less clear, right, because you're looking across the entire 90 00:05:54,520 --> 00:05:59,640 Speaker 1: history of the platform, but not the current events. And 91 00:05:59,720 --> 00:06:03,039 Speaker 1: we all also know that anecdotal evidence is really useless 92 00:06:03,160 --> 00:06:07,880 Speaker 1: with TikTok, right because TikTok's algorithm looks at what content 93 00:06:08,040 --> 00:06:11,760 Speaker 1: you are engaging with on the platform, and then TikTok 94 00:06:11,839 --> 00:06:16,440 Speaker 1: essentially draws the conclusion of, Oh, this is what this 95 00:06:16,560 --> 00:06:19,720 Speaker 1: person wants to see, so let me just give them 96 00:06:19,720 --> 00:06:22,000 Speaker 1: a whole bunch more of that. It reminds me of 97 00:06:22,040 --> 00:06:24,480 Speaker 1: a routine that Patton Oswalt used to do where he 98 00:06:24,520 --> 00:06:28,400 Speaker 1: talked about tvo, where he said he used TVO to 99 00:06:29,000 --> 00:06:31,719 Speaker 1: record a couple of Westerns that he wanted to watch, 100 00:06:32,200 --> 00:06:34,839 Speaker 1: and then TVO made the conclusion that he just really 101 00:06:34,920 --> 00:06:37,520 Speaker 1: liked content that had horses in it, and so he 102 00:06:37,600 --> 00:06:41,360 Speaker 1: got a whole bunch of horse content recorded on his behalf. 103 00:06:41,920 --> 00:06:44,479 Speaker 1: This is sort of what it's reminding me of, which 104 00:06:44,520 --> 00:06:46,200 Speaker 1: means that, you know, if you were to go on 105 00:06:46,240 --> 00:06:48,960 Speaker 1: TikTok and let's say you actually did see a pro 106 00:06:49,120 --> 00:06:54,800 Speaker 1: Palestinian video and you objected to that video for whatever reason, 107 00:06:55,360 --> 00:06:58,920 Speaker 1: and maybe you even leave a comment giving your different 108 00:06:58,960 --> 00:07:00,720 Speaker 1: point of view and what I am sure would be 109 00:07:00,760 --> 00:07:06,520 Speaker 1: a reasonable and polite way, then TikTok doesn't see that as, Oh, 110 00:07:06,600 --> 00:07:11,040 Speaker 1: this person ends up disagreeing with this content, I should 111 00:07:11,120 --> 00:07:16,040 Speaker 1: show other points of view instead, TikTok says, Oh, this 112 00:07:16,360 --> 00:07:20,520 Speaker 1: user has engaged with this content, I should show them 113 00:07:20,560 --> 00:07:23,280 Speaker 1: a whole bunch of the same sort of stuff. So 114 00:07:23,440 --> 00:07:26,280 Speaker 1: to you, the user, it seems like there's just this 115 00:07:26,480 --> 00:07:29,880 Speaker 1: flood of content presenting just one side of an issue, 116 00:07:30,320 --> 00:07:33,720 Speaker 1: and you might not see very many, if any, examples 117 00:07:33,800 --> 00:07:36,520 Speaker 1: of the other side. But that's a very different thing 118 00:07:36,560 --> 00:07:40,240 Speaker 1: than arguing that the entire platform has a bias. In fact, 119 00:07:40,280 --> 00:07:44,040 Speaker 1: the reporters even point out that in other nations governments 120 00:07:44,040 --> 00:07:48,320 Speaker 1: are actually arguing that the opposite is true, that TikTok 121 00:07:48,360 --> 00:07:54,360 Speaker 1: has a pro Israel and anti Palestinian bias. So there 122 00:07:54,440 --> 00:08:00,120 Speaker 1: are different governments making opposite conclusions about a supposed bias 123 00:08:00,120 --> 00:08:02,960 Speaker 1: within TikTok. And I think the truth of the matter 124 00:08:03,080 --> 00:08:06,440 Speaker 1: is this is really more to do with the sort 125 00:08:06,480 --> 00:08:11,080 Speaker 1: of brute force approach of TikTok's algorithm. That is probably, 126 00:08:11,400 --> 00:08:16,320 Speaker 1: in my opinion, what is fueling this, and that you know, 127 00:08:16,560 --> 00:08:18,920 Speaker 1: I'm not a huge fan of TikTok, right, I do 128 00:08:19,000 --> 00:08:23,000 Speaker 1: have some concerns about the potential link to China. That 129 00:08:23,120 --> 00:08:26,880 Speaker 1: is something that I worry about, although I also acknowledge 130 00:08:27,000 --> 00:08:30,600 Speaker 1: that as far as I can determine, there's not really 131 00:08:30,680 --> 00:08:35,800 Speaker 1: much evidence to point to actual cases of Chinese officials 132 00:08:35,920 --> 00:08:42,080 Speaker 1: using TikTok to you know, surveil national security matters or 133 00:08:42,080 --> 00:08:44,760 Speaker 1: anything like that. But the potential is there, and that's 134 00:08:44,800 --> 00:08:48,520 Speaker 1: what is I think a little concerning. But I don't 135 00:08:48,559 --> 00:08:51,880 Speaker 1: really know that it's you know, that this particular argument 136 00:08:51,920 --> 00:08:54,680 Speaker 1: has much merit. In fact, I think it doesn't have 137 00:08:54,800 --> 00:08:57,680 Speaker 1: much merit the argument that TikTok has some sort of 138 00:08:57,720 --> 00:09:01,640 Speaker 1: specific bias when it goes to the ongoing conflict between 139 00:09:01,800 --> 00:09:06,720 Speaker 1: Israel and Hamas. I think that in most cases it's 140 00:09:06,760 --> 00:09:10,960 Speaker 1: either a failing of critical thinking, or perhaps it's someone 141 00:09:11,000 --> 00:09:15,440 Speaker 1: who's trying to score political points by you bashing against TikTok, 142 00:09:15,760 --> 00:09:18,280 Speaker 1: or perhaps a combination of the two that's really to 143 00:09:18,320 --> 00:09:23,040 Speaker 1: blame here. Next up, Scarlett Johansson is the latest famous 144 00:09:23,080 --> 00:09:27,600 Speaker 1: person in a face off against artificial intelligence. The Verge 145 00:09:27,600 --> 00:09:31,160 Speaker 1: reports that Johansson's lawyer is taking legal action against a 146 00:09:31,200 --> 00:09:36,280 Speaker 1: software developer company. That company is called Convert Software or 147 00:09:36,360 --> 00:09:41,000 Speaker 1: Convert Software. The company creates an app that's called Lisa 148 00:09:41,080 --> 00:09:46,320 Speaker 1: AI nineties Yearbook an avatar and ran an advertisement about 149 00:09:46,320 --> 00:09:49,840 Speaker 1: this app, and it included an AI generated copy of 150 00:09:49,880 --> 00:09:53,319 Speaker 1: Johanson's voice. So it starts off with like a little 151 00:09:53,360 --> 00:09:56,640 Speaker 1: segment of a behind the scenes thing from her filming 152 00:09:56,679 --> 00:10:00,800 Speaker 1: of Black Widow, and then you get the a imitation 153 00:10:01,080 --> 00:10:05,760 Speaker 1: of Johansson giving information about this app. Apparently this was 154 00:10:05,800 --> 00:10:09,240 Speaker 1: all done without Johanson's knowledge or consent. And while the 155 00:10:09,240 --> 00:10:11,960 Speaker 1: ad did feature a little bit of text that said 156 00:10:12,040 --> 00:10:15,440 Speaker 1: the person in the app doesn't have anything to do 157 00:10:15,520 --> 00:10:18,839 Speaker 1: with the app itself. Her lawyer is saying, that doesn't 158 00:10:18,880 --> 00:10:21,640 Speaker 1: really cut the mustard, and we are definitely getting into 159 00:10:21,679 --> 00:10:25,000 Speaker 1: some tricky territory here. As I've talked about before, the 160 00:10:25,080 --> 00:10:27,560 Speaker 1: laws here in the United States aren't exactly up to 161 00:10:27,679 --> 00:10:31,800 Speaker 1: date with technological capabilities. There are very few protections in 162 00:10:31,840 --> 00:10:36,880 Speaker 1: place when it comes to impersonation. Now, there are limits, 163 00:10:36,920 --> 00:10:38,120 Speaker 1: like if you were to go so far as to 164 00:10:38,160 --> 00:10:42,280 Speaker 1: impersonate someone in an effort to slander them, for example, 165 00:10:42,640 --> 00:10:45,880 Speaker 1: that would be something that you could take legal action 166 00:10:45,920 --> 00:10:48,559 Speaker 1: against because it goes under slander. Or if you were 167 00:10:48,559 --> 00:10:53,800 Speaker 1: to impersonate someone and have that person apparently endorse a product, 168 00:10:53,880 --> 00:10:56,480 Speaker 1: not just voice an ad, but to actually say that 169 00:10:56,520 --> 00:10:59,640 Speaker 1: they had used and approved of that product. That could 170 00:10:59,679 --> 00:11:02,600 Speaker 1: get you in some big legal trouble too, because the 171 00:11:02,720 --> 00:11:05,720 Speaker 1: law states that in order to make an endorsement, you 172 00:11:05,880 --> 00:11:09,360 Speaker 1: have to have actually used the product or service in question, 173 00:11:09,520 --> 00:11:12,559 Speaker 1: and you have to give your real opinion about it, 174 00:11:13,280 --> 00:11:18,880 Speaker 1: and that otherwise the endorsement can be misleading and false advertising. 175 00:11:19,280 --> 00:11:23,000 Speaker 1: So obviously, if you're using an impersonation of a celebrity 176 00:11:23,040 --> 00:11:26,040 Speaker 1: to make an endorsement, that would be a huge problem. 177 00:11:26,679 --> 00:11:30,240 Speaker 1: But there doesn't seem to be, as far as I'm aware, 178 00:11:30,880 --> 00:11:36,000 Speaker 1: very much information legally about impersonating someone in an effort 179 00:11:36,040 --> 00:11:38,480 Speaker 1: to just do a regular ad, not an endorsement, but 180 00:11:38,520 --> 00:11:41,320 Speaker 1: a regular ad. So I don't know if there is 181 00:11:41,440 --> 00:11:45,560 Speaker 1: a legal basis to hold convert software accountable, but I 182 00:11:45,600 --> 00:11:48,640 Speaker 1: suspect that cases like this one and a few others 183 00:11:48,640 --> 00:11:50,640 Speaker 1: that have popped up over the last year or so 184 00:11:51,160 --> 00:11:54,440 Speaker 1: are going to lead to new legislation that will expressly 185 00:11:54,480 --> 00:11:58,200 Speaker 1: cover these situations in the future. All right, So who 186 00:11:58,240 --> 00:12:00,920 Speaker 1: do you go to? If you're employe lets you go 187 00:12:01,360 --> 00:12:04,000 Speaker 1: in order to automate your job and then hand your 188 00:12:04,080 --> 00:12:07,920 Speaker 1: job over to AI? What do you do? Well? May 189 00:12:07,960 --> 00:12:11,800 Speaker 1: I suggest that you use an AI powered career counselor 190 00:12:11,840 --> 00:12:14,920 Speaker 1: to get you out of the situation? Sounds a bit trippy. 191 00:12:15,440 --> 00:12:18,320 Speaker 1: So this actually has to do with LinkedIn. LinkedIn has 192 00:12:18,400 --> 00:12:22,120 Speaker 1: deployed an AI chat bot feature that's meant to serve 193 00:12:22,200 --> 00:12:26,440 Speaker 1: as a kind of career advisor. At the moment, the 194 00:12:26,480 --> 00:12:30,520 Speaker 1: feature is in limited run. Only the people who are 195 00:12:30,600 --> 00:12:36,240 Speaker 1: actually LinkedIn premium subscribers can use this feature right now. 196 00:12:36,800 --> 00:12:40,520 Speaker 1: And you don't just sit down with this AI chatbot 197 00:12:40,640 --> 00:12:43,440 Speaker 1: and then have a deep soul searching heart to heart 198 00:12:43,559 --> 00:12:46,000 Speaker 1: about where your career needs to go at the stage 199 00:12:46,040 --> 00:12:51,760 Speaker 1: in your life. Instead, the chatbot will activate when you 200 00:12:51,800 --> 00:12:56,200 Speaker 1: are attempting to evaluate a job offer or maybe a 201 00:12:56,280 --> 00:12:58,800 Speaker 1: job listing and you're trying to figure out am I 202 00:12:58,880 --> 00:13:02,360 Speaker 1: a good fit for this position? Neuel. You can ask 203 00:13:02,559 --> 00:13:05,360 Speaker 1: those specific types of questions and a little drop down menu, 204 00:13:05,840 --> 00:13:09,000 Speaker 1: and the chatbot will essentially do an analysis of your 205 00:13:09,280 --> 00:13:13,040 Speaker 1: resume and job experience that kind of thing, and then 206 00:13:13,160 --> 00:13:17,920 Speaker 1: also analyze the job listing in particular and give you 207 00:13:17,960 --> 00:13:21,080 Speaker 1: some guidance as to whether or not maybe you're a 208 00:13:21,280 --> 00:13:26,120 Speaker 1: good fit for the job, or maybe you're just pooling 209 00:13:26,160 --> 00:13:28,240 Speaker 1: yourself and you should go back to being a sign spinner. 210 00:13:29,000 --> 00:13:31,120 Speaker 1: No shade on sign spinners, by the way, folks got 211 00:13:31,120 --> 00:13:33,760 Speaker 1: to get that Dala dolla, and goodness knows, I'd be 212 00:13:33,760 --> 00:13:35,520 Speaker 1: the worst sign spinner in the history of the gig 213 00:13:35,559 --> 00:13:38,040 Speaker 1: if I were to give it a try. Also, real talk, 214 00:13:38,760 --> 00:13:41,600 Speaker 1: let's say I lose my job. I honestly do not 215 00:13:41,840 --> 00:13:43,840 Speaker 1: know what I would do for a living at this 216 00:13:43,920 --> 00:13:47,040 Speaker 1: point in my life. It is a sobering thought, and 217 00:13:47,120 --> 00:13:50,160 Speaker 1: I think a lot of people struggle when they find 218 00:13:50,200 --> 00:13:53,400 Speaker 1: themselves in a similar situation where maybe they've done a 219 00:13:53,440 --> 00:13:56,880 Speaker 1: specific job for a very long time, maybe there aren't 220 00:13:56,880 --> 00:13:59,640 Speaker 1: that many analogs out there where they could port their 221 00:13:59,679 --> 00:14:03,800 Speaker 1: skins over to something else. It's terrifying. So I think 222 00:14:03,840 --> 00:14:08,000 Speaker 1: a tool that can help someone prioritize their own time 223 00:14:08,320 --> 00:14:11,200 Speaker 1: when they're in a job search mode, I think that's 224 00:14:11,240 --> 00:14:14,840 Speaker 1: a great thing. I think that's a very good use 225 00:14:15,080 --> 00:14:19,560 Speaker 1: of technology to help people kind of narrow their focus. However, 226 00:14:20,200 --> 00:14:23,240 Speaker 1: several things need to be true in order for me 227 00:14:23,280 --> 00:14:27,160 Speaker 1: to be really one hundred percent in favor of this tool. First, 228 00:14:27,960 --> 00:14:30,280 Speaker 1: the tool needs to work right. It needs to actually 229 00:14:30,360 --> 00:14:34,200 Speaker 1: work well. Second, I would want to see this tool 230 00:14:34,320 --> 00:14:37,160 Speaker 1: rolled out to anyone who's a LinkedIn user, not just 231 00:14:37,240 --> 00:14:41,560 Speaker 1: those who are premium subscribers, because I mean, if you're 232 00:14:41,560 --> 00:14:43,920 Speaker 1: in a position where you need to find a new job, 233 00:14:44,280 --> 00:14:46,560 Speaker 1: you might also be in a position where you can't 234 00:14:46,600 --> 00:14:50,960 Speaker 1: really afford to pay for extras in order to get 235 00:14:51,000 --> 00:14:53,640 Speaker 1: that new job. And I would like to see the 236 00:14:53,640 --> 00:14:56,280 Speaker 1: people who need this the most have access to it 237 00:14:56,360 --> 00:15:02,160 Speaker 1: without having to sacrifice to pay for that access. Three, 238 00:15:02,440 --> 00:15:05,240 Speaker 1: I also hope this AI powered tool isn't prone to 239 00:15:05,840 --> 00:15:09,200 Speaker 1: unfair bias. So, for example, I would hope the tool 240 00:15:09,200 --> 00:15:11,560 Speaker 1: doesn't assume that a woman would just be a bad 241 00:15:11,600 --> 00:15:14,480 Speaker 1: fit for an engineer position because she is a she. 242 00:15:14,960 --> 00:15:17,160 Speaker 1: I would want all those things to be true in 243 00:15:17,240 --> 00:15:19,920 Speaker 1: order to really feel good about this tool, So hopefully 244 00:15:20,560 --> 00:15:24,720 Speaker 1: those are all true eventually. Okay, we've got a lot 245 00:15:24,760 --> 00:15:26,520 Speaker 1: more news to go, but before we get to that, 246 00:15:26,600 --> 00:15:39,120 Speaker 1: let's take a quick break. We're back. So the Attorney 247 00:15:39,120 --> 00:15:42,240 Speaker 1: General for New York is going after a group of 248 00:15:42,360 --> 00:15:47,560 Speaker 1: companies that are in the cryptocurrency space, and she's arguing 249 00:15:47,600 --> 00:15:50,840 Speaker 1: that these companies have defrauded customers out of more than 250 00:15:50,960 --> 00:15:56,400 Speaker 1: a billion dollars yikes, and they do have a connection 251 00:15:56,640 --> 00:16:01,680 Speaker 1: with some pretty infamous failures in the crypto space from 252 00:16:01,760 --> 00:16:05,800 Speaker 1: last year. So the companies that are part of this 253 00:16:06,160 --> 00:16:11,280 Speaker 1: entire investigation and legal action include the Digital Currency Group, 254 00:16:11,640 --> 00:16:15,800 Speaker 1: which is the parent company for another party in this 255 00:16:16,000 --> 00:16:20,560 Speaker 1: in this legal action, Genesis. That's a crypto lending company, 256 00:16:21,240 --> 00:16:25,200 Speaker 1: as well as a crypto exchange company called Gemini or 257 00:16:25,880 --> 00:16:28,720 Speaker 1: if you preferred Gemini, and Gemini is the one that 258 00:16:28,840 --> 00:16:32,120 Speaker 1: was co founded by the infamous Winkle Voss twins, you know, 259 00:16:32,200 --> 00:16:36,800 Speaker 1: who also played a part in the early days of Facebook, 260 00:16:37,120 --> 00:16:39,840 Speaker 1: and that's why it's called Gemini, right, Gemini are the 261 00:16:39,880 --> 00:16:43,600 Speaker 1: twins when you're talking about astrology, So very clever. But 262 00:16:43,680 --> 00:16:48,880 Speaker 1: the charges allege that these companies incurred massive losses and 263 00:16:48,920 --> 00:16:52,640 Speaker 1: then tried to hide that information from investors because they 264 00:16:52,640 --> 00:16:54,840 Speaker 1: wanted to keep the money flowing in. And if you're 265 00:16:54,880 --> 00:16:58,040 Speaker 1: telling investors we're losing lots of money, chances are they're 266 00:16:58,080 --> 00:17:02,560 Speaker 1: not going to be super willing to invest more into you. Now, 267 00:17:02,600 --> 00:17:05,960 Speaker 1: one key element of the case involves what was a 268 00:17:06,000 --> 00:17:12,640 Speaker 1: collaboration between Gemini, the crypto exchange, and Genesis, the crypto 269 00:17:12,800 --> 00:17:20,400 Speaker 1: lending company. The service was called Gemini Earn Earn, So basically, 270 00:17:20,440 --> 00:17:24,720 Speaker 1: the idea was that the investors who were using the 271 00:17:24,800 --> 00:17:30,640 Speaker 1: crypto exchange Gemini, could loan money out to the crypto 272 00:17:30,800 --> 00:17:35,080 Speaker 1: loaner company, Genesis. So Gemini customer says, all right, I 273 00:17:35,119 --> 00:17:39,119 Speaker 1: will give X amount of my cash out to Genesis 274 00:17:39,160 --> 00:17:43,840 Speaker 1: as a loan. Genesis in turn would loan that investor 275 00:17:43,880 --> 00:17:48,679 Speaker 1: money out to its own customers, and the idea was 276 00:17:48,720 --> 00:17:52,000 Speaker 1: that the customers would pay interest on the loan, Genesis 277 00:17:52,000 --> 00:17:55,280 Speaker 1: would pay interest on the loaned money from the investors, 278 00:17:55,840 --> 00:17:58,720 Speaker 1: and the money would trickle back up the chain and 279 00:17:58,800 --> 00:18:03,840 Speaker 1: everybody would make cash right, So the interest rates were 280 00:18:03,880 --> 00:18:06,560 Speaker 1: listed as high as eight percent. At certain points, you 281 00:18:06,600 --> 00:18:11,280 Speaker 1: get eight percent return on your investment. Not bad. And 282 00:18:11,840 --> 00:18:16,600 Speaker 1: according to the lawsuit, Gemini performed a risk analysis on 283 00:18:17,080 --> 00:18:21,480 Speaker 1: Genesis and then determined that it would actually be really 284 00:18:21,600 --> 00:18:26,760 Speaker 1: risky to loan money to Genesis, but Gemini did not 285 00:18:27,040 --> 00:18:31,640 Speaker 1: communicate the information to its customers to the investors on Gemini, 286 00:18:31,920 --> 00:18:35,600 Speaker 1: so they didn't tell the investors, hey, heads up, this 287 00:18:35,680 --> 00:18:38,560 Speaker 1: is a risky loan. Instead they said, yeah, dog, it's 288 00:18:38,560 --> 00:18:42,480 Speaker 1: totally safe and you can increase your wealth this way. Meanwhile, 289 00:18:43,240 --> 00:18:47,280 Speaker 1: Genesis totally fumbled the bag because it was loaning out 290 00:18:47,400 --> 00:18:52,240 Speaker 1: considerable amounts of money to entities that then went broke, 291 00:18:52,920 --> 00:18:56,280 Speaker 1: one of which being Alimator Research. So if that name 292 00:18:56,320 --> 00:18:59,480 Speaker 1: sounds familiar, that's because Alimator Research was one of the 293 00:18:59,600 --> 00:19:04,320 Speaker 1: two crypto companies that Sam Bankman Freed co founded, you know, 294 00:19:04,359 --> 00:19:09,600 Speaker 1: the other one being obviously the FTX Exchange, and Alimator 295 00:19:09,680 --> 00:19:13,080 Speaker 1: Research went bankrupt. So this huge amount of money that 296 00:19:13,119 --> 00:19:18,639 Speaker 1: was loan to Alamator Research was effectively lost. So this 297 00:19:18,800 --> 00:19:21,280 Speaker 1: lawsuit is saying that JEM and I became aware that 298 00:19:21,320 --> 00:19:24,800 Speaker 1: Genesis earned was not a good investment opportunity from a 299 00:19:24,880 --> 00:19:29,560 Speaker 1: risk standpoint, but then specifically did not tell their own 300 00:19:29,640 --> 00:19:33,720 Speaker 1: customers this information, which allowed the investors to continue to 301 00:19:33,760 --> 00:19:37,440 Speaker 1: funnel money into an operation that was really not likely 302 00:19:37,520 --> 00:19:40,320 Speaker 1: to create a return and in fact ended up being 303 00:19:40,359 --> 00:19:43,199 Speaker 1: a huge failure. So this is another story pointing out 304 00:19:43,240 --> 00:19:47,760 Speaker 1: how precarious a lot of the crypto community is. That's 305 00:19:47,800 --> 00:19:52,120 Speaker 1: not to say that all cryptocurrency companies or crypto companies 306 00:19:52,160 --> 00:19:54,919 Speaker 1: in general are in the same boat. I'm not saying 307 00:19:55,080 --> 00:19:58,440 Speaker 1: they're all just a guaranteed loss. That's not what I'm saying. 308 00:19:58,800 --> 00:20:01,280 Speaker 1: I am saying there are lots of examples of companies 309 00:20:01,320 --> 00:20:05,439 Speaker 1: that are in that space that tried to capitalize on 310 00:20:05,560 --> 00:20:09,119 Speaker 1: a boom period, Like they were really quick and rushing in, 311 00:20:09,200 --> 00:20:13,680 Speaker 1: and they made some big, risky decisions, and then when 312 00:20:13,680 --> 00:20:17,600 Speaker 1: that boom kind of ended, these companies found themselves out 313 00:20:17,640 --> 00:20:20,840 Speaker 1: of their depth and a lot of them ended up collapsing. 314 00:20:21,320 --> 00:20:25,840 Speaker 1: So just again a word of caution before you start 315 00:20:25,920 --> 00:20:30,320 Speaker 1: investing into cryptocurrency. I'm not a financial advisor. I cannot 316 00:20:30,359 --> 00:20:35,080 Speaker 1: give any sort of advice. Personally, I think that it 317 00:20:35,200 --> 00:20:39,000 Speaker 1: is not worth the risk in most cases. I'm still 318 00:20:39,000 --> 00:20:42,800 Speaker 1: not sold on cryptocurrency in general, I feel like it's 319 00:20:43,240 --> 00:20:48,200 Speaker 1: largely a technology that benefits a few at the expense 320 00:20:48,320 --> 00:20:51,840 Speaker 1: of many, similar to things like a pyramid scheme or 321 00:20:51,880 --> 00:20:54,520 Speaker 1: a Ponzi scheme. I'm not saying that all cryptocurrencies are 322 00:20:54,560 --> 00:20:59,159 Speaker 1: a Ponzi scheme either, just that there are some similarities, 323 00:20:59,320 --> 00:21:04,000 Speaker 1: and I haven't seen a lot of evidence for, you know, 324 00:21:04,200 --> 00:21:08,640 Speaker 1: like ones that are treated less as a commodity exchange 325 00:21:08,960 --> 00:21:13,520 Speaker 1: and more like an actual currency. It's just it's rare. Okay. 326 00:21:14,720 --> 00:21:18,440 Speaker 1: Reuter's reports that the company we Work could be filing 327 00:21:18,440 --> 00:21:22,760 Speaker 1: for bankruptcy as early as next week. So first of all, 328 00:21:22,800 --> 00:21:24,639 Speaker 1: to get this out of the way, we Work isn't 329 00:21:24,720 --> 00:21:28,880 Speaker 1: really a tech company really, but I cover it because 330 00:21:29,560 --> 00:21:33,439 Speaker 1: everyone treats it like it was a tech startup. It 331 00:21:33,560 --> 00:21:37,840 Speaker 1: was a darling among tech venture capitalysts when we Work 332 00:21:37,920 --> 00:21:42,440 Speaker 1: first launched, despite the fact that We Work's entire business 333 00:21:42,480 --> 00:21:46,520 Speaker 1: model was predicated upon an idea that wasn't remotely new 334 00:21:46,760 --> 00:21:50,800 Speaker 1: or innovative. It's it wasn't like we Work was breaking 335 00:21:50,800 --> 00:21:52,600 Speaker 1: the mold and coming up with something that had never 336 00:21:52,640 --> 00:21:55,760 Speaker 1: been tried before. It was actually a business idea that 337 00:21:55,800 --> 00:22:00,280 Speaker 1: traditionally had proven to be really challenging, and one had 338 00:22:00,359 --> 00:22:03,439 Speaker 1: pretty low profit margins. So we Work's business model, in 339 00:22:03,480 --> 00:22:08,080 Speaker 1: case you're not aware, is the company purchases or leases 340 00:22:08,240 --> 00:22:14,919 Speaker 1: office space in large, typically urban communities, and then it 341 00:22:15,000 --> 00:22:20,360 Speaker 1: will lease or sub lease those spaces to various individuals 342 00:22:20,400 --> 00:22:23,639 Speaker 1: and small companies so that they will have an office 343 00:22:23,640 --> 00:22:25,720 Speaker 1: location they can use, but they don't have to go 344 00:22:25,840 --> 00:22:30,200 Speaker 1: to the extent of actually, you know, purchasing or renting 345 00:22:30,800 --> 00:22:35,840 Speaker 1: dedicated office space that would require a much larger financial investment, 346 00:22:36,280 --> 00:22:39,560 Speaker 1: so they can save money by renting at a smaller 347 00:22:39,600 --> 00:22:44,199 Speaker 1: fee a section of office space in this communal office 348 00:22:44,200 --> 00:22:48,320 Speaker 1: location run by we Work. Now, that's it as far 349 00:22:48,359 --> 00:22:51,040 Speaker 1: as the business plan goes. But the WeWork story has 350 00:22:51,080 --> 00:22:54,840 Speaker 1: all sorts of bizarre twists and turns. Some of it 351 00:22:54,880 --> 00:22:58,199 Speaker 1: is your classic conflict of interest stuff because it was 352 00:22:58,240 --> 00:23:03,879 Speaker 1: discovered that the company's founder was acquiring a real estate 353 00:23:04,000 --> 00:23:08,320 Speaker 1: on his own and then using we Work to purchase 354 00:23:08,359 --> 00:23:13,600 Speaker 1: a lease from him for that property, which is certainly questionable. 355 00:23:13,720 --> 00:23:16,920 Speaker 1: Right Like if you run a company and then on 356 00:23:16,960 --> 00:23:18,800 Speaker 1: your own, you go out and buy something and then 357 00:23:18,840 --> 00:23:21,320 Speaker 1: you direct your company to purchase the thing you bought 358 00:23:21,560 --> 00:23:25,800 Speaker 1: at a huge profit for you personally, that there is 359 00:23:26,040 --> 00:23:30,000 Speaker 1: pretty darn questionable. But on top of that, his wife 360 00:23:30,480 --> 00:23:34,399 Speaker 1: was drafting some very weird corporate documents that seem to 361 00:23:34,400 --> 00:23:38,399 Speaker 1: have more in common with Gwyneth Paltrow's Goop company, which 362 00:23:38,440 --> 00:23:40,159 Speaker 1: you know, kind of makes more sense when you realize 363 00:23:40,200 --> 00:23:43,000 Speaker 1: the two women are actually related to each other. The 364 00:23:43,040 --> 00:23:47,040 Speaker 1: whole story is weird. There are entire documentaries about it, 365 00:23:47,280 --> 00:23:51,840 Speaker 1: and great YouTube videos as well. Maybe at one point, 366 00:23:51,880 --> 00:23:53,640 Speaker 1: we Work was seen as a being at like a 367 00:23:53,680 --> 00:23:57,320 Speaker 1: massive disruptor in office real estate, but now the company 368 00:23:57,320 --> 00:24:00,880 Speaker 1: finds itself on the verge of obsolescence tech stuff. We're 369 00:24:00,920 --> 00:24:04,880 Speaker 1: more of a general kind of business podcast. I would 370 00:24:04,920 --> 00:24:07,760 Speaker 1: probably do a full series of episodes about we Work 371 00:24:07,760 --> 00:24:14,160 Speaker 1: and really dive into the evolution and fall of that company, 372 00:24:14,200 --> 00:24:16,639 Speaker 1: but I think I'm going to leave that to other shows. 373 00:24:16,680 --> 00:24:19,840 Speaker 1: It just figured I should cover this because we Work 374 00:24:20,080 --> 00:24:23,240 Speaker 1: was treated as though it were a tech startup from 375 00:24:23,280 --> 00:24:26,680 Speaker 1: the very beginning, and now we're seeing the end of it. 376 00:24:27,400 --> 00:24:31,920 Speaker 1: Now let's talk about an actual tech company. So by 377 00:24:31,960 --> 00:24:37,520 Speaker 1: the end of today, you know, Thursday, November two, Apple 378 00:24:38,240 --> 00:24:44,800 Speaker 1: is expected to post a quarterly result that will indicate 379 00:24:44,840 --> 00:24:48,520 Speaker 1: a decline in revenue for the fourth consecutive quarter. So, 380 00:24:48,600 --> 00:24:52,600 Speaker 1: in other words, over the course of a year, Apple's 381 00:24:52,720 --> 00:24:57,119 Speaker 1: revenue has declined quarter by quarter at least compared to 382 00:24:58,080 --> 00:25:02,879 Speaker 1: the same time last year, right, and revenue decline is 383 00:25:02,920 --> 00:25:05,040 Speaker 1: not something that you really want to see. This is 384 00:25:05,040 --> 00:25:07,960 Speaker 1: all according to CNBC. Now, this is not to say 385 00:25:07,960 --> 00:25:11,520 Speaker 1: that Apple is in trouble. That's not the case. Again, 386 00:25:11,560 --> 00:25:15,120 Speaker 1: we're talking about decline in revenue, but we're not talking 387 00:25:15,160 --> 00:25:17,880 Speaker 1: about the company losing money. It's just not making as 388 00:25:18,000 --> 00:25:20,639 Speaker 1: much money as it did this time last year. The 389 00:25:20,680 --> 00:25:25,600 Speaker 1: company is still making boatloads of money every year. The 390 00:25:25,760 --> 00:25:29,479 Speaker 1: expectation for this quarter, which will know for sure by 391 00:25:29,480 --> 00:25:32,240 Speaker 1: the end of today, but the expectations that Apple's going 392 00:25:32,280 --> 00:25:36,320 Speaker 1: to report on making around eighty nine point two eight 393 00:25:36,640 --> 00:25:41,399 Speaker 1: billion dollars in sales. That's billion with a B. That 394 00:25:41,600 --> 00:25:45,360 Speaker 1: is an unimaginable amount of money to me. And that's 395 00:25:45,440 --> 00:25:49,840 Speaker 1: just one quarter of the year. However, even at eighty 396 00:25:49,920 --> 00:25:52,960 Speaker 1: nine point to eight billion dollars, that's still a one 397 00:25:53,040 --> 00:25:56,960 Speaker 1: percent drop from what the company made in sales this 398 00:25:57,080 --> 00:26:02,160 Speaker 1: time last year. Now, they're probably tons of reasons for 399 00:26:02,200 --> 00:26:04,520 Speaker 1: this decline, and a lot of those reasons, I would 400 00:26:04,600 --> 00:26:08,080 Speaker 1: argue are totally outside of Apple's direct control right, like 401 00:26:08,800 --> 00:26:12,240 Speaker 1: the state of the economy in general is an example. 402 00:26:12,400 --> 00:26:16,520 Speaker 1: Apple doesn't control that, and some people, I'm guessing, are 403 00:26:16,560 --> 00:26:19,919 Speaker 1: willing to stick with their current technology and use it 404 00:26:20,000 --> 00:26:23,960 Speaker 1: for maybe longer than they typically would because they're just 405 00:26:24,040 --> 00:26:26,040 Speaker 1: trying to be a little more careful with their own 406 00:26:26,200 --> 00:26:29,920 Speaker 1: personal budget. Then you also have Apple customers who might 407 00:26:30,080 --> 00:26:33,160 Speaker 1: feel that some of the more recent products Apple has 408 00:26:33,200 --> 00:26:37,919 Speaker 1: released just haven't been innovative enough or shown enough reasons 409 00:26:38,000 --> 00:26:41,720 Speaker 1: to upgrade from what you're currently using. So this would 410 00:26:41,720 --> 00:26:44,399 Speaker 1: be the people who say, oh, you know, Apple doesn't 411 00:26:45,119 --> 00:26:48,480 Speaker 1: end up creating really cool products the way it used to. 412 00:26:49,080 --> 00:26:53,920 Speaker 1: I don't necessarily disagree with that, I guess because part 413 00:26:53,920 --> 00:26:56,359 Speaker 1: of it is that it's been a while since Apple 414 00:26:56,400 --> 00:27:00,000 Speaker 1: has been able to introduce sort of a new product 415 00:27:00,119 --> 00:27:02,639 Speaker 1: category and do the same thing that it did with 416 00:27:02,720 --> 00:27:05,840 Speaker 1: things like it be three players with the iPod, smartphones 417 00:27:05,880 --> 00:27:09,320 Speaker 1: with the iPhone tablet, computers with the iPad, et cetera. 418 00:27:10,240 --> 00:27:11,800 Speaker 1: It's been a while since they've been able to do 419 00:27:11,880 --> 00:27:14,280 Speaker 1: something like that, and I think a lot of people 420 00:27:14,359 --> 00:27:17,439 Speaker 1: had this expectation that every few years, Apple was just 421 00:27:17,480 --> 00:27:19,840 Speaker 1: going to come out with a new product line that 422 00:27:19,880 --> 00:27:25,080 Speaker 1: would redefine a gadget and create a brand new way 423 00:27:25,119 --> 00:27:27,920 Speaker 1: for Apple to print money. But that's hard to do, 424 00:27:28,400 --> 00:27:31,440 Speaker 1: and it's also hard to like innovate in a way 425 00:27:31,520 --> 00:27:37,040 Speaker 1: where you're significantly improving the performance and quality of the 426 00:27:37,080 --> 00:27:41,399 Speaker 1: products you're making. So, you know, I don't fully blame 427 00:27:41,400 --> 00:27:43,080 Speaker 1: Apple for this. I do think it puts a lot 428 00:27:43,119 --> 00:27:47,240 Speaker 1: more pressure on the company to try and create those 429 00:27:47,280 --> 00:27:52,879 Speaker 1: innovative products. I think Apple's experimentation with augmented and mixed 430 00:27:52,880 --> 00:27:57,200 Speaker 1: reality kind of show that the company has been thinking 431 00:27:57,280 --> 00:28:01,320 Speaker 1: about that but just hasn't been able to nail the 432 00:28:01,400 --> 00:28:03,520 Speaker 1: execution in a way that I think is going to 433 00:28:03,600 --> 00:28:06,000 Speaker 1: have a huge impact on the market. We'll see. Maybe 434 00:28:06,040 --> 00:28:08,760 Speaker 1: I'm wrong. I've been wrong about lots of outher Apple stuff. 435 00:28:08,920 --> 00:28:10,639 Speaker 1: I didn't think the iPad was going to make it, 436 00:28:11,000 --> 00:28:13,480 Speaker 1: and I was completely wrong about that, So maybe I'm 437 00:28:13,520 --> 00:28:16,439 Speaker 1: totally wrong about mixed reality too. We'll have to see. 438 00:28:16,960 --> 00:28:20,359 Speaker 1: But I imagine that it's going to take another year 439 00:28:20,720 --> 00:28:25,160 Speaker 1: for investors to really get super excited about Apple again 440 00:28:26,040 --> 00:28:29,280 Speaker 1: in order to, you know, kind of reverse this trend. 441 00:28:29,480 --> 00:28:32,840 Speaker 1: I don't think investors are losing hope in Apple, because again, 442 00:28:32,880 --> 00:28:35,399 Speaker 1: the company is a juggernaut it makes so much money 443 00:28:35,920 --> 00:28:38,760 Speaker 1: that it's hard to say, oh, well, I'm going to 444 00:28:38,840 --> 00:28:42,120 Speaker 1: pull out Apple because their sales went down a little 445 00:28:42,120 --> 00:28:44,800 Speaker 1: bit this year compared to last year. But then again, 446 00:28:45,600 --> 00:28:50,080 Speaker 1: investors are finicky people at times, so who knows. Okay, 447 00:28:50,600 --> 00:28:52,920 Speaker 1: I got a few more stories to cover. But before 448 00:28:52,960 --> 00:29:06,040 Speaker 1: I do that, let's take another quick break. So we're back, 449 00:29:06,160 --> 00:29:09,440 Speaker 1: and there is a deal. It's actually been in development 450 00:29:09,600 --> 00:29:13,960 Speaker 1: for several years I think five years now that may 451 00:29:14,280 --> 00:29:18,560 Speaker 1: finally be close to closing in the not too distant future. 452 00:29:18,960 --> 00:29:22,040 Speaker 1: And I'm talking about the Walt Disney companies purchase of 453 00:29:22,400 --> 00:29:26,720 Speaker 1: the streaming platform Hulu. So Disney has been part of 454 00:29:26,800 --> 00:29:30,520 Speaker 1: Hulu for years, Like that's not new. But once upon 455 00:29:30,560 --> 00:29:33,640 Speaker 1: a time, Hulu was kind of a collaborative project. There 456 00:29:33,640 --> 00:29:36,680 Speaker 1: were a lot of different companies that were part of 457 00:29:36,800 --> 00:29:41,000 Speaker 1: Hulu early on, largely because these were companies that were 458 00:29:41,040 --> 00:29:46,520 Speaker 1: trying to oppose Netflix. The idea being that they didn't 459 00:29:46,520 --> 00:29:51,680 Speaker 1: want to lose control of who actually has access to 460 00:29:51,880 --> 00:29:54,640 Speaker 1: certain content. The studios thought, well, if we have our 461 00:29:54,680 --> 00:29:59,400 Speaker 1: own streaming platform, we can compete against Netflix. Netflix will 462 00:29:59,400 --> 00:30:01,320 Speaker 1: not be the only game in town. They will not 463 00:30:01,640 --> 00:30:06,520 Speaker 1: dominate the streaming landscape. So Hulu became a thing. These days, however, 464 00:30:07,680 --> 00:30:11,480 Speaker 1: the stakeholders really come down to Disney and then Comcast. 465 00:30:11,720 --> 00:30:16,000 Speaker 1: Comcast owns about a third, in fact a third of Hulu, 466 00:30:16,040 --> 00:30:20,400 Speaker 1: like thirty three percent of Hulu. So again, five years ago, 467 00:30:20,480 --> 00:30:25,640 Speaker 1: these two entities agreed that Disney would purchase Hulu in 468 00:30:25,720 --> 00:30:30,600 Speaker 1: total and pay out Comcast for its shares. They recently 469 00:30:31,080 --> 00:30:34,160 Speaker 1: came to an agreement about how much that will be. 470 00:30:34,280 --> 00:30:38,600 Speaker 1: At minimum, which is eight point six one billion dollars. 471 00:30:39,160 --> 00:30:41,120 Speaker 1: That's what Disney will have to pay to get full 472 00:30:41,200 --> 00:30:44,720 Speaker 1: ownership of Hulu. And again that's not for all of Hulu. 473 00:30:45,000 --> 00:30:48,959 Speaker 1: That's representing one third of Hulu now, I say at least, 474 00:30:49,400 --> 00:30:53,520 Speaker 1: because the amount actually will depend upon what Hulu's fair 475 00:30:53,560 --> 00:30:58,120 Speaker 1: market value was on September thirtieth, that's when this agreement 476 00:30:58,200 --> 00:31:01,959 Speaker 1: actually happened, although we're only really hearing details about it 477 00:31:02,040 --> 00:31:04,960 Speaker 1: now as we're getting close to earnings calls and such. 478 00:31:05,520 --> 00:31:08,880 Speaker 1: So the eight point six one was based on a 479 00:31:09,040 --> 00:31:12,480 Speaker 1: minimum valuation that of Hulu being twenty seven point five 480 00:31:12,520 --> 00:31:16,040 Speaker 1: billion dollars. But that agreement was made half a decade ago. 481 00:31:16,200 --> 00:31:20,360 Speaker 1: It was kind of just based off of projections, so 482 00:31:21,000 --> 00:31:25,400 Speaker 1: it's possible that on September thirtieth, Hulu's fair market value 483 00:31:25,480 --> 00:31:28,840 Speaker 1: was higher than twenty seven point five billion dollars, perhaps 484 00:31:28,920 --> 00:31:33,760 Speaker 1: even significantly higher. And if that's the case, Disney will 485 00:31:33,760 --> 00:31:37,440 Speaker 1: be beholden to make up the difference of that, you know, 486 00:31:37,480 --> 00:31:40,560 Speaker 1: thirty three percent of that. So like if instead of 487 00:31:40,600 --> 00:31:44,200 Speaker 1: twenty seven point five billion, uh an analysis says that 488 00:31:44,280 --> 00:31:46,720 Speaker 1: on September thirtieth of this year, Hulu is worth like 489 00:31:46,800 --> 00:31:50,440 Speaker 1: forty billion. Disney's going to have to pay the difference 490 00:31:50,480 --> 00:31:52,880 Speaker 1: there to in addition to that eight point six one 491 00:31:52,920 --> 00:31:58,240 Speaker 1: billion dollars to complete the transaction. So it's not paid 492 00:31:58,280 --> 00:32:02,120 Speaker 1: for and done, but it's up. It'll be interesting to 493 00:32:02,160 --> 00:32:04,520 Speaker 1: see what Disney does. You know, We've heard rumors that 494 00:32:04,560 --> 00:32:10,000 Speaker 1: Disney plans on integrating Hulu with the Disney Plus streaming platform, 495 00:32:10,680 --> 00:32:13,120 Speaker 1: which is the one reason why I haven't subscribed to Hulu. 496 00:32:13,440 --> 00:32:16,720 Speaker 1: I am a Disney Plus subscriber, and I thought, well, 497 00:32:16,760 --> 00:32:19,640 Speaker 1: maybe I'll just wait because if these two merge, I 498 00:32:19,680 --> 00:32:22,440 Speaker 1: may end up just you know, I'll probably have to 499 00:32:22,440 --> 00:32:26,040 Speaker 1: pay more because I can't imagine the price remaining the same, 500 00:32:26,960 --> 00:32:29,040 Speaker 1: but I may be able to access both on the 501 00:32:29,080 --> 00:32:32,720 Speaker 1: same platform. So we'll see Disney is going to hold 502 00:32:32,760 --> 00:32:35,720 Speaker 1: an earnings call next week, so there's a good chance 503 00:32:35,760 --> 00:32:38,960 Speaker 1: that I will have an update next week and I 504 00:32:39,000 --> 00:32:42,520 Speaker 1: can talk more about it then. The Guardian here this 505 00:32:42,600 --> 00:32:44,600 Speaker 1: is our technically our final story, but I do have 506 00:32:44,640 --> 00:32:50,000 Speaker 1: some recommendations afterwards. But The Guardian reports that the Pentagon 507 00:32:50,760 --> 00:32:54,680 Speaker 1: here in the United States has launched an online reporting 508 00:32:54,720 --> 00:33:00,840 Speaker 1: tool for folks who have information about unidentified anomalist phenomena 509 00:33:01,160 --> 00:33:06,360 Speaker 1: or UAPs. So UAP is the current acceptable term for 510 00:33:06,440 --> 00:33:10,520 Speaker 1: what we used to call UFOs right unidentified flying objects 511 00:33:10,840 --> 00:33:13,920 Speaker 1: now at the moment, this tool is not open to 512 00:33:13,960 --> 00:33:18,440 Speaker 1: the public. Right now, only folks who have quote direct 513 00:33:18,520 --> 00:33:22,560 Speaker 1: knowledge of US government programs or activities related to UAP 514 00:33:22,800 --> 00:33:26,840 Speaker 1: dating back to nineteen forty five end quote have access 515 00:33:26,880 --> 00:33:29,960 Speaker 1: to this particular tool, which is like an online form. 516 00:33:30,240 --> 00:33:34,600 Speaker 1: The Pentagon does plan to roll out a publicly accessible 517 00:33:34,960 --> 00:33:38,880 Speaker 1: version of this tool sometime in the future. I don't 518 00:33:38,920 --> 00:33:41,320 Speaker 1: have a timeline on that, but it is supposed to 519 00:33:41,320 --> 00:33:46,000 Speaker 1: be coming. You could not pay me enough to be 520 00:33:46,080 --> 00:33:48,520 Speaker 1: the person who has to filter through all the stuff 521 00:33:48,560 --> 00:33:52,000 Speaker 1: that's going to come through that public portal. I'm sure 522 00:33:52,040 --> 00:33:56,239 Speaker 1: they're going to be countless hoaxes there'll be people who 523 00:33:56,280 --> 00:34:00,200 Speaker 1: are just playing trolling, they'll be true believers who are 524 00:34:00,320 --> 00:34:05,240 Speaker 1: spouting off stuff about UAPs. The sheer volume of weirdness 525 00:34:05,280 --> 00:34:09,279 Speaker 1: that is going to hit that inbox is unfathomable to me. 526 00:34:10,160 --> 00:34:12,319 Speaker 1: This is all part of the US government's attempt to 527 00:34:12,440 --> 00:34:18,120 Speaker 1: make the whole UAP investigation stuff more transparent. So does 528 00:34:18,160 --> 00:34:22,840 Speaker 1: this mean, like various conspiracy theories have argued that the 529 00:34:22,920 --> 00:34:26,759 Speaker 1: US government has secretly been hiding evidence of aliens, that 530 00:34:27,040 --> 00:34:30,759 Speaker 1: maybe we've even found examples of alien technology and then 531 00:34:30,800 --> 00:34:34,400 Speaker 1: reverse engineered it in order to create amazing updates to 532 00:34:34,440 --> 00:34:40,359 Speaker 1: our tech. No, it does not mean that. In fact, 533 00:34:40,719 --> 00:34:44,160 Speaker 1: the government officials who have been part of this project 534 00:34:44,200 --> 00:34:48,960 Speaker 1: say I'm not aware of anyone doing anything even remotely 535 00:34:49,000 --> 00:34:51,439 Speaker 1: related to that. So if it's happening, I've never heard 536 00:34:51,440 --> 00:34:55,400 Speaker 1: of it. I suspect that no amount of transparency is 537 00:34:55,440 --> 00:34:59,040 Speaker 1: ever going to eliminate the various stories that circulate about 538 00:34:59,480 --> 00:35:03,040 Speaker 1: suspect did cover ups and conspiracies all meant to hide 539 00:35:03,080 --> 00:35:06,080 Speaker 1: the truth from the rest of US. I don't think 540 00:35:06,080 --> 00:35:08,880 Speaker 1: that that's at all likely. I don't think it's even possible. 541 00:35:09,400 --> 00:35:13,440 Speaker 1: But I don't think there's any way to ever stop 542 00:35:13,520 --> 00:35:16,560 Speaker 1: those stories, I still find it very difficult to believe 543 00:35:16,600 --> 00:35:21,200 Speaker 1: that any alien civilization has visited us, not because it 544 00:35:21,280 --> 00:35:24,960 Speaker 1: would be completely impossible, but when you start the factor 545 00:35:25,000 --> 00:35:28,520 Speaker 1: in the requirements that would be necessary for aliens to 546 00:35:28,600 --> 00:35:35,560 Speaker 1: get here, it becomes diminishingly likely that it's happened. And 547 00:35:35,600 --> 00:35:38,920 Speaker 1: by that I mean, let's take into consideration what has 548 00:35:38,960 --> 00:35:42,279 Speaker 1: to happen for aliens to find us. So, assuming that 549 00:35:42,360 --> 00:35:47,879 Speaker 1: the aliens are looking for signs of things like radio communications, 550 00:35:47,960 --> 00:35:50,600 Speaker 1: which would indicate that there's the presence of an intelligent 551 00:35:50,640 --> 00:35:54,319 Speaker 1: life somewhere, well, radio travels at the speed of light, 552 00:35:55,000 --> 00:35:58,920 Speaker 1: and let's say this alien civilization is three hundred light 553 00:35:59,000 --> 00:36:01,720 Speaker 1: years away from us. Well, we've only been really making 554 00:36:01,840 --> 00:36:05,440 Speaker 1: radio communications for a little more than a century, so 555 00:36:06,160 --> 00:36:09,160 Speaker 1: it would still take two hundred years for those very 556 00:36:09,239 --> 00:36:13,359 Speaker 1: weak radio signals to even be detectable to a civilization 557 00:36:13,400 --> 00:36:16,200 Speaker 1: that's three hundred light years away. Then you have to 558 00:36:16,200 --> 00:36:21,200 Speaker 1: actually travel that distance to get here on Earth. And 559 00:36:21,800 --> 00:36:24,680 Speaker 1: even if they had light speed travel, which as far 560 00:36:24,719 --> 00:36:27,600 Speaker 1: as we are aware, is impossible to do unless you 561 00:36:27,640 --> 00:36:30,960 Speaker 1: are some sort of life form that's pure energy, then 562 00:36:31,000 --> 00:36:33,239 Speaker 1: it's going to take you three hundred years at minimum, 563 00:36:33,280 --> 00:36:36,400 Speaker 1: to make that journey. So we're talking five hundred years 564 00:36:37,280 --> 00:36:42,839 Speaker 1: out from when we started transmitting in radio waves. That's 565 00:36:42,880 --> 00:36:44,400 Speaker 1: what I mean when I say I just don't think 566 00:36:44,440 --> 00:36:47,520 Speaker 1: it's likely that we've been visited by aliens. Now, could 567 00:36:47,560 --> 00:36:52,640 Speaker 1: an alien civilization capable of space travel, just by happenstance 568 00:36:52,920 --> 00:36:59,040 Speaker 1: come across Earth. I guess I don't know why they 569 00:36:59,120 --> 00:37:03,600 Speaker 1: would like just randomly joy writing across the universe and 570 00:37:03,640 --> 00:37:05,759 Speaker 1: then you make a little pitstop in a tiny little 571 00:37:05,800 --> 00:37:09,400 Speaker 1: solar system and you happen to land on a planet 572 00:37:09,480 --> 00:37:14,000 Speaker 1: that supports life like that just again seems unlikely, But 573 00:37:14,080 --> 00:37:18,239 Speaker 1: I digress. I'm sure that this tool is going to 574 00:37:18,280 --> 00:37:21,319 Speaker 1: get a lot of use, and my thoughts go out 575 00:37:21,320 --> 00:37:24,080 Speaker 1: to whomever it is that has to go through all 576 00:37:24,080 --> 00:37:27,680 Speaker 1: that information. Okay, before we wrap up, I do have 577 00:37:27,719 --> 00:37:30,640 Speaker 1: a couple of article suggestions for you all today. One 578 00:37:30,800 --> 00:37:35,160 Speaker 1: is from MIT News. David L. Chandler is the author 579 00:37:35,280 --> 00:37:38,240 Speaker 1: of this piece. It is titled Engineers develop an efficient 580 00:37:38,320 --> 00:37:42,799 Speaker 1: process to make fuel from carbon dioxide. So, as the 581 00:37:42,800 --> 00:37:45,640 Speaker 1: title indicates, the story is all about how some researchers 582 00:37:45,640 --> 00:37:48,719 Speaker 1: have used a variety of different strategies to make it 583 00:37:48,880 --> 00:37:52,360 Speaker 1: more practical to turn CO two into a stable fuel, 584 00:37:52,960 --> 00:37:54,960 Speaker 1: and you would be able to store this fuel for 585 00:37:55,480 --> 00:37:58,520 Speaker 1: an indefinite amount of time and use the fuel to 586 00:37:58,600 --> 00:38:00,759 Speaker 1: power stuff like you could use it to do things 587 00:38:00,840 --> 00:38:02,719 Speaker 1: like heat homes and things like that, or use it 588 00:38:02,719 --> 00:38:07,720 Speaker 1: in fuel cells typically, and you would also be tapping 589 00:38:07,719 --> 00:38:12,759 Speaker 1: into renewable energy sources to actually power the conversion processes. So, 590 00:38:13,000 --> 00:38:16,000 Speaker 1: in other words, you could find ways to turn the 591 00:38:16,000 --> 00:38:20,400 Speaker 1: CO two into fuel without having to place more demand 592 00:38:20,440 --> 00:38:25,279 Speaker 1: on carbon generating power plants. It's an interesting read. I 593 00:38:25,320 --> 00:38:28,520 Speaker 1: am curious as to what byproducts the fuel produces when 594 00:38:28,560 --> 00:38:32,040 Speaker 1: it's consumed, right, you know, like your typical fuel cell 595 00:38:32,800 --> 00:38:37,400 Speaker 1: uses hydrogen and oxygen to generate electricity, and the byproducts 596 00:38:37,400 --> 00:38:41,640 Speaker 1: are heat and water vapor, and that's it. I don't 597 00:38:41,680 --> 00:38:44,840 Speaker 1: know what the byproducts are for the CO two based 598 00:38:44,840 --> 00:38:47,560 Speaker 1: fuel cells, and I should look at it up and 599 00:38:47,600 --> 00:38:50,239 Speaker 1: read into it, but I think the article itself is 600 00:38:50,239 --> 00:38:53,600 Speaker 1: worth a read. The other article recommendation I have is 601 00:38:53,600 --> 00:38:58,880 Speaker 1: from Rollingstone dot com. It is titled HBO bosses used 602 00:38:59,080 --> 00:39:03,399 Speaker 1: secret fake accounts to troll TV critics. This one's by 603 00:39:03,480 --> 00:39:07,840 Speaker 1: Cheyenne Roundtree, and again the title shows that some executives 604 00:39:07,880 --> 00:39:10,640 Speaker 1: over at HBO really wanted to make some TV critics 605 00:39:10,680 --> 00:39:15,000 Speaker 1: feel badly about their opinions about HBO programs, and this 606 00:39:15,040 --> 00:39:19,040 Speaker 1: happened a couple of years ago, and honestly, reading over 607 00:39:19,360 --> 00:39:23,440 Speaker 1: the messages that were part of this ridiculous campaign over 608 00:39:23,480 --> 00:39:28,440 Speaker 1: at HBO gave me real mean girls in high school vibes. 609 00:39:28,800 --> 00:39:32,280 Speaker 1: So check it out. That's it for the Tech News 610 00:39:32,320 --> 00:39:35,560 Speaker 1: for Thursday, November second, twenty twenty three. I hope you 611 00:39:35,600 --> 00:39:39,799 Speaker 1: are all well and I'll talk to you again really soon. 612 00:39:45,960 --> 00:39:50,640 Speaker 1: Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, 613 00:39:50,960 --> 00:39:54,640 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 614 00:39:54,680 --> 00:39:59,240 Speaker 1: to your favorite shows