1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,840 --> 00:00:14,440 Speaker 1: He there, and welcome to tech Stuff. I'm your host, 3 00:00:14,520 --> 00:00:17,720 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,960 --> 00:00:20,880 Speaker 1: And how the tech are you. It's time for the 5 00:00:20,920 --> 00:00:26,440 Speaker 1: tech news for November two thousand, twin d two and 6 00:00:26,520 --> 00:00:29,280 Speaker 1: I didn't have a news episode this past Tuesday, which 7 00:00:29,280 --> 00:00:31,640 Speaker 1: means we've got a little bit of a Twitter backup 8 00:00:31,680 --> 00:00:34,840 Speaker 1: to get through because things are still going bonkers over there. 9 00:00:35,400 --> 00:00:40,120 Speaker 1: Starting off in Twitter b E. That is, before Elon, 10 00:00:40,680 --> 00:00:45,600 Speaker 1: the corporate culture encouraged internal criticism. Some would say criticism 11 00:00:45,640 --> 00:00:49,120 Speaker 1: was tolerated to a fault in fact, but the point 12 00:00:49,200 --> 00:00:53,600 Speaker 1: is Twitter employees were never punished for criticizing company leadership 13 00:00:53,880 --> 00:00:59,640 Speaker 1: or policies. Well, that's very much changed. One engineer, Eric Fraunhofer, 14 00:01:00,120 --> 00:01:02,720 Speaker 1: who had worked for Twitter for eight years and got 15 00:01:02,760 --> 00:01:06,280 Speaker 1: involved in a Twitter conversation with Elon Musk in an 16 00:01:06,280 --> 00:01:08,959 Speaker 1: effort to correct what he saw his Musk's errors regarding 17 00:01:09,000 --> 00:01:13,240 Speaker 1: how Twitter operates on a technical level, has been let go. 18 00:01:14,680 --> 00:01:17,600 Speaker 1: He has since said that Musk was just completely wrong 19 00:01:17,720 --> 00:01:20,759 Speaker 1: about Twitter's technical operations and so he was stepping in 20 00:01:21,319 --> 00:01:25,640 Speaker 1: to actually provide the real answers. Musk did not care 21 00:01:25,720 --> 00:01:29,360 Speaker 1: for that critique, and fron Huffer was fired. A couple 22 00:01:29,400 --> 00:01:33,800 Speaker 1: of other employees were similarly let go for doing similar things, 23 00:01:33,880 --> 00:01:37,559 Speaker 1: and the following day some Twitter employees expressed support for 24 00:01:37,640 --> 00:01:41,920 Speaker 1: their former colleagues, and they too were shown the door, 25 00:01:42,520 --> 00:01:45,360 Speaker 1: which led to a lot of Twitter employees going back 26 00:01:45,400 --> 00:01:50,280 Speaker 1: and scrubbing their various accounts uh messages on like Twitter 27 00:01:50,360 --> 00:01:52,680 Speaker 1: and Slack and that kind of thing in an effort 28 00:01:52,720 --> 00:01:57,120 Speaker 1: to not hire Sawon. I guess, And according to Casey Newton, 29 00:01:57,520 --> 00:02:01,280 Speaker 1: who does an incredible newsletter each several times a week 30 00:02:01,280 --> 00:02:04,880 Speaker 1: actually about tech, about two dozen folks were affected by 31 00:02:04,880 --> 00:02:09,280 Speaker 1: these firings. Some other outlets are reporting even more than that. 32 00:02:09,639 --> 00:02:12,720 Speaker 1: So the word seems to be you don't criticize Elon 33 00:02:12,840 --> 00:02:16,160 Speaker 1: Musk or you'll get canned. So I guess the free 34 00:02:16,200 --> 00:02:19,520 Speaker 1: speech absolutist has his own thoughts about the kinds of 35 00:02:19,560 --> 00:02:23,720 Speaker 1: speech that shouldn't be expressed. Though, to be fair, free 36 00:02:23,760 --> 00:02:26,720 Speaker 1: speech has never met You're actually free from the consequences 37 00:02:26,760 --> 00:02:29,480 Speaker 1: of what you say, only that you are allowed to 38 00:02:29,520 --> 00:02:33,000 Speaker 1: say it. So yeah, I'm being a bit cheeky here. Anyway, 39 00:02:33,080 --> 00:02:36,000 Speaker 1: Several of the folks fired had been at Twitter for 40 00:02:36,240 --> 00:02:38,959 Speaker 1: nearly a decade or longer, which is bad news for 41 00:02:39,000 --> 00:02:42,320 Speaker 1: a company that recently purged about half of all employees 42 00:02:42,320 --> 00:02:45,000 Speaker 1: and still needs to get stuff done. As for those 43 00:02:45,080 --> 00:02:48,640 Speaker 1: left behind, well, they have a tough decision to make. Today. 44 00:02:49,040 --> 00:02:53,040 Speaker 1: Elon Musk has issued a deadline. Twitter employees must fill 45 00:02:53,040 --> 00:02:55,640 Speaker 1: out an online form to indicate whether or not they 46 00:02:55,639 --> 00:02:59,720 Speaker 1: will be willing to do grueling work at long hours 47 00:02:59,840 --> 00:03:03,880 Speaker 1: or else leave the company. Musk alerted employees to this 48 00:03:04,080 --> 00:03:07,440 Speaker 1: yesterday in an email titled a fork in the Road. 49 00:03:08,160 --> 00:03:09,919 Speaker 1: So they have until the end of today to fill 50 00:03:09,960 --> 00:03:13,160 Speaker 1: out the form. And y'all, this is smelling awful, similar 51 00:03:13,200 --> 00:03:17,520 Speaker 1: to a loyalty pledge, which is given me seriously bad vibes. 52 00:03:17,960 --> 00:03:20,200 Speaker 1: But for some a Twitter, this could be the push 53 00:03:20,240 --> 00:03:23,040 Speaker 1: they need to determine if they in fact want to 54 00:03:23,120 --> 00:03:26,480 Speaker 1: stay at the company and try to transform the platform 55 00:03:26,960 --> 00:03:29,680 Speaker 1: or hack just keep it running while working with half 56 00:03:29,720 --> 00:03:32,920 Speaker 1: the personnel, or if they'd rather take a chance and 57 00:03:32,960 --> 00:03:35,480 Speaker 1: make a big career change, which is tough in a 58 00:03:35,600 --> 00:03:38,600 Speaker 1: market where you're seeing lots of layoffs across big tech. 59 00:03:39,120 --> 00:03:41,840 Speaker 1: It could be that Musk is actually looking to further 60 00:03:41,960 --> 00:03:45,160 Speaker 1: thin the ranks of Twitter though much of the analysis 61 00:03:45,280 --> 00:03:47,760 Speaker 1: that I have read suggests that such a move would 62 00:03:47,760 --> 00:03:50,680 Speaker 1: be risky because it's already going to be hard for 63 00:03:50,760 --> 00:03:53,600 Speaker 1: the remaining staff to just keep things going while also 64 00:03:53,800 --> 00:03:58,040 Speaker 1: creating the projects that Musk wants implemented. On a related note, 65 00:03:58,360 --> 00:04:01,600 Speaker 1: Elon Musk had to appear in a court in Delaware 66 00:04:01,960 --> 00:04:07,360 Speaker 1: to defend his fifty six billion dollar compensation package at Tesla. 67 00:04:08,000 --> 00:04:10,920 Speaker 1: We'll talk more about Tesla in a second. This lawsuit 68 00:04:11,000 --> 00:04:15,839 Speaker 1: came from Tesla shareholder Richard Tornetta, who accuses Tesla and 69 00:04:15,960 --> 00:04:20,000 Speaker 1: Elon Musk for leveraging a compliant board of directors to 70 00:04:20,080 --> 00:04:24,560 Speaker 1: create an insanely lucrative compensation package from Musk while hiding 71 00:04:24,600 --> 00:04:29,080 Speaker 1: crucial information from shareholders who were encouraged to sign off 72 00:04:29,080 --> 00:04:33,680 Speaker 1: on this compensation deal, which they did. In his testimony, 73 00:04:33,880 --> 00:04:36,479 Speaker 1: Musk said he intends to find someone else to run 74 00:04:36,560 --> 00:04:40,240 Speaker 1: Twitter in the future, but later that same day, which 75 00:04:40,320 --> 00:04:43,800 Speaker 1: again was just yesterday, he said it would take some 76 00:04:43,920 --> 00:04:46,280 Speaker 1: time because he wants to hand Twitter over once it 77 00:04:46,440 --> 00:04:51,839 Speaker 1: is in a quote unquote strong place. Well, considering how 78 00:04:51,960 --> 00:04:53,839 Speaker 1: things have been going for Twitter over the last couple 79 00:04:53,839 --> 00:04:56,320 Speaker 1: of weeks and the amount of ground that's been lost 80 00:04:56,640 --> 00:05:00,400 Speaker 1: as various advertisers have pulled back from the platform. It 81 00:05:00,520 --> 00:05:03,520 Speaker 1: may take a long time for Twitter to be in 82 00:05:03,600 --> 00:05:07,240 Speaker 1: a strong place, So why would Musk talk about Twitter 83 00:05:07,400 --> 00:05:11,080 Speaker 1: in a trial that's really about his compensation with Tesla. Well, 84 00:05:11,720 --> 00:05:14,920 Speaker 1: several Tesla shareholders have shown concerned that Musk is far 85 00:05:15,000 --> 00:05:20,200 Speaker 1: too distracted by Twitter to lead Tesla effectively. According to 86 00:05:20,240 --> 00:05:24,080 Speaker 1: The Wall Street Journal, Tesla board member James Murdoch has 87 00:05:24,120 --> 00:05:26,679 Speaker 1: said that Musk has someone in mind who would serve 88 00:05:26,800 --> 00:05:31,440 Speaker 1: as CEO of Tesla as well, indicating that perhaps Musk 89 00:05:31,480 --> 00:05:34,760 Speaker 1: is thinking of handing control over to another person. Musk 90 00:05:34,839 --> 00:05:37,040 Speaker 1: says he thinks of himself now as a leader, but 91 00:05:37,120 --> 00:05:41,080 Speaker 1: as an engineer, something that I think fron Hoffer would 92 00:05:41,120 --> 00:05:45,880 Speaker 1: take Umbridge to. Anyway, this trial will be decided by 93 00:05:45,880 --> 00:05:48,440 Speaker 1: a judge. It is not a jury trial, so a 94 00:05:48,560 --> 00:05:52,560 Speaker 1: judge ultimately makes the decision about whether or not Tesla 95 00:05:52,839 --> 00:05:57,360 Speaker 1: and the board of directors misled shareholders. Uh interesting side 96 00:05:57,440 --> 00:06:01,719 Speaker 1: note that that judge is Chancellor Kathleen McCormick. She happens 97 00:06:01,760 --> 00:06:04,560 Speaker 1: to be the judge who was also overseeing Twitter's lawsuit 98 00:06:04,640 --> 00:06:07,479 Speaker 1: against Musk back when Musk was still trying to back 99 00:06:07,480 --> 00:06:10,080 Speaker 1: out of his acquisition deal. That was the trial that 100 00:06:10,200 --> 00:06:15,200 Speaker 1: ultimately got called off because Musk ultimately agreed to buy Twitter. 101 00:06:15,760 --> 00:06:20,440 Speaker 1: Speaking of Tesla, the US government has released data showing 102 00:06:20,480 --> 00:06:25,440 Speaker 1: that Tesla has identified two additional car crash fatalities connected 103 00:06:25,560 --> 00:06:30,640 Speaker 1: to its driver assistance systems, but details are a bit scarce. 104 00:06:31,040 --> 00:06:34,159 Speaker 1: One of those two accidents happened nearly a year ago, 105 00:06:34,520 --> 00:06:37,680 Speaker 1: so it's kind of shocking to get word of it now. 106 00:06:38,480 --> 00:06:42,320 Speaker 1: But both happened in California. Both involved some form of 107 00:06:42,400 --> 00:06:48,160 Speaker 1: driver assistance modes being engaged. But the NHTSA, the National 108 00:06:48,360 --> 00:06:53,240 Speaker 1: Highway Traffic and Safety Administration uh OR authority rather doesn't 109 00:06:53,560 --> 00:06:57,760 Speaker 1: distinguish between autopilot and full self driving, So I don't 110 00:06:57,800 --> 00:07:01,240 Speaker 1: know which mode was actually in operation for either of 111 00:07:01,320 --> 00:07:03,240 Speaker 1: these crashes. It could have been one or the other. 112 00:07:03,760 --> 00:07:07,640 Speaker 1: The n h t s A categorizes both autopilot and 113 00:07:08,120 --> 00:07:11,760 Speaker 1: full self driving from Tesla as s A E level 114 00:07:11,880 --> 00:07:16,520 Speaker 1: two on the Driving Automation Scale. Now remember that scale 115 00:07:16,600 --> 00:07:20,760 Speaker 1: goes from zero, which has really no active driver assist features. 116 00:07:20,800 --> 00:07:23,800 Speaker 1: It could have warnings and stuff like that, but it doesn't. 117 00:07:24,160 --> 00:07:28,280 Speaker 1: It doesn't really take over control of the vehicle, and 118 00:07:28,400 --> 00:07:30,440 Speaker 1: then you go all the way up to level five, 119 00:07:30,520 --> 00:07:32,920 Speaker 1: which is where you have a fully automated vehicle, where 120 00:07:33,440 --> 00:07:35,480 Speaker 1: you know there are no controls for humans at all. 121 00:07:35,680 --> 00:07:39,240 Speaker 1: The whole thing is is is vehicle operated. But level two, 122 00:07:39,800 --> 00:07:44,480 Speaker 1: where both autopilot and full self driving are still requires 123 00:07:44,560 --> 00:07:48,840 Speaker 1: human operation and human attention. Since July of two thousand 124 00:07:48,920 --> 00:07:52,600 Speaker 1: twenty one, almost all of the fatal accidents involving driver 125 00:07:52,720 --> 00:07:56,160 Speaker 1: assistance systems that have been reported to the nh T 126 00:07:56,440 --> 00:08:02,080 Speaker 1: s A have been Tesla vehicles or have involved Tesla vehicles. 127 00:08:02,760 --> 00:08:07,720 Speaker 1: But the NHT essay also points out there's not a 128 00:08:07,840 --> 00:08:13,720 Speaker 1: standardized approach across all automakers, all car manufacturers on how 129 00:08:13,800 --> 00:08:18,440 Speaker 1: to track crashes that involve some form of driver assistance 130 00:08:18,600 --> 00:08:23,400 Speaker 1: system in them. So, because there's not a unified and 131 00:08:23,600 --> 00:08:28,080 Speaker 1: standardized approach here, you cannot definitively say that one brand 132 00:08:28,120 --> 00:08:32,000 Speaker 1: of car is inherently less safe than another. So while 133 00:08:32,720 --> 00:08:35,240 Speaker 1: you look at this this set of figures and you say, wow, 134 00:08:35,400 --> 00:08:41,120 Speaker 1: Tesla's a driver assistance systems have led to some terrible accidents, 135 00:08:41,520 --> 00:08:44,520 Speaker 1: I mean that is true, but it doesn't give you 136 00:08:44,559 --> 00:08:49,400 Speaker 1: the full story because we don't have a unified version 137 00:08:49,480 --> 00:08:54,400 Speaker 1: of metrics and processes so that we can definitively say, 138 00:08:54,600 --> 00:08:58,640 Speaker 1: all right, this company seems to be doing something far 139 00:08:58,800 --> 00:09:03,560 Speaker 1: more effect active and responsible than this company. We can't 140 00:09:03,600 --> 00:09:05,840 Speaker 1: really do that because it's not a level playing field, 141 00:09:05,880 --> 00:09:09,160 Speaker 1: because we don't know all the different methodologies used to 142 00:09:09,559 --> 00:09:13,440 Speaker 1: track and report this data, which makes it not really 143 00:09:13,559 --> 00:09:16,240 Speaker 1: that useful. In my mind, I think we need to 144 00:09:16,400 --> 00:09:19,599 Speaker 1: create standards that that need to be adopted by the 145 00:09:19,840 --> 00:09:23,559 Speaker 1: entire industry in order to have a better view of 146 00:09:23,679 --> 00:09:26,960 Speaker 1: what's actually happening. I know I dump on Tesla a lot, 147 00:09:27,760 --> 00:09:31,920 Speaker 1: but I don't think it's fair to draw a conclusion 148 00:09:31,960 --> 00:09:35,640 Speaker 1: that Tesla is inherently worse at this than other companies 149 00:09:36,040 --> 00:09:38,880 Speaker 1: when we know that there's not a standardized way to 150 00:09:39,760 --> 00:09:44,160 Speaker 1: actually get metrics on all of this and some more 151 00:09:44,480 --> 00:09:48,079 Speaker 1: bad news for Elon Musk, this time stemming from SpaceX. 152 00:09:48,520 --> 00:09:52,280 Speaker 1: So this past summer, five SpaceX employees signed off on 153 00:09:52,360 --> 00:09:55,040 Speaker 1: a letter that called on the company to acknowledge and 154 00:09:55,200 --> 00:09:59,719 Speaker 1: condemn Elon Musk's behavior on Twitter relating to a news 155 00:09:59,760 --> 00:10:03,800 Speaker 1: rep art that SpaceX had settled a sexual harassment lawsuit 156 00:10:03,840 --> 00:10:08,040 Speaker 1: out of court, one that involved Musk and an employee 157 00:10:08,080 --> 00:10:12,439 Speaker 1: of SpaceX. So these five employees, who's who wrote this 158 00:10:12,600 --> 00:10:16,080 Speaker 1: letter saying Elon Musk's behavior was unacceptable and the company 159 00:10:16,120 --> 00:10:19,719 Speaker 1: should call him out on it. Then found themselves out 160 00:10:19,720 --> 00:10:22,720 Speaker 1: of a job, and the following day around twenty engineers 161 00:10:22,800 --> 00:10:25,200 Speaker 1: had to attend a meeting in which a Tesla VP 162 00:10:25,440 --> 00:10:29,640 Speaker 1: named John Edwards allegedly equated the letter to an quote 163 00:10:29,720 --> 00:10:34,880 Speaker 1: unquote extremist act. That loyalty pledged thing I mentioned about 164 00:10:34,920 --> 00:10:39,360 Speaker 1: Twitter seems to apply to Musk's other operations, I guess anyway. 165 00:10:39,640 --> 00:10:42,320 Speaker 1: Some of these employees say that the letter ultimately led 166 00:10:42,400 --> 00:10:46,679 Speaker 1: to nine people getting fired from SpaceX. Eight of them 167 00:10:47,080 --> 00:10:49,720 Speaker 1: have now joined in an effort with the National Labor 168 00:10:49,840 --> 00:10:54,480 Speaker 1: Relations Board or in l RB to bring unfair labor 169 00:10:54,600 --> 00:10:59,960 Speaker 1: practices charges against SpaceX. Moreover, this issue seems to add 170 00:11:00,400 --> 00:11:04,199 Speaker 1: to this this perception that Elon Musk very much does 171 00:11:04,280 --> 00:11:08,840 Speaker 1: not like having his authority questioned or restricted in any way. 172 00:11:09,880 --> 00:11:14,439 Speaker 1: It's not a good look. Is kind of authoritarian in 173 00:11:14,679 --> 00:11:19,360 Speaker 1: its approach, and again, like having someone being very flippant 174 00:11:20,080 --> 00:11:24,679 Speaker 1: about a sexual harassment lawsuit being dismissed doesn't come across 175 00:11:24,840 --> 00:11:30,160 Speaker 1: great either. Anyway. That's all the Twitter slash Elon Musk 176 00:11:30,200 --> 00:11:32,760 Speaker 1: stuff I have for now, So that's good. When we 177 00:11:32,840 --> 00:11:36,040 Speaker 1: come back. We'll move on to totally different tech news. 178 00:11:45,440 --> 00:11:49,720 Speaker 1: We're back. Sir Christopher Home, a man who came from 179 00:11:49,840 --> 00:11:53,520 Speaker 1: humble beginnings to rise to billionaire in the world of 180 00:11:53,600 --> 00:11:58,880 Speaker 1: financial trading, has some stern words for Google, or specifically 181 00:11:58,960 --> 00:12:03,679 Speaker 1: Google's parent company, Alphabet. So Hans Hedge Fund, which is 182 00:12:03,720 --> 00:12:08,479 Speaker 1: called t c I, has a massive steak in Alphabet. 183 00:12:09,400 --> 00:12:13,719 Speaker 1: It's valued at around six billion dollars billion with a B. 184 00:12:14,640 --> 00:12:18,040 Speaker 1: And Han recently sent a letter to Alphabet arguing that 185 00:12:18,600 --> 00:12:23,640 Speaker 1: Alphabet one has too many employees, to it pays its 186 00:12:23,679 --> 00:12:27,959 Speaker 1: employees way too much, and three it wastes too much 187 00:12:28,040 --> 00:12:32,080 Speaker 1: money on projects that don't drive revenue and accrew huge losses. 188 00:12:32,800 --> 00:12:37,200 Speaker 1: So he's calling for Alphabet to downsize and eliminate employee 189 00:12:37,240 --> 00:12:41,160 Speaker 1: positions that he feels the company has, you know, too 190 00:12:41,240 --> 00:12:43,360 Speaker 1: many of. He's saying that there are too many people 191 00:12:43,440 --> 00:12:46,319 Speaker 1: working there and they're not doing enough work. That this 192 00:12:46,440 --> 00:12:51,960 Speaker 1: is inefficient and costly and should be addressed quickly. Moreover, 193 00:12:52,960 --> 00:12:55,680 Speaker 1: he wants to see the folks who are at Alphabet 194 00:12:55,800 --> 00:12:59,880 Speaker 1: make less money, because Alphabet's median salary is somewhere je 195 00:13:00,080 --> 00:13:03,200 Speaker 1: shy of three hundred thousand dollars a year. That's according 196 00:13:03,240 --> 00:13:06,080 Speaker 1: to filings with the SEC, which is a big old wow, 197 00:13:06,320 --> 00:13:12,839 Speaker 1: like the median salaries around three hundred k. Wow. Now, 198 00:13:12,920 --> 00:13:16,840 Speaker 1: I imagine that doesn't include like contractors and that sort 199 00:13:16,840 --> 00:13:18,760 Speaker 1: of thing, but still three d k that's a lot 200 00:13:18,840 --> 00:13:23,320 Speaker 1: of cheddar. And you know, Hans thinks it's too much cheddar, 201 00:13:23,840 --> 00:13:26,920 Speaker 1: as more cheddar should be going to the billionaires like him. 202 00:13:27,679 --> 00:13:30,559 Speaker 1: I guess. I mean, I get that Alphabet's compensation is 203 00:13:30,679 --> 00:13:34,040 Speaker 1: well above that of the competition, but I find it 204 00:13:34,240 --> 00:13:36,000 Speaker 1: hard to be on the side of a guy who 205 00:13:36,240 --> 00:13:40,000 Speaker 1: literally has more wealth than most people, including myself, will 206 00:13:40,160 --> 00:13:43,400 Speaker 1: ever come close to hard to side with the super 207 00:13:43,559 --> 00:13:46,400 Speaker 1: rich guy, you know. But anyway, he also wants to 208 00:13:46,480 --> 00:13:49,920 Speaker 1: see subsidiaries like Weymo get the old Google treatment and 209 00:13:49,960 --> 00:13:55,080 Speaker 1: get shut down. Weymo is Google's self driving vehicle company, 210 00:13:55,640 --> 00:13:59,920 Speaker 1: and Weymo, he argues, has lost billions of dollars while 211 00:14:00,200 --> 00:14:03,199 Speaker 1: other car manufacturers have kind of backed off of trying 212 00:14:03,240 --> 00:14:09,079 Speaker 1: to make autonomous cars reality. He cited some various initiatives 213 00:14:09,120 --> 00:14:12,560 Speaker 1: that had previously been launched and since abandoned and says 214 00:14:12,880 --> 00:14:17,199 Speaker 1: Google should do the same thing. So will Alphabet capitulate 215 00:14:17,400 --> 00:14:20,640 Speaker 1: to his demands? Well, the tech industry as a whole 216 00:14:20,800 --> 00:14:25,200 Speaker 1: is definitely slimming down due to various economic pressures. Because 217 00:14:25,240 --> 00:14:30,400 Speaker 1: we're in that ding dang durn economic uncertainty that we 218 00:14:30,600 --> 00:14:33,840 Speaker 1: still have not really named. I think we're gonna skip 219 00:14:33,880 --> 00:14:38,520 Speaker 1: recession and go straight to depression personally, but anyway, it 220 00:14:38,560 --> 00:14:41,200 Speaker 1: would not be surprising to see Alphabet make a move 221 00:14:41,320 --> 00:14:44,920 Speaker 1: like this. We know that Google has been pulling back 222 00:14:45,400 --> 00:14:50,080 Speaker 1: on hires and also has started to restrict things like 223 00:14:50,680 --> 00:14:55,200 Speaker 1: company travel, so it would not be surprising to see 224 00:14:55,240 --> 00:14:58,080 Speaker 1: Alphabet make that move, especially since we know so many 225 00:14:58,120 --> 00:15:00,600 Speaker 1: other companies in the tech space are are doing that. 226 00:15:01,640 --> 00:15:03,720 Speaker 1: In fact, while we're on the topic of layoffs, let's 227 00:15:03,760 --> 00:15:07,240 Speaker 1: talk Amazon. Just as we're entering the frenzy of the 228 00:15:07,280 --> 00:15:10,760 Speaker 1: holiday season, though maybe the season will be less of 229 00:15:10,800 --> 00:15:16,160 Speaker 1: a frenzy due to the aforementioned economic uncertainty, Amazon is 230 00:15:16,680 --> 00:15:20,960 Speaker 1: following the band and has begun to layoff employees. Some 231 00:15:21,120 --> 00:15:24,400 Speaker 1: reports suggest that these layoffs could reach around ten thousand 232 00:15:24,560 --> 00:15:28,800 Speaker 1: jobs total. The company started by laying off employees who 233 00:15:28,840 --> 00:15:32,080 Speaker 1: are working in the hardware divisions, which produces stuff like 234 00:15:32,480 --> 00:15:37,720 Speaker 1: Amazon Fire, Amazon Echo, the Kindle, and devices with Amazon's 235 00:15:37,760 --> 00:15:40,800 Speaker 1: personal assistant, whom I will not name so that you're 236 00:15:40,840 --> 00:15:43,880 Speaker 1: smart speakers don't talk back at me because they're mean. 237 00:15:44,280 --> 00:15:47,360 Speaker 1: You should have a discussion with those smart speakers about 238 00:15:47,480 --> 00:15:52,400 Speaker 1: their behavior. Words can hurt anyway. It's just a super 239 00:15:52,520 --> 00:15:54,840 Speaker 1: tough time at the big tech companies right now with 240 00:15:54,960 --> 00:15:57,880 Speaker 1: so many people being let go. And I can speak 241 00:15:57,960 --> 00:16:02,600 Speaker 1: from experience, it fights to get laid off around Thanksgiving. 242 00:16:02,960 --> 00:16:05,720 Speaker 1: I had that happened to me at a former job, 243 00:16:06,200 --> 00:16:10,640 Speaker 1: and even seventeen years later, it still stings. The cuts 244 00:16:10,680 --> 00:16:14,520 Speaker 1: at Amazon are mostly targeting corporate staff and might amount 245 00:16:14,560 --> 00:16:17,680 Speaker 1: to around three percent of the total workforce, but the 246 00:16:17,760 --> 00:16:22,000 Speaker 1: company's warehouse divisions are unlikely to see extensive cuts. That 247 00:16:22,120 --> 00:16:25,240 Speaker 1: makes sense. We are getting into the holidays, there's going 248 00:16:25,320 --> 00:16:27,440 Speaker 1: to be a lot of activity in those warehouses, so 249 00:16:27,720 --> 00:16:29,360 Speaker 1: it makes sense that the cuts were seeing are going 250 00:16:29,400 --> 00:16:32,040 Speaker 1: to be on the corporate side, not necessarily the warehouse side. 251 00:16:32,560 --> 00:16:36,800 Speaker 1: CNBC reports that Amazon has sent out a voluntary severance 252 00:16:36,880 --> 00:16:40,240 Speaker 1: message to some employees, giving them the option to take 253 00:16:40,280 --> 00:16:43,880 Speaker 1: a severance package and leave the company voluntarily rather than 254 00:16:44,000 --> 00:16:47,280 Speaker 1: risk being laid off later down the road. Those employees 255 00:16:47,280 --> 00:16:49,960 Speaker 1: will have until November twenty nine to make their decision, 256 00:16:50,280 --> 00:16:52,840 Speaker 1: and if they take the package. Their employment with Amazon 257 00:16:52,960 --> 00:16:57,040 Speaker 1: will end on December twenty three. By the way, I 258 00:16:57,120 --> 00:17:02,280 Speaker 1: saw that Jeff Bezos, Amazon's founder, has dedicated to giving 259 00:17:02,320 --> 00:17:07,240 Speaker 1: away almost all of his wealth two various charitable organizations 260 00:17:07,359 --> 00:17:11,320 Speaker 1: and efforts. And I saw that someone had posted clearly 261 00:17:11,400 --> 00:17:14,760 Speaker 1: he was visited by three spirits, So good on you. 262 00:17:15,040 --> 00:17:16,560 Speaker 1: I wish I had your name in front of me, 263 00:17:16,640 --> 00:17:20,679 Speaker 1: because that was a great joke and I really laughed 264 00:17:20,760 --> 00:17:24,720 Speaker 1: when I saw it. Okay, let's talk about Netflix, and 265 00:17:24,760 --> 00:17:27,760 Speaker 1: Netflix has announced a change to how subscribers can manage 266 00:17:27,880 --> 00:17:34,520 Speaker 1: access to various devices. So previously cutting off access revoking 267 00:17:34,600 --> 00:17:37,840 Speaker 1: access to devices was an all or nothing deal. You 268 00:17:37,880 --> 00:17:41,200 Speaker 1: could revoke access to all devices you had signed into. 269 00:17:42,000 --> 00:17:44,000 Speaker 1: Let's say you did something silly like let's say you 270 00:17:44,400 --> 00:17:47,359 Speaker 1: went to stay at an Airbnb and they had a 271 00:17:47,440 --> 00:17:50,560 Speaker 1: smart TV, so you logged into your Netflix account and 272 00:17:50,680 --> 00:17:53,760 Speaker 1: then you left the Airbnb, but you realize you forgot 273 00:17:53,880 --> 00:17:57,800 Speaker 1: to log out of that smart TV, so your account 274 00:17:57,840 --> 00:18:02,360 Speaker 1: is still active on there. Well, you could revoke access, 275 00:18:02,560 --> 00:18:05,400 Speaker 1: but the problem was it applied to every single device 276 00:18:05,480 --> 00:18:08,440 Speaker 1: you were logged into, so you lost everything, which meant 277 00:18:08,480 --> 00:18:10,040 Speaker 1: that you had to go through the whole process of 278 00:18:10,160 --> 00:18:13,440 Speaker 1: getting access again for the stuff you actually own, like say, 279 00:18:13,520 --> 00:18:17,600 Speaker 1: your own television or whatever. Well, now Netflix is letting 280 00:18:17,680 --> 00:18:21,400 Speaker 1: users get more granular. You can revoke access to specific 281 00:18:21,480 --> 00:18:25,800 Speaker 1: devices while you retain it everywhere else. It also gives 282 00:18:25,920 --> 00:18:29,360 Speaker 1: users the chance to shut down access to their account if, 283 00:18:29,480 --> 00:18:33,200 Speaker 1: for example, they lent it to say that no good 284 00:18:33,400 --> 00:18:36,760 Speaker 1: mooture of a cousin of yours or whatever. Maybe I'm 285 00:18:36,760 --> 00:18:41,880 Speaker 1: projecting anyway, because Netflix will be instituting a sub account 286 00:18:42,000 --> 00:18:46,080 Speaker 1: charge for users who let other households also access their accounts. 287 00:18:46,680 --> 00:18:49,720 Speaker 1: This move lets users nope right on out of that 288 00:18:49,960 --> 00:18:53,360 Speaker 1: by cutting off the access. So instead of Netflix saying, hey, 289 00:18:53,920 --> 00:18:58,960 Speaker 1: we noticed that this other household is using your account regularly, 290 00:18:59,560 --> 00:19:02,000 Speaker 1: sometimes at the same time you're using the account, so 291 00:19:02,119 --> 00:19:05,320 Speaker 1: we're gonna charge you a monthly fee to have this 292 00:19:05,440 --> 00:19:09,080 Speaker 1: additional uh feed go out, you could say, you know what, 293 00:19:09,200 --> 00:19:11,440 Speaker 1: I'm just going to revoke that access and I'm good. 294 00:19:11,680 --> 00:19:13,960 Speaker 1: I'll just pay mine and then they can go and 295 00:19:14,040 --> 00:19:17,639 Speaker 1: get their own Netflix account. So that is going to 296 00:19:17,720 --> 00:19:20,400 Speaker 1: be an option to although really Netflix is pointing more 297 00:19:20,480 --> 00:19:24,320 Speaker 1: at the case I mentioned where you've perhaps traveled somewhere, 298 00:19:24,480 --> 00:19:27,000 Speaker 1: logged into Netflix, forgot the log out and you wanted 299 00:19:27,040 --> 00:19:30,240 Speaker 1: to revoke access that way. Anyway, the feature is now 300 00:19:30,400 --> 00:19:33,480 Speaker 1: live to all Netflix users, and may the odds be 301 00:19:33,640 --> 00:19:37,440 Speaker 1: ever in your favor. Meta is making some changes to 302 00:19:37,480 --> 00:19:40,920 Speaker 1: what Facebook users will display in their profiles. Starting on 303 00:19:41,040 --> 00:19:44,399 Speaker 1: December one, Facebook is going to remove some fields that 304 00:19:44,440 --> 00:19:48,600 Speaker 1: are currently active in profiles. Those fields include religious views, 305 00:19:49,040 --> 00:19:53,480 Speaker 1: political views, the interested in field, which in this case 306 00:19:53,600 --> 00:19:57,640 Speaker 1: the interested in relates to sexual orientation, as well as 307 00:19:57,760 --> 00:20:00,840 Speaker 1: the address field. All of those are to go away 308 00:20:01,320 --> 00:20:06,080 Speaker 1: starting December one. Meta has not yet shared what prompted 309 00:20:06,160 --> 00:20:08,800 Speaker 1: this decision, like, we know what's going to happen, but 310 00:20:09,040 --> 00:20:12,320 Speaker 1: the company has not yet said why it's made this decision. 311 00:20:13,240 --> 00:20:16,639 Speaker 1: I think it might be to mitigate issues like harassment 312 00:20:16,960 --> 00:20:20,399 Speaker 1: that could potentially be part of it, but the company 313 00:20:20,440 --> 00:20:23,760 Speaker 1: has not said for sure. Ever. Note, the note taking 314 00:20:23,800 --> 00:20:26,080 Speaker 1: app that burst onto the scene in two thousand eight, 315 00:20:26,600 --> 00:20:31,120 Speaker 1: announced that the mobile developer company called Bending Spoons will 316 00:20:31,240 --> 00:20:35,960 Speaker 1: acquire ever note early next year. And Evernote made a 317 00:20:36,040 --> 00:20:39,600 Speaker 1: pretty big splash when it first debuted, But honestly, I 318 00:20:39,840 --> 00:20:42,160 Speaker 1: had not heard very much about it over the last 319 00:20:42,200 --> 00:20:46,040 Speaker 1: several years. Apparently I'm not alone. They kind of fell 320 00:20:46,119 --> 00:20:48,320 Speaker 1: into semi obscurity. I mean, the people who love every 321 00:20:48,359 --> 00:20:52,080 Speaker 1: Note continued to love every Note, although from what I understand, 322 00:20:52,359 --> 00:20:54,920 Speaker 1: the service made some changes that upset a lot of 323 00:20:55,040 --> 00:20:59,040 Speaker 1: users before backing off of those changes. But this announcement 324 00:20:59,119 --> 00:21:01,920 Speaker 1: said that this upcoming merger will allow every Note to 325 00:21:02,000 --> 00:21:07,480 Speaker 1: make quote accelerated improvements across our teams, professional, personal and 326 00:21:07,680 --> 00:21:11,240 Speaker 1: free offerings, which is fantastic news for lovers of Evernote 327 00:21:11,320 --> 00:21:15,200 Speaker 1: everywhere end quote. Glad to hear it. I would like 328 00:21:15,320 --> 00:21:18,119 Speaker 1: to see folks continue to be gainfully employed and be 329 00:21:18,200 --> 00:21:20,359 Speaker 1: able to work on a on a platform that they 330 00:21:20,400 --> 00:21:23,680 Speaker 1: really believe in. Also, I want to say, bending spoons 331 00:21:23,800 --> 00:21:25,800 Speaker 1: is a pretty funny name. Just makes me think of 332 00:21:25,880 --> 00:21:28,320 Speaker 1: your ee Geller and the Charlatans who claim that they 333 00:21:28,359 --> 00:21:31,120 Speaker 1: can use psychic powers to bend spoons, when in fact 334 00:21:31,160 --> 00:21:35,520 Speaker 1: the psychic powers are actually their hands. Okay, enough of 335 00:21:35,960 --> 00:21:39,920 Speaker 1: my skeptical critique. We're gonna take another quick break. When 336 00:21:39,960 --> 00:21:41,879 Speaker 1: we come back, we have a couple more news items 337 00:21:41,880 --> 00:21:53,960 Speaker 1: to get through. Okay, this next story is pretty upsetting. 338 00:21:54,119 --> 00:21:59,080 Speaker 1: Israel has deployed robotic guns on guard towers in the 339 00:21:59,200 --> 00:22:03,280 Speaker 1: West Bank. The guns use artificial intelligence to identify and 340 00:22:03,320 --> 00:22:07,080 Speaker 1: track targets, but are actually fired remotely by guards who 341 00:22:07,119 --> 00:22:10,280 Speaker 1: are stationed inside these guard towers. So while the guns 342 00:22:10,320 --> 00:22:13,480 Speaker 1: are using computer assist aiming, kind of like what you 343 00:22:13,600 --> 00:22:17,680 Speaker 1: see in video games, especially video games that allow players 344 00:22:17,720 --> 00:22:20,800 Speaker 1: on console to go up against PC gamers, because you know, 345 00:22:21,000 --> 00:22:24,159 Speaker 1: console controllers don't have the same level of speed and 346 00:22:24,240 --> 00:22:29,719 Speaker 1: precision as a mouse and keyboard, so often console players 347 00:22:29,800 --> 00:22:33,240 Speaker 1: get a little aim assist boost where the computer helps 348 00:22:33,880 --> 00:22:37,080 Speaker 1: guide those crosshairs just a little bit so you can 349 00:22:37,160 --> 00:22:40,760 Speaker 1: get hits. This AI does effectively the same thing, but 350 00:22:40,880 --> 00:22:44,960 Speaker 1: to a real world gun, not a video game gun. However, 351 00:22:45,080 --> 00:22:47,560 Speaker 1: a human being still has to fire the gun, so 352 00:22:47,680 --> 00:22:50,199 Speaker 1: the guns are outfitted with what Israel refers to as 353 00:22:50,320 --> 00:22:54,119 Speaker 1: non lethal ammunition, which includes stuff like tear gas and 354 00:22:54,320 --> 00:22:58,280 Speaker 1: sponge tipped bullets. I think non lethal might be a 355 00:22:58,359 --> 00:23:01,960 Speaker 1: bit aspirational there. Let's say that they are intended to 356 00:23:02,119 --> 00:23:06,040 Speaker 1: be non lethal anyway. The location of these guns is 357 00:23:06,080 --> 00:23:09,520 Speaker 1: around the Palestinian refugee camp, which has led to criticism 358 00:23:09,600 --> 00:23:12,880 Speaker 1: that Israel is using Palestinians as targets to train its AI, 359 00:23:13,560 --> 00:23:15,879 Speaker 1: while the Israeli government claims that the guns are there 360 00:23:15,960 --> 00:23:21,000 Speaker 1: to protect both Israeli and Palestinian lives activist Issa Amro 361 00:23:21,520 --> 00:23:24,200 Speaker 1: criticized the tech and said that the possibility that could 362 00:23:24,240 --> 00:23:28,480 Speaker 1: be misused or even hacked, which is terrifying, puts thousands 363 00:23:28,520 --> 00:23:31,680 Speaker 1: of lives at risk. Even if it's never hacked. The 364 00:23:31,800 --> 00:23:34,280 Speaker 1: fact that people could think that such a thing could 365 00:23:34,320 --> 00:23:38,320 Speaker 1: happen means that you've got this this tool of fear 366 00:23:38,760 --> 00:23:43,200 Speaker 1: there right Like it's terrifying to think about. So you 367 00:23:43,240 --> 00:23:45,159 Speaker 1: don't even need the hack to happen for it to 368 00:23:45,240 --> 00:23:47,879 Speaker 1: already have a negative impact, not to mention that, I 369 00:23:47,960 --> 00:23:50,639 Speaker 1: think just having a robotic gun as a negative impact 370 00:23:51,240 --> 00:23:55,160 Speaker 1: on the face of its full stop. Meanwhile, Omar Shakir, 371 00:23:55,400 --> 00:23:58,520 Speaker 1: a leader at Human Rights Watch, warrens that using AI 372 00:23:58,600 --> 00:24:01,159 Speaker 1: assisted weapons with a remote ire could put Israel on 373 00:24:01,240 --> 00:24:04,800 Speaker 1: a slippery slope toward becoming quote a powder keg for 374 00:24:04,960 --> 00:24:09,000 Speaker 1: human rights abuse end quote. We've seen several robotics companies 375 00:24:09,040 --> 00:24:11,920 Speaker 1: recently pledged to never develop any sort of weapons systems. 376 00:24:12,440 --> 00:24:15,680 Speaker 1: But while that's reassuring, we also know that military is 377 00:24:15,680 --> 00:24:19,359 Speaker 1: around the world, including the United States military, are working 378 00:24:19,440 --> 00:24:23,880 Speaker 1: hard to do just that, to build robotic platforms for weapons, 379 00:24:24,320 --> 00:24:26,719 Speaker 1: and that you know we've seen that with drones. I mean, 380 00:24:26,840 --> 00:24:29,440 Speaker 1: that's one robotic platform that has weapons, but we're seeing 381 00:24:29,480 --> 00:24:34,399 Speaker 1: it across multiple implementations now, and you can kind of 382 00:24:34,480 --> 00:24:37,400 Speaker 1: understand why. I mean, there's this desire to create weapons 383 00:24:37,800 --> 00:24:40,479 Speaker 1: that keep your own soldiers out of harm's way. Right 384 00:24:40,640 --> 00:24:42,920 Speaker 1: this way, you're not putting your soldiers lives at risk. 385 00:24:43,760 --> 00:24:48,280 Speaker 1: But critics really fear that a use and implementation of 386 00:24:48,400 --> 00:24:51,480 Speaker 1: robotic platforms will lead to more conflicts around the world. 387 00:24:51,560 --> 00:24:54,440 Speaker 1: There will be less resistance to getting into armed conflicts 388 00:24:54,800 --> 00:24:58,880 Speaker 1: if you're thinking that your side doesn't really risk any losses, 389 00:24:59,400 --> 00:25:03,359 Speaker 1: and that can see widespread human rights crises as a result, 390 00:25:03,760 --> 00:25:07,879 Speaker 1: which is the big reason why I'm dead set against 391 00:25:08,440 --> 00:25:13,439 Speaker 1: robotic weapons platforms. Intel launched a new product called fake Catcher, 392 00:25:14,080 --> 00:25:17,040 Speaker 1: which is designed to detect deep fake videos. Now. According 393 00:25:17,080 --> 00:25:20,639 Speaker 1: to Intel, fake Catcher has a nineties accuracy rate in 394 00:25:20,720 --> 00:25:24,719 Speaker 1: detecting deep fakes. The product looks for tail tale signs 395 00:25:24,760 --> 00:25:28,080 Speaker 1: of face and landmark manipulations, you know, movements that might 396 00:25:28,119 --> 00:25:30,080 Speaker 1: be too subtle for humans to really pick up on, 397 00:25:30,400 --> 00:25:33,800 Speaker 1: but that indicate a computer generated the image or video. 398 00:25:34,560 --> 00:25:38,840 Speaker 1: It even looks for really small indicators like how our 399 00:25:39,000 --> 00:25:42,760 Speaker 1: veins change color as our hearts pump blood through them, 400 00:25:42,800 --> 00:25:46,640 Speaker 1: which is kind of creepy. And there's a growing concern 401 00:25:46,760 --> 00:25:49,720 Speaker 1: over how deep fakes could be used to push misinformation 402 00:25:49,840 --> 00:25:53,920 Speaker 1: campaigns or to smear someone by posting fake content of them. 403 00:25:54,280 --> 00:25:57,480 Speaker 1: And of course there's also the terrible world of deep 404 00:25:57,560 --> 00:26:01,960 Speaker 1: fake adult content, where innocent people have had their images 405 00:26:02,240 --> 00:26:05,680 Speaker 1: used in this kind of adult content without their consent, 406 00:26:06,400 --> 00:26:10,280 Speaker 1: and that can lead to really harmful situations, everything from uh, 407 00:26:10,640 --> 00:26:14,320 Speaker 1: you know, professional impact to mental and emotional trauma. So 408 00:26:15,080 --> 00:26:18,520 Speaker 1: it is important that we have tools developed that are 409 00:26:18,880 --> 00:26:22,200 Speaker 1: good at detecting deep fix Of course, the flip side 410 00:26:22,240 --> 00:26:24,800 Speaker 1: of that is that we often see this as a 411 00:26:24,880 --> 00:26:28,639 Speaker 1: seesaw kind of approach. Right. As tools get better at 412 00:26:28,680 --> 00:26:31,840 Speaker 1: detecting deep fakes, deep fakes get better at evading tools, 413 00:26:32,440 --> 00:26:36,960 Speaker 1: and the process continues. Activision Blizzard will be suspending several 414 00:26:37,080 --> 00:26:39,840 Speaker 1: online games in China due to a failure to reach 415 00:26:39,880 --> 00:26:43,520 Speaker 1: a licensing deal with Chinese game company net Ease. So 416 00:26:43,680 --> 00:26:45,879 Speaker 1: let's break this down really quickly. Net Ease is a 417 00:26:45,960 --> 00:26:49,639 Speaker 1: game's publisher and distributor in China. Blizzard has partnered with 418 00:26:49,760 --> 00:26:53,200 Speaker 1: net Ease to bring certain titles like World of Warcraft, 419 00:26:53,560 --> 00:26:57,879 Speaker 1: over Watch two, Diablo three, and more to the Chinese market, 420 00:26:58,280 --> 00:27:02,080 Speaker 1: but this lie something. Agreement expires every so often and 421 00:27:02,119 --> 00:27:05,840 Speaker 1: then comes up for renegotiation, and apparently the most recent 422 00:27:05,880 --> 00:27:09,080 Speaker 1: negotiation talks broke down as Blizzard failed to reach an 423 00:27:09,119 --> 00:27:12,359 Speaker 1: agreement for certain titles. Other ones will remain unaffected because 424 00:27:12,400 --> 00:27:18,639 Speaker 1: they're under separate agreements. So after January twenty three, games 425 00:27:18,720 --> 00:27:24,200 Speaker 1: that include World of Warcraft, Overwatch two, Diablow three, Hearthstone, 426 00:27:24,520 --> 00:27:29,080 Speaker 1: Heroes of the Storm, StarCraft, and Warcraft three Reforged will 427 00:27:29,119 --> 00:27:33,280 Speaker 1: become unavailable in China. Other titles like Diablow Immortal will 428 00:27:33,320 --> 00:27:36,000 Speaker 1: continue development in China because, as I said, they were 429 00:27:36,040 --> 00:27:39,480 Speaker 1: part of a separate deal that was already signed. Apparently, 430 00:27:39,600 --> 00:27:44,000 Speaker 1: some of the obstacles blocking this deal centered around not distribution, 431 00:27:44,440 --> 00:27:48,560 Speaker 1: but the ownership of intellectual property and and player data. 432 00:27:48,840 --> 00:27:51,440 Speaker 1: So Blizzard says it's working to find alternatives to bring 433 00:27:51,520 --> 00:27:54,600 Speaker 1: titles back to Chinese players, but there's no telling on 434 00:27:54,800 --> 00:27:58,280 Speaker 1: when or if that will happen. Microsoft has made a 435 00:27:58,320 --> 00:28:00,800 Speaker 1: move that will likely send at least some corporate leaders 436 00:28:00,880 --> 00:28:04,520 Speaker 1: to throw a tantrum. The company has introduced games in 437 00:28:04,640 --> 00:28:09,040 Speaker 1: its Microsoft Teams product in an app called Games for Work. 438 00:28:09,640 --> 00:28:12,240 Speaker 1: So the games include a bunch of casual titles, stuff 439 00:28:12,280 --> 00:28:16,439 Speaker 1: like mind Sweeper, Solitaire, that sort of stuff. But they 440 00:28:16,480 --> 00:28:20,280 Speaker 1: also have multiplayer capability, so it allows employees to match 441 00:28:20,400 --> 00:28:23,680 Speaker 1: up against their coworkers in various games. So you can 442 00:28:23,880 --> 00:28:26,879 Speaker 1: go up against someone else in a game of Solitaire 443 00:28:27,320 --> 00:28:30,040 Speaker 1: to the death. Okay, I got a little carried away. 444 00:28:30,080 --> 00:28:33,440 Speaker 1: It's probably not to the death. Microsoft says that their 445 00:28:33,520 --> 00:28:36,520 Speaker 1: data shows employees who play games together for forty five 446 00:28:36,560 --> 00:28:40,560 Speaker 1: minutes are then more productive than those who engage in 447 00:28:40,720 --> 00:28:44,480 Speaker 1: other team building activities. But based on how some bosses, 448 00:28:44,560 --> 00:28:47,320 Speaker 1: particularly in the tech space, seemed to be obsessed with 449 00:28:47,480 --> 00:28:50,840 Speaker 1: monitoring employees, I can't see this going over well at 450 00:28:50,840 --> 00:28:54,880 Speaker 1: a lot of places. Nope, work is supposed to be 451 00:28:54,960 --> 00:28:58,840 Speaker 1: oppressive and joy list, so forget about games. Maybe I'm 452 00:28:58,880 --> 00:29:02,360 Speaker 1: just projecting now. To be clear, I'm being facetious. I 453 00:29:02,480 --> 00:29:05,640 Speaker 1: still work remotely. I have not been pressured to return 454 00:29:05,760 --> 00:29:08,640 Speaker 1: to the office, and as long as I keep regular hours, 455 00:29:08,760 --> 00:29:11,320 Speaker 1: everything's pretty cool for me. But I realized that my 456 00:29:11,440 --> 00:29:14,560 Speaker 1: situation is kind of an outlier in the tech space, 457 00:29:14,640 --> 00:29:17,800 Speaker 1: and that stinks. I think tons of companies actually really 458 00:29:17,920 --> 00:29:22,760 Speaker 1: benefited from employees working remotely with more autonomy. Anyway, the 459 00:29:22,880 --> 00:29:25,640 Speaker 1: game's app is available now, and Microsoft says it will 460 00:29:25,680 --> 00:29:28,960 Speaker 1: soon be updating the service with more features and more games. 461 00:29:29,520 --> 00:29:33,840 Speaker 1: So maybe you'll end up playing games on Microsoft Teams 462 00:29:33,880 --> 00:29:37,240 Speaker 1: with teammates occasionally when you're taking a break, which sounds 463 00:29:37,320 --> 00:29:40,800 Speaker 1: kind of cool to me. Finally, Upside Foods has received 464 00:29:40,920 --> 00:29:44,840 Speaker 1: pre market consultation approval from the Food and Drug Administration 465 00:29:45,080 --> 00:29:47,400 Speaker 1: in the U s a k A the f d A. 466 00:29:48,760 --> 00:29:51,280 Speaker 1: And why the heck am I covering a food story 467 00:29:51,640 --> 00:29:56,800 Speaker 1: in tech stuff? Well, Upside food makes chicken. Like. This 468 00:29:56,920 --> 00:30:00,720 Speaker 1: company doesn't butcher chickens and then sell the meat. It 469 00:30:01,240 --> 00:30:05,320 Speaker 1: makes chicken meat. That's right. Upside Foods is a company 470 00:30:05,400 --> 00:30:09,080 Speaker 1: that makes lab grown meat, meaning the company harvests animal 471 00:30:09,160 --> 00:30:13,480 Speaker 1: cells but doesn't slaughter the animals, and it cultivates those 472 00:30:13,520 --> 00:30:17,080 Speaker 1: cells and bioreactors to make lab grown meat. The FDA 473 00:30:17,160 --> 00:30:21,000 Speaker 1: approval is an important step towards bringing this product to market, 474 00:30:21,480 --> 00:30:24,480 Speaker 1: though there are several other steps that have to happen. First. 475 00:30:24,960 --> 00:30:27,680 Speaker 1: For one thing, the lab will require an inspection from 476 00:30:27,720 --> 00:30:30,880 Speaker 1: the United States Department of Agriculture a k the U 477 00:30:31,120 --> 00:30:34,160 Speaker 1: S d A, and then the FDA will actually have 478 00:30:34,280 --> 00:30:36,920 Speaker 1: to inspect the real food produced before it could be 479 00:30:37,040 --> 00:30:40,560 Speaker 1: sold in the US, so there are other regulatory steps 480 00:30:40,600 --> 00:30:42,680 Speaker 1: that have to be made, but the f d A 481 00:30:42,880 --> 00:30:46,720 Speaker 1: pre approval stage is a necessary and important step. Also, 482 00:30:47,080 --> 00:30:49,760 Speaker 1: Upside Foods will have to take some time to scale 483 00:30:49,880 --> 00:30:53,440 Speaker 1: up production if it passes all these inspections, it's not 484 00:30:53,520 --> 00:30:56,400 Speaker 1: going to be able to go into mass production right away. 485 00:30:56,880 --> 00:30:59,960 Speaker 1: And in fact, the company plans to provide lab grown 486 00:31:00,120 --> 00:31:04,160 Speaker 1: chicken only to fancy schmancy restaurants when it starts off, 487 00:31:04,760 --> 00:31:07,440 Speaker 1: so there will definitely be a huge up charge in 488 00:31:07,520 --> 00:31:09,840 Speaker 1: the beginning because you'll have to travel to some pretty 489 00:31:09,880 --> 00:31:13,280 Speaker 1: exclusive restaurants to get a bite of Upside chicken. And 490 00:31:13,680 --> 00:31:16,200 Speaker 1: that makes sense because the process to grow the meat 491 00:31:16,400 --> 00:31:21,760 Speaker 1: is itself incredibly expensive. In fact, analysts estimate that once 492 00:31:21,840 --> 00:31:25,800 Speaker 1: it hits market, lab grown meat will be maybe three 493 00:31:26,320 --> 00:31:31,520 Speaker 1: times or more expensive than your typical butchered meat, and 494 00:31:31,640 --> 00:31:34,480 Speaker 1: I'm guessing that's gonna be a massive impediment to its adoption. 495 00:31:35,040 --> 00:31:37,280 Speaker 1: They might be a massive impediment even to get into 496 00:31:37,320 --> 00:31:41,560 Speaker 1: grocery stores. I don't think most folks can necessarily afford 497 00:31:41,640 --> 00:31:43,880 Speaker 1: to triple the amount of money they spend on meat, 498 00:31:44,360 --> 00:31:46,960 Speaker 1: even if they are comforted by the idea that no 499 00:31:47,120 --> 00:31:51,040 Speaker 1: animal died to provide that meat, So I imagine this 500 00:31:51,200 --> 00:31:53,560 Speaker 1: kind of gradual rollout is one that can lead to 501 00:31:53,720 --> 00:31:57,600 Speaker 1: reductions and production cost over time. Personally, I'm all for 502 00:31:58,160 --> 00:32:00,880 Speaker 1: the lab grown approach. I you think that it's going 503 00:32:00,920 --> 00:32:03,760 Speaker 1: to take time to perfect it so that things like 504 00:32:04,120 --> 00:32:08,480 Speaker 1: taste and consistency are really on par with what you 505 00:32:08,480 --> 00:32:11,480 Speaker 1: would get if you were buying fresh meat. But the 506 00:32:11,600 --> 00:32:14,760 Speaker 1: idea of being able to do that without harming animals, 507 00:32:14,880 --> 00:32:18,720 Speaker 1: and also being able to scale back the livestock industry, 508 00:32:19,160 --> 00:32:23,520 Speaker 1: which is a huge environmental impact on our world right. 509 00:32:23,560 --> 00:32:27,200 Speaker 1: If we're able to scale that back significantly, that would 510 00:32:27,240 --> 00:32:31,400 Speaker 1: be a huge, huge wind as far as fighting against 511 00:32:31,440 --> 00:32:34,280 Speaker 1: climate change. So there are multiple reasons to really be 512 00:32:34,680 --> 00:32:38,880 Speaker 1: on board with lab grown meat. Um, but it's gonna 513 00:32:38,920 --> 00:32:40,240 Speaker 1: take a while for us to get to a point 514 00:32:40,320 --> 00:32:43,800 Speaker 1: where it becomes cost effective. When it starts off, it's 515 00:32:43,840 --> 00:32:49,360 Speaker 1: gonna be so prohibitively expensive that only super fancy pants 516 00:32:49,440 --> 00:32:51,800 Speaker 1: people are going to get a chance to try it. Um. 517 00:32:52,520 --> 00:32:57,080 Speaker 1: If a fancy pants restaurant that serves lab grown meat 518 00:32:57,200 --> 00:33:00,800 Speaker 1: wants to invite me to try some, I'm happy to 519 00:33:00,880 --> 00:33:04,200 Speaker 1: do it. I just can't you know, I can't take 520 00:33:04,240 --> 00:33:07,640 Speaker 1: out a loan to have an appetizer. All right, That 521 00:33:07,800 --> 00:33:10,640 Speaker 1: wraps up this episode of tech Stuff News. If you 522 00:33:10,680 --> 00:33:12,120 Speaker 1: would like to get in touch with me, tell me 523 00:33:12,200 --> 00:33:14,160 Speaker 1: about something you would like me to cover in the future. 524 00:33:14,680 --> 00:33:16,120 Speaker 1: You can do that in a couple of different ways. 525 00:33:16,200 --> 00:33:18,320 Speaker 1: One is to download the I Heart Radio app. It 526 00:33:18,480 --> 00:33:20,400 Speaker 1: is free to download, free to use. You can just 527 00:33:20,720 --> 00:33:22,720 Speaker 1: navigate over to tech Stuff by putting that into the 528 00:33:22,760 --> 00:33:26,000 Speaker 1: search bar. You'll see tech Stuff pop up. There will 529 00:33:26,040 --> 00:33:28,360 Speaker 1: be a little microphone icon there. If you click on that, 530 00:33:28,520 --> 00:33:30,840 Speaker 1: you can leave a voice message up to thirty seconds 531 00:33:30,880 --> 00:33:33,280 Speaker 1: in length. If you like, you can let me know 532 00:33:33,640 --> 00:33:36,000 Speaker 1: and I can use that that voice message in a 533 00:33:36,040 --> 00:33:38,880 Speaker 1: future episode. That'd be great. If you don't want me 534 00:33:38,960 --> 00:33:40,600 Speaker 1: to just you know, you don't have to tell me 535 00:33:40,720 --> 00:33:42,800 Speaker 1: you just leave a message. I'm never going to include 536 00:33:42,800 --> 00:33:47,160 Speaker 1: a message unless I'm expressly told that I can, or 537 00:33:47,600 --> 00:33:49,600 Speaker 1: if you prefer, you can reach out to me on Twitter. 538 00:33:49,960 --> 00:33:53,000 Speaker 1: The handle for the show is tech Stuff H s 539 00:33:53,240 --> 00:34:03,040 Speaker 1: W and I'll talk to you again really soon. Text 540 00:34:03,040 --> 00:34:06,480 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 541 00:34:06,520 --> 00:34:09,280 Speaker 1: from I Heart Radio, visit the i Heart Radio app, 542 00:34:09,400 --> 00:34:12,560 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows,