1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:11,800 --> 00:00:14,400 Speaker 1: Be there and welcome to tech Stuff. I'm your host, 3 00:00:14,480 --> 00:00:17,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio. 4 00:00:17,360 --> 00:00:19,720 Speaker 1: And how the tech are Yet It's time for the 5 00:00:19,720 --> 00:00:25,520 Speaker 1: tech news for Thursday, July two thousand twenty two. Happy 6 00:00:25,760 --> 00:00:30,200 Speaker 1: Best Steal day, everybody. Let's get to the news. A 7 00:00:30,320 --> 00:00:34,280 Speaker 1: huge story this week, at least in my mind, is 8 00:00:34,360 --> 00:00:38,600 Speaker 1: that famous designer Johnny I've will no longer be designing 9 00:00:38,760 --> 00:00:42,000 Speaker 1: for Apple, all right, so let's give a quick rundown 10 00:00:42,040 --> 00:00:45,239 Speaker 1: for those who are unfamiliar with I've. He started at 11 00:00:45,240 --> 00:00:48,120 Speaker 1: Apple in the early nine nineties. This was an Apple 12 00:00:48,680 --> 00:00:52,559 Speaker 1: that lacked Steve Jobs because jobs have essentially been pushed 13 00:00:52,640 --> 00:00:56,000 Speaker 1: out of the company in and he would not come 14 00:00:56,040 --> 00:01:00,880 Speaker 1: back until I've had studied design all his life. His 15 00:01:01,000 --> 00:01:04,800 Speaker 1: father was a silversmith and his grandfather was an engineer, 16 00:01:04,880 --> 00:01:07,240 Speaker 1: so it kind of ran in the family. And when 17 00:01:07,319 --> 00:01:12,000 Speaker 1: Steve Jobs returned to Apple in ven and subsequently was 18 00:01:12,040 --> 00:01:15,319 Speaker 1: put in charge of the company, he and I've formed 19 00:01:15,680 --> 00:01:20,000 Speaker 1: a type of partnership that would redefine Apple's look, and 20 00:01:20,040 --> 00:01:22,600 Speaker 1: it would bring the company from the verge of bankruptcy 21 00:01:22,720 --> 00:01:26,920 Speaker 1: into an unprecedented era of success. So I've worked on 22 00:01:27,080 --> 00:01:32,560 Speaker 1: tons of iconic Apple products, including the colorful iMac line. 23 00:01:32,600 --> 00:01:35,600 Speaker 1: You might not remember these. These were like the CRT 24 00:01:35,959 --> 00:01:39,080 Speaker 1: monitor Imax, so they look like old televisions, but they 25 00:01:39,120 --> 00:01:43,479 Speaker 1: had these very colorful shells on them, really vibrant ones, 26 00:01:43,600 --> 00:01:47,400 Speaker 1: and everyone just wanted one because they looked cool and friendly. 27 00:01:48,360 --> 00:01:52,320 Speaker 1: I've spent countless hours on design, looking at stuff like 28 00:01:53,000 --> 00:01:56,400 Speaker 1: individual materials and testing out those materials to make sure 29 00:01:56,440 --> 00:01:59,920 Speaker 1: that they would hold what his design called for properly 30 00:02:00,040 --> 00:02:02,960 Speaker 1: and create the aesthetic he was going after. He would 31 00:02:03,200 --> 00:02:07,600 Speaker 1: obsess over the curve of an iPhone or the way 32 00:02:07,640 --> 00:02:11,000 Speaker 1: an Apple watch band would look on someone's wrist. It 33 00:02:11,160 --> 00:02:13,920 Speaker 1: is not an exaggeration to say that a lot of 34 00:02:13,960 --> 00:02:18,840 Speaker 1: Apple's appeal came out of Ive's work and input. Now, 35 00:02:18,919 --> 00:02:22,280 Speaker 1: I don't want to overstate things here either. Obviously, there 36 00:02:22,320 --> 00:02:25,960 Speaker 1: are a lot of other factors with Apple that are important, 37 00:02:26,360 --> 00:02:29,680 Speaker 1: and I've led an entire department of designers who all 38 00:02:29,760 --> 00:02:32,680 Speaker 1: contributed to the success of the company. It wasn't just 39 00:02:33,360 --> 00:02:36,919 Speaker 1: one guy, but during the first decade of the two 40 00:02:36,960 --> 00:02:40,200 Speaker 1: thousand's jobs and I've would be two of the most 41 00:02:40,240 --> 00:02:45,000 Speaker 1: prominent figures at Apple, things, however, would change after Jobs 42 00:02:45,040 --> 00:02:48,440 Speaker 1: passed away in two thousand eleven. Tim Cook, who was 43 00:02:48,560 --> 00:02:53,040 Speaker 1: Steve Job's successor as CEO, has a very different approach 44 00:02:53,080 --> 00:02:57,640 Speaker 1: to leadership than Jobs did. Cook is a financial genius, 45 00:02:57,639 --> 00:02:59,919 Speaker 1: and he's well known for his ability to find new 46 00:03:00,000 --> 00:03:04,760 Speaker 1: ways to generate revenue while also limiting costs. But Cook 47 00:03:05,120 --> 00:03:09,440 Speaker 1: isn't really known for being innovative with products, nor is 48 00:03:09,480 --> 00:03:12,480 Speaker 1: he the salesman that Jobs was. That's just not who 49 00:03:12,520 --> 00:03:16,000 Speaker 1: Cook is. His strengths lie elsewhere, and while Cook would 50 00:03:16,080 --> 00:03:20,800 Speaker 1: lead Apple to unprecedented profits, the press and Apple fans 51 00:03:20,840 --> 00:03:24,720 Speaker 1: began to worry that Apple was kind of losing its mojo, 52 00:03:24,800 --> 00:03:28,519 Speaker 1: that it was falling behind in technical and design innovation. 53 00:03:29,400 --> 00:03:32,840 Speaker 1: I've gradually began to step back from his work. In 54 00:03:32,960 --> 00:03:36,960 Speaker 1: twenty nineteen, he actually left Apple and founded a design 55 00:03:37,040 --> 00:03:40,600 Speaker 1: company called Love From, along with a fellow former Apple 56 00:03:40,640 --> 00:03:44,120 Speaker 1: designer named Mark Newsson. But even then, Apple would be 57 00:03:44,720 --> 00:03:48,440 Speaker 1: Love From his chief client, and the two companies signed 58 00:03:48,480 --> 00:03:51,320 Speaker 1: a three year deal rumored to be valued at around 59 00:03:51,360 --> 00:03:55,240 Speaker 1: a hundred million dollars, where Love From would do a 60 00:03:55,280 --> 00:03:58,800 Speaker 1: lot of design work for Apple, in some cases exclusively 61 00:03:58,920 --> 00:04:01,920 Speaker 1: for Apple because the agreement meant that love From was 62 00:04:01,960 --> 00:04:04,280 Speaker 1: not supposed to work for any kind of company that 63 00:04:04,920 --> 00:04:10,760 Speaker 1: competed against Apple in certain categories. Now, reportedly, Apple and 64 00:04:10,840 --> 00:04:15,120 Speaker 1: love From have decided not to extend that three year contract. 65 00:04:15,360 --> 00:04:18,600 Speaker 1: The contract terms her up and they appear to have said, 66 00:04:18,640 --> 00:04:21,000 Speaker 1: you know what, We're gonna walk away. That means that 67 00:04:21,080 --> 00:04:23,440 Speaker 1: for the first time in thirty years, Apple will really 68 00:04:23,480 --> 00:04:27,000 Speaker 1: be moving forward without any of ives involvement in the 69 00:04:27,040 --> 00:04:30,800 Speaker 1: design of its products. So does this mean that Apple's 70 00:04:30,839 --> 00:04:34,000 Speaker 1: aesthetic is going to hold steady or are we actually 71 00:04:34,040 --> 00:04:37,479 Speaker 1: going to see a shift. Will some other genius step 72 00:04:37,560 --> 00:04:41,960 Speaker 1: forward and guide Apple's design in a new direction, or 73 00:04:42,000 --> 00:04:44,000 Speaker 1: will the team with an Apple do its best to 74 00:04:44,080 --> 00:04:48,400 Speaker 1: kind of eight ives preferences. For some folks, the severing 75 00:04:48,440 --> 00:04:51,520 Speaker 1: of this relationship is kind of a warning flag that 76 00:04:51,560 --> 00:04:54,200 Speaker 1: Apple is going to gradually lose its appeal as a 77 00:04:54,200 --> 00:04:59,159 Speaker 1: company that designs sexy and appealing technology that you just 78 00:04:59,200 --> 00:05:01,560 Speaker 1: immediately want to pay cup in your own hands when 79 00:05:01,600 --> 00:05:03,240 Speaker 1: you see it. That's how I think of a lot 80 00:05:03,240 --> 00:05:06,480 Speaker 1: of Apple's products, even though I don't own many of them. 81 00:05:06,560 --> 00:05:10,840 Speaker 1: When I see those presentations, invariably I'm like, man, I 82 00:05:10,880 --> 00:05:12,840 Speaker 1: want to get my hands on that and see what 83 00:05:12,880 --> 00:05:15,680 Speaker 1: it feels like. I think I will have to do 84 00:05:15,839 --> 00:05:19,240 Speaker 1: a full profile episode about Johnny. I've before too long, 85 00:05:19,520 --> 00:05:21,960 Speaker 1: so I'm not gonna go into any further detail here, 86 00:05:22,520 --> 00:05:26,200 Speaker 1: but i will say it is a pretty remarkable end 87 00:05:26,360 --> 00:05:31,560 Speaker 1: of an era. Now let's move on to Twitter. The 88 00:05:31,600 --> 00:05:35,360 Speaker 1: company officially sued Elon Musk in a Delaware court over 89 00:05:35,480 --> 00:05:37,680 Speaker 1: the issue of Musk trying to back out of his 90 00:05:37,839 --> 00:05:41,840 Speaker 1: deal to acquire Twitter. The filing is reportedly sixty two 91 00:05:41,839 --> 00:05:45,240 Speaker 1: pages long, and The Atlantic has a great rundown about 92 00:05:45,240 --> 00:05:49,880 Speaker 1: the lawsuit, including opinions on why Musk's arguments are aren't 93 00:05:50,000 --> 00:05:53,240 Speaker 1: likely to hold up in court. Uh namely that Musk 94 00:05:53,360 --> 00:05:56,280 Speaker 1: is arguing that a flood of bots on Twitter means 95 00:05:56,320 --> 00:06:00,120 Speaker 1: the platform is significantly overvalued, which Musk's team argue is 96 00:06:00,360 --> 00:06:02,760 Speaker 1: enough to allow Musk to back out of the deal. 97 00:06:03,680 --> 00:06:07,680 Speaker 1: The article has the title Elon Musk is a Nightmare client. 98 00:06:08,320 --> 00:06:11,680 Speaker 1: The piece also points out some ironic features in the lawsuit, 99 00:06:11,800 --> 00:06:16,080 Speaker 1: namely that the filing portrays Musk as a capricious and 100 00:06:16,279 --> 00:06:20,000 Speaker 1: petty person and someone that you probably wouldn't want to 101 00:06:20,040 --> 00:06:23,640 Speaker 1: be in charge of your company. But the whole purpose 102 00:06:23,680 --> 00:06:26,599 Speaker 1: of the lawsuit is to force Musk to go through 103 00:06:26,640 --> 00:06:30,400 Speaker 1: with this acquisition deal and to purchase Twitter. So it's 104 00:06:30,480 --> 00:06:33,480 Speaker 1: kind of like saying, this guy's a joke. He should 105 00:06:33,520 --> 00:06:35,960 Speaker 1: not be running a company, and that's why you can't 106 00:06:36,000 --> 00:06:39,520 Speaker 1: take his excuses for why he can't run our company seriously, 107 00:06:39,920 --> 00:06:43,159 Speaker 1: So make him run our company. It's a mad, mad, 108 00:06:43,240 --> 00:06:46,000 Speaker 1: mad mad mad world, y'all. And of course it gets 109 00:06:46,080 --> 00:06:48,920 Speaker 1: better and the filing, Twitter revealed that part of the 110 00:06:48,960 --> 00:06:52,479 Speaker 1: deal it made with Musk was that the company would 111 00:06:52,520 --> 00:06:55,360 Speaker 1: not impede Musk's ability to tweet. He could tweet all 112 00:06:55,400 --> 00:07:00,000 Speaker 1: he liked. However, he was not to disparage Twitter or 113 00:07:00,000 --> 00:07:03,360 Speaker 1: its representatives because that would be in violation of the agreement. 114 00:07:03,560 --> 00:07:09,160 Speaker 1: So it is possible that when Musk tweeted a poop 115 00:07:09,200 --> 00:07:12,840 Speaker 1: emoji at Twitter, that that would eventually come back and 116 00:07:12,880 --> 00:07:15,480 Speaker 1: haunt him and be used against him in a court 117 00:07:15,640 --> 00:07:18,680 Speaker 1: of law. This is the world we live in now, 118 00:07:19,120 --> 00:07:23,560 Speaker 1: where a poop emoji becomes evidence. This is where we 119 00:07:23,600 --> 00:07:26,960 Speaker 1: are at as a society. None of the science fiction 120 00:07:27,000 --> 00:07:30,760 Speaker 1: authors predicted this, So Musk tweeting out negative things about 121 00:07:30,760 --> 00:07:34,200 Speaker 1: Twitter and its representatives violates this agreement, which Twitter then 122 00:07:34,320 --> 00:07:37,680 Speaker 1: argues means that Musk has forfeited his right to terminate 123 00:07:37,760 --> 00:07:40,920 Speaker 1: the agreement, and it gets juicier from there, And at 124 00:07:40,960 --> 00:07:44,640 Speaker 1: least from an outsider's perspective, I would say that it 125 00:07:44,680 --> 00:07:47,760 Speaker 1: looks like Twitter has the upper hand going into this litigation. 126 00:07:47,920 --> 00:07:51,119 Speaker 1: But I am no expert, and Elon Musk has proven 127 00:07:51,120 --> 00:07:54,040 Speaker 1: to be a slippery character on numerous occasions, so I'm 128 00:07:54,120 --> 00:08:00,240 Speaker 1: not confident that he's out of serious trouble or as 129 00:08:00,280 --> 00:08:02,640 Speaker 1: in serious trouble. I honestly can't tell one way or 130 00:08:02,680 --> 00:08:05,200 Speaker 1: the other it is. It's a mystery to me, but 131 00:08:05,400 --> 00:08:08,240 Speaker 1: it's an entertaining one. Now, let's stay Muskie for a 132 00:08:08,240 --> 00:08:11,120 Speaker 1: couple of stories and talk about the SpaceX Rocket booster 133 00:08:11,240 --> 00:08:14,280 Speaker 1: mishap earlier this week. Actually meant to cover this on 134 00:08:14,320 --> 00:08:18,160 Speaker 1: Tuesday's episode because it happened on Monday. Then. It was 135 00:08:18,200 --> 00:08:22,200 Speaker 1: in Texas where a SpaceX Starship Booster rocket burst into 136 00:08:22,200 --> 00:08:26,040 Speaker 1: flames during a ground test firing. Now, this was part 137 00:08:26,080 --> 00:08:29,400 Speaker 1: of a regimen of fire tests with this booster, and 138 00:08:29,440 --> 00:08:32,960 Speaker 1: it's safe to say this particular test was a failure. 139 00:08:33,160 --> 00:08:36,280 Speaker 1: Musk himself acknowledged that the outcome was quote unquote not 140 00:08:36,440 --> 00:08:39,960 Speaker 1: good on Twitter, and must later said it appeared that 141 00:08:40,120 --> 00:08:44,040 Speaker 1: some unburned propellant from the raptor two engines of the Starship. 142 00:08:44,920 --> 00:08:47,240 Speaker 1: The starship has thirty three of these kinds of engines, 143 00:08:47,559 --> 00:08:50,040 Speaker 1: that that was the cause of the fire. As of 144 00:08:50,080 --> 00:08:52,840 Speaker 1: this recording, the teams are still assessing the causes and 145 00:08:52,880 --> 00:08:56,120 Speaker 1: the damage. The exterior structure of the booster appears to 146 00:08:56,120 --> 00:08:57,920 Speaker 1: be in pretty good shape, but I've yet to see 147 00:08:57,960 --> 00:09:01,760 Speaker 1: official word about the condition of the engines themselves now. 148 00:09:01,800 --> 00:09:05,640 Speaker 1: Originally SpaceX hoped to have a test orbital flight of 149 00:09:05,679 --> 00:09:08,560 Speaker 1: the Starship by the end of this month. That seems 150 00:09:08,640 --> 00:09:11,559 Speaker 1: unlikely now, but again, there's been no official word as 151 00:09:11,600 --> 00:09:14,240 Speaker 1: of this recording about when the company expects to conduct 152 00:09:14,240 --> 00:09:17,720 Speaker 1: the test. I think the takeaway here is that rocket 153 00:09:17,760 --> 00:09:22,400 Speaker 1: science is hard. I mean, it's not brain surgery. And 154 00:09:22,520 --> 00:09:25,400 Speaker 1: over at Tesla, the company is saying goodbye to its 155 00:09:25,440 --> 00:09:29,679 Speaker 1: top executive in its AI department, Andre Caparthi, who has 156 00:09:29,679 --> 00:09:33,600 Speaker 1: served as director for Tesla's autopilot vision team, has announced 157 00:09:33,640 --> 00:09:36,560 Speaker 1: he is leaving Tesla. He has been part of the 158 00:09:36,559 --> 00:09:39,600 Speaker 1: company for five years, and this comes as Tesla faces 159 00:09:39,720 --> 00:09:42,679 Speaker 1: scrutiny from the US government regarding the safety of the 160 00:09:42,720 --> 00:09:46,280 Speaker 1: autopilot feature in general, which has been cited as a 161 00:09:46,360 --> 00:09:50,280 Speaker 1: factor in some notable accidents, including some traffic accidents that 162 00:09:50,320 --> 00:09:55,120 Speaker 1: resulted in fatalities. It's also a time when Tesla as 163 00:09:55,120 --> 00:09:58,520 Speaker 1: a whole has been laying off workers. The company recently 164 00:09:58,559 --> 00:10:02,000 Speaker 1: closed at San Mateo office after laying off most of 165 00:10:02,000 --> 00:10:06,679 Speaker 1: the workforce there and transferring the remaining staff to other offices. 166 00:10:07,320 --> 00:10:09,800 Speaker 1: So is this a further indication that Tesla might be 167 00:10:09,840 --> 00:10:12,600 Speaker 1: in a bit of a sticky situation. I just don't know. 168 00:10:12,760 --> 00:10:15,400 Speaker 1: I mean, I'm not sure what Caparthi's motivations were for 169 00:10:15,520 --> 00:10:17,480 Speaker 1: leaving the company. It could just be that he wants 170 00:10:17,480 --> 00:10:20,880 Speaker 1: a break, or maybe he wants a new challenge in 171 00:10:20,920 --> 00:10:23,480 Speaker 1: his career. I don't want to read too much into it, 172 00:10:23,520 --> 00:10:26,080 Speaker 1: is what I'm saying. All Right, We've got more news 173 00:10:26,120 --> 00:10:28,079 Speaker 1: stories to cover today, but before we get to that, 174 00:10:28,160 --> 00:10:39,800 Speaker 1: let's take a quick break. We're back Earlier this week, 175 00:10:40,200 --> 00:10:42,840 Speaker 1: I talked about how a whistle blower leaked more than 176 00:10:42,880 --> 00:10:46,560 Speaker 1: a hundred twenty thousand internal documents from Uber to the press. 177 00:10:46,600 --> 00:10:50,720 Speaker 1: Those documents covered Uber activities between two thousand thirteen and 178 00:10:50,760 --> 00:10:55,520 Speaker 1: two thousand seventeen, and mostly documented stuff that we already knew, 179 00:10:55,600 --> 00:10:58,840 Speaker 1: just in much greater detail than we were aware of, 180 00:10:59,040 --> 00:11:03,560 Speaker 1: namely that Uber took to some pretty aggressive tactics in 181 00:11:03,679 --> 00:11:07,319 Speaker 1: order to expand into various markets, including but not limited 182 00:11:07,360 --> 00:11:12,800 Speaker 1: to being pretty loosey goosey with local laws in an 183 00:11:12,800 --> 00:11:16,440 Speaker 1: effort to one established business and to then change the 184 00:11:16,559 --> 00:11:19,959 Speaker 1: laws in Uber's favor. Now I have more news about Uber, 185 00:11:20,480 --> 00:11:24,040 Speaker 1: and it's grim. A group of more than five women 186 00:11:24,080 --> 00:11:27,320 Speaker 1: have brought a lawsuit against Uber, alleging that they had 187 00:11:27,320 --> 00:11:32,040 Speaker 1: been victims of violence from Uber drivers, ranging from sexual 188 00:11:32,080 --> 00:11:37,600 Speaker 1: harassment to kidnapping to rape. The list of accusations is 189 00:11:37,720 --> 00:11:41,480 Speaker 1: absolutely horrifying, and the women claimed that Uber did too 190 00:11:41,520 --> 00:11:44,680 Speaker 1: little to prevent sexual predators from working as drivers for 191 00:11:44,679 --> 00:11:49,600 Speaker 1: the company despite being aware of the issue. That's about 192 00:11:49,640 --> 00:11:51,680 Speaker 1: where we are right now. I'm sure we'll hear a 193 00:11:51,720 --> 00:11:55,240 Speaker 1: lot more about this as the case moves on. But yeah, 194 00:11:55,320 --> 00:11:59,560 Speaker 1: that is a terrible, terrible story, and I'll keep an 195 00:11:59,559 --> 00:12:03,200 Speaker 1: eye on and how it develops. Ring, the smart doorbell 196 00:12:03,280 --> 00:12:07,319 Speaker 1: and security camera company that's owned by Amazon, recently released 197 00:12:07,320 --> 00:12:10,480 Speaker 1: a transparency report in which the company revealed that in 198 00:12:11,440 --> 00:12:15,240 Speaker 1: one it had received three thousand, one hundred forty seven 199 00:12:15,360 --> 00:12:20,120 Speaker 1: legal demands from various law enforcement agencies to hand over 200 00:12:20,240 --> 00:12:23,840 Speaker 1: surveillance footage that had been gathered from the smart doorbell 201 00:12:24,000 --> 00:12:30,880 Speaker 1: cameras and security cameras, so essentially turning customer cameras into 202 00:12:31,080 --> 00:12:35,680 Speaker 1: law enforcements surveillance cameras. Now that number of requests is 203 00:12:35,760 --> 00:12:40,720 Speaker 1: up from around nineteen hundred demands back in, or an 204 00:12:40,800 --> 00:12:45,160 Speaker 1: increase of around sixty and out of those three thousand 205 00:12:45,480 --> 00:12:47,760 Speaker 1: seven demands, more than eighty five percent of them were 206 00:12:47,800 --> 00:12:53,600 Speaker 1: accompanied by a court issued order such as a search warrant. However, 207 00:12:53,840 --> 00:12:57,600 Speaker 1: Ring said that it turned over data as in footage, 208 00:12:58,040 --> 00:13:00,920 Speaker 1: and only around four out of every ten demands when 209 00:13:00,960 --> 00:13:05,600 Speaker 1: you average it all out. So while of these demands 210 00:13:05,640 --> 00:13:10,079 Speaker 1: had a court order attached to them, were the ones 211 00:13:10,120 --> 00:13:13,960 Speaker 1: that Ring actually handed information over. I'm not really sure 212 00:13:14,360 --> 00:13:17,840 Speaker 1: how this all shook out properly, but in addition to that, 213 00:13:18,000 --> 00:13:22,600 Speaker 1: Ring also received two thousand, seven seventy four preservation orders. 214 00:13:23,040 --> 00:13:27,800 Speaker 1: Now that's when a law enforcement agency asks Amazon to 215 00:13:27,880 --> 00:13:31,160 Speaker 1: hold onto data for up to six months while that 216 00:13:31,280 --> 00:13:34,600 Speaker 1: agency attempts to secure a warrant or other court order 217 00:13:34,640 --> 00:13:37,880 Speaker 1: that would be necessary to demand access to the footage. So, 218 00:13:37,880 --> 00:13:41,760 Speaker 1: in other words, this is an agency saying, hey, don't 219 00:13:41,800 --> 00:13:45,360 Speaker 1: delete this customer's data anytime soon, we might need access 220 00:13:45,400 --> 00:13:48,559 Speaker 1: to it. On top of that, Amazon shared footage with 221 00:13:48,640 --> 00:13:53,280 Speaker 1: police at least eleven times without any kind of warrant 222 00:13:53,400 --> 00:13:56,400 Speaker 1: or court order at all, something that the company had 223 00:13:56,440 --> 00:14:01,920 Speaker 1: previously indicated to customers wouldn't happen. The message to consumers 224 00:14:02,000 --> 00:14:04,360 Speaker 1: is that Amazon is not going to hand over footage 225 00:14:04,360 --> 00:14:07,079 Speaker 1: to police unless they have a court order to do so. 226 00:14:07,559 --> 00:14:12,080 Speaker 1: But it turns out that in cases of emergency, essentially 227 00:14:12,120 --> 00:14:17,720 Speaker 1: when someone somewhere determines that there's imminent danger, Amazon can 228 00:14:17,880 --> 00:14:22,600 Speaker 1: share information to police, and that has led some critics 229 00:14:22,600 --> 00:14:25,680 Speaker 1: to protest the practice. They say it sets a precedent 230 00:14:25,760 --> 00:14:29,640 Speaker 1: and it could allow for future mission creep, meaning once 231 00:14:29,680 --> 00:14:33,200 Speaker 1: you establish that there is an exception to the rules, 232 00:14:33,720 --> 00:14:37,720 Speaker 1: it can be easy to have that exception expand over time. 233 00:14:37,840 --> 00:14:41,440 Speaker 1: And since this is a massive invasion of privacy and 234 00:14:41,520 --> 00:14:45,479 Speaker 1: potentially a violation of the Fourth Amendment to the US Constitution, 235 00:14:45,520 --> 00:14:49,120 Speaker 1: which protects against unreasonable search and seizure, that is a 236 00:14:49,160 --> 00:14:55,600 Speaker 1: significant problem. Yesterday, U S courts convicted former CIA employee 237 00:14:55,680 --> 00:15:00,800 Speaker 1: Joshua Schultz on nine charges relating to leaking classified information. 238 00:15:01,880 --> 00:15:04,680 Speaker 1: Who boy, this story is filled with a whole bunch 239 00:15:04,720 --> 00:15:08,360 Speaker 1: of awful stuff. So first, Schult used to work at 240 00:15:08,400 --> 00:15:12,720 Speaker 1: the CIA as a software engineer. In twenty fifteen, Schulton 241 00:15:12,760 --> 00:15:16,280 Speaker 1: filed a restraining order against one of his co workers 242 00:15:16,320 --> 00:15:19,800 Speaker 1: in the CIA. Filed this restraining order in state court. 243 00:15:20,440 --> 00:15:23,800 Speaker 1: The two people, Schulton this co worker had been in 244 00:15:23,880 --> 00:15:27,760 Speaker 1: what CNN would describe as a feud. They got sent 245 00:15:27,920 --> 00:15:31,360 Speaker 1: to different offices as a result. And then, according to 246 00:15:31,440 --> 00:15:35,400 Speaker 1: prosecutors in this case, Schultz got angry at his employer, 247 00:15:35,640 --> 00:15:39,160 Speaker 1: the CIA, when the agency started looking into bringing in 248 00:15:39,200 --> 00:15:42,200 Speaker 1: a contractor to design a tool that was similar to 249 00:15:42,200 --> 00:15:45,440 Speaker 1: one that Schultz himself was already working on, and that 250 00:15:45,640 --> 00:15:50,480 Speaker 1: this sent Schultz on a vindictive path. So, according to prosecutors, 251 00:15:50,480 --> 00:15:54,080 Speaker 1: Schult then stole several cyber tools belonging to the CIA 252 00:15:54,680 --> 00:15:58,000 Speaker 1: and then sent those tools to Wiki Leaks, the nonprofit 253 00:15:58,080 --> 00:16:00,440 Speaker 1: organization meant to serve as sort of a clearing house 254 00:16:00,480 --> 00:16:05,000 Speaker 1: for whistleblowers. Schult left the CIA in t sixteen. He 255 00:16:05,120 --> 00:16:10,280 Speaker 1: got arrested in t seventeen on charges of possessing child pornography. 256 00:16:10,280 --> 00:16:13,560 Speaker 1: Around that same time, Wiki Leaks began to publish what 257 00:16:13,600 --> 00:16:17,760 Speaker 1: would become known as the Vault seven leaks, the largest 258 00:16:17,840 --> 00:16:22,480 Speaker 1: leak of data in CIA's history. Prosecutors said that those 259 00:16:22,560 --> 00:16:26,920 Speaker 1: leaks originated from two programs that Schult was able to access, 260 00:16:27,000 --> 00:16:30,560 Speaker 1: which implies that the tools Shult handed over to Wiki 261 00:16:30,640 --> 00:16:34,640 Speaker 1: Leaks made the Vault seven leaks possible. While Schultz was 262 00:16:34,680 --> 00:16:39,080 Speaker 1: being investigated and prosecuted for the child pornography charges, he 263 00:16:39,200 --> 00:16:43,200 Speaker 1: then also was charged with leaking this classified information. He 264 00:16:43,240 --> 00:16:46,520 Speaker 1: had an initial trial that ended back in twenty but 265 00:16:46,560 --> 00:16:50,120 Speaker 1: it ended with a mistrial. The government chose to retry 266 00:16:50,200 --> 00:16:55,480 Speaker 1: the case, and that concluded yesterday on July two. That 267 00:16:55,560 --> 00:16:58,880 Speaker 1: one ended with Schult being convicted on those counts of 268 00:16:59,040 --> 00:17:02,840 Speaker 1: leaking classified information. He has yet to be sentenced, but 269 00:17:02,960 --> 00:17:06,120 Speaker 1: this incident has become known, as I said, as the 270 00:17:06,200 --> 00:17:10,200 Speaker 1: largest leak of CIA information in history. All right, let's 271 00:17:10,240 --> 00:17:14,119 Speaker 1: move on to something less grim, at least for anyone 272 00:17:14,160 --> 00:17:18,200 Speaker 1: who doesn't happen to be Google or YouTube. Tech Crunch 273 00:17:18,280 --> 00:17:21,920 Speaker 1: reports that the utes today are spending way more time 274 00:17:21,960 --> 00:17:25,280 Speaker 1: on TikTok than they are on YouTube, and I think 275 00:17:25,320 --> 00:17:29,960 Speaker 1: that comes as a surprise to nobody. VidCon, for example, 276 00:17:30,320 --> 00:17:34,679 Speaker 1: Vidcan is a big conference that focuses on online video 277 00:17:34,880 --> 00:17:39,919 Speaker 1: and traditionally, UH the big attraction of vidcan was the 278 00:17:39,960 --> 00:17:45,119 Speaker 1: attendance of YouTube content creators. Well, this past VidCon it 279 00:17:45,200 --> 00:17:49,000 Speaker 1: was very clear that the focus had switched from YouTube 280 00:17:49,160 --> 00:17:52,440 Speaker 1: to TikTok. Not that there weren't still YouTube creators there 281 00:17:52,480 --> 00:17:59,040 Speaker 1: there were, but that TikTok was largely the main attraction 282 00:17:59,400 --> 00:18:02,840 Speaker 1: at VIDCA this year. So that's a pretty strong example 283 00:18:02,920 --> 00:18:05,720 Speaker 1: of where the trend is going. I'll tech Crunch employeed 284 00:18:05,720 --> 00:18:09,520 Speaker 1: a research team from Q Studio. That's qu s t 285 00:18:09,760 --> 00:18:12,520 Speaker 1: u d i oh. That's a company that makes parental 286 00:18:12,600 --> 00:18:16,959 Speaker 1: control software. And according to Q studios data, gen Z 287 00:18:17,200 --> 00:18:22,080 Speaker 1: and jen Alpha users typically spend somewhere around nine minutes 288 00:18:22,160 --> 00:18:26,159 Speaker 1: on average per day watching content on TikTok and fifty 289 00:18:26,240 --> 00:18:29,800 Speaker 1: six minutes per on average per day watching content on YouTube. 290 00:18:30,400 --> 00:18:32,080 Speaker 1: So there would be about two and a half hours 291 00:18:32,080 --> 00:18:35,119 Speaker 1: of watching stuff online from just those two sources. But 292 00:18:35,200 --> 00:18:37,560 Speaker 1: considering how much television I soaked up when I was 293 00:18:37,600 --> 00:18:40,880 Speaker 1: a kid, I can't get judge about any of this anyway. 294 00:18:41,040 --> 00:18:43,320 Speaker 1: The data shows what I think a lot of folks 295 00:18:43,359 --> 00:18:47,639 Speaker 1: already knew at younger generations are gravitating more towards the 296 00:18:47,720 --> 00:18:52,320 Speaker 1: short form content of TikTok than moving on to what 297 00:18:52,440 --> 00:18:55,120 Speaker 1: you would find on YouTube and It's why a lot 298 00:18:55,160 --> 00:18:59,240 Speaker 1: of platforms, including YouTube itself, have promoted features that present 299 00:18:59,359 --> 00:19:03,840 Speaker 1: content in shorter formats, essentially trying to mimic what TikTok 300 00:19:03,920 --> 00:19:08,920 Speaker 1: is doing on YouTube. It's literally called YouTube shorts, and 301 00:19:09,000 --> 00:19:11,199 Speaker 1: the goal there is to serve as kind of a 302 00:19:11,240 --> 00:19:14,119 Speaker 1: means of discovery so that users will see shorts that 303 00:19:14,160 --> 00:19:17,000 Speaker 1: attract them. They watch the short and then they'll say, 304 00:19:17,040 --> 00:19:19,959 Speaker 1: you know what, I'm gonna go and find the video 305 00:19:20,119 --> 00:19:23,200 Speaker 1: that this came from, the longer form video and watch 306 00:19:23,280 --> 00:19:27,880 Speaker 1: that too. Meanwhile, according to Business Insider, Google's s VP 307 00:19:28,320 --> 00:19:32,679 Speaker 1: prob Hoker rock Hoven said at a conference, the younger 308 00:19:32,720 --> 00:19:36,639 Speaker 1: audiences are turning to TikTok and Instagram to perform searches 309 00:19:37,160 --> 00:19:40,480 Speaker 1: rather than using Google to do though now that doesn't 310 00:19:40,480 --> 00:19:43,119 Speaker 1: necessarily mean that kids are using TikTok to do research 311 00:19:43,200 --> 00:19:47,880 Speaker 1: for homework or anything like that, though maybe that's the case. Instead, 312 00:19:49,040 --> 00:19:52,120 Speaker 1: the example given was more like you're trying to figure 313 00:19:52,119 --> 00:19:54,280 Speaker 1: out where you want to go to grab lunch, and 314 00:19:54,320 --> 00:19:57,960 Speaker 1: that kids these days will look to TikTok and Instagram 315 00:19:58,160 --> 00:20:01,040 Speaker 1: to find out, you know, goods aggestions as opposed to 316 00:20:01,119 --> 00:20:04,879 Speaker 1: using Google Search. So anyway, that's something Google wants to 317 00:20:04,920 --> 00:20:07,760 Speaker 1: address it, wants to develop new tools that will appeal 318 00:20:07,800 --> 00:20:11,760 Speaker 1: to younger users so that they'll use Google for search. Meanwhile, 319 00:20:11,840 --> 00:20:13,760 Speaker 1: I'm sitting over here thinking I need to start a 320 00:20:13,800 --> 00:20:16,840 Speaker 1: TikTok account where I dress up in various quasi historical 321 00:20:16,880 --> 00:20:21,000 Speaker 1: outfits and do quick lessons on stuff like American history 322 00:20:21,160 --> 00:20:24,000 Speaker 1: and Shakespeare or whatever. I could be the cliffs Notes 323 00:20:24,040 --> 00:20:27,920 Speaker 1: of TikTok, except I'm sure that already exists, probably in 324 00:20:28,040 --> 00:20:30,960 Speaker 1: multiple channels, but maybe I should do it. Anyway, while 325 00:20:31,000 --> 00:20:33,080 Speaker 1: I think about that, let's take another quick break and 326 00:20:33,119 --> 00:20:43,119 Speaker 1: we'll be back with a couple more stories. Okay, we 327 00:20:43,200 --> 00:20:45,520 Speaker 1: got a couple of stories to close out this episode. 328 00:20:45,640 --> 00:20:49,320 Speaker 1: First up, The Register reports that a letter signed by 329 00:20:49,440 --> 00:20:52,479 Speaker 1: executives at some of the biggest tech companies in the world, 330 00:20:52,880 --> 00:20:58,520 Speaker 1: which include Apple, Microsoft, Amazon, Google, and Meta, among others, 331 00:20:59,200 --> 00:21:03,720 Speaker 1: encourages state leaders to introduce computer science lessons for students 332 00:21:03,920 --> 00:21:07,920 Speaker 1: in elementary, middle and high school here in the United States. 333 00:21:07,960 --> 00:21:10,159 Speaker 1: So the letter calls for kids to be given the 334 00:21:10,200 --> 00:21:13,360 Speaker 1: opportunity to learn the basics of computer science early on, 335 00:21:14,000 --> 00:21:18,439 Speaker 1: including eventually how to code, with classes becoming more complex 336 00:21:18,440 --> 00:21:21,720 Speaker 1: and in depth at higher grade levels. I think This 337 00:21:21,800 --> 00:21:24,920 Speaker 1: is a great idea. Getting a grounding and computer science 338 00:21:25,119 --> 00:21:28,080 Speaker 1: is practical, and it also means that students will learn 339 00:21:28,119 --> 00:21:31,040 Speaker 1: to put other skills to use. Plus, it will mean 340 00:21:31,040 --> 00:21:33,720 Speaker 1: that students who move on to colleges and universities will 341 00:21:33,760 --> 00:21:37,720 Speaker 1: already have a fundamental understanding before they jump into computer 342 00:21:37,760 --> 00:21:42,320 Speaker 1: science classes. That is critical. Computer science has advanced so 343 00:21:42,400 --> 00:21:45,800 Speaker 1: much in such a relatively short time that it can 344 00:21:45,840 --> 00:21:48,119 Speaker 1: be pretty challenging for students who have little to no 345 00:21:48,280 --> 00:21:53,080 Speaker 1: background in the subject to make progress at the university level. Plus, 346 00:21:53,200 --> 00:21:57,320 Speaker 1: computer science encourages students to learn problem solving skills like 347 00:21:57,400 --> 00:22:01,240 Speaker 1: critical thinking skills, breaking problems down own in a way 348 00:22:01,240 --> 00:22:04,560 Speaker 1: where you can approach a solution that can be applied 349 00:22:04,760 --> 00:22:08,520 Speaker 1: outside the discipline of computer science. It's a really powerful 350 00:22:08,600 --> 00:22:11,320 Speaker 1: tool to have in your arsenal and it's something that 351 00:22:11,400 --> 00:22:13,960 Speaker 1: I wish I had had access to back when I 352 00:22:14,000 --> 00:22:15,960 Speaker 1: was a kid. You know, when I was going to 353 00:22:16,600 --> 00:22:19,560 Speaker 1: high school, the most advanced class we had in our 354 00:22:19,600 --> 00:22:23,399 Speaker 1: computer lab was data entry, right, that was that was 355 00:22:23,480 --> 00:22:25,480 Speaker 1: the peak. That's as far as you could go. I 356 00:22:25,520 --> 00:22:30,360 Speaker 1: actually ended up taking an additional course in the computer 357 00:22:30,520 --> 00:22:33,640 Speaker 1: lab that wasn't even offered officially. I had to get 358 00:22:33,840 --> 00:22:38,600 Speaker 1: special permission to do it rather than take some other elective. 359 00:22:38,880 --> 00:22:41,919 Speaker 1: And it turned out that I end up teaching the 360 00:22:41,960 --> 00:22:44,160 Speaker 1: class at least a portion of the class that were 361 00:22:44,160 --> 00:22:48,040 Speaker 1: on the IBM computers because the teacher of my computer 362 00:22:48,160 --> 00:22:51,359 Speaker 1: lab was more familiar with Apple computers, so that's what 363 00:22:51,440 --> 00:22:54,199 Speaker 1: she focused on. Uh. Turns out I didn't do a 364 00:22:54,200 --> 00:22:57,719 Speaker 1: whole lot other than like again, data entry and word processing. 365 00:22:57,840 --> 00:23:00,959 Speaker 1: So even though I was put in charge, I didn't 366 00:23:01,280 --> 00:23:04,399 Speaker 1: get to really advance my knowledge that much. And I 367 00:23:04,480 --> 00:23:08,840 Speaker 1: consider that real, you know, a real failure of the 368 00:23:08,840 --> 00:23:11,840 Speaker 1: school system, if I'm being honest, because it would have 369 00:23:11,880 --> 00:23:15,720 Speaker 1: been great to have had access to better educational tools 370 00:23:16,200 --> 00:23:19,840 Speaker 1: to get into computer science, even just a little bit, 371 00:23:19,840 --> 00:23:24,120 Speaker 1: because again in my case, it didn't happen at all. 372 00:23:24,200 --> 00:23:29,600 Speaker 1: Who is literally superficial applications of computers, nothing important about 373 00:23:29,600 --> 00:23:34,600 Speaker 1: the computers themselves. So the letter also stresses the importance 374 00:23:34,640 --> 00:23:37,960 Speaker 1: of fostering new generations of computer scientists because it will 375 00:23:37,960 --> 00:23:41,320 Speaker 1: be a critical component for the workplace in general. That's 376 00:23:41,320 --> 00:23:44,840 Speaker 1: clearly been the case for a while. It's also critical 377 00:23:44,880 --> 00:23:47,960 Speaker 1: for U S strategy to have experts and computer science. 378 00:23:48,160 --> 00:23:50,720 Speaker 1: You do not want the country to fall behind in 379 00:23:50,800 --> 00:23:53,520 Speaker 1: that regard. In fact, a lot of people have argued 380 00:23:53,560 --> 00:23:56,320 Speaker 1: that China is already well ahead of the United States 381 00:23:56,320 --> 00:23:59,679 Speaker 1: at this point, and that if we don't start addressing 382 00:24:00,080 --> 00:24:04,240 Speaker 1: this issue, then we will be hopelessly left behind. The 383 00:24:04,320 --> 00:24:06,960 Speaker 1: letter also calls leaders to make certain that such classes 384 00:24:06,960 --> 00:24:10,280 Speaker 1: are available to all students everywhere, including those belonging to 385 00:24:10,320 --> 00:24:14,639 Speaker 1: classically underserved communities. And I really hope that the letter 386 00:24:14,680 --> 00:24:17,720 Speaker 1: helps spur some action across the States. I think it 387 00:24:17,760 --> 00:24:22,320 Speaker 1: would be incredibly helpful if that did happen. Finally, Nintendo 388 00:24:22,440 --> 00:24:26,080 Speaker 1: has a new acquisition. The company has purchased a CG 389 00:24:26,320 --> 00:24:31,240 Speaker 1: production company called Dynamo Pictures, which Nintendo plans to rebrand 390 00:24:31,480 --> 00:24:35,800 Speaker 1: as Nintendo Pictures. The subsidiary will be tasked with creating 391 00:24:35,880 --> 00:24:40,280 Speaker 1: quote visual content utilizing Nintendo I p end quote. According 392 00:24:40,320 --> 00:24:43,920 Speaker 1: to a Nintendo press release, that visual content is likely 393 00:24:43,960 --> 00:24:46,760 Speaker 1: going to include work that will actually appear in video 394 00:24:46,760 --> 00:24:51,080 Speaker 1: games themselves. Dynamo Pictures has previously done motion capture work 395 00:24:51,320 --> 00:24:55,240 Speaker 1: on several large computer game and and video game titles 396 00:24:55,240 --> 00:24:58,320 Speaker 1: like Death Stranding is one of them, so a lot 397 00:24:58,320 --> 00:25:01,720 Speaker 1: of that work may be to reduced content that's actually 398 00:25:01,720 --> 00:25:04,760 Speaker 1: going to go into video games. However, I expect we'll 399 00:25:04,800 --> 00:25:09,440 Speaker 1: also see Nintendo Pictures work on stuff that's meant to 400 00:25:09,480 --> 00:25:14,639 Speaker 1: be on television or streaming or film. The acquisition is 401 00:25:14,680 --> 00:25:19,080 Speaker 1: not yet complete. It's still in the process. Nintendo said 402 00:25:19,200 --> 00:25:23,919 Speaker 1: in its released to two shareholders that it expects the 403 00:25:23,960 --> 00:25:28,040 Speaker 1: deal to close on October three of this year. But Yeah, 404 00:25:28,080 --> 00:25:29,960 Speaker 1: I'm really curious to see what comes out of this. 405 00:25:30,040 --> 00:25:32,439 Speaker 1: It would be interesting to see if Nintendo starts to 406 00:25:32,800 --> 00:25:39,160 Speaker 1: generate more UH shows and films and stuff that leverage 407 00:25:39,240 --> 00:25:41,679 Speaker 1: the I P. We know there's the Super Mario Brothers 408 00:25:41,720 --> 00:25:45,040 Speaker 1: movie coming out soon, which you know, the biggest story 409 00:25:45,080 --> 00:25:48,520 Speaker 1: about that is how everyone reacts to Chris Pratt doing 410 00:25:48,600 --> 00:25:52,280 Speaker 1: the voice for Mario. A lot of folks were not 411 00:25:52,440 --> 00:25:56,240 Speaker 1: super happy to hear that, although I haven't heard any 412 00:25:56,280 --> 00:25:59,119 Speaker 1: specific like recordings from any of the sessions, so I 413 00:25:59,160 --> 00:26:02,240 Speaker 1: don't know what kind of voice he's doing for the character, 414 00:26:02,359 --> 00:26:04,040 Speaker 1: or if he's doing a voice for the character, if 415 00:26:04,040 --> 00:26:07,080 Speaker 1: he's just being Chris Pratt, don't know, haven't paid attention, 416 00:26:07,200 --> 00:26:10,399 Speaker 1: haven't checked in. But if he is just being Chris Pratt, 417 00:26:10,440 --> 00:26:13,520 Speaker 1: I imagine that's going to upset a lot of folks 418 00:26:13,560 --> 00:26:18,320 Speaker 1: who are really hoping for the Itsamei Mario type character. Anyway, 419 00:26:18,480 --> 00:26:21,280 Speaker 1: I can't wait to find out more about Nintendo pictures 420 00:26:21,320 --> 00:26:23,480 Speaker 1: and what they'll be making in the future, so I 421 00:26:23,480 --> 00:26:25,680 Speaker 1: will be keeping an eye on that, and I'm sure 422 00:26:25,920 --> 00:26:29,399 Speaker 1: we'll talk about in future episodes of this show. And 423 00:26:29,480 --> 00:26:33,240 Speaker 1: with that, we are done with the news for Thursday, 424 00:26:33,480 --> 00:26:37,080 Speaker 1: July two thousand twenty two. If you have suggestions for 425 00:26:37,160 --> 00:26:39,439 Speaker 1: topics I should cover in future episodes of Tech Stuff, 426 00:26:39,880 --> 00:26:41,760 Speaker 1: you know, like if you really want to hear that 427 00:26:41,760 --> 00:26:45,160 Speaker 1: that full profile on Johnny I, for example, although I'm 428 00:26:45,160 --> 00:26:48,040 Speaker 1: pretty sure I'm gonna do that anyway, please reach out 429 00:26:48,080 --> 00:26:49,240 Speaker 1: to me. There are a couple of ways you can 430 00:26:49,240 --> 00:26:52,280 Speaker 1: do that. One is to download the I Heart Radio app, 431 00:26:52,480 --> 00:26:55,480 Speaker 1: which is free to use. You can navigate over to 432 00:26:55,480 --> 00:26:59,160 Speaker 1: the tech Stuff section of the I Heart Radio app, 433 00:26:59,600 --> 00:27:02,359 Speaker 1: and on that section you're gonna see a little microphone icon. 434 00:27:02,400 --> 00:27:04,200 Speaker 1: If you click on that, you can leave a voice 435 00:27:04,240 --> 00:27:06,800 Speaker 1: message up to thirty seconds in length. If you like, 436 00:27:06,880 --> 00:27:08,480 Speaker 1: you can let me know if I can use the 437 00:27:08,560 --> 00:27:11,879 Speaker 1: audio in a future episode, um, because I am all 438 00:27:11,880 --> 00:27:14,520 Speaker 1: about opt in, or if you prefer not to do 439 00:27:14,600 --> 00:27:16,760 Speaker 1: that and I get it you maybe you're like listen 440 00:27:16,880 --> 00:27:19,720 Speaker 1: ug Onto apps already, or I don't really want to 441 00:27:19,760 --> 00:27:25,639 Speaker 1: talk on on your little microphone feature I totally understand. 442 00:27:26,080 --> 00:27:28,000 Speaker 1: Another way to reach out to me is on Twitter. 443 00:27:28,359 --> 00:27:31,159 Speaker 1: The handle for the show is tech Stuff H s 444 00:27:31,359 --> 00:27:40,320 Speaker 1: W and I'll talk to you again really soon. Tech 445 00:27:40,400 --> 00:27:43,840 Speaker 1: Stuff is an I Heart Radio production. For more podcasts 446 00:27:43,880 --> 00:27:46,639 Speaker 1: from My Heart Radio, visit the I Heart Radio app, 447 00:27:46,760 --> 00:27:49,919 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.